These are tests done in Photoshop:
These in Maxwell and Vray:
Could be animated and used for weapon sprites
Here I will be collecting tools and methods for advanced rendering. I have another post focussing on the volumetric end of advanced rendering.
Link: www.indigorenderer.com --- blog
-- Tools --
Link: www.maxwellrender.com
Physically Accurate renderer, very slow, Can render very real SSS type surfaces and accurate Color Dispersion for effect like color splitting in glass or diamond, and thin film effects like coloration on Bubbles. Cannot render volumetrics
SSS - Subsurface Scattering
Can simulate a Camera Lens, Flare, Bloom, Bokeh very well (can work as a post effect on any HDR image you send it also)
Dispersion:
Accurate Caustics:
Accurate DOF and motionblur:
Link: www.indigorenderer.com
PC only, FREE, similar to Maxwell in It's goals.. plus It's also slow
Dispersion and Aberration:
Volumetric/SSS
Here It's rendering what appears like volumetric effect using SSS, which is an interesting idea as they are essentially the same thing when your computing it accurately:
Vray and Vray4C4D
Link: www.chaosgroup.com
Link: www.vrayforc4d.com
Vray is brilliant and my rendering engine of choice right now, but It can't yet do accurate dispersion, aberration or volumetrics
I think It's DOF and motion blur are fairly accurate (some things missing for C4D right now), and it can simulate bokeh
Volumetric rendering isn't added yet, but It's coming eventually if these examples are anything to go by:
I've created a very rough example of this here, so check the link below for the movie:
It needs a lot of work, for a start the averaging code work linearly, but our eyes see light exponentially/logarithmically. So the perceptual average brightness is not the same as the calculated, which skews it a bit.
But It's a good start! And I already have a few ideas how to improve it greatly.
Ok, new day, I've found a nice model of a House, fixed it up a little, boarded some windows up all in the name of testing and fine tuning my auto-exposure expression.
I've rendered it at super super super low quality and It's takingaround 20-30 seconds per frame. Below is one such frame after some fine post work by AE:
That looks strikingly good considering all I did was put a physical sun/sky on and hit render. The realism really is down to the color correction, it looked kinda crap before I did the post work in AE.
Should look even better if I can find a way to batch Maxwell Simulens it.
Well my auto-exposure expression is pretty nice, and it helps a lot, but It's kinda shit too. It's going to take a huge leap of math to make it work properly and not just based on guess work.
Working like this is HARD, you have to be so careful with color profile and linear light, everyone and their dog wants to fiddle with the HDR before you get to see it, and to make matters worse, when I work in what I believe is the correct linear color space in AE, it... looks fantastic in the realtime viewport, but If I hit render.. I get a different result!?!?!?
Clearly even Adobe have a hard time with this color stuff.... fantastic...
--
Ok I've figured out how to do this now in theory, I just realized a checkbox on the exposure effect alter It's behavior, bringing it in line with what i'd expect whereby an exposure of 2 versus 1 results in doubling the average brightness of the image. Now I just have to figure out the math to work that out in reverse... I know it will involve log() in someway.... I really can't wrap my head around the math... *hunts for math geeks AGAIN*
Simulating camera/eye optical effects and response
lens flares
depth of field
motion blur
lens effects/diffraction
bokeh
chromatic aberration
tonemapping/filmresponse
-- Examples --
For more detail on a project of mine that involved it you can go here:
Link: helios.mine.nu --- Lens Simulation
I also did a bunch of research into simulating auto-exposure
Fresnel Lens Simulated in Maxwell:
Link: www.digitalartform.com --- fresnel_lens_-
Fryrender
Zipper.deviantart.com
Light splitting, chromatic aberration, rainbows. = pretty!
Frindging
Link: www.dofpro.com
Link: www.dofpro.com
Link: www.dofpro.com
Link: www.dofpro.com
Link: www.missouriskies.org --- february_rainbow_2006
Some nice real world lens flares:
Link: www.flickr.com --- 1026048408
Link: flickr.com --- 288235379
-- Tools --
Make your own LensFlare by hand in AfterEffects with expressions:
Link: library.creativecow.net --- lens_flare
Other 2D solutions like Knoll Light Factory:
Link: www.redgiantsoftware.com --- knoll-light-factory-pro
Link: www.frischluft.com
Frischluft : Flair - BoxBlur with Aberrations:
Frischluft Lenscare: Simulated DOF:
Frischluft Lenscare:
Frischluft Lenscare: Distortion around edge:
Frischluft - Fresh Curves
Looks powerful, hopefully It's 32bit as I've been looking a for 32bit powerful curves editor!
Frischluft - ZbornToy
Link: www.maxwellrender.com
With Maxwell Renders SimuLens tool you can (on any HDR image you feed it) produce a very realistic lens simulation, simulating the aperture and any dirt/obstruction on the lens:
The above was rendered in Vray by me, then color treated and given to Maxwell for the lens effect
You cannot (as far as I see) do your typical cheesy lens flare with the several concentric rings
Some tests of throwing it some brush strokes done in photoshop in 32bit:
Not quite realistic, but adds a nice touch
I've been racking my brain lately over human eye-sight optics. The natural response curve of the human eye and how I can simulate that on a 3D render. Including bloom effects and how the eye adapts to changes in brightness and depth of field auto-focus.
I'm working now with a realistic type of scene, the end goal will be to transition the 'realism' knowledge and tools to not so ordinary subject matter and retain the same level of realism with rather more abstract items.
I believe you should always render to a linear color space, no tone mapping. Then take the 32bit output and simulate the tone/color-mapping in post using something like AfterEffects. This then gives you the ability to detect over bright areas and 'bloom' them and realistically simulate film/camera/eye response to color and brightness. So simulate the camera in post, not in render (if possible)
This isn't great for things like DOF which really need to be done as part of the ray tracing till we can save out volumetric images from renders. But hopefully enough information exists to simulate the other effects more interactively.
I was wondering why a lot of my renders didn't seem to be HDR, or looked 'weird' then I figured out it was because I was relying on Vrays tone-mapping to adjust my image to exposure all areas evenly, which achieves a weird bland but evenly exposed image.
All of this makes it increasingly important to have accurate setup including scale of scene, accurate sun brightness and area light brightness. And this can be hard if you use any HDR's to light the scene as they are usually relatively calibrated and are not in anyway accurately emitted the correct brightness of light.
I'm also working on a way to have an animation auto-expose like a camera or the human eye to changes in brightness.
Right now for animation the only way I can see of changing the exposure over time is render the whole animation once out to HDR, then have another 3D program import each frame of that animation and displays on a camera facing plane, or hangs it in front of the camera... then with interactive render-region, key frame the camera exposure settings so the image is exposed correctly. Then hide the plane, and re-render the movie again with those key-framed exposure settings thus simulating the propper DOF and motion blur that goes with it too. But I want REAL subtle and fast exposure changes and no way you could key frame that, would be insanely painful.
As a side note, I see 3ds Max Design isto include Exposure technology for simulating and analyzing sun, sky, and artificial lighting to assist with LEED 8.1 certification.
UPDATE : Ok, I should be able to do this within AfterEffects now, I've found it can detect average brightness, but the logarithmic math and such are beyond me so I'm waiting on some serious math nerds to help me out with a working formula
:-)
I've found Maxwell simulates a camera lens really well, and you can load any HDR image into it and have it work It's magic. The blurring is a bit extreme in the following images, think of it as a very very dirty camera lens.
Here is a before and after:
The above is with some color work in AE, simulating some film color profiles. Simulating the response curve of film greatly improves the image, you get more interest contrasts and color shifts, and in general just looks nicer and more photographs removes the highly linear and boring CG look that's so typical otherwise.
Link: fnordware.blogspot.com --- hdr-tone-mapping-using-film-profiles
Details on Linear Workflow, general 32bit color ness and jazz to be found at the following link:
Good site full of info on Cinematography, Color Correction, 32bit Linear Workflow etc etc:
Link: prolost.blogspot.com
Odd super bright pixels appeared in my render and resulted in this blow out... cool though, no reason you couldn't use this effect on images drawn in Photoshop from scratch.
ooo.. I like it.. very much
Let the busted camera effects commence!:
Looks almost real
:-)
like a scale model. Burns done with stock footage of film burns thanks to Artbeats Film Clutter 1Not conscious of it, as I don't know anything about cinematography, but I have apparently been trying to simulate super-8 film. I wanted a hand held look, and some more interesting color and exposures, a more textural... rich, warm.. romantic? look to it, which I suppose super-8 achieves rather naturally. I'm not interested in recreating reality exactly, I want a better reality, an artistic reality. Technology (digital cameras) may be getting closer to capturing reality but, is that a good thing, well yes as long as you can then go and post produce it to be more interesting.
I get the feeling that capturing friends and family on a modern consumer camera is more akin to spying or surveillance footage, unflattering, cold. It's kinda funny really, using a computer to simulate what is essentially bad technology.
Hah, just found someone saying the same thing "It's somewhat ironic that now we have the ability to rid our screens of the scratches, lines and bits of debris that find It's way onto film, we choose to put them back in digitally".."Similarly, just as lens manufacturers have worked hard over years developing special coatings to eliminate artifacts, those involved in CG can't get enough of these lens effects, though considered undesirable in traditional photography, their popularity in CG lies in the fact that they are visual hooks that we associate with reality, or at least a photographed version of reality. They are also fairly useful when it comes to hiding something that's not rendering quite right too"
I want that same warmth and nice exposure with all the beautiful blooms and light effects that go with it, but no so much the dirt, blur, shake and low frame rate. I'm going to have an experiment.
Just looking at the whole area of things really, light can do some wonderful things and digital technology isn't quite there yet in terms of recreating it, the way light splits, lens flares, aberrations, rainbows, we as human beings find them beautiful for whatever reason.
Also things like simulating a huge depth of field and motionblur just seems to make images more natural and attractive, supposedly because the human eye has such a large depth of field but we normally don't notice it as whatever we are looking directly at, is always in sharp focus.
Looking at recreating the look of viewing thru someones eyes or via a hand held camera, and found this tool for C4D:
Link: www.c4d-jack.de --- buy
Also of note is Cinemas Architectural Bundle that includes some walkthru tools that can be recorded allowing you to freely walk around your model with the keyboard and mouse just like a FPS game.
Theres also CSTools MoCam which is free!
Link: circlesofdelusion.blogspot.com --- cstools-faq
Real Super 8 footage:
Link: www.youtube.com