If you’ve been following my blog and would like to know more about creating photorealistic 3D CGI renders, you can go straight to the source with these books from Amazon:
I’d hoped to gain more insight from reading Jeremy Burns chapter on Decals and Dirt, but found that I had already been using what he refers to as Decals in my projects. In the example robot image below, I had already applied decals (stickers) to my robot, however, what I hadn’t done was added dirt.
In this example I’ve probably gone a little over the top with the dirt but it helps to illustrate the point I’m making.
When I showed the animated version of this robot to a fellow artist, he immediately noticed the lack of movement in the robot’s hands and how clean the robot looked in comparison to the background. Although the film noise/grain and motion blur that had been added were helping to blend the two media together, it seemed the texture on the surface of the robot’s skin, was too clean and shiny to begin with.
In the example above, I’ve used Ambient Occlusion (a tool for finding edges between two surfaces and is often used to emphasize shadows) to create the dirt which in my opinion is a inaccurate way to achieve the effect. When creating dirt with a photorealistc effect, you should paint dirt onto a model by hand. Burns (p229) correctly states that you should “choose dirt maps that add specific, motivated detail to your objects. Think through the story behind all of the stains and imperfections on a surface – something has to cause any dirt, scratches, or stains that you would see”.
In the example above, dust would have fallen and settled onto the robot from above, this would mean that the robot’s shoulders, head, bridge of it’s nose and chest would probably have accumulated the most dust. Similarly, water dripping from the ceiling would have created vertical streaks running down the robot’s body.
Painting in these extra details will require considerable time but it’s this attention to details that will eventually sell the illusion of photorealism.
…and lots of other sources.
Dmitry Denisov’s Photoreal 3D Tutorial
I’d previously suggested that I’d be spending a few days following a tutorial in issue 48 of 3D Artist magazine on Photrealism in 3D.
I found that 15 hours dedicated to the tutorial would not be well spent, however, simply reading through Dmitry Denisov’s instructions has highlighted some interesting thoughts. Firstly, I was pleased to see that in the same vein as my own research, Denisov is attempting to replicate the unwanted by-products of real world cameras in his 3D visualisations. This close up image shows how Denisov creates chromatic aberration in the water drops.
Currently absent from my own images is Denisov’s use of air particles. Quite often (always?) there are particles of dust and the like floating in the air, although they are not always visible to the naked eye. In the ladybug picture, Denisov generates 3D particles in the scene to represent the presence of dust. This effect would have been extremely useful to the robot animation I had produced when studying animation artefacts as this animation was staged in a very dusty environment.
Denisov goes even further to develop this effect by adding bokeh (lens blur) to the dust particles. Combined with his use of volumetric (visible) lighting a very effective and realistic light setup occurs. The image below shows the combination of volumetric lighting, air particles and bokeh in Denisov’s work.
If you look closely in the image, you can see that Denisov also has a layer in his file for vignetting, another unwanted artefact of real cameras that has previously been discussed.
Whilst giving some more thought to real camera artefacts it occurred to me that I hadn’t yet considered exposure. Quite often it is difficult to photograph a landscape scene and acquire correct exposure for both land and sky. It is quite common for the sky to become overexposed and the highlights clipped. To overcome this problem Nikon DSLR cameras use a feature called Active D-Lighting. For a 3D artist trying to achieve photorealism, I wander if these unwanted exposure problems should be artificially introduced?
As had expected, only limited learning could come from reading a magazine and so I resigned myself to obtaining some research materials that I could really sink my teeth into.
Research Papers on Photorealism
Many of the sources I’ve already consulted give confirmation to the observations I have made previously. In fact, one of the opening statements in Rackwitz and Sterner’s research paper (2007, p11) states that “To achieve photorealism with a computer is a challenging task and requires understanding of the fundamental physics and psychophysics of light. How does light interact with materials and surfaces in the real world? What happens when light rays enter the human eye?”. It seems that when I wrote my hypothesis I was barking up the right tree!
In the same paper Rackwitz and Sterner (p26) give praise to the pursuit of replicating unwanted artefacts of real world cameras. They state that “Producing a photo with a digital camera includes minor noise, while rendering generates totally ‘crispy’ and clean images”.
They also confirm that hard/square edges should be avoided. They state that “Real objects can hardly be as hard‐edged as they are in 3D”… “The modeller has to be aware of those problems and correct them, at least for photorealistic rendering” (p26).
In his tutorial, Dmitry Denisov uses Directional Lights to achieve a volumetric (visible) lighting effect. When reading Antione Bishc’s research paper for his Msc in Media Production entitled ‘Photorealism in 3D Images’ (2007), I had noticed that he too discusses the use of directional lighting.
I was a little puzzled as to what a directional light was until I came across this video on Real World Lighting in Cinema 4D by Nick Campbell. In the video, Nick explains the difference between the hard shadows created by a distant light, such as the sun, compared to the soft shadows created by a large and up-close source of light such as a studio soft box.
I’d already come to understand that an infinite (directional) light should be used to emulate the sun but hadn’t understood why. I’d highly recommend that anybody interested in lighting watch this video, however, more research is needed on my part to understand the volumetric/visible lighting effect.
Warm and Cold Lighting
Although not necessarily related to photorealism, something else that I hadn’t been able to resolve had been bugging me for some time. Unexpectedly this week, I found the solution in two separate locations. I wanted to know why films use so much blue and orange lighting. Martin Scorsese’s film Hugo (2011) is a paradigm example of this as its colour palate uses almost nothing but blue and orange.
In their research paper ‘Photorealistic Rendering with V‐ray’ (2007, p23) Rackwitz and Sterner explain that a balance of warm and cold light will “generate a feeling of depth and spatiality”.
In his video on real world lighting, Nick Campbell says that a warm light is often used for the key light and a cold light is used as a fill light. He speculates that this might be because our eyes are used to seeing the sky act as a fill light whilst the sun acts as a key light.
Rackwitz and Sterner (p48) go on to say that “One simple rule of thumb is daylight or lights that come from an outside environment should be cold and kind of blue with a high color temperature, while indoor lamps have a low color temperature, a yellow touch, a bit dim light and not that intensity as the bright cold light”.
Whilst on the subject of warm and cold lighting, I’ve found that a 3D artist in search of photorealism should create lights that have the same colour temperature as their real world counterparts. The following is a small selection of real world lights and their colour temperature.
- Candle flame: 1900°Kelvin
- 100‐watt household bulb: 2865°Kelvin
- Daylight: 5600°Kelvin
If a 3D artist in pursuit of photorealism was creating a directional light that was intended to emulate a sun, then the virtual light’s colour temperate should match that of its real world counterpart, i.e. approximately 5,600°Kelvin (depending on the time of day etc).
Even better than trying to match the correct colour temperature, an architectural 3D artist in search of photorealism should almost certainly make use of IES (Illuminating Engineering Society of North America) lights wherever possible. The IES have created a standard that allows manufacturers to record characteristics for the lights they make, such as colour temperature, falloff and visible light etc. These measurements are saved in a text file and made publicly available. 3D software (such as Cinema 4D) can then use these files to replicate a real world light.
Todd Dave has a useful video that demonstrates IES lighting properties in this video.
One of the biggest insights I have been able to obtain whilst conducting this research has been the generation of randomness or chaos. This is something that all sources seem to agree on; a 3D artist in search of realism should note that in real life, there is chaos.
In his tutorial, Dmitry Denisov uses 3D software to place water drops randomly into the scene.
Using software to do this is a very good idea as I believe (albeit without having conducted any research) it is difficult for a human mind to conceive randomness. For example, if a human was asked to randomly draw some dots on a piece of paper, it’s unlikely that they would produce something that looks like this:
Of course, in the infinite number of ways that the dots could have been drawn, a match to my example would be unlikely. Although having said that, it depends on how you look at it, the opposite is also true; in the infinite number of ways that the dots could have been drawn, eventually, an exact match could be found. Anyway, the argument I’m trying to make is that the human mind is fantastic at recognising patterns and is instinctively programmed to seek harmony. I’d guess that most people would try to fill the page with dots that were somewhat evenly spaced and would try to avoid leaving large areas of empty space. I’d guess that the human brain would generate a (simple) formula in an attempt to create something that is random, the result being something that is far from random.
This is all speculation of course, but as matter of fact, the computer is most certainly not confined by the same instinctive limitations as the human mind and is thus far more adept at creating randomness (despite requiring a human programmer to explain how to go about it). Without digressing (too much), software generated randomness is a tool that 3D designers should put to good use wherever possible.
I’ll try not to go off an a tangent this time but there are situations where the opposite is also true. For example, when creating a brick wall, a computer would use an algorithm to generate bricks that were exactly the same size, arranged in perfectly straight lines, and spaced exactly the same distance apart. In this instance, the computer’s formula is far from the chaotic reality, as can be seen in the irregularities of the real brick wall below.
Rackwitz and Sterner (2007) demonstrate this aspect of photorealism in their research paper by ensuring that the cupboard doors in the following 3D image were not perfectly aligned.
According to Rackwitz and Sterner (p7), when you ask someone in the 3D business what they think will make a picture more realistic, the answer will most often be “irregularity, dirt, grain” or “imperfect makes perfect”
How black is black?
I’d already mentioned in my studies that the only thing in nature that is truly black is a black hole, everywhere else, albeit in infinitely small quantities, there is always some amount of light present. Similarly, in nature, a pure white does not exist. Whilst writing their research paper, Rackwitz and Sterner attended an internship with IKEA.Whilst there, they found that IKEA set many specifications for 3D images produced by the artists that work for them. Within those specifications it was found that the brightest colour used for IKEA imagery should be the hexadecimal colour #F7F7F7 and the nearest colour to black should be #0A0A0A. Hexadecimal is far beyond the scope of this document but suffice to say; “there are 10 types of people in this world; those who understand binary, and those who don’t”.
It’s a little concerning that Rackwitz and Sterner suggest that “Photorealistic rendering is an extremely broad subject. To think, that one thesis could capture all the knowledge needed to explain photorealism, would be plain ignorance”. The implications of this suggest that my own ideas for a research paper might need to be re-evaluated, however, Rackwitz and Sterner do offer some alternative subjects for a research paper such as “Photorealistic lighting for indoor environments” or “Texturing solid materials for photorealism”.
Rackwitz and Sterner conclude their research paper by suggesting something that I’ve come to realise in my own studies; pinpointing the essence of photorealism is very subjective.
Deeper Down the Rabbit Hole
As Rackwitz and Sterner have rightly noted, one research paper couldn’t possibly provide all the answers for producing photorealistic 3D renders. In light of this, I have turned my focus away from research papers and have at am now concentrating on books. At the time of writing, I have on my desk a book that has me in a state of pure excitement Just looking through the table of contents makes the hairs on the back of neck tingle! What follows is but a small excerpt from the table of contents:
- Directional Lights
- Modeling with Light
- The visual function of shadows
- Shadow algorithms
- Lights with negative brightness
- Software without colour balance
- Simulating real life cameras
- Gamma Correction
- Motion blur
- Colour and depth; warm and cool colours
- Diffuse and specular light transmission
- Realistic specularity
- The Fresnel effect
- Decals and dirt
- Transparency and refraction
- Photon mapping
- Particle Effects
I realise that this won’t appeal to all who read this, but it most certainly puts a massive smile on my face. Turning to the first page invites the reader to continue with an opening question: “How do you simulate the exposure process of a real camera and the natural side-effects of a cinematographer’s exposure controls?”. I think I’m going to find that the book sitting in my lap is a real gem.
As I work my way through Jeremy Birn’s ‘Digital Lighting and Rendering’ I’ll be reporting my findings on this blog. In the meantime, for those who are ‘Chomping at the bit’ like I am, here are some other books that may be of interest:
Can’t wait for the postman? perhaps a Google search of realistic image synthesis might also be in order.