As one of the steps necessary to support lakes, dikes and lands below the sea level in Outerra, I've made the sea level parametrized. For now it can be set globally to values above or below the mean sea level.
Here are some screens of North America for sea levels of 0, 100, 200, 500 and 1000 meters.
The same for Europe with sea levels of 0, 100, 200, 500 and 1000 meters.
With 500 and 1000m rise the world changes beyond recognition.
While the rise of sea level by 1000m is completely out of proportion and would require some serious biblical sinning or Earth gradually absorbing a body of water with the size of Sedna, it could still serve as a background for a science-fiction based game.
For some opposite change: Great Britain with sea level falling by -20, -50 and -100m respectively:
Lastly, a view from the hills around the Los Angeles sea.
Showing posts with label water. Show all posts
Showing posts with label water. Show all posts
Monday, July 2, 2012
Friday, July 29, 2011
White balance
When implementing the fog mentioned in the previous post, I observed a weird thing happening: the fog wasn't white, as I expected, but it had a dirty Beige tint making it look a bit like a smog or something. But since the implementation didn't use different absorption and scattering coefficients for RGB components, and thus the color of the sun light shouldn't have been modified, I thought it was a bug, and neglected it until most of other issues were solved.
But then, after inspecting all the code paths, I came to the only conclusion that the computation is right and the problem must be in the interpretation. So I tried to convince myself that the fog must be white, and the tint actually isn't there. Almost made it, too.
But the machine coldly asserted that the color wasn't white as well. Didn't bother with any hinting as to why, though.
Apparently the incoming light that was scattering on fog particles was already this color, even though the sun color was not modified in any way, unlike in the previous experiments.
Interpretation?
The thing is that sunlight really gets modified a bit until it arrives to the planet surface. The same thing that is responsible for blue sky causes this: a small part of the blue light (and a smaller part of the green light too) gets scattered away from the sun ray. What comes down here has a slightly shifted spectrum.
But how come we see the fog white in real life?
Turns out, everything is fake.
The way we perceive colors is purely subjective interpretation of a part of the electromagnetic spectrum.
And as it is easier for the brain to orient in the environment when the sensors don't move, it is also simpler to stick with constant properties on objects. Our brain "knows" that a sheet of paper is white, and so it will make it appear white in wildly varying lighting conditions. This becomes apparent when you use a digital camera without adjusting for the white color - the results will be ugly.
So basically that's why we have to implement an automatic white balancing, at least until we all have full surround displays and our brains magically adapt by themselves. By the way, playing in fullscreen in the dark room with uncorrected colors slowly makes it adapt too.
Implementation
Our implementation tries to mimic what the perception actually does. By definition, a white sheet appears to be white under a wide range of lighting conditions. So we are running a quick computation that uses the existing atmospheric code on GPU, that computes what light reflects off a white horizontal surface. The light has two components - sun light that reflects at an angle and its illuminative power diminishes as the sun recedes from zenith, and the second one is the aggregated light from the sky. Once this compound color is known, we could perform the color correction as a post-process, but there's another way - adjusting the color of sun so that the resulting surface color is white. This has an advantage of not affecting the performance at all, since the sun color is already taken into equation.
While this algorithm doesn't mimic the human perception precisely, i.e. the actual process is more complex and depends on other things, it seems to be pretty satisfactory, though I expect further tuning.
Some of the properties: it extends the period of day that seems to have a "normal" lighting, and removes the unnatural greenish tint on the sky:
During the day it compensates for the brownish light color by making the blue things bluer. Can't say the old colors were entirely bad though.
But then, after inspecting all the code paths, I came to the only conclusion that the computation is right and the problem must be in the interpretation. So I tried to convince myself that the fog must be white, and the tint actually isn't there. Almost made it, too.
But the machine coldly asserted that the color wasn't white as well. Didn't bother with any hinting as to why, though.
Apparently the incoming light that was scattering on fog particles was already this color, even though the sun color was not modified in any way, unlike in the previous experiments.
Interpretation?
The thing is that sunlight really gets modified a bit until it arrives to the planet surface. The same thing that is responsible for blue sky causes this: a small part of the blue light (and a smaller part of the green light too) gets scattered away from the sun ray. What comes down here has a slightly shifted spectrum.
But how come we see the fog white in real life?
Turns out, everything is fake.
The way we perceive colors is purely subjective interpretation of a part of the electromagnetic spectrum.
And as it is easier for the brain to orient in the environment when the sensors don't move, it is also simpler to stick with constant properties on objects. Our brain "knows" that a sheet of paper is white, and so it will make it appear white in wildly varying lighting conditions. This becomes apparent when you use a digital camera without adjusting for the white color - the results will be ugly.
So basically that's why we have to implement an automatic white balancing, at least until we all have full surround displays and our brains magically adapt by themselves. By the way, playing in fullscreen in the dark room with uncorrected colors slowly makes it adapt too.
Implementation
Our implementation tries to mimic what the perception actually does. By definition, a white sheet appears to be white under a wide range of lighting conditions. So we are running a quick computation that uses the existing atmospheric code on GPU, that computes what light reflects off a white horizontal surface. The light has two components - sun light that reflects at an angle and its illuminative power diminishes as the sun recedes from zenith, and the second one is the aggregated light from the sky. Once this compound color is known, we could perform the color correction as a post-process, but there's another way - adjusting the color of sun so that the resulting surface color is white. This has an advantage of not affecting the performance at all, since the sun color is already taken into equation.
While this algorithm doesn't mimic the human perception precisely, i.e. the actual process is more complex and depends on other things, it seems to be pretty satisfactory, though I expect further tuning.
Some of the properties: it extends the period of day that seems to have a "normal" lighting, and removes the unnatural greenish tint on the sky:
![]() | ![]() |
During the day it compensates for the brownish light color by making the blue things bluer. Can't say the old colors were entirely bad though.
![]() | ![]() |
![]() | ![]() |
![]() |
| So long, and thanks for all the fish |
Friday, February 18, 2011
Ocean Rendering
Let me first say that I'm often visiting my own blog to read how I did certain things. This is mostly true for some of the older, more technical posts. I decided to blog about recent water rendering development in a way that will be helpful for me in time when my brain niftily sends all the crucial bits to desert. I apologize in advance if some pieces seem incoherent.
Now for the rendering of water in Outerra.
There are two types of waves mixed - open sea waves with the direction of wind (fixed for now), and shore waves (the surf) that orient themselves perpendicularly to the shore, appearing as the result of oscillating water volume that gets compressed with rising underwater terrain.
Open sea waves are simulated in a usual way by summing a bunch of trochoidal (Gerstner) waves with various frequencies over a 2D texture that is then tiled over the sea surface. Obviously, the texture should be seamlessly tileable, and that puts some constraints on possible frequencies of the waves. Basically, the wave should peak on each point of the grid. This can be satisfied by guaranteeing that the wave has an integral number of peaks in both u,v texture directions. Resulting wave frequency is then
Other wave parameters depend on the frequency (or its reciprocal, the wavelength). Generally, wave amplitude should be kept below 1/20th of wave length, as larger ones would break.
Wave speed for deep waves can be computed using the wavelength λ as:
Direction of waves can be determined by manipulating the amplitudes of generated wave, for example the directions that lie closer to the direction of wind can have larger amplitudes than the ones flowing in opposite direction. The opposite wave directions can be even suppressed completely, which may be usable e.g. for rivers.
Shore waves form as the terrain rises and water slows down, while the wave amplitude rises. These waves tend to be perpendicular to shore lines.
In order to make the beach waves we need to know the distance from particular point in water to shore. Additionally, a direction vector is needed to animate the foam.
Distance from shore is used as an argument to wave shape function, stored in a texture. This shape is again trochoidal, but to simulate a breaking wave the equation has been extended to a skewed trochodial wave by adding another parameter determining the skew. Here's how it affects the wave shape:
The equation for skewed trochoidal wave is:
Skew γ=1 gives a normal Gerstner wave.
Several differently skewed waves are precomputed in a small helper texture, and the algorithm chooses the right one depending on water depth.
Distance map is computed for terrain tiles that contain a shore, i.e. those with maximum elevation above sea level and minimum elevation below it. Shader finds the nearest point of opposite type (above or below sea level) and outputs the distance. Resulting distance map is filtered to smooth it out.
Gradient vectors are computed by applying Sobel filter on the distance map.
Both wave types are then added together. The beach waves are conditioned using another texture with mask changing in time so that they aren't continual all around the shore.
Water color is determined by several indirect parameters, most importantly by the absorption of color components under the water. For most of the screen shots shown here it was set to values of 7/30/70m for RGB colors, respectively. These values specify the distances at which the respective light components get reduced to approximately one third of their original value.
Another parameter is a reflectivity coefficient that tells how much light is scattered towards the viewer. Interestingly, scattering effect in pure water is negligible in comparison with the effect of light absorption. Main contributor to the observed scattering effect is dissolved organic matter, followed by inorganic compounds. This also gives seas slightly different colors.
Here's a short video showing it all in motion.
An earlier video that was posted on the forums with underwater scenes:
TODO
Water rendering is not yet finished, this should be considered a first version. Here's a list of things that will be enhanced:
A few of ocean sunset and underwater screenshots that were posted on the forums during the development.
Now for the rendering of water in Outerra.
There are two types of waves mixed - open sea waves with the direction of wind (fixed for now), and shore waves (the surf) that orient themselves perpendicularly to the shore, appearing as the result of oscillating water volume that gets compressed with rising underwater terrain.
Open sea waves are simulated in a usual way by summing a bunch of trochoidal (Gerstner) waves with various frequencies over a 2D texture that is then tiled over the sea surface. Obviously, the texture should be seamlessly tileable, and that puts some constraints on possible frequencies of the waves. Basically, the wave should peak on each point of the grid. This can be satisfied by guaranteeing that the wave has an integral number of peaks in both u,v texture directions. Resulting wave frequency is then
Other wave parameters depend on the frequency (or its reciprocal, the wavelength). Generally, wave amplitude should be kept below 1/20th of wave length, as larger ones would break.
Wave speed for deep waves can be computed using the wavelength λ as:
Direction of waves can be determined by manipulating the amplitudes of generated wave, for example the directions that lie closer to the direction of wind can have larger amplitudes than the ones flowing in opposite direction. The opposite wave directions can be even suppressed completely, which may be usable e.g. for rivers.
Shore waves form as the terrain rises and water slows down, while the wave amplitude rises. These waves tend to be perpendicular to shore lines.
In order to make the beach waves we need to know the distance from particular point in water to shore. Additionally, a direction vector is needed to animate the foam.
Distance from shore is used as an argument to wave shape function, stored in a texture. This shape is again trochoidal, but to simulate a breaking wave the equation has been extended to a skewed trochodial wave by adding another parameter determining the skew. Here's how it affects the wave shape:
The equation for skewed trochoidal wave is:
Skew γ=1 gives a normal Gerstner wave.
Several differently skewed waves are precomputed in a small helper texture, and the algorithm chooses the right one depending on water depth.
Distance map is computed for terrain tiles that contain a shore, i.e. those with maximum elevation above sea level and minimum elevation below it. Shader finds the nearest point of opposite type (above or below sea level) and outputs the distance. Resulting distance map is filtered to smooth it out.
Gradient vectors are computed by applying Sobel filter on the distance map.
![]() |
| Gradient field created from Gaussian filtered distance map |
Both wave types are then added together. The beach waves are conditioned using another texture with mask changing in time so that they aren't continual all around the shore.
Water color is determined by several indirect parameters, most importantly by the absorption of color components under the water. For most of the screen shots shown here it was set to values of 7/30/70m for RGB colors, respectively. These values specify the distances at which the respective light components get reduced to approximately one third of their original value.
![]() | ![]() |
| Red: 7m, Green: 30m, Blue: 70m, Scattering coefficient: 0.005 | Red: 70m, Green: 30m, Blue: 7m |
Another parameter is a reflectivity coefficient that tells how much light is scattered towards the viewer. Interestingly, scattering effect in pure water is negligible in comparison with the effect of light absorption. Main contributor to the observed scattering effect is dissolved organic matter, followed by inorganic compounds. This also gives seas slightly different colors.
![]() | ![]() |
| Scattering coefficient: 0.000 | Scattering coefficient: 0.020 |
Here's a short video showing it all in motion.
An earlier video that was posted on the forums with underwater scenes:
TODO
Water rendering is not yet finished, this should be considered a first version. Here's a list of things that will be enhanced:
- Better effect for wave breaking. This will probably require additional geometry, maybe a tesselation shader could be used for that.
- Animated foam
- Enhanced wave spectrum - currently the spectrum is flat, which doesn't correspond to reality. Wave frequencies could be even generated adaptively, reflecting the detail needed for the viewer.
- Fixing various errors - underwater lighting, waves against the horizon, lighting of objects on and under the water, LOD level switching ...
- Support for other types of wave breaking
- Integrating climate type support to the engine, that will allow different sea parameters across the world
- UI for setting water parameters
- Reflect the waves in physics for boats
A few of ocean sunset and underwater screenshots that were posted on the forums during the development.
Subscribe to:
Comments (Atom)































