Real-time Atmospheric

Transcription

Real-time AtmosphericEffects in Games RevisitedCarsten Wenzel

The dealFollow up to a talk I gave at SIGGRAPH2006! Covers material presented at the timeplus recent additions and improvements!

Overview!!!IntroductionScene depth based renderingAtmospheric effects breakdown!!!!!!!!Sky light renderingFog approachesSoft particlesCloud rendering (updated/new)Volumetric lightning approximationRiver and Ocean rendering (updated/new)Scene depth based rendering and MSAA (new)Conclusions

IntroductionAtmospheric effects are important cuesof realism (especially outdoors)! Why !Create sense of depth! Increase level of immersion!

MotivationAtmospheric effects are mathematicallycomplex (so far usually coarselyapproximated if any)! Programmability and power of today’sGPUs allow implementation ofsophisticated models! How to can these be mapped efficiently?!

Related WorkDeferred Shading (Hargreaves 2004)! Atmospheric Scattering (Nishita et al1993)! Cloud Rendering (Wang 2003)! Real-time Atmospheric Effects in Games(Wenzel 2006)!

Scene Depth Based Rendering:Motivation!!!Many atmospheric effects require accessingscene depthSimilar to Deferred Shading [Hargreaves04]Mixes well with traditional style rendering!!!Deferred shading is not a must!Think of it as writing a pixel shader with scenedepth availableRequires laying out scene depth first andmaking it available to following rendering passes

Scene Depth Based Rendering:Benefits!Decouple rendering of opaque scene geometryand application of other effects!!!!Atmospheric effectsPost-processingMoreApply complex models while keeping theshading cost moderate!!Features are implemented in separate shadersHelps avoiding hardware shader limits (can supportolder HW)

Scene Depth Based Rendering:Challenges!Alpha-transparent objects!!!Only one color / depth value storedHowever, per-pixel overdraw due to alphatransparent objects potentially unboundWorkaround for specific effects needed (will bementioned later)

Scene Depth Based Rendering:API and Hardware Challenges!!!!Usually cannot directly bind Z-Buffer andreverse mapWrite linear eye-space depth to textureinsteadFloat format vs. RGBA8Supporting Multi-Sample Anti-Aliasing istricky (more on that later)

Recovering World Space Positionfrom Depth!Many deferred shading implementationstransform a pixel’s homogenous clip spacecoordinate back into world space!!3 dp4 or mul/mad instructionsThere’s often a simpler / cheaper way!!For full screen effects have the distance from thecamera’s position to its four corner points at the farclipping plane interpolatedScale the pixel’s normalized linear eye space depth bythe interpolated distance and add the camera position(one mad instruction)

Sky Light Rendering!!Mixed CPU / GPU implementation of[Nishita93]Goal: Best quality possible at reasonableruntime cost!!Trading in flexibility of camera movementAssumptions and constraints:!!!Camera is always on the groundSky infinitely far away around cameraWin: Sky update is view-independent, updateonly over time

Sky Light Rendering: CPU!Solve Mie / Rayleigh in-scattering integral!For 128x64 sample points on the skyhemisphere solve H h ( t ( PP ,λ ) t ( PP ,λ )) ca (1)I v (λ ) I s (λ )K (λ )F (θ , g ) e 0 e Pa Using the current time of day, sunlight direction,Mie / Rayleigh scattering coefficientsStore the result in a floating point texturePb!!!Distribute computation over several frames!Each update takes several seconds to compute

Sky Light Rendering: GPU!!Map float texture onto sky domeProblem: low-res texture produces blocky resultseven when filtered!!!Solution: Move application of phase function to GPU (F(θ,g)in Eq.1)High frequency details (sun spot) now computed per-pixelSM3.0/4.0 could solve Eq.1 via pixel shader andrender to texture!!Integral is a loop of 200 asm instructions iterating 32 timesFinal execution 6400 instructions to compute in-scatteringfor each sample point on the sky hemisphere

Global Volumetric FogNishita’s model still too expensive tomodel fog/aerial perspective! Want to provide an atmosphere model!!!To apply its effects on arbitrary objects in thesceneDeveloped a simpler method to computeheight/distance based fog withexponential fall-off

Global Volumetric Fogf (( x , y , z ) ) bev czTvvv (t ) o t d vf (v (t ))dt be f ((o1 td x , o y td y , o z td z )Tx)vd dt0 co zdx dy dz22vv f (v (t ))dtF (v (t )) ef – fog density distributionc – height fall-off2 1 e cd z cdz (2)b – global densityF – fog density along vv – view ray from camera (o) to target pos (o d), t 1

Global Volumetric Fog:Shader ImplementationEq.2 translated into HLSL float ComputeVolumetricFog( in float3 cameraToWorldPos ){// NOTE: cVolFogHeightDensityAtViewer exp( -cHeightFalloff *cViewPos.z );float fogInt length( cameraToWorldPos ) *cVolFogHeightDensityAtViewer;const float cSlopeThreshold 0.01;if( abs( cameraToWorldPos.z ) cSlopeThreshold ){float t cHeightFalloff * cameraToWorldPos.z;fogInt * ( 1.0 - exp( -t ) ) / t;}return exp( -cGlobalDensity * fogInt );}

Combining Sky Light and FogSky is rendered along with scenegeometry! To apply fog !Draw a full screen quad! Reconstruct each pixel’s world space position! Pass position to volumetric fog formula toretrieve fog density along view ray! What about fog color?!

Combining Sky Light and Fog!Fog colorAverage in-scattering samples along thehorizon while building texture! Combine with per-pixel result of phasefunction to yield approximate fog color!!Use fog color and density to blendagainst back buffer

Combining Sky Light and Fog:Results*

Fog Volumes!!!Fog volumes via ray-tracing in the shaderCurrently two primitives supported: Box,EllipsoidGeneralized form of Global Volumetric Fog!!!!Ray-trace in object space: Unit box, unit sphereTransform results back to solve fog integralRender bounding hull geometry!!Exhibits same properties (additionally, direction ofheight no longer restricted to world space up vector,gradient can be shifted along height dir)Front faces if outside, otherwise back facesFor each pixel !Determine start and end point of view ray to plug intoEq.2

Fog Volumes!Start point!!Either camera pos (if viewer is inside) or ray’sentry point into fog volume (if viewer is outside)End point!Either ray’s exit point out of the fog volume orworld space position of pixel depending whichone of the two is closer to the cameraRender fog volumes back to front! Solve fog integral and blend with backbuffer!

Fog VolumesRendering of fog volumes: Box (top left/right), Ellipsoid (bottom left/right)

Fog and Alpha-TransparentObjects!Shading of actual object and application ofatmospheric effect can no longer bedecoupled!!Need to solve both and combine results in same passGlobal Volumetric Fog!!!Approximate per vertexComputation is purely math op based (no lookuptextures required)Maps well to older HW !!Shader Models 2.xShader Model 3.0 for performance reasons / due to lackof vertex texture fetch (IHV specific)

Fog and Alpha-TransparentObjects!Fog VolumesApproximate per object, computed on CPU! Sounds awful but it’s possible whendesigners know limitation and how to workaround it!!!Alpha-Transparent objects shouldn’t become toobig, fog gradient should be rather softCompute weighted contribution byprocessing all affecting of fog volumes backto front w.r.t camera

Soft Particles!Simple idea!!!!Instead of rendering a particle as a regular billboard,treat it as a camera aligned volumeUse per-pixel depth to compute view ray’s traveldistance through volume and use the result to fade outthe particleHides jaggies at intersections with other geometrySome recent publications use a similar idea and treatparticles as spherical volumes!!We found a volume box to be sufficient (saves shaderinstructions; important as particles are fill-rate hungry)GS can setup interpolators so point sprites are finallyfeasible

Soft Particles: ResultsComparisons shots of particle rendering with soft particlesdisabled (left) and enabled (right) *

Clouds Rendering Using PerPixel Depth!!!Follow approach similar to [Wang03],Gradient-based lightingUse scene depth for soft clipping (e.g. rainclouds around mountains) – similar to SoftParticlesAdded rim lighting based on cloud density

Cloud Shadows!!Cloud shadows are cast ina single full screen passUse depth to transformpixel position into shadowmap space

Distance Clouds!!!!Dynamic sky and pre-baked sky box clouds don’t mixwellReal 3D cloud imposters can be expensive and areoften not neededLimited to 2D planes above the camera clouds can berendered with volumetric propertiesSample a 2D texture (cloud density) along the view dir!!For each sample point sample along the direction to sunAdjust number of samples along both directions to fitinto shader limits, save fill-rate, etc.

Distance Clouds!Use the accumulated density to calc attenuationand factor in current sun / sky lightDistance Clouds at different times of day *

Volumetric Lightning Using PerPixel Depth!Similar to Global Volumetric Fog!Light is emitted from a point falling off radiallyNeed to carefully select attenuationfunction to be able to integrate it in aclosed form! Can apply this lighting model just likeglobal volumetric fog!!Render a full screen pass

Volumetric Lightning Model()f ( x, y , z ) TivT1 a l ( x, y , z )2vv vv (t ) o td ()1vvTf (v (t ))dt f (ox td x , o y td y , oz td z ) d dt0 v 2w v arctan arctan 4uw v 2 4uw v 2222 2i d x d y d z 4uw v 2 v F (v (t )) f – light attenuation function i – source light intensityl – lightning source posa – global attenuation control valuev – view ray from camera (o) to target pos (o d), t 1F – amount of light gathered along v(3)

Volumetric Lightning Using PerPixel Depth: Results*

River shading!!!!Rivers (and water areas in general)Special fog volume type: PlaneUnder water fog rendered as described earlier(using a simpler uniform density fog modelthough)Shader for water surface enhanced to softlyblend out at riverside (difference betweenpixel depth of water surface and previouslystored scene depth)

River shading: ResultsRiver shading –Screens taken from a hidden section of the E3 2006 demo *

Ocean shadingVery similar to river shading, however ! Underwater part uses more complexmodel for light attenuation and inscattering! Assume horizontal water plane, uniformdensity distribution and light alwaysfalling in top down! Can be described as follows !

Ocean shading()f ( x, y, z ) , t e c ( s z )e ct e c ( s z t )vv vv (t ) o tdv vvp od v vp ov vl p oTvattenuation f ( p, l ) e c ( s p z l )inscatter e c ( s oz )l elvv c ( s (o z td z ) t )f (v (t ), t )dt ed dt ct (1 d z )0dt0 e (d z 1)ct ()d1c z 0l e c ( s oz ) e c ( s oz ) e (d z 1)cl 1 (d z 1)c finalCol attenuation sceneCol inscatter envlightCol(4)

Ocean shading: ResultsUnderwater view: from ground up (1st row), fromunderneath the surface down (2nd row). Same lightingsettings apply. Higher density on the right column. *

Scene depth based rendering andMSAA!Several problemsCannot bind multi-sampled RT as texture! Shading per pixel and not per sample!Need to resolve depth RT whichproduces wrong values at silhouettes Îpotentially causes outlines in latershading steps! Two problems we ran into!Fog! River / Ocean!

Scene depth based rendering andMSAA: Fog!!!!Fog color doesn’t changed drastically forneighboring pixel while density doesHave fog density computed while laying outdepth (two channel RT)During volumetric fog full screen pass onlycompute fog color and read density fromresolved RTAveraging density during resolve worksreasonably well compared to depth

Scene depth based rendering andMSAA: River / Ocean!!!Shader assumes dest depth plane depth(otherwise pixel would have be rejected by ztest)With resolved depth RT this cannot beguaranteed (depends on pixel coverage ofobject silhouettes)Need to enforce original assumption by findingmax depth of current pixel and all neighbors(direct neighbors suffice)

Scene depth based rendering andMSAA: ResultsFog full screen pass with MSAA disabled (left) / enabled (right)River / Ocean shading artifact (left) and fix (right)

Conclusion!!!Depth Based Rendering offers lot’s of opportunitiesDemonstrated several ways of how it is used inCryEngine2Integration issues (alpha-transparent geometry, MSAA)Kualoa Ranch on Hawaii –Real world photo (left), internal replica rendered with CryEngine2 (right)

References!!!![Hargreaves04] Shawn Hargreaves, “DeferredShading,” Game Developers Conference, D3DTutorial Day, March, 2004.[Nishita93] Tomoyuki Nishita, et al., “Display of theEarth Taking into Account Atmospheric Scattering,”In Proceedings of SIGGRAPH 1993, pages 175-182.[Wang03] Niniane Wang, “Realistic and Fast CloudRendering in Computer Games,” In Proceedings ofSIGGRAPH 2003.[Wenzel06] Carsten Wenzel, “Real-time AtmosphericEffects in Games,” SIGGRAPH 2006.

Acknowledgements!Crytek R&D / Crysis dev team

Questions?

Start point!Either camera pos (if viewer is inside) or ray’s entry point into fog volume (if viewer is outside)!End point!Either ray’s exit point out of the fog volume or world space position of pixel depending which one of the two is closer to the camera!Render fog volumes ba