内容简介:ImageEffect后处理效果对于游戏画面的提升非常重要,但是要实现一些牛逼的效果并且保证性能,是一件很难的事情.我把后处理效果按持续时间分两个类,常驻和不常驻.常驻就表示一直存在的,这种后处理一般只允许存在一个,这样才能保证性能.Unity的后处理链,叠加多个后处理,有非常大的开销,据说Unity较新的版本有优化这个.而不常驻的,用一下就会关闭的,由于存在时间短,所以多个并存并不会特别影响性能.我游戏用到的后处理主要有:
ImageEffect后处理效果对于游戏画面的提升非常重要,但是要实现一些牛逼的效果并且保证性能,是一件很难的事情.
我把后处理效果按持续时间分两个类,常驻和不常驻.常驻就表示一直存在的,这种后处理一般只允许存在一个,这样才能保证性能.Unity的后处理链,叠加多个后处理,有非常大的开销,据说Unity较新的版本有优化这个.而不常驻的,用一下就会关闭的,由于存在时间短,所以多个并存并不会特别影响性能.
我游戏用到的后处理主要有:
常驻:
Bloom全屏泛光
GlobalFog高度雾(限制使用)
ScreenSpaceRain后处理雨(限制使用)
ScreenSpaceSnow后处理雪(限制使用)
不常驻:
Blur模糊
Dead死亡变黑
HeatDistortion热浪扭曲
RadialBlur径向模糊
WaterRipple水波纹效果
所有后处理我定义了一个基类,来判断硬件是否支持.
using UnityEngine; using System.Collections; namespace Luoyinan { public class ImageEffect : MonoBehaviour { [HideInInspector] public Material m_Material; protected bool isSupport = false; // 每个子类必须调用此函数 public void CheckSupport(string shaderName, DepthTextureMode mode = DepthTextureMode.None) { Shader shader = Shader.Find(shaderName); if (!shader || !shader.isSupported) { LogSystem.DebugLog("shader is not supported, shaderName:", shaderName); isSupport = false; enabled = false; return; } if (!SystemInfo.supportsImageEffects) { LogSystem.DebugLog("supportsImageEffects false, shaderName:", shaderName); isSupport = false; enabled = false; return; } if (!SystemInfo.supportsRenderTextures) { LogSystem.DebugLog("supportsRenderTextures false, shaderName:", shaderName); isSupport = false; enabled = false; return; } if (mode == DepthTextureMode.Depth || mode == DepthTextureMode.DepthNormals) { if (!SystemInfo.SupportsRenderTextureFormat(RenderTextureFormat.Depth)) { LogSystem.DebugLog("SupportsRenderTextureFormat(RenderTextureFormat.Depth) false, shaderName:", shaderName); isSupport = false; enabled = false; return; } } m_Material = new Material(shader); isSupport = true; enabled = true; } } }
--------------------------------------------------------------------------------------------------------------------------
Bloom全屏泛光:
端游上会用HDR技术,手机由于一些硬件的限制,我们可以用廉价一些的Bloom效果来代替.Bloom效果的实现可以参考Unity官方的例子,我这里只提一些改进的地方.
1.Bloom效果一般都是原图和模糊后的图Add叠加.但是这样比较容易曝光,我改成了Screen滤色混合.这样不会曝光.
// blend mode //fixed4 final = bloom + color; // Add fixed4 final = bloom + color * (1 - bloom); // Screen
2.Bloom会根据阈值提取一些较亮的像素来做模糊,但是这样并不可控,如果整个场景你都有做高光贴图的话,可以用高光贴图的gloss来作为依据,决定提取哪些像素来做模糊.所以我增加了一种模式,可以根据gloss来过滤像素.
#ifdef _ALPHA_GLOSS_ON return max(color * 0.25 - THRESHHOLD, 0) * ONE_MINUS_THRESHHOLD_TIMES_INTENSITY * color.a; #else return max(color * 0.25 - THRESHHOLD, 0) * ONE_MINUS_THRESHHOLD_TIMES_INTENSITY; #endif
把所有高光贴图的gloss都渲染进屏幕的alpha通道:
然后根据这个gloss的值来决定整个场景的bloom效果,这样美术就可以控制场景bloom亮度效果,比如下图翅膀很亮.但是角色并没有过度曝光:
3.给Bloom效果增加一个对比度的矫正.增强明暗对比
// contrast fixed3 AvgLumin = fixed3(0.5, 0.5, 0.5); color.xyz = lerp(AvgLumin, color.xyz, _conAmount);
GloabaFog高度雾:
高度雾在实现一些地宫的场景,会有非常好的效果,整个雾从下至上而来.由于会多渲染一次深度图,所以限制一些小的场景使用.
传统雾和高度雾的对比:
ScreenSpaceRain后处理雨:
当初为了实现天气系统,做了雨和雪.完整的下雨效果不但有雨滴落下,还得有雨落到地面的涟漪,以及场景湿漉漉的效果.我用后处理来做的.需要渲染场景的深度和法线.
涟漪的实现原理就是用一堆序列帧的涟漪法线贴图来实现动画效果:
由于是后处理的,所以整个屏幕都会有这个涟漪动画效果,然后我们根据渲染的深度法线图来过滤掉那些不应该产生涟漪的像素.有深度图,可以获得每个像素的位置,可以先过滤掉世界位置较远的像素的计算,等于控制了下雨的范围,有法线图,可以获得每个像素的法线朝向,让那些竖直的墙面不受涟漪的影响,如下图:
在屏幕空间处理一些渲染,有时候需要知道屏幕上某个像素的世界位置,但我们只有深度图的深度信息,所以需要将这个深度信息,转换成世界位置信息.这个空间转换是这种屏幕空间后处理的核心点.
顶点程序:
v2f vert (appdata v) { v2f o; o.pos = mul(UNITY_MATRIX_MVP, v.vertex); o.uv = v.texcoord; // screen pos -> view pos float4 cameraRay = mul(unity_CameraInvProjection, float4(v.texcoord * 2 - 1, 1, 1)); // farPlane cameraRay.z *= -1; // 摄像机的正向是-Z轴,正好和Unity默认的Z轴相反. o.ray = cameraRay.xyz / cameraRay.w; return o; }
片段程序:
// view pos -> world pos float4 viewPos = float4(i.ray * depth, 1); float4 worldPos = mul(unity_CameraToWorld, viewPos);
整个shader代码:
Shader "Luoyinan/ImageEffect/ScreenSpaceRain" { Properties { _MainTex ("Texture", 2D) = "white" {} } SubShader { Cull Off ZWrite Off ZTest Always Fog { Mode off } Pass { CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" struct appdata { float4 vertex : POSITION; float2 texcoord : TEXCOORD0; }; struct v2f { float4 pos : SV_POSITION; float2 uv : TEXCOORD0; float3 ray : TEXCOORD1; }; v2f vert (appdata v) { v2f o; o.pos = mul(UNITY_MATRIX_MVP, v.vertex); o.uv = v.texcoord; // screen pos -> view pos float4 cameraRay = mul(unity_CameraInvProjection, float4(v.texcoord * 2 - 1, 1, 1)); // farPlane cameraRay.z *= -1; // 摄像机的正向是-Z轴,正好和Unity默认的Z轴相反. o.ray = cameraRay.xyz / cameraRay.w; return o; } sampler2D _MainTex; sampler2D _CameraDepthNormalsTexture; float4x4 _CamToWorld; sampler2D _RippleTex; float _RippleTexScale; fixed _RippleIntensity; fixed _RippleBlendFactor; sampler2D _WaveTex; fixed _WaveIntensity; fixed _WaveTexScale; half4 _WaveForce; samplerCUBE _ReflectionTex; fixed _RainIntensity; half _MaxDistance; half4 frag (v2f i) : SV_Target { fixed4 finalColor = tex2D(_MainTex, i.uv); // normal & depth half3 normal; float depth; DecodeDepthNormal(tex2D(_CameraDepthNormalsTexture, i.uv), depth, normal); normal = mul((float3x3)_CamToWorld, normal); //normal = mul((float3x3)unity_CameraToWorld, normal); half filter = normal.y; // view pos -> world pos float4 viewPos = float4(i.ray * depth, 1); float4 worldPos = mul(unity_CameraToWorld, viewPos); // distance half d = length(worldPos.xyz - _WorldSpaceCameraPos.xyz); if (d < _MaxDistance) // performance { // wave half3 bump = UnpackNormal(tex2D(_WaveTex, worldPos.xz * _WaveTexScale + _Time.xx * _WaveForce.xy)); bump += UnpackNormal(tex2D(_WaveTex, worldPos.xz * _WaveTexScale + _Time.xx * _WaveForce.zw)); bump *= 0.5; // ripple half3 ripple = UnpackNormal(tex2D(_RippleTex, worldPos.xz * _RippleTexScale)); normal.xy = lerp(normal.xy, ripple.xy * _RippleIntensity + bump.xy * _WaveIntensity, _RippleBlendFactor); // reflection half3 viewDir = normalize(_WorldSpaceCameraPos - worldPos); half3 reflUVW = normalize(reflect(-viewDir, normal)); //half fresnel = 1 - saturate(dot(viewDir, normal)); //fresnel = 0.25 + fresnel * 0.75; half4 reflection = texCUBE(_ReflectionTex, reflUVW) * _RainIntensity; finalColor += reflection * normal.y * step(0.1, filter) * filter * 2; } return finalColor; } ENDCG } } }
C#脚本代码:
using UnityEngine; using System.Collections; namespace Luoyinan { [AddComponentMenu("Image Effects/ScreenSpaceRain")] public class ScreenSpaceRain : ImageEffect { public float maxDistance = 100.0f; public Texture2D[] rippleTextures; public Texture2D waveTexture; public Cubemap reflectionTexture; public float rippleTextureScale = 0.3f; public float rippleFrequency = 20.0f; [Range(0.5f, 2)] public float rippleIntensity = 1.25f; [Range(0, 1)] public float rippleBlendFactor = 0.9f; public Vector4 waveForce = new Vector4(1.0f, 1.0f, -1.0f, -1.0f); public float waveIntensity = 0.2f; public float waveTextureScale = 0.15f; public float rainIntensity = 1.0f; private int m_CurRippleTextureIndex = 0; private float m_LastTime = 0; private static ScreenSpaceRain m_Instance; public static ScreenSpaceRain Instance { get { return m_Instance; } } void Awake() { if (null != m_Instance) { LogSystem.ErrorLog("Awake()这个函数只能被调用一次!!!!"); return; } m_Instance = this; CheckSupport("Luoyinan/ImageEffect/ScreenSpaceRain", DepthTextureMode.DepthNormals); if (rippleTextures == null) { int count = 24; rippleTextures = new Texture2D[count]; for (int i = 0; i < count; ++i) { rippleTextures[i] = Resources.Load("Texture/Rain/Ripple/ripple" + (i + 1) + "_ddn") as Texture2D; } } if (waveTexture == null) { waveTexture = Resources.Load("Texture/Rain/wave") as Texture2D; } if (reflectionTexture == null) { reflectionTexture = Resources.Load("Texture/Reflection_2") as Cubemap; } } void OnEnable() { if (!isSupport) return; ImageEffectMgr.Instance.AddDepthTextureModeCount(GetComponent<Camera>(), DepthTextureMode.DepthNormals); } void OnDisable() { if (!isSupport) return; ImageEffectMgr.Instance.RemoveDepthTextureModeCount(GetComponent<Camera>(), DepthTextureMode.DepthNormals); } void Update() { if (!isSupport) return; float f = rippleFrequency * (rainIntensity * 0.5f + 0.5f); if (Time.time - m_LastTime > 1.0f / f) { m_LastTime = Time.time; ++m_CurRippleTextureIndex; if (m_CurRippleTextureIndex >= rippleTextures.Length) m_CurRippleTextureIndex = 0; } } void OnRenderImage(RenderTexture src, RenderTexture dest) { if (!isSupport) { Graphics.Blit(src, dest); return; } m_Material.SetMatrix("_CamToWorld", GetComponent<Camera>().cameraToWorldMatrix); m_Material.SetFloat("_MaxDistance", maxDistance); m_Material.SetTexture("_RippleTex", rippleTextures[m_CurRippleTextureIndex]); m_Material.SetTexture("_WaveTex", waveTexture); m_Material.SetTexture("_ReflectionTex", reflectionTexture); m_Material.SetFloat("_RippleTexScale", rippleTextureScale); m_Material.SetFloat("_RippleFrequency", rippleFrequency); m_Material.SetFloat("_RippleIntensity", rippleIntensity); m_Material.SetFloat("_RippleBlendFactor", rippleBlendFactor); m_Material.SetFloat("_RainIntensity", rainIntensity); m_Material.SetVector("_WaveForce", waveForce); m_Material.SetFloat("_WaveIntensity", waveIntensity); m_Material.SetFloat("_WaveTexScale", waveTextureScale); Graphics.Blit(src, dest, m_Material); } void OnDestroy() { m_Instance = null; } } }
ScreenSpaceSnow后处理雪:
后处理的雪和部分技术和雨类似.雪是可以逐渐变浓并逐渐融化的,
畅游蛮荒搜神记类似效果:
雪的处理,需要一张噪声纹理,让雪看起来有些变化.如下图中雪深和雪浅的地方:
雨和雪都都是天气系统的一部分,在编辑器设置也很简单:
-----------------------------------------------------------------------------------------------------------------
还有其他一些后处理效果,有空我再介绍吧.
作者:qq18052887 发表于 2018/05/22 15:30:23原文链接 https://blog.csdn.net/qq18052887/article/details/80403917
阅读:2
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。