PRTGI

14k words

简介

记录 Precomputed Radiance Transfer Global Illumination 的研究

预览

Image text
Image text
Image text
Image text

来源

预计算辐照度全局光照(PRTGI)从理论到实战 - AKG4e3的文章 - 知乎
Unity移动端可用实时GI方案细节补充 - 方木君的文章 - 知乎
10.3 球面函数和半球函数 - Justin的文章 - 知乎
体探针漏光解决方案 - jackie 偶尔不帅的文章 - 知乎
全局光照技术:从离线到实时渲染

思路

  • 场景几何信息
    • 探针获取 CubeMap,存储的数据过大,通过只存储指定数量的 Surfel 数据节省内存
  • 实时光照计算
    • 运行时计算每个 Surfel 受到的 radiance
    • 为了计算屏幕像素得到光照,而不是单个 Surfel 的光照,需要将 Surfel 代表的环境光照信息编码到 Probe 上,像素着色时访问 Probe 得到其周围环境光照信息
  • 光照信息编码
    • 通过球谐函数和球谐系数和构成函数,可以得到给定方向上的光照颜色(对 radiance 进行编码)
    • 根据 “当进行球面对称的核函数与原始信号函数的卷积时,可以等价为核函数球谐系数与原始信号函数的球谐系数进行乘法操作”
    • 通过计算 radiance 的球谐系数,简单的和传输函数的球谐系数相乘,再乘以一个固定倍率,就可以得到对应的 irradiance
  • radiance 球谐系数的计算
    • 使用蒙特卡洛法估算数值
  • 优化
    • 漏光
      • 在计算亮度贡献时,将【像素到探针方向】与【像素法线】Dot 得到权重
    • 解决只计算光反射一次
      • 将上一次的光追结果汇入本次光线追踪中。计算每个 Surfel 的直接光照。也用该 Surfel 世界坐标采样上一帧球谐系数并重建光照
    • 探针数据混合
      • 传统前向渲染中,通过设置 Per Object 的四个 Probe,然后在四面体插值得到每个像素的光照信息,在物体特别大并横跨多个四面体时,这种做法会导致错误
      • Image text
      • 将所有 Probe 的球谐系数 Pack 到一张 Buffer上,运行时通过世界坐标找到对应的体素网格,并访问距离最近8个 Probe 进行颜色的混合
      • Image text

步骤记录

初始化

关闭Unity默认的环境光项
Image text
结果
Image text

场景几何信息

  1. 通过摄像机,捕获 Probe 的Cube信息,包括世界空间坐标、法线、Albedo
    Image text
Probe.cs
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39

private Camera CreateCamera()
{
GameObject cameraGo = new GameObject("CubeMapCamera");
cameraGo.transform.SetLocalPositionAndRotation(transform.position, Quaternion.identity);
Camera camera = cameraGo.AddComponent<Camera>();
camera.clearFlags = CameraClearFlags.SolidColor;
camera.backgroundColor = new Color(0, 0, 0, 0);
return camera;
}
private void BatchGBufferData(Camera camera) {
GameObject[] objects = FindObjectsOfType(typeof(GameObject)) as GameObject[];
BatchSetShader(objects, Shader.Find("PRTGI/GBufferWorldPos"));
camera.RenderToCubemap(RT_WorldPos);
BatchSetShader(objects,Shader.Find("PRTGI/GBufferNormal"));
camera.RenderToCubemap(RT_Normal);
BatchSetShader(objects,Shader.Find("Universal Render Pipeline/Unlit"));
camera.RenderToCubemap(RT_Albedo);

BatchSetShader(objects,Shader.Find("Universal Render Pipeline/Lit"));
DestroyImmediate(camera.gameObject);
}
private void BatchSetShader(GameObject[] objs, Shader shader) {
for (int k = 0; k < objs.Length; k++) {
GameObject go = objs[k];
MeshRenderer meshRenderer = go.GetComponent<MeshRenderer>();
if (meshRenderer) {
meshRenderer.sharedMaterial.shader = shader;
}
}
}
public void CaptureGBufferCubeMaps() {
Camera camera= CreateCamera();
if (camera == null)
{
return;
}
BatchGBufferData(camera);
}
GBufferWorldPos\GBufferNormal
1
2
3
4
5

return float4(o.worldPos,1);

return float4(normalize(o.normal),1);

  1. 用 Surfel 数组存储从 CubeMap 中采样到的 worldPos\normal\albedo\skyMask 信息,压缩数据
    Image text
    Image text
    NormalCubemap的w值,可以得到表面的不透明度,在 GBufferNormal.shader 中,法线的 w 项统一返回1
Surfel 数据获取
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32

public Surfel[] readBackBuffer;
public ComputeBuffer surfels;
public ComputeShader surfelSampleCS;

public RenderTexture RT_WorldPos;
public RenderTexture RT_Normal;
public RenderTexture RT_Albedo;

public struct Surfel
{
public Vector3 position;
public Vector3 normal;
public Vector3 albedo;
public float skyMask;
}

通过 ComputeShader 从 Cubemap 中随机选取一部分像素作为光线追踪的采样点
private void SampleSurfels(RenderTexture worldPosCubeMap, RenderTexture normalCubeMap, RenderTexture albedoCubeMap) {
var kid = surfelSampleCS.FindKernel("CSMain");

Vector3 p = gameObject.transform.position;
surfelSampleCS.SetVector("_probePos", new Vector4(p.x, p.y, p.z, 1f));
surfelSampleCS.SetFloat("_randSeed", UnityEngine.Random.Range(0.0f, 1f));
surfelSampleCS.SetTexture(kid, "_worldPosCubemap", worldPosCubeMap);
surfelSampleCS.SetTexture(kid, "_normalCubemap", normalCubeMap);
surfelSampleCS.SetTexture(kid, "_albedoCubemap", albedoCubeMap);
surfelSampleCS.SetBuffer(kid, "_surfels", surfels);

surfelSampleCS.Dispatch(kid, 1, 1, 1);
surfels.GetData(readBackBuffer);
}
SurfelSampleCS.cs
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50

// ref: https://stackoverflow.com/questions/4200224/random-noise-functions-for-glsl
// 根据UV生成随机数
float rand(float2 uv)
{
return frac(sin(dot(uv, float2(12.9898, 78.233))) * 43758.5453);
}

// 在球面上均匀采样点
// ref: Unreal Engine 4, MonteCarlo.ush
float3 UniformSphereSample(float u, float v)
{
const float C_PI = 3.14159265359f;
float phi = degrees(2.0 * C_PI * u);
float cosine_theta = 1.0 - 2.0 * v;
float sine_theta = sqrt(1.0 - cosine_theta * cosine_theta);

float x = sine_theta * cos(phi);
float y = sine_theta * sin(phi);
float z = cosine_theta;

return float3(x, y, z);
}


[numthreads(32,16,1)]
void CSMain (uint3 id : SV_DispatchThreadID)
{
float2 xy = float2(id.x, id.y) / float2(32, 16);
xy += float2(1, 1) * _randSeed;

float u = rand(xy * 1.0);
float v = rand(xy * 2.0);
float3 dir = UniformSphereSample(u,v);

Surfel result;

result.position = _worldPosCubemap.SampleLevel(sampler_point_clamp, dir, 0).rgb;
result.albedo = _albedoCubemap.SampleLevel(sampler_point_clamp, dir, 0).rgb;

float4 normal_and_mask = _normalCubemap.SampleLevel(sampler_point_clamp, dir, 0);
result.normal = normal_and_mask.xyz;
result.skyMask = saturate(1.0 - normal_and_mask.w);

result.position += (_probePos.xyz + dir) * result.skyMask;

uint surfelIndex = id.x * 16 + id.y;
_surfels[surfelIndex] = result;
}

  1. Radiance 的计算
Radiance 计算
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16

uint surfelIndex = id.x * 16 + id.y;
Surfel surfel = _surfels[surfelIndex];

Light mainLight = GetMainLight();
float4 shadowCoord = TransformWorldToShadowCoord(surfel.position);
float atten = SampleShadowmap(TEXTURE2D_ARGS(_MainLightShadowmapTexture, sampler_MainLightShadowmapTexture),
shadowCoord, GetMainLightShadowSamplingData(), GetMainLightShadowParams(), false);

float NdotL = saturate(dot(surfel.normal, mainLight.direction));
float3 radiance = surfel.albedo * mainLight.color * NdotL * atten * (1.0 - surfel.skyMask);

float3 dir = normalize(surfel.position - _probePos.xyz);

float3 skyColor = SAMPLE_TEXTURECUBE_LOD(_GlossyEnvironmentCubeMap, sampler_GlossyEnvironmentCubeMap, dir, 0).xyz;
radiance += skyColor * surfel.skyMask * _skyLightIntensity;

Image text

  1. Irradiance 的计算
    Image text
SHDebug.shader
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

// 使用定点数存储小数, 因为 compute shader 的 InterlockedAdd 不支持 float
// array size: 3x9=27
CBUFFER_START(UnityPerMaterial)
StructuredBuffer<int> _coefficientSH9;
CBUFFER_END


float3 dir = i.normal;
// decode sh
float3 c[9];
for(int i=0; i<9; i++)
{
c[i].x = DecodeFloatFromInt(_coefficientSH9[i*3+0]);
c[i].y = DecodeFloatFromInt(_coefficientSH9[i*3+1]);
c[i].z = DecodeFloatFromInt(_coefficientSH9[i*3+2]);
}

// decode irradiance
float3 irradiance = IrradianceSH9(c, dir);
float3 Lo = irradiance / PI;

return float4(Lo, 1.0);

1
2
3
4
5

MeshRenderer meshRenderer = gameObject.GetComponent<MeshRenderer>();
meshRenderer.sharedMaterial.shader = Shader.Find("CasualPRT/SHDebug");
matPropBlock.SetBuffer("_coefficientSH9", coefficientSH9);
meshRenderer.SetPropertyBlock(matPropBlock);
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22

// SH projection
const float N = 32 * 16;
float3 c[9];
c[0] = SH(0, 0, dir) * radiance * 4.0 * PI / N;
c[1] = SH(1, -1, dir) * radiance * 4.0 * PI / N;
c[2] = SH(1, 0, dir) * radiance * 4.0 * PI / N;
c[3] = SH(1, 1, dir) * radiance * 4.0 * PI / N;
c[4] = SH(2, -2, dir) * radiance * 4.0 * PI / N;
c[5] = SH(2, -1, dir) * radiance * 4.0 * PI / N;
c[6] = SH(2, 0, dir) * radiance * 4.0 * PI / N;
c[7] = SH(2, 1, dir) * radiance * 4.0 * PI / N;
c[8] = SH(2, 2, dir) * radiance * 4.0 * PI / N;

// atom write result to buffer
for(int i=0; i<9; i++)
{
InterlockedAdd(_coefficientSH9[i*3+0], EncodeFloatToInt(c[i].x));
InterlockedAdd(_coefficientSH9[i*3+1], EncodeFloatToInt(c[i].y));
InterlockedAdd(_coefficientSH9[i*3+2], EncodeFloatToInt(c[i].z));
}

  1. 通过过世界坐标找到对应的体素网格,并访问距离最近8个 Probe 进行颜色的混合
    Image text
颜色的混合
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58

float3 SampleSHVoxel(
in float4 worldPos,
in float3 albedo,
in float3 normal,
in StructuredBuffer<int> _coefficientVoxel,
in float _coefficientVoxelGridSize,
in float4 _coefficientVoxelCorner,
in float4 _coefficientVoxelSize
)
{
// probe grid index for current fragment
int3 probeIndex3 = GetProbeIndex3DFromWorldPos(worldPos, _coefficientVoxelSize, _coefficientVoxelGridSize, _coefficientVoxelCorner);
int3 offset[8] = {
int3(0, 0, 0), int3(0, 0, 1), int3(0, 1, 0), int3(0, 1, 1),
int3(1, 0, 0), int3(1, 0, 1), int3(1, 1, 0), int3(1, 1, 1),
};

float3 c[9];
float3 Lo[8] = { float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), };
float3 BRDF = albedo / PI;
float weight = 0.0005;

// near 8 probes
for (int i = 0; i < 8; i++)
{
int3 idx3 = probeIndex3 + offset[i];
bool isInsideVoxel = IsIndex3DInsideVoxel(idx3, _coefficientVoxelSize);
if (!isInsideVoxel)
{
Lo[i] = float3(0, 0, 0);
continue;
}

// normal weight blend
float3 probePos = GetProbePositionFromIndex3D(idx3, _coefficientVoxelGridSize, _coefficientVoxelCorner);
float3 dir = normalize(probePos - worldPos.xyz);
float normalWeight = saturate(dot(dir, normal));
weight += normalWeight;

// decode SH9
int probeIndex = GetProbeIndex1DFromIndex3D(idx3, _coefficientVoxelSize);
DecodeSHCoefficientFromVoxel(c, _coefficientVoxel, probeIndex);
Lo[i] = IrradianceSH9(c, normal) * BRDF * normalWeight;
}

// trilinear interpolation
float3 minCorner = GetProbePositionFromIndex3D(probeIndex3, _coefficientVoxelGridSize, _coefficientVoxelCorner);
float3 maxCorner = minCorner + float3(1, 1, 1) * _coefficientVoxelGridSize;
float3 rate = (worldPos - minCorner) / _coefficientVoxelGridSize;
float3 color = TrilinearInterpolationFloat3(Lo, rate) / weight;

return color;
}

float3 gi = SampleSHVoxel(worldPos, albedo, normal,
_coefficientVoxel, _coefficientVoxelGridSize, _coefficientVoxelCorner, _coefficientVoxelSize);

Image text
地面的黑暗是由于只计算光反射一次的结果

  1. 保存上一帧的数据,并用于下一帧的计算

Image text

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67


float3 SampleSHVoxel(
in float4 worldPos,
in float3 albedo,
in float3 normal,
in StructuredBuffer<int> _coefficientVoxel,
in float _coefficientVoxelGridSize,
in float4 _coefficientVoxelCorner,
in float4 _coefficientVoxelSize
)
{
// probe grid index for current fragment
int3 probeIndex3 = GetProbeIndex3DFromWorldPos(worldPos, _coefficientVoxelSize, _coefficientVoxelGridSize, _coefficientVoxelCorner);
int3 offset[8] = {
int3(0, 0, 0), int3(0, 0, 1), int3(0, 1, 0), int3(0, 1, 1),
int3(1, 0, 0), int3(1, 0, 1), int3(1, 1, 0), int3(1, 1, 1),
};

float3 c[9];
float3 Lo[8] = { float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), float3(0, 0, 0), };
float3 BRDF = albedo / PI;
float weight = 0.0005;

// near 8 probes
for (int i = 0; i < 8; i++)
{
int3 idx3 = probeIndex3 + offset[i];
bool isInsideVoxel = IsIndex3DInsideVoxel(idx3, _coefficientVoxelSize);
if (!isInsideVoxel)
{
Lo[i] = float3(0, 0, 0);
continue;
}

// normal weight blend
float3 probePos = GetProbePositionFromIndex3D(idx3, _coefficientVoxelGridSize, _coefficientVoxelCorner);
float3 dir = normalize(probePos - worldPos.xyz);
float normalWeight = saturate(dot(dir, normal));
weight += normalWeight;

// decode SH9
int probeIndex = GetProbeIndex1DFromIndex3D(idx3, _coefficientVoxelSize);
DecodeSHCoefficientFromVoxel(c, _coefficientVoxel, probeIndex);
Lo[i] = IrradianceSH9(c, normal) * BRDF * normalWeight;
}

// trilinear interpolation
float3 minCorner = GetProbePositionFromIndex3D(probeIndex3, _coefficientVoxelGridSize, _coefficientVoxelCorner);
float3 maxCorner = minCorner + float3(1, 1, 1) * _coefficientVoxelGridSize;
float3 rate = (worldPos - minCorner) / _coefficientVoxelGridSize;
float3 color = TrilinearInterpolationFloat3(Lo, rate) / weight;

return color;
}
// radiance from last frame
float3 history = SampleSHVoxel(
float4(surfel.position, 1.0),
surfel.albedo,
surfel.normal,
_lastFrameCoefficientVoxel,
_coefficientVoxelGridSize,
_coefficientVoxelCorner,
_coefficientVoxelSize
);
radiance += history * _giIntensity;

  1. Surfel数据在运行时会被清空,使用 ScriptableObject 存储数据,在 Start 时,进行数据的加载
ProbeVolumeData.cs
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEditor;
using System;

[Serializable]
[CreateAssetMenu(fileName = "ProbeVolumeData", menuName = "ProbeVolumeData")]
public class ProbeVolumeData : ScriptableObject
{
[SerializeField]
public Vector3 volumePosition;

[SerializeField]
public float[] surfelStorageBuffer;

// pack all probe's data to 1D array
public void StorageSurfelData(ProbeVolume volume)
{
int probeNum = volume.probeSizeX * volume.probeSizeY * volume.probeSizeZ;
int surfelPerProbe = 512;
int floatPerSurfel = 10;
Array.Resize<float>(ref surfelStorageBuffer, probeNum * surfelPerProbe * floatPerSurfel);
int j = 0;
for (int i = 0; i < volume.probes.Length; i++)
{
Probe probe = volume.probes[i].GetComponent<Probe>();
foreach (var surfel in probe.readBackBuffer)
{
surfelStorageBuffer[j++] = surfel.position.x;
surfelStorageBuffer[j++] = surfel.position.y;
surfelStorageBuffer[j++] = surfel.position.z;
surfelStorageBuffer[j++] = surfel.normal.x;
surfelStorageBuffer[j++] = surfel.normal.y;
surfelStorageBuffer[j++] = surfel.normal.z;
surfelStorageBuffer[j++] = surfel.albedo.x;
surfelStorageBuffer[j++] = surfel.albedo.y;
surfelStorageBuffer[j++] = surfel.albedo.z;
surfelStorageBuffer[j++] = surfel.skyMask;
}
}

volumePosition = volume.gameObject.transform.position;

// save
//EditorUtility.SetDirty(volumePosition);
//EditorUtility.SetDirty(surfelStorageBuffer);
EditorUtility.SetDirty(this);
UnityEditor.AssetDatabase.SaveAssets();
}

// load surfel data from storage
public void TryLoadSurfelData(ProbeVolume volume)
{
int probeNum = volume.probeSizeX * volume.probeSizeY * volume.probeSizeZ;
int surfelPerProbe = 512;
int floatPerSurfel = 10;
bool dataDirty = surfelStorageBuffer.Length != probeNum * surfelPerProbe * floatPerSurfel;
bool posDirty = volume.gameObject.transform.position != volumePosition;
if (posDirty || dataDirty)
{
Debug.LogWarning("volume data is old! please re capture!");
Debug.LogWarning("探针组数据需要重新捕获");
return;
}

int j = 0;
foreach (var go in volume.probes)
{
Probe probe = go.GetComponent<Probe>();
for (int i = 0; i < probe.readBackBuffer.Length; i++)
{
probe.readBackBuffer[i].position.x = surfelStorageBuffer[j++];
probe.readBackBuffer[i].position.y = surfelStorageBuffer[j++];
probe.readBackBuffer[i].position.z = surfelStorageBuffer[j++];
probe.readBackBuffer[i].normal.x = surfelStorageBuffer[j++];
probe.readBackBuffer[i].normal.y = surfelStorageBuffer[j++];
probe.readBackBuffer[i].normal.z = surfelStorageBuffer[j++];
probe.readBackBuffer[i].albedo.x = surfelStorageBuffer[j++];
probe.readBackBuffer[i].albedo.y = surfelStorageBuffer[j++];
probe.readBackBuffer[i].albedo.z = surfelStorageBuffer[j++];
probe.readBackBuffer[i].skyMask = surfelStorageBuffer[j++];
}
probe.surfels.SetData(probe.readBackBuffer);
}
}
}


  1. 漏光问题
    体探针漏光解决方案 - jackie 偶尔不帅的文章 - 知乎
    Image text
  • 计算模型的BentNormal,写入到顶点色中,这个数据在采样体素贴图时对位置进行偏移
    Image text
    • 黑色的mesh与绿色mesh相交,那么 露出的顶点 显示的红色bentnormal方向是对的,但是最右边的顶点,因为在物体内部 所以他没有计算出bentnormal,这样导致,渲染的黄色部分,bentnormal是 左右2个顶点插值的结果,这样就导致黄色的bentnormal 偏移角度不够,依然会插值到 绿色框内部或右面的探针,产生漏光
      解决方式:不要原点发射方向,而是沿着射线方向偏移一点再发射
      Image text
  1. 全局光照的割裂
    去除法线计算贡献值权重
    对比:

Image text
Image text