Physically Based Rendering V2

Overview
Since Sonoid finished working on PBR he passed the torch to me and provided me rights to change and redistribute the code as i feel fit. Since i took on the project i made some small changes, the first change i made was to allow PBR to work alongside the legacy renderer in Forward Rendering. The second change i made was to change the diffuse model to be Burley because i personally prefer it over Lambert.

Results:
[video]https://youtu.be/6qfIHsG_MSU[/video]

Images:



How to use
When you create a material you will need to use one of the PBR render techniques located in CoreData/Techniques/PBR

If the selected technique uses diffuse then you will need to input a albedo texture into the diffuse channel. Albedo is similar to diffuse although it does not contain any lighting data.
if the selected technique uses normal then you will need to input a traditional normal map into the normal channel.
if the selected technique uses Metallic /roughness then you will need to input a PBR properties map into the specular channel. The PBR properties map contains Roughness in the red color channel and Metallic in the green color channel.
if the selected technique uses emissive then you will need to input a emissive texture into the emissive channel.

For additional control over the material you can add Roughness and Metallic as material parameters, from these values you can adjust the PBR properties map or set default values for material without a PBR properties map

Effect
Both the Roughness and Metallic values affect the overall look of the material. The effect each value has on the material is shown in the images below, these images where taken from the Material Test scene included in PBR repository.

Roughneses:

Metalic:

Area Lighting is still in development although sphere lights are currently supported. The image below demonstrates sphere lights as demonstrated on the Material Test scene.

Sphere Light

Download

Currently the download will not work on OpenGL Deferred Renderers due to know issue (without a fix currently)
https://github.com/dragonCASTjosh/Urho3D

1 Like

Great work! Thanks for sharing this.

Thanks for the support :slight_smile:

I wanted to implement the IBL system demonstrated in the Unreal 4 PBR notes but it caused massive performance problems for my 970. you can enable it in the repo by changing iblColor in the LitSolid.hlsl to float3 iblColor = ApproximateSpecularIBL(specColor, roughness, normal, -toCamera)

and the number of samples in the Lighting to 1024

[code]float3 PrefilterEnvMap(float Roughness, float3 R)
{
float3 N = R;
float3 V = R;

		float3 PrefilteredColor = 0;
		const uint NumSamples = 1024;

		float TotalWeight = 0.0000001f;

		for (uint i = 0; i < NumSamples; i++)
		{
			float2 Xi = Hammersley(i * 2 + 0, NumSamples);
			float3 H = ImportanceSampleGGX(Xi, Roughness, N);
			float3 L = 2 * dot(V, H) * H - V;
			float NoL = saturate(dot(N, L));
			if (NoL > 0)
			{
				PrefilteredColor += SampleCubeLOD(ZoneCubeMap, float4(L, 0)).rgb * NoL;
				TotalWeight += NoL;
			}
		}
		return PrefilteredColor / TotalWeight;
	}[/code]

Anybody is welcome to make PR to the repo especially if they help with performance of the Unreal PBR as it looks a lot better

Congratulations to all the parties involved! I hope to be able to use pbr on mobile/linux some day.

OpenGL support should be done within a few days

OpenGL support should be done within a few days[/quote]

I am looking forward to it. Thanks for sharing it.

EDIT
I just clone the repo and I have a few questions:
[ul][li] It seems that there are no changes whatsoever to the Urho3D engine code. Is it really true?[/li]
[li] I understand that sinoid has transfer the copyright to you for all his work (IANAL, so sorry if I use a wrong term here). My question is, now under which license are you releasing your combined work? Please say MIT or even better Urho3D license (MIT) :wink:. Using Urho3D license makes it easier for us to pull the good bits over into upstream Urho3D project as there should be no copyright issue then in copying the code over pro-actively. [/li][/ul]

Yes, it’s only shaders. You can see fork from the original repo github.com/souxiaosou/UrhoPBRCoreData . It’s containt detailed comments, like

/// Smith GGX Visibility /// nDotL: dot-prod of surface normal and light direction /// nDotV: dot-prod of surface normal and view direction /// roughness: surface roughness float SmithGGXVisibility(in float nDotL, in float nDotV, in float roughness) { float rough2 = roughness * roughness; float gSmithV = nDotV + sqrt(nDotV * (nDotV - nDotV * rough2) + rough2); float gSmithL = nDotL + sqrt(nDotL * (nDotL - nDotL * rough2) + rough2); return 1.0 / (gSmithV * gSmithL); }
but it is obsolete, I think. It is a pity that these comments have been removed.

Just previewing the test scene, it seems to miss a bit of the punch of other PBR implementations. I couldn’t guess why from the short test – it could be as simple as poor environment map choice or may be something more. In my previous attempt with PBR I realised it’s important to have simple tests and comparisons.

I can tell there is a lot of work put into this though, nice work so far !

Most of the work is in the shaders, but a complete PBR implementation will need to touch on the engine source to at least provide some parameters for area lights.

[quote=“weitjong”]

I just clone the repo and I have a few questions:
[ul][li] It seems that there are no changes whatsoever to the Urho3D engine code. Is it really true?[/li]
[li] I understand that sinoid has transfer the copyright to you for all his work (IANAL, so sorry if I use a wrong term here). My question is, now under which license are you releasing your combined work? Please say MIT or even better Urho3D license (MIT) :wink:. Using Urho3D license makes it easier for us to pull the good bits over into upstream Urho3D project as there should be no copyright issue then in copying the code over pro-actively. [/li][/ul][/quote]

Currently there are no changes to Urho3D engine code and its possible to get away without any changes but to include features like Parallax Correction for cubemaps and Area Lighting there will need to be small changes to the engine.
For the licence sinoid provided me the source under MIT as is visible in the licence image found under the shader dir (only there temporally). Im not sure if MIT would allow me to changes the licence to Urho3D but if it did i would change it.

The lack of comments is a temporary thing, I was planning on moving PBR to its own shader file and re-document.

The issue likely lies within the IBL solution that has been used. Currently i am using the Unreal 4 mobile IBL(same as sinoid implemented). I tried implementing more advanced IBL and it provided much better results although the performance of it currently is horrible

Comparison

[spoiler]Current Rough Metallic

More advanced IBL

Current Rough Non-Metallic

More advanced IBL

[/spoiler]

[quote=“dragonCASTjosh”]Currently there are no changes to Urho3D engine code and its possible to get away without any changes but to include features like Parallax Correction for cubemaps and Area Lighting there will need to be small changes to the engine.
For the licence sinoid provided me the source under MIT as is visible in the licence image found under the shader dir (only there temporally). Im not sure if MIT would allow me to changes the licence to Urho3D but if it did i would change it.[/quote]

Thanks for the prompt reply. Again, IANAL. Urho3D license is MIT license. The only difference is that it declares the work/material/code is copyrighted by Urho3D project. If anyone submitting a PR to us then you must have read and agree to the T&C to release them under MIT license with the copyright statement “Copyright © 2008-2016 the Urho3D project” (see urho3d.github.io/documentation/H … klist.html). Basically, what I am asking is whether we have your permission to “copy” the combined work into Urho3D project and thus they would adhere to the above term. We do not steal other ppl code :wink: and claim them as ours. Alternatively, we can just wait passively until you send us PR when/if you decide to merge your work to upstream Urho3D project.

Looking great. :slight_smile:

[quote=“weitjong”][quote=“dragonCASTjosh”]Currently there are no changes to Urho3D engine code and its possible to get away without any changes but to include features like Parallax Correction for cubemaps and Area Lighting there will need to be small changes to the engine.
For the licence sinoid provided me the source under MIT as is visible in the licence image found under the shader dir (only there temporally). Im not sure if MIT would allow me to changes the licence to Urho3D but if it did i would change it.[/quote]

Thanks for the prompt reply. Again, IANAL. Urho3D license is MIT license. The only difference is that it declares the work/material/code is copyrighted by Urho3D project. If anyone submitting a PR to us then you must have read and agree to the T&C to release them under MIT license with the copyright statement “Copyright © 2008-2016 the Urho3D project” (see urho3d.github.io/documentation/H … klist.html). Basically, what I am asking is whether we have your permission to “copy” the combined work into Urho3D project and thus they would adhere to the above term. We do not steal other ppl code :wink: and claim them as ours. Alternatively, we can just wait passively until you send us PR when/if you decide to merge your work to upstream Urho3D project.[/quote]

Your free to implement it into Urho3D, i was going to make a pull request once i got the OpenGL version and and cleaned up the code a little. But you are free to merge it when you feel its ready :slight_smile:

Thanks in advance for that. I am not necessary saying I am the best person to do it nor stating the readiness of the combined work, but I think everyone here will be definitely happy to see it merged eventually.

I updated the Repo to support OpenGL Forward Rendering so now everyone should be able to access it :slight_smile: . As for the GL deferred renderer i ran into some issues to do with YCoCg that Sinod originally implemented as way to boost performance and compress the Albedo and Specular into a single RGBA color buffer. If anyone wants to help with fixing it, everything should be on the Repo, if i cant find a fix within the next few weeks i may look into removing the functionality from all renderers and add extra buffers for what is needed

Made some advancements today, nothing public yet but i wanted to provide a short development screenshot of area lighting. In my private build area lighting is only supported on Point lights although i am looking to branch it out to other light sources and even one up unreal and include none spherical area lights :slight_smile:


Awesome. Will definitely look at the OpenGL implementation once I have concluded the SDL 2.0.4 upgrade.

@weitjong How do you think its best to implement the front end of Area Lighting, in my head currently i have new lighting types or multiple new options on existing lights.

Hm I just read this thread and was thinking about the environment maps and also about mirrors for a while. I also found this lecture about the topics: inf.ed.ac.uk/teaching/course … 9_2013.pdf
The environment maps used in the samples here (yeah just samples but still) look pretty fake as they don’t show the actual scene at all.
I remember the Half Life 2 engine from Garrys Mod and they did environment maps by pre-rendering images in the level editor at various coordinates scattered throughout the map (a 3D raster). This looked pretty bad as well because the maps jumped when the object moved around and was closer to a different map and also the maps did not reflect the actual scene at all (dynamic objects/lights).

Raytracing is still way to slow but what about this:
Could a camera with a FOV of 90 be used to dynamically create cubemaps used as environment maps? Could there be like a wandering camera that calculates a (new) cubemap for one object every six frame (one cubemap side each frame, so if you have 10 objects with an environment map every one gets updated all 60 frames)? Could that special camera render all 6 sides in one “frame” (sub-frame if you want)?

Or could some mirror technique (like the Urho water sample) be used to get environment data?

Creating dynamic cubemaps without 6 rendertextures per cubemap would become very costly very quick, and after filtering the cubemap for the majority of materials you would find it was not worth the cost. A better solution would be SSR where you do a screen space raytreace to capture the reflections, there are multiple methods of doing SSR but my preferred result is the one used by Frostbite 3 as it does not create a noisy texture like UE4 does under certain scenarios.

Results: http://www.frostbite.com/wp-content/uploads/2015/08/Stochastic-Screen-Space-Reflections.mp4
Paper: http://www.frostbite.com/2015/08/stochastic-screen-space-reflections/

Creating dynamic cubemaps without 6 rendertextures per cubemap would become very costly very quick, and after filtering the cubemap for the majority of materials you would find it was not worth the cost. A better solution would be SSR where you do a screen space raytreace to capture the reflections, there are multiple methods of doing SSR but my preferred result is the one used by Frostbite 3 as it does not create a noisy texture like UE4 does under certain scenarios.

Results: http://www.frostbite.com/wp-content/uploads/2015/08/Stochastic-Screen-Space-Reflections.mp4
Paper: http://www.frostbite.com/2015/08/stochastic-screen-space-reflections/[/quote]
:open_mouth:
Wow that’s awesome! I want that! :smiley:
I have no idea how to do that. :frowning:

I get the ideas behind that but how to do the raytracing? Is it done on the CPU? Or is the whole scene somehow moved to the graphics card to calculate there?
Are mirrors in principle also done like that?
I some games there is glass that partly mirrors, is partly transparent and is partly “bending” the viewing angle somehow. Is the bending similar?
I had no idea that such a thing is possible.

Edit: Added to wishlist github.com/urho3d/Urho3D/wiki/urho3d-wishlist

Edit2: Ah it seems to be not a real ray trace but a “depth buffer scan”: casual-effects.blogspot.de/2014/ … acing.html
"…Games march 3D rays across the height field defined by a depth buffer to create very approximate screen-space reflections…"