Ogre 2.x has an issue with its documentation. Most cool features are presented to the user through the Samples and only the Samples (or presented not at all, assuming there are more features than samples)
I love learning through example code. But these don't answer more complex questions about their compatibility with each other and their feature set.
This leads to the user feeling a little bit lost on how and why to use certain features because one is forced to experiment (and fail) a lot to learn.
Here are some questions I've asked myself during the time I've used Ogre 2.x:
- When to use Instant Radiosity and when Voxel Cone Tracing?
Pros and Cons to compare both.
Should I use Instant Radiosity at all, as it is older?
- What is the difference between IrradianceVolume and IrradianceField. Why do both exist?
The first one uses 1 sample per direction (in total 6) and the other uses small cubemaps?
- Instant Radiosity can be used with IrradianceVolume or only with VPLs. When to choose which?
- VCT can be used with IrradienceField or not. Pros and Cons of each Mode to choose from?
- Local Cubemaps have the nice feature of having Diffuse GI calculated from the cubemap.
This feature is never explained and I've discovered it by accident through the documentation of SceneManager::setAmbientLight()
Those can even capture the Skybox and such allow to simulate the lack of officially supported Image Based Lighting.
Can VCT or Instant Radiosity somehow make use of the Sky or a Local Cubemap to cast rays into the scene to simulate what Blender does with Eevee?
This is one very effective feature for realistic lighting in a scene.
I've noticed this because my scene never really looked like it would fit in my skybox. Only with the Local Cubemaps the ambience seemed to be realistic as the color tones match now.
- Local Cubemaps and VCT seem to be exclusive to each other?
- Why does SSAO apply to the whole image and not just the ambient term?
Yeah I know, Forward Rendering makes it more difficult as SSAO is easier with Deferred.
But this is still a limitation which should be communicated as SSAO feels weird when applied to diffuse lighting.
- Should there be a feature to import Eevee IrradianceVolumes from Blender so lighting can be baked for a scene externally?
Those are cubemaps as well so there should be a compatibility. But I do agree that is can be probably done by an external tool or a Blender Python Script.
- Is VCT or Instant Radiosity usable for moving lights to simulate a sun which moves slowly during the day?
Are GPU based GI solutions really the best way to go or should it be calculated asynchronously in a CPU thread?
I could imagine that is doesn't need a high update rate for slowly moving lights.
- When to use PccPerPixelGridPlacement? Aren't manually crafted probes more accurate most of the time?
Sample_PccPerPixelGridPlacement feels very basic as it doesn't really explain why this feature is good and shall be used.
I just have the feeling that I have to read lots of shader code from Pbs to understand which feature works in harmony with another feature.
A illustrated manual for lighting would help a lot.
Maybe I'm missing something. I've also checked the manual of 2.3 but reading through Class documentation doesn't help a lot ot get a feeling for the engine. I hope that this text doesn't feel toxic. English is not my main language.