OpenGL 3+ RenderSystem
-
masterfalcon
- OGRE Retired Team Member

- Posts: 4270
- Joined: Sun Feb 25, 2007 4:56 am
- Location: Bloomington, MN
- x 126
Re: OpenGL 3+ RenderSystem
That's somewhat correct. UBO's correspond, mostly, to DX10-11's constant buffers. You can allocate a buffer, map it and then update uniforms as needed or just reference a buffer whose values are to be used again. They really shine when used with uniform blocks, a grouping of uniforms. These buffers can be used with multiple shader programs too.
The only implementation detail that I haven't ironed out yet and would like some feedback on is whether we should use them for ALL uniforms or only those referenced in a shared parameter block in the material/program definition.
If it were the latter, I would probably go ahead and enforce a requirement that shared parameters in GLSL be implemented within a block.
The only implementation detail that I haven't ironed out yet and would like some feedback on is whether we should use them for ALL uniforms or only those referenced in a shared parameter block in the material/program definition.
If it were the latter, I would probably go ahead and enforce a requirement that shared parameters in GLSL be implemented within a block.
-
sparkprime
- Ogre Magi
- Posts: 1137
- Joined: Mon May 07, 2007 3:43 am
- Location: Ossining, New York
- x 13
Re: OpenGL 3+ RenderSystem
So can you use multiple uniform buffers in a given program binding? That would allow having scene parameters that are updated only once.
-
masterfalcon
- OGRE Retired Team Member

- Posts: 4270
- Joined: Sun Feb 25, 2007 4:56 am
- Location: Bloomington, MN
- x 126
Re: OpenGL 3+ RenderSystem
Yup, that's right.
-
TheSHEEEP
- OGRE Retired Team Member

- Posts: 972
- Joined: Mon Jun 02, 2008 6:52 pm
- Location: Berlin
- x 65
Re: OpenGL 3+ RenderSystem
No love for CG? 
How comes it does not work on OSX? I thought CG was platform independent.
How comes it does not work on OSX? I thought CG was platform independent.
-
masterfalcon
- OGRE Retired Team Member

- Posts: 4270
- Joined: Sun Feb 25, 2007 4:56 am
- Location: Bloomington, MN
- x 126
Re: OpenGL 3+ RenderSystem
Well, none of the old ARB program types are supported in the core profile. There could be something that we could do but I haven't looked into it at all yet.
-
sparkprime
- Ogre Magi
- Posts: 1137
- Joined: Mon May 07, 2007 3:43 am
- Location: Ossining, New York
- x 13
Re: OpenGL 3+ RenderSystem
I have been telling Linux users with ATI cards to go away and come back with a nvidia card for about 3 years now because of this 
-
TheSHEEEP
- OGRE Retired Team Member

- Posts: 972
- Joined: Mon Jun 02, 2008 6:52 pm
- Location: Berlin
- x 65
Re: OpenGL 3+ RenderSystem
Hmm, that kinda defeats the purpose of using in CG in the first place. Might as well write all shaders in HLSL and GLSL from the start.
-
masterfalcon
- OGRE Retired Team Member

- Posts: 4270
- Joined: Sun Feb 25, 2007 4:56 am
- Location: Bloomington, MN
- x 126
Re: OpenGL 3+ RenderSystem
I plan on looking into Cg support but I just haven't yet.
-
sparkprime
- Ogre Magi
- Posts: 1137
- Joined: Mon May 07, 2007 3:43 am
- Location: Ossining, New York
- x 13
Re: OpenGL 3+ RenderSystem
At the moment, you can write CG and getTheSHEEEP wrote:Hmm, that kinda defeats the purpose of using in CG in the first place. Might as well write all shaders in HLSL and GLSL from the start.
D3D9 (all cards)
GL (nvidia)
Which is enough to cover the vast majority of users with a single shading language. With GLSL you'd get
GL (all cards)
and with HLSL you'd get
D3D9 (all cards)
Together they'd cover everything but on their own, they each are less useful than CG.
note: you can generate arbvp1 arbfp1 with CG but the language is too limited for a modern engine.
-
TheSHEEEP
- OGRE Retired Team Member

- Posts: 972
- Joined: Mon Jun 02, 2008 6:52 pm
- Location: Berlin
- x 65
Re: OpenGL 3+ RenderSystem
Even if CG covers more ground, you'll end up having to write shaders in two languages. As nobody will seriously want to ignore non-nvidia users
CG + GLSL.
And if you are going to do two languages anyway, it IMHO makes more sense to do GLSL and HLSL right away. All shader syntaxes are really similar, so it's not that much work. Still annoying, of course.
Also, what is with CG and DirectX11+ and GL3+ features? In a pretty near future, D3D9 will be a very rare thing to see (good riddance!), as well as older GL versions. And I suppose CG doesn't cover this "next-gen" ground, correct?
To me all of this seems as if CG is pretty much dying. It might generally be a good idea to ditch it completely and focus on GLSL and HLSL, as those two are certainly here to stay. Sure it will mean you have to support both on a multi-platform release, but you will also have only to support those two, and never more.
Why waste ressources on something like that? I think Ogre should support the upcoming feature sets and not go retro as much as possible
Maybe I'm wrong though and CG support will generally see a rise in popularity from hardware and software developers, but I seriously doubt it.
And if you are going to do two languages anyway, it IMHO makes more sense to do GLSL and HLSL right away. All shader syntaxes are really similar, so it's not that much work. Still annoying, of course.
No need to hurry, this is really not your fault.masterfalcon wrote:I plan on looking into Cg support but I just haven't yet.
Also, what is with CG and DirectX11+ and GL3+ features? In a pretty near future, D3D9 will be a very rare thing to see (good riddance!), as well as older GL versions. And I suppose CG doesn't cover this "next-gen" ground, correct?
To me all of this seems as if CG is pretty much dying. It might generally be a good idea to ditch it completely and focus on GLSL and HLSL, as those two are certainly here to stay. Sure it will mean you have to support both on a multi-platform release, but you will also have only to support those two, and never more.
Why waste ressources on something like that? I think Ogre should support the upcoming feature sets and not go retro as much as possible
Maybe I'm wrong though and CG support will generally see a rise in popularity from hardware and software developers, but I seriously doubt it.
-
sparkprime
- Ogre Magi
- Posts: 1137
- Joined: Mon May 07, 2007 3:43 am
- Location: Ossining, New York
- x 13
Re: OpenGL 3+ RenderSystem
Actually It is Ogre's fault. It should support compiling Cg to GLSL but Both Cg and GLSL are treated by Ogre as 'high level' languages, and the architecture is designed for high level languages to compile to low level languages, which are the ASMs basically. But the world has moved on and this isn't a sensible model anymore.
-
sparkprime
- Ogre Magi
- Posts: 1137
- Joined: Mon May 07, 2007 3:43 am
- Location: Ossining, New York
- x 13
Re: OpenGL 3+ RenderSystem
When I used to write HLSL and GLSL, it wasn't too bad because I used macros and some code gen to allow the same shader code to compile as both languages. However, HLSL took forever to compile, several seconds per shader. If you have a lot of shaders as I do, then this makes development too slow. CG compiles them (to sm3) instantly.
I don't think CG is going to get left behind. NVidia are really keen on it. They can just keep adding new targets as the render systems allow more things at shading time.
I don't think CG is going to get left behind. NVidia are really keen on it. They can just keep adding new targets as the render systems allow more things at shading time.
-
CABAListic
- OGRE Retired Team Member

- Posts: 2903
- Joined: Thu Jan 18, 2007 2:48 pm
- x 58
Re: OpenGL 3+ RenderSystem
Actually, I did write a patch once to allow the CgProgramManager to compile to GLSL and use the output in Ogre. It was a little hacky (because as you said, the current architecture doesn't really account for this), but worked in theory. Except that the GLSL output was unusable, because apparently it makes different assumptions about coordinate systems and/or matrix ordering than the Ogre GL rendersystem does. I didn't get it to work, so I abandoned it.
Cg to HLSL, on the other hand, worked flawlessly, but is significantly less interesting, imho.
Also, another user here in the forum recently pointed me to MojoShader, which could perhaps be turned into a plugin that, when loaded, allows seamless use of HLSL shaders with GL. Could be worth a try, at least.
Cg to HLSL, on the other hand, worked flawlessly, but is significantly less interesting, imho.
Also, another user here in the forum recently pointed me to MojoShader, which could perhaps be turned into a plugin that, when loaded, allows seamless use of HLSL shaders with GL. Could be worth a try, at least.
-
sparkprime
- Ogre Magi
- Posts: 1137
- Joined: Mon May 07, 2007 3:43 am
- Location: Ossining, New York
- x 13
Re: OpenGL 3+ RenderSystem
It'd be a very interesting research project to design a brand new shading language that fully integrates with Ogre. Obviously there would be no issues with left/right hand coordinate systems and all that junk. But also things like multiple passes, compositors, deferred shading, easy hook-up with scene / material / object properties, would be much cleaner. But it's a big project and maintaining it would be fairly tricky too. And that's coming from someone with a PhD in programming language design and with experience in compiling high level languages to CUDA 
-
CABAListic
- OGRE Retired Team Member

- Posts: 2903
- Joined: Thu Jan 18, 2007 2:48 pm
- x 58
Re: OpenGL 3+ RenderSystem
Yeah, I'm afraid that's very unlikely to happen 
-
sparkprime
- Ogre Magi
- Posts: 1137
- Joined: Mon May 07, 2007 3:43 am
- Location: Ossining, New York
- x 13
Re: OpenGL 3+ RenderSystem
If only Cg was a nice open source library we might consider simply making it the only shading language usable in Ogre. That would simplify things a lot.
-
TheSHEEEP
- OGRE Retired Team Member

- Posts: 972
- Joined: Mon Jun 02, 2008 6:52 pm
- Location: Berlin
- x 65
Re: OpenGL 3+ RenderSystem
Having to use only one shader for all platforms would be super awesome, indeed.
But this really get's kinda off-topic here, as CG concerns not only the new OpenGL3+ RenderSystem
But this really get's kinda off-topic here, as CG concerns not only the new OpenGL3+ RenderSystem
-
sleo
- Gremlin
- Posts: 171
- Joined: Sun Jun 05, 2011 6:49 am
- Location: Vodka Federation
- x 18
Re: OpenGL 3+ RenderSystem
Wanted to test this render system, but can't compile with main branch, seems HardwareConstantBuffer present only in GL3+ fork!?
Code: Select all
ogre-gl3plus\RenderSystems\GL3Plus\include\OgreGL3PlusDefaultHardwareBufferManager.h(36): fatal error C1083: Cannot open include file: 'OgreHardwareConstantBuffer.h': No such file or directory
-
masterfalcon
- OGRE Retired Team Member

- Posts: 4270
- Joined: Sun Feb 25, 2007 4:56 am
- Location: Bloomington, MN
- x 126
Re: OpenGL 3+ RenderSystem
That's correct. It only exists in the fork. It's actually going to be removed soon.
-
masterfalcon
- OGRE Retired Team Member

- Posts: 4270
- Joined: Sun Feb 25, 2007 4:56 am
- Location: Bloomington, MN
- x 126
Re: OpenGL 3+ RenderSystem
BTW, for everyone who is following this thread: If there are any OpenGL 3.0-4.2 features that you would really like to see implemented in Ogre just mention them here.
-
PhilipLB
- Google Summer of Code Student

- Posts: 550
- Joined: Thu Jun 04, 2009 5:07 pm
- Location: Berlin
- x 108
Re: OpenGL 3+ RenderSystem
The DirectX 9 and and current GL rendersystems are as far as I can see, more or less equivalent featurewise.
Maybe it would be desireable to have a similar situation with the GL 3+ Rendersystem and the DirectX 11 one?
Maybe it would be desireable to have a similar situation with the GL 3+ Rendersystem and the DirectX 11 one?
Google Summer of Code 2012 Student
Topic: "Volume Rendering with LOD aimed at terrain"
Project links: Project thread, WIKI page, Code fork for the project
Mentor: Mattan Furst
Volume GFX, accepting donations.
Topic: "Volume Rendering with LOD aimed at terrain"
Project links: Project thread, WIKI page, Code fork for the project
Mentor: Mattan Furst
Volume GFX, accepting donations.
-
sparkprime
- Ogre Magi
- Posts: 1137
- Joined: Mon May 07, 2007 3:43 am
- Location: Ossining, New York
- x 13
Re: OpenGL 3+ RenderSystem
Agreed. I need to support both windows and Linux with the same featureset in both, so anything that isn't in both I can't really use. Minor differences in the way features are exposed are OK but structural / performance differences aren't really OK.
-
Klaim
- Old One
- Posts: 2565
- Joined: Sun Sep 11, 2005 1:04 am
- Location: Paris, France
- x 56
Re: OpenGL 3+ RenderSystem
sparkprime wrote:Agreed. I need to support both windows and Linux with the same featureset in both, so anything that isn't in both I can't really use. Minor differences in the way features are exposed are OK but structural / performance differences aren't really OK.
Same here.
-
masterfalcon
- OGRE Retired Team Member

- Posts: 4270
- Joined: Sun Feb 25, 2007 4:56 am
- Location: Bloomington, MN
- x 126
Re: OpenGL 3+ RenderSystem
Oh absolutely. I wasn't suggesting that we break the feature parity between them. Just asking which newer OpenGL features sound good to you guys. Feature parity may be broken short term until someone can fill in the dx portion. But otherwise the plan is to keep them in sync
-
scrawl
- OGRE Expert User

- Posts: 1119
- Joined: Sat Jan 01, 2011 7:57 pm
- x 220
Re: OpenGL 3+ RenderSystem
Hm, did you forget to add a file to the repo?
Code: Select all
CMake Error at CMake/Utils/OgreAddTargets.cmake:100 (add_library):
Cannot find source file:
src/GLX/OgreFileSystemLayer.cpp
Tried extensions .c .C .c++ .cc .cpp .cxx .m .M .mm .h .hh .h++ .hm .hpp
.hxx .in .txx