OpenGL 3+ RenderSystem

Discussion area about developing or extending OGRE, adding plugins for it or building applications on it. No newbie questions please, use the Help forum for that.
AusSkiller
Gremlin
Posts: 158
Joined: Wed Nov 28, 2012 1:38 am
x 1

Re: OpenGL 3+ RenderSystem

Post by AusSkiller » Sun Jan 26, 2014 6:24 am

holocronweaver wrote:
AusSkiller wrote:Will it be easy to reference the multisample textures of one framebuffer as inputs to the shader for another using compositors?
Yes, using the output texture of one pass as the input texture of another is what compositors are all about! Most of my implementation time will be making sure multisample textures are accessible from the compositor.
Excellent, I'll have to have another go at using the compositors then, hopefully it'll be a lot easier since I have all the shaders written and working this time and there'll be less guess work as to why something is not being displayed correctly or at all. Keep up the good work :)
0 x

scrawl
OGRE Expert User
OGRE Expert User
Posts: 1119
Joined: Sat Jan 01, 2011 7:57 pm
x 2

Re: OpenGL 3+ RenderSystem

Post by scrawl » Tue Jan 28, 2014 6:07 pm

Just to let you know, I have upgraded to nvidia driver 331.20 and the logObjectInfo issue is indeed gone now.

Code: Select all

nvidia-smi
Tue Jan 28 18:06:47 2014       
+------------------------------------------------------+                       
| NVIDIA-SMI 331.20     Driver Version: 331.20         |                       
|-------------------------------+----------------------+----------------------+
| GPU  Name        Persistence-M| Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|===============================+======================+======================|
|   0  GeForce GTX 560 Ti  Off  | 0000:02:00.0     N/A |                  N/A |
| 40%   38C  N/A     N/A /  N/A |    203MiB /  1023MiB |     N/A      Default |
+-------------------------------+----------------------+----------------------+
                                                                               
+-----------------------------------------------------------------------------+
| Compute processes:                                               GPU Memory |
|  GPU       PID  Process name                                     Usage      |
|=============================================================================|
|    0            Not Supported                                               |
+-----------------------------------------------------------------------------+
0 x

stoyannk
Gnoblar
Posts: 2
Joined: Fri Feb 21, 2014 9:57 am

Re: OpenGL 3+ RenderSystem

Post by stoyannk » Fri Feb 21, 2014 11:02 am

Hi guys,

my company (http://coherent-labs.com) is building a demo project with Ogre. I'm integrating my voxel polygonization (http://stoyannk.wordpress.com/voxels-library/) library and stumbled upon some limitations of the OGL3Plus rendering system.

I have a couple of feature suggestions that are relatively easy to implement but will give big impact on the rendering system. I'm using the 1.9 branch which as I understand is the most stable.

1. There is a bug in the interpretation of IT_32BIT as GL_UNSIGNED_BYTE when using index buffers - should be GL_UNSIGNED_INT. OgreGL3PlusRenderSystem.cpp:1856 and
OgreGL3PlusRenderSystem.cpp : 1840. It's fixed in the master branch.

2. There is no way to pass integer vertex attributes in the system as integers. glVertexAttribPointer is always used which transforms the integer attributes in floats. A solution that I used because we need this in our demo is to add new Ogre attribute types that use glVertexAttribIPointer and thus allow passing integers in the vertex shader. You can see the changes in https://bitbucket.org/stoyannk/ogre-coh ... d8aae3ab35

All the changes we are making can be found on https://bitbucket.org/stoyannk/ogre-coherentlabs/

I'm unable to provide a full pull request for the feature for the lack of time and full knowledge of the policies for adding features to render systems in Ogre. The forum seems a better choice.
0 x

User avatar
holocronweaver
Google Summer of Code Student
Google Summer of Code Student
Posts: 273
Joined: Mon Oct 29, 2012 8:52 pm
Location: Princeton, NJ

Re: OpenGL 3+ RenderSystem

Post by holocronweaver » Wed Feb 26, 2014 9:51 pm

stoyannk wrote:I'm using the 1.9 branch which as I understand is the most stable.
Unless it has been backported, v1-9 does not include any of my work from the 2013 GSoC which fixed many, many bugs. I would recommend using the default branch if you want GL3+ bug fixes, and then switch to the 1.10 branch when it is released.
1. There is a bug in the interpretation of IT_32BIT as GL_UNSIGNED_BYTE when using index buffers - should be GL_UNSIGNED_INT. OgreGL3PlusRenderSystem.cpp:1856 and
OgreGL3PlusRenderSystem.cpp :1840. It's fixed in the master branch.
You mean the default branch? I fixed the IT_32BIT case you mention on the bleeding edge repo - the IT_16BIT I spotted during the GSoC and fixed last year and should be in default.
2. There is no way to pass integer vertex attributes in the system as integers. glVertexAttribPointer is always used which transforms the integer attributes in floats. A solution that I used because we need this in our demo is to add new Ogre attribute types that use glVertexAttribIPointer and thus allow passing integers in the vertex shader.
I overlooked this problem, thanks for spotting it! I think a new VET type is unnecessary - simply adding cases to use glVertexAttribIPointer for non-float base VET types should suffice. Unless I am missing something?
0 x

User avatar
masterfalcon
OGRE Team Member
OGRE Team Member
Posts: 4270
Joined: Sun Feb 25, 2007 4:56 am
Location: Bloomington, MN
Contact:

Re: OpenGL 3+ RenderSystem

Post by masterfalcon » Thu Feb 27, 2014 5:58 am

2. There is no way to pass integer vertex attributes in the system as integers. glVertexAttribPointer is always used which transforms the integer attributes in floats. A solution that I used because we need this in our demo is to add new Ogre attribute types that use glVertexAttribIPointer and thus allow passing integers in the vertex shader.
I overlooked this problem, thanks for spotting it! I think a new VET type is unnecessary - simply adding cases to use glVertexAttribIPointer for non-float base VET types should suffice. Unless I am missing something?
You're right, the INT and UINT VET types were added last year before GSoC but I never got around to hooking up everything else for it.
0 x

User avatar
holocronweaver
Google Summer of Code Student
Google Summer of Code Student
Posts: 273
Joined: Mon Oct 29, 2012 8:52 pm
Location: Princeton, NJ

Re: OpenGL 3+ RenderSystem

Post by holocronweaver » Thu Feb 27, 2014 6:12 am

masterfalcon wrote:You're right, the INT and UINT VET types were added last year before GSoC but I never got around to hooking up everything else for it.
I have taken care of them in my repo. Looking to merge upstream within the next two weeks.
0 x

stoyannk
Gnoblar
Posts: 2
Joined: Fri Feb 21, 2014 9:57 am

Re: OpenGL 3+ RenderSystem

Post by stoyannk » Thu Feb 27, 2014 1:16 pm

masterfalcon wrote: I overlooked this problem, thanks for spotting it! I think a new VET type is unnecessary - simply adding cases to use glVertexAttribIPointer for non-float base VET types should suffice. Unless I am missing something?

You're right, the INT and UINT VET types were added last year before GSoC but I never got around to hooking up everything else for it.
Thank you very much for the swift response!
For the new VET types - it depends on how much flexibility you want to provide. The current way things happen is still valid - it's just that INT and UINT attributes get transformed to floats by OpenGL. I suppose somebody might need that. That's why in my impl. I added new types - to cover all cases. I haven't checked how things are handled in your other RenderSystems. If you want to keep the behavior consistent (which I believe you do) and on other systems the attribs are not cast to float - than the new VETs are indeed unnecessary.
0 x

Ident
Gremlin
Posts: 155
Joined: Thu Sep 17, 2009 8:43 pm
Location: Austria
Contact:

Re: OpenGL 3+ RenderSystem

Post by Ident » Sun Mar 09, 2014 9:04 pm

Ever since I am using the Ogre OGL3+ renderer in my application, I have noticed rendering issue (texture is just black filled with (0,0,0,255) texels according to OpenGL GPU debugger) when CEGUI is rendering to FBOs/RTTs in my application. This bug doesnt appear in said application when using Ogre D3D9 renderer. Since we only access the rendering indirectly via Ogre (except for proving hlsl and glsl shaders) , there must have be some different in the Ogre-OGL3-internal FBOs/RTT rendering that causes this. Specifically, the rendering seems to fail on _only_ the first instance of any FBO ever being rendered. At least this is how it seems. When the window is re-rendered it works fine. I am inconclusive if it is a CEGUI related issue, but since it works with the Ogre D3D9 renderer i dont think it is not. Has anything specifically changed in the behavior of FBOs in OGL3?

I should probably also try the old Ogre OGL Renderer. EDIT: I tried and when only using the interface it works fine with FBOs in Ogre OGL Renderer, but exactly the same setup witrh OGL3+ Renderer results in the first rendered FBO being black again.

Btw i added changes that support the latest default branch version of Ogre to CEGUI v0-8 repo.
0 x

TheSHEEEP
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 972
Joined: Mon Jun 02, 2008 6:52 pm
Location: Berlin

Re: OpenGL 3+ RenderSystem

Post by TheSHEEEP » Sat Mar 22, 2014 10:30 am

I just discovered this:

http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014/

Does the GL3+ system utilize the "gems" shown there somehow? Or is this even relevant to the rendersystem?
I'm not really deep into OpenGL low level stuff, but these videos and slides do seem helpful to those that are.
0 x
My site! - Have a look :)
Also on Twitter - extra fluffy

User avatar
masterfalcon
OGRE Team Member
OGRE Team Member
Posts: 4270
Joined: Sun Feb 25, 2007 4:56 am
Location: Bloomington, MN
Contact:

Re: OpenGL 3+ RenderSystem

Post by masterfalcon » Sat Mar 22, 2014 4:41 pm

Not really yet but that's definitely something that I'd like to work toward.
0 x

User avatar
holocronweaver
Google Summer of Code Student
Google Summer of Code Student
Posts: 273
Joined: Mon Oct 29, 2012 8:52 pm
Location: Princeton, NJ

Re: OpenGL 3+ RenderSystem

Post by holocronweaver » Sat Mar 22, 2014 4:51 pm

@Ident: I apologize for the long delayed reply. To begin, are you using my GL3+ beta repo or the primary OGRE repo?
Ident wrote:Specifically, the rendering seems to fail on _only_ the first instance of any FBO ever being rendered.
Interesting. I will look into this on my own, but if you could you provide a reproducible test case that would seriously speed things up.
I am inconclusive if it is a CEGUI related issue, but since it works with the Ogre D3D9 renderer i dont think it is not. Has anything specifically changed in the behavior of FBOs in OGL3?
I am not sure as masterfalcon is the one who implemented it and I have barely touched it.
Btw i added changes that support the latest default branch version of Ogre to CEGUI v0-8 repo.
Great! :D
0 x

Ident
Gremlin
Posts: 155
Joined: Thu Sep 17, 2009 8:43 pm
Location: Austria
Contact:

Re: OpenGL 3+ RenderSystem

Post by Ident » Sat Mar 22, 2014 6:02 pm

I am not sure how to write a reconstruction for this in a simple Ogre demo, because I am not sure what causes this in detail. I can only tell from my own observations. I assume it is too much to ask for, to expect you to build CEGUI and set it up and all that, just for this bug. Basically in CEGUI all that has to be done afaik is to create a FrameWindow or better two, both with autoRenderingSurface (that's the FBO) on, and then, if they are just displayed in default state without anything being done to them that would request a redraw, the one of them would not have its FBO texture correctly drawn (its empty/black as checked in the debugger). Currently I am extremely busy otherwise I would simplify my project and send it to you ):
0 x

Syniurge
Halfling
Posts: 40
Joined: Thu Aug 30, 2012 1:43 pm
Location: France

Re: OpenGL 3+ RenderSystem

Post by Syniurge » Wed Apr 02, 2014 11:42 pm

Hi masterfalcon, holoconweaver and others,

I'd like to make the render system (finally) work on Mesa, which unlike nVidia and AMD's proprietary implementations of OpenGL doesn't expose symbols for extension functions:
./SampleBrowser: symbol lookup error: RenderSystem_GL3Plus.so.1.9.0: undefined symbol: glXCreateContextAttribsARB
Should I use GLEW or mimic the old GL rendersys's way of retrieving them?

edit: All things considered that was a rhetorical question, since the old GL plugin used GLEW to initialize the function pointers, i'll be back with a patch
0 x

User avatar
holocronweaver
Google Summer of Code Student
Google Summer of Code Student
Posts: 273
Joined: Mon Oct 29, 2012 8:52 pm
Location: Princeton, NJ

Re: OpenGL 3+ RenderSystem

Post by holocronweaver » Thu Apr 03, 2014 12:04 am

Are you using my cutting edge GL3+ repo? I am pretty sure this has already been fixed there, though the repo is in somewhat of a state of transition at the moment as I finalize the new window system.
0 x

Syniurge
Halfling
Posts: 40
Joined: Thu Aug 30, 2012 1:43 pm
Location: France

Re: OpenGL 3+ RenderSystem

Post by Syniurge » Thu Apr 03, 2014 9:16 pm

holocronweaver wrote:Are you using my cutting edge GL3+ repo? I am pretty sure this has already been fixed there, though the repo is in somewhat of a state of transition at the moment as I finalize the new window system.
Your repo is indeed working with Mesa, neat! When will your work be pushed into mainline?

There are still many issues on my Evergreen R600 laptop with Mesa built from git (Xorg Edgers):

Image
  • Many rendering issues
  • (Most) characters are black squares
  • Quitting doesn't work, I have to kill the process
Here's the console output pastebin'd: http://paste.kde.org/pwppaxcwj
0 x

User avatar
holocronweaver
Google Summer of Code Student
Google Summer of Code Student
Posts: 273
Joined: Mon Oct 29, 2012 8:52 pm
Location: Princeton, NJ

Re: OpenGL 3+ RenderSystem

Post by holocronweaver » Thu Apr 03, 2014 11:12 pm

I am surprised it runs at all on Mesa. Very happy to hear it! :) Either today or tomorrow I will be pushing the new windowing system, which should fix at least the problem with Quit not quitting.
0 x

Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 988
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 12
Contact:

Re: OpenGL 3+ RenderSystem

Post by Crashy » Fri Oct 02, 2015 8:21 am

Hello !
After years of using the D3D Rendersystem, I just tried the GL3+ Rendersystem to see how it performs.
I'm porting my custom shaders and everything is ok so far.

But I'm trying to add the support of hardware pcf, like in D3D11, as it is now possible with GLSL 330+.
So as specified in the various documentation I've found, I've implemented the functions related to texture comparison in the render system:

Code: Select all

void GL3PlusRenderSystem::_setTextureUnitCompareFunction(size_t unit, CompareFunction function)
    {
        if (!activateGLTextureUnit(unit))
            return;

        OGRE_CHECK_GL_ERROR(glSamplerParameteri(mTextureTypes[unit], GL_TEXTURE_COMPARE_FUNC, convertCompareFunction(function)));

        activateGLTextureUnit(0);
    }

    void GL3PlusRenderSystem::_setTextureUnitCompareEnabled(size_t unit, bool compare)
    {
        if (!activateGLTextureUnit(unit))
            return;
       
        mTextureCompareEnabled = compare;

        OGRE_CHECK_GL_ERROR(glSamplerParameteri(mTextureTypes[unit], GL_TEXTURE_COMPARE_MODE, compare ? GL_COMPARE_REF_TO_TEXTURE:GL_NONE));

        activateGLTextureUnit(0);
    }
And changed my pixel shader shadow sampling code from this :

Code: Select all

float shadow =	(shadowMapPos.z <= texture(shadowMap, shadowMapPos.xy ).r) ? 1.0 : 0.0;
to this:

Code: Select all

float shadow = texture( shadowSampler, vec3(shadowMapPos.xy, shadowMapPos.z));
Which should do thejob, shadowSampler being a "sampler2DShadow", as requested.

However, the result is always 0, so something must be wrong, but I have no clue, except that this may not work with a texture type different from GL_DEPTH, but it's not clear, and I don't know if it's possible to use PF_DEPTH as a format for shadow textures in Ogre. :?
0 x
Follow la Moustache on Twitter or on Facebook
Image

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 4036
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 214
Contact:

Re: OpenGL 3+ RenderSystem

Post by dark_sylinc » Tue Oct 06, 2015 5:07 pm

Crashy wrote:However, the result is always 0, so something must be wrong, but I have no clue, except that this may not work with a texture type different from GL_DEPTH, but it's not clear, and I don't know if it's possible to use PF_DEPTH as a format for shadow textures in Ogre. :?
The GL specs leave using a texture2Dshadow on non-depth formats as undefined behavior. I've seen it work on AMD on a FLOAT32_R format, but I dunno about the other vendors and other formats.
Depth textures support was added to Ogre in 2.1

Btw the comparison modes must be set after setting the textures, which is something you don't need in D3D11. If you're setting the modes first, then the texture, it's not going to work on GL.
Ogre 2.1 also got rid of that annoyance by using the newer GL sampler objects.
0 x

Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 988
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 12
Contact:

Re: OpenGL 3+ RenderSystem

Post by Crashy » Wed Oct 07, 2015 9:47 am

Thank you very much. I'll try to change a few things to see how it behaves.
0 x
Follow la Moustache on Twitter or on Facebook
Image

Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 988
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 12
Contact:

Re: OpenGL 3+ RenderSystem

Post by Crashy » Thu Jan 26, 2017 11:09 am

Hi !
I've done some test of my current project that uses the Gl3+ Rendersystem on a computer fitted with an Intel HD 4000, and experienced a few errors during the compilation of shaders, because of that piece of code in OgreGLSLShader.cpp

Code: Select all

if (Root::getSingleton().getRenderSystem()->getCapabilities()->hasCapability(RSC_SEPARATE_SHADER_OBJECTS))
            {
                // Assume blocks are missing if gl_Position is missing.
                if (mSource.find("vec4 gl_Position") == String::npos)
                {
                    size_t mainPos = mSource.find("void main");
                    // Only add blocks if shader is not a child
                    // shader, i.e. has a main function.
                    if (mainPos != String::npos)
                    {
                        size_t versionPos = mSource.find("#version");
                        int shaderVersion = StringConverter::parseInt(mSource.substr(versionPos+9, 3));
                        if (shaderVersion >= 150)
                        {
                            size_t belowVersionPos = mSource.find("\n", versionPos) + 1;
                            switch (mType)
                            {
                            case GPT_VERTEX_PROGRAM:
                                mSource.insert(belowVersionPos, "out gl_PerVertex\n{\nvec4 gl_Position;\nfloat gl_PointSize;\nfloat gl_ClipDistance[];\n};\n\n");
                                break;
                            case GPT_GEOMETRY_PROGRAM:
                                mSource.insert(belowVersionPos, "out gl_PerVertex\n{\nvec4 gl_Position;\nfloat gl_PointSize;\nfloat gl_ClipDistance[];\n};\n\n");
                                mSource.insert(belowVersionPos, "in gl_PerVertex\n{\nvec4 gl_Position;\nfloat gl_PointSize;\nfloat gl_ClipDistance[];\n} gl_in[];\n\n");
                                break;
                            case GPT_DOMAIN_PROGRAM:
                                mSource.insert(belowVersionPos, "out gl_PerVertex\n{\nvec4 gl_Position;\nfloat gl_PointSize;\nfloat gl_ClipDistance[];\n};\n\n");
                                mSource.insert(belowVersionPos, "in gl_PerVertex\n{\nvec4 gl_Position;\nfloat gl_PointSize;\nfloat gl_ClipDistance[];\n} gl_in[];\n\n");
                                break;
                            case GPT_HULL_PROGRAM:
                                mSource.insert(belowVersionPos, "out gl_PerVertex\n{\nvec4 gl_Position;\nfloat gl_PointSize;\nfloat gl_ClipDistance[];\n} gl_out[];\n\n");
                                mSource.insert(belowVersionPos, "in gl_PerVertex\n{\nvec4 gl_Position;\nfloat gl_PointSize;\nfloat gl_ClipDistance[];\n} gl_in[];\n\n");
                                break;
                            case GPT_FRAGMENT_PROGRAM:
                            case GPT_COMPUTE_PROGRAM:
                                // Fragment and compute shaders do
                                // not have standard blocks.
                                break;
                            }
                        }
                    }
                }
            }
The Intel HD compiler doesn't like at all using arrays without specified size in programs other than geometry programs.
That said, if I comment this code, everything works fine, on both nVidia and Intel devices.

According to GLSL documentation, the variables added by this piece of code to the shader are built-in, so why are we doing this ?

Thanks.
0 x
Follow la Moustache on Twitter or on Facebook
Image

paroj
OGRE Team Member
OGRE Team Member
Posts: 836
Joined: Sun Mar 30, 2014 2:51 pm
x 147
Contact:

Re: OpenGL 3+ RenderSystem

Post by paroj » Thu Jan 26, 2017 2:37 pm

They are not predefined with separate shader objects
0 x

Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 988
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 12
Contact:

Re: OpenGL 3+ RenderSystem

Post by Crashy » Thu Jan 26, 2017 3:22 pm

Indeed, I was looking for some docs and most of examples of separate programs I've found are talking about this, but in my case commenting this code while using separate programs works just the same, and that's pretty confusing. :?
0 x
Follow la Moustache on Twitter or on Facebook
Image

paroj
OGRE Team Member
OGRE Team Member
Posts: 836
Joined: Sun Mar 30, 2014 2:51 pm
x 147
Contact:

Re: OpenGL 3+ RenderSystem

Post by paroj » Thu Jan 26, 2017 5:59 pm

driver quirks..

added RSC_GLSL_SSO_REDECLARE which you can unset as needed:
https://github.com/OGRECave/ogre/pull/3 ... 0872ad6291
0 x

Crashy
Google Summer of Code Student
Google Summer of Code Student
Posts: 988
Joined: Wed Jan 08, 2003 9:15 pm
Location: Lyon, France
x 12
Contact:

Re: OpenGL 3+ RenderSystem

Post by Crashy » Thu Jan 26, 2017 6:09 pm

Thanks a lot ! I was suspecting a driver related issue. I've not tested it on AMD devices right now but I fear things will be different there too :D
0 x
Follow la Moustache on Twitter or on Facebook
Image

frostbyte
Orc Shaman
Posts: 737
Joined: Fri May 31, 2013 2:28 am
x 14

Re: OpenGL 3+ RenderSystem

Post by frostbyte » Thu Jan 26, 2017 8:02 pm

since my cursed "optimus" nvidia 660m started melting my lappy's plastic cover( i think its totaly fried ) i'm left only with my integrated HD4000( until i get rich )
last time i checked( about a year ago ) HLMS port with GL/GL3+ and HD4000 nothing rendered, other samples worked, and with nvidia GL3+ PBS sample also worked, so i assumed that its a driver bug, and carried on with my life but without openGL...

crashy, did you have success with PBS sample GL/GL3+ and HD4000?
is this bug related somehow? is this bug resolved?
can i have now my shiny PBS robots on HD4000 with GL/GL3+??? :P
thanks guys...
0 x
the woods are lovely dark and deep
but i have promises to keep
and miles to code before i sleep
and miles to code before i sleep..

coolest videos link( two minutes paper )...
https://www.youtube.com/user/keeroyz/videos

Post Reply