Not detecting Multiple Cards in Laptop

A place for users of OGRE to discuss ideas and experiences of utilitising OGRE in their games / demos / applications.
Post Reply
kulebril
Gnoblar
Posts: 16
Joined: Mon Mar 07, 2011 11:08 am

Not detecting Multiple Cards in Laptop

Post by kulebril » Thu Sep 08, 2011 11:00 pm

Hi:

I run a demo on my laptop and I find that the system does not list both graphics cards.
I have an Alienware M11x Laptop. I has a Intel HD graphics family and a nVidia Geforce GT 540.

The problem is that when the Ogre3d popup shows at the beginning of the demo., the Ogre Rendering Device list on DirectX shows only the Intel HD Graphics family (a card for Desktop aplications only) and does not list the Geforce GT540.

Why is this?
How can I force Ogre to detect the card?
How can I list the correct cards and choose it manually?

Thanks in Advance
0 x

User avatar
Jabberwocky
OGRE Moderator
OGRE Moderator
Posts: 2819
Joined: Mon Mar 05, 2007 11:17 pm
Location: Canada
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by Jabberwocky » Fri Sep 09, 2011 4:58 am

I had a similar problem using a modern dell XPS. The XPS has 2 video cards, an integrated (crappy) one, and a high powered ATI one.

The problem was that the XPS uses the integrated card by default for all applications that weren't explicitly listed inside the ATI Catalyst Control panel. The reason for this is to conserve power, the battery lasts for much longer using the integrated video card.

I can't remember the exact instructions I followed to add my game to the list, so it would use the proper video card. But here's some different instructions I found on line that seem to explain the process:
http://h20000.www2.hp.com/bizsupport/Te ... R1002_USEN
Scroll down to the part that talks about the ATI Catalyst Control Center window.

Perhaps you're having the same issue.
0 x
Image

kulebril
Gnoblar
Posts: 16
Joined: Mon Mar 07, 2011 11:08 am

Re: Not detecting Multiple Cards in Laptop

Post by kulebril » Fri Sep 09, 2011 12:03 pm

Thanks, it looks like the same problem.

I'll check it out later, and tell you how it went.

Thanks!
0 x

kulebril
Gnoblar
Posts: 16
Joined: Mon Mar 07, 2011 11:08 am

Re: Not detecting Multiple Cards in Laptop

Post by kulebril » Fri Sep 23, 2011 6:04 pm

Even after putting my executable on the NVIDIA list.
Ogre continues not seeing the correct card.

I even tried forcing the laptop to use the nVidia card for ANYTHING and it continues switching to the Intel HD.

What else can I do?
0 x

User avatar
Jabberwocky
OGRE Moderator
OGRE Moderator
Posts: 2819
Joined: Mon Mar 05, 2007 11:17 pm
Location: Canada
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by Jabberwocky » Fri Sep 23, 2011 7:20 pm

Interesting.
I had the exact same problem as you, but putting my game it in that list solved it for me. I'm still assuming this is probably related to your laptop's dual card power saving setup. Maybe you need to talk to Alienware or NVIDIA tech support.
0 x
Image

NoVer
Gnoblar
Posts: 5
Joined: Fri Dec 02, 2011 5:58 am

Re: Not detecting Multiple Cards in Laptop

Post by NoVer » Fri Dec 02, 2011 6:45 pm

This is the same exact situation I'm in. If you found an answer please let me know!
0 x

NoVer
Gnoblar
Posts: 5
Joined: Fri Dec 02, 2011 5:58 am

Re: Not detecting Multiple Cards in Laptop

Post by NoVer » Mon Dec 05, 2011 2:29 am

Try building it in Release mode, that has solved my problem so far (not with the multiple cards, but i can atleast create ogre heads now)
0 x

kariem2k
Gnoblar
Posts: 18
Joined: Sun Jan 09, 2005 10:16 pm
Location: Alexandria, Egypt
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by kariem2k » Mon Jun 04, 2012 9:38 pm

Yes that happens to me to (Dell 5110 with 525M).

That is because NVIDIA has that technology called OPTIMUS which enables you to select one of your NVIDIA GPU and the Built In Intel GPU to be used. It is not supported at all on NVIDAI Linux drivers (although you can use Bumblebee on Linux which still does not work with OGRE last time i checked).
Although NVIDIA windows drivers have the ability to select which GPU that will be used with the application you are running, it still fails in OGRE, It always goes to the Intelcard.

I am doing some debugging but i am not expert in the field, i hope i can find a solution.

*EDIT*
My mistake it works without problem on windows, what made made me think that my GPU is not detected is the "ShaderSystemMultiLight" sample, which i think it checks for ps_3_0 profile and it is not listed for some reason after changing the code to fp30 it worked without issues.
0 x
Kariem2k.blogspot.com

andrewfenn
Halfling
Posts: 62
Joined: Fri Mar 23, 2007 2:48 pm

Re: Not detecting Multiple Cards in Laptop

Post by andrewfenn » Fri Jun 29, 2012 10:42 am

kariem2k wrote:(although you can use Bumblebee on Linux which still does not work with OGRE last time i checked).
It's been working for me for over a year now with few issues.

I use the following to run my compiled games or ogitor..

Code: Select all

optirun ./game
If I want to use the debugger you need to do the following

Code: Select all

optirun $SHELL
gdb ./game
exit
0 x

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 3997
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 191
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by dark_sylinc » Thu Nov 15, 2012 3:56 am

This is an old thread.
However, NVIDIA released Optimus rendering policies guidelines not long ago.

If the user has driver 302 or higher, we can hint the driver to use the dedicated GPU. All we need to do is to export a variable:

Code: Select all

extern "C" {
_declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}
Hope this info helps whoever is in need.
Cheers
Dark Sylinc
0 x

User avatar
Jabberwocky
OGRE Moderator
OGRE Moderator
Posts: 2819
Joined: Mon Mar 05, 2007 11:17 pm
Location: Canada
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by Jabberwocky » Thu Nov 15, 2012 11:50 pm

Thanks dark_sylinc - appreciate you posting this info.
0 x
Image

Tiffer
Kobold
Posts: 36
Joined: Thu Oct 04, 2012 12:00 pm

Re: Not detecting Multiple Cards in Laptop

Post by Tiffer » Tue Nov 20, 2012 6:46 pm

Hi guys. I have this problem. I tried the export suggested by dark_sylinc but it doesn't work. I have no doubt that that should work in most instances, but with Ogre it doesn't for some reason.

I've set the nvidia control panel to force the nvidia card always, and it works for other games I have, but not for any Ogre app I've tried. I'm trying this in a new Windows 8 PC. Ogre is listing only one device, the Intel card.

It seems that Ogre is doing something wrong that most games don't, any clues?
0 x

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 3997
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 191
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by dark_sylinc » Sun Nov 25, 2012 6:49 pm

Tiffer wrote:I've set the nvidia control panel to force the nvidia card always, and it works for other games I have, but not for any Ogre app I've tried. I'm trying this in a new Windows 8 PC. Ogre is listing only one device, the Intel card.

It seems that Ogre is doing something wrong that most games don't, any clues?
I'm no an Optimus expert, but:
  • NVIDIA whitelists games. That's why they work and Ogre doesn't
  • Read the PDF. First you need driver 302 or higher, otherwise the export trick won't work.
  • Read the PDF. As stated in that document, there are many stuff that can override settings (even the driver settings) and stop using Optimus; which are evaluated before reading the exported variable.
Good luck! and let us know your results!
0 x

User avatar
Captain_Deathbeard
Gremlin
Posts: 179
Joined: Mon Nov 21, 2005 6:16 pm
Location: UK
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by Captain_Deathbeard » Mon Apr 22, 2013 7:21 pm

BUMP! Instead of starting a new thread.

My game is released, so when you have tens of thousands of people playing your game this issue becomes a bigger problem.
I have a lot of reports, and they are always from laptop users with dual cards.

I have no idea to fix this on my end, and telling them things like "update your drivers" don't usually get any results.
Has anybody had any results with fixing this? I think it should be considered an Ogre bug, though unfortunately we don't have a bug forum.
0 x

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 3997
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 191
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by dark_sylinc » Mon Apr 22, 2013 8:01 pm

Captain_Deathbeard wrote:Has anybody had any results with fixing this? I think it should be considered an Ogre bug, though unfortunately we don't have a bug forum.
Not really, NVIDIA whitelists which programs automatically get the good card. Great way to piss any non-mainstream devs.
We have JIRA for bugs.
The export trick should work, but it's not guaranteed.

So basically you're reduced to either getting very popular, or tech-savvy users who know how to bypass Optimus defaults and force the good card.

I wish I had one of these devices to experiment with so I could get the best possible results. I agree it's really pissing to get users/fans complaining they can't your game run on the 'good' card.
0 x

Transporter
Minaton
Posts: 933
Joined: Mon Mar 05, 2012 11:37 am
Location: Germany
x 1

Re: Not detecting Multiple Cards in Laptop

Post by Transporter » Tue Apr 23, 2013 7:12 am

I can check this. In my laptop has an nvidia and intel hd4000. Just tell me what I should test.
0 x

User avatar
Captain_Deathbeard
Gremlin
Posts: 179
Joined: Mon Nov 21, 2005 6:16 pm
Location: UK
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by Captain_Deathbeard » Tue Apr 23, 2013 3:37 pm

Thanks transporter,

heres a link to my game demo:
http://www.indiedb.com/games/kenshi/dow ... enshi-0404

The standard ogre config window pops up at start, so it should show both your video cards.
0 x

User avatar
Jabberwocky
OGRE Moderator
OGRE Moderator
Posts: 2819
Joined: Mon Mar 05, 2007 11:17 pm
Location: Canada
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by Jabberwocky » Fri Apr 26, 2013 8:31 pm

For ATI users, did you try the ATI Catalyst Control panel setting stuff I discussed above? I'm not sure why that didn't work for some folks in this thread, but it worked for my game on one laptop I tested on. Perhaps there is an equivalent setting for nvidia cards.
0 x
Image

User avatar
Senzin
Halfling
Posts: 57
Joined: Wed Apr 06, 2011 8:54 pm

Re: Not detecting Multiple Cards in Laptop

Post by Senzin » Sun Apr 28, 2013 3:56 am

I also have a laptop with dual nVidia and Intel HD4000. When I start my project, only Intel HD4000 is shown.

I tried the NvOptimusEnablement export trick but it had no effect.

To fix this, I had to go into the nVidia control panel and add the game. Once the game was added, I had to explicitly set it to use the nVidia card instead of Intel as that's not the default setting when you add a new game.
0 x

User avatar
dark_sylinc
OGRE Team Member
OGRE Team Member
Posts: 3997
Joined: Sat Jul 21, 2007 4:55 pm
Location: Buenos Aires, Argentina
x 191
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by dark_sylinc » Fri May 17, 2013 2:46 am

Senzin wrote:I also have a laptop with dual nVidia and Intel HD4000. When I start my project, only Intel HD4000 is shown.

I tried the NvOptimusEnablement export trick but it had no effect.
What happens if you also export this variable in OgreMain.dll, and what about doing the same in RenderSystem_Direct3D9.dll?
0 x

jsding
Greenskin
Posts: 105
Joined: Tue Dec 14, 2010 9:46 am
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by jsding » Thu Oct 03, 2013 10:58 am

Is this issue fixed? My OGRE app can detect NVIDIA card, but no video mode detected.If I choose NVIDIA card, there will be "request video mode can't found" error.

Thanks.
0 x

peanutandchestnut
Kobold
Posts: 36
Joined: Tue Nov 11, 2014 9:29 am

Re: Not detecting Multiple Cards in Laptop

Post by peanutandchestnut » Thu Nov 20, 2014 2:44 am

I also have the same issue on my notebook.
My notebook owns HD Graphics 3000 and Geforce GT 635. Ogre can only detect HD Graphics 3000.
I change prefered graphics processor to nvidia, add my game to customize nvidia program setting, both not work.
I disable the HD Graphics 3000, ogre still can not detect GT 635.
Don't y, don't have a clue. :(
0 x

User avatar
spacegaier
OGRE Team Member
OGRE Team Member
Posts: 4291
Joined: Mon Feb 04, 2008 2:02 pm
Location: Germany
x 2
Contact:

Re: Not detecting Multiple Cards in Laptop

Post by spacegaier » Thu Nov 20, 2014 10:06 am

Did you try this version export trick that was mentioned earlier in this thread?

http://www.ogre3d.org/forums/viewtopic. ... 93#p477193
0 x
Ogre Admin [Admin, Dev, PR, Finance, Wiki, etc.] | BasicOgreFramework | AdvancedOgreFramework
Don't know what to do in your spare time? Help the Ogre wiki grow! Or squash a bug...

solarstandard
Gnoblar
Posts: 3
Joined: Thu Oct 16, 2014 6:15 am

Re: Not detecting Multiple Cards in Laptop

Post by solarstandard » Sat Jan 10, 2015 6:26 am

I have the same problem on a laptop with two video cards - Intel HD3000 and Nvidia GT540M, running Windows 7 64-bit. I have Ogre SDK v1.9.0 for VC11. I'm trying to run the Basic Tutorial 2 (http://www.ogre3d.org/tikiwiki/tiki-ind ... Tutorial+2) on the Nvidia card.

I've tried setting the Nvidia GPU as default in the Nvidia control panel and explicitly adding the executable to the list of programs to use the Nvidia GPU. Every time only the Intel card is available.

I've tried adding the NvOptimusEnablement declaration in one of the header files of the basic tutorial application I'm compiling. Still only one card.

There is no mention of "nvidia" or "540m" in Ogre.log, but strangely the Nvidia GPU Activity icon in the system tray shows the application as using the GPU while it is running.

Is there a solution to this problem?
0 x

frostbyte
Orc Shaman
Posts: 737
Joined: Fri May 31, 2013 2:28 am
x 14

Re: Not detecting Multiple Cards in Laptop

Post by frostbyte » Sat Jan 10, 2015 7:57 pm

There is no mention of "nvidia" or "540m" in Ogre.log, but strangely the Nvidia GPU Activity icon in the system tray shows the application as using the GPU while it is running.
its ok,you are probably running ogre on the nvidia gpu

its just driver info issue...optimus uses your nvidia as "co processor" and resulting frameBuffer is sent to intel for actual rendering to take place...
if you want to be sure...switch back to intel and search for performance/fps change...try something heavy...
0 x
the woods are lovely dark and deep
but i have promises to keep
and miles to code before i sleep
and miles to code before i sleep..

coolest videos link( two minutes paper )...
https://www.youtube.com/user/keeroyz/videos

Post Reply