Select graphic card to use & SLI

Discussion area about developing or extending OGRE, adding plugins for it or building applications on it. No newbie questions please, use the Help forum for that.
Post Reply
TheSHEEEP
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 972
Joined: Mon Jun 02, 2008 6:52 pm
Location: Berlin

Select graphic card to use & SLI

Post by TheSHEEEP » Tue Dec 04, 2012 10:00 am

Hey folks,

I wanted to ask if it is possibly to select the graphic card an Ogre application is to use when there are multiple cards on the board (not SLI)?

My second question is if there is anything special one has to do to enable the usage of SLI connected graphic cards?

Thanks :)
0 x
My site! - Have a look :)
Also on Twitter - extra fluffy

User avatar
Zonder
Ogre Magi
Posts: 1133
Joined: Mon Aug 04, 2008 7:51 pm
Location: Manchester - England
x 22

Re: Select graphic card to use & SLI

Post by Zonder » Tue Dec 04, 2012 1:55 pm

well I know for my sli nvidia I have to actually enable it in the nvidia control panel (requires a reboot boot to change as well). not sure how ati handle it.
0 x
There are 10 types of people in the world: Those who understand binary, and those who don't...

TheSHEEEP
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 972
Joined: Mon Jun 02, 2008 6:52 pm
Location: Berlin

Re: Select graphic card to use & SLI

Post by TheSHEEEP » Tue Dec 04, 2012 4:37 pm

Hmm, okay, that would be for SLI then. I thought you could turn it on in code, somehow (just like with the NVIDIA flag you can set in code to make use of the Optimus technology on Linux (without Bumblebee)).

Selecting a specific graphic card seems rather hard to do.

I only found some very low level code here for OpenGL.
Unfortunately, that code only shows offline rendering (nothing goes to a window/to the screen) - while I need it on a window.
0 x
My site! - Have a look :)
Also on Twitter - extra fluffy

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: Select graphic card to use & SLI

Post by sparkprime » Tue Dec 04, 2012 6:25 pm

Hey, tell me about that optimus trick! Does it work on windows?
0 x

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: Select graphic card to use & SLI

Post by sparkprime » Tue Dec 04, 2012 6:29 pm

Is there not a setConfigOption for whatever rendersystem you're using that allows you to select the card?

I would guess SLI is a driver thing -- apps just see a single card for that.
0 x

TheSHEEEP
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 972
Joined: Mon Jun 02, 2008 6:52 pm
Location: Berlin

Re: Select graphic card to use & SLI

Post by TheSHEEEP » Wed Dec 05, 2012 9:37 am

sparkprime wrote:Hey, tell me about that optimus trick! Does it work on windows?
Shouldn't that be automatic on Windows? But I don't know, never tried it. Only read about it here (look for "Global Variable NvOptimusEnablement"). I also saw one post here at the forums where someone claimed that this did not work for him/her on his Linux system. So I guess you'd have to test to see if it really does work.

sparkprime wrote:Is there not a setConfigOption for whatever rendersystem you're using that allows you to select the card?
Well, you can do something like this:

Code: Select all

mRenderSystem.SetConfigOption("Rendering Device","NVIDIA GeForce FX Go5200"); // That's from the Intermediate Tutorial 7
But what if you have two times the same graphic card?
0 x
My site! - Have a look :)
Also on Twitter - extra fluffy

User avatar
mkultra333
Gold Sponsor
Gold Sponsor
Posts: 1889
Joined: Sun Mar 08, 2009 5:25 am
x 36

Re: Select graphic card to use & SLI

Post by mkultra333 » Wed Dec 05, 2012 10:17 am

I have SLI and, if I have it turned on from the Nvidia options panel, it runs automatically for my own Ogre project.
0 x
"In theory there is no difference between practice and theory. In practice, there is." - Psychology Textbook.

User avatar
sparkprime
Ogre Magi
Posts: 1137
Joined: Mon May 07, 2007 3:43 am
Location: Ossining, New York
Contact:

Re: Select graphic card to use & SLI

Post by sparkprime » Wed Dec 05, 2012 9:20 pm

Ah, a global extern. That's easy, thanks :)

I would expect the cards to be given names like Blah and Blah (2) if you have two of them. If not, that's a bug in Ogre :)
0 x

TheSHEEEP
OGRE Retired Team Member
OGRE Retired Team Member
Posts: 972
Joined: Mon Jun 02, 2008 6:52 pm
Location: Berlin

Re: Select graphic card to use & SLI

Post by TheSHEEEP » Thu Dec 06, 2012 9:49 am

Yeah, would be worth a try, at least.

This is (more or less) how the Ogre config dialog shows its selection:

Code: Select all

Ogre::ConfigOptionMap opts = renderSystem->getConfigOptions();
Ogre::ConfigOptionMap::iterator pOpt = opts.begin();
std::string strLine;
while (pOpt!=opts.end())
{
  strLine = pOpt->second.name + ": " + pOpt->second.currentValue;
  std::cout << strLine << std::endl;
  ++pOpt;
}
But this doesn't even show any graphic card. The render device is not shown at all.

When starting up, Ogre shows the name of the device. But it uses the following GL function:

Code: Select all

glGetString(GL_RENDERER);
Which (I guess) shows the selected/active one, not the list of possible devices.

How could I get the device name list, then, if not from the RenderSystem?
0 x
My site! - Have a look :)
Also on Twitter - extra fluffy

Post Reply