Question About "Use All GPUs" CC_CONFIG.XML Parameter

Cruncher-American
Cruncher-American
Joined: 24 Mar 05
Posts: 71
Credit: 5521721291
RAC: 4219074
Topic 230414

I bought a 3080 Ti and a 2080 Ti today from Craigslist for a very good price, and installed them in my B550 machine, replacing an older 2080 Ti and a 2080. But I couldn't get the machine to use new the 2080 Ti, no matter what I did. Then I remembered the use_all_gpus parameter for cc_config. so I set the param = 1 (TRUE) in the cc_config file, and lo and behold, it is now using both cards. 

My question is (for curiosity, only), what is the criterion used by BOINC to not use a GPU?

 

The 2080Ti has 14231 GFLOPS, and the 3080 Ti, 36250. Obviously much better, but the 14231 shouldn't be sneezed at. 

So how is it decided not to use the 2080 Ti in this circumstance?

GWGeorge007
GWGeorge007
Joined: 8 Jan 18
Posts: 3084
Credit: 4982167686
RAC: 1647658

Cruncher-American wrote: I

Cruncher-American wrote:

I bought a 3080 Ti and a 2080 Ti today from Craigslist for a very good price, and installed them in my B550 machine, replacing an older 2080 Ti and a 2080. But I couldn't get the machine to use new the 2080 Ti, no matter what I did. Then I remembered the use_all_gpus parameter for cc_config. so I set the param = 1 (TRUE) in the cc_config file, and lo and behold, it is now using both cards. 

Out of curiosity, what is the reason for buying a 2080Ti to replace an 'older' 2080Ti?  Also, I'm glad you figured out the <all_gpus> parameter.

Cruncher-American wrote:

My question is (for curiosity, only), what is the criterion used by BOINC to not use a GPU?

The 2080Ti has 14231 GFLOPS, and the 3080 Ti, 36250. Obviously much better, but the 14231 shouldn't be sneezed at. 

So how is it decided not to use the 2080 Ti in this circumstance?

Personally, I don't think it's as much to do with the GFLOPS as it is how much memory it has and what BOINC project you're going to be using.  When the age of the GPU gets to be so old that BOINC determines that it can no longer meet the requirements to run BOINC at all, then it pulls it from the listing of acceptable GPUs.  In the case of your 2080Ti, the specs: Graphics Processor-TU102, Cores-4352, TMUs-272, ROPs-88, Memory Size-11 GB, Memory Type-GDDR6, Bus Width-352 bit, you shouldn't have any problem with it.

George

Proud member of the Old Farts Association

Cruncher-American
Cruncher-American
Joined: 24 Mar 05
Posts: 71
Credit: 5521721291
RAC: 4219074

Hey, George, that's a fair

Hey, George, that's a fair question, so here's my answer: I was originally going to buy just the 3080ti, but the seller offered me the 2080ti for cheap. Since I have been running my machines 24/7/365 for a rather lengthy time (just check my electric bills), I like to have backups that I can use when one of my cards dies. And the 2080ti I had is getting a little long in the tooth, so just a precaution. Better too many than too few, I say.

GWGeorge007
GWGeorge007
Joined: 8 Jan 18
Posts: 3084
Credit: 4982167686
RAC: 1647658

Cruncher-American

Cruncher-American wrote:

Hey, George, that's a fair question, so here's my answer: I was originally going to buy just the 3080ti, but the seller offered me the 2080ti for cheap. Since I have been running my machines 24/7/365 for a rather lengthy time (just check my electric bills), I like to have backups that I can use when one of my cards dies. And the 2080ti I had is getting a little long in the tooth, so just a precaution. Better too many than too few, I say.

LOL   Well said!   LOL

George

Proud member of the Old Farts Association

Keith Myers
Keith Myers
Joined: 11 Feb 11
Posts: 4981
Credit: 18802538878
RAC: 7894899

That's not how the client

That's not how the client determines which card to use.  The client always chooses the 'most capable' card when presented with multiple cards detected.

Thus requiring the use all gpus setting.  The criteria is based on multiple criteria of card characteristics.

The only time the setting is not needed to be used is when multiple cards are of the same type or identical, as in two 2080's or 3080's etc.

 

Ian&Steve C.
Ian&Steve C.
Joined: 19 Jan 20
Posts: 3981
Credit: 47409992642
RAC: 63951303

To elaborate, for Nvidia

To elaborate, for Nvidia GPUs, “best” or “most capable” is determined from the following priority

1. CUDA compute capability (the Nvidia parameter which basically says which generation it is)

2. drive version (which can be effectively ignored since you'll always be running the same driver)

3. memory size 

4. speed (driver reported flops) 

 

so a 3080Ti will be most capable by the first priority. It’s newer CC of 8.6 is greater than the CC 7.5 on the 2080Ti

say you had a 3060 and a 3070 together, the 3060 would actually be seen as the better card because they have the same CC, but the 3060 has more VRAM. 

_________________________________________________________________________

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.