Used Gpus?

mikey
mikey
Joined: 22 Jan 05
Posts: 11960
Credit: 1833637842
RAC: 225433

GWGeorge007 wrote: That is

GWGeorge007 wrote:

That is some good, compelling info on what a GPU can take as far as BOINC is concerned.  I've re-pasted my 2060 GPU along with getting some Thermalright Thermal Pads with 12.8 W/mK just 0.5 mm thicker and it's temps are roughly ~20॰C COOLER while running BOINC projects such as Einstein.

It leads me to believe that Keith Myers is CORRECT after all that I may have a PSU about to go south because of some errant lockups and/or shut downs.  It runs at 12.2v (using GkrellM) without BOINC running, and drops to 12.0v - 11.9v with BOINC.  It hasn't shut down yet without BOINC running, but if I start BOINC it then drops in voltage and shuts down or locks up within ~0.5 - 5.0+ hours, even after re-pasting my GPU.

.....[EDIT].....

I even have gone as far as setting my Graphics Clock Offset to -200 so that it runs at the Nvidia max clock settings of 1680MHz in Nvidia X-Server Settings. 

Try running something like MilkyWay or even Collatz and see what happens, it could just be something that Einstein uses to excess for that gpu. There are over half a dozen gpu projects you can try it on beyond the 2  I already mentioned including Minecraft, GPUGrid, PrimeGrid and it's test Project [url[http://boincvm.proxyma.ru:30080/test4vm/[/url] although they don't always have gpu tasks, MLC https://www.mlcathome.org/mlcathome/ SRBase even has gpu tasks and of course WCG if you can get them.

Ian&Steve C.
Ian&Steve C.
Joined: 19 Jan 20
Posts: 3713
Credit: 34661396416
RAC: 28392346

GWGeorge007 wrote: It leads

GWGeorge007 wrote:

It leads me to believe that Keith Myers is CORRECT after all that I may have a PSU about to go south because of some errant lockups and/or shut downs.  It runs at 12.2v (using GkrellM) without BOINC running, and drops to 12.0v - 11.9v with BOINC.  It hasn't shut down yet without BOINC running, but if I start BOINC it then drops in voltage and shuts down or locks up within ~0.5 - 5.0+ hours, even after re-pasting my GPU.

.....[EDIT].....

I even have gone as far as setting my Graphics Clock Offset to -200 so that it runs at the Nvidia max clock settings of 1680MHz in Nvidia X-Server Settings.

Honestly, those voltages are fine for the 12v rail and well within spec. 11.8+ is fine.

but that begs the question, how do your other voltages look? sag on the 5V or 3.3V rails could indicate impending failure too. 3.3V especially since the GPU uses that rail too at the PCIe slot.

_________________________________________________________________________

GWGeorge007
GWGeorge007
Joined: 8 Jan 18
Posts: 2822
Credit: 4630808862
RAC: 3647915

Ian&Steve C.

Ian&Steve C. wrote:

Honestly, those voltages are fine for the 12v rail and well within spec. 11.8+ is fine.

but that begs the question, how do your other voltages look? sag on the 5V or 3.3V rails could indicate impending failure too. 3.3V especially since the GPU uses that rail too at the PCIe slot.

Hi Ian,

I think my voltages are okay when I'm not running any GPU content, like Einstein.  I have not tried any other yet, like Milkyway.  I'll keep an eye out for the 5.0V and 3.3V rails if/when it locks up again.

Here is a screenshot of my 3950X machine with the 2060 GPU

I hope you can read this...

Also, I'm periodically getting a warning message from Ubuntu 20.04 that it is experiencing an internal error.  This only began happening after the last update.  Is there any way to reverse the changes (that I can't remember) back to the previous settings?

George

Proud member of the Old Farts Association

Tom M
Tom M
Joined: 2 Feb 06
Posts: 5658
Credit: 7739127077
RAC: 2531424

I have two Rx 5700's I am

I have two Rx 5700's I am trying to get rid of.  I would trade them for late generation NVIDIA gpus. They are also listed on eBay.

Make an offer?

Tom M

A Proud member of the O.F.A.  (Old Farts Association).  Be well, do good work, and keep in touch.® (Garrison Keillor)

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.