I have a demonstrated lack of competince on straightening bent pins.
T
Back in the old days of everyone using Intel dual and quad core cpu's we used a mechanical pencil to straighten the pins, but now I just let someone else put the cpu into the MB for me as they are too expensive to screw up. With the money the local places charge for that I usually then let them put the whole thing together then I tweak it the way I want it once I get it back home again. ie one place complained that the bench marking would have been MUCH higher if I'd given them an SSD to use instead of an older platter drive. The next one I had them build I did send an SSD along and they said they played games for a couple of hours on it and LOVED IT. I don't have any problems with them playing games on the machines they build if that's part of their testing process which it was.
I'm a little confused. I am talking about a Ryzen 9 5950x CPU (16c/32t).
I have seen a couple of listings with both a EPYC CPU and mb. The cheapest looks like a average deal but is likely out of my immediate price range.
To do robust boinc CPU crunching it pretty much needs to manage 3MHz.
I bought what I could afford. If had to do it over again I would have stopped at the 8c/16t EPYC CPU I started with. It was more than sufficient to push the video cards.
Ah, I looked up something further up and thought I was looking at a listing for just an Epyc server motherboard for over $500.
Is Epyc the server CPU version of Ryzen?
I can run video cards on antique CPUs (even DDR2 ones!) in Milkyway, Einstein, etc. If the GPU isn't fed fast enough, get the GPU to do a few tasks at once, then it can access a few cores of the CPU at once. I've only ever got a better CPU for my own use and for CPU crunching.
If this page takes an hour to load, reduce posts per page to 20 in your settings, then the tinpot 486 Einstein uses can handle it.
Static electricity discharge is something to beware of as well while manipulating the pins.
Cheers, Mike.
Wasn't that only a problem in the 90s? I've never been careful with any computer parts and I've never electrocuted one. It maybe depends on your clothing and the flooring. I had a colleague who refused to ever touch a computer part because he always broke therm. But if he touched a metal object he got no shock and there was no spark, it was weird.
If this page takes an hour to load, reduce posts per page to 20 in your settings, then the tinpot 486 Einstein uses can handle it.
I have a demonstrated lack of competince on straightening bent pins.
T
Back in the old days of everyone using Intel dual and quad core cpu's we used a mechanical pencil to straighten the pins, but now I just let someone else put the cpu into the MB for me as they are too expensive to screw up. With the money the local places charge for that I usually then let them put the whole thing together then I tweak it the way I want it once I get it back home again. ie one place complained that the bench marking would have been MUCH higher if I'd given them an SSD to use instead of an older platter drive. The next one I had them build I did send an SSD along and they said they played games for a couple of hours on it and LOVED IT. I don't have any problems with them playing games on the machines they build if that's part of their testing process which it was.
I use a very fine pointed er.... not sure what it is or where I got it from, but it looks like a dental pick on the sharp end, but with a screwdriver's handle.
When I used to build PCs, and prided myself in very quiet ones, I'd test everything worked and the fans were set right by running Boinc flat out on all cores and the GPU for 48 hours and watch the temperatures. The only Nvidia I ever used actually melted, it had no thermal cutout. That got sent back under warranty! I insisted on AMD ever since.
If this page takes an hour to load, reduce posts per page to 20 in your settings, then the tinpot 486 Einstein uses can handle it.
Yes. Epyc also has advantages like a bunch of PCIe lanes so you don't have some lanes running 16x while others are running 8x, 4x, 2x or 1x all on the same motherboards. The Asus b450-f's does things like running 8x, 4x, 2,x, and 1x.
But even with that combination of Xx speeds( 8x, 4x, 2x, 1x), I am getting very respectable processing on projects that don't need a lot of continuous communication between the CPU and the Gpu (like E@H, PrimeGrid, Seti@H etc).
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!
Static electricity discharge is something to beware of as well while manipulating the pins.
Cheers, Mike.
Wasn't that only a problem in the 90s? I've never been careful with any computer parts and I've never electrocuted one.
In my varied career in the computer business I spent about six years actually employed in Intel's Reliability activity.
Yes, electrostatic discharge is a real problem. I've seen the high-resolution SEM imagery of the aftermath of plenty of parts.
No, people don't believe it. Their logic resembles the logic of a person who has successfully crossed a road blindfolded several times, and concludes that since they did not die, there is no reason to confine themselves to a crosswalk, or even to look for traffic.
Not just people like us building their own machines. I had a conversation with the manufacturing manager of a small company which built workstations for computer-aided design in the mid 1980s who actually believe that there were no real ESD problems and that 3M was just building on a myth in order to sell him stuff to help manage the problem in his facility.
EPYC is more analogous to Threadripper than desktop Ryzen
EPYC 7001 “Naples” uses 14nm Zen architecture with 4 NUMA domains
EPYC 7002 “Rome” uses 7nm Zen2 architecture with single NUMA domain
EPYC 7003 “Milan” uses 7nm Zen3 architecture with single NUMA domain
EPYC brings 128 PCIe lanes per CPU, and 8-channel RDIMM memory. Whereas Threadripper has 64 PCIe lanes and 4-channel memory (but only UDIMM), desktop Ryzen only has 20PCIe lanes (16 on a single slot or x8x8, and x4 for an NVMe drive) plus the chipset connection for some more lanes and peripherals with a cap on total bandwidth. EPYC is available in 2P configurations (only on non-P skus). EPYC also does not have a chipset so all peripherals use direct connection to the CPU.
when connecting a lot of PCIe devices, especially GPUs, things are generally more stable if they are on a dedicated PCIe connection and not through the chipset being shared with other devices (like SATA, USB, Ethernet, etc).
Bandwidth per GPU really only matters for GPUGRID. Most projects can get away with a lot less bandwidth. You certainly don’t need x8 or x16 with Einstein, but avoiding cramming a bunch of devices over the chipset can be beneficial. even if you’re under bandwidth limits, the chipset still has to do PCIe switching and consumer chipsets really aren’t designed for that 24/7 like enterprise grade stuff. So you can experience more issues, and since you can’t breakout the x16 lanes to 16x x1 devices, or 8x x2 devices, or in most cases can’t even breakout to x4x4x4x4 on consumer Ryzen, going with EPYC or TR is a better platform for it.
there’s more than one factor that will determine success.
if you have unstable power connections you will have issues
if you have poorly shielded or bad connections on your PCIe lines you will have issues
if you have an unstable CPU or memory overclock you will have issues
being able to effectively troubleshoot and find the root cause of your specific issue will go a lot further than blindly throwing parts at something.
Yes, electrostatic discharge is a real problem. I've seen the high-resolution SEM imagery of the aftermath of plenty of parts.
No, people don't believe it. Their logic resembles the logic of a person who has successfully crossed a road blindfolded several times, and concludes that since they did not die, there is no reason to confine themselves to a crosswalk, or even to look for traffic.
Not just people like us building their own machines. I had a conversation with the manufacturing manager of a small company which built workstations for computer-aided design in the mid 1980s who actually believe that there were no real ESD problems and that 3M was just building on a myth in order to sell him stuff to help manage the problem in his facility.
I've worked in a university where they researched new techniques for creating microchips. I was told everything had protection diodes nowadays. And I must have handled several thousand computer parts with no precautions and no ill effects. Maybe I'm just not the type of person to create static? The humidity in Scotland probably helps! I have once heard of a problem. A computer I built for someone, I shipped it to them wrapped in bubblewrap. The customer told me he'd unwrapped it and got a static shock, caused by the bubblewrap, and he thought he had his fingers at the back near the graphics card outputs. The graphics card refused to work. From then on I bought static free bubblewrap (the pink stuff).
If this page takes an hour to load, reduce posts per page to 20 in your settings, then the tinpot 486 Einstein uses can handle it.
I've worked in a university where they researched new techniques for creating microchips. I was told everything had protection diodes nowadays.
Protection structures are not new. All the Intel parts had provisions when I joined in 1974. People have worked nonstop on making them more effective. I knew someone at my church in Silicon Valley who was a very long-term employee of National Semiconductor who spent much of his career on the things. I even spent a few days of mine trying to improve the provision which Mel Bazes had designed into his re-do of the 8080 (yes, this was a very long time ago).
If you think you are immune, you are wrong. You've just been lucky. This stuff is a second cousin to lightning, and everyone knows how hard it is to guess just when and where that will take practical effect.
Static electricity discharge
)
Static electricity discharge is something to beware of as well while manipulating the pins.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Tom M wrote: I have a
)
Back in the old days of everyone using Intel dual and quad core cpu's we used a mechanical pencil to straighten the pins, but now I just let someone else put the cpu into the MB for me as they are too expensive to screw up. With the money the local places charge for that I usually then let them put the whole thing together then I tweak it the way I want it once I get it back home again. ie one place complained that the bench marking would have been MUCH higher if I'd given them an SSD to use instead of an older platter drive. The next one I had them build I did send an SSD along and they said they played games for a couple of hours on it and LOVED IT. I don't have any problems with them playing games on the machines they build if that's part of their testing process which it was.
Tom M wrote:I'm a little
)
Ah, I looked up something further up and thought I was looking at a listing for just an Epyc server motherboard for over $500.
Is Epyc the server CPU version of Ryzen?
I can run video cards on antique CPUs (even DDR2 ones!) in Milkyway, Einstein, etc. If the GPU isn't fed fast enough, get the GPU to do a few tasks at once, then it can access a few cores of the CPU at once. I've only ever got a better CPU for my own use and for CPU crunching.
If this page takes an hour to load, reduce posts per page to 20 in your settings, then the tinpot 486 Einstein uses can handle it.
Mike Hewson wrote: Static
)
Wasn't that only a problem in the 90s? I've never been careful with any computer parts and I've never electrocuted one. It maybe depends on your clothing and the flooring. I had a colleague who refused to ever touch a computer part because he always broke therm. But if he touched a metal object he got no shock and there was no spark, it was weird.
If this page takes an hour to load, reduce posts per page to 20 in your settings, then the tinpot 486 Einstein uses can handle it.
mikey wrote: Tom M wrote: I
)
I use a very fine pointed er.... not sure what it is or where I got it from, but it looks like a dental pick on the sharp end, but with a screwdriver's handle.
When I used to build PCs, and prided myself in very quiet ones, I'd test everything worked and the fans were set right by running Boinc flat out on all cores and the GPU for 48 hours and watch the temperatures. The only Nvidia I ever used actually melted, it had no thermal cutout. That got sent back under warranty! I insisted on AMD ever since.
If this page takes an hour to load, reduce posts per page to 20 in your settings, then the tinpot 486 Einstein uses can handle it.
Peter Hucker wrote:Is Epyc
)
Yes. Epyc also has advantages like a bunch of PCIe lanes so you don't have some lanes running 16x while others are running 8x, 4x, 2x or 1x all on the same motherboards. The Asus b450-f's does things like running 8x, 4x, 2,x, and 1x.
But even with that combination of Xx speeds( 8x, 4x, 2x, 1x), I am getting very respectable processing on projects that don't need a lot of continuous communication between the CPU and the Gpu (like E@H, PrimeGrid, Seti@H etc).
Tom M
A Proud member of the O.F.A. (Old Farts Association). Be well, do good work, and keep in touch.® (Garrison Keillor) I want some more patience. RIGHT NOW!
Peter Hucker wrote: Mike
)
In my varied career in the computer business I spent about six years actually employed in Intel's Reliability activity.
Yes, electrostatic discharge is a real problem. I've seen the high-resolution SEM imagery of the aftermath of plenty of parts.
No, people don't believe it. Their logic resembles the logic of a person who has successfully crossed a road blindfolded several times, and concludes that since they did not die, there is no reason to confine themselves to a crosswalk, or even to look for traffic.
Not just people like us building their own machines. I had a conversation with the manufacturing manager of a small company which built workstations for computer-aided design in the mid 1980s who actually believe that there were no real ESD problems and that 3M was just building on a myth in order to sell him stuff to help manage the problem in his facility.
EPYC is more analogous to
)
EPYC is more analogous to Threadripper than desktop Ryzen
EPYC 7001 “Naples” uses 14nm Zen architecture with 4 NUMA domains
EPYC 7002 “Rome” uses 7nm Zen2 architecture with single NUMA domain
EPYC 7003 “Milan” uses 7nm Zen3 architecture with single NUMA domain
EPYC brings 128 PCIe lanes per CPU, and 8-channel RDIMM memory. Whereas Threadripper has 64 PCIe lanes and 4-channel memory (but only UDIMM), desktop Ryzen only has 20PCIe lanes (16 on a single slot or x8x8, and x4 for an NVMe drive) plus the chipset connection for some more lanes and peripherals with a cap on total bandwidth. EPYC is available in 2P configurations (only on non-P skus). EPYC also does not have a chipset so all peripherals use direct connection to the CPU.
when connecting a lot of PCIe devices, especially GPUs, things are generally more stable if they are on a dedicated PCIe connection and not through the chipset being shared with other devices (like SATA, USB, Ethernet, etc).
Bandwidth per GPU really only matters for GPUGRID. Most projects can get away with a lot less bandwidth. You certainly don’t need x8 or x16 with Einstein, but avoiding cramming a bunch of devices over the chipset can be beneficial. even if you’re under bandwidth limits, the chipset still has to do PCIe switching and consumer chipsets really aren’t designed for that 24/7 like enterprise grade stuff. So you can experience more issues, and since you can’t breakout the x16 lanes to 16x x1 devices, or 8x x2 devices, or in most cases can’t even breakout to x4x4x4x4 on consumer Ryzen, going with EPYC or TR is a better platform for it.
there’s more than one factor that will determine success.
if you have unstable power connections you will have issues
if you have poorly shielded or bad connections on your PCIe lines you will have issues
if you have an unstable CPU or memory overclock you will have issues
being able to effectively troubleshoot and find the root cause of your specific issue will go a lot further than blindly throwing parts at something.
_________________________________________________________________________
archae86 wrote:Yes,
)
I've worked in a university where they researched new techniques for creating microchips. I was told everything had protection diodes nowadays. And I must have handled several thousand computer parts with no precautions and no ill effects. Maybe I'm just not the type of person to create static? The humidity in Scotland probably helps! I have once heard of a problem. A computer I built for someone, I shipped it to them wrapped in bubblewrap. The customer told me he'd unwrapped it and got a static shock, caused by the bubblewrap, and he thought he had his fingers at the back near the graphics card outputs. The graphics card refused to work. From then on I bought static free bubblewrap (the pink stuff).
If this page takes an hour to load, reduce posts per page to 20 in your settings, then the tinpot 486 Einstein uses can handle it.
Peter Hucker wrote:I've
)
Protection structures are not new. All the Intel parts had provisions when I joined in 1974. People have worked nonstop on making them more effective. I knew someone at my church in Silicon Valley who was a very long-term employee of National Semiconductor who spent much of his career on the things. I even spent a few days of mine trying to improve the provision which Mel Bazes had designed into his re-do of the 8080 (yes, this was a very long time ago).
If you think you are immune, you are wrong. You've just been lucky. This stuff is a second cousin to lightning, and everyone knows how hard it is to guess just when and where that will take practical effect.