Join the SatelliteGuys Folding@Home Team!

Looks like it's already running a SLI system. That power supply should easily be able to handle two 960s if you decide to really go crazy with the folding habit.

Edit: It looks like you have a very good cooling setup too.
I never used the SLI capability when it was my main computer. I have four displays connected and SLI is not available when you do that. When I put the computer aside I enabled the SLI, but have nothing to use it on ;).
The cooling was barely adequate. But then there are temp sensor issues with that MB. So it was really hard to tell where you were at with that board. They changed the monitors to drop the temp displays some 20° but 3rd party programs were not fooled with that. For me that was the major flaw in that computer build.
 
We're #94 now! And, with all the new hardware our team has been adding in the past month, we've moved into the Top 40 Folding teams in terms of PPD, less than 40K PPD behind those BadBirdies. A little more iron and we may retake our position from them.

Voyager6, the Dell is a beast, am I right? Check your power supply to see if you got the 1,000 W model. If so, you should be able to handle two GPU cards easily.

JoshuaM has passed 7,000,000 Points! And by the time I type this and it gets uploaded, it will probably be 8,000,000 Points! VOOOM!
 
  • Like
Reactions: samy60 and Joshua M
Finally got all of the servers up and running...only bad thing is at home I'm not able to keep the 980ti running 24 hours a day anymore, it is causing the room to get too hot which is where the thermostat is for upstairs. Because it is keeping that room hotter, the bedrooms are staying freezing cold because the heater is not coming on often enough...

all.png
 
  • Like
Reactions: Voyager6
I have a window in my den/computer room open with a fan sucking air out 24/7 even in the dead of winter. I t's probably 20 degrees warmer in here than the rest of the house all the time.
 
  • Like
Reactions: samy60
Voyager6, the Dell is a beast, am I right? Check your power supply to see if you got the 1,000 W model. If so, you should be able to handle two GPU cards easily.
It does have the 1,000 W power supply. I am encountering some difficulties loading an operating system. The system came with just one drive but it had been set up in Raid and now shows as failed due to the missing drive. I bypassed the original drive and installed a new one. The system seems to recognize it and Ubuntu installs on the new drive fine but for some reason the system will not boot from the new drive. I am digging into the bios settings to try and figure it out.
 
Last edited:
I have some additional RAM arriving today. Since I was going to be opening up the case anyways I took the opportunity to take apart my PC and give it a good cleaning. When I put it back together I put a much greater emphasis on cable management. It was kind of a nightmare before. I had added several parts since my original build but took the easy road instead of re-routing my cables to get as many of them out of the way as possible.

I was able to completely remove a SATA power cable and 2 PCIe power cables from my modular power supply and get them out of the case by using a better layout that allowed me to daisy chain things to fewer cables. My power supply has PCIe cables that have 1 8 pin plug on the power supply end and two 6+2 pin plugs on the GPU side. Before I was using four separate power supply cables for the four 6 pin ports on my GPUs. Now I am using 1 of the split cables for each GPU.

My cable management is drastically better now and it should provide much better airflow throughout my case. I'm interested in seeing if this has any impact on my GPU and CPU temps under gaming and folding load. If I get any significant temperature reduction I will bump up my CPU and GPU overclocks a bit.

Hopefully everything boots up and works normally when I put the additional RAM in tonight. I don't typically like changing this many variables at once since it makes the troubleshooting process more difficult if something doesn't work. The cleaning and cable management improvements were needed pretty badly though.
 
I have some additional RAM arriving today. Since I was going to be opening up the case anyways I took the opportunity to take apart my PC and give it a good cleaning. When I put it back together I put a much greater emphasis on cable management. It was kind of a nightmare before. I had added several parts since my original build but took the easy road instead of re-routing my cables to get as many of them out of the way as possible.

I was able to completely remove a SATA power cable and 2 PCIe power cables from my modular power supply and get them out of the case by using a better layout that allowed me to daisy chain things to fewer cables. My power supply has PCIe cables that have 1 8 pin plug on the power supply end and two 6+2 pin plugs on the GPU side. Before I was using four separate power supply cables for the four 6 pin ports on my GPUs. Now I am using 1 of the split cables for each GPU.

My cable management is drastically better now and it should provide much better airflow throughout my case. I'm interested in seeing if this has any impact on my GPU and CPU temps under gaming and folding load. If I get any significant temperature reduction I will bump up my CPU and GPU overclocks a bit.

Hopefully everything boots up and works normally when I put the additional RAM in tonight. I don't typically like changing this many variables at once since it makes the troubleshooting process more difficult if something doesn't work. The cleaning and cable management improvements were needed pretty badly though.

I got my new RAM installed, booted up, and ran a quick stress test to make sure everything was still working properly after all my tinkering. Everything looks good to go. I'll get to see if the increased air flow has any impact on my temperatures with a full day of folding tomorrow.
 
only bad thing is at home I'm not able to keep the 980ti running 24 hours a day anymore, it is causing the room to get too hot which is where the thermostat is for upstairs. Because it is keeping that room hotter, the bedrooms are staying freezing cold because the heater is not coming on often enough...
Hee-hee... That reminds me of when I first started Folding on my son's Dell Dimension 4700 with a 9800 GTS+ which just happened to be located right under the house thermostat. I ended up moving the thermoses as it was in the Living Room which received the morning sun. In the winter and the summer it had the effect of cooling the rest of the house down. When I was Folding, it was all the time!

I need to try to remember how I fixed my Mac Pro so I can update the T7400 to match. I applied an update to the T7400 last night and it rebuild the kernel, without the NVIDIA drivers. So, when the system rebooted, it used the Nouveau drivers and no CUDA! Which means my WU I was working on aborted. Which meant I needed to reinstall the NVIDIA drivers and reboot again. Argh!

It has to do with the XOrg and the LightDM Desktop GUI. I'll have to figure out how I fixed it so it doesn't matter about applying Ubuntu updates.

Voyager6, good to hear you got your machine configured and running. I was going to recommend that you rebuild the RAID or set it up as JBOD. Mine came with a pair of 500 GB SATA drives, so it was no big deal. My big fight has been the single-bit DIMM error. It doesn't halt the reboot anymore, but I'm not sure how to fix it without installing Windows.
 
I need to try to remember how I fixed my Mac Pro so I can update the T7400 to match. I applied an update to the T7400 last night and it rebuild the kernel, without the NVIDIA drivers. So, when the system rebooted, it used the Nouveau drivers and no CUDA! Which means my WU I was working on aborted. Which meant I needed to reinstall the NVIDIA drivers and reboot again. Argh!
Looking forward to seeing your notes on this. I am looking at a couple 7400's right now.
 
  • Like
Reactions: Voyager6 and Foxbat
So I think if a guy is buying a Dell server and sticking some high powered video cards in just to do FAH, he should be able to write off the whole amount and the power to run it as a charitable contribution when doing your taxes.
 
  • Like
Reactions: Joshua M
So what's the advantage to these T7400s? They seem to be a popular choice lately.
You can get them for cheap and they usually have two Xeon CPU's (8 cores) and a beefy power supply. Put a good GPU (or two) in there and you have a Folding monster.

I just got my T7400 up and running. Had to install Win7. Ubuntu was just not working for me. Folding with 6 CPU cores and the GTX980.
 
  • Like
Reactions: Foxbat
In my case, here is a big box with two PCI-e x16 slots, a 1000 W power supply, and eight 3.0 GHz Xeon cores to drive them, all for less than the cost of an entry-level PC. It may be that a bunch of these boxes are coming off leases and are making it to market at reasonable prices.

I spent more on the GPU than I did for the computer, and Ubuntu 14.04 LTS doesn't add any extra cost, so if you look at Points per Day per Dollar, it's a better ratio (over 550 PPD/$ in my case) than going out and building a modern Folding rig since the majority of the Points is in the GPU.

That said, I wouldn't spend more than $350 on a Dell T7400. Mine was $256 delivered, and Voyager6 spent less than that for his. The box does weigh 60+ lbs, though, so shipping costs alone can be significant.
 
You can get them for cheap and they usually have two Xeon CPU's (8 cores) and a beefy power supply. Put a good GPU (or two) in there and you have a Folding monster.

I just got my T7400 up and running. Had to install Win7. Ubuntu was just not working for me. Folding with 6 CPU cores and the GTX980.

In my case, here is a big box with two PCI-e x16 slots, a 1000 W power supply, and 8 3.0 GHz Xeon cores to drive them, all for less than the cost of an entry-level PC. It may be a bunch of these boxes are coming off leases and are making it to market at reasonable prices.

I spent more on the GPU than I did for the computer, and Ubuntu 14.04 LTS doesn't add any extra cost, so if you look at Points per Day per Dollar, it's a better ratio (over 550 PPD/$ in my case) than going out and building a modern Folding rig since the majority of the Points is in the GPU.

That said, I wouldn't spend more than $350 on a Dell T7400. Mine was $256 delivered, and Voyager6 spent less than that for his. The box does weigh 60+ lbs, though, so shipping costs alone can be significant.


I was wondering why you guys didn't just build your own folding machines.I guess that makes sense. If you have to buy your own case, CPU, motherboard, RAM, and power supply separately it would be tough to match that Dell for $250.

Since most of you don't seem to fold on the CPU I would think a Haswell or Skylake i3 would be enough to keep dual GPUs busy while folding.

If you are interested in CPU folding the $250 Xeon e3 1231 V3 is faster with 3.4GHz base speed and 3.8GHz turbo boost speed. It's got hyperthreading to bring the thread count up to 8 and it is compatible with a standard Haswell motherboard. It's basically an i7-4770 without the integrated graphics.Those standard Haswell motherboards aren't going to support dual CPUs though.

Neither of the build options I mentioned stack up to what you are getting for $250 though. They would if you were buying new PCs but the price to performance math doesn't add up when comparing it to a used server model like that.
 
Last edited:
  • Like
Reactions: Foxbat

Users Who Are Viewing This Thread (Total: 0, Members: 0, Guests: 0)

Latest posts