Almost a fifth of cancer cases worldwide are caused by a chronic infection. Most of these infections are viruses.

[Worklog] The Beast Slayer
#1
For those who have kept track of my updates on the Pride of Hiigara, you'll already be aware that I've had to cancel it because there has been a change in my company accommodation. Since my company is moving me to another apartment where the dimensions of the walls will probably be different, there is no sense in continuing with the kind of wall-mounted computer that I had wanted to go for.

However, the abandonment of my previous project does not mean I am completely stopping. After all, I still need an insanely overpowered system, so I shall begin a new project. Going with my obvious bias for the Homeworld series, the system shall be called The Beast Slayer.

For those who are unfamiliar with Homeworld lore, The Beast can be likened to a cancer on the galaxy, and the group that defeated them were known as Beast Slayers (Full Beast article). For this reason, I found the name rather fitting.

So this is how the entire process will work:

1: Amass the necessary funds in the form of cash and deliver it to someone in the US willing to use their bank account and address to purchase the hardware.

2: Pick out and purchase the hardware at the time when the cash is handed over.

3: Parts will then be delivered to this person. Once received, the hardware will be tested to ensure nothing is DOA

4: After successful testing, weight and volume reduction will be performed, provided it is safe to do so. For example, this will involve actions such as placing the CPU and RAM into the motherboards and placing the set back in the original motherboard box, while disassembling and flattening the original packaging of the CPU and RAM. Once completed, everything will be shipped in a single, larger box to my address.

5: After receiving the parts on my end, my personal system will be assembled, along with the parts for the Beast Slayer. While my system will be housed in a standard case, a large, custom case will be built for everything else.

Basic details of the project:

1: There are two major parts to this project: The Beast Slayer itself, and my personal system. Both will be used for DC projects, but the former will be running them exclusively, while the latter will also be used for my typical usage.

2: The Beast Slayer will actually consist of two completely separate sets of hardware. They will be contained within the same case and will run headless. No monitor or peripherals of their own. They would be accessed via SSH from the personal system.

3: One of the Beast Slayer systems will run on Windows, while another will run on Linux. Each will run projects best suited to their respective operating systems.

4: The first case that the Beast Slayer will be placed in will likely be a temporary one. I am currently researching the feasibility of a chillbox, which is essentially an airtight case that is cooled in a manner similar to phase change cooling. Think 'computer in a freezer', except that actual freezers are never a good idea.



The Hardware:

You probably raised an eyebrow at how I planned to purchase the hardware. Why not just go online and buy the parts yourself? Surely that would have been easier, no? Well, while it would have been easier, it would have also been more expensive. Given that the US is one of the cheapest places to buy parts from, and that I am not living in the US, I'd be looking at paying more than necessary if I had bought the parts locally.

Okay, so why not buy from a US site and have them ship it to me instead? Nope, still going to cost more than necessary. Probably would have been the most expensive thing to do. One, my bank account is obviously not in USD, so a currency conversion will have to be done. No business will ever exchange at the true rate, so cash is lost there. International transfers will also incur a fee, so more cash is lost on that. Then, getting things shipped from the US will end up costing a ridiculous amount. Not a suitable option

How about going to the US and buying them in retail stores then? I'm obviously flying around for work, and I get to go to the US often enough. Seems like a good solution, yes? While it is definitely better than the previous option, retail is more expensive. Plus, I won't have enough room to put everything in my suitcase, forcing me to still pay a lot in shipping.

Ideally, parts purchased from an online US store such as Newegg would be the best option. Since I can't actually do that myself due to aforementioned points, and I can't open a bank account in the US as a foreigner, I needed a little help. Ruthalas was kind enough to lend a hand here. While I had the required USD in cash already, I needed some way of taking cash and making it work in my favour. So a plan was made.

By requesting a flight to where Ruthalas lived, I could carry the cash with me, meet with him, pick the parts, and have him use his account to pay for the parts, at which point I would hand the cash over to him. Essentially, Ruthalas is the middle man who allowed this project to be much more than what it could have been, had I done everything on my own. This brings me to what we ended up purchasing:

Personal system:

GTX 1080:
- 1x EVGA
- $619.99

PSU:
- 1x Rosewill 700M
- $64.99

SSD:
- 1x ADATA 240 GB
- $64.99

Mobo:
- 1x ASRock Z170 Pro4S
- $68.49

RAM:
- 2x APACER 8GB DDR4 2133
- $31.99

CPU:
- 1x i5 6500
- $204.99

HSF:
- Cooler Master Hyper 212 EVO
- $29.99

HDD:
- 1x Seagate ST4000DM000 4TB
- $95.00

Monitor:
- ASUS VP247H-P
- 23.6 inches
- 1ms response time
- $125.99

Keyboard & Mouse
- $29.99

Beast Slayer:

GTX 1080:
- 5x Gigabyte
-- $609.99
- 2x ASUS
-- $599.99
- 2x MSI
-- $615.00

PSU:
- 2x Rosewill 1600s
- $299.99 new
- $191.99 open box

Mobos:
- 2x ASRock X99 WS-E
- $399.00

RAM:
- 2x APACER 8GB DDR4 2133
- $31.99

CPU:
- 2x Xeon E5-2609 v4
- $609.84

HDD:
- 2x RuthalaDrives



Build log:

[Image: 01S3r4I.jpg]
[Image: KAvBFo6.jpg]
[Image: yvm05RG.jpg]

All the ordered hardware arrives with Ruthalas. Of course, I had to be stupid and ordered an incompatible CPU, so that had to be returned and exchanged for the right one. In my defense for making such a rookie mistake, there were sales which ended only minutes after we ordered everything.

[Image: wWMXE04.jpg]

Assembly of the personal rig.

[Image: DQBVQYV.jpg]

Personal rig up and running!

[Image: iJyTb7R.jpg]

The heart of Beast Slayer number 1!

[Image: eZML1aA.jpg]

PSU unboxing. These things are absolutely massive and probably weigh about two kilos each. I don't think PSUs get any bigger than this. Pity one of them died later on, so an EVGA SuperNOVA 1600 replaced the dead unit.

[Image: BI6L5JX.jpg]

The testbed. Everything getting stressed and checked for functionality.

[Image: gq87FLN.jpg]

Slight complication with the boards. Turns out that one of the boards was shipped with both the main and backup BIOS using outdated firmware, which made it incompatible with the CPU.

[Image: PnUgJZ2.jpg]

Luckily, the other board had a more recent version, so one BIOS from each board was pulled, and swapped. After each started on the newer one, the older ones were flashed.

[Image: 8MUqivJ.jpg]

In preparation for my return to Seattle, Ruthalas packed everything into three bags and met me at the hotel I stayed in. Additional packing material was taken.

[Image: ZeKyeSk.jpg]

Problem is, we needed to take two suitcases and a cabin bag's worth of hardware, and fit it into one cabin bag and one suitcase. So all unnecessary packaging was discarded. Serial numbers of hardware were cut from boxes and kept separately.

[Image: m67t0cz.jpg]

As my personal rig was better left in its case, we immobilised the hardware by placing newspaper-covered foam cut from the original packaging in any place that had a gap in it. To make use of some of the safer spaces, some cables, smaller items such as bags of screws, and the mouse were also stashed in there.

[Image: vnu7s3d.jpg]

Only one thing loaded in the suitcase, and already running out of room!

[Image: mYkHtik.jpg]

That said, I think we did a pretty good job of playing the world's riskiest game of Tetris!

[Image: JNUhogd.jpg]

Monitor was placed last, upside down, and aligned to the edges of the case, so that the screen would not have anything pressing on it. Some additional foam was used to immobilise the parts as much as possible.

[Image: 3F2gSPN.jpg]

The most valuable items would go in my cabin bag, so I managed to fit the other 9 video cards and pad them with more foam. Was actually quite amazed that things fit so well.

[Image: NDS940v.jpg]

This is what I was left with. Unfortunately, the motherboards couldn't fit, despite having the CPUs and RAM already installed, then having two boards fit in one box. Thankfully, one of the other crew agreed to take it in their suitcase.

I also did some last minute rearranging of the suitcase and managed to fit the bag of manuals and serial numbers, and the side panel of the case inside. The reason the panel was off was because it has a section that sticks out. That would have damaged the monitor, so it had to be removed.

All in all, the suitcase weighed 35 kilos. Not sure what I was surprised at more. The fact that they accepted that weight, or the fact that TSA actually did a pretty good job at inspecting the contents and returning everything to the right place without damaging anything.

Anyway, once I got back to Dubai, I had to be pulled aside at customs. Because there were a heap of highly valuable items in my bag, I had to pay duty on them. Unfortunately I did not have the invoice on me at the time, so they had to hold the parts until the value could be determined. On top of that, because I had all the drives in the case, the entire computer needed to be sent off to a media branch of the government to ensure nothing illegal was on the drives (which in an Islamic country would include porn and anything anti-Islamic). Apparently it should take three days, before I could go collect it, but since I have other flights in the meantime, collecting it after those three days wouldn't be possible.

On the bright side, I was able to keep a few of the items. The monitor, keyboard, mouse, and PSUs were good to go. The other crew member made it through customs with my motherboards as well, but went home because I was stuck in customs for 1.5 hours. The parts were picked up the next day, and brought home however.

[Image: waoxb4e.jpg]

In the meantime, to pass the time, I prepared the motherboard mounting plate for the system. This piece of 10mm acrylic was cut out from some scrap I had left over from my previous project. You can ignore the markings, as I had one initial idea for how to lay out the boards, but since then, it was changed.

[Image: U3ln3fN.jpg]

Now that the boards were here, I could use them to take measurements for where the standoffs should go. After deciding on how the boards should be arranged, I marked the standoff holes, and drilled partially into the acrylic; enough so that the standoffs would go in, but not enough to drill right through the other side.

[Image: RGaGMZ0.jpg]

Once the holes were drilled, I used a tapping tool to cut threads into the holes, so that the standoffs could be screwed in. Had I not done this, the standoffs could just be pulled out, unless I used glue to hold them in place.

At this point, I also removed the protective paper on the acrylic.

[Image: OvpY35t.jpg]

While screwing in my first standoff, I ended up snapping the threaded part off from excessive torque. This was partly because I didn't tap the hole deep enough (phrasing). Acrylic isn't like wood, in the sense that the material is soft enough to not require tapping in every case. With acrylic, if it's not tapped, either the material with crack, or the screw will break.

The solution in this case was to offset the rest of the standoffs 5mm to the left for the left board, and 5mm to the right for the right board. Thankfully I did not drill every hole beforehand, so I wasn't left with a bunch of unnecessary holes.

[Image: ZG6M4PL.jpg]

Boards mounted.

[Image: e9VhrEm.jpg]

...And well aligned! I'm proud of myself for that!

[Image: 91lDmbV.jpg]

The PSU mounting plate is next. This will be a base for the PSUs to sit on, while the plate hangs underneath the motherboard mounting plate. The idea here is to have the PSUs on their sides, while next to each other. The fans point outwards, while the AC sides face the back, and the DC sides face the front.

So the PSUs were placed in this position on some more scrap acrylic and had their edges traced. Using a protractor, I drew 45 degree lines from each corner and marked a spot 2cm on each line. These were then drilled with 12mm bits, so that 20cm lengths of M12 threaded rods would fit through them. The rods would have nuts at each end, securing the plates to each other, and also allowing the PSUs to be secured by tightening the nuts until the two plates essentially clamp down on the PSUs.

[Image: hi0Fy1R.jpg]

Due to the location of the rods that will attach the PSU mounting plate to the motherboard plate, two of the rods will be positioned underneath the motherboards. While the other two rods are well clear of any components and may be secured with M12 nuts that are suitable for the M12 rods, I needed another solution for the pair under the boards.

The two options I had were to either find some low profile nuts that would fit in the gap under the motherboards which are made by their standoffs, or find some way to use smaller fasteners. So I took a length of M12 rod and drilled a 5mm hole. I then used an M6 tap to create threads in the hole, which are barely visible in the photo. Not a pleasant experience, mind you. Tapping steel by hand is an arduous task. Also, as I had no other suitable substance, I resorted to using cooking oil for lubrication.

You might also notice that the hole is slightly off-center. Let's just say that I'm quite surprised I was able to drill the hole that close to the center as it is, as well as drill more or less straight enough that I didn't break through the side of the rod. I wish I had the space for a drill press.

[Image: fIh8ezN.jpg]

Quick test with an M6 screw shows that both holes drilled and tapped were sufficiently deep to hold a significant weight. The screw will need to be cut to an appropriate length to ensure there's no gap between the bottom of the motherboard plate, and the top of the M12 rod.

[Image: XuE78vT.jpg]

The video cards and my personal system finally clear customs, after 13 days. Due to the delay, I didn't have to pay for the storage fee, which would have been about $30 a day, and the duty was only 1%. So I paid the duty, and carried the box home.

I guess I got a little too excited, because I only remembered to take this photo after I had pulled the first few cards out. Yes, those are paper towels that were used as packing material. Bit ridiculous that this was what I was provided with as packing material, but after a thorough inspection of the PCBs on the back of each card, things look intact.

At this point, I did not test any cards, due to time constraints, but my personal system was working, so there's that.

[Image: OgP09dn.jpg]

You can see how everything is starting to take shape now. The threaded rods were cut to length, and nuts secure them in place on the PSU plate. Additional nuts were added to the front two rods, and the motherboard plate was lowered in place. The rods in the back have the screw holes in them, so while the plate was lowered, these screws were inserted through the plate, with a nut on each, and then screwed into the rods.

While the PSU plate is immediately aligned due to the nuts being of uniform height and with no rods protruding, the motherboard plate needs to be leveled. This will be done by adjusting the appropriate nuts. Using a level, I can ensure that everything will be perfectly straight.

[Image: urzfWEz.jpg]

PSUs fit perfectly. At the moment, there is just enough height clearance for the hardware to slide in easily. Later when the hardware is mounted for the finished setup, the nuts will be tightened to the point where a small compressive force will hold the PSUs in place.

You can see the right rod is also misaligned. Not sure if I didn't measure correctly, drilled an angled hole, or a combination of both, but thankfully that won't be visible, as I don't plan on leaving the rods visible.

[Image: XPrsMrE.jpg]

To hide the unsightly threaded rods and nuts, I cut lengths of shower curtain rods (found some thrown away, so I took them). They were just the right diameter to fit around the nuts. Plus they are shiny!

[Image: CIS1gf6.jpg]

Curtain rods in place over the threaded rods. At this point, I began to do some leveling. With the mounting places secure, I tightened the top nuts and screws, until everything was reasonably level longitudinally and laterally.

[Image: OBnkWyw.jpg]

I changed my mind about securing the PSUs with compressive force. Instead, I used strips of rubber bands to prevent motion by friction. Also started peeling off the protective acrylic coverings. More satisfying than peeling the protective plastic off electronics because you can put a little force into it!

[Image: yZC1oP4.jpg]

PSUs in place, and acrylic goes all naked and lewd on us!

[Image: Ncx33hX.jpg]
[Image: GZNQ85r.jpg]

Ladies and gentlemen, I give you the Clusterfuck! Without cable management, the simplicity and elegance that the acrylic and hardware had just moments prior simply vanished. I'll need to fix that at some point. Still plenty of work to do, but everything is running at this point.

You'll notice that the second system is running only three cards. That's because while I was able to reach the power button on the motherboard itself on the first system, the second system is blocked by the first. I can enable Wake On LAN and turn it on remotely, but I need to set it up first.

At this point, the first system was already operational, outputting just short of 3 million PPD on Folding@Home. Fans are LOUD at 100%, but necessary, as the hottest card runs at 82 degrees. The one sitting right on the end with unrestricted airflow is chugging away nicely at 58 degrees though.

[Image: LLez7Evh.jpg]

To help with fixing the cable jungle, I've cut out a length of cable trunking, then cut two sets of holes for the PCI-e cables to plug into the video cards, and another hole in the center where the cables come together. Additional trunking will be made to hide more cables, and guide them to where they need to go. When the chillbox casing is built, the trunking will be aligned properly and secured.

Speaking of the chillbox, as I would need a decent amount of insulation, I've doubled up on the standoffs, so there's a larger gap between the motherboards and the acrylic.

I'm also tempted to swap the hard drives out for some M2 drives, just so I can get rid of some additional cables. A couple of 30 or 60 gig M2 drives would probably cost about 60 to 80 dollars currently. I don't think it's worth it though. These systems don't access the hard drives that frequently, and they're not set to save their data at regular enough intervals to cause any major interruptions. Besides, that cash can go towards more cards!

[Image: y8xykALh.jpg]

Did a little more cable management. The vertical cable trunking is actually two pieces back to back and secured with super glue. The alignment is a little off, but if I do finally finish the rest of the system within the next decade, the alignment will be sorted when they get anchored to the case I plan to make. At the moment, all the trunking is supported by the cables themselves. They can be adjusted to an extent by wiggling the cables around a bit. Still got a lot of cable management to do, but the trunking does a very good job at handling most of it.

I should probably do a bit of dusting as well...
[Image: sigimage.php?FAHUser=hiigaran&FAHTeam=21...hangeling1]
Reply
Likes: Ruthalas
#2
This is good news, I'm glad to hear you'll be able to come up with something despite the change in living arrangements.

I know I keep going on about single rackmount solutions (such as the ASRock 3U8G-C612) but I honestly feel your concerns about PCI-E bandwidth are unwarranted. I recently started folding on my dual 980Ti rig again to try to delay getting overtaken in F@H, I have been monitoring PCI-E bus usage constantly.

One GPU is 3.0 x16 and the other is 3.0 x8, not once has the first ever reached 25% bandwidth, similarly, the second has never got near 50%. An off-the-shelf rackmount solution with two CPUs would be more than capable of running F@H on 8 of the latest GPUs with x8 lanes each, and then some!
[Image: sigimage.php?u=652873&t=212997&b=twilight2]
Reply
Likes:
#3
I don't remember if I asked you how much one of those cost. Would buying one of those, plus all the GPUs come out cheaper than buying multiple cheaper mobos that might only support two or three cards? Assume that the latter system uses a single NAS drive for everything, and PSUs are splitting power.
[Image: sigimage.php?FAHUser=hiigaran&FAHTeam=21...hangeling1]
Reply
Likes:
#4
That particular chassis is like €4,000 (there are other similar ones on the market), but then all you need to add are the GPUs, two CPUs, a little bit of RAM and a hard drive or two. I don't know how much the cost of using multiple cheaper motherboards compares, but I wouldn't be surprised if the single rackmount chassis comes out more expensive. You're bound to pay a small premium for the convenience of having everything in one simple, portable package. You won't get that with lots of small PCs and that might make your life difficult if your living arrangements are getting more unstable.
[Image: sigimage.php?u=652873&t=212997&b=twilight2]
Reply
Likes:
#5
I've got no need for the all in one system that I placed a lot of emphasis on with the previous project. The case I am building is essentially an oversized case that I can place several mobos on next to each other.

Still, at that cost, I'd probably only be able to buy two or three cards at the most.
[Image: sigimage.php?FAHUser=hiigaran&FAHTeam=21...hangeling1]
Reply
Likes:
#6
What's your budget like? I seem to recall $10k being mentioned before. I doubt you'll end up with more than two motherboards, here's a list I quickly threw together for a four GTX 1080 build (LINK). You'd need to build an open air case but they're cheap to make and there are tonnes of guides from the bitcoin mining community for this. You could buy two sets of this spec and end up with eight 1080 GPUs for around £6000 (~$7,300 USD)
[Image: sigimage.php?u=652873&t=212997&b=twilight2]
Reply
Likes:
#7
Have you considered building two (or three) entirely separate systems, instead of having multiple motherboards in one case? My four GPU system is 55 pounds (25 kg), and it's already very hard to move around. Although if you can wheel the case everywhere then that makes it easier.

So are you planning on using risers to get 5+ cards on a motherboard (also better cooling for air-cooled cards), or just have 4 cards per motherboard?

If you had a specific parts list that would help. :P
Reply
Likes:
#8
Budget will be at least $7000 USD, as that is the amount of US cash I should have upon arrival. I can bring additional cash and have it converted if need be, which I probably will.

I'd prefer not to build separate systems as a single custom case will be simpler in the long run. The wheels are there only to make easy adjustments to positioning. Everything will be dismantled during my move because I obviously don't want things knocking around.

I won't plan on any risers for this build. I will however need to make a support so that the video cards won't get torn off the mobo. Simple enough to make.

Specific parts for now aren't available. Likely, each system will have 8 gigs DDR4, a LAN connection to a router with NAS for booting from an appropriately partitioned drive at 4 gigs or so, high wattage PSUs with power cables split to reduce the number of PSUs needed.

I'm thinking three mobos would be sufficient and also allow for a little expansion, if those boards can do three slots of PCI-e x16 3.0.

One thing however is that I also need to build my own personal rig in its own case. That will probably take away about $1200 or so from the available budget.

EDIT: Doing a quick Newegg search, I can see the following items:

RAM for $32
PSU for $240
HDD for $109

Assuming three boards, that's three sticks of RAM, two PSUs, and one HDD with NAS. A NAS enclosure is probably another $100, so this is an alright start. Like I said though, specific hardware will be chosen on the day in a joint decision, to ensure prices are up to date. Hopefully I can find some good combo deals as well.
[Image: sigimage.php?FAHUser=hiigaran&FAHTeam=21...hangeling1]
Reply
Likes:
#9
Why do you need a NAS? Why not boot locally off some cheap flash? There's nothing wrong with risers in the right situation imo, if it's not a completely enclosed case then they make sense to space the GPUs out for better airflow. It's easy to find motherboards which support more than 3 GPUs each, the fewer motherboards you have the further your money will stretch, having three motherboards just to get 3x 3.0 x16 slots each is a complete waste of money.
[Image: sigimage.php?u=652873&t=212997&b=twilight2]
Reply
Likes:
#10
Well another advantage of separate systems is that you could use off the shelf cases, which would provide enough structure to firmly hold the GPUs in place, even while moving. To move my machine I just lifted it out of the crawlspace of my previous home, and put it in the crawlspace of my new house, no disassembly required.

And yeah, you probably don't want to use a NAS to boot off of. Believe me, I tried. In addition to being a giant pain to set up and maintain, it also caused a serious performance hit in F@H. In the end I bought a bunch of cheap 80 GB drives off eBay for $60. (So $15 per system for a hard drive, plus spare drives. This was back on my four computer CPU system.) Much easier to set up and F@H performed as expected.

Also, which GPUs are you planning to use? GTX 1080?
Reply
Likes:
#11
Interesting. Didn't realise NAS would take that much of a performance hit, considering the amount of data that can be transferred over ethernet. Righty, that's noted. Cheap individual drives it is then. Out of curiosity though, do you know if it made a difference in performance when you increased the setting for time between disk access? For a NAS setup, it sounds like the issue could be significantly mitigated through infrequent disk access. Either way, the reason I was interested in NAS was because I also wanted a place to store movies for use with a home cinema and have it networked for all systems.

The plan is to go for 1080s, yes. However, as I am placing more emphasis on PPD per dollar this time, I might look into one of the other 10xx cards as well. My personal rig will definitely have a 1080 though.

As for not going with risers, I feel as if the airflow issue won't be as significant here, considering the flagship card uses a mere 186 watts at maximum.
[Image: sigimage.php?FAHUser=hiigaran&FAHTeam=21...hangeling1]
Reply
Likes:
#12
Actually I might be mixed up, I think it was certain BOINC projects that took a hit with network drives. But yeah, it confused me too. The network drives tested fine for bandwidth and latency, so idk what was going on. But it's still a huge pain and I wouldn't do it again for just three or four computers.
Reply
Likes:
#13
Interesting...I revisited the thread on the official folding forum which I made originally to try and get a better understanding of PCI-e bandwidth limitations and how this might affect the performance of a flagship card. One of the unread posts was:

Quote:There is test of gtx 1080 on pcie 2.0 x1 and only 7% performance loss compared to pcie 2.0 x16

If I assume this scales linearly, the gen3 equivalent would be a 3.5% loss, and therefore dropping from x16 to x4 would result in a 1.16% loss. The test was performed on a Linux distro though, so I'm not sure how Windows will handle it.
[Image: sigimage.php?FAHUser=hiigaran&FAHTeam=21...hangeling1]
Reply
Likes:
#14
Barring a minor delay, everything has been ordered. More details will come in the next few days, but as I'm sure you're all curious, a total of 10 1080s have been ordered with everything else.

It was rather nerve-wracking to be carrying almost $8500 in cash through the city!
[Image: sigimage.php?FAHUser=hiigaran&FAHTeam=21...hangeling1]
Reply
Likes: BestPony
#15
That's great news! Would you be able to share the order details? I'm interested to see the configuration you've gone for.
[Image: sigimage.php?u=652873&t=212997&b=twilight2]
Reply
Likes:




SOON