Tuesday, October 13, 2009

Design Goals

Before we get our hands dirty building the Extreme Media Server, let's talk a bit about the design goals.

Key items that I want for my EMS:
  • Easy, straight forward, quick access to any and all media files and drives
  • Ability to run programs at the server level (such as TVeristy to transcode files to play back on XBOX, etc.)
  • Use only basic/commodity type hardware whenever possible
  • Use Minimal runtime power
  • Generate minimal heat
  • Ability to run Virtual Machines for pet projects
  • Ability to install a high-end GPU/Graphics Card for potential server-side transcoding projects
  • Use a single power supply instead of the standard two supplies that BackBlaze typically uses
Maybe more importantly than what I want, is what I don't want:
  •  Hardware RAIDs
  •  Lack of power management capabilities for the drives
  •  Non Standard File Systems (in my case, I'll be using NTFS - for those in the Linux world, please don't hold this against me!)
So with that in mind, here's my build list:

1 - BackBlaze Custom 4U Server Pod Case
1 - Associated Parts - Nylon Mounts, Nylon Screws, 90 degree molex connectors, fans, etc.
9 - 5-Port SATA Backplanes with Integrated Port Multipliers
2 - 3124 Based Silicon Image Controllers (4-Port, 64-Bit PCI)
1 - 3124 Based Silicon Image Controller (4-Port, PCI Express)
1 - ASUS P6T Workstation Edition MB (i7 Based)
1 - Intel Quad i7 Processor
2 - 6GB DDR3 Memory Kits (12GB Total)
1 - PCI Express Video Card - basic for now
1 - 1200 Watt ToughPower Thermatake Power Supply with Modular Connectors
1 - Windows Server 2008R2 x64 with HyperV
1 - SATA Drive for Loading OS (i.e. Boot Drive)


I know I'll get a ton of questions on this, so let's talk a little turkey on the RAID setup.

Some of you will no doubt want to RAID your data for protection. I'm not a fan of anything more complex than mirrored RAID setups. I've lost enough data with drives in a typical RAID 5 setup that I frankly avoid them like the plague. Another consideration is that if this is going in a home, the usage is going to be extremely light. Keeping a RAID array powered up will use a lot of extra energy and generate a lot more heat. If you are like me and live in a Sun State, the last thing you need is more heat!

The other reason people want a RAID is for performance, but you'll never see it at the client level. The best connection that most will do with a server like this is a hard-wired gigabit network connection. A typically hard drive can read about 100MB/sec or 800Mb/sec. In short, your gigabit ethernet (i.e. 1000Mb/sec) will quickly become the bottleneck for your data transfers.

The good news is that I did a test with my test EMS box and I was able to stream 16 movies at once without a RAID.This was a mix of bitrates that is comparable to everything from iPod to DVD to BluRay. With all 16 movies playing/streaming, I used just over 100Mb/sec of bandwidth on my network wire and around 9 to 12MB/sec of disk throughput or only about 1/10 of what a decent desktop drive is a capable of outputting. For a typical setup of serving a few MP3s and a movie or two concurrently, you'll have more than enough bandwidth without a RAID for most users.

In short, RAID if you want, but you'll use more power and probably get no real world performance benefit. What you might get is data protection, but you have 45 slots for extra hard drives - it might be just as easy to script a backup to a spare HD or use BackBlaze for online backups :)

The next big choice is whether to use Linux or Windows. I know Microsoft Window Server like the back of my hand. I support it all day long and for me there really wasn't any other choice. At the end of the day, it is cheaper for me to use Windows Server 2008R2 as I'll spend much more in time learning the Linux side of things which is also a real cost for me. Either OS is great and both have support for Port Multipliers. Use what you are comfortable with or use it as opportunity to learn something new. I would also encourage you to use the 64-Bit Version of whatever OS you decide on.

If you've got an idea to make the EMS better, let me know. If I like it, I'll give it a try and post the results!

The covers the basics for now. Next up, installing and wiring up the port multipliers.

15 comments:

  1. Don,

    A few questions if you don't mind...

    1. Are you using each drive as a different "letter"?

    What about doing JBOD on half x2 so you only have 2 drive letters. One JBOD could then be used for backup via a script or program.

    Or would a sofware JBOD cause it to run slower/use more energy?

    Of course if you are using single drives, you could always have a script create a list of all the files on each drive. Very easily done w/ a batch file.

    2. Would Windows XP Pro 32bit w/ 4GB of ram work? Or XP Pro 64bit w/ more ram?

    3. What about an AMD CPU?

    My plan is for this to mainly be for storing files and my other PCs will have Media Portal or other software on them to access the files.

    Of course I would like to find software that has the file info on the server so I dont have to add it to each PC. All the solutions I have found won't mount & play ISO files though.

    I am also going to build my case out of sheet metal and square tubing. Doing this should allow me to change the drive arrangements and even add more by using the onboard e/sata ports. I am considering 2 or 3 rows of vertical drives and mounting a modified drawer LCD for when I need to work on it.

    Making the case should be far less than $1k and you can paint it any color you want but mine is going in the closet. It could also be done to allow the addition of extra drives later. Have all the cables ran already so you just have to plug it in.

    Lastly, you can get the CFI-B53PM Port Muliplier (PM) Backplane at: http://www.storage4mac.com/cfpomupba.html

    They are $48 though.

    Sorry for such a long post!

    Garrie

    ReplyDelete
  2. @Don

    Again, you single out many of my "thoughts".

    Sure everyone will have to make its own decisions regarding disc setups (raid or not), OS, etc. based on their own experience. We all choose according to what we're more comfortable with.

    I have no real experience with RAID5, and was hoping it would be as good as theoretically expected for data safety. My only experience has been with RAID0 for performance reasons (not in a NAS role of course!)
    But indeed RAID adds an extra layer of complexity should you ever have to move those drives to some other setup due to unforeseeable events and get the data back.

    But... on the other hand, though I could proabably figure a way to replicate "critical data" on two or more drives for backup, there are terabytes of non-critical stuff that I don't consider worth having replicated wasting "twice" the space - hence the attraction of a stripped RAID5 array that would provide adequate protection against a single drive failure in a more efficient manner.

    ... decisions decisions...

    I'm also curious about how all those disks show up: Are they instantly recognized by the BIOS or, do they just show up in the OS after installing the SATA controller drivers?



    As for the power part, I'm also waiting to hear more details. How much power does it all use, how you handle the power-up surge, etc.

    Keep up the excellent posts! :)

    ReplyDelete
  3. >> 1. Are you using each drive as a different "letter"?

    I am using NTFS Mount Points. These allow you to mount/associate a hard drive to a folder at the server level. I then take these folders and share them as needed for remote access. Since I'll have over 26 drives in the system that aren't RAID'd, as far as I know this is the only option available.


    >> 2. Would Windows XP Pro 32bit w/ 4GB of ram work? Or XP Pro 64bit w/ more ram?

    I personally wouldn't use XP. Windows 7 would be a good candidate though. While I don't know of any technical reason that a 32-bit OS won't work, it just doesn't feel right for an 'Extreme Media Server' - going 64-bit just seems the way to go :)!

    Just keep in mind that Microsoft non-server OSes don't include support for software RAIDs if you are going that route.

    If you are trying to keep costs down, take a look at Ubuntu 9.04 or 9.10.


    >> 3. What about an AMD CPU?

    Any CPU/MB combination will work just fine. Just make sure you have enough slots and the right ports for the Silicon Image SATA Controllers.


    >> I am also going to build my case out of sheet metal and square tubing.

    That is one of the things I love about this stuff. With some time and tools you can build some really neat stuff and save a bundle. Thanks to BackBlaze and hopefully what I'm putting on line we can give everyone a major headstart - it would be really boring if everyone built just a plain red box!!

    For me, I just wanted a "quick 'n dirty" 45 bay server case. One of the items that I thought would be really neat is to build this into a piece of furniture. Maybe something like a media storage cabinet that holds hard drives where you just slide the hard drives into integrated backplanes would be cool.


    Thanks for the link on PMs! Much Appreciated!

    Don

    ReplyDelete
  4. Hi Don,

    Thanks for the reply. :)

    I have considered making a cabinet and putting my tv on it. The main problem is if I decided I want a new look. If I keep it in the closet then it's not a big deal and I can change my furniture easier & cheaper. Plus I do want to build a new HTPC and have it look deorative.

    >>Just keep in mind that Microsoft non-server OSes don't include support for software RAIDs >>

    XP Pro allows you to use dynamic drives and build RAID 0,1,5. It also can do JBOD. Which is what I have on my current HTPC.

    To do RAID 1 & 5 you do need to hack the OS a little. The URL below tells how.

    http://www.optimiz3.com/low-cost-and-reliable-network-attatched-software-jbod-raid-0-1-or-5/

    Thanks again and love the posts. :)

    Garrie

    ReplyDelete
  5. Where did you get
    "3124 Based Silicon Image Controller (4-Port, PCI Express)"
    I cant find any

    ReplyDelete
  6. Here are the controllers I'm using in my build:

    1 of these (4-Port 3124 Based, PCI - Express)

    http://addonics.com/products/host_controller/adsa3gpx8-4e.asp

    2 of these (4-Port 3124 Based, PCI 64-Bit):

    http://addonics.com/products/host_controller/adsa3gpx8-4e.asp

    Please note that the later ones are PCI 64-Bit. I believe they'll work in 32-Bit slots, but you'll probably want a MB with PCI 64-Bit slots if you use them and to be safe.

    Don

    ReplyDelete
  7. Thanks for sharing this. My main question:
    What product is the "5-Port SATA Backplanes with Integrated Port Multiplier" exactly?

    Very interested. Also i'm unknown with the use of port multipliers and wonder the compatibility issues that could arrise when dealing with controllers.

    I imagine most controller aren't designed to recognize 10 disks if they've got only 2 SATA connectors.

    Imagine ZFS RAID-Z on this. And then a pile running GlusterFS.

    ReplyDelete
  8. I, too, am curious what you used for the port multipliers. Or did they come with the case? What did that case run you?

    Great job though man! This gives me such server envy!

    ReplyDelete
  9. Opensolaris would have been a better choice, hardware permitting.

    ReplyDelete
  10. >> Opensolaris would have been a better choice, hardware permitting.

    In my case, Windows Server 2008R2 was MY best choice for the reasons I've stated in the blog.

    Please post the reasons that you think Opensolaris would be a good or better choice. While not my first choice, it might be an option others would like to consider.

    Also, why Opensolaris instead of Ubuntu or similar?

    Don

    ReplyDelete
  11. I'd imagine the main benefit of OpenSolaris is the zfs filesystem, especially combined with raidz1 or raidz2. With it end-to-end error checking etc.

    ReplyDelete
  12. Hi!
    I and some friends are going to build a Blackblaze as well...
    We where wondering about the weight, if the case is full, I can imagine it's quite heavy. Can it sit in the middle of a locker on the rails only? Will it not "budge" down in the middle?
    If you for example want to change a broken disk, is it possible to pull it out and open the cover without the whole box crumbling?

    And then the heat... Is there enough room for air to flow and cool the disks really?
    We are thinking 42x2TB Samsung disks 5700rpms (green line).
    Hopefully they are not as hot as 7200rpms, but we are not sure... ;)

    Thank you for a great description of your project.

    ReplyDelete
  13. Just some notes from another builder...
    Mine is holding now 2 x 12 1Tb disks in Windows 2008 software Raid.
    Used the AddOnics Raid but this was constantly resynching at each reboot. Now using pass through.
    On stress test W2008, sometime lost a drive and a whole raid every day. After BIOS upgrade its pretty stable. Feeding 1.2 million documents per day as stress test.
    for info info@futureware.be

    ReplyDelete
  14. where do you purchase the sata backplanes?

    ReplyDelete
  15. I got mine straight from CFI. chyangfun.com

    ReplyDelete