The [H]ard Forum Storage Showoff Thread - READ requirements in first post

Status
Not open for further replies.
Knowing the right people helps. ;)

I used to have free web hosting and also got my friend free web hosting from a guy off [H]ere. We were using it for our sites then bam no more hosting and now his sites are down and he ain't replying to my emails and pm's.

The point of this is make sure you know the person well enough so you don't get cut off randomly then he ignores you.
 
Definitely not an issue in my case. In my case it is a job perk and I work for a cool company which may even let me keep it after I leave but there is no way they would not give me atleast a couple weeks to find another home for it if that ever happened. I own the hardware so its not like they could just keep it and I couldn't get it back or anything like that.
 
Definitely not an issue in my case. In my case it is a job perk and I work for a cool company which may even let me keep it after I leave but there is no way they would not give me atleast a couple weeks to find another home for it if that ever happened. I own the hardware so its not like they could just keep it and I couldn't get it back or anything like that.

Looks like your company has a cage, which means that they already have the resources paid for, so having the extra unused resources are wasted, meaning there is no issue with you sharing a little :)

Let me know if your company is ever looking to expand their colo hosting infrastructure, especially to the Florida or Michigan areas.
 
Looks like your company has a cage, which means that they already have the resources paid for, so having the extra unused resources are wasted, meaning there is no issue with you sharing a little :)

Let me know if your company is ever looking to expand their colo hosting infrastructure, especially to the Florida or Michigan areas.

we are already pretty big (130 racks or so) and about using 90-95% of our capacity in rack space). All the employees work in sothern california and we are based out of los angeles so I doubt we have an interest in rack space anywhere else. Actually we are getting big enough now that we will probably stop renting rack space from people and instead lease just regular space and put our own UPS/cooling in and setup our own data center.

Edit:
Also those 2 left most racks in the picture are specifically employee/friends only racks. No customer servers go in there and those are specific for employees/friends/etc... That is why the hardware varies pretty widely between whats in the rack. Pretty much all of our other racks are mainly super micro hardware/netapp/sun filers.

Today I just hooked up 3 more disks only (no cpu/mobo and which uses a miniSAS cable to hook up to the main one) 24-drive beast servers with 1.5 TB seagate drives. One server with 120TB of useable space:

~# tw_cli /c0 show unitstatus

Unit UnitType Status %RCmpl %V/I/M Stripe Size(GB) Cache AVrfy
------------------------------------------------------------------------------
u0 RAID-6 INITIALIZING - 25%(A) 64K 30733.4 ON ON
u1 RAID-6 INITIALIZING - 25%(A) 64K 30733.4 ON ON

~# tw_cli /c1 show unitstatus

Unit UnitType Status %RCmpl %V/I/M Stripe Size(GB) Cache AVrfy
------------------------------------------------------------------------------
u0 RAID-6 REBUILDING 0%(A) - 64K 30733.4 OFF OFF
u1 RAID-6 INITIALIZING - 18%(A) 64K 30733.4 ON ON


Pretty awesome except for the fact they are using shitty (IMHO) 3ware raid controllers.
 
what OS are you running?
for what are used cables on serial port?

Its running sabayon linux (basically a modified version of gentoo). The serial port is for BIOS redirection and console output. It hooks up to a digiCM. This makes it so I can access a terminal if the NICs died or it was getting DDoS'd or something or simply get output if it got a kernel panic. It also makes it so I could update the BIOS remotely via a dos boot disk image loaded from grub. It also helps to have so I can recover from compiling a knew kernel that didn't load for some reason (all remotely). Here is what it looks like:

serial2.png

serial3.png

serial4.png

serial5.png

serial6.png

serial7.png

serial8.png

serial9.png

serial10.png

serial11.png

serial12.png

serial13.png
 
Specs:

Lian Li PC-101B case
Antec Earthwatts 500w power supply
Scythe mine cpu cooler
Artic Cooling 120mm fan
MSI Ma78gpm-ds2h mainboard
4gig Crucial 800mhz DDR2
AMD AM2 5050e 45w cpu
Icy Dock 5 bay enclosure
DVD-rom drive
1x Samsung 750gig HD
3x WD 500gig HDs
1x WD green 1terabyte
WHS with PP1

More info here
More larger/pics here


3185388648_986ffb52d8.jpg

3185388884_4e9b8bb4ff.jpg

3184545513_f801c8a262.jpg

3185389248_8c0dc0b473.jpg

3185390504_299cf9d4fb.jpg

3184547335_f5a4a9fe72.jpg

3184547453_14ea850d8a.jpg
 
Wow, that's a hard one to follow. So clean looking too!

Well, i'm in.
160GB Vista OS
400GB Main Media (pics, music, wallpapers)
640GB Backups and extra media (pic backup, music backup, tv shows, and high deff stuff)
=
Advertised:
1,200GB

Formatted: 1,117GB

Also have 80GB, but my friend is using it, and Linux on a 120GB. Dual boot.

with those,
Advertised,
1,400GB

Formatted, 1,297GB
harddrivemanagerfixed.png
 
Wow, that's a hard one to follow. So clean looking too!

Well, i'm in.
160GB Vista OS
400GB Main Media (pics, music, wallpapers)
640GB Backups and extra media (pic backup, music backup, tv shows, and high deff stuff)
=
Advertised:
1,200GB

Formatted: 1,117GB

Also have 80GB, but my friend is using it, and Linux on a 120GB. Dual boot.

with those,
Advertised,
1,400GB

Formatted, 1,297GB
Sadly enough, you don't actually meet the 'requirements' set out on the first page! Not that anybody even remembers there are requirements, nor cares...
 
>_>... i didnt even read that. I just looked at everyone else's and though it qualified cause they all talked about 1tb+ is in.. oh-well
 
Well i got the n7700 nas server

I put 7 1TB WD10EADS

configuring the raid is wasting so much time. takes like 6 hours to build raid 5. i already build 2 times becuase i am trying to figure out how to use iSCSI target feature.

The screenshot i am posting is using raid JBOD (just a bunch of disk). I am going to use raid 5 but JBOD took only like 1 min to get it working. using raid 5 I will only get like 5TB something free space to use.

Also on the screenshot you can see I already have 3 1 TB hard drive in my comp itself. and a raptor drive for main hd.

So I have 10 1 TB hard drives total now.


 
ok 4th time formatting to raid 5 now.

I decided to do 100 percent raid data because iscsi seems to be windows only.

So i going to do it as a network drive. At first I thought i cant see how much space is in the network drive but I realized i could if i map the network.

So back to formatting it again.
Seems like if i use iSCSI the server doesnt even know how much free space and used space it used.
 
ok 4th time formatting to raid 5 now.

I decided to do 100 percent raid data because iscsi seems to be windows only.

So i going to do it as a network drive. At first I thought i cant see how much space is in the network drive but I realized i could if i map the network.

So back to formatting it again.
Seems like if i use iSCSI the server doesnt even know how much free space and used space it used.
iSCSI is an open standard that is supported on multiple OSes. I know of at least one implementation of iSCSI on Linux but there may be 2. Also, iSCSI is a Block level interface while SAMBA is a network file sharing protocol. iSCSI is what it says, it's internet based SCSI. So you are transmitting actual SCSI commands via IP. There is some authentication and wrapping stuff that iSCSI does, but in the end you are using an iSCSI device just as if it was a local drive.

I'm not certain why you couldn't tell the size of an iSCSI partition, that seems kind of weird.
 
I am guessing because it might be n7700 cant read data in the iscsi so I just end up doing raid 5 with ext3 using 100 percent space.

The nas runs a samba server for network connection.

Well it does not matter too much I guess.

Copy and cleaned lot of files from my other drive into the server.
 
This won't impress anyone here, but I thought I would share.

Corsair HX520 power supply

RAID 5
Areca ARC-1210
four 1.5TB Seagate drives

one 500GB Seagate drive

5TB advertised capacity. 4.55TB formatted capacity.

All in windows xp 32-bit :)

DSC02823%20(Medium).JPG
 
Colo box is:
2U supermicro (CSE-825TQ-560LPB)
Core 2 quad q6600 2.4 Ghz
8GB DDR2-800 memory
8x750 GB seagate 7200.11 drives (raid5)
ARC-1220 (screenshot) but now ARC-1231ML
Onboard graphics

Mythbox is:
Cooler master CM stacker Case
Core 2 duo E6850 3 Ghz
4GB DDR2-800 Memory
Promise EX8350 Raid Controller
8x500 GB 7200.10 seagate drives (raid5)

Home box is in my sig:
NORCO RPC-4020 Case
MSI P6N SLI Platinum (nforce 650i) Mobo
MSI Geforce gtx 260 (pic has an 8800 gtx)
8GB DDR2-1000 g.skill ram
ARC-1280ML raid controller
20x1TB Seagate SATA (raid6)
wow that thing is huge. i bet most of us r pirates if using it for home
 
Can we start this over or change the rules?

I mean 1 tb with min of 4 HDD's is WEAK considering we have 2tb single HDD's
 
Can we start this over or change the rules?

I mean 1 tb with min of 4 HDD's is WEAK considering we have 2tb single HDD's

I agree I think some new requirements may be in order. Also, is it possible to setup this thread so there's an option to show posts with images only?
 
Oh well on what was lost on this thread. As for making a new thread the only thing I didn't like was the requirement of having your username by the pictures. That is quite annoying as it means you have to re-take all your pictures =(. Anyway I did say I would have a pic of the crazy 24 drive supermicro server rack we have at work:



They are from top to bottom:
1. 24x1.5 TB JBOD box hooked up to 3.
2. 24x1.5 TB JBOD box hooked up to 3.
3. 24x1.5 TB head machine which the above 2 and below are hooked up to
4. 24x1.5 TB JBOD box hooked up to 3.
5. 24x1.5 TB JBOD box (not in use)
6. 24x1 TB JBOD box hooked up to 7.
7 24x750 GB head machine which has the one above it hooked up to it

Total eventual useable disk space (when they are all in use) is:

203.5 TB

Total Space of drives is:

222 TB

Total useable space in all 4 of the racks in the screenshot (not counting the netapp) is:

399.5 TB

Total space of drives is:

614TB
 
well fuck, guess all my work on the new thread is gone
Or is it? :)
Code:
[QUOTE]DO NOT POST IN THIS THREAD IF YOU SEE THIS MESSAGE







Old thread was outdated and with 100,000+ views, it's time for a new one.
This is The [H]ard Forum Storage Showoff Thread

I will maintain an updated rankings post based on the largest single system and total storage across all systems.

If you want your name on the board, the following are required.

    * Pictures of your actual systems.
    * Minimum of 10 TB total storage or 8 drives in one system.
    * Full system specs
    * Post format must follow the example post.
    * Send me a picture of your system with a piece of paper with your username written on it to verify.


Once you are ready to be considered for a spot on the board, send me a PM and I'll look at your post.
If it doesn't meet the requirements, I'll tell you what to fix, or I may just delete your PM - this shouldn't be complicated. If you can build a machine that qualifies, you should be able to follow simple instructions. I'm not going to be a nazi about it, I just want a clean, organized thread.

Your post should include the following information. Try to keep it in the order I have it listed here.

Case
PSU
Motherboard
CPU
RAM
GPU (if discrete)
Controller Cards (if any)
Optical Drives
Hard Drives (include full model number)
Battery Backup Units (if any)
Operating System

A short paragraph here describing what you use your storage for and how you handle backups and organizing would be nice.

Then post a few pictures somewhere that has decent hosting. If your pictures go down, you get dropped from the board.[/QUOTE]

[QUOTE]Total Storage Rankings

   1. your mom
   2. your mom
   3. your mom


Largest Single System Rankings

   1. your mom
   2. your mom
   3. your mom[/QUOTE]
[QUOTE]This is an example of the format for posting. This is my current WHS and we're just gonna pretend it's 10TB to provide an example.

Antec Solo
Antec EarthWatts 380W
Gigabyte GA-MA74GM-S2
AMD X2 BE-2400
Scythe Zipang
Pioneer DVR-111D
Corsair 2 x 1GB TWIN2X2048-6400 G
6 x WD10EACS
APC Smart UPS SUA1500
Windows Home Server

I use this machine to backup all my other computers. Right now it is also working as a file server, but I am in the process or building a separate one.
[url]http://dl.getdropbox.com/u/35025/WHS/small/computer%20pics%20042%20%5B1280x768%5D.JPG[/url]
[url]http://dl.getdropbox.com/u/35025/WHS/small/computer%20pics%20043%20%5B1280x768%5D.JPG[/url]
[url]http://dl.getdropbox.com/u/35025/WHS/small/computer%20pics%20053%20%5B1280x768%5D.JPG[/url]

(Now if I had another system I would list it here.)

Antec Solo
Antec EarthWatts 380W
Gigabyte GA-MA74GM-S2
AMD X2 BE-2400
Scythe Zipang
Pioneer DVR-111D
Corsair 2 x 1GB TWIN2X2048-6400 G
6 x WD10EACS
APC Smart UPS SUA1500
Windows Home Server[/QUOTE]
 
Blue Fox to the rescue

I like it.

WTF has been up with [H] for the last month?
Seems like they have been down a lot.
 
Ender has some humor...


Most Total Storage
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie


Most Storage in One System
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie
Ockie

haha
 
again with the pic and paper
cmon, why?
are we realy hit the bottom and cant trust each other? :D
 
again with the pic and paper
cmon, why?
are we realy hit the bottom and cant trust each other? :D
ok, I guess we can do without it
if someone's a big enough loser to fake it, that's their problem
 
again with the pic and paper
cmon, why?
are we realy hit the bottom and cant trust each other? :D

I can't agree more with you. Its annoying that I basically have to take new pictures just for this requirements... The only machine I wouldn't have to do this on is my work machine because it already has my servers name and my name on it which is also my forum name =(.
 
pic and paper is a good idea but wont last because people like me are too lazy. however, a new requirement should be that if requested, they must be able to provide it.... so we can call fakers out.
 
pic and paper is a good idea but wont last because people like me are too lazy. however, a new requirement should be that if requested, they must be able to provide it.... so we can call fakers out.
ok, I like that
 
Is the "advertised" space before or after raid? Suppose I have 10 1TB drives in raid6 for 8TB of "advertised" space after raid, does this count? I think the intended meaning in the thread is "sum advertised space of drives in your system" (e.g., 10 * 1TB => 10TB), but it could mean "compensate for advertising overhead on the amount of usable space you have" (e.g., 7.6TB because drives are smaller than 1TB each => 8TB).
 
Is the "advertised" space before or after raid? Suppose I have 10 1TB drives in raid6 for 8TB of "advertised" space after raid, does this count? I think the intended meaning in the thread is "sum advertised space of drives in your system" (e.g., 10 * 1TB => 10TB), but it could mean "compensate for advertising overhead on the amount of usable space you have" (e.g., 7.6TB because drives are smaller than 1TB each => 8TB).
just the sum of the manufacturer stated capacity for all your drives
ignore RAID, formatting, etc
:)
 
just the sum of the manufacturer stated capacity for all your drives
ignore RAID, formatting, etc
:)

Keeps it simple, I like it. Even though I need more drives now. My 8TB WHS box doesn't look all that hot. :(
 
Status
Not open for further replies.
Back
Top