Dashcat2 Build

Discussion in 'Worklogs' started by MisterDNA, May 29, 2010.

  1. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    September 2, the company I worked for downsized. I was one of the ~15% cut. I learned later that I was being paid about 40% less than my position demanded. Instead of going directly to another job (I was offered a great spot at a cool company, with interesting work, and a sweet pay rate), I'm refreshing myself and going back to school. I'm headed to Bridgerland ATC for the first half and USU for the second half. I was planning to get into Drafting, but the Robotics program is too appealing to ignore. I can learn SolidWorks on my own time, anyway, and a lot of what I'll cover in Robotics is integral to my USU studies.

    My later work will require Dashcat's computing power. I have no choice but to continue. What luck... obsolescence that isn't.

    I might have to scavenge Infiniband gear. It's the cables that are expensive, oddly.
     
  2. spugm1r3

    spugm1r3 Gawd

    Messages:
    800
    Joined:
    Sep 28, 2012
    Cheers to the positive turn on a big change. And good luck with the new career path.

    Having stared at this a bit, it seems to me you can pretty much count on about $1k per device for 40Gb/s connectivity if you are going with Mellanox, and a bit less for Qlogic. This seems to hold true whether you are connecting as little as 8 ports, or as much as 36 ports.

    Either way, that's a monster get for home use. Good luck!
     
  3. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    Thanks for the ups. I appreciate it. I didn't think anyone still followed this pretty-much defunct thread from almost six years ago.

    I can't go 40Gbps at this point. The nodes and server are all PCI-X 64/133 architecture. Basically, it's down to a 10Gbps (Infiniband SDR 4x) approach. I see dual-port cards for USD$10, cables for USD$10, and 24-port switches for USD$200.

    Dashcat is 16-node with two cold-spare worker nodes and a cold-spare server. 18 ports for the nodes, two ports each for the servers (if needed to quell saturation), and two ports to my PCI-e workstation is a good fit. If I elect to cold-plug the backup server (or I just don't need two ports each for the servers, I have two spare ports to throw at a long-distance shotgunned fiber link out to 10km away.

    That's my whole city and then some, even with our local fibers. Shooting twin beams to BATC from my house is only three miles (4.67km). Granted, at that distance, speed-of-light in the fiber (~2/3 c because zigzag) becomes a relevant issue with a ~26 microsecond propagation delay.

    I'm glad to have these problems to consider.
     
  4. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    I guess I'm at least persistent. I'm back in college, refreshing mah skillz.

    I expected a UCAT certificate, and that guarantees me $18 an hour now. I also expected to press onward to an Associate Degree that would boost that by some bucks. That's not the endpoint. 2017 starts the Bachelor program in my field. This is perfect timing.
     
  5. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    VR lit this project back up; Unreal Engine 4 Swarm.

    Looks like I can make something out of this, after all.

    I didn't see this coming.
     
  6. pwrusr

    pwrusr 2[H]4U

    Messages:
    3,133
    Joined:
    Jul 30, 2009
    Nice!

    Great to see you're still at it :)
     
    MisterDNA likes this.
  7. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004


    This is new. Does it work?

    Preview says so. I'm going to have fun with this.
     
  8. TsingTao

    TsingTao n00bie

    Messages:
    35
    Joined:
    Nov 22, 2010
    It works, and it is good.

    Signed,
    Old Lurker
     
    MisterDNA likes this.
  9. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    It was yesterday that I stabbed my flag in the dirt, taking control by admitting defeat, in a way.

    As most of those who have followed this thread already know, I have had a lot going on and it was 7/28/2016 13:43MDT I declared a "Fuck That Shit" to the world and took my pills.
    Bupropion (Generic for Wellbutrin) -- Originally just an antidepressant, which I needed anyway. Since 2004, used for ADD/ADHD for the same reason Strattera (Atomoxetine) is.
    Buspirone (Generic for Buspar) -- Anti-Anxiety. Ever since my Mom's final Ambulance ride and my subsequent chase in my Camaro SS, sirens going by my house put me on edge.
    feat. Pitbull, er, Zolpidem (Generic for Ambien) -- Wellbutrin/Bupropion causes insomnia for the first month. Did the same when I was a teen. This fixes it.

    My entire bill for this month was like USD$48. Ambien was $9 of that. This should never have surprised me. Overclocking costs money, whether it's Silicon or brain matter.

    I need to give a shoutout to whoever is behind the prescription discount card my town is giving out. I'd have paid more than double without it.
     
    Last edited: Jul 29, 2016
  10. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    I think I get it now.



    I gotta buy a hot glue gun like now.
     
  11. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    No I don't. CR2032 cells pack 225mAh and last three years minimum. My LiSOCl2 AAs pack 10 times that.
     
  12. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    This has been really difficult for reasons I've already mentioned. I started this project in 2009 around this time. The technology was already 4 years old then. Today, it's not CPU but GPU that dictates speed. The problem is these motherboards I use are PCI-X and not PCIe of any kind. PCI-X 133 to PCIe 1.0 4x bridge cards are available, but they're expensive and still limited in a lot of ways. I'm working on getting the cluster computing concepts mastered. I will never give up.

    I really hope this is a Launch And Catch-Fire moment. Seven years is a long time. Steel is one thing, but dog years are longer than silicon years.
     
    spugm1r3 likes this.
  13. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    [​IMG]

    Four of those nodes have more cores at a faster speed than my own.
    The upgrade price from 32GB to 64GB per node is dirt cheap and would give four nodes as much RAM as my whole cluster.

    I wish I could get a bunch of the mobo/CPU/RAM sets and put those into my cases. I'm sure it doesn't work that way, though.

    PCIe slots mean GPUs become an option.
     
  14. Jorona

    Jorona 2[H]4U

    Messages:
    2,851
    Joined:
    Nov 6, 2011
    Funny you should say that...

    http://www.natex.us/ReWork-Intel-S2600CP2J-Motherboard-2x-E5-2670-SR-p/s2600cp-h8-32gb.htm
     
  15. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    This is probably the most solid progress I've made on this machine in a long time. I have the cable glands that will allow me to hook my 8AWG welder cable to a sub-panel for the cluster. I'm figuring out breaker sizes, at the moment. My house wiring can take 50A intermittent and 40A continuous within spec on the 30A circuit. (Note: USA power uses split-phase 120V for a net 240V between poles and 120Vrms versus neutral/ground.) I prefer to steady-state 30A like a standard clothes dryer, but larger and cooler wires are a bonus on the voltage drop front.

    My IceBox intelligent PDUs will take two standard IEC computer power plugs, which are normally designed for 10A to each of five nodes. In practice, heavier gauge cable yields better ampacity, as does better contact material. I can maintain 20A per branch (120VAC split, of course) with 14AWG cable without exceeding 60C or even close. That's 480W per node. I don't need that much.
     
  16. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    When I first conceived the Dashcat Supercomputer, I thought about the idea of a massive bank of system with dedicated GPUs rendering video in somewhat-realtime fashion with, say, 120 nodes kicking out a completed frame every two seconds and those frames being stitched together to form a video stream. I used to wonder how to switch GPU outputs quickly enough to accomplish this, but I realized after that Netex link provided by Jorona that's it's no longer necessary to coordinate video outputs when each frame can be sent over a network connection.

    When I built Dashcat, the goal was Gigabit Ethernet and that was economical at the time. It's still feasible now while kicking out a 4K frame every few minutes. The nodes in the listing Jorona posted have 10GbE onboard. This is a game changer. I got to thinking: What if I transplanted the guts from one of these nodes into the Dashcat Server and changed the switch from the Dell 5324 to something with Gigabit ports, but with 10GbE uplinks, with one connected to the server to aggregate the worker nodes and another sent to my workstation? I wouldn't need Infiniband.

    848x480x32bpp video at 60Hz consumes 781mbps.

    1920x1080x32bpp video at 60Hz consumes ~4gbps.
    2160x1200x32bpp video at 90Hz consumes -7.5gbps. (today's VR HMDs)
    3840x2160x32bpp video at 24Hz consumes ~6.4gbps.
    3840x2160x32bpp video at 30Hz consumes -8gbps.

    GbE can handle 3fps at 4K (I would need 20 nodes) (334ms additional latency)
    GbE can handle 15fps at 1080p with 12fps more likely (I would need 5 nodes) (67ms additional latency)

    I do believe Dashcat is going to become a general playground, not just a render farm. The capability is there now.

    What I find interesting is what happens when this is extended out to 100GbE (which will be cheap, eventually)

    7680x4320x32bpp is just over 1gb per frame. The same at 90Hz is ~95.5Gbps.

    What happens when I use a 4x VR resolution?

    8640x4800X32bpp = 1.327gb per frame = 70Hz to fit in a 100Gbps stream

    We'll get there eventually. I remember when 1024x768 XGA was the holy grail, then 1080p. Now I'm looking at buying my first 4K TV/monitor.
     
  17. MisterDNA

    MisterDNA Gawd

    Messages:
    1,023
    Joined:
    Feb 8, 2004
    So... Last week I got a QLogic Silverstorm 24-port CX4 Infiniband SDR switch for $10 while I was out Picking. And that was at the most unusual DI store on the Wasatch Front.

    There's a lot to be said for "How?" and I'm putting that to work in 2017. This is already disturbingly-effective like nabbing Mewtwo with an errant snotball.

    Oh... My Pokemon GO is showing.

    This thread may as well be my life history, at this rate.

    If anyone benefits from this at any age or level, just pay it forward. But leave the finer points for questions requiring answers later.

    "Fryode only ran at Infiniband 10G to learn what he did. That gear is free now if it hasn't gone to the trash."

    I'll adapt. Nobody will be left behind. (But I'll need to get helpers of the Third Street Saints-tier.)

    Holy shit....
     
    Last edited: Jan 31, 2017
    ballistic likes this.