• Read Bio
    link
    fedilink
    English
    178 months ago

    Maybe windows is not used in supercomputers often because unix and linux is more flexiable for the cpus they use(Power9,Sparc,etc)

    • Matt
      link
      fedilink
      13
      edit-2
      8 months ago

      Plus Linux doesn’t limit you in the number of drives, whereas Windows limits you from A to Z. I read it here.

      • @[email protected]
        link
        fedilink
        4
        edit-2
        8 months ago

        For people who haven’t installed Windows before, the default boot drive is G, and the default file system is C

        So you only have 25 to work with (everything but G)

        • The Ramen Dutchman
          link
          fedilink
          28 months ago

          Almost, the default boot drive is C:, everything gets mapped after that. So if you have a second HDD at D: and a disk reader at E:, any USBs you plug in would go to F:.

      • @[email protected]
        link
        fedilink
        88 months ago

        You can mount drives against folders in windows. So while D: is one drive, D:\Logs or D:\Cake can each be a different disk.

    • @[email protected]
      link
      fedilink
      58 months ago

      That’s certainly a big part of it. When one needs to buy a metric crap load of CPUs, one tends to shop outside the popular defaults.

      Another big reason, historically, is that Supercomputers didn’t typically have any kind of non-command-line way to interact with them, and Windows needed it.

      Until PowerShell and Windows 8, there were still substantial configuration options in Windows that were 100% managed by graphical packages. They could be changed by direct file edits and registry editing, but it added a lot of risk. All of the “did I make a mistake” tools were graphical and so unavailable from command line.

      So any version of Windows stripped down enough to run on any super-computer cluster was going to be missing a lot of features, until around 2006.

      Since Linux and Unix started as command line operating systems, both already had plenty fully featured options for Supercomputing.

        • Flying Squid
          link
          fedilink
          38 months ago

          I think this is a Ship of Theseus thing here that we’re going to argue about because at what point is it just UNIX-like and not UNIX?

          UNIX-like is definitely a descriptor currently used for Linux.

          Even the Wikipedia entry starts that way.

    • @[email protected]
      link
      fedilink
      68 months ago

      To make it more specific I guess, what’s the problem with that? It’s like having a “people living on boats” and “people with no long term address”. You could include the former in the latter, but then you are just conveying less information.

    • @[email protected]OP
      link
      fedilink
      408 months ago

      Unix is basically a brand name.
      BSD had to be completely re-written to remove all Unix code, so it could be published under a free license.
      It isn’t Unix certified.

      So it is Unix-derived, but not currently a Unix system (which is a completely meaningless term anyway).

        • @[email protected]
          link
          fedilink
          28 months ago

          It means nothing, it’s just a paycheck you sign and then you get to say “I certify my OS is Unix”. The little bit more technical part is POSIX compliance but modern OSs are such massive and complex beasts today that those compliances are tiny parts and very slowly but very surely becoming irrelevant over time.

          Apple made OSX Unix certified because it was cheap and it got them off the hook from a lawsuit. That’s it.

          • @[email protected]
            link
            fedilink
            58 months ago

            Microsoft could technically get Windows certified as UNIX.

            I don’t think they could now that the POSIX subsystem and Windows Services for UNIX are both gone. Don’t you need at least some level of POSIX compliance (at least the parts where POSIX and Unix standards overlap) to get Unix certified?

  • The Menemen
    link
    fedilink
    358 months ago

    Surprised to learn that there were windows based Supercomputers.

    • @[email protected]
      link
      fedilink
      628 months ago

      Those were the basic entry level configurations needed to run Windows Vista with Aero effects.

      • @[email protected]
        link
        fedilink
        58 months ago

        Meh, you just needed a discrete GPU, and not even a good one either. Just a basic, bare-bones card with 128MB of VRAM and pixel shader 2.0 support would have sufficed, but sadly most users didn’t even have that back in 06-08.

        It was mostly the consumer’s fault for buying cheap garbage laptops with trash-tier iGPUs in them, and the manufacturer’s for slapping a “compatible with Vista” sticker on them and pushing those shitboxes on consumers. If you had a half-decent $700-800 PC then, Vista ran like a dream.

        • @[email protected]
          link
          fedilink
          English
          38 months ago

          Most computers sold are the lowest end models. At work we never got anything decent so it was always a bit of a struggle. Our office stayed with XP for way longer than we should have so we skipped Vista altogether and adopted Windows 7 a few years late.

        • @[email protected]
          link
          fedilink
          English
          188 months ago

          No, it was mostly the manufacturers fault for implying that their machine would run the operating system it shipped with well. Well that and Microsoft’s fault for strong arming them to push Vista on machines that weren’t going to run it well.

          • @[email protected]
            link
            fedilink
            18 months ago

            APUs obviously weren’t a thing yet, and it was common knowledge back then that contemporary iGPUs were complete and utter trash. I mean they were so weak that you couldn’t even play HD video or even enable some of XP’s very basic graphical effects with most integrated graphics.

            Everyone knew that you needed a dedicated graphics card back then, so you can and should in fact put some blame on the consumer for being dumb enough to buy a PC without one, regardless of what the sticker said. I mean I was a teenager back then and even still I knew better. The blame goes both ways.

            • @[email protected]
              link
              fedilink
              English
              38 months ago

              No, if you weren’t “involved in the scene” and only had the word of the person at the store then you have no idea what an iGPU is, let alone why they weren’t up to the task of running the very thing it was sold with.

              You were a teenager in a time where teenagers average tech knowledge was much higher than before. That is not the same as someone who just learnt they now need one of those computer things for work. Not everyone had someone near them who could explain it to them. Blaming them for not knowing the intricacies of the machines is ridiculous. It was pure greed by Microsoft and the manufacturers.

      • @[email protected]
        link
        fedilink
        48 months ago

        How can there be N/A though? How can any functional computer not have an operating system? Or is just reading the really big MHz number of the CPU count as it being a supercomputer?

        • @[email protected]OP
          link
          fedilink
          68 months ago

          Early computers didn’t have operating systems.
          You just plugged in a punch card or tape with the program you want to run and the computer executed those exact instructions and nothing else.
          Those programs were specifically written for that exact hardware (not even for that model, but for that machine).
          To boot up the computer, you had to put a number of switches into the correct position (0 or 1), to bring its registers in the correct state to accept programs.

          So you were the BIOS and bootloader, and there was no need for an OS because the userspace programs told the CPU directly what bits to flip.

        • @[email protected]
          link
          fedilink
          28 months ago

          They ofcouse had one, probably linux, or unix. But that information, about the cluster, is not available.

  • @[email protected]
    link
    fedilink
    English
    98 months ago

    We’re gonna take the test, and we’re gonna keep taking it until we get one hundred percent in the bitch!

      • @[email protected]
        link
        fedilink
        9
        edit-2
        8 months ago

        Apple had its current desktop environment for it’s proprietary ecosystem built on BSD with their own twist while supercomputers are typically multiuser parallel computing beats, so I’d say it is really fucking surprising. Pretty and responsive desktop environments and breathtaking number crunchers are the polar opposites of a product. Fuck me, you’ll find UNIX roots in Windows NT but my flabbers would be ghasted if Deep Blue had dropped a Blue Screen.

          • @[email protected]
            link
            fedilink
            298 months ago

            I think it was PS3 that shipped with “Other OS” functionality, and were sold a little cheaper than production costs would indicate, to make it up on games.

            Only thing is, a bunch of institutions discovered you could order a pallet of PS3’s, set up Linux, and have a pretty skookum cluster for cheap.

            I’m pretty sure Sony dropped “Other OS” not because of vague concerns of piracy, but because they were effectively subsidizing supercomputers.

            Don’t know if any of those PS3 clusters made it onto Top500.

  • @[email protected]
    link
    fedilink
    88 months ago

    This looks impressive for Linux, and I’m glad FLOSS has such an impact! However, I wonder if the numbers are still this good if you consider more supercomputers. Maybe not. Or maybe yes! We’d have to see the evidence.

    • @[email protected]
      link
      fedilink
      4
      edit-2
      8 months ago

      I wonder if the numbers are still this good if you consider more supercomputers.

      Great question. My guess is not terribly different.

      “Top 500 Supercomputers” is arguably a self-referential term. I’ve seen the term “super-computer” defined whether it was among the 500 fastest computer in the world, on the day it went live.

      As new super-computers come online, workloads from older ones tend to migrate to the new ones.

      So my impression is there usually aren’t a huge number of currently operating supercomputers outside of the top 500.

      When a super-computer falls toward the bottom of the top 500, there’s a good chance it is getting turned off soon.

      That said, I’m referring here only to the super-computers that spend a lot of time advertising their existence.

      I suspect there’s a decent number out there today that prefer not to be listed. But I have no reason to think those don’t also run Linux.

    • @[email protected]OP
      link
      fedilink
      178 months ago

      There’s no reason to believe smaller supercomputers would have significantly different OS’s.
      At some point you enter the realm of mainframes and servers.
      Mainframes almost all run Linux now, the last Unix’s are close to EOL.
      Servers have about a 75% Linux market share, with the rest mostly running Windows and some BSD.

    • @[email protected]
      link
      fedilink
      58 months ago

      Yes, in the linux stat. The otheros option on the early PS3 allowed you to boot linux, which is what most, of not all, of the clusters used.

    • @[email protected]OP
      link
      fedilink
      288 months ago

      I think you can actually see it in the graph.
      The Condor Cluster with its 500 Teraflops would have been in the Top 500 supercomputers from 2009 till ~2014.
      The PS3 operating system is a BSD, and you can see a thin yellow line in that exact time frame.

  • @[email protected]
    link
    fedilink
    English
    348 months ago

    As someone who worked on designing racks in the super computer space about 10 q5vyrs ago I had no clue windows and mac even tried to entered the space

        • @[email protected]
          link
          fedilink
          8
          edit-2
          8 months ago

          but it did not stick.

          Yeah. It was bad. The job of a Supercomputer is to be really fast and really parallel. Windows for Supercomputing was… not.

          I honestly thought it might make it, considering the engineering talent that Microsoft had.

          But I think time proves that Unix and Linux just had an insurmountable head start. Windows, to the best of my knowledge, never came close to closing the gap.

          • @[email protected]
            link
            fedilink
            English
            68 months ago

            At this point I think it’s most telling that even Azure runs on Linux. Microsoft’s twin flagship products somehow still only work well when Linux does the heavy lifting and works as the glue between

            • @[email protected]
              link
              fedilink
              28 months ago

              Where did you find that azure runs on linux? I have been qurious for a while, but google refuse to tell me anything but the old “a variant of hyper-v” or “linux is 60% of the azure worklad” (not what i asked about!)

              • @[email protected]
                link
                fedilink
                4
                edit-2
                8 months ago

                Where did you find that azure runs on linux?

                I dont know of anywhere that Microsoft confirms, officially, that Azure, itself, is largely running on Linux. They share stats about what workloads others are running on it, but not, to my knowledge, about what it is composed of.

                I suppose that would be an oversimplification, anyway.

                But that Azure itself is running mostly on Linux is an open secret among folks who spend time chatting with engineers who have worked on the framework of the Azure cloud.

                When I have chatted with them, Azure cloud engineers have displayed huge amouts of Linux experience while they sometimes needed to “phone a friend” to answer Windows server edition questions.

                For a variety of reasons related to how much longer people have been scaling Linux clusters, than Windows servers, this isn’t particularly shocking.

                Edit: To confirm what others have mentioned, inferring from chatting with MS staff suggests, more specifically, that Azure, itself, is mostly Linux OS running on a Hyper-V virtualization later.

              • @[email protected]
                link
                fedilink
                English
                38 months ago

                Good question! I can’t remember.

                I think I read a Microsoft blog or something like a decade ago that said they shifted from a Hyper-V based solution to Linux to improve stability, but honestly it’s been so long I wouldn’t be shocked if I just saw it in a reddit comment on a related article that I didn’t yet have the technical knowhow to fully comprehend and took it as gospel.

          • SayCyberOnceMore
            link
            fedilink
            English
            4
            edit-2
            8 months ago

            But, surely Windows is the wrong OS?

            Windows is a per-user GUI… supercomputing is all about crunching numbers, isn’t it?

            I can understand M$ trying to get into this market and I know Windows server can be used to run stuff, but again, you don’t need a GUI on each node a supercomputer they’d be better off with DOS…?

            • @[email protected]
              link
              fedilink
              3
              edit-2
              8 months ago

              But, surely Windows is the wrong OS?

              Oh yes! To be clear - trying to put any version of Windows on a super-computer is every bit as insane as you might imagine. By what I heard in the rumor mill, it went every bit as badly as anyone might have guessed.

              But I like to root for an underdog, and it was neat to hear about Microsoft engineers trying to take the Windows kernel somewhere it had no rational excuse to run (at the time - and I wonder if they had internal beta versions of stuff that Windows ships standard now, like SSH…), perhaps by sheer force of will and hard work.

            • Badabinski
              link
              fedilink
              78 months ago

              I could see the NT kernel being okay in isolation, but the rest of Windows coming along for the ride puts the kibosh on that idea.

      • @[email protected]
        link
        fedilink
        English
        58 months ago

        Yeh it was system x I worked on out default was redhat. I forget the other options but win and mac sure as shut wasn’t on the list

    • @[email protected]
      link
      fedilink
      English
      388 months ago

      about 10 q5vyrs ago

      Have you been distracted and typed a password/PSK in the wrong field 8)

  • @[email protected]
    link
    fedilink
    58 months ago

    Any idea how it’d look if broken down into distros? I’m assuming enterprise support would be favoured so Red Hat or Ubuntu would dominate?

    • @[email protected]
      link
      fedilink
      28 months ago

      I can’t imagine Supercomputers to use a mainstream operating system such as Ubuntu. But clearly people even put Windows on it, so I shouldn’t be surprised…

      • @[email protected]OP
        link
        fedilink
        48 months ago

        They do use Ubuntu, Red Hat and SUSE mostly.
        But for customers like that, the companies are of course willing to adjust the distro to their needs, with full support.
        Microsoft uses their own Linux distro now.

    • @[email protected]OP
      link
      fedilink
      20
      edit-2
      8 months ago

      The previously fastest ran on Red Hat Enterprise Linux, the current fastest runs on SUSE Enterprise Linux.
      The current third fastest (owned by Microsoft) runs Ubuntu. That’s as far as I care to research.

        • CantWeJustCuddle
          link
          fedilink
          18 months ago

          Because Cray have CrayOS which is a slightly modded version of SuSE. Why did Cray choose SUSE probably because the licensed support was cheaper that RHEL 😂

        • veroxii
          link
          fedilink
          138 months ago

          Because all the Arch consultants were busy posting on the internet.

        • @[email protected]
          link
          fedilink
          18 months ago

          Suoer-computing is a pain-in-the-ass, so my guess is some combination of SUSE picking up top talent that left other Linux vendors as IBM has been purchasing them, and SUSE just being willing to put in the extra work for the added brand recognition.

  • @[email protected]
    link
    fedilink
    238 months ago

    Wow, that’s kind of a lot more Linux than I was expecting, but it also makes sense. Pretty cool tbh.