• SuperiorOne
    link
    fedilink
    English
    3510 months ago

    ‘Soon’ is a questionable claim from a CEO who sells AI services and GPU instances. A single faulty update caused worldwide down time recently. Now, imagine all infrastructure is written with today’s LLMs - which are sometimes hallucinate so bad, they claim ‘C’ in CRC-32C stands for ‘Cool’.

    I wish we could also add a “Do not hallucinate” prompt to some CEOs.

  • @[email protected]
    link
    fedilink
    English
    810 months ago

    Most companies can’t even give decent requirements for humans to understand and implement. An AI will just write any old stuff it thinks they want and they won’t have any way to really know if it’s right etc.

    They would have more luck trying to create an AI that takes whimsical ideas and turns them into quantified requirements with acceptance criteria. Once they can do that they may stand a chance of replacing developers, but it’s gonna take far more than the simpleton code generators they have at the moment which at best are like bad SO answers you copy and paste then refactor.

    This isn’t even factoring in automation testers who are programmers, build engineers, devops etc. Can’t wait for companies to cry even more about cloud costs when some AI is just lobbing everything into lambdas 😂

  • @[email protected]
    link
    fedilink
    English
    1510 months ago

    Let me weigh in with something. The hard part about programming is not the code. It is in understanding all the edge cases, making flexible solutions and so much more.

    I have seen many organizations with tens of really capable programmers that can implement anything. Now, most management barely knows what they want or what the actual end goal is. Since managers aren’t capable of delivering perfect products every time with really skilled programmers, if i subtract programmers from the equation and substitute in a magic box that delivers code to managers whenever they ask for it, the managers won’t do much better. The biggest problem is not knowing what to ask for, and even if you DO know what to ask for, they typically will ignore all the fine details.

    By the time there is an AI intelligent enough to coordinate a large technical operation, AIs will be capable of replacing attorneys, congressmen, patent examiners, middle managers, etc. It would really take a GENERAL artificial intelligence to be feasible here, and you’d be wildly optimistic to say we are anywhere close to having one of those available on the open market.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      10 months ago

      I agree with you completely, but he did say no need for ‘human programmers’ not 'human software engineers. The skill set you are describing is one I would put forward is one of if not the biggest different between the two.

      • @[email protected]
        link
        fedilink
        English
        410 months ago

        This is really splitting hairs, but if you asked that cloud CEO if he employed programmers or ‘software engineers’ he would almost certainly say the latter. The larger the company, the greater the chance they have what they consider an ‘engineering’ department. I would guess he employs 0 “programmers” or ‘engineeringless programmers’.

        • @[email protected]
          link
          fedilink
          English
          1
          edit-2
          10 months ago

          Anyone in software engineering will tell you that as you get more senior you spend less time writing lines of code and more time planning, designing, testing, reviewing, and deleting code.

          This will continue to be true, it’s just that there will be less juniors below who’s whole job is to produce code that meets a predefined spec or passes an existing test, and instead a smaller number of juniors will use AI tools to increase their productivity, while still requiring the same amount of direction and oversight. The small amounts of code the seniors write will also get smaller and faster to write, as they also use AI tools to generate boilerplate while filling in the important details.

  • @[email protected]
    link
    fedilink
    English
    1210 months ago

    The sentiment on AI in the span of 10 years went from “it’s inevitable it will replace your job” to “nope not gonna happen”. The difference back then the jobs it was going to replace were not tech jobs. Just saying.

    • @[email protected]
      link
      fedilink
      English
      1
      edit-2
      10 months ago

      From the very beginning people were absolutely making connections between ai and tech jobs like programming.

      The fuck are you talking about? Are you seriously trying to imply that now that it’s threatening tech jobs (it’s not) suddenly the narrative around how useful it will be changed (it didn’t)

      • @[email protected]
        link
        fedilink
        English
        1
        edit-2
        10 months ago

        From the very beginning

        When is that exactly do you have in mind? I’m talking about automation which roughly around 2010 the discourse was primarily centered around blue collar jobs. The discussion was about these careers becoming obsolete if AI ever advanced to the point where it involved little to no humans to perform the tasks.

        Back then AI with regards to white collar jobs was no where near the primary focus of discourse much less programming.

        Tech nerds back then were all gung ho about it making entire careers obsolete in the near future. Truck drivers were supposed to be a dead career by now. They absolutely do not hold the same enthusiasm right now when it’s being said about their own careers.

        Are you seriously trying to imply

        You’re way off the mark. Save your outrage.

  • @[email protected]
    link
    fedilink
    English
    510 months ago

    How much longer until cloud CEOs are a thing of the past? Wouldn’t an AI sufficiently intelligent to solve technical problems at scale also be able to run a large corporate division? By the time this is actually viable, we are all fucked.

  • @[email protected]
    link
    fedilink
    English
    8
    edit-2
    10 months ago

    Everyone was always joking about how AI should just replace CEOs, but it turns out CEOs are so easily lead by the nose that AI companies practically already run the show.

  • @[email protected]
    link
    fedilink
    English
    30
    edit-2
    10 months ago

    Extremely misleading title. He didn’t say programmers would be a thing of the past, he said they’ll be doing higher level design and not writing code.

    • @[email protected]
      link
      fedilink
      English
      610 months ago

      So they would be doing engineering and not programming? To me that sounds like programmers would be a thing of the past.

    • @[email protected]
      link
      fedilink
      English
      110 months ago

      We’ll be able to use the newly found time to realise our dream of making PMs redundant by automating them.

    • jackeryjoo
      link
      fedilink
      English
      3710 months ago

      Even so, he’s wrong. This is the kind of stupid thing someone without any first hand experience programming would say.

      • @[email protected]
        link
        fedilink
        English
        4
        edit-2
        10 months ago

        I heard a lot of programmers say it

        Edit: why is everyone downvoting me lol. I’m not agreeing with them but I’ve seen and met a lot that do.

        • jackeryjoo
          link
          fedilink
          English
          3110 months ago

          They’re falling for a hype train then.

          I work in the industry. With several thousand of my peers every day that also code. I lead a team of extremely talented, tenured engineers across the company to take on some of the most difficult challenges it can offer us. I’ve been coding and working in tech for over 25 years.

          The people who say this are people who either do not understand how AI (LLMs in this case) work, or do not understand programming, or are easily plied by the hype train.

          We’re so far off from this existing with the current tech, that it’s not worth seriously discussing.

          There are scripts, snippets of code that vscode’s llm or VS2022’s llm plugin can help with/bring up. But 9 times out of 10 there’s multiple bugs in it.

          If you’re doing anything semi-complex it’s a crapshoot if it gets close at all.

          It’s not bad for generating psuedo-code, or templates, but it’s designed to generate code that looks right, not be right; and there’s a huge difference.

          AI Genned code is exceedingly buggy, and if you don’t understand what it’s trying to do, it’s impossible to debug because what it generates is trash tier levels of code quality.

          The tech may get there eventually, but there’s no way I trust it, or anyone I work with trusts it, or considers it a serious threat or even resource beyond the novelty.

          It’s useful for non-engineers to get an idea of what they’re trying to do, but it can just as easily send them down a bad path.

          • @[email protected]
            link
            fedilink
            English
            110 months ago

            Had to do some bullshit ai training for work. Tried to get the thing to remake cmatrix in python.

            Yeah no, that’s not replacing us anytime soon, lmao.

          • @[email protected]
            link
            fedilink
            English
            110 months ago

            People use visual environments to draw systems and then generate code for specific controllers, that’s in control systems design and such.

            In that sense there are already situations where they don’t write code directly.

            But this has nothing to do with LLMs.

            Just for designing systems in one place visual environments with blocks might be more optimal.

            • @[email protected]
              link
              fedilink
              English
              310 months ago

              And often you still have actual developers reimplementing this shit because EE majors don’t understand dereferencing null pointers is bad

      • @[email protected]
        link
        fedilink
        English
        310 months ago

        Not really, it’s doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.

        • @[email protected]
          link
          fedilink
          English
          810 months ago

          right now not a chance. it’s okay ish at simple scripts. it’s alright as an assistant to get a buggy draft for anything even vaguely complex.

          ai doing any actual programming is a long ways off.

        • jackeryjoo
          link
          fedilink
          English
          1810 months ago

          This is incorrect. And I’m in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.

          I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.

          People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.

          If you don’t know what you’re doing, LLMs can get you close, some of the time. But there’s no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.

          Why would I want to voluntarily spend my day trying to decypher someone else’s code? I don’t need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else’s. This is true by orders of magnitude for AI-code gen today.

          So I don’t consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.

          • @[email protected]
            link
            fedilink
            English
            3
            edit-2
            10 months ago

            It’s just a tool like any other. An experienced developer knows that you can’t apply every tool to every situation. Just like you should know the difference between threads and coroutines and know when to apply them. Or know which design pattern is relevant to a given situation. It’s a tool, and a useful one if you know how to use it.

            • @[email protected]
              link
              fedilink
              English
              610 months ago

              This is like applying a tambourine made of optical discs as a storage solution. A bit better cause punctured discs are no good.

              A full description of what a program does is the program itself, have you heard that? (except for UB, libraries, … , but an LLM is no better than a human in that too)

      • @[email protected]
        link
        fedilink
        English
        610 months ago

        Yeah, there are people who can “in general” imagine how this will happen, but programming is exactly 99% not about “in general” but about specific “dumb” conflicts in the objective reality.

        People think that what they generally imagine as the task is the most important part, and since they don’t actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.

        But objective reality doesn’t bend. Their general ideas without every little bloody detail simply won’t work.

    • Todd Bonzalez
      link
      fedilink
      English
      110 months ago

      How is “not writing code” different from programmers being a thing of the past?

      What do you think programmers do?

    • @[email protected]
      link
      fedilink
      English
      3
      edit-2
      10 months ago

      Sounds like he’s just repeating a common meme. I don’t see anything about higher level design that would make it more difficult for an AI (hypothetical future AI, not the stuff that’s available now) compared to lower level tasks.

    • @[email protected]
      link
      fedilink
      English
      210 months ago

      When my job was outsouced a few years back, I was thinking there is probably a boat load of indien coming out of management schools that would do a great job at C level ! For a fraction of the price.

  • @[email protected]
    link
    fedilink
    English
    4710 months ago

    The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.

  • Lettuce eat lettuce
    link
    fedilink
    English
    5210 months ago

    Lol sure, and AI made human staff at grocery stores a thing of the…oops, oh yeah…y’all tried that for a while and it failed horribly…

    So tired of the bullshit “AI” hype train. I can’t wait for the market to crash hard once everybody realizes it’s a bubble and AI won’t magically make programmers obsolete.

    Remember when everything was using machine learning and blockchain technology? Pepperidge Farm remembers…

    • @[email protected]
      link
      fedilink
      English
      1410 months ago

      It’s the pinnacle of MBA evolution.

      In their worldview engineers are a material, and all that matters in the world is knowing how to do business. So it just makes sense that one can guide and use and direct engineers to replace themselves.

      They don’t think of fundamentals, they really believe it’s some magic that happens all by itself, you just have to direct energy and something will come out of it.

      Lysenko vibes.

      This wouldn’t happen were not the C-suite mostly comprised of bean counters. They really think they are to engineers what officers are to soldiers. The issue is - an officer must perfectly know everything a soldier knows and their own specialty, and also bears responsibility. Bean counters in general less education, experience and intelligence than engineers they direct, and also avoid responsibility all the time.

      So, putting themselves as some superior caste, they really think they can “direct progress” to replace everyone else the way factories with machines replaced artisans.

      It’s literally a whole layer of people who know how to get power, but not how to create it, and imagine weird magical stuff about things they don’t know.

        • @[email protected]
          link
          fedilink
          English
          210 months ago

          Yeah, that’s what I mean. Black boxes are a concept to accelerate development, but we can’t blackbox ourselves through civilization. They are also mostly useful for horizontal, not vertical relationships, which people misunderstand all the time (leaky abstractions).

          This actually should make us optimistic. If hierarchical blackboxing were efficient, it would be certain that state of human societies will become more and more fascist and hierarchical over time, while not slowing down in development. But it’s not.

  • @[email protected]
    link
    fedilink
    English
    2710 months ago

    amazon cloud CEO reveals that they have terminal CEO brain and have no idea what reality is like for the people they’re in charge of

    checks out