Just out of curiosity. I have no moral stance on it, if a tool works for you I’m definitely not judging anyone for using it. Do whatever you can to get your work done!

  • @[email protected]
    link
    fedilink
    12 years ago

    Used in small doses to generate text with some degree of precision is helpful. I do find it to be a good way to cut out boring email writing. But I would recommend it more as a text generation tool than a fact generation tool. With the right expectations and work flow it fits right in. And no I don’t consider it plagiarism if the client’s demand is boring.

  • @[email protected]
    link
    fedilink
    52 years ago

    I use it to speed up writing scripts on occasion, while attempting to abstract out any possibly confidential data.

    I’m still fairly sure it’s not allowed, however. But considering it would be easy to trace API calls and I haven’t been approached yet, I’m assuming no one really cares.

    • @[email protected]OP
      link
      fedilink
      42 years ago

      i have used to to do simple shell scripts - like, “read a text file, parse out a semver, increment the minor version, set the last value to zero, write back out to the text file”. simple stuff that can be easily stated it’s pretty good at. mind you it was a bit wrong and i had to fix it, but it saved me googling commands and writing the script myself. I wouldn’t have bothered normally but i do that once every two weeks so it’s nice to just have a command to do it.

  • @[email protected]
    link
    fedilink
    English
    52 years ago

    As a coder, we have had discussions about using it at work. Everyone’s fine with it for generation of test data, or for generating initial code skeletons but it still requires us to review every line. It saves a bit of time but that’s all.

  • @[email protected]
    link
    fedilink
    22 years ago

    I tell everyone! I suggest my coworkers and bosses to do the same.

    Why I should keep it as secret?

  • awkwardparticle
    link
    fedilink
    92 years ago

    My whole team was playing around with it and for a few weeks it was working pretty well for a coupl3 of things. Until the answers started to become incorrect and not useful.

  • @[email protected]
    link
    fedilink
    English
    252 years ago

    Some of my co-workers use it, and it’s fairly obvious, usually because they are putting out even more inaccurate info than normal.

    • @[email protected]
      link
      fedilink
      5
      edit-2
      2 years ago

      Urgh one of my coworkers (technically client, but work closely alongside) clearly uses it for every single email he sends, and it’s nauseating. He’s crass and very poorly spoken in person, yet overnight all his email correspondence is suddenly robotic and unnecessarily flowery. I use it regularly myself, for fast building of Excel formulas and so forth, but please, don’t dump every email into it.

  • Bruno Finger
    link
    fedilink
    82 years ago

    We openly use it and abuse of it from top to bottom of the company and for me add Co-Pilot to that as well

  • Platypus
    link
    fedilink
    English
    422 years ago

    I’ve been using it a little to automate really stupid simple programming tasks. I’ve found it’s really bad at producing feasible code for anything beyond the grasp of a first-year CS student, but there’s an awful lot of dumb code that needs to be written and it’s certainly easier than doing it by hand.

    As long as you’re very precise about what you want, you don’t expect too much, and you check its work, it’s a pretty useful tool.

    • jecxjo
      link
      fedilink
      English
      102 years ago

      I’ve found it useful for basically finding the example code for a 3rd party library. Basically a version of Stack Exchange that can be better or worse.

      • @[email protected]
        link
        fedilink
        82 years ago

        I essentially use it as interactive docs. As long as what you’re learning existed before 2021 it’s great.

        • jecxjo
          link
          fedilink
          English
          62 years ago

          Yeah sadly the times I’ve gotten screwed is when a major version change occurred in 2022. Got burned once doing that and now I know to check to see if we have upgraded past the version the code works before spending too much time working on it.

    • @[email protected]
      link
      fedilink
      92 years ago

      I don’t know you, the language you use, nor the way you use chat gpt, but I’m a bit surprised at what you say. I’ve been using chatgpt on a nearly daily basis for months now and while it’s not perfect, if the task isn’t super complicated and if it’s described well, after a couple of back and forth I usually have what I need. It works, does what is expected, without being an horrendous way to code it.

      And gpt4 is even better

      • Platypus
        link
        fedilink
        English
        72 years ago

        My job involves a lot of shimming code in between systems that are archaic, in-house, or very specific to my industry (usually some combination of the three), so the problems I’m usually solving don’t have much representation in gpt’s training data. Sometimes I get to do more rapid prototyping/sandbox kind of work, and it’s definitely much more effective there where I’m (a) using technologies that might pop up on stack overflow and (b) don’t have a set of arcane constraints the length of my arm to contend with.

        I’m absolutely certain that it’s going to be a core part of my workflow in the future, either when the tech improves or I switch jobs, but for right now the most value I get out of it is as effectively a SO search tool.

        • @[email protected]
          link
          fedilink
          52 years ago

          Got it. With context, it makes much more sense.

          I myself use some of the most widely used programming language ( php and react mostly ) so yhea, there’s plenty to be found with those

    • @[email protected]
      link
      fedilink
      2
      edit-2
      2 years ago

      I, like most peaple, find it easier to write code than to read it. That “check its work” step means more work actually, for me

  • @[email protected]
    link
    fedilink
    28
    edit-2
    2 years ago

    not chatGPT - but I tried using copilot for a month or two to speed up my work (backend engineer). Wound up unsubscribing and removing the plugin after not too long, because I found it had the opposite effect.

    Basically instead of speeding my coding up, it slowed it down, because instead of my thought process being

    1. Think about the requirements
    2. Work out how best to achieve those requirements within the code I’m working on
    3. Write the code

    It would be

    1. Think about the requirements
    2. Work out how best to achieve those requirements within the code I’m working on
    3. Start writing the code and wait for the auto complete
    4. Read the auto complete and decide if it does exactly what I want
    5. Do one of the following depending on 4 5a. Use the autocomplete as-is 5b. Use the autocomplete then modify to fix a few issues or account for a requirement it missed 5c. Ignore the autocomplete and write the code yourself

    idk about you, but the first set of steps just seems like a whole lot less hassle then the second set of steps, especially since for anything that involved any business logic or internal libraries, I found myself using 5c far more often than the other two. And as a bonus, I actually fully understand all the code committed under my username, on account of actually having wrote it.

    I will say though in the interest of fairness, there were a few instances where I was blown away with copilot’s ability to figure out what I was trying to do and give a solution for it. Most of these times were when I was writing semi-complex DB queries (via Django’s ORM), so if you’re just writing a dead simple CRUD API without much complex business logic, you may find value in it, but for the most part, I found that it just increased cognitive overhead and time spent on my tickets

    EDIT: I did use chatGPT for my peer reviews this year though and thought it worked really well for that sort of thing. I just put in what I liked about my coworkers and where I thought they could improve in simple english and it spat out very professional peer reviews in the format expected by the review form

    • @[email protected]
      link
      fedilink
      32 years ago

      Those different sets of steps basically boil down to a student finding all the ways they can to cheat and spending hours doing it, when they could have just used less time to study for the test.

      Not saying that you’re cheating, just that it’s the same idea. Usually the quickest solution is to just tackle the thing head-on rather than find the lazy workaround.

      • @[email protected]
        link
        fedilink
        12 years ago

        What I think ChatGPT is great for in programming is ‘I know what I want to do but can’t quite remember the syntax for how to do it’. In those scenarios it’s so much faster than wading through the endless blogspam and SEO guff that search engines deal in now, and it’s got much less of a superiority complex than some of the denizens of SO too.

    • @[email protected]
      link
      fedilink
      12 years ago

      As a side note, whilst I don’t really use AI to help with coding, I was kinda expecting what you describe, more so for having stuff like ChatGPT doing whole modules.

      You see, I’ve worked as a freelancer (contractor) most of my career now and in practice that does mostly mean coming in and fixing/upgrading somebody else’s codebase, though I’ve also done some so-called “greenfield projects” (entirelly new work) and in my experience the “understanding somebody else’s code” is a lot more cognitivelly heavy that “coming up with your own stuff” - in fact some of my projects would’ve probably gone faster if we just rewrote the whole thing (but that wasn’t my call to make and often the business side doesn’t want to risk it).

      I’m curious if multiple different pieces of code done with AI actually have the same coding style (at multiple levels, so also software design approach) or not.

  • @[email protected]
    link
    fedilink
    12
    edit-2
    2 years ago

    I might tell them just as I might tell them I used google to find out something. Doesn’t really pop up in conversation that often, but I wouldn’t hide the fact. It’s just almost totally irrelevant.

    • @[email protected]
      link
      fedilink
      English
      12 years ago

      For code snippets especially. I mean the thing is limited to input sizes and doesn’t remember context of running conversations that well

  • flynnguy
    link
    fedilink
    English
    702 years ago

    I had a coworker come to me with an “issue” he learned about. It was wrong and it wasn’t really an issue and the it came out that he got it from ChatGPT and didn’t really know what he was talking about, nor could he cite an actual source.

    I’ve also played around with it and it’s given me straight up wrong answers. I don’t think it’s really worth it.

    It’s just predictive text, it’s not really AI.

    • Echo71Niner
      link
      fedilink
      262 years ago

      I concur. ChatGPT is, in fact, not an AI; rather, it operates as a predictive text tool. This is the reason behind the numerous errors it tends to generate and its lack of self-review prior to generating responses is clearest indication of it not being an AI. You can identify instances where CHATGPT provides incorrect information, you correct it, and within 5 seconds of asking again, it repeat the same inaccurate information in its response.

      • @[email protected]
        link
        fedilink
        242 years ago

        It’s definitely not artificial general intelligence, but it’s for sure AI.

        None of the criteria you mentioned are needed for it be labeled as AI. Definition from Oxford Libraries:

        the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

        It definitely fits in this category. It is being used in ways that previously, customer support or a domain expert was needed to talk to. Yes, it makes mistakes, but so do humans. And even if talking to a human would still be better, it’s still a useful AI tool, even if it’s not flawless yet.

        • @[email protected]
          link
          fedilink
          52 years ago

          It just seems to me that by this definition, the moment we figure out how to do something with a computer, it ceases to be AI because it no longer requires human intelligence to accomplish.

    • @[email protected]
      link
      fedilink
      English
      12 years ago

      Isn’t that what humans also do and it’s what makes us intelligent? We analyze patterns and predict what will come next.

    • @[email protected]OP
      link
      fedilink
      82 years ago

      i think learning where it can actually help is a bit of an art - it’s just predictive text, but it’s very good predictive text - if you know what you need and get good and giving it the right input it can save a huge about of time. you’re right though, it doesn’t offer much if you don’t already know what you need.

      • 7bicycles [he/him]
        link
        fedilink
        English
        12 years ago

        Can you hand me an example? I keep hearing this but every time somebody presents something, be it work related or not, it feels like at best it would serve as better lorem ipsum

        • @[email protected]
          link
          fedilink
          English
          42 years ago

          I’ve had good success using it to write Python scripts for me. They’re simple enough I would be able to write them myself, but it would take a lot of time searching and reading StackOverflow/library docs/etc since I’m an amateur and not a pro. GPT lets me spend more time actually doing the things I need the scripts for.

          • @[email protected]
            link
            fedilink
            12 years ago

            A use it with web development by describing what I want something to look like and have it generate a React component based on my description.

            Is what it gives me the final product? Sometimes, but it’s such a help to knock out a bunch of boilerplate and get me close to what I want.

            Also generating documentation is nice. I wanted to fill out some internal wiki articles to help people new to the industry have something to reference. Spent maybe an hour having a conversation asking all of the questions I normally run into. Cleaned up the GPT text, checked for inaccuracies, and cranked out a ton of resources. That would have taken me days, if not weeks.

            At the end of the day, GPT is better with words than I am, but it doesn’t have the years of experience I have.

    • @[email protected]
      link
      fedilink
      32 years ago

      More often than not you need to be very specific and have some knowledge on the stuff you ask it.

      However, you can guide it to give you exactly what you want. I feel like knowing how to interact with GPT it’s becoming similar as being good at googling stuff.

  • @[email protected]
    link
    fedilink
    English
    752 years ago

    A junior team member sent me an AI-generated sick note a few weeks ago. It was many, many neat and equally-sized paragraphs of badly written excuses. I would have accepted “I can’t come in to work today because I feel unwell” but now I can’t take this person quite so seriously any more.

    • @[email protected]
      link
      fedilink
      English
      292 years ago

      Classic over explaining to cover up a lie.

      I never send anything other than “I’ll be out of the office today” for every PTO notice.

      • @[email protected]
        link
        fedilink
        62 years ago

        Exactly and lets me honest you coworkers don’t want to heard about you explosive diarrhea problems or the weird mole on your butt.

      • @[email protected]
        link
        fedilink
        142 years ago

        I dunno, I’d consider it a moral failing on the part of the person who couldn’t be honest and direct, even if there’s a cultural issue in the workplace.

      • @[email protected]
        link
        fedilink
        22 years ago

        Exactly, if they’re too lazy to write a fake sick note then they’re certainly too lazy to work, either send them in for remediation or terminate them, either way they shouldn’t be in the workplace

  • @[email protected]
    link
    fedilink
    English
    12 years ago

    I’ve used it for writing job descriptions. The final output is different after I’ve tweaked it but it’s much easier than starting with a blank page.

  • @[email protected]
    link
    fedilink
    52 years ago

    Technically not ChatGPT, but GPT4 directly via the API. And I also told my boss about it ;) I used it as baseline for a few translations, had it generate some boilerplate, and create classes in multiple languages from some JSON. But most of the time I have no need it can fulfill, and most of the (tiny) bill OpenAI sends me is from me using it for personal stuff like getting an idea what to do for our wedding anniversary (the boat ride was a great idea :D), or as a funny bot in two twitch chats I moderate (I use 3.5 there because it’s over an order of magnitude cheaper).