Ok let’s give a little bit of context. I will turn 40 yo in a couple of months and I’m a c++ software developer for more than 18 years. I enjoy to code, I enjoy to write “good” code, readable and so.
However since a few months, I become really afraid of the future of the job I like with the progress of artificial intelligence. Very often I don’t sleep at night because of this.
I fear that my job, while not completely disappearing, become a very boring job consisting in debugging code generated automatically, or that the job disappear.
For now, I’m not using AI, I have a few colleagues that do it but I do not want to because one, it remove a part of the coding I like and two I have the feeling that using it is cutting the branch I’m sit on, if you see what I mean. I fear that in a near future, ppl not using it will be fired because seen by the management as less productive…
Am I the only one feeling this way? I have the feeling all tech people are enthusiastic about AI.
I use GitHub Copilot from work. I generally use Python. It doesn’t take away anything at least for me. It’s big thing is tab completion; it saves me from finishing some lines and adding else clauses. Like I’ll start writing a docstring and it’ll finish it.
Once in a while I can’t think of exactly what I want so I write a comment describing it and Copilot tries to figure out what I’m asking for. It’s literally a Copilot.
Now if I go and describe a big system or interfacing with existing code, it quickly gets confused and tends to get in the weeds. But man if I need someone to describe a regex, it’s awesome.
Anyways I think there are free alternatives out there that probably work as well. At the end of the day, it’s up to you. Though I’d so don’t knock it till you try it. If you don’t like it, stop using it.
This. I’ve seen SO much hype and FUD and all the while there are thousands of developers grinding out code using these tools.
Does code quality suffer? ONLY in my experience if they have belt wielding bean counters forcing them to ship well before it’s actually ready for prime time :)
The tools aren’t perfect, and they most DEFINITELY aren’t a panacea. The industry is in a huge contraction phase right now so I think we have a while before we have to worry about AI induced layoffs, and if that happens the folks doing the laying off are being incredibly short sighted and likely to have a high impact date with a wall coming in the near future anyway.
Its* big thing
I don’t think you are disturbed by AI, but y Capitalism doing anything they can to pay you as little as possible. From a pure value* perspective assuming your niche skills in c++ are useful*, you have nothing to worry about. You should be paid the same regardless. But in our society, if you being replaced by someone “good enough”, will work for the business then yes you should be worried. But AI isn’t the thing you should be upset by.
*This is obviously subjective, but the existence of AI with you troubleshooting vs fully replacing you is out of scope here.
This is a real danger in a long term. If advancement of AI and robotics reaches a certain level, it can detach big portion of lower and middle classes from the societys flow of wealth and disrupt structures that have existed since the early industrial revolution. Educated common man stops being an asset. Whole world becomes a banana republic where only Industry and government are needed and there is unpassable gap between common people and the uncaring elite.
White collar never should have been getting paid so much more than blue collar and I welcome seeing the Shift balance out, so everyone wants to eat the rich.
Rich will have weapons and technology. I see 1984 + hunger games scenario more likely.
White collar never should have been getting paid so much more than blue collar
Actually I see that the other way around. Blue collar should have never been paid so much less than white collar.
This is exactly what I see as the risk. However, the elites running industry are, on average, fucking idiots. So, we have been seeing frequent cases of them trying to replace people whose jobs they don’t understand, with technology that even leading scientists don’t fully understand, in order to keep those wages for themselves, all in-spite of those who do understand the jobs saying that it is a bad idea.
Don’t underestimate the willingness of upper management to gamble on things and inflict the consequences of failure on the workforce. Nor their willingness to switch to a worse solution, not because it is better or even cheaper but because it means giving less to employees, if they think that they can get away with it.
Right. I agree that in our current society, AI is net-loss for most of us. There will be a few lucky ones that will almost certainly be paid more then they are now, but that will be at the cost of everyone else, and even they will certainly be paid less then the share-holders and executives. The end result is a much lower quality of life for basically everyone. Remember what the Luddites were actually protesting and you’ll see how AI is no different.
It doesn’t matter what you think about AI. It’s very clear that this technology is here to stay and will only improve. From this point on AI will become deeply integrated into human culture and technology, after all we’ve been fetishizing it for almost 100 years now. Your only logical option as a developer is to learn how to use it and abuse it. Choosing not to do so is career suicide, possibly even societal suicide depending on how quickly adoption happens.
You’re probably right, in the near future people that can’t use it will be fired. To that point they should be fired. Why the fuck would I allow my accounts to do their financal work on paper when Excel exists?
Welcome to the future.
If you are afraid about the capabilities of AI you should use it. Take one week to use chatgpt heavily in your daily tasks. Take one week to use copilot heavily.
Then you can make an informed judgement instead of being irrationally scared of some vague concept.
Yeah, not using it isn’t going to help you when the bottom line is all people care about.
It might take junior dev roles, and turn senior dev into QA, but that skillset will be key moving forward if that happens. You’re only shooting yourself in the foot by refusing to integrate it into your work flows, even if it’s just as an assistant/troubleshooting aid.
It’s not going to take junior dev roles) it’s going to transform whole workflow and make dev job more like QA than actual dev jobs, since difference between junior middle and senior is often only with scope of their responsibility (I’ve seen companies that make junior do fullstack senior job while on the paper they still was juniors and paycheck was something between junior and middle dev and these companies is majority in rural area)
I wish your fear were justified! I’ll praise anything that can kill work.
Hallas, we’re not here yet. Current AI is a glorified search engine. The problem it will have is that most code today is unmaintainable garbage. So AI can only do this for now : unmaintainable garbage.
First the software industry needs to properly industrialise itself. Then there will be code to copy and reuse.
I’ll praise anything that can kill work under UBI. Without reform, I worry the rich will get richer, the poor will get even poorer and it leads to guillotines in the square.
Under capitalism the rich will get richer, and the poor poorer. That’s the whole point of it. Guillotines are a solution to get UBI.
Your last sentence is where I fear we will end up. The very wealthy would be wise to realise it and work reform themselves.
I disagree that capitalism, at least in the way I understand it, always leads to rich getting richer, poor getting poorer. Many European countries have a happy medium that rewards risk-taking while looking after everyone. While most still slowly get worse on the Gini coefficient it’s based on pretty much the 0.1% pulling away and away, while the rest of their societies actually stays roughly the same. So really they only have the top of the top of the top to deal with, whereas a country like the US has a much larger, all-encompassing inequality.
All countries of Europe are going fascists one after the other. Why if there is no problem?
Europe had capitalism under a leash because communism was here to threaten it. Since the 90’s, capitalism is unleashed and inequalities are rising. USA didn’t had communism to tame its capitalism, because it was basically forbidden because of the cold war.
Capitalism is entirely focused on having companies making a profit. If you don’t have strong states to tame it and redistribute the money, inequalities increase. It’s mathematical.
The rise of fascism has more to do with people’s impression of immigration than it does capitalism.
Inequality in Europe isn’t rising if you disregard the top 0.1%. It’s the very very top that needs adjusting in Europe.
I agree with your last paragraph. Of course you need rules and redistribution. That doesn’t mean that capitalism, if well regulated, isn’t the most productive or the most effective at increasing wealth for everyone.
Fascism has everything to do with poverty and inequalities. And inequalities in Europe are rising a lot. Where do you get your informations?
Capitalism is a sickness. It breeds crisis that lead to war, and it lives out of war and exploitation. But that’s beside the point.
I’m a composer. My facebook is filled with ads like “Never pay for music again!”. Its fucking depressing.
Good thing there’s no Spotify for sheet music yet… I probably shouldn’t give them ideas.
It won’t replace coders as such. There will be devs who use AI to help them be more productive, and there will be unemployed devs.
If your job truly is in danger, then not touching AI tools isn’t going to change that. The best you can do for yourself is to explore what these tools can do for you and figure out if they can help you become more productive so that you’re not first on the chopping block. Maybe in doing so, you’ll find other aspects of programming that you enjoy just as much and don’t yet get automated away with these tools. Or maybe you’ll find that they’ll not all they’re hyped up to be and ease your worry.
🙄 no I’m sure you’re the only one
I’m a 50+ year old IT guy who started out as a c/c++ programmer in the 90’s and I’m not that worried.
The thing is, all this talk about AI isn’t very accurate. There is a huge difference in the LLM stuff that ChatGPT etc. are built on and true AI. These LLM’s are only as good as the data fed into them. The adage “garbage in, garbage out” comes to mind. Anybody that blindly relies on them is a fool. Just ask the lawyer that used ChatGPT to write a legal brief. The “AI” made up references to non-existent cases that looked and sounded legitimate, and the lawyer didn’t bother to check for accuracy. He filed the brief and it was the judge that discovered the brief was a work of fiction.
Now I know there’s a huge difference between programming and the law, but there are still a lot of similarities here. An AI generated program is only going to be as good as the samples provided to it, and you’re probably want a human to review that code to ensure it’s truly doing what you want, at the very least.
I also have concern that programming LLMs could be targeted by scammers and the like. Train the LLM to harvest sensitive information and obfuscate the code that does it so that it’s difficult for a human to spot the malicious code without a highly detailed analysis of the generated code. That’s another reason to want to know exactly what the LLM is trained on.
AI allows us to do more with less just like any other tool. It’s no different than an electric drill or a powered saw. Perhaps in the future we will see more immersive environment games because much of the immersive environment can be made with AI doing the grunt work.
I am om the product side of things and have created some basic proof of concept tools with AI that my bosses wanted to sell off. No way no how will I be able to sevrice or maintain them. It’s incredibly impressive that I could even get this output.
I am not saying it won’t become possible, but I lack the fundamental knowledge and understanding to make anything beyond the most minor adjustments and AI is still wuite bad at only addressing specific issues or, good forbid, expanding code, without fully rewriting the whole thing and breaking everything else.
For our devs I see it as a much improved and less snide stackoverflow and Google. The direct conversational nature really speeds things up with boilerplate code and since they actually know what they are doing, it’s amazing. Not only that but we had devs copy paste from online searches withoout fully understanding the snippets. Now the AI can explain it in context.
There’s a massive amount of hype right now, much like everything was blockchains for a while.
AI/ML is not able to replace a programmer, especially not a senior engineer. Right now I’d advise you do your job well and hang tight for a couple of years to see how things shake out.
(me = ~50 years old DevOps person)
I’m only on my very first year of DevOps, and already I have five years worth of AI giving me hilarious, sad and ruinous answers regarding the field.
I needed proper knowledge of Ansible ONCE so far, and it managed to lie about Ansible to me TWICE. AI is many things, but an expert system it is not.
Well, technically “expert system” is a type of AI from a couple of decades ago that was based on rules.
Great advice. I would add to it just to learn leveraging those tools effectively. They are great productivity boost. Another side effect once they become popular is that some skills that we already have will be harder to learn so they might be in higher demand.
Anyway, make sure you put aside enough money to not have to worry about such things 😃
Currently at the crossroads between trough of disillusionment and slope of enlightenment
Betteridge’s law of headlines: No.
The trough of disillusionment is my favorite.
Kind of nice to see NFTs breaking through the floor at the trough of disillusionment, never to return.
Man, it’s a tool. It will change things for us, it is very powerful; but still a tool. It does not “know” anything, there’s no true intelligence in the things we now call “AI”. For now, is really useful as a rubber duck, it can make interesting suggestions, make you explore big code bases faster, and even be useful for creating boilerplate. But the code it generates usually is not very trustworthy and have lower quality.
The reality is not that we will lose our jobs to it, but that companies will expect more productivity from us using these tools. I recommend you to try ChatGPT (the best in class for now), and try to understand it’s strengths and limitations.
Remember: this is just an autocomplete on steroids, that do more the the regular version, but that get the same type of errors.