Short disclosure, I work as a Software Developer in the US, and often have to keep my negative opinions about the tech industry to myself. I often post podcasts and articles critical of the tech industry here in order to vent and, in a way, commiserate over the current state of tech and its negative effects on our environment and the Global/American sociopolitical landscape.
I’m generally reluctant to express these opinions IRL as I’m afraid of burning certain bridges in the tech industry that could one day lead to further employment opportunities. I also don’t want to get into these kinds of discussions except with my closest friends and family, as I could foresee them getting quite heated and lengthy with certain people in my social circles.
Some of these negative opinions include:
- I think that the industries based around cryptocurrencies and other blockchain technologies have always been, and have repeatedly proven themselves to be, nothing more or less than scams run and perpetuated by scam artists.
- I think that the AI industry is particularly harmful to writers, journalists, actors, artists, and others. This is not because AI produces better pieces of work, but rather due to misanthropic viewpoints of particularly toxic and powerful individuals at the top of the tech industry hierarchy pushing AI as the next big thing due to their general misunderstanding or outright dislike of the general public.
- I think that capitalism will ultimately doom the tech industry as it reinforces poor system design that deemphasizes maintenance and maintainability in preference of a move fast and break things mentality that still pervades many parts of tech.
- I think we’ve squeezed as much capital out of advertising as is possible without completely alienating the modern user, and we risk creating strong anti tech sentiments among the general population if we don’t figure out a less intrusive way of monetizing software.
You can agree or disagree with me, but in this thread I’d prefer not to get into arguments over the particular details of why any one of our opinions are wrong or right. Rather, I’d hope you could list what opinions on the tech industry you hold that you feel comfortable expressing here, but are, for whatever reason, reluctant to express in public or at work. I’d also welcome an elaboration of said reason, should you feel comfortable to give it.
I doubt we can completely avoid disagreements, but I’ll humbly ask that we all attempt to keep this as civil as possible. Thanks in advance for all thoughtful responses.
Most IT infra exists solely to justify work that is pointless work.
One if the worst IT sectors is ad tech. The entire industry rationally should not exist.
Good article on this:
‘Using cloud software will lead to lower costs and a better overall service quality’
When I was in undergrad I did debate, and a term that was used to describe the debate topics was “a solution in need of a problem”. I think that that very often characterizes the tech industry as a whole.
There is legitimately interesting math going on behind the scenes with AI, and it has a number of legitimate, if specialized, use-cases - sifting through large amounts of data, etc. However, if you’re an AI company, there’s more money to be made marketing to the general public and trying to sell AI to everyone on everything, rather than keeping it within its lane and letting it do the thing that it does well, well.
Even something like blockchain and cryptocurrency is built on top of somewhat novel and interesting math. What makes it a scam isn’t the underlying technology, but rather the speculation bubbles that pop up around it, and the fact that the technology isn’t being used for applications other than pushing a ponzi scheme.
For my own opinions - I don’t really have anything I don’t say out loud, but I definitely have some unorthodox opinions.
-
I think that the ultra-convenient mobile telephone, always on your person at all times, has been a net detriment societally speaking. That is to say, the average iPhone user would be living a happier, more fulfilling, more authentic life if iPhones had not become massively popular. Modern tech too often substitutes genuine real-in-person interactions for online interactions that only approximate it. The instant gratification of always having access to all these opinions at all times has created addictions to social media that are harder to quit than cocaine (source: I have a friend who successfully quit cocaine, and she said that she could never quit instagram). The constantly-on GPS results in people not knowing how to navigate their own towns; if you automate something without learning how to do it, you will never learn how to do it. While that’s fine most of the time, there are emergency situations where it just results in people being generally less competent than they otherwise would have been.
-
For the same reason, I don’t like using IDEs. For example when I code in java, the ritual of typing “import javafx.application.Application;” or whatever helps make me consciously aware that I’m using that specific package, and gets me in the headspace. Plus, being constantly reminded of what every single little thing does makes it much easier for me at least to read and parse code quickly. (But I also haven’t done extensive coding since I was in undergrad).
-
Microsoft Office Excel needs to remove February 29th 1900. I get that they have it so that it’s backwards compatible with some archaic software from the 1990s; it’s an annoying pet peeve.
-
Technology is not the solution to every problem, and technology can make things worse as much as it can make things better. Society seems to have a cult around technological progress, where any new tech is intrinsically a net good for society, and where given any problem the first attempted solution should be a technological one. But for example things like the hyperloop and tesla self-driving cars and so forth are just new modern technology that doesn’t come anywhere near as close to solving transportation problems as just implementing a robust public transit network with tech that’s existed for 200 years (trains, trolleys, busses) would.
I’m interested in reading more about coding java without an IDE, what’s your usual workflow? Do you use maven or gradle or something else? Are there solutions or scripts you use to make up for some functionality of an IDE?
For the same reason, I don’t like using CLIs.
IDEs?
Yes, my bad, I get all the TLAs mixed up.
-
It’s one of the reasons I enjoy working on open source. Sure the companies that pay the bills for that maintenance might not be the ones you would work for directly but I satisfy myself that we are improving a commons that everyone can take advantage of.
I told my lib colleague about how many software creators provide their stuff and its source code for free and he could barely get why; I also told him historically many nations just left their research and findings available publicly for people to learn from and he can’t grasp why that was either.
He does truly believe the profit motive is the only (best?) way to advance science.
Yes and no. A lot of the projects I work on the majority of the engineers are funded by companies which have very real commercial drivers to do so. However the fact the code itself is free (as in freedom) means that everyone benefits from the commons and as a result interesting contributions come up which aren’t on the commercial roadmap. Look at git, a source control system Linus built because he needed something to maintain Linux in and he didn’t like any of the alternatives. It solved his itch but is now the basis for a large industry of code forges with git at their heart.
While we have roadmaps for features we want they still don’t get merged until they are ready and acceptable to the upstream which makes for much more sustainable projects in the long run.
Interestingly while we have had academic contributions there are a lot more research projects that use the public code as a base but the work is never upstreamed because the focus is on getting the paper/thesis done. Code can work and prove the thing they investigating but still need significant effort to get it merged.
I think most people who actually work in software development will agree with you on those things. The problem is that it’s the marketing people and investors who disagree with you, but it’s also them who get to make the decisions.
I took some VC money to build some bullshit and I’ll do it again!
A very large portion (maybe not quite a majority) of software developers are not very good at their jobs. Just good enough to get by.
And that is entirely okay! Applies to most jobs, honestly. But there is really NO appropriate way to express that to a coworker.
I’ve seen way too much “just keep trying random things without really knowing what you’re doing, and hope you eventually stumble into something that works” attitude from coworkers.
I actually would go further and say that collectively, we are terrible at what we do. Not every individual, but the combination of individuals, teams, management, and business requirements mean that collectively we produce terrible results. If bridges failed at anywhere near the rate that software does, processes would be changed to fix the problem. But bugs, glitches, vulnerabilities etc. are rife in the software industry. And it just gets accepted as normal.
It is possible to do better. We know this, from things like the stuff that sent us to the moon. But we’ve collectively decided not to do better.
The tech industry is so very capitalistic, so many companies see devs as min max churn machines, tech debt? Nah FEATURES! AI! MODERNITY! That new dev needs to be trained in the basics and best practices? Sorry that’s not within scope
Managers decided that by forcing people to deliver before it’s ready. It’s better for the company to have something that works but with bugs, rather than delaying projects until they are actually ready.
In most fields where people write code, writing code is just about gluing stuff together, and code quality doesn’t matter (simplicity does though).
Game programmers and other serious large app programmers are probably the only ones where it matters a lot how you write the code.
Kind of the opposite actually.
The Business™️ used to make all decisions about what to build and how to build it, shove those requirements down and hope for the best.
Then the industry moved towards Agile development where you put part of the product out and get feedback on it before you build the next part.
There’s a fine art to deciding which bugs to fix win. Most companies I’ve worked with aren’t very good at it to begin with. It’s a special skill to learn and practice
Agile is horrible though. It sounds good in theory but oh my god its so bad.
It’s usually the implementation of Agile that’s bad.
The Manifesto’s organizing principles are quite succinct and don’t include a lot of the things that teams dislike.
We follow these principles: Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. Welcome changing requirements, even late in development. Agile processes harness change for the customer's competitive advantage. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale. Business people and developers must work together daily throughout the project. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done. The most efficient and effective method of conveying information to and within a development team is face-to-face conversation. Working software is the primary measure of progress. Agile processes promote sustainable development. The sponsors, developers, and users should be able to maintain a constant pace indefinitely. Continuous attention to technical excellence and good design enhances agility. Simplicity--the art of maximizing the amount of work not done--is essential. The best architectures, requirements, and designs emerge from self-organizing teams. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.
Main difference is, a bridge that fails physically breaks, takes months to repair, and risks killing people. Your average CRUD app… maybe a dev loses a couple or hours figuring out how to fix live data for the affected client, bug gets fixed, and everybody goes on with their day.
Remember that we almost all code to make products that will make a company money. There’s just no financial upside to doing better in most cases, so we don’t. The financial consequences of most bugs just aren’t great enough to make the industry care. It’s always about maximizing revenue.
maybe a dev loses a couple or hours figuring out how to fix live data for the affected client, bug gets fixed, and everybody goes on with their day.
Or thousands of people get stranded at airports as the ticketing system goes down or there is a data breach that exposes millions of people’s private data.
Some companies have been able to implement robust systems that can take major attacks, but that is generally because they are more sensitive to revenue loss when these systems go down.
Yup, this is exactly it. There are very few software systems whose failure does not impact people. Sure, it’s rare for it to kill them, but they cause people to lose large amounts of money, valuable time, or sensitive information. That money loss is always, ultimately, paid by end consumers. Even in B2B software, there are human customers of the company that bought/uses the software.
I’m not sure if you’re agreeing or trying to disprove my previous comment - IMHO, we are saying the exact same thing. As long as those stranded travelers or data breaches cost less than the missed business from not getting the product out in the first place, from a purely financial point of view, it makes no sense to withhold the product’s release.
Let’s be real here, most developers are not working on airport ticketing systems or handling millions of users’ private data, and the cost of those systems failing isn’t nearly as dramatic. Those rigid procedures civil engineers have to follow come from somewhere, and it’s usually not from any individual engineer’s good will, but from regulations and procedures written from the blood of previous failures. If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
… If companies really had to feel the cost of data breaches, I’d be willing to wager we’d suddenly see a lot more traction over good development practices.
that’s probably why downtime clauses are a thing in contracts between corporations; it sets a cap at the amount of losses a corporation can suffer and it’s always significantly less than getting slapped by the gov’t if it ever went to court.
I’m just trying to highlight that there is a fuzzier middle ground than a lot of programmers want to admit. Also, a lot of regulations for that middle ground haven’t been written; the only attention to that middle ground have been when done companies have seen failures hit their bottom line.
I’m not saying the middle ground doesn’t exist, but that said middle ground visibly doesn’t cause enough damage to businesses’ bottom line, leading to companies having zero incentive to “fix” it. It just becomes part of the cost of doing business. I sure as hell won’t blame programmers for business decisions.
It just becomes part of the cost of doing business.
I agree with everything you said except for this. Often times, it isn’t the companies that have to bear the costs, but their customers or third parties.
That’s why I don’t work on mission critical stuff.
If my apps fail, some Business Person doesn’t get to move some bits around.
A friend of mine worked in software at NASA. If her apps failed, some astronaut was careening through space 😬
I think it’s definitely the majority. The problem is that a lot of tech developments, new language features and Frameworks then pander to this lack of skill and then those new things become buzzwords that are required at most new jobs.
So many things could be got rid of if people would just write decent code in the first place!
maybe not quite a majority
VAST majority. This is 80-90% of devs.
I read somewhere that everyone is bad at their job. When you’re good at your job you get promoted until you stop being good at your job. When you get good again, you get promoted.
I know it’s not exactly true but I like the idea.
They call that the Peter Principle, and there’s at least one Ig Nobel Prize winning study which found that it’s better to randomly promote people rather than promote based on job performance.
I don’t want to get promoted… Once my job isn’t mainly about programming anymore (in a pretty wide sense though), I took a wrong turn in life 😅
deleted by creator
The whole “tech industry” naming is bulllshit, there is more technology let’s say in composite used to build an aircraft wing or in a surgerical robots, than in yet another mobile app showing you ads
The whole tech sector also tend to be over evaluated on the stock market. In no world Apple is worth 3 trillion while coca cola or airbus are worth around 200 billions
More people own an iPhone than an Airbus plane.
If you want apples to apples, why the hell is Tesla, a company that makes under 2m vehicles, have a market cap of 1.4T while Toyota, a company that makes 10 million vehicles a year, has a market cap of 233B. No matter how you look at it, Toyota has better numbers in every way, but Tesla is a tech company as far as the market is concerned.
Tesla doesn’t just make cars. Tesla also makes batteries and photovoltaic panels.
I agree that Tesla is wildly overvalued and treated as a tech stock, but electric cars isn’t the only thing Tesla makes.
MOBILE USERS CAN GO FUCK THEMSELVES.
Phew. That felt good.
Only one in this thread willing to talk about the real problems.
A lot of what is sold to consumers is straight up shite.
I think that the AI industry is particularly harmful to writers, journalists, actors, artists, and others. This is not because AI produces better pieces of work, but rather due to misanthropic viewpoints of particularly toxic and powerful individuals at the top of the tech industry hierarchy pushing AI as the next big thing due to their general misunderstanding or outright dislike of the general public.
I’m a writer and my work is increasingly making me use AI to do things. I’m 98% sure I’m just training this thing to replace me at this point, and am planning accordingly.
I really don’t get the use of AI to replace creative roles. At worst I’ve used it as a sort of “lorem ipsum” generator but for various placeholders. I think AI’s true value is in understanding the sometimes overwhelming amount of documents, records, datasets and databases that organizations can amass. Being able to have an AI help sift through the garbage is real helpful actually.
I’ve seen governments using it to do things like handle access-to-information type requests or help patent examiners find relevant patents: those uses make a lot of sense.
The Microsh*t Office Suit is atrocious — both from a Software Dev and ordinary user perspective. Literally any alternative is better, Libre Office, Google Office, etc.
Word is bloated, slow, impractical, bad for collaboration, and politically dubious. Teams is buggy, impractical, also politically dubious, and lacks many basic features. At this point, I literally despise Microsoft. Also Windows really seems to be unusable, from the enlightened perspective of a Mac or Linux user (in my case the latter).
SystemD is bloated and stopping Linux from getting faster.
Most mainstream programming languages suck, Rust being the exception.
Alright, I’m done ;)
Edit: any website that breaks because of uBlock Origin medium mode is poorly made and not trustworthy. /endrant
There is two types of languages, ones people bitch about, and ones nobody uses.
Thoughts on rust? Is it a good programming language to learn as a beginner?
I love Rust but specifically for a beginner it could be challenging. Have you learned any other languages or would it be your first?
I am a python beginner it would probably be a good idea to get better at python before moving onto something like rust. And if rust is so good is there any reason to learn any other low level language like c or c++ unless you are working on a current project that already uses c or c++?
I have only leaned a bit of the Cs, but others have written that learning Rust first is a great way to learn the concepts and best practices of those languages.
A lot of the difficult concepts (pointers, ownership, multi thread concurrency) are shared between all three.
Thats nice to know, thanks.
Fuck no. A beginner learning base concepts like arrays, conditionals, loops, variables, functions, etc. should use something much less punishing like Python. It’s much easier to iterate, to understand your mistakes, and to learn from others when you use a simpler language.
When you’re ready to learn about pointers, memory management, etc. then you can take on Rust.
Yeah, ok makes sence thanks!
What sucks the most about rust is that 90% of rust jobs are some crypto bullshit. I love the language, but finding normal jobs is near impossible.
At the same time, i could find 20 Go positions but Go just isn’t exciting. It’s the new java imo, working with it probably good for job security, but i just don’t see myself working in Go in the future as a main language.
Hardly controversial I would say.
My office forces everyone to use Microsoft (there’s a lot of Mac and Windows users), and whenever I complain, people get pissed at me. God knows why.
As for SystemD, I think a lot of people think it’s fine and people like me are exaggerating. I guess that’s fine, but non-systemD systems (Void Linux being my favorite) are so much faster, it’s unbelievable.
And then there’s a lot of generic language programmers and business owners, who are very willing to defend their income source. Like everyone I know. (I’m really dying here; I gotta find a cool Rust or LISP company)
As for uBO, it’s a “progress” thing. If using masses of third parties and trackers makes stuff more innovative (not to mention laggy), then it’s good, they claim.
I’m happy to hear that Lemmy shares my opinion though, that’s a little comforting :)
I use Artix Linux with runit as my daily driver. I’ll admit, its very nice, but I haven’t run systemd except on my VPSs for years now, so I really don’t know if it’s slow or not as my point of reference is long gone.
The systemd take is goofy, but everything regarding Microsoft is spot on. Teams is an eldritch horror.
deleted by creator
Probably setup two entirely separate tenants
lol your admin is a dumbass I wager
Tech workers need to unionize
This is more than self interest, self respecting tech workers would have refused to create our current panopticon-skinnerbox if they weren’t at the mercy of the tech lords. Seniority based hiring and firing, that has to be demand number one, number 2 is layoff recall lists 5 years long.
So Just for context I work as an engineer but I consider myself pretty low level. I am completely self taught as I sort of flunked out of college and didn’t pay much attention anyway. I’ve just been the sort of person who takes everything apart and tinkers around to figure things out or reads documentation. So I am not some genius programmer or anything. However what I have noticed over the span of my ( 40 + years ) is that the Internet and technology used to be a challenge but rewarding. Things were skewed towards creativity, sharing, community, and knowledge. I remember spending lots of time on forums like Usenet and later bulletin boards of various types. I remember when Wikipedia first became a thing and it really seemed to me that we were going to get this amazing platform to learn and self teach just about any subject imaginable. Then somehow the Internet just became an endless fucking scroll farm. My dumbass uncles and older family members who used to be content with just eating aerosol cheese while channel surfing got online and became complete fools. Instead of creativity and debate we just have endless AI slop, morons reacting to videos of nothing, Bots, and click bait. It seems like the industry just loves it because before they could barely figure out how they could make money off of this crap and now they have it figured out “turn everyone into fucking zombies”. People at work are at times blown away at my stamina to work through problems and it’s like bro I used to sleep next to my 486 so I could put in disk 20 of 50 to install something and it would take like all friggin night. I used to have to find a dude that got a catalog so I could get a CPU upgrade or part because there was no internet. I used to have to fight for every damn piece of documentation or software I could get my hand on. Now it’s all right there and people have decided to watch Tik Tok instead of being able to do anything on their own. We screwed the hell up the Internet and tech has made people lazy, less capable, and focused on instant gratification. It was supposed to make us curious, creative, and engaged. Now with AI we are like “hmm how can I even be lazier?”. I would get if they used AI to help solve really complex problems reserved that compute and stuff to assist on certain things that humans are not good at. However we are using this shit to just circumvent having to think and a substitute for community. Why ask a friggin bot when all the answers were in forums where you could interact with people make friends and learn? Now I am looking down the barrel of the gun of being replaced in the next 5 years or so going, Great so this shit which was “my thing” the only damn thing I was ever good at or interested in is going to be taken away from me because of some lazy ass people who just want to watch Tik Tok all day? -End rant.
I don’t think it’s really uncomfortable to say but whatever.
There used to be a digital social contract that we were all stewarding a global information database. That was before the era of “inflluencers” and information arbitrage. In other words people deriving monetized content from other content. Why would anyone want to do the leg work for some random jerk to take all for personal gain.
The whole proposition is a negative spiral. The paradigm changed from stewardship to something shit. This scroll zombie thing or whatever. We have the few users who are the “creators”. Everyone else are consuming whatever is fed to them. It has discouraged people from thinking for themselves and maybe even adding something to the pot.
One thing I’ve noticed the git repo snipers. People will camp on forks looking at your work. If you don’t submit to upstream then someone else will copy your patch(es) and make a pull request.
Also more generally things I do that I don’t publish to posts/blogs is liable to be sniped. So might as well keep it to myself unless I’m will to go the full mile making a big show of staking ownership.
I 1000% agree with this sentiment and to be honest im similar (except engineering EE srufent) and I was managing the world fine with all the increased algorithm and whatnot till COVID hit. I went from immensely internet literate and techy to depressed and stuck on social media all the time from waking up to sleeping I’ll check Instagram (even tho I avoided tiktok for that same reason) honestly I still struggle with this, cause social media is more toxic, pain, and mind destroying then ever before. I hope I can cut this addiction before it’s too late
This is why I love lemmy so much. The only people here are people who have realised that mainstream social media is a steaming pile of trash and came hear to find a nice community.
I agree with you wholeheartedly. Before I fully started using Lemmy maybe a month ago, Instagram and my former safe haven Reddit become absolutely toxic. I go from one to another and it’s always people being blown up and comments saying most racist shit ever. People being racist misogynistic and other bs all the time (which can be funny in certain circumstances but not these one) So far I love the Lemmy community I’ve had some actual thought provoking conversation I’ve not had since early Reddit. I pray to get Lemmy never changes to to much in that regard
I would argue that people you are describing enjoying Tik Tok and being too lazy to look stuff up themselves, are not really engineers. You could also frame it as, information technology got so mainstream, even people with no technical background whatsoever are part of it. Engineers and tech nerds still exist, but they are a minority now.
Engineers aren’t the only people who should want to learn and look stuff up right? Everyone should want to learn. And tiktok is just not the way to do it at all. I think it’s wild how many people “learn” from platforms like tiktok and tell it to others like it’s a fact without doing any extra research.
I’m personally very conflicted between my love of computers and the seeming necessity of conflict minerals in their construction. How much coltan is dug up every year just to be shoved into an IoT device whose company will be defunct in six months, effectively bricking the thing? Even if the mining practices were made humane, they wouldn’t be sustainable. My coworkers are very cool for tech workers. Vague anticapitalist sentiments. Hate Elon. But I don’t think they’re ready for this conversation.
How much coltan is dug up every year just to be shoved into an IoT device whose company will be defunct in six months, effectively bricking the thing?
Man, there’s a lot of this. But what really gets me going is electronics that are actually made to be disposable. Motherfuckers hitting a vape with a little LCD screen then littering it. No hope.
companies don’t know how to interview. i don’t need someone to walk me through a sorting algorithm. i need someone who will be responsive, and interested in the problems we actually face.
Also, any number of interviews that is more than one is too many interviews.
not sure i agree with that. I mean ok, i recently had three interviews for a company where each interviewer asked me almost the same questions. That was clearly a waste.
At my place, we do a 30min introductory call with the boss first, to quickly weed out unfit candidates and not waste employee and interviewee time with interviews. if that’s ok, then there’s three interviews of 45-60 minutes, one with the product owner that focuses on soft skills and team fit, one with the team your applying to and one with the other team (like frontend or backend) with more technical things, and also just if you’d like to work with this person.
no amount of interviewing will ever guarantee that things work out and unfit people can slip through cracks. And i hate wasting time in tons of interviews. But i’d also not want to work at a place where i know my coworkers were hired after just 1 hour quick chatting. That so little time to get an idea of a person, to spot any red flags. Heck, the ‘tell me a bit about yourself’ section of an interview is already 15 minutes and not usually very helpful.
Ok, but can you just whiteboard code me a Fibonacci sequence function.