this post was submitted on 25 Jan 2024
329 points (97.1% liked)

Asklemmy

44152 readers
797 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Every day there’s more big job cuts at tech and games companies. I’ve not seen anything explaining why they all seam to be at once like this. Is it coincidence or is there something driving all the job cuts?

you are viewing a single comment's thread
view the rest of the comments
[–] Lauchs@lemmy.world 145 points 11 months ago (5 children)

A few things happened pretty quickly.

During the pandemic, tech profits soared which led to massive hiring sprees. For all the press about layoffs at the big guys, I think most still have more workers than they did pre-pandemic.

Interests rates soared. Before the pandemic interest rates were ludicrously low, in other words it cost almost nothing to borrow money. This made it easier to spend on long term or unclear projects where the hope seemed to be "get enough users, then you can monetize." Once interest rates rose, those became incredibly expensive projects, so funding is now much more scarce. Companies are pulling back on bigger projects or, like reddit, trying to monetize them faster. Startups are also finding it harder, so fewer jobs.

And of course, AI. No one is quite sure how much that'll change the game but some folks think most programmers will be replaceable, or at least 1 programmer will be able to do the work of several. So, rather than hire and go through everything severance etc might entail, I think a lot of companies are taking a wait and see approach and thus not hiring.

[–] Badabinski@kbin.social 113 points 11 months ago* (last edited 11 months ago) (5 children)

I want to offer my perspective on the AI thing from the point of view of a senior individual contributor at a larger company. Management loves the idea, but there will be a lot of developers fixing auto-generated code full of bad practices and mysterious bugs at any company that tries to lean on it instead of good devs. A large language model has no concept of good or bad, and it has no logic. It'll happily generate string-templated SQL queries that are ripe for SQL injection. I've had to fix this myself. Things get even worse when you have to deal with a shit language like Bash that is absolutely full of God awful footguns. Sometimes you have to use that wretched piece of trash language, and the scripts generated are horrific. Remember that time when Steam on Linux was effectively running rm -rf /* on people's systems? I've had to fix that same type of issue multiple times at my workplace.

I think LLMs will genuinely transform parts of the software industry, but I absolutely do not think they're going to stand in for competent developers in the near future. Maybe they can help junior developers who don't have a good grasp on syntax and patterns and such. I've personally felt no need to use them, since I spend about 95% of my time on architecture, testing, and documentation.

Now, do the higher-ups think the way that I do? Absolutely not. I've had senior management ask me about how I'm using AI tooling, and they always seem so disappointed when I explain why I personally don't feel the need for it and what I feel its weaknesses are. Bossman sees it as a way to magically multiply IC efficiency for nothing, so I absolutely agree that it's likely playing a part in at least some of these layoffs.

[–] bobs_monkey@lemm.ee 42 points 11 months ago

So basically, once again, management has no concept of the work and processes involved in creating/improving [thing], but still want to throw in the latest and greatest [buzzword/tech-of-the-day], and then are flabbergasted why their devs/engineers/people who actually do the work tell them it's a bad idea.

[–] treadful@lemmy.zip 15 points 11 months ago (1 children)

I'm pretty excited about LLMs being force multipliers in our industry. GitHub's Copilot has been pretty useful (at times). If I'm writing a little utility function and basically just write out the function signature, it'll fill out the meat. Often makes little mistakes, but I just need to follow up with little tweaks and tests (that it'll also often write).

It also seems to take context of my overall work at the time somehow and infers what I'll do next occasionally, to my astonishment.

It's absolutely not replacing me any time soon, but it sure can be helpful in saving me time and hassle.

[–] conditional_soup@lemm.ee 10 points 11 months ago (1 children)

Those little mistakes drove me nuts. By the end of my second day with copilot, I felt exhausted from looking at bad suggestions and then second guessing whether I was the idiot or copilot was. I just can't. I'll use ChatGPT for working through broad issues, catching arcane errors, explaining uncommented code, etc. but the only LLM whose code output doesn't generally create a time cost for me is Cody.

[–] childOfMagenta@lemm.ee 1 points 11 months ago

If you tried copilot at the beginning, it's improved a lot since, now it's using GPT-4.

[–] colonial@lemmy.world 13 points 11 months ago (1 children)

A large language model has no concept of good or bad, and it has no logic.

Tragically, this seems to be the minority viewpoint - at least among CS students. A lot of my peers seem to have convinced themselves that the hallucination machines are intelligent... even when it vomits unsound garbage into their lap.

This is made worse by the fact that most of our work is simple and/or derivative enough for $MODEL to usually give the right answer, which reinforces the majority "thinking machine" viewpoint - while in reality, generating an implementation of & using only ~ and | is hardly an Earth-shattering accomplishment.

And yes, it screws them academically. It doesn't take a genius to connect the dots when the professor who encourages Copilot use has a sub-50% test average.

[–] pkill@programming.dev 1 points 11 months ago

In my experience copilot for neovim is pretty useful if you

  1. Split the current window if you have anything like type declarations in a separate file
  2. Write a pretty verbose documentation, e.g. using Swagger.

If you expect it to whip out of thin air what you really need and not have you correct it in several places, learn to code without it first.

[–] shasta@lemm.ee 5 points 11 months ago (2 children)

To add to this, at my company, we've received a mandate to avoid putting any code into an AI prompt because of privacy concerns. So effectively no one is using it here.

[–] Rentlar@lemmy.ca 4 points 11 months ago

Yep as far as most companies should be concerned, using something like CoPilot means giving free license to Microsoft to all your trade secrets and code that you input.

[–] RecallMadness@lemmy.nz 1 points 11 months ago* (last edited 11 months ago)

We had the same. And you would have thought for a heavily regulated industry we’d keep it that way.

But no, some executive wonk from Microsoft flew over, gave our c-suite a β€œit’s safe, promise” chat over champagne and lobster, and now we’re happily using copilot.

[–] PatMustard@feddit.uk 2 points 11 months ago

a shit language like Bash

There's your mistake, treating bash like a language and not like a scripting tool. Its strength is that it's a common standard available on almost every machine because its older than most of us, its weakness is that it's full of horribly outdated syntax because its older than most of us. If used to script other processes it's great, but when you start using it as a language then the number of ways you can do something horrible that sort of works makes JavaScript look slick!

[–] OpenStars@startrek.website 8 points 11 months ago (1 children)
[–] anarchost@lemm.ee 12 points 11 months ago (2 children)

I'm here to repeal and replace good things, and I'm all out of "replace".

[–] OpenStars@startrek.website 10 points 11 months ago

OMG I luv this:-) So, in your honor:

[–] Donjuanme@lemmy.world 5 points 11 months ago (3 children)

Let's not throw the baby out with the bath water. AI had the potential to alleviate a lot of pressures of society, to free up much of our time spent doing tedious mindless tasks. We just need to make sure to use it for the benefit of the many rather than the profit of the few. I don't want a union that wants to keep labor busy and well compensated, I want a union that keeps people safe, happy, and compensated properly

[–] anarchost@lemm.ee 20 points 11 months ago (1 children)

We're like a century past innovation making our 40 hour work week into a 20 hour one

[–] rwhitisissle@lemmy.ml 9 points 11 months ago (1 children)

I fully believe we'll get a standardized 60 hour work week before we get a 20 hour one. Hell, I'm pretty sure we'd relegalize slavery before we get a 20 hour work week. Your average American will bend over backwards for a chance to please "the boss" and actively rat on their colleagues for avoiding work because our cultural understanding of loyalty is functionally equivalent to boot licking.

[–] jonne@infosec.pub 6 points 11 months ago

Yeah, except there's no way the owners would give up any of the profits for the betterment of society. Every technological improvement since the industrial revolution made productivity skyrocket, and yet the capitalists made sure working people were still hovering just above destitution. The only reason some of us have it better is because unions fought them, and that includes Luddites that would destroy the means of production.

[–] ininewcrow@lemmy.ca 2 points 11 months ago

A lot of technology problems and utilizing AI for the betterment of humanity could all be dealt with easily if we just removed a large chunk of the bloated administrative, management and ownership hogs at the top that contribute nothing, stall everything and constantly sabotage development with their politics, infighting and warring with competitors. If you remove the profit factor, corporate greed and economic shortsightedness in these situations, a lot of problems can be dealt with fairly easily and fairly quickly.

Unfortunately, we are greedy monkeys who want to rule the world and once you give power to one monkey or a small group of monkeys, they immediately try to overpower all the other monkeys and rule the jungle.

[–] sunbrrnslapper@lemmy.world 8 points 11 months ago

I completely agree, although I think AI is more likely to have impact marketing, communications, PR, creative and PM type roles (and there are a lot of those in tech companies). I suspect we will see a noticeable reduction in tech workers over the next decade.

[–] tal@lemmy.today 2 points 11 months ago

Interests rates soared. Before the pandemic interest rates were ludicrously low, in other words it cost almost nothing to borrow money. This made it easier to spend on long term or unclear projects where the hope seemed to be β€œget enough users, then you can monetize.” Once interest rates rose, those became incredibly expensive projects, so funding is now much more scarce. Companies are pulling back on bigger projects or, like reddit, trying to monetize them faster. Startups are also finding it harder, so fewer jobs.

Note that this also impacted other projects that take a lot of capital up front, then provide a return over a very long term. There was a nuclear power plant project with NuScale in Utah that got shelved over this; with interest rates suddenly going from way low to way high, the economics get upended.

I'd bet that in general, infrastructure spending dropped across the board.