- cross-posted to:
- economics@lemmy.world
- cross-posted to:
- economics@lemmy.world
So they got all that money from Uncle Sam’s CHIPS Act only to lay off 10,000 employees and make themselves “lean”. Govt funded unemployment.
So they got all that money from Uncle Sam’s CHIPS Act only to lay off 10,000 employees and make themselves “lean”. Govt funded unemployment.
Two generations of bad cpu’s and their solution is get rid of the workers so they can keep their bonuses.
MBA brain rot
Gotta keep that angle at 45° forever and ever.
Two words, bean counters.
I wish them brain drain for being such greedy. Let their best people leave to better pastures.
Part of the lackluster CPU problem is that Intel was pissing away their money on other adventures. CPUs were “in the bag”, so they kept spending money on other stuff to try to “create new markets”. Any casual observer knew their fundamental problem was simple: they got screwed on fabrication tech. Then they got screwed again as a lot of heavy lifting went to the ‘GPU’ half of the world and they were the only ones with zero high performance GPU product/credibility. But they instead went very different directions with their investments…
For example they did a lot to try to make Optane DIMMs happen, up to and including funding a bunch of evangelism to tell people they’ll need to rewrite their software to use entirely new methods of accessing data to make Optane DIMMs actually do any better than NAND+RAM. They had a problem where if it were treated like a disk, it was a little faster, but not really, and if it were used like RAM it was WAY slower, so they had this vision of a whole new third set of data access APIs… The instant they realized they needed the entire software industry to fundamentally change data access from how they’ve been doing it for decades for a product to work should have been the signal to kill it off, but they persisted.
See also adventures in weird PCIe interconnects no one asked for (notably they liked to show a single NVME drive being moved between servers, which costed way more than just giving each server another NVME and moving data over a traditional fabric). Embedding FPGA into CPUs when they didn’t have the thermal budget to do so and no advantages over a discrete FPGA. Just a whole bunch of random ass hardware and software projects with no connection to business results, regardless of how good or bad they were. Intel is bad for “build it, and they will come”.