In the Dwarkesh podcast with Semi-Analysis's Dylan Patel they forecast the phone market will shrink by 50% this year because of RAM prices:
But that’s the high end of the market, which is only a few hundred million phones a year. Apple sells two or three hundred million phones annually. The bulk of the market is mid-range and low-end. It used to be that 1.4 billion smartphones were sold a year. Now we’re at about 1.1 billion. Our projections are that we might drop to 800 million this year, and down to 500 or 600 million next year.
We look at data points out of China from some of our analysts in Asia, Singapore, Hong Kong, and Taiwan. They’ve been tracking this, and they see Xiaomi and Oppo cutting low-end and mid-range smartphone volumes by half.
Yes, it’s only a $150 BOM increase on a $1,000 iPhone where Apple has some larger margin. But for smaller phones, the percentage of the BOM that goes to memory and storage is much larger. And the margins are lower, so there’s less capacity to even eat the margins. And they have also generally tended not to do long-term agreements on memory.
Why this is a big deal is that if smartphone volumes halve, that drop will happen in the low and mid-range, not the high end.
This is an extinction event for the low-cost cell phone companies. How are they going to survive if they can't sell their $100 phones profitably for 2 years? I think many of the low-end companies will simply sell their allocations of RAM and close up shop.
This is my greatest concern. So many small players will be wiped out. Consolidation is assured. Always great for consumers to be under the thumb of increasingly large companies.
There are a bunch of subbrands but there are also a lot of genuine small Android phone companies, especially in China.
Some of these serve some interesting niches that might now disappear due to this DRAM supply issue, e.g. Unihertz for extra small phones or CAT for extra durable worksite phones.
Will it be such a big deal though? Currently people are swapping out their phones for another model that's exactly the same but with a different number at the end of the name every 12 months. This could just mean that the unnecessary churn dies down a bit, and companies taking advantage of it have to find a new line of business.
> Will it be such a big deal though? Currently people are swapping out their phones for another model that's exactly the same but with a different number at the end of the name every 12 months.
I don't think they do that at the low-end (nor the high-end, though that doesn't matter here - higher-end manufacturers have a small margin they can eat into). People on the low-end phones want a new phone, they just cannot afford it!
Even in the mid-end: If you buy a phone which you find to be decent, but affordable, and are not out for chasing the latest gimmick - there is no reason it would not last you 6 or 8 years easily - before applications start assuming the presence of better hardware, or a newer Android version than you have etc. Naturally you will have to protect it from physical damage, and maybe replace a battery at some point.
Because the phones stop working well? I write part of a post, open another tab to go look up some information, come back to the post and what I've written is gone, because the memory got dumped. That's the reality of using an old cheap phone.
And would you consider yourself representative of the phone-buying public in general?
My desktop PC is from 2008 but I'd never consider this to represent anything like common usage. In fact it's so unusual that I get to point it out in posts like this.
This comment makes no sense to me. I exclusively use very low-end phones from Xiaomi. I buy a new one roughly every two years. Each new phone has a better screen, camera, CPU/GPU, charging, and sometimes more RAM/storage.
But it had 4k 60fps video, optical image stabilisation, a "super retina display" etc five generations ago. The specs have kept improving, but it's not a quantum leap in performance.
The same applies at the low end, the grand parent comment even agrees.
You buy a new phone every two years, it comes with a camera, a cpu, a gpu, a host of sensors. Same as phones did two years ago, and ten years before that.
I don’t use my current smart phone in any ways that are different to the iMate PDA2K I had twenty years so.
How often does your browser freeze up when you open a webpage? How often does your phone browser dump its memory when you switch to another tab and then switch back? Eg if you were writing a post and opened another tab to go check some fact then the post in the original tab gets deleted.
Because that's what happens if you use an old cheap phone in the modern day.
I even had a phone that would occasionally just crash when on a heavy website and the onscreen keyboard popped up. That was not at all infuriating!!! Especially when it would crash when I try to refine a Google search.
your comment makes also no sense to me, I exclusively use very low-end phones from Xiaomi since 6 years, and change it only when it's dead or when I can't run my apps (I'm afraid mine won't last 2 years more). Before this I kept my first smartphone (iphone 3GS) for 10.
Forget developing countries, iPhone is a luxury even in some European countries, when rent is 500+ Euros and your take home pay is ~1000. After all the other bills you're not left with iPhone money, which is why 100-200 Euro models of Chinese brands are doing so well.
It's easier to name the countries where iPhone ISN'T a luxury, as you can count them on very few hands.
Many countries would develop much faster if there weren't bombed nor maintain by puppet dictactors from (economically) developped nations (USA and france keep doing this intensively, while countries like Germany dont mind supporting fascist states). (PS: I'm not woke, not even Marxist).
Thanks a lot, Sam Altman / OpenAI. Their little $100bn war chest being used for obstructive / destructive purposes will wipe out multiples of that amount via economic ripple effects. All in an attempt to keep a stranglehold over AI via competitive resource starvation. Basic.
> This is an extinction event for the low-cost cell phone companies. How are they going to survive if they can't sell their $100 phones profitably for 2 years?
This is a great thing to happen, actually. Those phones are all essentially trash that ends up in a landfill within a year or so. They should not exist at all.
Smartphones are widely available on the used goods market though, perhaps even more so than second-hand SBCs or old PCs. The "low and mid range" can be filled by the former high end.
My Samsung Galaxy S3 died after 8 years. EMMC failure. Just started boot looping while I was asleep. Everything gone. Known issue.
My Samsung Galaxy S8 died at 7 years. Some kind of thermal failure, I was able to recover my data by keeping the phone in the freezer while I copied. Known issue.
My Samsung Galaxy S21? I figure I've got another year or two in it before it, too, dies.
Having beautiful dead phones that have never had a broken screen or a hard drop is pretty depressing.
It can also depend on the hardware it's connected to. If the endless gigabytes of Samsung's value-add software are scribbling to eMMC nonstop then it's not surprising the flash is wearing out. A lot of this stuff is masked by the fact that most people swap out their phone for a new one that's exactly the same every 12 months so they never notice this, but if you hold onto a phone or similar device for longer the unnecessary wear starts to add up.
Google Android should get more praise for doing quality control by analyzing and killing apps and processes that attacked the hardware - at least back in the day.
The great filter for incompetence by the big G was real and necessary.
Yeah, the Flash has a wear lifetime. Battery has a finite lifespan too. Anything over five years is pretty good going. My wife managed that with a Nokia 1020, the last and best of the Windows phones.
Like everything else, phones need to be backed up.
I just replaced my OnePlus 5 a couple of months ago at over 8.5 years old. No repairs needed, battery was a bit crippled in active use, especially making calls, but fine for a mostly idling phone. In idle it still lasts longer than a 1.5 year old iPhone 15. I still use it for by backup phone number SIM, as it slowly gets to ~9 years old.
The bigger issue was no more OS updates since 2020, and no Play updates since 2023. The battery can be replaced but getting a fully updated OS is more involved.
OnePlus 5 runs great with custom ROMs, including potentially ones based on mainline Linux as opposed to AOSP. (The Linux support is not as good as OP 6/6T but getting there pretty nicely.)
Too bad they have these long lists of "this doesn't work so well" and I'm too time constrained to troubleshoot for too long or dig for solutions. And I'd also need to replace the battery. It's an option for when I actually have some time.
The device integrity is a bigger deal, this is also a backup for some banking apps so if they don't work it kind of defeats the purpose. I removed all other apps to minimize the attacks surface.
If you're using it as backup for banking apps and the like I totally get not running a custom ROM on it! But you could also set that backup on something even cheaper, any one of the random not-bootloader-unlockable brands, and be left with the OP5 as a Linux phone. You're also right that the Linux support on OP5 is not up to standard yet, this is more of a question for the future if that support improves.
What's cheaper than an already existing phone that would otherwise stay unused or end up in the landfill (recycling center)? It could also be a great experimentation platform, play with Linux on the phone, but the time I have available now leaves little room for this kind of play.
The goal is not "experimentation" but having it eventually as an always up-to-date daily driver once the support for it matures. You're quite right that we're still a bit far from that, though.
If you're not a phone power user, you can get by on old low end stuff. When my pixel 4a died of a bad screen crack a couple years ago, I replaced it with a random used 4a on ebay for $80. Two years later and it's still completely fine for all my purposes (texting, phone calls, chrome browsing, tolerable camera, etc.), although I still haven't accepted google's deal for a free battery swap yet from sheer laziness. I've learned that I can accept a 90 minute screen-on phone battery, though it's an odd adjustment to make. Again, not a power user.
The free battery deal ended in January, but you're likely better off as mine ended up getting a damaged screen while being transported for mail in (because all local stores stopped doing the program), and they wanted to charge me an extortionate price to fix the screen. Support were useless
>Smartphones are widely available on the used goods market though, perhaps even more so than second-hand SBCs or old PCs. The "low and mid range" can be filled by the former high end.
When new cars got more expensive, used cars got more expensive, too. I expect the same to happen with the phones.
When it becomes clear that the insanely expensive AI data center orders are not going to be filled, we can expect a huge reversal in the price of RAM and GPUs. There are 241 GW of orders on the pipeline but only a third of that is under construction and of that third, even less is being quickly finished and brought online, It’s estimated that just 3GW came online last year.
Don't count on it. There's a lot of money in killing other businesses, or even just keeping prices high. Even if the high prices are an accident, there is always someone looking to take advantage of any situation for profit.
I have to agree. You only have to look at car and junk food inflation from after covid.
The prices make no sense, but that doesn't matter, they got away with it and are fighting to hold onto high prices, even as consumers balk. Their solution? Ditch poorer consumers. New cars and (branded) junk foods are luxury items now, apparently.
The only two changes that matter to me are: no more iPhone Mini and no more hardware switch to mute the phone. Instead they got a new "thinnest iPhone ever" that's actually thicker than my 4+ year old one when measured honestly.
I had an iPhone 11. It was a good phone. It started giving up in early 2024. I held on with poor battery life until the new iPhones that year and bought the 16 Plus. I'm glad actually because they're discontinued the Plus models, annoyingly.
But I'm glad I don't need to upgrade for the next couple of years. I honestly want to get 4-5 years out of any phone going forward. There's basically no difference between models 12 months apart.
I can see the prices going up this year. IT's already happened to the PS5, which is bascially unheard of.
It really sucks more because the reason for it--AI--is just so godawful and pointless.
I bought my wife an iPhone 11 Pro Max in 2020, and, knock on wood, outside of replacing the battery it has been going on like a champ.
I've offered to buy her a replacement phone but at this point I think she's kind of curious as to how much life she can get out of it.
I have an iPhone 13 Pro Max; I bought it in 2023 but it was a refurb so I don't actually know how old it actually is. Regardless, it's still going strong, and I am hoping it can last through whatever RAM crunch is going on.
>But I'm glad I don't need to upgrade for the next couple of years.
Said the user who didn't learn the lesson.
Apple, you do not own anything, if Apple wanna release an update next month that makes your current phone useless, there is nothing you can do to prevent it.
Apple was caught hacking battery level, hacking users GPS signal, etc.
> AI is the best thing I've seen in 30 years working in software and expensive RAM for 2 years is a price I think is worth it.
I think generative AI is pretty neat, but I'm not sure it's worth the RAM increases. I use Claude like everyone else does, and it's cool, but I am a little concerned at how much absolute low-effort crap is being produced with it.
It has made YouTube considerably worse; there was already a lot of low-effort shit flooding it, but now it's almost cartoonish. A lot of the videos that I'm recommended will have thousands of views, and give kind of a facsimile of a video with "effort", only for me to realize about a quarter of the way through a bunch of AI tropes in the writing and/or the visuals. It has made the already-mediocre experience of YouTube actively bad.
I am also not convinced that the prices will go down after two years. We already have big memory vendors completely leaving the consumer market, and we have these AI companies buying literal years worth of entire production lines of RAM chips.
This is something that could be solved by competitors jumping in to fill the niche, but it takes a lot of time to build new factories for this stuff, I think more than two years.
And if the AI bubble is still going then with datacenter construction going as planned, the RAM shortage will be even more extreme than today despite higher production capacities.
I would not call any Apple phone even mid-range, and certainly not low-end?
This coming from an android user, that recently bought iphones for my daughters. Paying 600$ for not even top-of-the-line phones does not scream low-end to me.
My company-provided work phone is a base model iPhone, I'd definitely put it in a performance class lower than a flagship from any brand. Certainly not low-end, but I think mid-range would be a fair characterization.
Or optimize the os because I still find 8GB insane for everyday tasks. Ok, gaming I can understand, but most common tasks should be runnable with at most 2GB of memory and that is mostly for browsers.
Its going to be interesting, when big the big AI bubble directly attacks via pricing the government by preventing the sale of surveillance devices. So the bubble will pop- not because it can not sustain, but because its existence is adversarial to government demands for surveillance.
Just checked my Amazon history, and in late 2020 I bought two Raspberry Pi 4s with 4GB memory for ¥6,500 JPY (~$62 USD) each. At the time, they were in somewhat short supply and I payed a little over the $55 list price from a reseller on Amazon.
It looks like the current price on Amazon for the Raspberry Pi 4 4GB is ¥18,800 (~$117 at current rates), which is indeed expensive AF. Oddly, the Raspberry Pi 5 4GB is priced about the same, at ¥18,950 (~$119).
Considering inflation and the speed increases over the 4, the Raspberry Pi 5 price doesn't seem too unreasonable to me. But having the price go up well over ¥10,000 definitely takes it out of the realm of impulse buy and more into something I would only buy if I had a specific and urgent need. So I can definitely see this killing off a good chunk of the hobbyist market.
As it stands, my two older Pis are currently sitting unused in a closet, so I would definitely try to use those before buying anything new.
My big regret at the moment is not buying a 4TB M.2 SSD last year when prices were dipping down below ¥30,000. Now they have more than doubled to ¥65,000 or more. I had one in my cart, but decided not to buy it with the rationale that "well I still don't need the space right now, and the price per TB will probably come down even further by the time I do need it". That is, after all, the way that prices on computer component have worked for most of my life.
I bought a pair of 4 TB SSDs for like $300-350/ea two years ago. I don’t remember exactly.
Around Christmas I tried to order one more. They wanted to above MSRP, like $500. Given the price of everything else I decided to just bite the bullet and do it.
After about a month they canceled my order. Whether that’s because they didn’t actually have one and couldn’t get one, or because they just wanted to wait for the prices to go up further I don’t know.
I went looking again two weeks ago. The exact same drive is back in stock. MSRP is now $1000. Amazon has it “on sale” for $900. Other retailers that often have slightly higher prices are asking $1250.
I built a new desktop in 2023 and repurposed my old desktop for my daughter. The old desktop had a couple of smaller SSDs so I swapped them out for a 2TB Samsung SSD. Paid $99 on Amazon.
The exact same SSD is $479 on Amazon today. It's not a fancy super fast NVMe. It's a slow SATA drive. I have no idea why anyone would even consider building a PC with prices this inflated.
> I have no idea why anyone would even consider building a PC with prices this inflated.
I did recently, specifically targeting lower capacities for the components that have been increasing (RAM and storage).
It didn’t seem like prices would be going down for a while and I didn’t have a desktop pc otherwise, so just went for it. We’ll see how it all plays out but I don’t think it was a terrible decision, as long as prices stay high for a couple years it still makes sense to just suffer through the increases
Those will dry up soon enough. Corporate laptop refreshes will be drawn out as they try and cost save on the increased price.
You also better hope the aliexpress dont figure out a way to get the RAM out of those things because they will start harvesting it for sure if there is money to made.
We're talking about a pi replacement. The Pi 5 is slower than a 10yo laptop. That's gives us a very vast pool of used laptops.
> You also better hope the aliexpress dont figure out a way to get the RAM
That is a real worry and I can see used machines being gutted because selling DDR3/4/5 sticks is way easier and profitable than the whole machine. Adapters for SODIMM to regular DIMM are readily available and cheap, too.
And worse, they're shucking surplus for RAM And SSD's now. I am seeing more and more eBay auctions for surplus PC's sans SSD and RAM. So the second hand market is going to be invaded by the reseller parasites leaving us with $50 CPU-in-a-box and $500+ RAM/SSD parts
I recently did an install of Windows 11 on a machine without TPM
To bypass the check during installation:
Boot the laptop from your USB.
When you see the "This PC can't run Windows 11" screen, press Shift + F10 to open a command prompt.
Type regedit and hit Enter.
Navigate to HKEY_LOCAL_MACHINE\SYSTEM\Setup.
Right-click Setup, create a new Key named LabConfig.
Inside LabConfig, create two DWORD (32-bit) values:
BypassTPMCheck = 1
BypassSecureBootCheck = 1
Close the registry and the command prompt; the installer will now let you proceed.
It's a never-ending cat-and-mouse game, and unsupported hacks like these usually aren't well-received in corporate environments. Decent stop-gap for home use, though!
The Pi isnt a loss leader for user acquisition nor do they get to enjoy Apples economy of scale. Apple can take a small loss on this and it will still be worth it if they retain the users in their ecosystem.
Is there any evidence that’s the case? They always had massively bigger margins than all other PC manufacturers so it’s unlikely they are selling it at a loss even if’s significantly reduced
For most older laptops it's easy enough, you just open them up and take the RAM sticks out. There are SO-DIMM to DIMM adapters to fit a laptop memory stick in a DIMM socket.
Helium supply issues are only going to make this worse.
I feel like for the first time in our lives we might have seen peak technology for the next few years. Everyone is going to have to make do instead of depending on ever increasing performance.
> Helium supply issues are only going to make this worse.
I believe helium, although important constitutes a small percent of the cost of semiconductors, so its effect on price will be less severe. It will be more noticeable in other uses of helium though - party balloons could get very expensive etc.
A hospital isn't going to shut down because their MRI's new helium load is getting more expensive - they'll pay a fortune for it. For a lot of other applications there are no suitable alternatives either.
The real question then becomes: what's going to happen when there's a 1000x price increase?
Reminds me of a demo my college physics professor did in our first class (presumably to get our attention).
He had two floating balloons, one about twice as big as the other. Pointed a blowtorch at the smaller one and it (of course) popped.
"That one was filled with helium. Now, there's only one gas less dense than helium..." and right as I thought to myself "he's not gonna do what I think he's gonna do", he pointed the blowtorch at the other balloon which exploded into a much larger (and much louder) fireball.
The problem is not that it's finite, the problem is that by the time prices rise enough to discourage people from using it frivolously, you might already be dangerously low on it.
This is a really interesting question. Is it? My intuition would say no since you have no inherent duty to protect or help others. I have no clue though.
The helium that goes into balloons is mostly a byproduct of industrial grade helium production that would otherwise just go to waste. It's not pure enough for industrial uses.
You could always purify it, it's just uneconomic to do so at a smaller scale. But if the price rises enough, that will change and no one will be using helium for party balloons.
This one might last longer. The AI race is on, and the US tries its best to make it as expensive for China as possible to participate in it. Every dollar China spends on GPUs they get at markup is one not spent on building navy ships.
If there is an escalation over Taiwan, then that will cause the loss of most of the world's high grade chip manufacturing capacity. TSMC is busy doing technology transfers into the US, but it is going to take time, those fabs won't have capacity for the whole world, and they still heavily depend on Taiwan based engineers if something goes wrong etc.
Just like with COVID you don't know how long this shortage will last.
It will incredibly hard for China to conquer Taiwan. One hundred kilometers across the straits introduces a brutal geographic hurdle. If anything, the fabs will probably be severely damaged in the war. Plus most senior execs and elite engineers would be moved to US offices in Arizona.
We are going to have that now in a couple of months regardless. So it won't matter if Taiwan's manufacturing base gets disrupted, the hardware will have already effectively stopped.
Wow, I wasn't aware Samsung, Intel, SMSC were unable to produce "modern technology." Not everything needs to be on a 3nm TSMC process, believe it or not.
TSMC makes a lot of stuff besides the EUV-scale parts that all the YouTube videos talk about.
Almost everything you own that runs on electricity has some parts from Taiwan in it. TSMC alone makes MEMS components, CMOS image sensors, NVRAM, and mixed-signal/RF/analog parts to name a few.
Also, people seem to assume that TSMC is an autonomous entity that receives sand at one loading dock and ships wafers out at another. That's not how fabs work. Their processes depend on a continuous supply of exotic materials and proprietary maintenance support from other countries, many of them US-aligned. There is no need to booby-trap any equipment at TSMC; it will grind to an unrecoverable halt soon after the first Chinese soldier fires a rifle or launches a missile.
Hopefully Xi understands that. But some say it's a personal beef/legacy thing with him, and that he doesn't even care about TSMC.
Russia weren't able to take Ukraine even when they were able to just drive their tanks right up to Kiyv. Modern warfare tech just favors the defender too much. China has ninety km of sea to cross before they even get to Taiwan. Missiles and drones have already taken out the Russian naval fleet in the Black Sea. China will be losing a lot in the same way if they ever attempt the crossing.
That's what happens when consumer demand rapidly shifts, and businesses start panic-buying and panic-cancelling. As far as I recall, actual chip fab output didn't really change that much.
I ask ChatGPT about this. It says the root was demand collapse at the start of COVID. So fabs stopped producing the many low-end chips reqd for modern cars. They retooled/pivoted to higher-end chips. When auto manufs came back knocking after COVID, the fabs didn't want/need their biz of low-end chips.
Moore's law only really works when at least part of the world is functioning under practically ideal conditions. Right now that's far from what's happening.
Finally, good efficient code is going to get its moment to shine! Which will totally happen because it's not like 80% of the industry is vibe coding everything, right?
Yeah, I got the AI to convert some code that ran at 30fps in Javascript to C, and it resulted in a program that generated 1 frame every 20 seconds. Then I told it to optimize it, and now it's running at 1 fps. After going back and forth with the AI for hours, it never got faster than 1 fps. I guess I'm "doing it wrong" as the hypesters like to tell me.
> Yeah, I got the AI to convert some code that ran at 30fps in Javascript to C, and it resulted in a program that generated 1 frame every 20 seconds. Then I told it to optimize it, and now it's running at 1 fps. After going back and forth with the AI for hours, it never got faster than 1 fps. I guess I'm "doing it wrong" as the hypesters like to tell me.
Remove the "I actually only want a slideshow" instruction from your prompt :-)
Honestly speaking, it has started to look like AI coders could actually do a better job than 80% of app developers in writing efficient apps just by being set to adhere to best-practice programming conventions by default (notwithstanding their general tendency of trying to be too clever instead of writing clear and straightforward code).
This is my theory: we're going to see a lot of languages with straightforward and obvious semantics, high guard rails, terrible dx, and great memory allocation and performance behavior out of the box. Assembler or worse, but with extremely strong typing bolted on in a way that no human would ever tolerate, basically, something in that vibe.
Yeah actually I worked with Pascal early in my career and that's kinda the vibes I am thinking about, with maybe a stronger type system more ada-esque though (composite, partial and range-and-domain types, all that jazz)
I vibe coded a library in Nim the other day (a language I view very much as a spiritual continuation of the Pascal/Modula line), complete with a C ABI.
The language has well defined syntax, strong types, and I turned up the compiler strictness to the max, treat all warnings as errors etc. After a few hours I put the agent aside, committed to git then deleted everything and hand coded some parts from scratch.
I then compared the results. Found one or two bugs in the AI code but honestly, the rest of our differences were “maters of taste” (is a helper function actually justified here or not kind of things).
>I feel like for the first time in our lives we might have seen peak technology for the next few years.
This happened for a while with CPUs in 2004 or 2005, IIRC. At the end of the Pentium 4 era clock speeds and TDPs were so high that we hit a wall. Nobody was pushing past 4 GHz even with watercooling (I tried).
Dual-core processors were neither widely available nor mainstream yet, and those that were available had much lower clock speeds. It definitely felt like we hit a lull, or a stagnation, in those years. It picked back up with a fury when Intel released the Core 2 Duo in 2006, though.
Helium is almost all captured from gas wells by cryogenically liquefying the nitrogen out of it. I guess you could do technically do that with the fab's air but it is a LOT of volume of air to liquefy and likely costs more than even inflated helium prices.
Most helium from most wells is simply vented because it is expensive to separate even with its relatively high concentration, and I imagine even the best case scenario for capturing it from a fab has abysmal concentration of helium. But because most of it is vented it also means if the capital is put down to build more helium separators on gas wells it wouldn't take long to increase supply. Short term for a year or two it can be a problem, but beyond that it is simply a cost versus demand issue. There is neither a technological nor source limitation, it is a pure capital investment limitation.
> Helium is almost all captured from gas wells by cryogenically liquefying the nitrogen out of it.
This is wild. I never thought about how they separated gases from natural gas fields. The carbon footprint of each kg of that helium must be astonishingly large.
> Most helium from most wells is simply vented because it is expensive to separate even with its relatively high concentration
I remember a similar situation with neon early in the Ukraine invasion a few years ago. What I expect to happen is some other source coming online that currently doesn't try to capture it for economic reasons.
Helium recovery in scientific settings for cost saving reasons is already done, so it's not like there isn't expertise in using it.
Helium is actually pretty hard to keep ahold of, being a very light and small noble gas. It can diffuse through a surprising amount of materials, flow through far smaller cracks than you would expect, and is quite hard to filter out of a mixture of gases.
Also superfluid helium (a big chunk of helium used for refrigeration like in e.g. the LHC) has the weird property of flowing the same speed through a tiny hole as a large one and coating everything with a molecular coating. Superfluid helium is basically a bose einstein condensate but macro-scale, totally counterintuitive. Essentially a thermal superconductor. Zero viscosity.
AFAIK they recapture most, but recapturing all simply isn't possible / financially feasible. And they use a lot of helium, so even if they capture most of it, the losses are still higher than the currently available supply.
Even before the hikes, SBCs were $50-$100 a pop, compared to pennies for basic MCUs and maybe $4 for high-performance ones. People were clearly willing to pay 100x more just for familiarity and the ecosystem ("hats", forums, etc). I don't know if 300x is going to make more hobbyists see the light, or just result in fewer of them being able to afford the hobby?
> People were clearly willing to pay 100x more just for familiarity and the ecosystem
This is obviously logical. If I know how to program in Python or JS but not C and am familiar with SSH, I can do something with a SBC in a few minutes.
I get paid $200/hr. If I spent even one hour to learn what I need to deal with a microcontroller, the time cost is four times the cost of materials if I stick with what I know.
How many small projects do I need to do in my free time before it's financially smart to learn a whole new technology?
Most of the "professional" microcontrollers have complicated flashing schemes, expensive bespoke IDEs, and limited language support. Treating a lot of that like a moat around their products.
I find it remarkable that they haven't tried to make all of that easier. Any board with arduino support is easy to start using, with pared down c++, boards similar to the micro:bit support micropython and javascript as well as a few others, and a ton of modern development boards have UF2 support.
UF2 is a step change in how easy it is to flash a binary onto a microcontroller. You hold down a button before connecting it to a USB port, and then it appears as a USB drive for you to drop a file onto, once it's done "copying" the board is flashed and will run your code as soon as it resets.
If you want to gain familiarity with a board, you can drop a .uf2 file with a REPL on it and run code on the board a line at a time.
As if it would make sense that spending 2hrs relaxing on the beach or gardening your orchids would cost $400 to you. Money not made is not money spent. If you were doing a hobby project for learning, you were not going to be working during that time anyways, so your hourly rate doesn't matter.
Microcontrollers don't really make sense for hobbyists (unless their hobby is programming microcontrollers, of course). They only make sense when you think about deploying an application at scale, at which point the per-unit price becomes important. OTOH, if your hobby project goes viral and you want to profit from selling SBCs with it preinstalled, a cheaper SBC is a plus, but that's not very likely to happen...
My point is that the FPGA boards are several orders of magnitude more expensive than the actual chip. To be fair you should be comparing between the cost of the SoC and the microcontroller.
Yeah, never understood why I would want an entire OS running just to blink an LED. I was going to make a pro-Arduino comment but I guess my LED example warrants little more than an R/C circuit and a transistor, ha ha.
(Anyway, I still remember the thrill of writing assembly for a 68HC11 and getting a pair of hobby servos to respond.)
Mostly for the network stack. Economics, also, sometimes.
These days, with ESP32, Pi Pico W etc... things have changed a lot.
But before they got popular, Why deal with MCU + wiring some weird peripheral for wifi / ethernet when you get a Pi Zero W / Clone with built in wifi for the same price?
You jest, but I ended up getting a lot of use out of being able to do this in software for a dimmable LED lamp. Dimming the LED required PWM, and the potentiometer resistance -> PWM frequency map ended up fairly intricate to make the knob "feel right."
Now what I would have loved to have done is come up with some crazy analog circuit to implement an arbitrary transfer function from potentiometer input to LED voltage, but I didn't know how to do this at the time and the dev cycle would be a lot more painful than with software.
They don't call it C++ because that sounds too difficult. But it's literally, not like a simplified subset that compiles into an IL using a formally proven tool, but as in literally compiled using GCC as, C++.
it's literally the hello world of micros. get an arduino, plug it into the usb, install the ide, new -> example -> 01. Blink. Press Run. Cool you have now blunk a led. Now use AI to draw the rest of the owl.
It's easy once you've done it - but before you've done it (for me at least) it was much easier to just install a Linux on a Pi and run a bash script than to learn how to program an Arduino.
(Of course, there are those to whom an Arduino is an overpriced piece of junk and they don't understand how I can't solder a three cent chip myself.)
But let's be realistic - all of these things are like my Steam library - purchases made but never used (I have a drawer full of Pis and other SBCs, and Arduino dev kits, etc. Someday I'll have time time time!).
As well as a GUI to easily flash devices and view the output from the serial port, as well as import libraries that do all of the hard work like say making a serial port on any microcontroller pin or control external devices like light strips or displays.
I'd assume the average user on HN should be able to figure it out pretty easily.
With micropython or some of the js based frameworks for microcontrollers, it's really not that new/different.Especially with ESP32/Pi Pico W/their clones...
In fact it's a lot more straight forward to not have to deal with Network Manager config files or systemd unit files or read only rootfs headaches of Linux world.
You're probably joking, but this is interesting. If we throw more RAM at AI, it can help us optimize programs to reduce our RAM needs, I haven't thought about it like that
For me it's primarily the ability to run a full TCP/IP stack. For hobby projects, I'd rather use a Pi or a Beaglebone with IRC or HTTP for data egress than, say, I2C or SPI. The ease of debugging alone makes it worth it.
Agree, but there was something special about SBCs being so cheap they were the default recommendation for new hobbyists and I'm sad to see that go.
I would not have fallen in love with microcontrollers without Raspberry Pi and PocketCHIP as stepping stones.
The messaging of "it's a tiny computer, make whatever you want with it" is so much more approachable than anything I've found on the microcontroller side. Even Arduino. I dismissed it for a long time because I misunderstood it. I thought I had to buy Arduino devices, then Arduino shields, then program them in the Arduino language using the Arduino IDE.
I’ve been having a lot of fun with the Pi Pico 2W. It can host an access point, a web server, be a USB host, and of course has GPIO. And not running an OS means it’s way simpler.
The Raspberry Pi 4B 2GB is $55 on CanaKit, PiShop, Seeed Studio, and MicroCenter.
The Raspberry Pi 3B 1GB is $35 on CanaKit, Adafruit. The 3B+ is on PiShop for $40.
The Raspberry Pi 3A+ 512MB is $25 on CanaKit, Adafruit, PiShop, SparkFun.
The Raspberry Pi Zero 2 W is $16.35 on CanaKit, $17.25 on PiShop.
We are unavoidably headed at financial collapse, authoritarianism, and potentially the collapse of Western civilization. And ya'll are worried that you won't have 8GB of RAM in an embedded GPIO computer for your hobby? Maybe it's time to make the ultimate sacrifice: use less RAM.
Maybe it will even trigger that we software developers get more time to optimize our RAM usage while developing instead of implementing that new user tracking feature that is being pushed by the business...
I picture a scene with Richard Crenna knocking on our old fogey's cabin door to ask us to come out of retirement and help hand-optimize software in this new environment
If I spent a bajillion dollars on massive data centres I would be delighted if personal computing were also crippled for a while. It would allow me to further own your ability to do compute tasks and to help kill the concept of doing it yourself for a while.
There are ups and downs in the prices of components. Often people forget that during COVID prices were high for SBCs because of supply chain issues. Video cards just were not available in the UK and afterwards (every supplier had long lead times) and are still relatively expensive (at least there are now lower priced options). Raspberry Pis you couldn't get hold of and many people (Jeff Included) was using a website checking for availability which was non-existent for anything other than low end models.
I remember 15-20 years ago when hard drive prices went up through the roof because there was a flood in Thailand and it too years for prices to come down.
There is going to be supply chain issues due to the current Geopolitical situation (Helium comes out of the Gulf and that is need in chip manufacture) is also going to affect the price of components.
Eventually in a few years (as the article states) the situation will change. It just sucks at the moment.
TBH I am more worried about my ability to fill up the tank on my car as both Petrol and Diesel is unavailable locally. I can make do with whatever computer equipment I have.
> People are quick to forget that during COVID prices were high for SBCs because of supply chain issues.
inb4 AI has the same supply chain effects as a worldwide pandemic. I guess those AI doomers that talked about it being the end of the world had it right!
There is a saying that is often trotted out my economists "That the cure for high prices, is high prices".
There is a consumer market and business need for DRAM outside of AI. Someone will fulfil the need as there is a high incentive to. It just going to take a bit of time for this to happen. My equipment is going to be fine for another few years. So I am going to just hang tight and make do with what I got for now.
Main producers actually reduced dram output in 2026. When you have few players with very high capital cost you will end up with cartels like light bulb cartel.
Someone will come in when the price goes up enough. It will take time, but it will happen. What people are complaining about is that the time for this to happen is too long.
Oh look, there is a player coming into the market it seems:
Reminder the whole world is not the United States of America. While you make the choice of voting for someone who thinks tariffs are good for the local market, no other country joined your bandwagon.
Maybe they will. However people often claim that there won't be anyone to want to enter the market to take advantage of high DRAM prices when if they spent two minutes doing a web search they would discover that isn't true.
In reality were they going to survive anyway? I would wager likely not.
Raspberry PI is the defacto standard for SBCs. Almost all the other SBCs had significant problems usually around software support and also third party support e.g. Hats, cases etc.
I’m just going to try and hang tight as well. But I do wonder if DRAM companies should or should not respond to this pricing situation. The actual AI model training companies buying all the RAM aren’t profitable yet, right? It’s all investment, which can dry up at the drop of a hat.
> Someone will fulfil the need as there is a high incentive to.
Unless the capital cost to compete is too high and the risk of the existing manufacturers undercutting you is very real. Plus it can take 5-10 years or more to build a new fab, debug/iterate your process, then start shipping product.
Markets are prone to natural distortions. This is one form of that. It can be perfectly natural for all potential competitors to choose not to compete no matter how much demand exists.
Frankly I'd expect nationalization of some of the DRAM makers before we see the rise of useful competitors. The more likely scenario is government pressure, up to and including arresting executives, to rattle the cages of the existing players who are way better placed to expand production quickly for relatively low capex. Not that I think any action is likely in the short term. My guess is the existing players are betting on an AI bubble pop so they don't see the use in really expanding capacity only to be left with idle fabs later. None of us really knows.
The price for a couple of 32GB sticks is now over $1200 after being stable at about $200 for several years until last September. That's not a blip; that's 6-fold hike and there is no sign it is slowing down any time soon.
Let's see, this is a low speed 2x16GB DDR4 kit for $300.
The closest option on the pcpartpicker chart was about $75 as a stable price. So that one's only a 4x increase.
Versus DDR5 where... it looks like a 5x increase to me? I'm seeing a jump from 200USD up to 1000USD. Edit: Oh there's an extra jump in the last month on the CAD version but not the USD version.
Did you not read what I said? I couldn't even get a replacement video card at any price during the height of COVID and believe you I had the money to pay for one. I couldn't even get a Raspberry PI (any model) for about a year. They were constantly out of stock.
> That's not a blip; that's 6-fold hike and there is no sign it is slowing down any time soon.
How does that invalidate anything I said? As states in the article this will change, it will take years but it isn't forever.
I find it hard to believe that people here cannot make do with whatever hardware they already have.
I also don't believe those small SBCs would have survived long term anyway. Most people just use a Raspberry PI. It is either a MiniPC or a Raspberry PI.
Discord groups that had real-time line counts and pictures of the line at most best buys across the country (US).
The only way I got one was overpaying and a lottery system that bundled it with other hardware because they knew everyone would still buy it. It was impossible to buy online normally as you needed some kind of automated way to buy it before stock zeroed the minute it was posted.
You could pay a scalper for a gfx card, but stores had none. Now, stores have RAM at least.
> Did you not read what I said? I couldn't even get a replacement video card at any price during the height of COVID and believe you I had the money to pay for one.
You're comparing to memory sticks that went up 6x. If you were offering anywhere near 6x MSRP and you couldn't get a video card... I don't believe you.
> If you were offering 5x MSRP and you couldn't get a video card... I don't believe you.
My 1080Ti had died. I had to use a 8800GTS from the late 2000s for about a year. As that was the only GPU I had. I have no iGPU on my CPU.
There was at one time, no stock available. Not on Amazon, Not on Overclockers, Not on Scan. They had some weird lotto system taking place on most sites.
Scalpers claimed to have cards. But I wouldn't risk sending a lot of money to some random seller on ebay.
> Unless this article is massively misleading, sure it was out of stock at 1x price but it wasn't out of stock at 2-3x price.
Again I am in the UK. You could not buy any PI other than 1GB model and maybe the zero. Both of which were useless to me.
> Scalpers claimed to have cards. But I wouldn't risk sending a lot of money to some random seller on ebay.
Ah, so you could have bought one, but you judged the available suppliers to be too risky.
Completely fair, but then it's not true that you couldn't buy one "at any price". It was just not a price+risk that you were willing to take.
Also, re: Raspberry Pis, you couldn't always get the exact RAM configuration you wanted, but they were pretty continuously available during COVID on Aliexpress. You did have to pay 3-5x normal price, but you could do it. I really needed one after one at home died, and paid the 3x markup, and it was annoying but fine. Not sure if Aliexpress is equally as available in the UK as it is here in the US, though.
> Completely fair, but then it's not true that you couldn't buy one "at any price". It was just not a price+risk that you were willing to take.
You are being pedantic. I find this type of discussion very tiresome. I've explained why in other forks of this thread. Quite honestly it pisses me off.
> Also, re: Raspberry Pis, you couldn't always get the exact RAM configuration you wanted, but they were pretty continuously available during COVID on Aliexpress. You did have to pay 3-5x normal price, but you could do it. I really needed one after one at home died, and paid the 3x markup, and it was annoying but fine. Not sure if Aliexpress is equally as available in the UK as it is here in the US, though.
Not in the UK. Someone was running a site with all the places that you could buy from. I was checking most days. Stock was extremely limited other than a few models.
This was my experience, too. Pis would disappear from online retailers before you noticed the stock alert email.
I only got hold of a Pi 4 by chance when Raspberry Pi did an official pop-up store in Southampton for one day only. The queue to get in was about 45 mins long.
Okay, UK, maybe that changes things more than I expected. But what about ebay and the sites that replaced classified ads? And is it unreasonable for me to say that you could have bought a US listing and had it reshipped?
Edit since you added: Scalpers claimed to have cards. But I wouldn't risk sending a lot of money to some random seller on ebay.
Even with ebay's buyer protection?
Well not to be mean but I think "I refused to use ebay" invalidates your claim that you couldn't buy a card.
I've had problems with it before (I can't remember specifics as it was a while ago). I'd rather not going through the hassle and/or risk in the first place.
There are still plenty of scams on ebay. During this era there were people scamming. e.g the box for a GPU. Listing the entire specs and then putting right at the bottom of the listing it was only the box and not the card.
> Well not to be mean but I think "I refused to use ebay" invalidates your claim that you couldn't buy a card.
What you are doing is being hyper-pedantic. It is fucking tiresome when people do this online.
If you are going to be a smart arse, I will modify my statement to say "I could not get a card from a reputable online store as they were all out of stock and did not wish to risk buying from a less reputable one".
I would be foolish to trust some overpriced (or underpriced) listing on ebay. I've had an ebay/paypal account now for 25+ years, I've learned to never do this because I got screwed every time I did.
> What you are doing is being hyper-pedantic. It is fucking tiresome when people do this online.
That's not pedantry. There's a huge difference between "they were unavailable and I couldn't get one at any price" and "I could have bought one from a scalper but I didn't trust them". Even if it's reasonable not to trust them (it is!), the first statement is sensational, and untrue, especially considering you emphasized "at any price" in your comment upthread.
> If you are going to be a smart arse, I will modify my statement to say "I could not get a card from a reputable online store as they were all out of stock and did not wish to risk buying from a less reputable one".
That's what you should have said in the first place; that would have been honest and correct.
And please, there's no need to call the other poster names. That's uncalled-for and childish. You seem to be new here (9-day-old account), so please read the site guidelines and turn it down a notch or three.
> That's not pedantry. There's a huge difference between "they were simply unavailable and I couldn't get one at any price" and "I could have bought one from a scalper but I didn't trust them". Even if it's reasonable not to trust them (it is!), the first statement is sensational, and untrue, especially considering you emphasized "at any price" in your original comment.
It is for any normal person in relatively normal setting.
Only amongst technical people is this sort of discourse tolerated where someone pretends that an unreasonable option (the scalper in this case as you admitted yourself) should be included in a statement when it is perfectly obvious it should not be included because it is not in any way reasonable.
I could have flown to the US and bought a card or China. Is that reasonable? For most people it isn't reasonable. It wasn't for me. Buying from an untrustworthy seller, is unreasonable.
> the first statement is sensational, and untrue, especially considering you emphasized "at any price" in your original comment.
They were out of stock on every reputable site. Therefore I could not buy a card at any price from them because they didn't exist.
> That's what you should have said in the first place; that would have been honest and correct.
I was honest and correct to begin with. The poster was using prices and availability in the US and not the UK.
> And please, there's no need to call the other poster names.
I never called them names. I expressed my annoyance at their behaviour.
i would certainly consider "at any price" to mean that you'd be willing to pay the 5x price to 20 different scammers and still got no card.
there might be a cultural difference between the old world and new world for what "at any price" means, but id take it to mean that to be at least spending $1M for it
You are being a pendant as far as I am concerned and arguing semantics with me is not going to convince me and many others.
So I suggest in future you should learn that using this line of logic (where you expect me to do something unreasonable to a huge number of people) is not something that people are going to put up with. It is really annoying to have to converse in this manner and in fact I believe that often that is wholly disingenuous and I no longer wish to speak to you.
If I categorized these situations the way you do, and I said what I'm saying, I would be a pedant.
But I see things a different way. The logic I'm actually using is not pedantic.
You calling me disingenuous over this is painful to look at. Get out of your own head for a second. We're using different premises, and we're reaching different conclusions because of that. My logic is fine, and your logic is fine.
> If I categorized these situations the way you do, and I said what I'm saying, I would be a pedant.
I am not categorising any situation. The vast majority of people would omit unreasonable options.
I could buy a racing bike that is £5000 new, for £200 when I live in London (back in 2000s). The bike would most likely would have been stolen. So technically I can buy a £5000 bike for £200. But most people wouldn't want to buy from a thief and consider it unethical.
People feel similarly about scalpers and other untrustworthy sellers.
> You calling me disingenuous over this is painful to look at. Get out of your own head for a second.
You started the conversation claiming I was outright lying. Then when I clarified to you what I meant you continued claiming I was lying/misstating. That is really annoying.
If you could have just said "okay that is fair, while you might have been doing X and Y, I can understand why you didn't want to do that". That would have been fine. But that didn't happen.
> You started the conversation claiming I was outright lying. Then when I clarified to you what I meant you continued claiming I was lying/misstating. That is really annoying.
I said "If you were offering anywhere near 6x MSRP" I didn't believe you, and it turns out you weren't offering 6x MSRP. So I wasn't calling you a liar.
> If you could have just said "okay that is fair, while you might have been doing X and Y, I can understand why you didn't want to do that". That would have been fine. But that didn't happen.
So if I had explicitly said "I think it's fine you didn't use ebay" that would have fixed everything? Because I never argued about your personal choice, I argued about you calling ebay "unreasonable".
Well for the record, I was going to say something like that in response to "If you are going to be a smart arse, I will modify my statement to say "I could not get a card from a reputable online store as they were all out of stock and did not wish to risk buying from a less reputable one"."
But then I saw you had called me "hyper-pedantic" and I focused on rebuffing that attack instead.
Edit: And it doesn't help that you never actually did that modification, and instead keep insisting that what you originally said means the same thing.
> So if I had explicitly said "I think it's fine you didn't use ebay" that would have fixed everything? Because I never argued about your personal choice, I argued about you calling ebay "unreasonable".
Ebay in itself isn't unreasonable.
Ebay is unreasonable when the only sellers are untrustworthy sellers, when there was a bunch of scams at the time. Which there were.
I've clarified this many times now. I don't care what interpretation is now of what I said.
> Well for the record, I was going to say something like that in response to "If you are going to be a smart arse, I will modify my statement to say "I could not get a card from a reputable online store as they were all out of stock and did not wish to risk buying from a less reputable one"."
I don't believe you. I've had plenty of stupid conversations like this, with plenty of tech nerds. Rarely happens with non-tech people. I spend some time in non-tech hobby spaces that are technical (Classic Car / Bike repairs) and this convo style never happens.
People like yourself think you are being clever buy poking holes in everything that said. I am quite happy to be quite obnoxious in pointing this out. I am tired of it. I am this cantankerous IRL about this btw.
The fact is that you could not buy a new graphics card in the UK for some time during COVID via almost every online retailers. I had conversations with other people in the UK that wanted to buy PC hardware and they were in the same situation. The same was true for the Pi 4 at the time. Making stupid semantic arguments doesn't change that fact.
> Edit: And it doesn't help that you never actually did that modification, and instead keep insisting that what you originally said means the same thing.
For all intents and purposes it is the same thing if you aren't engaging in pedantry and semantics. I try not to engage in it anymore (unless it is tit for tat), because I understand it pisses people off. You obviously don't care.
I like these many posts about how you, specifically, chose not to use any of the available systems to get a GPU that rapidly organized and became common globally during lockdown. The line from “I just didn’t feel like doing something once” through to “My predictions for the future about a different problem are obviously true” is clear as day. Can’t see why anyone would disagree
> I like these many posts about how you, specifically, chose not to use any of the available systems to get a GPU that rapidly organized and became common globally during lockdown.
You like the other people are was arguing with are pretending that the options were reasonable. They weren't at the time. Many other people I know thought the same.
There was no stock for any GPU except for absolute crap on any of the retail sites in the UK. There are not many options in the UK generally. It is not like the US.
As far as I am concerned what you are engaging is effectively gas-lighting.
> The line from “I just didn’t feel like doing something once” through to “My predictions for the future about a different problem are obviously true” is clear as day. Can’t see why anyone would disagree
If you deliberately want to misunderstand what is said you could draw that conclusion. Which is blatantly what you are doing.
The only thing I claimed about the current high price DRAM situation is:
1) It is likely to get worse before it gets better (due to supply chain issues due to current wars).
2) It resolve itself over time and you should be patient and just make your existing stuff last as long as possible.
That is how any crisis often plays out and I was actually telling people in my original statement not to be all doom and gloom and just be patient. It will sort itself out. It won't be this year for sure.
My favorite part would have to be where you can’t remember the actual, structurally crucial piece of information that your argument rests on and just said that you didn’t feel like getting a GPU off eBay.
>I've had problems with it before (I can't remember specifics as it was a while ago). I'd rather not going through the hassle and/or risk in the first place.
As your evidence that
> Doomers IMO are just click baiting.
Like you admitted that you _do not remember_ why it was entirely unreasonable or impossible and are arguing against people that do possess memory of it being possible and reasonable enough for them at the time. Amazing stuff.
> My favorite part would have to be where you can’t remember the actual, structurally crucial piece of information that your argument rests on and just said that you didn’t feel like getting a GPU off eBay.
You are misunderstanding what is being said. I suspect it is deliberate.
It is often said that "Prevention is often better than the cure". Similarly it is often better not to risk spending your money unwisely than to have to go through processes to recover your money. It matters not what the specifics of the situation was (it happened a decade or more ago)
I communicated that quite clearly. So you either didn't understand or you are deliberately misunderstanding what I said.
> Like you admitted that you _do not remember_ why it was entirely unreasonable or impossible and are arguing against people that do possess memory of it being possible and
reasonable enough for them at the time. Amazing stuff.
I bet you felt really clever constructing that. However as explained the specifics weren't the point. Avoiding the process entirely for funds recovery is the point.
I don’t know why you’re being so combative here. I said I liked your posts about vaguely feeling that a specific thing was probably worse during covid lockdown than everyone else remembers it and how that means that your are equipped to predict the impact of a completely different phenomenon on something else. I like these posts! Responding to “hmm this specific thing looks bad” with “alright I don’t actually remember what I’m basing this on but I saw a quote about economists that I think means it’s good and it feels like everyone that doesn’t vibe with me and my quote are wrong” is fantastic posting!
I wasn't trying to be a smart arse at all. "I couldn't get a new card from a store" and "I couldn't get a card at all" are extremely different claims in my mind.
I'd rate my pedantry level as quite low. From my point of view this is not a nitpick.
Especially because you emphasized "at any price". It's the scalpers and the used market that were selling at any price. Sticking to reputable stores means sticking close to MSRP.
Paying a scalper on ebay isn't. Which is what I said. Misstating what I said is disingenuous.
> You could have gotten another 1080Ti from a legitimate previous owner.
They were being scalped as well. Also people were holding onto their 10 series cards because the other cards were too expensive. So I would have had to buy an older card (which I had already had one fail) at an inflated price.
I could have bought a GT 710 or a GT1030, but that wouldn't have been any better than my 8800GTS really.
I could have flown to Taiwan and bought a card. I could have stolen one. I am sure you will invent another fantasy scenario where I could have gotten a graphics card that I didn't think about at the time.
The fact is that I could not buy a new card from an online retailer in the UK as they were out of stock. Even when they did come into stock there was a lotto system. So you couldn't really buy one then. That is a fact.
I am not a hardware guy, so I am asking this in good faith: excluding people with corporate backing, who actually needs DDR5 RAM? Gamers? Why is DDR4 or DDR3 not good enough?
Because modern CPUs are on platforms that support only DDR5.
If you are a gamer, chances are you want one of the AMD X3D CPUs. Whilst AMD did produce 5600X3D, 5700X3D and the highly sought after 5800X3D, these are effectively unobtainable now (outside of the Used Market, which is already about 2X MSRP).
You are effectively forced into AM5 (or whatever Intel is doing) and they require DDR5. You don't have the "choice" to use DDR4 anymore in most circumstances.
If your question is more of a hypothetical (assuming we could use newer CPUs with DDR4 or even DDR3) the answer is a bit more blurred, but at least in a lot of gaming workloads, you aren't memory speed bound. There is some performance regressions, sometimes up to 15%, but a lot of this is negated with the X3D chips anyways (:
If you only need DDR3-like throughput you can keep a minimum of RAM for booting and caching, and set up swap on an Intel Optane drive: they're widely available and cheap (at least cheaper than RAM) on the second-hand market.
(For read only workloads (no writes or only very rare writes) any ordinary SSD would suffice; the point of resorting to Optane is its unique wearout resistance.)
I remember that literally everything, including basic necessities like food and housing jumped 30% higher overnight and never really returned to pre COVID prices. It erased about a decade worth of wage increases for most people.
I think the doomers are probably anticipating another round of that and they're probably right.
At work we just got a quote to upgrade a couple servers, original price a few years ago was ~ $150k. Essentially the same hardware, just newer, is now quoted at ~ $450k.
We decided to just keep our current hardware for now and extend a support contract for ~ normal price.
Quite the opposite i'd wager. Now that AI can figure everything out we can have the AIs do the performance work. Performance work alot of the times also went against developer experience in terms of languages/patterns and such. AI doesn't need to care about DevEx which might also show a shift towards more memory efficient languages and patterns. Only time will tell though.
Between this revelation and that post recently on HN about the scanned receipts and egg prices, I find myself wondering if we're worrying about the wrong things.
We're seeing massive inflation in computing, but because the dollar is holding its value we call it increased prices. But the buying by the big buyers is the thing driving the inflation, its mechanism is scarcity.
But it's also localized. Only we experience this as a problem because compared to the hyperscalers we're poor.
The same idea applies to the price of groceries. As the prices increase, base increase being inflation, but logistic efficiency also plays a big role.
The effect is the same. The ones with more spendable income don't experience an issue yet in the projects nobody is eating fresh veggies.
The part that scares me is the creep, as I call it. Throughout the years I've always been able to carry price shocks and such but this time I'm out of the game. No more DRAM for me.
I then wonder if one day, without losing my job, I won't be able to pay for veggies.
You're right that fuel prices have risen. But usually the impact of fuel prices is mostly felt on bulkier, lower cost items first.
After all, a truck can carry a 10kg sack of rice, or a 10kg nvidia gpu. If shipping costs for 10kg rise by $15 the sack of rice has doubled in price, but the GPU is only 0.5% more expensive.
For a truck yeah, but across the ocean, it isn't quite that simple because GPUs and grains are sent in different types of ships (or different modes entirely) that aren't interchangeable.
You're right - perishable goods have to be shipped fast. Your bananas, berries, fresh fish, and not-fron-concentrate juice can't be on some slow-steaming container ship with the furniture, clothes, building materials and vehicles.
This is driven by AI datacenter demand, not fuel prices. RAM prices have actually dropped significantly in the last couple days as the Iran war hit and the possibility that interest rates might go up and pop the AI bubble sunk in. (Though let’s see where they go after the last couple days of whipsawing.)
Yeah. Not true. Or send me the name of your server vendor. I’m buying.
Having issues with both price and availability on NVMe, SATA flash, starting to see some CPUs, and for a personal project high density spinning rust (24TB+).
Well this brings back initiative to stop throwing RAM at the problem and start optimising the code. Would like to see what smart people can do when there are money saved from buying more RAM.
For some uses right now, this makes sense, but it has to be at scale. If you're working on something that will ship in two years and is used by end users, it might not be worth the effort since production will catch up.
Taking a big, complex, already well optimised program like Chrome or the linux kernel and optimising the memory footprint is hard. But 90% of programs are just crappy web apps that nobody has even bothered to optimise at all. (Sometimes wrapped in electron or something.)
If you go look, you often discover that 90% of the requests are useless, or at least could be combined. That 60% of bandwidth is used up by 3 high res images which get displayed at 30x30 pixels. That CPU performance is dominated by some rubbish code that populates an array of a million items every call, then looks up 1 element then throws the whole thing away, only to regenerate the exact same list again a few microseconds later.
We have plenty of RAM. In absolute terms, 8gb of ram in the macbook neo is 8 billion bytes. 64 billion ones and zeros. You don't need rocket science to make a CRUD app that runs well with that much ram.
Computers don't get slower over time. If we were merely as lazy with computing resources as programmers 10 years ago, most programs would scream on modern hardware.
It isn't that they are crappy web devs. It is that often the org paying for the development doesn't care.
I am a web developer of over 20 years. I can create insanely optimised pages using nothing other than vanilla CSS and JS.
I have been paid exactly once to do this. There is a site I built in 2023 that has a JS and CSS footprint of less than 100KB after GZip (large site). We even had the Go templates compiled when the web app initialised so the server responded as fast as possible.
Guess what happened when it went live? The content team use 8mb images for everything and every single optimisation I did at CSS/JS was totally useless.
Devs don't care because the people above them don't care and therefore there is zero incentive to even bother.
This is a really great case study for why you only optimize when you actually have a problem, and only in the context of a profiler to define what needs optimization.
the engineering and leadership failure was at requirements time. why on earth would somebody pay for all that optimization without knowing about whats gonna be on the page first?
There is something very wrong now with how companies operate in general.
You get beaten down eventually. Late last year. I spent like an hour going through why a PR (and this developer's work) in general wasn't acceptable to my superior. He said to me that he was perfectly fine with someone not understanding basic language features (after 6 months using the langauge). He then merged it.
It didn't work (as I had warned) and created a situation where I had to turn off tests in some projects as it totally broke them. I've spent months fixing his crap and still haven't recovered from one bad PR. Now add two other employees that are like this and my manager does nothing about it. I bought a AI package from Jetbrains and now have it do almost all the work. I normally spend some time cleaning it up. Management have made it clear to me that they don't care about quality, they won't hold anyone accountable and won't even fire people that clearly cannot program.
I am 43 years old this year. I just can't be bothered trying to be a hero anymore.
Similarly, my father who retired last week was a joiner/carpenter and would be considered a master boat builder. When my sister was little my dad made her new bed with hearts and flowers carved in the headboard.
He described how adversarial he was too his employer before he retired. He was engaging in Malicious compliance (he is a layman and didn't know it was called that) because management was making his life miserable by employing the same sort of the stand-up meeting ceremony nonsense in carpentry.
They managed to make someone with that level of skill hate their job because of process.
> There is something very wrong now with how companies operate in general.
Some companies. A lot of companies, maybe. But far from all of them.
I've done a lot of consulting work, which means I've done short stints at a lot of different places over the years. Some were absolute stinkers - like you describe. But I've also worked with some wonderful people and on some great, high performance teams. I understand that its not so easy when you're 43 (and maybe, with kids). But you don't need to stay in a job like this. Its not worth getting ground down like this. Its bad for your health. And its horrible for your career in the long run.
Move to a smaller company. Or sniff around and find a better team within your existing org. In the words of my favorite poet: The world is made to be free in. Give up all the other worlds except the one to which you belong.
> Some companies. A lot of companies, maybe. But far from all of them.
I honestly think it is most of them.
> Some were absolute stinkers - like you describe. But I've also worked with some wonderful people and on some great, high performance teams.
I've totally given up on it. People don't value your work. I did a piece for a particular company. It worked perfectly. It was thrown away after a year and half because management decided everything should be rewritten in <new framework> ignoring the fact that what I had written was well documented and worked absolutely fine.
Now I shouldn't really care right? I was paid and all. But it pissed me off. What the point in doing a good job if people just throw your work in the bin?
I am looking at what my options are going forward. I am honestly considering being a car mechanic (I fix my own vehicles) or work outside for the canal trust. Realistically I suspect I might pivot to QA or doing something security related.
> I understand that its not so easy when you're 43 (and maybe, with kids). But you don't need to stay in a job like this. Its not worth getting ground down like this. Its bad for your health. And its horrible for your career in the long run.
I've been looking for over 2 years. I want to move to be closer to my family which are 300 miles away (the other side of the UK). So remote is a must. A large number of positions are hybrid, so not an option.
Outside of that many of the position in the UK are working Defence, Intelligence or Law Enforcement. All of those I have ethical reasons why I won't work for them. Outside of that there is Gambling, Pay day loans, and spooky stuff like tracking people via facial recognition.
> In the words of my favorite poet: The world is made to be free in. Give up all the other worlds except the one to which you belong.
Dear god how do I get these jobs? I'm 35 yo and would work with you and accept your work, not jam crap code into things. I'm open minded and realize when someone's idea or code is better than mine.
You can still write JS or TypeScript code that tries its best to keep memory use under check. JavaScript was around in the late 90s when the memory footprint of software was at least an order of magnitude lower, so it's absolutely doable.
You don't have to go that deep. 99% of the time our analytics or risk management teams have some really memory inefficient Python and they want me to write them one of our "magic C things" it turns out to be fixable by replacing their in-memory iterations with a generator.
Most people don't have the chance to do that, but hopefully we can see some other languages get first class access on the web. At least there is the whole WASM project.
This feels kind of worn out. Yes we use more memory but we have more to work with. At the very worse you just let your favourite LLM take a pass at improving memory usage. For example, yesterday I was debugging an Electron crypto mining blockchain 2.0 app and the WebWorkers wou—-
Toolchains and product pressure did more damage than rusty malloc discipline, because modern stacks assume cheap RAM and pile on deps until a small board looks underpowered.
Just rewrite your biggest memory hogs in Rust, it routinely slashes RAM footprint and demand for RAM throughput. The effect is even bigger than the typical reduction in CPU use. You can even ask AI to help you with the task, it will use a lot less RAM for it than the rewrite will save down the road.
Yes, languages with very strong type systems like Rust are incredible when paired with an LLM. Just like chat bots have a calculator as a "tool" because they are not the best at calculation themselves, they need a type system to deterministically validate the safety and cohesion of the code they generate.
Its languages like C that you have to watch out for, because the LLM will gladly say "this is safe!" when its not.
LLMs are great at C, probably because C is historically the most popular language in the world, by far. It only declined slightly very recently. But there's insane amount of code written in it.
Why is that less realistic than saying 'rewrite in rust, make sure there are no memory leaks'?
My point, which I should have been clearer with, is that we aren't at a state where you can just one shot a rewrite of a complex application into another language and expect some sort of free savings. Once we are at that state, and it's good enough to pull it off, why wouldn't the AI be able to pull it off in C as well?
You don't have to trust the AI to do it with Rust, you just have to ensure certain conventions are followed and you can formally prove you're 'safe' from certain classes of issue, no AI magic dice-roll.
A lot of people are very excited by the idea that now language capabilities (and almost every other technical nuance) somehow don't matter but much like gravity they will continue to assert themselves whether you believe in them or not.
So far humans have proven unable to write large apps in C without those issues, given their work is the training basis for LLMs this creates two problems, one being that they don't 'know' what a safe app looks like either and any humans reviewing the outputted code will be unable to validate that either.
Documentation and testing used to be mildly important, you better have them, but the quality of the tests didn't matter as much, since you have to get the implementation right, no matter how good or bad your tests are.
Now that the work is delegated to an LLM, the test and documentation quality ultimately decides the quality of the product.
Since you as the programmer no longer have to deal with the language's annoyances directly and force the LLM to perform the drudgery for you, you can build a language that makes a trade off between drudgery and quality and receive a software quality upgrade essentially for free.
LLMs are really good at producing tokens faster than developers, so make those tokens count.
Based on a FIDO2 spec I used it to write a reasonably compliant security token implementation that runs on top of Linux USB gadget subsystem (xcept for attestation, because that's completely useless anyway). It also extracted tests from an messy proprietary electron based compliance testsuite that FIDO alliance uses and rewrote them in clean and much more understandable C without a shitton of dependencies that electron mess uses. Without any dependencies but openssl libcrypto, for that matter.
In like 4 hours. (and most of that was me copy pasting things around to feed it reasonable chunks of information, feature by feature)
It also wrote a real-time passive DTLS-SRTP decryptor in C in like 1 hour total based on just the DTLS-SRTP RFC and a sample code of how I write suckless things in C.
I mean people can believe whatever they want. But I believe LLMs can write a reasonably fine C.
I believe that coding LLMs are particularly nice for people who are into C and suckless.
Pain is an important signal that tells you something is going wrong, before it goes wrong really badly.
Rust gives you a lot of pain (= useful signals), before damage occurs.
Now imagine you build a reinforcement learning harness around Rust and C. Which is better for reinforcement learning? Impossible to detect failures in the final product or loud and annoying compiler errors that force you to address them?
The high cost of high end DRAM (4GB+) cost skyrocketing has caused some interesting shifts:
1. Shifting Hobbyist Focus: Because hobbyists typically prefer parts under the $100 mark (so they don't "fret over breaking them"), the community is shifting away from modern, high-powered SBCs. Instead, people are moving toward:
2. Older SBC models (like the Pi 3 or 4 with lower RAM).
3. Microcontrollers (like the RP2040) which remain cheap. So Used hardware and "repurposing" old tech is retro trending again.
IMO, perhaps there will be push to make software/firmware more RAM efficient with AI assisted coding?
am I crazy for thinking that the 16GB Pi 5 is just there to absorb money from people who purchase the most expensive version of things? Like really nobody needs that much RAM on a Pi?
I am running a bunch of stuff on my 8GB Pi and I've run out of memory to put more stuff on. I use it as a low power server running a bunch of Docker containers. Some of these require at least 200mb and some use 2G of memory.
I was going to buy a small nuc and load it up on memory but I've acquired an old Mac Mini with 16GB of ram, which will do.
Yes, you are crazy for thinking that. The extra ram is useful for small LLMs and also running lots of dock containers. The very low power consumption makes it ideal for a low end home server.
I use the 16GB SKU to host a bunch of containers and some light debugging tools, and the power usage that sips at idle will probably pay for the whole board over my previous home server, within about 5 years.
Docker is about containerization/sandboxing, you don't need to duplicate the OS. You can run your app as the init process for the sandbox with nothing else running in the background.
That makes docker entirely useless if you use it just for sandboxing. Systemd services can do all that just fine, without all the complexity of docker.
I think that on linux docker is not nearly as resource intensive as on Mac. Not sure of the actual (for example) memory pressures due to things like not sharing shared libs between processes, granted
Any node server app will be ~50-100 MiB (because that's roughly the size of node binary + shared deps + some runtime state for your app). If you failed to optimize things correctly, and you're storing and working with lots of data in the node process itself, instead of it serving as a thin intermediary between http service and a database/other backend services, you may get spikes of memory use well above that, but that should be avoided in any case, for multiple reasons.
And most of this 50-100 MiB will be shared if you run multiple node services on the same machine the old way. So you can run 6 node app servers this way, and they'll consume eg. 150MiB of RAM total.
With docker, it's anyone guess how much running 6 node backend apps will consume, because it depends on how many things can be shared in RAM, and usually it will be nothing.
Only Java qualifies under your arbitrary rules, and even then I imagine it's trying to catch up to .NET (after all.. blu-ray players execute Java).. which can run on embedded systems https://nanoframework.net/
I listed some popular languages that web applications I happened to run dockerised are using. They are not arbitrary.
If you run normal web applications they often take many hundreds of megabytes if they are built with some popular languages that I happened to list off the top of my head. That is a fact.
Comparing that to cut down frameworks with many limitations meant for embedded devices isn't a valid comparison.
"just as well"? lmao sure i guess i could just manually set up the environment and have differences from what im hoping to use in productio
> 1GiB machine can run a lot of server software,
this is naive
it really depends if you're crapping out some basic web app versus doing something that's actually complicated and has a need for higher performance than synchronous web calls :)
in addition, my mq pays attention to memory pressure and tunes its flow control based on that. so i have a test harness that tests both conditions to ensure that some of my backoff logic works
> if RAM is not wasted on having duplicate OSes on one machine.
Yes, it's exactly how docker works if you use it for where it matters for a hobbyist - which is where you are installing random third-party apps/containers that you want to run on your SBC locally.
I don't know why people instantly forget the context of the discussion, when their favorite way of doing things gets threatened. :)
Context is hobbyists and SBC market (mostly various ARM boards). Maybe I'm weird, but I really don't care about minor differences between my arch linux workstation, and my arch linux arm SBCs, because 1) they're completely different architectures, so I can't avoid the differences anyway 2) it's a hobby, I have one instance at most of any service. 3) most hobbyist run services will not work with a shitton of data or have to handle 1000s of parallel clients
> Yes, it's exactly how docker works if you use it for where it matters for a hobbyist
What you described is exactly the opposite of how it works. There is no reasonable scenario in how that is how it works. In fact, what you're saying is opposite of the whole point of containers versus using a VM.
> when their favorite way of doing things gets threatened
No, it's when someone (like you) thinks they have an absolute answer without knowing the context.
And by the way, in my scenario, container overhead is in the range of under a hundred MiB total . The thing I'm working on HAPPENS to require a fair amount of RAM.
But you confidently asserted that "1GiB machine can run a lot of server software". And that's true for many people (like you), but not true for a lot of other people (like me).
> most hobbyist run services will not work with a shitton of data or have to handle 1000s of parallel clients
neither of these are true for me but you need to take a step back and maybe stop making absolute statements about what people are doing or working on :)
I bought a Pi 500+ (basically a 16gb Pi 5 in a keyboard with a built in NVME hat) to use as a family computer, otherwise I agree. Unless you're planning on using it as an actual desktop there's no real reason for that much ram
Browsers treat RAM as infinite, if you want to for whatever reason open LinkedIn, you might wanna get a bigger model. I’d personally rather buy more ram than I need rather than deal with the cost of fixing / working around the issue in future
No you are not crazy. It's silly to try to use a raspberry pi 5 16GB (or equivalent priced product) as a desktop workstation with a GUI on it when much better actual x86-64 based workstations exist. Ones with real amounts of PCI-E lanes of I/O, NVME SSD interfaces on motherboard, multiple SATA3 interfaces on motherboard, etc. In very small form factors same as you'd see in any $bigcorp office cubicle.
It’s an incredibly lopsided machine. The Pi 5 is decently powerful, but you really really should not be attempting to use one as a desktop replacement. While theoretically possible you are so much better off with a $50 used SFF PC.
Living in Korea where Samsung and SK Hynix are headquartered, the DRAM pricing situation is interesting from the supply side too. Both companies have been aggressively shifting capacity toward HBM for AI/datacenter use because the margins are 3-5x higher than commodity DDR5. The hobbyist SBC market is essentially collateral damage of the AI boom — manufacturers are rationally choosing to serve the more profitable customer.
Unfortunately I don't see this reversing until HBM demand plateaus or new fabs come online, which is 2-3 years out at minimum.
I remember my company buying RAM expansion boards for our PCs back in 1989 so we could run OS/2. The 4MB boards (MB! Not GB.) cost around $2000 at the time.
Like everyone, I love getting tons of RAM or SSD storage on the cheap; but we have a ways to go before we reach the 'unaffordable' level.
this whole saga is having ripple effects even in the second hand market. in 2020-2022 there was a glut of those 1L mini pcs on ebay and other resellers which were WAY better value than the RPi4 at the time which was in short supply due to COVID. these mini PCs were pretty affordable and could be upgraded with extra RAM, new SSD/NVMe drive etc to make perfectly good little home servers. I still have mine which has been running for a few years now, Intel 6th Gen CPU, Lenovo thinkcentre.
nowadays the price of these 2nd hand mini PCs has shot up, and even if you do get a chance to get one, upgrading it with more RAM is gonna be painful
If you can't find mini pc's at a resaonable price, there's always old enterprise SFF desktops or even thin clients, which have very low specs (and are usually not upgradable) but can be loaded with a custom lightweight OS.
I know this is a pipe dream (govt’s of the world working together to benefit their citizens instead of blowing some other country’s citizens up!) but if we aren’t gonna regulate AI collectively to ensure we are developing it responsibly, the least we can do is ensure AI is given bottom billing when it comes to all the resources it’s sucking up. Energy, components, engineers, construction, etc.
My preference is responsible AI development which prevents it from turning into an arms race but that’s clearly not on the cards, especially with current leadership.
The extreme DRAM market has had an unexpected side effect of triggering a lot of panic buying. I know several people who delayed PC upgrades for years but then panic bought new systems in this market. The trigger was seeing all of the "It's only going to get worse" and "This is the end of personal computing" headlines.
They're already regretting spending so much now that prices have started to tick downward.
I keep telling everyone: If you don't have a pressing need to buy right now, please wait 6 months and check again.
wasn't "panic" buy but I built a new comp early 2025, cuz at worst case would be complete supply crash and at best case it was going to be more expensive.
Def don't regret doing that, though I regret not springing for the extra RAM.
That's actually a reasonable response to market volatility and illiquidity. It's not just high prices, but prices that still fail to be representative of the actual market stance despite the rises.
It's only a spike if it comes down. Every RAM chip is a lottery ticket with a plausible chance of giving one lucky winner fabulous prizes like absolute dictatorship of the entire world and physical immortality. What else are the billionaires going to spend their money on? Arms races can absorb unlimited resources.
In January I bought a barebone ASUS NUC, which is relatively expensive among mini-PCs, but I need to run it 24/7 for many years, so I made a choice based on expected reliability.
After adding to it DRAM and SSDs, the cost of the barebone remained of only 40% of the total, so the price of the memories was 50% higher than the barebone computer.
At that time, the memories were still cheaper than today, so now the price ratio would be even worse. (The barebone NUC had an Intel Arrow Lake H CPU and it cost $500, while 32 GB DDR5 + 3 TB SSDs cost $750.)
Last month I "panic bought" a $999 Macbook Mini (32G) so I could run small models, Image Generation, and Voice synthesis on it. I don't think I regret it yet, despite the fact that you can get a 16G for $599, which is honestly a much more efficient price per Gig.
I think it is interesting that, at least thus far, Apple has chosen not to raise the price of their comps despite presumably the price of RAM going up multiples.
Tipping point for me: It will be a pretty kickass media server for at least a decade.
I panic bought a Macbook Pro M5 Max with 128 GB ram. I yolo'ed because I don't think ram prices get better in 18 months so this might be the last time we see "cheap" memory, even though the laptop cost me $5000.
Bought these Patriod SSDs recently, they are cheap but they are DRAMless apparently, I'm using them to replace the HDDs from 17 year old hardware so figure it's not terrible plus the read/write speed is still high
Got my RPi 5 16GB quite a while ago for around $160 and already thought that was expensive... It’s still powerful enough for almost everything I throw at it, honestly a bit overkill in most scenarios.
With prices steadily going up, for me it's starting to feel more sensible to repurpose the RAM sticks I've collected from old PC builds / laptops and just throw together small amd64 boxes instead of buying more RPis.
I wonder if there are low power Intel or AMD boards that accept DDR3. So many sticks of 2 / 4 / 8GB DDR3 inside laptops going into recycling or landfills which would do perfectly fine for low power purposes. Hell, performance for standard workloads scales with access times, not bandwidth, and DDR3 sits nicely at CAS8 1600MHz and CAS10 2133MHz..
People hate AI. This will make them hate it even more. Yet somehow the market has convinced/forced us to use their products even though we might not want to.
A recent nationwide poll[1] shows AI has a poorer approval rating than ICE — ICE! — probably due to their overlords being "those" SV types. Everyday AI features are being shoved down our throats. I can't even choose to not install Gemini related apps on my Android when I select "which apps to install" when booting a new phone.
But people are a weird bunch. They largely don't buy products aligning with their values. No one is jumping up and down for Graphene phones even if they had amazing privacy first software. People buy 6mi/gal hummers and iPhones for fashion, brand, money, convenience/function. The pain threshold of all bad effects still is not high enough to quit their products in a meaningful way. Values and privacy are way down in their list. I wish people would not buy/install AI related features by big tech and be more discerning, but that is likely a pipe dream.
Bought a couple of 32gb SBCs before this all hit the fan. And also built a SSD NAS before the wave hit.
So timed that all pretty great. What worries me is my desktop is up for a full new buy somewhere around early '28. That could be a train wreck depending on how taiwan situation goes
> So timed that all pretty great. What worries me is my desktop is up for a full new buy somewhere around early '28
That's a very specific date / timeline. How do you decide to do a full new buy? I ask because I own a desktop that I built 15 years ago which I was flirting with replacing completely last year, but unfortunately I didn't pull the trigger ... oops :(
My old rig is still going strong. The motherboard can only take up to 32GB DDR3 though. CPU is an Intel i7-4790k which is still very fair today if you are not running a resource hog OS (looking at you Windows). Overall it is completely serviceable for my needs. Being honest with myself the only reason I wanted to upgrade was for nerd cred but I don't game much anymore and don't do any ML tasks that require lots of local compute.
My PC is similar. I upgraded it to a 4790k a few years ago (best CPU on the socket). What's funny is I also maxed out the RAM as well because I realised two more 8GiB sticks were like £30 so why not. I thought it was a funny thing to do at the time as I didn't really need that much, but glad I did now. It's going to have to do me for many more years to come, but I'm fine with that. I don't game at all. Just have to hope nothing fails. I did build it with solid foundations: good and overprovisioned PSU, Asus mobo, so here's hoping.
Unfortunately I do also have server gear now as well. I'm going to have to really think about what I actually need now...
A 2GB RAM (and no EMMC) Raspberry Pi 5 in Canada is $90. Around $150 is where you can get used N100 Mini PCs with a proper SSD, and at least 8GB of RAM. It’s crazy.
Single Board Computer. Yes, it might help to explain such acronyms in the article. And yes, if you think the name is slightly misleading, I agree: most PC mainboards today are also "single board computers" - you don't have to add any additional cards to get a functional PC (unless you consider RAM modules and M.2 SSDs cards).
Maybe you should have search for "SBC abbreviation" instead, as it's not exactly a philosophical question. It's a TLA (spoiler: TLA stands for Three Letter Acronym).
SBC (single board computer) is a fairly widely used term.
Yep. I just bought a Pi CM5 for my son, for his ClockworkPi uConsole. CAD $200 for the 8GB module. I bought a whole Pi5 16GB not long ago for under CAD $200.
I will not be buying any more SBC's at this price point. I wonder if Raspberry PI will survive.
Funnily enough the consumer impact of DRAM etc. costs came up in an unrelated interview I was doing at Kubecon last week. She also made the observation that a lot of these big companies are buying components to keep in reserve for data centers that haven’t even been built yet.
The title should say: "Collusion of large corporations promoting LLMs with RAM manufacturers is killing the hobbyist SBC market (and bankrupting anybody trying to get a PC or laptop)".
Because we all know that DRAM prices have spiked since production is going to those infernal chatbot training data centers. Same as a lot of the electricity in some parts of the world, BTW.
Can you elaborate on the collusion aspect? Is the implication that OpenAI and Anthropic are coordinating their purchases in such a way that they target the hobbyist market? What’s the collusion angle here?
Only works so long as you eventually pay up... well unless the manufacturers make too much this way. That said are there some Chinese manufacturers that aren't part of the cabal and could undercut them?
Except that it doesn't work like that. If you buy DRAM and don't do anything genuinely worthwhile with it, you'll ultimately dump it all right back onto the market, and everyone knows that. The biggest worry is that it's actually OpenAI and their direct competition starving the rest of the market because they predict AI research and the like to be a highly valued use for the stuff, compared to building gaming PC battlestations or whatever the highest-valued use was before. Many observers think that this will also happen with GPUs and cutting-edge digital logic more generally.
> We all know that DRAM prices have spiked since production is going to those infernal chatbot training data centers
I know it's very fashionable here to talk about capitalism as some hand-washes-hand big corp organized scam, but if you put that ideology aside for a moment, you contradicted yourself here, I think.
I personally don't like conspiracy-theory-thinking. If I was a DRAM manufacturer and had to choose between servicing a single customer, who orders hundreds of millions worth of my product, or service a very large number of customers who order tiny amounts of the product a piece, then of course I would focus on the large client, because they are easier to service for the expected profit margin. I wouldn't even need to think about advertisement, sales, all that jazz. Looking at it from that perspective, it seems pretty logical to me that a spike in demand from datacenter operators would rise prices dramatically. I struggle to see room for collusion / conspiracy here.
A couple of issues, first there is a history of price collusion (see DRAM price fixing scandal on Wikipedia) and while it may be "logical" from a seller point of view to prefer large orders, this upsets a lot of people and used to be illegal in the United States (it may still be illegal, but it's not enforced)
There is a risk of having a single large customer. As a small food manufacturer we've been warned about it, like to not sell to Walmart even if given the chance.
If one customer buys a majority of your product, your entire business is at their mercy. They can dictate terms, or quit buying from you which can end your business.
So even with RAM - if a company goes all in on RAM for an AI company, what happens when the AI bubble bursts, or the AI company spins up/buys their own RAM factory and quits buying? Did you make enough money to tide you over until you can regain your old customers that have gotten used to not being your customer?
I didn't say "Criminal conspiracy" nor "Capitalism is bad" (although I'm not a fan, and not because of the DRAM price spike). What mean by collusion is that OpenAI apparently agreed with Samsung and Hynix to secure 40% of global DRAM output, for their own exclusive use.
Most software uses 10x more memory than is necessary to solve the problem. In an ideal world, developers would stop building bloatware if their customers can't afford the DRAM.
I agree, OTOH there are many very cool things that we can build if we're able to assume a user can spare 2GB of RAM that we'd otherwise have to avoid entirely like 3D scenes with Three.js, in-browser video/photo editing. Should be making sure that extra memory is enabling genuinely richer functionality, not just compensating for developer laziness (fewer excuses now than ever for that).
After discovering Dell Alienware clearance and graphics card availability in those Alienware computers, I haven't felt the need to build a computer for the last five years.
It's good that openAI is failing to meet its obligations on hardware, but given what we know about the DRAM industry, i suspect drastically higher prices will be the permanent new normal, just like most everything else.
I've been having fun getting Linux 7.0 running on my Milk-V Duo S, it's still available super cheap (though tariffs make buying single quantities expensive) so i stocked up on Duo boards. I guess I'm hoping for an upside where there's more interest in cheaper overstock boards from 2022+
It's terrible. Fake money is fueling the exhaustion of real resources in search of questionable outcomes ("AGI"). Imagine if all of these money were invested in curing cancer.
Let's also imagine an alternative reality where some reasonable percentage of the $2.5T in current year AI spending was instead invested in the "general intelligence" researchers we already have for the same purpose. I think it's a pretty reasonable expectation that 1) they'd probably make more progress and 2) that money would help a lot more people in the process (through jobs and economic activity).
Obviously there's no "evidence". Why would you even think we need AGI? But I'm happy to hear your reasoning if you were one of the few/only? people who imagined that software that could predict the next word could do what it now is doing.
I think his broader point is that life preservation doesn't seem like such a big win if overall quality of life is dropping to the point where people decide to not subject their potential children with the burden of living.
I've already seen at least one person who was pretty sure that the preprint paper they co-authored with AI (read: AI wrote for them) was going to cure cancer and make them billions of dollars.
There was only one problem. The paper jumped straight from "this paper will show how our new treatment could cures cancer forever" to "as you can see, these results clearly show that our treatment cures cancer" - with neither any actual results nor any specifics on the treatment. And I don't just mean that the paper didn't go into details; writing the paper was the full extent of their "research".
AI was used fundamentally for COVID vaccine development. AI is used for research in all modern drugs. It’s a certainty if cancer gets cured AI will have played a fundamental role since it’s already fundamental to precursors.
The main cost input is presumably ram. They are passing it through.
If everything on the board but the ram costs $30, and ram is going from $10/gb to $20/gb, then they have to change the price $50 -> $70 to break even on the 2gb board, and $190 -> $350 for the 16gb board.
In other words, the raspi is now priced like a stick of ram with a bonus computer attached because ram is massively more expensive than the rest of the computer.
> memory prices won’t remain at their current very high level indefinitely; the circumstances in which we find ourselves are challenging, but in the future they will abate.
how long does it take to increase manufacturing capacity? how long will the decision be postponed to increase manufacturing capacity? if AI skeptics are right and the bubble bursts, increasing capacity inordinately will prove a big mistake. if AI skeptics are wrong delaying increase of capacity indordinately will prove a big mistake.
In a sense we are forcing DRAM manufacturers to play the judge, jury and executioner:
If they don't increase capacity corresponding with AI boom, the DRAM prices may ultimately cause an AI winter.
If they do increase capacity (lowering per unit costs), the lower DRAM prices may enable AI summer to continue.
This looks like self-fulfilling prophecy scenario.
The barriers are the US demands that nobody should sell semiconductor manufacturing equipment to China.
Otherwise their memory manufacturing companies would be happy to exploit this opportunity.
Actually some Chinese companies already sell cheap DDR5 memory modules, but their production capacity is severely limited by the US blockade, so the cheap memories are available in few places, mainly in Asia.
So the high memory prices are caused by USA both by the AI companies that have bought most of the existing production and by the US government, who has sabotaged the Chinese memory vendors since a couple of years ago, in order to protect the market share of Micron (the US sanctions coincided with the moment when several companies, including Apple, intended to use the cheaper Chinese memories, so preventing this to happen seems a much more likely reason for the "sanctions" than the BS excuse that consumer DDR DIMMs and SSDs are dual-use products that may benefit the military. Even if that were true, the US sanctions did not prevent at all the Chinese from producing anything that would be needed in a small quantity, like for a military application. The sanctions have prevented only the mass production of devices using state-of-the-art lithography, which would have impacted the prices in consumer markets).
Startup costs measured in the billions, with no guarantee of success, and a long payback time horizon in a market that almost everyone thinks is - in one way or another - a bubble.
Oh yeah, the market is also getting intense scrutiny from powerful geopolitical entities that are quite explicit that they don't believe in fair play or consistent, stable rules.
We're somehow in a race between LLMs curing cancer, destroying the planet by "You're right to be mad, I shouldn't have issued those launch codes, it's even in my Claude.md file, I'm sorry," and rendering modern technological civilization uneconomical. I know this is statistically the best time in history to live, but lord, I could use a vacation.
Is there anything (technically) preventing SBC manufacturers adding SODIMM slots?
I was expecting the Milk V Titan to avoid this memory nonsense since it has two unpopulated DDR4 slots, but it has fallen off the radar like several other SBCs.
SODIMMs are huge compared to a BGA memory package which is a problem if your goal is to minimize your board size (e.g. I don't think there's a reasonable way to fit it into a Raspberry Pi form factor without something weird and expensive like a mezzanine connector). Routing the signals is also somewhat more annoying because they all come out of one edge of the connector compared to a BGA package which has them fan out in every direction, giving more space for length matching traces, etc. You'll likely need additional PCB layers compared to a BGA chip.
Unless you're really using the GPIO pins or other weird I/O, I really fail to see the purpose in having an 8GB or 16GB RAM Raspberry Pi (at a much higher price than it used to be) as a desktop workstation with a GUI on it.
The idea of putting sixteen gigs of RAM in a raspberry pi is nuts. The legit thing you want to use a raspberry pi (or a competitor) for as an embedded headless thing with no KB/mouse/display attached should run fine in 2GB of RAM or less, assuming an ordinary debian-based OS environment.
I would much rather have a used, ex-corporate/ex-lease, small form factor or ultra small form factor x86-64 desktop PC (Dell, HP, Lenovo, whatever) with 16GB of RAM in it and an SSD on a SATA3 or NVME interface. Whatever is the "best" SFF that you can buy via huge eBay used equipment dealers on any given month.
Despite being many years old, whatever you can buy on ebay for 200 bucks (at least before the recent RAM fiasco) with some recent-ish quad core core i5/i7 or Ryzen in it will run circles around a raspberry pi 5.
A few, until their current stocks run out. Orange Pi already increased prices (their boards are similar price or more expensive than equivalent Pi's now), and Radxa seems to just stop selling certain models (at least in NA) once they run out of stock.
Arduino has one of the cheapest 4GB boards now, but I wonder if it's just because they made a ton and the demand for their strange board has been low?
This is a good thing. Pis were priced too low for OEMs and too high for hobby work. It's no longer an accessible board for fledgling hackers . Reclaim hardware for your nephews, which is good for the environment, too.
Not everyone earns tech bro salaries and can sustain a thousand cuts. Many hobbiests are scraping and saving money to acquire hardware. For some it very well msy be the end of their world.
We are talking about brand new latest gen hardware here. People with low budgets are always scraping and saving for deals and don’t need to buy something brand new from a pricey brand name like raspberry pi.
You can still jump on eBay and buy all kinds of dirt cheap used pieces of hardware.
My buddy just bought a used ThinkPad T14 with 32GB of RAM and 1TB of storage for about $500. You can get by with a whole lot less.
In this context, I will also present the idea that Rasperry Pi has represented quite poor cost value for many years now.
Have you looked at how expensive international shipping is? eBay covers just a few countries, the rest of us can't buy there because we'll be paying 10 times the cost of hardware to get it over here.
I already moaned about this recently, but to briefly reiterate: the only hardware that's becoming available for most people in my region are Frankenstein desktops built from heavily used 10+ year old Xeons running on suspicious motherboards made by obscure Chinese manufacturers you've never heard of. This is pushing ever more people towards smartphones and away from actual computers.
But at least we got the bullshit machine in return, that's something, I guess.
> Have you looked at how expensive international shipping is?
It really shocks me how bad shipping has gotten. It's nearly unaffordable to buy things on eBay from the US as a Canadian due to shipping costs, so I can only imagine just how bad it is for people from other countries.
It's probably unaffordable for anyone to buy things from the US due to shipping costs, because the Trump administration has completely screwed up everything there with tariffs and mismanagement of the USPS and more. But the US is not the world. A better comparison is how much it cost to ship things from China a year ago compared to today.
> Frankenstein desktops built from heavily used 10+ year old Xeons running on suspicious motherboards made by obscure Chinese manufacturers you've never heard of.
I've heard reports that these are actually surprisingly good. I wouldn't want to use one in a production environment, but for homelab stuff they're an incredible deal.
Sometimes goes the other way. I was recently looking for a specific PC case (Fractal Design Torrent Compact without a window) and it's entirely unavailable in North America.
Placed an order with a Polish seller on eBay, received a message that Fedex wouldn't take the package due to size, replied that they could send with any shipping company and that I'm not concerned with shipping speed, after which they cancelled the order on me.
Yes, 90%+ of sellers refuse to ship here (and we're not even under any sanctions and/or political pressure of any sort). I hear about these magical 100$ Thinkpads all the time; I'm yet to see anything cheaper than 300$ (add another 100$+ for shipping).
We need to come to terms with the possibly-irreparable harm that private capital has done to the West. Capital is long-past serving the interests of the broader public; but we're now past the point of capital serving the interests of the corporations its being invested into. The demand for shares in OpenAI and Anthropic is so high that its pushing their valuations into territories they can never hope to drive revenue to fulfill; the cycle of this massive warchest of private capital inside the AI industry has for all intents and purposes created a communist economic structure, with all of its faults. Grifting, favored suppliers; if the stories about SK Hynix and Samsung guaranteeing 40% of their wafer supply to OpenAI on a letter of intent they cannot follow through on are true, we're even getting good old fashioned communist mis-allocation of resources. The day may eventually come when the USG is forced with the decision of bailing out the trillion dollar OpenAI Corporation; taking a stake to add to their portfolio next to Intel and others; and maybe normies will then realize what is happening, but the writing has been on the wall for years.
I love capitalism; its ability to allocate resources on a macroeconomic scale to pick winners and more-importantly losers doesn't have a rival system. As a younger, more naive startup employee, I'm on the record making a total fool of myself responding to our CEO talking about struggling to find PMF by saying "then maybe our company doesn't deserve to exist" (yeah...) But the "capitalists" who run the world aren't actually interested in capitalism, and thus definitionally can't be capitalists. At least once upon a time we had filthy rich titans you could look up to, like Buffet and Gates (Epstein stuff aside); but at this point most of them aren't even enviable people. Despite being richer than God, people like Huang, Musk, Ellison, and Zuckerberg feel more like vampires; they want to spend their whole lives doing the exact same thing, getting richer and richer, refusing to put a ladder down for anyone else to take a shot at improving on what they've built. I actually have a modicum of respect for Bezos and, to whatever vanishingly small degree, Trump; at least they're trying something different.
Grabbed up as much ram as they could, nearly no questions asked, at above market rates in some cases, ramping up the perceived demand and decreasing supply significantly.
The SBC markets been on life support for a long time. Youtubers making videos about them don't seem to grasp that and keep pumping out reviews and projects like its still 2019. The pi specifically has plummeted in popularity and for most use cases they just aren't a cost effective option when second hand micro pcs are dirt cheap and vastly more capable.
I don't think comparing new Pis to used micro pcs is fair. Compare a _used_ Pi with a used micro pc. If you have any geek friends, it's probably not hard to find a used Pi for free.
Imagine if there were a universal means to attach external devices, perhaps one of these external devices could handle GPIO. You might even call it a "Universal Serial Bus."
In the Dwarkesh podcast with Semi-Analysis's Dylan Patel they forecast the phone market will shrink by 50% this year because of RAM prices:
But that’s the high end of the market, which is only a few hundred million phones a year. Apple sells two or three hundred million phones annually. The bulk of the market is mid-range and low-end. It used to be that 1.4 billion smartphones were sold a year. Now we’re at about 1.1 billion. Our projections are that we might drop to 800 million this year, and down to 500 or 600 million next year.
We look at data points out of China from some of our analysts in Asia, Singapore, Hong Kong, and Taiwan. They’ve been tracking this, and they see Xiaomi and Oppo cutting low-end and mid-range smartphone volumes by half.
Yes, it’s only a $150 BOM increase on a $1,000 iPhone where Apple has some larger margin. But for smaller phones, the percentage of the BOM that goes to memory and storage is much larger. And the margins are lower, so there’s less capacity to even eat the margins. And they have also generally tended not to do long-term agreements on memory.
Why this is a big deal is that if smartphone volumes halve, that drop will happen in the low and mid-range, not the high end.
This is an extinction event for the low-cost cell phone companies. How are they going to survive if they can't sell their $100 phones profitably for 2 years? I think many of the low-end companies will simply sell their allocations of RAM and close up shop.
This is my greatest concern. So many small players will be wiped out. Consolidation is assured. Always great for consumers to be under the thumb of increasingly large companies.
Are there really that many small players? Aren't they all just subbrands of Xiaomi, Lenovo and Oppo?
There are a bunch of subbrands but there are also a lot of genuine small Android phone companies, especially in China.
Some of these serve some interesting niches that might now disappear due to this DRAM supply issue, e.g. Unihertz for extra small phones or CAT for extra durable worksite phones.
Is there any 'guide' to this ecosystem...because 'odd niche communications gear' is always interesting.
Transsion
Will it be such a big deal though? Currently people are swapping out their phones for another model that's exactly the same but with a different number at the end of the name every 12 months. This could just mean that the unnecessary churn dies down a bit, and companies taking advantage of it have to find a new line of business.
> Will it be such a big deal though? Currently people are swapping out their phones for another model that's exactly the same but with a different number at the end of the name every 12 months.
I don't think they do that at the low-end (nor the high-end, though that doesn't matter here - higher-end manufacturers have a small margin they can eat into). People on the low-end phones want a new phone, they just cannot afford it!
Even in the mid-end: If you buy a phone which you find to be decent, but affordable, and are not out for chasing the latest gimmick - there is no reason it would not last you 6 or 8 years easily - before applications start assuming the presence of better hardware, or a newer Android version than you have etc. Naturally you will have to protect it from physical damage, and maybe replace a battery at some point.
But why? I just replaced my Huawei P20 after eight years and only because nobody cares about app sizes and compatibility.
Because the phones stop working well? I write part of a post, open another tab to go look up some information, come back to the post and what I've written is gone, because the memory got dumped. That's the reality of using an old cheap phone.
You just answered your own question.
There's also the issue of phones occasionally getting broken, of course.
And would you consider yourself representative of the phone-buying public in general?
My desktop PC is from 2008 but I'd never consider this to represent anything like common usage. In fact it's so unusual that I get to point it out in posts like this.
This comment makes no sense to me. I exclusively use very low-end phones from Xiaomi. I buy a new one roughly every two years. Each new phone has a better screen, camera, CPU/GPU, charging, and sometimes more RAM/storage.
Take a look at a comparison of the iphone 17 and the iphone 12
https://www.apple.com/iphone/compare/?modelList=iphone-17,ip...
Is the newer model better? Sure!
But it had 4k 60fps video, optical image stabilisation, a "super retina display" etc five generations ago. The specs have kept improving, but it's not a quantum leap in performance.
iPhone 12 wasn't low end
The same applies at the low end, the grand parent comment even agrees.
You buy a new phone every two years, it comes with a camera, a cpu, a gpu, a host of sensors. Same as phones did two years ago, and ten years before that.
I don’t use my current smart phone in any ways that are different to the iMate PDA2K I had twenty years so.
https://www.gsmarena.com/i_mate_pda2k-962.php
How often does your browser freeze up when you open a webpage? How often does your phone browser dump its memory when you switch to another tab and then switch back? Eg if you were writing a post and opened another tab to go check some fact then the post in the original tab gets deleted.
Because that's what happens if you use an old cheap phone in the modern day.
I even had a phone that would occasionally just crash when on a heavy website and the onscreen keyboard popped up. That was not at all infuriating!!! Especially when it would crash when I try to refine a Google search.
your comment makes also no sense to me, I exclusively use very low-end phones from Xiaomi since 6 years, and change it only when it's dead or when I can't run my apps (I'm afraid mine won't last 2 years more). Before this I kept my first smartphone (iphone 3GS) for 10.
This is huge problem for developing countries. Most people here have $100-$200 phones. iPhone is a luxury.
Forget developing countries, iPhone is a luxury even in some European countries, when rent is 500+ Euros and your take home pay is ~1000. After all the other bills you're not left with iPhone money, which is why 100-200 Euro models of Chinese brands are doing so well.
It's easier to name the countries where iPhone ISN'T a luxury, as you can count them on very few hands.
Well, develop more then.
So funny to see my theory correct, this sites quality has really gone down.
Many countries would develop much faster if there weren't bombed nor maintain by puppet dictactors from (economically) developped nations (USA and france keep doing this intensively, while countries like Germany dont mind supporting fascist states). (PS: I'm not woke, not even Marxist).
What country would be the stereotypical example you are thinking of? I fail to find any
From the US point of view only:
Historically, most of Latin America.
Very recently: why was Venezuela attacked by the US?
Latin America isn’t a country and Venezuela wasn’t developing in any sens of the word
Also, why are they trying to do a genocide in Cuba?
What current fascist states are Germany supporting?
How could countries like Switzerland, Sweden, Norway or Germany get ahead?
>Many countries would develop much faster if there weren't bombed
Like Germany 1945?
Won't matter - if enough people in developing countries can afford iphones, apple will just rise the prices.
Thanks a lot, Sam Altman / OpenAI. Their little $100bn war chest being used for obstructive / destructive purposes will wipe out multiples of that amount via economic ripple effects. All in an attempt to keep a stranglehold over AI via competitive resource starvation. Basic.
> This is an extinction event for the low-cost cell phone companies. How are they going to survive if they can't sell their $100 phones profitably for 2 years?
This is a great thing to happen, actually. Those phones are all essentially trash that ends up in a landfill within a year or so. They should not exist at all.
Smartphones are widely available on the used goods market though, perhaps even more so than second-hand SBCs or old PCs. The "low and mid range" can be filled by the former high end.
My Samsung Galaxy S3 died after 8 years. EMMC failure. Just started boot looping while I was asleep. Everything gone. Known issue.
My Samsung Galaxy S8 died at 7 years. Some kind of thermal failure, I was able to recover my data by keeping the phone in the freezer while I copied. Known issue.
My Samsung Galaxy S21? I figure I've got another year or two in it before it, too, dies.
Having beautiful dead phones that have never had a broken screen or a hard drop is pretty depressing.
I bought a used Samsung and it started boot looping almost immediately, all these issues seem very specific to Samsung
The Seagate of cellphones
>I bought a used Samsung and it started boot looping almost immediately
Maybe that's why the previous owner sold it.
I am noticing something those devices have in common.
My Galaxy Tab also has dead EMMC. My HTC One M8 still works and even holds a day of charge. Too bad Android doesn't support 32bit ARM anymore.
It can also depend on the hardware it's connected to. If the endless gigabytes of Samsung's value-add software are scribbling to eMMC nonstop then it's not surprising the flash is wearing out. A lot of this stuff is masked by the fact that most people swap out their phone for a new one that's exactly the same every 12 months so they never notice this, but if you hold onto a phone or similar device for longer the unnecessary wear starts to add up.
Google Android should get more praise for doing quality control by analyzing and killing apps and processes that attacked the hardware - at least back in the day.
The great filter for incompetence by the big G was real and necessary.
The S3 ran Samsung software for about, oh, a month after I bought it brand new?
I'd been running CyanogenMod until they quit giving updates to the S3.
Yeah, the Flash has a wear lifetime. Battery has a finite lifespan too. Anything over five years is pretty good going. My wife managed that with a Nokia 1020, the last and best of the Windows phones.
Like everything else, phones need to be backed up.
Never. I am very gentle to my phones. Thing has one small dot of a scratch on the screen. Never been opened.
I just replaced my OnePlus 5 a couple of months ago at over 8.5 years old. No repairs needed, battery was a bit crippled in active use, especially making calls, but fine for a mostly idling phone. In idle it still lasts longer than a 1.5 year old iPhone 15. I still use it for by backup phone number SIM, as it slowly gets to ~9 years old.
The bigger issue was no more OS updates since 2020, and no Play updates since 2023. The battery can be replaced but getting a fully updated OS is more involved.
OnePlus 5 runs great with custom ROMs, including potentially ones based on mainline Linux as opposed to AOSP. (The Linux support is not as good as OP 6/6T but getting there pretty nicely.)
Too bad they have these long lists of "this doesn't work so well" and I'm too time constrained to troubleshoot for too long or dig for solutions. And I'd also need to replace the battery. It's an option for when I actually have some time.
The device integrity is a bigger deal, this is also a backup for some banking apps so if they don't work it kind of defeats the purpose. I removed all other apps to minimize the attacks surface.
If you're using it as backup for banking apps and the like I totally get not running a custom ROM on it! But you could also set that backup on something even cheaper, any one of the random not-bootloader-unlockable brands, and be left with the OP5 as a Linux phone. You're also right that the Linux support on OP5 is not up to standard yet, this is more of a question for the future if that support improves.
What's cheaper than an already existing phone that would otherwise stay unused or end up in the landfill (recycling center)? It could also be a great experimentation platform, play with Linux on the phone, but the time I have available now leaves little room for this kind of play.
The goal is not "experimentation" but having it eventually as an always up-to-date daily driver once the support for it matures. You're quite right that we're still a bit far from that, though.
My Galaxy Note 8 is still going as my main daily music player/backup phone.
My Galaxy Note 4 still works. Had to sideload updated web certificates.
My Galaxy S1 would still be going, but somebody got the charging port wet.
iPhones usually live pretty long life.
If you're not a phone power user, you can get by on old low end stuff. When my pixel 4a died of a bad screen crack a couple years ago, I replaced it with a random used 4a on ebay for $80. Two years later and it's still completely fine for all my purposes (texting, phone calls, chrome browsing, tolerable camera, etc.), although I still haven't accepted google's deal for a free battery swap yet from sheer laziness. I've learned that I can accept a 90 minute screen-on phone battery, though it's an odd adjustment to make. Again, not a power user.
The free battery deal ended in January, but you're likely better off as mine ended up getting a damaged screen while being transported for mail in (because all local stores stopped doing the program), and they wanted to charge me an extortionate price to fix the screen. Support were useless
> The "low and mid range" can be filled by the former high end.
With the 4-7 year support window on Android? Maybe that's why Google is trying to kill off Graphene et al.
>Smartphones are widely available on the used goods market though, perhaps even more so than second-hand SBCs or old PCs. The "low and mid range" can be filled by the former high end.
When new cars got more expensive, used cars got more expensive, too. I expect the same to happen with the phones.
When it becomes clear that the insanely expensive AI data center orders are not going to be filled, we can expect a huge reversal in the price of RAM and GPUs. There are 241 GW of orders on the pipeline but only a third of that is under construction and of that third, even less is being quickly finished and brought online, It’s estimated that just 3GW came online last year.
https://www.wheresyoured.at/the-ai-industry-is-lying-to-you/
Don't count on it. There's a lot of money in killing other businesses, or even just keeping prices high. Even if the high prices are an accident, there is always someone looking to take advantage of any situation for profit.
I have to agree. You only have to look at car and junk food inflation from after covid.
The prices make no sense, but that doesn't matter, they got away with it and are fighting to hold onto high prices, even as consumers balk. Their solution? Ditch poorer consumers. New cars and (branded) junk foods are luxury items now, apparently.
Somebody needs to redub this with Hanoi Hannah from that video game as the voice of the AI industry.
My iPhone 11 hanging on for dear life…
I replaced my old iPhone XS when they stopped making IOS updates for it. I was also curious what the news ones could do.
Turns out not much more.
I switched from an 11 to a 15 because my friend got the 14 and it took amazingly stabilised videos while snowboarding.
Photos taken in the dark also became much better.
The default photos app can find words inside photos and translate without going on the internet. My wife’s 13 couldn’t even though we had the same OS.
I’m sure there’s more that I don’t even know about.
And it does the word search and image recognition locally, which is the killer feature for me
Anyone can do it in the cloud, but I don’t want them looking at my photos
The only two changes that matter to me are: no more iPhone Mini and no more hardware switch to mute the phone. Instead they got a new "thinnest iPhone ever" that's actually thicker than my 4+ year old one when measured honestly.
I don't see myself replacing mine any time soon.
HODLing my iPhone 8 here. I can’t use a lot of apps, but Venmo and Lyft work on it still.
I had an iPhone 11. It was a good phone. It started giving up in early 2024. I held on with poor battery life until the new iPhones that year and bought the 16 Plus. I'm glad actually because they're discontinued the Plus models, annoyingly.
But I'm glad I don't need to upgrade for the next couple of years. I honestly want to get 4-5 years out of any phone going forward. There's basically no difference between models 12 months apart.
I can see the prices going up this year. IT's already happened to the PS5, which is bascially unheard of.
It really sucks more because the reason for it--AI--is just so godawful and pointless.
I bought my wife an iPhone 11 Pro Max in 2020, and, knock on wood, outside of replacing the battery it has been going on like a champ.
I've offered to buy her a replacement phone but at this point I think she's kind of curious as to how much life she can get out of it.
I have an iPhone 13 Pro Max; I bought it in 2023 but it was a refurb so I don't actually know how old it actually is. Regardless, it's still going strong, and I am hoping it can last through whatever RAM crunch is going on.
>But I'm glad I don't need to upgrade for the next couple of years.
Said the user who didn't learn the lesson.
Apple, you do not own anything, if Apple wanna release an update next month that makes your current phone useless, there is nothing you can do to prevent it.
Apple was caught hacking battery level, hacking users GPS signal, etc.
You don't own an Apple device, Apple owns you!!!
What? Apple support their phones for a very long time. I got 8 years out of my previous iPhone.
I had an iPhone 12 and just upgraded to a 17 because I don't see any point waiting - prices will just increase.
> It really sucks more because the reason for it--AI--is just so godawful and pointless.
Strong disagree.
AI is the best thing I've seen in 30 years working in software and expensive RAM for 2 years is a price I think is worth it.
> AI is the best thing I've seen in 30 years working in software and expensive RAM for 2 years is a price I think is worth it.
I think generative AI is pretty neat, but I'm not sure it's worth the RAM increases. I use Claude like everyone else does, and it's cool, but I am a little concerned at how much absolute low-effort crap is being produced with it.
It has made YouTube considerably worse; there was already a lot of low-effort shit flooding it, but now it's almost cartoonish. A lot of the videos that I'm recommended will have thousands of views, and give kind of a facsimile of a video with "effort", only for me to realize about a quarter of the way through a bunch of AI tropes in the writing and/or the visuals. It has made the already-mediocre experience of YouTube actively bad.
I am also not convinced that the prices will go down after two years. We already have big memory vendors completely leaving the consumer market, and we have these AI companies buying literal years worth of entire production lines of RAM chips.
This is something that could be solved by competitors jumping in to fill the niche, but it takes a lot of time to build new factories for this stuff, I think more than two years.
Listen to the podcast I linked. 2 years is ramp up time for new memory factories (which are currently being built)
I do not see any linked podcasts.
https://www.dwarkesh.com/p/dylan-patel
And if the AI bubble is still going then with datacenter construction going as planned, the RAM shortage will be even more extreme than today despite higher production capacities.
New Chinese vendors are scaling up, seeing the opportunity for large profits. Prediction: in 5 years half the market will be Chinese brands
Assuming the bubble doesn’t pop first.
It’s amazing how quickly people forget why we only have a handful of DRAM manufacturers in the first place.
hanging in there with you. Its a great phone.
I would not call any Apple phone even mid-range, and certainly not low-end?
This coming from an android user, that recently bought iphones for my daughters. Paying 600$ for not even top-of-the-line phones does not scream low-end to me.
My company-provided work phone is a base model iPhone, I'd definitely put it in a performance class lower than a flagship from any brand. Certainly not low-end, but I think mid-range would be a fair characterization.
I wonder if manufacturers opt to install less memory.
Or optimize the os because I still find 8GB insane for everyday tasks. Ok, gaming I can understand, but most common tasks should be runnable with at most 2GB of memory and that is mostly for browsers.
Optimizing the OS won't do anything about shrinking sales when the spec sheet changes.
Its going to be interesting, when big the big AI bubble directly attacks via pricing the government by preventing the sale of surveillance devices. So the bubble will pop- not because it can not sustain, but because its existence is adversarial to government demands for surveillance.
Government will bail out the big tech companies related to the AI bubble when that pops, that's why they're all massive donors to TRUMP INC.
Just checked my Amazon history, and in late 2020 I bought two Raspberry Pi 4s with 4GB memory for ¥6,500 JPY (~$62 USD) each. At the time, they were in somewhat short supply and I payed a little over the $55 list price from a reseller on Amazon.
It looks like the current price on Amazon for the Raspberry Pi 4 4GB is ¥18,800 (~$117 at current rates), which is indeed expensive AF. Oddly, the Raspberry Pi 5 4GB is priced about the same, at ¥18,950 (~$119).
Considering inflation and the speed increases over the 4, the Raspberry Pi 5 price doesn't seem too unreasonable to me. But having the price go up well over ¥10,000 definitely takes it out of the realm of impulse buy and more into something I would only buy if I had a specific and urgent need. So I can definitely see this killing off a good chunk of the hobbyist market.
As it stands, my two older Pis are currently sitting unused in a closet, so I would definitely try to use those before buying anything new.
My big regret at the moment is not buying a 4TB M.2 SSD last year when prices were dipping down below ¥30,000. Now they have more than doubled to ¥65,000 or more. I had one in my cart, but decided not to buy it with the rationale that "well I still don't need the space right now, and the price per TB will probably come down even further by the time I do need it". That is, after all, the way that prices on computer component have worked for most of my life.
I bought a pair of 4 TB SSDs for like $300-350/ea two years ago. I don’t remember exactly.
Around Christmas I tried to order one more. They wanted to above MSRP, like $500. Given the price of everything else I decided to just bite the bullet and do it.
After about a month they canceled my order. Whether that’s because they didn’t actually have one and couldn’t get one, or because they just wanted to wait for the prices to go up further I don’t know.
I went looking again two weeks ago. The exact same drive is back in stock. MSRP is now $1000. Amazon has it “on sale” for $900. Other retailers that often have slightly higher prices are asking $1250.
That’s 3-4x price increase in 2 years.
I built a new desktop in 2023 and repurposed my old desktop for my daughter. The old desktop had a couple of smaller SSDs so I swapped them out for a 2TB Samsung SSD. Paid $99 on Amazon.
The exact same SSD is $479 on Amazon today. It's not a fancy super fast NVMe. It's a slow SATA drive. I have no idea why anyone would even consider building a PC with prices this inflated.
> I have no idea why anyone would even consider building a PC with prices this inflated.
I did recently, specifically targeting lower capacities for the components that have been increasing (RAM and storage).
It didn’t seem like prices would be going down for a while and I didn’t have a desktop pc otherwise, so just went for it. We’ll see how it all plays out but I don’t think it was a terrible decision, as long as prices stay high for a couple years it still makes sense to just suffer through the increases
Mine were SATA too!
Funny I make cameras around the Pi ecosystem and having 4GB of RAM is pretty overkill, might as well put an LLM on there/vision while you're at it
I did think for more basic cameras I'd use a lower spec Pi like 1GB, I use the full pis now for the high resolution DSI displays
> The price increases bring the 16GB Pi 5 up to $299.99.
Meanwhile, a refurbished corporate laptop with 16GB RAM and a 512GB SSD can be yours for $199 [1]
I'm sure there will still be people who want the Pi 5 but at these prices, I ain't one of them.
[1] https://www.ebay.com/itm/327079631563
Those will dry up soon enough. Corporate laptop refreshes will be drawn out as they try and cost save on the increased price.
You also better hope the aliexpress dont figure out a way to get the RAM out of those things because they will start harvesting it for sure if there is money to made.
> Those will dry up soon enough.
We're talking about a pi replacement. The Pi 5 is slower than a 10yo laptop. That's gives us a very vast pool of used laptops.
> You also better hope the aliexpress dont figure out a way to get the RAM
That is a real worry and I can see used machines being gutted because selling DDR3/4/5 sticks is way easier and profitable than the whole machine. Adapters for SODIMM to regular DIMM are readily available and cheap, too.
> Those will dry up soon enough.
And worse, they're shucking surplus for RAM And SSD's now. I am seeing more and more eBay auctions for surplus PC's sans SSD and RAM. So the second hand market is going to be invaded by the reseller parasites leaving us with $50 CPU-in-a-box and $500+ RAM/SSD parts
Windows 11 requiring a TPM is still going to force a decent number of replacements: extended support on W10 is $61 Y1, $122 Y2, $244 Y3.
Delaying that refresh might actually end up the more expensive option.
I recently did an install of Windows 11 on a machine without TPM
To bypass the check during installation:
It's a never-ending cat-and-mouse game, and unsupported hacks like these usually aren't well-received in corporate environments. Decent stop-gap for home use, though!
The EDU Neo is $500, too bad it’s not as versatile.
What is the edu neo
The MacBook Neo’s education price of $499
It blows my mind that a Pi is a significant portion of the cost of it.
And the Pi doesn't even come with a monitor, keyboard, speakers, or power supply!
I’d bet a lot that the Neo has a better SSD in it too.
Having a SSD certainly is better than no SSD.
The Pi isnt a loss leader for user acquisition nor do they get to enjoy Apples economy of scale. Apple can take a small loss on this and it will still be worth it if they retain the users in their ecosystem.
Is there any evidence that’s the case? They always had massively bigger margins than all other PC manufacturers so it’s unlikely they are selling it at a loss even if’s significantly reduced
I mean, it's Apple we're talking about. Selling at margins <50% can probably be considered "at a loss"
For most older laptops it's easy enough, you just open them up and take the RAM sticks out. There are SO-DIMM to DIMM adapters to fit a laptop memory stick in a DIMM socket.
RPis are used in a lot of embedded devices. From industrial IoT to music keyboards. You can't easily use refurbishes laptops for those[1].
--
[1] Korg Kronos with its crazy Intel Atom based architecture notwithstanding.
Those are not in the same order of magnitude in power consumption or physical size. This drastically changes their optimal use cases.
Second hand equipment being cheaper than brand new equipment isn't much of a surprise.
Some people don't want trashy looking crap sitting around their family room in order to save $100
Helium supply issues are only going to make this worse.
I feel like for the first time in our lives we might have seen peak technology for the next few years. Everyone is going to have to make do instead of depending on ever increasing performance.
> Helium supply issues are only going to make this worse.
I believe helium, although important constitutes a small percent of the cost of semiconductors, so its effect on price will be less severe. It will be more noticeable in other uses of helium though - party balloons could get very expensive etc.
On the other hand: how flexible is the demand?
A hospital isn't going to shut down because their MRI's new helium load is getting more expensive - they'll pay a fortune for it. For a lot of other applications there are no suitable alternatives either.
The real question then becomes: what's going to happen when there's a 1000x price increase?
Parties would be more interesting if the balloons were filled with hydrogen gas anyway.
Reminds me of a demo my college physics professor did in our first class (presumably to get our attention).
He had two floating balloons, one about twice as big as the other. Pointed a blowtorch at the smaller one and it (of course) popped.
"That one was filled with helium. Now, there's only one gas less dense than helium..." and right as I thought to myself "he's not gonna do what I think he's gonna do", he pointed the blowtorch at the other balloon which exploded into a much larger (and much louder) fireball.
Attention captured, for sure.
Considering helium is a finite resource on earth, it should be made illegal to put it in a party balloon.
It’s not illegal to put gas in your car. That is a finite resource too.
There's more of it at least.
The problem is not that it's finite, the problem is that by the time prices rise enough to discourage people from using it frivolously, you might already be dangerously low on it.
Is it illegal to pull the ladder up behind you in a flood?
This is a really interesting question. Is it? My intuition would say no since you have no inherent duty to protect or help others. I have no clue though.
Not really: https://en.wikipedia.org/wiki/Fischer%E2%80%93Tropsch_proces...
If that is your argument, we can also produce Helium in nuclear reactors. It is just impractical for the amount that we use.
The helium that goes into balloons is mostly a byproduct of industrial grade helium production that would otherwise just go to waste. It's not pure enough for industrial uses.
You could always purify it, it's just uneconomic to do so at a smaller scale. But if the price rises enough, that will change and no one will be using helium for party balloons.
> although important constitutes a small percent of the cost of semiconductors, so its effect on price will be less severe
You should think about this some more.
You should elaborate your snarky rebuttal more.
--reasoning_effort: xhigh
I thought about it more. He's right though so I'm not sure what the extra thinking was meant to do...
First time in your life?
Were you born after COVID and the 24 months of dire component shortages that followed?
This one might last longer. The AI race is on, and the US tries its best to make it as expensive for China as possible to participate in it. Every dollar China spends on GPUs they get at markup is one not spent on building navy ships.
If there is an escalation over Taiwan, then that will cause the loss of most of the world's high grade chip manufacturing capacity. TSMC is busy doing technology transfers into the US, but it is going to take time, those fabs won't have capacity for the whole world, and they still heavily depend on Taiwan based engineers if something goes wrong etc.
Just like with COVID you don't know how long this shortage will last.
Hypothetically what would happen if China took over Tawain and TSMC?
It will incredibly hard for China to conquer Taiwan. One hundred kilometers across the straits introduces a brutal geographic hurdle. If anything, the fabs will probably be severely damaged in the war. Plus most senior execs and elite engineers would be moved to US offices in Arizona.
I’m not a military expert, but I’ll bet my left nut that if push comes to shove Taiwan will go scorched earth and just blow up the chip factories
Which will put the whole world back for a decade while we rebuild the factories from scratch.
All modern technology becomes unobtainable.
We are going to have that now in a couple of months regardless. So it won't matter if Taiwan's manufacturing base gets disrupted, the hardware will have already effectively stopped.
Wow, I wasn't aware Samsung, Intel, SMSC were unable to produce "modern technology." Not everything needs to be on a 3nm TSMC process, believe it or not.
TSMC makes a lot of stuff besides the EUV-scale parts that all the YouTube videos talk about.
Almost everything you own that runs on electricity has some parts from Taiwan in it. TSMC alone makes MEMS components, CMOS image sensors, NVRAM, and mixed-signal/RF/analog parts to name a few.
Also, people seem to assume that TSMC is an autonomous entity that receives sand at one loading dock and ships wafers out at another. That's not how fabs work. Their processes depend on a continuous supply of exotic materials and proprietary maintenance support from other countries, many of them US-aligned. There is no need to booby-trap any equipment at TSMC; it will grind to an unrecoverable halt soon after the first Chinese soldier fires a rifle or launches a missile.
Hopefully Xi understands that. But some say it's a personal beef/legacy thing with him, and that he doesn't even care about TSMC.
There would be a brief, or possibly extended depending on how much damage the fabs took, outage, then it'd be back to business as usual.
Russia weren't able to take Ukraine even when they were able to just drive their tanks right up to Kiyv. Modern warfare tech just favors the defender too much. China has ninety km of sea to cross before they even get to Taiwan. Missiles and drones have already taken out the Russian naval fleet in the Black Sea. China will be losing a lot in the same way if they ever attempt the crossing.
It is public knowledge that the critical equipment has "kill switches".
I wouldn't be surprised if there was enough damage that building a new fab from scratch is easier.
To the people who downvoted my comment: Are you doing that because you know it's not correct or because you really hope and wish it wasn't correct?
We couldn’t even make cars after COVID.
That's what happens when consumer demand rapidly shifts, and businesses start panic-buying and panic-cancelling. As far as I recall, actual chip fab output didn't really change that much.
I ask ChatGPT about this. It says the root was demand collapse at the start of COVID. So fabs stopped producing the many low-end chips reqd for modern cars. They retooled/pivoted to higher-end chips. When auto manufs came back knocking after COVID, the fabs didn't want/need their biz of low-end chips.
I expect my 5 year old desktop will last a lot longer, but start worrying about the bathtub curve.
We stood still on Intel 14nm for YEARS, then a few years of decent progress, and now this. Moore's law is taking a beating.
Moore's law only really works when at least part of the world is functioning under practically ideal conditions. Right now that's far from what's happening.
Finally, good efficient code is going to get its moment to shine! Which will totally happen because it's not like 80% of the industry is vibe coding everything, right?
just do vibe performance optimization (I am not even kidding)
Yep I’ve seen multiple instances of this so far.
Yeah, I got the AI to convert some code that ran at 30fps in Javascript to C, and it resulted in a program that generated 1 frame every 20 seconds. Then I told it to optimize it, and now it's running at 1 fps. After going back and forth with the AI for hours, it never got faster than 1 fps. I guess I'm "doing it wrong" as the hypesters like to tell me.
> Yeah, I got the AI to convert some code that ran at 30fps in Javascript to C, and it resulted in a program that generated 1 frame every 20 seconds. Then I told it to optimize it, and now it's running at 1 fps. After going back and forth with the AI for hours, it never got faster than 1 fps. I guess I'm "doing it wrong" as the hypesters like to tell me.
Remove the "I actually only want a slideshow" instruction from your prompt :-)
Try converting the Javascript into a slide deck and spam the next button.
Honestly speaking, it has started to look like AI coders could actually do a better job than 80% of app developers in writing efficient apps just by being set to adhere to best-practice programming conventions by default (notwithstanding their general tendency of trying to be too clever instead of writing clear and straightforward code).
They would do well just by letting the AI generate Rust code.
Vibe coding might be a positive here since there's no need to optimize for DX over perf when the clanker is the one reading/writing code.
This is my theory: we're going to see a lot of languages with straightforward and obvious semantics, high guard rails, terrible dx, and great memory allocation and performance behavior out of the box. Assembler or worse, but with extremely strong typing bolted on in a way that no human would ever tolerate, basically, something in that vibe.
So Pascal and Delphi are coming back? I'm actually cool with that.
Yeah actually I worked with Pascal early in my career and that's kinda the vibes I am thinking about, with maybe a stronger type system more ada-esque though (composite, partial and range-and-domain types, all that jazz)
Have a look at Nim
Pascal inspired syntax
Ada inspired type system
Lisp inspired templating and Macros
Compiles to C
I vibe coded a library in Nim the other day (a language I view very much as a spiritual continuation of the Pascal/Modula line), complete with a C ABI.
The language has well defined syntax, strong types, and I turned up the compiler strictness to the max, treat all warnings as errors etc. After a few hours I put the agent aside, committed to git then deleted everything and hand coded some parts from scratch.
I then compared the results. Found one or two bugs in the AI code but honestly, the rest of our differences were “maters of taste” (is a helper function actually justified here or not kind of things).
>I feel like for the first time in our lives we might have seen peak technology for the next few years.
This happened for a while with CPUs in 2004 or 2005, IIRC. At the end of the Pentium 4 era clock speeds and TDPs were so high that we hit a wall. Nobody was pushing past 4 GHz even with watercooling (I tried).
Dual-core processors were neither widely available nor mainstream yet, and those that were available had much lower clock speeds. It definitely felt like we hit a lull, or a stagnation, in those years. It picked back up with a fury when Intel released the Core 2 Duo in 2006, though.
Ultra clean rooms with massive air handling systems can't recapture all their helium?
Or is this just a temporary thing based on where processing is located?
Helium is almost all captured from gas wells by cryogenically liquefying the nitrogen out of it. I guess you could do technically do that with the fab's air but it is a LOT of volume of air to liquefy and likely costs more than even inflated helium prices.
Most helium from most wells is simply vented because it is expensive to separate even with its relatively high concentration, and I imagine even the best case scenario for capturing it from a fab has abysmal concentration of helium. But because most of it is vented it also means if the capital is put down to build more helium separators on gas wells it wouldn't take long to increase supply. Short term for a year or two it can be a problem, but beyond that it is simply a cost versus demand issue. There is neither a technological nor source limitation, it is a pure capital investment limitation.
I hope systems which separate helium: 1. have very good thermal insulation 2. use heat exchangers so separated gases can cool down incoming gas.
> Most helium from most wells is simply vented because it is expensive to separate even with its relatively high concentration
I remember a similar situation with neon early in the Ukraine invasion a few years ago. What I expect to happen is some other source coming online that currently doesn't try to capture it for economic reasons.
Helium recovery in scientific settings for cost saving reasons is already done, so it's not like there isn't expertise in using it.
The fact that all helium escapes the atmosphere, and is essentially impossible to produce makes things a bit more complicated.
Helium is actually pretty hard to keep ahold of, being a very light and small noble gas. It can diffuse through a surprising amount of materials, flow through far smaller cracks than you would expect, and is quite hard to filter out of a mixture of gases.
Also superfluid helium (a big chunk of helium used for refrigeration like in e.g. the LHC) has the weird property of flowing the same speed through a tiny hole as a large one and coating everything with a molecular coating. Superfluid helium is basically a bose einstein condensate but macro-scale, totally counterintuitive. Essentially a thermal superconductor. Zero viscosity.
Unless you need it to be less than 3 kelvin for some reason, helium doesn't do that.
AFAIK they recapture most, but recapturing all simply isn't possible / financially feasible. And they use a lot of helium, so even if they capture most of it, the losses are still higher than the currently available supply.
OTOH things which belong on microcontrollers are now being pushed back to microcontrollers for cost reasons, so there is a win to be found there.
Even before the hikes, SBCs were $50-$100 a pop, compared to pennies for basic MCUs and maybe $4 for high-performance ones. People were clearly willing to pay 100x more just for familiarity and the ecosystem ("hats", forums, etc). I don't know if 300x is going to make more hobbyists see the light, or just result in fewer of them being able to afford the hobby?
> People were clearly willing to pay 100x more just for familiarity and the ecosystem
This is obviously logical. If I know how to program in Python or JS but not C and am familiar with SSH, I can do something with a SBC in a few minutes.
I get paid $200/hr. If I spent even one hour to learn what I need to deal with a microcontroller, the time cost is four times the cost of materials if I stick with what I know.
How many small projects do I need to do in my free time before it's financially smart to learn a whole new technology?
Most of the "professional" microcontrollers have complicated flashing schemes, expensive bespoke IDEs, and limited language support. Treating a lot of that like a moat around their products.
I find it remarkable that they haven't tried to make all of that easier. Any board with arduino support is easy to start using, with pared down c++, boards similar to the micro:bit support micropython and javascript as well as a few others, and a ton of modern development boards have UF2 support.
UF2 is a step change in how easy it is to flash a binary onto a microcontroller. You hold down a button before connecting it to a USB port, and then it appears as a USB drive for you to drop a file onto, once it's done "copying" the board is flashed and will run your code as soon as it resets.
If you want to gain familiarity with a board, you can drop a .uf2 file with a REPL on it and run code on the board a line at a time.
As if it would make sense that spending 2hrs relaxing on the beach or gardening your orchids would cost $400 to you. Money not made is not money spent. If you were doing a hobby project for learning, you were not going to be working during that time anyways, so your hourly rate doesn't matter.
Microcontrollers don't really make sense for hobbyists (unless their hobby is programming microcontrollers, of course). They only make sense when you think about deploying an application at scale, at which point the per-unit price becomes important. OTOH, if your hobby project goes viral and you want to profit from selling SBCs with it preinstalled, a cheaper SBC is a plus, but that's not very likely to happen...
With LLMs, it's potentially a lot easier to use microcontrollers now, depending on how widely available the documentation is now.
That's an apples to oranges comparison. Might as well bring up how people pay thousands of dollars for FPGA boards.
No. Many hobbyists default to SBCs whether they need them or not. No one defaults to FPGA if they don't need it.
My point is that the FPGA boards are several orders of magnitude more expensive than the actual chip. To be fair you should be comparing between the cost of the SoC and the microcontroller.
Yeah, never understood why I would want an entire OS running just to blink an LED. I was going to make a pro-Arduino comment but I guess my LED example warrants little more than an R/C circuit and a transistor, ha ha.
(Anyway, I still remember the thrill of writing assembly for a 68HC11 and getting a pair of hobby servos to respond.)
Mostly for the network stack. Economics, also, sometimes.
These days, with ESP32, Pi Pico W etc... things have changed a lot.
But before they got popular, Why deal with MCU + wiring some weird peripheral for wifi / ethernet when you get a Pi Zero W / Clone with built in wifi for the same price?
You jest, but I ended up getting a lot of use out of being able to do this in software for a dimmable LED lamp. Dimming the LED required PWM, and the potentiometer resistance -> PWM frequency map ended up fairly intricate to make the knob "feel right."
Now what I would have loved to have done is come up with some crazy analog circuit to implement an arbitrary transfer function from potentiometer input to LED voltage, but I didn't know how to do this at the time and the dev cycle would be a lot more painful than with software.
Familiarity - it’s easy for us Linux dweebs to build a pi that can flip an LED, but programming an arduino is an entire new area.
It's pretty trivial to do so on Arduino though.
Well first you have to learn the Arduino programming language. And the stdlib.
They don't call it C++ because that sounds too difficult. But it's literally, not like a simplified subset that compiles into an IL using a formally proven tool, but as in literally compiled using GCC as, C++.
Calling it c++ might give the wrong impression to some people too, since it doesn't have the STL, rtti, or exceptions for boards like the Uno r3.
It is c++ though. Just limited in similar ways to the US air force's requirements for using the language.
it's literally the hello world of micros. get an arduino, plug it into the usb, install the ide, new -> example -> 01. Blink. Press Run. Cool you have now blunk a led. Now use AI to draw the rest of the owl.
It's easy once you've done it - but before you've done it (for me at least) it was much easier to just install a Linux on a Pi and run a bash script than to learn how to program an Arduino.
(Of course, there are those to whom an Arduino is an overpriced piece of junk and they don't understand how I can't solder a three cent chip myself.)
But let's be realistic - all of these things are like my Steam library - purchases made but never used (I have a drawer full of Pis and other SBCs, and Arduino dev kits, etc. Someday I'll have time time time!).
It's C++, and basically what Arduino gives you is
As well as a GUI to easily flash devices and view the output from the serial port, as well as import libraries that do all of the hard work like say making a serial port on any microcontroller pin or control external devices like light strips or displays.I'd assume the average user on HN should be able to figure it out pretty easily.
Good thing LLMs exist now
With micropython or some of the js based frameworks for microcontrollers, it's really not that new/different.Especially with ESP32/Pi Pico W/their clones...
In fact it's a lot more straight forward to not have to deal with Network Manager config files or systemd unit files or read only rootfs headaches of Linux world.
asking as a casual non-poweruser... how does one do that on linux exactly?
https://www.kernel.org/doc/html/v5.0/driver-api/gpio/index.h...
You can usually find libraries for your language of choice to speak GPIO and expose the pin's state as a variable in your code.
good news, now you can use all that dram you can't afford to vibe code an arduino program. Think of the savings and the learnings!
You're probably joking, but this is interesting. If we throw more RAM at AI, it can help us optimize programs to reduce our RAM needs, I haven't thought about it like that
68HC11 would have been a luxury for us.
For me it's primarily the ability to run a full TCP/IP stack. For hobby projects, I'd rather use a Pi or a Beaglebone with IRC or HTTP for data egress than, say, I2C or SPI. The ease of debugging alone makes it worth it.
How is this a barrier to using a microcontroller? I've sent http requests from many different microcontrollers.
Agree, but there was something special about SBCs being so cheap they were the default recommendation for new hobbyists and I'm sad to see that go.
I would not have fallen in love with microcontrollers without Raspberry Pi and PocketCHIP as stepping stones.
The messaging of "it's a tiny computer, make whatever you want with it" is so much more approachable than anything I've found on the microcontroller side. Even Arduino. I dismissed it for a long time because I misunderstood it. I thought I had to buy Arduino devices, then Arduino shields, then program them in the Arduino language using the Arduino IDE.
I’ve been having a lot of fun with the Pi Pico 2W. It can host an access point, a web server, be a USB host, and of course has GPIO. And not running an OS means it’s way simpler.
The Raspberry Pi 4B 2GB is $55 on CanaKit, PiShop, Seeed Studio, and MicroCenter.
The Raspberry Pi 3B 1GB is $35 on CanaKit, Adafruit. The 3B+ is on PiShop for $40.
The Raspberry Pi 3A+ 512MB is $25 on CanaKit, Adafruit, PiShop, SparkFun.
The Raspberry Pi Zero 2 W is $16.35 on CanaKit, $17.25 on PiShop.
We are unavoidably headed at financial collapse, authoritarianism, and potentially the collapse of Western civilization. And ya'll are worried that you won't have 8GB of RAM in an embedded GPIO computer for your hobby? Maybe it's time to make the ultimate sacrifice: use less RAM.
> use less RAM.
The same accounts that defended and promoted LLM use just a few weeks ago are now telling RPi users to use less RAM.
Maybe it will even trigger that we software developers get more time to optimize our RAM usage while developing instead of implementing that new user tracking feature that is being pushed by the business...
I picture a scene with Richard Crenna knocking on our old fogey's cabin door to ask us to come out of retirement and help hand-optimize software in this new environment
If I spent a bajillion dollars on massive data centres I would be delighted if personal computing were also crippled for a while. It would allow me to further own your ability to do compute tasks and to help kill the concept of doing it yourself for a while.
Crippling personal computing isn't a side-effect of massive investments in data centers, but rather one of the primary drivers for it.
There are ups and downs in the prices of components. Often people forget that during COVID prices were high for SBCs because of supply chain issues. Video cards just were not available in the UK and afterwards (every supplier had long lead times) and are still relatively expensive (at least there are now lower priced options). Raspberry Pis you couldn't get hold of and many people (Jeff Included) was using a website checking for availability which was non-existent for anything other than low end models.
I remember 15-20 years ago when hard drive prices went up through the roof because there was a flood in Thailand and it too years for prices to come down.
There is going to be supply chain issues due to the current Geopolitical situation (Helium comes out of the Gulf and that is need in chip manufacture) is also going to affect the price of components.
Eventually in a few years (as the article states) the situation will change. It just sucks at the moment.
TBH I am more worried about my ability to fill up the tank on my car as both Petrol and Diesel is unavailable locally. I can make do with whatever computer equipment I have.
> People are quick to forget that during COVID prices were high for SBCs because of supply chain issues.
inb4 AI has the same supply chain effects as a worldwide pandemic. I guess those AI doomers that talked about it being the end of the world had it right!
Doomers IMO are just click baiting.
There is a saying that is often trotted out my economists "That the cure for high prices, is high prices".
There is a consumer market and business need for DRAM outside of AI. Someone will fulfil the need as there is a high incentive to. It just going to take a bit of time for this to happen. My equipment is going to be fine for another few years. So I am going to just hang tight and make do with what I got for now.
Main producers actually reduced dram output in 2026. When you have few players with very high capital cost you will end up with cartels like light bulb cartel.
Someone will come in when the price goes up enough. It will take time, but it will happen. What people are complaining about is that the time for this to happen is too long.
Oh look, there is a player coming into the market it seems:
https://economy.ac/news/2026/02/202602288291#:~:text=If%20eq...
EDIT: In fact many other chinese companies are now expanding into DRAM because of the high prices. Which confirms exactly what I said.
Reminder the whole world is not the United States of America. While you make the choice of voting for someone who thinks tariffs are good for the local market, no other country joined your bandwagon.
Maybe they will. However people often claim that there won't be anyone to want to enter the market to take advantage of high DRAM prices when if they spent two minutes doing a web search they would discover that isn't true.
>"a good sign, but im guessing at some point these companies are gonna be tariffed heavily..."
In the US. The rest might do the other way. The US of course will try to do some arm twisting. Hopefully the world can learn to fight back.
> Someone will fulfil the need as there is a high incentive to
And those uses which fall short of the new threshold, e.g. hobbyist SBCs, slowly fall away.
In reality were they going to survive anyway? I would wager likely not.
Raspberry PI is the defacto standard for SBCs. Almost all the other SBCs had significant problems usually around software support and also third party support e.g. Hats, cases etc.
I’m just going to try and hang tight as well. But I do wonder if DRAM companies should or should not respond to this pricing situation. The actual AI model training companies buying all the RAM aren’t profitable yet, right? It’s all investment, which can dry up at the drop of a hat.
> Someone will fulfil the need as there is a high incentive to.
Unless the capital cost to compete is too high and the risk of the existing manufacturers undercutting you is very real. Plus it can take 5-10 years or more to build a new fab, debug/iterate your process, then start shipping product.
Markets are prone to natural distortions. This is one form of that. It can be perfectly natural for all potential competitors to choose not to compete no matter how much demand exists.
Frankly I'd expect nationalization of some of the DRAM makers before we see the rise of useful competitors. The more likely scenario is government pressure, up to and including arresting executives, to rattle the cages of the existing players who are way better placed to expand production quickly for relatively low capex. Not that I think any action is likely in the short term. My guess is the existing players are betting on an AI bubble pop so they don't see the use in really expanding capacity only to be left with idle fabs later. None of us really knows.
So are AI evangelists to be fair.
It is almost as if two or more things can be true at the same time.
I’m not going to argue my comment was particularly substantive but these kinds of rude, canned meme-responses are not really appropriate here.
This time is different. https://ca.pcpartpicker.com/trends/price/memory/#ram.ddr5.60...
The price for a couple of 32GB sticks is now over $1200 after being stable at about $200 for several years until last September. That's not a blip; that's 6-fold hike and there is no sign it is slowing down any time soon.
https://www.memoryexpress.com/Products/MX00115488
Let's see, this is a low speed 2x16GB DDR4 kit for $300.
The closest option on the pcpartpicker chart was about $75 as a stable price. So that one's only a 4x increase.
Versus DDR5 where... it looks like a 5x increase to me? I'm seeing a jump from 200USD up to 1000USD. Edit: Oh there's an extra jump in the last month on the CAD version but not the USD version.
that was like $80 last year.
Did you not read what I said? I couldn't even get a replacement video card at any price during the height of COVID and believe you I had the money to pay for one. I couldn't even get a Raspberry PI (any model) for about a year. They were constantly out of stock.
> That's not a blip; that's 6-fold hike and there is no sign it is slowing down any time soon.
How does that invalidate anything I said? As states in the article this will change, it will take years but it isn't forever.
I find it hard to believe that people here cannot make do with whatever hardware they already have.
I also don't believe those small SBCs would have survived long term anyway. Most people just use a Raspberry PI. It is either a MiniPC or a Raspberry PI.
Ya I mean gfx card was pretty bad during Covid.
Discord groups that had real-time line counts and pictures of the line at most best buys across the country (US).
The only way I got one was overpaying and a lottery system that bundled it with other hardware because they knew everyone would still buy it. It was impossible to buy online normally as you needed some kind of automated way to buy it before stock zeroed the minute it was posted.
You could pay a scalper for a gfx card, but stores had none. Now, stores have RAM at least.
> Did you not read what I said? I couldn't even get a replacement video card at any price during the height of COVID and believe you I had the money to pay for one.
You're comparing to memory sticks that went up 6x. If you were offering anywhere near 6x MSRP and you couldn't get a video card... I don't believe you.
https://www.pcmag.com/news/scalpers-have-sold-50000-nvidia-r...
https://www.pcmag.com/news/read-it-and-weep-heres-how-bad-nv...
These show GPUs available for 1.5-2.5x price, which fits what I remember.
> I couldn't even get a Raspberry PI (any model) for about a year.
https://picockpit.com/raspberry-pi/why-are-raspberry-pi-pric...
I didn't look into Pi prices a whole lot, but this suggests they were continuously available for 2-3x price.
I am in the UK. Not the US!
> If you were offering 5x MSRP and you couldn't get a video card... I don't believe you.
My 1080Ti had died. I had to use a 8800GTS from the late 2000s for about a year. As that was the only GPU I had. I have no iGPU on my CPU.
There was at one time, no stock available. Not on Amazon, Not on Overclockers, Not on Scan. They had some weird lotto system taking place on most sites.
Scalpers claimed to have cards. But I wouldn't risk sending a lot of money to some random seller on ebay.
> Unless this article is massively misleading, sure it was out of stock at 1x price but it wasn't out of stock at 2-3x price.
Again I am in the UK. You could not buy any PI other than 1GB model and maybe the zero. Both of which were useless to me.
> Scalpers claimed to have cards. But I wouldn't risk sending a lot of money to some random seller on ebay.
Ah, so you could have bought one, but you judged the available suppliers to be too risky.
Completely fair, but then it's not true that you couldn't buy one "at any price". It was just not a price+risk that you were willing to take.
Also, re: Raspberry Pis, you couldn't always get the exact RAM configuration you wanted, but they were pretty continuously available during COVID on Aliexpress. You did have to pay 3-5x normal price, but you could do it. I really needed one after one at home died, and paid the 3x markup, and it was annoying but fine. Not sure if Aliexpress is equally as available in the UK as it is here in the US, though.
> Completely fair, but then it's not true that you couldn't buy one "at any price". It was just not a price+risk that you were willing to take.
You are being pedantic. I find this type of discussion very tiresome. I've explained why in other forks of this thread. Quite honestly it pisses me off.
> Also, re: Raspberry Pis, you couldn't always get the exact RAM configuration you wanted, but they were pretty continuously available during COVID on Aliexpress. You did have to pay 3-5x normal price, but you could do it. I really needed one after one at home died, and paid the 3x markup, and it was annoying but fine. Not sure if Aliexpress is equally as available in the UK as it is here in the US, though.
Not in the UK. Someone was running a site with all the places that you could buy from. I was checking most days. Stock was extremely limited other than a few models.
>Not in the UK.
This was my experience, too. Pis would disappear from online retailers before you noticed the stock alert email.
I only got hold of a Pi 4 by chance when Raspberry Pi did an official pop-up store in Southampton for one day only. The queue to get in was about 45 mins long.
Okay, UK, maybe that changes things more than I expected. But what about ebay and the sites that replaced classified ads? And is it unreasonable for me to say that you could have bought a US listing and had it reshipped?
Edit since you added: Scalpers claimed to have cards. But I wouldn't risk sending a lot of money to some random seller on ebay.
Even with ebay's buyer protection?
Well not to be mean but I think "I refused to use ebay" invalidates your claim that you couldn't buy a card.
> Even with ebay's buyer protection?
I've had problems with it before (I can't remember specifics as it was a while ago). I'd rather not going through the hassle and/or risk in the first place.
There are still plenty of scams on ebay. During this era there were people scamming. e.g the box for a GPU. Listing the entire specs and then putting right at the bottom of the listing it was only the box and not the card.
> Well not to be mean but I think "I refused to use ebay" invalidates your claim that you couldn't buy a card.
What you are doing is being hyper-pedantic. It is fucking tiresome when people do this online.
If you are going to be a smart arse, I will modify my statement to say "I could not get a card from a reputable online store as they were all out of stock and did not wish to risk buying from a less reputable one".
I would be foolish to trust some overpriced (or underpriced) listing on ebay. I've had an ebay/paypal account now for 25+ years, I've learned to never do this because I got screwed every time I did.
> What you are doing is being hyper-pedantic. It is fucking tiresome when people do this online.
That's not pedantry. There's a huge difference between "they were unavailable and I couldn't get one at any price" and "I could have bought one from a scalper but I didn't trust them". Even if it's reasonable not to trust them (it is!), the first statement is sensational, and untrue, especially considering you emphasized "at any price" in your comment upthread.
> If you are going to be a smart arse, I will modify my statement to say "I could not get a card from a reputable online store as they were all out of stock and did not wish to risk buying from a less reputable one".
That's what you should have said in the first place; that would have been honest and correct.
And please, there's no need to call the other poster names. That's uncalled-for and childish. You seem to be new here (9-day-old account), so please read the site guidelines and turn it down a notch or three.
> That's not pedantry. There's a huge difference between "they were simply unavailable and I couldn't get one at any price" and "I could have bought one from a scalper but I didn't trust them". Even if it's reasonable not to trust them (it is!), the first statement is sensational, and untrue, especially considering you emphasized "at any price" in your original comment.
It is for any normal person in relatively normal setting.
Only amongst technical people is this sort of discourse tolerated where someone pretends that an unreasonable option (the scalper in this case as you admitted yourself) should be included in a statement when it is perfectly obvious it should not be included because it is not in any way reasonable.
I could have flown to the US and bought a card or China. Is that reasonable? For most people it isn't reasonable. It wasn't for me. Buying from an untrustworthy seller, is unreasonable.
> the first statement is sensational, and untrue, especially considering you emphasized "at any price" in your original comment.
They were out of stock on every reputable site. Therefore I could not buy a card at any price from them because they didn't exist.
> That's what you should have said in the first place; that would have been honest and correct.
I was honest and correct to begin with. The poster was using prices and availability in the US and not the UK.
> And please, there's no need to call the other poster names.
I never called them names. I expressed my annoyance at their behaviour.
i would certainly consider "at any price" to mean that you'd be willing to pay the 5x price to 20 different scammers and still got no card.
there might be a cultural difference between the old world and new world for what "at any price" means, but id take it to mean that to be at least spending $1M for it
> It is for any normal person in relatively normal setting.
I disagree. But clearly I'm not going to convince you (and vice versa), so let's just call it a day.
> I never called them names. I expressed my annoyance at their behaviour.
"Smart arse" is name-calling.
Why don't you step back from the keyboard for a bit and cool down. Might do you some good.
> I disagree. But clearly I'm not going to convince you (and vice versa), so let's just call it a day.
Try it in a IRL conversation and see how quickly someone gets annoyed with you. It won't be very long.
> "Smart arse" is name-calling.
I said "If you are going to be a smart arse". Which means "If you are going to engage in this behaviour then ...".
I never called anyone names.
> Why don't you step back from the keyboard for a bit and cool down. Might do you some good.
I am perfectly fine. I can be mildly annoyed by someone and still be quite rational.
Also this sort of statement is close to concern trolling.
> It is for any normal person in relatively normal setting.
A normal person understands scalping and that if they want it badly enough they can go on ebay.
They're not going to say it's "unavailable at any price" when it's right there for double the price.
If you're willing to pay the scalped price, the risk of using ebay is not in fact unreasonable.
You are being a pendant as far as I am concerned and arguing semantics with me is not going to convince me and many others.
So I suggest in future you should learn that using this line of logic (where you expect me to do something unreasonable to a huge number of people) is not something that people are going to put up with. It is really annoying to have to converse in this manner and in fact I believe that often that is wholly disingenuous and I no longer wish to speak to you.
If I categorized these situations the way you do, and I said what I'm saying, I would be a pedant.
But I see things a different way. The logic I'm actually using is not pedantic.
You calling me disingenuous over this is painful to look at. Get out of your own head for a second. We're using different premises, and we're reaching different conclusions because of that. My logic is fine, and your logic is fine.
> If I categorized these situations the way you do, and I said what I'm saying, I would be a pedant.
I am not categorising any situation. The vast majority of people would omit unreasonable options.
I could buy a racing bike that is £5000 new, for £200 when I live in London (back in 2000s). The bike would most likely would have been stolen. So technically I can buy a £5000 bike for £200. But most people wouldn't want to buy from a thief and consider it unethical.
People feel similarly about scalpers and other untrustworthy sellers.
> You calling me disingenuous over this is painful to look at. Get out of your own head for a second.
You started the conversation claiming I was outright lying. Then when I clarified to you what I meant you continued claiming I was lying/misstating. That is really annoying.
If you could have just said "okay that is fair, while you might have been doing X and Y, I can understand why you didn't want to do that". That would have been fine. But that didn't happen.
> You started the conversation claiming I was outright lying. Then when I clarified to you what I meant you continued claiming I was lying/misstating. That is really annoying.
I said "If you were offering anywhere near 6x MSRP" I didn't believe you, and it turns out you weren't offering 6x MSRP. So I wasn't calling you a liar.
> If you could have just said "okay that is fair, while you might have been doing X and Y, I can understand why you didn't want to do that". That would have been fine. But that didn't happen.
So if I had explicitly said "I think it's fine you didn't use ebay" that would have fixed everything? Because I never argued about your personal choice, I argued about you calling ebay "unreasonable".
Well for the record, I was going to say something like that in response to "If you are going to be a smart arse, I will modify my statement to say "I could not get a card from a reputable online store as they were all out of stock and did not wish to risk buying from a less reputable one"."
But then I saw you had called me "hyper-pedantic" and I focused on rebuffing that attack instead.
Edit: And it doesn't help that you never actually did that modification, and instead keep insisting that what you originally said means the same thing.
> So if I had explicitly said "I think it's fine you didn't use ebay" that would have fixed everything? Because I never argued about your personal choice, I argued about you calling ebay "unreasonable".
Ebay in itself isn't unreasonable.
Ebay is unreasonable when the only sellers are untrustworthy sellers, when there was a bunch of scams at the time. Which there were.
I've clarified this many times now. I don't care what interpretation is now of what I said.
> Well for the record, I was going to say something like that in response to "If you are going to be a smart arse, I will modify my statement to say "I could not get a card from a reputable online store as they were all out of stock and did not wish to risk buying from a less reputable one"."
I don't believe you. I've had plenty of stupid conversations like this, with plenty of tech nerds. Rarely happens with non-tech people. I spend some time in non-tech hobby spaces that are technical (Classic Car / Bike repairs) and this convo style never happens.
People like yourself think you are being clever buy poking holes in everything that said. I am quite happy to be quite obnoxious in pointing this out. I am tired of it. I am this cantankerous IRL about this btw.
The fact is that you could not buy a new graphics card in the UK for some time during COVID via almost every online retailers. I had conversations with other people in the UK that wanted to buy PC hardware and they were in the same situation. The same was true for the Pi 4 at the time. Making stupid semantic arguments doesn't change that fact.
> Edit: And it doesn't help that you never actually did that modification, and instead keep insisting that what you originally said means the same thing.
For all intents and purposes it is the same thing if you aren't engaging in pedantry and semantics. I try not to engage in it anymore (unless it is tit for tat), because I understand it pisses people off. You obviously don't care.
I like these many posts about how you, specifically, chose not to use any of the available systems to get a GPU that rapidly organized and became common globally during lockdown. The line from “I just didn’t feel like doing something once” through to “My predictions for the future about a different problem are obviously true” is clear as day. Can’t see why anyone would disagree
> I like these many posts about how you, specifically, chose not to use any of the available systems to get a GPU that rapidly organized and became common globally during lockdown.
You like the other people are was arguing with are pretending that the options were reasonable. They weren't at the time. Many other people I know thought the same.
There was no stock for any GPU except for absolute crap on any of the retail sites in the UK. There are not many options in the UK generally. It is not like the US.
As far as I am concerned what you are engaging is effectively gas-lighting.
> The line from “I just didn’t feel like doing something once” through to “My predictions for the future about a different problem are obviously true” is clear as day. Can’t see why anyone would disagree
If you deliberately want to misunderstand what is said you could draw that conclusion. Which is blatantly what you are doing.
The only thing I claimed about the current high price DRAM situation is:
1) It is likely to get worse before it gets better (due to supply chain issues due to current wars). 2) It resolve itself over time and you should be patient and just make your existing stuff last as long as possible.
That is how any crisis often plays out and I was actually telling people in my original statement not to be all doom and gloom and just be patient. It will sort itself out. It won't be this year for sure.
My favorite part would have to be where you can’t remember the actual, structurally crucial piece of information that your argument rests on and just said that you didn’t feel like getting a GPU off eBay.
>I've had problems with it before (I can't remember specifics as it was a while ago). I'd rather not going through the hassle and/or risk in the first place.
As your evidence that
> Doomers IMO are just click baiting.
Like you admitted that you _do not remember_ why it was entirely unreasonable or impossible and are arguing against people that do possess memory of it being possible and reasonable enough for them at the time. Amazing stuff.
> My favorite part would have to be where you can’t remember the actual, structurally crucial piece of information that your argument rests on and just said that you didn’t feel like getting a GPU off eBay.
You are misunderstanding what is being said. I suspect it is deliberate.
It is often said that "Prevention is often better than the cure". Similarly it is often better not to risk spending your money unwisely than to have to go through processes to recover your money. It matters not what the specifics of the situation was (it happened a decade or more ago)
I communicated that quite clearly. So you either didn't understand or you are deliberately misunderstanding what I said.
> Like you admitted that you _do not remember_ why it was entirely unreasonable or impossible and are arguing against people that do possess memory of it being possible and reasonable enough for them at the time. Amazing stuff.
I bet you felt really clever constructing that. However as explained the specifics weren't the point. Avoiding the process entirely for funds recovery is the point.
I don’t know why you’re being so combative here. I said I liked your posts about vaguely feeling that a specific thing was probably worse during covid lockdown than everyone else remembers it and how that means that your are equipped to predict the impact of a completely different phenomenon on something else. I like these posts! Responding to “hmm this specific thing looks bad” with “alright I don’t actually remember what I’m basing this on but I saw a quote about economists that I think means it’s good and it feels like everyone that doesn’t vibe with me and my quote are wrong” is fantastic posting!
I wasn't trying to be a smart arse at all. "I couldn't get a new card from a store" and "I couldn't get a card at all" are extremely different claims in my mind.
I'd rate my pedantry level as quite low. From my point of view this is not a nitpick.
Especially because you emphasized "at any price". It's the scalpers and the used market that were selling at any price. Sticking to reputable stores means sticking close to MSRP.
Buying from scalpers and other untrustworthy people like thieves and other toerags is unreasonable.
I would expect people to understand that unreasonable options should be omitted from conversation.
There was no stock at any of the online outlets that are commonly used in the UK when it came to GPUs for what seemed like a long time.
> I'd rate my pedantry level as quite low. From my point of view this is not a nitpick.
"I have investigated myself and found that I did nothing wrong".
Ebay is reasonable.
Ebay is not all scalpers either. You could have gotten another 1080Ti from a legitimate previous owner.
> Ebay is reasonable.
Paying a scalper on ebay isn't. Which is what I said. Misstating what I said is disingenuous.
> You could have gotten another 1080Ti from a legitimate previous owner.
They were being scalped as well. Also people were holding onto their 10 series cards because the other cards were too expensive. So I would have had to buy an older card (which I had already had one fail) at an inflated price.
I could have bought a GT 710 or a GT1030, but that wouldn't have been any better than my 8800GTS really.
I could have flown to Taiwan and bought a card. I could have stolen one. I am sure you will invent another fantasy scenario where I could have gotten a graphics card that I didn't think about at the time.
The fact is that I could not buy a new card from an online retailer in the UK as they were out of stock. Even when they did come into stock there was a lotto system. So you couldn't really buy one then. That is a fact.
I am not a hardware guy, so I am asking this in good faith: excluding people with corporate backing, who actually needs DDR5 RAM? Gamers? Why is DDR4 or DDR3 not good enough?
Because modern CPUs are on platforms that support only DDR5.
If you are a gamer, chances are you want one of the AMD X3D CPUs. Whilst AMD did produce 5600X3D, 5700X3D and the highly sought after 5800X3D, these are effectively unobtainable now (outside of the Used Market, which is already about 2X MSRP).
You are effectively forced into AM5 (or whatever Intel is doing) and they require DDR5. You don't have the "choice" to use DDR4 anymore in most circumstances.
If your question is more of a hypothetical (assuming we could use newer CPUs with DDR4 or even DDR3) the answer is a bit more blurred, but at least in a lot of gaming workloads, you aren't memory speed bound. There is some performance regressions, sometimes up to 15%, but a lot of this is negated with the X3D chips anyways (:
Who really needs more than warmth, shelter and something to eat and drink?
If you only need DDR3-like throughput you can keep a minimum of RAM for booting and caching, and set up swap on an Intel Optane drive: they're widely available and cheap (at least cheaper than RAM) on the second-hand market.
(For read only workloads (no writes or only very rare writes) any ordinary SSD would suffice; the point of resorting to Optane is its unique wearout resistance.)
I remember that literally everything, including basic necessities like food and housing jumped 30% higher overnight and never really returned to pre COVID prices. It erased about a decade worth of wage increases for most people.
I think the doomers are probably anticipating another round of that and they're probably right.
DRAM pricing is killing the everything market.
We just had a vendor uplift our quote 50% per unit for some machines because of a mix of memory + supply chain issues.
At work we just got a quote to upgrade a couple servers, original price a few years ago was ~ $150k. Essentially the same hardware, just newer, is now quoted at ~ $450k.
We decided to just keep our current hardware for now and extend a support contract for ~ normal price.
I wonder how long these shortages have to last until software developers are required to be mindful of RAM usage like in the decades before,
Probably not. AI code generation will not allow to use memory efficiently.
Quite the opposite i'd wager. Now that AI can figure everything out we can have the AIs do the performance work. Performance work alot of the times also went against developer experience in terms of languages/patterns and such. AI doesn't need to care about DevEx which might also show a shift towards more memory efficient languages and patterns. Only time will tell though.
Is it not enough to add to your prompt “use memory efficiently”?
> a vendor uplift our quote 50% per unit
Try 200% (tho tbf our boss sit on that quote for like a year and a half because he thought it was too pricey. Bet he regretted it now).
And all the quotes are now only valid for a week due to insane price fluctuation.
We just had a vendor uplift our quote 50% per unit
Good thing they didn't increase it.
That’s strange, there aren’t wider market supply chain issues outside of DRAM. Maybe your vendor is just throwing excuses around.
>That’s strange, there aren’t wider market supply chain issues outside of DRAM.
GPUs, ram, ssds, hdds, hell even CPUs are starting to climb in price. It's an everything shortage and it's only getting worse.
A workstation that two years ago cost $3,000 was $10,000 last month and $10,500 this month. There are parts which aren't available at any price.
Wait what? That's over 300%.
Between this revelation and that post recently on HN about the scanned receipts and egg prices, I find myself wondering if we're worrying about the wrong things.
We're seeing massive inflation in computing, but because the dollar is holding its value we call it increased prices. But the buying by the big buyers is the thing driving the inflation, its mechanism is scarcity.
But it's also localized. Only we experience this as a problem because compared to the hyperscalers we're poor.
The same idea applies to the price of groceries. As the prices increase, base increase being inflation, but logistic efficiency also plays a big role.
The effect is the same. The ones with more spendable income don't experience an issue yet in the projects nobody is eating fresh veggies.
The part that scares me is the creep, as I call it. Throughout the years I've always been able to carry price shocks and such but this time I'm out of the game. No more DRAM for me.
I then wonder if one day, without losing my job, I won't be able to pay for veggies.
At least with veggies you can stick seeds in the ground in the backyard.
My hard drive tree will take years to develop before it bears fruit!
DRAM and flash both seem to be up about 10x. HDDs are just impossible to buy.
Fuel price rises = logistics price rises.
You're right that fuel prices have risen. But usually the impact of fuel prices is mostly felt on bulkier, lower cost items first.
After all, a truck can carry a 10kg sack of rice, or a 10kg nvidia gpu. If shipping costs for 10kg rise by $15 the sack of rice has doubled in price, but the GPU is only 0.5% more expensive.
For a truck yeah, but across the ocean, it isn't quite that simple because GPUs and grains are sent in different types of ships (or different modes entirely) that aren't interchangeable.
You're right - perishable goods have to be shipped fast. Your bananas, berries, fresh fish, and not-fron-concentrate juice can't be on some slow-steaming container ship with the furniture, clothes, building materials and vehicles.
The GPUs can though.
Rice is a nonperishable grain. Grain ships in neither of those. Grain is shipped in bulk carriers
And the GPUs are such high margin that they all take an airplane anyway.
That is other “different mode entirely” that exists to go across an ocean :)
This is driven by AI datacenter demand, not fuel prices. RAM prices have actually dropped significantly in the last couple days as the Iran war hit and the possibility that interest rates might go up and pop the AI bubble sunk in. (Though let’s see where they go after the last couple days of whipsawing.)
Yeah. Not true. Or send me the name of your server vendor. I’m buying.
Having issues with both price and availability on NVMe, SATA flash, starting to see some CPUs, and for a personal project high density spinning rust (24TB+).
DRAM is up more than that 50% though.
Flash has supply (and price) problems too.
This isn't true: NAND flash prices are up too, though not nearly as dramatically. But the war means that fuel and shipping prices are way up as well.
They’re throwing something around.
I assume this is sarcasm.
SSDs and HDDs are being squeezed as well.
Don't forget SDCards
"Memory card prices have TRIPLED in the last few months: when will this madness stop?!" https://www.digitalcameraworld.com/cameras/memory-cards/memo...
Sony stopped making their cards entirely, which stinks because I'd settled on their pro cards for all my camera bodies.
We just had a vendor tell us none of the HDDs we were looking for were available unless we also committed to a full NAS offering.
Well this brings back initiative to stop throwing RAM at the problem and start optimising the code. Would like to see what smart people can do when there are money saved from buying more RAM.
For some uses right now, this makes sense, but it has to be at scale. If you're working on something that will ship in two years and is used by end users, it might not be worth the effort since production will catch up.
> it might not be worth the effort since production will catch up.
Sounds unlikely.
Unless RAM use would impact financial metrics, it would be hard to justify the work on memory footprint if customer doesn’t complain a lot.
My migration from the Oberon System 3 to the Raspberry Pi 2 and 3 comes at just the right time as it seems (see https://github.com/rochus-keller/OberonSystem3Native/release...). There is also a stand-alone version of the compiler that runs on all standard systems (see https://github.com/rochus-keller/op2/).
Maybe the company will extend the availability of the Model 2b for a few more years and release versions with less RAM (<= 500 MB)?
it's probably time to call those old retired programmers to ask them how to reduce software memory footprint
or to teach that again
Taking a big, complex, already well optimised program like Chrome or the linux kernel and optimising the memory footprint is hard. But 90% of programs are just crappy web apps that nobody has even bothered to optimise at all. (Sometimes wrapped in electron or something.)
If you go look, you often discover that 90% of the requests are useless, or at least could be combined. That 60% of bandwidth is used up by 3 high res images which get displayed at 30x30 pixels. That CPU performance is dominated by some rubbish code that populates an array of a million items every call, then looks up 1 element then throws the whole thing away, only to regenerate the exact same list again a few microseconds later.
We have plenty of RAM. In absolute terms, 8gb of ram in the macbook neo is 8 billion bytes. 64 billion ones and zeros. You don't need rocket science to make a CRUD app that runs well with that much ram.
Computers don't get slower over time. If we were merely as lazy with computing resources as programmers 10 years ago, most programs would scream on modern hardware.
It isn't that they are crappy web devs. It is that often the org paying for the development doesn't care.
I am a web developer of over 20 years. I can create insanely optimised pages using nothing other than vanilla CSS and JS.
I have been paid exactly once to do this. There is a site I built in 2023 that has a JS and CSS footprint of less than 100KB after GZip (large site). We even had the Go templates compiled when the web app initialised so the server responded as fast as possible.
Guess what happened when it went live? The content team use 8mb images for everything and every single optimisation I did at CSS/JS was totally useless.
Devs don't care because the people above them don't care and therefore there is zero incentive to even bother.
This is a really great case study for why you only optimize when you actually have a problem, and only in the context of a profiler to define what needs optimization.
the engineering and leadership failure was at requirements time. why on earth would somebody pay for all that optimization without knowing about whats gonna be on the page first?
> therefore there is zero incentive to even bother.
I hear you, and this is a real problem. But it's kind of depressing to need incentives to care about the quality of your work.
There is something very wrong now with how companies operate in general.
You get beaten down eventually. Late last year. I spent like an hour going through why a PR (and this developer's work) in general wasn't acceptable to my superior. He said to me that he was perfectly fine with someone not understanding basic language features (after 6 months using the langauge). He then merged it.
It didn't work (as I had warned) and created a situation where I had to turn off tests in some projects as it totally broke them. I've spent months fixing his crap and still haven't recovered from one bad PR. Now add two other employees that are like this and my manager does nothing about it. I bought a AI package from Jetbrains and now have it do almost all the work. I normally spend some time cleaning it up. Management have made it clear to me that they don't care about quality, they won't hold anyone accountable and won't even fire people that clearly cannot program.
I am 43 years old this year. I just can't be bothered trying to be a hero anymore.
Similarly, my father who retired last week was a joiner/carpenter and would be considered a master boat builder. When my sister was little my dad made her new bed with hearts and flowers carved in the headboard.
He described how adversarial he was too his employer before he retired. He was engaging in Malicious compliance (he is a layman and didn't know it was called that) because management was making his life miserable by employing the same sort of the stand-up meeting ceremony nonsense in carpentry.
They managed to make someone with that level of skill hate their job because of process.
> There is something very wrong now with how companies operate in general.
Some companies. A lot of companies, maybe. But far from all of them.
I've done a lot of consulting work, which means I've done short stints at a lot of different places over the years. Some were absolute stinkers - like you describe. But I've also worked with some wonderful people and on some great, high performance teams. I understand that its not so easy when you're 43 (and maybe, with kids). But you don't need to stay in a job like this. Its not worth getting ground down like this. Its bad for your health. And its horrible for your career in the long run.
Move to a smaller company. Or sniff around and find a better team within your existing org. In the words of my favorite poet: The world is made to be free in. Give up all the other worlds except the one to which you belong.
> Some companies. A lot of companies, maybe. But far from all of them.
I honestly think it is most of them.
> Some were absolute stinkers - like you describe. But I've also worked with some wonderful people and on some great, high performance teams.
I've totally given up on it. People don't value your work. I did a piece for a particular company. It worked perfectly. It was thrown away after a year and half because management decided everything should be rewritten in <new framework> ignoring the fact that what I had written was well documented and worked absolutely fine.
Now I shouldn't really care right? I was paid and all. But it pissed me off. What the point in doing a good job if people just throw your work in the bin?
I am looking at what my options are going forward. I am honestly considering being a car mechanic (I fix my own vehicles) or work outside for the canal trust. Realistically I suspect I might pivot to QA or doing something security related.
> I understand that its not so easy when you're 43 (and maybe, with kids). But you don't need to stay in a job like this. Its not worth getting ground down like this. Its bad for your health. And its horrible for your career in the long run.
I've been looking for over 2 years. I want to move to be closer to my family which are 300 miles away (the other side of the UK). So remote is a must. A large number of positions are hybrid, so not an option.
Outside of that many of the position in the UK are working Defence, Intelligence or Law Enforcement. All of those I have ethical reasons why I won't work for them. Outside of that there is Gambling, Pay day loans, and spooky stuff like tracking people via facial recognition.
> In the words of my favorite poet: The world is made to be free in. Give up all the other worlds except the one to which you belong.
I find this condescending.
Dear god how do I get these jobs? I'm 35 yo and would work with you and accept your work, not jam crap code into things. I'm open minded and realize when someone's idea or code is better than mine.
I can save everyone a few Mb of memory now :
1. Check that you really need a SaaS SPA to solve the communication issues between your team members.
2. HTML and css should be enough for 99% of corporate websites.
3. Resize the images on your websites, they're too big.
4. Use teams in the browser, not as stand-alone app.
The art is not lost, just not funded. Feel free to fund the programmers for your own software projects.
It isnt lost but it also isnt a common skill set in programmers any more.
Most programmers are JS web devs writing client side code or server side CRUD.
I would guess < 10% of programmers writing code today get perf / valgrind out on the regular. I know I dont.
You can still write JS or TypeScript code that tries its best to keep memory use under check. JavaScript was around in the late 90s when the memory footprint of software was at least an order of magnitude lower, so it's absolutely doable.
JavaScript in the late 90s was doing a hell of a lot less than it is today.
You don't have to go that deep. 99% of the time our analytics or risk management teams have some really memory inefficient Python and they want me to write them one of our "magic C things" it turns out to be fixable by replacing their in-memory iterations with a generator.
Most people don't have the chance to do that, but hopefully we can see some other languages get first class access on the web. At least there is the whole WASM project.
This is happening, sort of. All the big tech companies have major initiatives going to reduce RAM usage.
The old graybeards who know how to optimize efficiency may not work for them anymore, though.
This feels kind of worn out. Yes we use more memory but we have more to work with. At the very worse you just let your favourite LLM take a pass at improving memory usage. For example, yesterday I was debugging an Electron crypto mining blockchain 2.0 app and the WebWorkers wou—-
Seems unnecessary. We can simply ask the LLMs to do it after, of course, imploring them to not make any mistakes.
That's circular logic. It's the demand from LLMs that are driving the DRAM shortage.
Why not ask your LLM?
You can surely trust the wolf in their brick house knowledge
Toolchains and product pressure did more damage than rusty malloc discipline, because modern stacks assume cheap RAM and pile on deps until a small board looks underpowered.
Electron on an SBC is a bad joke.
memmaker for the win!
Nonsense. You prompt an adversarial agent to look for bottlenecks and suggest improvements in what your coding agent wrote, "and don't make mistakes".
Just rewrite your biggest memory hogs in Rust, it routinely slashes RAM footprint and demand for RAM throughput. The effect is even bigger than the typical reduction in CPU use. You can even ask AI to help you with the task, it will use a lot less RAM for it than the rewrite will save down the road.
Why would we need rust, if the AI can just write really good code in C that doesn't exhibit any of the issues that rust protects you from?
Rust's compile-time checks are actually a nice set of guardrails for LLMs.
Nobody who works with LLM generated code believes that LLMs produce fault-free code.
Yes, languages with very strong type systems like Rust are incredible when paired with an LLM. Just like chat bots have a calculator as a "tool" because they are not the best at calculation themselves, they need a type system to deterministically validate the safety and cohesion of the code they generate.
Its languages like C that you have to watch out for, because the LLM will gladly say "this is safe!" when its not.
AI can produce code that is about 70% as good as median quality code found on the internet.
I would not describe median quality C code as free from these issues
The Rust ecosystem and build tools are much easier to use than C. The value of a language isn't just syntax.
LLMs are great at C, probably because C is historically the most popular language in the world, by far. It only declined slightly very recently. But there's insane amount of code written in it.
But are they "no UB in the code" great? For my use, Opus is absolutely good enough at Rust
> if the AI can just write really good code in C that doesn't exhibit any of the issues that rust protects you from?
"if"
If it could you wouldn't need to use Rust. It can't, qed.
'rewrite in C, make sure there are no memory leaks'. You first.
Why is that less realistic than saying 'rewrite in rust, make sure there are no memory leaks'?
My point, which I should have been clearer with, is that we aren't at a state where you can just one shot a rewrite of a complex application into another language and expect some sort of free savings. Once we are at that state, and it's good enough to pull it off, why wouldn't the AI be able to pull it off in C as well?
You don't have to trust the AI to do it with Rust, you just have to ensure certain conventions are followed and you can formally prove you're 'safe' from certain classes of issue, no AI magic dice-roll.
A lot of people are very excited by the idea that now language capabilities (and almost every other technical nuance) somehow don't matter but much like gravity they will continue to assert themselves whether you believe in them or not.
So far humans have proven unable to write large apps in C without those issues, given their work is the training basis for LLMs this creates two problems, one being that they don't 'know' what a safe app looks like either and any humans reviewing the outputted code will be unable to validate that either.
Documentation and testing used to be mildly important, you better have them, but the quality of the tests didn't matter as much, since you have to get the implementation right, no matter how good or bad your tests are.
Now that the work is delegated to an LLM, the test and documentation quality ultimately decides the quality of the product.
Since you as the programmer no longer have to deal with the language's annoyances directly and force the LLM to perform the drudgery for you, you can build a language that makes a trade off between drudgery and quality and receive a software quality upgrade essentially for free.
LLMs are really good at producing tokens faster than developers, so make those tokens count.
There are classes of bug that are easy to write in C that are impossible to express in Rust.
Hmmm... where could the oob access possibly be I can't tell
Easy to spot in a contrived example is not:
> impossible to express in Rust
I’m not going to argue with Rust folks who misrepresent the language.
You can prevent unsafe from being used in a repo with linter rules.
Because it can't?
Have you tried asking Claude 4.6 Opus?
Based on a FIDO2 spec I used it to write a reasonably compliant security token implementation that runs on top of Linux USB gadget subsystem (xcept for attestation, because that's completely useless anyway). It also extracted tests from an messy proprietary electron based compliance testsuite that FIDO alliance uses and rewrote them in clean and much more understandable C without a shitton of dependencies that electron mess uses. Without any dependencies but openssl libcrypto, for that matter.
In like 4 hours. (and most of that was me copy pasting things around to feed it reasonable chunks of information, feature by feature)
It also wrote a real-time passive DTLS-SRTP decryptor in C in like 1 hour total based on just the DTLS-SRTP RFC and a sample code of how I write suckless things in C.
I mean people can believe whatever they want. But I believe LLMs can write a reasonably fine C.
I believe that coding LLMs are particularly nice for people who are into C and suckless.
Pain is an important signal that tells you something is going wrong, before it goes wrong really badly.
Rust gives you a lot of pain (= useful signals), before damage occurs.
Now imagine you build a reinforcement learning harness around Rust and C. Which is better for reinforcement learning? Impossible to detect failures in the final product or loud and annoying compiler errors that force you to address them?
It can't, because there is no really good code to train off of.
The high cost of high end DRAM (4GB+) cost skyrocketing has caused some interesting shifts:
1. Shifting Hobbyist Focus: Because hobbyists typically prefer parts under the $100 mark (so they don't "fret over breaking them"), the community is shifting away from modern, high-powered SBCs. Instead, people are moving toward:
2. Older SBC models (like the Pi 3 or 4 with lower RAM).
3. Microcontrollers (like the RP2040) which remain cheap. So Used hardware and "repurposing" old tech is retro trending again.
IMO, perhaps there will be push to make software/firmware more RAM efficient with AI assisted coding?
am I crazy for thinking that the 16GB Pi 5 is just there to absorb money from people who purchase the most expensive version of things? Like really nobody needs that much RAM on a Pi?
I am running a bunch of stuff on my 8GB Pi and I've run out of memory to put more stuff on. I use it as a low power server running a bunch of Docker containers. Some of these require at least 200mb and some use 2G of memory.
I was going to buy a small nuc and load it up on memory but I've acquired an old Mac Mini with 16GB of ram, which will do.
Yes, you are crazy for thinking that. The extra ram is useful for small LLMs and also running lots of dock containers. The very low power consumption makes it ideal for a low end home server.
I use the 16GB SKU to host a bunch of containers and some light debugging tools, and the power usage that sips at idle will probably pay for the whole board over my previous home server, within about 5 years.
You can just as well not run docker. 1GiB machine can run a lot of server software, if RAM is not wasted on having duplicate OSes on one machine.
Docker is about containerization/sandboxing, you don't need to duplicate the OS. You can run your app as the init process for the sandbox with nothing else running in the background.
That makes docker entirely useless if you use it just for sandboxing. Systemd services can do all that just fine, without all the complexity of docker.
I think that on linux docker is not nearly as resource intensive as on Mac. Not sure of the actual (for example) memory pressures due to things like not sharing shared libs between processes, granted
That's the major problem. The more shared libs the app is using, the worse this is.
Containers are not Virtual Machines. 1GB cannot run a lot of server software.
If stuff is written in .NET, Java or JavaScript. Hosting a non-trivial web app can use several hundred megabytes of memory.
Any node server app will be ~50-100 MiB (because that's roughly the size of node binary + shared deps + some runtime state for your app). If you failed to optimize things correctly, and you're storing and working with lots of data in the node process itself, instead of it serving as a thin intermediary between http service and a database/other backend services, you may get spikes of memory use well above that, but that should be avoided in any case, for multiple reasons.
And most of this 50-100 MiB will be shared if you run multiple node services on the same machine the old way. So you can run 6 node app servers this way, and they'll consume eg. 150MiB of RAM total.
With docker, it's anyone guess how much running 6 node backend apps will consume, because it depends on how many things can be shared in RAM, and usually it will be nothing.
Only Java qualifies under your arbitrary rules, and even then I imagine it's trying to catch up to .NET (after all.. blu-ray players execute Java).. which can run on embedded systems https://nanoframework.net/
I listed some popular languages that web applications I happened to run dockerised are using. They are not arbitrary.
If you run normal web applications they often take many hundreds of megabytes if they are built with some popular languages that I happened to list off the top of my head. That is a fact.
Comparing that to cut down frameworks with many limitations meant for embedded devices isn't a valid comparison.
1GB is plenty for almost every case I've seen, 10-20x the need. Yes if you're running a repeated full OS underneath (hello VMs) then it'll waste more.
I run (regular) .NET (8) in <50mb, Javascript in <50mb, PHP in <50mb. C, Perl, Go in <20mb.
Unless you're talking about disk space.. runtimes take space.
> 1GB is plenty for almost every case I've seen, 10-20x the need
Couldn't have seen many then! Maybe you should look elsewhere.
> Yes if you're running a repeated full OS underneath (hello VMs) then it'll waste more.
Docker is not VMs. Other people have stated this.
> I run (regular) .NET (8) in <50mb, Javascript in <50mb, PHP in <50mb. C, Perl, Go in <20mb.
Good for you. I run web services that are heavier. The container has nothing to do with it.
It's not OS duplication per se, but a SystemD one.
> You can just as well not run docker.
this is naive
"just as well"? lmao sure i guess i could just manually set up the environment and have differences from what im hoping to use in productio
> 1GiB machine can run a lot of server software,
this is naive
it really depends if you're crapping out some basic web app versus doing something that's actually complicated and has a need for higher performance than synchronous web calls :)
in addition, my mq pays attention to memory pressure and tunes its flow control based on that. so i have a test harness that tests both conditions to ensure that some of my backoff logic works
> if RAM is not wasted on having duplicate OSes on one machine.
this is naive
that's not how docker works...
Yes, it's exactly how docker works if you use it for where it matters for a hobbyist - which is where you are installing random third-party apps/containers that you want to run on your SBC locally.
I don't know why people instantly forget the context of the discussion, when their favorite way of doing things gets threatened. :)
Context is hobbyists and SBC market (mostly various ARM boards). Maybe I'm weird, but I really don't care about minor differences between my arch linux workstation, and my arch linux arm SBCs, because 1) they're completely different architectures, so I can't avoid the differences anyway 2) it's a hobby, I have one instance at most of any service. 3) most hobbyist run services will not work with a shitton of data or have to handle 1000s of parallel clients
> Yes, it's exactly how docker works if you use it for where it matters for a hobbyist
What you described is exactly the opposite of how it works. There is no reasonable scenario in how that is how it works. In fact, what you're saying is opposite of the whole point of containers versus using a VM.
> when their favorite way of doing things gets threatened
No, it's when someone (like you) thinks they have an absolute answer without knowing the context.
And by the way, in my scenario, container overhead is in the range of under a hundred MiB total . The thing I'm working on HAPPENS to require a fair amount of RAM.
But you confidently asserted that "1GiB machine can run a lot of server software". And that's true for many people (like you), but not true for a lot of other people (like me).
> most hobbyist run services will not work with a shitton of data or have to handle 1000s of parallel clients
neither of these are true for me but you need to take a step back and maybe stop making absolute statements about what people are doing or working on :)
I bought a Pi 500+ (basically a 16gb Pi 5 in a keyboard with a built in NVME hat) to use as a family computer, otherwise I agree. Unless you're planning on using it as an actual desktop there's no real reason for that much ram
Most Pis are sold for embedded customers, some of which no doubt can use 16gb.
Browsers treat RAM as infinite, if you want to for whatever reason open LinkedIn, you might wanna get a bigger model. I’d personally rather buy more ram than I need rather than deal with the cost of fixing / working around the issue in future
No you are not crazy. It's silly to try to use a raspberry pi 5 16GB (or equivalent priced product) as a desktop workstation with a GUI on it when much better actual x86-64 based workstations exist. Ones with real amounts of PCI-E lanes of I/O, NVME SSD interfaces on motherboard, multiple SATA3 interfaces on motherboard, etc. In very small form factors same as you'd see in any $bigcorp office cubicle.
> as you'd see in any $bigcorp office cubicle.
Which bigcorp does use cubicles?
It’s an incredibly lopsided machine. The Pi 5 is decently powerful, but you really really should not be attempting to use one as a desktop replacement. While theoretically possible you are so much better off with a $50 used SFF PC.
I bought the pi 500+, it fit my requirement of being built into the keyboard
Living in Korea where Samsung and SK Hynix are headquartered, the DRAM pricing situation is interesting from the supply side too. Both companies have been aggressively shifting capacity toward HBM for AI/datacenter use because the margins are 3-5x higher than commodity DDR5. The hobbyist SBC market is essentially collateral damage of the AI boom — manufacturers are rationally choosing to serve the more profitable customer.
Unfortunately I don't see this reversing until HBM demand plateaus or new fabs come online, which is 2-3 years out at minimum.
The whole point was to kill personal computing.
How spoiled we have become...
I remember my company buying RAM expansion boards for our PCs back in 1989 so we could run OS/2. The 4MB boards (MB! Not GB.) cost around $2000 at the time.
Like everyone, I love getting tons of RAM or SSD storage on the cheap; but we have a ways to go before we reach the 'unaffordable' level.
this whole saga is having ripple effects even in the second hand market. in 2020-2022 there was a glut of those 1L mini pcs on ebay and other resellers which were WAY better value than the RPi4 at the time which was in short supply due to COVID. these mini PCs were pretty affordable and could be upgraded with extra RAM, new SSD/NVMe drive etc to make perfectly good little home servers. I still have mine which has been running for a few years now, Intel 6th Gen CPU, Lenovo thinkcentre.
nowadays the price of these 2nd hand mini PCs has shot up, and even if you do get a chance to get one, upgrading it with more RAM is gonna be painful
If you can't find mini pc's at a resaonable price, there's always old enterprise SFF desktops or even thin clients, which have very low specs (and are usually not upgradable) but can be loaded with a custom lightweight OS.
I know this is a pipe dream (govt’s of the world working together to benefit their citizens instead of blowing some other country’s citizens up!) but if we aren’t gonna regulate AI collectively to ensure we are developing it responsibly, the least we can do is ensure AI is given bottom billing when it comes to all the resources it’s sucking up. Energy, components, engineers, construction, etc.
My preference is responsible AI development which prevents it from turning into an arms race but that’s clearly not on the cards, especially with current leadership.
I wish korea or the US could pass a law requiring samsung/sk hynix or micron to allocate a certain amount of their production for consumer use
I think the least we can do is even less than that.
What is SBC? Session Border Controller? Small Business Consumer?
Single Board Computing.
The extreme DRAM market has had an unexpected side effect of triggering a lot of panic buying. I know several people who delayed PC upgrades for years but then panic bought new systems in this market. The trigger was seeing all of the "It's only going to get worse" and "This is the end of personal computing" headlines.
They're already regretting spending so much now that prices have started to tick downward.
I keep telling everyone: If you don't have a pressing need to buy right now, please wait 6 months and check again.
wasn't "panic" buy but I built a new comp early 2025, cuz at worst case would be complete supply crash and at best case it was going to be more expensive.
Def don't regret doing that, though I regret not springing for the extra RAM.
Same. I got 64gb for my new build the day this whole thing started but I kind of wish I had gotten 128 just for bragging rights.
That's actually a reasonable response to market volatility and illiquidity. It's not just high prices, but prices that still fail to be representative of the actual market stance despite the rises.
It's not a reasonable response. If you don't need a PC right now, buying in the middle of a demand spike is the worst time to do it.
It's only a spike if it comes down. Every RAM chip is a lottery ticket with a plausible chance of giving one lucky winner fabulous prizes like absolute dictatorship of the entire world and physical immortality. What else are the billionaires going to spend their money on? Arms races can absorb unlimited resources.
What’s interesting is mini pcs are dirt cheap. The RAM for them costs as much or more than a barebones Ryzen 7 mini pc.
In January I bought a barebone ASUS NUC, which is relatively expensive among mini-PCs, but I need to run it 24/7 for many years, so I made a choice based on expected reliability.
After adding to it DRAM and SSDs, the cost of the barebone remained of only 40% of the total, so the price of the memories was 50% higher than the barebone computer.
At that time, the memories were still cheaper than today, so now the price ratio would be even worse. (The barebone NUC had an Intel Arrow Lake H CPU and it cost $500, while 32 GB DDR5 + 3 TB SSDs cost $750.)
>They're already regretting spending so much now that prices have started to tick downward.
Where?
Last month I "panic bought" a $999 Macbook Mini (32G) so I could run small models, Image Generation, and Voice synthesis on it. I don't think I regret it yet, despite the fact that you can get a 16G for $599, which is honestly a much more efficient price per Gig.
I think it is interesting that, at least thus far, Apple has chosen not to raise the price of their comps despite presumably the price of RAM going up multiples.
Tipping point for me: It will be a pretty kickass media server for at least a decade.
Didn't they eliminate the highest tier Mac Pro and raise the price of the one under it?
I panic bought a Macbook Pro M5 Max with 128 GB ram. I yolo'ed because I don't think ram prices get better in 18 months so this might be the last time we see "cheap" memory, even though the laptop cost me $5000.
High speed NVME is soaring too. Some popular Samsung kits are up 3X compared to 12 months ago.
Bought these Patriod SSDs recently, they are cheap but they are DRAMless apparently, I'm using them to replace the HDDs from 17 year old hardware so figure it's not terrible plus the read/write speed is still high
Got my RPi 5 16GB quite a while ago for around $160 and already thought that was expensive... It’s still powerful enough for almost everything I throw at it, honestly a bit overkill in most scenarios.
With prices steadily going up, for me it's starting to feel more sensible to repurpose the RAM sticks I've collected from old PC builds / laptops and just throw together small amd64 boxes instead of buying more RPis.
I wonder if there are low power Intel or AMD boards that accept DDR3. So many sticks of 2 / 4 / 8GB DDR3 inside laptops going into recycling or landfills which would do perfectly fine for low power purposes. Hell, performance for standard workloads scales with access times, not bandwidth, and DDR3 sits nicely at CAS8 1600MHz and CAS10 2133MHz..
Dell Optiplex 3040 or some configurations of Lenovo M700 may be using 6th gen Intel CPU with DDR3.
SBC = Single Board Computing Betriebsblind
People hate AI. This will make them hate it even more. Yet somehow the market has convinced/forced us to use their products even though we might not want to.
A recent nationwide poll[1] shows AI has a poorer approval rating than ICE — ICE! — probably due to their overlords being "those" SV types. Everyday AI features are being shoved down our throats. I can't even choose to not install Gemini related apps on my Android when I select "which apps to install" when booting a new phone.
But people are a weird bunch. They largely don't buy products aligning with their values. No one is jumping up and down for Graphene phones even if they had amazing privacy first software. People buy 6mi/gal hummers and iPhones for fashion, brand, money, convenience/function. The pain threshold of all bad effects still is not high enough to quit their products in a meaningful way. Values and privacy are way down in their list. I wish people would not buy/install AI related features by big tech and be more discerning, but that is likely a pipe dream.
[1] https://pos.org/wp-content/uploads/2026/03/260072-NBC-March-...
I'm people. I like AI
Bought a couple of 32gb SBCs before this all hit the fan. And also built a SSD NAS before the wave hit.
So timed that all pretty great. What worries me is my desktop is up for a full new buy somewhere around early '28. That could be a train wreck depending on how taiwan situation goes
> So timed that all pretty great. What worries me is my desktop is up for a full new buy somewhere around early '28
That's a very specific date / timeline. How do you decide to do a full new buy? I ask because I own a desktop that I built 15 years ago which I was flirting with replacing completely last year, but unfortunately I didn't pull the trigger ... oops :(
My old rig is still going strong. The motherboard can only take up to 32GB DDR3 though. CPU is an Intel i7-4790k which is still very fair today if you are not running a resource hog OS (looking at you Windows). Overall it is completely serviceable for my needs. Being honest with myself the only reason I wanted to upgrade was for nerd cred but I don't game much anymore and don't do any ML tasks that require lots of local compute.
My PC is similar. I upgraded it to a 4790k a few years ago (best CPU on the socket). What's funny is I also maxed out the RAM as well because I realised two more 8GiB sticks were like £30 so why not. I thought it was a funny thing to do at the time as I didn't really need that much, but glad I did now. It's going to have to do me for many more years to come, but I'm fine with that. I don't game at all. Just have to hope nothing fails. I did build it with solid foundations: good and overprovisioned PSU, Asus mobo, so here's hoping.
Unfortunately I do also have server gear now as well. I'm going to have to really think about what I actually need now...
A 2GB RAM (and no EMMC) Raspberry Pi 5 in Canada is $90. Around $150 is where you can get used N100 Mini PCs with a proper SSD, and at least 8GB of RAM. It’s crazy.
Duckduckgo did not help me with "SBC meaning". Here, no help either. Guess I am asking an LLM next, which I hate.
Single Board Computer. Yes, it might help to explain such acronyms in the article. And yes, if you think the name is slightly misleading, I agree: most PC mainboards today are also "single board computers" - you don't have to add any additional cards to get a functional PC (unless you consider RAM modules and M.2 SSDs cards).
Maybe you should have search for "SBC abbreviation" instead, as it's not exactly a philosophical question. It's a TLA (spoiler: TLA stands for Three Letter Acronym).
SBC (single board computer) is a fairly widely used term.
https://en.wikipedia.org/wiki/Single-board_computer
Single Board Computer
Yep. I just bought a Pi CM5 for my son, for his ClockworkPi uConsole. CAD $200 for the 8GB module. I bought a whole Pi5 16GB not long ago for under CAD $200.
I will not be buying any more SBC's at this price point. I wonder if Raspberry PI will survive.
Funnily enough the consumer impact of DRAM etc. costs came up in an unrelated interview I was doing at Kubecon last week. She also made the observation that a lot of these big companies are buying components to keep in reserve for data centers that haven’t even been built yet.
Raspberry Pi Zero 2 W (512MB) and Pi3B (1G) are both still super cheap if you can cope with that much RAM.
Time to break out the Small Web protocols and start living within our means!
Sorry, best I can do is coal powered datacenter vibe coding.
I purchased a Beelink Mini S12 for $160 in May 2025 on Amazon. It's now at $260.
Yeah, mini PCs are quite a bit more expensive. An N100 with 16GB could be had for $135 or sometimes less on sale but is now about $250
In the U.S., some of it was tariffs, though.
I purchased a Beelink Mini S12 for $160 in May 2025. It's now selling for $260.
I'm happy I picked up my Pi 500+ last week. Lovely little computer for $250, not so much $400
The title should say: "Collusion of large corporations promoting LLMs with RAM manufacturers is killing the hobbyist SBC market (and bankrupting anybody trying to get a PC or laptop)".
Because we all know that DRAM prices have spiked since production is going to those infernal chatbot training data centers. Same as a lot of the electricity in some parts of the world, BTW.
Can you elaborate on the collusion aspect? Is the implication that OpenAI and Anthropic are coordinating their purchases in such a way that they target the hobbyist market? What’s the collusion angle here?
OpenAI signed letters of intent for 40% of the DRAM supply because they have no moat and want to starve their competition.
Only works so long as you eventually pay up... well unless the manufacturers make too much this way. That said are there some Chinese manufacturers that aren't part of the cabal and could undercut them?
Except that it doesn't work like that. If you buy DRAM and don't do anything genuinely worthwhile with it, you'll ultimately dump it all right back onto the market, and everyone knows that. The biggest worry is that it's actually OpenAI and their direct competition starving the rest of the market because they predict AI research and the like to be a highly valued use for the stuff, compared to building gaming PC battlestations or whatever the highest-valued use was before. Many observers think that this will also happen with GPUs and cutting-edge digital logic more generally.
> Collusion of large corporations promoting LLMs
> We all know that DRAM prices have spiked since production is going to those infernal chatbot training data centers
I know it's very fashionable here to talk about capitalism as some hand-washes-hand big corp organized scam, but if you put that ideology aside for a moment, you contradicted yourself here, I think.
I personally don't like conspiracy-theory-thinking. If I was a DRAM manufacturer and had to choose between servicing a single customer, who orders hundreds of millions worth of my product, or service a very large number of customers who order tiny amounts of the product a piece, then of course I would focus on the large client, because they are easier to service for the expected profit margin. I wouldn't even need to think about advertisement, sales, all that jazz. Looking at it from that perspective, it seems pretty logical to me that a spike in demand from datacenter operators would rise prices dramatically. I struggle to see room for collusion / conspiracy here.
A couple of issues, first there is a history of price collusion (see DRAM price fixing scandal on Wikipedia) and while it may be "logical" from a seller point of view to prefer large orders, this upsets a lot of people and used to be illegal in the United States (it may still be illegal, but it's not enforced)
Oh, I did not know that. Thanks for the clarification
There is a risk of having a single large customer. As a small food manufacturer we've been warned about it, like to not sell to Walmart even if given the chance.
If one customer buys a majority of your product, your entire business is at their mercy. They can dictate terms, or quit buying from you which can end your business.
So even with RAM - if a company goes all in on RAM for an AI company, what happens when the AI bubble bursts, or the AI company spins up/buys their own RAM factory and quits buying? Did you make enough money to tide you over until you can regain your old customers that have gotten used to not being your customer?
You are making some very good points.
I didn't say "Criminal conspiracy" nor "Capitalism is bad" (although I'm not a fan, and not because of the DRAM price spike). What mean by collusion is that OpenAI apparently agreed with Samsung and Hynix to secure 40% of global DRAM output, for their own exclusive use.
See coverage here:
https://bizety.com/2025/12/28/the-dirty-dram-deal-how-openai...
https://www.tomshardware.com/pc-components/dram/openais-star...
Most software uses 10x more memory than is necessary to solve the problem. In an ideal world, developers would stop building bloatware if their customers can't afford the DRAM.
I agree, OTOH there are many very cool things that we can build if we're able to assume a user can spare 2GB of RAM that we'd otherwise have to avoid entirely like 3D scenes with Three.js, in-browser video/photo editing. Should be making sure that extra memory is enabling genuinely richer functionality, not just compensating for developer laziness (fewer excuses now than ever for that).
Totally agree. Just like graphics card prices. Is it worth building a pc now?
After discovering Dell Alienware clearance and graphics card availability in those Alienware computers, I haven't felt the need to build a computer for the last five years.
I looked on their site. I don’t see any section for clearance.
It's good that openAI is failing to meet its obligations on hardware, but given what we know about the DRAM industry, i suspect drastically higher prices will be the permanent new normal, just like most everything else.
I've been having fun getting Linux 7.0 running on my Milk-V Duo S, it's still available super cheap (though tariffs make buying single quantities expensive) so i stocked up on Duo boards. I guess I'm hoping for an upside where there's more interest in cheaper overstock boards from 2022+
It's terrible. Fake money is fueling the exhaustion of real resources in search of questionable outcomes ("AGI"). Imagine if all of these money were invested in curing cancer.
Imagine if AI cures cancer.
Let's also imagine an alternative reality where some reasonable percentage of the $2.5T in current year AI spending was instead invested in the "general intelligence" researchers we already have for the same purpose. I think it's a pretty reasonable expectation that 1) they'd probably make more progress and 2) that money would help a lot more people in the process (through jobs and economic activity).
You can imagine all you want, but my understanding is there is no credible evidence that scaling LLMs will result in true AGI.
Obviously there's no "evidence". Why would you even think we need AGI? But I'm happy to hear your reasoning if you were one of the few/only? people who imagined that software that could predict the next word could do what it now is doing.
An already-ageing population living even longer while nobody wants kids anymore?
Am I correctly reading your argument that you are pro cancer for the purposes of demographic balance?
I think his broader point is that life preservation doesn't seem like such a big win if overall quality of life is dropping to the point where people decide to not subject their potential children with the burden of living.
You don't have to imagine, it will hallucinate you a slop with full confidence every time.
I've already seen at least one person who was pretty sure that the preprint paper they co-authored with AI (read: AI wrote for them) was going to cure cancer and make them billions of dollars.
There was only one problem. The paper jumped straight from "this paper will show how our new treatment could cures cancer forever" to "as you can see, these results clearly show that our treatment cures cancer" - with neither any actual results nor any specifics on the treatment. And I don't just mean that the paper didn't go into details; writing the paper was the full extent of their "research".
So QED then, I guess.
Proof by assertion.
AI was used fundamentally for COVID vaccine development. AI is used for research in all modern drugs. It’s a certainty if cancer gets cured AI will have played a fundamental role since it’s already fundamental to precursors.
Imagine if people with inexpensive tools cure cancer. You know, like they used to so far?
Curing cancer isn't profitable. But even if someone tries to mix AI and biotech, the result will be a dangerous medical slop.
The market can stay irrational longer than I can stay solvent. I'm holding on to the hardware that I have.
Does anyone mind explaining why the 2GB model only increased by 20% in price while the 16GB model nearly tripled in price?
The main cost input is presumably ram. They are passing it through.
If everything on the board but the ram costs $30, and ram is going from $10/gb to $20/gb, then they have to change the price $50 -> $70 to break even on the 2gb board, and $190 -> $350 for the 16gb board.
In other words, the raspi is now priced like a stick of ram with a bonus computer attached because ram is massively more expensive than the rest of the computer.
The 16GB model has eight times more ram?
Does this mean the Atom 8GB boxes I have laying about are now more valuable?
> memory prices won’t remain at their current very high level indefinitely; the circumstances in which we find ourselves are challenging, but in the future they will abate.
how long does it take to increase manufacturing capacity? how long will the decision be postponed to increase manufacturing capacity? if AI skeptics are right and the bubble bursts, increasing capacity inordinately will prove a big mistake. if AI skeptics are wrong delaying increase of capacity indordinately will prove a big mistake.
In a sense we are forcing DRAM manufacturers to play the judge, jury and executioner:
If they don't increase capacity corresponding with AI boom, the DRAM prices may ultimately cause an AI winter.
If they do increase capacity (lowering per unit costs), the lower DRAM prices may enable AI summer to continue.
This looks like self-fulfilling prophecy scenario.
what are the barriers to new DRAM supply coming online?
The barriers are the US demands that nobody should sell semiconductor manufacturing equipment to China.
Otherwise their memory manufacturing companies would be happy to exploit this opportunity.
Actually some Chinese companies already sell cheap DDR5 memory modules, but their production capacity is severely limited by the US blockade, so the cheap memories are available in few places, mainly in Asia.
So the high memory prices are caused by USA both by the AI companies that have bought most of the existing production and by the US government, who has sabotaged the Chinese memory vendors since a couple of years ago, in order to protect the market share of Micron (the US sanctions coincided with the moment when several companies, including Apple, intended to use the cheaper Chinese memories, so preventing this to happen seems a much more likely reason for the "sanctions" than the BS excuse that consumer DDR DIMMs and SSDs are dual-use products that may benefit the military. Even if that were true, the US sanctions did not prevent at all the Chinese from producing anything that would be needed in a small quantity, like for a military application. The sanctions have prevented only the mass production of devices using state-of-the-art lithography, which would have impacted the prices in consumer markets).
Huge capital outlays and no guarantee the prices stay high.
Startup costs measured in the billions, with no guarantee of success, and a long payback time horizon in a market that almost everyone thinks is - in one way or another - a bubble.
Oh yeah, the market is also getting intense scrutiny from powerful geopolitical entities that are quite explicit that they don't believe in fair play or consistent, stable rules.
Would you place that bet?
been building a home server recently and the ram prices really caught me off guard. 32gb kits are almost double what they were last year
You're lucky then, if it's only double.
I have a receipt from August last year that prices 2x 48GB DDR5 modules at $179 AUD a piece. Same vendor has them currently at $708 AUD EACH!
It's not easy to do hobbies.. I keep needing more money..
We're somehow in a race between LLMs curing cancer, destroying the planet by "You're right to be mad, I shouldn't have issued those launch codes, it's even in my Claude.md file, I'm sorry," and rendering modern technological civilization uneconomical. I know this is statistically the best time in history to live, but lord, I could use a vacation.
Is there anything (technically) preventing SBC manufacturers adding SODIMM slots?
I was expecting the Milk V Titan to avoid this memory nonsense since it has two unpopulated DDR4 slots, but it has fallen off the radar like several other SBCs.
SODIMMs are huge compared to a BGA memory package which is a problem if your goal is to minimize your board size (e.g. I don't think there's a reasonable way to fit it into a Raspberry Pi form factor without something weird and expensive like a mezzanine connector). Routing the signals is also somewhat more annoying because they all come out of one edge of the connector compared to a BGA package which has them fan out in every direction, giving more space for length matching traces, etc. You'll likely need additional PCB layers compared to a BGA chip.
DDR4 is also crazy expensive right now so this just depends on you having some around from a previous build
If you're willing to check the used market it is more affordable as the spike isn't as severe as used DDR5.
It actually seems to be slightly less expensive than DDR5, perhaps due to the lower throughput that makes it uncompetitive for AI-adjacent workloads.
My PDP-11 runs fine on 512K
I bet the power draw is at least 50x though
Unless you're really using the GPIO pins or other weird I/O, I really fail to see the purpose in having an 8GB or 16GB RAM Raspberry Pi (at a much higher price than it used to be) as a desktop workstation with a GUI on it.
The idea of putting sixteen gigs of RAM in a raspberry pi is nuts. The legit thing you want to use a raspberry pi (or a competitor) for as an embedded headless thing with no KB/mouse/display attached should run fine in 2GB of RAM or less, assuming an ordinary debian-based OS environment.
I would much rather have a used, ex-corporate/ex-lease, small form factor or ultra small form factor x86-64 desktop PC (Dell, HP, Lenovo, whatever) with 16GB of RAM in it and an SSD on a SATA3 or NVME interface. Whatever is the "best" SFF that you can buy via huge eBay used equipment dealers on any given month.
Despite being many years old, whatever you can buy on ebay for 200 bucks (at least before the recent RAM fiasco) with some recent-ish quad core core i5/i7 or Ryzen in it will run circles around a raspberry pi 5.
SBCs are not just RPis. Other brands can still be bought cheaper.
A few, until their current stocks run out. Orange Pi already increased prices (their boards are similar price or more expensive than equivalent Pi's now), and Radxa seems to just stop selling certain models (at least in NA) once they run out of stock.
Arduino has one of the cheapest 4GB boards now, but I wonder if it's just because they made a ton and the demand for their strange board has been low?
8GiB Rpi 4B is 190 USD where I am 8GiB Opi 4 is 90 USD from aliexpress to where I am
https://www.aliexpress.com/item/1005010198492129.html
Holy crap, it is super expensive now. I should have brought an extra one in the past.
It’s great that everything I love is getting ruined so that the most mediocre people on earth can generate slop on a daily basis.
This is a good thing. Pis were priced too low for OEMs and too high for hobby work. It's no longer an accessible board for fledgling hackers . Reclaim hardware for your nephews, which is good for the environment, too.
“Killing” is strong phrasing.
Yes, a $250 mini PC I bought last year is now $350.
Is this pricing bad? Yeah, compared to what it was.
Is this the end of the world? Not really, and we’ve seen price spikes for all kinds of PC components in the past. It’s rarely permanent.
That sounds pretty nice. The same mini PC I paid $195 for in 2023 is now $450. Seems to be life in Canada sometimes.
It had caused me to look around though. I have found the Pi Zero 2W to be surprisingly capable for Pi sized jobs.
Not everyone earns tech bro salaries and can sustain a thousand cuts. Many hobbiests are scraping and saving money to acquire hardware. For some it very well msy be the end of their world.
We are talking about brand new latest gen hardware here. People with low budgets are always scraping and saving for deals and don’t need to buy something brand new from a pricey brand name like raspberry pi.
You can still jump on eBay and buy all kinds of dirt cheap used pieces of hardware.
My buddy just bought a used ThinkPad T14 with 32GB of RAM and 1TB of storage for about $500. You can get by with a whole lot less.
In this context, I will also present the idea that Rasperry Pi has represented quite poor cost value for many years now.
Have you looked at how expensive international shipping is? eBay covers just a few countries, the rest of us can't buy there because we'll be paying 10 times the cost of hardware to get it over here.
I already moaned about this recently, but to briefly reiterate: the only hardware that's becoming available for most people in my region are Frankenstein desktops built from heavily used 10+ year old Xeons running on suspicious motherboards made by obscure Chinese manufacturers you've never heard of. This is pushing ever more people towards smartphones and away from actual computers.
But at least we got the bullshit machine in return, that's something, I guess.
> Have you looked at how expensive international shipping is?
It really shocks me how bad shipping has gotten. It's nearly unaffordable to buy things on eBay from the US as a Canadian due to shipping costs, so I can only imagine just how bad it is for people from other countries.
It's probably unaffordable for anyone to buy things from the US due to shipping costs, because the Trump administration has completely screwed up everything there with tariffs and mismanagement of the USPS and more. But the US is not the world. A better comparison is how much it cost to ship things from China a year ago compared to today.
> Frankenstein desktops built from heavily used 10+ year old Xeons running on suspicious motherboards made by obscure Chinese manufacturers you've never heard of.
I've heard reports that these are actually surprisingly good. I wouldn't want to use one in a production environment, but for homelab stuff they're an incredible deal.
That cheap stuff from eBay that people talk about all the time seems to be available only in North America, or in the best case Western Europe.
ThinkPad T14 which generation?
Sometimes goes the other way. I was recently looking for a specific PC case (Fractal Design Torrent Compact without a window) and it's entirely unavailable in North America.
Placed an order with a Polish seller on eBay, received a message that Fedex wouldn't take the package due to size, replied that they could send with any shipping company and that I'm not concerned with shipping speed, after which they cancelled the order on me.
Yes, 90%+ of sellers refuse to ship here (and we're not even under any sanctions and/or political pressure of any sort). I hear about these magical 100$ Thinkpads all the time; I'm yet to see anything cheaper than 300$ (add another 100$+ for shipping).
I think the post is pertaining to SBC's, to which mini-pc's threaten the viability of as well.
We need to come to terms with the possibly-irreparable harm that private capital has done to the West. Capital is long-past serving the interests of the broader public; but we're now past the point of capital serving the interests of the corporations its being invested into. The demand for shares in OpenAI and Anthropic is so high that its pushing their valuations into territories they can never hope to drive revenue to fulfill; the cycle of this massive warchest of private capital inside the AI industry has for all intents and purposes created a communist economic structure, with all of its faults. Grifting, favored suppliers; if the stories about SK Hynix and Samsung guaranteeing 40% of their wafer supply to OpenAI on a letter of intent they cannot follow through on are true, we're even getting good old fashioned communist mis-allocation of resources. The day may eventually come when the USG is forced with the decision of bailing out the trillion dollar OpenAI Corporation; taking a stake to add to their portfolio next to Intel and others; and maybe normies will then realize what is happening, but the writing has been on the wall for years.
I love capitalism; its ability to allocate resources on a macroeconomic scale to pick winners and more-importantly losers doesn't have a rival system. As a younger, more naive startup employee, I'm on the record making a total fool of myself responding to our CEO talking about struggling to find PMF by saying "then maybe our company doesn't deserve to exist" (yeah...) But the "capitalists" who run the world aren't actually interested in capitalism, and thus definitionally can't be capitalists. At least once upon a time we had filthy rich titans you could look up to, like Buffet and Gates (Epstein stuff aside); but at this point most of them aren't even enviable people. Despite being richer than God, people like Huang, Musk, Ellison, and Zuckerberg feel more like vampires; they want to spend their whole lives doing the exact same thing, getting richer and richer, refusing to put a ladder down for anyone else to take a shot at improving on what they've built. I actually have a modicum of respect for Bezos and, to whatever vanishingly small degree, Trump; at least they're trying something different.
Apple and Sama didn't do the consumers any favors this year.
What did Apple do?
Grabbed up as much ram as they could, nearly no questions asked, at above market rates in some cases, ramping up the perceived demand and decreasing supply significantly.
Am I allowed to complain about this or do I have to get my VC's approval first
The SBC markets been on life support for a long time. Youtubers making videos about them don't seem to grasp that and keep pumping out reviews and projects like its still 2019. The pi specifically has plummeted in popularity and for most use cases they just aren't a cost effective option when second hand micro pcs are dirt cheap and vastly more capable.
I don't think comparing new Pis to used micro pcs is fair. Compare a _used_ Pi with a used micro pc. If you have any geek friends, it's probably not hard to find a used Pi for free.
PCs don't have GPIO. They're different markets and the desktop replacement never materialized.
You get GPIO and any other needed interfaces on any PC, by adding a $10 microcontroller on a USB port, e.g. one of the STM Nucleo boards.
If you use a PC or mini-PC that you already have, that is much cheaper than using a Raspberry Pi or similar.
For most projects using GPIO, a <$10 ESP32 board or Arduino clone will suffice
MCUs are great but a lot of projects require linux and an application processor. Pi is the industrial standard.
1. Most people aren't using GPIO.
2. If you are as others said, you can get very cheap GPIO addons.
Imagine if there were a universal means to attach external devices, perhaps one of these external devices could handle GPIO. You might even call it a "Universal Serial Bus."
An SBC is a great gateway to embedded development. There are some industrial PCs in that niche, but generally mini/client PCs don't fill that need.
For a couple years, a Pi was a decent value as a cheaper small desktop replacement.