A RAM chip takes several months to make, starting from an empty silicon wafer. Each chip takes 8-10 weeks to go through the process of lithography, deposition, etching, cleaning, etc. It then must be tested, which can take another couple of weeks, then packaged, before it can be sold to manufacturers. Thus, even if fab capacity were available today (it isn't), you'd still see a multi-month lag before new supply hit the market.
(This is an extraordinarily sensitive process, and disrupting it can cause you to lose the entire batch. You might have heard of cases where "wafer starts" had to be discarded due to a tsunami or power disruption - this is why.)
The raw materials are cheap: It's mostly just quartz, which is the most abundant mineral on earth.
The problem is actually making chips. The machines use to make modern integrated circuits are some of the most precise equipment in the world, manufacturing structures just tens of atoms across.
Getting more factories online might take close to a decade, and that's if anyone wants to pay: The current demand showed up basically overnight as some of the companies (running of investor money with no way to make profit) started a bidding war. Betting billions of dollars on them still being around in 5-10 years is just not a wise decision.
> Betting billions of dollars on them still being around in 5-10 years is just not a wise decision.
Historically dynamic RAM has gone through several boom/bust cycles, oscillating between manufacturers struggling to break-even and cutting production and then a few years later not being able to make enough chips. I remember the late 80s being another time where companies were delaying new product launches because they couldn't get DRAM.
RAM factories can only produce memory chips, so companies are careful about expanding their production lines.
If demand for memory drops later, they could suffer huge losses.
In fact, memory prices often change when a new factory is built.
Manufacturers usually buy memory when prices are low, so we don’t really notice these changes when we buy products.
I work in the semiconductor industry, I recently asked the same question to some experts around me.
They told me there is a lesson from the early days of personal computers.
Improvements in operating systems reduced the amount of RAM needed, which caused serious problems for the memory industry.
A big part of it is that RAM fabs are insanely capital‑intensive and slow to ramp: you’re talking years to build, then months per wafer, with very cyclical, boom‑bust demand. That makes it rational for the handful of DRAM vendors to be cautious, because overbuilding capacity in a bubble can destroy them for a decade.
I'm curious about manufacturing from like 10-20 years ago. Would it be cheap to make a RAM chip from like 15 years ago? Or would it just not even work with our modern hardware? Like if we want more of something, can we pay with performance and efficiency instead? We seem to have this option with other types of technology.
> Would it be cheap to make a RAM chip from like 15 years ago? Or would it just not even work with our modern hardware?
Modern CPUs have built in memory controllers. Most of them can only talk to a small variety of memory; maybe DDR4 or DDR5 and sometimes a LP variant. If you wanted to make DDR3, you'd also need a special CPU with a DDR3 compatible memory controller.
I think once production stops for a given memory type at a specific fab, they get rebuilt for a new type of RAM. It doesn't make sense to keep the stuff around because there's been enough DDR3 ram made for the rest of time.
It's different for other fabs, where old lines can remain valuable for lower cost production.
The reason we have so few RAM manufacturers in the first place is that was (until just a few months ago) an extremely low margin business.
New production capacity takes years to bring online, and manufacturers are rightly cautious of the current demand bubble bursting, leaving them billions of dollars out of pocket.
"AI" is readily abundant all over the place, yet has no profit. I no longer believe the observably false claims that profit is somehow causal to availability.
New fabs are being built, but that takes time. Estimates are that full production will start towards the end of 2027, or in some cases (Samsung, Micron) only in 2028. So expect RAM to remain scarce/expensive for the next two years.
Another factor is that the viability of building new fabs is based on the assumption that there is no AI bubble that will burst. Opinions differ on how large that risk is.
Avoid suspiciously low-priced bargains, give it a good several days of solid memtesting (I used to use Memtest86 and Memtest86+, and Prime95 for good measure) and you should be fine.
Because SK Hynix has substantially ramped up its production of HBM memory for GPUs (due to AI demand) which requires more silica per die than DRAM. Since companies produce HBM and DDR memory in the same factory and using the same equipment (more or less) shifting production to HBM has the double whammy of less silica for and hence less DRAM production.
Further to what's listed elsewhere:
A RAM chip takes several months to make, starting from an empty silicon wafer. Each chip takes 8-10 weeks to go through the process of lithography, deposition, etching, cleaning, etc. It then must be tested, which can take another couple of weeks, then packaged, before it can be sold to manufacturers. Thus, even if fab capacity were available today (it isn't), you'd still see a multi-month lag before new supply hit the market.
(This is an extraordinarily sensitive process, and disrupting it can cause you to lose the entire batch. You might have heard of cases where "wafer starts" had to be discarded due to a tsunami or power disruption - this is why.)
The raw materials are cheap: It's mostly just quartz, which is the most abundant mineral on earth.
The problem is actually making chips. The machines use to make modern integrated circuits are some of the most precise equipment in the world, manufacturing structures just tens of atoms across.
Getting more factories online might take close to a decade, and that's if anyone wants to pay: The current demand showed up basically overnight as some of the companies (running of investor money with no way to make profit) started a bidding war. Betting billions of dollars on them still being around in 5-10 years is just not a wise decision.
> Betting billions of dollars on them still being around in 5-10 years is just not a wise decision.
Historically dynamic RAM has gone through several boom/bust cycles, oscillating between manufacturers struggling to break-even and cutting production and then a few years later not being able to make enough chips. I remember the late 80s being another time where companies were delaying new product launches because they couldn't get DRAM.
RAM factories can only produce memory chips, so companies are careful about expanding their production lines. If demand for memory drops later, they could suffer huge losses.
In fact, memory prices often change when a new factory is built. Manufacturers usually buy memory when prices are low, so we don’t really notice these changes when we buy products.
I work in the semiconductor industry, I recently asked the same question to some experts around me. They told me there is a lesson from the early days of personal computers. Improvements in operating systems reduced the amount of RAM needed, which caused serious problems for the memory industry.
A big part of it is that RAM fabs are insanely capital‑intensive and slow to ramp: you’re talking years to build, then months per wafer, with very cyclical, boom‑bust demand. That makes it rational for the handful of DRAM vendors to be cautious, because overbuilding capacity in a bubble can destroy them for a decade.
I'm curious about manufacturing from like 10-20 years ago. Would it be cheap to make a RAM chip from like 15 years ago? Or would it just not even work with our modern hardware? Like if we want more of something, can we pay with performance and efficiency instead? We seem to have this option with other types of technology.
> Would it be cheap to make a RAM chip from like 15 years ago? Or would it just not even work with our modern hardware?
Modern CPUs have built in memory controllers. Most of them can only talk to a small variety of memory; maybe DDR4 or DDR5 and sometimes a LP variant. If you wanted to make DDR3, you'd also need a special CPU with a DDR3 compatible memory controller.
I think once production stops for a given memory type at a specific fab, they get rebuilt for a new type of RAM. It doesn't make sense to keep the stuff around because there's been enough DDR3 ram made for the rest of time.
It's different for other fabs, where old lines can remain valuable for lower cost production.
Only TSMC as far as I know keeps fully depreciated fab lines online to make older processors for things that don’t need to make the latest tech.
The reason we have so few RAM manufacturers in the first place is that was (until just a few months ago) an extremely low margin business.
New production capacity takes years to bring online, and manufacturers are rightly cautious of the current demand bubble bursting, leaving them billions of dollars out of pocket.
"AI" is readily abundant all over the place, yet has no profit. I no longer believe the observably false claims that profit is somehow causal to availability.
I read recently that there is an effective cartel of Samsung, SK Hynix and Micron.
Price collusion, and dumping (flooding market with low prices) if any real competitor shows up.
Someone please correct me if I'm wrong.
New fabs are being built, but that takes time. Estimates are that full production will start towards the end of 2027, or in some cases (Samsung, Micron) only in 2028. So expect RAM to remain scarce/expensive for the next two years.
Another factor is that the viability of building new fabs is based on the assumption that there is no AI bubble that will burst. Opinions differ on how large that risk is.
Outside of fabrication, memory chips also require some very fancy high speed testers that need specialized ICs which are most likely back ordered.
What's the current thinking on used ram? Is it worth it?
Avoid suspiciously low-priced bargains, give it a good several days of solid memtesting (I used to use Memtest86 and Memtest86+, and Prime95 for good measure) and you should be fine.
Because SK Hynix has substantially ramped up its production of HBM memory for GPUs (due to AI demand) which requires more silica per die than DRAM. Since companies produce HBM and DDR memory in the same factory and using the same equipment (more or less) shifting production to HBM has the double whammy of less silica for and hence less DRAM production.
See this video by Anastasi In Tech to understand the memory crisis - https://www.youtube.com/watch?v=KghkI5Oh_lY
This could prevent people from making a lot of money. Here's why.
It's price gouging, no matter what they say.