Moore’s Law


Recommended Posts

IMG_0470.thumb.jpeg.ee6099ba948bc391da8af842a70c2ce9.jpeg

I can remember multiple times in my life that Moore’s law was predicted to fail.  It has held true since 1975.

Any of you techies want to make a prediction on when it will fail?

Link to comment
Share on other sites

11 hours ago, mikbone said:

I can remember multiple times in my life that Moore’s law was predicted to fail.  It has held true since 1975.

Any of you techies want to make a prediction on when it will fail?

They first worked on making the transistors themselves smaller and smaller.  Solid state has been optimized as far as it can go.  So, they changed the design to work in 3d by adding layers.  But they have had problems with reading the layers and keeping the heat down.  They could only do so much with that.  So, now they are just making the chip itself bigger.

They can come up with all sorts of things to make the chip itself bigger.  And as long as it still fits into what we expect a computer to look like, they can continue this pattern.

They can't make the micro-circuitry any smaller because of the uncertainty principle (yes, they are that small).  But the geometry still has lots of options.  I'd bet that the next thing will be cooling technology.  If they can figure out how to get cooling mechanisms between layers, we're still golden.

There are still many avenues to keep this pattern going.  There is no way it can continue forever.  But it will probably keep going the same rate for our lifetimes.

Edited by Carborendum
Link to comment
Share on other sites

Is quantum computing technology going to reset this scale somehow?  I'm not geeky enough to guess, but if I had to pretend to be a geek, I'd guess maybe it'll slow the # of transistors, because you only need a fraction of qbits to do a billion times more than regular bits, or something.

Link to comment
Share on other sites

Lowering the voltage that chips require could boost speed by an order of magnitude or more. But you have to have a much clearer signal to use a lower voltage, which turns out not to be easy. Most chipsets currently operate at 3.3 V, CPUs can be under a volt, and memory is typical 1.2-1.5 V. Cutting these values in half could lead to denser packing and greater computing power per watt-second (aka per joule*).

Ironically, though I'm sort of a physics guy by educational background and interest, I don't really know much at all about computer engineering or the specifics of the problems chip designers encounter. So I'm mostly not speaking from personal knowledge, just from what I've heard and basic inferences I draw (like you must obviously need a cleaner signal for a lower voltage).

*I remember in the 2008 Iron Man movie, Tony Stark says to the Egyptian or Turkish or Iranian or whatever guy he's imprisoned with that his miniature, cave-built arc reactor can generate "if my math is right—and it always is—three gigajoules per second**." Of course, a watt is one joule per second, so he could have simply said "three gigawatts". But that would have been less cool.

**If my math is right—and often it is, but it's kind of like my spelling, so don't bet your life on it—that's enough to power almost two and a half Deloreans.

PS Tony Stark's subsequent efforts to perfect his Iron Man suit demonstrates that, um, well, his math wasn't always right. Ten percent, my foot.

 

Edited by Vort
Link to comment
Share on other sites

On 5/15/2024 at 11:58 AM, NeuroTypical said:

Is quantum computing technology going to reset this scale somehow?  I'm not geeky enough to guess, but if I had to pretend to be a geek, I'd guess maybe it'll slow the # of transistors, because you only need a fraction of qbits to do a billion times more than regular bits, or something.

That's a good question.  That is a topic that has been around for a while.  But I've never taken the time to get to understand it.

@Traveler, could you answer this question?

Edited by Carborendum
Link to comment
Share on other sites

20 hours ago, Carborendum said:

That's a good question.  That is a topic that has been around for a while.  But I've never taken the time to get to understand it.

@Traveler, could you answer this question?

 

20 hours ago, Carborendum said:

That's a good question.  That is a topic that has been around for a while.  But I've never taken the time to get to understand it.

@Traveler, could you answer this question?

I am not a qualified respondent to this question.  I tried to follow this stuff for a while, but the problem is (as near as I understand) that we do not have functioning quantum hardware yet.  That such efforts are not done with quantum tech but with common tech, super computers running in virtual mode.  What I have gathered about the hardware is that there are problems with quantum entanglement yet to be solved.   I do not know where things stand currently but the last I read was that in theory the concepts are promising but we are not there yet.  What advancements that are taking place are limited and thus very niche orientated.

As to Moore’s Law in general – who knows where that will go next.  Microchips (wafers) are not dependent on single wired transistors as they were once understood – by definition a single transistor has a on or off charge flow which is the basis of current binary computer hardware.  Currently chips are a little more complex – I am not so sure I understand the basic wafer reticle circuit representation anymore (I have been professionally out of the loop for about 5 years – more than that for these specifics).  I think we still have a way to go before we reach the dead end with what we can put on a wafer.  But there is a caveat.  If we ever get a super massive solar flair pointed at our little planet – we will not have any chip hardware left working (with some specific exceptions).  And it will take a while to replace everything all at once because silicon fabs, themselves, rely heavily on microchips for microchip fabrication.  We know how we got here so it is a question of finding a place where we can start over.

 

The Traveler

Link to comment
Share on other sites

22 minutes ago, Traveler said:

Microchips (wafers) are not dependent on single wired transistors as they were once understood – by definition a single transistor has a on or off charge flow which is the basis of current binary computer hardware.  Currently chips are a little more complex – I am not so sure I understand the basic wafer reticle circuit representation anymore.

This is a good point.  Perhaps I should have used quotes "transistors" because they aren't the same as the original design of a transistor.  But from the functionality that we receive as end users is about the same.

Link to comment
Share on other sites

  • 2 weeks later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...