mikbone Posted May 15, 2024 Report Posted May 15, 2024 I can remember multiple times in my life that Moore’s law was predicted to fail. It has held true since 1975. Any of you techies want to make a prediction on when it will fail? Jamie123 1 Quote
Carborendum Posted May 15, 2024 Report Posted May 15, 2024 (edited) 11 hours ago, mikbone said: I can remember multiple times in my life that Moore’s law was predicted to fail. It has held true since 1975. Any of you techies want to make a prediction on when it will fail? They first worked on making the transistors themselves smaller and smaller. Solid state has been optimized as far as it can go. So, they changed the design to work in 3d by adding layers. But they have had problems with reading the layers and keeping the heat down. They could only do so much with that. So, now they are just making the chip itself bigger. They can come up with all sorts of things to make the chip itself bigger. And as long as it still fits into what we expect a computer to look like, they can continue this pattern. They can't make the micro-circuitry any smaller because of the uncertainty principle (yes, they are that small). But the geometry still has lots of options. I'd bet that the next thing will be cooling technology. If they can figure out how to get cooling mechanisms between layers, we're still golden. There are still many avenues to keep this pattern going. There is no way it can continue forever. But it will probably keep going the same rate for our lifetimes. Edited May 15, 2024 by Carborendum Jamie123 and mikbone 1 1 Quote
NeuroTypical Posted May 15, 2024 Report Posted May 15, 2024 Is quantum computing technology going to reset this scale somehow? I'm not geeky enough to guess, but if I had to pretend to be a geek, I'd guess maybe it'll slow the # of transistors, because you only need a fraction of qbits to do a billion times more than regular bits, or something. Quote
Vort Posted May 16, 2024 Report Posted May 16, 2024 (edited) Lowering the voltage that chips require could boost speed by an order of magnitude or more. But you have to have a much clearer signal to use a lower voltage, which turns out not to be easy. Most chipsets currently operate at 3.3 V, CPUs can be under a volt, and memory is typical 1.2-1.5 V. Cutting these values in half could lead to denser packing and greater computing power per watt-second (aka per joule*). Ironically, though I'm sort of a physics guy by educational background and interest, I don't really know much at all about computer engineering or the specifics of the problems chip designers encounter. So I'm mostly not speaking from personal knowledge, just from what I've heard and basic inferences I draw (like you must obviously need a cleaner signal for a lower voltage). *I remember in the 2008 Iron Man movie, Tony Stark says to the Egyptian or Turkish or Iranian or whatever guy he's imprisoned with that his miniature, cave-built arc reactor can generate "if my math is right—and it always is—three gigajoules per second**." Of course, a watt is one joule per second, so he could have simply said "three gigawatts". But that would have been less cool. **If my math is right—and often it is, but it's kind of like my spelling, so don't bet your life on it—that's enough to power almost two and a half Deloreans. PS Tony Stark's subsequent efforts to perfect his Iron Man suit demonstrates that, um, well, his math wasn't always right. Ten percent, my foot. Edited May 16, 2024 by Vort Quote
Carborendum Posted May 16, 2024 Report Posted May 16, 2024 (edited) On 5/15/2024 at 11:58 AM, NeuroTypical said: Is quantum computing technology going to reset this scale somehow? I'm not geeky enough to guess, but if I had to pretend to be a geek, I'd guess maybe it'll slow the # of transistors, because you only need a fraction of qbits to do a billion times more than regular bits, or something. That's a good question. That is a topic that has been around for a while. But I've never taken the time to get to understand it. @Traveler, could you answer this question? Edited May 16, 2024 by Carborendum Quote
askandanswer Posted May 16, 2024 Report Posted May 16, 2024 It seems like they just keep finding moore and moore ways to stack moore chips into evermoore smaller areas. Mooreover, I suspect that this trend will continue for some time. Its just one of many ways of doing more with moore. Carborendum and NeuroTypical 1 1 Quote
Traveler Posted May 17, 2024 Report Posted May 17, 2024 20 hours ago, Carborendum said: That's a good question. That is a topic that has been around for a while. But I've never taken the time to get to understand it. @Traveler, could you answer this question? 20 hours ago, Carborendum said: That's a good question. That is a topic that has been around for a while. But I've never taken the time to get to understand it. @Traveler, could you answer this question? I am not a qualified respondent to this question. I tried to follow this stuff for a while, but the problem is (as near as I understand) that we do not have functioning quantum hardware yet. That such efforts are not done with quantum tech but with common tech, super computers running in virtual mode. What I have gathered about the hardware is that there are problems with quantum entanglement yet to be solved. I do not know where things stand currently but the last I read was that in theory the concepts are promising but we are not there yet. What advancements that are taking place are limited and thus very niche orientated. As to Moore’s Law in general – who knows where that will go next. Microchips (wafers) are not dependent on single wired transistors as they were once understood – by definition a single transistor has a on or off charge flow which is the basis of current binary computer hardware. Currently chips are a little more complex – I am not so sure I understand the basic wafer reticle circuit representation anymore (I have been professionally out of the loop for about 5 years – more than that for these specifics). I think we still have a way to go before we reach the dead end with what we can put on a wafer. But there is a caveat. If we ever get a super massive solar flair pointed at our little planet – we will not have any chip hardware left working (with some specific exceptions). And it will take a while to replace everything all at once because silicon fabs, themselves, rely heavily on microchips for microchip fabrication. We know how we got here so it is a question of finding a place where we can start over. The Traveler NeuroTypical and Carborendum 1 1 Quote
Carborendum Posted May 17, 2024 Report Posted May 17, 2024 22 minutes ago, Traveler said: Microchips (wafers) are not dependent on single wired transistors as they were once understood – by definition a single transistor has a on or off charge flow which is the basis of current binary computer hardware. Currently chips are a little more complex – I am not so sure I understand the basic wafer reticle circuit representation anymore. This is a good point. Perhaps I should have used quotes "transistors" because they aren't the same as the original design of a transistor. But from the functionality that we receive as end users is about the same. Traveler 1 Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.