Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Can Our Computers Continue To Get Smaller and More Powerful?

timothy posted about 2 months ago | from the where-is-the-orchard-of-low-hanging-fruit? dept.

Upgrades 151

aarondubrow (1866212) writes In a [note, paywalled] review article in this week's issue of the journal Nature (described in a National Science Foundation press release), Igor Markov of the University of Michigan/Google reviews limiting factors in the development of computing systems to help determine what is achievable, in principle and in practice, using today's and emerging technologies. "Understanding these important limits," says Markov, "will help us to bet on the right new techniques and technologies." Ars Technica does a great job of expanding on the various limitations that Markov describes, and the ways in which engineering can push back against them.

cancel ×

151 comments

Sorry! There are no comments related to the filter you selected.

Only under COMMUNISM (-1, Troll)

For a Free Internet (1594621) | about 2 months ago | (#47673375)

Capitalism is in its epoch of terminal decay. COMMUNISM is the only salvation for humankind!!!!

toad in hat blocker text, please to ignore it this here now you, thanks!

Obvious (4, Insightful)

Russ1642 (1087959) | about 2 months ago | (#47673385)

Yes. Next question please.

Re:Obvious (0)

Anonymous Coward | about 2 months ago | (#47673407)

How is it obvious? Did our jets get faster and lighter and cheaper? Not by the same degree as computing. It doesn't take a lot of energy to flip a bit, but it still takes the same amount of energy to fly across the Atlantic.

We're already at the atom by atom level when we manufacture ICs.

https://www.youtube.com/watch?... [youtube.com]

What's after atoms?

yes. Especially per passenger. (5, Interesting)

raymorris (2726007) | about 2 months ago | (#47673495)

> Did our jets get faster and lighter and cheaper?

Yes. Especially lighter and cheaper PER PASSENGER, which is the goal for passenger jets.

> it still takes the same amount of energy to fly across the Atlantic.

Nope, fuel efficiency and energy efficiency have improved significantly.

Re:yes. Especially per passenger. (0)

Anonymous Coward | about 2 months ago | (#47674007)

The sad fact though is that flying from London to New York still takes the same time as it did 40 years ago. Sure, more can afford do so now, but the flying experience is becoming more and more unbearable with each passing year. And, like I said, it is not getting any faster.

Re:yes. Especially per passenger. (1)

infolation (840436) | about 2 months ago | (#47674439)

The sad fact though is that flying from London to New York still takes the same time as it did 40 years ago.

Nope. (almost) 40 years ago we had... Concorde!

That miracle of modern engineering took 3h 30min instead of the subsonic 7-8 hours it takes now. And why was Concorde retired? It just didn't make any money.

At the end of the day people want slow, cheap and unbearable to fast, stylish and extortionately expensive.

Re:yes. Especially per passenger. (1)

shoor (33382) | about 2 months ago | (#47674649)

The Concorde also had a sonic boom which limited the airports it could fly to. (Competitors may have exaggerated the problem, but I do believe it was a problem.)

Re:yes. Especially per passenger. (0)

Anonymous Coward | about 2 months ago | (#47675373)

Exactly, technology is very limited in many areas, only computing has advanced so much, precisely because it takes so little energy to flip a bit.

Other fields peaked about after WWII, and we've been improving things, sure, but not at the same relative pace as computing.

But even that will reach a limit.

That's obvious.

Re:yes. Especially per passenger. (2)

Kjella (173770) | about 2 months ago | (#47674229)

You're being very dishonest when you leave this part out:

Not by the same degree as computing.

By those standards, airplanes have basically stood still for the last 50 years. Sure they get a bit lighter, a bit better engines, a bit better aerodynamics but they're not radically different nor faster. Already the very first commercial transatlantic flight Berlin-New York was done in 25 hours, like orders faster than a boat and still on the same order - 8.5 hours - today. Same with cars, they've come a long way since the T-Ford but it could do 40-45 mph with 13-21 MPG. What would you get today, 35 MPG? You don't drive cross country on a thimble, that's for sure.

We're not talking about that kind of improvements when it comes to computers. We're talking about that 30 years ago memory was measured in kilobytes, today it's in gigabytes. If computers double performance in 10 years, we think that's awfully slow progress. 30+ years for a 10x improvement? 100 years until a terabyte is last century's gigabyte? Let's be honest, the kind of marginal - or rather, normal - improvements you're talking about would only be scoffed at. When - not if - we hit that limit that the walls are so thin they can't be thinner they might be roughly as good as they'll ever get.

Re:yes. Especially per passenger. (0)

Anonymous Coward | about 2 months ago | (#47674303)

> Yes. Especially lighter and cheaper PER PASSENGER

Yeah - but not for the passengers from USA =(

Re:Obvious (4, Insightful)

ShanghaiBill (739463) | about 2 months ago | (#47673599)

Did our jets get faster and lighter and cheaper?

The fastest air breathing aircraft was the SR-71, which went into production in 1962, based on technology from the 1950s. So for at least half a century, jets did not get faster. Aircraft improved enormously between 1903 and 1960. Then the rate of improvements fell off a cliff. That is why Sci-Fi from that era often extrapolated the improvements into flying cars, and fast space travel, but far fewer predicted things like the Internet or Wikipedia.

What's after atoms?

Silicon lithography will hit its limits after a few more iterations. But nano-assembly techniques may allow silicon transistors to be even smaller. After that we may be able to move to carbon nanotube transistors, based on spintronics to lower the heat dissipation. There is still plenty of room at the bottom.

Re:Obvious (1)

mlts (1038732) | about 2 months ago | (#47673793)

There is always going with distributed computing, both tightly coupled (cores) and loosely coupled (different CPUs.)

I wouldn't be surprised to see RAM chips with a part of the die dedicated to CPU/FPU/GPU functions. Add more RAM, add more CPUs.

Eventually the concept of a "central" processing unit may give way to passive backplanes and various speed buses, perhaps with a relatively lightweight chip directing everything.

Another example, is the x86 architecture. Intel has been amazing in keeping it going, but eventually, moving to something like Itanium with 128+ registers for integer, 128+ for floating point, etc. might be how Moore's "law" keeps going.

As for jets, it isn't a matter of "can't", but "why bother". Once commercial airlines got deregulated, good enough was good enough and the race to the bottom began, so there was no interest in trying to continue making progress with better planes, other than military aircraft.

Re:Obvious (4, Informative)

GrahamCox (741991) | about 2 months ago | (#47674343)

Then the rate of improvements fell off a cliff

That's only true if you're only judging it by outright speed, height, etc. Things have continued to improve in terms of efficiency, thrust-to-weight ratio, noise, cleanliness of fuel burn and above all, reliability.

The original RB211 turbofan (the first big fanjet of the type that all modern airliners use) had a total lifetime of 1,000 hours. Nowadays it's >33,000 hours. That's an incredible achievement. In 1970, as a young kid with a keen interest in aviation, I would watch Boeing 707s fly in and out of my local airport, all trailing plumes of black smoke, all whining loudly (and deafeningly, on take-off), and understanding where all the noise protesters that frequently appeared on the news were coming from. Nowadays you don't have that, because noise is just not the problem it was, there's no black smoke, and jets slip in and out of airports really very quietly, when you consider how much power they are producing (which in turn helps them climb away more quickly).

As far as computing is concerned, you're right - there's still plenty of room at the bottom. But the current fabrication technology is reaching its limits. Perhaps jet engine manufacturers in the late 60s couldn't see how they would overcome fundamental limits in materials technology to produce the jets we have today, but they did.

Re:Obvious (4, Insightful)

dnavid (2842431) | about 2 months ago | (#47674757)

Silicon lithography will hit its limits after a few more iterations. But nano-assembly techniques may allow silicon transistors to be even smaller. After that we may be able to move to carbon nanotube transistors, based on spintronics to lower the heat dissipation. There is still plenty of room at the bottom.

The point of the article and the article it references is that its easy to say stuff like that, but also mostly irrelevant to practical computing because in the history of modern computing its never been absolute physical limits that caused major changes to how computing is implemented. Just because there's room at the bottom, doesn't mean its room we can use. We *may* be able to use nano-assemblers for silicon and *may* be able to use carbon nanotube transistors, but unless that gets translated to someone working on actual practical implementations of those technologies, they will apply as much to the average consumer as the SR-71 that's being discussed in this thread means to the average commercial air traveler. In other words, exactly zero.

When I was in college people were already talking about the exotic technologies we would have to migrate to in order to achieve better performance, and that was the late eighties. In the twenty-plus years since then, we're still basically using silicon CMOS. Granted the fabrication technologies and gate technologies have radically improved, but the fundamental manufacturing technology is still the same. Its been the same because there's hundreds of billion dollars of cumulative technological infrastructure and innovation behind silicon lithography. For these other "room at the bottom" technologies to be meaningful, and not just SR-71s, they need to be able to reach the same point silicon lithography with its multi-decade head start and approaching trillion dollar learning curve. Its not enough to just work in theory, or even in practice one-off. If it can't work at the scale and scope of silicon lithography, its just an SR-71. A cool museum piece of advanced technology almost no one will ever see, touch, use, or directly benefit from.

It isn't trivially obvious there exists a technology commercializable in the next few decades that can replace silicon lithography. Anyone who thinks that's obvious doesn't understand the practical realities of scaling these technologies.

Re:Obvious (0)

Anonymous Coward | about 2 months ago | (#47675003)

Air breathing aircraft have gotten somewhat faster. The Boeing X-51 did Mach 5.1 last year.

It is true that commercial aviation cruise speeds haven't increased in recent decades, but that's driven more by economical factors than technical. Rising fuel costs have focused design efforts on improving cruise economy, rather than velocity. So, Mach ~0.8 or so is still the economic sweet spot.

Similarly, battery life has started to become more important than clock speed in many applications. Technical development hasn't slowed, but the metrics of concern have shifted.

--B.

Re:Obvious (1)

mjwx (966435) | about 2 months ago | (#47675053)

Did our jets get faster and lighter and cheaper?

The fastest air breathing aircraft was the SR-71, which went into production in 1962, based on technology from the 1950s. So for at least half a century, jets did not get faster. Aircraft improved enormously between 1903 and 1960. Then the rate of improvements fell off a cliff. That is why Sci-Fi from that era often extrapolated the improvements into flying cars, and fast space travel, but far fewer predicted things like the Internet or Wikipedia.

Thats because you're basing all aircraft improvement on speed.

This is flat out wrong.

The reason aircraft have not gotten faster than the SR71 is partially because you hit a serious wall at those speeds. The air literally becomes harder to push though. Physics is the enemy here, this is why its expensive to produce a car that goes over 400 KPH and that car is not very reliable. Friction and air resistance need to be overcome, heat dissipation has to be balenced with weight (the Veyron has 11 radiators) and aerodynamics. It's not as simple as strapping more rockets onto the arse of a 737.

But the biggest reason is there's no impetus. There's no demand for faster aircraft. Even with the Concorde, the economics of it never made sense, Air France and BA only ran the Concorde for pride. The demand for supersonic transport just isn't there. However there is a lot of demand for cheaper air travel.

How expensive was an air ticket in 1962? you'll find a US Domestic flight cost around $1500 in todays money, the same flight you get for $200 today.

Safety, in 1962 the DeHavilland Comet had this nasty problem of breaking up mid flight. Hull loss incidents lead to crashes in most cases. This is not the case, both the B777 an A330 flew in commercial service for over a decade each before a fatal crash. Hull loss incidents rarely lead to crashes, in fact, an entire engine can blow up and the aircraft can still land safely without a single injury.

So cost and safety have changed a lot in the last 50 years of flight. Flying is more accessible and safer.

Complaining that flying hasn't improved in 50 years because of speeds is like complaining that ovens have not improved in the last 200 years because they aren't any hotter. It ignores the advent of the thermostat control, convection oven and the fact that the price is lower and selection available to me has increased significantly.

Re:Obvious (1)

Dutch Gun (899105) | about 2 months ago | (#47675321)

I think it's fair to say that we've reached a point where we're flying "fast enough" for most practical purposes. Flying to the other side of the world only takes about 18 hours or so, which is pretty amazing, and the fast majority of flights are much shorter hops. Once cost, safety, reliability, and noise all reach a point where they can't be easily improved, aerospace engineers will probably start pushing harder against the speed barrier again. It's not that there's no impetus, it's just that there are currently higher priorities.

I think there are some interesting parallels to the improvements of tech components. We may be approaching a stabilizing trend because our computers are becoming "fast enough" for darn near whatever most people need to do with them, and because the physics for making components smaller and faster are really starting to get in the way. At some point, computers will be fast enough that they'll do whatever people want them to do, and there will be very little impetus to make them significantly faster. Besides gaming or other high-end jobs, personal computers are already ridiculously overpowered for what the user actually demands of them. And a lot of performance issues can simply be blamed on poor software design or overly deep and inefficient software abstractions. Note how the last two Windows OSes have actually *improved* CPU and memory performance since Vista, which was a pretty notorious hog.

I suppose this explains why most people are probably better off with a smartphone or a tablet, and why PC sales are dropping. I think the PC isn't dying so much as finding a more appropriate niche within the computational power spectrum.

Re:Obvious (1)

Beck_Neard (3612467) | about 2 months ago | (#47673675)

It takes zero energy to flip a bit. What does take energy is erasing bits, and as it turns out, that does not seem to be fundamental to the idea of computation. The limits of computation have nothing to do with energy per se. Rather, they are about entropy.

http://en.wikipedia.org/wiki/V... [wikipedia.org]
http://en.wikipedia.org/wiki/R... [wikipedia.org]

Re:Obvious (0)

Anonymous Coward | about 2 months ago | (#47674291)

Eh wot? Communication theory states there is a minimum amount of energy required to represent a change of state. In any case, all that does is show that the end product of information processing takes very little energy, but the practical matters of real life like flying or driving take a lot.

What do you think flipping a bit means? How is it different from erasing it?

Re:Obvious (1)

Beck_Neard (3612467) | about 2 months ago | (#47674697)

flipping:
1 -> 0
0 -> 1

erasing:
1 -> 0
0 -> 0

Re:Obvious (1)

Beck_Neard (3612467) | about 2 months ago | (#47674717)

Also, your entire reply is pretty much gibberish.

Re:Obvious (0)

Anonymous Coward | about 2 months ago | (#47675447)

Sorry, your bit reply doesn't make much sense to me either.

From your own links

"Landauer's principle asserts that there is a minimum possible amount of energy required to change one bit of information, known as the Landauer limit:

        kT ln 2,

where k is the Boltzmann constant (approximately 1.38×1023 J/K), T is the temperature of the circuit in kelvins, and ln 2 is the natural logarithm of 2 (approximately 0.69315).

At 25 C (room temperature, or 298.15 K), the Landauer limit represents an energy of approximately 0.0178 eV, or 2.85 zJ. Theoretically, roomtemperature computer memory operating at the Landauer limit could be changed at a rate of one billion bits per second with only 2.85 trillionths of a watt of power being expended in the memory media. Modern computers use millions of times as much energy.[1][2][3]"

It obviously takes energy to flip a bit.

Read your own stuff, understand it first, *then* come back to me about who is talking gibberish.

You're obviously a programmer, whose ego is boosted by the progress of actual scientists and engineers that develop the hardware you torture on a daily basis.

But you lack understanding.

Re:Obvious (1)

TWX (665546) | about 2 months ago | (#47673691)

Further multifunction integration into a single package, and more conversion chips (DACs, etc) as multifunction.

From a practical perspective for a personal computing device there will always be a lower limit on what's useful. Think of the Star Trek: TNG communicator or the Dick Tracy watch, anything made small still has to have a good user interface. In the Star Trek example the UI is entirely voice activated, so we'll either have to rethink our UI, or attempt to cram a more recognizable UI into a smaller device like how the Android-powered smartwatches did it.

I don't think that it's unreasonable to split the UI from the main computing device though. Take the watch or comm example- one could have the equivalent of a graphical dumb terminal in the form of a tablet that wirelessly connects to the smartwatch or smartfob or whatever format we go with.

Re:Obvious (1)

BarbaraHudson (3785311) | about 2 months ago | (#47674345)

From a practical perspective for a personal computing device there will always be a lower limit on what's useful. Think of the Star Trek: TNG communicator or the Dick Tracy watch, anything made small still has to have a good user interface. In the Star Trek example the UI is entirely voice activated, so we'll either have to rethink our UI, or attempt to cram a more recognizable UI into a smaller device like how the Android-powered smartwatches did it.

I don't think that it's unreasonable to split the UI from the main computing device though. Take the watch or comm example- one could have the equivalent of a graphical dumb terminal in the form of a tablet that wirelessly connects to the smartwatch or smartfob or whatever format we go with.

We already have that. When I say "OK Google", my phone answers questions by hitting their servers. The voice recognition software lets me send texts by dictating, etc. But since we can cram a LOT into today's smartphones VERY CHEAPLY, why not do so, and depend less on the network infrastructure, not clog up the 'tubes so much, etc? Things like navigation don't require a visible UI, either for input or output. Handy for people with reduced/no vision, and also allow for a different form factor. The same applies for texting - text-to-speech and speech-to-text mean that the "User Interface" can be reduced to ear buds with a built-in mic.

About 285 million people are visually impaired worldwide: 39 million are blind and 246 million have low vision (severe or moderate visual impairment) preventable cause are as high as 80% of the total global visual impairment burden. About 90% of the world's visually impaired people live in developing countries.

A guide dog isn't enough to fully participate in society in the digital age. As someone who couldn't use a computer for most of the last 3 years, I felt cut off. The GUI (or even the TUI) isn't the only viable interface.

Re:Obvious (1)

TWX (665546) | about 2 months ago | (#47675299)

But since we can cram a LOT into today's smartphones VERY CHEAPLY, why not do so, and depend less on the network infrastructure, not clog up the 'tubes so much, etc?

I agree, and I don't have any visual impairment to speak of. That's almost more a storage-density matter though, rather than a processing power issue.

I want a good nonvisual UI because of driving. I think that this push for touchscreens in cars is foolhardy at best, outright hazardous at worst. We need to get away from interfaces for secondary functions in cars (radio, HVAC, etc) that require eyes to use.

Re:Obvious (1)

Anonymous Coward | about 2 months ago | (#47673439)

C-C-C-Combo Breaker! In your face Betteridge!

Re:Obvious (1)

Jason Levine (196982) | about 2 months ago | (#47673465)

More powerful, perhaps. Smaller? Maybe not. We're already at the point where we can have watch-sized displays and full keyboards on our phones. The limiting factor is going to be 1) displays that are small but still readable and 2) input devices that aren't too tiny for human-sized fingers. As far as smart phones go (which, in essence, are tiny computers), I don't see them becoming much smaller due to these factors. However, I'm sure something completely innovative will come along that will make us look back and wonder why we thought it couldn't get smaller. Perhaps a Google Glass type setup where the screen is extremely tiny but fools the eye into thinking it is huge.

Some unknown-right-now innovation isn't obvious, however, or it wouldn't be unknown-right-now.

Re:Obvious (2)

Shortguy881 (2883333) | about 2 months ago | (#47673541)

Lol, they meant chip size getting smaller not the human interface.

Re:Obvious (1)

rogoshen1 (2922505) | about 2 months ago | (#47673571)

Zoolander and his phone begs to differ.

Re:Obvious (2, Insightful)

bobbied (2522392) | about 2 months ago | (#47673479)

Actually, the answer is no and that is obvious. Eventually we are going to run into limits driven by the size of atoms (and are in fact already there).

Once you get a logic gate under a few atoms wide, there is no more room to make things smaller. No more room to make them work on less power. We will have reached the physical limits, at least in the realm of our current lithographic doping processes. We are just about there.

This is not to say there won't be continued advances. They are going to get more and more stuff onto each die for quite some time and manufacturing costs will continue to decline as yields go up. It's just that we are about at the limits of lowering the power consumption of the CPU and chipsets.

Re:Obvious (1)

Russ1642 (1087959) | about 2 months ago | (#47673503)

What's obvious is that we can continue to get smaller and more powerful than what we have already. Do you doubt that in a year's time, let alone five, computers will be smaller, more powerful, and consume less energy? And then there are mobile devices, which have a LONG way to go, especially in regards to batteries. Thinking that we've already reached the limits of speed and size is laughable. It really is up there with "shut down the patent office because everything has been invented" attitude.

Re:Obvious (4, Insightful)

bobbied (2522392) | about 2 months ago | (#47673709)

If you read my comment.... I'm saying that we are very close to hitting the physical limits. In the past, the limits where set by the manufacturing process, but now we are becoming limited by the material, the size of the of silicon atoms.

There is basically only one way to reduce the current/power consumption of a device, make it smaller. A smaller logic gate takes less energy to switch states. We are rapidly approaching the size limits of the actual logic gates and are now doing gates measured in hundreds of atoms wide. You are not going to get that much smaller than a few hundred atoms wide. Which means the primary means of reducing power consumption is reaching it's physical limits. Producing gates that small also requires some seriously exacting lithography and doping processes, and we are just coming up the yield curve on some of these, so there is improvement still to come, but we are *almost* there now.

There are still possible power reducing technologies which remain to be fully developed, but they are theoretically not going to get us all that much more, or we'd have already been pushing them harder. So basic silicon technology is going to hit the physical limits of the material pretty soon.

Re:Obvious (1)

Anonymous Coward | about 2 months ago | (#47674685)

your assuming the logic gate will continue to be used, if this is so you are correct, But if we moved away from the logic gate to a more efficient system then size can be reduced again.

Re:Obvious (0)

Anonymous Coward | about 2 months ago | (#47675381)

What's obvious is that we can continue to get smaller and more powerful than what we have already. Do you doubt that in a year's time, let alone five, computers will be smaller, more powerful, and consume less energy? And then there are mobile devices, which have a LONG way to go, especially in regards to batteries. Thinking that we've already reached the limits of speed and size is laughable. It really is up there with "shut down the patent office because everything has been invented" attitude.

Pray tell, how do you propose that will we make a transistor less than one atom wide? They're only a few atoms wide already.

We await your undoubtedly brilliant answer.

Re:Obvious (1)

Anonymous Coward | about 2 months ago | (#47673545)

What about HP's "the machine"? They make fantastic claims about its smaller size, greater computing power, and reduced energy consumption.

Re:Obvious (0)

Anonymous Coward | about 2 months ago | (#47673595)

Um if you answer no then you would have been the ones saying that there is no way that the earth could be round.

Re:Obvious (1)

tlhIngan (30335) | about 2 months ago | (#47673701)

The question is, will they have to?

I mean, maybe back when the original iPhone was released, people were releasing ever-tinier cellphones, then it made sense. But given that cellphones are going bigger and bigger, the pressure to make smaller and smaller SoCs is decreasing.

I mean, 3.5" was ginormous before. Now we have people buying phones with 6" screens and large, the amount of size reduction needed is practically nil.

Re:Obvious (3, Insightful)

Beck_Neard (3612467) | about 2 months ago | (#47673721)

We're eventually going to hit limits, but there's no reason to think that that limit is a logic gate a few atoms wide. There's isentropic computing, spintronics, neuromorphic computing, and further down the road, stuff like quantum computing.

Re:Obvious (0)

Anonymous Coward | about 2 months ago | (#47674651)

Well, if you think it takes no energy to flip a bit, might as well believe in leprechauns or space-time folds in the fabric of the universe. Why not? Imagination is the key!

Fool.

Re:Obvious (1)

AmiMoJo (196126) | about 2 months ago | (#47673753)

We can move a lot of processing off to servers now that we have a fast, cheap and ubiquitous network. That will allow our devices to be smaller and use the resources of a larger server somewhere else.

Re:Obvious (1)

Yunzil (181064) | about 2 months ago | (#47673777)

now that we have a fast, cheap and ubiquitous network.

We do?

Re:Obvious (1)

AmiMoJo (196126) | about 2 months ago | (#47673829)

Well, some of us do, others are catching up. The UK is currently about 14 years behind the curve, for example.

Re:Obvious (1)

bobbied (2522392) | about 2 months ago | (#47673885)

We can move a lot of processing off to servers now that we have a fast, cheap and ubiquitous network. That will allow our devices to be smaller and use the resources of a larger server somewhere else.

You have a point, sort of. We are already doing this. However, apart from the display and CPU resources (in that order) the third largest power consumer in a cell phone is running the radios. When you start transferring data at high rates, it takes a lot of power. Given the normal distances between the phone and the cell tower, we are just about at the physical limits on this too. It just takes X amount of RF to get your signal over the link and there is not much you can do w/o violating the laws of physics..

WiFi, Bluetooth, Near Field chips suffer from the same minimum power limits dictated by physics.

So even this approach has it"s issues.

Re:Obvious (2)

jcochran (309950) | about 2 months ago | (#47673799)

I believe that we can get things smaller. I'll agree that we're approaching the limits as regards what is basically a 2 dimensional layout that we're currently using for chips, but that leaves the 3rd dimension. Of course there is a lot of technical issues to overcome, but I believe that they will be overcome.

Re:Obvious (1)

bobbied (2522392) | about 2 months ago | (#47673959)

I don't think going 3D is going to fix the power density problem. You still have to get the heat generated out of the die and keep the device within the operational temperature range it Stacking things 3D only makes this job harder, along with the how do you interconnect stuff on multiple layers?

Could we develop technologies to make 3D happen? Sure, we actually are already doing this, albeit in very specific cases. But there are multiple technical issues with trying to dope areas in 3D. You can do it, it's just really hard to then build a gate on top of an already doped region.

Too much power dysfunctional (1)

Livius (318358) | about 2 months ago | (#47673851)

I really hope computers stop getting more powerful, because the trend in last few years has been for software bloat to use up the added capacity, and now computers are getting more powerful but less useful.

Re:Too much power dysfunctional (1)

Stumbles (602007) | about 2 months ago | (#47674063)

That is why I say it really doesn't matter. Software bloat as you say gobbles up much of the progress.

Re:Obvious (1)

angelbar (1823238) | about 2 months ago | (#47674513)

Maybe they will dissappear, so we will not see them like the TNG "computer"

Re:Obvious (1)

Guppy (12314) | about 2 months ago | (#47674751)

Actually, the answer is no and that is obvious. Eventually we are going to run into limits driven by the size of atoms (and are in fact already there).

No problem with atomic size limits, let me just whip out my handy quark notcher!

Re:Obvious (1)

DivineKnight (3763507) | about 2 months ago | (#47675535)

Nonsense. We'll start making things out of quarks then.

EE, CE, SE, CS, and Physics majors of the future will stare into the void that is sub-atomic computing, and see something staring back at them.

Paywall/Flash Videos? (0)

Anonymous Coward | about 2 months ago | (#47673405)

Seriously? I'm drunk and I'm out. Bye Slashdot.

Battery, Screen, Body (3, Insightful)

mythosaz (572040) | about 2 months ago | (#47673411)

Even if the electronics fail to get much smaller, there's plenty of room to be had in batteries, screens, and the physical casings of our handheld devices.

Re:Battery, Screen, Body (1)

fahrbot-bot (874524) | about 2 months ago | (#47673807)

Even if the electronics fail to get much smaller, there's plenty of room to be had in batteries, screens, and the physical casings of our handheld devices.

At first glance, I read this as "Even if our electrons fail to get much smaller," and, for a second, I thought, "Whoa. Are people working on that?" Guess I gotta get my eyeglass prescription checked.

They're pretty small now. Efficiency will improve (2, Insightful)

Anonymous Coward | about 2 months ago | (#47673447)

We're running up against physical limitations but "3d" possibilities will take our 2d processes and literally add computing volume in a new dimension.

So of course it's going to continue, the only question is one of rate divided by cost/benefit.

Bettridge vs Moore in the battle of the laws (4, Funny)

raymorris (2726007) | about 2 months ago | (#47673463)

Bettridge's law says no.
Moore's law says yes.

In the battle of the eponymous laws, which law rules supreme? Find out in this week's epoch TFA.

Re:Bettridge vs Moore in the battle of the laws (1)

funwithBSD (245349) | about 2 months ago | (#47673585)

Darwin's Law?

Re:Bettridge vs Moore in the battle of the laws (1)

mythosaz (572040) | about 2 months ago | (#47673761)

Cole's Law.

[...thinly sliced cabbage...]

Re:Bettridge vs Moore in the battle of the laws (1)

Dutch Gun (899105) | about 2 months ago | (#47675343)

What if I don't want to slice my cabbage thin? What are you, some sort of cabbage Nazi? Hitler probably liked thin sliced cabbage too!

Godwin's Law

Re:Bettridge vs Moore in the battle of the laws (1)

stoploss (2842505) | about 2 months ago | (#47673713)

Finagle's law.

Re:Bettridge vs Moore in the battle of the laws (5, Funny)

riverat1 (1048260) | about 2 months ago | (#47674569)

In the battle of the eponymous laws, which law rules supreme?

Murphy's Law.

The net is the thing. (0)

Anonymous Coward | about 2 months ago | (#47673497)

It doesn't matter how small, as long as they can be interconnected.

performance never measured in MHz (1)

iggymanz (596061) | about 2 months ago | (#47673509)

three decades in the industry and I've never seen performance measured or stated in MHz. At various times MIPS (and referencing a specific architecture, e.g. VAX MIPS or Mainframe MIPS) or MFLOPS might have been used, but never clock speed alone. As now other benchmarks also were used.

Re:performance never measured in MHz (1)

wonkey_monkey (2592601) | about 2 months ago | (#47673551)

three decades in the industry and I've never seen performance measured or stated in MHz.

Did someone do that in any of the linked articles?

Re:performance never measured in MHz (1)

iggymanz (596061) | about 2 months ago | (#47673823)

yes, it was first sentence of John Timmer's Ars article set me off: "When I first started reading Ars Technica, performance of a processor was measured in megahertz"

Re:performance never measured in MHz (4, Insightful)

vux984 (928602) | about 2 months ago | (#47673555)

three decades in the industry and I've never seen performance measured or stated in MHz

Erm... from the 80286 through the Pentium 3 CPU clockspeed was pretty much THE proxy stat for "PC performance".

Re:performance never measured in MHz (2)

Misagon (1135) | about 2 months ago | (#47673787)

I can't tell if you are being sarcastic or not...

What you say is true only if you bought all your processors from Intel.

Once AMD came along, it was not entirely true if you compared to them. It was not true if you compared to Mac that used 680x0 and later PowerPC.

Re:performance never measured in MHz (1)

vux984 (928602) | about 2 months ago | (#47674001)

What you say is true only if you bought all your processors from Intel.

You say that like this wasn't common as dirt for most of a decade or so.

Once AMD came along

Yeah, that was mostly later. Pentium 4 vs Athlon XP etc. My suggested time frame ended with the Pentium III for a reason.

It was not true if you compared to Mac that used 680x0 and later PowerPC.

Also true, but comparatively few did that. Choosing a Mac vs a PC rarely had anything to do with performance. It was entirely about OS+applications; then once you chose a platform you combared models within it. Practically nobody cared whether their Centris 610 was faster than a 486 or if their Dell Pentium II 333 would have been faster had it been an iMac G3.

Re:performance never measured in MHz (1)

Anonymous Coward | about 2 months ago | (#47674419)

You must be too young for the Pentum 2 vs. K6-2 debates.

Re:performance never measured in MHz (1)

ranton (36917) | about 2 months ago | (#47675633)

You must be too young for the Pentum 2 vs. K6-2 debates.

You must be too young to remember that in the late 90s / early 00s, no one other than techies even knew there was competition between Intel and AMD. They just bought their Intel Inside Dells and Gateways.

Re:performance never measured in MHz (1)

iggymanz (596061) | about 2 months ago | (#47673789)

Marketing and sales to ignorant consumers don't count. The "MHz Myth" has been time and again a subject in many a PC magazines

More meaningful benchmarks have existed long before that era (e.g. Whetstone from early 70s) and many were (e.g. Dhrystone in mid 80s) used all through the rise of the microprocessor (8080, 6502, etc.)

Re:performance never measured in MHz (3, Insightful)

vux984 (928602) | about 2 months ago | (#47674059)

Marketing and sales to ignorant consumers don't count.

Originally it was useful enough. Marketing and sales perpetrated it long after it wasn't anymore.

The "MHz Myth" has been time and again a subject in many a PC magazines

Only once the truth had become myth. The Mhz "myth" only existed because it was sufficiently useful and accurate to compare intel CPUs by MHz within a generation and even within limits from generation to generation for some 8 generations.

It wasn't really until Pentium 4 that MHz lost its usefulness. The Pentium 4 clocked at 1.4GHz was only about as fast as a P3 1000 or something; and AMD's Athlon XP series came out and for the first time in a decade MHz was next to useless. Prior to that, however, it was a very useful proxy for performance.

More meaningful benchmarks have existed long before that era (e.g. Whetstone from early 70s) and many were (e.g. Dhrystone in mid 80s) used all through the rise of the microprocessor (8080, 6502, etc.)

Sure they did. But for about decade or so, if you wanted a PC, CPU + MHz was nearly all you really needed to know.

Re:performance never measured in MHz (1)

iggymanz (596061) | about 2 months ago | (#47674811)

But there was ALWAYS alternatives to intel processors even for personal computer (e.g. motorola) from day one of the personal computer movement, and so the Megahertz Myth was always meaningless. My home computer in 1991 had a Motorola chip (NeXTStation), in 1996 it had a Sparc chip.

Re:performance never measured in MHz (1)

iggymanz (596061) | about 2 months ago | (#47674853)

and if anyone interested, 1976 I had a SWTP 6800

Re:performance never measured in MHz (1)

vux984 (928602) | about 2 months ago | (#47675419)

But there was ALWAYS alternatives to intel processors even for personal computer (e.g. motorola) from day one of the personal computer movement, and so the Megahertz Myth was always meaningless.

Only if you cared about comparison with non-intel PCs. People buying Macs weren't worried about performance comparisons with PCs, they were only concerned about performance compared to OTHER macs. The (much larger) DOS/Windows PC crowd only cared about performance relative to other intels.

My home computer in 1991 had a Motorola chip (NeXTStation), in 1996 it had a Sparc chip.

Heh, NeXT sold what 50,000 units total? Very VERY few people were terribly interested in comparing the performance of those to DOS boxes -- and for that sure their were other benchmark methodologies. But, much as you seem not to want to admit it, CPU MHz *was*:

a) used to measure relative performance of DOS/Windows PCs for several years

b) a pretty reasonable and adequate means for doing so, for quite a few years

1996 it had a Sparc chip.

Again its very few users weren't selecting it for performance vis-a-vis an intel dos bos. :)

Down with paywalls (1)

Anonymous Coward | about 2 months ago | (#47673527)

Get the original article here: Fuck paywalls [libgen.org]

Our own computers? In the FUTURE? (5, Insightful)

uCallHimDrJ0NES (2546640) | about 2 months ago | (#47673563)

Next you'll be telling me they'll let us run unsigned code on processors capable of doing so. You need to get onboard, citizens. All fast processing is to occur in monitored silos. Slow processing can be delegated to the personal level, but only with crippled processors that cannot run code that hasn't yet been registered with the authorities and digitally signed. You kids ask the wrong questions. Ungood.

Re:Our own computers? In the FUTURE? (1)

Anonymous Coward | about 2 months ago | (#47673601)

++ungood citizen

I run approved OS. It is good for us all.

check out the table on this article (0)

Anonymous Coward | about 2 months ago | (#47673623)

Micro Laptop (1)

mschoolbus (627182) | about 2 months ago | (#47673635)

I just want a micro x86_64 laptop with an outside screen as well for phone purposes. *dreams*

Considering (1)

msobkow (48369) | about 2 months ago | (#47673725)

Considering the raw power of today's typical smart phone and it's form factor, I'd say we're rapidly approaching the limits on the size of devices, especially when you consider the rooms that computers far less powerful used to occupy in the days of yore.

There are physical limits to how small electronics can be made, even if new lithography technologies are developed. We'd need to come up with something energy based instead of physical in order to get smaller than those barriers.

Plus there's the fact that a user interface device can only be so small and still be useful to anyone. I already find virtually every cell phone on the market to be too small to be useful for anything. I'm not interested in squinting to read text on a 5 inch screen, thank you very much. Never mind the fact that fat fingers tend to be far bigger than the hot-spots on the user interfaces of such devices.

Considering (1)

BarbaraHudson (3785311) | about 2 months ago | (#47674405)

I'm not interested in squinting to read text on a 5 inch screen

So enlarge the fonts. Turn on triple-tap to zoom text in even more. No need to squint.

Re:Considering (1)

stepho-wrs (2603473) | about 2 months ago | (#47675109)

Making it much easier to read the 3 characters that fit on the 5" screen...

It ain't gonna matter. (1)

Stumbles (602007) | about 2 months ago | (#47673995)

I have been using computers since the early 80s. Things like the HP2114, Varian 77 and other stuff that never saw the light of civilian day. My I7 with 16GB ram boots no faster nor gets into a usable state than the HP2114 with 8k of core memory and used discrete components to construct a CPU.

Will they be able to process more data, yeah probably but that won't matter cause they'll just be given more data to munch so you will still need more machines. And so the cycle goes.

Obligatory: "There's Plenty of Room at the Bottom" (2)

noidentity (188756) | about 2 months ago | (#47674099)

Feynman's talk on this seems required reading: There's plenty of room at the bottom [zyvex.com] . None of the linked articles even mention Feynman's name.

Re:Obligatory: "There's Plenty of Room at the Bott (1)

frank_adrian314159 (469671) | about 2 months ago | (#47675145)

None of the linked articles even mention Feynman's name.

Why should they? Not many current astrophysics papers mention Galileo, either. Nor do most papers in modern computing reference the work of John von Neumann.

In science, an original idea or suggestion by someone, no matter how famous, is built upon by others, who's work is built upon by others, until someone actually turns an incomplete idea into a field of study. And by this time the literature has evolved to view the problem slightly differently, perhaps more completely, perhaps from a point of view that's more useful from a research point of view. And then these papers by the others who made these changes become the ones that are referenced. It's the cycle of scientific research. And don't think it's because we've forgotten our roots... If you asked the author of this paper, I'm pretty sure he'd start with either Shannon or Feynmann. We leave older references off, because, often it's not relevant to the research you're talking about. And, frankly, your space is already so limited you don't want to spend any on name checks.

But come on, do you really think a 55 year old paper is going to be at the top of impact rankings when computed against current research in a field moving this fast? And, even if so, isn't it more likely this work has been superseded by others? IT'S BEEN 55 GOD DAMN YEARS, FOR CHRISSAKE!!! I think your hero worship is showing. At least find a more modern reference.

Re:Obligatory: "There's Plenty of Room at the Bott (0)

Anonymous Coward | about 2 months ago | (#47675403)

Feynman's talk on this seems required reading: There's plenty of room at the bottom [zyvex.com] . None of the linked articles even mention Feynman's name.

Did you ever even ready the good Prof. Feynman's words?

When we get to the very, very small world – say circuits of seven atoms – we have a lot of new things that would happen that represent completely new opportunities for design.

The finest circuits are *already* about 7 atoms thick. What do you propose to do when it's down to one atom, slice it with a pizza cutter?

We're already at the goddamned bottom and Feynman's not around to bail us out.

Remove the Bloat (2)

Hamsterdan (815291) | about 2 months ago | (#47674121)

As we're nearing the size limit for IC manufacturing technology, what about reducing bloat and coding in a more efficient manner.

Let's look at the specs of earlier machines

Palm Pilot. 33Mhz 68000 with 8MB of storage, yet it was fast and efficient.
C=64 1Mhz 6510 with 64k RAM (38 useable), also fast and efficient, you could run a combat flight simulator on it (Skyfox)
Heck, even a 16MB 66Mhz 486 was considered almost insane in early 1994 (and it only had a 340 *MB* HDD, and everything was fine. (I bought that in high school for AutoCAD)

Go back to the same efficient and small code, and our devices will seem about 10 times faster and will last longer.

Re:Remove the Bloat (1)

GrahamCox (741991) | about 2 months ago | (#47675185)

C=64 1Mhz 6510 with 64k RAM (38 useable), also fast and efficient

It wasn't fast by any stretch (I had the European PAL spec, which was even slower). If you wanted to use "high resolution" mode (320x200 pixels) then it took minutes to draw even simple curves. If you programmed it using the built-in BASIC, anything non-trivial took minutes or more. The only way you could write anything like a useful program was to use assembler, coding directly to the bare metal. Some of the games resulting were impressive enough for their time, but wouldn't look much today.

The problem isn't sloppy coding, but that expectations are higher - people want photographic fidelity for images and video, interfaces that look good, and the ability to download stuff over the internet quickly. All that takes a lot of processor power, and a certain amount of code. A modern PC is hardly wasting CPU cycles to get its work done (except in the trivial sense that it's using a lot of power for things that some people consider frivolous, like blurry translucent window backgrounds), there isn't a way to speed up our devices by 10x and still have them do what they do. The idea that modern code is wasteful and bloated is a myth.

Does it matter? (1)

Krishnoid (984597) | about 2 months ago | (#47674197)

There was a time when 1GHz/1GB was overkill, and while CPU/IO speed improves, usability doesn't seem to be getting all that much better. Considering we've had multiple orders of magnitude improvement in raw hardware performance, shouldn't other factors -- usability, reliability, security -- get more focus?

Sure, those could benefit from more raw hardware capability, but the increased 'power' doesn't seem to be targeted at improving anything other than raw application speed -- and sometimes, not even that.

Re:Does it matter? (1)

fyngyrz (762201) | about 2 months ago | (#47674459)

There was a time when 1GHz/1GB was overkill

Not for desktop computers, there wasn't. Perhaps for your watch. Then again, probably not.

There's no such thing as "overkill" in computing power and resources. There is only "I can't get (or afford) anything faster than this right now."

Re:Does it matter? (1)

Ambassador Kosh (18352) | about 2 months ago | (#47674835)

If I had a computer that was a million times faster than my current computer I could still use something even faster. Even at a billion times faster I could still use more power. We are at the stage where we can use computer simulations to help bring drugs to market. The computational power needed is HUGE but it is also helping bring drugs (including CURES) to market that would have never been possible otherwise. There are even potential cancer cures that will NOT make it to market ANY other way.

The average person may not need more computing power but as a species we desperately need insanely more computing power than we have now.

Re:Does it matter? (1)

Krishnoid (984597) | about 2 months ago | (#47675151)

I run Rosetta@Home [bakerlab.org] on my own computers -- I can't believe I forgot about that. Great point.

Re:Does it matter? (1)

Dutch Gun (899105) | about 2 months ago | (#47675397)

The scientists and engineers that design the US nuclear weapons have computational problems that are measured in CPU months. A senior scientist was talking to a consultant, and explained the importance of these simulations.

"Just think about it.", he said. "If we get those computations wrong, millions of people could accidentally live."

-credit to the unknown US nuclear scientist who told this joke to Scott Meyers, who in turn relayed it at a conference.

Re:Does it matter? (1)

Ambassador Kosh (18352) | about 2 months ago | (#47675577)

In my case though these calculations will save millions of lives and improve the qualify of life for many millions more. Even the most powerful super computers in the world would take years to solve many of these problems and we keep finding more to solve. We approximate solutions because that is still better than we had before and it is the best we can do for now.

With more computing power we can save more lives.

no, my dick is getting bigger and more powerful (-1)

Anonymous Coward | about 2 months ago | (#47674253)

just thought you should know

Moore's law (2)

jafffacake (1966342) | about 2 months ago | (#47674777)

Three years ago in the uk i bought my daughter a dell laptop, i5 processor, 6Gb RAM, 500Gb hard drive, £350. Recently it died, so i looked around for a replacement. listed in the bargain forums here (hotukdeals.com) only a couple of weeks ago was a laptop i5, 6Gb RAM, 1Tb hard drive, £380. So in three years the price has barely changed for a remarkably simiar spec. Moore's law seems dead? I agree with the original poster!

Re:Moore's law (0)

Anonymous Coward | about 2 months ago | (#47674999)

You are unfortunately incorrect. The i5 line up varies greatly based on what you purchase. In reality most of the i5s from 3 years ago didn't have HT, they had larger die, they ate more power and they had lower benchmarks. The newer i5s being released now are on par with what the top of the line i7s from 3 years ago were doing. Intel needs to start labeling their stuff better...

The Gating Issue (1)

rssrss (686344) | about 2 months ago | (#47675087)

The gating issue is now screen size and finger size. Nice big high def screens need big batteries to keep them lit. I don't think those items are going to get much smaller.

Re:The Gating Issue (0)

Anonymous Coward | about 2 months ago | (#47675449)

Funnily enough, they seem to be getting bigger these days.

Mobile phones were once a huge briefcase sized system, then moved to a house brick sized bend held model. They gradually shrunk until they bottomed out with the smallest possible size that you could still hold in your hand and press the buttons.

Those tiny Nokias (and other brands too) really couldn't get any smaller and still remain usable. They fit in a pocket or purse easily and had the same battery and performance as their larger brethren.

Then the smart phones came along where screen size was important, along with the ability to type long text messages and emails. So phones started to grow again, until some came along that are basically all screen and battery. Construction methods have improved so the phone is now basically limited by the usable size of the screen with onscreen keyboard.

A few years ago the vast majority of phones fit into the average pocket. Now the phones are growing as big as possible while still (kind of) fitting into a (large) pocket or purse. These days you need suit jacket pockets, cargo pants, or a hand bag to carry them around. The current iPhones barely fit into my jean pockets (and definitely not comfortably unlike those tiny Nokia phones). Belt holsters are annoying.

I'd love for Apple to bring out an iPhone nano that could be taken out with jeans and a t-shirt when you just need the bare minimum communications technology with you, but have it fully synced with the bigger full sized phone.

Betteridge's law of headlines - finally broken (1)

germansausage (682057) | about 2 months ago | (#47675375)

The one word answer is "Yes". Betteridge's law of headlines is finally broken.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?