FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Today at 05:55 AM

 
 
Thread Tools Search this Thread
Old 06-20-2003, 04:06 AM   #1
Veteran Member
 
Join Date: Aug 2001
Location: Seattle
Posts: 2,280
Default Limits of CPU, memory speed?

I just got in a 3.0 GHz pentium chip w/ 800 MHz FSB and Hyperthreading. It got me to thinking just how fast with the current way of making chips can computers go? Can they go to 50 GHz? can they have many times more operations per clock cycle as they do now?

What are the new technologies on the horizon that will get memory bandwidth and numbercrunching power up orders of magnitude from where there are now?

I can't remember a good discussion on this here, but hopefully some people might have a specialty in this.
repoman is offline  
Old 06-20-2003, 06:06 AM   #2
Veteran Member
 
Join Date: Mar 2002
Location: anywhere
Posts: 1,976
Default

I have some interest (inversely proportional to the actual expertise I have) in this topic, because of its relevance to fine-tuning/ID arguments. By way of further introductory material, let me suggest the following bibliography, I lifter from this site:
Quote:
Lloyd, Seth. Ultimate Physical Limits to Computation Nature 406, pp. 1047-1053 (31 August 2000) Also available in preprint form on the arXiv ePrint server.

Ng, Jack Y. From Computation to Black Holes to Space-Time Foam Physical Review Letters Vol. 86, #14. pp 2946 - 1949. (2 April 2001) Also available in preprint form on the arXiv ePrint server (version on the preprint server is more recent than the one published in PRL, and is corrected).

Susskind, Leonard and Uglumy, John Black Hole Entropy In Canonical Quantum Gravity and Superstring Theory Phys.Rev. D Vol. 50 (1994) Pp. 2700-2711. A version of the paper is available in preprint form on the arXiv ePrint server.

Chuang, Isaac L. et al Experimental realization of a quantum algorithm Nature, Vol. 393 (1998) Pp. 143-146. A version of the paper is available in preprint form on the arXiv ePrint server.
Principia is offline  
Old 06-20-2003, 07:43 AM   #3
Senior Member
 
Join Date: Apr 2003
Location: Omaha, Nebraska
Posts: 503
Default

Our computers would be become much faster if they didn't have to comply with the x86 architecture. They will continue to become faster at a modest rate, but once nanotechnology becomes viable, we shall see huge leaps in the speed of processing.
Jake
SimplyAtheistic is offline  
Old 06-20-2003, 09:54 PM   #4
Veteran Member
 
Join Date: Mar 2003
Location: Colorado
Posts: 1,969
Default

While I don't know what the limit is, there is certainly a speed limit brought on by quantum mechanics.
As chip architecture becomes more dense, and conducting material gets smaller and tighter, the probability of electron tunneling, or electrons passing through insulating material to nearby conducting material, gets greater. If tunneling occurs often enough, the processing function of the chip becomes unpredictable, and therefore the chip is useless.
To overcome tunneling, the chip must be made larger as more circuits are imprinted onto it. The larger the chip, the farther the electron path for a given function, and the slower the speed.
Therefore there is a speed limit for the technology we now employ, and thus the great interest in completely different technologies.

Ed
nermal is offline  
Old 06-20-2003, 11:51 PM   #5
Junior Member
 
Join Date: Jun 2003
Location: Bowling Green, KY USA
Posts: 15
Default

We have the ability now to take silicon to a new level, but not a much higher one. Replacing the silicon with carbon, and stretching the materials used to make transistors will help with the speed increases. The next big step, and I mean a revolutionary change in processors, memory, and storage, will probably be in DNA and RNA processing. It's already been used to help with the decoding of the human genome and has proven to be extremely powerful, extremely fast, and you can store the same information in about 1 trillionth of the space. It holds the most promise, and should be the next revolution. As for the highest speed with current architecture and materials, we're probably looking at 25-35ghz.
galt23 is offline  
Old 06-21-2003, 12:31 AM   #6
Moderator - Science Discussions
 
Join Date: Feb 2001
Location: Providence, RI, USA
Posts: 9,908
Default

This section of an article by Ray Kurzweil is somewhat relevant to this topic, especially the first two paragraphs as well as the last one:

Quote:
The exponential trend that has gained the greatest public recognition has become known as "Moore's Law." Gordon Moore, one of the inventors of integrated circuits, and then Chairman of Intel, noted in the mid 1970s that we could squeeze twice as many transistors on an integrated circuit every 24 months. Given that the electrons have less distance to travel, the circuits also run twice as fast, providing an overall quadrupling of computational power.

After sixty years of devoted service, Moore's Law will die a dignified death no later than the year 2019. By that time, transistor features will be just a few atoms in width, and the strategy of ever finer photolithography will have run its course. So, will that be the end of the exponential growth of computing?

Don't bet on it.

If we plot the speed (in instructions per second) per $1000 (in constant dollars) of 49 famous calculators and computers spanning the entire twentieth century, we note some interesting observations.

Moore's Law Was Not the First, but the Fifth Paradigm To Provide Exponential Growth of Computing

Each time one paradigm runs out of steam, another picks up the pace




It is important to note that Moore's Law of Integrated Circuits was not the first, but the fifth paradigm to provide accelerating price-performance. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census, to Turing's relay-based "Robinson" machine that cracked the Nazi enigma code, to the CBS vacuum tube computer that predicted the election of Eisenhower, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer which I used to dictate (and automatically transcribe) this essay.

But I noticed something else surprising. When I plotted the 49 machines on an exponential graph (where a straight line means exponential growth), I didn't get a straight line. What I got was another exponential curve. In other words, there's exponential growth in the rate of exponential growth. Computer speed (per unit cost) doubled every three years between 1910 and 1950, doubled every two years between 1950 and 1966, and is now doubling every year.

But where does Moore's Law come from? What is behind this remarkably predictable phenomenon? I have seen relatively little written about the ultimate source of this trend. Is it just "a set of industry expectations and goals," as Randy Isaac, head of basic science at IBM contends? Or is there something more profound going on?

In my view, it is one manifestation (among many) of the exponential growth of the evolutionary process that is technology. The exponential growth of computing is a marvelous quantitative example of the exponentially growing returns from an evolutionary process. We can also express the exponential growth of computing in terms of an accelerating pace: it took ninety years to achieve the first MIPS (million instructions per second) per thousand dollars, now we add one MIPS per thousand dollars every day.

Moore's Law narrowly refers to the number of transistors on an integrated circuit of fixed size, and sometimes has been expressed even more narrowly in terms of transistor feature size. But rather than feature size (which is only one contributing factor), or even number of transistors, I think the most appropriate measure to track is computational speed per unit cost. This takes into account many levels of "cleverness" (i.e., innovation, which is to say, technological evolution). In addition to all of the innovation in integrated circuits, there are multiple layers of innovation in computer design, e.g., pipelining, parallel processing, instruction look-ahead, instruction and memory caching, and many others.

From the above chart, we see that the exponential growth of computing didn't start with integrated circuits (around 1958), or even transistors (around 1947), but goes back to the electromechanical calculators used in the 1890 and 1900 U.S. Census. This chart spans at least five distinct paradigms of computing, of which Moore's Law pertains to only the latest one.

It's obvious what the sixth paradigm will be after Moore's Law runs out of steam during the second decade of this century. Chips today are flat (although it does require up to 20 layers of material to produce one layer of circuitry). Our brain, in contrast, is organized in three dimensions. We live in a three dimensional world, why not use the third dimension? The human brain actually uses a very inefficient electrochemical digital controlled analog computational process. The bulk of the calculations are done in the interneuronal connections at a speed of only about 200 calculations per second (in each connection), which is about ten million times slower than contemporary electronic circuits. But the brain gains its prodigious powers from its extremely parallel organization in three dimensions. There are many technologies in the wings that build circuitry in three dimensions. Nanotubes, for example, which are already working in laboratories, build circuits from pentagonal arrays of carbon atoms. One cubic inch of nanotube circuitry would be a million times more powerful than the human brain. There are more than enough new computing technologies now being researched, including three-dimensional silicon chips, optical computing, crystalline computing, DNA computing, and quantum computing, to keep the law of accelerating returns as applied to computation going for a long time.
Jesse is offline  
Old 06-22-2003, 07:09 PM   #7
Regular Member
 
Join Date: Jun 2003
Location: Brisbane, Australia
Posts: 109
Default

Would anyone be able to explain to a layperson how it is that we're constantly having these increases in processor speed? What I mean is, why is it possible for us to make a 3ghz processor at an affordable price today, yet we weren't able to do it 3 years ago, and why will we be able to make much faster processors in 3 years at an affordable price yet we can make them today?
It seems odd to have such a steady and constant rate of improvement in a technology.
Anson is offline  
Old 06-22-2003, 11:53 PM   #8
Regular Member
 
Join Date: Feb 2003
Location: Canada
Posts: 127
Default

Great link Jesse. Ray Kurzweil is very cool.

Anson, how is it odd? Its the way it has been since the dawn of human history (actually saying since evolution began would be more accurate).

We build tools and with those tools we can build better tools, ect...
Elvithriel is offline  
Old 06-23-2003, 02:39 AM   #9
Banned
 
Join Date: Apr 2003
Posts: 7,834
Exclamation There is almost no theoretical limit!

Based on current technology, it seems the theoretical limit is probably somewhere in the 30-35 GHz range, although I've seen estimates up to 50+!!

However, there is a new technology based on using quantum mechanics prinicples (IIRC, bound spin states of electrons) to transmit/store information. I think this, in conjuction with the biological advances using DNA/RNA make the advances beyond anything we can even estimate now, although the technology won't be readily available for quite some time.

If we're lucky, right around the time we reach the upper limit with printed circuits....

Here's a link http://cryptome.org/qc-grover.htm on quantum computers that is well written in mostly laymen's terms.

If you google "Quantum Circuit" You can find lots of good links. But be prepared for some mind bending stuff!

Lane
Worldtraveller is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 09:22 PM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.