Blog Post

Your Own Personal Supercomputer

Most people ?think? serially and consequently, give both other people and computers instructions in a similar way. In fact, even the term used for the act of running a computer program, ?processing,? implies a predefined, linear structure (think about the ?process? of renewing your driver's license). As such, its no big surprise, that the computer was designed to process once command after another, and that a programmers' natural instinct is to write code in a similar structure. Computer scientists have long been aware that well-designed parallel algorithms are far more powerful than their linear counterparts, but with the constant increase in processing power over the last 30 years, there has not been a large push towards teaching distributed (parallel) programming outside of supercomputing or other highly specialized fields.

As the popularity of video gaming and the average number of pixels on a computer screen have grown, the demand has also increased dramatically on the Graphics Processing Unit (GPU), which needs to calculate and update every pixel on the screen 50-100 times per second (almost all modern LCD screens have well over 1 million pixels; High Definition video has over 2 million). In order to satisfy this demand, which was growing far faster than the capacity of a standard CPU, graphics card designers began to increase the number of processors rather than the actual speed (?If 2 heads are better than 1, then 3 heads must be better than 2.?). Modern graphics cards have anywhere from 8 to 240 individual cores with computational power reaching into the teraflop range (1 teraflop = a million, million operations per second). In comparison, the world's fastest super computer, the IBM RoadRunner, has a maximum sustained rate of 1 petaflop (1,000 teraflops), but uses 65 times as many processors than if it were built with GPUs. This high-performance isn't just limited to high-end systems either: my laptop's GPU (8 cores) can process around 34 billion operations per second (34 gigaflops). This is far more powerful than my CPU, which can process just under 15 billion operations per second, and costs considerably more than the GPU.

Having seen how powerful a GPU is due to its parallel processing capability, it is interesting to look at the brain under a similar light: While we may ?think? serially, our brains process information in a massively parallel system. In fact, every single neuron could be considered equivalent to a CPU core (regarding speed, not flexibility). If we apply the same metrics used for calculating processor speeds to our brain and then compare it to the IBM RoadRunner, we see that it would take over fifty million of these supercomputers just to keep up with your brain! So, the next time your friend is bragging about how fast and fancy his new computer is, just think about how long it would have taken his computer to say that.

68

4 comments

This is an informative and delightful posting. Thanks!

77

http://www.youtube.com/watch?v=cL9Wu2kWwSY This is Scott McLeod's YouTube video that pushes some of these points in an entertaining way. I am not sure all the facts are right or even what all the implications are but it is interesting and so are the various YouTube mashups and remixes of it.

109

Wow, that is really interesting, although I do question the claims about computational power compared to the human brain. Even if researchers were able to build a supercomputer capable of 100 billion gigaflops in the next 5 years, I highly doubt it would be capable of competing with the human brain (remember, the fastest supercomputer today is rated at just over a petaflop, 100,000 times slower). Although some studies from the lat 1980s and the 1990s have estimated the brain's computing power around 10 petaflops [1,2], they have shown total disregard for some of the unique aspects of the human brain; nueronal activation is far more complicated than a simple "on" or "off" due to, among other things, the varying chemicals released when the neuron fires [3].Judging by the trend of "increased knowledge of the brain" == "increased estimation of the brain's computing power", it is safe to assume the computing power of the human brain far surpasses any previous (and probably current) estimations.

Additionally, the video's producers are probably basing their estimations of computing power in 2013 and 2049 on Moore's Law, which states that the number of transistors (and consequently, computing power) will double every 18 months. Unfortunately, this has not held true since 2004 when Intel abandoned the switch from 90nm to 60nm wavelengths in favor of increasing the number of CPU cores (sound a little like what GPU designers did in the '90s?).

Finally, while optical character and handwriting recognition (two relatively difficult tasks for a computer, but easy for a human) have come a long way, the limiting factor is still the algorithms and not the processing speed. Admittedly, if I were to have a 100 exaflop supercomputer at my disposal, it is definitely feasible that I could put something together that would work. However, thats like killing a mosquito with a bazooka: a little excessive, and definitely not representative of the capabilities of a computer. When Gary Kasparov lost to Deep Blue, a chess computer capable of 3 million, billion chess operations per second (3 "special" petaflops), no one considered for a second that Deep Blue was smarter than Kasparov: it was merely better at following a static set of instructions. Kasparov on the otherhand is capable of planning and adapting his playing style: he turned right around and won the next three games.

I was reading a research article just this week about sound recognition using artificial neural networks; although they were able to train the network to recognize individual pitches, the output became worthless when they tried to recognize chords, or even harmonics, much less instruments or music styles.

There is far too much I could say about this, and since this is exactly what I plan to talk about in my next post, I'll probably be writing a nice long one far sooner than I had planned!

References
[1] http://www.merkle.com/brainLimits.html
[2] http://www.transhumanist.com/volume1/moravec.htm
[3] http://cogsci.uwaterloo.ca/Articles/molecules.htm

72

Jeff Hawkins, in On Intelligence, makes the point that we're still a long way from the kind of intuitive, dextrous rethinking--not just calculation--that the human brain does. So maybe it will be more like Temple Grandin's idea of "animal genius." Rather than thinking about the computer as some simulacrum of the human brain, we need to think about "computer genius" that isn't comparable to the human brain but exceeds it in some capacities, is far less interesting and powerful in others but, basically, is different from (thank goodness) brain power almost in a "species" way? Thanks for these really interesting and provocative comments!

72