Petaflop Computer
Has anybody heard about the computer that broke the petaflop barrier? We're one step closer to robots taking over the world.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
- Login to post comments
Which one did you have in mind? Roadrunner or Jaguar?
=
I hadn't heard about Jaguar. I'll have to look it up.
Jesus H Christ. 1.64 petaflops! We're on the precipice of a huge shift in computing power. Just look at the specs on this beast! If only we could break the memory wall...
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
wait, why are we considering the move into petaflops breaking the petaflop barrier? I'm fairly certain the enxt fastest supercomputer below Roadrunner, Blue Gene, only maxed below 500 teraflops. That's quite the gap to cover in one year (Roadrunner did 1.1PFLOPS in 2008) and quite the gap to cover if indeed there was a 'barrier' at all. I really do wonder why it gets termed as such. There's no less than 3 supercomputers which I could find that have performed a sustained petaflop... some barrier.
Now the memory wall should be known as the memory barrier. There are very real physical problems with the speed of memory off the CPU and of course with the speed between CPUs and memory. Too small and too hot... which would also constitute a problem for acheiving petaflop performance if it weren't possible to just link many, many processors together until you get the desired performance (I know it's a bit more complicated than that).
BigUniverse wrote,
"Well the things that happen less often are more likely to be the result of the supper natural. A thing like loosing my keys in the morning is not likely supper natural, but finding a thousand dollars or meeting a celebrity might be."
If you want to consider distributed computing, Folding@Home is currently running at about 4.5 petaflop
I called it a memory wall because a "barrier" usually refers to something that will soon be, or is often broken. Like the sound barrier. The term "wall" has a stronger connotation. The memory wall is going to be very hard to surmount. Not only are there the physical problems to overcome but there's also a lot of "catch-up" because it's lagged behind for so long. It's going to take a lot of innovation and research.
After eating an entire bull, a mountain lion felt so good he started roaring. He kept it up until a hunter came along and shot him.
The moral: When you're full of bull, keep your mouth shut.
MySpace
Meh, perhaps it a regional connotation thing, but a wall is weaker than a barrier as far as I'm concerned.
BigUniverse wrote,
"Well the things that happen less often are more likely to be the result of the supper natural. A thing like loosing my keys in the morning is not likely supper natural, but finding a thousand dollars or meeting a celebrity might be."
Well, I don't suppose that the distinction between a wall and a barrier is all that important here. Suffice to say that there is one whatever it is called.
Actually, whatever the term one uses, it should be noted that they tend to come from journalism trying to explain matters to the great unwashed masses. In reality, the people who are doing the work often already have the problems and some solutions identified before the journalists start writing.
The idea of a sound barrier is a point on which this can be shown. It has it's origin in the WW2 practice of delivering bombs on targets by taking the aircraft up fairly high and then power diving the target, pulling out of the dive after the bombs were released. Many pilots found that the aircraft of the day became unstable as they approached Mach 1.
The engineers were well aware of this and were working on solutions. And in fact, Chuck Yeager clearly broke the speed of sound in late 1947 (the exact date is not clear because the measurements on some of the test flights may not have been accurate enough to say when the first trans-sonic flight actually was). Also, there are reports from at least one German pilot that match well with later observations to suggest that the speed of sound may have been broken before the end of the war.
But I digress...
Being a hardware junkie myself, I can tell you that it is somewhat inaccurate to speak of a memory wall. We know what the problem and the cause are. The limiting factor is actually in the design of modern motherboards. At the >300Mhz speeds of today's motherboards, all of those circuit traces that connect the memory sockets to the processor have to be very carefully engineered to exacting lengths.
The signals are traveling along those traces at nearly the speed of light and even slight variations in the manufacturing process would cause the “words” of memory to not all arrive at the destination at the same time, thus forcing what are known as “wait states” (basically down time for the hardware to catch up to itself).
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
That being said, if you want to build yourself a really fast computer, one of the more important numbers to look for is how much cache memory in in the processor itself. After all, that memory can communicate with the core logic directly, without going through the bottle neck known as the motherboard.
Past that, I also buy most of my parts in OEM (aka White Box) packaging as that lets me pick my parts by the manufacturing codes. If I know that, I can pick the ones from known good production runs. But that is just me.
As far as where the researchers are at integrating memory and core logic is where it is at for removing the memory wall. Not only do Intel and AMD tend to stay locked in a race to do more of this with each generation but memory is headed in that direction with “processor in memory” being the probable next big thing. At least two universities have reached “tape out” on the first batches, which basically means that they can now send specific orders to chip foundries for test samples.
I don't think that motherboards will be phased on entirely in the forseeable future. However, with parallel processing becoming more mainstream every day and predicitve execution already being fairly standard, the RAM, CPU and GPU(s) in the next couple of generations will be much more able to do various processing work on the specific chips that are most suited to each piece of the work and then they will use the motherboard to pass around not the raw data so much but completed work. The result of that should bring us at least one more order of magnitude in our home computing power.
=
Is there not also a particular breakthrough with z-ram? I understand that it's not such that it can be used in place of regular memory now, but it's being worked onto processors and that it is very powerful.
BigUniverse wrote,
"Well the things that happen less often are more likely to be the result of the supper natural. A thing like loosing my keys in the morning is not likely supper natural, but finding a thousand dollars or meeting a celebrity might be."
SO WHEN DO I GET TO PLAY SUPCOM AT 60 FPS ???
Well, a quick search for z-ram did not find any really recent articles, so it is possible that the technology is stalled. Then too stalls are nothing new at the bleeding edge. Heck but z-ram only became something worth considering because of the earlier stall in SOI fabrication.
Although I must say that having a 100MB cache on die does sound like we might be heading toward the day when games really are indistuinguishable from hi-res photography.
=
It'll be before robots take over the world, that's for sure.
"Hey robot! Check out my EMP generator!" Click.
Saint Will: no gyration without funkstification.
fabulae! nil satis firmi video quam ob rem accipere hunc mi expediat metum. - Terence