Technological Singularity
Hi Everyone,
Has anyone been looking at the technological singularity idea - and more particularly the possible time frames?
It has not been something that I have taken seriously but the more I think about it and it's underlying expectation that technologies in various forms are becoming more sophisticated at an exponentially faster rate of development, the more it seems that it is possible to see a drastic event in the near future.
Useful links :
http://en.wikipedia.org/wiki/Technological_singularity
http://en.wikipedia.org/wiki/Ray_Kurzweil
Why I think this is important to these forums is because it has the potential of making humanity obsolete.
- Login to post comments
With regards to artificial intelligence taking over the world as per Terminator, it seems unlikely at most. There's a big distincion between a program that doesn't perform as intended and a sentient, willful, technological intelligence. The prior is quite common, but technology produced for defense contracts tend to be thoroughly scrutinized for quality control, at least in the U.S. The latter seems close to impossible, especially as an accident. Intelligence alone produces neither an understanding of the world nor goals to work towards.
I consider it much more likely that technology would be used to allow humans to break the performance limitations imposed by a gradual process of evolution as per Ghost in the Shell or Gattaca. Whether or not this is a good thing is an entirely different question, but one that might be best answered with "good according to whom?"
I think humanity will be fine as long as we don't trap 2 superintelligent robots in a lab and tell them they can consume any source of energy around, as long as they avoid downloading from the ethernet port of the knowledge of good and evil (a broadband Internet connection).
Last thing we want are robots that want larger penises, at the same time carrying around terabytes of porn, while crying about how unfair the mass media is to Britany Spears.
But seriously, I think that this concept is great fodder for stories. Just as there are fears that stem cells will lead humanity to creating clones, wars, and the end of humanity, this is yet another vehicle for how humanity is gonna wipe itself off the planet.
Let's not innovate anymore because our technology will turn on us one day and supercede us.
Remember how you figured out there is no Santa? Well, their god is just like Santa. They just haven’t figured out he’s not real yet.
Before the "singularity" can happen we need a breakthrough in technology.
All of our current Technology really breaks down to one single component.
The transistor.
We've been using the transistor for over 60 years. The one real thing that has changed has been the size and number of the transistors in our technology.
We are closely reaching the limits to the transistor capability.
We need to come up with the next thing like the transistor replacing the vacuum tube.
"I am an atheist, thank God." -Oriana Fallaci
Our circuits are still rather 2d...
"What right have you to condemn a murderer if you assume him necessary to "God's plan"? What logic can command the return of stolen property, or the branding of a thief, if the Almighty decreed it?"
-- The Economic Tendency of Freethought
First off, I dislike the term "technological singularity". I have studied both computers and physics my entire life. As for a professional job I took the computer route, although physics has been my second favorite field of study. I have read about this quite a bit, but the terminology has never made sense to me. I do not see how any definition of the word "singularity" fits into this concept, so maybe it was just a cool catch phrase someone picked.
Terminology aside, I have had a similar belief for a long time. At some point, technology will overcome human abilities. We are already witnessing our physical abilities being taken over by mechanical means. You could view this in two ways. One being that we use cars instead of running, and we use guns (or even nukes) instead of choking people to death. These examples demonstrate technological progress taking over our physical abilities. If you deny that as an example, a second example is how the olympics recently denied a double amputee because his bionic legs gave him an unfair advantage (even 30% over able bodied people). I don't think anyone is really disputing that we will be physically beat by mechanical means.
For technology to overcome people in a mental aspect is a whole different story. I would say that deep blue beating the best human being at chess is not a good example of this because the computer is not actually thinking, but it is following a known algorithm for the best moves (even if it was learning to play, that learning process would be a pre-programmed method specific to chess). For real progress in artificial intelligence, we will need to develop a new method of storing and processing data (as watcher pointed out). I think this will come in the form of integrating computers and living things together. There are other possibilities and many companies are doing research into many interesting data processing/storage techniques, such as holographic storage, memory based on electron spin, and even quantum computers.
There has been some great progress in integrating electronics with living things, including the human brain. If anyone is not aware of this progress I strongly suggest researching it more (see links). If this progress continues, we would reach this "technological singularity" with a cyborg species, if not a completely independent computer, or perhaps the cyborgs could create the intelligent computer.
Here's somewhere to start the research:
brain computer interface:
http://en.wikipedia.org/wiki/Brain-computer_interface
remote controlled humans (some reports include video):
http://www.foxnews.com/story/0,2933,173500,00.html
microsoft mind reading patent:
http://www.newscientist.com/blog/invention/2007/10/microsoft-mind-reading.html
sony brain sensation control patent:
http://www.newscientist.com/article.ns?id=mg18624944.600
If you consider any of these sources unreliable, double and triple verify them with google searches. I personally don't trust anything from any single source, no matter how reputable it is.
As I understand it, they are attempting to convey that there will be a point in time where the rate at which technology changes is so extreme that no useful predictions about what follows is possible - the equivalent of attempting to see past the event horizon of a black hole.
Be that as it may, I know from my own experience that the rate at which technology changes is somewhat surprising. (I remember for instance the first IBM computer I had had two 360Kb floppy disk drives and CGA graphics and the first PC I had that had a hard disk was 10 Mb). Trying to imagine where we are today from that point was very difficult.
We are progressing in not only our abilities to create simulations of mental processes but create synthetic life with a DNA basis. We are getting better at imaging real time brain activity (which a part of the basis for reverse engineering a brain). And this is before we get to talking about quantum mechanical and nano technologies.
I think a few breakthroughs in any of these technologies will have a profound impact on the future.
That isn't actually what made me rethink this. What is interesting is the basic idea that regardless of anything else, the rate at which we have been increasing the development of technologies is consistent with a exponetial function. So regardless of how fast things changed before they will change considerably faster in the future.
One of the examples Ray kurzweil uses is the way the human genome was mapped. Something like 80% of the mapping happened in the last quarter of the time.
BTW thanks for the replies they are all very interesting and it is great to have a place where you can get intelligent responses to a question or two!
A fine idea. I've watched some of Ray Kurzweil's lectures on youtube before. Im about halfway through one of his books titled "The Singularity is Near". I love this man he is my hero. It is very fascinating when he speaks of exponential growth of technology but evolution as well. He predicts that we will merge with machines. When he says merge i think he means nanomachines will prevent us from dying(if we desire it, or as he says "Death is a tragedy" , disease, enhance our minds and bodies. He also talks about the idea of reverse engineering the human brain with either nanomachines or a brain scanning method. Nanomachines will enable vastly more powerful computers to be constructed and enable human mind simulation. The idea of nanomachines is very fascinating and something I myself would love to get into maybe in graduate studies. I've come to understand its vast potential good in the world. Cheaper products, dirt cheap health care, end famine, and many other things. He also talks about modeling the human mind to see what all areas do specifically to create strong ai. As far as timeframe I think it may discuss that in another one of his books. Live long enough to live forever or something like that. I've been meaning to buy it because I think it discusses ways to be healthy through your diet. I think he projects at the current rate of exponential growth of certain technologies to be about 50 years for either increased longevity or immortality i cant remember exactly.
Doubt is the root of all wisdom. - Unknown
Knowing will come from the practice of understanding - Myself
Are you suggesting we stack layers of transistors? The heat build up would destroy the inner transistors within a minute of operation.
"I am an atheist, thank God." -Oriana Fallaci
I don't think this is true. We do not have a clear grasp of what true intelligence is (a few hundred years ago, being able to do maths quickly would have demonstrated intelligence what people call intelligence). Also, most AI techniques used right now are treated as black boxes, and cannot easily be checked for flaws - neural networks and genetic algorithms are two common examples. Once we create an AI whose intelligence surpasses our own, I doubt we will have any control over the way it thinks or the conclusions it comes to. It will be too complex for us to understand except on a very high level.
Having said that, I don't think a Terminator situation will happen either, unless we deliberately design computers that are self-interested and dominance-oriented. While we won't be able to control computers' thoughts and actions, we will at least be able to control their high level behaviour, like our own emotions and instincts control us.
As for the technological singularity, I think it's useless making predictions (If we treat the singularity as a point where computers can make better versions of themselves). AI researchers have been wrong time and time again about how easy/hard things are to do, simply because we don't understand the problem of intelligence well enough.
There have been some interesting developments over the past few decades, but all of our AI research really boils down to some simple, but clever, mathematical/logical techniques that give some semblence of limited intelligence in certain areas. If we really want intelligence in the human sense, we really should be looking to neuroscience, rather than the current techniques of AI, to solve the problem.
On the other hand, the most promising AI research I've read about so far for capturing human intelligence is described here: http://en.wikipedia.org/wiki/Open_Mind_Common_Sense
This approach seems flawed to me, but it is still a better approach to human intelligence than any other current technology.
Am I ever glad I stumbled upon this place.
After reading thoroughly through "The Singularity is Near" and after sending Kurzweil some emails, I have come to the conclusion that he is indeed, right. Moore's Law is the doubling of processing power, over about a year and a half. The time between doubling is shrinking constantly. Kurzweil also points out that once sufficient research and progression has been made in Nueroscience, we will be able to map and reverse engineer the brain.
Which is one of the reasons why I want to become a Nueroscientist.
"Religion is regarded by the common as true, the wise as false and the rulers as useful"