Sunday, 18 December 2011

Technological Singularity

Technological singularity will be an event in the future of humanity when artificial intelligence will surpass human intelligence and the immediate and long-term future would be impossible to predict.
Simply put, the time when your robot is more intelligent than you, will be Technological singularity.

Some people might argue that their smartphone is already more intelligent than them just because it can connect to the Internet and collect any information that it is programmed to do. Intelligence is more related to decision making than with knowledge-compendium. A guy called 'Wikipedia'  already knows more than any single individual in the world but he cannot make decisions in complex situations as a human can. This barrier will be broken, if and when singularity occurs. After it, robots would have decision making intelligence greater than humans and therefore it is impossible for any normal human to think of what will happen next.


Most scientists predict Singularity to occur in this very century and the predicted year is around 2040. In the past, many scientists have predicted singularity to occur as early as 2010(which as far as I know, did not happen) and as late as 'never'. Singularity will not be an event scheduled for a specific date and time, instead it will be an era. It will be an era of hitherto unknown acceleration in knowledge and development of technology.
You might not have noticed, but if we look back into history, the event of singularity might have already occurred quite a few times. The earth, before humans existed, was a very different place than it is today. The changes that humans have brought to the world must have been unthinkable by the animals that were present before us. So, can't we think of that as Evolutionary Singularity? Similarly, if we look even before that, the earth when it was just born must have been quite different than it would have been at the time when life blossomed. This again can be termed as some sort of Ecological Singularity.
According to scientists, technological singularity will take place via any of the three scenarios (many more might be possible):
1. IA(Intelligence Amplification) Scenario
The scenario when human intelligence is augmented or supplemented by technological advances so much that it reaches superhuman levels. We have already seen the beginning of this. Instruments like calculators, computers and other devices supplement humans to perform better.
2. AI (Artificial Intelligence) Scenario
This scenario proposes the possibility of humans creating humanoids or robots having intelligence far greater than humans themselves. AI is rapidly advancing in all aspects and there might be a time when it actually overtakes us.
3. Biomedical Scenario
The scenario when doctors and surgeons are able to devise methods which can directly increase the intelligence of human mind manifold by improving the neurological functions of the brain.

Today, artificial intelligence is evolving faster than ever. We have so many advanced computer algorithms to analyze and manipulate data which were previously unheard of. Our HRD minister wants social networking sites to create image and text analyzing algorithms which can detect sarcasm. Now, this would be one major step towards technological singularity. We have speech recognition softwares which can analyze human language, 'understand' it and act accordingly. The personal assistant 'Siri' put into iPhone4s by Apple is one such example. Apple claims that Siri can understand human expressions in their natural form i.e. people need not learn any syntax to operate it. Many movies like the Terminator series, "I, Robot" have also depicted the scenario of singularity in some form or the other.

To believe Singularity can happen, for one moment, let's assume humans are machines. We are really sophisticated machines made up of billions and billions of biomolecules that interact according to well-defined, though not completely known, rules deriving from physics and chemistry. The biomolecular interactions taking place inside our heads give rise to our intellect, our feelings, our sense of self.
Accepting this hypothesis opens up a remarkable possibility. If we really are machines and if we learn the rules governing our brains, then in principle there's no reason why we shouldn't be able to replicate those rules in, say, silicon and steel. I believe our creation would exhibit genuine human-level intelligence, emotions, and even consciousness.

Another way Singularity can happen is by the creation of self-improving codes. Currently, any source code or any algorithm needs human intervention to improve itself. If there were self-improving codes, there won't be need any for humans to improve the source code. The code will enhance any shortcoming it faces and improve itself. This loop of recursive-improvement will, at one point of time, lead to super-intelligent machines which will be able to beat humans at their own game.

We humans cannot imagine the world post-singularity just because it will be a product of super-human intelligence. However, we can speculate whether it will be good for humanity or devastating for it.
Having super-humanly intelligent machines will lead to a world similar to today but with a little role reversal. The position that we enjoy today as the supreme authority on earth will be then enjoyed by the machines. The machines will takeover all the development on the planet and we will merely become the cogs in the wheel. This is not necessarily a bad thing. It is quite possible that those machines will help us understand much more about nature and universe as a whole and help us develop as well. Even today, we don't know much about the secrets of far-away galaxies and we don't have a way to reach them. We still think that speed of light is the zenith. We still don't know much about the God-particle and many more mysteries are still unsolved. Maybe those machines will hep us uncover them. Maybe, they'll help us find cure for many incurable diseases which claim millions of lives each year.
The converse of this might take place as well. It is possible that those machines might eradicate humanity once and for all, to claim a larger portion of land. We are already 7 billion strong and there is not much place left anywhere for a 'species' more intelligent than us to co-exist in complete peace and harmony. This is similar to what we are doing today. We are eradicating forests, killing wildlife to make more space for us. So, if those machines kill us, they will be justified as well.

We cannot avoid Technological Singularity neither do we need to. If we want the technology to progress any further, we have to keep working to improve it. The scientists cannot sit at home and do nothing in the fear of creating something more intelligent than themselves.
Technological progress is required by society, economy and humanity to grow and progress, but who knows, this same thing might be the reason for its downfall????

1 comment: