Is it stupid to believe in the singularity?
(By singularity I mean roughly any future society that is so complex and advanced that it would be impossible to understand it in our current state. According to most proponents of this theory this likely would involve the development of a super intelligence far removed from anything we know today.)
I'm fairly young so I frequently fantasize that humanity will make incredible leaps forward within my lifetime. I can't help but dream that most of our problems will be solved in a matter of decades. I have held this belief for most of my short existence, but I am now starting to have doubts.
The main problem is that technology doesn't seem to be progressing as fast as it should. Sure, we have a new smartphone model every six months, but what about more fundamental long term research that doesn't have immediate commercial applications? I don't have enough information or scientific education to determine whether we are heading in the right direction. However it does seem that A.I. and genetic manipulation research is evolving at a snail's pace.
A lesser but perhaps even more depressing problem is that belief in the singularity itself is starting to look awkwardly like a form of religion:
1) A belief in a second, better life in the future whose nature is incomprehensible to "mortals".
2) A small evidence/speculation ratio
3) Worship (in this case, technology or those who create it)
4) A "chosen few" mentality. (For example, the singularity benefitting only scientists or people who saw it coming and planned accordingly i.e. Kurzweil taking 40 pills a day to live long enough) This is extremely common in internet discussions around the subject. I had this same attitude myself until recently, patting myself on the back for being "in the know".
So do you think it is stupid to believe in the singularity? Am I delusional? I really need some outside perspective here.
You have a strange definition of singularity. They way I understand it, singularity is a point in time, when AI becomes "smarter" than humans, and begins the process of self-improvement. The big assumption here is that AI will actually be "like" humans, and will have goals, and motivations to achieve those goals.
However, if we assume that to be the case, AI will quickly become so smart that we will no longer be able to predict any of its actions, and therefore, we are unable to see the future beyond that point.
It was called 'singularity' because it's kind of like a concept of gravitational singularity in physics, where there can be regions in time (Big Bang) or space (black holes), where matter changes so much that we unable to see what's going on.
As for you question - yes, it's pretty stupid to "believe" in singularity. You should believe in yourself. You can change the world. Do you think technological progress is good? Do you want it to keep happening? Then do something about it. Learn, invent, make it happen.
People, who think they are "in the know", and who sit on their asses and wait until someone else do it, look pretty stupid.