Some people fear the idea of some form of artificial intelligence taking over the world and killing everyone. While this is probably the most extreme outcome, it is generally our worst fears that motivate us most. If we were to create a true kind of artificial intelligence that was actually able to think for itself and then reproduce itself, we would have done something amazing. But, the natural processes of evolution have already done that. You and I are here because of it. That intelligence probably would not have existed without our intervention, but we would not have come to exist had it not been for the evolution of plant life. They filled the atmosphere with the oxygen that we are breathing at this and every moment. So what if we create something that is more capable than us? Wouldn’t that be incredible?

But then there’s the fear of being taken over, or whatever you want to call it. I don’t think this is likely, but I would be stupid not to acknowledge the possibility. Even Steven Hawking is thinking seriously about it. He suggests we bolster our understanding of genetic engineering as a way of keeping artificial intelligence in check. “Computers double their performance every 18 months. So the danger is real that they could develop intelligence and take over the world (Newsweek, 2001).”