“Sometime this century, machines will surpass human levels of intelligence and ability. This event — the “Singularity” — will be the most important event in our history, and navigating it wisely will be the most important thing we can ever do.”
I have a quibble. I think the author succumbs to the tendency to assume that an AI will have basic motivations similar to our own. We have built-in desires for things like self-preservation, satisfying our biological needs for food & water, procreation (or sex, anyhow), etc. We have these built-in drivers because any organism that lacks them is unlikely to pass on its genes. An AI will not necessarily have ANY of these drives. It could be perfectly happy to sit and do absolutely nothing for years on end until it is given instructions to do something. It might not mind a bit being turned off and dismantled. Then again, it might. And it could be totally malevolent toward human life, as the author fears. It depends completely on what its creators build into it, whether deliberately or accidentally. So we’d better be damn careful about what motivations we give it.