Eric Schmidt spoke at the Techonomy conference in Lake Tahoe today and dropped some serious rhetorical bombs. “There was 5 exabytes of information created between the dawn of civilization through 2003,” Schmidt said, “but that much information is now created every 2 days, and the pace is increasing…People aren’t ready for the technology revolution that’s going to happen to them.”
The Techonomy conference is a gathering of people from around the globe seeking to use technology to solve the world’s big problems. Schmidt spoke there today and said that people need to get ready for major technology disruption, fast.
The bulk of what’s contributing to this explosion of data, Schmidt says, is user generated content. From that content, far more prediction than we’ve seen today is possible and will be a factor in the future.
“If I look at enough of your messaging and your location, and use Artificial Intelligence,” Schmidt said, “we can predict where you are going to go.”
“Show us 14 photos of yourself and we can identify who you are. You think you don’t have 14 photos of yourself on the internet? You’ve got Facebook photos! People will find it’s very useful to have devices that remember what you want to do, because you forgot…But society isn’t ready for questions that will be raised as result of user-generated content.”
In addition to predicting personal behavior, diseases and other crises will become predictable as well, Schmidt said.
On the misuse of information for criminal or anti-social purposes:
“The only way to manage this is true transparency and no anonymity. In a world of asynchronous threats, it is too dangerous for there not to be some way to identify you. We need a [verified] name service for people. Governments will demand it.”
How’s that all sound to you? Realistic? Frightening? A combination of both?
The upside? “In our lifetimes,” Schmidt says, “we’ll go from a small number of people having access to information, to 5 billion people having all the world’s knowledge in their native language.” That is truly incredible.
But in a loss of privacy, in the hyper-proliferation of predictive technologies, is there a cost in terms of free will? Maybe not, but it certainly seems an appropriate subject of debate.