What happens next is uncertain — and has been the subject of speculation since the dawn of computing. Good hypothesized that "ultraintelligent" machines, once created, could design even better machines. Thus the first ultraintelligent machine is the last invention that man need ever make," he wrote.
Buzz about the coming singularity has escalated to such a pitch that there's even a book coming out next month, called "Singularity Rising" BenBella Books , by James Miller, an associate professor of economics at Smith College, about how to survive in a post-singularity world. But not everyone puts stock in this notion of a singularity, or thinks we'll ever reach it. Perhaps without sensory inputs from the outside world, computers could never become self-aware.
Others argue that Moore's law will soon start to break down, or that it has already. The argument stems from the fact that engineers can't miniaturize transistors much more than they already have, because they're already pushing atomic limits. On the atomic scale, bizarre quantum effects set in. Transistors no longer maintain a single state represented by a "1" or a "0," but instead vacillate unpredictably between the two states, rendering circuits and data storage unreliable. The other limiting factor, Denning says, is that transistors give off heat when they switch between states, and when too many transistors, regardless of their size, are crammed together onto a single silicon chip, the heat they collectively emit melts the chip.
For these reasons, some scientists say computing power is approaching its zenith. But if that's the case, it's news to many. Doyne Farmer, a professor of mathematics at Oxford University who studies the evolution of technology, says there is little evidence for an end to Moore's law. He says computers continue to grow more powerful as they become more brain-like. Computers can already perform individual operations orders of magnitude faster than humans can, Farmer said; meanwhile, the human brain remains far superior at parallel processing, or performing multiple operations at once.
For most of the past half-century, engineers made computers faster by increasing the number of transistors in their processors, but they only recently began "parallelizing" computer processors. To work around the fact that individual processors can't be packed with extra transistors, engineers have begun upping computing power by building multi-core processors, or systems of chips that perform calculations in parallel. You can imagine a robot arm in a factory that automatically remanufactures itself when the object that it is putting into boxes changes shape.
Every sector is changing and even the lines between industry sectors are becoming blurred, as 3D-printing and machine learning come together for example; as manufacturing and information; or manufacturing and the body come together. What needs to be done to ensure that their benefits are maximized and the associated risks kept under control?
If you think about the future of computing as a convergence of the biological, the physical and the digital and the post-digital quantum , using as examples 3D-printing, biotechnology, robotics for prosthetics, the internet of things, autonomous vehicles, other kinds of artificial intelligence, you can see the extent of how life will change.
We need to make sure that these developments benefit all of society, not just the most wealthy members of society who might want these prosthetics, but every person who needs them. One of our first questions in the Council is going to be, how do we establish governance for equitable innovation? How do we foster the equitable benefits of these technologies for every nation and every person in every nation? And, is top-down governance the right model for controlling the use of these technologies, or is bottom-up ethical education of those that engage in the development of the technologies and their distribution, a better way to think about how to ensure equitable use?
I believe that all technologists need to keep in mind a multi-level, multi-part model of technology that takes into account the technological but also the social, the cultural, the legal, all of these aspects of development. All technologists need to be trained in the human as well as the technological so that they understand uses to which their technology could be put and reflect on the uses they want it to be put to. We have no idea yet because change is happening so quickly.
We know that quantum computing — the introduction of physics into the field of computer science — is going to be extremely important; that computers are going to become really, very tiny, the size of an atom.
My own work goes towards ensuring that social bonds and the relationships amongst people, and even the relationship between us and our technology, supports a social infrastructure, so that we never forget those values that make us human. The views expressed in this article are those of the author alone and not the World Economic Forum. Despite clear economic benefits of new digital technologies, slow median wage growth has led many to worry that these new technologies are failing to deliver for the average worker.
MIT scientists and the Qatar Center for Artificial intelligence have developed a deep learning model that predicts very high resolution crash risk maps.
I accept. Computing in medical nanobots and autonomous vehicles. But will they bring people together? Take action on UpLink. Setting aside the artificial intelligence debate for a moment, what might futuristic computers look like?
They might actually be invisible. Pervasive computing is a type of technology that incorporates computers into just about anything you can imagine.
Buildings, highways, vehicles and even the clothing you wear might have built-in computer elements. Coupled with networking technology, the world of may be one in which the very environment around you is part of a massive computing system.
In such a world, your digital life and your real life could overlap seamlessly. We see hints of this world in today's technology. There are hundreds of smartphone applications that add a digital layer over our perception of the real world.
They might help you navigate around a strange city or discover a new favorite restaurant tucked away in a corner somewhere. These applications still require us to activate programs on mobile devices and use those devices as a lens through which we can see the digital world. In the future, we may be able to accomplish the same thing using glasses, contact lenses or perhaps even ocular implants.
Imagine being able to look at the world through one of a million different filters, all of which provide different kinds of information to you instantaneously. Then again, it's possible that our ingenuity won't be enough to keep up with Moore's law after a few more microprocessor generations. Perhaps our computers will be more mundane and functional. But considering the way they've transformed our world over the last 50 years, I'm willing to bet will be an exotic, digital era.
What do you think? We want to hear what your predictions are for the future of computing. Share your ideas in our comments section! Sign up for our Newsletter! Mobile Newsletter banner close.
0コメント