Image credit
People who lived in the last century witnessed the most rapid increase in human technological capability in history. The slope of the curve of progress will clearly be positive for some time, but its curvature remains an open question. Some who speculate on the future, Ray Kurzweil most prominent among them, insist that technology will not only improve, but the rate of its improvement will improve long into the future, until the machines we create are able to improve themselves faster than we can design them, and all ability to foresee the direction of technology will vanish like light across a black hole's event horizon. Such an event is known as a technological singularity, and you can read more about it here:
Technological singularity
The arguments in favor of a technological singularity happening in this century go something like this. Computers are much better now than they were a few decades ago, and if you plot a number of figures of merit (including processor speed, storage space, internet connectivity, and so on) against time, an exponential curve fits the data pretty well. Intel co-founder Gordon Moore first noticed this in 1965. The people who build computers obviously want this trend to continue into the future, so what happens if it does? In a few more decades, then, computers become almost unimaginably powerful and cheap. By the end of the 2020s, Kurzweil claims, a $1,000 computer will have the thinking capacity of a thousand humans, and human brains will compose only 1% of the total intelligence on Earth. What then? Clearly there would be a lot more computational muscle around than we're used to having, and it's conceivable that problems seemingly intractable now will be solved after a simple bit of computer concentration.
The idea of a technological singularity reminds me of those cartoon scenes where a character does the math and figure out that given his wimpy allowance, it'll take hundreds of years afford the sweet new video game or bike or whatever. Trends, especially exponential ones, are impressive when they have staying power, but whether that endurance exists depends on details well hidden from a simple log-log plot. The increase in computational power over the last half-century, for example, is mainly dirven by improvements in integrated circuit technology, allowing more and more processing horsepower to fit into a machine with the same size and price tag. Matter isn't continuous though, it comes in integer amounts of atoms, and we're close to the point where integrated circuits can't be made any smaller because the quantum noise of individual atoms will start to drown out the processing signal. A fundamental breakthrough is required, perhaps through spintronics or quantum computing for Moore's "Law" (really a trend) to continue. Whether and when this will happen is virtually impossible to say with any kind of certainty, unless you're trying to see tickets to your next lecture.
It's possible that the greatest slope in the curve of technological progress for the time being lies in our past. As sci-fi author Rick Robinson put it, "I also suspect that the Singularity, to the extent there is one, already happened, centered around 1870-1930, though it will take at least a couple more centuries for its consequences to become clear." Regardless of what the future holds, we have a lot to wrap our heads around right now.
No comments:
Post a Comment