I think I've figured out my problem with a lot of the sf that talks about the "Singularity", and doubly so with the people who take it as a real-life given: evolution isn't that directional or predetermined. I can certainly accept the idea that people--or some people--may move in that direction. What I can't buy is the idea that this must happen, that it's as inevitable as entropy, or that if some people decide they'd rather be uploaded software, that will eliminate humans as a species of animal living on the planet in physical bodies.
Tags:
From:
no subject
From:
no subject
Can't think of any others off the top of my head, but then I'm not a well-read in SF as I should be.
From:
no subject
From:
no subject
From:
no subject
From:
no subject
From:
no subject
The 'loose' definition of the Singularity, as I understand it, is the notion that evolutionary change will happen soon, it will happen rapidly, and it will involve the intermediation of technology, with the result that we will all be something different and unrecognizable very soon now. It essentially takes Stephen Jay Gould's (?) idea of little bursts of evolutionary activity (for lack of a better term) and weds that to cyberpunkish ideas of the integration of humans and tech. Most discussions of Singularityness then proceed to focus less on the body-mod aspect of this stuff and go straight to the computer-aided consciousness aspect, but that's the cold dead hand of Descartes for you.
All of it still inevitably reminds me of KJF's story about Greg Bear: "Thirty years from now, you won't even recognize your children!" He's been saying that for thirty years now...
From:
no subject
From:
no subject
From:
no subject
I don't think that any particular change can be postulated at that point, just that it's unimaginable in practice.
Obviously, there are a bunch of SF authors running around with "The Singularity" as their central thesis/problem, Vinge particularly among them. But near as I can tell, the best of them -aren't- writing about the singularity as much as taking it as a given (however true or untrue that may be; I'm somewhat sceptical) and then writing around the concept rather than through it.
From:
no subject
Er, that should be "the rate of change", without the qualifier, I think -- there isn't really a term for the -acceleration- of change.
From:
no subject
An argument can and has been made that the rate of technological change at the end of the 19th and beginning of the 20t century was more rapid than it is now.
From:
no subject
From:
no subject
Also, there's a -similar-, but different concept which is also called a singularity, but -still- isn't limited to what
Personally, I'd take either singularity -- I have no desire for death, nor can I see much wrong with becoming smarter/thinking faster than I do now. But that's just me speaking for me.
From:
no subject
From:
no subject
Anyway, as for the rate of scientific progress. I suspect it has been speeding up in recent times, but I see no good reason to assume that this will be exponential. Two things have made a big, big difference in recent times: cheap access to massive computer power and the internet. The former allows both scientists and engineers to manipulate much larger, more complex designs. The latter makes communication of scientific ideas much faster -- minutes rather than the lag time of journals, conferences and workshops which were (still are!) measurable in months or years.
For years, people have talked about 'the wall' -- a point beyond which Moore's law (the doubling of computer power every 18 months that has been consistent since the 1970s) will break down. The way things are looking, the problem is more likely to hit first in the difficulties inherent in handling the sheer size of chip designs and verifying their correctness. At the time of writing, it costs about $2 million to have a typical custom chip fabricated. If you make a mistake, you've burnt all that money, and you'll have to pay over again. For current CPUs like the Pentium 4, the cost is maybe two orders of magnitude higher, The existing software tools used by chip designers are really struggling to cope, even at current lavels of complexity. There are better tools on the way, but this is likely to be, for practical purposes, 'the wall' rather than the limitations of silicon itself. Which brings us back to the singularity -- if humans are in the loop, our limited capacity for handling complexity will get in the way. But, if some smartass gets genetic programming really right, to the extent that humans are no longer in the loop, things could get really 'interesting' really quickly.
There are wildcards out there, however. Quantum computing might allow classes of computation that are effectively uncomputable at the moment to be computed. Most of the press blah talks about reversing one way functions and attacks on various cryposystems. I have a strong hunch that once people figure out how to program quantum computers, everything will change. Much of the limitations on handling complexity are bound up in the idea that you just can't do some things without brute force searching. Quantum computing could change that forever, making it possible to define algorithms in terms of their results, without any need to specify method.
The other wildcard is nanotech. I don't know if we will ever see Drexler-class nanomachines, but the steps toward that will include significantly useful spinoffs. A good example is in the diffirence that nanotech 'assemblers' could make to the chip industry. One of the main reasons why chips have increased in power exponentially is so simple and obvious that no one ever seems to mention it -- chips are rectangular! Silicon technology progresses primarily by gradually reducing the size of 'features' (wires, transistors, etc). This han been fairly linear over time. But, what happens when you reduce the size of things that are being tightly packed into a rectangle is that every time you half the feature size in one axis, you quadruple the number of devices on a chip. Assemblers could feasibly build 3D chips - given current device geometries and assuming a cube with the same edge size as current chips, we are looking at going up from 100,000,000 gates to 1,000,000,000,000 gates - four orders of magnitude in one hit. And, of course, if you half the feature size of a 3D chip, you get an eight times increase in component count within the same volume. As a further benefit, for topological reasons, it is much easier to route wires through a 3D chip than a 2D chip, so the density improvement may be improved even further.
From:
no subject
PS: I know you know
PPS: Sorry for rambling!
From:
no subject
From:
no subject
Most of the intelligent fiction I’ve read that’s about post-Singularity/post-human/Extropian/etc settings has ordinary people in it (sometimes for odd values of “ordinary”). Examples:
Vinge’s Zones of Thought stories (A Fire Upon the Deep, A Deepness in the Sky, and the short story “The Blabber”) all have fleshy human beings among the protagonists.
Vinge’s Marooned in Realtime might be the purest example of Singularity-as-Rapture — it has the only remaining humans being those left behind by the Singularity, but it needs to be that way for the inside-out-locked-room-murder-mystery plot, not because the it’s a definitional part of the Singularity.
I just recently read Karl Schroeder’s Ventus, which has Vingean star-gods sharing the galaxy with human beings, both ordinary and with various levels of technological enhancement.
Greg Egan’s Diaspora, which is about computer intelligences, shows them interacting at one point with fleshy people. In Schild’s Ladder some people live in computer simulation, while others live in post-biological physical bodies, and there are still some old-fashioned meat-based people running around out there.
In Cory Doctorow’s Down and Out in the Magic Kingdom, there exists the option of having yourself translated into computer code and embedded in a spacecraft, but everyone the novel actually focuses attention on is flesh-and-blood, and most of them don’t even have tails or extra arms or anything exotic like that.