redbird: closeup of me drinking tea, in a friend's kitchen (Default)
([personal profile] redbird Oct. 2nd, 2003 12:09 pm)
I think I've figured out my problem with a lot of the sf that talks about the "Singularity", and doubly so with the people who take it as a real-life given: evolution isn't that directional or predetermined. I can certainly accept the idea that people--or some people--may move in that direction. What I can't buy is the idea that this must happen, that it's as inevitable as entropy, or that if some people decide they'd rather be uploaded software, that will eliminate humans as a species of animal living on the planet in physical bodies.
drplokta: (Default)

From: [personal profile] drplokta


I think you're setting up a straw man here. Could you give a few examples of SF in which humanity moves uniformly to some kind of post-humanism (without some coincidental natural disaster to explain it, as in Greg Egan's Diaspora)?

From: [identity profile] wordweaverlynn.livejournal.com


Arthur C. Clarke, Childhood's End.

Can't think of any others off the top of my head, but then I'm not a well-read in SF as I should be.
avram: (Default)

From: [personal profile] avram


Yeah, but in Childhood’s End humanity suddenly evolves into psychic godhood, via a mechanism that I don’t recall being explained. That’s very definitely not the Vingean Singularity, which is supposed to occur deliberately through technological development.

From: [identity profile] the-gardener.livejournal.com


I agree. The Singularity is extropian wank by people too excited to have noticed that until we have a mathematical description of consciousness they have no hope of downloadingt hemselves into machine and setting off for the stars. And too excited to have noticed that the most likely scenario for humanity over the next hundred or so years isn't technological progress roaring unstoppably on and on but the beginning of our extinction because we've buggered the global environment beyond its ability to support us.

From: [identity profile] supergee.livejournal.com


If I get uploaded, which may be my dearest wish, I will have no problem with others remaining corporeal. Doing otherwise would be as dumb as wanting to punish people for sexual activities I don't consider fun.

From: [identity profile] ailsaek.livejournal.com


I can't believe all of humanity agreeing on anything, let alone something like the Singularity. I see it in the same catrgory as FTL travel - a useful SF prop for those who want it, but not necessarily something that will ever come to pass in real life.

From: [identity profile] pantryslut.livejournal.com


I think this is only one version of the Singularity, though. One I haven't actually seen much of in fiction (Charlie Stross excepted, and he's really making fun of the idea in a subtle way), but have seen a lot of in popular discussions. Of course, I may be reading the wrong fiction. Or the right fiction, depending on your perspective.

The 'loose' definition of the Singularity, as I understand it, is the notion that evolutionary change will happen soon, it will happen rapidly, and it will involve the intermediation of technology, with the result that we will all be something different and unrecognizable very soon now. It essentially takes Stephen Jay Gould's (?) idea of little bursts of evolutionary activity (for lack of a better term) and weds that to cyberpunkish ideas of the integration of humans and tech. Most discussions of Singularityness then proceed to focus less on the body-mod aspect of this stuff and go straight to the computer-aided consciousness aspect, but that's the cold dead hand of Descartes for you.

All of it still inevitably reminds me of KJF's story about Greg Bear: "Thirty years from now, you won't even recognize your children!" He's been saying that for thirty years now...


From: [identity profile] rysmiel.livejournal.com


Small grump about punctuated equilibrium: the evidence Gould and others talk about for the idea refers to jumps which are rapid only on a geological scale, as in of the order of ten thousand years or so; which for most of the organisms I've seen it advanced for, is several thousand generations at minimum; that actually needs very little gradual change per generation for weird and wonderful species changes to happen. It doesn't mean evolution sits on its heels for a quarter of a million years and then kicks in overnight.


From: [identity profile] pantryslut.livejournal.com


You're right, of course, and I knew this; I'm just noting the influence of Gould's idea on the Singularity people.
mneme: (Default)

From: [personal profile] mneme


Really, the singularity is about one and only one thing -- the apparent trend we've been seeing of the rate of techonological progress increasing as time went on, rather than going at a constant rate or leveling off. "the singularity" is the concept that this trend, if it goes on, will result in an unamiaginable rate of change -- if one sees it as an upwards curve, eventually it can hit a limit such that, say, as the date approaches 2100, the change (not the rate of change) approaches infinity -- what happens after that, or at that time, is the classic singularity.

I don't think that any particular change can be postulated at that point, just that it's unimaginable in practice.

Obviously, there are a bunch of SF authors running around with "The Singularity" as their central thesis/problem, Vinge particularly among them. But near as I can tell, the best of them -aren't- writing about the singularity as much as taking it as a given (however true or untrue that may be; I'm somewhat sceptical) and then writing around the concept rather than through it.
mneme: (Default)

From: [personal profile] mneme


"the change (not the rate of change) "

Er, that should be "the rate of change", without the qualifier, I think -- there isn't really a term for the -acceleration- of change.

From: [identity profile] pantryslut.livejournal.com


I don't believe that the rate of technological change has been increasing. I think that's a myth that we tell ourselves -- for a variety of reasons.

An argument can and has been made that the rate of technological change at the end of the 19th and beginning of the 20t century was more rapid than it is now.

From: [identity profile] nolly.livejournal.com


Your first paragraph is closer to my understanding of Vinge's singularity concept than what [livejournal.com profile] redbird described. It's the point we can't predict beyond, because things have changed too much. There's a similar concept in chaos math, but I've forgotten the term.
mneme: (Default)

From: [personal profile] mneme


Yup (don't know enough chaos math to know the term, though).

Also, there's a -similar-, but different concept which is also called a singularity, but -still- isn't limited to what [livejournal.com profile] redbird described, of technological change to the point where we're making ourself smarter (thus making ourselves better able to invent, thus capable of making ourself even smarter, etc). Descriptions of this tend to involve a transition to machine intelligence, but mostly as a biproduct, not an end in and of itself.

Personally, I'd take either singularity -- I have no desire for death, nor can I see much wrong with becoming smarter/thinking faster than I do now. But that's just me speaking for me.

From: [identity profile] wild-irises.livejournal.com


That must be one of Greg Bear's argumentative styles or something. It's probably just about 15 years ago now that he and I mixed it up at a party over whether or not printed books would exist in 15 years ... since I write from my desk at a profitable book publishing company, I claim victory.

From: [identity profile] compilerbitch.livejournal.com


I have a copy of Vernor Vinge's 'Across Realtime' sitting on the shelf near me - some people seem te credit this with the coining of the term. I have to say I didn't go for it in a big way as a book -- the argument didn't convince me. I do rather like his space opera 'A deepness in the sky' however. The spiders were great. To continue this wiggly line of reasoning, I'd also recommend Greg Egan's Schild's Ladder -- math-fi with some nice ideas. I like the concept of describing the non-uploaded as 'anachronauts'.

Anyway, as for the rate of scientific progress. I suspect it has been speeding up in recent times, but I see no good reason to assume that this will be exponential. Two things have made a big, big difference in recent times: cheap access to massive computer power and the internet. The former allows both scientists and engineers to manipulate much larger, more complex designs. The latter makes communication of scientific ideas much faster -- minutes rather than the lag time of journals, conferences and workshops which were (still are!) measurable in months or years.

For years, people have talked about 'the wall' -- a point beyond which Moore's law (the doubling of computer power every 18 months that has been consistent since the 1970s) will break down. The way things are looking, the problem is more likely to hit first in the difficulties inherent in handling the sheer size of chip designs and verifying their correctness. At the time of writing, it costs about $2 million to have a typical custom chip fabricated. If you make a mistake, you've burnt all that money, and you'll have to pay over again. For current CPUs like the Pentium 4, the cost is maybe two orders of magnitude higher, The existing software tools used by chip designers are really struggling to cope, even at current lavels of complexity. There are better tools on the way, but this is likely to be, for practical purposes, 'the wall' rather than the limitations of silicon itself. Which brings us back to the singularity -- if humans are in the loop, our limited capacity for handling complexity will get in the way. But, if some smartass gets genetic programming really right, to the extent that humans are no longer in the loop, things could get really 'interesting' really quickly.

There are wildcards out there, however. Quantum computing might allow classes of computation that are effectively uncomputable at the moment to be computed. Most of the press blah talks about reversing one way functions and attacks on various cryposystems. I have a strong hunch that once people figure out how to program quantum computers, everything will change. Much of the limitations on handling complexity are bound up in the idea that you just can't do some things without brute force searching. Quantum computing could change that forever, making it possible to define algorithms in terms of their results, without any need to specify method.

The other wildcard is nanotech. I don't know if we will ever see Drexler-class nanomachines, but the steps toward that will include significantly useful spinoffs. A good example is in the diffirence that nanotech 'assemblers' could make to the chip industry. One of the main reasons why chips have increased in power exponentially is so simple and obvious that no one ever seems to mention it -- chips are rectangular! Silicon technology progresses primarily by gradually reducing the size of 'features' (wires, transistors, etc). This han been fairly linear over time. But, what happens when you reduce the size of things that are being tightly packed into a rectangle is that every time you half the feature size in one axis, you quadruple the number of devices on a chip. Assemblers could feasibly build 3D chips - given current device geometries and assuming a cube with the same edge size as current chips, we are looking at going up from 100,000,000 gates to 1,000,000,000,000 gates - four orders of magnitude in one hit. And, of course, if you half the feature size of a 3D chip, you get an eight times increase in component count within the same volume. As a further benefit, for topological reasons, it is much easier to route wires through a 3D chip than a 2D chip, so the density improvement may be improved even further.



From: [identity profile] compilerbitch.livejournal.com



PS: I know you know [livejournal.com profile] livredor and have seen your posts on her journal. You've also just met [livejournal.com profile] doseybat, who alerted me to this conversation and suggested that it might be 'up my street'. Which it is. I've added you as a friend -- is that OK?

PPS: Sorry for rambling!
avram: (Default)

From: [personal profile] avram


Since when does the idea of the Singularity include the idea that nobody will stay behind? There’d be the Amish, of nobody else.

Most of the intelligent fiction I’ve read that’s about post-Singularity/post-human/Extropian/etc settings has ordinary people in it (sometimes for odd values of “ordinary”). Examples:

Vinge’s Zones of Thought stories (A Fire Upon the Deep, A Deepness in the Sky, and the short story “The Blabber”) all have fleshy human beings among the protagonists.

Vinge’s Marooned in Realtime might be the purest example of Singularity-as-Rapture — it has the only remaining humans being those left behind by the Singularity, but it needs to be that way for the inside-out-locked-room-murder-mystery plot, not because the it’s a definitional part of the Singularity.

I just recently read Karl Schroeder’s Ventus, which has Vingean star-gods sharing the galaxy with human beings, both ordinary and with various levels of technological enhancement.

Greg Egan’s Diaspora, which is about computer intelligences, shows them interacting at one point with fleshy people. In Schild’s Ladder some people live in computer simulation, while others live in post-biological physical bodies, and there are still some old-fashioned meat-based people running around out there.

In Cory Doctorow’s Down and Out in the Magic Kingdom, there exists the option of having yourself translated into computer code and embedded in a spacecraft, but everyone the novel actually focuses attention on is flesh-and-blood, and most of them don’t even have tails or extra arms or anything exotic like that.
.

About Me

redbird: closeup of me drinking tea, in a friend's kitchen (Default)
Redbird

Most-used tags

Powered by Dreamwidth Studios

Style credit

Expand cut tags

No cut tags