ChubbyHuggs
Member
Humans should never gain knowledge of immortality.
Just as soon as our colossal overlord has decided the calculation has reached he proper uncertainty level to finish his overlord thesis, we are FUCKED.
Haven't we been seeking to do just that for a while without significant controversy?
But I think I can see some ethical dilemmas. For instance, in the Black Mirror episode White Christmas we develop in some nearby future the capacity to create a "cookie" from an existing human brain. This is a simulation that has sentience and can this be manipulated, threatened and tortured.
But that's not a hard boundary, as was suggested. Examples of beneficial uses of such a technology are to enable a sentient mind (or a reasonably faithful facsimile thereof) to survive the death of the body.
I would also suggest Surface Detail, one of Iain Banks' greatest novels, as a fairly good exploration of the pros and cons of that kind of machine intelligence. In particular, his discussion of what the narrator calls "The Simming Problem" is quite inclusive in its definition of machine intelligence.
http://ask.metafilter.com/178421/Effect-of-a-small-mass-at-near-light-speed-in-atmosphere0.25c is 75,000,000 m/s. The specific kinetic energy of an object at that speed is 2.81x1015 J/kg. (Neglecting relativistic effects, so this is a slight underestimate.) If we assume a 5 gram bullet, its kinetic energy is about 14 trillion joules. According to Wikipedia, that's on the order of about 30,000 bolts of lightning.
Another way to look at it: suppose the bullet disintegrates completely before hitting the ground. We can put a very loose lower bound on the rate of energy release by assuming it loses energy uniformly. The bullet would traverse 100km of atmosphere in about a millisecond, so that's 14 quadrillon watts of power.
Divide that by the surface area of a sphere, radius 10 km, and at that distance the heat output would momentarily be about 10,000 times brighter than the sun. At 1 km, it would be a million times brighter than the sun. The incandescent gas trail would probably reach billions or trillions of Kelvin before it had time to cool.
This is all under the assumption that the bullet vaporizes before it hits the ground. If not, you still get the same amount of energy released, but it all goes into the ground instead. 2.8 petajoules is about the energy released in a magnitude 7.1 earthquake.
Immortality.
I can't think of how it wouldn't bring the world to unavoidable ruin.
we have been trying to instill ai with the capacity to learn and be more functional than it is to the extent that it is nearly indistinguishable from humanity, but that seems more focused on functionality and making us comfortable and making technology better
imbuing something artificial with actual sentience though, seems like the most morally reprehensible thing i have ever heard of in my life
Why do you think we should abandon all research into creating sentient machines?
Easily available High fractions of C space travel speed
Easy way to fuck the planet up is to point a spacecraft at it and let it hit at relative velocity
Here's what just a single bullet at .25c would do:
http://ask.metafilter.com/178421/Effect-of-a-small-mass-at-near-light-speed-in-atmosphere
Now warping space to get by the C speed limit via Alcubierre drive is fine and dandy, just dont ACTUALLY have an easy way to propel objects to significant fractions of C that just anyone could get their hands on
Not quite what I meant. Going beyond a certain point in the development of sentient machines would pretty much be setting yourself up to be replaced,
and there's also the issue bringing something self-aware into the world for the pure purpose of servitude. But for your point they touched on that in a game called SOMA where you repeatedly upload a scan of a guy's brain that thinks it's real and keep wiping it from existence when it has a mental breakdown. Your character has issue with this but your buddy character is like "nah, it's fine. Robots."
because you dont have a right to bring something into an existence where it can suffer simply for your own benefit
basically thats just going to be like making the pass butter robot
Humans should never gain knowledge of immortality.
Well that's a good argument against child slavery. But would creating artificial children imply an obligation to mistreat them? If we can treat them well, then does your objection not evaporate?
You're not thinking hard enough then.
Of course we aren't. But without knowledge we will never know.
Everything we are, everything we can be is directing us to our ultimate destiny. The proliferation of humanity across the stars.
because you dont have a right to bring something into an existence where it can suffer simply for your own benefit
Unless "trickery" is involved that violates the laws of physics- minus gravity you have to spend that much energy to gain it. So that technology would already be capable of achieving that kind of destruction via other means. I suppose thousands of perfect orbital slingshots could get an object to some fraction of it.
The logic is cyclic, but unless we are going to erase all knowledge (i.e. wipe out all sentient life), it's the best we have to work with.Isaac Asimov said:If knowledge can create problems, it is not through ignorance that we can solve them.
What do you define as "ruin" here? Colonising planets and moons? Eating up stars?If we live forever we will ruin the galaxy.
heck, if you go with just someTHING, that'd extend even to plants. The stance is nonsense.
was pretty clearly having a conversation talking about sentient robots, not (non sentient) plants, so you can quit with the random bullshit that i didn't say thanks
What do you define as "ruin" here? Colonising planets and moons? Eating up stars?
If we can reproduce and never die anymore, humans will grow like a disease on the whole universe. So yes to all that you mentioned and much more.
Why do you think humans growing out into the universe are a 'disease'?If we can reproduce and never die anymore, humans will grow like a disease on the whole universe.
That sounds more like a problem for the humans than for the universe, to be honest.
Why do you think humans growing out into the universe are a 'disease'?
On this planet we're destructive to other forms of life, but if we assume there's no (or very little) life out there, what are we a disease to? Barren asteroids, moons, and planets? I don't think they'll mind.
As far as I remember, nothing is infinite. As long as we keep growing indefinitely, we would need more and more energy, more and more space, more and more resources. After a certain amount of time humans would eventually reach a critical state where there wouldn't be anything left, since I don't recall the possibility of energy being created from nothing. Thus, we would end the universe as we know it.
Obviously, this is just a layman's opinion on the subject.
you can treat them as well as you want, but you simply have no way to guarantee that any one existence is going to be free from suffering. if you had control over every variable dictating whether or not a sentient being would really be free from distress 100% of their lives, then i question what the point was in trying to generate sentience in the first place since you'd basically have to manipulate them 24/7, i feel
If we developed FTL we could send super-powerful observatories out to various distances to look at the Earth at what would be history.
Well I'm a father, so I kinda get how that whole thing works. Cool.
Why would we want to condemn sentient machines to servitude? Well, perhaps we're evil, in which case bring replaced by our robot slaves might be a moral improvement.
But I notice that you're just exploring worst cases here, whereas I thought the idea was that the creation of sentient life absolutely must not be researched.
I think it's rather more nuanced than that. We're biological, we're evolved and we probably can't ever traverse deep space by some faster-than-light space opera magic. It might be a good idea to research the creation of a successor species that would share our curiosity but fit the form factor and time constraints needed for deep space travel. Or maybe not. But the argument can be made.
Thought of another one: The Terrible Secret of Space
Luckily I am protected, as I have stairs in my house.
How to create a true vacuum. (No, not a Dyson)
How to make nukes.
They have no use for us, and even if we were to try to fend off an asteroid, it's far better to ward it off course gradually through other means.
Burning coal, oil or gas for energy.
Creating plastic.
Industrial farming.
Realistically no. But hypothetically time travel would be way too dangerous to mess with.