Is there knowledge that humanity shouldn't know?

Haven't we been seeking to do just that for a while without significant controversy?

But I think I can see some ethical dilemmas. For instance, in the Black Mirror episode White Christmas we develop in some nearby future the capacity to create a "cookie" from an existing human brain. This is a simulation that has sentience and can this be manipulated, threatened and tortured.

But that's not a hard boundary, as was suggested. Examples of beneficial uses of such a technology are to enable a sentient mind (or a reasonably faithful facsimile thereof) to survive the death of the body.

I would also suggest Surface Detail, one of Iain Banks' greatest novels, as a fairly good exploration of the pros and cons of that kind of machine intelligence. In particular, his discussion of what the narrator calls "The Simming Problem" is quite inclusive in its definition of machine intelligence.
Not quite what I meant. Going beyond a certain point in the development of sentient machines would pretty much be setting yourself up to be replaced, and there's also the issue bringing something self-aware into the world for the pure purpose of servitude. But for your point they touched on that in a game called SOMA where you repeatedly upload a scan of a guy's brain that thinks it's real and keep wiping it from existence when it has a mental breakdown. Your character has issue with this but your buddy character is like "nah, it's fine. Robots."


Also living forever.

Living forever sucks.
 
Easily available High fractions of C space travel speed

Easy way to fuck the planet up is to point a spacecraft at it and let it hit at relative velocity
Here's what just a single bullet at .25c would do:
0.25c is 75,000,000 m/s. The specific kinetic energy of an object at that speed is 2.81x1015 J/kg. (Neglecting relativistic effects, so this is a slight underestimate.) If we assume a 5 gram bullet, its kinetic energy is about 14 trillion joules. According to Wikipedia, that's on the order of about 30,000 bolts of lightning.

Another way to look at it: suppose the bullet disintegrates completely before hitting the ground. We can put a very loose lower bound on the rate of energy release by assuming it loses energy uniformly. The bullet would traverse 100km of atmosphere in about a millisecond, so that's 14 quadrillon watts of power.

Divide that by the surface area of a sphere, radius 10 km, and at that distance the heat output would momentarily be about 10,000 times brighter than the sun. At 1 km, it would be a million times brighter than the sun. The incandescent gas trail would probably reach billions or trillions of Kelvin before it had time to cool.

This is all under the assumption that the bullet vaporizes before it hits the ground. If not, you still get the same amount of energy released, but it all goes into the ground instead. 2.8 petajoules is about the energy released in a magnitude 7.1 earthquake.
http://ask.metafilter.com/178421/Effect-of-a-small-mass-at-near-light-speed-in-atmosphere

Now warping space to get by the C speed limit via Alcubierre drive is fine and dandy, just dont ACTUALLY have an easy way to propel objects to significant fractions of C that just anyone could get their hands on
 
we have been trying to instill ai with the capacity to learn and be more functional than it is to the extent that it is nearly indistinguishable from humanity, but that seems more focused on functionality and making us comfortable and making technology better

imbuing something artificial with actual sentience though, seems like the most morally reprehensible thing i have ever heard of in my life
You're describing only the most successful and therefore the most prominent kind of AI research. But the other kind of research still exists and (in the early days) was the only kind that was known to exist. That is, early researchers were consciously attempting to reproduce the processes of human thought.

Although that strand of AI research ran into some dead ends that blunted its early promise, research into sentient machines hasn't been hampered by ethical concerns. It's just been overshadowed by statistical methods.

Why do you think we should abandon all research into creating sentient machines?
 

Stinkles

Clothed, sober, cooperative
Easily available High fractions of C space travel speed

Easy way to fuck the planet up is to point a spacecraft at it and let it hit at relative velocity
Here's what just a single bullet at .25c would do:

http://ask.metafilter.com/178421/Effect-of-a-small-mass-at-near-light-speed-in-atmosphere

Now warping space to get by the C speed limit via Alcubierre drive is fine and dandy, just dont ACTUALLY have an easy way to propel objects to significant fractions of C that just anyone could get their hands on
Unless "trickery" is involved that violates the laws of physics- minus gravity you have to spend that much energy to gain it. So that technology would already be capable of achieving that kind of destruction via other means. I suppose thousands of perfect orbital slingshots could get an object to some fraction of it.
 
Not quite what I meant. Going beyond a certain point in the development of sentient machines would pretty much be setting yourself up to be replaced,
Well I'm a father, so I kinda get how that whole thing works. Cool.

and there's also the issue bringing something self-aware into the world for the pure purpose of servitude. But for your point they touched on that in a game called SOMA where you repeatedly upload a scan of a guy's brain that thinks it's real and keep wiping it from existence when it has a mental breakdown. Your character has issue with this but your buddy character is like "nah, it's fine. Robots."
Why would we want to condemn sentient machines to servitude? Well, perhaps we're evil, in which case bring replaced by our robot slaves might be a moral improvement.

But I notice that you're just exploring worst cases here, whereas I thought the idea was that the creation of sentient life absolutely must not be researched.

I think it's rather more nuanced than that. We're biological, we're evolved and we probably can't ever traverse deep space by some faster-than-light space opera magic. It might be a good idea to research the creation of a successor species that would share our curiosity but fit the form factor and time constraints needed for deep space travel. Or maybe not. But the argument can be made.
 
because you dont have a right to bring something into an existence where it can suffer simply for your own benefit

basically thats just going to be like making the pass butter robot
Well that's a good argument against child slavery. But would creating artificial children imply an obligation to mistreat them? If we can treat them well, then does your objection not evaporate?
 
Well that's a good argument against child slavery. But would creating artificial children imply an obligation to mistreat them? If we can treat them well, then does your objection not evaporate?
you can treat them as well as you want, but you simply have no way to guarantee that any one existence is going to be free from suffering. if you had control over every variable dictating whether or not a sentient being would really be free from distress 100% of their lives, then i question what the point was in trying to generate sentience in the first place since you'd basically have to manipulate them 24/7, i feel
 
You're not thinking hard enough then.
Well

a) Immortality is widespread - complete resource depletion quickly ensues as the human race is unable to reach an equilibrium of growth.

b) Immortality is restricted - a social divide would form, and wars would surely follow. 'Who decides who gets to live forever?'. The best case resolution of such wars would be a).

Is there a c?
 
Knowledge in itself is neither good nor bad,all that matters is how and for what purpose we use it.

Having said that,if i had to choose one knowledge to be forbidden it would be differences between IQ and/or Cognitive abilities among people from different geographical and cultural backgrounds.

It most likely exists but i don't see any social benefit into knowing as a hard fact and it can be used as a segregation tool for many.
 
Of course we aren't. But without knowledge we will never know.

Everything we are, everything we can be is directing us to our ultimate destiny. The proliferation of humanity across the stars.
That's not our ultimate destiny, but only a midpoint goal. Our ultimate destiny is most likely extinction.

because you dont have a right to bring something into an existence where it can suffer simply for your own benefit
...are you a vegan?

heck, if you go with just someTHING, that'd extend even to plants. The stance is nonsense.
 
Unless "trickery" is involved that violates the laws of physics- minus gravity you have to spend that much energy to gain it. So that technology would already be capable of achieving that kind of destruction via other means. I suppose thousands of perfect orbital slingshots could get an object to some fraction of it.
slingshots only work to a point, eventually you're coming and going faster than the planet can even have enough time of you in its Sphere of Influence to pull you towards it and adjust your trajectory much
 
was pretty clearly having a conversation talking about sentient robots, not (non sentient) plants, so you can quit with the random bullshit that i didn't say thanks
These are natural developments of your stance. And i see you ignored that we already do this to animals.

I'll quit when i want, b. Same as you.
 
I think the creation of life itself is one. I don't think we will ever get there, but it would be horrible.

More pressingly, genome editing. Once that pandora's box is truly opened Gattaca will look like paradise. I'd argue advanced AI and machine "intelligence" (not just doing what it's programmed to do) are extinction level threats.

Finally, I'm going to say that if we discover that there are 12th dimensional beings and we aren't the first in the universe but rather the lowliest, restricted by time and woefully aware of it, able only to relate to them as an ant would to us, totally unimportant to truly intelligent species in every way... well, that would be catastrophically depressing. A "what's the point" scenario.

Edit: If plants feel pain and fear.
 
If we can reproduce and never die anymore, humans will grow like a disease on the whole universe.
Why do you think humans growing out into the universe are a 'disease'?

On this planet we're destructive to other forms of life, but if we assume there's no (or very little) life out there, what are we a disease to? Barren asteroids, moons, and planets? I don't think they'll mind.
 
That sounds more like a problem for the humans than for the universe, to be honest.
Yes, but we would take it with us to the end.

Why do you think humans growing out into the universe are a 'disease'?

On this planet we're destructive to other forms of life, but if we assume there's no (or very little) life out there, what are we a disease to? Barren asteroids, moons, and planets? I don't think they'll mind.
As far as I remember, nothing is infinite. As long as we keep growing indefinitely, we would need more and more energy, more and more space, more and more resources. After a certain amount of time humans would eventually reach a critical state where there wouldn't be anything left, since I don't recall the possibility of energy being created from nothing. Thus, we would end the universe as we know it.

Obviously, this is just a layman's opinion on the subject.
 
As far as I remember, nothing is infinite. As long as we keep growing indefinitely, we would need more and more energy, more and more space, more and more resources. After a certain amount of time humans would eventually reach a critical state where there wouldn't be anything left, since I don't recall the possibility of energy being created from nothing. Thus, we would end the universe as we know it.

Obviously, this is just a layman's opinion on the subject.
From what I understand the universe comes to an end with or without the intervention of a bunch of horny immortal humans. I am not sure how a universe that dies untouched and unwitnessed necessarily has any more moral value than one with humans crawling all over it.
 

kamineko

Does his best thinking in the flying car
we're kind of backlogged at the moment, seeing as we're just discovering how poorly we've handled knowledge of the combustion engine. Who knows what other surprises await us?

honestly, our ability to measure navigational latitude spelled disaster for multiple cultures

jury is still out of the atomic bomb

time travel is still the GOAT thought-experiment IMO
 
you can treat them as well as you want, but you simply have no way to guarantee that any one existence is going to be free from suffering. if you had control over every variable dictating whether or not a sentient being would really be free from distress 100% of their lives, then i question what the point was in trying to generate sentience in the first place since you'd basically have to manipulate them 24/7, i feel
So we shouldn't ever have kids?
 
Well I'm a father, so I kinda get how that whole thing works. Cool.



Why would we want to condemn sentient machines to servitude? Well, perhaps we're evil, in which case bring replaced by our robot slaves might be a moral improvement.

But I notice that you're just exploring worst cases here, whereas I thought the idea was that the creation of sentient life absolutely must not be researched.

I think it's rather more nuanced than that. We're biological, we're evolved and we probably can't ever traverse deep space by some faster-than-light space opera magic. It might be a good idea to research the creation of a successor species that would share our curiosity but fit the form factor and time constraints needed for deep space travel. Or maybe not. But the argument can be made.
I'm exploring worst case scenarios because we're talking about the hard stop line. Going past that line is the only point where those worst cases become an issue. If you're gonna put a replacement species into effect, you'd have to do it after we're all dead, like the sack people from 9. At no point would we be willing to share with another species on our level. There are people who can't even handle having other people on their level without throwing a fit.
 
How to make nukes.

They have no use for us, and even if we were to try to fend off an asteroid, it's far better to ward it off course gradually through other means.
Have you noticed how the early 20th century is defined largely by absolute slaughterfests like WWI and WWII? Have you noticed we haven't seen anything on that scale since nukes came on the scene?

Reason for that.