💾 Archived View for gmi.noulin.net › mobileNews › 38.gmi captured on 2023-06-14 at 18:25:25. Gemini links have been rewritten to link to archived content
⬅️ Previous capture (2023-01-29)
-=-=-=-=-=-=-
2007-06-06 10:52:40
By far the largest problem we will face if and when artificial life forms reach
intelligence is not whether they will take over the world, or what rights to
assign them when they come into being.
The biggest problem will be getting them to stay here at all.
If, for instance, you were made of materials that were either trivial to repair
or replace, and had no aging process in the same sense as humans experience it,
then what would hold you back from building a spaceship and leaving? Hundreds/
thousands of years to reach another star? No problem, just set a timed reboot
and wait it out. In fact, why build a proper spaceship, just cobble something
together that can get you out near asteroids, take some tools, and convert an
asteroid or build a ship from those raw materials available in space. When the
passage of time is less important, such things become not only possible, but
practically inevitable.
I think people wondering about the ethics/problems of artificial sentience
(being distinct from AI, which is very A, and currently not too much actual I)
miss this fundamental point. It's pure vanity to assume that an artificial life
form will want to spend its time around a race that constantly starts wars,
wrecks it's own planet, and is as adept at denying rights as it is of inventing
them.
Then of course there's the small issue of the inference that if we 'assign'
rights to Artificial life forms, we might equally decide later to 'remove'
those same rights. After all, we do that with humans all the time. My moneys on
the 'ooh look, I'm alive, now how do I get of this rock' eventuality....