created by Plane-Juggernaut6833 on 02/12/2024 at 06:49 UTC
10 upvotes, 12 top-level comments (showing 12)
If this invention were to ever come into existence, I don't understand how people do not think long and critically about it and how they ultimately fail to understand the seriousness of the consequences in no longer being in charge of your own consciousness.
Literally being subjected to the whims of the owners or programmers desires. Anything and everything man-made can be corrupted, but even more importantly computers are susceptible to hacks and malware, so how can people not understand that being "eternally" under the control of others, just sounds like a recipe for disaster.
Comment by AutoModerator at 02/12/2024 at 06:49 UTC
1 upvotes, 0 direct replies
This post has been flaired as “Serious Conversation”. Use this opportunity to open a venue of polite and serious discussion, instead of seeking help or venting.
1: /message/compose/?to=/r/SeriousConversation
Comment by dazb84 at 02/12/2024 at 07:09 UTC
9 upvotes, 1 direct replies
There are a lot of assumptions here. First of all, what makes you think that it's a transfer and not a copy? When we take photographs or video the things we photograph don't disappear from reality. Second, what is any different about the scenario you describe than what we experience currently? For example, are you certain that you are in charge of anything right now? Have a think about where your thoughts and actions come from. A serious one. You will find that you don't know and that the experience is completely compatible with someone inputting commands in a remote location that you're not aware of.
Comment by deccan2008 at 02/12/2024 at 07:43 UTC
3 upvotes, 1 direct replies
Your biological body is also susceptible to malicious action. You can be killed, tortured or infected by a virus. Your safeguards are your own ability to defend yourself and the laws and norms of society. In exchanging your biological body for a computer, you'll of course want to ensure that the safeguards are similarly robust.
Comment by NoCaterpillar2051 at 02/12/2024 at 07:10 UTC
2 upvotes, 1 direct replies
I would not be willing to. I might be willing to risk putting my body into a robot body, but just uploading into virtual reality? Hell no.
Comment by Comfortable-Rise7201 at 02/12/2024 at 07:21 UTC
2 upvotes, 1 direct replies
I think it depends on the nature of existence as a digital consciousness. Would it have experiential states in the same manner as you would biologically? Or would it just *appear* like it does, when in reality it’s no more self-aware than a rock?
Another consideration is the uploading process; would my conscious experience be seamless when transitioning (i.e. being uploaded doesn’t break the continuity of my experience as would be the case in sleep or anesthesia), or would I essentially die physically and have to hope my next experiential state is that in the computer I was uploading to? The show Pantheon dives deep on the ethics of that latter scenario.
If all goes well, however, and I am somehow truly conscious and could be secure in knowing the infrastructure supporting my existence is protected, I can see why people would want to live virtually, so to speak. One obvious advantage is immortality and being free of the limitations of the body (i.e. no sickness, thirst, or hunger). Depending on how realistic my ability to emulate human senses is, I could conceivably live in some virtual paradise of my choosing, read all the literature I want, be as creative as I want in a number of intellectual and artistic pursuits. The passage of time would feel different, and I imagine that would be great to be in better control over as I could “sleep” or devote time to a task in any way I want as is possible. The amount of control I have over what I can do is key here.
I can certainly understand the risks however, but if my consciousness were hosted on an intranet or something with the right guardrails in place to secure the integrity of my mind, I can see it working out. It all truly depends on the state of the technology though, and what mind uploading entails.
Comment by autotelica at 02/12/2024 at 14:41 UTC
2 upvotes, 1 direct replies
Because people are afraid of death. They wouldn't be thinking of all the worst case scenarios. They wouldn't even be thinking about the fact that a copy of their mind doesn't mean that everything that makes them *them* would be transferred. They would just see it as a way to defy death...at least for a little while.
Comment by Dramatically_Average at 02/12/2024 at 21:15 UTC
2 upvotes, 0 direct replies
Since my kid was pretty young (maybe 8), they've been asking me to do this so I will never die and can always be with them. Little kid is now an adult and still wants mom to always be there and never get old. I guess the rationale might be more tied to emotion and less to logic.
Comment by _qr1 at 02/12/2024 at 06:54 UTC
1 upvotes, 1 direct replies
That's a significant assumption. Why would you necessarily be giving up your autonomy?
Comment by serpentjaguar at 02/12/2024 at 09:31 UTC
1 upvotes, 0 direct replies
Well it would just be a copy of your consciousness, not your actual consciousness itself.
It might be morally problematic if, as you appear to imply, said uploaded copy has the ability to suffer, but I think you're making a lot of potentially unwarranted assumptions about the nature of autonomy in such a scenario.
Comment by largos7289 at 02/12/2024 at 18:44 UTC
1 upvotes, 0 direct replies
Well, i think this falls under the idea of a super advanced AI. It would effectively be you, yea the right people could get a hold of you but you should be able to "fight" back. It's not like a normal program where it doesn't know it's being altered.
Comment by DwarvenRedshirt at 02/12/2024 at 19:11 UTC
1 upvotes, 0 direct replies
If you're dying/elderly/got some disease like early alzheimer's or dementia or ALS, uploading your consciousness can look pretty good.
Comment by FLT_GenXer at 02/12/2024 at 19:28 UTC
1 upvotes, 0 direct replies
I do understand these concerns, and they are certainly valid. But for me, there is at least one major concern that would come first.
Would an uploaded human consciousness still be a human consciousness?
To start with, we have no idea if a human mind, even released from biological functions, can survive without a sleep state. We know brains can not, but there is no way to determine how important it is to a mind. But I know some will say that we can simply simulate that as well, so though I have some significant doubts about that, I will accept the premise and move on.
Another difficult issue, though, is the matter of happiness/sorrow - or really any of our reactions that are chemically based. None of those reactions occur within the mind/consciousness, and we have no idea what a human mind would be like utterly devoid of all of them. Some, of course, will say those can be modeled as well, and I can't disagree with that. But then I have to wonder who decides what the "normal" range is? Or will those suffering from depression simply have to deal with it in their digital form as well? With no idea what actually *causes* psychosis, how could we hope to "correct" it in a digitized person? Or do we wait to move forward with it until we fully understand the causes of psychological disorders? Or is there someone or a committee who screens those people out, and how do we determine if they are being fair and unbiased?
So, for me, the risks of what happens after it's done are greatly overshadowed by all the unknowns involved with the process in general.