šŸ’¾ Archived View for dioskouroi.xyz ā€ŗ thread ā€ŗ 29450818 captured on 2021-12-05 at 23:47:19. Gemini links have been rewritten to link to archived content

View Raw

More Information

-=-=-=-=-=-=-

Sarco suicide capsule ā€˜passes legal reviewā€™ in Switzerland

Author: gigglesupstairs

Score: 164

Comments: 272

Date: 2021-12-05 17:16:42

Web Link

________________________________________________________________________________

detaro wrote at 2021-12-05 17:31:22:

_P.N.: Currently a doctor or doctors need to be involved to prescribe the sodium pentobarbital and to confirm the personā€™s mental capacity. We want to remove any kind of psychiatric review from the process and allow the individual to control the method themselves.

Our aim is to develop an artificial intelligence screening system to establish the personā€™s mental capacity._

That sounds like a terrible idea. There are valid concerns about doctor involvement in the process, but AI won't fix those.

phailhaus wrote at 2021-12-05 18:01:06:

I can't believe he said that with a straight face. He actually thinks you can write a _computer program_ that can give humans permission to _kill themselves_. I don't think he consulted with a single engineer before making that statement.

sokoloff wrote at 2021-12-05 18:12:05:

I think itā€™s an interesting framing/premise to think that a human _needs to be given_ permission to kill themselves.

cyber_kinetist wrote at 2021-12-05 19:56:41:

Baudrillard's going to laugh in his grave for this one.

Our modern society has tried so hard to defeat death that we have essentially surrounded ourselves in it. Once we've made machines that try to defeat death, the only natural extension is to make machines that do the exact opposite, since there is no longer a distinction between the two.

(I highly recommend reading Chapter 5 from Baudrillard's _Symbollic Exchange and Death_, which elaborates about how modern society has ripped apart the symbolic exchange between life and death and neatly partitioned it, and as a consequence death becomes an immortal force that we cannot deal with.)

gnull wrote at 2021-12-05 18:20:43:

ā€œPermissionā€ was just a bad way to put it. I'm sure what they are actually talking about is making sure that the person is making the decision in clear mind and not under the effect of drugs, strong temporal emotions and is not being forced by someone to do this.

tednoob wrote at 2021-12-05 19:48:52:

A reverse Turing test. A discussion with a computer to convince it you are a sane human.

crakenzak wrote at 2021-12-05 23:36:30:

This comment really made me chuckle. Well done =)

diognesofsinope wrote at 2021-12-05 18:25:08:

> strong temporal emotions

Well said. As Jim Jefferies said in his gun control bit referencing suicide -- 'We all have bad days.'

5e92cb50239222b wrote at 2021-12-05 18:45:42:

> strong temporal emotions

That's a pretty loaded statement. You're probably thinking of those ostentatious fools who end up in the news like "man was prevented from jumping off the bridge". Not every suicide does it on a whim, some think about that decision for many years.

mortehu wrote at 2021-12-05 19:47:54:

As a general trend, when you make effective suicide less convenient, e.g. switching from lethal gas to carbon monoxide free natural gas in ovens, they happen less often. Not necessarily because of fewer attempts though.

https://www.hsph.harvard.edu/means-matter/means-matter/saves...

detaro wrote at 2021-12-05 19:06:40:

yes, that's the point, making the distinction between those two? How is that a loaded statement?

thisiswater wrote at 2021-12-05 20:46:07:

Yes, but the vast majority of people who survive a suicide attempt do not go on to die by suicide.

A4ET8a8uTh0 wrote at 2021-12-05 19:24:54:

Some do. Some don't. It is very hard to generalize here.

I personally don't really think that a person, who is making that a leap and decide and stop existing, has a 'clear' mind. It may be an objectively rational decision ( and frankly, I believe it is up to each of us to make that decision ), but I would be hard-pressed to argue that 'this individual, who I find of sound mind opted for the chair'. There is a reason society has certain level of concern for those that try and fail.

It is possible that I do not have enough of a population sample, but I personally see it as part of an effort way to keep world population control at certain level. Before anyone accuses me of tin-foiling, I mean it in the same sense as that there are efforts to prevent suicides by means of suicide prevention hotlines.

I guess what I am saying, as a society, we are grappling with with two competing interests:

1. We care about certain individuals and we don't want them gone from our life.

2. We care about certain individuals and we want them to have control over their own body.

cjfd wrote at 2021-12-05 19:52:51:

I don't find it very hard to think of situation where it is likely that a person has a clear mind and decides to stop existing. In particular in case of a disease where the person knows that the only thing that life has in store for them is either more pain or being palliatively sedated.

Aidevah wrote at 2021-12-05 20:35:50:

Destruction of government property is generally considered illegal.

api wrote at 2021-12-05 18:48:04:

Itā€™s not about permission in the idealistic sense. Itā€™s about issues like people being pressured or manipulated into it so someone can have inheritance, so the healthcare system can save money, etc.

These would be even larger concerns in the US than Switzerland for cultural and economic reasons.

I am personally against assisted suicide for the same reason I am against the death penalty: the logic works but only if you ignore the ugliness and messiness of real human behavior.

theragra wrote at 2021-12-05 19:04:06:

So you think several cases of unintended death are more important than suffering of all who really want to die?

Cause I can't imagine that percentage of wrong deaths is more than several percents.

sgustard wrote at 2021-12-05 19:24:02:

A "wrong death rate of several percent" sounds problematic in almost any circumstance.

sokoloff wrote at 2021-12-05 19:39:41:

That must be traded off against a ā€œprolonged suffering rate of X%ā€ as the overwhelmingly likely alternative.

We chose to euthanize our dog this summer. No matter how obvious her medical condition was, I still questioned whether we did it too early, too late, or just right. (Upon reflection, I think _very_ slightly too late [by days or maybe a week].) I also couldnā€™t help but compare that experience to that of people. In many ways, I think we treat our family pets with more compassion.

syshum wrote at 2021-12-05 19:28:39:

>>Itā€™s not about permission in the idealistic sense

Then you proceed to give an idealistic reason to oppose it... A person choosing to end their life to preserve a family inheritance IMO would / should be a valid reason, your opposition to such a choice is idealistic.

>I am personally against assisted suicide for the same reason I am against the death penalty

There is no logical or idealistic similarities between the two as there is a difference between actions being forced upon you, and voluntary actions.

This has become a common trend in the modern era where we attempt to expand the idea of coercion to include scenario;s where people have only poor choices. Examples include people taking a poor paying job being "coerced" into it because they did not have "good" choices,

it is very dangerous to equate a circumstances where there are no good options to coercion

sokoloff wrote at 2021-12-05 19:35:58:

I suspect Iā€™m generally aligned with you on the topic, but I agree that ā€œchoosing suicide for grandma to preserve a family inheritanceā€ is perfectly valid if grandma is choosing it, but acknowledge that itā€™s terribly problematic if the kids or grandkids are behind it. Being in the middle part of my life, Iā€™ve seen the pressures that arise here, the concerns over finances and quality of life, and the diminishment of mental capacity of many elderly folks.

I had a close family member express repeatedly and regularly that ā€œthey were doneā€ and ā€œare looking forward to finally dyingā€. Thatā€™s what makes me strongly support individual choice here, but Iā€™m not blind to the possibility of abuse here (and near certainty that it will happen in some cases).

smusamashah wrote at 2021-12-05 19:49:39:

Permission is probably correct here. Unless you can live in a close box in isolation without ever needing to depend on another living or having someone depend on you, then sure you can live/die on your own terms.

You live by the rules of whatever society you are living in. You don't live in isolation, you depend on countless other living beings to be where you are in this point of time in your life. Life is interconnected web, not an isolated event.

Your life has a value for other people too. No one can force you to live. "Permission" does not mean being forced. Unless you are physically unable to have a life, you should be needing a permission.

We already give permission in courts and write rules on how a person should live (or not) their lives for so many reasons we think are beneficial/harmful for rest of us. How is this any different?

analognoise wrote at 2021-12-05 19:57:40:

Because it's naive to think a person wanting to end their own life is going to be deterred by a law?

Like what are they doing to do, throw your body in prison? Hand your body a fine?

smusamashah wrote at 2021-12-05 20:13:29:

That's the kind of suicide one is going to do anyway. I am here talking about the one which someone is sane enough to seek assistance or buy a device like in the article.

We already put people with mental issues in mental health facilities instead of killing them. We could probably do something similar (not the same) for people who decide to take their own lives and seek out before doing it.

analognoise wrote at 2021-12-05 22:26:02:

So the plan is to cause bureaucratic headaches and forced treatment options for people who openly and sanely admit they don't like being here, leaving only the messy and less effective methods easily available?

This is one of those, "in theory, there's no difference between theory and practice. In practice, there are" situations.

Nobody owes society anything. In fact, the reverse is true: society owes things to those brought into this world without being asked: clean water, safety, clean environment, reasonable standards of living. I just don't see how it could cross anyone's mind to try to prevent people from ending their life, since they didn't ask to be here in the first place, to support a society that has clearly failed that individual.

If a society is good and just and receives conscious support from people, that's acceptable. But I don't see how it could possibly justify interfering with a right of self determination w.r.t. ending the ride early.

I guess I just don't get it. It seems cruel and Kafkaesque.

smusamashah wrote at 2021-12-06 00:57:32:

Forced treatment is opposite of how a person already sick of this world needs to be treated. That's not what I was suggesting. Something like providing a way to live a totally different kind of life in a totally different environment might actually help. Psychiatrists etc can suggest better ways.

Society usually tries to help, not deny the rights in a larger scheme of things. If you find someone on the verge of ending their life will you judge their sanity based on their age (what age if so and why) before deciding whether or not you should stop them or just let them do it because they must have a good reason?

I always wonder how we decide that at certain age a person is sane enough to start making decisions for their own life. In some ways or the other, we always need assistance from other people no matter how adult we become. This is just one of them. Society thinks it can help people from killing themselves just like we help kids from killing themselves unknowingly for first many years of their lives.

gnull wrote at 2021-12-05 20:07:40:

They will not let you have a reliable and painless death. I bet some people will be stopped by the pain that a traditional suicide methods might bring, as well as the risk of staying alive but getting mutilated for the rest of their life as a result.

goldenkey wrote at 2021-12-05 20:18:47:

^ This. The risk of becoming a vegetable from a botched suicide is pretty high. The last thing a suicidal person wants is to make their life even worse.

timwaagh wrote at 2021-12-05 18:57:47:

A Dutch engineer maybe. The views on this topic are so extreme in my country i find it a bit creepy. I'm sure everyone involved thinks people should be able to die at the touch of a button. The rest is just there for compliance reasons.

A4ET8a8uTh0 wrote at 2021-12-05 19:06:21:

I will add more to this. What is really fascinating is the idea 'ai' has been so successfully sold as a solution to just about any problem out there. I am genuinely trying not to just not add 'using novel blockchain protocol for full transparency' ( while naturally keeping all transparency out from the ai blackbox ).

We live in a weird time leg.

Nextgrid wrote at 2021-12-05 18:11:56:

You could argue nobody needs "permission" to decide what to do with their lives to begin with?

wutwutwutwut wrote at 2021-12-05 18:16:39:

But that was not what was being argued.

Nextgrid wrote at 2021-12-05 18:27:17:

My point is that the developers of the device may not believe that anyone needs such a "permission" either and thus the whole AI/computer program is merely there to fulfil some legal obligation - they don't actually care whether it's good or not.

wutwutwutwut wrote at 2021-12-05 19:02:21:

The thing we are talking about is verifying that the person is in a mental shape to take the decision to end his life. The word "permission" is an odd choice here.

tsuujin wrote at 2021-12-05 18:21:15:

> The second turned out not to be aesthetically pleasing. For that and various other reasons itā€™s not the best one to use.

Also on the list of concerning statements. Why do I get the feeling that ā€œvarious other reasonsā€ play a bit of a more important role than heā€™s letting on there?

kqr wrote at 2021-12-05 21:08:20:

Not so fast. Meehl (1989) is the obvious reference here:

https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=meeh...

But going beyond that, Kahneman in his latest book summarises the recent research in the area as such:

- Simple linear models generally outperform experts.

- Simple linear models generally outperform experts when the experts get additional information to base their decisions on, that the model does not get.

- Simple linear models generally outperform experts when the experts get to know and use the outcome of the linear model.

- Simple linear models trained only on an expert's judgments and not the actual outcome outperform the very expert they were trained on.

- Simple linear models with random weights (!) outperform experts.

- Simple linear models with equal weights (i.e. transform the predictors to the same scale and then just sum them) outperform experts.

- Simple linear models with equal weights and almost all predictors removed except the best 1--3 outperform experts.

oneepic wrote at 2021-12-05 18:42:10:

While it's not in the article, I could see them adding _some_ in-person contact (ie interview/screen with a human, but not a doctor) to get a rough idea of the person. Not just accept or deny them 100% according to the AI.

cyanydeez wrote at 2021-12-05 18:55:20:

just to be clear, what exactly is the epistomological basis against sucide.

zdragnar wrote at 2021-12-05 19:22:04:

Most hold that life has value, and the desire to continue living t9 be axiomatic. There are very few cases where suicide may be considered a rational choice, and in those cases it is difficult to determine that the choice was made of their own free will, and not pressured into it by lazy family, doctors or government (dictating available treatments, providing appropriate pallative care, etc).

cyanydeez wrote at 2021-12-05 22:11:24:

hold whose life valueable to who.

the very consent required for life doesnt exist.

people dont consent to being alive.

holding someone who feels suffering alive would be considered torture.

when you strip away the selfishness that is keeping others alive, suicide might be the only truly zelfless act.

i do not think you need a complicated ai to allow a suicide machine. you treat it as you would the way we sane people regulate guns. cool down periods, background checks and coercion checks.

drivingmenuts wrote at 2021-12-05 19:37:52:

Wouldnā€™t the desire to die be proof that the desire to live is not axiomatic?

phailhaus wrote at 2021-12-05 20:24:19:

I'm not arguing against assisted suicide. I just think it is wildly ignorant to think that a computer can accurately assess someone's mental capacity.

bryanrasmussen wrote at 2021-12-05 18:21:21:

at that level you talk to the salespeople, not the engineers.

mongol wrote at 2021-12-05 18:34:21:

Ethical AI, anyone?

actually_a_dog wrote at 2021-12-05 18:30:13:

Nor did they consult with an ethicist, clearly.

tomjen3 wrote at 2021-12-05 18:18:44:

As an Engineer, I wouldn't necessarily be against it if it made EOL decisions more available to humans. I would prefer no program at all, and just have a waiting period of 14 days to nullify those who insist that all suicidal thoughts are just spur of the moment things.

But given access to data and an appeals process for those it turned down, I would be morally okay with writing the software.

We let parents have as many children as they want, why shouldn't those children get to say if the life they have is worth living?

logifail wrote at 2021-12-05 19:04:40:

> I would prefer no program at all, and just have a waiting period of 14 days to nullify those who insist that all suicidal thoughts are just spur of the moment things

I've not posted about this before, for a variety of reasons:

A close family friend of mine - in his mid 20s - took his own life a week or so before Christmas a couple of years ago.

I was with him and his father in a pub the evening before, he was telling jokes and buying drinks, he was dead a few hours later.

Unfortunately taking your own life doesn't have a cooling-off period :(

xupybd wrote at 2021-12-05 18:44:08:

I have found depression can last much longer than 14 days.

bookofjoe wrote at 2021-12-06 01:28:10:

In some cases, a lifetime.

hardwaregeek wrote at 2021-12-05 17:49:17:

Are they trying to create the suicide booths from Futurama? Because that sounds about as good of an idea

cstross wrote at 2021-12-05 19:31:30:

That's just plain crazy.

The software only knows what the (human) user tells it. What assurance does it have that the patient isn't being pressurized into giving the correct answers? Or that the suicide capsule is going to be used correctly (as opposed to being a murder tool)?

If you start looking for edge cases where this might go wrong, it turns out to be razor-sharp edge cases all the way down. A human with a remit to prevent false positives should be the last link in the kill chain: sure, allow the applicant to appeal if the human practitioner has made a mistake, but the default outcome should be "no kill" unless their mental capacity can be positively confirmed (which is a GAI-complete requirement, hence wanting a human being in the loop).

eikenberry wrote at 2021-12-05 20:54:09:

There should be no gatekeepers to control what you can do with yourself. No doctors nor AIs nor governments have that right. I should be able to buy a humane means of death with no questions asked whenever I want. I am the only person who has the right to determine if/when I die, it is not their call. It makes me so mad that you have to break the law to try to find a humane way to commit suicide and that so many people resort to terribly unpleasant means. And while I'm glad to see progress from things like this and mad dog brewing in AU, it isn't enough.

dillondoyle wrote at 2021-12-05 22:14:01:

I agree for ill and end of life.

But I think it's more complicated for otherwise healthy people temporarily passing through a mental illness crisis or untreated depression. Though there is a distinction here in it's a government providing the means. Anyone can figure out a means themselves.

DanBC wrote at 2021-12-05 20:57:11:

You have a right to life. You don't yet have a right to death. Given that, it's understandable that there are barriers in place for people who temporarily think they want to die.

baobabKoodaa wrote at 2021-12-05 21:39:47:

You don't have the right to dictate what people can and can't do with their own bodies. Furthermore, you should accept the fact that some people hold opinions different than yours and it's not always "temporary", yes, differing opinions actually do exist. There's many documented cases of people consistently wanting to die over a long period of time, you deny these people exist?

eikenberry wrote at 2021-12-05 21:27:56:

My point is that you do have a right to death. You could understand it in terms of the right to life... that it is your right to control. If you don't have the right to die do you really have control of your life and the right to it. IMO the right to life requires a right to die.

johnisgood wrote at 2021-12-05 18:35:36:

> Our aim is to develop an artificial intelligence screening system to establish the personā€™s mental capacity.

Please let me know when this is done, I am very curious how one would implement such a thing. How can an AI determine if the person is in a "clear" state of mind? What does it even mean? Can they actually define this mental capacity that a person must be in to be able to make the choice? They have to define it, and then they have to somehow turn it into something that the computer can understand and work with. I believe that it is not possible.

theragra wrote at 2021-12-05 19:06:42:

It is also not possible with human doctors, yet it it is widely used for important decisions

johnisgood wrote at 2021-12-05 19:21:22:

OK, because we have patterns of behavior, we do this and that, we do not do this and do not do that when we are in a clear state of mind[1]. Now, to make it usable for AI, we have to define what these behaviors are that you do and do not do that makes you be in a clear state of mind. It is not enough to give an exhaustive list, or use heuristics; you have to consider the context as well.

Sure, humans might do it in a more or less accurate way, but could an AI? If it could, would you please tell me how? I really am curious[2], I work at the hospital.

Oh, about working at the hospital: a patient has been deemed aggressive when in fact it is just her voice and looks and whatnot that makes it seem like she is aggressive. She is not actually aggressive in any way, she just looks and sounds as if she were. The moment you have a discussion with her, you would be able to tell that she is not aggressive at all. How would an AI deal with this, if even humans cannot? You also have to consider that these behaviors are heavily influenced by the environment and changes in environment that one has to consider.

[1] Mind you, there are cultural differences at play here as well. There are some behaviors that would consider you "not in a clear state of mind" in one culture, but not in another.

[2] I am also wondering if an AI would be able to determine if a person has dementia based on behavior alone.

hereforphone wrote at 2021-12-05 17:52:09:

> I wan't to commit suicide HAL

> I'm sorry, I'm afraid I can't let you do that Dave

Better than the opposite case where the AI has the ability to mandate suicide.

i_like_waiting wrote at 2021-12-05 18:07:57:

> I wan't to commit suicide HAL

I'm sorry, but you have outstanding debt please resolve it before continuing
Program will continue after this short ad from Coffins Inc.

you cannot scale those things effectively with doctors as middlemen

throwawayboise wrote at 2021-12-05 19:04:32:

https://memegenerator.net/instance/67435723/clippy-hi-i-am-c...

fennecfoxen wrote at 2021-12-05 18:18:50:

> the ability to mandate suicide

'to mandate suicide' is typically reduced to 'commit homicide'

hereforphone wrote at 2021-12-05 18:22:44:

Can AI commit homicide? Does cancer commit homicide?

johnisgood wrote at 2021-12-05 18:39:08:

Someone who programmed the AI committed homicide using/programming the AI. But then... what about bugs? What if these bugs are not incidental? Let us assume that they are. Would it count then as non-premeditated murder? Who are the people who reviewed the code in this case, then? Accessories?

AJ007 wrote at 2021-12-05 19:11:53:

Iā€™m not sure if there is any difference. They are assuming that their ā€œAIā€ is acting alone and in a very narrowly defined context.

Whatā€™s to say the human voluntarily entered this capsule and the ā€œAIā€ does what that person wishes? One could take the Kim Jong-nam assassination as an example where the participants may have believed something completely different was occurring than the true context.

In the absence of ā€œAIā€ someone could reverse the do die/do not die button. However, adding an additional, pliable layer is a bad idea.

Whether it is autonomous weapons or this, once AI or machine learning is added, the human is no longer in full control.

dqpb wrote at 2021-12-05 17:57:55:

Well, you know Apple watches can detect you are getting sick, before you experience any symptoms, via changes to your heart rate.

A maximally effective suicide AI should also be able to detect that you are a good suicide candidate before you even start having ideation, and proactively convince you of this fact.

punnerud wrote at 2021-12-05 18:52:03:

ā€œHi Tesla, can you help me kill my self?ā€

Sounds like a terrible feature

Y_Y wrote at 2021-12-05 20:22:08:

"Full Self Driving activated."

zabzonk wrote at 2021-12-05 21:02:14:

"You already own me, don't you?"

bjt2n3904 wrote at 2021-12-05 17:47:05:

This whole thing is a terrible idea. All of it.

cadence- wrote at 2021-12-05 17:52:57:

Why? There is no more personal thing than your own life. Why should not we be able to choose when to end it? People will be doing it anyways, just in all the gruesome ways it is usually done. Hopefully this container doubles as a coffin that can be either buried or burnt.

johnisgood wrote at 2021-12-05 18:41:58:

Because we cannot even define what "clear state of mind" is in a way that an AI would be able to determine.

ajsnigrutin wrote at 2021-12-05 18:11:45:

Sucide is a permanent solution to a temporary problem.

Usually, atleast a few other methods should be used to help, before deciding on suicide (with a notable exception of being close to a very painful, inevitable death, and just wanting to shorten the suffering).

weego wrote at 2021-12-05 18:16:23:

Without being at all patronising, you're lucky to be able to have such a binary view of mental and physical health.

There are so many non-temporary conditions that people live with that can lead to wanting people, very rationally, to speed up end of life.

ajsnigrutin wrote at 2021-12-05 18:23:46:

Sure... but they should atleast try some kind of therapy, before deciding to end it all permanently.

5e92cb50239222b wrote at 2021-12-05 18:38:09:

Please advise: what therapy could help the terminally ill father of a friend of mine? Stage 4 lung cancer. Talk to a psychologist? The only thing that "helped" him were large doses of opiate painkillers.

throwawayboise wrote at 2021-12-05 19:06:13:

What's wrong with large doses of optiate painkillers, if you're terminally ill? Addicition isn't a worry, certainly.

zdragnar wrote at 2021-12-05 19:26:21:

The side effects of most pain killers can also be extremely unpleasant. You'll have more time, but you'll be miserable or marginally conscious for most of it.

stan_rogers wrote at 2021-12-05 19:53:38:

And they don't always work. Look up intractable pain.

johnisgood wrote at 2021-12-05 19:50:55:

In case of opiates, what side-effects are you thinking of? Constipation is an easy fix. Assuming proper use and dosage, opiates are way safer than NSAIDs.

ajsnigrutin wrote at 2021-12-05 18:41:12:

Yes, this is the exception i put in the braces in my previous post.

But being physically ok, but having depression is something that can be solved by therapy.. or atleast should be tried before ending it

meepmorp wrote at 2021-12-05 18:35:46:

What kind of therapy ought someone consider before deciding to end it because their terminal illness is excruciatingly painful?

macNchz wrote at 2021-12-05 18:50:02:

Studies of psychedelic therapy for terminally ill patients have had some super interesting results, thereā€™s a section on the topic in Michael Pollanā€™s book _How to Change Your Mind_ which I found fascinating. Seems there is a lot of potential for helping people deal with the existential dread and get more enjoyment from their remaining time with friends and family. Hereā€™s an article for reference:

https://nyulangone.org/news/mental-health-benefits-one-dose-...

That said Iā€™m a Swiss citizen and knew someone with a terminal illness who used the ā€œExitā€ program and it seemed extremely humane and a positive thing overall and I fully support having it available to everyone in that situation.

ajsnigrutin wrote at 2021-12-05 18:41:17:

Yes, this is the exception i put in the braces in my previous post.

But being physically ok, but having depression is something that can be solved by therapy.. or atleast should be tried before ending it

asdfasgasdgasdg wrote at 2021-12-05 18:36:09:

Presumably most if not all people who end up doing an assisted suicide _have_ tried one or more methods to ameliorate their issue before they go to the more extreme approach. Do you have some reason to believe they do not?

ajsnigrutin wrote at 2021-12-05 18:42:14:

I still think there should be a human safeguard to verify that everything else has been tried.

A few questions and an AI box is not 'that'.

drivingmenuts wrote at 2021-12-05 19:42:05:

Thatā€™s nice and all, but but in the real world, those things arenā€™t always available (for any number of reasons, not just financial). In the absence of other solutions, sometimes the best choice may be to end a life rather than continue suffering.

jes wrote at 2021-12-05 19:05:24:

Curious as to why you see parent as lucky to hold a binary view, if you care to say. I would say holding such a view is unlucky. Perhaps Iā€™m missing something important.

notreallyserio wrote at 2021-12-05 19:37:50:

The use of the cliche "Sucide is a permanent solution to a temporary problem" demonstrates binary thinking, as though there is only one temporary problem and no compounding factors.

It's also a pretty shitty thing to say. It's designed to make people who may want to die feel worse about themselves by calling their feelings merely "temporary".

jes wrote at 2021-12-05 19:55:17:

I was asking weego why they consider ajsnigrutin lucky to hold the binary view you outline.

notreallyserio wrote at 2021-12-05 19:59:56:

Sure, but publicly, and I had a response (as I was thinking the same thing).

jannyfer wrote at 2021-12-05 20:26:14:

The part you didn't respond to is why they are "lucky" to be able to hold a view...

My guess is that if you have such a binary view of the world, you might have lived a simple or sheltered or privileged life.

notreallyserio wrote at 2021-12-06 00:37:05:

Ah, I see, my mistake. I agree with your guess.

avgcorrection wrote at 2021-12-05 19:25:51:

What a neat slogan. Also misleading.

Life itself temporary, hence the death-is-permanent thing. Hence all things in lifeā€”pain, happiness, pleasure, sufferingā€”are temporary. Suicide is just the shortening of all of these possible states (or potential statesā€”for the hopeful). Thus a shortening of a bunch of temporary states.

Doesnā€™t sound as menacing and dramatic when you put it like that. And itā€™s equally true.

sokoloff wrote at 2021-12-05 18:13:40:

Suicide is a permanent solution. Sometimes itā€™s a solution to a temporary problem; other times to a permanent problem.

yumraj wrote at 2021-12-05 19:05:28:

That is just a silly quote/generalization.

As an example: Old age is a permanent, and worsening, problem. Why should one be forced to live to an old age where they are dependent on others, IF they donā€™t want to?

mensetmanusman wrote at 2021-12-06 02:26:11:

Many old people just stop eating

avgcorrection wrote at 2021-12-05 19:28:23:

Iā€™m personally biased towards hope. Meaning that I keep thinking that things will be better and that I will reach a point where things will be meaningful. I call this a bias because I can remember the past and how long Iā€™ve been in this state of mind without it panning outā€”itā€™s a false hope more often than not.

If average people are anything like me then I would think that they are more likely to think that a permanent problem is a temporary one rather than the other way around.

ajsnigrutin wrote at 2021-12-05 18:25:19:

So, what harm is it to try a few talks with a therapist, try to fix the problem, and if it doesn't work, you can still kill yourself?

Even buying guns in some states has a 3 day cool-off period.

csee wrote at 2021-12-05 18:32:04:

Perhaps it could work with a much more extended cool-off period. Over the course of a year, the person needs to affirm they want to go through with it on the last day of each month. If over the last 12 consecutive months they have said yes 12 times, then assist them, otherwise no.

ajsnigrutin wrote at 2021-12-05 18:47:24:

Yep, with therapy in between... Of course, the inevitable painful death being an exception, that can be done a lot faster.

Anything is better than AI in a 3d printed box.

pbsull wrote at 2021-12-05 18:23:07:

Suicide causes permanent state change. Life and all of lifeā€™s problems are temporary state.

kergonath wrote at 2021-12-05 18:46:06:

Thatā€™s such an absurd point of view. When the duration of the problems is comparable to your expected lifespan, it is permanent for all intent and purposes. Sure, an incurable illness is not permanent in that it has to end when the person die. Thatā€™s not a useful point to make.

PUSH_AX wrote at 2021-12-05 18:31:53:

> all of lifeā€™s problems are temporary state.

Not all. Incurable disease is just one example.

gregoryl wrote at 2021-12-05 19:07:33:

Can you agree that there's some threshold where the time required to endure the temporary state is untenable?

How can anyone except myself decide what that threshold is?

rad_gruchalski wrote at 2021-12-05 19:11:28:

Until one dies anyway.

antris wrote at 2021-12-05 18:28:47:

This suicide capsule concept is clearly targeted for people who have made a well thought out decision on their life. Heat-of-the-moment suicides, won't happen with rare specialty devices but common tools/environments/drugs/poisons that are found all over.

It would be a weird assumption to think that a person who carefully plans their suicide in advance, and orders a specialty device for it, hasn't considered options other than suicide. Most likely such a person has tried everything else already.

sgustard wrote at 2021-12-05 19:31:56:

It's a good point about the rollout strategy. Is this a specialty device you must order? Or is there a row of them on display at the mall next to the massage chairs?

PUSH_AX wrote at 2021-12-05 18:25:16:

> Sucide is a permanent solution to a temporary problem.

This quote is supposed to be used in the correct context of the specific situation, *if* it's truly temporary. Not just as a generalisation.

stan_rogers wrote at 2021-12-05 19:52:22:

And what of the excruciatingly painful existence that doesn't come with a near-term "inevitable death". What you're advocating is long-term inescapable torture.

daenz wrote at 2021-12-05 18:29:50:

You can also be against the act of suicide and for the freedom to commit suicide. The two ideas don't have to be conflated.

f6v wrote at 2021-12-05 18:26:43:

> Usually, atleast a few other methods should be used to help

I bet most of people who resort to assisted suicide in Switzerland have terminal disease or incurable pain.

loonster wrote at 2021-12-05 18:17:45:

Sometimes the problems are permanent.

otabdeveloper4 wrote at 2021-12-05 18:05:42:

You can already kill yourself if you really want to. Nobody can stop you.

The whole idea of "assisted suicide" is like Bitcoin - invented for the express purpose of finding and abusing loopholes in the current legal frameworks.

tasty_freeze wrote at 2021-12-05 19:01:55:

There are multiple ways one might commit suicide, it is true. But some are more reliable than others; some leave a terrible mess for someone (perhaps a family member) to clean up; some are more painful than others; some may put other people at risk. The worst outcome is when a person tries to kill themselves and ends up alive but in an even worse condition.

For people who don't have religious prohibitions on suicide, having a reliable, simple, low cost, low-pain, low-mess option sounds great to me.

gambiting wrote at 2021-12-05 18:38:12:

Yes, someone who is paralyzed from their neck down can easily take their own life, of course. Don't see any issue here.

You have to excuse the sarcasm, but if you really can't think of any cases where taking your own life in a dignified manner is not actually possible, then I'm not sure there's any discussion to be had here

kergonath wrote at 2021-12-05 18:48:38:

Sometimes legal frameworks are inhumane, immoral, cruel, and ought to be loopholed. I can think of at least 5 examples right now.

Quarrelsome wrote at 2021-12-05 17:50:40:

I see you haven't had to care for someone with Alzheimer's before.

Its not a terrible idea, as a society we retain some naĆÆve and arguably zealous notions around death which often makes us push towards natural outcomes. Some natural outcomes are different from others in that they're significantly worse for all parties involved. Those outcomes are a subject that should be given more attention and seriousness as opposed to mere emotional rejection.

14 wrote at 2021-12-05 17:58:09:

Care giver here and someone who has directly worked with individuals who have done the MAID (medically assisted induced death). A person with Alzheimerā€™s would automatically be disqualified for consideration as they lack the mental capacity to consent to such a thing. This will most likely never be offered for that group of people. They typically are not suffering though some do get terrified at times but in general they are just lost in their world with not much insight as to why things are that way.

therealdrag0 wrote at 2021-12-05 18:02:24:

What if you had an advanced directive that explicitly called out that scenario? Not saying the law would cover that, but it plausibly could.

Pasorrijer wrote at 2021-12-05 18:09:22:

In Canada, yes, advanced directives can be written.

14 wrote at 2021-12-05 18:46:43:

Not for assisted death and something like Alzheimerā€™s. That directive is for something like you are in a car accident and are on life support. You can have it say no life support. You can not currently say if I get Alzheimerā€™s I want to die it is not legally possible at this time.

datameta wrote at 2021-12-05 19:17:27:

> They typically are not suffering

How on earth did you get such an out-of-touch callously asinine idea?

It is the most mentally debilitating state one can possibly be in. Their depth of recollection comes and goes. There is a persistent growing and waning sense of confusion. Their grasp on reality is in a shattered state. They recollect details out of context or in false connection to current events. I seriously fail to understand how you could underplay this disease.

14 wrote at 2021-12-05 21:52:41:

I used to work in a facility with the highest rating of care required for Alzheimerā€™s and dementia like conditions and typically if well looked after they are not suffering. Yes some do get scared and confused but with medication and skilled workers you can minimize that type of suffering. What ends up often happening is family feels guilty placing their loved ones in such a facility and instead struggle and eventually burn out. At that point they may already be short with the person and yes they truly are suffering. But a well trained staff can really help reduce such interaction through distraction and conversation or music and medications. There are even now in the world dementia villages where people donā€™t actually know they are in a facility and they can wonder around because the entire thing is fenced off but looks normal with shops and places they can go to. Usually the person who is suffering is the family who can no longer manage the 24/7 care required. And that is not a poor reflection on them it takes a lot of mental drain to deal with a person ask you the same thing 300 times a day.

datameta wrote at 2021-12-05 22:08:42:

I apologize for my pointed accusation. I did indeed experience the difficulties of at-home care, it can be too much for a few loved ones to manage. The training isn't there. I think you paint the overall picture well from experience. I'm glad to hear that the average case of dementia, when properly looked after, can be milder on the person than I imagined.

14 wrote at 2021-12-06 00:26:08:

No need to apologize I can tell you experienced a heavy burden caring for your loved one. Alzheimerā€™s is an evil disease. It is unfair. It can take the most loving person and test them to the limit. I now advocate for anyone in a situation of caring for an Alzheimerā€™s patient to really be honest to themselves if they are feeling burnt out and to seek help before things get too bad. When I was in that job I got to go home at the end of my shift and unwind but for many that is not an option. The cost of care where I worked started at $4500 per months something not everyone can afford. It truly is a horrible situation but there are some of us who really did our best to provide some peace to patients and families. I wish you well today and hope you realize you are a good person. That is why it was so hard for you because it is truly hard to see a loved one in that state.

Quarrelsome wrote at 2021-12-05 18:10:06:

> A person with Alzheimerā€™s would automatically be disqualified for consideration as they lack the mental capacity to consent to such a thing.

Which is part of the reason that people who have seen it with their parents often want to submit their consent in advance. If I get Alzheimer's, please fucking kill me.

14 wrote at 2021-12-05 21:46:52:

Currently the law does not support that type of advanced directive. You can say do not leave me in a vegetative state after a car accident but not for Alzheimerā€™s.

14 wrote at 2021-12-05 17:51:44:

Could you elaborate on what parts you find terrible? It seems like a peaceful way to die for some.

pydry wrote at 2021-12-05 17:53:28:

Giving AI the power to approve suicides?

14 wrote at 2021-12-05 18:01:31:

Well I donā€™t know the details but I donā€™t think this is a suicide booth just anyone can use you would still need to have a doctor verify you had a terminal illness and then I could be left with you to use at the time you were ready. The AI is simply making sure you are of sound mind at the time of death. Which means you can answer a few questions and it wouldnā€™t be that hard to determine if someone was if sound mind there are plenty of mental state exams that a computer could give.

can16358p wrote at 2021-12-05 18:04:29:

Then don't use it.

throwaway6734 wrote at 2021-12-05 17:57:26:

Why? If I ever get dementia I would like the opportunity to end my own life before it progresses too far.

jasonpeacock wrote at 2021-12-05 17:41:03:

Tangentially, this is my frustration with the death penalty - it's way too complicated & error-prone when much simpler, reliable, and humane methods are available like this.

Humans are easy to kill, and it's not hard to do so humanely.

(Yes, there are many other problems with the death penalty besides the technical execution of it, but you'd think we could at least do that part right.)

Jiro wrote at 2021-12-05 19:49:00:

Any time someone tries a new method of execution, activists pounce on it and take it to court, making any new method expensive because of court costs and tied up in years of cases. The only methods of execution that can't be filibustered in this way are methods where courts have specifically ruled in the past that the method is okay. That's why execution is limited to specific methods, and why we're not going to use carbon monoxide or nitrogen.

Personally, I'd prefer that if the extra suffering caused by a method of execution is less than the suffering caused by, for instance, a week in jail, we should ignore the suffering and permit use of the method.

deegles wrote at 2021-12-05 22:17:30:

Depending on the jail you use as a baseline, that might be a scary way to die.

meepmorp wrote at 2021-12-05 17:51:46:

Yeah, with all the problems the US has had acquiring an appropriate cocktail of drugs for lethal injections over the last few years, you'd think someone would look into this kind of thing. Nitrogen is super easy to get.

skocznymroczny wrote at 2021-12-05 18:01:22:

I am not up to date with US lethal injection, but weren't the changes in lethal injections in the US "yeah, it's still a painful death, but we added muscle relaxants so it looks peaceful on the outside and acceptable to the public?"

hprotagonist wrote at 2021-12-05 18:03:14:

The changes are of the form that nearly every legit manufacturer of the cocktail refuses to produce it for the purposes of capital punishment.

https://www.npr.org/2015/03/11/392375383/states-scramble-to-...

meepmorp wrote at 2021-12-05 18:19:27:

I think there's always been a paralytic involved. The idea was to knock the person out with a barbiturate, then stop the breathing with the muscle relaxant, and finally stop the heart with a bolus of potassium chloride.

It's not so much to pretend it's peaceful, as it's a error in how the drugs are administered and availability of appropriate drugs.

jasonpeacock wrote at 2021-12-05 18:29:48:

This is exactly my point - why do something so complicated when many people die quietly in their homes from CO poisoning? Let's just setup a CO poisoning chamber to do the same thing.

Or nitrogen as the article describes. Or a fatal overdose of a pleasant drug like morphine?

Even the guillotine is quite humane, though gory.

gambiting wrote at 2021-12-05 18:39:32:

I never understood it either - there must be some crazy reason though.

kergonath wrote at 2021-12-05 18:53:23:

Sadism, mostly. The electric chair was painful enough but too picturesque, which makes it easy to rally against. The injections solve that problem whilst still inflicting immense pain. Which is supposed to be good, because these damn criminals had it coming. Or something.

alice-i-cecile wrote at 2021-12-05 18:49:53:

Many argue that the deliberate cruelty is the point. Also stigma around gas chambers.

jack_riminton wrote at 2021-12-05 17:59:32:

Some states are experimenting with Nitrogen

Article from NYTimes from 2018:

https://web.archive.org/web/20210704140110/https://www.nytim...

jasonhansel wrote at 2021-12-05 18:42:30:

Needless to say, bringing back gas chambers is a hard sell.

formerly_proven wrote at 2021-12-05 18:53:39:

Isn't Arizona planning to start executions in gas chambers again using Zyklon B?

meepmorp wrote at 2021-12-05 19:30:05:

They've always used hydrogen cyanide (HCN) gas, typically produced from potassium cyanide pellets dropped into hydrochloric acid. Zyklon B is just HCN.

Edit: not HCl, H2SO4.

sschueller wrote at 2021-12-05 20:12:21:

Or just carbon monoxide. The thing people are so paranoid over in their homes because it is a silent killer.

stjohnswarts wrote at 2021-12-06 03:07:23:

My frustration is that a significant portion of the people on death row are innocent, if you kill them they have 0% chance of getting out alive.

https://en.wikipedia.org/wiki/Blackstone's_ratio

71a54xd wrote at 2021-12-05 18:16:17:

Not to be grim, this is a very "adult" subject that most adults have a hard time discussing.

After seeing friends and family struggle with severe injury / deformities and Alzheimer's / mental decline (cancer with grueling treatment as well) - I can as a currently happy and healthy person say that I would likely use some form of firearm to end my life. It's quick, hard to mess up and above all my right as a human being.

I generally dislike political spats regarding labor / "obligations" to society in general - but I think the most sacred right is the right to decide whether you contribute by living. Both capitalism and socialism demonize the idea of suicide because each needs some form of a worker and the pursuit of a certain constructed life to work properly. Something that I'll never understand and that I believe is more morally wrong is family deciding that someone should "get over" their own choice to no longer live. In my opinion nothing is more selfish and less respectful of what life should mean to an individual.

rrdharan wrote at 2021-12-05 19:06:06:

> hard to mess up

I was under the impression it was in fact not that hard at all to mess up.

Though in fact, seems like other methods are indeed likely ā€œworseā€ in terms of achieving the intended outcome:

https://www.medpagetoday.com/psychiatry/generalpsychiatry/83...

rootusrootus wrote at 2021-12-05 19:40:09:

> hard to mess up

I think it depends a bit on technique and choice of firearm. Some people foolishly put a gun in their mouth and assume it will definitely kill. But it's entirely possible to miss the brainstem and just blow off the side of your face instead. Now you are alive but mangled.

Something like a shotgun is probably a bit harder to screw up.

But personally I'm far too chickenshit to use a firearm. I'd probably try to find someone who'd sell me fentanyl, were I interested in terminating my own existence. At the moment, however, I'm far more likely to suffer an existential crisis and so I've zero desire to answer life's greatest question any sooner than strictly necessary.

BrandoElFollito wrote at 2021-12-05 21:30:47:

And here I am in France, where the closest thing to a gun I can get legally is a nail gun.

mikewarot wrote at 2021-12-06 00:32:08:

> hard to mess up

I had a co-worker who had earlier in life done that to himself. The bullet took an eye and a chunk of his brain out. He was instantly transformed from one of the smartest people in the room, to someone who could barely keeping up with conversations, with about half of the IQ.

errcorrectcode wrote at 2021-12-05 17:43:52:

For a second, I thought "capsule" meant "pill." "Pod" would've been a better word.

Obligatory:

Does it have "quick and painless", "slow and horrible", and "clumsy bludgeoning" settings?

What do they do with the bodies?

jfrunyon wrote at 2021-12-05 21:23:11:

Or "booth" (shoutout to Futurama) ;)

mcguire wrote at 2021-12-05 18:20:25:

Given that conceivably you could print the pod yourself, unless the AI automatically notifies someone no one would know so you would sit in a pod full of nitrogen until someone found you.

errcorrectcode wrote at 2021-12-05 18:48:31:

It would be better to already have arrangements pre-paid (Edit: in the US, it's called "pre-need" and can save 80-90% of the costs by buying decades before), and the coroner and the mortician notified. Having someone "find" you isn't a very classy way to go out.

errcorrectcode wrote at 2021-12-05 18:43:54:

It seems a little impractical to print yourself given the size of the parts.

This idea has too many moving components, I don't think it's a good idea.

There are already companies who sell dual-use nitrogen equipment for home brewing and "home brewing."

LorenPechtel wrote at 2021-12-05 20:41:05:

What actually matters is the nitrogen, not the pod.

gigglesupstairs wrote at 2021-12-05 17:18:38:

A 3D-printed capsule, destined for use in assisted suicide, may legally be operated in Switzerland, according to advice obtained by Exit International, the organisation that developed the ā€˜Sarcoā€™ machine.
Itā€™s a 3-D printed capsule, activated from the inside by the person intending to die. The machine can be towed anywhere for the death. It can be in an idyllic outdoor setting or in the premises of an assisted suicide organisation, for example.
The person will get into the capsule and lie down. Itā€™s very comfortable. They will be asked a number of questions and when they have answered, they may press the button inside the capsule activating the mechanism in their own time.
The capsule is sitting on a piece of equipment that will flood the interior with nitrogen, rapidly reducing the oxygen level to 1 per cent from 21 per cent. The person will feel a little disoriented and may feel slightly euphoric before they lose consciousness. The whole thing takes about 30 seconds. Death takes place through hypoxia and hypocapnia, oxygen and carbon dioxide deprivation, respectively. There is no panic, no choking feeling.

annetipasto wrote at 2021-12-05 17:28:25:

this feels like a phenomenal advancement on this front. now if only I could read about it without slipping into existential panic...

derbOac wrote at 2021-12-05 17:49:54:

FWIW, this is just automating something that has been in use for some time. The person interviewed has a book and organization advocating for this in various forms for some time.

DenisM wrote at 2021-12-05 18:38:06:

Yeah, there is something claustrophobic about this description.

jonnybgood wrote at 2021-12-05 18:01:03:

What can a person do if they change their mind? Iā€™m assuming it can be stopped. I think about those who jumped from the Golden Gate and survived. Them changing their mind after jumping.

throwawayboise wrote at 2021-12-05 19:20:10:

Presumably there's a "cancel" button, for if you change your mind before you lose consciousness.

Biganon wrote at 2021-12-05 19:42:28:

And now due to the prolonged hypoxia, you're a vegetable.

hatesinterviews wrote at 2021-12-05 19:51:03:

No. If you are still conscious (it takes <30 seconds before losing consciousness) then you havenā€™t experienced sufficient oxygen deprivation to have severe permanent side effects. Brain death takes several minutes of sustained hypoxia. If youā€™re unconscious then obviously thereā€™s no cancelling by your own will.

throwawayboise wrote at 2021-12-05 19:51:59:

I think if you're still conscious enough to press a cancel button, you'd be OK in that regard (but I'm not an expert on the subject, to be sure).

krisoft wrote at 2021-12-05 18:41:03:

I think this device is designed to provoke a conversation rather than to solve the problem it says it sets out to solve. There is no reason it has to look like a futuristic coffin. Most people breath only through their nostrils and mouth. A simple mask could do the same job this huge contraption is sets out to do. Except you couldnā€™t exhibit a scuba mask hooked up to a nitrogen tank in a museum. It wouldnā€™t have the same visual impact and scifi otherworldlyness.

jsn wrote at 2021-12-05 19:48:56:

Yeah, this is seriously overengineered. A cellophane hood with elastic collar and a nitrogen nasal cannula is all it takes.

majkinetor wrote at 2021-12-05 22:58:09:

That must be exactly how most people envision their death: with cellophane hood and elastic collar.

jsn wrote at 2021-12-06 02:16:11:

Hmm, interesting. Are you speaking from experience? Because I definitely am. I watched my mother die a very slow, very painful death from cancer several years ago. I can assure you she would be absolutely genuinely happy to get that cellophane hood and that nitrogen cannula. And, having seen what I've seen, I will also take the hood over some very real alternatives in a heartbeat. Playing dress up with this ridiculously futuristic glass coffin toy instead seems somewhat preposterous when you are facing death.

drakonka wrote at 2021-12-05 18:13:41:

I'm glad more options for a peaceful chosen death are becoming available.

Between a sudden, unexpected death and dying in pain from a progressive disease, my "ideal" way to die would be chosen death before a progressive disease becomes bad enough to cross some personal threshold of suffering.

At the same time, I plan to do whatever I reasonably can to prolong my life (and its quality) for as long as possible.

Hopefully by the time this becomes relevant more options like this capsule and others will be readily available.

Having said that, his quote about an AI giving you permission to die seems a bit preposterous. The actual method of death he described sounds reasonable, but I don't know that I'd want a machine to permit me to end my own life. Not that having to get a doctor's permission sounds any better. Ideally, nobody should have a say over a life but the individual themselves. I recognize in saying this though that it isn't really that simple, and that there are complex nuances with potential undue influence family members or others surrounding a person can have in the situation.

BrandoElFollito wrote at 2021-12-05 21:34:32:

> my "ideal" way to die would be chosen death before a progressive disease becomes bad enough to cross some personal threshold of suffering.

It is a long time I wish there was some kind of deice you can implant in your body that would release poison when not responding to some kind of regular request. This would at least solve neurological cases (either degenerative, or alive in a coma which is terrifying, or vegetative state)

LorenPechtel wrote at 2021-12-05 20:45:54:

Yup. There are a lot of ways to die that are worse than dying. Why bother with the pod, though? A big plastic bag and a tank of nitrogen can do the same thing.

vmception wrote at 2021-12-05 17:58:09:

His family sees it differently. His mother begs. ā€œI want you to live no matter what.ā€ But that ignores his pain and his dignity, Yoshi says.

This is my observation of suicide hotlines and ambiguous generic anti-suicide advice.

It seems this conversation is so immature, patronizing and invalidating.

I havenā€™t found people able to articulate their thoughts on the matter as they just invalidate my comments on social media platforms until they are no longer visible.

Do things compatible with self-preservation like me, or donā€™t.

mcguire wrote at 2021-12-05 18:30:52:

Only something like 10% of suicides are related to chronic or terminal illnesses. One could reasonably assume that most of the rest are suffering from mental illnesses.

LorenPechtel wrote at 2021-12-05 20:56:11:

And how many are that aren't really counted as medical? I know one medical suicide--she never went to the doctor about what drove her to suicide. She didn't have all that long to live anyway and she perfectly well knew a broken hip meant she would never leave bed again.

zamadatix wrote at 2021-12-06 01:37:45:

I'm guessing quite significantly less than 40%, probably significantly less than the 10% already accounted.

LandR wrote at 2021-12-05 20:01:34:

It doesn't make their experience or suffering invalid.

If we are to allow suicide for chronic physical pain, then why not for chronic mental pain?

oxfeed65261 wrote at 2021-12-05 19:14:20:

Depression is a chronic, and sometimes terminal, illness.

vmception wrote at 2021-12-05 18:42:15:

Which means what about the messaging, to you.

prirun wrote at 2021-12-05 18:21:38:

This seems like way too much bother, and it looks expensive. I've read the same thing can be done with a plastic bag over the head and helium gas (maybe nitrogen works just as well).

I still believe people should be allowed to have a safe, painless, unmessy death if they want it, for whatever reason. I don't understand why it's such a controversy. Why does a person who wants to leave have to resort to messy, violent methods? Our pets have more access to humane ways to die because of bad situations, than humans.

rootusrootus wrote at 2021-12-05 18:27:16:

> I've read the same thing can be done with a plastic bag over the head and helium gas (maybe nitrogen works just as well).

My older brother attempted suicide a few years ago and he tried to asphyxiate himself using nitrogen. When he was found, he was vomiting all over himself. Some part of his plan failed to work, obviously. So I don't know that I'd assume it's a trivial exercise to make a working setup on your own.

asdfasgasdgasdg wrote at 2021-12-05 18:41:39:

I'm sorry that happened to your brother, and I hope things are better for him now.

I imagine there are probably approaches with complexity between "plastic bag" and "lie-in 3d printed electromechanical apparatus" that would be still be reliable. Particularly, I would imagine a device that seals around the neck but otherwise operates by the same principle would be a lot cheaper and more environmentally friendly to produce. You would have a regulated, switchable input, one arm of which would be the atmosphere and the other arm of which would be a pure nitrogen supply. And then a check valve for output.

I hope never to have to use such a device myself . . . dying peacefully in my sleep would be the ideal, like my grandfather did, or of a sudden stroke like my other grandmother and grandfather. But if it comes to pass that I am dying of a painful and debilitating disease, I hope that by then a device like this is easily obtainable.

rootusrootus wrote at 2021-12-05 18:56:30:

> I'm sorry that happened to your brother, and I hope things are better for him now.

Yep, that unfortunate time of his life has passed and he is in a much better place. Found someone who helps keep him grounded mentally. Finally qualifies for social security so he isn't struggling just to find a roof to live under. Luckily he was an engineer for many years and so his social security payment isn't half bad. He's going to be okay now, and for that I'm grateful.

I agree with your sentiments. I'm glad that I live in Oregon, so we do have assisted suicide as an option. The law makes it somewhat difficult, IMO, but in practice the rules seem to get bent. My father died with assistance a few years back when he had terminal kidney disease. Two nurses came to the house and prepared the pills for him by converting them into a liquid he could drink, they even helped him hold it. He'd have had difficulty doing that himself, so I'm grateful they were willing to help. I don't _think_ the law allows for that much assistance, but maybe I'm wrong.

In lieu of euthanasia, I'll pin my hopes on opiates, I guess. A friend of mine passed away about a week ago (cancer) and I had the opportunity to talk with him about three weeks ago -- he was on a significant dose (but not quite lethal) of fentanyl, and he said he had zero pain. They were willing to crank the dose up to whatever it takes -- this seems normal when hospice is involved. No worries about anyone getting addicted. And if the patient inadvertently dies a little early, nobody asks any questions.

DantesKite wrote at 2021-12-05 19:39:02:

It cannot be done so easily.

Your body has a thousand instincts inside trying to preserve itself.

Many of these people who try asphyxiating themselves end up ripping the bag even after they go unconscious. And if you fail, you risk long-term brain damage.

Trying to kill yourself is like trying to kick a the corner of a concrete wall with your shin as hard as possible. You have to override a lot of self-preservation instincts. It's not like you suddenly become immune to pain, loss, and fear.

LorenPechtel wrote at 2021-12-05 20:58:51:

Asphyxiating without a replacement gas does trigger your body to try to fight it.

However, when you use enough replacement gas (helium or nitrogen will do) the body won't fight it because it's the buildup of CO2 that's the trigger, not the lack of oxygen. I do agree brain damage can happen from a failure.

jliptzin wrote at 2021-12-05 18:03:56:

The capsule itself is interesting, seems like a peaceful and pain free way to die. I donā€™t understand why they feel they have to replace doctors and psychiatrists with an AI in prescribing the procedure, one of the worst ideas I have ever heard.

xwdv wrote at 2021-12-05 18:09:14:

Replacing with AI solves the issue of having to wait weeks to see a doctor and also the costs associated with it.

schroeding wrote at 2021-12-05 18:14:23:

But so does replacing it with an algorithm that randomly says "yes" or "no" ^^

(Costs shouldn't be an issue, switzerland has pretty good medical insurances AFAIK)

throw03172019 wrote at 2021-12-05 17:29:20:

I wouldnā€™t want to be on the QA team here.

sarsway wrote at 2021-12-05 18:10:01:

I heard they have a huge churn rate

thejackgoode wrote at 2021-12-05 17:44:58:

works on my machine

can16358p wrote at 2021-12-05 18:07:52:

They might just test on production.

rklaehn wrote at 2021-12-05 19:22:10:

Suicide capsules? Reminds me of this short story:

https://zerohplovecraft.wordpress.com/2019/09/28/the-green-n...

deadalus wrote at 2021-12-05 17:44:51:

Hoping this pill makes it to the Darknet Markets, I want to die but I don't want to experience pain. I just want a painless exit from this world from where I find no meaning or joy.

smeej wrote at 2021-12-05 18:04:41:

From someone who's felt that way before, I strongly recommend going searching for meaning or joy instead, or inventing your own meaning or joy if you can't find anybody else's idea that does it for you.

Not saying it'll be easy, or even promising it'll work.

I'm just saying I'm glad I gave it a shot. 10/10 would recommend. YMMV.

schroeding wrote at 2021-12-05 17:58:20:

It's not a pill, but a "capsule" in the "Japanese Capsule Hotel" sense.

I feel you. I'm the same, honestly.

That's probably the biggest danger of these "safe", "easy" suicide methods - some people just don't commit suicide because they are too scared of a cruel death, not because they are attached to life. Stuff like this will lower the probability of such a cruel death, making suicide less scary, raising the probability of pulling though with it. Not an easy topic.

But I really hope you can find something in life before you do it, so you don't have to do it ^^'

meepmorp wrote at 2021-12-05 17:53:05:

You should take a minute to skim the article.

timwaagh wrote at 2021-12-05 20:12:36:

I'm sure there's plenty on there that can kill you already. But before you go maybe try a vacation in Thailand or something.

bogwog wrote at 2021-12-05 18:17:28:

If you click the link you'll see a picture of it. It's not a pill, it's a giant capsule you step into.

Also, I don't know you, but I'm confident that dying is the wrong solution to whatever problems you're having.

gregoryl wrote at 2021-12-05 19:24:05:

Consider: When you share a profoundly personal feeling, and a stranger openly states "I know nothing about you or your situation, but you're wrong".

hliyan wrote at 2021-12-05 18:11:41:

For a moment, the catchy trade name and the mention of 3D printing made me worry that this was a startup. Thankfully, Exit International is a non-profit [1]. Imagine if this had been a commercial endeavor: what would one have to resort to, to increase revenue?

https://www.exitinternational.net/about-exit/history/

garaetjjte wrote at 2021-12-05 18:20:49:

Uh...

>Our aim is to develop an artificial intelligence screening system to establish the personā€™s mental capacity. Naturally there is a lot of scepticism, especially on the part of psychiatrists. But our original conceptual idea is that the person would do an online test and receive a code to access the Sarco.

1f60c wrote at 2021-12-05 19:10:07:

"Before we continue with this assisted suicide, I'd like to thank our sponsor NordVPN"

sokoloff wrote at 2021-12-05 18:16:49:

> what would one have to resort to, to increase revenue?

Well, subscriptions are outā€¦

duxup wrote at 2021-12-05 19:35:22:

There was PBS Frontline episode about a man who after being diagnosed with ALS traveled to Switzerland to die. They follow him until the very end.

Very powerful episode, I still remember it vividly.

https://www.pbs.org/wgbh/pages/frontline/suicidetourist/

SavantIdiot wrote at 2021-12-05 18:23:07:

It's an interesting new method: hypoxia via nitrogen. I'd prefer this to the drinkable concoctions that put you to sleep after a brief panic of thirst.

LorenPechtel wrote at 2021-12-05 21:03:43:

It's not new. It's just there isn't a consumer use for nitrogen so it's harder to obtain. Helium has a consumer market in inflating balloons, though. It occurs to me that argon would also work although there isn't a lot of consumer market.

mensetmanusman wrote at 2021-12-05 19:03:19:

ā€œ There is no panic, no choking feeling. ā€œ

Iā€™m amazed the author of the article was able to determine this. May they rest in peace.

throwawayboise wrote at 2021-12-05 19:18:24:

The panic/choking feeling comes from a buildup of CO2 in the air and body, not lack of oxygen. We know from experiments/accidents that when CO2 saturation does not happen, people don't feel like they are choking or can't breath even if they are not getting any oxygen.

DominikPeters wrote at 2021-12-05 19:19:34:

The reaction to nitrogen is well-understood:

https://en.wikipedia.org/wiki/Inert_gas_asphyxiation

nosianu wrote at 2021-12-05 19:51:08:

Some comments mentioned that some people who attempted to use nitrogen had problems. According to the Wikipedia a fast switch to breathing a nitrogen and no oxygen atmosphere has "no symptoms at all", but when it happens slowly what happens to people varies:

> _a slow decrease in oxygen breathing gas content has effects which are quite variable.[5] By contrast, suddenly breathing pure inert gas causes oxygen levels in the blood to fall precipitously, and may lead to unconsciousness in only a few breaths, with no symptoms at all.[3]_

They cite scuba diving rebreather accidents as one source for experiences. I think there should be people to tell the tale if they got rescued, "never dive alone", after all, and ships for divers should be equipped for emergencies. I guess other than observation of someone committing suicide with this method we may actually have personal experience stories.

I looked into this when I had a huge health scare that changed my life (it "only" was heavy metal poisoning and after lots of chelators I'm doing very well, better than before), and nitrogen was what I considered because I saw a chance to actually get my hands on some, compared to drugs. Bad thoughts, but I'll still keep it in mind in case I ever need it. Living through all the heavy metal issues was bad enough but at least I had hope. If it's similar or worse problems or even pain with no hope I don't want to have to endure.

amne wrote at 2021-12-05 21:36:50:

Has anyone heard from the QA team?

charles_f wrote at 2021-12-05 22:59:33:

A 3D-printed capsule

How is the 3d printing of any relevance? Is as relevant a claim as "a lipo powered capsule" or "a raspberry pi controlled capsule"

bmmayer1 wrote at 2021-12-05 17:59:13:

Fascinating. It seems interesting that far more innovation has occurred in the service of corporal punishment (from the torture devices of the middle ages to the guillotine to lethal injection) than in the service of suicide (arguably cyanide tablets?).

pengaru wrote at 2021-12-05 18:26:42:

Am I alone in wishing the person in the photo would climb in for a live demonstration?

ehPReth wrote at 2021-12-05 18:13:46:

Whatā€™s/whereā€™s the best ā€œavailable nowā€ service that takes foreigners?

scruple wrote at 2021-12-05 19:48:06:

But weed is still illegal over there, right?

fetzu wrote at 2021-12-05 20:08:46:

Because weed goes hand-in-hand with assisted suicide how?

Not illegal, CBD/low THC (-1%) cannabis has been legal for a while, and Ā« weed Ā» can be prescribed medically.

danielovichdk wrote at 2021-12-05 18:24:14:

Outrageous. You want me to believe that it's humane to lie in a fucking blue capsule and commit suicide?

People need to understand that when and if you really want to kill yourself, this is often done in affect if not the individual is very psychical sick.

I don't like this. I think its ethically wrong to have people go away in a plastic box and building an actual product around that. I mean, how inhumane are you trying to make this look?

Have people go out in their own bed, have their loved ones next them and have professional medical staff help out during the process.

throwaway675309 wrote at 2021-12-05 19:18:49:

I think a person who's had to live in chronic excruciating pain for years is not gonna give a shit what color the freaking capsule is, if this provides them a relatively painless way out of their never ending suffering then it's their goddamn choice.

It's not up to their family, and it's not up to their friends. They can't begin to understand the anguish that person is going through and frankly neither can you.

BrandoElFollito wrote at 2021-12-05 21:40:58:

> this is often done in affect if not the individual is very psychical sick

You also have the population of people who are sick and simply want to end their life.

No affect. No psychological issues.

Just the fact that you want to be in control of your life. I find this very courageous - I for one certainly do not want to be a burden for my family if I am in a state that requires them to take care of me when my brain is gone.

xf13d wrote at 2021-12-05 19:12:39:

This is to eliminate the pesky doctor from the situation.

Doctors have ethics, morals, and feelings. Doctors, at least ostensibly, agree to "first do no harm". If you take that statement literally that means doctors are obligated to not help a patient commit suicide because killing someone is ostensibly more harm than having them in pain. I know this isn't popular on HN being very pro individual liberty, and I agree people _should_ have control over their own lives, but the inclusion of doctors presents a problem to people who want to enable anyone to kill themselves easily.

The sarco is a techno-dystopian solution. Now that the cat is out of the bag, 3d printable, and theoretically reliable, it's only a matter of time before it's used for something other than killing yourself. Or, even worse, it's used to create some new tech startup that starts to approach soylent green. Just throw a nice screen in there with images of pretty scenery and you have the "exit" in the most literal sense.

It's humane in theory because the nitrogen provides a painless death. I say in theory because this only works in theory. There have been many failed suicides directed by exit international that demonstrate the method is not as effective as they tout but no one seems to mention the quackery by EI when they talk about them.

My concern is these people walk the line between "helping the actual sick have a peaceful death" and "homicidal psychopath" and with this device I believe they are leaning far, far closer to the latter. Creating mass scale suicide devices with the specific intent to end people's lives is immoral and unethical. As stated above, it will only be a matter of time before a company decides to provide suicide-as-a-service.

throwawayboise wrote at 2021-12-05 19:15:26:

Yes that was my feeling just about the appearance of the thing. Why does it have to look so cold, alien, sterile?

isodev wrote at 2021-12-05 21:34:02:

Excellent! Every person should have the right to make the choice for themselves.

ag8 wrote at 2021-12-05 17:48:14:

Our aim is to develop an artificial intelligence screening system to establish the personā€™s mental capacity.

I'd like to see how that works!

criddell wrote at 2021-12-05 18:16:09:

I can imagine it getting into a catch-22 scenario where wanting to die is considered a sign of mental illness and mentally ill people are excluded from using the suicide machine.

tomjen3 wrote at 2021-12-05 18:29:10:

Then it wouldn't been AI, it would just be the current medical system.

Apocryphon wrote at 2021-12-05 20:43:06:

Every day we draw closer to _Children of Men_

https://youtu.be/IacY-MYuQQ0

https://youtu.be/QYy80trSPSI

qualudeheart wrote at 2021-12-05 20:44:27:

I find this very depressing.

BrandoElFollito wrote at 2021-12-05 21:38:18:

I find it very reassuring that there is a place around me where I can do this (I am in France, and have Switzerland, the Netherlands and BElgium nearby).

cambaceres wrote at 2021-12-05 20:30:18:

I wonder how they test these.

diveanon wrote at 2021-12-06 00:46:38:

Just in time for millenials to start thinking about retirement.

In all serious though, my family does not age well and I have actively thought about this as an end of life option vs decades of mental degredation.

I doubt it will ever be available in the United States.

xwdv wrote at 2021-12-05 17:51:44:

The next step for such a machine IMO is something that can automatically cremate the corpse inside the machine and vacuum the ashes, dispensing them into either a canister or into some kind of plumbing network for scattering them high into the air where the wind can carry them.

mleonhard wrote at 2021-12-05 21:17:47:

Immediate cremation would reduce cost. It would also save loved ones from seeing or dealing with the corpse. The blue tinted windows prevent attendees from seeing the body's color change.

My mother died after months of agony from cancer. The mortuary staff came and put the corpse in a black plastic bag and wheeled it out of the house on a stretcher. They used an extremely pungent chemical to disinfect the bed. The chemical smell filled the house. At the funeral, I carefully avoided looking at the corpse. I'm thankful that, in my last memories of her, she was alive and showing me tenderness.

I think she and all of us involved would have been better off if she could have used a machine like this several weeks earlier. We could have said goodbye, she would push the button, and then it would be over.

EDIT: The machine could refrigerate/freeze/preserve the corpse for ceremonies, until burial/cremation.

For burial, the pod could be in two parts, the top part is a casket for burial and the bottom part holds the gas delivery system and refrigerator. They could disconnect and bury the casket without ever opening it. This would eliminate the expensive embalming process. It would also reduce the risk of disease transmission from handling the corpse.

This could be useful for regular anticipated deaths, too. The patient could spend their last week or so in the pod bed. What would be the psychological impact of that on the patient and the loved ones? I think it could be a positive impact for most people.

garaetjjte wrote at 2021-12-06 00:30:38:

>At the funeral, I carefully avoided looking at the corpse.

Around here, there's a tradition of placing the corpse in opened coffin and leaving it in the house for 3 days, praying rosary every day. It just seems horrific to me.

User23 wrote at 2021-12-05 17:49:41:

Itā€™s neat that Robert Chambers predicted this well over a century ago in _The King in Yellow_.

analog31 wrote at 2021-12-05 17:55:27:

Socrates drank hemlock.

User23 wrote at 2021-12-05 22:51:02:

While Plato presents it as a principled stand, it remains that Socrate's death was the result of a judicial conviction and thus isn't exactly a voluntary suicide.

bobthechef wrote at 2021-12-05 19:12:04:

The nihilistic culture of death is well established, so I'm surprised it took this long or if this thing can truly be said to be "new".

bjt2n3904 wrote at 2021-12-05 17:57:33:

Is suffering a valid reason to end your own life? How much suffering is the threshold where suicide is "medically necessary"? How can it be quantified, what are the units? Does mental anguish count, or only physical suffering? What about existential suffering?

This isn't a treatment, and it's certainly not compassionate. Joni Eareckson-Tada is a testament to the wonderful life that can be lived in spite of suffering, and she has much to say on it that is worth a listen.

owlbynight wrote at 2021-12-05 18:04:10:

Why does someone else's threshold for anguish have anything to do with you?

BrandoElFollito wrote at 2021-12-05 21:45:57:

> Is suffering a valid reason to end your own life?

Yes.

> How much suffering is the threshold where suicide is "medically necessary"?

When I say so.

> How can it be quantified, what are the units?

1 unit of my will to do so.

> Does mental anguish count, or only physical suffering?

Anything I feel to be.

> What about existential suffering?

This is everyone's choice.

> This isn't a treatment, and it's certainly not compassionate

Have you had anyone close to you trying to end their life and you were helpless because _someone else_ said no?

If you did and still think that people should suffer no matter what you should seriously rethink this.

If you did not then get to a palliative ward in a hospital and discuss with people.

Seriously - all these discussions from people who are either into religion or some books and did not experience the problem themselves is exasperating.

bjt2n3904 wrote at 2021-12-05 23:06:45:

> If you did not then get to a palliative ward in a hospital and discuss with people.

People think this is such a winning argument.

I've been in them. I've worked in EMS. There's not been a single patient that I've encountered where I thought for a second, "I wish I could give them the option to end their life."

> Seriously - all these discussions from people who are either into religion or some books and did not experience the problem themselves is exasperating.

I'd argue it's your side of the debate which could use some more reading...

LorenPechtel wrote at 2021-12-05 21:16:06:

You have no way to quantify the suffering without having been there or at least seen the suffering of someone who has. I don't believe the form of the suffering matters, only the untreatability of whatever is causing the suffering. It has to come down to the person--at what point do they feel the suffering is too much, that whatever remaining life they have isn't worth it. Everyone is going to draw the line at a different place and I don't believe there is a right or wrong.

I dislike not having the doctors involved, though--many people won't know if there's something that can be done or not.

f6v wrote at 2021-12-05 18:34:38:

You remind me of people who talk about vaccination without any knowledge of immunology. Donā€™t you think the countries that have legalized assisted suicide have already devised methods for quantifying suffering? You just have to spend a load of time to study their frameworks.

LorenPechtel wrote at 2021-12-05 21:17:24:

No, they haven't--because it can't be quantified, nor would it do any good if it could be. The patient decides what's too much.

paxcoder wrote at 2021-12-05 19:10:20:

Pain is a part of life, God is the answer. People belong to God, they have no right to kill themselves. Furthermore, people change their minds when they jump. In short, euthanasia is an abomination

newintellectual wrote at 2021-12-05 19:11:09:

Simple fact: everybody dies.

The remaining question is whether you have any control over it.

aaron695 wrote at 2021-12-05 21:15:21:

This is 100% because of Futurama.

But we'll ignore that?

Sure it's a funny meme, and I'm happy for memes to become real even at the cost of human lives, 200 million dollar movies also cost a lot of lives.

But I'm not going to pretend I'm stupid just so we as a species can meme off a cartoon show without mentioning it. Own it. It ok it's off Futurama and a lot of people love Futurama and it was originally dystopian. We also love dystopian, probably more for other people though.

FYI Nitschke's The Deliverance Machine has been used legally

https://en.wikipedia.org/wiki/Euthanasia_device

you could see it at art galleries. Wiki says the British Science Museum has the one that was used on people.

bserge wrote at 2021-12-05 17:52:29:

High quality exit bag lol. Actually, I'm surprised no one sells readymade suicide kits.

A well made exit bag, or a capsule like this, would be totally legal pretty much anywhere.

Hmm, good idea for a business! If I don't use it first :D

questiondev wrote at 2021-12-05 20:05:15:

my anxiety is getting the best of me whenever i see this type of stuff i always think of how a state can abuse it to get rid of people legally and use an excuse. but then again i am extremely distrusting of new ā€œprogressā€ in general.

guilhas wrote at 2021-12-05 18:23:03:

Imagine next year, besides covid vaccines governments, will also be able to mandate suicide pills to save the planet

fetzu wrote at 2021-12-05 20:12:11:

I wouldnā€™t worry too much, it sounds like gouvernements are not even able to enforce the Ā« read the article Ā» rule yet; son you probably still have some time ahead of you.

guilhas wrote at 2021-12-05 22:15:07:

The same was said about the vaccine mandates

MeteorMarc wrote at 2021-12-05 17:31:02:

Let,s call this procedure euthanasia instead of suĆÆcide.

meepmorp wrote at 2021-12-05 17:47:28:

This is literally suicide, as the person within the capsule triggers the thing that kills them. In that way, it's no different than pills or a gun.

sprucevoid wrote at 2021-12-05 18:29:03:

I suspect MeteorMarc meant 'euthanasia' in the philosophy sense of a death that is (a1) rationally planned over a period of time and (b1) in the dying persons overall interest. Which can be contrasted with 'suicide' in a sense commonly used in suicide prevention care and research, where the act (a2) often results from temporary cognitive or emotional instability and (b2) is seldom in the dying persons overall interest. You on the other hand perhaps meant to distinguish between the dying person (a) getting assistance vs in dying (b) acting alone.

A third termological option is 'assisted dying'. A big complication is that 'euthanasia' comes with quite different historical baggage in different countries.

codetrotter wrote at 2021-12-05 17:43:23:

I like the term suicide, because they allow the person whose life it is about do it themselves with a button.

Reminds me of the Suicide Booths from Futurama.

https://futurama.fandom.com/wiki/Suicide_Booth

Biganon wrote at 2021-12-05 19:45:27:

No. Euthanasia is well defined in Swiss law, and it is not the same as assisted suicide. Let's not complicate things even further.

schroeding wrote at 2021-12-05 18:05:43:

"Euthanasie", the german word for euthanasia, is *heavily* stained by its meaning during the Nazi era, and the pod is made in (german speaking) switzerland.

That's probably one of the main reasons why they called it "suicide capsule", even in english :)

Edit: I don't really get why this is downvoted ^^

The link goes to an article that was originaly written in German and only translated by the swiss television station SRF.

"Euthanasia" is not a neutral term in German, take a look at the wikipedia entry and how much of the page contains "NS", "Nationalsozialismus" or dates between 1939 and 1945:

https://de.wikipedia.org/wiki/Euthanasie

It was used as a euphemism by the Nazis, and this still taints it. It has lost much of its original meaning because of this.

If you say "Euthanasie" in German, even to someone from switzerland, there is a very high chance people think of it as "the killing of people that disabled, against their will". :/

There is a reason most people use "aktive Sterbehilfe" ("active help to die") instead of "Euthanasie", even though both things mean the same.

cadence- wrote at 2021-12-05 17:48:12:

Wow, even better looking than this:

https://i.kym-cdn.com/entries/icons/facebook/000/023/876/sui...

More seriously, I like it. Iā€™m always petrified of dying a slow and painful death from some disease that cannot be cured. I would much rather just use this and be done with it.

iso1210 wrote at 2021-12-05 18:08:12:

Go parachuting

LorenPechtel wrote at 2021-12-05 21:18:47:

If you're in good enough shape to go parachuting it's unlikely you're in bad enough shape that suicide is a sane answer.

politician wrote at 2021-12-05 17:37:53:

Iā€™m glad to see this uses nitrogen asphyxiation. Itā€™s the humane alternative.

Edit: Clicked the link

mrshadowgoose wrote at 2021-12-05 17:42:49:

I'm not sure if you're joking, or simply did not read the article. The capsule described in the article is person-sized, and uses nitrogen asphyxiation.

jack_riminton wrote at 2021-12-05 17:57:04:

When did people stop reading the articles on here? this affliction seems to have spread from Twitter

bogwog wrote at 2021-12-05 18:19:38:

It's not even about reading, this guy did not even click the link. If you do, the first thing you see is a picture of the thing.

Minor49er wrote at 2021-12-05 18:30:10:

He might have thought it was too big a pill to swallow

gwbas1c wrote at 2021-12-05 18:36:39:

The same problem existed on Slashdot 20 years ago. Assuming you have good karma, the easiest thing to do is downvote.