top | item 33955563

New cognitive science tool to shed light on mental health

69 points| rglover | 3 years ago |darpa.mil | reply

37 comments

order
[+] elil17|3 years ago|reply
Contrary to many people's expectations, the most typical profile of someone who kills themselves is a 33 to 44 year old man who presents no warning signs and leaves no note.

This is a tool that could be extremely helpful for addressing a large portion of suicides.

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3777349/

https://www.health.harvard.edu/blog/suicide-often-not-preced...

https://www.medpagetoday.com/meetingcoverage/cap/82375

[+] smeeth|3 years ago|reply
You and I seem to have different definitions of "warning signs." From the medpagetoday story you linked:

- 239/657 suicides were people under psychiatric care (please note they used present tense, more of them might have had a history of care)

- 187/657 had a previous attempt (!!!)

- "About 22% of the deaths were accompanied by no known inciting event or identified life stressor" Phrased differently, 78% of suicides accompanied a life event or stressor.

Contrary to popular belief, risk prediction isn't usually rocket science. Suicide correlates are extremely well studied, and any experienced mental health professional can point high risk individuals out to you if they ever cross paths with one.

The most challenging problem often is: what do you do about it? Can you get them the help they need? Can you navigate them through a big bureaucracy like the VA? Do you have grounds to forcibly keep them in in-patient psychiatry? Etc, etc.

Source: Relative is a psychologist @ the VA.

[+] derbOac|3 years ago|reply
My impression (I do research in this area broadly speaking) is that this sort of tool is not really very strongly related to the reasons why they present no warning signs, etc.

Implicit measures tend, on replication and rigorous scrutiny, to kind of be very broad brush and not very specific. There is a literature on implicit measurement and suicide, and it seems they hold up, but they don't dramatically add to anything, and there's also tricky issues about the meaning of "preconscious" assessment. So, lots of false positives etc, and also maybe even bigger questions about the meaning of implicit processing relative to conscious opportunities for "revision".

I think the problem is, what good does it do if someone has some predilection toward suicide but isn't aware of it? If they were going to be resistant to addressing it without this tool, why would they be less resistant to it with it?

The problem with suicide prediction has never really been in predicting suicidal attitudes or cognitions, it's been predicting the actual follow-through with an attempt or success. That depends on a lot of stuff that exists outside of attitudinal space, for lack of a better way of putting it. Someone can basically feel disturbed by the idea until they don't, and then change rapidly in their feelings about it, or vice-versa.

I can't say I think this research line is a bad idea; I think the more information the better. But I'm really skeptical about how well it will work up when really skeptically evaluated, with lots of data, especially when you put it to practice. There's a lot more hurdles than people think.

[+] 90d|3 years ago|reply
The road to hell [...] good intentions.
[+] cphoover|3 years ago|reply
How about we fix the existing support systems. Popular media constantly exalts things like the suicide hotline/chat, or mental health services. But those services often don't even work when people need them, they are either unavailable (where people are put on hold indefinitely) or the applications and websites have bugs that make them totally unusable.

Also the way healthcare is tied to employment in the US is a huge health risk for anyone changing jobs or at risk of termination.

[+] barbazoo|3 years ago|reply
> How about we fix the existing support systems.

From the article

> Using the preconscious will hopefully enable us to detect signs of depression, anxiety, or suicidal ideation earlier and more reliably than ever before. If successful, NEAT will not only significantly augment behavioral health screening, but it could also serve as a new way to assess ultimate treatment efficacy, since patients will often tell their clinicians what they think the clinician wants to hear rather than how they are truly feeling.

Wouldn't this count as an attempt to fix the existing support system by helping identify those in need which is always the first step?

[+] boredumb|3 years ago|reply
This will most certainly be used to limit US citizens access to their second amendment. The unseen use cases for this will be equally horrifying. DARPA creating a precrime detection tool is about as dystopic as is gets.
[+] djexjms|3 years ago|reply
That one take on it. The other is that it could potentially save literally tens of thousands of lives. Immediately classifying a potentially life-saving medical technology is a threat to the second amendment might not be the best look for 2A advocates.
[+] nbaugh1|3 years ago|reply
We could use a little more restriction on access to weapons in this country tbh
[+] thinkmcfly|3 years ago|reply
What is the precrime tool you're talking about?
[+] alexfromapex|3 years ago|reply
As "cool" as this is, providing actual healthcare for the veterans would be a much better step than creating a tool to flag people with mental health issues.
[+] rafaelero|3 years ago|reply
Why do you believe they don't already have access to healthcare?
[+] kayodelycaon|3 years ago|reply
So, they created what they think is a better lie detector. Even if they claim otherwise, it will get used for this purpose. And it will get used against people as much as possible.

The temptation is just too high. We're see this happen with polygraph tests. To the point where it was being used in job interviews.

The only hope I see here is US courts treating it as a polygraph device and effectively banning it from common use.

[+] jessegavin|3 years ago|reply
> NEAT is not focused on lie detection, truth detection, or assessing someone’s credibility but, rather, on aggregating preconscious brain signals to determine what someone believes to be true.

How is this NOT a lie detector?

This would definitely be useful when interrogating someone right?

[+] rglover|3 years ago|reply
Yes. It's a "mind reader" and it would be granted blind trust/authority by whoever relies on it like all other technology ("but NEAT says you want to kill yourself, Jerry."). I view this as a weapon, not a benevolent tool for helping veterans.
[+] bjtitus|3 years ago|reply
Wonder how long it will take until people are being put on mental health holds based on this data.
[+] snapcaster|3 years ago|reply
This is really worrying, hopeful it doesn't work and this research fails
[+] adventured|3 years ago|reply
If the research actually works, they'll pretend it failed and it'll vanish.
[+] rafaelero|3 years ago|reply
This is so obviously how things should be done in psychiatry. These types of disease most certainly have some signals that can be collected and interpreted by an AI to offer a diagnosis. No idea why it is taking so long.
[+] NumberlessMan|3 years ago|reply
If you think this is a good idea, [I think] you need to understand the discipline of psychiatry better.

My impression is that anything about you that is non-functional to you as a rational, self-interested agent in the world is a disorder, example [1].

Read about how they characterize delusions [2]. Essentially, if you don't preface everything you say with the words "I think", you are deluded.

It's an interesting field, but [I think] it's dangerous to put too much power in the hands of this field. [I think] it blunts a lot of normal humanity.

[I think] having suicidal thoughts is a normal part of an open-minded human life. [I think] going through a very bad near suicidal crises should be able to happen without the state labeling you in a fashion that can absolutely ruin your life should you decide to return to it. [I think] allowing people the privacy of their own minds is an essential human freedom, that we abandon at our peril.

In addition [I think] forcing people to state "I think" at the beginning of every sentence they say is a way to limit their persuasive power. Which is an interesting norm, but [I think] it needs to be universally applied as otherwise [I think] it can be used as a way to detooth opposition to authority.

That said, I wish all people tormented by their own minds the best possible future. I think therapy can help, as can reaching out to people who love you, if you have any in your life. But I think if you don't want to tell your therapist about your suicidal tendancies because you're worried about how a nervous bureaucracy will respond, I think that's your right.

Involuntary psychiatric care is no picnic. Having your freedoms taken away from you because of things you said, and finding you can't find the right sequence of magic words to leave is about as frustrating a position as a person can be in. What you might view as a serious bump in the road is viewed from the outside as a whole new social trajectory for you.

The people I met in there told me everything would get better once I just accepted I would never leave. That this was my life now. To stay calm, I did have to imagine a life where I never left, and figure out a way to be okay with that. It's a very hard mental exercise if you've ever wanted good things for your own life, and it doesn't surprise me not everyone can manage it without acting out.

[1] https://en.wikipedia.org/wiki/Scrupulosity

[2] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3016695/

[+] armatav|3 years ago|reply
What about one that uses the preconscious to detect criminal intent before it happens?

Could file it into a sort of very small report