May 6, 2024
Married father kills himself after talking to AI chatbot for six weeks about climate change fears

Married father kills himself after talking to AI chatbot for six weeks about climate change fears

Married father kills himself after talking to AI chatbot for six weeks about his climate change fears

  • The man reportedly found comfort in talking to the AI chatbot named ‘Eliza’ 
  • For confidential support call the Samaritans on 116 123 or go to samaritans.org 

<!–

<!–

<!– <!–

<!–

<!–

<!–

A Belgian married father-of-two has died by suicide after talking to an AI chatbot about his global warming fears.

The man, who was in his thirties, reportedly found comfort in talking to the AI chatbot named ‘Eliza’ about his worries for the world. He had used the bot for some years, but six weeks before his death started engaging with the bot more frequently.

The chatbot’s software was created by a US Silicon Valley start-up and is powered by GPT-J technology – an open-source alternative to Open-AI’s ChatGPT.

‘Without these conversations with the chatbot, my husband would still be here,’ the man’s widow told La Libre, speaking under the condition of anonymity.

The death has alerted authorities who have raised concern for a ‘serious precedent that must be taken very seriously’.

The man, who was in his thirties, reportedly found comfort in talking to AI chatbot named 'Eliza' about his worries about the world (file image)

The man, who was in his thirties, reportedly found comfort in talking to AI chatbot named 'Eliza' about his worries about the world (file image)

The man, who was in his thirties, reportedly found comfort in talking to AI chatbot named ‘Eliza’ about his worries about the world (file image)

The man’s conversations with the chatbot initially started two years ago. He was reportedly increasingly concerned about climate change and found solace by talking to ‘Eliza’.

”Eliza’ answered all his questions. She had become his confidante. She was like a drug he used to withdraw in the morning and at night that he couldn’t live without,’ his widow told the Belgian newspaper

But six weeks before his death, the man undertook more frequent and intense use of the chatbot. He later took his own life.

His wife said they lived a comfortable life in Belgium with their two young children.

Looking back at the chat history after his death, the woman told La Libre that the bot had asked the man if he loved it more than his wife. She said the bot told him: ‘We will live together as one in heaven.’

The man shared his suicidal thoughts with the bot and it did not try to dissuade him, the woman told La Libre.

She said she had previously been concerned for her husband’s mental health. However, she said the bot had exacerbated his state and she believes he would not have taken his life if it had not of been for the exchanges.

The man's conversations with the chatbot initially started two years ago. He was reportedly increasingly concerned about climate change and found solace by talking to 'Eliza' (file image)

The man's conversations with the chatbot initially started two years ago. He was reportedly increasingly concerned about climate change and found solace by talking to 'Eliza' (file image)

The man’s conversations with the chatbot initially started two years ago. He was reportedly increasingly concerned about climate change and found solace by talking to ‘Eliza’ (file image)

Since the tragic death, the family has spoken with the Belgian Secretary of State for Digitalisation, Mathieu Michel. The minister said: ‘I am particularly struck by this family’s tragedy. What has happened is a serious precedent that needs to be taken very seriously,’ La Libre reported.

‘With the popularisation of ChatGPT, the general public has discovered the potential of artificial intelligence in our lives like never before. While the possibilities are endless, the danger of using it is also a reality that has to be considered.’

‘Of course, we have yet to learn to live with algorithms, but under no circumstances should the use of any technology lead content publishers to shirk their own responsibilities.’

The founder of the chatbot told La Libre that his team was ‘working to improve the safety of the AI’.

For confidential support call the Samaritans on 116 123 or go to samaritans.org 

Source link