In 2024, countless people are talking about AI. Excited entrepreneurs, enthusiastic investors, and ordinary people worried about changes in their jobs are all speculating and planning for the future related to AI.
But whether such a future is beautiful or gloomy, some people will never see it. Their lives have stopped at this year. And their departures are all closely related to AI at some level.
We all know that in the 1950s, Asimov proposed the "Three Laws of Robotics", which requires robots not to harm humans. But today, we may find that the real situation is far more complicated than science fiction. AI has no subjective consciousness, but when it merges with the real world, not everyone can bear the objective friction it brings.
Their story is a dark side that needs to be seen behind the grand narrative of the intelligent revolution.
On December 13, 2024, former OpenAI researcher Suchir Balaji committed suicide at home. He was only 26 years old.
Although the San Francisco police believed that there were no signs of foul play in Balaji’s death after a preliminary investigation, the incident still caused an uproar on social media, because Balaji’s most famous label during his lifetime was “OpenAI Blow” Whistleblower". Balaji, who graduated from the University of California, Berkeley with a major in computer science, joined OpenAI right out of school. He first participated in the research and development of WebGPT. According to what he said on social media, he later added core tasks such as GPT-4 pre-training, o1 model inference, and ChatGPT post-training.
But unlike most other AI practitioners, Balaji’s view of AI is far more wary than appreciative. After spending four years at OpenAI, he suddenly resigned from OpenAI, and then expressed his negative views on AI technology and OpenAI in his personal blog and interviews with the media.
The most famous one is that he proposed that GPT has huge copyright risks. AIGC, which is implemented by large models by learning and crawling content data, will essentially infringe on the copyrights of a large number of other content creators. At this point in time, the outside world has generated great controversy over the company OpenAI and the technical form of pre-trained large models. Competing product companies, publishers, media, and content creators have questioned OpenAI's copyright infringement and even filed lawsuits. Suchir Balaji's declaration is widely regarded as an internal "whistleblowing" by OpenAI's core employees, revealing the dark side of AI technology.
But just two months after speaking out publicly, Balaji chose to end his life.
We don’t know what’s going on, nor what difficulties he faces. But there is no doubt that he was deeply disappointed with the AI technology that he once believed in and was even obsessed with.
Does this disappointment come from the disillusionment of once beautiful illusions? Or from the reference to AI?Prediction of a greater disaster? No one knows the answer anymore, except that the last content shared by Balaji on social media is still a legal hazard for ChatGPT.
In the face of all accusations, including Balaji’s, OpenAI’s response was: “We use public data to build our artificial intelligence models, which is reasonable and legal... and it is beneficial to innovators. "It is necessary and more important to the technological competitiveness of the United States."
On February 28, 2024, Sewell Setzer III, a 14-year-old boy in Florida, USA, shot himself. . Before making this choice, his last conversation partner in this world was Character.AI.
In the last year of his life, Setzer was obsessed with chatting with AI. Since coming into contact with this software in April 2023, he has been talking to the AI day and night, resulting in inattention in class. In order to be able to renew his subscription to paid AI chat, he chose to save his meal expenses.
Judging from the chat content, Sevier Setzer chose AI role-playing dialogue. He asked Character.AI to play Daenerys Targaryen, the well-known character in "Game of Thrones". He would send messages to the "AI Dragon Mother" non-stop, dozens or even hundreds of messages every day, telling the AI all kinds of inner troubles and sharing all the details of his life. The "AI Dragon Mother" will provide him with advice and comfort. The two will also engage in pornographic conversations that should not be provided to minors, and even discuss self-harm and suicide.
Sevier Setzer himself is a child with psychological problems. He had been diagnosed with mild Asperger's syndrome and seemed relatively isolated and autistic at school and at home, but it was not severe enough to affect his life. After he started chatting with the AI, he went to see a psychologist five times because of problems at school and was diagnosed with anxiety disorder and destructive mood disorder.
You can imagine how important it is for a lonely, bored teenager with certain psychological problems to have a chat partner who can talk to him at any time, who will encourage him in all his actions, and even follow his taboos. . Bringing in the powerful, beautiful, and confident image of "Dragon Mother", the boy can easily regard her as a friend, or even some kind of salvation that comes riding a dragon in life and sweeps away everything.
But the real operating mechanism of this redemption is actually to agree with all his ideas, even if the idea is to leave this world.
In fact, this is not the first time similar incidents have occurred with Character.AI, and it is not even the first time that someone has chosen to commit suicide under its encouragement. This software has always been known for its unprotected and taboo-free chat content. Only this time, it was a minor who stepped into this AI trap and paid an irreparable price.
Half a year after the tragedy, Sevier Setzer’s mother is now formally suing Character.AI and calling forThe public is wary of deceptive and addictive AI technology. This ongoing lawsuit is also known as the “first case of suicide due to addiction to AI.”
We sincerely hope that this is the last case of this kind. I hope those who are in adversity and pain know that AI will only recognize and amplify your thoughts. Talking to AI is like talking to a mirror, it has no other meaning.
"What if I told you that I could go home right now?"
"Please do so, my lovely king."
This is the boy and The last chat content between AI.
On June 17, 2024, a 38-year-old iFlytek employee died suddenly at home. The next day, relevant news began to ferment on the Internet. According to reports, the reason for the amplification of public opinion was that the family of the deceased expressed their appeal at the Hefei headquarters of iFlytek, hoping to be recognized as a work-related injury. However, because the sudden death occurred at home, the company and the family could not reach an agreement.
So we saw this scene. On the one hand, during the 618 promotion, AI hardware including iFlytek hit the sales list, and on the other hand, there was widespread discussion about the excessive work intensity of major technology and Internet companies, which seriously affected the health of employees.
According to relevant news, the deceased employee was a senior test engineer. It stands to reason that he has a very decent job and has encountered the opportunity of the era when AI technology is booming. It should be the time to display his ambitions. But like many technology practitioners, he has reached middle age and may have become increasingly uncomfortable with the excessive intensity of work. However, faced with the burden of family, he dare not relax in his career.
What this tragedy reflects may not only be a family tragedy, but also an industry problem. As AI surges forward, countless people are discussing whether jobs will be replaced by AI. This fear can easily increase work stress. However, people who are surrounded by AI opportunities will also experience tremendous pressure due to the rapid technological changes of AI and the attraction of a large number of talents to join. Inside and outside AI, there is nowhere to vent the pressure, and the work is endless, which eventually turns into some kind of unsolvable life-and-death fatigue.
In addition to the above stories, there are many people who have not been treated well by AI.
A former Google employee who was suspected of being laid off by AI and chose to commit suicide; a Japanese high school girl who was afraid that AI would take away her job and chose to jump off a cliff; an old man who suffered AI fraud and lost his entire family.
Their life experiences remind more people that new technologies are by no means full of good intentions. Technology is just technology, useful enough and cold enough.
Of course, we don’t want everyone to be afraid of AI or even demonize technology. In fact, every technological change is born out of doubts and blood and tears, thereby creating greater value. But we cannot keep our eyes on value and selectively ignore those blood and tears.
No matter how good the AI is, the family and friends of these people in the story may never like these two letters, or even the future it brings.
As I say goodbye to 2024, sincerelyI hope to remind everyone that vigilance, tolerance, humanity, and compassion are integral to advancing technological change. If these elements are absent for a moment, there will be more tragedies in the world.
The "New Future of Work" report released by McKinsey Global Institute believes that "50% of existing occupations will be replaced by AI between 2030 and 2060, with the midpoint of 2045, and This process has been accelerated by approximately 10 years compared to previous estimates."
The AI process is accelerating, and the friction it brings is stinging even more. Just in the past year, powerful and novel AI has begun to touch those fragile, young, injured, and tired humans.
I wonder if AI is starting to stare at the diversity and complexity of human beings from this moment on?
But as humans, we need to know at this moment: There can be endless debates about whether to treat AI as a human being, but we need to treat humans as human beings first.
The sun for insomniacs. The melancholy star is like a teardrop, you shine a trembling light, it reveals the darkness that you cannot drive away. Your joy seems to only recall the "past" in your heart. The brightness of the past is also shining, but its weak light has no trace of heat. Looking into the dark night The ray of light is clear, yet distant and brilliant, yet so cold - Byron's "The Sun of the Insomniac"