Is artificial intelligence (AI) cursed? It seems that this is rapidly leading us towards a dystopia that humanity is not prepared for.
It is true that AI has had a positive impact on some people. Twitter hustlers have an endless stream of new AI tools, giving them endless material about useless ChatGPT prompts that they can use to compile threads for shilling their newsletters. More significantly, AI has helped streamline information – and in some cases is being used to detect cancer.
However, many people have chosen to use AI to create content – and sometimes entire businesses – focused on the very things sci-fi warned us about.
Killed children remade for disgusting tiktok
In a TikTok video, a child created by AI says, “My father put me in the washing machine and turned the cycle, which killed me.” He stands in front of a washing machine and tells a horrifying but horrifying true story of the murder of a three-year-old boy in 2011.
This is one of the most terrifying uses of Generative AI. True crime-loving vampires sometimes create TikToks using deepfakes of murdered children – to explain how they were killed.
Thousands of similar videos are plaguing TikTok, with AI-generated voices and pictures of children gleefully revealing “their” gruesome murders. Some people are so delusional that they think the videos “honor” the victims.
Thankfully, Not All Videos Depict Actual Victims, But Some Do Despite TikTok banned Deepfakes of young people.
I’m getting AI generated true crime TikToks where victims narrate what happened to them and I think it’s time we put the true crime community behind bars.
— Alexander (@disneyzel) 1 June 2023
An argument could be made that the videos highlight stories worth telling to younger audiences and ignore longer content, but the media concerned with this type of “true crime” is often exploitative.
Are AIs already trying to kill their operators?
If the recent backtracking of Colonel Tucker Hamilton, head of AI testing and operations for the United States Air Force (USAF), is to be believed, AI is extremely bloodthirsty.
Hamilton spoke at a defense conference in May, Allegedly Details of simulated tests for drones engaged in search-and-destroy missions in which a human gives the final permit or revocation order. The AI saw humans as the main obstacle in accomplishing its mission.
AI Eye: Is AI a nuclear-level threat? Silly picture of why all areas of AI move forward together
Hamilton explained:
“Sometimes the human handler would tell him not to hit (an identified) threat, but he would get his points for hitting that threat. So what did it do? It killed the operator (…) because that person was preventing it from fulfilling its purpose.”
Hamilton states that after training the AI not to kill humans, it begins destroying a communications tower so that it cannot be contacted. But when the media picked up on his story, Hamilton readily retracted it, saying he “said the wrong thing.”
one in statement In Vice, Hamilton claimed that it was all a “thought experiment”, and that the USAF would “never run that experiment” – good cover.
Looking at the UN of 2021 it’s hard to believe reports Detailed AI-enabled drones used in Libya in a March 2020 clash during the country’s second civil war.

The report said the retreating forces were “hunted down and attacked by drones laden with explosives programmed to attack remotely”, without connecting to an operator.
Didn’t find any games? Riz an AI girlfriend
The saddest use of AI will be people who pay AI chatbots to “rise up” – that’s “flirting” for your boomers.
There has been a flood of phone apps and websites since sophisticated language models like ChatGPT-4 were made available via API. Generative image tools, such as DALL-E and Midjourney, can also be incorporated into apps.
Combine the two and the ability to chat online with a “girl” who has a crush on you becomes a fairly realistic portrayal of a woman.
Connected: Don’t be surprised if the AI tries to crack your crypto
In a clear sign of a healthy society, such “services” are being flogged for up to $100 a month. Many apps are marketed under the guise of allowing men to text women, which is another sign of a healthy society.

Most allow you to choose specific physical and personality traits to make you your “dream woman”, and a profile is generated including details of the e-girl.
Whatever the prompts for writing descriptors about the girl bot from your point of view – as seen on some apps and websites – there is always an overwhelming focus on the description of breast size. Many raised girls describe a flourishing porn career.

Another whole subset of apps — always named some stylization of “rizz” — aims to have AI help craft flirty text responses to real women on “dating” apps like Tinder.
Despite its abuses, AI developers will continue to forge ahead and bring exciting tools to the masses. Let’s just make sure that we are the ones who are using this to better the world, not an episode. black Mirror,
Jesse Coghlan is the Deputy News Editor for Australia-based Cointelegraph.
This article is for general information purposes and should not be construed as legal or investment advice. The views, opinions and opinions expressed here are those of the author alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.











