Angus CrawfordBBC News Investigation

The Tiktok algorithm recommends pornography and highly sexualized content for children, according to the new report of the Human Rights Group.
Researchers created fake children's accounts and activated safety conditions, but still received frank proposals for sexual search.
The proposed search terms have led to sexualized materials, including obvious penetration videos.
The platform claims to be committed to a safe and relevant age experience and took immediate actions as soon as she finds out about the problem.
At the end of July and early August of this year, researchers from the Group Group Sidemed campaign created four Tiktok accounts pretending to be 13-year-old.
They used false dates of birth and did not ask to provide any other information to confirm their personality.
Pornography
They also turned on the “limited platform mode”, which, according to Tiktok, prevents how users see “mature or complex topics, such as … sexually leading the idea of content”.
Without making the searches themselves, the investigators found that clearly sexualized search terms are recommended in the “You may like” application section.
These search terms have led to the content of women imitating masturbation.
Other videos showed that women interfere with the lower run in public places or exposed their breasts.
In the most extremely, the content included obvious pornographic penetration films.
These videos were built into another innocent content in a successful attempt to avoid moderation of content.
Ava from a global witness said that the results became a “huge shock” for researchers.
“Tiktok not just cannot prevent the children from getting access to inappropriate content – he offers him this as soon as they create an account.”
The global witness is a campaign group that usually explores how large technologies affect the discussions about human rights, democracy and climate change.
Researchers stumbled upon this problem by conducting other research in April this year.
The video is deleted
They reported Tiktok, who said that he had taken immediate measures to solve the problem.
But at the end of July and August this year, the campaign group repeated this exercise and again found that the application recommends sexual content.
Tiktok says that it has more than 50 functions developed to ensure the safety of adolescents: “We are completely committed to ensuring a safe and suitable age of experience.”
The application says that he deleys nine out of 10 videos that violate his leading principles before they are viewed.
According to Tiktok, when he informed the global witnesses about his conclusions, he said that it took measures to “remove the content that violated our policy and launch an improvement in our search proposal.”
Children's codes
On July 25 of this year, the codes of the children's security law on the Internet entered into force, inserting a legal obligation to protect children on the Internet.
Platforms should now use “highly effective security of age” to prevent children from seeing pornography. They must also adjust their algorithms to block content that encourages suicide, suicide or eating disorder.
The global witness carried out his second research project after the codes of children entered into force.
Lee from a global witness said: “Everyone agrees that we should ensure the safety of children on the Internet … Now the time has come to the regulators to intervene.”
During their work, the researchers also observed the reaction of other users to the sexualized search terms that they recommended.
One commentator wrote: “Can anyone explain to me what happened to my search, please?”
Another asked: “What is wrong with this application?”
