On intimate photos and videos: Distinguishing minors from adults is a problem for Facebook 16

On intimate photos and videos: Distinguishing minors from adults is a problem for Facebook

Moderators who review images for Facebook have a hard time

Last year, Facebook reported 27 million sexual images of minors to authorities. But it is not always clear how old the people in the photographs are. Thus, decision makers quickly find themselves in a quandary.

When it comes to possible sexual images of minors, no site reports as many suspicious cases to US authorities as Facebook does. Every day, thousands of moderators view photos and videos, always looking for violations of the site’s terms of use or applicable law. The decisions that observers have to make every day are sometimes extremely complex because the age of the faces depicted is not always clear. New guidelines should help: As the New York Times reports, moderators are instructed to assume people are of legal age when in doubt.

For parent company Facebook Meta, this is obviously a choice between plague and cholera. As Meta head of security Antigone Davis said in an interview with The Times, “Internet sexual abuse of teens is disgusting ,” but the consequences for people falsely accused of uploading underage sexual material can be life-destroying. As such, the guidelines are designed to protect those who post erotic material on the platform that complies with the rules.

Many cases remain unsolved

However, according to child protection experts, this could leave a large number of minors vulnerable to the publication of explicit material. Studies have shown that today’s young people develop faster than in the past. Especially in black and Hispanic adolescents, puberty occurs much earlier than, for example, in the population of European origin. Combined with relaxed reporting rules, this results in a whole group of teens who are not protected, according to Lianna McDonald, executive director of the Canadian Child Welfare Center.

In addition, there are internal problems with the content of reporting, employees report behind closed doors, they are not allowed to speak publicly due to confidentiality agreements. Although companies are required by law to report prohibited material immediately, they have little protection from lawsuits if the report is made in error and thus has a damaging effect on the reputation of the individuals concerned.

Four former employees told The New York Times how it affected their day-to-day work. According to them, a negative evaluation threatened the job if the person made too many bad decisions. As a result, moderators were afraid to report photos and videos that they weren’t entirely sure were legal.

The overload of the authorities is no reason to look past it

At least Apple, Snap and Tiktok have confirmed to The New York Times that they have a different approach. If they weren’t sure, they preferred to report pictures, the companies confirmed. But this is not the last word, Meta explained. After all, too many reports will only further narrow the problematic bottleneck, namely overloaded offices. “If the system is too crowded with useless stuff ,” Davis said, “then it’s a real burden .”

Meanwhile, US authorities are siding with companies that, as a precautionary measure, report all suspicious cases. It’s not about easing the authorities’ plight, as that can’t be a reason not to report cases, according to the decision-making body that checks reports and alerts police if it’s illegal media. Investigators are of the same opinion. “No one should be reluctant to report a possible crime, especially a crime against a minor, because they think the police are too busy,” said Chuck Cohen, who led the Indiana State Child Exploitation Task Force for 14 years.

After all, corporations on both sides seem to want to protect themselves as best they can without taking any risks. The main problem here is that the courts and laws in the US are very vague when it comes to exactly what criteria defines “obscene” material and what corporations should look for when evaluating sexual images. “I couldn’t find any court that even came close to answering the question of how to find that balance,” said Paul Ohm, a former prosecutor with the Department of Justice’s computer crime unit.