Child’s dance uploaded to YouTube and all Google accounts closed

8 mins read

When a Colorado woman found out that her account had been deactivated by Google, she felt, in her words, ‘as if her house had burned down’. In an instant, she lost access to her wedding photos, videos of her son growing up and everything else she had stored in what she thought was the safest place. She had no idea why.

Google refused to reconsider the decision in August, saying her YouTube account was hosting harmful content that could be illegal. It took her weeks to realize what had happened: It finally dawned on her that it was a YouTube Short video of her child dancing naked. Her 9-year-old confessed to uploading the video using an old smartphone

Google has an autonomous system of algorithmic monitoring to prevent the sharing and storage of abusive images of children on its platforms. If a photo or video uploaded to the company’s servers is judged to be sexually explicit material featuring a minor, Google deactivates the user’s account across all its services. Users have the right to challenge Google’s action, but in the past they had little opportunity to explain a nude photo or video of a child.

Google has updated its appeal process

According to the New York Times, Google will allow users facing accusations of child sexual abuse to prove their innocence. Content deemed abusive will still be removed from Google and reported, but users will be able to explain why it appeared on their accounts.

Susan Jasper, Google’s head of safety operations, said in a blog post that the company will “provide more detailed justifications for account suspensions,” adding, “We will also update our appeals process to allow users to provide more information to help us understand the content detected in their accounts.

In recent months, the Times has reported on the power tech companies wield over the most private parts of their users’ lives, and examined several examples of Google’s previous review process.

In two separate cases, parents photographed their young children to facilitate a medical diagnosis. The algorithm automatically flagged the images, and then moderators decided that the images violated Google’s rules. Police found that the fathers had committed no offense, but the company deleted the accounts.

He got six months of data back on a USB stick

Two fathers, one in California and the other in Texas, were unable to get past Google’s previous appeal process: they could not produce medical records, communications with their doctors or police documents exonerating them of the crime. The father in San Francisco finally took legal action and got six months of data back from Google on a USB stick.

Google spokesperson Matt Bryant said in a statement: “When we find child sexual abuse material on our platforms, we remove it and suspend the account. We take the consequences of suspending an account seriously, and our teams are constantly working to minimize the risk of an incorrect suspension.”

Tech companies that offer free services to consumers are notoriously bad at customer support. Google has billions of users and in 2021 deactivated more than 270,000 accounts for violating its rules against child sexual abuse material. In the first half of 2022, the number of deactivated accounts was more than in all of 2021.

Colorado mom took four months to get her account back

“We don’t know what percentage of these are false alarms,” said Associate Professor Kate Klonick of St. John’s University School of Law. “We don’t know what percentage of these are false alarms,” said Kate Klonick, an associate professor at St. John’s University Law School in St. John’s, predicting that even just 1 percent would lead to hundreds of appeals a month.

Evelyn Douek, an assistant professor at Stanford Law School, said more details are needed on how Google’s new appeals process will work, “Just creating a process doesn’t solve everything. The devil is in the details. Does reconsideration make sense? What is the timeline?” he added.

It took four months for the Colorado mother, who did not want her name used to protect her son’s privacy, to get her account back. Google reinstated the account after the Times reported on it.

“We understand how upsetting it is to lose access to your Google account and the data stored in it due to a misunderstanding,” Google spokesperson Bryant said in a statement. These cases are extremely rare, but we are working on ways to improve the appeals process when people come to us with questions about their accounts or believe we made the wrong decision.”

Google did not notify the woman that her account had been reactivated, and she learned of the decision ten days later from a Times reporter. When she logged in, she found that everything had been restored except the video of her son. A message appeared on YouTube saying that her content violated community guidelines. “As this is the first time this has happened, this is just a warning,” the message said.

“I wish they would have done this in the first place,” said the mother, who also received an email from the Google Team on December 9. “We understand that you have tried to appeal this decision several times, and we apologize for any inconvenience this has caused. We hope you can understand that we have strict policies in place to prevent our services from being used to share harmful or illegal content, especially horrific content like child sexual abuse.”

In addition to Google, many companies are monitoring their platforms to prevent the widespread sharing of images of child sexual abuse. Last year, more than 100 companies sent 29 million reports of suspected child abuse to the (US) National Center for Missing and Exploited Children. Last year, data scientists at the company analyzed some of the flagged material and found examples that qualified as illegal under federal law but were not ‘malicious’. In a sample of 150 flagged accounts, researchers found that more than 75 percent “did not exhibit malicious intent.”

Jason Scott, a digital archivist who wrote a blog post in 2009 warning people not to trust the cloud, said companies should be legally obliged to give users their data even if an account is shut down for violations, adding: “Data storage should be like tenant law. You can’t hold someone’s data and not give it back.”

New York Times report. 


The ancient idea tries to provide the most accurate information to its readers in all the content it publishes.

Fatal error: Uncaught TypeError: fclose(): Argument #1 ($stream) must be of type resource, bool given in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php:2386 Stack trace: #0 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2386): fclose(false) #1 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2146): wp_cache_get_ob('<!DOCTYPE html>...') #2 [internal function]: wp_cache_ob_callback('<!DOCTYPE html>...', 9) #3 /home/fikrikadim/public_html/wp-includes/functions.php(5420): ob_end_flush() #4 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(324): wp_ob_end_flush_all('') #5 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(348): WP_Hook->apply_filters('', Array) #6 /home/fikrikadim/public_html/wp-includes/plugin.php(517): WP_Hook->do_action(Array) #7 /home/fikrikadim/public_html/wp-includes/load.php(1270): do_action('shutdown') #8 [internal function]: shutdown_action_hook() #9 {main} thrown in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php on line 2386