A Houston man has been arrested after Google sent a tip to the National Center for Missing and Exploited Children saying the man had explicit images of a child in his email, according to Houston police.
The man was a registered sex offender, convicted of sexually assaulting a child in 1994, reports Tim Wetzel at KHOU Channel 11 News in Houston.
“He was keeping it inside of his email. I can’t see that information, I can’t see that photo, but Google can,” Detective David Nettles of the Houston Metro Internet Crimes Against Children Taskforce told Channel 11.
After Google reportedly tipped off the National Center for Missing and Exploited Children, the center alerted police, who used the information to get a warrant.
A search of the man’s other devices revealed more suspicious images and text messages. Police arrested him and he’s being held on a $200,000 bond.
On one hand, most people would certainly applaud the use of technology to scan email in a case like this.
On the other, debate rages about how much privacy users can expect when using Google’s services like email. In a word: none.
A year ago, in a court brief, Google said as much. Then, in April, after a class-action case against Google for email scanning fell apart, Google updated its terms of service to warn people that it was automatically analyzing emails.
Considering Google has been working to fight online child sexual abuse since 2006, it stands to reason the company would scan emails looking for those sorts of images. Google has never come right out and said so, but hinted strongly at it about a year ago when Jacquelline Fuller, director of Google Giving, specifically mentioned the National Center’s “CyberTipline” in a blog post. The CyberTipline receives leads and tips regarding suspected crimes.
In 2011, the National Center for Missing & Exploited Children’s (NCMEC’s) Cybertipline Child Victim Identification Program reviewed 17.3 million images and videos of suspected child sexual abuse. …
Since 2008, we’ve used ‘hashing’ technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. …
We’re in the business of making information widely available, but there’s certain ‘information’ that should never be created or found. We can do a lot to ensure it’s not available online—and that when people try to share this — For more information read the original article here.