Texas Republican Ted Cruz just tore into Mark Zuckerberg about why some users are directed toward potential child sexual material on Instagram.
His team puts up a slide deck of an Instagram prompt that warns users they may be about to see CSAM and asks if they would like to “see the results anyway”.
“Mr Zuckerberg, what the hell were you thinking?” Cruz yells into his microphone.
The Meta CEO is interrupted several times as he attempts to answer, eventually pleading “give me time” to reply.
Zuckerberg promises to “personally look into it” as Cruz continues to grill him, complaining that the company’s official response in several days will be a lawyerly one.
Zuckerberg tries to give nuanced answers – but lawmakers want ‘yes’ or ‘no’
Things got quite awkward just now in the hearing room.
Mark Zuckerberg is showing increasing frustration that he is not being allowed to provide answers with nuance as lawmakers push for quick yes or no answers.
Missouri Republican Senator Josh Hawley peppered Zuckerberg with questions about whether he is responsible for harming children and whether he should apologise.
He then essentially forced Zuckerberg to stand up, turn around and apologise to family members in the crowd.
Several parents held up placards of their children as he turned to face them.
Crowd laughs when Zuckerberg says sexually explicit content is not allowed
A short interaction between Senator Mike Lee and Mark Zuckerberg just caused some members of the crowd to burst into laughter.
Zuckerberg said: “My understanding is we don’t allow sexually explicit content for anyone.”
“How is that going?” Lee responds, causing claps and laughs from the audience.
Zuckerberg goes on to say about 99% of content removed is identified by AI and he believes Meta is an industry leader in this area.
Zuckerberg wants Apple and Google to play bigger role in child safety
Facebook founder Mark Zuckerberg has come to congress today with a clear vision – that Apple and Google could play a bigger role in keeping children off platforms or ensuring they have age appropriate experiences on social apps.
Zuckerberg says Apple and Google’s app stores could be the “easiest” and “right” place to check the age of child users or let parents verify themselves, rather than have to upload ID to many different apps.
He adds that if the two giants’ app stores already require parental consent when children make payments in apps, “it should be pretty trivial to pass a law that requires them to make it so that parents have control any time a child downloads an app and offers consent of that”.
Senator Amy Klobuchar responds that such processes aren’t simple enough for parents – and the court rooms and halls of Congress offer them a smoother path for protecting their kids online.
Senator says Facebook founder has ‘blood on his hands’
Each of the tech bosses is reading a prepared statement about their company. Mark Zuckerberg is speaking currently.
So let’s revisit a moment from earlier, when Republican Senator Lindsey Graham told the hearing that social media companies are “destroying lives” and “threatening democracy itself”.
Graham tells the story of a young man who was extorted on Instagram and ended up taking his own life.
He then addresses Mark Zuckerberg, whose company Meta owns Instagram.
Mr Zuckerburg, you and the companies before us – I know you don’t mean it to be so, but you have blood on your hands. You have a product that’s killing people” from Lindsey Graham Republican Senator
Mr Zuckerburg, you and the companies before us – I know you don’t mean it to be so, but you have blood on your hands. You have a product that’s killing people”Lindsey GrahamRepublican Senator
Many of the spectators in the audience cheer and clap in response.
What is CSAM – and how do platforms detect it?
Liv McMahon
Technology reporter
One of the biggest concerns platform bosses will have to answer questions about today is Child Sexual Abuse Material (CSAM) – sexually explicit content depicting a child.
The term also applies to self-generated imagery and material created using artificial intelligence.
Social media platforms typically use a combination of automated and human processes to review content flagged as potentially violating its policies, including CSAM.
Industry-developed tools such as Microsoft’s PhotoDNA tool, YouTube’s Child Sexual Abuse Imagery Match (CSAI) tool and Google’s Content Safety API help platforms identify and report violating content to the National Center for Missing and Exploited Children (NCMEC).
Many of these use image or hash matching to detect content which might be CSAM.
Hash matching sees a piece of content given a unique digital signature, or ‘hash’, so it can be checked against signatures belonging to existing content, such material in databases of known child abuse material, to find copies or matches.
The NCMEC’s Take It Down portal helps people in the US remove explicit imagery of themselves taken when under the age of 18 by using hash matching to identify and remove copies of the content if found on other platforms.
👁️[WPPV-TOTAL-VIEWS]