YouTube algorithm accused of glorifying death of victims in terrorist attack

WASHINGTON, 23 Feb 2023:

Gonzalez vs Google LLC is the name of the case reviewed by the US Supreme Court on Tuesday – which pits the family of a young woman murdered by the Islamic State (IS) against the behemothic Internet platform, a case in which the girl’s mother and stepfather said they’re not afraid and feel “confident” about the outcome.

“We feel confident after listening to all the arguments. We just hope that this (will) change the laws and it’ll be for the good,” said Jose Hernandez, the stepfather of Nohemi Gonzalez, the only American killed in the 2015 Paris terrorist attack.

“So other families don’t have the pain that we’re feeling,” he added after he and his wife Beatriz Gonzalez emerged from the oral hearing before the high court in Washington.

Their daughter was murdered in November 2015 in a terrorist attack perpetrated by the IS in the French capital, an attack that killed a total of 130 people.

The lawsuit holds Google liable for Nohemi’s death – arguing that via YouTube, the IS was able to post videos that launched the violence and urged people to join the terrorist group, with Google steering viewers to videos made by jihadists, identifying potential interested viewers according to a Google algorithm.

The case examines for the first time the limits to which online platforms can be held responsible for the recommendations their algorithms make for people to view material posted by third parties and ultimately could change the current configuration of the Internet.

But the plaintiffs are not afraid of going up against the tech mega-giant, with Hernandez telling reporters that if he and his wife would have thought about being afraid they would never have pursued the case – because they know this could change the social networks, adding that Nohemi will be the catalyst for big changes in platform algorithms.

Nohemi had travelled to Paris to study there but was killed in the attack on the La Belle Equipe restaurant.

Beatriz, who is of Mexican origin, said she remembers her daughter with a lot of pride, adding that she was an independent, self-sufficient young woman of 23 and asserting “it wasn’t fair” that her life was taken.

She said a change is needed because “information moves so easily” on the social networks, noting it’s very easy to form groups and share information and asserting that things need to be more strictly monitored – not only to prevent terrorist incidents but also all sorts of other criminal activities – a sentiment with which Hernandez agreed.

The decision of the high court, which has a 6-3 conservative majority, will not be made known for several months, but the couple is hopeful when that ruling is handed down it will be “something good.”

Hernandez said they are hoping that ruling will “change everything,” noting they pursued the case so that the Supreme Court will render justice from the sadness they are feeling.

“It’s a very important case,” he said.

In the plaintiffs’ sights is Section 230 of the Communications Decency Act, which is known in the tech world as the “26 words that created the modern Internet.”

Section 230 shields online platforms from liability for harm stemming from content posted by individuals, no matter how discriminatory, defamatory or dangerous that content might be.

However, a key issue that is still up in the air is how much liability companies may have if they can be found guilty of enabling or abetting terrorists or other criminals in perpetrating unlawful activity.

The US Supreme Court also began hearing a case yesterday involving Twitter and examining whether Elon Musk’s firm is responsible for helping in the commission of a terrorist attack by not properly eliminating content published by organisations like IS.

The social network was sued along with Facebook and Google (as the owner of YouTube) by the family of Jordanian citizen Nawras Alassaf, who was killed on 1 Jan 2017 in an Istanbul nightclub by Abdulkadir Masharipov, a terrorist who attacked the nightspot and murdered 39 people.

The plaintiffs allege that given that the terrorist organisation used these platforms to recruit members, issue terrorist threats and disseminate propaganda, create fear and intimidate the civilian population, the tech companies bear responsibility for instigating this attack.

In the opinion of the plaintiffs, the tech firms provided material support to the IS by providing the infrastructure and the services that enabled them to promote and carry out their terrorist activities and by not monitoring and proactively eliminating the terrorist content.

They are relying on the Anti-Terrorism Act (ATA) and the Justice Against Sponsors of Terrorism Act (JASTA), which permit victims of terrorism to file lawsuits for primary and secondary responsibility against any entity that aids in the commission of a terrorist act.

The high court justices will have to rule on whether, according to the ATA, the social platforms that host user content have helped in the commission of a terrorist act for allegedly failing to filter and eliminate content posted by terrorist organisations.

In yesterday’s hearing, Twitter attorney Seth Waxman focused his defence on the argument that not doing everything possible to comply with Twitter’s rules and policies prohibiting this kind of harmful content is not equivalent to “knowingly providing substantial assistance” to posters of violent content.

He said the plaintiffs had not claimed Twitter had provided “substantial assistance, much less knowing substantial assistance, to that attack or, for that matter, to any other attack,” going on to say it was undisputed that Twitter “had no intent to aid ISIS’s terrorist activities.”

“What we have here,” he said, “is an alleged failure to do more to ferret out violations of a clear and enforced policy against assisting or allowing any postings supporting terrorist organisations or activities,” but that did not amount to “aiding and abetting an act of international terrorism.”

If the Istanbul chief of police had come to Twitter saying they have been following three user accounts and these users seemed to be planning some kind of terrorist act and Twitter had not investigated it, in that case the firm would have assumed responsibility for whatever attack they might have carried out, he said.

The tech firm owned by magnate Elon Musk says the fact that the IS used the platform does not constitute knowing assistance, a stance shared by the Joe Biden administration.

Deputy solicitor general Edwin Kneedler, the government’s representative, said the firm cannot be considered responsible under the ATA because Congress has said this law is not broad enough to inhibit legitimate and important activities of companies, organisations and others.

But in the opinion of several of the high court’s magistrates, Twitter “knew all that” and “did nothing” about it, as progressive justice Elena Kagan said.

How can it be said Twitter did not provide substantial assistance, asked Kagan, adding that the social network is, in fact, providing service to people with the explicit knowledge those people are using the platform to promote terrorism.

As Nitsana Darshan-Leitner, an attorney for the Nawras Alassaf family, told reporters after the hearing, the suit seeks to end the “immunity of the social networks.”

“Every terror attack begins and ends on the social media. The social media have been immune for too many years. They felt that they’re untouchable, and therefore they allowed the terror organisations to use them as a tool that they never had before and cannot do without,” she said.

– EFE