New Zealand Privacy Commissioner John Edwards has used his keynote address to the IAPP ANZ Summit in Sydney this week to discuss some of the challenges of the new digital economic order, specifically how one country can take on the tech giants.
"Digital platforms need to adapt to the jurisdictions in which they operate -- not the other way around," he said.
There are over 100 data protection and privacy commissioners working nationally and in-country, providing independent regulation in a "patchwork quilt of domestic laws".
"It is a challenge for me to meaningfully coordinate domestically with authorities with a shared interest in the values, mores, and cultural settings we are ingesting with our imported technology," Edwards said.
"When you multiply the challenge across every jurisdiction with a censor, electoral commission, competition authority, consumer protection authority and online harms agency, the scale of the fragmentation becomes immediately evident. Yet 'They' remain 'One'."
Edwards shared concern that in 2019 there are companies bigger than countries; Facebook has a "population" of over 2 billion, none of whom has voting or regulatory rights there.
"They move fast and break things, innovate at the speed of fibre optic broadband, deliver fantastic services that improve the lives of billions, and leave regulators in the dust; unable to keep up, unable to match their resources unable to assert effectively that their 'one size fits all product' does not in fact, fit all," he said.
Over in Australia: Labor floats jail time as penalty for social media giants that breach Aussie law
A video of the terror attack in Christchurch was viewed around 4,000 times on Facebook and took 29 minutes before it was finally reported, Facebook said previously.
Despite its removal, approximately 1.5 million copies of the video sprung up on the network in the first 24 hours after the attack. However, only approximately 300,000 copies were published as over 1.2 million videos were blocked at upload.
Over on YouTube, a copy of the video was uploaded once every second during the first 24 hours following the terrorist attack.
Edwards said Facebook knew of the potential for its service to be used in this way before it launched it.
"It knew, and failed to take steps to prevent its platform, and audience and technology from being used in that way," he said.
"It was predictable and predicted. And the company responsible was silent -- confident that the protection afforded by its home jurisdiction would shield it from liability everywhere."
Edwards said that both New Zealand and Australia did not vote on and were not consulted on section 230 of the US Communications Decency Act, but still feel its effects: "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."
But, as Edwards said, pharmaceutical entrepreneurs can't just launch a product without strict testing and regulation.
See also: Facebook data privacy scandal: A cheat sheet (TechRepublic)
"Facebook launched its live-streaming product with the knowledge that it could be used for the purposes the gunman ultimately chose on March 15 and went ahead anyway," he said.
"They knew the platform could be used to broadcast rape, suicide, murder, and other such content. They assured the public that a combination of artificial intelligence, and human moderation would mitigate those risks and allow the public to have the benefits of the technology."
As disclosed in a statement from Facebook's policy director for counter-terrorism to US Congress, the company's AI didn't pick up the video initially as it was trained mostly on ISIS beheading footage, which meant at the time there wasn't "enough gore" in the gunman's video.
"Data protection laws alone cannot combat these harms. It may be that we need some more agile consumer protection mechanism to allow data protection and privacy authorities to work together with other consumer safety regulators to respond more assertively to the emergence of harmful products in the marketplace," Edwards said.
"One of the lessons from Christchurch is that when we join together and present the irrefutable moral right, it can force industry to act," he said, pointing to the Christchurch Call.
"But it mustn't take another terrorist atrocity, or corrupted election or referendum, or Myanmar genocide to get and hold their attention."
Edwards also pointed to the murder of a young woman, whose trial was put in peril as a result of Google's algorithms doing the very thing they were designed to do.
Her now-accused murderer had been granted a suppression order against the release of his name. However, international media weren't bound by this order and as a result of 100,000 or-so individuals "Googling" his name, the search giant's trending topics algorithm was triggered.
While he said the individuals searching and the media reporting hold some blame, New Zealand's Minister of Justice singled out Google for blame and criticism for its failure to ensure the orders of the courts were respected.
Google in July apologised and suspended its Trends Alert subscriber email system.
"It appears that Google's highly sophisticated, automated service simply acted as it was designed to," Edwards said.
'It seems to have been incapable of responding to the combination of factors that the humans in charge of New Zealand's media outlets do, that is; the making of an order, the breaching of the order by irresponsible international media, and the many name searches made by curious individuals."
He said that unique combinations of events pose a real challenge to a global company.
"Its systems are designed to deploy at a scale that regards the entire population of New Zealand as trivial. So, it is possible to have some sympathy for its plight," Edwards said.
"But not much.
"The problem of scale is a problem of that platform's own making."
RELATED COVERAGE
from Latest Topic for ZDNet in... https://ift.tt/2C3y8V3
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.