Wednesday, September 2, 2020

I speak with a computerized voice. Republicans used it to put words in my mouth

This piece has been updated.

Ady Barkan is a lawyer and co-founder of the Be A Hero PAC.

I speak with a computerized voice — think Stephen Hawking. It’s a result of ALS, the neurological disease I’ve had since 2016. And of all the painful parts of this entire ordeal, which has now almost completely paralyzed me, one of the worst is the way the disease has robbed me of my natural voice.

Last week, House Minority Whip Steve Scalise tried to use my computer-assisted voice to rob me of my agency, too. In a video aimed at Democratic presidential nominee Joe Biden, Scalise, a Republican from Louisiana, shared his team’s manipulated footage of an interview I conducted with Biden to make it appear I had said words that I never uttered, in an effort to distort Biden’s views and harm his electoral prospects.

Scalise eventually scrubbed the video from his Twitter feed after being criticized for the manipulation, but the ominous lessons of the episode remain: the ability to use technology not only for good but to mislead and manipulate; the willingness of those with political agendas to resort to such disinformation and propaganda; and the way in which America has cleaved into two separate information universes, with a conservative media ecosystem amplifying falsehoods that then take root.

The entirety of the Scalise video painted a bleak picture of the country, with cleverly spliced scenes designed to make major cities look like places of anarchy and violent chaos. That’s already disingenuous; protesters demanding an end to centuries of racial violence have largely been peaceful. But what made it so remarkable wasn’t just that Scalise twisted the truth about Black Lives Matter. His video went a step further, altering a question I had asked Biden about law enforcement to make it sound as though Biden had agreed to defund the police. I’m in favor of defunding the police, so I wish that were the case. But Biden has been clear that isn’t his position.

Now, I am of course grateful I can still speak, even if very slowly, using eye-gaze technology: A camera tracks the movements of my eyes on a screen-based keyboard, and then the resulting text is converted into speech by a synthetic voice generator. But because of my Hawking-esque voice, it’s particularly easy for others to manipulate what I say. Scalise’s team just went the extra mile in seeming to find the exact voice generator I use when they whipped up the extra words meant to damn Biden. (Scalise’s team denies this.)

Scalise has since conceded the video “shouldn’t have been edited” in an interview on Fox News — even as he attempted to claim there was an underlying truthfulness to the message. That isn’t the same as an apology to me, or, more important, the more than 2 million people in this country who communicate using assistive technology like I do.

It’s specifically insulting to witness actors with the worst intentions hijack the technology that has allowed me to speak to try to speak for me, but this duplicity also exposes the broader information crisis in our society. When President Trump claimed, as he did in the run-up to the 2018 election, that a “migrant caravan” threatened the safety of the United States, he was bolstered by a vast conservative media that runs coverage amplifying his claims from morning to midnight. The inauguration crowd size, the repeated lies about voter fraud, claims about wiretapping, all of it is part of an attempt to shear one half of America away from the other by creating an alternate reality for Trump’s supporters.

That reality isn’t based on facts, but on polarized partisanship. Trump, like many other leaders around the world with authoritarian aspirations, understands that shaping reality is the most powerful tool at an autocrat’s disposal. His goal is a society in which it doesn’t matter whether what you say is true as long as your side loves it.

In that context, “deepfakes” such as the one Scalise posted aren’t missteps. They’re disinformation test balloons that should put every single one of us on alert. If they can without consequence make it seem as though I said something I didn’t, what else can they do? What else will they do? What fearmongering words can they put in Biden’s mouth in a video doctored to tip the election?

I’m not sure I know how to solve this problem. The collective outrage that got the video stricken from Twitter is a good place to start; that must not let up. Another might be looking at the polarizing effects of Facebook. Scalise took it off his page, but elsewhere on the site, the video remains, gathering views.

That’s just the beginning, though. We need far more aggressive action across the board to identify and stop the spread of false information, because more is coming. But I can’t do that on my own. Every letter I’m typing here is difficult, each sentence its own hurdle, and my words aren’t enough. What we desperately need is others ready to speak their own — not speak false ones for me.

Watch Opinions videos:

Read more:



from Hacker News https://ift.tt/3jCvs4E

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.