Cybersecurity whiz: Combined threats endanger 2020 vote | KOB 4
Advertisement

Cybersecurity whiz: Combined threats endanger 2020 vote

FILE - In this photo taken Friday, June 8, 2012 Alex Stamos CTO of Artemis Internet, an NCC Group Company, poses by a domain name poster at their offices in San Francisco. Stamos served as chief security officer at Facebook for three years before joining Stanford University, where he studies internet security, including systems related to conducting elections. (AP Photo/Eric Risberg, File) FILE - In this photo taken Friday, June 8, 2012 Alex Stamos CTO of Artemis Internet, an NCC Group Company, poses by a domain name poster at their offices in San Francisco. Stamos served as chief security officer at Facebook for three years before joining Stanford University, where he studies internet security, including systems related to conducting elections. (AP Photo/Eric Risberg, File) |  Photo: AP

By FRANK BAJAK
October 13, 2019 10:21 AM

Alex Stamos served as chief security officer at Facebook for three years before joining Stanford University, where he studies internet security, including systems related to conducting elections.

Advertisement

He warns that there's little the federal government can do now for the 2020 elections. And he laments the arrival of "disinformation as a service," where companies are hired to help spread misinformation on social media.

Stamos spoke recently with The Associated Press. Remarks have been edited for clarity and brevity.

Q. Can you summarize the chilling scenario you recently penned on how the 2020 presidential elections might be thrown into turmoil?

A: Most people focus on specific threats: a technical disruption of voting, social media rumors blaming a conspiracy, extreme media amplifying divisive information. Now imagine combining all three. A not-especially-powerful technical issue could be amplified into a significant election disruption.

Q: There are no real federal election security standards despite expert consensus on how to make elections less hackable, voter-verifiable paper ballots for starters. Government regulation is mostly absent. Is it too late to do anything for 2020?

A: Election authorities are already readying the primaries. It is difficult to imagine what we could do on a federal level to fix these issues at this point.

Q: Why is your outfit at Stanford prioritizing the study of disinformation as a threat to democracy?

A: The Russian playbook is not difficult to implement - and not illegal under many circumstances. The disinformation market is thriving. Domestic actors create fake outlets much as Russian agents did in 2016. Internationally, companies offer disinformation as a service. Foreign money cannot fund domestic electioneering ads. But people whose Facebook and Twitter accounts are deleted for terms of service violations can simply create new accounts and try again.

Q: You worked at Facebook and have a good idea of how disinformation became a problem. Facebook can't seem to get it under control. Why not?

A: Your assumption that "They can't control all this information" is not something I necessarily agree with.

Q: So it can be controlled?

A: An open society like ours is always going to be vulnerable to disinformation. We have a free and open internet. You don't need an ID to open a Twitter or Facebook account. We don't arrest people for spreading disinformation. So, in the last four to five years, a lot of semi-reputable, highly partisan media outlets have emerged that amplify extreme views to try to widen societal divisions. Tech companies, meantime, don't want to be the arbiters of truth, and I think that's reasonable.

Q: The U.S. news media got criticized for how it handled Democratic Party emails stolen in 2016. Has it gotten any better at squelching the weaponization of such information?

A: No. It's a super competitive media environment. Journalists want to be first. There's been very little self-reflection. Tech companies have self-flagellated a lot on these issues. You've seen almost nothing from any flagship media organizations.

Q: Facebook says it won't be fact-checking politicians' speech. Was that a mistake?

A: No. I think it's the right thing. We have to temper our desire for the companies to solve some of these problems with our concern about the power they have. It is reasonable for them to do things like limit access to advertising. It does the most damage, in part because it's targeted. The platforms are able to downgrade inaccurate content. But to artificially downgrade non-paid political speech by candidates is, I think, to give too much power to social media companies that are already spectacularly powerful.

Credits

By FRANK BAJAK

Copyright 2019 The Associated Press. All rights reserved.

Advertisement
Comment on Facebook

Share 4 - News Tips - Photos - Videos
  Share a News Tip, Story Idea, Photo, Video



Advertisement


Advertisement

Advertisement


APD cautions the use of local crime watch pages

Woman accidentally calls DA investigator to purchase walker

Bassan, Benton win seats in city council run-off election

City officials say ART bus memes help raise awareness

Some residents scared after someone shoots cars, windows with BB gun