Download as pdf or txt
Download as pdf or txt
You are on page 1of 23
 
White Paper (DRAFT) - U.S. Sen. Mark R. Warner
1
Potential Policy Proposals for Regulation of Social Media and Technology Firms
Social media and wider digital communications technologies have changed our world in innumerable ways. They have transformed the way we do everything from shopping for groceries to growing our small businesses and have radically lowered the cost of, and barriers to, global communication. The American companies behind these products and services
 – 
 Facebook, Google, Twitter, Amazon, and Apple, among others
 – 
 have been some of the most successful and innovative in the world. As such, each of them deserves enormous recognition for the technological transformation they have engendered around the world. As their collective influence has grown, however, these tech giants now also deserve increased scrutiny.
In the course of investigating Russia’s unpreceden
ted interference in the 2016 election, the extent to which many of these technologies have been exploited
 – 
 and their providers caught repeatedly flat-footed
 – 
 has been unmistakable. More than illuminating the capacity of these technologies to  be exploited by bad actors, the revelations of the last year have revealed the dark underbelly of an entire ecosystem. The speed with which these products have grown and come to dominate nearly every aspect of our social, political and economic lives has in many ways obscured the shortcomings of their creators in anticipating the harmful effects of their use. Government has failed to adapt and has been incapable or unwilling to adequately address the impacts of these trends on privacy, competition, and public discourse. Armed with this knowledge, it is time to begin to address these issues and work to adapt our regulations and laws. There are three areas that should be of particular focus for policymakers.
First, understanding the capacity for communications technologies to promote disinformation that undermines trust in our institutions, democracy, free press, and markets
. In many ways, this threat is not new. For instance, Russians have been conducting
information warfare for decades. During the Cold War, the Soviets tried to spread “fake news”
denigrating Martin Luther King Jr. and alleging that the American military had manufactured the
 
White Paper (DRAFT) - U.S. Sen. Mark R. Warner
2
AIDS virus.
1
 Much like today,
their aim was to undermine Americans’ faith in democratic
government. But what
is
 new is the advent of social media tools with the power to magnify
 – 
 and target
 – 
 propaganda and fake news on a scale that was unimaginable back in the days of the Berlin Wall. As one witness noted during the March 2017 hearing on Russian disinformation efforts before the Senate Select Committee on Intelligence,
today’s tools seem almost
 purpose-built 
 for Russian disinformation techniques.
2
 Just as we
’re trying to sort thro
ugh the disinformation playbook used in the 2016 election and as we prepare for additional attacks in 2018, a new set of tools is being developed that are poised to exacerbate these problems. Aided in large part by advances in machine learning, tools like DeepFake allow a user to superimpose existing images and videos onto unrelated images or videos. In addition, we are seeing an increasing amount of evidence that bad actors are beginning to shift disinformation campaigns to encrypted messaging applications rather than using the relatively more open social media platforms. Closed applications like WhatsApp, Telegram, Viber, and others, present new challenges for identifying, rapidly responding to, and fact-checking misinformation and disinformation targeted to specific users.
3
 
But it’s also important to recognize that manipulation and exploitation of the tools and scale
these platforms provide goes beyond just foreign disinformation efforts. In the same way that  bots, trolls, click-farms, fake pages and groups, ads, and algorithm-gaming can be used to  propagate political disinformation, these same tools can
 – 
 and have
 – 
 been used to assist financial frauds such as stock-pumping schemes, click fraud in digital advertising markets, schemes to sell counterfeit prescription drugs, and efforts to convince large numbers of users to download malicious apps on their phones.
4
 Addressing these diseconomies of scale
 – 
 negative
1
 U.S. Department of State: 
Soviet  Influence Activities: A Report on Active Measures and Propaganda, 1986-1987 
2
 U.S. Congress, Senate, Select Committee on Intelligence,
Open Hearing: Disinformation: A Primer in Russian  Active Measures and Influence Campaigns.
115
th
 Cong., 1
st
 sess., 2017.
3
 
See
 Elizabeth Dwoskin, & Annie Gowen
, “On WhatsApp, fake news is fast – 
 and
can be fatal,” Washington Post.
, “The
Era of WhatsApp Propagan
da Is Upon Us,” Foreign Policy.
 
4
 
See, e.g.,
 
Robert Gorwa, “Computational Propaganda in Poland: False Amplifie
r and the Digital Public Sphere,
Working Paper No. 2017.4, Oxford Internet Institute, University of Oxford.
 
White Paper (DRAFT) - U.S. Sen. Mark R. Warner
3
externalities borne by users and society as a result of the size of these platforms
 – 
 represents a  priority for technology policy in the 21
st
 century.
A second dimension relates to consumer protection in the digital age.
 As online platforms have gained greater prominence in our lives, they have developed more advanced capabilities to track and model consumer behavior
 – 
 typically across the multiple devices a consumer owns. This includes detailed information on viewing, window-shopping, and purchasing habits, but also more sensitive information. The prevailing business model involves offering nominally free services, but which results in consumers providing ever-more data in exchange for continued usage. User tracking can have important consumer benefits, for instance by showing users more relevant ads and helping to optimize user experience across different apps. At the same time, these user profiles could provide opportunities for consumer harm
 – 
 and in surreptitious, undetectable ways. Pervasive tracking may give platforms important behavioral information on a
consumer’s willingnes
s to pay or on behavioral tendencies that can be exploited to drive engagement with an app or service. These technologies might even be used to influence how we engage with our own democracy here at home, as we saw in recent months with the Cambridge Analytica scandal, where sensitive Facebook data from up to 87 million people may have been used to inappropriately target U.S. voters. The allure of pervasive tracking also creates incentives to predicate services and credit on user  behavior. Users have no reason to expect that certain browsing behavior could determine the interest they pay on an auto-loan, much less that what their friends post could be used to determine that. Further, numerous studies indicate users have no idea their information is being
“Scheme created fake news stories to manip
ulate stock prices, SEC alleges,
 Los Angeles Times.
; Lauren Moss, “Xanax drug sold o
n social media found to be fake,
 BBC News.
March 26, 2018.
Danny Palmer, “Android malware found insid
e apps downloaded 500,000 times,
 ZDNet.

Reward Your Curiosity

Everything you want to read.
Anytime. Anywhere. Any device.
No Commitment. Cancel anytime.
576648e32a3d8b82ca71961b7a986505