'It knows what colors your brain lights up to if I give you a red button or a green button or a yellow button. It knows which words activate your psychology. It knows an unprecedented amount of information about what will manipulate you.'
Authored by Asher Schechter: The opening panel of the Stigler Center’s annual antitrust conference discussed the source of digital platforms’ power and what, if anything, can be done to address the numerous challenges their ability to shape opinions and outcomes present.Google CEO Sundar Pichai caused a worldwide sensation earlier this week when he unveiled Duplex, an AI-driven digital assistant able to mimic human speech patterns (complete with vocal tics) to such a convincing degree that it managed to have real conversations with ordinary people without them realizing they were actually talking to a robot.
While Google presented Duplex as an exciting technological breakthrough, others saw something else: a system able to deceive people into believing they were talking to a human being, an ethical red flag (and a surefire way to get to robocall hell).
Following the backlash, Google announced on Thursday that the new service will be designed “with disclosure built-in.” Nevertheless, the episode created the impression that ethical concerns were an “after-the-fact consideration” for Google, despite the fierce public scrutiny it and other tech giants faced over the past two months. “Silicon Valley is ethically lost, rudderless and has not learned a thing,” tweeted Zeynep Tufekci, a professor at the University of North Carolina at Chapel Hill and a prominent critic of tech firms.
The controversial demonstration was not the only sign that the global outrage has yet to inspire the profound rethinking critics hoped it would bring to Silicon Valley firms. In Pichai’s speech at Google’s annual I/O developer conference, the ethical concerns regarding the company’s data mining, business model, and political influence were briefly addressed with a general, laconic statement: “The path ahead needs to be navigated carefully and deliberately and we feel a deep sense of responsibility to get this right.”
A joke regarding the flawed design of Google’s beer and burger emojis received roughly the same amount of time.
Google’s fellow FAANGs also seem eager to put the “techlash” of the past two years behind them. Facebook, its shares now fully recovered from the Cambridge Analytica scandal, is already charging full-steam ahead into new areas like dating and blockchain.
Many of the mechanisms that allowed for this growth are opaque and rooted in manipulation. What are those mechanisms, and how should policymakers and antitrust enforcers address them? These questions, and others, were the focus of the Stigler Center panel, which was moderated by the Economist’s New York bureau chief, Patrick Foulis.
The Race to the Bottom of the Brainstem
“The way to win in Silicon Valley now is by figuring out how to capture human attention. How do you manipulate people’s deepest psychological instincts, so you can get them to come back?” said Tristan Harris, a former design ethicist at Google who has since become one of Silicon Valley’s most influential critics. Harris, who co-founded the Center for Humane Technology, an organization seeking to change the culture of the tech industry, described the tech industry as an “arms race for basically who’s good at getting attention and who’s better in the race to the bottom of the brainstem to hijack the human animal.”The proliferation of AI, Harris said, creates an asymmetric relationship between platforms and users. “When someone uses a screen, they don’t really realize they’re walking into an environment where there’s 1,000 engineers on the other side of the screen who asymmetrically know way more about their mind [and] their psychology, have 10 years about what’s ever gotten them to click, and use AI prediction engines to play chess against that person’s mind. The reason you land on YouTube and wake up two hours later asking ‘What the hell just happened?’ is that Alphabet and Google are basically deploying the best supercomputers in the world—not at climate change, not at solving cancer, but at basically hijacking human animals and getting them to stay on screens.”
This “fiduciary relationship,” in which one party is able to massively exploit the other, is best exemplified by Facebook, which is akin to a “psychotherapist who knows every single detail in your life, including the details of your inner life, in the sense that it doesn’t just know who you click on at two in the morning and what you post and your TINs and your photos and your family and who you talk to the most and who your friends are. It also intermediates every single one of your communications. It knows what colors your brain lights up to if I give you a red button or a green button or a yellow button. It knows which words activate your psychology. It knows an unprecedented amount of information about what will manipulate you. If there’s ever been a precedent or a need for defining something as being an asymmetric or fiduciary relationship, it’s this one.”
Facebook’s ad-based business model, Harris argued, is “obviously misaligned” with its asymmetric power. “Would you want to be paying that psychotherapist or would you want that psychotherapist to instantly take all that personal information about you, the most intimate details of your life, and then sell it to car salesmen?”
“The reason you land on YouTube and wake up two hours later [asking] ‘What the hell just happened?’ is that Alphabet and Google are basically deploying the best supercomputers in the world—not at climate change, not at solving cancer, but at basically hijacking human animals and getting them to stay on screens.”
It’s not that Silicon Valley lacks in goodwill, he said. In 2013 Harris, then a product manager at Google, prepared a presentation that argued that Google, while having the power to shape elections and societies, often exploits users’ psychological vulnerabilities instead of acting with their best interest in mind. The presentation went viral and got Harris promoted to the role of “design ethicist.”
Ultimately, though, the company quickly reverted to business as usual. The problem, said Harris, was the incentive to maximize users’ time and attention. “If you’re at YouTube, you’re incentivized to get people to spend time on videos, even if those videos are conspiracy theories. The product manager—25 years old, going to stay at YouTube for two years, went to a good school—their job is just to show on their resume that they made the engagement numbers on videos go up. Then you wake up two years later and YouTube has driven 15 billion views to Alex Jones’ videos. That’s not videos people sought out themselves. That’s actually YouTube driving the recommendation.”
“The Search Engine Is the Most Powerful Source of Mind Control Ever Invented”
Robert Epstein, a senior research psychologist at the American Institute for Behavioral Research and Technology in California and the former editor of Psychology Today, is one of only a few scholars who have conducted empirical studies on the ability of digital platforms to manipulate opinions and outcomes. In a 2015 study, Epstein and Ronald E. Robertson reported the discovery of what they consider “one of the largest behavioral effects ever identified”: the search engine manipulation effect (SEME). Simply by placing search results in a particular order, they found, voters’ preferences can shift dramatically, “up to 80 percent in some demographic groups.”“What I stumbled upon in 2012 or early 2013 quite by accident was a particular mechanism that shows how you can shift opinions and votes once you’ve got people hooked to the screen,” said Epstein.
While much of the political and public scrutiny of digital platforms has been focused on the behavior of bad actors like Cambridge Analytica, Esptein called these scandals a distraction, saying, “Don’t worry about Cambridge Analytica. That’s just a content provider.” Instead, he said, the power of digital platforms to manipulate users lies in the filtering and ordering of information: “It’s no longer the content that matters. It’s just the filtering and ordering.” Those functions, he noted, are largely dominated by two companies: Google and, to a lesser extent, Facebook.
SEME, said Epstein, is but one of five psychological effects of using search engines he and his colleagues are studying, all of which are completely invisible to users. “These are some of the largest effects ever discovered in the behavioral sciences,” he claimed, “but since they use ephemeral stimuli, they leave no trace. In other words, they leave no trace for authorities to track.”
Another effect Epstein discussed is the search suggestion effect (SSE). Google, Epstein’s most recent paper argues, has the power to manipulate opinions from the very first character that people type into the search bar. Google, he claimed, is also “exercising that power.”
“We have determined, through our research, that the search suggestion effect can turn a 50/50 split among undecided [voters] into a 90/10 split just by manipulating search suggestions.”
One simple way to do this, he said, is to suppress negative suggestions. In 2016, Epstein and his coauthors noticed a peculiar pattern when typing the words “Hillary Clinton is” into Google, Yahoo, and Bing. In the latter, the autocomplete suggested searches like “Hillary Clinton is evil,” “Hillary Clinton is a liar,” and “Hillary Clinton is dying of cancer.” Google, however, suggested far more flattering phrases, such “Hillary Clinton is winning.” Google has argued that the differences can be explained by its policy of removing offensive and hateful suggestions, but Epstein argues that this is but one example of the massive opinion-shifting capabilities of digital platforms. Google, he argues, has likely been determining the outcomes of a quarter of the world’s elections in recent years through these tools.
“The search engine is the most powerful source of mind control ever invented in the history of humanity,” he said. “The fact that it’s mainly controlled by one company in almost every country in the world, except Russia and China, just astonishes me.”
Epstein declined to speculate whether these biases are the result of deliberate manipulation on the part of platform companies. “They could just be from neglect,” he said. However, he noted, “if you buy into this notion, which Google sells through its PR people, that a lot of these funny things that happen are organic, [that] it’s all driven by users, that’s complete and utter nonsense. I’ve been a programmer since I was 13 and I can tell you, you could build an algorithm that sends people to Alex Jones’s videos or away from Alex Jones’s videos. You can easily alter whatever your algorithm is doing to send people anywhere you want to send them. The bottom line is, there’s nothing really organic. Google has complete control over what they put in front of people’s eyes.”
A “Nielsen-type network” network of global monitoring, suggested Epstein, might provide a partial solution. Together with “prominent business people and academics on three continents,” he said, he has been working on developing such a system that would track the “ephemeral stimuli” used by digital platforms. By using such a system, he said, “we will make these companies accountable to the public. We will be able to report irregularities to authorities, to law enforcement, to regulators, antitrust investigators, as these various manipulations are occurring. We think long-term that is the solution to the problems we’re facing with these big tech companies.”
Is Antitrust the Solution?
In the past two years, a growing movement of scholars, policy wonks and politicians has argued that many of the challenges associated with digital platforms are related to market concentration and has favored increased antitrust enforcement, possibly even breaking platforms up, as a way to address their growing power. But is antitrust the best way to address things like addiction-enhancing business models? Kevin Murphy, a Chicago Booth economics professor, doesn’t think so.“First off, most of what we’re talking about has nothing to do with concentration. These problems would exist absent concentration. Secondly, a focus on concentration as the bottom line of antitrust is misguided as well. The idea that we have some new world that doesn’t look like things we’ve seen before, I don’t know, I don’t see it,” said Murphy.
“Would this be an easier problem to solve if we had 100 firms out there all trying to influence people using these same methods? It might be a much more difficult regulatory process in that world. It might be difficult to measure, difficult to regulate. It’s not clear how this is related to the concentration issue per se,” he added.
Similar arguments about the ability of technology to manipulate elections, Murphy opined, were also made in the early days of television, and concerns over market power were heard during the early days of the Internet. “I remember the days where Yahoo was thought to have an insurmountable first-mover advantage in search, or [when] AOL had an insurmountable first-mover advantage in access to people’s eyeballs, or [Windows] Media Player was going to dominate digital music. The idea that we’re any good at predicting how these markets are going to move or any good at shaping how they’re going to move seems to me to be odd. It also seems like a poor use of antitrust.”
There is also the question of just what sort of impact digital platforms have on the economy as a whole. Contrary to their prominence in the political debate, noted Chad Syverson, also an economics professor at Chicago Booth, the rise of digital platforms coincided with a decade of historically low productivity growth. Tech firms often like to portray themselves as bucking this trend, but evidence of this has so far been slim. It is possible, said Syverson, “the brain space dedicated to these companies, right now at least, exceeds the economic space that they fill up. The entire information sector, which is all of telecom, all of broadcasting, publishing, online and off, and some other sectors, that’s less than five percent of GDP.”
For years, tech execs have fended off possible antitrust actions by claiming that their dominance is not a competition issue, utilizing Alphabet CEO and Google co-founder Larry Page’s argument that “competition is only one click away.” The problem that faces antitrust enforcers, argued Thompson, is that it’s “kind of true.”
“You can go to Bing. You can go to DuckDuckGo, which doesn’t track your information. You can go to other e-commerce sites. You can go to other social networks,” said Thompson. “The issue is that customers don’t want to. It’s not that they can’t. It’s that no one wants to go anywhere else.” The services that platforms offer are vastly superior to what came before them, and network effects mean they can offer an overall better user experience than any fledgling competitor. According to Thompson, this is the paradox antitrust enforcers have to contend with: “The bigger you are, the better you are, at least from a consumer perspective.”
Concentration Really Does Matter
Responding to Murphy, Yale University economics professor Fiona Scott Morton argued that while there have been similar concerns over technology’s ability to influence and manipulate in the past, the difference is the precision with which digital platforms can target users at the individual level.Concentration, she said, is a relevant issue because of the massive influence currently held by a small number of actors. “If there were 30 search engines and everybody was evenly distributed across those 30 search engines and each one had a bias, we would not think that anyone of them was perhaps tipping an election. That’s the sense in which the concentration really does matter to the problems that we’re talking about.”
Responding to Syverson, Morton said that “it is a little misleading” to say platforms are only a small part of GDP. The influence of their technologies on the rest of the economy, she noted, exceeds their actual share of GDP.
While there are potential costs associated with regulating digital platforms, these are not necessarily larger than the benefits that would come from regulating them, Morton asserted. “Of course, we’re going to make a mistake, but we balance the mistakes of regulation. My photos aren’t shared quite as well as they might have been. The search term doesn’t come up quite as fast as it otherwise would because we’ve regulated the company away from innovating in that space. Then there’s the cost of not regulating, which is our democracy doesn’t work anymore, and we have to balance those two things. As a society, we’re having a national conversation about how that latter thing is a lot bigger than we thought it was before.”
Many of these challenges, Morton noted, are not strictly related to competition. When it comes to antitrust, she said, “there’s a little bit of a shortage of really tight theories of harm,” which is why, she said, antitrust cases against digital platforms have not moved forward. “There’s also a question of political will to bring those cases,” she acknowledged, but “even with political will, you have to have a really good explanation of how competition is being harmed.”
At the conclusion of the panel, Foulis asked the panelists whether platforms companies will be more or less powerful in 10 years. The panelists were divided. “In 10 years, I think the surveillance business model will have been made illegal,” said Epstein, whereas Morton argued that platforms will ultimately become more powerful. “I’m afraid that I believe that people with profit are really good at hanging onto their profit,” she said.
Thompson also believes platforms will be more powerful. However, he said, “I do think people like Tristan are the biggest threat to these companies. The reason is because their power accrues not from controlling railways or telephone wires. Their power accrues from people continually making affirmative choices to use their platforms. That’s what gives them monopsony power, The way I think ultimately that power will be undone is through the political process.”
Source
Related: http://stgeorgewest.blogspot.co.uk/2018/05/is-social-media-destroying-humanity-on.html
No comments:
Post a Comment