Balkin Reports on Regulation of Fake News

By Griselda Jarquin – Today, fake news is big news. At the 2017 Law in the Information Age Lecture on March 15, Jack M. Balkin of Yale Law School tackled the issue head-on. His talk was entitled “Soylent Green is the Right to Forget Your Robot is Spouting Fake News: Free Speech Theory in the 21st Century.”

Balkin, Knight Professor of Constitutional Law and the First Amendment at Yale Law School, began his talk by discussing big data, which is typically processed by artificial intelligence programs or algorithms. Big data, he argued, has become a major source of wealth.

The purpose of accumulating big data is to achieve omniscience. However, it has yet to be determined who is going to hold or control omniscience. Alluding to the 1973 science fiction movie Soylent Green, Balkin explained that big data is really all about the people from whom it is gathered. “If big data is people then it represents a relationship between people that also leads to the distribution of power among people,” Balkin said.

Regulating artificial intelligence

Which begs the question: how is big data regulated? The First Amendment (specifically the clause that protects free speech) has, Balkin explained, been used by the government to regulate artificial intelligence with the use of two key concepts: information fiduciary and algorithm nuisance. 

“Information fiduciary” refers to companies who rely on big data to collect information about individuals. This process, which is not necessarily transparent, can be regulated through fiduciary obligations that require the establishment of a good faith relationship. However, individuals do not always have a good faith relationship with big data.

For instance, if a person approaches a bank to get approved for a mortgage loan, the bank relies on information collected by big data because it’s too costly for such research to be conducted by individuals. Algorithms then ultimately decide whether the individual should be approved for a mortgage loan.  However, externalizing the costs of collecting data and using algorithms to make decisions does not consider unforeseen consequences or other problems that may arise, such as discriminatory effects. “Algorithm nuisance” tries to explain the costs that are imposed upon people due to big data. 

New school

Up until now, consumer protections were conceived to protect consumer privacy vis-à-vis personal robots. For instance, Apple’s Siri or Amazon’s Echo has a fiduciary relationship with its owner similar to the relationship that a butler would have with their employer, in that they possess infinite personal information about a specific individual due to their constant presence and proximity to the person in question.

“What are the appropriate channels or regulations that a government can employ for these new entities?” Balkin asked. To answer this question, he turned to the topic of fake news. Historically, to monitor fake news, “old-school speech regulation” targeted the publishers or the speakers themselves. In today’s digital age, “new-school speech regulation” targets the digital infrastructure hosts the information—such as the cloud, internet providers, search engines, credit card companies or social media platforms—since speakers tend to be anonymous.

From peaceful coexistence to threats

There are three separate means for addressing this issue. “Collateral censorship” places pressure on the information providers to censor its speakers, while enjoying intermediary immunity. “Public-private cooperation” (or “cooptation”) describes instances where the public or private sector is coaxed into helping in some capacity. This ranges from peaceful coexistence to threats. For instance, companies in WikiLeaks’ periphery were coaxed by the United States government to adjust its behavior. The U.S. government declared that it was outrageous that VISA allowed its users to donate to WikiLeaks, prompting VISA to discontinue the practice.

Lastly, “private governance of speech communities” allows information channels such as Facebook or YouTube, rather than the state, to regulate the flow of information. Private entities like Facebook take data and monetize it, but also monitor the flow of information by implementing systems of private governance which govern the behavior of its users. There are, however, two problems with private governance. One is substantive, whereby a nation may put such pressure on private entities that it becomes full blown censorship. The other is procedural, whereby users may want to censor one another or employ private regulation. 

Right to forget

As an example, Balkin addressed the “right to be forgotten” within the European Union. The right to be forgotten offers one practical obscurity for certain news stories and allows individuals to target search engines, which are database operators, under new-school speech regulations.

As part of the collateral censorship, states have deputized private entities, such as Google, to create bureaucracies that regulate the flow of data. To combat the problem of fake news, any remedy will necessarily employ new-school speech regulations. Balkin concluded by predicting that these new media corporations will realize that they have a social obligation to solve the problem of fake news.

Learn more about Jack M. Balkin.

Filed under: