“…and that is the way it is”

Back in the 1970s, the evening network news broadcasts were watched on more than 70% of all television sets in use. Today, that number has plunged to less than half [source]. When there were a limited number of news outlets, one could argue that they were undifferentiated: since they all chased the same, entire audience, they could not express an editorial perspective or slant that differed from the mainstream. Politics was avoided as much as possible. Walter Cronkite was the most trusted man in America and he famously signed off the news broadcast by the line “And that is the way it is”.

Today it is different. One can only wonder how surreal the above tagline sounds to our ears today. We have shifted from a sole source of truth (Cronkite) to each person being entitled to their own personal device, personal feed, personal echo chamber. Quoting Wendy Chun, we are collectively becoming addicted to the update, and we fetishize the crisis [source ]. In addition to that, we all suffer from selective perception [best described in “They saw a Game”] and confirmation bias - sometimes the more educated we are, the more prone we are to find arguments to support our own world view [Dan Kahan].

This frictionless nature of modern media nurtures extreme views and fringe ideas, particularly in the online world. Creating more content puts the emphasis on the consumer rather than the content creator (or curator). Thus, traditional news sources lose their audience, along with local news, to alternative and fringe outlets, nationally and internationally.

FIGURE 1 - taken from http://www.people-press.org/2014/06/12/section-1-growing-ideological-consistency
FIGURE 1 - taken from http://www.people-press.org/2014/06/12/section-1-growing-ideological-consistency

Slowly but surely, particularly in the US, the sociopolitical fabric is being stretched towards the two extremes, dangerously thinning its center [figure 1 source: PEW STUDY].

How to move forward?

At this point, it is not possible to go back to the era of Cronkitean truth. How can we move towards a future that is healthy, inclusive and fair for all sides? What interventions can we imagine, and what tools can we employ?

As it has already been stated in our group discussions, the obvious directions are the legal, the technical, and the educational. Each has their upsides and downsides. We imagine and have been operating with a fourth one in mind: the (re)construction of a shared media reality. This reality that is being constantly fragmented by our devices’ screens, notifications, sounds and our constant chase of the update. What we care about is mapping the media landscape, particularly this of the broadcasted TV news, in a way that spans all opinions, sides and contexts.

Some Viral interventions that operate within that space are described below.

Wall of now (Tomer Weller, 2015)

Wall of Now is a multi-dimensional media browser of recent news items, which offers an exploration across the following dimensions: detected emotions, people, countries, states, companies and organizations. Each column contains the top-trending stories of that type in the last 24 hours. Pressing on an entity will reveal a stream of video that relates to that specific entity.

Figure 2 - Screengrab from http://wallofnow.um-dokku.media.mit.edu/
Figure 2 - Screengrab from http://wallofnow.um-dokku.media.mit.edu/


Perspectives (Jasmin Rubinovitz, 2016)

Perspectives is a web application that aims to give a viewer different points of view on a specific story. By using clustering algorithms on the video story segments, it meaningfully groups news stories and generates a comparative news analysis view. By tracking and clustering published stories it helps identify which stories are reported and which news outlets have introduced new frames into the public debate.

Figure 3 - Screengrab from http://perspectives.um-dokku.media.mit.edu/
Figure 3 - Screengrab from http://perspectives.um-dokku.media.mit.edu/


Panorama (Jasmin Rubinovitz, 2017)

Panorama is a web application that addresses the issue of online news content overload, by allowing users to customize their news feed. It also uses machine learning to probe news in a four dimensional space: political leaning, sentiment, subjectivity and trend detection. The user is encouraged to submit feedback that is sent back to the underlying training models, helping them improve.

Figure 4 - Screengrab from http://panorama.um-dokku.media.mit.edu/
Figure 4 - Screengrab from http://panorama.um-dokku.media.mit.edu/


Slantometer (Retzepi, Lingutla, 2017)

Slantometer is an effort to describe how the reporting coverage of the same story evolves with time and as the story gains more or less traction from the rest of the media world.

We are interested in learning whether the multiplicity of opinions in our polarized media landscape has resulted in major, or mainstream, news outlets now admitting a “slant” in their coverage. We are also interested in how and when an editorial slant is introduced into an ongoing story. For example, when an event occurs, such as an earthquake or a missile launch, in the early phases of reporting on it, only raw facts are available. As it develops, there is time to add context and perspective to those facts.  This is exacerbated by the news cycle itself that requires ongoing coverage of an ongoing story even if there is little new to report [Lippmann, 1922 and Boorstin, 1962].

We coined the term “slant” to describe the latent space inside which journalists bring their audience’s attention to issues, often by delineating certain points of view. In order to map this space, we gathered and annotated a body of both online articles as well as broadcast TV news transcripts, rating their “slanted-ness” on a sentence-to-sentence level.

We used green to characterize information that is essential to the understanding of the developing story, and is also factual (rather than an opinion). Yellow denotes information that is either non-essential or non-factual. Red describes information that is non-essential and non-factual.

We employed two different data analysis methods: Manual annotation of online news articles using a Chrome extension that we developed, and using the Amazon Mechanical Turk to annotate a series of TV transcripts. In all of the cases, we examined events that were not directly political or politicized within America - events that should (at least at a baseline level) show some degree of factual reporting versus biased op-ed type of commentary.

Figure 5 describes the results of the online news analysis. We focused on two similar events (mass shootings) one before and one after the 2016 US Presidential Election. We view the Election as an inflection pointn public media discourse, and particularly in the political language employed across the media landscape (print or broadcast).

Figure 5 - Static visualization for online news (October). Top row: an example of the visualization where the height of the triangles denotes the amount of "slanted" sentences we identified. The angle towards which it leans is the political affiliation of the source (left vs right). Middle row: Las Vegas shooting as it was covered by five major online news sources. Bottom row: Sandy Hook shooting, as it was covered by the same news sources. For both rows, from left to right: New York Times - CNN - ABC - Washington Times - FOX News.
Figure 5 - Static visualization for online news (October). Top row: an example of the visualization where the height of the triangles denotes the amount of "slanted" sentences we identified. The angle towards which it leans is the political affiliation of the source (left vs right). Middle row: Las Vegas shooting as it was covered by five major online news sources. Bottom row: Sandy Hook shooting, as it was covered by the same news sources. For both rows, from left to right: New York Times - CNN - ABC - Washington Times - FOX News.


For the TV broadcast transcripts, we focused on the evening news (as opposed to a talk show) of three major news channels (ABC, FOX and NBC), for four events that look place before and after the 2016 Presidential Election. Some examples are the Boston bombings in 2013, the Syria chemical attacks, the Las Vegas shooting in 2017.

Setting up the mTurk HITs (Human Intelligence Tasks) involved presenting each TV transcript (a total of 124 files) in a sentence-by-sentence fashion to five mTurk users per transcript (a total of 650 people). They each provided two ratings per sentence, as explained by the two axes of Figure 6.

Figure 6
Figure 6

Figure 7 shows some of the results for the Boston bombing and the Las Vegas shooting events. We did not receive many “red” ratings, so we have omitted that f category om the presentation below. We notice more green reporting for the 2017 event for all channels, which is against our initial hypothesis that post-2016 election media landscape will be collectively more “slanted”. There also seemed to be no clear correlation between the political leanings of a news outlet and the amount of slant they presented.


Figure 7 - Dynamic visualization of the mTurk rating of TV news transcripts (December). From left to right for both rows: ABC, FOX, NBC.
Figure 7 - Dynamic visualization of the mTurk rating of TV news transcripts (December). From left to right for both rows: ABC, FOX, NBC.

So

me attention needs to be directed to the fact that the mTurk sample is not a representative of the average American broadcast news consumer (although we did try and aim for an equal representation in terms of gender, age and political affiliation). We also noticed that some ratings were probably performed by automated bots rather than humans (the same value was filled across all input boxes).

Using the insights gained by this process, we can imagine curating a selection of sentences that scored high on each scale (green, yellow, red) and using them as input for training a machine learning algorithm. It will be interesting to see if an automated process can be trained to pick up on the sllexical ant that is inserted into news reporting by each news source. For example, it can be that there are some “slanted” keywords or phrases which are repeated across news programs of the same TV channel. The amotion detection tofunctionality ofuperGlue [figure 2, leftmost column] could also be a useful addition - maybe a detection of “disgust” or “irony” from the part of the news anchor could inpoint us to subtle wording change that indicates slant.

OuRe-ared reality

The interventions and analyses described above are only a few example efforts to reclaiming the shared reality which existed in the previous decades, but is under assault by the ubiquity and marketization of media and particularly online platforms. No matter how precise and scaleable our analyses are, it will not matter if we are talking to a public that can’t hear us. It is as if someone is preparing to give a speech to a large audience, and the microphone is in a different room. We have to bring the speaker, the microphone and the audience in the same room - in the same reality.