Thursday, August 31, 2023

jackie thinks about mis/dis/information for a long hard while

In 2012, Notre Dame football player Manti Te’o announced his grandmother and girlfriend died in the same day dedicating his gameplay that season to their memory. Sports news outlets picked up the story with a storm reporting on both the players' incredible success in the season and the tragic passing of these two women. By the next year, it will be revealed that only one of these women exists. His girlfriend, an online relationship whom Te’o never met, is ultimately a catfish.

While there are layers to this story beyond the slurry of reports as this was revealed to America (ie, the girlfriend profile “Lennay Kekua” was run by a trans woman in her pre-transitioning youth, and Manti’s personal mental health drawing him to social media and thus this relationship), the initial sports news outlets that published on this story fall into categories of misinformation where, as defined in Caroline Jack’s Lexicon of Lies,  “accuracy is unintentional” (2). Articles are revealed in earnest and off the basis that there is an early 2010’s social media footprint enough to reflect a real person. The Deadspin article released to reveal this inaccuracy interestingly roots itself in the righteous need to correct the misinformed trades of Sports Illustrated and ESPN while further establishing Deadspin as a legitimate news site. Yet Deadspin is in the same boat as the misinformed media es emphasized in Marwick and Lewis’ Media Manipulation and Disinformation Online discussion of “the attention economy” and its impact on news outlets transitions into online environments drawing from popular stories on social media (42). Though the catfishing of it all displayed a form of disinformation in Manti’s personal life, when brought to the public sphere the story is rooted in misinformation subject to carnivorous media.


Where this example was so clear to me throughout reading, both Lexicon and Media Manipulation made me repeatedly think of meme cultures and traditionally parody news accounts. The discussion of trolls coupled with definitions and misplacements of satire on this spectrum of false information requires repeated clarification of intent versus public use; if the joke was created to disinform, are consumers at fault for believing it's true? Kuo and Marwick offer a great line to this stating “countering mis- and disinformation goes beyond solutions like “fact checking” or “media literacy” which place responsibility on individuals to become informed media consumers” (5). When considering accounts like Discussing Film that exist across Twitter and Instagram histories there are numerous parody accounts that have come and gone from singular like-styled tweets; I’m referring to accounts like DibussingFilm and DissussingFilm, both of which no longer exist either at all or as a fully run account as the trend is to simply change your profile picture and submit a tweet that is almost perfect.



Jack’s Lexicon of Lies importantly points out that “online content often spreads far beyond its original context, and sometimes it can be difficult to judge whether a piece of content is serious or satirical in nature”, (12) a factor we discussed last week with the Hurricane Hillary “live reports” using images from the Universal Studio Tour to represent flooding. While parody will always require forms of media literacy to combat, this week offered interesting takes and definitions to these forms of comedy moving into the digital and the ethical responsibilities of their comedians not unlike journalists. 

Misinformation...

[1] https://www.nytimes.com/2023/05/31/world/asia/south-korea-alert.html

Just three months ago, South Koreans had a rude awakening [literally and metaphorically] when the emergency siren—to be truthful, I had no idea these still existed—began wailing at 6:32 a.m., quickly followed by government-issued SMS emergency alerts stating that North Korea had fired a rocket [at 6:27 a.m.]. The texts urged Seoul residents to “prepare to evacuate” and prioritize children and the elderly. At 7:03 a.m., approximately 22 minutes after the first round of mobile alerts were issued, the Ministry of the Interior and Safety of South Korea issued a second: “a notice that the 06:41 alert issued by the city of Seoul was a false alarm” [note: informal language; missing period].

It was intriguing to see how different friends responded to the fiasco—[1] one sent an angry text [and a string of profanities] to our group chat at 7:23 a.m., from the comfort of her home, infuriated by the idiocy of it all; [2] another friend, who had served in the special forces, had been walking home after drinking with his friends at the Han River [he had supposedly seen a body floating down the river the same day]; when he received the texts; he packed his essentials immediately and headed straight to the subway station; [3] another somehow slept through the sirens and alerts.

The mayor later issued an official apology regarding the confusion but stated that the decision wasn’t a false alarm but rather an “overreaction”; though was several hundred kilometers away from the rocket’s trajectory, officials supposedly issued the alert as a precaution. Misinformation or disinformation [or somewhere in between]?


[2] 
Caroline Jack’s definition of misinformation is contingent on the question of accuracy. Is misinformation necessarily “inaccurate”? How do we define accuracy? I’m thinking of George Floyd and the unrelenting string of subscripts [ex-convict, drugs, etc.] appended to his story. Stripped of political intent, were these claims inaccurate? Falsehoods? How are truth and fact deployed to perpetuate what we might understand as a broader and nebulous arc of misinformation [e.g., racial stereotypes], especially as it pertains to Blackness? I found Kuo and Marwick’s claim that “technology did not create the problem of disinformation and technical solutions alone are not the answer” (6) rather banal and am more so interested in how technology exacerbates, as in the example of the post-9/11 information infrastructures referenced in the paragraph. How do surveillance technologies partake in the dissemination of misinformation? What modes of praxis might we engage with in resistance? [I’m reminded of Saidiya Hartman’s annotations in her archival work (2019) and Christina Sharpe’s redactions (2016).]

Mis(sing)information: False Realities in the Entertainment and Performance Spheres

            In Lexicon of Lies: Terms for Problematic Information, Caroline Jack points to some of the major ways misinformation and disinformation spread. Within the first few pages, Jack emphasizes the dangers of computational systems which rely on algorithms concerned only with an item’s potential to trend (thereby creating a profit), but not with the item’s accuracy or intent (3). On this subject, Jack writes, “News content circulates on social media alongside entertainment content, and the lines between the two can be hard to discern” (4). This is especially true when that entertainment parodies news. This popular form of satire can quickly transform into misinformation when readers do not carefully read between the lines, discern a piece’s tone, or even check a site’s “about” tab for more information before sharing it with the incorrect context. A good example of this occurred in 2022 when a satirical news source on all things Disney (Mouse Trap News) posted an article and accompanying TikTok about Disney World’s plans to lower the legal age of drinking within park limits to 18. Within a matter of days, the video had accrued over 3 million views as a legitimate informational piece. Despite the news source’s clearly satirical name and its bio which reads, “Real Disney News That is 100% FAKE!” the story circulated like wildfire and even ended up on ABC 10 News.


Had Mouse Trap News’ piece simply existed in written form on their website, it’s hard to imagine it going as viral as it did on TikTok. This is undoubtedly the result of TikTok’s algorithmic structure as well as its presentational format. Because the video sparked attention, the platform’s algorithms promoted it even further. This was then exacerbated by a medium which encourages users to scroll speedily through the short videos fed to them through their FYPs without stopping long enough to check who is posting them. This example is interesting, because while Mouse Trap News did intentionally post about something fictitious for comedic purposes, to critique the money making machine that is Disney, they did not intend to participate in malicious disinformation. And yet, despite this, their post led to widespread misinformation which radiated outside the immediate Disney Adult imagined community.

The aforementioned example involving Mouse Trap News illustrates a complex example regarding the circulation of false information to simultaneously entertain and encourage viewers to reconsider their relationship with the Disney brand. Next, I’d like to turn to a perhaps even more complicated example of disinformation and misinformation at the intersection of politics, religion, and performance. In recent years, Shen Yun has become infamous across social media for its aggressive marketing campaigns—a fact that quickly became meme-ified in 2019.

 
 
Despite this public acknowledge of Shen Yun’s extensive publicity, few have explored its ties to the ideology its propaganda-cloaked-as-performance promotes. Though Shen Yun’s advertising claims that it is a theatrical experience spanning 5,000 years of Chinese history and though it is an event that features highly trained professionals, such characterizations of Shen Yun neglect to mention the force that makes it all possible—the Falu Gong, a nonprofit organization with anti-communist, anti-evolution, homophobic, racist politics. Perhaps the only end-of-times religion with its own thriving performing arts academy, the Falu Gong utilize Shen Yun to promote conspiracy theories and prejudice through song and dance. Intrigued by this enigmatic event, I attended Shen Yun in 2021 and was shocked by the magnitude of its unabashedly didactic messaging.

 

Woven throughout the musical numbers were tales of Chinese government officials harvesting the organs of young Falu Gong followers as well as natural disasters smiting the earth because of gay couples and humankind’s overreliance on technology. Ironic, then, that the entire performance relies upon a massive digital backdrop which transitions each number and features several wholly unnecessary special effects (including a tsunami in the shape of Karl Marx’s head), but that’s a discussion for a more in-depth research paper.

What I found particularly disturbing was the hate Shen Yun’s performance mapped not only onto the Chinese government, but onto China itself. As I looked incredulously around the audience during the final number, I realized the majority of people in attendance were elderly, white, and (based on their fervent nodding along to words bemoaning the corruption of today’s youth who are led astray by science) conservative. Within this echo chamber, those concert-goers were having their own biases about China confirmed by those they viewed as “authentic” representatives from China. In essence, Shen Yun utilizes a pre-existing Orientalist framework as a tool to promote their own historical narrative. As discussed by Rachel Kuo and Alice Marwick in “Critical Disinformation Studies: History, Power, and Politics,” disinformation is commonly disseminated through “the repetition of particular narratives and stereotypes” which often reify already established fear or hate based systems (3). In the context of Shen Yun, depicting Chinese citizens as evil, communist organ-harvesters allows the organization to build upon preconceived xenophobic beliefs within their viewers, thereby promoting the idea that China in its contemporary context is not what is truly (or purely) Chinese because of communism's influence. Shen Yun’s seemingly omnipresent advertisements which depict nothing at all aside from a dancer and a vaguely positive descriptor of their performance act as visual distractions which assert a sense of power and “traditional authenticity” over a version of Chinese history the Falu Gong has presented as truth for decades.

Wednesday, August 30, 2023

Mis (Disinformation) and Twitter

 In this response, I take care to include "(dis)" in instances I cannot be sure of the intentions.

In the era of the Twitter (and I will not be calling it X) community notes, mis (dis) information on the site has taken a rather hilarious hit, but has also revealed just how much of Twitter is just that, mis(dis)information. In the Kuo and Marwick piece, “Critical Disinformation Studies: History, Power and Politics”, they write "First, positing a current crisis of fragmented “truth” due to technologically enabled polarization presumes that, prior to the advent of social platforms, the public agreed upon “facts” and “knowledge.” The below tweet echoes a similar sentiment, many tweets that have gone viral in the months since the launch of the community notes feature have been marked with fact-checks, revealing how much wrong information permeates the site.

Similarly, Kuo and Marwick mention how misinformation prevailed even prior to social media, which many credit for the rise. Rather, misinformation has always been prevalent, the difference between then and now, is simply the reach; I believe that the amount of disinformation has nor changed, but rather is able to spread furhter. The writers talk about how black spaces presented their counters to the streotypical assumptions peddled about the race by white-dominated media, and in the cases of this example, the reach of these white centered mis(dis)nformation campaigns is wide due to social media, but due to these black spaces also existing on public social media, so is the counterinformation. The CNN article specifally is a great example as CNN has touched on an article that holds some truth to it, but due to the exisitng view of many African countries being homophobic (not denying the truth of this), the headline presents the case in a way that highlights this perceived homophia rather than the sexual assuault perfomed by the men; furthing Uganda as a homophobc country and not accounting for the nuances of this particular situation. This mis(dis)information is then foiled by Twitter community notes.

In the second reading assigned,“Lexicon of Lies: Terms for Problematic Information”, the writer notes that "Finally, digital platforms systematize incentives that can drive the spread of problematic information". With the launch of the payouts for Twitter premium users with a lot of enagegment on Twitter, many of them have resorted to different tactics in order to farm for engagement. One of such tactics is, of course, the spread of mis(dis)information. Sometimes dangerous but other times harmless, such as the tweet below. 
This form of mis(dis)information is obviously debunked, and even prior to the community notes feature was easily done. However, it is incentivized the hope of a payout for the amount of impressions and engagement it will it brings to the user. Exemplifying the point made by Jack.

Twitter is a greats site for the analysation of the spread of mis(dis)information and consistently proves so everyday. 


Cricket and Conspiracies.

 1. On Jingoism and Cricket: 

With the incredibly niche of cricket tournament, the Asia Cup kicking off the season in a matter of days, part of national Indian news is the speed with which the tickets are being snatched up. While we've already seen similar discourse around the Ticketmaster x Taylor Swift frenzy, with the case of expensive India games being sold out within seconds mirroring a tale we know all too well, there is a sinister underbelly to these workings. This includes a culmination of corruption, including parties such as resellers and the (most popular and well funded cricketing agency) Indian cricketing body (The BCCI, or Board of Control for Cricket in India) but also lays entrenched within national politics and instances of misinformation to the Indian public fueling further jingoism and divisive opinions. 

With ruling party members reflecting immense levels of (Hindu, right-wing) patriotism as well as a keen interest in the sport of cricket, there is a messy intertwining of politics and sport which has been exemplified. This goes to the extent of naming the largest and newest stadium in India the Narendra Modi Stadium after the Prime Minister, and having former BJP members into the highest ranks of the cricketing boards. Vraga and Bode offer a description of "best available evidence" (138) which melds neatly into the concept of media manipulation and rules of control operating in a grey area that feeds off confusion regarding accuracy, and a lack of options to turn to. With several Twitter and Instagram users complaining under the #IndiaVsPakistan of their (resoundingly similar to the Ticketmaster x Swift experience) inability to secure tickets for the fabled match. In encountering a crowd verified and publicly fact checked case of misinformation, there is room to exaggerate and promote the already frenzied event, not only for private monetary gain, but to capitalize on the jingoist, racist and hyper-nationalist sentiments surrounding the India vs Pakistan match. Perhaps this may be edging the line between misinformation and conspiracy theorizing, but this is something that's been increasingly concerning to me as of late. 

Here's an ad promoting the match, positioning the match as #TheGreatestRivalry: https://www.youtube.com/watch?v=tcI9NYGXUPE&ab_channel=StarSports

2. Maui Fires, Disaster Capitalism and Conspiracies: The recent devastation in Maui produced not only a warning sign of the severity and increasing rapidness with which natural calamities may approach us has been met with several threads on various social media platforms calling it a hoax. More specifically, a government run conspiracy. Harkening Marwick and Lewis' words on sensationalism "driving faulty information flows within communities" (19), the conspiracy originated here on Facebook, and has spiraled online. Thankfully, most news sources are treating this as a warning sign regarding fake news and misinformation, urging readers to be more discerning and fact check, rather than repost. This has been done by the likes of Forbes, USA Today, and The Guardian to name a few. 

While Marwick and Lewis specifically call out and detail the workings of alt-right influencers and their reach, this conspiracy has reached beyond the pale. Relatively apolitical influencers, even family vloggers as pictured below, have dabbled in repackaging the conspiracies to make already bite sized news (in this case, bite sized social media threads) seemingly into a 'best of' highlights reel of the disaster. As mentioned in the section of Data and Society, Media Manipulation and Disinformation Online, the role of the influencer is "as significant nodes in these networks...[they] hold the power to amplify particular messages and make otherwise fringe beliefs get mainstream coverage" (Marwick and Lewis 20), which comes with a certain responsibility that has clearly been eschewed in the case of this conspiracy.  



In diving into the rabbit hole of climate change naysayers coupled with conspiracy theorists, I came up for air thanks to the words of Naomi Klein. She notes, "disasters have become the preferred moments for advancing a vision of a ruthlessly divided world, one in which the very idea of a public sphere has no place at all. Call it disaster capitalism." (Klein 49). While the latest barrage of misinformation sweeping the internet hasn't been catalyzed by the media, the role of the influencer - and then by extension, social media news outlets ranging from YouTube bigwigs such as Philip DeFranco and Keemstar (ew) to TikTok green screen reporters with a couple hundred followers - bear a certain burden in what they are choosing to spread. With audiences' growing mistrust of the mainstream media, and more likely laziness in fact checking and extending beyond Twitter for news updates, is a Community Note or Meta's blurred image and note stating that what we're reading is fake news enough? The public seem to be wanting an alternate reading to these disasters, and "Given the boiling temperatures, both climatic and political, future disasters need not be cooked up in dark conspiracies. All indications are that if we simply stay the current course, they will keep coming with ever more ferocious intensity. Disaster generation can therefore be left to the market's invisible hand." (Klein 58)

Welcome!

 We'll use this blog to post various work for the CTCS 585 Countering Misinformation course.



jackie thinks about mis/dis/information for a long hard while

In 2012, Notre Dame football player Manti Te’o announced his grandmother and girlfriend died in the same day dedicating his gameplay that se...