From left to right: Talha Belaig, Nicole Sessego, Liz Mori Tornheim, Sarah Amos, Theodora Skeadas, Rebecca Thein
This blog post is a summary of information and insights shared by the speakers at the Misinformation Village satellite event during RightsCon 2023.
This blog post was written majorily by OpenAI's ChatGPT.
Embracing a Systemic Approach to Tackle Misinformation
In the era of digital connectivity and vast information exchange, tackling the growing issue of misinformation has become a critical task for both tech companies and society at large. Unfortunately, this problem cannot be resolved by addressing individual components in isolation. Instead, a systemic approach, encompassing various aspects of the internet, tech companies, governmental bodies, and users themselves, needs to be adopted.
Embracing a systemic approach to tackle misinformation, therefore, involves harmonizing efforts across multiple domains: technology, regulation, ethical journalism, and public education. Tech companies need to use their capabilities, collaborate, and operate within transparent, regulatory frameworks. Journalists must be committed to ethical reporting, while the public must be educated to navigate the information landscape intelligently.
The goal behind this event was to connect with the digital rights audience who often sits outside the decision making rooms and processes that motivate new products and social features that both connect us and ail our communication and poison the social digital space that we log onto to share information, expertise, find new interests and organize. By connecting with a diverse and multi-hyphenate audience, we aim to tell fuller, more diverse stories about the internet and online disinformation.
Confronting Biases in AI: An Insight by Sahar Massachi
Sahar Massachi, with his experience as a former engineer at Facebook, did not mince words about the biases entrenched in AI systems. In his discourse, he shattered the illusion of AI neutrality, demonstrating how these systems can inadvertently reflect and even escalate societal prejudices. Massachi underscored the urgent need for rigorous auditing of these convoluted systems—not merely to mitigate public backlash or to meet regulatory compliance, but to hold tech giants accountable for their role in shaping societal perceptions, narratives, and attitudes.
Jeff Allen's Plea for Transparency in AI
Jeff Allen, a seasoned tech executive, championed for transparency in AI and social media. He acknowledged the reality of our algorithm-driven lives and asserted that understanding these technologies—their workings and decision-making rationale—is essential for empowering users. In Allen's view, user-facing transparency is not merely a moral obligation; it's also a pathway for companies to foster trust and loyalty among users.
Pamela San Martin on The Rise of Social Media Oversight Boards
Pamela San Martin offered a fascinating perspective on the role of oversight boards within major social media conglomerates. As she pointed out, in a world where power dynamics are fluid and private entities sometimes wield more influence than governmental institutions, oversight boards serve as a crucial counterbalance. San Martin envisions these boards not just as regulatory bodies, but as culturally and geographically diverse entities capable of providing unique insights, navigating the complexities of global digital interactions, and setting fair standards for online communication.
Paulina Ibarra and The Weaponization of Disinformation
Paulina Ibarra, with her expertise in political influence operations, unveiled the grim reality of how disinformation campaigns are weaponized to target specific groups—women in politics, in her example. Ibarra shared compelling instances of how these campaigns can significantly influence political outcomes and alter the trajectory of national narratives. However, she emphasized the need to frame this issue as a human rights concern, not just a technological problem, warranting a comprehensive, multi-sectoral response.
Naming and Addressing Digital Violence: Insights from Ixchel Aguirre
Ixchel Aguirre, a relentless activist against gender-based digital violence, highlighted the importance of explicitly recognizing and naming the various forms of digital violence. She argued that to understand this form of violence, a broader lens—embracing diversity and inclusivity beyond binary gender perspectives—is required. Aguirre emphasized viewing these issues through not just a technical lens, but also from a human rights and social justice standpoint.
Navigating the Role of Government in Internet Governance: A Perspective by Adam Peake
Adam Peake of ICANN offered nuanced insights into the role of government in internet governance. He acknowledged the delicate equilibrium between the ideal scenario of government using the internet for societal welfare and the concern of potential power misuse. However, his primary focus lay on ensuring system security and stability—a crucial domain where ICANN plays a key role.
A Fresh Perspective on Tech Impact by Swati Punia
Swati Punia offered an innovative viewpoint on how we perceive the 'impact' of technology. She suggested that the true measure of impact should encompass not just the ultimate outcome, but also the transformative experiences and personal growth individuals undergo while interacting with technology—adding a new dimension to our understanding of how technology influences us.
These profound insights underscore the intricate interconnections among technology, society, and governance. As we delve deeper into the digital age, these conversations will only become more crucial, guiding us towards a more inclusive, fair, and transparent tech-scape.
Diverse Narratives of our Digital World and their Civic Impact
Human civilization has always been driven by narratives, guiding our understanding and shaping our behavior. Yet, as Jennifer Halweil emphasizes, our traditional narratives have been largely binary - good versus evil, right versus wrong. This binary view, while useful in simpler times, is woefully inadequate for our complex, interconnected world. With the advent of the internet, our perception of the world has exploded into a technicolor array of diverse perspectives. Communities now have a global stage to share their stories, spurring innovation and breaking down binary barriers. Halweil's focus is on nurturing these digital ecosystems to foster inclusive and efficient information exchanges.
However, our digital spaces come with their share of problems. Holwell argues that our internet, primarily driven by ad-based models, has created a culture of outrage and echo chambers. This environment only feeds our implicit and confirmation biases, while users, the central commodity, are left with little reward or representation. A fundamental change is necessary to our primary models that foster a more nuanced and balanced view of the world.
Shaping Civic Integrity in the Digital Age: Insights from Industry Leaders on Elections, Disinformation, and Collaboration - A Panel
When we shift the focus to the sphere of civic integrity, Theo Skeadas broadens our understanding. He reminds us that it's not just about elections but all significant civic events. In the wake of unplanned crises like natural disasters or wars, it is imperative for tech platforms to respond appropriately. Skeadas underscores the importance of combatting disinformation, validating candidates, and partnering with reliable news sources to fortify the legitimacy of these events.
Sarah Amos dives into the technical aspects, highlighting the role of AI and machine learning in combating misinformation. However, she insists, it isn't just about the technology. Human expertise is critical in staying ahead of the curve, identifying emerging narratives, and adapting our responses accordingly.
Nichole Sessego and Rebecca Thein offer insight into internal workings within tech companies. Sessego underscores the value of cross-functional collaboration, especially during times of industry upheaval. Thein, on the other hand, argues for a focus on redundancy over efficiency. Drawing from her own experience at Twitter, she insists on early preparation and team building for successful election support.
Liz Mori Tornheim extends the conversation to include external partners. Civil society, academia, and nonprofits can offer invaluable insights, especially when dealing with elections in regions where tech companies might lack expertise or personnel. These partners can provide feedback to help shape product interventions, policy developments, and process designs.
Navigating the Digital Landscape: The Interplay of Media Narratives and Networked Authoritarianism (Giovanna Fleck, Alex Esenler)
Alex Esenler frames journalism as a mechanism for positive civic impact, creating repositories of civically minded reporting that illuminate underrepresented narratives and provide comprehensive visual documentation of global events. Furthermore, Esenler highlights the necessity for ongoing observation, reporting, and analysis, emphasizing that in-depth, persistent research can reveal potential crises well before they occur.
Giovanna Fleck underscores the irreplaceable value of human analysis in understanding context-specific information. Only individuals immersed in the context can truly grasp and interpret these details, which may be missed by automated systems. Fleck also draws attention to the issue of digital authoritarianism, where technology is employed to curb freedoms, manipulate narratives, and preserve authoritarian practices. She argues that recognizing digital rights as human rights is fundamental to preserving individual integrity and freedom of expression in the digital world.
Neuroscience, Emotion, and the Disinformation Battle (Maia Mazurkiezicz, Hallie J. Stern)
Maia Mazurkiewicz advocates for a strategic shift beyond fact-checking and combating disinformation. Instead, she calls for promoting positive strategic communications, emphasizing the crucial roles of education, media literacy, and particularly emotional education in this process. The more individuals understand their emotional responses, the less susceptible they are to the manipulative tactics employed by disinformation campaigns.
Hallie J. Stern introduces a neuroscience perspective, elucidating that online information is designed to evoke specific responses by stimulating multiple emotions. The interplay between these responses and the information we encounter online is a product of our neurological processes, she explains. By marrying the technical understanding of these processes with the humanities, we can learn to measure and teach the societal impacts of disinformation.
Maia Mazurkiewicz argues that fact-checking isn't enough – we need a broader communication strategy that includes media literacy and emotional education.
An interesting case study discussed during the session was the current political situation in Poland. As the country heads towards elections, Maia and her are piloting their tool there to identify disinformation narratives. The first-hand accounts shared painted a picture of how government narratives are exploiting the pro-Russian label for opposition parties, stirring public sentiment and causing massive public demonstrations. These events underscore the urgency of their work.
Tackling disinformation isn't just about technology and tools. It's also about fostering an education that promotes not only critical thinking but emotional intelligence. The recognition of how emotions influence reactions to information, and how they can be manipulated, was highlighted as an important defense against disinformation. A poignant example was shared - during the onset of war in Ukraine, Polish citizens experienced a rise in fear, a basic emotion that was easily exploited by propagandists with the narrative "Putin wants you to be afraid." This example underscores the urgent need for emotional education from a young age, teaching individuals to understand their emotions and their effects.
The session also shed light on the importance of studying disinformation from a marketing perspective. A fascinating analogy drawn was how horror movies elicit a much stronger emotional response due to the immersive experience they create. In a similar vein, it was explained that disinformation campaigns are designed to trigger specific emotional responses, capitalizing on our cognitive processing of familiarity and emotional responses.