Kate Eichhorn
Measuring the Effects of Online Hate Speech
Role: Principal Investigator
Timeline: 1998-2000
When digital communities were just beginning to form and norms around online behavior were still unsettled, this study examined how hate speech emerges, spreads, and gains meaning in virtual environments. It analyzed how platform features shape the conditions under which harmful or exclusionary language affects individuals and groups, and how circulation patterns such as resharing, archiving, and linking can intensify or transform its impact. The project helped frame early discussions about ethical online engagement and highlighted how digital platforms can both disrupt and reinforce existing forms of hate.
​
Key Questions
​​​
-
How do speech acts (especially injurious or hateful ones) operate differently online compared to offline contexts?
-
In what ways do circulation, archiving, and referencing of digital content (images, text, links) influence the impact of hate speech?
-
What role do platform conditions (anonymity, pseudonymity, moderation, visibility) play in shaping the dynamics of cyberhate?
-
How might the design of digital communication environments enable or constrain harmful speech acts?
-
What are the implications for researchers and designers in terms of mitigating harm, supporting community resilience, and building ethical infrastructures for digital platforms?
Methodology and Rationale
This study was conducted at a formative moment in the evolution of online communication, when norms around digital interaction were still taking shape. It began with the premise that digital speech is not merely an extension of face-to-face or print communication. Unlike offline speech—which is often fleeting—online content can be copied, remixed, and recirculated indefinitely, reshaping its meaning and amplifying its reach.
​
To understand how hate speech operates in these environments and how it acts upon individuals and communities, I conducted a discourse analysis of selected online hate-speech events. The analysis drew on speech-act theory to examine how language not only conveys information but can, in some cases, act upon individuals—altering their status and causing harm. It further explored how the specific conditions of digital environments alter the capacity of hate speech to act upon individuals.
​
Study Design
​​​
-
Textual/Discourse Analysis: Examined selected instances of harmful speech online, how they are composed, circulated, referenced, and archived.
-
Case Studies: The study focused in detail on specific incidents or threads of online hate speech, tracking how they move through digital spaces and how users respond.
-
Qualitative Interpretation: Rather than large-scale quantitative measurement, this research emphasized how meaning is produced, how harm is socially and digitally constituted, and how online infrastructures support or impede those processes.
Sampling
Because this was a discourse analysis rather than an ethnographic study, there were no participants; the data set consisted of examples of online speech. The study paid particular attention to godhatesfags.com, an early and widely publicized domain dedicated to spreading misinformation about the LGBTQ community.
​
Screener Logic
I focused on online contexts where acts of injurious speech occur on a frequent basis, and traced how these acts were being referenced, recirculated, reshared, and engaged with by visitors.​
Bias and Validity
​
Research for this study was carried out before the arrival of the comprehensive search engines; as a result, the study relied upon examples of online hate speech that were already visible enough to be reported upon in the printed media. Given that hate speech itself can be open to interpretation, I also acknowledge that the study was shaped by my own interpretation of what constitutes a linguistic injury. The study was, in this respect, shaped by both the limits of the search tools available at the time and my own potential interpretative biases. ​​​
Key Insights
​​​​​​
-
Digital hate speech can have amplified impacts: Because content can be archived, copied, reshared, and referenced indefinitely, a single injurious act can acquire an extended life and influence far beyond its original context.
-
Circulation shapes meaning and harm: The path a harmful message takes—who reshared it, where it appeared, how it was framed—can shape its impact as much as its original content.
-
Platform affordances matter: Features such as anonymity/pseudonymity alter accountability; persistence and archiving mean speech acts outlive their posting; hyperlinking enables harmful messages to spread across communities.
-
Blurred boundaries between private and public: What might have been an ephemeral remark offline becomes searchable and referenceable, creating new forms of exposure and harm.
-
Design and moderation shape experience: Moderation systems, visibility controls, anonymity features, and archiving policies directly influence how harmful speech is encountered and managed.
-
Digital communities require new ethical frameworks: Traditional approaches to addressing harmful speech—and even existing regulatory frameworks—often fall short when applied to mediated digital environments.
​
Impact
​​
​Although published in 2001, the article anticipated many issues now central to discussions of digital experience, platform governance, and UX ethics: how harmful speech propagates online, how design choices amplify or mitigate harm, and how participation and visibility shape users’ sense of safety, identity, and agency.​​
​​​
​​​
Publications and Presentations
Eichhorn, Kate. 2001. “Re-in/citing linguistic injuries: speech acts, cyberhate, and the spatial and temporal character of networked environments." Computers and Composition 18 (3): 293-304. doi: 10.1016/S8755-4615(01)00057-3.
​
Eichhorn, Kate. 2000. "Cyberhate and Performative Speech in Accelerated Time(s)." M/C Journal 3(3). doi: https://doi.org/10. 5204/ mcj.1849
​
Eichhorn, Kate. 2000. "Legal and Pedagogical Perspectives on Hate Speech in Cyberspace." Invited talk. Canadian Association for the Practical Study of Law in Education.
​
Eichhorn, Kate. 2000. “Cyberhate and Performative Speech.” Electronic Communication and Culture Area, ACA/PCA National Convention, New Orleans, LS.
​
​
​
​