On 8chan, the site administration (global mods) took a "hands-off" approach, intervening only when required by United States law. This meant that while child sexual abuse material (CSAM) was banned, other forms of extreme content—including bestiality, gore, and hate speech—were permitted provided they stayed within their designated boards. Board moderators (Board Volunteers or BVs) were users themselves. In /zoo/, this resulted in a self-policing environment where the only rules were dictated by the necessity to keep the board online and avoid federal scrutiny.
Sociologically, participants in /zoo/ utilized mechanisms of moral disengagement to justify their presence. Common rationalizations found in the board's text posts included arguments of "animal consent," the rejection of "human-centric sexual morality," and the framing of their interests as a persecuted sexual orientation. This created an echo chamber where laws against bestiality were framed as oppressive government overreach, aligning the board's userbase with the broader libertarian/anarchist political ethos of 8chan at large. zoo 8chan
Following the Christchurch mosque shootings in 2019, 8chan was deplatformed by its infrastructure providers (Cloudflare, etc.). When the site rebranded as 8kun, a massive restructuring occurred. To appease payment processors and infrastructure providers, 8kun implemented a "whitelist" system. Boards like /zoo/, which were deemed too risky and repugnant, were not whitelisted. This marked the official end of the board on the clear web, forcing the remaining community into the dark web or decentralized file-sharing networks (like ZeroNet). On 8chan, the site administration (global mods) took
The Architecture of Anonymity and Radicalization: A Case Study of 8chan’s /zoo/ Board In /zoo/, this resulted in a self-policing environment