From junta-era raids to algorithmic invisibility, Latin America’s memory work faces new gatekeepers. What might digital transitional justice look like in practice?
Content warning: This article contains mentions of state violence and human rights abuses.
In a dim back room of a Buenos Aires community centre, two archivists hover over a flatbed scanner. Each yellowed police report they digitise reclaims a story long suppressed under Operation Condor’s continental web of repression. What begins as a simple act of preservation becomes an act of defiance against erasure. Yet, in 2025, the threat no longer comes from burning files but from algorithmic invisibility. When a community standards notice halts an upload, history is once again placed on trial by an unseen authority.
Across Latin America, the memory of dictatorship is being reshaped not by state censors but by opaque platform policies. What once required armed raids and official decrees now occurs through automated flagging and downranking. Human rights groups from Argentina to Paraguay report posts about past atrocities disappearing overnight. The mechanisms differ, yet the logic of control persists: visibility remains a privilege, not a right. In the digital sphere, moderation has become the new censorship.
The resonance with the 1970s is chilling. Operation Condor was conceived to silence dissidents across borders, coordinating surveillance, kidnapping and information blackouts among Southern Cone regimes. Decades later, transnational power operates through servers instead of secret police, but the asymmetry endures. Big Tech companies headquartered in California or Dublin decide which South American memories meet ‘advertiser-friendly’ criteria. This shift from physical to algorithmic domination defines the region’s current archival struggle.

The discovery of Paraguay’s Archives of Terror in 1992 proved that bureaucratic documentation could be weaponised and later reclaimed. Activists from the Centro de Estudios Legales y Sociales (CELS) turned those files into evidence for truth commissions. Today, the same ethos animates digital archivists uploading testimonies and images once hidden in filing cabinets. But platforms often lack the nuance to distinguish documentation from incitement. As a result, the infrastructure meant to democratise memory can reproduce its exclusions.
From Buenos Aires to Santiago, volunteer networks have migrated their physical collections to cloud repositories, hoping to ensure permanence. They rely on YouTube, Meta and Google Drive — services whose terms can change overnight. When a channel sharing courtroom footage of dictatorship trials is suspended, volunteers scramble to mirror the content elsewhere. The irony is acute: the technology that promised decentralisation concentrates power in a handful of corporations. Memory work, once threatened by dictatorships, now depends on their digital successors.
For survivors and relatives, this renewed vulnerability reopens wounds. Many had trusted that digitisation would shield testimonies from the decay of paper and the corruption of governments. Instead, algorithmic filters treat evidence of violence as violence itself. Behind each deletion lies a symbolic silencing that echoes the impunity these archives sought to challenge.
From state censorship to algorithmic control
In the 1970s, censorship in South America was explicit and violent. Military juntas raided newsrooms, banned books and detained editors in the name of national security. The architecture of silence relied on fear and bureaucracy — ministries catalogued ‘subversives’ with precision. When Operation Condor connected intelligence services across Argentina, Chile, Uruguay, Paraguay and Brazil, censorship became continental. Its goal was to annihilate dissent, both in print and in the public imagination.
The Archives of Terror later revealed how information control operated as a shared technology of domination. Thousands of documents listed citizens targeted for their ideas, linking surveillance files to international police co-operation. Each record reflected the obsession with eliminating alternative narratives. Truth was not just hidden but administratively erased. The paperwork of repression created the illusion of legality for illegitimate acts.
As democracies returned, societies sought to rebuild their archives and recover public memory. State institutions such as Argentina’s Archivo Nacional de la Memoria began cataloguing testimonies and legal evidence. Regional organisations including CELS and Brazil’s Armazém Memória helped digitise and contextualise collections. Their mission was simple yet urgent: ensure that future generations could access the past unmediated. For many, digitisation symbolised closure — proof that history could not be undone.
Yet the promise of open access collided with the rise of platform governance. As social networks and hosting services became dominant spaces for information exchange, they quietly inherited the gatekeeping role once held by ministries. Instead of direct orders from a general, new controls arrived through opaque ‘community standards’. The language softened, but the result felt familiar: entire archives could disappear without explanation. Censorship became decentralised but no less powerful.
Algorithmic control reframed old logics under the guise of efficiency. Platforms insist moderation is neutral, a technical necessity to protect users from harm. But automated filters cannot always discern between documentation and glorification, between testimony and propaganda. In practice, this ambiguity penalises users from the Global South, whose historical materials often contain violent imagery. The outcome resembles state censorship — not in intent, but in effect.
What platforms remove — and why
The architecture of moderation defines what the internet remembers. Every takedown, shadowban or downranking alters a story’s visibility. Unlike traditional censorship, these mechanisms operate silently and at scale, affecting millions of posts daily. A takedown is an outright deletion; a shadowban hides a user’s content without notice; downranking buries it beneath algorithmic clutter. Together, they create a hierarchy of suppression calibrated for plausible deniability.
Companies present moderation as a matter of safety, not politics. Yet the criteria for ‘community standards’ are often shaped by advertiser demands and geopolitical pressure. When a platform’s ‘brand safety’ filter blocks footage of a protest or atrocity, it is not protecting users — it is protecting revenue streams. The moderation process thus extends market logic into the domain of collective memory.
Human rights organisations across Latin America report that moderation errors disproportionately affect archival and activist content. In Argentina, footage from the Nunca Más trials has been removed for ‘graphic violence’. In Chile, photographs documenting the 2019 protests disappeared from Instagram during the peak of unrest. Each case illustrates how automation can misread context. Instead of balancing rights, platforms tend towards over-removal, sacrificing accuracy for reduced liability.
Appeals mechanisms frequently fail non-English speakers. Forms, help pages and responses are often written in corporate English, creating a linguistic barrier that mirrors geopolitical hierarchies. For many small organisations, translation becomes a cost of survival. Without accessible, multilingual routes to challenge decisions, false positives compound into silence.
Moderation policies also privilege ‘recency’ and ‘engagement’, favouring viral visibility over historical relevance. Archival uploads, therefore, perform poorly in algorithmic ranking, disappearing into digital obscurity. In societies marked by decades of silence, this temporal bias erodes the fragile continuity between historical truth and civic education — a slow deletion rather than an abrupt purge.
The convergence of automation and commercial logic turns moderation into infrastructural governance. Decisions once made by public institutions are now executed through private code. When evidence of human-rights violations becomes ‘content’, it enters a marketplace of visibility governed by opaque incentives. Platforms claim neutrality, yet their algorithms embody political choices. The absence of public oversight allows silent distortions to flourish.
Who loses visibility
The victims of algorithmic erasure are rarely those with institutional protection. Grassroots journalists, independent archivists and memory activists occupy the margins of visibility by design. Their work challenges official narratives and therefore triggers moderation systems tuned for commercial neutrality. When a community radio station in Córdoba posts testimonies from survivors of police violence, its reach collapses overnight. The algorithm detects ‘graphic content’, not civic pedagogy.
This invisibility compounds structural inequality. Larger outlets can appeal through public-relations teams or legal departments; smaller collectives rely on luck and solidarity. In practice, digital moderation reproduces the media hierarchies that transitional justice sought to dismantle. The communities that risked most to document the past are again rendered peripheral. For them, deletion is not abstract but personal — a repetition of historical erasure in a new medium.
Across Latin America, women, LGBTQIA+ activists and Indigenous groups face higher rates of content removal. Feminist collectives denouncing dictatorship-era sexual violence are routinely flagged for ‘adult content’. Indigenous communities digitising oral histories also see their material labelled as ‘hate speech’ owing to mistranslated terms. Algorithms designed around Anglo-American linguistic norms often fail to comprehend local idioms of resistance. Bias becomes structural, embedded in syntax and semantics.
For Afro-descendant organisations in Brazil and Colombia, visibility depends on navigating double marginalisation — racial and geographic. Researchers have documented dozens of takedowns involving anti-racist memory campaigns. Each removal reinforces a familiar colonial hierarchy: deciding whose suffering is acceptable to see. Moderation metrics flatten historical difference into generic categories of ‘violence’ or ‘sensitivity’, erasing context in pursuit of comfort. In the digital archive, whiteness remains the default frame. [editor’s note: Please confirm the preferred term here aligns with community usage and R&A style.]

Diaspora voices in the UK and Spain encounter a subtler form of silencing. Their posts are not always removed; they are often downranked into invisibility. Spanish-language testimonies about Operation Condor trials reach only a fraction of potential readers, while English summaries by international NGOs circulate widely. Linguistic bias thus determines transnational empathy. Algorithmic translation amplifies what is already legible to the North and mutes what resists easy consumption.
This asymmetry shapes how transitional justice is remembered beyond Latin America. In Madrid and London, diasporic communities use digital platforms to commemorate the disappeared and educate new generations. Yet their archives depend on corporate servers that treat them as content producers rather than custodians of history. The result is a fragile ecology of remembrance, vulnerable to platform redesigns. When memory lives in the cloud, sovereignty drifts elsewhere.
The human toll of invisibility extends beyond data metrics. For families of the disappeared, each removed post can feel like a second death — a symbolic annihilation of testimony. Digital repression operates through exhaustion: constant appeals, uncertain outcomes and the fear that years of work may vanish overnight. This psychological burden mirrors the intimidation once exerted by military regimes. Erasure becomes automated grief.
Reclaiming visibility requires understanding power as both technical and affective. Algorithms amplify what fits predefined templates and demote what unsettles them. The challenge for memory activists is not only to archive but to outsmart moderation logics. Collaborative projects such as Memoria Abierta and Archivo Oral de la Memoria Trans experiment with decentralised back-ups and federated platforms. Their work turns remembrance into resistance — proof that even under digital constraint, memory finds a way to breathe.
Towards digital transitional justice
The concept of transitional justice emerged to address the aftermath of dictatorship and conflict through truth commissions, reparations and institutional reform. In the digital era, justice extends beyond the courtroom into the infrastructures that mediate memory. The challenge is no longer only to expose state crimes but to ensure their documentation remains visible online. This necessity gives rise to a new framework: digital transitional justice.
At its core, digital transitional justice is about continuity rather than novelty. It recognises that archives, like courts, require safeguards. Evidence can no longer be separated from its medium; if data disappears, accountability follows. Scholars argue that digital preservation is itself a human right. Mechanisms are needed to ensure that digital evidence of atrocities receives the same protection as physical documents. Without such measures, history remains hostage to algorithms.
Policy debates in Europe and Latin America are beginning to converge on this realisation. The European Union’s Digital Services Act introduces transparency obligations for content moderation, while regional NGOs advocate adapting them in the Global South. Groups such as Derechos Digitales and Article 19 propose national frameworks that would treat human-rights archives as protected heritage. Under this model, platforms would require judicial review before deleting verified historical material. The goal is procedural fairness rooted in memory ethics.

Practical initiatives already hint at solutions. The Mnemonic project, in collaboration with WITNESS, develops decentralised databases for storing visual evidence of atrocities, enabling duplication across multiple servers and reducing dependency on any single platform. Latin American organisations are experimenting with similar methods, combining open-source tools and institutional partnerships. These prototypes translate the ethos of truth commissions into the digital domain: collective accountability through redundancy.
Education also forms part of digital transitional justice. Activists in Argentina and Chile run workshops teaching young people how to recognise and report wrongful takedowns. The sessions blend historical testimony with digital literacy, linking past censorship to present algorithms. By understanding how visibility is engineered, participants learn to contest its hierarchies. Memory work thus becomes civic education — a bridge between history and algorithmic citizenship.
Transparency remains the weakest link in platform governance. Companies disclose takedown numbers but rarely detail the reasons or geographic patterns behind them. Without granular reporting, researchers cannot assess systemic bias. Digital transitional justice demands metrics that reveal whose voices vanish most often — and why.

Some governments are exploring legal recognition for ‘digital cultural heritage’. Paraguay’s Ministry of Culture has proposed a registry for online archives linked to human-rights documentation. In Argentina, CELS and the Ministry of Justice are drafting guidelines for the ethical handling of digital evidence in trials. These steps signal a shift from reactive to proactive policy. Justice, once confined to physical institutions, begins to inhabit the cloud.
In the end, digital transitional justice is about continuity rather than novelty. By linking archival preservation with governance reform, it redefines remembrance as civic infrastructure. The South American experience — born of dictatorships and rebirth — offers lessons for a world where deletion is the new denial. Protecting memory now means programming justice into the code itself.
Minimum safeguards for platform governance in memory work
To move from critique to construction, societies need standards that protect digital memory as rigorously as physical archives. Platform governance must recognise historical documentation as a public good, not a liability. The following minimum safeguards prioritise transparency, accountability and accessibility.
Transparency
- Platforms should publish detailed reports indicating what proportion of takedowns involve human-rights content, disaggregated by region and language.
- Independent audits should verify moderation accuracy across the Global South and expose biases in automated detection.
Accountability
- Oversight mechanisms — for example, via the Inter-American Commission on Human Rights — should mirror data-protection authorities, with transparent, multilingual procedures for wrongful-removal complaints.
- Ethical partnerships between platforms and memory institutions could recognise ‘protected heritage content’ categories exempt from automated deletion, aligning corporate policy with human-rights law.
Accessibility
- Appeals should be available in users’ native languages and supported by plain-language guidance. Incorporating local languages into moderation workflows ensures inclusion rather than tokenism.
Technical design
- Open-standard metadata, robust time-stamping and decentralised storage make content harder to erase without a trace. Community servers hosted in public universities can combine technical resilience with civic oversight to reclaim autonomy over digital heritage.
Public engagement
- Education and advocacy are essential to sustain change. Journalists, grassroots organisations and educators can translate complex policy debates into civic language and demand representation in moderation processes.
In the end, minimum safeguards mark a beginning, not an end. They transform remembrance from a fragile upload into a durable right. By embedding accountability, accessibility and transparency into platform design, societies honour the unfinished work of truth commissions. Memory survives when defended by structure as well as sentiment.
References
European Union. (2022, 19 October). Regulation (EU) 2022/2065 of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act). Official Journal of the European Union, L 277, 1–102. https://eur-lex.europa.eu/eli/reg/2022/2065/oj/eng
UNESCO. (2009). Archives of Terror (Paraguay) — Memory of the World Register. https://www.unesco.org/en/memory-world/archives-terror
Comisión Nacional sobre la Desaparición de Personas (CONADEP). (1986). Nunca más: The report of the Argentine National Commission on the Disappeared. Farrar, Straus & Giroux. https://archive.org/details/nuncamsreportofa0000arge
McSherry, J. P. (2005). Predatory states: Operation Condor and covert war in Latin America. Rowman & Littlefield. (Overview: https://books.google.com/books/about/Predatory_States.html?id=2eFHAAAAYAAJ)
Inter-American Commission on Human Rights, Office of the Special Rapporteur for Freedom of Expression (RELE). (2024). Digital inclusion and internet content governance. Organization of American States. https://www.oas.org/en/iachr/expression/reports/Digital_inclusion_eng.pdf
ARTICLE 19. (2024, 17 April). Content moderation and local stakeholders in Colombia. https://www.article19.org/wp-content/uploads/2024/04/Content_Moderation_and_Local_Stakeholders_in_Colombia_17Apr.pdf
WITNESS. (2013). Activists’ guide to archiving video. https://commonslibrary.org/wp-content/uploads/Activists-Guide-to-archiving-video.pdf
Mnemonic. (n.d.). About. Retrieved 12 November 2025, from https://mnemonic.org/en/about/
The Santa Clara Principles. (n.d.). The Santa Clara Principles on Transparency and Accountability in Content Moderation. Retrieved 12 November 2025, from https://santaclaraprinciples.org/
Meta. (n.d.). Types of content we demote (Transparency Center). Retrieved 12 November 2025, from https://transparency.meta.com/features/approach-to-ranking/types-of-content-we-demote/
World Federation of Advertisers (WFA). (2022, 17 June). GARM Brand Safety Floor + Suitability Framework. https://wfanet.org/knowledge/item/2022/06/17/GARM-Brand-Safety-Floor–Suitability-Framework-3
Tech Policy Press. (2024, 8 July). A review of content moderation policies in Latin America. https://techpolicy.press/a-review-of-content-moderation-policies-in-latin-america
UNESCO. (2003, 15 October). Charter on the Preservation of the Digital Heritage. https://www.unesco.org/en/legal-affairs/charter-preservation-digital-heritage
Amnesty International. (2020, 14 October). Eyes on Chile: Police violence at protests. https://www.amnesty.org/en/latest/research/2020/10/eyes-on-chile-police-violence-at-protests/
Memoria Abierta. (n.d.). Sobre Memoria Abierta. Retrieved 12 November 2025, from https://memoriaabierta.org.ar/wp/sobre-memoria-abierta/
Archivo de la Memoria Trans Argentina (AMT). (n.d.). Catálogo del Archivo de la Memoria Trans. Retrieved 12 November 2025, from https://atom.archivotrans.ar/
Centro de Estudios Legales y Sociales (CELS). (n.d.). Archivo del CELS. Retrieved 12 November 2025, from https://archivo.cels.org.ar/
World Bank. (2018). Afro-descendants in Latin America: Toward a framework of inclusion. https://documents1.worldbank.org/curated/en/896461533724334115/pdf/129298-7-8-2018-17-29-37-AfrodescendantsinLatinAmerica.pdf
CIPDH-UNESCO. (n.d.). Archivo del Terror (Paraguay) – Memorias situadas. Retrieved 12 November 2025, from https://www.cipdh.gob.ar/memorias-situadas/en/lugar-de-memoria/archivo-del-terror/
Keep Independent Voices Alive!
Rock & Art – Cultural Outreach is more than a magazine; it’s a movement—a platform for intersectional culture and slow journalism, created by volunteers with passion and purpose.
But we need your help to continue sharing these untold stories. Your support keeps our indie media outlet alive and thriving.
Donate today and join us in shaping a more inclusive, thoughtful world of storytelling. Every contribution matters.”