Content warning: This article discusses sex work, the policy framing of trafficking, online harassment, and the loss of digital accounts and income.

Toronto, late autumn 2025: the crowd spills onto the pavement after a midweek show. A performer’s phone buzzes with the same message again and again. The notification is tidy, almost polite, as if the platform is simply doing admin. By the time the venue lights come up, the Instagram profile is gone. What remains is a blank space where bookings, posters and years of community once sat.

In Canada’s nightlife economy, a social account is not a vanity project. It is a calendar, a payroll, a safety check in the DMs and a living archive. When it disappears, work does not simply ‘move elsewhere’. It fractures into unpaid labour, awkward explanations to promoters and a scramble to rebuild trust. A Toronto burlesque artist has even used a replacement profile bio to state that Instagram deleted their old account. You can explore more here.

Stylized portrait of a woman in blue light wearing retro attire and looking upwards. Sex worker artists and Meta deletion

Meta’s opaque enforcement is stripping sex worker artists in Canada of income and archives, while reported claims of leniency in trafficking-related enforcement raise serious questions about moral governance. This matters now because platform crackdowns are arriving in waves, not as isolated mistakes. Repro Uncensored has tracked a sharp rise in severe restrictions and removals affecting gender, health and queer organising across Meta’s apps in 2025. This is not a ‘nude or not nude’ debate. It is a question of who gets to circulate culturally without being treated as disposable.

The deletion economy: what disappears when an account vanishes

A deleted account looks like a single loss on a screen. In practice, it is a stack of losses at once. Contacts vanish with the DMs that held them. Past work disappears with reels, posters and tagged photos. A performer’s audience becomes an unreachable crowd. The first cost is always time.

For many artists, Instagram is the main booking route. Promoters search for a name and expect receipts. They want a style, a tone, a back catalogue. A blank profile reads like a risk to them. So work becomes proving you exist again. That proof is labour, and it is unpaid.

The harm is not evenly distributed across scenes. Sex worker artists already operate under stigma. They often keep multiple identities to stay safe. When a platform erases one identity, the ‘clean’ one can be tainted by association. In the worst cases, visibility becomes a trap rather than a resource.

The European Sex Workers’ Rights Alliance describes social media as a lifeline for community and income. Its April 2025 report links deplatforming to economic insecurity, especially for marginalised workers. The report treats shadowbanning and account suspension as part of a wider content moderation system. It also notes the damage done when there is no meaningful appeal.

Artists often describe this as losing infrastructure, not followers. A follower count is just one measurement. The real number is the number of paid gigs that came through the profile. Another is the number of collaborations that required platform tools. Another is the number of screenshots needed to prove past work. Another is the number of hours spent reporting, appealing, and waiting.

Toronto and Montreal scenes rely on fast circulation. A last-minute show needs a last-minute post. A touring performer needs a quick story to share. A new venue night needs a reliable tag. When accounts are restricted, reach becomes unpredictable. That unpredictability is a tax on creativity.

In Canada, criminalisation shapes what ‘risk’ looks like online. Bill C-36 and the Protection of Communities and Exploited Persons Act (PCEPA) frame sex work through ‘exploitation’ language. That framing has consequences that extend beyond the courts and police. It shapes how neighbours report, how the media narrates and how companies anticipate regulation. The result is a platform environment primed for over-removal.

The deletion economy is not only about money. It is also about the loss of a public record. Artists in nightlife build work that is ephemeral by design. Their digital archive is often the only lasting proof. When it is deleted, it can rewrite a career. It can also erase community histories that were already under-documented.

How Meta enforces ‘account integrity’ and why appeals feel opaque

Meta often frames enforcement as a safety function. The language is procedural and standardised. It leans on categories like ‘account integrity’ and ‘human exploitation’. For users, the message is usually vague. For workers, the vagueness can be dangerous. It can imply criminality without explanation.

Meta’s Transparency Center describes disabling accounts as a response to repeated violations. It also describes a strike system in which accumulated violations can lead to removal. That structure matters because it turns context into counting.

Counting is not neutral when categories are broad. When ‘sexual solicitation’ is interpreted expansively, art becomes suspicious. A pole clip can be treated like a transaction offer. A burlesque poster can be treated like adult services advertising. A link in bio can be misread or mass-reported. The strike system then becomes a fast road to deletion.

Appeals are where justice is supposed to live. In practice, appeals often feel like a dead end. UK organisations have reported appeals being rejected within minutes. PinkNews reported account owners saying there was no sign of human review. This experience turns ‘appeal’ into theatre. It also pushes users towards desperation.

Drag performer in vibrant costumes and creative makeup on stage with blue feathers.  Sex worker artists and Meta deletion

Opacity is not only about silence from Meta. It is also about the inability to know what triggered enforcement. Without a clear post-level explanation, creators cannot adjust safely. They cannot know whether words, images or reports drove action. They cannot identify whether a rival or harasser targeted them. And they cannot separate error from policy.

The ESWA report describes a ‘context collapse’ problem in automated moderation. It argues that a lack of transparency makes discriminatory practices hard to track. It also stresses that terms like shadowbanning cover many covert actions, including reduced recommendations and hashtag blocking. The key point is that visibility can be removed without notification.

For sex worker artists, invisibility can be worse than deletion. Deletion is obvious and can trigger community support. A shadowban is quieter and more corrosive. It makes creators work harder for the same outcome. It can nudge them towards safer, more advertiser-friendly content. It can also distort an entire scene’s aesthetic over time.

Meta insists that rules apply equally across groups. The Guardian reports Meta denying enforcement based on affiliation, while campaigners describe patterns that look like targeted chilling. That tension is the story’s centre: governance without democratic accountability; private policy acting like public law.

The anti-trafficking paradox: expansive punishment vs documented tolerance

Anti-trafficking language carries moral weight for a reason. Trafficking is real, violent and worthy of a serious response. The problem is how platforms operationalise that language. ‘Exploitation’ becomes a bucket that catches consensual adult work. Art gets swept into the same net as abuse. The rhetoric becomes a tool for expansive punishment, and the results can harm the people it claims to protect.

In late 2025, a different story about Meta emerged through court filings. The Verge reported deposition testimony alleging a ‘17 strike’ policy for certain trafficking-related violations, describing accounts receiving many chances before suspension. Meta disputed the allegations through a spokesperson statement. The reporting is not proof of policy on its own, but it is evidence of contested governance.

The paradox is not abstract when you put it next to everyday decisions. Sex worker artists describe swift punishment for mild, non-explicit content. Meanwhile, the allegation suggests a high threshold for serious violations elsewhere. That contrast raises a question about incentives. It suggests engagement and growth may shape enforcement priorities. It also suggests ‘safety’ can be selectively applied.

The Guardian’s reporting on recent waves adds another layer. It describes accounts removed or restricted across queer and reproductive health spaces. Meta describes many disabled accounts as ‘correctly removed’ while also acknowledging some were errors. The pattern looks like heavy removal followed by partial reinstatement.

If enforcement is erratic, it becomes a form of structural power. It disciplines speech through uncertainty. It encourages self-censorship and coded language. It pushes creators into darker corners of the internet. It also makes workers easier to scam because they are desperate for reinstatement. Platform opacity becomes a market for intermediaries.

The ESWA report points to double standards inside platform economies. It cites evidence that profitable adult content can slip through when it pays. It notes a 2025 reference to findings that thousands of pornographic adverts were allowed in 2024. The implication is not that porn should be banned. The implication is that enforcement is shaped by profit, not just principle.

This is why ‘nudity’ is the wrong axis for debate. The axis is labour and legitimacy. Who is treated as a creator, and who is treated as a liability? Who gets channels for appeal, and who gets automated rejection? Who is granted presumed innocence, and who is smeared by category labels? A moral algorithm does not need to be consistent to be effective. It only needs to make some people afraid.

The UK case studies show the cultural cost clearly. PinkNews reported a combined loss of tens of thousands of followers across deleted accounts. It also reported a founder saying deletion destroyed years of community-building. That language mirrors what Canadian performers say, privately and publicly: the platform takes away not just reach, but the social fabric that makes art possible.

The anti-trafficking paradox is also a political story. It shows how ‘protect the vulnerable’ rhetoric can be used to expand control. It also shows how corporations manage public trust by invoking safety. When the same corporation is accused of tolerating harm elsewhere, the rhetoric collapses. The question becomes: who is protected, and who is made expendable?

Canada’s legal backdrop: PCEPA, Bill C-36 and the politics of ‘exploitation’

Canada’s sex work framework shapes how platforms interpret risk. PCEPA and Bill C-36 have been criticised for criminalising the conditions around sex work. The law’s moral framing can bleed into corporate policy. It also shapes public reporting, including malicious reporting. A platform does not operate in a vacuum. It absorbs the values of its loudest regulators and advertisers.

A 2024 position statement on sex work and migration describes PCEPA as creating offences that make many sex work-related activities illegal. It stresses that migrant and gender-diverse workers face compounded harms through surveillance and policing. It also notes that the law is often an entry point for state intrusion into migrant lives. This matters because platform enforcement often mirrors state suspicion. When the state treats sex work as a nuisance, platforms can treat it as brand risk. Read more here.

Active History’s 2024 essay frames Bill C-36 as pushing street-based workers into more isolated conditions. It connects criminalisation to a reduced ability to screen and negotiate. The point is not only legal theory. It is material safety. When safety resources move online, deplatforming becomes a physical risk factor. Visibility and screening can be harm-reduction tools. Removal can undo that.

Artistic portrayal of a person in a harlequin costume with dramatic face paint and jester hat. Sex worker artists and Meta deletion

Now bring that into the arts. Nightlife work often overlaps with sex work because both are precarious labour markets. Both rely on networks, referrals and reputation. Both can involve stigma, policing and harassment. When a performer loses their Instagram, they lose the bridge between these economies. They also lose the ability to control the narrative about their own work. That narrative control is a safety measure.

Canada’s scene is also shaped by race and migration. Migrant sex workers face unique constraints, including immigration rules and community surveillance. The 2024 position statement links these constraints to human rights concerns and calls for full decriminalisation as a pathway to safety. Online enforcement that treats sex work as inherently suspect worsens these dynamics. It can push migrant workers towards riskier, less visible arrangements.

Platforms often claim they are simply enforcing ‘community standards’. Yet standards are cultural artefacts. They are made through social panic, policy lobbying and advertiser demands. When ‘exploitation’ is used as an umbrella, consensual work becomes easier to erase. The same word can mean rescue or punishment depending on who wields it. In Canada, that tension is built into the legal architecture.

This is why this piece is about governance, not morality. The question is not whether burlesque is art. It is also not whether sex work is work (though it is). The question is who gets to decide what counts as legitimate culture. Meta and its automated systems now make that decision at scale, with consequences in Toronto clubs and Montreal studios.

The Guardian describes affected organisations being offered vague explanations and closed-door briefings. It also reports critics saying those meetings reinforce power imbalances. That dynamic has a Canadian echo, even when the case study is global. Platforms ask communities to adapt, rather than fixing governance. The burden shifts from corporation to precarious worker.

The legal backdrop also shapes public sympathy. Anti-trafficking narratives can make it harder to defend sex worker artists publicly. People fear being seen as causing harm. That fear can isolate workers when accounts are deleted. It can also discourage venues and institutions from solidarity. Yet conflation is itself a harm, as ESWA argues. The report calls for ending the conflation of sex work and trafficking in policymaking.

Toronto and Montreal case files: cultural work as survival

In Toronto, a performer’s brand is often stitched together from scraps: a poster on a feed, a clip from a rehearsal and a tip link. It is also private logistics in DMs. When the account goes, those threads snap. The performer becomes harder to book overnight. They also become harder to vouch for. The platform holds the receipts that the scene relies on.

Some losses are immediate and easy to name. A cancelled gig means a specific amount of money. A venue holding back payment because the promotion failed is another. A collaborator who cannot find you is a third. A festival clip that disappears is a fourth. Together, they look like an invisible layoff.

Other losses are slower and more psychological. An archive is not only marketing. It is memory and proof of growth. For marginalised artists, it can be evidence against erasure. Losing it can feel like a second closet. It can also trigger a fear of posting anything at all. That fear shrinks scenes from the inside.

The Toronto artist who writes that their old account was deleted is doing public grieving. It is also a warning to others in the same labour market. A replacement account is a new starting line. It asks audiences to re-follow and re-trust. It asks promoters to retake a chance. It also invites the same enforcement again.

Montreal’s nightlife ecosystem has its own rhythms. Language, venue cultures and community networks shape what travels. Yet the platform layer is shared with Toronto. The same Meta rules govern a bilingual poster and an English caption. The same report button sits under a burlesque image and a club flyer. Transnational platforms flatten local nuance.

Black and white photo of a burlesque performer posing elegantly with a feather headdress. Sex worker artists and Meta deletion

The ESWA report stresses that platform discrimination affects many marginalised communities. It explicitly links sex worker discrimination to wider debates about platform governance. It also highlights how automated decision-making struggles with context. That matters in nightlife, where context is everything. A glittered costume can be a celebration, not a solicitation. A backstage selfie can be community, not commerce.

In both cities, deplatforming also intersects with harassment. A deleted account can make a worker easier to impersonate. A restricted account can make them harder to contact safely. A shadowban can push them to accept riskier outreach channels. Even without explicit threats, the environment becomes less controllable. Control is a form of safety for precarious labour.

The Guardian reports that more than half of some flagged accounts were reinstated, including ones taken down in error. That pattern creates a cruel loop for creators. You are asked to trust a process that admits it makes mistakes. You are also asked to absorb the cost of those mistakes. For artists who live gig to gig, a week offline can be rented. For workers managing safety risks, a week offline can be exposure.

So artists and workers organise around redundancy. They share warnings in private chats. They swap screenshot templates and appeal language. They build community lists that do not depend on Meta. They also learn to read the platform’s moods as weather. That weather report is not a creative choice, but it becomes part of the job.

The UK mirror: when sex-positive arts get erased

The story is not contained within Canada’s borders. UK organisations have described a similar pattern of sudden deletions. PinkNews reported multiple queer-inclusive arts and sex-positive projects being removed, alongside claims that appeals were rejected almost instantly. Those figures are a proxy for lost reach and lost work.

What connects these cases is not geography, but governance. A platform built in Silicon Valley sets rules for a Toronto burlesque show and a London sex-positive night. The same enforcement categories travel across borders. The same moral assumptions about bodies and labour are exported. The result is a transnational pattern of deplatforming. The pattern is not always intentional, but it is consistent in outcome.

The UK case also clarifies the stakes for organisers. PinkNews quoted a founder describing deletion as destroying years of community-building. That community is not only online. It includes real-life events, mutual aid and cultural space-making. When an account disappears, that work is made harder to sustain. Culture becomes more fragile because its infrastructure is treated as disposable.

The Guardian’s reporting adds a wider frame. It describes bans and restrictions affecting queer groups and reproductive health providers. It includes Meta’s denial of discriminatory enforcement. It also includes activists describing a power imbalance in closed-door engagements. That language maps onto arts organisers who cannot get a straight answer. The appeal process becomes a black box that shapes public culture.

This is vital for Rock&Art. Sex-positive arts have always been integral to British cultural life. Performance spaces and community festivals are not only labour markets; they are essential spaces where queer and migrant communities assert belonging. When platforms erase them, they are not just removing ‘content’; they are disrupting how people connect and form communities.

An artistic portrait of a woman in dramatic lighting wearing a feathered costume. Sex worker artists and Meta deletion

The anti-trafficking paradox also matters in the UK mirror. The Verge report about an alleged high strike threshold for serious violations lands badly beside rapid deletions of marginalised art spaces. Even as an allegation, the contrast demands scrutiny. It also undercuts the claim that harm reduction is the primary driver.

The ESWA report is useful here because it links sex worker discrimination to platform accountability. It argues that solutions are not only individual workarounds. It points towards regulatory safeguards and transparent redress. It also highlights cultural and civic entities as part of the solution. That includes arts organisations, venues and unions. It is a collective problem because the harm is collective.

If the UK mirror teaches anything, it is that going quiet does not guarantee safety. Some organisations report being removed even when the content is non-explicit. Others report restrictions on educational or artistic nudity. The target is not simply explicitness. The target is anything that triggers a conservative risk model. That model is moral, economic and political at once. The only honest response is to treat it as governance.

A resilience toolbox: what helps, what does not

Toolkits can sound like self-help when the problem is structural. This is not a promise of immunity. It is a way to reduce the blast radius when an Instagram account is deleted. It also recognises that every workaround carries risk. Some workers prefer visibility; others prefer safety through quiet. Both strategies can be valid.

Start with documentation, because memory is not evidence. Screenshot every warning, strike and appeal screen you receive. Keep a dated log of actions you took and responses you got. Save copies of key posts and captions off-platform. This is boring work, but it can help when you need to argue your case. Meta’s own strike logic makes counting central, so documentation matters.

Build audience channels that you control, even if they feel less glamorous. A mailing list is slow, but it is portable. A website with a simple calendar can reduce dependence on one platform. A link hub is useful, but it can also trigger enforcement if it looks like adult-services routing. Keep versions of your links and note which ones were live when a warning arrived. This turns guesswork into a clearer timeline.

Separate work streams when possible, but do not treat separation as shame. Many people keep a public arts profile and a private worker profile. Others keep a single, integrated identity because splitting can be exhausting. The choice should be yours, not Meta’s. If you separate, be careful about cross-tagging and shared phone numbers. If you integrate, be careful about the language that automated systems misread.

Plan for community-based recovery, not only platform-based appeal. When accounts are deleted, community signal can help. A trusted venue can repost your new account without naming the reason. A collective can share a link in a newsletter rather than in a story. Friends can help rebuild an archive by sending you images they saved. This spreads the labour so you are not alone.

Be cautious about ‘fast recovery’ services. Desperation creates a market for scams. Opaque appeals and instant rejections create panic, and panic is profitable. The Guardian reports affected groups receiving vague reasons and little support, which is exactly the environment scammers exploit. If someone offers a guarantee, treat it as a red flag. Focus on what you can verify and control.

Use institutions where they can help, and name their limits. Arts venues can write letters confirming bookings and legitimacy. Festivals can archive programmes and line-ups off-platform. Unions can collect anonymised case studies to show patterns. Universities and researchers can support public evidence, like ESWA’s accountability work. None of these fixes Meta, but it builds a record that Meta cannot erase.

Finally, insist on the political frame when you talk about this publicly. This is not a personal failure to ‘follow the rules’. It is a power imbalance between precarious cultural labour and corporate governance. PinkNews’ reporting shows how quickly years of work can be wiped out in the UK, and Canadian scenes recognise that risk too. The Verge’s reporting shows that even serious harm categories can be governed by contested thresholds. Put together, the picture is not morality, but accountability. That is where the fight belongs. Earlier in this series.

Sources

The Guardian, 11 December 2025 (last modified 15 December 2025) — Reported a global wave of Meta restrictions, quotes from Repro Uncensored and Meta’s stated position on enforcement and reinstatements.

PinkNews, 8 December 2025 — Documented UK queer and sex-positive arts accounts being deleted, including reported follower losses and rapid appeal rejections, with founder testimony.

The Verge, 25 November 2025 — Reported deposition testimony alleging a ‘17 strike’ threshold for certain trafficking-related violations, used here as an explicitly labelled allegation.

European Sex Workers’ Rights Alliance, Sex Workers Belong on Social Media, April 2025 — Worker-led analysis of discrimination, shadowbanning, deplatforming and recommendations for platform accountability.

Gender Justice in Migration, Sex Work and Migration (position statement), February 2024 — Summarised PCEPA and Bill C-36 impacts with attention to migrant and gender-diverse workers and human rights concerns.

Active History, ‘[title not provided]’, 3 December 2024 — Contextualised Bill C-36 and safety impacts, used here to connect criminalisation with screening and harm reduction.

Meta Transparency Center, ‘Disabling accounts’ page (accessed via indexed result, July 2025) — Described strike-based enforcement logic referenced here to explain the counting model.

Public testimony in a Toronto burlesque performer’s Instagram bio (indexed result, 2025) — Used as an on-the-record example of account deletion affecting cultural workers, without sharing handles.


Keep Independent Voices Alive!

Rock & Art – Cultural Outreach is more than a magazine; it’s a movement—a platform for intersectional culture and slow journalism, created by volunteers with passion and purpose.

But we need your help to continue sharing these untold stories. Your support keeps our indie media outlet alive and thriving.

Donate today and join us in shaping a more inclusive, thoughtful world of storytelling. Every contribution matters.”


Sarah Beth Andrews (Editor)

A firm believer in the power of independent media, Sarah Beth curates content that amplifies marginalised voices, challenges dominant narratives, and explores the ever-evolving intersections of art, politics, and identity. Whether she’s editing a deep-dive on feminist film, commissioning a piece on underground music movements, or shaping critical essays on social justice, her editorial vision is always driven by integrity, curiosity, and a commitment to meaningful discourse.

When she’s not refining stories, she’s likely attending art-house screenings, buried in an obscure philosophy book, or exploring independent bookshops in search of the next radical text.

Sophie Renard (Author)

Bonjour! I’m Sophie, a cultural journalist who loves exploring the connections between the past and the present. My passion lies in bridging classical and contemporary art, showing how the old can inform the new. Join me as we journey across time and culture, uncovering the stories that link yesterday’s masterpieces with today’s creative expressions

guest

0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Categories

Don't Miss Out!

Energetic crowd enjoying a lively street party in João Pessoa, captured in monochrome.

The body as “policy risk”: racialised nudity enforcement against Afro-Brazilian protest

Content warning: This piece discusses content moderation, non-sexual nudity, and references to sexually explicit advertising. Meta says it enforces one…
digital border | Rock & Art

Suspicion by default: how UK eVisas and share codes make the digital border a daily obstacle for migrant artists

On a wet Thursday night in late November 2025, a small venue in south London is trying to do what…