February 18 2025, Asia Meetup

The Asia Regional Meetups are bimonthly text-based gatherings that bring together folks from the Asian region to share, connect, seek help, and release stress by celebrating each other. In addition, it is a time for us to find ways to support each other, and help us understand what is happening in our part of the world. If you cannot attend the monthly meetups, we are taking notes of each gathering and linking to them below.
The Asian community is connected during the week in different ways. Either through the Asian channel on the TCU Mattermost or via different events organized on various topics during the year.
Date: Tuesday, February 18 2025
Time: 5:30pm IST/ 6:30pm MMT / 7am EST / 12pm UTC (What time is it in my city?)
Who: Facilitated by Sapni
Where: Text-based format in the Regional Asia channel on the TCU Mattermost.
- Don't have an account to the IFF Mattermost? you can request one following the directions here.
Lezismore Taiwan’s largest privacy-first platform supporting sexual minorities will be sharing their work, experiences and challenges with community work in the region.
Notes
Shin is a law and tech professional working on interesting projects in the space of sexual minorities, and content moderation, making accessible and equitable online spaces and communities. To begin with, we'd love to know more about you and your projects, particularly Lezismore!
This is a self-hosted/governed, community-driven platform for Taiwan’s queer community. It’s a space that aims to balance privacy with accountability, freedom with safety—a balancing act that might sound simple on paper but is incredibly complex once you’re in the trenches. And what makes us different, is we support sex-positive content, and this platform is free to use, and collect no personal data.
Tell us more about Lezismore. What is the etymology of the name and the story behind its inception?
Around 2013, many independent Taiwanese websites and platforms serving sexual minorities were forced to shut down. The primary reason was that large platforms like Facebook had already cornered the mainstream advertising market, leaving these niche sites with neither funding nor a feasible path to survival. As a result, countless members of the LGBTQ+ community found themselves displaced onto mainstream social networks—where posting sex-positive content often led to account deletions, harassment, or other forms of pushback.
In 2015, I decided to create a platform of our own to address these challenges—a space where our community could truly flourish. The name “濡沫” (Ru Mo) originates from a passage in the ancient Chinese text Zhuangzi which says, “相濡以沫,不如相忘於江湖”—a story of two fish in a drying pond who sustain each other by sharing their saliva, yet ultimately long to swim freely in the vast river beyond. This concept also inspired our phrase “Lez is more,” underscoring how, in a difficult environment, we support one another for survival—but the end goal is to be free like fish in a boundless ocean.
Lezismore is our English name, wish everyone (Lesbian and other sexual minorities) here could be "more".
Beyond Lezismore, I’ve participated in various workshops and collaborative programs—like the upcoming “Connective (t)issues Workshop” this March—where I explore how digital ecosystems can be thoughtfully structured not just through code, but through norms, policies, user flows, and especially the subtle, often “invisible” design decisions. Too often, we prioritize convenience in user experience and narrow down options in a way that unintentionally stifles self-expression—something especially problematic for queer communities that thrive on diverse identities and nuanced self-representation. My focus is on crafting frameworks that truly empower users by embracing complexity, rather than pushing them into a handful of neatly packaged categories.
And we are facing challenges because Lezismore actually emerged from two driving forces. First, we saw that mainstream platforms often take a hardline stance against “explicit” or “sexual” content in the name of fighting pornography. That policy might seem understandable at scale, but it disproportionately silences queer communities for whom discussions about sex, identity, and self-expression are integral to personal dignity and visibility.
Can you explain a bit what kind of spaces Lezismore provides? Are those forums, social networking, list of resources, or something else altogether?
It’s essentially a community composed of multiple open-source platforms. Our main site, built with WordPress, features podcasts, columns, and written pieces by queer writers and gender-focused commentators. Alongside that, we have a separate community forum—also built on open-source software. Together, our main site and community forum engage a total of over fifty thousand users every month. It functions like a traditional forum but also allows private messaging, and its user interface draws on the design aesthetic typical of the 2010–2020 era.
Most users spend most of their time on the forum-style community platform. They often overlook the main site—which actually hosts plenty of advocacy articles—and instead head straight to the forum for dating, hookups, and sex-positive conversations.
I have to say, users on our platform experience all sorts of outcomes. Some face harassment or have negative encounters, but others have found short/long-term partners here, even gotten married, and even started families. For many in Taiwan’s queer community, it’s simply become a part of everyday life.
And I want to continue talk about our challenges there’s a criminal law commonly referred to as the “crime for fostering sexual intercourse.” In simple terms, if someone operating a platform in Taiwan profits from hosting user-generated sexual content, they can be held criminally liable. That legal landscape can put platform administrators in a precarious spot, especially if they want to create a safe environment for open sexual discourse without risking potential jail time. And somebody ACTUALLY went to jail because of this crime in Taiwan. This make us super difficult to survive.
These factors led us to build an “exclusive” online space that’s more than just a ban-all or allow-all approach. We wanted somewhere that protects sexual expression as a core aspect of queer identity—while still respecting local laws and avoiding blanket censorship. Sex, in many ways, is the canary in the coal mine for digital free speech: once sexual content is broadly suppressed, other crucial forms of queer communication often end up marginalized, too.
Given your background in law and tech, what points of friction do you see in global content moderation regulatory frameworks, which can be eased through the learnings from RuMo/Lezismore?
Most global content moderation frameworks are either too centralized (think big social media, where algorithms or faceless moderators make sweeping decisions) or too fragmented (federated platforms where different servers have wildly different norms). What we discovered at Lezismore is that neither extreme fully addresses the realities of smaller, culturally specific, or marginalized communities.
- Overly broad policies can miss context: – Hate speech or harassment guidelines often come from Western-centric viewpoints, ignoring local nuances in language or culture. At Lezismore, we saw how crucial it is to have moderation that is culturally fluent—able to tell the difference between a local slang expression and a genuine threat.
- One-size-fits-all approaches can be exclusionary: Requiring real-name verification might protect some platforms from “trolls,” but it can also deter users who fear for their safety in conservative or discriminatory environments. Our experience showed that trust-based governance—where the community can see a user’s history and patterns of behavior—can work better than rigid ID checks.
- Mob rule vs. no rule: Many frameworks fall into the trap of letting the loudest voices dominate. We had to learn to avoid “mass-report = ban” scenarios by requiring context for every report and using transparent audit logs. This was crucial to resist harassment campaigns disguised as moderation requests.
In short, we learned that localized, participatory, and context-aware moderation—what you might call “community-led” or “self-governed”—is not just an idealistic dream. It’s a real, if challenging, option that can inform global regulatory standards.
The answer that Big Tech should be looking at to address moderation issues is to hire well and more locally?
We should build more independent platform using OSS. As a former product manager, I have to say business decision is profit-driven, and it's hard to change because of its scale.
In the past, for instance, I used BigTag as a platform, and over time, I’ve gradually transitioned away from it. It’s definitely been a journey—one that highlights how each step of trial, error, and experimentation is essential to figuring out what truly works. That lead to the next part, which I am going to share about open-source spirit is not only about technical thing, but also the "strategies" we using different platform (including BigTech's), like how/when we use it, what's the risk, what's the backup plan.
There’s much more to it than just picking the “right technology.” We have to decide which platforms to adopt—sometimes even using multiple platforms at once for both functionality and risk management. That kind of knowledge and experience should be shared, so people can choose and adapt based on their own needs. I don’t believe open-source software is the only solution; some communities simply don’t have the resources to build and maintain an entirely self-hosted system on their own.
That’s where a broader understanding of “open-source” as more than just code can help. We should be sharing not only our tech solutions, but also the operational methods, policy frameworks, and community insights that we’ve developed through trial and error. Running a platform—or simply deciding how and when to moderate—is never one-size-fits-all. Open-sourcing the lessons learned, the failures, and even the hidden costs of these decisions can give other communities a blueprint for what might (or might not) work in their own contexts.
The more we can publicly document our experiences, the more we empower other communities to adapt, iterate, and hopefully build platforms that are truly inclusive—not just technically advanced.