Nest Center: Building Mongolia's First AI-Powered Fact-Checking System
Project: Pluma Media
Newsroom size: 10 - 20
Solution: An AI-powered fact-checking and sentiment analysis system integrated into Nest’s Pluma.media platform, designed to detect bias, disinformation, and underreported topics in Mongolia’s public interest media.
The Nest Center for Journalism Innovation and Development, a pioneering NGO in Mongolia, confronted a critical challenge in combating disinformation while empowering independent journalism. This organisation, which has operated for five years building innovative media platforms, has emerged as a crucial player in Mongolia's media landscape through its commitment to serving underrepresented communities, its development of freelance journalism infrastructure, and its evidence-based approach to fighting false information in a media ecosystem dominated by political and business interests.
The problem: A media ecosystem under strain
Mongolia's media landscape presents a unique paradox. With 460 registered media organisations serving just 3.5 million people, the country has one of the world's highest per-capita media densities. Yet this apparent abundance masks a troubling reality: 75% of major outlets are owned by politicians or businesses, seemingly functioning more as propaganda tools than independent news sources.
"When you look at the market size and ownership structure, it becomes clear that most media organisations aren't run as businesses but as influence tools," explains Dulamkhorloo Baatar, the project lead. With 80% of news coverage focused on parliament and 60% of the population concentrated in the capital Ulaanbaatar, vast swathes of Mongolian society – particularly rural communities and marginalised groups – remain voiceless in mainstream media.
This concentration of coverage creates a perfect storm for disinformation. Without adequate editorial oversight for independent journalists or proper fact-checking resources for smaller outlets, false information spreads unchecked through Mongolia's media ecosystem.
Building the solution: AI-Powered tools for truth
The Nest Center's response was ambitious: develop Mongolia's first AI-powered fact-checking and sentiment analysis system, integrated into their existing Pluma.media platform – a space designed for freelance journalists to reach underserved audiences.
The project introduced two key innovations. First, an AI-assisted fact-checking tool that scans incoming content for potential inaccuracies, drawing on a database of 4,400 fact-checks conducted over five years. "We identified five key features where compromised information typically appears," Baatar notes. "If content shows a 20% probability of being false, it's flagged for human review. At 60%, readers see an immediate warning that fact-checkers are investigating, which is then updated with the verdict of the fact-check, clearly labelling compromised information as either false, missing context or misleading."
The second tool analyses sentiment across news sources, showing how different outlets cover the same story. "Readers can see whether they're reading an outlier perspective or mainstream consensus," Baatar explains. Early users report this increases engagement, encouraging deeper reading.
Building the solution with limited resources
Creating AI tools for Mongolian language presented unique challenges. As a "low-resource language" with limited digital presence, Mongolian lacks the vast datasets that power AI development in English or Chinese. "We have 100 years of newspaper history that isn't digitised," Baatar notes, "and no government plans for a national language model."
The team built most components from scratch, choosing a one-shot model for sentiment analysis while developing a custom fact-checking assistance system. "We couldn't use readily available tools because everything needed localisation. We literally compiled lists of words that commonly appear in problematic content," explains Byambajargal Ayushjav, a member of the technical team.
The platform's complexity demanded a full-stack approach with careful integration across separate systems for journalists, readers, fact-checkers, and sources. Finding qualified engineers proved challenging, particularly those fluent enough in English to work with AI documentation while understanding Mongolian language nuances.
Trust: The unexpected barrier
Perhaps the biggest challenge wasn't technical but cultural. "We planned to onboard 50 news organisations but ended up with seven," said Baatar. "Even though we've operated for five years, asking for access to archives required executive-level decisions. The level of trust among Mongolian media organisations simply wasn't there."
This forced an ethical approach to development. Rather than scraping data without permission, the team pursued collaborative agreements. "We promised partner newsrooms they could use our trained model for their own journalism tools. We wanted this to be the baby of Mongolia's entire media ecosystem, not just ours."
Psychological barriers also emerged. "Journalists still tell us they're afraid of AI, worried it will replace them," Baatar reveals. This fear, combined with language barriers and limited AI literacy, created resistance even among potential beneficiaries.
The opportunities: AI literacy and niche audiences
Despite challenges, the project is reshaping Mongolia's media landscape. The sentiment analysis tool unexpectedly increased reader engagement, with time spent on articles tripling when users could explore different perspectives. A collaboration with another local organisation added an AI chatbot, the Voyager AI widget, that suggests related stories, further deepening engagement.
More significantly, the project catalysed Mongolia's first community of AI-literate journalists. "Nine months ago, these conversations weren't comfortable. Now we have a group that can discuss AI, ask questions, and solve problems together," Baatar notes.
The platform also enables journalists to serve niche audiences economically unviable for traditional newsrooms. "A newsroom might not survive serving 10,000 people with disabilities, but for an individual journalist, that's a substantial audience," Baatar explains.
Lessons for newsrooms
Set ethical standards early: In emerging AI markets, recognise that you are "breaking the ice," and it is your responsibility to proactively set clear ethical standards for AI collaboration in the country.
Prioritise ethics over commercial speed: Emphasise that ethical considerations must precede commercial ones. "Going slower with proper collaboration is better than setting exploitative precedents," Baatar emphasises. This patient approach, prioritising human rights over profits, offers a model for other developing nations navigating AI adoption.
Local understanding trumps resources: The Nest Center's journey demonstrates that meaningful AI innovation doesn't require Silicon Valley resources – just clear purpose, ethical commitment, and deep understanding of local needs.
Explore Previous Grantees Journeys
Find our 2024 Innovation Challenge grantees, their journeys and the outcomes here. This grantmaking programme enabled 35 news organisations around the world to experiment and implement solutions to enhance and improve journalistic systems and processes using AI technologies.
The JournalismAI Innovation Challenge, supported by the Google News Initiative, is organised by the JournalismAI team at Polis – the journalism think-tank at the London School of Economics and Political Science, and it is powered by the Google News Initiative.
