Trust and Information - OGM Call, 2025-08-14
Author: Peter Kaminski Issue: 2025-08-20
Trust and Information - OGM Call, 2025-08-14
by Peter Kaminski
The OGM Thursday call crew tackled the challenges of navigating information trustworthiness in our current societal and media landscape on the recent 2025-08-14 OGM Call.
I listened to the recording of the call, and also used AI (™️) to distill themes covered in the call.
Before we start, some quotes from the call, chosen by me:
- Kevin: “Listen for the truth and don’t get scared.”
- Judith: “Credibility is at the heart of this, but there’s discernment associated with credibility, and there’s sourcing, also.”
- Rick: “Trust is something that’s overrated, and if we focus on discernment, verification, truth-seeking....”
- Mike: “We do need to restore shame. A feedback loop only works if you’re embarrassed when you’re caught out.”
- Klaus: “We get drawn into conversations and get excited about something that’s obviously wrong... But when we respond... we respond to this one particular misstatement, that doesn’t really deal with what the intention of [the] misinformation was.”
- Mike: “So it’s emancipation, indoctrination, or discombobulation.”
- Stacey: “How do we make it easier to question with curiosity? Genuine curiosity, not condescending curiosity, because there’s a big difference between the two.”
The following text was curated by me and mainly written by Claude Sonnet 4, using the call and chat transcripts as sources. I also used ChatGPT 5 and NotebookLM powered by Gemini 2.5 to assist Claude in this analysis. I have reviewed the text and I believe it's a reasonable representation of the call. However, note that while reading a summarization of any call is useful in helping to structure an understanding of the call, it is not a replacement for listening to the audio yourself.
2025-08-14 OGM Call
Jerry Michalski framed the discussion around interconnected questions that strike at the heart of modern information consumption: What sources do we trust and why? How should we share information responsibly in online communities? How do we distinguish reliable information from conspiracy theories? And importantly, what emerges as traditional journalism continues its “ongoing protracted agonizing death”?
The conversation revealed a community genuinely wrestling with maintaining intellectual rigor while building accessible, trustworthy information-sharing practices. Rather than falling into despair about information chaos, participants engaged constructively with both theoretical foundations and practical mechanics of trust-building in digital spaces.
The East Germany Trust Paradox emerged as a central metaphor: Jerry's observation that extreme surveillance states (where 1 in 10 citizens were informants) paradoxically fostered the deepest underground trust networks, because people's lives depended on accurate information sharing. This contextual, layered nature of trust became a recurring theme, challenging simplistic binary notions of trustworthiness.
Participants shared concrete examples of information challenges they'd encountered, from fabricated stories about Elon Musk offering $500M to the Baltimore Ravens (plausible but completely false) to the persistent problem of misattributed quotes—like a Marcus Aurelius saying that was actually created by Leo Tolstoy. These weren't abstract concerns but lived experiences that demonstrated how even discerning people struggle with information validation.
[Image not included in the current archive. Images may be included in the future.]
Key Themes
The Trust Paradox
The central insight that trust operates in counterintuitive ways. Jerry’s East Germany example—where extreme surveillance coexisted with deep underground networks—suggests trust isn’t binary but contextual and layered. Rick Botelho challenged this framework entirely, arguing that “trust is overrated” and advocating instead for “discernment, verification, and truth-seeking.” This tension between trust-building and verification processes ran throughout the discussion.
Information Validation Crisis
The community grappled with distinguishing reliable from unreliable information, highlighted by real examples participants had encountered. The challenge isn’t just identifying lies, but dealing with the speed at which misinformation spreads. As someone noted, “A lie gets halfway around the world before the truth gets its pants on”—though the group chuckled that even this quote is misattributed.
Economic Incentives vs. Truth
A recurring theme that there’s “no business model” for fighting disinformation while spreading it is highly profitable. The attention economy rewards engagement over accuracy, creating systemic bias toward sensational, often false information. Gil Friend noted this fundamental asymmetry: there’s a clear business model for spreading lies, but none for spreading truth.
Community Standards and Protocols
The need for explicit agreements about how to share information responsibly within trusted communities. Kevin Jones advocated for treating the OGM list with more respect, as a “trusted channel” where members validate sources before sharing. This included practical measures like source validation, providing Archive.is links to bypass paywalls, and clearly labeling speculative content. Jerry suggested even more ambitious feedback loops—ways to gently remind people if they’d shared false claims in the past.
Emotion vs. Logic in Information Processing
Jose’s crucial insight that people follow information sources (like Trump) not because they’re factually accurate, but because they validate existing feelings and worldviews. This revealed the futility of fighting emotional narratives with logical corrections. As Jose put it, people don’t care about the accuracy of Trump’s numbers—they care that “he validates their emotions.”
Communication Complexity Dilemma
The tension between sophisticated analysis (Rick’s academic language) and accessible communication (Mike Nelson’s “bumper sticker” principle). How to maintain intellectual rigor while being understood by broader audiences became a meta-conversation about the very challenge they were discussing. Jerry’s frank feedback to Rick about his writing being unreadable despite his spoken insights illustrated this tension perfectly.
Technology as Both Problem and Solution
Recognition that current platforms are designed for addiction and misinformation spread, while also discussing potential technological solutions. These included digital identity systems that reveal only necessary information, news readers with “anti-echo chamber” features, and better attribution tracking for online content. Mike Nelson’s ideas about self-sovereign ID systems that protect privacy while enabling verification represented sophisticated thinking about technical solutions.
Scale and Scope of Engagement
Jose’s fundamental question about whether we need opinions on everything happening globally versus focusing on what we can directly influence. This touched on the psychological and practical costs of trying to stay informed about everything—the tendency to turn most information sharing into “gossip” that consumes our life force without productive purpose.
Power and Media Manipulation
Understanding that misinformation isn’t just accidental—it’s often strategic. The discussion of Trump, Bannon, and Putin’s deliberate use of contradictory messages to create “discombobulation” rather than promoting a single narrative revealed sophisticated understanding of modern information warfare. Gil’s point about “sputtering apoplexy” as the intended effect on opposition forces highlighted how reactive responses can play into manipulative strategies.
Process vs. Content Focus
Rick’s emphasis that getting the process right (how we think and communicate) is more important than winning individual content battles. His concept of “emancipatory learning” versus “indoctrination” suggested that the fundamental challenge is educational and cultural, not just technological or regulatory.
Minority Themes
Several important themes were mentioned but not deeply explored, suggesting rich areas for future investigation:
Journalism Infrastructure Collapse: Jerry mentioned “the ongoing protracted agonizing death of journalism” but the conversation didn’t dive into specifics of newsroom closures, local news deserts, or emerging journalism models beyond brief mentions of nonprofit news.
Generational and Digital Literacy Divides: Judy’s critical observation that “80% of people” rely on “oral-only communication” and “aren’t really literate” represents a massive educational challenge that got minimal discussion despite being fundamental to the problem.
Mental Health and Information Overload: Jose touched on whether “we need to have an opinion about everything” and the psychological costs, but the conversation didn’t explore information anxiety, overwhelm, or cognitive limits of processing global information.
International Information Warfare: While Putin and Russia were mentioned, there wasn’t much exploration of state-sponsored information operations, cross-border influence campaigns, or how different countries are responding to information warfare.
Platform-Specific Dynamics: Social media algorithms were mentioned, but there wasn’t deep exploration of how different platforms (TikTok, Telegram, Signal) have different information dynamics or platform-specific moderation challenges.
Identity and Attribution Systems: Mike Nelson’s sophisticated ideas about digital identity, self-sovereign ID, and attribute-based verification got limited exploration despite being potentially transformative for information verification.
Global South and Non-Western Perspectives: The conversation was very Western-centric, with no discussion of how information trust and verification work in different cultural contexts or developing nations.
AI’s Role in Information Verification: While AI was mentioned for translation and simplification, its potential role in automated fact-checking, deepfake detection, or information verification wasn’t explored.
Legal and Regulatory Frameworks: Brief mentions of the Fairness Doctrine, but no deep dive into Section 230, European approaches like the Digital Services Act, or how regulation might address information problems.
Historical Context and Cycles: Several participants hinted at historical parallels, but there wasn’t deep exploration of whether current information problems are fundamentally new or variations on historical themes.
Next Time
Given the energy and momentum from this conversation, several high-priority directions emerged for future OGM calls:
“Brain Defense” Workshop: Building on Mike Nelson’s concept, this could be a practical session where participants share personal techniques for information evaluation, practice identifying emotional manipulation in real-time, and develop a toolkit for cognitive self-protection.
OGM Community Protocol Development: Kevin’s call for better standards could become concrete guidelines for information sharing on the OGM list, including templates for framing uncertain content and peer accountability mechanisms.
“Anti-Echo Chamber” Experiment: Jose’s insight about not needing opinions on everything could lead to structured exercises in consuming opposing viewpoints, testing tools like GroundNews collectively, and discussing what information is actually worth tracking versus ignoring.
The Literacy Crisis Deep Dive: Judy’s observation about oral-only communication deserves serious attention—exploring how to reach people who don’t read for information, what “brain defense” looks like for different literacy levels, and whether audio/video formats can convey nuanced thinking.
Business Models for Truth: Moving beyond identifying the problem to exploring solutions through examining successful nonprofit news models, cooperative journalism experiments, and micropayment innovations.
Emotional Intelligence for Information: Building on Jose’s insight about feelings versus facts to explore how to address legitimate emotions behind “wrong” beliefs, techniques for curious conversations with people we disagree with, and when empathy is more effective than correction.
The group seemed ready to move from problem identification to solution development. The recommendation would be starting with “Brain Defense” Workshop combined with OGM Protocol Development—building on momentum from the current discussion while creating immediate practical value for the community and establishing foundations for future conversations.
For Future Exploration
Looking beyond immediate next steps, the minority themes suggest rich areas for deeper investigation:
The intersection of technology, education, psychology, and global governance in addressing information challenges represents fertile ground for future exploration. Questions around economic models for quality information—subscription journalism, public media funding, cooperative ownership models, micropayments—deserve serious attention given the “no business model” problem identified.
Neuroscience and cognitive biases could provide foundation for more effective “brain defense” strategies, while historical patterns might help distinguish what’s genuinely new about current information challenges versus recurring themes in human communication and propaganda.
The global perspective gap represents both an opportunity and a necessity—understanding how different cultures approach information trust could provide models for more resilient information ecosystems.
Perhaps most importantly, the conversation revealed a community ready to engage with these challenges constructively rather than defensively. The willingness to examine their own practices, accept feedback, and experiment with new approaches suggests OGM could model the very practices it hopes the world might adopt—building trust through transparency, validation through verification, and wisdom through collective sense-making.
Text Versions of Mindmap
Here are text representations of the graphical mindmap included above. First a simplified one used to create the image (simplified to make it more practical to render as an image), and a somewhat more complex one with additional detail.
Central: Trust & Information Sources ├── Trust Paradox │ ├── East Germany Story │ ├── 1 in 10 Informants │ └── Deep Underground Trust ├── Misinformation Examples │ ├── Elon Musk Ravens │ └── Misattributed Quotes ├── Proposed Solutions │ ├── Search X & Anti-X │ └── Dual Sourcing ├── Business Model Problem │ ├── No Profit in Truth │ └── Attention Economy ├── Community Standards │ ├── Trusted Channel │ └── Validation Before Sharing ├── Philosophical Tensions │ ├── Trust vs Verification │ └── Complexity vs Simplicity ├── Emotional Dimensions │ ├── Feeling vs Facts │ └── Trump's Emotional Appeal └── Technology Solutions ├── Digital Identity └── Anti-Echo ChamberCentral Node: "Trust & Information Sources" ├── Trust Paradox │ ├── East Germany Story │ ├── 1 in 10 Informants │ └── Deep Underground Trust ├── Misinformation Examples │ ├── Elon Musk Ravens Story │ ├── $500M Fake Story │ ├── Misattributed Quotes │ └── Marcus Aurelius/Tolstoy ├── Proposed Solutions │ ├── Search X & Anti-X │ ├── Dual Sourcing │ ├── Archive.is Links │ └── Multiple Perspectives ├── Business Model Problem │ ├── No Profit in Truth │ ├── Attention Economy │ └── Addiction by Design ├── Community Standards │ ├── Trusted Channel │ ├── Validation Before Sharing │ └── Kevin's Proposal ├── Philosophical Tensions │ ├── Trust vs Verification │ ├── Complexity vs Simplicity │ ├── Rick's View │ └── Jargon Debate ├── Emotional Dimensions │ ├── Feeling vs Facts │ ├── Trump's Emotional Appeal │ └── José's Insight └── Technology Solutions ├── Digital Identity ├── Anti-Echo Chamber ├── Self-Sovereign ID └── News Reader Feature
Related:
- Peter Kaminski (author)
- 2025 (year)
- Topics: Events and Gatherings, Health and Wellbeing, Media and Communication