<?xml version='1.0' encoding='UTF-8'?>
<?xml-stylesheet href="/rss/stylesheet/" type="text/xsl"?>
<rss xmlns:content='http://purl.org/rss/1.0/modules/content/' xmlns:taxo='http://purl.org/rss/1.0/modules/taxonomy/' xmlns:rdf='http://www.w3.org/1999/02/22-rdf-syntax-ns#' xmlns:itunes='http://www.itunes.com/dtds/podcast-1.0.dtd' xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0" xmlns:dc='http://purl.org/dc/elements/1.1/' xmlns:atom='http://www.w3.org/2005/Atom' xmlns:podbridge='http://www.podbridge.com/podbridge-ad.dtd' version='2.0'>
<channel>
  <title>Terms of Service Podcast </title>
  <language>en-us</language>
  <generator>microfeed.org</generator>
  <itunes:type>episodic</itunes:type>
  <itunes:explicit>false</itunes:explicit>
  <atom:link rel="self" href="https://podcast.termsofservice.xyz/rss/" type="application/rss+xml"/>
  <link>https://termsofservice.xyz</link>
  <description>
    <![CDATA[<p><strong><em>Subscribe, rate, and share to support the show on </em></strong><a href="https://podcasts.apple.com/us/podcast/the-terms-of-service-podcast/id1774517287" rel="noopener noreferrer" target="_blank"><strong><em>Apple Podcasts</em></strong></a><strong><em>, </em></strong><a href="https://open.spotify.com/show/1OHsoeF5UugkdEAN6r04W4" rel="noopener noreferrer" target="_blank"><strong><em>Spotify</em></strong></a><strong><em>, </em></strong><a href="https://pca.st/podcast/940c4540-6ec7-013d-4cae-0affd1ac52c3" rel="noopener noreferrer" target="_blank"><strong><em>PocketCast</em></strong></a><strong><em> or wherever you listen.</em></strong></p><p><strong><em>Follow us on&nbsp;</em></strong><a href="https://www.linkedin.com/company/podcast-terms-of-service" rel="noopener noreferrer" target="_blank"><strong><em>LinkedIn</em></strong></a><strong><em>&nbsp;for updates and join the conversation.</em></strong></p><p><br></p><p><em>Welcome to “Terms of Service,” the podcast that dives deep into the fine print of our digital lives. Every time we check the box on an app, website, or online service, we’re making choices—often without knowing the full story. From giving away our privacy to navigating complex security settings, we’re all part of a digital landscape that’s constantly evolving.</em></p><p><br></p><p><em>Join us as we unpack the themes that shape our online experiences: privacy, security, safety, and the everyday permissions we grant without a second thought. We’ll explore how AI, agency, and decentralized technologies are reshaping our digital world, often in ways that fly under the radar. And because no conversation about our digital lives would be complete without it, we’ll tackle the legal and policy implications that come with our clicks, swipes, and taps.</em></p><p><br></p><p><em>Whether you’re tech-savvy or just trying to keep up, “Terms of Service” invites you to join the conversation about the hidden costs of convenience in the digital age. Tune in to explore, question, and rethink the terms we often accept without a second thought, and let’s challenge the norms of our digital lives together.</em></p><p><br></p><h3><strong><em>Credits</em></strong></h3><p><em>Produced by Mary Camacho &amp; Nicole Klau Ibarra. Music and sound production is by Arthur Vincent at Sonorlab.</em></p><p><br></p><h3><strong><em>Behind the Mic</em></strong></h3><p><br></p><p><a href="https://www.linkedin.com/in/maryfcamacho/" rel="noopener noreferrer" target="_blank"><em><img src="https://media.termsofservice.xyz/termsofservice-xyz/production/media/rich-editor/channels/6TDNYiz6Umx/image-25fc934acfff709d9f9f80aa00494e2e.png"></em></a></p><p><em>Co-founder of Holochain, and CEO of Holo, Mary leads the development of peer-to-peer and decentralised technologies that empower users and redefine digital interactions. With over 20 years in tech and telecom, her career has been dedicated to enhancing user control, privacy, and digital autonomy.</em></p><p><br></p><p><em>Mary's educational background as a social scientist grounds the explorations at intersection of sociality and technology, exposing the trade-offs in privacy, security, and agency inherent in our digital choices. On “Terms of Service,” she invites listeners to rethink these everyday interactions and the broader implications of AI, distributed tech, and legal frameworks on our digital lives, advocating for a future where individuals have greater control over their data and decisions.</em></p><p><a href="https://www.linkedin.com/in/maryfcamacho/" rel="noopener noreferrer" target="_blank"><em><img src="https://media.termsofservice.xyz/termsofservice-xyz/production/media/rich-editor/channels/6TDNYiz6Umx/image-fea03ed04120277051a31cb932450e3a.png"></em></a></p><p><a href="https://www.linkedin.com/in/maryfcamacho/" rel="noopener noreferrer" target="_blank"><em>https://www.linkedin.com/in/maryfcamacho/</em></a></p><p><br></p><p><em><img src="https://media.termsofservice.xyz/termsofservice-xyz/production/media/rich-editor/channels/6TDNYiz6Umx/image-5ad7639f63416407575d251f59fc2ffd.png"></em></p><p><em>Nicole is a visionary entrepreneur with a diverse background, she is passionate about social system design and has helped multiple ventures, Including the IKIGAI Project, her non-profit helping people build essential 21st-century skills- many of which intersect with the topics discussed in Terms Of Service.</em></p><p><a href="https://www.linkedin.com/in/nicole-klau-ibarra-b26818137/" rel="noopener noreferrer" target="_blank"><em><img src="https://media.termsofservice.xyz/termsofservice-xyz/production/media/rich-editor/channels/6TDNYiz6Umx/image-7339db02160590446c70bd8ed111a7c8.png"></em></a></p><p><a href="https://www.linkedin.com/in/nicole-klau-ibarra-b26818137/" rel="noopener noreferrer" target="_blank"><em>https://www.linkedin.com/in/nicole-klau-ibarra-b26818137/</em></a></p><p><br></p><p><br></p><p><em><img src="https://media.termsofservice.xyz/termsofservice-xyz/production/media/rich-editor/channels/6TDNYiz6Umx/image-6076d176375ba31529acbbb2f715133e.png"></em></p><p><em>Arthur Vincent is a seasoned Music and Audio Producer with a passion for pushing the boundaries of music technology. As a music producer and sound designer, he has crafted innovative audio experiences for global brands like Heineken, Philips, and Cupra. Alongside his creative work, Arthur is also an expert in audio technology, mastering both hardware and software tools to deliver high-quality, immersive sound. </em></p><p><a href="https://www.linkedin.com/in/arthur-vincent/" rel="noopener noreferrer" target="_blank"><em><img src="https://media.termsofservice.xyz/termsofservice-xyz/production/media/rich-editor/channels/6TDNYiz6Umx/image-a3a2f1765cb87b9daac4c46e9455a835.png"></em></a></p><p><a href="https://www.linkedin.com/in/arthur-vincent/" rel="noopener noreferrer" target="_blank"><em>https://www.linkedin.com/in/arthur-vincent/</em></a></p>]]>
  </description>
  <itunes:author>Mary Camacho</itunes:author>
  <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/channel-552cfc6453e8f9e9674d2015afb0e0dd.jpg"/>
  
  <copyright>©2024 Mary Camacho</copyright>
  <itunes:owner>
    <itunes:email>info@termsofservice.xyz</itunes:email>
    <itunes:name>Mary Camacho</itunes:name>
  </itunes:owner>
  <itunes:title>The Terms of Service Podcast</itunes:title>
  <itunes:category text="Business">
    <itunes:category text="Entrepreneurship"/>
  </itunes:category>
  <itunes:category text="Technology"/>
  <itunes:category text="Society &amp; Culture"/>
  <item>
    <title>Environments Are Not Neutral: Biology, Burnout, and the Design of Work with Dr. Elizabeth C. Nelson</title>
    <guid>chEimLvuErH</guid>
    <pubDate>Wed, 25 Feb 2026 15:42:31 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with Dr. Elizabeth C. Nelson, biomedical engineer and founder of Learn, Adapt, Build, about a deceptively simple idea: environments are not neutral. From open office layouts to wearable wellness metrics, the spaces and systems we design encode assumptions about who they are built for—and who must adapt to survive inside them.</p><p>Drawing from her own burnout experience and years of research bridging academia and practice, Elizabeth explains how modern workplaces often optimize for the most resilient minority rather than the majority. They explore how environmental design affects stress, cognition, sleep, and performance; why high performers are often the first to hit the wall; and how leadership teams can make practical, measurable changes that improve both well-being and output.</p><p>This conversation extends this season’s focus on health technology governance into the physical workplace itself — asking how environmental measurement, workplace design, and performance metrics can either support human thriving or quietly optimize for institutional control.</p><h2>Key Takeaways</h2><ul><li><strong>Environments are not neutral.</strong> Physical layout, lighting, noise, air quality, and collaboration norms encode assumptions about the “default worker.”</li><li><strong>We do not design for the average.</strong> Many workplaces are optimized for the most resilient employees, not the most sensitive—despite evidence that designing for the sensitive improves outcomes for everyone.</li><li><strong>Focus is biologically powerful.</strong> Deep work and flow states (often lasting 60–90 minutes) support cognitive performance and emotional regulation. Constant interruption erodes both.</li><li><strong>Burnout is not a binary.</strong> It develops over time and often affects high performers who overextend without adequate recovery.</li><li><strong>Measurement can validate—or destabilize.</strong> Environmental sensors and wearables can reconnect people to their bodies, but poorly framed metrics can create shame or disconnect (as seen in early 10,000-step tracking experiences).</li><li><strong>Small structural changes matter.</strong> Separating deep-focus roles from interruption-heavy roles, improving air quality transparency, and removing unnecessary management friction can significantly improve performance and morale.</li></ul><h2>Topics Covered / Timestamped Sections</h2><ul><li><strong>00:00</strong> – Season framing: architecture, wellness technology, and why environments matter</li><li><strong>04:00</strong> – Burnout as origin story and the shift from academia to workplace research</li><li><strong>06:00</strong> – Open offices, evolutionary biology, and why protection and cover matter</li><li><strong>12:00</strong> – The cultural loss of focus and the cost of constant collaboration</li><li><strong>19:00</strong> – Burnout as a gray zone and the biological role of sleep</li><li><strong>22:00</strong> – Wearables, recalibrating step goals, and the psychology of measurement</li><li><strong>27:00</strong> – Air quality sensors, transparency, and the “Butterfly Air” example</li><li><strong>33:00</strong> – Designing for the most sensitive rather than the most resilient</li><li><strong>49:00</strong> – Case study: separating engineers from interruption-driven roles</li><li><strong>56:00</strong> – Leading with biology: why design becomes easier when aligned with human instincts</li></ul><h2>Guest Bio and Links</h2><p>Dr. Elizabeth C. Nelson is a biomedical engineer, researcher, and founder of Learn, Adapt, Build. Her work bridges scientific research and real-world application, focusing on workplace design, burnout prevention, environmental measurement, and biological alignment. She advises leadership teams and organizations on how to create spaces that support focus, recovery, and sustainable performance.</p><ul><li>Website: <a href="https://learnadaptbuild.com/" rel="noopener noreferrer" target="_blank">https://learnadaptbuild.com</a></li><li>LinkedIn: <a href="https://www.linkedin.com/in/nelsonelizabeth/" rel="noopener noreferrer" target="_blank">https://www.linkedin.com/in/drelizabethnelson/</a></li><li>Book: <a href="https://www.amazon.com/Healthy-Office-Revolution-burnout-working/dp/9082692406" rel="noopener noreferrer" target="_blank"><em>The Healthy Office Revolution</em></a></li></ul><h2>Resources Mentioned</h2><ul><li><a href="https://www.amazon.com/Healthy-Office-Revolution-burnout-working/dp/9082692406" rel="noopener noreferrer" target="_blank"><em>The Healthy Office Revolution</em></a> by Dr. Elizabeth C. Nelson&nbsp;</li><li><a href="https://jamesclear.com/atomic-habits" rel="noopener noreferrer" target="_blank">Atomic Habits</a> (referenced in discussion about incremental change)&nbsp;</li><li><a href="https://smartbuildingcollective.com/" rel="noopener noreferrer" target="_blank">Smart Building Collective</a> (Elizabeth’s professional affiliation)&nbsp;</li><li><a href="https://en.wikipedia.org/wiki/Workplace_exposure_monitoring" rel="noopener noreferrer" target="_blank">Workplace environmental testing</a> (CO₂, air quality, light disturbance measurement)&nbsp;</li></ul><h2>Call to Action</h2><p>If environments are not neutral, then design is a form of leadership. What assumptions are encoded in your workplace—or in the technologies you use every day? This episode invites you to rethink performance, burnout, and biology through the lens of space itself.</p><p>🎧 Listen now: <a href="https://termsofservice.xyz/" rel="noopener noreferrer" target="_blank">https://termsofservice.xyz/</a></p><h2>Credits</h2><p>Host: Mary Camacho</p><p>Guest: Dr. Elizabeth C. Nelson</p><p>Produced by Terms of Service Podcast</p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/chEimLvuErH/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-e68910b3d28e0aba00cc3404df1c9cba.jpg"/>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-c2f9b678438f5ae47e0a2d1cc4375962.mp3" type="audio/mpeg" length="124562201"/>
    <itunes:duration>00:51:53</itunes:duration>
  </item>
  <item>
    <title>Femtech&apos;s Reckoning: Privacy, Power, and Protection in Health Technology with Soribel Feliz</title>
    <guid>4_TzseN96dz</guid>
    <pubDate>Thu, 29 Jan 2026 14:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode of Terms of Service, host Mary Camacho speaks with Soribel Feliz—AI governance and tech policy advisor—about the dangerous gaps between what health technology promises and what privacy law actually protects. Drawing from her experience advising on AI and emerging tech across the U.S. Senate, federal government, and Big Tech, Soribel examines how femtech and wellness apps claim to empower women while selling their most intimate data and leaving them vulnerable to law enforcement.</p><p>This conversation starts with a moment at a pitch competition: a pregnancy app founder dismissed a question about law enforcement access with "we're HIPAA compliant" and turned away. That turning away from hard questions reveals the problem. As 210 pregnant women face criminal charges using data from apps that promised empowerment, this episode asks: what would it take to build health technology that actually protects the people who use it?</p><h2>Key Takeaways</h2><ul><li>HIPAA compliance doesn't mean privacy protection. Most consumer health apps aren't covered entities and can sell your data freely.</li><li>Your health data is being sold without meaningful consent. Period tracking apps sold location data to anti-abortion organizations to target women visiting Planned Parenthood.</li><li>Pregnancy loss can become criminal evidence. 210 pregnant women faced criminal charges using app data in the first year after Dobbs—HIPAA offered zero protection.</li><li>Compliance ≠ actual protection. Checking regulatory boxes doesn't mean users are safe. Founders and investors must ask harder questions.</li><li>Algorithms are personal. From hiring discrimination to insurance denials, AI systems make intimate decisions about people's lives with little transparency.</li></ul><h2>Topics Covered / Timestamped Sections</h2><ul><li><strong>02:40</strong> – The pitch competition moment: when "HIPAA compliant" became a shield against accountability</li><li><strong>04:31</strong> – What HIPAA actually does and doesn't do</li><li><strong>08:49</strong> – Period tracker data sold to Wisconsin Right to Life for anti-abortion targeting</li><li><strong>16:30</strong> – Why consumer health apps aren't covered by HIPAA</li><li><strong>20:15</strong> – Law enforcement access and pregnancy loss as criminal evidence</li><li><strong>32:18</strong> – ChatGPT Health and the risks of sharing complete medical records</li><li><strong>37:50</strong> – What Soribel learned in the Senate, at Meta, and Microsoft</li><li><strong>43:45</strong> – Algorithms are personal: Workday's hiring discrimination lawsuit</li><li><strong>46:23</strong> – Advice for founders: put your money where your mouth is</li><li><strong>50:26</strong> – Follow Soribel's work on LinkedIn, Substack, and YouTube</li></ul><h2>Guest Bio and Links</h2><p><strong>Soribel Feliz</strong> – AI governance and tech policy advisor with experience advising on AI and emerging tech across the U.S. Senate, federal government, and Big Tech. She focuses on how AI systems create legal, ethical, and operational risk, especially in health tech and femtech, and how organizations can govern AI responsibly at scale. Her work highlights where privacy law and AI governance fall short and why robust governance frameworks matter now.</p><ul><li><strong>LinkedIn:</strong> <a href="https://www.linkedin.com/in/soribelfeliz/" rel="noopener noreferrer" target="_blank">https://www.linkedin.com/in/soribelfeliz/</a></li><li><strong>Substack:</strong> <a href="https://soribelfeliz.substack.com/" rel="noopener noreferrer" target="_blank">https://soribelfeliz.substack.com/</a></li><li><strong>YouTube:</strong> Algorithms are Personal - <a href="https://studio.youtube.com/channel/UC4gbKlwc5VQ6AeCq7kH0ZHA" rel="noopener noreferrer" target="_blank">https://studio.youtube.com/channel/UC4gbKlwc5VQ6AeCq7kH0ZHA</a></li></ul><h2>Resources Mentioned</h2><ul><li><strong>"So You Got the Privacy Officer Title - Now What?"</strong> by Teresa "T" Froester-Falk</li><li><strong>Near Intelligence / Wisconsin Right to Life</strong> – <a href="https://www.commondreams.org/news/abortion-location-data" rel="noopener noreferrer" target="_blank">Senator Wyden's investigation</a> into location data sales</li><li><strong>Mobley v. Workday</strong> – <a href="https://www.cnn.com/2025/05/22/tech/workday-ai-hiring-discrimination-lawsuit" rel="noopener noreferrer" target="_blank">AI hiring discrimination lawsuit</a></li><li><strong>Pregnancy Justice Report</strong> – <a href="https://www.pregnancyjusticeus.org/resources/pregnancy-as-a-crime-a-preliminary-report-on-the-first-year-after-dobbs/" rel="noopener noreferrer" target="_blank">210 criminal charges post-Dobbs</a></li></ul><h2>Further Reading / Related Episodes</h2><ul><li>Soribel's "Femtech Reckoning" series on Substack</li></ul><h2>Call to Action</h2><p>What does it mean to build health technology that actually protects the people who use it? Soribel Feliz offers a clear-eyed examination of where femtech is failing—and what it would take for founders, investors, and policymakers to ask the hard questions.</p><h2>Credits</h2><p><strong>Host:</strong> Mary Camacho</p><p><strong>Guest:</strong> Soribel Feliz</p><p><strong>Produced by</strong> Terms of Service Podcast</p><p><strong>Sound Design:</strong> Arthur Vincent and Sonor Lab</p><p><strong>Co-Producers:</strong> Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/4_TzseN96dz/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-2475b8617cb5d149a972370c89276403.jpg"/>
    <itunes:title>Femtech&apos;s Reckoning: Privacy, Power, and Protection in Health Technology with Soribel Feliz</itunes:title>
    <itunes:season>2</itunes:season>
    <itunes:episode>1</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-c8cf3020f7983d2ec1e3a5cefad9155d.mp3" type="audio/mpeg" length="124406512"/>
    <itunes:duration>00:51:50</itunes:duration>
  </item>
  <item>
    <title>Changing Minds and Making Space: Curiosity, Emotion, and Democracy with Dr. Sarah Stein Lubrano</title>
    <guid>cFW-S4-ppEf</guid>
    <pubDate>Wed, 26 Nov 2025 14:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with Dr. Sarah Stein Lubrano—author of <em>Don’t Talk About Politics: How to Change 21st Century Minds</em>—about what it takes to think, connect, and persuade in a time of rapid technological and cultural disruption. Drawing from her background in philosophy, psychology, and political theory, Sarah explores how emotions shape our cognition, why curiosity is a democratic virtue, and how design and technology can either open or close off possibilities for shared understanding.</p><p>Together, they examine how modern systems—from social media to AI agents—can reduce nuance, flatten emotional range, and reward performance over reflection. This conversation invites us to think more deeply about how we encounter difference—and what it takes to stay open when the world feels overwhelming.</p><h3><strong>Key Takeaways</strong></h3><ul><li><strong>Changing minds isn’t about winning arguments.</strong> It starts with curiosity, emotional intelligence, and building the cognitive space for reflection.</li><li><strong>Democracy requires mental infrastructure.</strong> That means not just freedom of speech, but the psychological and social capacity to listen, consider, and evolve.</li><li><strong>AI and social platforms risk “flattening” cognition.</strong> Speed and frictionless interaction can reduce the emotional and epistemic range of public discourse.</li><li><strong>Design can support or inhibit dignity.</strong> How we architect systems of learning, debate, or health shapes what kinds of people and conversations they enable.</li><li><strong>We don’t need agreement to coexist.</strong> But we do need structures that protect space for difference—both in ideas and identities.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>02:10</strong> – Sarah’s intellectual path: from Oxford and Harvard to emotional epistemology and political learning</li><li><strong>04:24</strong> – Why she wrote <em>Don’t Talk About Politics</em> and what “changing minds” really involves.</li><li><strong>13:30</strong> – How certain academic and tech cultures mistake argument for insight, and why more discussion doesn’t necessarily lead to understanding or change.</li><li><strong>17:50</strong> –The tension between emotional speed and civic depth — what technology amplifies, and what it erodes.</li><li><strong>24:06</strong> – Designing for reflection: what it takes to build platforms that support empathy, not outrage.</li><li><strong>39:11</strong> – Bringing emotional education into institutions, policymaking, and design.</li><li><strong>46:01</strong> –Reflections on where we go from here — cultivating the emotional capacity democracy requires.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Dr. Sarah Stein Lubrano</strong> – Researcher, educator, and author focused on the psychology of political learning and epistemic humility. She holds a doctorate from Oxford and is the author of <em>Don’t Talk About Politics: How to Change 21st Century Minds</em>.</p><ul><li><a href="https://www.sarahsteinlubrano.com/" rel="noopener noreferrer" target="_blank">Sarah’s Website</a></li><li><a href="https://www.amazon.com/Dont-Talk-About-Politics-21st-Century/dp/1399413929" rel="noopener noreferrer" target="_blank">Don’t Talk About Politics – Book Link</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://www.theschooloflife.com/" rel="noopener noreferrer" target="_blank">The School of Life </a>– Where Sarah developed emotional learning content</li><li><strong>Trauma-informed pedagogy</strong> – Educational design that recognizes emotional safety and regulation.</li><li><strong>Patient experience research</strong> – How listening and context shape clinical outcomes.</li><li><strong>AI as cognitive scaffolding</strong> – The potential and risks of AI agents in deliberative thinking.</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><strong>Episode 6</strong>: "Emotional Intelligence in the Age of AI: A Conversation with Marisa Zalabak".</li><li><strong>Episode 7</strong>: "Who Watches the Watchers? Privacy Law, AI, and Power with William McGeveran"</li></ul><h3><strong>Call to Action</strong></h3><p>How do we create room for real thought—and for each other—in an age of constant noise? Dr. Sarah Stein Lubrano offers a thoughtful and hopeful path forward, grounded in emotion, curiosity, and civic design.</p><p>🎧 Listen now: Episode Link</p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Dr. Sarah Stein Lubrano</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/cFW-S4-ppEf/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-9f5cfb01726412d560f63103072775f7.jpg"/>
    <itunes:title>Changing Minds and Making Space: Curiosity, Emotion, and Democracy with Dr. Sarah Stein Lubrano</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>12</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-db92bfc9ec5beae4080f7f169baffbba.mp3" type="audio/mpeg" length="56383111"/>
    <itunes:duration>00:58:43</itunes:duration>
  </item>
  <item>
    <title>You Don’t Own It If You Can’t Fix It: The Fight for the Right to Repair</title>
    <guid>oJCPclMfqRb</guid>
    <pubDate>Fri, 19 Sep 2025 07:11:59 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h3><strong>Episode Summary</strong></h3><p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with Gay Gordon-Byrne, Executive Director of the Digital Right to Repair Coalition, about how manufacturers are rewriting the rules of ownership in the digital age. Drawing on decades of experience in enterprise computing and leasing, Gay shares how restrictive repair policies—hidden behind software locks, proprietary tools, and legal fine print—are quietly eroding our rights as consumers.</p><p>From absurd real-world examples to legislative progress across the U.S., this conversation reveals what’s at stake when we lose the ability to fix the things we own—and how the Right to Repair movement is pushing back.</p><h3><strong>Key Takeaways</strong></h3><ul><li><strong>Repair is a right, not a loophole.</strong> Companies have used copyright law, contracts, and DRM to block basic repairs—redefining ownership in the process.</li><li><strong>You don’t void your warranty by repairing your own device.</strong> Under U.S. law, that’s been protected since the 1970s.</li><li><strong>Tractors, phones, and dishwashers now run on software.</strong> That means repair is increasingly a legal and digital issue, not just mechanical.</li><li><strong>Fixing things is a cultural practice.</strong> It's being squeezed out by design, but it offers economic, environmental, and emotional benefits.</li><li><strong>State-level legislation is gaining traction.</strong> While federal regulators stall, local organizing and public pressure are driving change.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>04:30</strong> – Understanding Right to Repair - From leasing and enterprise sales to grassroots repair advocacy.</li><li><strong>08:10</strong> – The slow erosion of repair rights through software and service bundling</li><li><strong>10:50</strong> – What “Right to Repair” actually means—and what it doesn’t</li><li><strong>12:52</strong> – The Shift in Consumer Expectations.</li><li><strong>13:50 </strong>– The Economics of Repairability.</li><li><strong>15:40 </strong>– Legal Implications of Ownership.</li><li><strong>19:00</strong> – Tractors, cars, and consumer electronics: software as the new lock</li><li><strong>24:00</strong> – The Global Perspective on Repair Culture.</li><li><strong>27:26</strong> – The Magnuson-Moss Warranty Act and the myth of “voided” warranties.</li><li><strong>28:00</strong> – Legislative Changes and Consumer Power</li><li><strong>31:25</strong> – Antitrust and tying agreements: the legal dimension of forced service</li><li><strong>35:05</strong> – The Role of Consumers in Advocacy- France’s repairability index and global momentum for consumer rights.</li><li><strong>45:38</strong> – Stories from the field: absurd repair scenarios and growing public awareness.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Gay Gordon-Byrne</strong> – Executive Director of the Digital Right to Repair Coalition (Repair.org). With decades of experience in the computer leasing industry, Gay has spent the past decade fighting to restore ownership and repair rights for consumers and independent businesses across the U.S.</p><ul><li><a href="http://repair.org" rel="noopener noreferrer" target="_blank">Repair.org</a>&nbsp;</li><li><a href="https://www.linkedin.com/in/gay-gordon-byrne-b167855/" rel="noopener noreferrer" target="_blank">Gay Gordon-Byrne on LinkedIn</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://www.ftc.gov/legal-library/browse/statutes/magnuson-moss-warranty-federal-trade-commission-improvements-act" rel="noopener noreferrer" target="_blank">Magnuson-Moss Warranty Act (FTC.gov) </a>– Protecting U.S. consumers from deceptive warranty practices</li><li><a href="https://repair.eu/news/the-french-repair-index-challenges-and-opportunities/" rel="noopener noreferrer" target="_blank">France’s Repairability Index</a> – Labeling systems that inform buyers on repair potential</li><li><a href="https://www.ifixit.com/" rel="noopener noreferrer" target="_blank">iFixit </a>– Repair guides, community support, and advocacy</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/empowerment-tech-unlocking-customer-data-for-bett-W7nErd8CLMY/" rel="noopener noreferrer" target="_blank">Episode 3: “Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business”</a></li></ul><h3><strong>Call to Action</strong></h3><p>What if you couldn’t fix your own tools, car, or phone—even when it’s a simple repair? Listen to Gay Gordon-Byrne explain why the right to repair is about more than gadgets—it’s about autonomy, sustainability, and democratic accountability.</p><p>🎧 Listen now: <a href="https://termsofservice.xyz/episodes/you-dont-own-it-if-you-cant-fix-it-the-fight-for-the-right-to-repair/oJCPclMfqRb/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Gay Gordon-Byrne</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/oJCPclMfqRb/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-bb28d5d12901ed87f24edcd91329b5d5.png"/>
    <itunes:title>You Don’t Own It If You Can’t Fix It: The Fight for the Right to Repair</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>14</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-d24d16ef2aba73ab47b4515868e33f3d.mp3" type="audio/mpeg" length="133046773"/>
    <itunes:duration>00:55:26</itunes:duration>
  </item>
  <item>
    <title>Designing Privacy You Can Feel: Smooth, Supportive, Empowering</title>
    <guid>wG1hz4E7I42</guid>
    <pubDate>Wed, 06 Aug 2025 13:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h3><strong>Episode Summary</strong></h3><p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with <strong>Molly Willson</strong> and <strong>Eriol Fox</strong> from Superbloom, a nonprofit design and technology studio working at the intersection of open-source software, privacy, and human rights. Together, they unpack the <em>Privacy Experience Heuristics</em>—a framework designed to help teams build more intuitive, trust-centered experiences around privacy.</p><p>They explore why legal compliance isn’t enough, how tools like password managers and secure messaging apps can feel intimidating or unsafe, and why it’s crucial to center marginalized users in privacy and security design. From “personas non grata” to designing for digital dignity, this conversation explores how we can bridge the gap between secure systems and the real people who need them most.</p><h3><strong>Key Takeaways</strong></h3><ul><li><strong>Privacy isn’t just technical—it’s emotional and relational.</strong> Smooth, supportive, and empowering experiences help users trust and engage with privacy-respecting tools.</li><li><strong>The Privacy Experience Heuristics</strong> were created to guide open-source and nonprofit teams in building better UX for privacy without requiring specialized expertise.</li><li><strong>Designers have a critical role</strong> in shaping security culture and making privacy feel accessible, not punitive.</li><li><strong>Marginalized communities often bear the brunt of poor defaults and unsafe assumptions.</strong> Designing with their safety in mind improves tools for everyone.</li><li><strong>Security isn’t one-size-fits-all.</strong> Empowerment means giving users choices without overwhelming them with complexity.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>03:30</strong> – How Molly and Eriol came to focus on privacy-centered design</li><li><strong>08:56</strong> – Why compliance frameworks (like GDPR) don’t ensure a good user experience.</li><li><strong>10:22</strong> – Introducing the Privacy Experience Heuristics: smooth, supportive, empowering.</li><li><strong>16:08</strong> – The difference between supportive and empowering.</li><li><strong>21:00 </strong>– Human-centered design doesn't start and end with the users.</li><li><strong>23:27</strong> – Designing for safety: why privacy must serve people on the margins.</li><li><strong>27:38</strong> – Should people have to worry about privacy?</li><li><strong>31:30</strong> – Personas Non Grata: preparing for misuse and unexpected users.</li><li><strong>36:21</strong> – Real world examples where privacy or security is being built into design.</li><li><strong>43:38</strong> –Why can't you split the world into 'people who need privacy' and 'people who don’t?</li><li><strong>44:30</strong> – WhatsApp, Signal and the difference between them.&nbsp;</li><li><strong>56:00</strong> – Hope for the future: reframing privacy as a shared cultural value</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Molly Willson</strong> – Molly has been at Superbloom since 2018, where she leads design and research projects around a variety of open-source and public interest technology. She has worked with teams on projects around privacy, security, transparency, open data, and internet governance, and has also done research projects together with funders and communities working in these areas. She also leads Superbloom's coaching program, helping pair experts with teams for high-impact design, community, and fundraising mentoring. Her background is in both design and education, making her particularly passionate about making design useful to everyone looking to create rights-friendly alternatives to big tech platforms. Before she joined Superbloom, she taught design at the Stanford d.school and the Hasso-Plattner-Institut at the University of Potsdam. She is originally from the US but has lived in Berlin, Germany since 2015, where she lives with her husband and her two daughters.</p><p><strong>Eriol Fox</strong> – Eriol has been working as a designer for 15+ years working in for-profits and then NGO's and open-source software organisations, working on complex problems like sustainable food systems, peace-building and crisis response technology. Eriol now works at Superbloom design, research, open-source and technology projects. They are also part of the core teams at Open Source Design (http://opensourcedesign.net/) and Human Rights Centred Design working group (https://hrcd.pubpub.org/) and Sustain UX &amp; Design working group (https://sustainoss.org/working-groups/design-and-ux/) and help hosts podcast about open source and design (https://sosdesign.sustainoss.org/) Eriol is a non-binary, queer person who uses they/them pronouns.</p><ul><li><a href="https://superbloom.design/" rel="noopener noreferrer" target="_blank">Superbloom Website</a></li><li><a href="https://superbloom.design/learning/blog/measuring-the-privacy-experience/" rel="noopener noreferrer" target="_blank">Privacy Experience Heuristics</a></li><li><a href="https://github.com/sprblm/The-Design-We-Open" rel="noopener noreferrer" target="_blank">https://github.com/sprblm/The-Design-We-Open</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://signal.org/" rel="noopener noreferrer" target="_blank">Signal</a> – Encrypted messaging with strong privacy defaults</li><li><a href="https://www.torproject.org/download/" rel="noopener noreferrer" target="_blank">Tor Browser</a> – Privacy-first web browsing</li><li><a href="https://gdpr-info.eu/" rel="noopener noreferrer" target="_blank">GDPR </a>– European data protection law, often insufficiently implemented in UX</li><li><a href="https://www.firefox.com/" rel="noopener noreferrer" target="_blank">Firefox,</a><strong> </strong><a href="https://proton.me/" rel="noopener noreferrer" target="_blank">Proton</a><strong>, </strong><a href="https://keepassxc.org/" rel="noopener noreferrer" target="_blank">KeePassXC</a> – Examples discussed throughout.</li><li><a href="https://superbloom.design/learning/resources/persona-non-grata/" rel="noopener noreferrer" target="_blank">Personas Non Grata</a></li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/beyond-honeypots-privacy-security-and-the-futur-YjzvQ-RnEQE/" rel="noopener noreferrer" target="_blank">Episode 2: "Beyond Honeypots: Privacy, Security, and the Future of Distributed Webs"</a></li><li><a href="https://termsofservice.xyz/episodes/the-great-disruption-building-human-centered-digi-goDcF1Nff0b/" rel="noopener noreferrer" target="_blank">Episode 8: "The Great Disruption: Building Human-Centered Digital Futures"</a></li></ul><h3><strong>Call to Action</strong></h3><p>How does privacy feel when you use your favorite app? Is it smooth? Supportive? Empowering? Molly and Eriol challenge us to design not just for policy, but for people. Listen to this episode and explore how design can help us reclaim digital agency.</p><p>🎧 Listen now:&nbsp;<a href="https://termsofservice.xyz/episodes/designing-privacy-you-can-feel-smooth-supportive-empowering/wG1hz4E7I42/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guests: Molly Willson &amp; Eriol Fox</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p><p><br></p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/wG1hz4E7I42/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-08306d3367ab16e1c10f23bbc1800eb2.png"/>
    <itunes:title>Designing Privacy You Can Feel: Smooth, Supportive, Empowering</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>13</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-421fcd4aad31e19f480b0e6e96246777.mp3" type="audio/mpeg" length="157607099"/>
    <itunes:duration>01:05:40</itunes:duration>
  </item>
  <item>
    <title>Mission, Complexity, and Crisis: Leading in a Rapidly Changing World</title>
    <guid>rsUED0m6jwj</guid>
    <pubDate>Wed, 02 Jul 2025 13:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h3><strong>Episode Summary</strong></h3><p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with Dr. David Bray, a seasoned leader who has served in senior roles across the U.S. government, tech, and civil society. From bioterrorism response at the CDC to digital transformation efforts in national intelligence, Bray brings a unique perspective on leadership in complexity. They explore how institutions can adapt in times of disruption, why trust is a critical infrastructure, and how positive change agents can build bridges across sectors—even in polarized environments. With a deep systems lens, Bray challenges us to align technological innovation with human values and long-term mission.</p><p><strong>Key Takeaways</strong></p><ul><li><strong>Mission-driven leadership matters most in times of complexity and crisis.</strong> Leaders must be able to hold contradictions, listen deeply, and navigate uncertainty with clarity of purpose.</li><li><strong>Trust is infrastructure.</strong> Societal systems—especially in democracies—depend on mutual trust, and technology can either degrade or strengthen that foundation.</li><li><strong>The U.S. is structured for stalemate, not for rapid transformation.</strong> But transformation is still possible—especially in crises—through coalitions and adaptive strategies.</li><li><strong>Cross-sector collaboration is essential.</strong> Government, civil society, and private enterprise must learn to speak a shared language of values and resilience.</li><li><strong>We must redesign metrics for success.</strong> Quarterly profits aren’t the only or best measure; we need frameworks that value long-term human and ecological well-being.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>02:52</strong> – The Neutrality of Technology and Its Implications</li><li><strong>06:30</strong> – Agency in the Age of AI and Information</li><li><strong>09:32 </strong>– Policy Evolution in the Face of Rapid Technological Change</li><li><strong>13:43 </strong>– Building Trust Across Divided Sectors</li><li><strong>14:25</strong> – When institutions break down: adaptive leadership and finding windows of possibility</li><li><strong>18:10</strong> – Personal Journeys and Motivations in Leadership.&nbsp;</li><li><strong>22:48</strong> – Advice for Leaders Amidst Polarization</li><li><strong>23:20</strong> – Navigating polarized environments with shared values and pluralist frames</li><li><strong>28:10</strong> – Decision-Making Frameworks for Leaders</li><li><strong>30:15 </strong>– Fostering Healthy Tension in Leadership</li><li><strong>32:09 </strong>– Empowering Others and Agency in Leadership</li><li><strong>39:18 </strong>– The Ethics of Power and Responsibility</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Dr. David Bray</strong> is a strategist and transformation leader working at the intersection of technology, policy, and complex change. Currently Distinguished Chair of the Accelerator at the Stimson Center and Principal at LeadDoAdapt Ventures, he’s led efforts ranging from bioterrorism preparedness to countering disinformation for U.S. Special Operations. A former FCC CIO and Executive Director for bipartisan national commissions, David has advised 12 startups, worked globally on the future of tech and data, and earned honors including the National Intelligence Exceptional Achievement Medal and CIO 100 Awards. He’s also served as Executive-in-Residence at Harvard and was named one of Business Insider’s “24 Americans Changing the World.”</p><ul><li><a href="https://www.linkedlinkedin.com/in/dbrayin.com/in/davidbray/" rel="noopener noreferrer" target="_blank">David Bray on LinkedIn&nbsp;</a></li><li><a href="https://www.cxotalk.com/bio/dr-david-a-bray-distinguished-chair-of-the-accelerator-stimson-center" rel="noopener noreferrer" target="_blank">CXO TALK</a></li><li><a href="https://www.stimson.org/ppl/david-bray/" rel="noopener noreferrer" target="_blank">Stimson Center – Bray’s Profile</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://peoplecentered.net/" rel="noopener noreferrer" target="_blank">People-Centered Internet Coalition</a> - Dr. Bray served as Executive Director for this initiative co-founded by Vint Cerf. It promotes digital infrastructure that empowers people.&nbsp;</li><li><a href="https://www.edelman.com/trust/trust-barometer" rel="noopener noreferrer" target="_blank">Edelman Trust Barometer</a> – He references the 2025 edition of the Edelman Trust Barometer, particularly noting statistics on global grievance and willingness to justify violence.</li><li><a href="https://en.wikipedia.org/wiki/Jean-Jacques_Rousseau" rel="noopener noreferrer" target="_blank">Rousseau’s Theory of Pluralities </a>– Bray refers to Rousseau’s idea that democracies require civic responsibility from at least 20% of people to function well—a power-law principle still relevant today.</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/who-watches-the-watchers-privacy-law-ai-and-pow-Q_Qd-RcKe9e/" rel="noopener noreferrer" target="_blank">Episode 11: "Who Watches the Watchers? Privacy Law, AI, and Power"</a></li><li><a href="https://termsofservice.xyz/episodes/the-great-disruption-building-human-centered-digi-goDcF1Nff0b/" rel="noopener noreferrer" target="_blank">Episode 8: "The Great Disruption: Building Human-Centered Digital Futures"</a></li><li><a href="https://termsofservice.xyz/episodes/regenerating-social-fabric-and-innovating-governance-YqXAtMJCkT2/" rel="noopener noreferrer" target="_blank">Episode 5: "Regenerating Social Fabric &amp; Innovating Governance"</a></li><li><a href="https://termsofservice.xyz/episodes/dynamics-of-digital-spaces-rethinking-democracy-o-7g_H6TrnPAq/" rel="noopener noreferrer" target="_blank">Episode 4: "Dynamics of Digital Spaces: Rethinking Democracy Online"</a></li></ul><h3><strong>Call to Action</strong></h3><p>How do we lead with courage and clarity when everything is changing? This conversation with Dr. David Bray offers a roadmap for leadership in uncertain times—grounded in systems thinking, public service, and a deep respect for human agency.</p><p>🎧 Listen now: <a href="https://termsofservice.xyz/episodes/mission-complexity-and-crisis-leading-in-a-rapidly-changing-world/rsUED0m6jwj/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Dr. David Bray</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and SonorLab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/rsUED0m6jwj/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-99466a7c0e03fc431592f39c8c825ff7.png"/>
    <itunes:title>Mission, Complexity, and Crisis: Leading in a Rapidly Changing World</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>12</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-9c47b83522dc09085edf75695b18d290.mp3" type="audio/mpeg" length="111366185"/>
    <itunes:duration>00:46:24</itunes:duration>
  </item>
  <item>
    <title>Who Watches the Watchers? Privacy Law, AI, and Power</title>
    <guid>Q_Qd-RcKe9e</guid>
    <pubDate>Tue, 03 Jun 2025 13:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h3><strong>Episode Summary</strong></h3><p>In this episode of <em>Terms of Service</em>, Mary Camacho sits down with William McGeveran—Dean of the University of Minnesota Law School and author of a leading privacy law casebook—to explore the evolving landscape of data protection, surveillance, and individual rights. With deep insights into both U.S. and European frameworks, McGeveran breaks down where current laws fall short, why consent alone doesn’t protect privacy, and how legal systems can (and should) evolve to meet the challenges posed by AI, big tech, and systemic data collection.</p><h3><strong>Key Takeaways</strong></h3><ul><li><strong>Most of the world—including the EU—follows a “data protection” model that assumes personal data must be protected on behalf of individuals.</strong> This gives people broad rights to know, limit, and contest how their data is collected and used. In contrast, the U.S. lacks a unified data protection framework. Instead, companies are largely free to collect and use personal data unless a specific law prohibits it—prioritizing institutional autonomy over individual rights.</li><li><strong>Consent is an inadequate foundation for privacy protection.</strong> Relying on individuals to understand and agree to complex data practices shifts responsibility away from those in power and undermines meaningful control.</li><li><strong>Legal design matters.</strong> Structural choices—like creating intentional silos for data—can strengthen protections rather than limit innovation.</li><li><strong>Data breaches are no longer unusual—they’re inevitable.</strong> But legal standards still play a critical role in enforcing accountability and incentivizing better security practices.</li><li><strong>Younger generations see privacy not as a personal failure but as a systemic issue.</strong> And they're looking for collective, enforceable solutions—not just more terms of service.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>01:39</strong> – From Capitol Hill to privacy casebooks: McGeveran’s path into data law.</li><li><strong>02:48</strong> – The wild west of the early internet and Lessig’s “Code”.</li><li><strong>04:32</strong> – Silos in surveillance and the importance of intentional data separation.</li><li><strong>08:00</strong> – Privacy law vs. data protection law: U.S. and EU’s contrasting assumptions.</li><li><strong>11:04</strong> – Why California's privacy laws are stronger—but still fundamentally U.S. in approach.</li><li><strong>14:11</strong> – Why it’s not “all over”: What legal protections still matter.</li><li><strong>17:33</strong> – Aggregation harms and why individuals can’t foresee long-term data consequences.</li><li><strong>24:03</strong> – How digital-native students view privacy today—and what gives them hope.</li><li><strong>27:00</strong> – Why privacy policies can’t be read, and how AI can help interpret them.</li><li><strong>35:30</strong> – GDPR’s global ripple effects and Max Schrems' legal victories.</li><li><strong>40:00</strong> – Casebooks, case studies, and how law students are shaping future data policy.</li><li><strong>41:45</strong> – Data breaches, legal gaps, and the human side of cybersecurity.</li><li><strong>50:35</strong> – AI is both revolutionary and familiar—and requires caution, not panic.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>William McGeveran</strong> – William McGeveran was named the twelfth dean of the University of Minnesota Law School in 2024. He originally joined the faculty of Minnesota Law in 2006 and previously served as the interim dean and the associate dean for academic affairs. Dean McGeveran’s research focuses on information law, with particular focus on data privacy and trademark law. His scholarship in trademark law considers the balance between prevention of harmful consumer confusion and protection of valuable speech including parody, commentary, and comparative advertising. McGeveran is also the sole author of a casebook, Privacy and Data Protection Law, used by instructors at dozens of U.S. law schools. Dean McGeveran has been a resident fellow at the University of Minnesota Institute of Advanced Study, a visiting professor at University College Dublin School of Law, and an instructor in the Notre Dame Law School London Programme. He frequently speaks to the media, submits amicus briefs, works with policymakers, and teaches continuing legal education courses in his specialty areas. Dean McGeveran earned a J.D., magna cum laude, from New York University and a B.A., magna cum laude, in political science from Carleton College. While an undergraduate he spent one year as a nonmatriculated visiting student at Worcester College, Oxford. Prior to joining Minnesota Law, he was a resident fellow at the Berkman Center for Internet and Society at Harvard Law School. He previously clerked for Judge Sandra Lynch on the United States Court of Appeals for the First Circuit and practiced as an intellectual property litigator at Foley Hoag LLP in Boston. Before law school, Dean McGeveran worked in national politics for seven years, primarily as a senior legislative aide to then-Rep. Charles Schumer.</p><ul><li><a href="http://linkedin.com/in/william-mcgeveran-b119614" rel="noopener noreferrer" target="_blank">Follow William McGeveran on Linkedin</a>&nbsp;</li><li><a href="https://law.umn.edu/profiles/william-mcgeveran" rel="noopener noreferrer" target="_blank">Faculty Profile – University of Minnesota Law</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://gdpr-info.eu/" rel="noopener noreferrer" target="_blank">GDPR (General Data Protection Regulation) </a>– Europe’s landmark data privacy law</li><li><a href="https://oag.ca.gov/privacy/ccpa" rel="noopener noreferrer" target="_blank">California Consumer Privacy Act (CCPA)</a> – A leading example of enhanced U.S. state-level regulation.</li><li><a href="https://en.wikipedia.org/wiki/Max_Schrems" rel="noopener noreferrer" target="_blank">Max Schrems and NOYB</a> – Strategic litigation challenging EU-U.S. data sharing agreements.</li><li><a href="https://kb.osu.edu/server/api/core/bitstreams/a9510be5-b51e-526d-aea3-8e9636bc00cd/content" rel="noopener noreferrer" target="_blank">Carnegie Mellon - The Cost of Reading Privacy Policies Study</a> – Analysis of time required to read all privacy policies.</li><li><a href="https://www.amazon.com/Privacy-Data-Protection-University-Casebook-dp-164242112X/dp/164242112X/ref=dp_ob_title_bk" rel="noopener noreferrer" target="_blank">Privacy and Data Protection Law (University Casebook Series)</a></li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/from-ai-anxiety-to-ip-integrity-navigating-rights-dFYKTIhrGDX/" rel="noopener noreferrer" target="_blank">Episode 1: "From AI Anxiety to IP Integrity: Navigating Rights in a Tech-Driven World"</a></li><li><a href="https://termsofservice.xyz/episodes/empowerment-tech-unlocking-customer-data-for-bett-W7nErd8CLMY/" rel="noopener noreferrer" target="_blank">Episode 3: “Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business”</a></li></ul><h3><strong>Call to Action</strong></h3><p>Privacy isn't dead—but it is under pressure. If you're tired of shrugging at every “accept cookies” pop-up, this episode will help you rethink what’s possible through law, accountability, and systemic reform. Listen to Dean William McGeveran on how to reclaim digital dignity.</p><p>🎧 Listen now: <a href="https://termsofservice.xyz/episodes/who-watches-thewatchers-privacy-law-ai-and-power/KKgk5udJvBA/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: William McGeveran</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/Q_Qd-RcKe9e/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-ca404632b2e1b1fe102246fd9466f2b9.jpg"/>
    <itunes:title>Who Watches the Watchers? Privacy Law, AI, and Power</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>11</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-2ab02ee7de169cc26b22305475cb08a6.mp3" type="audio/mpeg" length="129686381"/>
    <itunes:duration>00:54:02</itunes:duration>
  </item>
  <item>
    <title>When Alexa Says Sorry: What We Risk When AI Sounds Human</title>
    <guid>KKgk5udJvBA</guid>
    <pubDate>Tue, 13 May 2025 12:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h3><strong>Episode Summary</strong></h3><p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with Marisa Zalabak, an AI ethicist and psychologist who explores how our relationships with artificial intelligence impact emotional intelligence, learning, communication, and mental health. With a rich background in education, social justice, psychology, and theater arts, Marisa offers deep insights into the emotional and ethical implications of anthropomorphizing AI, the risks of synthetic empathy, and the importance of slowing down to ask better questions. Together, they unpack how emotional and cognitive habits are being shaped by our daily interactions with machines—and what it means for our shared future.</p><h3><strong>Key Takeaways</strong></h3><ul><li><strong>Anthropomorphizing AI</strong>—treating machines as if they are human—is natural but dangerous, especially when synthetic empathy (like chatbots saying “I’m sorry”) reinforces emotional trust in non-human systems.</li><li>Marisa emphasizes the importance of asking better questions about the tools we use, why we use them, and what long-term effects they may have.</li><li>Research shows people increasingly treat AI systems as coworkers or even confidants, which can affect trust, mental health, and social connection.</li><li>Systems like Alexa and humanoid AIs often reinforce gender bias, particularly when defaulted to women’s voices.</li><li>Encouraging digital literacy, slow learning, and psychological grounding helps individuals—and especially children—build healthy habits with technology.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>01:55</strong> – Marisa’s unconventional journey from performing arts to educational psychology to AI ethics</li><li><strong>05:48</strong> – Discovering AI and contributing to one of the first IEEE standards on human well-being in AI design.</li><li><strong>08:27</strong> – First deep AI encounter: conversing with NASA's humanoid BINA48 and the psychology of human-machine interaction.</li><li><strong>13:22</strong> – Synthetic empathy and the blurry boundaries of trust in conversational AI.</li><li><strong>18:10</strong> – How politeness and pronouns affect human habits and communication patterns.</li><li><strong>21:45</strong> – Designing meaningful research on emotional and psychological effects of AI.</li><li><strong>23:14</strong> – Children and AI: the real impacts of early and normalized interaction with synthetic personalities.</li><li><strong>38:00</strong> – Why education should be an invitation to inquiry, not a race toward certainty.</li><li><strong>33:31</strong> – Gendered AI voice assistants and their unintended social consequences.</li><li><strong>37:40</strong> – Why education should be an invitation to inquiry, not a race toward certainty.</li><li><strong>42:05</strong> – Breaking down complexity through “aunt Dorothy” explanations and slow, focused inquiry.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Marisa Zalabak</strong> is an AI ethicist, psychologist, and thought leader specializing in responsible AI, education, sustainability, and human well-being. Her talks emphasize adaptive leadership, ethical innovation, and climate action through sustainable practices. A two-time TEDx and international keynote speaker, Marisa has contributed to global forums such as Stratcom, UN Summit of the Future, and AI House in Davos during the World Economic Forum. As Co-Founder of GADES (Global Alliance for Digital Education and Sustainability), Resident Fellow with The Digital Economist Center of Excellence and faculty member at the Trocadéro Forum Institute, Marisa champions education aligning responsible technology with regenerative design for human and planetary flourishing. Chairing IEEE's AI Ethics Education and Planet Positive 2030 initiatives, Marisa has co-authored ethical AI standards for human-wellbeing with AI technologies. Collaborating across sectors with organizations like Microsoft, SAP, and Stanford University Marisa addresses emerging issues in AI for a sustainable future.</p><ul><li><a href="https://www.marisazalabak.com" rel="noopener noreferrer" target="_blank">Marisa’s Website</a></li><li><a href="https://www.linkedin.com/in/marisazalabak/" rel="noopener noreferrer" target="_blank">Marisa’s Linkedin</a></li><li><a href="https://www.instagram.com/mzalabak/" rel="noopener noreferrer" target="_blank">Marisa’s Instagram</a></li><li><a href="https://www.facebook.com/marisa.zalabak" rel="noopener noreferrer" target="_blank">Marisa’s Facebook</a></li><li><a href="https://www.ted.com/talks/marisa_zalabak_educational_fire_drills_for_flourishing" rel="noopener noreferrer" target="_blank">Marisa’s TEDx Talk</a></li><li><a href="https://ethicsinaction.ieee.org" rel="noopener noreferrer" target="_blank">IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://www.hansonrobotics.com/bina48-9/" rel="noopener noreferrer" target="_blank">BINA48</a> – One of the first advanced humanoids trained for human interaction and space exploration.</li><li><a href="https://www.dataexpertise.in/advancing-emotional-artificial-intelligence/" rel="noopener noreferrer" target="_blank">Synthetic Emotion in AI</a> – IEEE working group focused on standards for AI that emulates human emotions.</li><li><a href="https://www.brookings.edu/articles/how-ai-bots-and-voice-assistants-reinforce-gender-bias/" rel="noopener noreferrer" target="_blank">Digital Assistants &amp; Bias</a> – Ongoing research into how voice assistants perpetuate societal norms and stereotypes.</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/regenerating-social-fabric-and-innovating-governance-YqXAtMJCkT2/" rel="noopener noreferrer" target="_blank"><strong>Episode 5</strong>: “Regenerating Social Fabric &amp; Innovating Governance”</a></li></ul><h3><strong>Call to Action</strong></h3><p>How are your emotional habits being shaped by the tools you use every day? Marisa Zalabak invites us to slow down, ask better questions, and reimagine AI as a tool for well-being—not just productivity. Listen now and rethink the terms of service we accept in our digital lives.</p><p>🎧 Listen now:&nbsp;<a href="https://termsofservice.xyz/episodes/when-alexa-says-sorry-what-we-risk-when-ai-sounds-human/KKgk5udJvBA/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Marisa Zalabak</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/KKgk5udJvBA/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-ed6af501f57cda58309eba93ba88f64b.jpg"/>
    <itunes:title>When Alexa Says Sorry: What We Risk When AI Sounds Human</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>10</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-c44cfaa98085c208b8fb5c97cae83f82.mp3" type="audio/mpeg" length="113126838"/>
    <itunes:duration>00:47:08</itunes:duration>
  </item>
  <item>
    <title>Beyond the Dataset: Building Human-Centered Research</title>
    <guid>3ZuIxiFkys2</guid>
    <pubDate>Tue, 22 Apr 2025 13:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h3><strong>Episode Summary</strong></h3><p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with Elizabeth Eagen, Deputy Director of the Citizens and Technology (CAT) Lab at Cornell University. Elizabeth shares how her work in human rights led her to explore the impact of emerging technologies on civil society, and how citizen science can be used to shape better digital spaces. Together, they discuss algorithmic bias, data ownership, community-driven research, and how regulation often lags behind both the harm and the science. With sharp insights and powerful stories, Elizabeth unpacks the complex dynamics of platform accountability, participatory research, and digital equity.</p><h3><strong>Key Takeaways</strong></h3><ul><li>The CAT Lab supports communities in generating their own research questions and data to investigate online life, shifting power dynamics away from institutions.</li><li>Effective regulation of technology (e.g. AI hiring algorithms) requires better alignment between legal timelines and scientific inquiry.</li><li>Community-led research can influence both platform behavior and public policy—while honoring participant agency.</li><li>The human cost of data loss or mismanagement is real—whether in human rights documentation or everyday digital life.</li><li>Human values like mutual aid and accountability must be embedded into technological systems and policy frameworks to ensure equity and resilience.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>00:48</strong> – Introduction to Elizabeth Eagen and the CAT Lab’s mission: “citizen science for the internet.”</li><li><strong>02:53</strong> – Redefining research through community participation and shifting who owns the data.</li><li><strong>05:39</strong> – Institutional barriers to community-led science, including IRB processes and timeline mismatches.</li><li><strong>08:50</strong> – Real-world issues CAT Lab explores: algorithmic hiring bias, content moderation, and digital inclusion.</li><li><strong>11:20</strong> – Community impact: research results go first to participants, enabling operational improvements.</li><li><strong>12:52</strong> – Case Study: Local Law 144 and AI auditing for hiring discrimination in NYC.</li><li><strong>21:00</strong> – Regulation is often slower than technological impact—but still faster than science.</li><li><strong>23:40</strong> – Elizabeth’s journey from Human Rights Watch to building the Emerging Tech portfolio at Open Society Foundations.</li><li><strong>25:00</strong> – The responsibility to protect data as people—not just points.</li><li><strong>29:48</strong> – Tradeoffs in data ownership, portability, and government involvement.</li><li><strong>35:16</strong> – Digital identity and the folklore of “messifying” databases for privacy and security.</li><li><strong>37:08</strong> – The risks of corporate donations of tech tools to civil society—and the ethics of mutual dependency.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Elizabeth Eagen</strong> is Deputy Director of the Citizens and Technology Lab at Cornell University, which works with communities to study the effects of technology on society and test ideas for changing digital spaces to better serve the public interest, so that digital power is guided by evidence and accountable to the public. She was also a 2022-23 Practitioner Fellow at Stanford University’s Digital Civil Society Lab. Previously, she established and led the Emerging Technology portfolio at the Open Society Foundations’ Information Program. This initiative funded the use of emergent technologies in evidence and advocacy, building the role of knowledge management, and the use of data visualization tools, data science, statistics, and new media tactics by civil society and policymakers. She founded the Human Rights Data Initiative, and led the Urbanization Working Group, which explored urbanization and open society through programming, research, and debate. She holds an MA/MPP in Public Policy and Russian and Eastern European Studies from the University of Michigan, and a BA from Macalester College.</p><ul><li><a href="https://citizensandtech.org" rel="noopener noreferrer" target="_blank">Citizens and Technology Lab</a></li><li><a href="http://www.linkedin.com/in/eeagen" rel="noopener noreferrer" target="_blank">Elizabeth on Linkedin</a></li><li><a href="http://@eleag.bsky.social" rel="noopener noreferrer" target="_blank">Elizabeth on Bsky</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://www.nyc.gov/assets/dca/downloads/pdf/about/DCWP-AEDT-FAQ.pdf" rel="noopener noreferrer" target="_blank">Local Law 144</a> - NYC legislation requiring audits for AI-based hiring tools.</li><li><a href="https://www.fda.gov/regulatory-information/search-fda-guidance-documents/institutional-review-boards-frequently-asked-questions" rel="noopener noreferrer" target="_blank">IRB (Institutional Review Board)</a> – Systems for ethical oversight of academic research.</li><li><a href="https://datasociety.net/" rel="noopener noreferrer" target="_blank">Data and Society Research Institute</a> – Partner in algorithmic accountability research.</li><li><a href="https://www.opensocietyfoundations.org/" rel="noopener noreferrer" target="_blank">Open Society Foundations</a> – Supporting rights-based civil society through tech.</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/empowerment-tech-unlocking-customer-data-for-bett-W7nErd8CLMY/" rel="noopener noreferrer" target="_blank">Episode 3: "Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business"</a></li><li><a href="https://termsofservice.xyz/episodes/regenerating-social-fabric-and-innovating-governance-YqXAtMJCkT2/" rel="noopener noreferrer" target="_blank">Episode 5: Regenerating Social Fabric &amp; Innovating Governance</a></li><li><a href="https://termsofservice.xyz/episodes/breaking-the-binary-rethinking-law-power-and-po-fmeCP2iPMij/" rel="noopener noreferrer" target="_blank">Episode 7: "Breaking the Binary: Rethinking Law, Power, and Possibility"</a></li></ul><h3><strong>Call to Action</strong></h3><p>Want to know what it means to shift power in research, regulation, and digital life? Don’t miss this conversation with Elizabeth Eagen, and learn how citizen science can create more inclusive and accountable tech systems.</p><p>🎧 Listen now:&nbsp;<a href="https://termsofservice.xyz/episodes/Beyond-the-Dataset-Building-Human-Centered-Research/3ZuIxiFkys2/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Elizabeth Eagen</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p><p><br></p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/3ZuIxiFkys2/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-cc213ced2e219f53c3bc0d28feb9276d.jpg"/>
    <itunes:title>Beyond the Dataset: Building Human-Centered Research</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>9</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-ab11209a0d4486d278b92ca22a1b252a.mp3" type="audio/mpeg" length="99447034"/>
    <itunes:duration>00:41:26</itunes:duration>
  </item>
  <item>
    <title>The Great Disruption: Building Human-Centered Digital Futures</title>
    <guid>goDcF1Nff0b</guid>
    <pubDate>Tue, 01 Apr 2025 13:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h3><strong>Episode Summary</strong></h3><p>In this episode of <em>Terms of Service</em>, host Mary Camacho sits down with Mei Lin Fung, co-founder of the People-Centered Internet and a pioneer in customer relationship management (CRM). Mei Lin shares her personal journey from Singapore to MIT, her role in shaping the CRM industry, and her commitment to ensuring that technology remains a tool for human flourishing. Together, they discuss the current “Great Disruption” brought on by digital transformation, the importance of community-driven technology, and why feedback and inclusion are key to resilient societies.</p><h3><strong>Key Takeaways</strong></h3><ul><li>Human collaboration is the foundation of societal survival and progress, magnified now by digital technologies.</li><li>The “Great Disruption” refers to the rapid changes in how humans interact due to digital transformation, with both challenges and opportunities.</li><li>Mei Lin’s career, including early CRM development and co-founding People-Centered Internet, emphasizes technology’s potential to empower rather than exploit.</li><li>Singapore’s example shows how numerate, inclusive governance and long-term investment can enable societies to thrive.</li><li>Effective leadership today requires participation, feedback systems, and learning from all voices—not just the powerful few.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>01:30</strong> – Introduction to Mei Lin Fung’s background and lifelong mission.</li><li><strong>03:30</strong> – Growing up in Singapore and the impact of investing in education and infrastructure.</li><li><strong>08:56</strong> – Early CRM days: Shaping the industry through community and engagement.</li><li><strong>12:30</strong> – Why she answers hundreds of questions on Quora: listening to people as a leadership practice.</li><li><strong>13:15</strong> – Defining the “Great Disruption” and how human collaboration is evolving.</li><li><strong>18:30</strong> – Lessons from Singapore’s digital governance and health policy success.</li><li><strong>21:00</strong> – Founding the People-Centered Internet: Technology as a tool for inclusion and equity.</li><li><strong>31:20</strong> – Building networks of communities to experiment and grow together.</li><li><strong>36:00</strong> – Policy leadership and shaping digital equity at the G7 level.</li><li><strong>38:15</strong> – Mei Lin’s magic wand wish: Making participation and feedback essential to future societies.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Mei Lin Fung</strong> – Co-founder with Vint Cerf, of People-Centered Internet in 2015 is a leading voice in Digital Public Infrastructure for opportunity and community resilience. An early pioneer of Customer Relationship Management and ERP systems, while at Oracle she worked with Marc Benioff, now CEO of Salesforce as well as Tom Siebel who sold Siebel Systems - the first CRM company, to Oracle. She served as socio-technical lead for the U.S. Federal Health Futures and was honored to be business partner with Internet pioneer Douglas Engelbart. Mei Lin organized the 2024 UN Science Summit Digital Governance Series which included the UN AI Report and a rare plenary by Turing Award winner, Dr Alan Kay; The 50th anniversary of the Internet was the occasion for the celebration in Palo Alto California - with IEEE, in London - with the Royal Society, in Brussels - with the I50Y, the youth celebration, where Mei Lin served as catalyst, initiator and speaker. Today Mei Lin is the Chair of the Technical Committee on Sustainability for the IEEE, Society on the Social Implications of Technology and serves as the liaison to the IEEE Industry Engagement Committee for IEEE-USA.</p><ul><li><a href="https://peoplecentered.net" rel="noopener noreferrer" target="_blank">People-Centered Internet</a></li><li><a href="https://www.quora.com/profile/Mei-Lin-Fung/" rel="noopener noreferrer" target="_blank">Follow Mei Lin on Quora</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://dougengelbart.org/content/view/191/" rel="noopener noreferrer" target="_blank">Douglas Engelbart’s concept of Networked Improvement Communities</a></li><li><a href="https://www.nobelprize.org/prizes/economic-sciences/2024/press-release/" rel="noopener noreferrer" target="_blank">Nobel Prize-winning research on social institutions and economic development</a></li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/dynamics-of-digital-spaces-rethinking-democracy-o-7g_H6TrnPAq/" rel="noopener noreferrer" target="_blank">Episode 4: “Dynamics of Digital Spaces: Rethinking Democracy Online”</a></li><li><a href="https://termsofservice.xyz/episodes/regenerating-social-fabric-and-innovating-governance-YqXAtMJCkT2/" rel="noopener noreferrer" target="_blank">Episode 5: “Regenerating Social Fabric &amp; Innovating Governance”</a></li></ul><h3><strong>Call to Action</strong></h3><p>Curious about how to navigate the Great Disruption and build people-centered digital futures? Listen to Mei Lin Fung’s inspiring insights and learn how inclusion, feedback, and collaboration can reshape our digital society.</p><p>🎧 Listen now: <a href="https://termsofservice.xyz/episodes/the-great-disruption-building-human-centered-digital-futures/goDcF1Nff0b/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Mei Lin Fung</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/goDcF1Nff0b/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-bdd7152c8311422af01e693fc16d19bf.jpg"/>
    <itunes:title>The Great Disruption: Building Human-Centered Digital Futures</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>8</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-29c98a381702f0c9ac96d8f52fde18ef.mp3" type="audio/mpeg" length="98617469"/>
    <itunes:duration>00:41:05</itunes:duration>
  </item>
  <item>
    <title>Breaking the Binary: Rethinking Law, Power, and Possibility</title>
    <guid>fmeCP2iPMij</guid>
    <pubDate>Tue, 11 Mar 2025 14:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h3><strong>Episode Summary</strong></h3><p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with Helen Slottje, an award-winning attorney and co-founder of the Regenerative Law Institute. Helen shares her journey from corporate law to leading a groundbreaking legal movement against fracking, which earned her the Goldman Environmental Prize. They discuss the deeper patterns of power and control in legal systems, how governance structures enforce extractive models, and the need for transformative legal frameworks that align with natural systems. Helen’s work challenges conventional legal thinking, moving beyond fixing broken systems to designing entirely new paradigms for governance and community resilience.</p><h3><strong>Key Takeaways</strong></h3><ul><li><strong>Beyond Extractive Systems</strong>: Legal and governance structures often reinforce power imbalances, prioritizing control over coherence.</li><li><strong>The Fracking Fight as a Model for Change</strong>: Helen’s legal strategy helped shift an "inevitable" industry into an impossible one, leading to New York’s fracking ban.</li><li><strong>Predator-Prey Dynamics in Law</strong>: Legal systems replicate extraction-based power structures, often reinforcing historical violence rather than challenging it.</li><li><strong>Regenerative Law vs. Sustainability</strong>: The goal is not just to sustain broken systems but to design new legal structures that support thriving, decentralized communities.</li><li><strong>Reframing Ownership and Control</strong>: From nonprofit governance to alternative currencies, emerging models challenge the idea that control must always be centralized.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>01:19</strong> - Introduction to Helen Slottje and her shift from corporate law to environmental law.</li><li><strong>03:19</strong> - The fight against fracking in New York: How legal strategy led to a statewide ban.</li><li><strong>07:42</strong> - The Power of Patterns &amp; Systemic Control – How binary thinking limits our ability to create real change.</li><li><strong>10:07</strong> - The Role of Narratives in Governance &amp; Social Change – The importance of redefining governance beyond the current nation-state model.</li><li><strong>13:45 -</strong> Challenging the Predator-Prey Dynamic: How society reinforces power imbalances and why we must shift toward mutual thriving.</li><li><strong>22:38</strong> - Hope, Community Building, &amp; Mobilizing Change: The lessons from the fracking fight on organizing and redefining what’s possible.</li><li><strong>33:09</strong> - The Evolutionary Leap: Shifting Consciousness – Why we need new frameworks to break conventional thinking and evolve.&nbsp;</li><li><strong>35:50</strong> - The Process of Creating Transformational Change – The importance of starting with small groups before scaling change.&nbsp;</li><li><strong>40:40</strong> - The Future of Law &amp; Governance: Quantum Thinking – Applying nature, physics, and alternative governance models to drive systemic change..</li><li><strong>42:33</strong> - Helen’s "magic wand" wish: Creating language and legal tools to make transformative governance accessible.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Helen Slottje</strong> is a Harvard-educated lawyer and a recipient of the Goldman Environmental Prize (‘Green Nobel’). As the founder of the Regenerative Law Institute, Helen helps leaders navigate high-pressure challenges with coherence and emergent design rather than brute force. At the core of her work is the conviction that real solutions emerge by leveraging pressure as a catalyst, embracing coherence with nature’s patterns, and making quantum leaps beyond the limits of conventional thinking</p><ul><li><a href="https://www.regenerativelaw.com/" rel="noopener noreferrer" target="_blank">Regenerative Law Institute</a></li><li><a href="https://www.goldmanprize.org/" rel="noopener noreferrer" target="_blank">Goldman Environmental Prize</a></li><li><a href="https://www.linkedin.com/in/helenslottje/" rel="noopener noreferrer" target="_blank">LinkedIn</a>&nbsp;</li><li><a href="https://www.instagram.com/hslottje?igsh=MTFhbWR2bjdvNGpwbQ==" rel="noopener noreferrer" target="_blank">IG</a>&nbsp;</li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://www.goldmanprize.org/recipient/helen-slottje/#recipient-bio" rel="noopener noreferrer" target="_blank">Fracking and Local Bans</a> - The legal strategy that led to New York’s statewide fracking prohibition.</li><li><a href="https://www.gsb.stanford.edu/faculty-research/publications/alternative-models-governance" rel="noopener noreferrer" target="_blank">Alternative Governance Models</a> - Expanding democratic decision-making structures.</li><li><a href="https://en.wikipedia.org/wiki/Mimetic_theory" rel="noopener noreferrer" target="_blank">Rene Girard’s Mimetic Theory</a> - Understanding power through hidden cycles of violence and control.</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><strong>Episode 1</strong>: <a href="https://termsofservice.xyz/episodes/from-ai-anxiety-to-ip-integrity-navigating-rights-dFYKTIhrGDX/" rel="noopener noreferrer" target="_blank">"From AI Anxiety to IP Integrity: Navigating Rights in a Tech-Driven World"</a></li><li><strong>Episode 3</strong>:<a href="https://termsofservice.xyz/episodes/dynamics-of-digital-spaces-rethinking-democracy-o-7g_H6TrnPAq/" rel="noopener noreferrer" target="_blank"> "Dynamics of Digital Spaces: Rethinking Democracy Online"</a></li><li><strong>Episode 5:</strong> <a href="https://termsofservice.xyz/episodes/regenerating-social-fabric-and-innovating-governance-YqXAtMJCkT2/" rel="noopener noreferrer" target="_blank">"Regenerating Social Fabric &amp; Innovating Governance"</a></li></ul><h3><strong>Call to Action</strong></h3><p>Can legal systems evolve beyond extractive models? Listen to Helen Slottje’s transformative insights on law, governance, and power—and explore how regenerative law might shape the future.</p><p>🎧 Listen now: <a href="https://termsofservice.xyz/episodes/breaking-the-binary-rethinking-law-power-and-possibility/fmeCP2iPMij/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Helen Slottje</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/fmeCP2iPMij/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-8cad30482f4f519096cbae74270debf5.jpg"/>
    <itunes:title>Breaking the Binary: Rethinking Law, Power, and Possibility</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>7</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-df07e40c1a62fe1e8fb48e3a1b3120a7.mp3" type="audio/mpeg" length="52705583"/>
    <itunes:duration>00:54:54</itunes:duration>
  </item>
  <item>
    <title>Constitutional Rights, Tech Governance, and Power Structures</title>
    <guid>nxnsRR4dW1i</guid>
    <pubDate>Tue, 18 Feb 2025 12:50:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode of <em>Terms of Service</em>, host Mary Camacho welcomes Nora Mbagathi, Executive Director of the Katiba Institute, for a deep dive into constitutional rights, technology governance, and power dynamics in Kenya and beyond. They explore how constitutions function as the "terms of service" of a society, shaping citizen rights and responsibilities. Nora highlights the risks posed by centralized digital identity systems, the role of transnational corporations in shaping the digital landscape, and the importance of grassroots activism in defending constitutional protections.</p><h3><strong>Key Takeaways</strong></h3><ul><li><strong>Constitutions as Societal Contracts</strong>: Just like digital terms of service, constitutions define the relationship between citizens and power structures.</li><li><strong>Kenya’s 2010 Constitution</strong>: A strong rights-based document that emerged from political unrest, yet faces implementation challenges due to literacy gaps and power imbalances.</li><li><strong>Tech Governance in the Global South</strong>: Digital ID systems, centralized data collection, and lack of local tech solutions create unique vulnerabilities.</li><li><strong>Extractive Tech Models</strong>: Nairobi is often called "the Silicon Valley of Africa," but many systems prioritize corporate interests over community empowerment.</li><li><strong>Listening as a Solution</strong>: Instead of imposing external solutions, policymakers and tech companies need to engage meaningfully with affected communities.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>00:49</strong> - Introduction to Nora Mbagathi and her journey from human rights law to constitutional implementation.</li><li><strong>05:33</strong> - The role of constitutions in protecting citizens and the Katiba Institute’s mission.</li><li><strong>07:46</strong> - Kenya’s 2010 Constitution: A turning point in governance after election violence.</li><li><strong>12:31</strong> - Constitutional literacy: Why some citizens benefit while others remain unaware of their rights.</li><li><strong>16:23</strong> - The intersection of constitutional rights and technology governance.</li><li><strong>20:25</strong> - The role of centralized digital ID systems and their risks.</li><li><strong>25:14</strong> - The myth of Nairobi as the "Silicon Valley of Africa"—who really benefits?</li><li><strong>30:53</strong> - The dangers of centralization vs. the potential of decentralized identity solutions.</li><li><strong>36:04</strong> - The importance of designing technology with privacy, transparency, and equality at its core.</li><li><strong>40:25</strong> - Building international coalitions to challenge corporate and governmental overreach.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Nora Mbagathi</strong> is the Executive Director at Katiba Institute in Kenya. She is a qualified lawyer in multiple jurisdictions and has worked in human rights campaigning and strategic litigation for over ten years. Nora has participated in cases relating to digital ID, platform accountability, criminal justice, and the right to nationality in Kenya. Prior to joining Katiba Institute, Nora was a senior lawyer with the Open Society Justice Initiative, based in London.</p><ul><li><a href="https://x.com/NoraMbagathi" rel="noopener noreferrer" target="_blank">Nora Mbagathi on X</a>&nbsp;</li><li><a href="https://katibainstitute.org/" rel="noopener noreferrer" target="_blank">Katiba Institute Website</a></li><li><a href="https://x.com/KatibaInstitute" rel="noopener noreferrer" target="_blank">Katiba Institute on X</a></li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="http://www.parliament.go.ke/sites/default/files/2023-03/The_Constitution_of_Kenya_2010.pdf" rel="noopener noreferrer" target="_blank">Kenya’s 2010 Constitution</a> - A landmark rights-based document.</li><li><a href="https://g0v.tw/intl/en/" rel="noopener noreferrer" target="_blank">GovZero</a> - A movement in Taiwan promoting citizen-driven government accountability.</li><li><a href="https://en.wikipedia.org/wiki/Digital_identity" rel="noopener noreferrer" target="_blank">Digital ID Systems</a> - Centralized identity databases and their risks in Kenya.</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/dynamics-of-digital-spaces-rethinking-democracy-o-7g_H6TrnPAq/" rel="noopener noreferrer" target="_blank">Episode 4: "Dynamics of Digital Spaces: Rethinking Democracy Online"</a></li><li><a href="https://termsofservice.xyz/episodes/regenerating-social-fabric-and-innovating-governance-YqXAtMJCkT2/" rel="noopener noreferrer" target="_blank">Episode 5: "Regenerating Social Fabric &amp; Innovating Governance"</a></li></ul><h3><strong>Call to Action</strong></h3><p>How can we ensure technology serves citizens rather than undermining their rights? Listen to this thought-provoking conversation with Nora Mbagathi and join the discussion on tech governance, rights, and digital power structures.</p><p>🎧 Listen now: <a href="https://termsofservice.xyz/episodes/breaking-the-binary-rethinking-law-power-and-possibility/fmeCP2iPMij/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Nora Mbagathi</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/nxnsRR4dW1i/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-45b3924a6088703c035e24e17bc5d893.jpg"/>
    <itunes:title>Constitutional Rights, Tech Governance, and Power Structures</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>6</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-c146e08c13354e4353a6d350719cbd60.mp3" type="audio/mpeg" length="103926512"/>
    <itunes:duration>00:43:18</itunes:duration>
  </item>
  <item>
    <title>Regenerating Social Fabric &amp; Innovating Governance</title>
    <guid>YqXAtMJCkT2</guid>
    <pubDate>Tue, 28 Jan 2025 13:22:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode of <em>Terms of Service</em>, host Mary Camacho speaks with Glen Weyl, founder of the Plurality Institute and Radical Exchange Foundation. They explore transformative ideas such as quadratic voting, collaborative governance, and the importance of regenerating the social fabric in a polarized world. Glen shares insights from his work with Taiwan’s innovative digital governance models and his collaboration with Audrey Tang, offering a vision for a more interconnected, pluralistic digital future.</p><h3><strong>Key Takeaways</strong></h3><ul><li>Social media should shift from being divisive to regenerative, so as to rebuild the social fabric that sustains it.</li><li>Quadratic voting enables nuanced decision-making, fostering consensus while embracing diversity.</li><li>Taiwan’s digital governance innovations demonstrate the power of collective action and participatory design.</li><li>Technology must integrate with human communities to create sustainable systems that enrich rather than erode social foundations.</li><li>The Plurality framework focuses on bridging divides and creating richer, interconnected digital ecosystems.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>01:20</strong> - Introduction to Glen Weyl and his interdisciplinary work.</li><li><strong>04:33</strong> - Regenerating social media to build stronger, pro-social networks.</li><li><strong>07:43</strong> - Glen’s personal journey and the power of embracing contradictions.</li><li><strong>12:00</strong> - Quadratic voting: How it works and its applications in blockchain and governance.</li><li><strong>17:50</strong> - Collaboration with Audrey Tang and the creation of Plurality.</li><li><strong>24:26</strong> - Reimagining social media to foster community awareness and connection.</li><li><strong>27:22</strong> - Taiwan’s transformative governance practices: GovZero, Polis, and digital competence education.</li><li><strong>34:14</strong> - A magic wand for change: Connecting global leaders with Audrey Tang to inspire collaborative solutions.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Glen Weyl</strong> - Founder and Research Lead of the Microsoft Research Plural Technology Collaboratory, Founder and Board Member of the RadicalxChange Foundation and Founder and Chair of the Plurality Institute. Co-author with Eric Posner of Radical Markets: Uprooting Capitalism and Democracy for a Just Society, with Audrey Tang and dozens of open source collaborators of ⿻ 數位 Plurality: The Future of Collaborative Technology and Democracy and with Puja Ohlhaver and Vitalik Buterin of "Decentralized Society: Finding Web3's Soul". Executive Producer of "Good Enough Ancestor"</p><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://plurality.net/" rel="noopener noreferrer" target="_blank">Plurality Website</a></li><li><a href="https://en.wikipedia.org/wiki/Quadratic_voting" rel="noopener noreferrer" target="_blank">Quadratic Voting</a> - A voting method allowing individuals to express preferences with intensity.</li><li><a href="https://g0v.tw/intl/en/" rel="noopener noreferrer" target="_blank">GovZero</a> - A movement creating open-source alternatives to government services.</li><li><a href="https://pol.is/home" rel="noopener noreferrer" target="_blank">Polis</a> - A participatory platform for identifying shared values across divides.</li><li><a href="https://bit.ly/geancestor" rel="noopener noreferrer" target="_blank">Good Enough Ancestor Film Trailer</a></li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><a href="https://termsofservice.xyz/episodes/dynamics-of-digital-spaces-rethinking-democracy-o-7g_H6TrnPAq/" rel="noopener noreferrer" target="_blank">Episode 4: “Dynamics of Digital Spaces: Rethinking Democracy Online”</a></li></ul><h3><strong>Call to Action</strong></h3><p>Discover how collaborative governance and innovative tools can shape a pluralistic future. Listen to Glen Weyl’s vision for social regeneration and join the Plurality movement to help build a better digital world.</p><ul><li>X: <a href="https://x.com/glenweyl" rel="noopener noreferrer" target="_blank">@glenweyl</a></li></ul><p>🎧 Listen now: <a href="https://termsofservice.xyz/episodes/regenerating-social-fabric-and-innovating-governance-YqXAtMJCkT2/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Glen Weyl</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/YqXAtMJCkT2/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-d2c92518bfd1505ab7a2a0dbc255d7d8.jpg"/>
    <itunes:title>Regenerating Social Fabric &amp; Innovating Governance</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>5</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-1464faeb7042987970015666149fd315.mp3" type="audio/mpeg" length="91327132"/>
    <itunes:duration>00:38:03</itunes:duration>
  </item>
  <item>
    <title>Dynamics of Digital Spaces: Rethinking Democracy Online</title>
    <guid>7g_H6TrnPAq</guid>
    <pubDate>Tue, 07 Jan 2025 16:04:26 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode of <em>Terms of Service</em>, host Mary Camacho engages with Nathan Schneider, assistant professor of media studies at the University of Colorado Boulder and director of the Media Economies Design Lab. Together, they explore the challenges of building digital communities, the nuanced relationship between decentralisation and centralised power, and the need for democratic practices in online spaces. Nathan shares insights from his latest book, <em>Governable Spaces</em>, and explains how governance, technology, and collective action intersect to create better digital futures.</p><h3><strong>Key Takeaways</strong></h3><ul><li>Decentralisation often coexists with centralszed structures, creating power dynamics that require deliberate accountability mechanisms.</li><li>Early internet protocols left governance vacuums, enabling centralised economic powers to dominate the digital landscape.</li><li>Treating online spaces with the same intentional governance as offline communities can empower users and address systemic challenges.</li><li>Democratic practices and community-driven governance frameworks can build resilient and equitable online spaces.</li><li>Tools like community rules and open protocols provide pathways for collaborative decision-making in digital environments.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>01:05</strong> - Introduction to Nathan Schneider and the Media Economies Design Lab.</li><li><strong>03:58</strong> - From religion to technology: Nathan’s journey to studying decentralised governance.</li><li><strong>08:38</strong> - Lessons from Occupy Wall Street and Ethereum: Empowering movements with digital tools.</li><li><strong>13:14</strong> - Implicit feudalism in online spaces: How admin power structures shape digital governance.</li><li><strong>17:52</strong> - The interplay between decentralisation and centralised authority in online systems.</li><li><strong>25:01</strong> - Friction as a teacher: Why democratic governance in tech requires investment and experimentation.</li><li><strong>34:27</strong> - Imagining equitable online sovereignty and collaborative digital communities.</li><li><strong>36:50</strong> - Local control in digital spaces: Lessons from social movements and cooperative models.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Nathan Schneider</strong> is an assistant professor of media studies at the University of Colorado Boulder, where he leads the Media Economies Design Lab and the MA program in Media and Public Engagement. He is the author of four books, most recently Governable Spaces: Democratic Design for Online Life, published by University of California Press in 2024, and Everything for Everyone: The Radical Tradition that Is Shaping the Next Economy, published by Bold Type Books in 2018. He edited Vitalik Buterin’s book Proof of Stake: The Making of Ethereum and the Philosophy of Blockchains and co-edited Beautiful Solutions: A Toolbox for Liberation and Ours to Hack and to Own: The Rise of Platform Cooperativism, a New Vision for the Future of Work and a Fairer Internet. Recent scholarship has been published in New Media &amp; Society, Feminist Media Studies, the Georgetown Law Technology Review, and Media, Culture &amp; Society, among other journals. He has also reported for publications including Harper’s, The Nation, The New Republic, The Chronicle of Higher Education, The New York Times, The New Yorker, and others, along with regular columns for America, a national Catholic magazine. He has lectured at universities including Columbia, Fordham, Harvard, MIT, NYU, the University of Bologna, and Yale. He serves on the boards of Metagov, Start.coop, and Waging Nonviolence.&nbsp;</p><h3><strong>Resources Mentioned</strong></h3><ul><li><a href="https://en.wikipedia.org/wiki/Section_230" rel="noopener noreferrer" target="_blank">Section 230</a> - A pivotal U.S. policy shaping platform liability and governance.</li><li><a href="https://www.colorado.edu/lab/medlab/" rel="noopener noreferrer" target="_blank">Media Economies Design Lab</a> - CU Boulder initiative fostering equitable governance models for online communities.</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><strong>Episode 2</strong>: "<a href="https://115ffbb9.termsofservice.pages.dev/episodes/beyond-honeypots-privacy-security-and-the-futur-YjzvQ-RnEQE/" rel="noopener noreferrer" target="_blank">Beyond Honeypots: Privacy, Security, and the Future of Distributed Webs</a>"</li><li><strong>Episode 3</strong>: "<a href="https://115ffbb9.termsofservice.pages.dev/episodes/empowerment-tech-unlocking-customer-data-for-bett-W7nErd8CLMY/" rel="noopener noreferrer" target="_blank">Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business</a>"</li></ul><h3><strong>Call to Action</strong></h3><p>Want to explore how we can build better digital communities? Listen to Nathan Schneider’s vision for governable spaces and find out how collective action can shape the future of the internet.</p><ul><li><a href="https://nathanschneider.info/" rel="noopener noreferrer" target="_blank">Nathan’s Website</a></li><li><a href="https://nathanschneider.info/books/governable-spaces/" rel="noopener noreferrer" target="_blank">Governable Spaces</a> – Learn more about his latest book.</li></ul><p>🎧 Listen now: <a href="https://termsofservice.xyz/episodes/dynamics-of-digital-spaces-rethinking-online-demo-7g_H6TrnPAq/" rel="noopener noreferrer" target="_blank">Episode Link</a></p><h3><strong>Credits</strong></h3><p>Host: Mary Camacho</p><p>Guest: Nathan Schneider</p><p>Produced by <em>Terms of Service Podcast</em></p><p>Sound Design: Arthur Vincent and Sonor Lab</p><p>Co-Producers: Nicole Klau Ibarra &amp; Mary Camacho</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/dynamics-of-digital-spaces-rethinking-democracy-o-7g_H6TrnPAq/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-57acade3fecbd5fd09e804515f6ac149.jpg"/>
    <itunes:title>Dynamics of Digital Spaces: Rethinking Democracy Online</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>4</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-f448dc274525bc8f5a354eb09e793213.mp3" type="audio/mpeg" length="123487001"/>
    <itunes:duration>00:51:27</itunes:duration>
  </item>
  <item>
    <title>Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business</title>
    <guid>W7nErd8CLMY</guid>
    <pubDate>Tue, 17 Dec 2024 18:29:37 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode of <em>Terms of Service</em>, host Mary Camacho welcomes Jamie Smith, founder of <em>Customer Futures Limited</em>, to explore the concept of empowerment tech. They dive into how giving customers control over their digital identities, data, and verified credentials can reshape customer experiences, unlock new business models, and enhance trust. Jamie shares practical examples and insights on digital wallets, identity attributes, and how AI intersects with customer empowerment in a rapidly evolving digital landscape.</p><h3><strong>Key Takeaways</strong></h3><ul><li><strong>Empowerment Tech</strong>: Tools that allow individuals to control, manage, and share their data, creating better outcomes for both customers and businesses.</li><li><strong>Digital Wallets</strong>: Evolving from payments to storing verified credentials like identity, loyalty status, and permissions – enabling trusted, seamless experiences.</li><li><strong>Trust vs. Privacy</strong>: Addressing concerns over big tech dominance while advocating for open-source, trusted wallet solutions.</li><li><strong>AI and Identity</strong>: The rise of AI agents will require wallets and digital identity systems to defend against fraud and ensure trust.</li><li><strong>Empowering customers</strong> with data reduces fraud, improves personalization, and stitches together fragmented experiences across businesses.</li></ul><h3><strong>Topics Covered / Timestamped Sections</strong></h3><ul><li><strong>01:47</strong> - Introduction to Jamie Smith and his 15-year journey into digital identity and empowerment tech.</li><li><strong>04:00</strong> - From "big data" to "little data": Shifting focus to customer-controlled data and its business value.</li><li><strong>08:50</strong> - Defining empowerment tech: Digital identity, verified facts, and better business outcomes.</li><li><strong>11:16</strong> - Creating new business models that benefit customers while improving governance and trust.</li><li><strong>16:03</strong> - The power shift: Why businesses can now trust data presented by individuals.</li><li><strong>23:14</strong> - Digital wallets: What they are, how they work, and the growing ecosystem of wallet providers.</li><li><strong>32:55</strong> - Trusting wallet providers: Big tech dominance, regulation, and open-source alternatives.</li><li><strong>37:42</strong> - Metadata, privacy, and how wallet usage may inadvertently expose patterns of behavior.</li><li><strong>44:53</strong> - The intersection of wallets and AI: Preparing for agent-centric AI and identity verification.</li></ul><h3><strong>Guest Bio and Links</strong></h3><p><strong>Jamie Smith</strong> is the Founder of Customer Futures Ltd, an advisory firm helping businesses seize the opportunity around disruptive and customer-empowering digital propositions. He has been working at the forefront of digital identity and customer-controlled personal data for nearly 15 years, and has led breakthrough innovation across a range of digital ventures at both large enterprises and startups. He brings a breadth of experience across product, commercial, technology and strategy. Jamie is passionate about creating new value with personal data, and empowering the consumer with new digital tools that work for them. He writes about the next billion-dollar market category - Empowerment Tech - at www.customerfutures.com.</p><ul><li><a href="https://www.linkedin.com/in/jamiedsmith/" rel="noopener noreferrer" target="_blank">Jamie’s LinkedIn</a></li><li><a href="https://www.customerfutures.com/" rel="noopener noreferrer" target="_blank">Customer Futures Newsletter</a> – Deep insights on digital identity, data empowerment, and emerging trends.</li></ul><h3><strong>Resources Mentioned</strong></h3><ul><li><strong>European eIDAS 2.0</strong> - A regulatory framework for digital identity and wallets in the EU.</li><li><strong>Open Wallet Foundation</strong> - An initiative to build secure, open-source digital wallets.</li><li><strong>Anthropic’s Claude AI</strong> - Agent-centric AI capable of automating workflows.</li><li><strong>Strava Case Study</strong> - Revealing metadata risks in location-based apps.</li></ul><h3><strong>Further Reading / Related Episodes</strong></h3><ul><li><strong>Episode 1</strong>: "From AI Anxiety to IP Integrity: Navigating Rights in a Tech-Driven World"</li><li><strong>Episode 2</strong>: "Beyond Honeypots: Privacy, Security, and the Future of Distributed Webs"</li></ul><h3><strong>Call to Action</strong></h3><p>Want to know how digital identity and AI are shaping the future? Listen to the full episode and subscribe for more insights on tech, trust, and innovation. Don’t forget to <a href="https://www.customerfutures.com/" rel="noopener noreferrer" target="_blank">follow Jamie’s newsletter</a> for expert perspectives on digital empowerment!</p><h3>Credits</h3><p>Host: Mary Camacho</p><p>Guest: Jamie Smith</p><p><em>Terms of Service Podcast produced by Mary Camacho &amp; Nicole Klau Ibarra</em></p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/empowerment-tech-unlocking-customer-data-for-bet-W7nErd8CLMY/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-be27d6aaec1a16f7d9b1e6a39a41fc6d.jpg"/>
    <itunes:title>Empowerment Tech: Unlocking Customer Data for Better Choices and Better Business</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>3</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-8807e9eb4aab62c73c832776c9892adc.mp3" type="audio/mpeg" length="115526969"/>
    <itunes:duration>00:48:08</itunes:duration>
  </item>
  <item>
    <title>Beyond Honeypots: Privacy, Security, and the Future of Distributed Webs</title>
    <guid>YjzvQ-RnEQE</guid>
    <pubDate>Tue, 26 Nov 2024 14:00:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode of&nbsp;<em>Terms of Service</em>, host Mary Camacho interviews Liz Steininger, CEO of Least Authority, to discuss privacy, security, and decentralization in the tech space. They explore the principles of least authority, the challenges of security in blockchain and Web3, and the delicate balance between convenience, transparency, and privacy. Liz shares insights into security-by-design, user empowerment, and the journey of implementing zero-knowledge proofs.</p><h3>Key Takeaways</h3><ul><li>The principle of "least authority" minimizes access to prevent vulnerabilities in tech systems.</li><li>Transparency vs. privacy: striking a balance is key to user trust and security.</li><li>Zero-knowledge proofs (like those explained in the&nbsp;<em>MoonMath Manual</em>) offer powerful tools for encryption without sacrificing usability.</li><li>Distributed systems reduce the risks of centralized honeypots but increase the responsibility of endpoints for security.</li><li>Marketing can overshadow security concerns in blockchain projects, highlighting the need for better industry standards.</li></ul><h3>Topics Covered / Timestamped Sections</h3><ul><li><strong>01:15</strong>&nbsp;- Introduction to Liz and Least Authority’s mission of "security and privacy for all."</li><li><strong>02:40</strong>&nbsp;- Origins of the company and the principle of least authority.</li><li><strong>08:52</strong>&nbsp;- Decentralization and its impact on user agency and security.</li><li><strong>10:40</strong>&nbsp;- Common misconceptions about internet security and privacy.</li><li><strong>19:32</strong>&nbsp;- Designing security that balances user convenience with effectiveness.</li><li><strong>23:00</strong>&nbsp;- The&nbsp;<em>MoonMath Manual</em>&nbsp;and its role in democratizing zero-knowledge proofs for developers.</li><li><strong>35:11</strong>&nbsp;- Evaluating blockchain projects: signals for trustworthy security practices.</li><li><strong>42:50</strong>&nbsp;- Liz’s "magic wand" wish for the tech industry: fostering openness about security failures.</li></ul><h3>Guest Bio and Links</h3><p><strong>Liz Steininger</strong>&nbsp;is&nbsp;an advocate for privacy and security in technology&nbsp;and she&nbsp;leads efforts to create tools that empower users while pushing for stronger security standards across the industry.&nbsp;Liz is the CEO and managing director of Least Authority, a leading Web3 security consulting company and builder of privacy enhancing technology products. The company focuses on cutting edge security and empowering users to control their right to privacy and they specialize in securing Web3 products, consulting and auditing for capability based security, and implementing advanced cryptography, especially zero knowledge proofs and multi-party computations.</p><p>Liz is an experienced entrepreneur, growing least authority over the last seven years. And prior to that, she was the senior program manager at the Open Technology Fund. She has over 22 years of experience in the tech industry, working on numerous projects at the edge of innovation. She has an MS in management and technology and a bachelor's degree in digital media.&nbsp;</p><ul><li><a href="https://www.linkedin.com/in/lizsteininger/" rel="noopener noreferrer" target="_blank">LinkedIn Profile</a></li><li><a href="https://leastauthority.com/" rel="noopener noreferrer" target="_blank">Least Authority Website</a></li><li><a href="https://x.com/liz315" rel="noopener noreferrer" target="_blank">Twitter</a></li></ul><h3>Resources Mentioned</h3><ul><li><strong>Zero-Knowledge Proofs</strong>&nbsp;- Advanced encryption technique enabling data validation without revealing the data itself.</li><li><strong><em>MoonMath Manual</em>&nbsp;</strong>guide for creating zero-knowledge snarks with only high school math:&nbsp;<a href="https://github.com/LeastAuthority/moonmath-manual" rel="noopener noreferrer" target="_blank">Read More</a></li></ul><h3>Call to Action</h3><p>Did this episode get you thinking about privacy and security? Subscribe to&nbsp;<em>Terms of Service</em>&nbsp;for more conversations like this, and share your thoughts with us using #PrivacyMatters and #TermsOfServicePodcast.</p><h3>Credits</h3><p>Host: Mary Camacho</p><p>Guest: Liz Steininger</p><p>Terms of Service Podcast&nbsp;Produced by: Mary Camacho &amp; Nicole Klau Ibarra</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/YjzvQ-RnEQE/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-461b9803011c06217575d8aa4995cff2.jpg"/>
    <itunes:title>Beyond Honeypots: Privacy, Security, and the Future of Distributed Webs</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>2</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-ed445a198b0b2be25959047abfbf0b35.mp3" type="audio/mpeg" length="107206446"/>
    <itunes:duration>00:44:40</itunes:duration>
  </item>
  <item>
    <title>From AI Anxiety to IP Integrity: Navigating Rights in a Tech-Driven World</title>
    <guid>dFYKTIhrGDX</guid>
    <pubDate>Fri, 08 Nov 2024 07:30:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<p>In this episode, host Mary Camacho sits down with Van Lindberg, a legal expert in technology and intellectual property, to explore the shifting landscape of AI and creator rights. They dive into how generative AI impacts the rights of creators, the legal nuances behind content training, and the fears surrounding AI’s growing influence on creative industries.</p><h3>Key Takeaways</h3><ul><li>Adobe's generative AI tool, Firefly, initially raised concerns about user content being used for AI training, sparking debates about trust and IP rights.</li><li>Many fears around AI stem from potential shifts in traditional business models, not just direct copying.</li><li>Generative AI doesn't "store" works but uses statistical measurements to create new content, distinguishing it from traditional copyright infringements.</li><li>The implications of AI extend beyond just art or writing; technology disrupts industries by altering skill demands and business models.</li></ul><h3>Topics Covered / Timestamped Sections</h3><ul><li><strong>00:01</strong>&nbsp;- Introduction to Van Lindberg and the legal changes with AI</li><li><strong>00:47</strong>&nbsp;- Adobe’s Firefly and controversy over AI training on user-uploaded content</li><li><strong>03:43</strong>&nbsp;- Subscription-based business models and user trust</li><li><strong>05:19</strong>&nbsp;- The difference between business model shifts and AI-specific issues</li><li><strong>07:51</strong>&nbsp;- Copyright law’s stance on learning from existing works vs. copying</li><li><strong>10:03</strong>&nbsp;- The need for defining AI in nuanced terms for clearer policies</li></ul><h3>Guest Bio and Links</h3><p><strong>Van Lindberg</strong> is an intellectual property attorney with Taylor English specializing in the intersection of computer technology and law. Mr. Lindberg has been named one of “America’s Top 12 Techiest Attorneys” by the American Bar Association Journal and was recognized as one of the world’s top IP strategists by Intellectual Asset Management magazine.</p><p>Mr. Lindberg specializes in open-source law and strategy, which has been his major focus for more than twenty-five years. He is the author of O’Reilly’s “Intellectual Property and Open Source,” and co-editor of the Open Source casebook. He also serves with many open-source foundations as a board member and legal counsel.</p><p>Mr. Lindberg is an expert in the emerging field of AI law, where he helps clients with the legal issues associated with creating and training machine learning models, using machine learning models to develop systems with novel capabilities, and using inference to generate new works.</p><p>In addition to Mr. Lindberg’s legal work, he is the founder of OSPOCO, the Open Source Program Office-as-a-Service company that pairs technical and community expertise with legal oversight and expertise. He develops natural language processing tools in his spare time and prefers programming in Python.</p><ul><li><a href="https://www.linkedin.com/in/van-lindberg/" rel="noopener noreferrer" target="_blank">LinkedIn Profile</a></li><li><a href="https://twitter.com/vanlindberg" rel="noopener noreferrer" target="_blank">Twitter</a></li></ul><h3>Resources Mentioned</h3><ul><li>Adobe's Firefly and Adobe Stock Photo Library</li><li>Legal definitions of AI in policy and IP law</li><li>Historical context of AI since the 1950s</li></ul><h3>Call to Action</h3><p>Enjoyed this episode? Subscribe, rate, and share to support the show on Apple Podcasts, Spotify, PocketCast or wherever you listen.</p><p>Follow us on <a href="https://www.linkedin.com/company/podcast-terms-of-service/ " rel="noopener noreferrer" target="_blank">LinkedIn</a> for updates and join the conversation.</p><h3>Credits</h3><p>Host: Mary Camacho</p><p>Guest: Van Lindberg</p><p>Terms of Service Podcast&nbsp;Produced by: Mary Camacho &amp; Nicole Klau Ibarra</p><p>Music, Sound &amp; Editing: Arthur Vincent at Sonorlab</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/from-ai-anxiety-to-ip-integrity-navigating-rights-dFYKTIhrGDX/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-ce1c38c29cd9622e69b85747f7ffef62.jpg"/>
    <itunes:title>From AI Anxiety to IP Integrity: Navigating Rights in a Tech-Driven World</itunes:title>
    <itunes:season>1</itunes:season>
    <itunes:episode>1</itunes:episode>
    <itunes:episodeType>full</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-6b0763cb1294c99dc7c92f4657f2926d.mp3" type="audio/mpeg" length="95846316"/>
    <itunes:duration>00:39:56</itunes:duration>
  </item>
  <item>
    <title>Sample - Terms of Service Podcast</title>
    <guid>9WgYzyhusLj</guid>
    <pubDate>Tue, 01 Oct 2024 09:59:00 GMT</pubDate>
    <itunes:explicit>false</itunes:explicit>
    <description>
      <![CDATA[<h2>Welcome to the Terms of Service Podcast</h2><p>In this sample-trailer, Mary is chatting with Catherine Stihler, former MEP from Scotland and former Executive Director of the Creative Commons.</p>]]>
    </description>
    <link>https://podcast.termsofservice.xyz/i/trailer-1-terms-of-service-podcas-9WgYzyhusLj/</link>
    <itunes:image href="https://media.termsofservice.xyz/termsofservice-xyz/production/images/item-27e3c692f552adaaea03820f52fd1b28.jpg"/>
    <itunes:title>Welcome to Terms of Service, hosted by Mary Camacho</itunes:title>
    <itunes:episodeType>trailer</itunes:episodeType>
    <enclosure url="https://op3.dev/e/media.termsofservice.xyz/termsofservice-xyz/production/media/audio-771ecb0527ecf37f5a50412c1ce4bb64.mp3" type="audio/mpeg" length="1713482"/>
    <itunes:duration>00:02:58</itunes:duration>
  </item>
</channel>
</rss>