<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>Algorithms - Coda Story</title>
	<atom:link href="https://www.codastory.com/tag/algorithms/feed/" rel="self" type="application/rss+xml" />
	<link>https://www.codastory.com/tag/algorithms/</link>
	<description>stay on the story</description>
	<lastBuildDate>Thu, 30 Apr 2026 15:25:39 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.9.4</generator>

 
<site xmlns="com-wordpress:feed-additions:1">239620515</site>	<item>
		<title>&#8220;All my fundees have blue eyes.&#8221; Epstein and the tech world&#8217;s dark ideology</title>
		<link>https://www.codastory.com/authoritarian-tech/blue-eyes-epstein-artificial-intelligence-eugenics-silicon-valley/</link>
		
		<dc:creator><![CDATA[Isobel Cockerell]]></dc:creator>
		<pubDate>Tue, 28 Apr 2026 18:15:02 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Anti-science]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Feature]]></category>
		<category><![CDATA[Oligarchy]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=63628</guid>

					<description><![CDATA[<p>The Epstein files reveal beliefs about race, eugenics, and engineering humans that run to the heart of Silicon Valley.</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/blue-eyes-epstein-artificial-intelligence-eugenics-silicon-valley/">&#8220;All my fundees have blue eyes.&#8221; Epstein and the tech world&#8217;s dark ideology</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-video"><video height="720" style="aspect-ratio: 1280 / 720;" width="1280" autoplay loop muted poster="https://www.codastory.com/wp-content/uploads/2026/04/eye_mp4_std.original.jpg" src="https://videos.files.wordpress.com/WTFSNpE3/eye.mp4" playsinline></video></figure>



<p>It starts with a simple search term in the Department of Justice’s <a href="https://www.justice.gov/epstein">Epstein Library</a>. “Blue eyes.” Hundreds of results. Jeffrey Epstein’s international trafficking agents send him pictures and descriptions of blue-eyed young girls: potential victims to be dispatched to his various homes. “I spotted two skinny blond blue eyes 21 years old ladies in Monaco last weekend and asked them for CVs,” one agent, whose name has been redacted, wrote. “Trying her best to move from her small town to Moscow; English isn't great. Could be fun for Paris, blue eyes,” wrote another. “Can't understand if her breast is real. Otherwise very pretty and sweet…Very blue eyes as we like.”&nbsp;</p>



<p>One of Epstein’s victims wrote of being chosen for her eye color in a journal entry later shared with federal prosecutors. "Superior gene pool?!? Why me?" she wrote, describing Epstein's worldview as "Nazi like." "It makes no sense. Why my hair color and eye color?"&nbsp;</p>





<p>Epstein — himself blue-eyed — seemed to prefer both his victims, and the people he bankrolled, to have blue eyes. “All of my fundees have blue eyes,” he <a href="https://www.justice.gov/epstein/files/DataSet%2011/EFTA02453821.pdf">boasted</a> in one email. In the entryway of his Manhattan townhouse, he displayed dozens of prosthetic eyeballs in a frame. Epstein made <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA00863704.pdf">notes</a> and <a href="https://www.justice.gov/epstein/files/DataSet%2011/EFTA02554047.pdf">sent</a> article links to his contacts asking if having blue eyes meant you were more intelligent or a “genius”. He even had a <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA01192599.pdf">list</a> of scientists and tech leaders with blue eyes — including Elon Musk, Peter Thiel, and Google’s Ray Kurzweil. “Total — 70 people Blue eyes — 41 Unclear (might be blue, but not 100% sure)” the list says.&nbsp;Appearing in the files — whether on this list or elsewhere in Epstein's records — does not connote legal wrongdoing.<br><br>Going deeper into the files, Epstein and his network of contacts <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA00654948.pdf">discussed</a> beliefs about how physical characteristics and race might denote intelligence. They <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA00654948.pdf">exchanged</a> emails about population control. They spoke of engineering women’s sex <a href="https://stanforddaily.com/2026/02/03/former-stanford-professor-nathan-wolfe-92-planned-sexual-behavior-research-described-interns-with-epstein/">drives</a>, <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA01003966.pdf">building</a> designer babies, and living in a world full of superintelligent humans that could merge with robots. They spoke of getting rid of the <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA00823256.pdf">elderly</a>, the infirm, and the <a href="https://jmail.world/thread/vol00009-efta01156952-pdf">poor.</a></p>



<p>The files offer a glimpse into a world where ideas about eugenics and race science have never gone away. On the contrary, they run through our elite universities, through the most powerful companies in Silicon Valley, and through the tech industry itself. Epstein’s was an exclusive club that counted among its members people who harbor dreams of re-engineering human minds and bodies, seizing control of our collective future, and building technology that, they hope, will one day merge with — or even replace — all of us.</p>



<figure class="wp-block-image size-full"><img src="https://www.codastory.com/wp-content/uploads/2026/04/ja.png" alt="" class="wp-image-63663"/><figcaption class="wp-element-caption">Jeffrey Epstein, 27. Jeffrey Epstein's mansion El Brillo Way in Palm Beach. U.S. Virgin Islands, Department of Justice, Sexual Offender Registry Photograph.</figcaption></figure>



<p>In 2002, two decades before the launch of ChatGPT, Epstein hosted an Artificial Intelligence summit on his Caribbean island. In the years that followed, he cultivated close, regular contact with a network of&nbsp; (predominantly male) scientists, researchers, academics and tech leaders working at the vanguard of AI, biotech, genetics and cognitive science, meeting them at universities like Harvard and at his various homes.&nbsp;</p>



<p>In August 2018, a year before Epstein was found dead in his jail cell, he was in email correspondence with software consultant and bitcoin investor Bryan Bishop about funding a project to create “designer babies” — children with genes cherrypicked for their looks, health, strength, immune systems, sleep needs and even, in Bishop’s imaginings, abilities to live on a different planet.&nbsp;</p>



<p>&nbsp;&nbsp;“Attached is the doc you requested, it's the "use of funds" spreadsheet for the designer baby and human cloning company,” Bishop <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA01003966.pdf">wrote</a> to Epstein. “This gets us out of our self-funded ‘garage biology’ phase to the first live birth of a human designer baby, and possibly a human clone, within 5 years. Once we reach the first birth, everything changes and the world will never be the same again.”<br><br>Bishop went on to discuss how his ultimate ambition was to make “practically unlimited modifications to the cells before generating an embryo.”<br><br>In response to a request for comment, Bishop <a href="https://diyhpl.us/wiki/designer-baby-faq/">sent</a> Coda a publicly available set of answers to frequently asked questions about designer babies.</p>



<p>“The reason people have an aversion to eugenics, and rightfully so, is because countries used genocide and sterilization to prevent reproduction by populations that they didn’t like. We have no intention of doing anything of the sort,” Bishop writes in the public FAQ. “‘Designer baby’ simply describes a child whose genome has been intentionally altered or chosen by their parents, rather than left entirely to the genetic lottery of natural conception.”</p>



<p>“It’s such a great subject,” Epstein <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA01019549.pdf">responded</a> after he read Bishop’s proposal. “We need to get a read on legal. Can’t do anything where US rules apply to US citizens regardless of where [they are].”&nbsp;</p>



<p>Building a super-race of humans, and parachuting humanity into a different evolutionary era — or even obsoleting the human race as we know it — is a running theme in the Epstein files, and an increasingly prominent ambition for tech evangelists today.<br><br>“It’s eugenics all the way down,” said Jacob Metcalf, a founding partner at Ethical Resolve, a consulting firm working with tech companies to develop their ethics protocols. A common fantasy in tech circles, he said, is “to essentially control human destiny. And a lot of the times that human destiny is for humans to be replaced. That's the really bleak thing here. What could be more eugenic than getting rid of humans.”</p>



<figure class="wp-block-gallery alignfull has-nested-images columns-default is-cropped wp-block-gallery-2 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img data-id="63644" src="https://www.codastory.com/wp-content/uploads/2026/04/Gif1-1800x1013.gif" alt="" class="wp-image-63644"/></figure>



<figure class="wp-block-image size-large"><img data-id="63645" src="https://www.codastory.com/wp-content/uploads/2026/04/Gif2-1800x1013.gif" alt="" class="wp-image-63645"/></figure>
</figure>



<p>In 2008, Epstein began conversations with the computer scientist Ben Goertzel. Over the years, Epstein would <a href="https://bengoertzel.substack.com/p/goertzel-vs-epstein">send</a> Goertzel more than $360,000 to fund the researcher’s plans to build towards Artificial General Intelligence (AGI), a term Goertzel himself popularized.<br><br>“I remain eager to move forward on working together to accelerate progress toward a human-obsoleting thinking machine,” Goertzel <a href="https://jmail.world/thread/3493d5a2cacca3edaeee1c6f08e678c9?view=inbox">wrote</a> to Epstein in May 2008. Eighteen years on, and the idea of obsoleting humans with artificial intelligence is widely discussed in the tech world.</p>



<p><br>When asked to comment on his exchange with Epstein, Goertzel told Coda: “I do think we will create forms of transhuman intelligence going beyond the scope of humanity as we know it, but I also very much hope and envision a strong role for humans even after this happens.”<br><br>Goertzel went on to describe a future where the world reaches the “Singularity” — a Silicon Valley buzzword signifying a tipping point where AI surpasses human intelligence. “I do think AI will eventually gain its own superhuman autonomy, but I think this can happen in a way that respects and nourishes human life rather than being harmful to it,” he said. “Epstein and I discussed this face to face a few times and indeed I was a bigger fan of the human species than he was, and more optimistic about its flourishing post-Singularity.”&nbsp;</p>



<p>In an email to Epstein, Goertzel laid out a scenario where AI systems would start running their own economic activity. He envisioned this Artificial Intelligence economy acting as a “parasite to overcome the regular human economy” that would eventually “gain its own superhuman autonomy.” The ideas Epstein and Goertzel exchanged mirror a broader conversation unfolding in the tech world that imagines a future where ultimately, human labour could be rendered superfluous, and ultimately be replaced by artificial intelligence and robots.</p>



<p>Together, Goertzel and Epstein also discussed modifying human brains — a concept popular in Silicon Valley today, where numerous brain-computer projects are researching ways to cognitively enhance the human brain, and alter human personality, memory, and mental capabilities.<br><br>In 2008, when Epstein told Goertzel he was “off to jail” for a year, after he was convicted of soliciting a minor for prostitution, Goertzel suggested the solution to his problems might one day be solved if human brains could be re-programmed.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2026/04/GettyImages-2245894269-1800x1183.jpg" alt="" class="wp-image-63682"/><figcaption class="wp-element-caption">Ben Goertzel with Desdemona the robot, at a tech event in Portugal in November. Sam Barnes/Sportsfile for Web Summit via Getty Images.</figcaption></figure>



<p>“According to my understanding, the girls you were involved with were old enough to know what they were doing, so society really has no ‘moral right’ to lock you up,” Goertzel <a href="https://jmail.world/thread/c8cfe07576b67908cc5cebafd1a37207?view=inbox">wrote</a> to Epstein. “This is a fucked-up society we live in. But past ones have really been no better -- the fault is really w/ the human brain architecture, which is precisely what I'm aiming to supercede in my AGI work.”<br><br>When asked to comment on these remarks — and in particular the implication that Epstein’s problems might be solved if his accusers' brains were one day re-engineered — Goertzel told Coda: “This was a general observation that the messed-up nature of our society generally is rooted in the way our brains have evolved... and that advanced tech will let us modify our brains to make ourselves and thus our society better.&nbsp; There was no implication intended (nor stated) that women’s brains are any more or less messed up or in need of improvement than men’s.”<br><br>Goertzel reflected that his comments on Epstein’s victims being “old enough” were “regrettable and unfortunate in hindsight,” adding that his impression was that Epstein had been involved with adult women, not “disgustingly curating high school students for sexual purposes. I should have paid more attention.”&nbsp;</p>



<p>In 2013, three and a half years after Epstein was released from jail, Goertzel approached Epstein for funding to build a “<a href="https://jmail.world/thread/vol00009-efta00700552-pdf">toddler robot</a>”. Given Epstein’s criminal history of abusing minors, this has inevitably attracted attention online. “When we were discussing measuring the IQ of robot toddlers, the topic was never sexualized in any way,” Goertzel told Coda when asked about the project. “While I had nothing to do with Epstein's perverse sexual tastes or abuse of women, what I have read about his awful doings in the newspapers relates to his interest in teenage girls not toddlers.”</p>



<p>Epstein was particularly interested in funding projects that built — like Goertzel’s –- on transhumanist theories. Transhumanism is a worldview that captivates many of the most prominent tech leaders in Silicon Valley today. It believes in a future when the human body can be endlessly altered, genetically engineered, and ultimately fused with artificial intelligence.&nbsp;</p>



<p>“Transhumanism is a much more radical concept than eugenics,” explained Timnit Gebru, a computer scientist and researcher who has <a href="https://firstmonday.org/ojs/index.php/fm/article/view/13636">written</a> extensively about eugenicist ideas within artificial intelligence. “In eugenics, you're trying to create a more superior human by breeding humans through generations. In transhumanism, you're trying to get rid of humans altogether.”</p>



<p>For transhumanists, she added, “their idea is to get rid of any undesirable properties they see with humans."</p>



<p>Perhaps the most well-known proponent of transhumanism in the Epstein files is Peter Thiel.<br><br>“I think you would prefer the human race to endure, right?” New York Times journalist Ross Douthat <a href="https://archive.is/qY99g#selection-617.0-617.12">asked</a> Thiel last year. “Uh—,” Thiel said. “This is a long hesitation!” Douthat said. “Should the human race survive?” “Yes, but I would like us to radically solve these problems,” Thiel said. “We want you to be able to change your heart and change your mind and change your whole body.”</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2026/04/Peter_Thiel-1800x1200.jpg" alt="" class="wp-image-63672"/><figcaption class="wp-element-caption">Peter Thiel. Creative Commons (CC BY 2.0) /Gage Skidmore.</figcaption></figure>



<p>Thiel’s name appears in the files more than 2000 times, and Epstein <a href="https://www.nytimes.com/2025/06/04/business/jeffrey-epstein-peter-thiel-estate.html">reportedly</a> invested some $40 million into Valar Ventures, a firm co-founded by Thiel. The two <a href="https://www.justice.gov/epstein/files/DataSet%2010/EFTA01738574.pdf">spoke</a> of building secret societies and shared an interest in transhumanism and cryogenics — Epstein <a href="https://www.nytimes.com/2019/07/31/business/jeffrey-epstein-eugenics.html">wanted</a> to freeze his brain and penis when he died, so that one day he could be revived, while Thiel has also <a href="https://fortune.com/2023/05/04/peter-thiel-cryonics-cryogenically-frozen-death-anti-aging-health/">stated</a> his body will be frozen after his death.&nbsp;</p>



<p>They also appeared to share an interest in bringing an end to the democratic systems of today, imagining a different system altogether. Epstein, for his part, spent his life puppeteering the most powerful people in the world and undermining democratic systems. Thiel, meanwhile, first expressed his own anti-democratic views in 2009 when he <a href="https://www.cato-unbound.org/2009/04/13/peter-thiel/education-libertarian/">wrote</a>: “I no longer believe that freedom and democracy are compatible,” adding that since women were allowed to vote, the notion of a capitalist democracy became impossible. When the Brexit vote came through, Epstein <a href="https://www.justice.gov/epstein/files/DataSet%2011/EFTA02459362.pdf">wrote</a> to Thiel: “Brexit, just the beginning.” Thiel asked — “of what”; Epstein said – “Return to tribalism, counter to globalization, amazing new alliances.”&nbsp;</p>



<p>Globalization — and the idea of internationally powerful governing bodies — is a worldview that both Epstein and Thiel seemed to distrust. In March, in a palazzo in Rome, a stone’s throw from the Vatican, Thiel gave one of his infamous lectures in which he espoused his views about an “antichrist” that gets in the way of technological progress. This antichrist, he suggested, could be an internationally powerful body; the product of globalization. I stood outside the palace as attendees — priests, students, researchers — mutely hurried out, refusing to speak to the cluster of reporters waiting for Thiel’s black Mercedes.&nbsp;</p>



<p>“He has a totally irrational side, which lives on fear, of what danger might happen,” one audience member told me of Thiel on condition of anonymity, recalling how, up close, Thiel looked haunted and ill. “His head is full of future scenarios, which is what’s killing him. I think he’s scared.”<br><br>Thiel did not respond to multiple requests for comment.&nbsp;</p>



<p>Epstein didn’t confine himself to lofty conversations about a future collapse of the global order or re-engineering humanity. He also had ambitions for his own personal eugenics project. In 2019, it emerged that he wanted to <a href="https://archive.is/zVQEC#selection-1061.160-1061.195">seed</a> the world with his DNA — and reportedly have 20 women impregnated at a time at Zorro ranch, his New Mexico property.<br><br>Epstein tried to recruit Virginia Giuffre for this very project. He “fantasized about improving the human race by fathering children who carried his superior genes,” she recounted in her memoir, published posthumously late last year.&nbsp; “He’d talk about using his Zorro ranch as a literal breeding ground to propagate babies.” When Giuffre was 18 years old, she recalled, Epstein asked if she would carry his child and hand over all legal rights to it – “like a modern-day handmaid.”</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2026/04/7-1800x1013.png" alt="" class="wp-image-63683"/><figcaption class="wp-element-caption">Zorro Ranch, New Mexico. Diary of Epstein's victim.</figcaption></figure>



<p>In a haunting diary entry from another Epstein victim, written between the ages of 16 and 17 and <a href="https://archive.is/Yep6s">shared</a> with federal prosecutors, a girl <a href="https://www.justice.gov/epstein/files/DataSet%2012/EFTA02731361.pdf">describes</a> being told she will be sent to Zorro ranch — possibly to participate in the very same project. “Go to New Mexico? What in the hell? This makes no sense. What about school?” she writes, describing how Epstein chose her for her hair color and eye color, and tried to convince her she would create “perfect offspring.”</p>



<p>The teenager chronicles her pregnancy, pasting a sonogram into the scrapbook, before giving a traumatic account of giving birth with Ghislaine Maxwell beside her. “Ghislaine said to push all the pain away. I don't understand. Blood and water all over the bed.” As the baby was born, she writes, Maxwell covered her eyes. “I saw between her fingers this tiny head and body in the doctors hands.”</p>



<p>The girl describes hearing the baby’s “tiny cries” before “they took her.”</p>



<p>“I’m nothing but your property and incubator,” the teenager writes of Epstein. The diary is a terrifying piece of evidence that appears to link to Epstein’s longstanding fixation with creating genetically bespoke humans. The diary author’s lawyers, Wigdor LLP, declined to comment.&nbsp;&nbsp;&nbsp;</p>



<p>Epstein’s fever-dreams of creating an army of children carrying specific genes reflect a broader trend of “pronatalism” — a movement historically tied to eugenics — that’s thriving in Silicon Valley.</p>



<p>&nbsp;Millions of dollars of funding are currently being poured into projects <a href="https://www.geneticsandsociety.org/article/inside-silicon-valley-push-breed-super-babies">creating</a> “superbabies,” while billionaire tech oligarchs including Elon Musk — whose name appears more than 1000 times in the files — <a href="https://people.com/elon-musk-father-of-14-wants-to-have-legion-level-of-kids-before-apocalypse-report-11716621">reportedly</a> want to use surrogates “to reach legion-level before the apocalypse.” Musk did not respond to requests for comment.</p>



<p>In the files, women appear either as victims, as objects, or as vessels for genetic engineering experiments. They are an inconvenient reality, people to be controlled and re-booted. Epstein <a href="https://www.justice.gov/epstein/files/DataSet%2010/EFTA01971473.pdf">wrote</a> a 2013 email implying that women “are like shrimp. You throw away the head and keep the body.”&nbsp;</p>



<p>“The obsession with "artificial" life appears tied to a masculine desire to try control the production of life – ultimately ridding themselves of their dependency on women," said Gabriella Razzano, Co-Founder of OpenUp, a social impact tech lab based in Cape Town, who is also a senior advisor at the African AI Observatory. “I think there is important work to be done on tying the narratives that are very revealing in the Epstein files to understand how, and why, technology is being developed as it is.”&nbsp;</p>



<p>The trading of ideas about intelligence — both artificial and human — takes a particularly sinister turn in a 2016 exchange between Epstein and the cognitive scientist and AI researcher Joscha Bach, whose research Epstein <a href="https://facultygovernance.mit.edu/sites/default/files/20200121GoodwinProcterReport.pdf">funded</a> to the tune of $300,000.</p>



<p>Bach <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA00824156.pdf">writes</a> to Epstein about a study claiming that “black children outperform white children in motor development, even in very poor and socially disadvantaged households, but they lag behind (and never catch up) in cognitive development even after controlling for family income.”<br><br>Epstein <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA00824156.pdf">responds</a> with racist ideas about his notion of how to “make blacks smarter”, adding — “maybe climate change is a good way of dealing with overpopulation. The Earth’s forest fire. Potentially a good thing for the species,” before contemplating a world with “too many people,” where “many mass executions of the elderly and infirm make sense.”&nbsp;</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2026/04/5-1800x1013.png" alt="" class="wp-image-63679"/><figcaption class="wp-element-caption">Bronze sculpture of a female torso&nbsp;Jeffrey Epstein's Manhattan residence.</figcaption></figure>



<p><br>Epstein then imagines <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA00824159.pdf">creating</a> a future “Übermensch” — a superior human with cherry-picked attributes. “What I like is the idea that ubermensch could be the melding of humans, put together in one brain,” Epstein writes. This bespoke human, he suggests, would include traits from marginalized groups, who he appears to believe have a stronger awareness of how to navigate power structures because of their historical exclusion. “An increased motor system, an increased awareness, an increased status calculator (Blacks, jews, women). Ubermensch could be the combination of the best of humans, not the best of a specific race or gender. Fun idea.”&nbsp;</p>



<p>Bach told Coda in a statement: “I was summarizing a scientific study in a private email. Studies like this get often abused in ideological discourse to justify discrimination, which I strongly oppose and condemn.”</p>



<p>“I am firmly opposed to any form of racial discrimination, and I reject the use of group-level statistical claims to make judgments about individuals or to justify unequal treatment.”&nbsp;</p>



<p>He continued: “It goes without saying that if global warming were to lead to a reduction in the human population, it would be accompanied by immeasurable suffering. Our civilization would break down, leading to a return to dark ages, in which the elderly and infirm were often killed, because people could not support them, and often did not care about supporting them. Every reasonable person understands that this is horrible and not desirable in any way.”</p>



<p>Epstein “was often callous about human suffering in a way that I found disturbing but worth understanding, as a window into the perspectives of the rich and powerful,” Bach added.&nbsp;</p>





<p>Alongside Epstein’s conversations about mass executions for the old and and the sick, he was also interested in Silicon Valley’s dream concept of <a href="https://www.theatlantic.com/health/2026/02/peter-attia-epstein-files-wellness/685861/">living forever</a> — he had numerous email conversations with the longevity guru Peter Attia about prolonging his own lifespan, and <a href="https://ogc.harvard.edu/sites/g/files/omnuum12481/files/ogc/files/report_concerning_jeffrey_e._epsteins_connections_to_harvard_university.pdf">funded</a> a Harvard project geared towards “the end of aging.” In an <a href="https://www.justice.gov/epstein/files/DataSet%209/EFTA00853878.pdf">email</a> to Attia, Epstein mused: “I’m not sure why women live past reproductive age at all.” Attia, who <a href="https://x.com/PeterAttiaMD/status/2018350892395774116">published</a> a statement about his relationship to Epstein, did not respond to requests for comment.&nbsp;</p>



<p>This interest in “longevity” — living for as long as possible, even living forever, is popular among the elite precisely because they find themselves in an elite class, says David Robert Grimes, a scientist and disinformation expert who has <a href="https://www.scientificamerican.com/article/silicon-valley-is-reviving-the-discredited-and-discriminatory-idea-of-race/">written</a> about longevity and race science in Silicon Valley. “They're both sides of the same coin — the Silicon Valley eugenics, and also the longevity stuff. They promote an idea that ‘we are exclusive and we are special',” he said. "It helps them to justify deep social inequality."</p>



<p>The tech elite did not inherit this ideology by accident. Stanford University, the intellectual heart of Silicon Valley, was once a major hub for the American eugenics movement, which later helped to inspire Nazi race laws. Stanford’s founding president, David Starr Jordan, was a prominent eugenicist, <a href="https://www.calacademy.org/scientists/library/the-problematic-legacy-of-david-starr-jordan">campaigning</a> for forced sterilization of people with undesirable genetic traits. The university removed his name from its buildings in 2020 — but in Palo Alto, his beliefs did not disappear with the nameplate.</p>



<p>"Instead of eugenics we just call it longevity or biohacking," Christopher Wylie, the Cambridge Analytica whistleblower who has spent years investigating Silicon Valley's belief systems, <a href="https://www.youtube.com/watch?v=wXo6isGKRNQ&amp;t=6s">said</a> on a panel with me at a journalism conference last year. "It's the same."</p>



<p>The ideology Epstein bankrolled in private is being built in public. It’s a vision of the future in which a select few get to upgrade and extend their lives, while tightening their grip on the systems that determine which humans are worth investing in — and which are not.<br><br>It sounds like a dark sci-fi fantasy, except, as the files show, that fantasy is being funded and pushed into reality. Most of us will never be in the rooms where these ideas are discussed. All of us will live with the results.</p>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-algorithms post_tag-artificial-intelligence post_tag-digital-id-systems post_tag-feature idea-captured author-cap-isobelcockerell ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/the-future-according-to-silicon-valleys-prophets/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/12/The-future-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2025/12/The-future-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2025/12/The-future-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2025/12/The-future-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2025/12/The-future-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/the-future-according-to-silicon-valleys-prophets/">The future according to Silicon Valley’s prophets</a></h2>


<div class="wp-block-post-author-name">Isobel Cockerell</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-polarization post_tag-artificial-intelligence post_tag-attacks-on-press-freedom post_tag-censorship post_tag-information-war post_tag-perspective author-cap-nicholasdawes ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/polarization/can-we-trust-an-ai-jury-to-judge-journalism/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2026/04/Objection.jpg" srcset="https://www.codastory.com/wp-content/uploads/2026/04/Objection.jpg 1920w, https://www.codastory.com/wp-content/uploads/2026/04/Objection-600x338.jpg 600w, https://www.codastory.com/wp-content/uploads/2026/04/Objection-1800x1013.jpg 1800w, https://www.codastory.com/wp-content/uploads/2026/04/Objection-768x432.jpg 768w, https://www.codastory.com/wp-content/uploads/2026/04/Objection-1536x864.jpg 1536w, https://www.codastory.com/wp-content/uploads/2026/04/Objection-1600x900.jpg 1600w" width="1920" height="1080"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/polarization/can-we-trust-an-ai-jury-to-judge-journalism/">Peter Thiel is building a parallel justice system — Powered by AI</a></h2>


<div class="wp-block-post-author-name">Nic Dawes</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-artificial-intelligence post_tag-conspiracy-theories post_tag-first-person post_tag-information-war idea-captured author-cap-j-paulneeley ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/when-im-125/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-250x250.gif" srcset="https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-250x250.gif 250w, https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-72x72.gif 72w, https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-232x232.gif 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/when-im-125/">When I’m 125?</a></h2>


<div class="wp-block-post-author-name">J. Paul Neeley</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/blue-eyes-epstein-artificial-intelligence-eugenics-silicon-valley/">&#8220;All my fundees have blue eyes.&#8221; Epstein and the tech world&#8217;s dark ideology</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="https://videos.files.wordpress.com/WTFSNpE3/eye.mp4" length="2483015" type="video/mp4" />

		<post-id xmlns="com-wordpress:feed-additions:1">63628</post-id>	</item>
		<item>
		<title>The future according to Silicon Valley’s prophets</title>
		<link>https://www.codastory.com/authoritarian-tech/the-future-according-to-silicon-valleys-prophets/</link>
		
		<dc:creator><![CDATA[Isobel Cockerell]]></dc:creator>
		<pubDate>Mon, 08 Dec 2025 13:44:27 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Digital ID systems]]></category>
		<category><![CDATA[Feature]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=59918</guid>

					<description><![CDATA[<p>Big Tech’s vision of the future has little room for the rest of us. These are some of their wildest dreams</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/the-future-according-to-silicon-valleys-prophets/">The future according to Silicon Valley’s prophets</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<div class="wp-block-group alignfull is-style-subnav is-nowrap is-layout-flex wp-container-core-group-is-layout-ad2f72ca wp-block-group-is-layout-flex">
<p class="is-style-sans hide-mobile">Sections:</p>



<div class="wp-block-buttons alignfull is-style-default is-layout-flex wp-block-buttons-is-layout-flex">
<div class="wp-block-button"><a class="wp-block-button__link wp-element-button" href="#introduction" style="border-radius:0px">Introduction</a></div>



<div class="wp-block-button"><a class="wp-block-button__link wp-element-button" href="#listicle" style="border-radius:0px">What they say</a></div>



<div class="wp-block-button top-button"><a class="wp-block-button__link wp-element-button" href="#" style="border-radius:0px">⇡</a></div>
</div>
</div>



<p id="introduction">We think of Silicon Valley as a nexus of tech moguls, innovators, power brokers and venture capitalists. But something bigger and more ideological is unfolding in the Valley — the building of an entire religion. Tech evangelists talk about Artificial Intelligence as if they’re building a higher power. Elon Musk believes AI will help us find a “digital God;” while biohacker and tech entrepreneur Bryan Johnson is adamant: “I think the irony is that we told stories of God creating us,” he said in an interview earlier this year. “And I think the reality is that we are creating God. We are creating God in the form of superintelligence.”</p>





<p>According to the tech prophets, the future is something the rest of us don’t have any control over — in part, they say, because we don’t understand the tech enough to have the power or the authority to regulate it, and in part because the prophets themselves don’t want to bear any responsibility for the products they create. So how should we think about Silicon Valley’s version of the future, what promises are they really making, and how can we regain control over the story of the future?&nbsp;</p>



<p>This time two years ago, I was staying at an eco-retreat deep in the rainforest in Costa Rica. It was supposed to be a break from work — a time to unplug, recharge, sleep in a bamboo “pod” to the soundtrack of howler monkeys and toucans, that sort of thing. Instead, as often happens when I’m trying not to think too hard, I came across an interesting story. It began when I noticed my fellow retreaters all came from California. They were unplugging too: and arguably, they needed it more than me, because they all worked in tech. What I had thought was a rustic Costa Rican-owned eco-lodge was actually a favorite techbro getaway, founded by burnt-out former tech innovators, who had invested their money into helping their other burnt-out friends recover from burnout.&nbsp;</p>



<p>Over my days in that steamy jungle, I learned that the place I was staying in often ran psychedelic retreats for venture capitalists, engineers, tech workers, and crypto-bros, and that the entire valley surrounding us was gradually being taken over by similar retreats. Parcels of land were being sold off to Californian buyers, with indigenous people pushed out before being invited back into “the space” to guide psychedelic rituals and help the tech bros unlock their “creative flow” and dream up their latest innovations.</p>



<p>Right now, Silicon Valley’s elite are obsessed with accelerating towards a future where the human race is re-engineered and the world’s resources are in the hands of a very few. After I got back from my trip, I couldn’t stop thinking about how psychedelics are being used to help some of the world’s most powerful tech evangelists build a vision of expanded human consciousness and fuel their ambition to build hyper-intelligent AI models, pushing them to accelerate towards evolutionary transformation, with all the problems and delusions that entails — and what that means for the rest of us.&nbsp;</p>



<p>“Come watch me trip balls,” Bryan Johnson, the longevity entrepreneur (whose catchphrase is “don’t die”), <a href="https://x.com/bryan_johnson/status/1994518006421230083">proclaimed</a> recently, before livestreaming himself taking a ‘heroic dose’ of magic mushrooms. Johnson, who believes the tech world is “building God with superintelligence” is determined to live until he can eventually merge with a machine and live forever. In recent years, he’s been trying myriad interventions to biohack his body — everything from injecting himself with his son’s blood plasma to taking over 100 supplements a day — in an attempt to live longer. Experimenting with psychedelics is his latest venture, but he’s far from alone in the tech world. OpenAI’s Sam Altman has publicly said a psychedelic retreat was “life-changing;” while Elon Musk says he has used ketamine for depression, and Google’s Sergey Brin has invested millions into a psychedelic research project.</p>



<p>Upon my return from Costa Rica, I spoke to Johns Hopkins psychedelic humanities lecturer Neşe Devenot, who described how, spurred on by psychedelics, the tech elite are building a conviction that they are “the chosen steward of technology to help transmute the current phase of humanity and consciousness into a new form.”</p>



<p>The thing is, while psychedelic brews like ayahuasca have been used in shamanic practices within indigenous groups for centuries, the practice has been hijacked by the tech world — not to forge a closer connection with nature, or to confront their own existence, but to imagine a future where we transcend nature, transcend death, and terra-form the planet with datacenters to power ever-expanding artificial intelligence systems.<br><br>“A tech bro on acid is still a tech bro — they just become a psychedelically amplified tech bro,” is how writer and media theorist Douglas Rushkoff put it to me last year. “These guys have a hallucinatory confidence over their plans. And they’re developing tech that is as potentially disruptive to civilization as nuclear weapons.” Here are some of the most psychedelically inflected visions for the future that the tech bros are building for us and, soberly, let’s also look at what the costs of those visions are.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/12/1-copy-1800x507.jpg" alt="" class="wp-image-59935"/></figure>



<h3 class="wp-block-heading" id="listicle"><strong>We’ll live in Utopia*&nbsp;</strong></h3>



<p><strong>Believers:</strong> Jeff Bezos, Ray Kurzweil, Elon Musk</p>



<p>Tech leaders like Jeff Bezos and Ray Kurzweil promise us a solved world. They say that with the help of AI, we can hack our way back into paradise. Some talk about it as “the Singularity” — a world where AI is billions of times more intelligent than humans — and say we just won’t be able to predict or even conceive of what the future will look like once we build artificial intelligence that powerful. But the most optimistic tech evangelists believe it will be a kind of heaven.</p>



<p>“It is a renaissance; it is a golden age. We are now solving problems with machine learning and artificial intelligence that were in the realm of science fiction for the last several decades,” says Amazon CEO Jeff Bezos. “By the time we get to the 2040s, we’ll be able to multiply human intelligence a billionfold. That will be a profound change that’s singular in nature,” adds computer scientist Ray Kurzweil, who has written extensively on the Singularity.</p>



<p>In our podcast <a href="https://www.audible.com/pd/Captured-Audiobook/B0DZJ5W4Y7?srsltid=AfmBOorKVtKwv7TbFl1cFcLBqBBn9r4HLtdHCaaqLpyo-SYIBsf7PBJ7"><em>Captured</em></a>, tech workers described what their utopia might look like from their San Francisco condos: “I see a city filled with gardens, filled with communities, a place where people can raise their kids together, a place where people can find a place to belong. And maybe there's sci-fi elements to that,” engineering physicist Andrew Cote told us, staring out over the horizon.</p>



<p><strong>The catch:</strong> But once everything is solved, what will we do with our time? Philosopher Nick Bostrom asks us to imagine what Utopia would actually look like — and whether it’s something we actually want: “Imagine we have all this technological abundance, and we’ve somehow managed not to use it to oppress one another or wage war, but have some reasonably good arrangement. What would human lives be like?” Well, for one thing…&nbsp;</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/12/2-2-1800x506.jpg" alt="" class="wp-image-59923"/></figure>



<h3 class="wp-block-heading"><strong>We’ll live forever*</strong></h3>



<p><strong>Believers: </strong>Bryan Johnson, Peter Thiel&nbsp;</p>



<p>Talk to anyone in Silicon Valley right now and they’ll wax lyrical about ways to live forever. At present, they accept it’s medically impossible — but they believe the day is coming when technology will let us transcend our bodies.</p>



<p>“I’m basically a brain with limbs… the rest is kind of undifferentiated,” said AI builder Kyle Morris when speaking to us for <em>Captured</em>, showing us the vast range of supplements he took to live long enough to see a technological shift where we’ll be able to merge with machines and continue to consciously live beyond the limits of our bodies. Bryan Johnson, tech CEO and leader of the “don’t die” movement, has experimented with injecting his son’s blood plasma into his veins in a bid to live longer — though he says it didn’t really work.</p>



<p><strong>The catch: </strong>*Not everyone will live forever. Only those who can afford it. “I suspect we're going to see a class divide between people who can live hundreds of years and people who live less than 50. That’s going to be a civil war of some sort, I would anticipate,” Kyle Morris told us.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/12/3-copy-1800x506.jpg" alt="" class="wp-image-59936"/></figure>



<h3 class="wp-block-heading"><strong>We’re all going to die* <br></strong></h3>



<p><strong>Believers: </strong>Elon Musk, Daniel Kokotajlo, Effective Altruists</p>



<p id="story">This might seem contradictory, but in San Francisco it makes sense: there are two camps — those who believe AI will allow us to live forever, and those who believe it will kill us all. There’s also people who believe both outcomes are a possibility. Elon Musk, for example, says there’s “only a 20% chance of annihilation” by super-powerful artificial intelligence programs.</p>



<p>While reporting for <em>Captured</em>, we spoke to Effective Altruists protesting outside Meta: <em>“</em>Pause AI because we don’t want to die!<em>”</em> they chanted. Earlier this year, a group of AI researchers released <a href="https://ai-2027.com/">AI2027</a>, a piece of science fiction charting the rise of runaway artificial intelligence, ending in a brutal showdown where every human is killed by an AI-activated biological weapon, and the Earth is terraformed by datacenters, laboratories, and particle colliders.</p>



<p id="story">*Except the tech-bro survivalists. Tech enthusiasts — with money — believe their inventions could trigger a catastrophic event on Earth: a global pandemic, climate breakdown, nuclear war, or AI apocalypse. They’re <a href="https://www.codastory.com/oligarchy/the-oligarchs-guide-to-sitting-out-a-nuclear-winter/">quietly prepping</a>. Some are building bunkers in Montana. Others see New Zealand as the ideal bolthole. Peter Thiel has constructed a fortified estate there, designed as a survival outpost.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/12/4-1800x506.jpg" alt="" class="wp-image-59925"/></figure>



<h3 class="wp-block-heading"><strong>We’ll never have to work again*</strong></h3>



<p><strong>Believers:</strong> Sam Altman, Mark Zuckerberg, Alex Blania</p>



<p>Tech leaders building artificial intelligence talk openly about how they’re transforming the entire economy. They tell us that the world of work, as we know it, may not exist for much longer. “Entire classes of jobs will go away and not come back,” is how OpenAI CEO Sam Altman puts it. Jobs as we know it will change forever. For <em>Captured</em>, we spoke to nurses who are already seeing chunks of their jobs taken over by artificial intelligence, and even a comedian who worries a day will come when AI starts writing her peers’ jokes. Already, entire industries are feeling the effects of AI takeover. But if we don’t have to work, how will we get paid? Silicon Valley has an answer for that too: Universal Basic Income, an old idea retrofitted for the AI age. The idea with UBI, is that we'll all get an allowance, a regular payment, no strings attached. That payment will replace income that would previously have come from a job. We traveled to Kenya to look at the prototype for one of these systems in action: a concept called World, that gives you a monthly allowance of around $50. In return, you must submit your iris biometrics to World’s database via a camera device called the Orb. When the Orb arrived in Kenya, there were enormous, chaotic queues at shopping malls, packed with people vying to submit their iris data and get onto World’s system and get hold of the handouts.&nbsp;</p>



<p><strong>The catch:</strong> Universal Basic Income sounds great in principle, but if you think deeper, it will completely change what it means to be human. If we don’t work, don’t pay taxes, then we as humans will no longer contribute to society and the economy. We’ll then become completely reliant on — and powerless against — the whims and wishes of those in power, with no way to protest, or strike, if they’re unhappy with how things are going. If we accept Silicon Valley’s vision of the future where we depend on handouts from our tech overlords, we’d concede our freedom, independence and autonomy to a new set of masters. Beyond that, it’s difficult to imagine what we would do all day — as a species — if we didn’t have to work. “If there's nothing we need to do–if we could just press a button and have everything done, like, then what do we do all day long? What gives meaning to our lives?” philosopher Nick Bostrom <a href="https://www.codastory.com/authoritarian-tech/finding-meaning-in-human-lives/">mused</a> while speaking to us for <em>Captured</em>.<br></p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/12/5-1800x506.jpg" alt="" class="wp-image-59926"/></figure>



<h3 class="wp-block-heading"><strong>Nation states will not exist*</strong></h3>



<p><strong>Believers:</strong> Balaji Srinivasan, Peter Thiel, Marc Andreessen</p>



<p>“Very few institutions that predated the internet will survive the internet,” Balaji Srinivasan, the former CTO of Coinbase, said in a lecture recently–and by that, he means governments, and countries themselves. After all, governments come with a whole host of irritating traits that tech leaders loathe–they regulate companies, make them pay taxes, tell them what they can and can’t do. Why not secede, then, from those countries entirely, and build your own? Srinivasan is one of the leading thinkers behind the idea of a “networked state” — a successor to the nation state, built and enabled by tech.&nbsp;</p>



<p>Proponents of the networked state dream of having digital statehoods; “startup nations” where they’ll be free of taxes and regulations, free of the bureaucracy of living in, well, a traditional country. They’re already doing it: pushing to draft legislation to create “freedom cities” in the U.S. — something Trump’s 2024 campaign proposed, enclaves unshackled by federal law where tech engineers can try out startups and clinical trials free from regulation or approval from federal agencies. Meanwhile on an island off the coast of Honduras is Prospera, a semi-autonomous “private city” backed by Sam Altman, Marc Andreessen and Peter Thiel, that’s marketed as a libertarian fantasy utopia.&nbsp;</p>



<p><strong>The catch: </strong>The idea of getting rid of stifling government bureaucracy and living in a world without borders is an idealistic dream held by many people, not just tech leaders. But, as the Silicon Valley elite envisions it, we would replace sovereign nations with a collection of private, giant gated communities that would hoard resources, money, and power, while locking everyone else out. A world where democracies no longer exist and elected leaders are replaced by digital moguls would be a world that serves clients, not citizens, and cares only for profit and innovation, a world where international human rights laws are thrown out.&nbsp;</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/12/6-1800x506.jpg" alt="" class="wp-image-59927"/></figure>



<h3 class="wp-block-heading"><strong>We’ll spread out into the stars*</strong></h3>



<p><strong>Believers:</strong> Elon Musk, Jeff Bezos, Richard Branson</p>



<p>But what if we could take this idea of building crypto-states further — and leave Earth entirely to build Silicon Valley outposts on <a href="https://www.codastory.com/oligarchy/silicon-valley-elon-musk-colonizing-mars/">Mars</a>, or on the moons of Jupiter? Not only transcend our bodies, but transcend the Earth itself — after all, if we can’t fix the planet, we can just leave it. Jeff Bezos talks about moving “all polluting industry into space” and leaving Earth as a nature reserve — one of the tech industry’s many technofixes for climate change. And all of Elon Musk’s ventures, from Tesla to X, are designed to support his ultimate mission: making the human species “multiplanetary.”</p>



<p>“They want to ensure the light of consciousness persists by reducing the probability of human extinction,” says Émile P. Torres, a philosopher who used to be part of what they call the emergent “cult” of Silicon Valley. Torres told us about the tech bros’ vision of a utopian future where humans conquer the universe and plunder the cosmos. It sounds like something out of science fiction — and indeed it is: when we visited AI frat houses during our reporting for <em>Captured</em> we found bookshelves stuffed with science fiction about space and colonizing the universe.&nbsp;</p>



<p>Harvard historian Jill Lepore has a different way of seeing it — she calls it “extra-terrestrial capitalism,” mimicking a colonialist vision of expanding indefinitely, taking our extractivist mindset into the stars.&nbsp;</p>



<p><strong>The catch: </strong>Not everyone will be able to travel into space — or perhaps, not everyone will be able to stay on Earth. If you read enough sci-fi, and listen to enough conversations in Silicon Valley, you can envision all sorts of different outcomes: Mars becoming a penal colony filled with slave workers extracting resources; Mars becoming independent from Earth; only the super-rich and elite able to leave Earth as the planet burns. In Musk and other tech-bro survivalist visions of the future, they imagine a global pandemic, climate meltdown or nuclear war extinction event — perhaps thanks to the runaway Artificial Intelligence they themselves built — and see space as the ultimate off-ramp for a chosen few.&nbsp;</p>



<p>“It’s important to get a self‐sustaining base on Mars… because it’s far enough away from Earth that it’s more likely to survive than a moon base,” Musk told the audience at South By Southwest in 2018. “In the hopefully unlikely event that something terrible happens to Earth, there’s a continuance of consciousness on Mars. One of the benefits of Mars is life insurance for life collectively,” he said this year.&nbsp;</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/12/7-1800x506.jpg" alt="" class="wp-image-59928"/></figure>



<h3 class="wp-block-heading"><strong>We’ll have all human knowledge in our brains*</strong></h3>



<p><strong>Evangelists:</strong> Elon Musk, Bryan Johnson</p>



<p>Why bother with school when you could install a chip in your brain? Right now, tech leaders are working on building chips — like Musk’s venture, Neuralink — that we can insert in our brains, so that one day, we can merge with machines. When we met engineers in San Francisco, they told us about their ultimate ambition: to put all human knowledge inside human brains, from birth.  “That’s the purpose of the education system, right?” said Jeremy Nixon, the founder of AGI House, which brings together AI workers into a houseshare in San Francisco.<br><br>But why not skip over all that and simply install a chip into our brains, so that even from birth we can know everything, all at once. Imagine, we’ll be able to speak every language on Earth, we’ll know all of human history, all of science. Ok, we might not be able to discover anything new — but our future will be boundless. “You hold your phone and it’s like a better prefrontal cortex. It tells you how to get places, tells you how to plan. It gives you answers. It gives you a better memory. I see in the next 50 years, that's going to enter us, that's going to become part of us,” Kyle Morris, another member of the AGI House, told us.&nbsp;</p>



<p><strong>The catch:</strong> Not everyone will necessarily be able to get this supersonic brain — and those enhancements will only come to those who pay. So, as tech leaders see it, could there one day be an underclass of people who can’t afford — or don’t want to install — these brain enhancements? And will those with enhanced brains then oppress those without them? Just as the world is <a href="http://google.com/search?q=digital+exiles+coda&amp;oq=digital+exiles+coda&amp;gs_lcrp=EgZjaHJvbWUyBggAEEUYOTIGCAEQRRg8MgYIAhBFGDzSAQgzODQxajBqN6gCALACAA&amp;sourceid=chrome&amp;ie=UTF-8">becoming</a> harder and harder to navigate now without a smartphone, perhaps in the future it will become harder to navigate without a chip in your brain — will you be able to travel, move freely, do simple errands? Last week, Mark Zuckerberg said that people without smart glasses like Meta’s model, that give them instant and constant access to an AI assistant, will be at a cognitive disadvantage.&nbsp;</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/12/8-1800x506.jpg" alt="" class="wp-image-59929"/></figure>



<h3 class="wp-block-heading"><strong>Climate change will be fixed by tech*</strong></h3>



<p><strong>Evangelists:</strong> Larry Page, Elon Musk, Bill Gates</p>





<p> There’s an idea we came across while reporting in Silicon Valley that climate change, while problematic, is nothing much to worry about, because one day soon it too, like everything else, will be fixed by some technological intervention. Perhaps we’ll geoengineer the skies to create “sunscreen for the Earth” (as one pair of tech evangelists-turned-guerilla geoengineers dubbed it); perhaps we’ll finally figure out nuclear fusion (that’s a favourite prediction in Silicon Valley circles), or we’ll figure out how to get our oceans to sequester carbon. In November, Elon Musk proposed that “A large solar-powered AI satellite constellation would be able to prevent global warming by making tiny adjustments in how much solar energy reached Earth.” Though artificial intelligence datacenters suck up vast quantities of water and spew carbon into the atmosphere (Google’s newest datacentre in the UK will <a href="https://www.theguardian.com/technology/2025/sep/15/google-datacentre-kent-co2-thurrock-uk-ai">emit</a> 570,000 tonnes of CO2 a year, according to planning documents), the tech leaders tell us: we’ll figure out the answers sooner or later; or AI will do it for us.&nbsp;</p>



<p><strong>The catch:</strong> Geoengineering, while a favorite pipedream of tech enthusiasts, could have unpredictable, and Earth-shattering consequences. Climate experts say processes like these could throw Earth into deeper chaos by cooling the world unevenly and wreaking havoc on our climate systems. And once we start the process of solar geoengineering, we won’t be able to stop — we’ll have to keep spewing chemicals into the atmosphere to cool down the sun, or face a rapid and catastrophic heating event. Who would even be in charge of geoengineering the planet; and who would decide if it was safe enough?</p>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-oligarchy post_tag-feature idea-captured author-cap-isobelcockerell ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/oligarchy/silicon-valley-elon-musk-colonizing-mars/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2024/06/NON-CC-GettyImages-109327001-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2024/06/NON-CC-GettyImages-109327001-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2024/06/NON-CC-GettyImages-109327001-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2024/06/NON-CC-GettyImages-109327001-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2024/06/NON-CC-GettyImages-109327001-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/oligarchy/silicon-valley-elon-musk-colonizing-mars/">Silicon Valley’s sci-fi dreams of colonizing Mars</a></h2>


<div class="wp-block-post-author-name">Isobel Cockerell</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-perspective idea-captured author-cap-nataliaantelava ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/who-decides-our-tomorrow-challenging-silicon-valleys-power/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/07/IMG_7364.gif" width="1920" height="1080"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/who-decides-our-tomorrow-challenging-silicon-valleys-power/">Who decides our tomorrow? Challenging Silicon Valley’s power</a></h2>


<div class="wp-block-post-author-name">Natalia Antelava</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-artificial-intelligence post_tag-conspiracy-theories post_tag-first-person post_tag-information-war idea-captured author-cap-j-paulneeley ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/when-im-125/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-250x250.gif" srcset="https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-250x250.gif 250w, https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-72x72.gif 72w, https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-232x232.gif 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/when-im-125/">When I’m 125?</a></h2>


<div class="wp-block-post-author-name">J. Paul Neeley</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/the-future-according-to-silicon-valleys-prophets/">The future according to Silicon Valley’s prophets</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">59918</post-id>	</item>
		<item>
		<title>The digital exiles: Why people are abandoning their smartphones</title>
		<link>https://www.codastory.com/surveillance-and-control/the-digital-exiles-why-people-are-abandoning-their-smartphones/</link>
		
		<dc:creator><![CDATA[Isobel Cockerell]]></dc:creator>
		<pubDate>Fri, 21 Nov 2025 10:00:32 +0000</pubDate>
				<category><![CDATA[Surveillance and Control]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Digital ID systems]]></category>
		<category><![CDATA[Facial recognition]]></category>
		<category><![CDATA[Feature]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=59354</guid>

					<description><![CDATA[<p>A growing movement of “former screenagers” is calling for a screen-free, surveillance-free life, for a chance to build a future beyond tech capture</p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/the-digital-exiles-why-people-are-abandoning-their-smartphones/">The digital exiles: Why people are abandoning their smartphones</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-video alignfull"><video height="1080" style="aspect-ratio: 1920 / 1080;" width="1920" autoplay loop muted poster="https://www.codastory.com/wp-content/uploads/2025/11/the-digital-exiles_mp4_avc_240p.original.jpg" src="https://videos.files.wordpress.com/I0vY7Yfj/the-digital-exiles.mp4" playsinline></video></figure>



<p>There was no specific tipping point that made Logan Lane get rid of her smartphone. One day, the thought just arrived. “I was like, I just can’t fucking do this anymore.” And she put away the device that had dominated her life since she was 11. “I spent about five of my developmental years just tied to my smartphone,” she says. Logan, 20, bought a basic flip phone, and re-learned to navigate the world, without social media, GPS, and without the constant, nagging cry for attention from her smartphone that had punctuated her days.</p>





<p>She grieves the early adolescence she lost to her phone. “In the years when you’re supposed to be reading and playing, we were on our phones and computers,” she says. “We had those years of play stolen from us.”&nbsp;</p>



<p>Lane is the founder of the <a href="https://www.theludditeclub.org/">Luddite Club</a>, a solidarity network of “former screenagers” growing a movement across America. Together, they’re pledging to give up their devices, choosing instead a life of voluntary exile from the digital world.&nbsp;</p>



<p>To speak to Lane, I placed an international call to her flip-phone — an act that already felt anachronistic. The line crackled as we talked and her train rattled through New York City. For a moment, the world felt analog again.&nbsp;</p>



<p>Lane is part of the first generation with no memory of life before smartphones — a generation that became addicted to their phones before anyone truly understood the cost. “There’s no one person to blame,” she said. “Even though I was only 11 or 12 years old when I got a phone, I was responsible for facilitating this addiction in my life. But at the same time, I was a child.”&nbsp;</p>



<p>All around Lane on the subway, all along the train — and along every train in New York City; every train in every major city in the world — people stared into their smart devices. The smartphone penetration rate for the world <a href="https://www.pewresearch.org/internet/fact-sheet/mobile/">is</a> about 60%; in the U.S. it’s at 91%. Just a decade ago, global penetration was 10%, but now many of us can’t leave a room, let alone the house, without our phones.</p>



<p>Rising in response is a resilient counterculture; a growing group of people who have had enough. People who long for a simpler, more three-dimensional life in which they have control over their digital existence, and their thoughts and data are not harvested, nudged, monitored. So they check out. Power off their smartphone; lock it in a drawer; give it away; throw it in the trash. Hope they’ll never have to use one again. The Luddite Club now has local chapters all over the U.S., and young people are flocking to the myriad offline events where they talk about reclaiming their lives from <a href="https://www.codastory.com/captured/">tech capture</a>.&nbsp;</p>



<p>“I’m excited to read on the train in peace, to not look at social media, post or check up on exes, looking for validation or a small dopamine hit. I’ll get dopamine the right way,” a young woman recently wrote on Reddit. “It will be difficult at first,” someone responded, “but it will become more freeing after you break your chains.” Another young man wrote that he had “just wasted ten years of my life living in an alternate reality.” Having made the switch, he called on others to “come back to the real world and enjoy the struggles and solutions of analog life.”<br></p>



<p>These conversations unfold in a radical corner of the internet where thousands of people a day come to discuss getting rid of their smartphone. The “dumbphones” <a href="https://www.reddit.com/r/dumbphones/">subreddit</a> has the intimacy of an addiction support group. The page is full of pictures of what people call their “everyday carry” gear, the tech they bring with them on a typical day. For people of a certain age, the pictures are transfixing, nostalgic: Motorola Razr flip phones, old Nokias, candy-colored iPod minis, notebooks, A to Z maps, point-and-shoot cameras, MP3 players. The photos hark back to a moment in time before everything — as the Luddites see it — started to go wrong.</p>



<figure class="wp-block-image alignwide size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/11/Nokia-gif-1800x1013.gif" alt="" class="wp-image-59476"/></figure>



<p>It was, if you want to put a specific year to it, 2006. Facebook had just opened up its usership beyond students, and tens of thousands of users were signing up every day. Back then, Facebook reunited long lost schoolfriends, lovers, even relatives. Independent musicians blew up overnight on Myspace. Social media felt like something that would make people more open and connected. The first iPhone was still a year away. We still knew how to navigate our world without Google maps. We still read books on commutes, took pictures on cameras and uploaded them in their joyful hundreds to Facebook for fun. The 2008 crash hadn’t happened. Attention algorithms didn’t yet exist. The tech companies still felt like harbingers of a better, more connected future.&nbsp;</p>



<p>Daisy Krigbaum, a dumbphone advocate who now runs a business around it, calls that era “the sweet spot.” It was a time, she says, “when online social platforms were there to facilitate in-person correspondence. They just filled the gap between when you could see somebody in person. You could talk to your friend while they’re abroad. You could talk to a family member who's bedridden. But then it evolved into a monster.”</p>



<p>The “sweet spot” is something Judy Estrin remembers well. One of the internet’s early architects, Estrin is a <a href="https://www.codastory.com/authoritarian-tech/stop-drinking-from-the-toilet/">Silicon Valley veteran</a> who helped build the foundations of the web in the 1970s. When I spoke to her at a sunny cafe in Palo Alto last year, she described the last days before technology stopped being built to cater to our needs. “It was human-centred,” she said of the internet back then. “It wasn’t until we got into the Cloud, mobile, social, that the dynamic shifted and it became more about humans adapting to the technology.”&nbsp;</p>



<p>One thing that had kept tech companies in check, Estrin explained, was limitations on computing power. “There were constraints on the technology. We kept moving up against processing, bandwidth, storage.” But once computing power got cheaper, those constraints disappeared. “The culture changed,” she said. Instead of designing carefully, companies could just keep <a href="https://www.codastory.com/authoritarian-tech/tech-design-ai-politics/">adding</a> features. “The design aesthetic was these continuous scrolling feeds. The design of mobile became more and more massively online.”</p>





<p>She remembered how computer scientists started designing for mobile first. “We stopped having to think in terms of constraints. We just started brute-forcing everything.” And then tech began not to respond to our lives but to shape them. “It was in 2010, 2011, 2012,” Estrin said, “you could see the incentives of the system and the ad-driven markets just completely starting to shift things.” She said she felt guilty for not noticing this <a href="https://www.codastory.com/authoritarian-tech/who-decides-our-tomorrow-challenging-silicon-valleys-power/">switch</a> sooner — and for playing some role in the world Silicon Valley <a href="https://www.codastory.com/authoritarian-tech/captured-silicon-valley-future-religion-artificial-intelligence/">created</a>; the world we all live in today. “ I did and do feel increasingly disappointed. Just disappointed with the technologies that we created,” she said. “ I think that I was so heads down and focused for so many years, between building companies and raising my son. And I think that I, then at some point, picked up my head. And it's like, well, why wasn't I paying attention to this stuff? What was I doing?”</p>



<p>For dumbphone business-owner Daisy Krigbaum and her partner Will Stults, the wake-up moment came on a transformative night in 2022. One night, after hours of scrolling beside each other on the couch, they finally looked up.</p>



<p>After basking in blue light and “looking at mindless stuff” for “an unreal amount of time,” Krigbaum said, they turned to each other and admitted they had a problem.&nbsp;</p>



<p>They decided to forgo the tech that had been dominating their existence. First, like Lane, they had to come to terms with the time they had lost, and why. “ I think we both feel really grateful to have been born kind of on the cusp of the post-information age where we still had some foundational social skills,” said Krigbaum, who is 28. “I already feel impoverished by how much of my adolescence took place online.”</p>



<p>“Society's relationship with tech has at least migrated to the point where we're willing to admit that most or all of us have some sort of problem,” added Stults. “None of us have a completely healthy relationship with technology.” They started to look at flip phones and old-style cellphones to switch over to but found the experience of detangling their life from smartphones filled with knotty inconveniences, workarounds and sacrifices.&nbsp;<br></p>



<p>Contemporary life is full of small dependencies that keep people tethered to their phones — apps for work, school portals, two-factor authentication, maps, music, messaging. One tiny function you rely on can hold you hostage to the whole device. “It’s such a confusing world to get off a smartphone,” Stults said. So he and Krigbaum founded an online store called <a href="https://dumbwireless.com/">dumbwireless</a> selling dumbphones, and running a hotline to help people through the process. “We thought if we could streamline it a little bit, then people might be more inclined to follow their better instinct in those moments when they are like, ‘I can't do this anymore,’” said Krigbaum.</p>



<figure class="wp-block-image alignleft size-full is-resized"><img src="https://www.codastory.com/wp-content/uploads/2025/11/light-phone2.png" alt="" class="wp-image-59507" style="width:428px;height:auto"/><figcaption class="wp-element-caption">Light Phone II. Creative Commons (CC BY 2.0) Jordan Mansfield.</figcaption></figure>



<p>Krigbaum herself uses a <a href="https://www.thelightphone.com/">Lightphone</a>: it’s a new type of device built for digital exiles. Alongside another phone called the <a href="https://mudita.com/products/phones/mudita-kompakt/?srsltid=AfmBOoqaqiSzmw_X-k_s5qR8qMa1Pp6AKQ3v9hAi5lXyQTIqHYqPSNk2">Kompakt</a>, these phones are intentionally boring. The screens are e-ink. They have maps, messages, a calculator, an alarm clock, and of course a telephone. The Kompakt can “sideload” any other apps you need, like Slack, Spotify and WhatsApp. But they don’t pull and nag at your attention.&nbsp;</p>



<p>At upwards of $200, these hybrid, dull phones are the ultimate connoisseur's choice for someone who wants to live in the modern world without being dependent on an attention-demanding device. But the true radicals go further, returning to the flip phones and Nokias of the “sweet spot” era, saying the joy of going back to the dumbphone is reclaiming parts of your brain — like your sense of direction — that have atrophied from smartphone use.</p>



<p>On Reddit’s dumbphones forum, people talk about the bigger aim of doing without the conveniences their phones provide and regaining control over their thought patterns. “Everything is a fucking struggle without a smartphone. The whole world is set up around them,” one Redditor wrote last month. “But I am focussed, I feel capable, I am so much more compassionate and understanding of others. I have more patience. I am less angry and more in control of my emotions. My anxiety is practically gone.”</p>



<p>Every so often, the author Zadie Smith — perhaps the world’s most famous flip-phone user — is reminded of the horrors of analog life, she told Ezra Klein on his <a href="https://www.youtube.com/watch?v=id_k43ZU8t4">podcast</a>. “ Disaster. We're at a party at three in the morning, there's no way to get home, forget about it, walk five miles, disaster. Once a year. And every time it happened, I would think that was bad, but is it as bad as having my very consciousness colonized every moment of the day? And I'd be like, no. Definitely no competition.” It’s a trade-off dumb phone users are happy to make – lose their phone but regain their consciousness.</p>



<p>The other thing dumb phone users cherish is the solitude they get back. True solitude – where there’s no constant companion in your pocket that can listen to the sound of your voice, feel the pads of your fingertips, track your expressions, and follow you through your home city.</p>



<p>Someone who knows the importance of such solitude is Issa Amro, a Palestinian activist living in Hebron, on the West Bank. Hebron is one of the most intensely surveilled places on Earth, where the Israeli military uses facial recognition programs called Blue Wolf, Red Wolf, and White Wolf to track Palestinians. “I feel that I live in a lab and I’m a simulation object,” Amro told me, describing how the systems rely heavily on smartphones for data collection and enforcement.&nbsp;</p>



<p>In 2022, Amro filmed an Israeli soldier beating an Israeli-Jewish activist. The video was much-shared in Israel, and Amro knew it would only be a matter of time before he was arrested. So he gave his smartphone to a friend who drove a taxi around the city. When the police came for him, they were intent on getting hold of it.  “The Israeli police were crazy to get my phone. And I refused to give it to them,” he remembers. Meanwhile, the phone’s location was moving all over Hebron, hidden in the taxicab. “My friends moved it from one car to another, trying to hide it. The phone was going all around the city until I was released,” he said with a laugh.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/11/snake4-1800x645.gif" alt="" class="wp-image-59512"/></figure>



<p>After that, he traveled to Jordan and bought an analog phone with buttons — the first he’d owned in years. “Buy it from a random place, when you travel somewhere, go and buy one,” he advised. He swaps out his smartphone for the analog phone “to feel better,” when he wants to have a moment of respite from the suffocating surveillance of life in the occupied West Bank.&nbsp;</p>



<p>“It’s a very bad feeling to know all your life is being watched. Not just your political activities, but your [personal] life too. If you want to have a date, or something for yourself, the occupation will use it against you.” Once Amro started using the analog phone, the Israeli forces took notice. They didn’t like it. Just last month, as he was crossing the border from Jordan into Palestine, the customs officer rifled through his bag, looking for his smartphone. When the officer found the small analog phone, he took a picture of it and sent it to his superiors.&nbsp;</p>



<p>“I was waiting, waiting, waiting,” Amro said. “Then the interrogators came.”</p>



<p>They grilled him about the phone. “ What’s wrong with you?” they said. “Why do you carry an analog phone? What do you do with it, who did you contact, and where is your smartphone?”</p>



<p>“I told them, ‘I’m not doing anything illegal. I live in Hebron. My house has one camera in the front and one in the back. Whenever I get in or get out, you know about it. Wherever I go, you know. My life has no privacy. Why do you care if I have an analog phone or a smartphone?’”</p>



<p>The border police questioned him solidly for two hours about the phone. “Everything is built on surveillance now and digitalization,” Amro said. “So if you go analog, you really make it hard for them. In the past, intelligence systems depended on analog tactics — on people. Now they depend on machines.”</p>



<p>They wanted him to have a smartphone because, as Amro put it, “The phone documents everything.”</p>



<p>He feels solidarity with other analog phone users around the world. “Whenever I see someone else with one, I feel — Here’s a friend. We are the same family.”</p>



<p>Sometimes, with his analog phone, Amro does nothing more than go to the forest for a moment of peace. “We’re skimming nature from our life, and it’s really important to understand the threat of digitalization. Going back to nature is really important.”&nbsp;</p>



<p>It’s a sentiment New York-based writer August Lamm, who has made a <a href="https://augustlamm.substack.com/p/you-dont-need-a-smartphone">zine</a> about dumbphones, shares. She can palpably feel the outside world re-entering her life since she got rid of her phone. “I feel more present and attuned to my surroundings, and I can feel my life changing,” she said. “My days feel long and rich and open, and I can trust my thoughts more because I don't feel they've been fed to me.”</p>



<p>She talked about how the physical realm opened up to her when she got rid of her smartphone, with its Instagram account and its tens of thousands of followers. She regained a sense of her surroundings. “If you live for fifty years and you’re aware every day of what’s going on around you, and you’re listening to people, and you’re present, that is more valuable than living into your nineties and when you flash back through your life it’s just screens.”</p>



<figure class="wp-block-image alignwide size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/11/Game-Boy-1800x1013.jpg" alt="" class="wp-image-59404"/></figure>



<p>Lamm wants to push to maintain a critical minority of society who isn’t captured by smartphone use, who don’t own them and will never own them. “I would love to live in a world where people say, ‘Wait, do you have a smartphone?’ as a matter of courtesy.”&nbsp;</p>



<p>It feels like an impossible dream, as big tech companies move to <a href="https://www.codastory.com/surveillance-and-control/nursing-ai-hospitals-robots-capture/">capture</a> even more areas of our lives. From the moment the scans of our unborn bodies are uploaded by our parents to Instagram, to our school days dominated by Google classroom, to our first phone, to every thought we <a href="https://www.codastory.com/authoritarian-tech/ai-therapy-regulation/">commit</a> to a search term or AI model, to every beat of our heart recorded by our smart watch, to the steady decline of our health, to our hospital appointments booked on our phones, to the day we die and condolences are posted on our page, the phone is ever-present. “ Our brains are captured. The industry is captured. Our politics is captured. We're captured in so many different ways,” Judy Estrin said. “Our leadership is captured. In every industry, we’re captured by this mentality and worship of growth and innovation.”&nbsp;</p>



<p>And if tech leaders have their way, there’ll be a time when the smartphone is no longer an external device — but part of our bodies. Kyle Morris, a young AI builder I met in San Francisco last year, called the smartphone a “better prefrontal cortex. It tells you how to get places, tells you how to plan. It gives you answers. It gives you a better memory. I see in the next 50 years, that it's going to enter us. That it's going to become part of us.” He held up his phone in front of me: “It's weird that we have these like external things that we're using. People are going to start retrofitting themselves with improved memory, improved vision.”&nbsp;</p>



<p>As companies like Neuralink push towards merging technology with the body, and AI seeps into every corner of our world, Lamm says she still has days where she feels powerless and alone.&nbsp;</p>



<p>When Google rolled out AI search, with no way to turn it off, she broke down. “I  googled how to undo an AI overview, and there wasn't a way to do it. And I had a total meltdown… I was like, this is evil. Like, I can't even do a Google search without being confronted by AI.”</p>





<p>I ask Lamm and Lane what they’ll do in the future in the face of this capture. With each passing year, it gets more difficult to live without a smartphone. The pandemic — which saw countries around the world rolling out QR code greenpasses — cemented this, as restaurants spurned paper menus, airlines stopped issuing paper tickets, health services made it so hospital appointments could only be booked on apps. Recently, Lamm couldn’t apply for a UK Visa because she needed a smartphone to do it. She can’t get an electric car because you can only pay for electricity with a QR code. So what then — when the drawbridge finally rises and modern life necessitates a smartphone?</p>



<p>Lamm has thought about this, and once she gets to her conclusion, it becomes as sci-fi as the imaginings of the tech workers who want to put chips in our heads. “There needs to be another option,” she reflected. “In the worst case scenario, people just defect from society and say, ‘Ok, there’s at least a few thousand of us that want to just live a normal life and we’ll go off and continue living a normal life somewhere else,’” she said. She quoted from Dave Eggers’s cult novel “The Every,” where a small tribe of tech-skeptics calling themselves the “Trogs” try to live outside a world where surveillance capitalism and tech have become all-encompassing.&nbsp;</p>



<p>“It wouldn’t be a commune situation because ideally through this activism, it would kind of be more of a split in society, rather than founding a new society,” Lamm said. “It would be like enough people that it would feel like normal life, and you just wouldn't interact with the tech.”<br></p>



<p>As my line with Logan Lane, the Luddite Club founder, crackled again, I asked her the same question — what will she do when life becomes impossible without a smartphone, when tech capture <a href="https://www.codastory.com/authoritarian-tech/who-decides-our-tomorrow-challenging-silicon-valleys-power/">becomes</a> complete? “I’m just like, fuck it. I’ll get to it when I get to it. But I am not OK with it. I am going to do everything I can before then to try to prevent that.” She paused. “I'm not so worried about what people in Silicon Valley think people want.” As her train went into a tunnel, the line went dead, and she continued with her journey — in exile from the digital world; fully present in the physical world.</p>



<p><em>Drop in image 1: Teona Tsintsadze. Motion by Anna Jibladze . Drop in image 3: Teona Tsintsadze/Creative Commons (CC BY 4.0) Reinhold Möller, Motion by Anna Jibladze. Drop in image4 : Teona Tsintsadze/ Creative Commons (CC BY 4.0)Ermell/Reinhold Möller</em>.</p>

<div class="wp-block-group alignleft is-style-meta-info is-layout-constrained wp-block-group-is-layout-constrained">
<h5 class="wp-block-heading">PART OF THE BIG IDEAS</h5>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-group is-horizontal is-nowrap is-layout-flex wp-container-core-group-is-layout-41b81202 wp-block-group-is-layout-flex">
<figure class="wp-block-image size-full is-resized"><img src="https://www.codastory.com/wp-content/uploads/2025/12/captured-icon.png" alt="" class="wp-image-59986" style="width:40px;height:auto"/></figure>



<h2 class="wp-block-heading">Captured</h2>
</div>



<p>This Big Idea explores how this new technology is not just intended to redefine the way we work, but what it means to be human.</p>



<div class="wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex">
<div class="wp-block-button is-style-outline is-style-outline--5"><a class="wp-block-button__link has-x-small-font-size has-custom-font-size wp-element-button" href="https://www.codastory.com/the-age-of-exile/">Explore Captured</a></div>
</div>
</div>



<hr class="wp-block-separator has-alpha-channel-opacity"/>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-group is-nowrap is-layout-flex wp-container-core-group-is-layout-ad2f72ca wp-block-group-is-layout-flex">
<figure class="wp-block-image size-full is-resized"><img src="https://www.codastory.com/wp-content/uploads/2025/12/age-of-exile-icon.png" alt="" class="wp-image-59985" style="width:40px;height:auto"/></figure>



<h2 class="wp-block-heading">The Age of Exile</h2>
</div>



<p>This Big Idea explores how displacement has evolved from historical punishment into a defining condition of our time. </p>



<div class="wp-block-buttons is-layout-flex wp-block-buttons-is-layout-flex">
<div class="wp-block-button is-style-outline is-style-outline--6"><a class="wp-block-button__link has-x-small-font-size has-custom-font-size wp-element-button" href="https://www.codastory.com/the-age-of-exile/">Explore The Age of Exile</a></div>
</div>
</div>
</div>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-polarization post_tag-authoritarianism post_tag-human-rights post_tag-migration post_tag-perspective post_tag-transnational-repression idea-the-age-of-exile author-cap-nataliaantelava ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/polarization/welcome-to-the-age-of-exile/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/11/Exile-Opener-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2025/11/Exile-Opener-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2025/11/Exile-Opener-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2025/11/Exile-Opener-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2025/11/Exile-Opener-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/polarization/welcome-to-the-age-of-exile/">Welcome to the age of exile</a></h2>


<div class="wp-block-post-author-name">Natalia Antelava</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-armed-conflict post_tag-border-surveillance post_tag-dissidents post_tag-memory post_tag-photo-essay post_tag-syria idea-the-age-of-exile author-cap-sarakontar author-cap-nadia-beard ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/armed-conflict/the-price-of-exile-a-syrian-photographer-trapped-by-the-laws-that-saved-her/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/11/To-Visit-My-Home-I-VIsit-its-Borders-SK-11.jpg" srcset="https://www.codastory.com/wp-content/uploads/2025/11/To-Visit-My-Home-I-VIsit-its-Borders-SK-11.jpg 1920w, https://www.codastory.com/wp-content/uploads/2025/11/To-Visit-My-Home-I-VIsit-its-Borders-SK-11-600x450.jpg 600w, https://www.codastory.com/wp-content/uploads/2025/11/To-Visit-My-Home-I-VIsit-its-Borders-SK-11-1600x1200.jpg 1600w, https://www.codastory.com/wp-content/uploads/2025/11/To-Visit-My-Home-I-VIsit-its-Borders-SK-11-768x576.jpg 768w, https://www.codastory.com/wp-content/uploads/2025/11/To-Visit-My-Home-I-VIsit-its-Borders-SK-11-1536x1152.jpg 1536w" width="1920" height="1440"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/armed-conflict/the-price-of-exile-a-syrian-photographer-trapped-by-the-laws-that-saved-her/">The price of exile: a Syrian photographer trapped by the laws that saved her</a></h2>


<div class="wp-block-post-author-name">Sara Kontar</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-rewriting-history post_tag-authoritarianism post_tag-china post_tag-essay post_tag-uyghurs idea-complicating-colonialism author-cap-abduweliayup ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/rewriting-history/uyghur-language-xinjiang-prison/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2024/06/Ancient-City-Two-Beds-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2024/06/Ancient-City-Two-Beds-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2024/06/Ancient-City-Two-Beds-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2024/06/Ancient-City-Two-Beds-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2024/06/Ancient-City-Two-Beds-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/rewriting-history/uyghur-language-xinjiang-prison/">I risked prison to keep the Uyghur culture alive</a></h2>


<div class="wp-block-post-author-name">Abduweli Ayup</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/the-digital-exiles-why-people-are-abandoning-their-smartphones/">The digital exiles: Why people are abandoning their smartphones</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="https://videos.files.wordpress.com/I0vY7Yfj/the-digital-exiles.mp4" length="1444482" type="video/mp4" />

		<post-id xmlns="com-wordpress:feed-additions:1">59354</post-id>	</item>
		<item>
		<title>Finding meaning in human lives</title>
		<link>https://www.codastory.com/authoritarian-tech/finding-meaning-in-human-lives/</link>
		
		<dc:creator><![CDATA[Isobel Cockerell]]></dc:creator>
		<pubDate>Mon, 03 Nov 2025 15:26:19 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Authoritarianism]]></category>
		<category><![CDATA[Human Rights]]></category>
		<category><![CDATA[Q&A]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=59027</guid>

					<description><![CDATA[<p>Nick Bostrom literally wrote the book on superintelligence. When machines can do nearly everything better than we can, he says, we must ask what is our purpose. </p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/finding-meaning-in-human-lives/">Finding meaning in human lives</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>AI is replacing humans in the workplace, with tech companies among the quickest to simply innovate people out of the job market altogether. Amazon announced <a href="https://www.reuters.com/business/world-at-work/amazon-targets-many-30000-corporate-job-cuts-sources-say-2025-10-27/">plans</a> to lay off up to 30,000 people. The company hasn’t commented publicly on why, but Amazon’s CEO Andy Jassy has talked about how AI will eventually replace many of his white-collar employees. And it’s likely the money saved will be used to — you guessed it — build out more AI infrastructure.&nbsp;</p>





<p>This is just the beginning. “Innovation related to artificial intelligence could displace 6-7% of the US workforce if AI is widely adopted,” <a href="https://www.goldmansachs.com/insights/articles/how-will-ai-affect-the-global-workforce">says</a> a recent Goldman Sachs report.</p>



<p>In the last week, over 53,000 people <a href="https://superintelligence-statement.org/">signed</a> a statement calling for “a prohibition on the development of superintelligence.” A wide coalition of notable figures, from Nobel-winning scientists to senior politicians, writers, British royals, and radio shockjocks agreed that AI companies are racing to build superintelligence with little regard for concerns that include “human economic obsolescence and disempowerment.”</p>



<p>The petition against superintelligence development could be the beginning of organized political resistance to AI's unchecked advance. The signatories span continents and ideologies, suggesting a rare consensus emerging around the need for democratic oversight of AI development. The question is: can it organize quickly enough to influence policy before the key decisions are made in <a href="https://www.codastory.com/authoritarian-tech/captured-silicon-valley-future-religion-artificial-intelligence/">Silicon Valley boardrooms</a> and government backrooms?</p>



<p>But it’s not just jobs we could lose. The petition talks about the “losses of freedom, civil liberties, dignity… and even potential human extinction.” It reflects a deeper unease about the quasi-religious zeal of AI evangelists who view superintelligence not as a choice to be democratically decided, but as an inevitable evolution the tech bros alone can shepherd.</p>



<p>Coda explored this messianic ideology at length in "<em>Captured</em>," a six-part investigative series available as a <a href="https://www.audible.com/pd/Captured-Audiobook/B0DZJ5W4Y7?srsltid=AfmBOork5id0usmWl-sD9Ol_jmLNX5udjH_nFe8S93VEndZJlDKf5_Id">podcast on Audible</a> and as a <a href="https://www.codastory.com/captured/">series of articles</a> on our website, in which we dove deep into the future envisioned by the tech elite for the rest of us.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2025/11/GettyImages-502680318-1731x1200.jpg" alt="" class="wp-image-59032"/><figcaption class="wp-element-caption">Philosopher Nick Bostrom, author of the book <em>Superintelligence: Paths, Dangers, Strategies</em>.<br>The Washington Post / Contributor via Getty Images</figcaption></figure>



<p>During our reporting, data scientist Christopher Wylie, best known as the Cambridge Analytica whistleblower, and I spoke to the Swedish philosopher Nick Bostrom, whose 2014 <a href="https://books.google.co.in/books/about/Superintelligence.html?id=7_H8AwAAQBAJ&amp;redir_esc=y">book</a> foresaw the possibility that our world might be taken over by an uncontrollable artificial superintelligence.</p>



<div class="wp-block-group is-nowrap is-layout-flex wp-container-core-group-is-layout-ad2f72ca wp-block-group-is-layout-flex">
<p>A decade later, with AI companies racing toward Artificial General Intelligence with minimal oversight, Bostrom’s concerns have become urgent. What struck me most during our conversation was how he believes we’re on the precipice of a huge societal paradigm shift, and that it’s unrealistic to think otherwise. It’s hyperbolic, Bostrom says, to think human civilization will continue to potter along as it is.&nbsp;</p>
</div>



<p>Do we believe in Bostrom’s version of the future where society plunges into dystopia or utopia? Or is there a middle way? Judge for yourself whether his warnings still sound theoretical.</p>



<p><em>This conversation has been edited and condensed for clarity.</em></p>



<p><strong>Christopher Wylie:</strong> To start, could you define what you mean by superintelligence and how it differs from the AI we see today?</p>



<p><strong>Nick Bostrom:</strong> Superintelligence is a form of cognitive processing system that not just matches but exceeds human cognitive abilities. If we're talking about general superintelligence, it would exceed our cognitive capacities in all fields — scientific creativity, common sense, general wisdom.</p>



<p><strong>Isobel Cockerell: </strong>What kind of future are we looking at — especially if we manage to develop superintelligence?</p>



<p><strong>Bostrom:</strong> So I think many people have the view that the most likely scenario is that things more or less continue as they have — maybe a little war here, a cool new gadget there, but basically the human condition continues indefinitely.&nbsp;</p>



<p>But I think that looks pretty implausible. It’s more likely that it will radically change. Either for the much better or for the much worse.</p>



<p>The longer the timeframe we consider — and these days I don’t think in terms of that many years — we are kind of approaching this critical juncture in human affairs, where we will either go extinct or suffer some comparably bad fate, or else be catapulted into some form of utopian condition.</p>



<p>You could think of the human condition as a ball rolling along a thin beam — and it will probably fall off that beam. But it’s hard to predict in which direction.</p>



<p><strong>Wylie:</strong> When you think about these two almost opposite outcomes — one where humanity is subsumed by superintelligence, and the other where technology liberates us into a utopia — do humans ultimately become redundant in either case?</p>



<p><strong>Bostrom:</strong> In the sense of practical utility, yes — I think we will reach, or at least approximate, a state where human labor is not needed for anything. There’s no practical objective that couldn’t be better achieved by machines, by AIs and robots.</p>



<p>But you have to ask what it’s all for. Possibly we have a role as consumers of all this abundance. It’s like having a big Disneyland — maybe in the future you could automate the whole park so no human employees are needed. But even then, you still need the children to enjoy it.</p>



<p>If we really take seriously this notion that we could develop AI that can do everything we can do, and do it much better, we will then face quite profound questions about the purpose of human life. If there’s nothing we need to do — if we could just press a button and have everything done — what do we do all day long? What gives meaning to our lives?</p>



<p>And so ultimately, I think we need to envisage a future that accommodates humans, animals, and AIs of various different shapes and levels — all living happy lives in harmony.</p>



<p><strong>Cockerell:</strong> How far do you trust the people in Silicon Valley to guide us toward a better future?</p>



<p><strong>Bostrom:</strong> I mean, there’s a sense in which I don’t really trust anybody. I think we humans are not fully competent here — but we still have to do it as best we can.</p>



<p>If you were a divine creature looking down, it might seem like a comedy: these ape-like humans running around building super-powerful machines they barely understand, occasionally fighting with rocks and stones, then going back to building again. That must be what the human condition looks like from the point of view of some billion-year-old alien civilization.</p>



<p>So that’s kind of where we are.</p>



<p>Ultimately, it’ll be a much bigger conversation about how this technology should be used. If we develop superintelligence, all humans will be exposed to its risks — even if you have nothing to do with AI, even if you’re a farmer somewhere you’ve never heard of, you’ll still be affected. So it seems fair that if things go well, everyone should also share some of the upside.</p>



<p>You don’t want to pre-commit to doing all of this open-source. For example, Meta is pursuing open-source AI — so far, that’s good. But at some point, these models will become capable of lending highly useful assistance in developing weapons of mass destruction.</p>



<p>Now, before releasing their model, they fine-tune it to refuse those requests. But once they open-source it, everyone has access to the model weights. It’s easy to remove that fine-tuning and unlock these latent capabilities.</p>



<p>This works great for normal software and relatively modest AI, but there might be a level where it just democratizes mass destruction.</p>



<p><strong>Wylie :</strong> But on the flip side — if you concentrate that power in the hands of a few people authorized to build and use the most powerful AIs, isn’t there also a high risk of abuse? Governments or corporations misusing it against people or other groups?</p>





<p><strong>Bostrom:</strong> When we figure out how to make powerful superintelligence, if development is completely open — with many entities, companies, and groups all competing to get there first — then if it turns out it’s actually hard to align them, where you might need a year or two to train, make sure it’s safe, test and double-test before really ramping things up, that just might not be possible in an open competitive scenario.</p>



<p>You might be responsible — one of the lead developers who chooses to do it carefully — but that just means you forfeit the lead to whoever is willing to take more risks. If there are 10 or 20 groups racing in different countries and companies, there will always be someone willing to cut more corners.</p>



<p><strong>Wylie: </strong>More broadly, do you have conversations with people in Silicon Valley — Sam Altman, Elon Musk, the leaders of major tech companies — about your concerns, and their role in shaping or preventing some of the long-term risks of AI?</p>



<p><strong>Bostrom:</strong> Yeah. I’ve had quite a few conversations. What’s striking, when thinking specifically about AI, is that many of the early people in the frontier labs have, for years, been seriously engaged with questions about what happens when AI succeeds — superintelligence, alignment, and so on.</p>



<p>That’s quite different from the typical tech founder focused on capturing markets and launching products. For historical reasons, many early AI researchers have been thinking ahead about these deeper issues for a long time, even if they reach different conclusions about what to do.</p>



<p>And it’s always possible to imagine a more ideal world, but relatively speaking, I think we’ve been quite lucky so far. The impact of current AI technologies has been mostly positive — search engines, spam filters, and now these large language models that are genuinely useful for answering questions and helping with coding.</p>



<p>I would imagine that the benefits will continue to far outweigh the downsides — at least until the final stage, where it becomes more of an open question whether we end up with a kind of utopia or an existential catastrophe.</p>



<p><em>A version of this story was published in this week’s Coda Currents newsletter. </em><a href="https://www.codastory.com/newsletters/"><em>Sign up here</em></a><em>.</em></p>

<div class="wp-block-group alignleft is-style-meta-info is-layout-constrained wp-block-group-is-layout-constrained">
<h3 class="wp-block-heading" id="h-why-this-story">Your Early Warning System</h3>



<p class="is-style-sans has-small-font-size">This story is part of “<a href="https://www.codastory.com/idea/captured/">Captured</a>”, our special issue in which we ask whether AI, as it becomes integrated into every part of our lives, is now a belief system. Who are the prophets? What are the commandments? Is there an ethical code? How do the AI evangelists imagine the future? And what does that future mean for the rest of us? You can listen to the Captured audio series <a href="https://www.audible.com/pd/Captured-Audiobook/B0DZJ5W4Y7?qid=1743678504&amp;sr=1-1&amp;ref_pageloadid=not_applicable&amp;pf_rd_p=83218cca-c308-412f-bfcf-90198b687a2f&amp;pf_rd_r=E9Q9MZKWCN2NBSBC3PB0&amp;plink=tXvuPW1hHaatATEj&amp;pageLoadId=J06yHclGbh1Idv9o&amp;creativeId=0d6f6720-f41c-457e-a42b-8c8dceb62f2c&amp;ref=a_search_c3_lProduct_1_1">on Audible now.</a></p>
</div>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-q-and-a idea-captured author-cap-isobelcockerell ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/who-owns-the-rights-to-your-brain/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/04/Brain-250x250.gif" srcset="https://www.codastory.com/wp-content/uploads/2025/04/Brain-250x250.gif 250w, https://www.codastory.com/wp-content/uploads/2025/04/Brain-72x72.gif 72w, https://www.codastory.com/wp-content/uploads/2025/04/Brain-232x232.gif 232w, https://www.codastory.com/wp-content/uploads/2025/04/Brain-900x900.gif 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/who-owns-the-rights-to-your-brain/">Who owns the rights to your brain?</a></h2>


<div class="wp-block-post-author-name">Isobel Cockerell</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-algorithms post_tag-artificial-intelligence post_tag-content-moderation post_tag-perspective idea-captured author-cap-isobelcockerell ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/captured-silicon-valley-future-religion-artificial-intelligence/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/03/Header-Captured.gif" width="1920" height="1080"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/captured-silicon-valley-future-religion-artificial-intelligence/">Captured: how Silicon Valley is building a future we never chose</a></h2>


<div class="wp-block-post-author-name">Isobel Cockerell</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-artificial-intelligence post_tag-conspiracy-theories post_tag-first-person post_tag-information-war idea-captured author-cap-j-paulneeley ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/when-im-125/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-250x250.gif" srcset="https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-250x250.gif 250w, https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-72x72.gif 72w, https://www.codastory.com/wp-content/uploads/2025/04/1.5.7mb-232x232.gif 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/when-im-125/">When I’m 125?</a></h2>


<div class="wp-block-post-author-name">J. Paul Neeley</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/finding-meaning-in-human-lives/">Finding meaning in human lives</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">59027</post-id>	</item>
		<item>
		<title>The capture of journalism and the illusion of objectivity</title>
		<link>https://www.codastory.com/authoritarian-tech/the-capture-of-journalism-and-the-illusion-of-objectivity/</link>
		
		<dc:creator><![CDATA[Natalia Antelava]]></dc:creator>
		<pubDate>Wed, 07 May 2025 06:43:08 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Attacks on press freedom]]></category>
		<category><![CDATA[Information War]]></category>
		<category><![CDATA[Perspective]]></category>
		<category><![CDATA[Rewriting history]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=56295</guid>

					<description><![CDATA[<p>Faced with hostility, hollowed out by Big Tech, journalists must ask themselves a question: ‘ What do we stand for?’</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/the-capture-of-journalism-and-the-illusion-of-objectivity/">The capture of journalism and the illusion of objectivity</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In early April, I found myself in the breathtaking Chiesa di San Francesco al Prato in Perugia, Italy talking about men who are on a mission to achieve immortality.</p>





<p>As sunlight filtered through glass onto worn stone walls, Cambridge Analytica whistleblower Christopher Wylie recounted a dinner with a Silicon Valley mogul who believes drinking his son's blood will help him live forever.</p>



<p>"We've got it wrong," Bryan Johnson told Chris. "God didn't create us. We're going to create God and then we're going to merge with him."</p>



<p>This wasn't hyperbole. It's the worldview taking root among tech elites who have the power, wealth, and unbounded ambition to shape our collective future.</p>



<p>Working on<a href="https://www.audible.com/pd/Captured-Audiobook/B0DZJ5W4Y7"> “Captured: The Secret Behind Silicon Valley's AI Takeover</a>” podcast, which we<a href="https://www.journalismfestival.com/programme/2025/captured-how-silicon-valleys-ai-emperors-are-reshaping-reality"> presented</a> in that church in Perugia, we realized we weren't just investigating technology – we were documenting a fundamentalist movement with all the trappings of prophecy, salvation, and eternal life. And yet, talking about it from the stage to my colleagues in Perugia, I felt, for a second at least, like a conspiracy theorist. Discussing blood-drinking tech moguls and godlike ambitions in a journalism conference felt jarring, even inappropriate. I felt, instinctively, that not everyone was willing to hear what our reporting had uncovered. The truth is, these ideas aren’t fringe at all – they are the root of the new power structures shaping our reality.</p>



<p>“Stop being so polite,” Chris Wylie urged the audience, challenging journalists to confront the cultish drive for transcendence, the quasi-religious fervor animating tech’s most powerful figures.&nbsp;</p>



<p>We've ignored this story, in part at least, because the journalism industry had chosen to be “friends” with Big Tech, accepting platform funding, entering into “partnerships,” and treating tech companies as potential saviors instead of recognizing the fundamental incompatibility between their business models and the requirements of a healthy information ecosystem, which is as essential to journalism as air is to humanity.</p>



<p>In effect, journalism has been complicit in its own capture. That complicity has blunted our ability to fulfil journalism's most basic societal function: holding power to account.</p>



<p>As tech billionaires have emerged as some of the most powerful actors on the global stage, our industry—so eager to believe in their promises—has struggled to confront them with the same rigor and independence we once reserved for governments, oligarchs, or other corporate powers.</p>



<p>This tension surfaced most clearly during a<a href="https://www.journalismfestival.com/programme/2025/comment-is-free-facts-are-sacred-wasted"> panel</a> at the festival when I challenged Alan Rusbridger, former editor-in-chief of “The Guardian” and current Meta Oversight Board member, about resigning in light of Meta's abandonment of fact-checking. His response echoed our previous exchanges: board membership, he maintains, allows him to influence individual cases despite the troubling broader direction.</p>



<p>This defense exposes the fundamental trap of institutional capture. Meta has systematically recruited respected journalists, human rights defenders, and academics to well-paid positions on its Oversight Board, lending it a veneer of credibility. When board members like Rusbridger justify their participation through "minor victories," they ignore how their presence legitimizes a business model fundamentally incompatible with the public interest.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>What once felt like slow erosion now feels like a landslide, accelerated by broligarchs who claim to champion free speech while their algorithms amplify authoritarians.</p>
</blockquote>



<p>Imagine a climate activist serving on an Exxon-established climate change oversight board, tasked with reviewing a handful of complaints while Exxon continues to pour billions into fossil fuel expansion and climate denial.&nbsp;</p>



<p>Meta's oversight board provides cover for a platform whose design and priorities fundamentally undermine our shared reality. The "public square" - a space for listening and conversation that the internet once promised to nurture but is now helping to destroy - isn't merely a metaphor, it's the essential infrastructure of justice and open society.</p>



<p>Trump's renewed attacks on the press, the abrupt withdrawal of U.S. funding for independent media around the world, platform complicity in spreading disinformation, and the normalization of hostility toward journalists have stripped away any illusions about where we stand. What once felt like slow erosion now feels like a landslide, accelerated by<a href="https://www.codastory.com/oligarchy/the-age-of-broligarchy/"> broligarchs</a> who claim to champion free speech while their algorithms amplify authoritarians.</p>



<h3 class="wp-block-heading"><strong>The Luxury of Neutrality</strong></h3>



<p>If there is one upside to the dire state of the world, it’s that the fog has lifted. In Perugia, the new sense of clarity was palpable. Unlike last year, when so many drifted into resignation, the mood this time was one of resolve. The stakes were higher, the threats more visible, and everywhere I looked, people were not just lamenting what had been lost – they were plotting and preparing to defend what matters most.</p>



<p>One unintended casualty of this new clarity is the old concept of journalistic objectivity. For decades, objectivity was held up as the gold standard of our profession – a shield against accusations of bias. But as attacks on the media intensify and the very act of journalism becomes increasingly criminalized and demonized around the world, it’s clear that objectivity was always a luxury, available only to a privileged few. For many who have long worked under threat – neutrality was never an option. Now, as the ground shifts beneath all of us, their experience and strategies for survival have become essential lessons for the entire field.</p>



<p>That was the spirit animating our “Am I Black Enough?” panel in Perugia, which brought together three extraordinary Black American media leaders, with me as moderator.</p>



<p>“I come out of the Black media tradition whose origins were in activism,” said Sara Lomax, co-founder of URL Media and head of WURD, Philadelphia’s oldest Black talk radio station. She reminded us that the first Black newspaper in America was founded in 1827 - decades before emancipation - to advocate for the humanity of people who were still legally considered property.</p>



<p>Karen McMullen, festival director of Urbanworld, spoke to the exhaustion and perseverance that define the Black American experience: “We would like to think that we could rest on the successes that our parents and ancestors have made towards equality, but we can’t. So we’re exhausted but we will prevail.”</p>



<p>And as veteran journalist and head of the Maynard Institute Martin Reynolds put it, “Black struggle is a struggle to help all. What’s good for us tends to be good for all. We want fair housing, we want education, we want to be treated with respect.”</p>



<p>Near the end of our session, an audience member challenged my role as a white moderator on a panel about Black experiences. This moment crystallized how the boundaries we draw around our identities can both protect and divide us. It also highlighted exactly why we had organized the panel in the first place: to remind us that the tools of survival and resistance forged by those long excluded from "objectivity" are now essential for everyone facing the erosion of old certainties.</p>



<figure class="wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-8 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img data-id="56304" src="https://www.codastory.com/wp-content/uploads/2025/05/DSC_6819-801x1200.jpg" alt="" class="wp-image-56304"/></figure>



<figure class="wp-block-image size-large"><img data-id="56305" src="https://www.codastory.com/wp-content/uploads/2025/05/DSC_6888-1797x1200.jpg" alt="" class="wp-image-56305"/></figure>



<figure class="wp-block-image size-large"><img data-id="56303" src="https://www.codastory.com/wp-content/uploads/2025/05/DSC_6874-1-1797x1200.jpg" alt="" class="wp-image-56303"/></figure>
<figcaption class="blocks-gallery-caption wp-element-caption">Sara Lomax (WURD/URL Media), Karen McMullen (Urbanworld) &amp; Martin Reynolds (Maynard Institute) discuss how the Black press in America was born from activism, fighting for the humanity of people who were still legally considered property - a tradition of purpose-driven journalism that offers critical lessons today. Ascanio Pepe/Creative Commons (CC BY ND 4.0)&nbsp;</figcaption></figure>



<h3 class="wp-block-heading"><strong>The Power of Protected Spaces</strong></h3>



<p>If there’s one lesson from those who have always lived on the frontlines and who never had the luxury of neutrality – it’s that survival depends on carving out spaces where your story, your truth, and your community can endure, even when the world outside is hostile.</p>



<p>That idea crystallized for me one night in Perugia, when during a dinner with colleagues battered by layoffs, lawsuits, and threats far graver than those I face, someone suggested we play a game: “What gives you hope?” When it was my turn, I found myself talking about finding hope in spaces where freedom lives on. Spaces that can always be found, no matter how dire the circumstances.&nbsp;</p>



<p>I mentioned my parents, dissidents in the Soviet Union, for whom the kitchen was a sanctuary for forbidden conversations. And Georgia, my homeland – a place that has preserved its identity through centuries of invasion because its people fought, time and again, for the right to write their own story. Even now, as protesters fill the streets to defend the same values my parents once whispered about in the kitchen, their resilience is a reminder that survival depends on protecting the spaces where you can say who you are.</p>



<p>But there’s a catch: to protect the spaces where you can say who you are, you first have to know what you stand for – and who stands with you. Is it the tech bros who dream of living forever, conquering Mars, and who rush to turn their backs on diversity and equity at the first opportunity? Or is it those who have stood by the values of human dignity and justice, who have fought for the right to be heard and to belong, even when the world tried to silence them?&nbsp;</p>



<p>As we went around the table, each of us sharing what gave us hope, one of our dinner companions, a Turkish lawyer, offered a metaphor in response to my point about the need to protect spaces. “In climate science,” she said, “they talk about protected areas – patches of land set aside so that life can survive when the ecosystem around it collapses. They don’t stop the storms, but they give something vital a chance to endure, adapt, and, when the time is right, regenerate.”</p>



<p>That's what we need now: protected areas for uncomfortable truths and complexity. Not just newsrooms, but dinner tables, group chats, classrooms, gatherings that foster unlikely alliances - anywhere we can still speak honestly, listen deeply, and dare to imagine.</p>



<p>More storms will come. More authoritarians will rise. Populist strongmen and broligarchs will keep fragmenting our shared reality.</p>



<p>But if history has taught us anything – from Soviet kitchens to Black newspapers founded in the shadow of slavery - it’s that carefully guarded spaces where stories and collective memory are kept alive have always been the seedbeds of change.</p>



<p>When we nurture these sanctuaries of complex truth against all odds, we aren't just surviving. We're quietly cultivating the future we wish to see.</p>



<p>And in times like these, that's not just hope - it's a blueprint for renewal.</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/the-capture-of-journalism-and-the-illusion-of-objectivity/">The capture of journalism and the illusion of objectivity</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">56295</post-id>	</item>
		<item>
		<title>Captured: how Silicon Valley is building a future we never chose</title>
		<link>https://www.codastory.com/authoritarian-tech/captured-silicon-valley-future-religion-artificial-intelligence/</link>
		
		<dc:creator><![CDATA[Isobel Cockerell]]></dc:creator>
		<pubDate>Thu, 03 Apr 2025 14:04:54 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Content moderation]]></category>
		<category><![CDATA[Perspective]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=55514</guid>

					<description><![CDATA[<p>AI’s prophets speak of the technology with religious fervor. And they expect us all to become believers.</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/captured-silicon-valley-future-religion-artificial-intelligence/">Captured: how Silicon Valley is building a future we never chose</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In April last year I was in Perugia, at the annual international journalism festival. I was sitting in a panel session about whether AI marked the end of journalism, when a voice note popped up on my Signal.&nbsp;</p>





<p>It came from Christopher Wylie. He’s a data scientist and the whistleblower who cracked open the Cambridge Analytica scandal in 2018. I had just started working with him on a new investigation into AI. Chris was supposed to be meeting me, but he had found himself trapped in Dubai in a party full of Silicon Valley venture capitalists.</p>



<p>“I don’t know if you can hear me — I’m in the toilet at this event, and people here are talking about longevity, how to live forever, but also prepping for when people revolt and when society gets completely undermined,” he had whispered into his phone. “You have in another part of the world, a bunch of journalists talking about how to save democracy. And here, you've got a bunch of tech guys thinking about how to live past democracy and survive.”</p>



<figure class="wp-block-audio"><audio controls src="https://www.codastory.com/wp-content/uploads/2025/04/Chris-voicenote-COMPLETE.mp3"></audio></figure>



<p>A massive storm and a once-in-a-generation flood had paralyzed Dubai when Chris was on a layover on his way to Perugia. He couldn’t leave. And neither could the hundreds of tech guys who were there for a crypto summit. The freakish weather hadn’t stopped them partying, Chris told me over a frantic Zoom call.&nbsp;</p>



<p>“You're wading through knee-deep water, people are screaming everywhere, and then…&nbsp; What do all these bros do? They organize a party. It's like the world is collapsing outside and yet you go inside and it's billionaires and centimillionaires having a party,” he said. “Dubai right now is a microcosm of the world. The world is collapsing outside and the people are partying.”</p>



<p>Chris and I eventually managed to meet up. And for over a year we worked together on a podcast that asks what is really going on inside the tech world.&nbsp; We looked at how the rest of us —&nbsp; journalists, artists, nurses, businesses, even governments — are being captured by big tech’s ambitions for the future and how we can fight back.&nbsp;</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>Mercy was a content moderator for Meta. She was paid around a dollar an hour for work that left her so traumatized that she couldn't sleep. And when she tried to unionize, she was laid off.</p>
</blockquote>



<p>Our reporting took us around the world from the lofty hills of Twin Peaks in San Francisco to meet the people building AI models, to the informal settlements of Kenya to meet the workers training those models.<br></p>



<p>One of these people was Mercy Chimwani, who we visited in her makeshift house with no roof on the outskirts of Nairobi. There was mud beneath our feet, and above you could see the rainclouds through a gaping hole where the unfinished stairs met the sky. When it rained, Mercy told us, water ran right through the house. It’s hard to believe, but she worked for Meta.&nbsp;</p>



<p>Mercy was a content moderator, hired by the middlemen Meta used to source employees. Her job was to watch the internet’s most horrific images and video –&nbsp; training the company’s system so it can automatically filter out such content before the rest of us are exposed to it.&nbsp;</p>



<p>She was paid around a dollar an hour for work that left her so traumatized that she couldn’t sleep. And when she and her colleagues tried to unionize, she was laid off. Mercy was part of the invisible, ignored workforce in the Global South that enables our frictionless life online for little reward.&nbsp;</p>



<p>Of course, we went to the big houses too — where the other type of tech worker lives. The huge palaces made of glass and steel in San Francisco, where the inhabitants believe the AI they are building will one day help them live forever, and discover everything there is to know about the universe.&nbsp;</p>



<p>In Twin Peaks, we spoke to Jeremy Nixon, the creator of AGI House San Francisco (AGI for <em>Artificial General Intelligence)</em>. Nixon described an apparently utopian future, a place where we never have to work, where AI does everything for us, and where we can install the sum of human knowledge into our brains. “The intention is to allow every human to know everything that’s known,” he told me.&nbsp;</p>





<p>Later that day, we went to a barbecue in Cupertino and got talking to Alan Boehme, once a chief technology officer for some of the biggest companies in the world, and now an investor in AI startups. Boehme told us how important it was, from his point of view, that tech wasn’t stymied by government regulation. <strong>“</strong>We have to be worried that people are going to over-regulate it. Europe is the worst, to be honest with you,” he said. “Let's look at how we can benefit society and how this can help lead the world as opposed to trying to hold it back.”</p>



<p>I asked him if regulation wasn’t part of the reason we have democratically elected governments, to ensure that all people are kept safe, that some people aren’t left behind by the pace of change? Shouldn’t the governments we elect be the ones deciding whether we regulate AI and not the people at this Cupertino barbecue?</p>



<p><strong>“</strong>You sound like you're from Sweden,” Boehme responded. “I'm sorry, that's social democracy. That is not what we are here in the U. S. This country is based on a Constitution. We're not based on everybody being equal and holding people back. No, we're not in Sweden.”&nbsp;</p>



<p>As we reported for the podcast, we came to a gradual realization – what’s being built in Silicon Valley isn’t just artificial intelligence, it’s a way of life — even a religion. And it’s a religion we might not have any choice but to join.&nbsp;</p>



<p>In January, the Vatican released a <a href="https://press.vatican.va/content/salastampa/it/bollettino/pubblico/2025/01/28/0083/01166.html#ing">statement</a> in which it argued that we’re in danger of worshiping AI as God. It's an idea we'd discussed with <a href="https://www.codastory.com/authoritarian-tech/stop-drinking-from-the-toilet/">Judy Estrin</a>, who worked on building some of the earliest iterations of the internet. As a young researcher at Stanford in the 1970s, Estrin was building some of the very first networked connections. She is no technophobe, fearful of the future, but she is worried about the zealotry she says is taking over Silicon Valley.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>What if they truly believe humans are replaceable, that traditional concepts of humanity are outdated, that a technological "god" should supersede us? These aren't just ideological positions&nbsp;– they're the foundations for the world being built around us.</p>
</blockquote>



<p>“If you worship innovation, if you worship anything, you can't take a step back and think about guardrails,” she said about the unquestioning embrace of AI. “So we, from a leadership perspective, are very vulnerable to techno populists who come out and assert that this is the only way to make something happen.”&nbsp;</p>





<p>The first step toward reclaiming our lost agency, as AI aims to capture every facet of our world, is simply to pay attention. I've been struck by how rarely we actually listen to what tech leaders are explicitly saying about their vision of the future.&nbsp;</p>



<p>There's a tendency to dismiss their most extreme statements as hyperbole or marketing, but what if they're being honest? What if they truly believe humans, or at least most humans, are replaceable, that traditional concepts of humanity are outdated, that a technological "god" should supersede us? These aren't just ideological positions – they're the foundations for the world being built around us right now.&nbsp;</p>



<p>In our series, we explore artificial intelligence as something that affects our culture, our jobs, our media and our politics. But we should also ask what tech founders and engineers are really building with AI, or what they think they’re building. Because if their vision of society does not have a place for us in it, we should be ready to reclaim our destiny – before our collective future is captured.</p>



<p><em>Our audio documentary series, CAPTURED: The Secret Behind Silicon Valley’s AI Takeover is <a href="https://www.audible.com/pd/Captured-Audiobook/B0DZJ5W4Y7?qid=1743678504&amp;sr=1-1&amp;ref_pageloadid=not_applicable&amp;pf_rd_p=83218cca-c308-412f-bfcf-90198b687a2f&amp;pf_rd_r=E9Q9MZKWCN2NBSBC3PB0&amp;plink=tXvuPW1hHaatATEj&amp;pageLoadId=J06yHclGbh1Idv9o&amp;creativeId=0d6f6720-f41c-457e-a42b-8c8dceb62f2c&amp;ref=a_search_c3_lProduct_1_1">available now on Audible.</a> Do please tune in, and you can dig deeper into our stories and the people we met during the reporting below.</em></p>

<div class="wp-block-group alignleft is-style-meta-info is-layout-constrained wp-block-group-is-layout-constrained">
<h3 class="wp-block-heading" id="h-why-this-story">Your Early Warning System</h3>



<p class="is-style-sans has-small-font-size">This story is part of “<a href="https://www.codastory.com/idea/captured/" target="_blank" rel="noreferrer noopener">Captured</a>”, our special issue in which we ask whether AI, as it becomes integrated into every part of our lives, is now a belief system. Who are the prophets? What are the commandments? Is there an ethical code? How do the AI evangelists imagine the future? And what does that future mean for the rest of us? You can listen to the Captured audio series <a href="https://www.audible.com/pd/Captured-Audiobook/B0DZJ5W4Y7?qid=1743678504&amp;sr=1-1&amp;ref_pageloadid=not_applicable&amp;pf_rd_p=83218cca-c308-412f-bfcf-90198b687a2f&amp;pf_rd_r=E9Q9MZKWCN2NBSBC3PB0&amp;plink=tXvuPW1hHaatATEj&amp;pageLoadId=J06yHclGbh1Idv9o&amp;creativeId=0d6f6720-f41c-457e-a42b-8c8dceb62f2c&amp;ref=a_search_c3_lProduct_1_1">on Audible now. </a></p>
</div>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Dig deeper into our CAPTURED series </h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-q-and-a idea-captured author-cap-isobelcockerell ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/who-owns-the-rights-to-your-brain/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/04/Brain-250x250.gif" srcset="https://www.codastory.com/wp-content/uploads/2025/04/Brain-250x250.gif 250w, https://www.codastory.com/wp-content/uploads/2025/04/Brain-72x72.gif 72w, https://www.codastory.com/wp-content/uploads/2025/04/Brain-232x232.gif 232w, https://www.codastory.com/wp-content/uploads/2025/04/Brain-900x900.gif 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/who-owns-the-rights-to-your-brain/">Who owns the rights to your brain?</a></h2>


<div class="wp-block-post-author-name">Isobel Cockerell</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech idea-captured author-cap-isobelcockerell ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/the-hidden-workers-who-train-ai-from-kenyas-slums/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2025/04/ezgif-6426ce7769f3e3.webp" width="600" height="378"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/the-hidden-workers-who-train-ai-from-kenyas-slums/">In Kenya’s slums, they’re doing our digital dirty work</a></h2>


<div class="wp-block-post-author-name">Isobel Cockerell</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-algorithms post_tag-perspective post_tag-united-states idea-captured author-cap-judyestrin ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/stop-drinking-from-the-toilet/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2024/09/HeaderImagePipes-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2024/09/HeaderImagePipes-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2024/09/HeaderImagePipes-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2024/09/HeaderImagePipes-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2024/09/HeaderImagePipes-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/stop-drinking-from-the-toilet/">Stop Drinking from the Toilet!</a></h2>


<div class="wp-block-post-author-name">Judy Estrin</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/captured-silicon-valley-future-religion-artificial-intelligence/">Captured: how Silicon Valley is building a future we never chose</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="https://www.codastory.com/wp-content/uploads/2025/04/Chris-voicenote-COMPLETE.mp3" length="1450404" type="audio/mpeg" />

		<post-id xmlns="com-wordpress:feed-additions:1">55514</post-id>	</item>
		<item>
		<title>Stop drinking from the toilet!</title>
		<link>https://www.codastory.com/authoritarian-tech/stop-drinking-from-the-toilet/</link>
		
		<dc:creator><![CDATA[Judy Estrin]]></dc:creator>
		<pubDate>Tue, 10 Sep 2024 13:02:17 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Perspective]]></category>
		<category><![CDATA[United States]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=51640</guid>

					<description><![CDATA[<p>We have systems to filter our water. Now we need systems to filter our tech</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/stop-drinking-from-the-toilet/">Stop drinking from the toilet!</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<div class="wp-block-cover alignfull" style="min-height:100vh;aspect-ratio:unset;"><span aria-hidden="true" class="wp-block-cover__background has-background-dim-0 has-background-dim" style="background-color:#414141"></span><img class="wp-block-cover__image-background wp-image-51866" alt="" src="https://www.codastory.com/wp-content/uploads/2024/09/HeaderImagePipes.jpg" style="object-position:47% 34%" data-object-fit="cover" data-object-position="47% 34%"/><div class="wp-block-cover__inner-container is-layout-constrained wp-block-cover-is-layout-constrained"><h1 class="has-text-align-center has-link-color wp-elements-63a3e475a1b19f09717383d71d3b5180 wp-block-post-title has-text-color has-yellow-color">Stop drinking from the toilet!</h1>


<h2 class="wp-block-heading has-text-align-center has-white-color has-text-color has-link-color wp-elements-295ed5b75d225a84f5b411a471668cfc" id="h-the-faucet-isn-t-much-better-digital-sewage-polluting-our-information-pipes-is-making-us-sick"><br>The faucet isn't much better: digital sewage polluting our information pipes is making us sick</h2>
</div></div>



<p><em>Judy Estrin has been thinking about digital connectivity since the early days of Silicon Valley. As a junior researcher at Stanford in the 1970s she worked on what became the Internet. She built tech companies, became Cisco’s Chief Technology Officer, and served on the board of Disney and FedEx. Now, she’s working to build our understanding of the digital systems that run our lives.</em></p>



<p class="has-drop-cap">We can’t live without air. We can’t live without water. And now we can’t live without our phones.&nbsp;Yet our digital information systems are failing us. Promises of unlimited connectivity and access have led to a fractionalization of reality and levels of noise that undermine our social cohesion. Without a common understanding and language about what we are facing, we put at risk our democratic elections, the resolution of conflicts, our health and the health of the planet. In order to&nbsp;move beyond just reacting to the next catastrophe, we can learn something from water. We turn on the tap to drink or wash, rarely considering where the water comes from–until a crisis of scarcity or quality alerts us to a breakdown. As AI further infiltrates our digital world, a crisis in our digital information systems necessitates paying more attention to its flow.</p>





<p>Water is life sustaining, yet too much water, or impure water, makes us sick, destroys our environment, or even kills us. A bit of water pollution may not be harmful but we know that if the toxins exceed a certain level the water is no longer potable. We have learned that water systems need to protect quality at the source, that lead from pipes leach into the water, and that separation is critical–we don’t use the same pipes for sourcing drinking water and drainage of waste and sewage.</p>



<p>Today, digital services have become the information pipes of our lives. Many of us do not understand or care how they work. Like water, digital information can have varying levels of drinkability and toxicity–yet we don’t know what we are drinking. Current system designs are corroded by the transactional business models of companies that neither have our best interests in mind, nor the tools that can adequately detect impurities and sound the alarm. Digital platforms, such as Instagram, TikTok, or YouTube, don’t differentiate between types of content coming into their systems and they lack the equivalent of effective water filters, purification systems, or valves to stop pollution and flooding. We are both the consumers and the sources of this ‘digital water’ flowing through and shaping our minds and lives. Whether we want to learn, laugh, share, or zone-out, we open our phones and drink from that well. The data we generate fuels increasingly dangerous ad targeting and surveillance of our online movements. Reality, entertainment, satire, facts, opinion, and misinformation all blend together in our feeds.&nbsp;<br></p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p class="has-yellow-color has-text-color has-link-color wp-elements-c11d8edf4c353ebb36818b6d68b14509">“Technology is neither good nor bad; <em>nor is it neutral</em>."</p>
</blockquote>



<p>Digital platforms mix “digital water” and “sewage” in the same pipes, polluting our information systems and undermining the foundations of our culture, our public health, our economy, and our democracy. We see the news avoidance, extremism, loss of civility, reactionary politics, and conflicts. Less visible are other toxins, including the erosion of trust, critical thinking, and creativity. Those propagating the problems deny responsibility and ignore the punch line of <a href="https://en.wikipedia.org/wiki/Melvin_Kranzberg#:~:text=Kranzberg%20is%20known%20for%20his,its%20journal%20Technology%20and%20Culture.">Kranzberg’s first law</a> which states, “technology is neither good nor bad; <em>nor is it neutral</em>." We need fundamental changes to the design of our information distribution systems so that they can benefit society and not just increase profit to a few at our expense.</p>



<figure class="wp-block-image aligncenter size-full is-resized"><img src="https://www.codastory.com/wp-content/uploads/2024/09/det5.png" alt="" class="wp-image-51915" style="width:563px;height:auto"/></figure>



<p class="has-drop-cap">To start, let us acknowledge the monetary incentives behind the tech industry’s course of action that dragged the public down as they made their fortunes. The foundational Internet infrastructure, developed in the 1970s and 80s, combined public and private players, and different levels of service and sources. Individual data bits traveled in packets down a shared distributed network designed to avoid single points of failure. Necessary separation and differentiation was enforced by the information service applications layered on top of the network. Users proactively navigated the web by following links to new sites and information, choosing for themselves where they sourced their content, be it their favorite newspaper or individual blogs. Content providers relied heavily on links from other sites creating interdependence that incentivized more respectful norms and behaviors, even when there was an abundance of disagreements and rants.<br></p>



<p>Then the 2000s brought unbridled consolidation as the companies that now make up BigTech focused on maximizing growth through ad-driven marketplaces. As with some privatized water systems, commercial incentives were prioritized above wellness. This was only amplified in the product design around the small screen of mobile phones, social discovery of content, and cloud computing. Today, we drink from a firehose of endless scrolling that has eroded our capacity for any differentiation or discernment. Toxicity is amplified and nuance eliminated by algorithms that curate our timelines based on an obscure blend of likes, shares, and behavioral data. As we access information through a single feed, different sources and types of content–individuals, bots, hyperbolic news headlines, professional journalism, fantasy shows, and <a href="https://www.nytimes.com/2024/03/29/opinion/ai-internet-x-youtube.html?searchResultPosition=1">human or AI generated</a>–all begin to feel the same.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p class="has-yellow-color has-text-color has-link-color wp-elements-ebf88a032e80accf015b51b21e73a67e">Toxicity is amplified and nuance eliminated by algorithms that curate our timelines based on an obscure blend of likes, shares, and behavioral data</p>
</blockquote>



<p>Social media fractured the very idea of truth by taking control of the <em>distribution</em> of information. Now. <a href="https://time.com/6302761/ai-risks-autonomy/">Generative AI</a> has upended the <em>production</em> of content through an opaque mixing of vast sources of public and private, licensed, and pirated data. Once again, an incentive for profit and power is driving product choices towards centralized, resource intensive Large Language Models (LLMs). The LLMs are trained to recognize, interpret, and generate language in obscure ways and then spit out, often awe inspiring, text, images, and videos on demand. The artificial sweetener of artificial intelligence entices us to drink, even as we know that something may be wrong. The social media waters are already muddied by algorithms and agents, as we are now seeing <a href="https://en.wikipedia.org/wiki/Enshittification">“enshittification”</a> (an aptly coined term by Cory Doctorow) of platforms as well as the overall internet, with increasing amounts of <a href="https://www.businessinsider.com/bad-ai-art-is-overrunning-facebook-2024-3">AI generated excrement</a> in our feeds and searches.<br></p>



<p>We require both behavioral change and a new more distributed digital information system–one that combines public and private resources to ensure that neither our basic ‘tap’ water or our fancy bottled water will poison our children. This will require overcoming two incredibly strong sets of incentives. The first is a business culture that demands dominance through maximizing growth by way of speed and scale. Second is our prioritization of convenience with a boundless desire for a frictionless world. The fact that this is truly a “<a href="https://www.interaction-design.org/literature/topics/wicked-problems">wicked problem</a>” does not relieve us of the responsibility to take steps to improve our condition. We don’t need to let go entirely of either growth or convenience. We do need to recommit to a more balanced set of values.</p>



<figure class="wp-block-image alignright size-full"><img src="https://www.codastory.com/wp-content/uploads/2024/09/det2.png" alt="" class="wp-image-51904"/></figure>



<p>As with other areas of public safety, mitigating today’s harms requires broad and deep education programs to spur individual and collective responsibility. We have thrown out the societal norms that guide us to not spit in the proverbial drink of the other, or piss in the proverbial pool. Instead of continuing to adapt to the lowest common decency, we need <em>digital hygiene</em> to establish collective norms for kids and adults. <em>Digital literacy</em> must encourage critical thinking and navigation of our digital environments with discernment; in other words, with a blend of trust and mistrust. In the analog world, our senses of smell and taste warn us when something is off. We need to establish the ability to detect rotten content and sources–from sophisticated phishing to deep fakes. Already awash in conspiracy theories and propaganda, conversational AI applications bring new avenues for manipulation as well as a novel set of emotional and ethical challenges. As we have learned from food labeling or terms of service, transparency only works when backed by the education to decipher the facts.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p class="has-yellow-color has-text-color has-link-color wp-elements-7ceaf426efb296e4b682a0778b21590d">The artificial sweetener of artificial intelligence entices us to drink, even as we know that something may be wrong.</p>
</blockquote>



<p>Mitigation is not sufficient. We need entrepreneurs, innovators, and funders who are willing to rethink systems and interface design assumptions and build products that are more <em>proactive</em>, <em>distributed</em>, and <em>reinforcing of human agency</em>. Proactive design must incorporate safety valves or upfront filters. Distributed design approaches can use less data and special purpose models, and the interconnection of diverse systems can provide more resilience than consolidated homogeneous ones. We need not accept the inevitability of general purpose brute force data beasts. Human agency designs would break with current design norms.&nbsp; The default to everything looking the same leads to homogeneity and flattening. Our cars would be safer if they didn’t distract us like smart phones on wheels. The awe of discovery is healthier than the numbing of infinite scrolls. Questioning design and business model assumptions require us to break out of our current culture of innovation which is too focused on short term transactions and rapid scaling. The changes in innovation culture have influenced other industries and institutions, including journalism that is too often hijacked by today's commercial incentives. We cannot give up on a common understanding and knowledge, or on the importance of trust and common truths.&nbsp;&nbsp;&nbsp;</p>



<figure class="wp-block-image alignleft size-full is-resized"><img src="https://www.codastory.com/wp-content/uploads/2024/09/det6.png" alt="" class="wp-image-51928" style="width:505px;height:auto"/></figure>



<p>We need policy changes to<em> </em>balance private and public sector participation. Many of the proposals on the table today lock in the worst of the problems, with legislation that reinforces inherently bad designs, removes liability, and/or targets specific implementations (redirecting us to equally toxic alternatives). Independent funding for education, innovation, and research is required to break the narrative and value capture of the BigTech ecosystem. We throw around words like safe, reliable, or responsible without a common understanding of what it means to really be safe. How can we ensure our <em>water is safe to drink</em>? Regulation is best targeted at areas where leakage leads to the most immediate harm–like algorithmic amplification, and lack of transparency and accountability. Consolidation into single points of power inevitably leads to broad based failure. A small number of corporations have assumed the authority of massive utilities that act as both public squares and information highways–without any of the responsibility.<br></p>



<p>Isolation and polarization have evolved from a quest for a frictionless society with extraordinary systems handcrafted to exploit our attention. It is imperative that we create separation, valves, and safeguards in the distribution and access of digital information. I am calling not for a return to incumbent gatekeepers, but instead for the creation of new distribution, curation, and facilitation mechanisms that can be scaled for the diversity of human need. There is no single answer, but the first step is to truly acknowledge the <a href="https://washingtonmonthly.com/2019/01/13/the-world-is-choking-on-digital-pollution/">scope and scale of the problem</a>. The level of toxicity in our ‘digital waters’ is now too high to address reactively by trying to fix things after the fact, or lashing out in the wrong way. We must question our assumptions and embrace fundamental changes in both our technology and culture in order to bring toxicity levels back to a level that does not continue to undermine our society.</p>

<div class="wp-block-group alignleft is-style-meta-info is-layout-constrained wp-block-group-is-layout-constrained">
<h3 class="wp-block-heading" id="h-why-this-story"><em>Why This Story?</em></h3>



<p><em>We are fully immersed in the digital world, but most of us have very little idea what we’re consuming, where it’s coming from, and what harm it may be doing. In part, that’s because we love the convenience that tech brings and we don’t want to enquire further. It’s also because the companies that provide this tech, by and large, prioritize commercial incentives over wellness.</em></p>
</div>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/stop-drinking-from-the-toilet/">Stop drinking from the toilet!</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">51640</post-id>	</item>
		<item>
		<title>The AI apocalypse might begin with a cost-cutting healthcare algorithm</title>
		<link>https://www.codastory.com/newsletters-category/cigna-ai-healthcare-algorithm/</link>
		
		<dc:creator><![CDATA[Ellery Roberts Biddle]]></dc:creator>
		<pubDate>Thu, 27 Jul 2023 15:45:52 +0000</pubDate>
				<category><![CDATA[Authoritarian Tech newsletter]]></category>
		<category><![CDATA[Newsletters]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Newsletter]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=45546</guid>

					<description><![CDATA[<p>Authoritarian Tech is a weekly newsletter tracking how people in power are abusing technology and what it means for the rest of us. </p>
<p>Also in this edition: Google and Meta face new lawsuits over violent content, and Saudi Arabia is playing dirty on Snapchat.</p>
<p>The post <a href="https://www.codastory.com/newsletters-category/cigna-ai-healthcare-algorithm/">The AI apocalypse might begin with a cost-cutting healthcare algorithm</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>On Monday, patients in California <a href="https://www.axios.com/2023/07/25/ai-lawsuits-health-cigna-algorithm-payment-denial">filed</a> a class action lawsuit against Cigna Healthcare, one of the largest health insurance providers in the U.S., for wrongfully denying their claims — and using an algorithm to do it. The algorithm, called PXDX, was automatically denying patients’ claims at an astonishing rate — the technology would spend an estimated 1.2 seconds “reviewing” each claim. During a two-month period in 2022, Cigna denied 300,000 pre-approved claims using this system. Of claim denials that were appealed by Cigna customers, roughly 80% were later overturned.</p>



<p>This is bad for people, but it could also sound wonky, banal or even “small bore” to tech experts. Yet it is precisely the kind of existential threat that we should worry about when we look at the consequences of bringing artificial intelligence into our lives.</p>



<p>You might remember this spring, when the biggest and wealthiest names in the tech world gave us some pretty grave warnings about the future of AI. After a flurry of opinion pieces and full-length speeches, they found a way to boil it all down to a simple “should” <a href="https://www.safe.ai/statement-on-ai-risk#open-letter">statement</a>:&nbsp;</p>



<p>“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war.”</p>



<p>This sentence and its most prominent signatories (Sam Altman, Bill Gates and Geoffrey Hinton among them) swiftly captured the headlines and our social media feeds. But have no fear, the statement’s authors <a href="https://www.safe.ai/press-release">said</a>. We will work with governments to ensure that AI regulations can prevent all this from happening. We will protect you from the worst possible consequences of the technology that we are building and profiting from. Oh really?</p>



<p>OpenAI CEO Sam Altman then <a href="https://foreignpolicy.com/2023/06/20/openai-ceo-diplomacy-artificial-intelligence/">jetted off</a> on a global charm tour, on which he seems to have won the trust of heads of state and regulators from Japan to the UAE to Europe. A week after he visited the EU, the highly anticipated AI Act had been <a href="https://time.com/6288245/openai-eu-lobbying-ai-act/">watered down</a> to suit his company’s best interests. Mission accomplished.</p>



<p>Before the tech bros began this particular round of spreading doom and gloom about blockbuster-worthy, humanity-destroying AI, journalists at ProPublica had published an <a href="https://www.propublica.org/article/cigna-pxdx-medical-health-insurance-rejection-claims">investigation</a> into a much more clear and present threat: Cigna’s PXDX algorithm, the very subject of the aforementioned lawsuit.&nbsp;</p>



<p>In its official response to ProPublica’s findings, Cigna had noted that the algorithm’s reviews of patients’ claims “occur after the service has been provided to the patient and do not result in any denials of care.”&nbsp;</p>



<p>But hang on a second. This is the U.S., where medical bills can bankrupt people and leave them terrified of seeking out care, even when they desperately need it. I hear about this all the time from my husband, who is a physician and routinely treats incredibly sick patients whose conditions have gone untreated for years, even decades, often due to their being uninsured or underinsured.&nbsp;</p>



<p>This is not the robot apocalypse or nuclear annihilation that the Big Tech bros are pontificating about. This is a slow-moving-but-very-real public health disaster that algorithms are already inflicting on humanity.&nbsp;</p>



<p>Flashy tools like ChatGPT and LensaAI may get the lion’s share of headlines, but there is big money to be made from much less interesting stuff that serves the banal needs of companies of all kinds. If you <a href="https://venturebeat.com/ai/chatgpt-and-llm-based-chatbots-set-to-improve-customer-experience/">read</a> about what tech investors are focused on right now, you will quickly discover that the use of AI in areas like customer service is expected to become a huge moneymaker in the years to come. Again, forget the forecasted human extinction by robots that take over the world. Tech tools that help “streamline” processes for big companies and state agencies are the banal sort of evil that we’re actually up against.</p>



<p>Part of the illusion that seems to drive statements that prophesy human extinction is that technology will start acting alone. But right now, and for the foreseeable future, technology is the result of a multitude of choices made by real people. Right now, tech does not act alone.</p>



<p>I don’t know where we’d be without this kind of journalism or the AI researchers who have been studying these issues for years now. I’ve plugged them before, and now I’ll do it again — if you’re looking for experts on this stuff, start with <a href="https://www.freepress.net/sites/default/files/2023-05/global_coalition_open_letter_to_news_media_and_policymakers.pdf">this list</a>.</p>



<p>And now I’ll plug a new story of ours. Today, we’re publishing a deep dive that shows how a technical tool, even when it’s built by people with really good intentions, can contribute to bad outcomes. Caitlin Thompson has spent months getting to know current and former staff at New Mexico’s child welfare agency and speaking with them about a tool that the agency has been using since 2020. The tool’s intention? To help caseworkers streamline decisions about whether a child should be removed from their home, in cases where allegations of abuse or neglect have arisen. This is a far cry from the ProPublica story, in which Cigna seems to have quite deliberately chosen to deny people’s claims in order to cut costs. This is a story about a state agency trying to improve outcomes for kids while grappling with chronic staffing shortages, and it shows how the adoption of one tool — well-intentioned though it was — has tipped the scales in some cases, with grave effects for the kids involved. <a href="https://www.codastory.com/authoritarian-tech/new-mexico-child-welfare/">Give it a read</a> and let us know what you think.</p>



<h2 class="wp-block-heading" id="h-global-news"><strong>GLOBAL NEWS</strong></h2>



<p><strong>Google and Meta are facing new legal challenges over violent speech on their platforms.</strong> The families of nine Black people who were killed in a supermarket in Buffalo, New York in 2022 have <a href="https://www.nytimes.com/2023/07/23/nyregion/google-meta-buffalo-shooting.html">filed suit</a> against the two companies, arguing that their technologies helped shape the ideas and actions of Payton Gendron, the self-described white supremacist who murdered their loved ones. The U.S. Supreme Court has already heard and <a href="https://www.nytimes.com/2023/05/18/us/politics/supreme-court-google-twitter-230.html">decided</a> to punt on two cases with very similar characteristics, reasoning that the companies are shielded from liability for speech posted by their users under Section 230 of the Communications Decency Act. So the new filings may not have legs. But they do reflect an increasingly widespread feeling that these platforms are changing the way people think and act and that, sometimes, this can be deadly.</p>



<p><strong>The Saudi regime is using Snapchat to promote its political agenda — and to intimidate its critics.</strong> This should come as no surprise: An estimated 90% of Saudis in their teens and 20s use the app, so it has become a central platform for Saudi Crown Prince Mohammed “MBS” bin Salman to burnish his image and talk up his economic initiatives. But people who have criticized the regime on Snapchat are paying a high price. Earlier this month, the Guardian <a href="https://www.theguardian.com/technology/2023/jul/18/snapchat-saudi-arabia-ties">reported</a> allegations that the influencer Mansour al-Raqiba was sentenced to 27 years in prison after he criticized MBS’ “Vision 2030” economic plan. Snapchat didn’t offer much in the way of a response, but Gulf-based media have <a href="https://www.arabnews.com/node/2289516/business-economy">reported​</a> on the company’s “special collaboration” with the Saudi culture ministry. It’s also worth noting that Saudi Prince Al Waleed bin Talal — who is Twitter, er, X’s, biggest shareholder after Elon Musk — is a major investor in the company.</p>



<h2 class="wp-block-heading"><strong>WHAT WE’RE READING</strong></h2>



<ul class="wp-block-list">
<li>Writing for WIRED, Hossein Derakshan, the blogger who was famously imprisoned in Iran from 2009 until 2015, <a href="https://www.wired.com/story/information-truth-personalization/">reflects</a> on his time in solitary confinement and what it taught him about the effects of technology on humanity.</li>
</ul>



<ul class="wp-block-list">
<li>Justin Hendrix of Tech Policy Press has <a href="https://techpolicy.press/rescuing-the-future-from-silicon-valley/">written</a> a new essay on the “cage match” between Elon Musk and Mark Zuckerberg, the “age of Silicon Valley bullshit” and the overall grim future of Big Tech in the U.S. Read both pieces, and then take a walk outside.</li>
</ul>
<p>The post <a href="https://www.codastory.com/newsletters-category/cigna-ai-healthcare-algorithm/">The AI apocalypse might begin with a cost-cutting healthcare algorithm</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">45546</post-id>	</item>
		<item>
		<title>Inside New Mexico’s struggle to protect kids from abuse</title>
		<link>https://www.codastory.com/authoritarian-tech/new-mexico-child-welfare/</link>
		
		<dc:creator><![CDATA[Caitlin Thompson]]></dc:creator>
		<pubDate>Thu, 27 Jul 2023 14:44:18 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Feature]]></category>
		<category><![CDATA[United States]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=44250</guid>

					<description><![CDATA[<p>A safety scoring tool was supposed to improve child welfare. But former caseworkers say it’s not helping</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/new-mexico-child-welfare/">Inside New Mexico’s struggle to protect kids from abuse</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Ivy Woodward can turn her emotions off like a water faucet.&nbsp;</p>



<p>It served her well when she worked in child protective services in Hobbs, a small oil town in southeastern New Mexico.</p>



<p>She looks at it this way: “If you give in to emotion, the job’s not going to get done. You don’t process emotion. You walk in on a scene, and the first thing you see isn’t a tragedy. The first thing you see is a checklist of things you need to do to resolve the issue.”</p>



<p>But when Woodward looks back on all the horrible things that she witnessed as a caseworker, the weight of the decisions she had to make is almost too much to bear.</p>



<p>“Each decision that you make changes your life. Every single, solitary decision that I made, I still carry it,” Woodward said when we met in the spring.</p>



<p>Woodward used to work for the state of New Mexico’s Children, Youth, and Families Department, supporting children who had been the victims of abuse or neglect. She was part of the CYFD staff teams that deliberated on whether to take kids away from their parents and put them into foster care.</p>



<p>Woodward is still haunted by the memories of one little girl in particular. Woodward had reason to believe that the girl, who was living in foster care with her grandfather, was being abused.</p>



<p>But something stood in the way: It was a safety assessment tool that the state requires caseworkers like Woodward to use. Formally known by its somewhat clunky brand name — “Structured Decision Making” — the tool is meant to help determine whether a child is in great enough danger to be removed from their home.&nbsp;</p>



<p>Her concern was based on more than just a hunch. The girl’s mother had told Woodward that the grandfather was an abuser – he had raped her when she was young. Woodward took this information to her team and called for another office to send a caseworker to investigate. But that caseworker’s report, based on the tool, indicated that there was no reason for concern about the girl’s safety. Despite Woodward’s pleas, CYFD staff decided to keep the girl with her grandfather.</p>



<p>It became clear months later that Woodward was right — the little girl’s grandfather had been sexually abusing her all along. The girl was eventually taken away from her grandfather and placed in a different foster home.</p>



<p>The agency is adamant that the tool isn’t meant to supersede a caseworker’s judgment. “It's not about giving the job of a caseworker to an electronic tool,” said Sarah Meadows, the head of the agency’s research, assessment and data bureau. “That’s 100% not what it’s intended to do.”</p>



<p>But in cases like this one, it felt to Woodward as if the tool had won out.</p>



<p>“You can no longer go on all of your training, all of your experience in the field. It’s a moot point because the tool said so,” Woodward told me.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>“You're going off of a scoring system now. And if the family doesn’t meet the score, you have to turn around and walk out.”</p>
</blockquote>



<p>Across the U.S., child welfare agencies are looking to algorithms and risk assessment tools to help support the arduous labor of caseworkers in child protective services agencies. The hope is that these tools will help caseworkers make better and more equitable decisions that will ultimately improve outcomes for vulnerable children. But these agencies’ problems run deep. Oftentimes, there is no single tool or policy solution that can fix them.</p>



<p>Facing high rates of child abuse and neglect, the New Mexico Children, Youth, and Families Department rolled out the Structured Decision Making safety scoring tool in 2020. The goal was to help the agency decide whether or not children are safe living with their parents and to assess the risk of future abuse if a child remains in their home. But in the face of severe staffing shortages and a push to remove kids from their families in only the most extreme cases, former CYFD staff and children’s attorneys in New Mexico say that the safety scoring tool has been replacing caseworker judgment and leaving some kids in harm’s way.</p>



<p class="has-drop-cap">New Mexico <a href="https://www.acf.hhs.gov/cb/report/child-maltreatment-2021">had</a> the 15th highest rate of child abuse or neglect in the 2021 fiscal year, a drop from the <a href="https://www.nmlegis.gov/Entity/LFC/Documents/FINAL%20Evidence-Based%20Options%20To%20Address%20Child%20Maltreatment.pdf">8th highest</a> in 2020. About a third of children who died from abuse, neglect or homicide between 2015 and 2021 <a href="https://www.nmhealth.org/publication/view/report/8272/">had</a> prior involvement with child welfare, according to the New Mexico Department of Health.</p>



<p>One of them was named James Dunklee Cruz. There were countless warning signs that the little boy was at risk of harm. When he was just a few months old, caseworkers found ample evidence of neglect: The home where he lived with his mother was roach-infested and strewn with trash and dog feces.&nbsp;</p>



<p>In October 2019, when Dunklee Cruz was four, he was brought to the hospital with multiple injuries, including a black eye, a bruised penis and an injured shoulder. He told a CYFD investigator that he had been abused by three people in his life. But somehow those allegations were declared unsubstantiated. The Strategic Decision Making tool classified Dunklee Cruz as “safe with a plan,” allowing him to stay with his mother.</p>



<p>Two months later, James Dunklee Cruz died as a result of blunt force trauma to his head and torso at the hands of Zerrick Marquez, one of the men he had named as his abuser two months before. Marquez <a href="https://www.kob.com/new-mexico/albuquerque-man-sentenced-to-30-years-in-prison-for-killing-of-4-year-old/">pleaded</a> guilty to killing Dunklee Cruz and was sentenced to 30 years in prison.</p>



<p>CYFD conducted nine investigations into allegations of abuse and neglect during the boy’s short life. Caseworkers put what they call “safety plans” in place for Dunklee Cruz, but this wasn’t enough to keep him safe. These details appear in a publicly posted child fatality review summary <a href="https://www.documentcloud.org/documents/21575265-dunklee-fatality-summary-report">report</a>. The section of the document drawing on the child’s autopsy also describes a litany of injuries, including “healing jaw fractures and healing subdural hemorrhage indicating significant blunt head trauma that occurred earlier than the acute injuries” — in other words, injuries that didn’t kill him but proved that Dunklee Cruz was at risk of serious harm before his final days.</p>



<p>A wrongful death lawsuit is also working its way through the court. A <a href="https://www.codastory.com/wp-content/uploads/2023/07/Complaint_Dunklee_Cruz_12-06-22.pdf">complaint</a> filed in December 2022 in a federal district court in New Mexico accuses CYFD of failing in their duty to protect the boy and states that Dunklee Cruz’s mother repeatedly violated the safety plans CYFD put in place. The complaint also specifically points to the Structured Decision Making tool.&nbsp;</p>



<p>It reads: “Over the span of his four years of life, CYFD investigators repeatedly failed to rely on accurate and well-documented facts when it utilized the agency’s Safety Risk Assessment Tool, causing its repeated contacts with James to result in flawed and underestimated risk assessment and flawed decision-making resulting in James’ death.”</p>



<p>When I asked about the boy’s case, CYFD offered this response: “The death of James Dunklee Cruz is tragic. The loss of any child is felt deeply and grieved by our caseworkers and staff. Regarding the function of the tool in this case, he was identified as safe with a plan. This means that the safety assessment tool identified at least one danger indicator and that in order for the child to stay in the custody of the parent, a plan was required. Our caseworkers worked with James’ mother to find a safe place for her to live and alternative childcare for James to mitigate against the threats that were identified by the caseworker.”</p>



<figure class="wp-block-image alignwide size-large"><img src="https://www.codastory.com/wp-content/uploads/2023/07/NewMexicoChildServices1-1800x1200.jpg" alt="" class="wp-image-45479"/><figcaption class="wp-element-caption">The Juvenile Justice Center which houses the Bernalillo County Youth Services Center Children’s Court in Albuquerque, New Mexico.</figcaption></figure>



<p class="has-drop-cap">What happened to James Dunklee Cruz reflects the most significant problem that former CYFD workers raised when they talked to me about the Structured Decision Making safety tool: It doesn’t always convey how much danger kids are truly facing.</p>



<p>The tool’s launch coincided with a change in the agency’s approach to decision-making about when to remove a child from their family’s home. This was a part of a nationwide shift with the passage of the federal Family First Preservation Services Act, a policy that was <a href="https://www.childwelfare.gov/topics/systemwide/laws-policies/federal/family-first/">designed</a> to keep children “safely with their families to avoid the trauma that results when children are placed in out-of-home care.”</p>



<p>The safety tool doesn’t tell caseworkers what to do. It is meant to facilitate a conversation between the worker and their supervisor about whether to declare the child “safe,” “safe with plan” or “unsafe.” The tool sets the tone for what, ideally, should be an extensive, in-depth dialogue between people from across the agency. But due to staffing shortages, it doesn’t always play out this way. The tool “doesn’t take into account that there's not enough workers, there's not enough supervisors,” said Matt Esquibel, a regional manager at CYFD.</p>





<p>Some former caseworkers have told me that, in this context, the assessment takes on an outsized role in determining a child’s fate.</p>



<p>“It's not meant to streamline or fast-track decisions, but it helps focus the conversations, which is helpful to supervisors and to workers,” said Meadows, the head of the agency’s research, assessment and data bureau.</p>



<p>Former CYFD workers told me that the risk and safety assessments did not always match what they observed about the level of danger a child was facing, particularly when it came to substance abuse, domestic violence or repeated involvement with child protective services.</p>



<p>“We saw issues with the safety tool immediately,” said one former CYFD worker who had knowledge of the tool and reviewed investigations in which it was used. She requested anonymity out of fear of retaliation.&nbsp;</p>



<p>The former CYFD worker said she would see cases in which she thought a child should have been removed from the home but the safety tool didn’t reflect that.</p>



<p>“I’m reading a report that comes and I’m reading their notes that they’ve entered. And then I’m looking at their safety assessment, [and it] does not match what I’m reading,” she said.</p>



<p>Workers are only allowed to check off a danger on the safety tool if they can observe or otherwise prove it. But investigators don’t always have time to do multiple home visits or to gather more information, said Esquibel. They may not be able to gather all the details right away, and children may not initially disclose abuse. There is an “override” for the risk assessment that requires supervisor approval. If the worker thinks that the risk score is too low, they can bump the score up one level. CYFD’s Meadows emphasized that workers should use their judgment and critical thinking, work with supervisors and override the tool if necessary.</p>



<p>“I think the workers and supervisors do the best that they can when they’re out there,” Esquibel said. “But your assessment is only as good as what information you're gathering or who’s available at the time.”&nbsp;</p>



<p>Ultimately, the former CYFD staff member who requested anonymity thinks the assessments are not capturing the seriousness of some cases and that the consequences for kids are real.</p>



<p>“I think it’s leading to dangerous situations for children,” she said. “I think the agency is leaving the children in situations based on that tool when they should be removing them.”</p>



<p>Meadows said that shouldn’t happen. “If a worker feels strongly that a child is unsafe and they don’t want to walk away from that child in that home, they shouldn't. Safety tool be damned,” she said.</p>



<figure class="wp-block-image alignwide size-large"><img src="https://www.codastory.com/wp-content/uploads/2023/06/Coda_CYFD_23-1800x1200.jpg" alt="" class="wp-image-45047"/><figcaption class="wp-element-caption">Ivy Woodward at her home in West Texas.</figcaption></figure>



<p class="has-drop-cap">Even though she’s moved on from CYFD, this all still weighs heavily on Ivy Woodward, who has worked with children for most of her career. Before working in child welfare, Woodward, who is Native American and Hispanic, taught elementary school on the Apache reservation in San Carlos, Arizona and in southwestern New Mexico. In 2017, CYFD brought her on as a permanency planning supervisor. This meant she worked on cases where the agency had credible evidence that a child was being abused or neglected at home. The work spoke to core elements of Woodward’s personality.&nbsp;</p>



<p>“I’m a protector,” Woodward told me. “You can do a lot to me and get away with it. But if I see somebody doing something to someone else, that triggers my inner anger.”&nbsp;</p>



<p>She calls it like she sees it and pushes back when she disagrees. “I don’t know what is broken in my head, but I question everything,” Woodward said.</p>



<p>When Woodward left CYFD in the summer of 2020, she and a colleague filed a lawsuit alleging that they faced retaliation after raising concerns about a case in which a child was severely injured after she and her siblings were returned to their parents. The agency <a href="https://www.krqe.com/news/new-mexico/former-new-mexico-cyfd-case-workers-getting-big-settlement-in-whistleblower-lawsuit/">settled</a> the lawsuit for more than $300,000 without <a href="https://www.abqjournal.com/2516049/some-claims-settled-in-childabuse-case-ex-nm-agrees-to-pay-90k-to.html">acknowledging</a> liability.</p>



<p>Woodward has a fast, forceful way of speaking, a reflection of her often overly-caffeinated state. But when she talks about the kids she worked with at CYFD and the horrible things she heard or saw on the job, her voice gets a little higher. Her emotions begin to flow.</p>



<p>“You do have to be able to turn off the emotions and make those cold, hard decisions when the time comes to make them,” she said. “But until that time comes, you have to see people, not casework.”</p>



<p>Woodward now lives across the border in a tiny county in West Texas with her husband and two daughters. She works as the chief of juvenile probation and coordinates emergency management for the county.</p>



<p>In some ways, Woodward was an outlier among other CYFD staff. Many start working with the agency soon after college and have little or no experience in child welfare. The agency <a href="https://www.togetherwethrivenm.org/dashboard/">struggles</a> to stay fully staffed — this spring, nearly a quarter of positions were unfilled. A July 2022 <a href="https://www.codastory.com/wp-content/uploads/2023/07/New_Mexico_CYFD_CS_Review_07-21-22.pdf">review</a> by an outside consultancy found that CYFD employees felt overwhelmed by the work they were being asked to do. Staff said they would rush from one emergency to the next and had little ability to make progress on other cases.</p>



<p>This is not unique to New Mexico. A caseworker I spoke with in Indiana described feeling like he was stretched so thin that he would race from one emergency to the next without ever having time to put out the fires. He felt like he was just identifying a fire and then moving on to the next one.</p>



<p>The issues of understaffing and high turnover rates were top of mind for many CYFD workers. High turnover isn’t just bad for morale. It directly affects the ability of those who remain on staff to do their work. When one person leaves, those who stay have to absorb their caseload. It is daunting. The review described a “culture of fear” in which staff were afraid that if something bad were to happen with a case, they would be punished or “scapegoated.”</p>



<p>And the forced intimacy of the work can be grating and even traumatic. Caseworkers must regularly intervene in painful moments of struggle and conflict within families, and they are sometimes met with resistance. As agents of the state, they are caught between a bureaucracy that requires them to treat each situation as consistently and objectively as possible and real life-and-death conflicts in which people’s actions are largely driven by emotion.</p>



<figure class="wp-block-image alignwide size-large"><img src="https://www.codastory.com/wp-content/uploads/2023/06/Coda_CYFD_42-1800x1200.jpg" alt="" class="wp-image-45051"/><figcaption class="wp-element-caption">The Children, Youth, and Families Department offices in Albuquerque, New Mexico.</figcaption></figure>



<p class="has-drop-cap">For strained child welfare agencies, algorithms and risk assessment tools are an attractive solution to the vexing challenge of maintaining consistent decision-making practices.</p>



<p>Some states have experimented with predictive analytics, with limited success. Illinois <a href="https://gizmodo.com/illinois-scraps-child-abuse-prediction-software-for-not-1821080730">used</a> an algorithm to estimate the likelihood that a child would die or be seriously injured as a result of abuse or neglect. Social workers were flooded with cases erroneously determined to be urgent, while children that the algorithm deemed low-risk were dying. The state soon <a href="https://gizmodo.com/illinois-scraps-child-abuse-prediction-software-for-not-1821080730">stopped</a> using the tool after the Illinois Department of Children and Family Services declared it ineffective.&nbsp;</p>



<p>A child welfare algorithm in Allegheny County, Pennsylvania is currently <a href="https://apnews.com/article/justice-scrutinizes-pittsburgh-child-welfare-ai-tool-4f61f45bfc3245fd2556e886c2da988b">facing</a> scrutiny from the U.S. Department of Justice. Using arrest records, Medicaid data and documented struggles with substance abuse, the algorithm <a href="https://apnews.com/article/child-welfare-algorithm-investigation-9497ee937e0053ad4144a86c68241ef1">generates</a> a score from 1 to 20 that determines whether to open a neglect investigation. Reporting by the AP <a href="https://apnews.com/article/child-welfare-algorithm-investigation-9497ee937e0053ad4144a86c68241ef1">found</a> that the algorithm disproportionately flagged Black children for neglect investigations. There was also <a href="https://apnews.com/article/child-protective-services-algorithms-artificial-intelligence-disability-02469a9ad3ed3e9a31ddae68838bc76e">evidence</a> that the algorithm did the same for parents with disabilities.</p>



<p>Torn between pushback against opaque algorithms and the desire to use technology to streamline decision-making, some states are turning to scoring tools that are less opaque and less automated. New Mexico’s Structured Decision Making tool, created by the nonprofit Evident Change, is one of them. Oregon, New Hampshire and California also use assessment tools built by Evident Change.</p>



<p>Structured Decision Making offers a checklist that is meant to help the investigator understand “the risk of imminent and serious harm,” according to a CYFD <a href="https://s3.documentcloud.org/documents/21039319/cyfd-2019-2021-progressreport-final.pdf">progress and impact report</a>. Children are ranked safe, safe with plan, which involves in-home services, or unsafe, which is grounds for removal. There’s also an actuarial risk scoring tool, which is meant to assess “the likelihood of any future maltreatment” and additional CYFD involvement in the next 18 to 24 months, if the child remains with the family.</p>



<p>The safety scoring tool asks about abuse or neglect, including physical or sexual abuse, failure to meet the child’s basic needs, unsafe living conditions, emotional harm or unexplained injuries. Both assessments are intended to guide caseworkers to think about risk factors, vulnerabilities and the impact on the child.</p>



<p>“What Structured Decision Making tries to do is to help workers and supervisors make accurate, consistent and equitable decisions at these high-stakes moments,” said Phil Decter, the director of child welfare at Evident Change.</p>



<p>Structured Decision Making is also “intended to reduce bias, whether that's related to race, ethnicity, socioeconomic status, making sure that we're not conflating poverty with neglect,” said CYFD’s Meadows.</p>



<p>But in New Mexico, as the Dunklee Cruz case and insights from caseworkers make clear, the tool does not always work as intended. And the tool can't solve some of CYFD's biggest problems. The agency doesn’t have the workers to meet the needs of the population. Emblematic of a national trend, CYFD is chronically understaffed. Workers juggle heavy caseloads and often have precious little time to dedicate to each child’s case.</p>



<p>The safety tool isn’t meant to fix that. CYFD says hiring is a priority. “Structured Decision Making is not intended to replace human beings in terms of lightening their caseload,” Meadows said. “The role of it is to create consistency, making sure that we're looking at every angle of the case, every potential impact to a child.”</p>



<p>But for caseworkers racing from one emergency to the next, the tool begins to play a different role. It sometimes becomes a shortcut, they told me — a stand-in for real human decision-making, in a system already weighed down by the rigid requirements of the state.</p>



<figure class="wp-block-image alignwide size-large"><img src="https://www.codastory.com/wp-content/uploads/2023/06/Coda_CYFD_29-1800x1200.jpg" alt="" class="wp-image-45049"/><figcaption class="wp-element-caption">Reed Ridens at his home in New Mexico.</figcaption></figure>



<p class="has-drop-cap">Reed Ridens remembers everything about the day the state took him away from his father almost seven years ago. It was a typical January afternoon at school. About an hour before classes ended, Ridens, who was 15 at the time, was pulled out of orchestra practice and brought to a conference room. Waiting for him were two of his teachers, the school social worker, representatives from CYFD, a police officer and his dad.&nbsp;</p>



<p>“I’m just looking around like, what is going on?” Ridens recalled.&nbsp;</p>



<p>For nearly an hour, the adults in the room went back and forth about whether Ridens’ dad could take care of him. There were concerns, they said, about neglect and his father’s alcoholism.&nbsp;</p>



<p>“The entire time, I was just sitting there, crying, like, ‘Hey, please don’t take me out of my home,’” Ridens said.&nbsp;&nbsp;</p>



<p>His protests were futile. Ridens stayed in the foster care system until he was 18, moving between 15 different placements. It left him with a deep-seated trauma, compounded by his father’s death four years ago.</p>



<p>“I felt like the state was taking me out of my household and then not doing any better for me than my father did. And in fact, actually putting me in worse-off situations,” he said.&nbsp;</p>



<p>“I don’t really feel like they saw me as a person,” he told me when we met in Albuquerque.</p>



<p>“I feel like they didn’t see me as more than a list of checkmarks. I feel like they didn’t see my dad as anything more than a monster.”</p>



<p>Today, kids in a position like Ridens’ are not only dealing with adults trying to decide what’s best for them. Their fate is also shaped by tools like Structured Decision Making.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2023/06/Coda_CYFD_27-900x1200.jpg" alt="" class="wp-image-45048"/><figcaption class="wp-element-caption">Ridens stayed in the foster care system until he was 18.</figcaption></figure>



<p class="has-drop-cap">How did New Mexico get here? In part, the objective was to prevent the wrong kids from entering foster care, said Beth Gillia, the former deputy secretary of CYFD.&nbsp;</p>



<p>“Foster care really should be the absolute last resort in extreme circumstances where needs cannot be met in the home and where a child cannot be safe at home,” she told me.</p>



<p>The state paid the nonprofit organization Evident Change $1.3 million to develop a risk and safety assessment tool, according to a state legislative finance committee <a href="https://www.nmlegis.gov/Handouts/ALFC%20072121%20Item%203%20CYFD%20PS%20Brief%20July%202021.pdf">report</a>. The nonprofit creates similar tools for criminal justice, education and adult protective services.&nbsp;</p>



<p>After a pilot in some counties in 2019, including in the country where James Dunklee Cruz lived, Structured Decision Making was rolled out statewide in January 2020.</p>



<p>The tool works best in situations where there is plenty of time and staff capacity to dedicate to this kind of deliberation. But CYFD’s investigations unit was short almost 25% of its workforce as of May 2023, according to the state’s public statistics <a href="https://www.togetherwethrivenm.org/dashboard/">dashboard</a>, and maintains a steep turnover rate.</p>



<p>“If a child welfare organization is not being resourced well, if it’s understaffed or if caseloads are high, it’s going to be hard for optimal work to happen in any situation,” said Decter, who previously worked in child welfare in Massachusetts. “Good decision-making takes time.”</p>



<p>A report presented to a CYFD steering committee <a href="https://www.cyfd.nm.gov/wp-content/uploads/2022/12/Themes_from_Steering_Cmte_Focus_Groups.pdf">found</a> that, according to focus groups made of CYFD workers, Structured Decision Making is “not being used as it was designed to be utilized. They go out and do their investigation and then come back in and click whatever needs to be clicked to show it has been done.”&nbsp;</p>



<p>A former investigator in Hobbs told me that the Structured Decision Making tool just added more work to her plate.&nbsp;</p>



<p>“It didn’t take a whole lot of time, but it was just another tedious step that you’re going through when you’ve already made up your mind,” she said.</p>



<p>As a result, she said, some people rushed through checking boxes on the safety tool.&nbsp;</p>



<p>“I watched people go click, click, click, click, click, and just move on,” she said. It wasn’t the deciding factor. But she did feel like it could be “manipulated” to justify a certain decision.</p>



<p>CYFD says this isn’t how it’s supposed to be used. “Safety assessment is not a quick activity,” said Meadows. “Workers should take their time with it, really do their best to engage the family to get as much information as possible so that the safety assessment is accurate.”</p>



<p class="has-drop-cap">Ivy Woodward, the former supervisor in Hobbs, had concerns about the safety scoring tool from the very beginning. In particular, she worried about how it dealt with a caregiver’s substance use, which is not listed as one of the danger indicators that must be checked in order for the agency to remove a child. In a sharp pivot from New Mexico’s previous assessment, substance use is treated as a “complicating factor” rather than a deciding one.</p>



<p>The risk tool adds points if the parent struggles with substance abuse. However, the tool doesn’t weigh substances differently. Meth gets the same number of points as marijuana, for example.&nbsp;</p>



<p>In the Structured Decision Making training, Woodward and some of the other experienced caseworkers challenged this, fearing that it would put children at risk. The discussion got so heated that the head of the agency came to intervene. Woodward said she was effectively shut down. It was clear that the agency would be using the tool, whether she liked it or not.</p>



<p>Other CYFD workers and child welfare attorneys also raised concerns about how the safety and risk assessments handle drug abuse, a factor affecting almost one-third of children who were victims of maltreatment in 2020, according to <a href="https://www.acf.hhs.gov/sites/default/files/documents/cb/cm2020.pdf">statistics</a> from the U.S. Department of Health and Human Services.&nbsp;</p>



<p>While investigators are supposed to consider substance use in their decision about removing a child, it’s not supposed to be the sole reason for removal. This is part of a recent change in the agency’s approach to substance use. Caseworkers are now told to focus not solely on substance use, but rather on the impact substance use has on the caregiver’s ability to care for their children, said Gillia, the former deputy secretary of CYFD.&nbsp;</p>



<p>“It’s only if the substance use interferes with parenting that it becomes abuse or neglect,” Gillia said. “So I think what the tool is trying to do is force a look at what parenting behavior is impacted by the substance use.”&nbsp;</p>



<p>Phil Decter at Evident Change says the safety tool also helps when it comes to an inexperienced workforce. It has detailed instructions that help workers decide whether to check ‘yes’ or ‘no’ for a danger indicator. It points staff without a background in child welfare in the direction of things to look for, he said.&nbsp;</p>



<p>But out in the field, Woodward sees problems with this. The decisions are so monumental — literally life or death. For Woodward, the tool is not a substitute for a seasoned supervisor guiding less experienced staff through decisions.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>“It becomes a crutch for a lack of confidence,” said Woodward. “I don't think that being armed with a piece of paper and a laptop is an adequate replacement for someone who's been in the trenches for 20 years and can tell them this is what you do.”</p>
</blockquote>



<p>And the tool doesn’t capture the unspoken cues that an investigator may notice, like a child who can’t make eye contact with a family member or won’t answer open-ended questions, Woodward said.&nbsp;</p>



<p>The safety tool has an “other” option where investigators can write in safety concerns not addressed by the nine danger indicators. But that should be used “rarely and infrequently,” said Decter. “That’s by design. The other danger indicators should be sufficient.”</p>



<p>The success of the tool depends on how it’s used, and this is where Woodward hit roadblocks in Hobbs. She said her supervisors would tell her she was paying attention to things that the safety tool said weren’t an issue, rather than focusing on what she was called upon to investigate. Woodward felt like she was being instructed to ignore history, context and other dangers that she knew were significant from past experience.&nbsp;</p>



<p>Information about those more subtle cues may be presented to a judge if CYFD files a petition to remove a child. But if the tool indicates that the child doesn’t need to be removed, the case likely won’t reach that stage.</p>



<p>Former CYFD staff like Gillia emphasized that the agency wants to keep kids living with their families unless they are clearly at risk of imminent and grave harm. The agency <a href="https://kevinssettlement.com/faq-about-the-lawsuit/">settled</a> a lawsuit in 2020 that accused the state of failing to take adequate care of foster children in CYFD custody.</p>



<p>But former caseworkers I spoke with worried that the tool was being used as a way to all but ensure that kids would remain in the home, even in cases where it might leave them at risk. The worry, for people like Ivy Woodward, was that the tool was being used to justify decisions that had already been made.</p>



<p>Evident Change emphasizes that “tools don’t make decisions, people make decisions.” But former CYFD workers told me they worried that this particular tool has an outsized impact on the agency’s final decision.</p>



<p>CYFD commissioned a report from an outside group, Collaborative Safety, to look at what went wrong in five specific cases from 2021 in which children died. In the <a href="https://drive.google.com/file/d/1PcBm6u9OciIimyIenD-UGgBWB9uTcibx/view?usp=sharing">report</a>, released in July 2022, staff involved in those cases said that sometimes the Structured Decision Making tool would say the child is “safe,” even if the worker felt there were “significant concerns with the family.”</p>



<p>“This places staff in the position where they perceive they cannot act on those concerns as it would go against what the tool’s output is,” wrote the report’s authors.</p>



<p>“Investigators were just using the tool as the end-all-be-all to a decision and an assessment. That’s not correct. We don’t want it to substitute their good judgment,” former CYFD Secretary Barbara Vigil <a href="https://sg001-harmony.sliq.net/00293/Harmony/en/PowerBrowser/PowerBrowserV2/20230201/-1/70817?startposition=20230208184134&amp;mediaEndTime=20230208195836&amp;viewMode=3&amp;globalStreamId=3">told</a> members of the New Mexico House Appropriations and Finance Committee in February 2023. In response to the Collaborative Safety report, CYFD <a href="https://drive.google.com/file/d/1XkaoTS3D5a-Dpkb6f5e21oJDD9TPbvl7/view?usp=sharing">announced</a> they would overhaul their training protocols and pledged to “make sure that every member of staff uniformly knows how to use the tool, including through enhanced training to investigators and supervisors statewide.”</p>





<p>The former CYFD worker I spoke to who requested anonymity saw this reflected in the investigations she reviewed. “I don’t even know how many cases I reviewed where it’s like, you should have removed that kid immediately. And they didn’t because of the safety tool,” she said. “We would always say, use your common sense. This is a guide.” But some workers and managers still put too much emphasis on the tool.</p>



<p>Esquibel said the tool played a major role in facilitating decision-making. “The weight is 100% on your safety assessment because that's really the snapshot of what happened the day that that worker was there,” he said.</p>



<p>CYFD’s Meadows put it differently: “It's not just a snapshot in time,” she said. “Safety assessments are not a one-off, one and done thing. Safety is assessed on an ongoing basis when we have an open case because sometimes it does take effort and time to learn more about a family or child situation.”</p>



<p>Woodward doesn’t think the tool should carry so much weight. Instead, it should be “something in your toolbox that you utilize to help you through the process,” she said. “I don't think they should be used as the ultimate decision maker.”</p>



<figure class="wp-block-image alignwide size-large"><img src="https://www.codastory.com/wp-content/uploads/2023/06/Coda_CYFD_45-1800x1200.jpg" alt="" class="wp-image-45052"/><figcaption class="wp-element-caption">Vanna at her home in New Mexico.</figcaption></figure>



<p class="has-drop-cap">When Vanna was first removed from her parents at age five, the adults in her life told her that her parents were going on vacation.&nbsp;</p>



<p>She remembers a woman pulling up to their house and talking to her parents. Her mother was crying, her father was trying to calm her down. The strange woman went up to her younger brother, who was four at the time, and said, “How would you like to go somewhere else?”<br><br>“I looked at her, I said, ‘You’re not taking my brother,” said Vanna.&nbsp;</p>



<p>Vanna, who is now 21 and using a nickname to preserve her privacy, has been fiercely protective of her little brother since they were small.&nbsp;</p>



<p>As the woman stood talking to her parents, Vanna tried to get him out of his car seat. “And I tried to run with him, and she started running after us. And she said, ‘I’m not trying to take your brother. I’m trying to take you both. You’re going to this lady. Your parents are going on vacation.”&nbsp;</p>



<p>She didn’t realize until later that she wasn’t returning home. Vanna spent 13 years in foster care until she aged out at 18. She estimates she lived in more than 50 placements.&nbsp;</p>



<p>In foster care, Vanna felt like she was treated like a case number. Someone else made decisions about every aspect of her life. Someone else had power over her.</p>



<blockquote class="wp-block-quote is-layout-flow wp-block-quote-is-layout-flow">
<p>“I got numb. I became this robot. You want me to be a puppet, guys? I’ll be a puppet. Pull my strings and do whatever you want because that’s how you treat me,” she said.&nbsp;</p>
</blockquote>



<p>Vanna would tell the adults around her what she wanted, but she didn’t feel like they listened.</p>



<p>“They would always say, ‘Honey, we wouldn't make any decision if it wasn't going to be safe for you or if we weren't keeping your best interests in mind,’” she said. “How do you know what my best interest is?”</p>



<p>The safety assessment that’s currently in place rolled out statewide the year Vanna aged out of the system. But when she looks back at her own experience, systems like this still worry her. She thinks the assessments used to make decisions need to be more personalized, otherwise they do more harm than good.&nbsp;</p>



<p>“How do you put everyone in the same box, the same population? You put them under the same microscope, but they’re not the same. They’ve had individual situations,” she said.</p>



<p>If the assessments are too generalized, kids won’t end up getting the help they need, Vanna said. Just as the assessments used to evaluate their needs are flattened and standardized, the care kids get is too.</p>



<figure class="wp-block-image size-large is-resized"><img src="https://www.codastory.com/wp-content/uploads/2023/07/Coda_CYFD_50-900x1200.jpg" alt="" class="wp-image-45188" style="width:736px;height:981px"/><figcaption class="wp-element-caption">Vanna spent 13 years in foster care.</figcaption></figure>



<p class="has-drop-cap">For people like Vanna, many aspects of the child welfare system were dehumanizing. Ernie Holland, who worked at CYFD for 25 years, thinks that by relying on assessment tools like Strategic Decision Making, the agency could make these effects even worse. When he left, he ran the Guidance Center, a nonprofit that offers mental health and other community-based services in Hobbs.&nbsp;</p>



<p>Even as a young child protective services investigator, the weight of the decisions he was making never escaped him. He shares Ivy Woodward’s belief that “each decision you make changes your life.”</p>



<p>“Unless you’ve gone around the block three or four times to screw up your courage to knock on somebody’s door and ask them why they sodomized their infant, you don’t know what it’s like,” he said. “I’ve been there, done that, and I know what it's like. And I know you’re risking some of yourself doing that work.”</p>



<p>That pressure never goes away. Holland still remembers a family whose case he managed nearly 50 years ago. He’s still not sure he made the right decision.&nbsp;</p>



<p>As the agency relies more on standardized assessments, he worries humanity gets removed from the equation.&nbsp;</p>



<p>For Holland, there’s a big difference between being able to say, “I made the decision based on this tool” and “I made the decision.”</p>



<p>“If you can hide behind an assessment tool,” Holland told me, “it’s not personal anymore. If you get it to where it’s not a personal decision, the kid loses. If you’re making life and death decisions, you damn well better own ‘em.”</p>



<p><em>This project was supported by the <a href="https://globalreportingcentre.org/">Global Reporting Centre</a> and <a href="https://the-citizens.com/about-us/">The Citizens</a> through the Tiny Foundation Fellowships for Investigative Journalism.</em></p>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-algorithms post_tag-feature post_tag-surveillance post_tag-united-states author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/amazon-workers-surveillance/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/amazon-workers-surveillance/">How Somali workers in the US are fighting Amazon’s surveillance machine</a></h2>


<div class="wp-block-post-author-name">Erica Hellerstein</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-surveillance-and-control post_tag-biometrics post_tag-feature post_tag-north-america post_tag-privacy-laws post_tag-united-states author-cap-caitlinthompson ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/surveillance-and-control/illinois-bipa-biometrics/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/03/Ritvik-Natalie-Fingerprint-FINAL.gif" width="1920" height="1080"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/surveillance-and-control/illinois-bipa-biometrics/">Fingerprinting employees could cost Illinois businesses billions</a></h2>


<div class="wp-block-post-author-name">Caitlin Thompson</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-surveillance-and-control post_tag-algorithms post_tag-content-moderation post_tag-explainer post_tag-social-media-censorship post_tag-united-states author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/surveillance-and-control/scotus-section-230-abortion/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/01/A230-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2023/01/A230-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2023/01/A230-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2023/01/A230-232x232.jpg 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/surveillance-and-control/scotus-section-230-abortion/">What a law designed to protect the internet has to do with abortion</a></h2>


<div class="wp-block-post-author-name">Tamara Evdokimova</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/new-mexico-child-welfare/">Inside New Mexico’s struggle to protect kids from abuse</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">44250</post-id>	</item>
		<item>
		<title>How TikTok influencers exploit ethnic divisions in Ethiopia</title>
		<link>https://www.codastory.com/authoritarian-tech/tktok-ethiopia-ethnic-conflict/</link>
		
		<dc:creator><![CDATA[Endalkachew Chala]]></dc:creator>
		<pubDate>Wed, 14 Jun 2023 13:29:06 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Africa]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Content moderation]]></category>
		<category><![CDATA[Feature]]></category>
		<category><![CDATA[TikTok]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=44479</guid>

					<description><![CDATA[<p>Social media influencers in Africa’s second-largest country are helping to stoke conflict – and making money along the way</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/tktok-ethiopia-ethnic-conflict/">How TikTok influencers exploit ethnic divisions in Ethiopia</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>When Ethiopians took to the streets in February in reaction to a highly politicized rift within the country’s Orthodox Tewahedo Church, government authorities temporarily <a href="https://www.reuters.com/world/africa/social-media-restricted-ethiopia-after-church-rift-turns-violent-2023-02-10/#:~:text=NAIROBI%2C%20Feb%2010%20(Reuters),within%20the%20country's%20Orthodox%20Church.">blocked</a> social media platforms. On the outside, it may have seemed like just another blunt force measure by an authoritarian state trying to quell social unrest. But the move was more keenly calculated than that — the rhetoric of social media influencers was having an outsized impact on how Ethiopians, both in the country and in Ethiopia’s politically influential diaspora, perceived what was happening.&nbsp; Similar to other moments of intense social conflict amid Ethiopia’s <a href="https://www.cfr.org/global-conflict-tracker/conflict/conflict-ethiopia">civil war</a>, TikTok became a ground zero for much of the conflict playing out online.</p>



<p>In early February, three archbishops of the Orthodox Tewahedo Church — one of the oldest churches in Africa that dates back to the 4th century — accused fellow church leaders of discriminating against the Oromo people, who constitute the largest ethnic group in Ethiopia’s population of 120 million. While church members come from a diverse array of ethnic backgrounds, worship services are predominantly conducted in the liturgical language of Ge’ez and in Amharic, which is a language primarily spoken by the Amhara people. Amharic is the dominant language of Addis Ababa, Ethiopia’s capital, and the working language of the federal government. This linguistic predilection underlines the cultural clout of Amharic.&nbsp;</p>



<p>After the three archbishops — all of Oromo lineage — made their allegations of discrimination public, they were excommunicated by church authorities. They then declared their plans to form a breakaway synod, triggering an instant public outcry. The cleavages underlying Ethiopia’s <a href="https://www.cfr.org/global-conflict-tracker/conflict/conflict-ethiopia">civil conflict</a> bubbled to the surface and devolved into violent skirmishes, resulting in a combined total of 30 <a href="https://www.reuters.com/world/africa/ethiopian-orthodox-church-reaches-deal-with-breakaway-oromo-synod-2023-02-16/">fatalities</a> in the southern Ethiopian town of Shashemene and in Addis Ababa.</p>



<p>But what was a serious political crisis for the church and for the country amounted to a prime opportunity for TikTok influencers seeking to spread their messages and turn a profit along the way.</p>



<p class="has-drop-cap">A quick scroll through live sessions on TikTok reveals heated political discussions in Amharic, Oromo and Tigrinya, in which participants exchange barbs and strategize on how to confront their adversaries. Zemedkun Bekele is prominent among them. A self-proclaimed defender of the Orthodox Tewahedo Church, he is known for his forceful, admonitory videos that are often over an hour long. Bekele began broadcasting threats against the breakaway synod, claiming to have video evidence that its leaders had engaged in homosexual activity and <a href="https://www.tiktok.com/@elias33chebere/video/7196204759966027014?q=zemedkun%20bekele%20%E1%8B%98%E1%88%98%E1%8B%B4&amp;t=1683813109956">threatening</a> to release the tape to the public. Accusations like this resonate deeply in a nation steeped in conservatism, where homosexuality is viewed with considerable disdain.</p>



<p>A known social media influencer who had already been banned from both Facebook and YouTube in 2020 for violating their policies on hate speech and the promotion of violence, Bekele re-established himself on TikTok in February 2023, just in time to jump into the fray. Since then, he has <a href="https://www.tiktok.com/@zemedkun.bekelee/video/7196708246986575110?q=zemedkun%20bekele%20%E1%8B%98%E1%88%98%E1%8B%B4%20ethiopian&amp;t=1680709496027">amassed</a> a dedicated audience of more than 203,000 TikTok followers, most of whom appear to be members of the Amhara ethnic group and followers of the Orthodox Tewahedo Church.&nbsp;</p>





<p>In the midst of the crisis, Bekele also launched attacks against a senior church teacher, Daniel Kibret, who has become a staunch ally of Prime Minister Abiy Ahmed. Drawing on the fact that the prime minister comes from a mixed religious background (his father is Muslim and his mother is Christian), Bekele made unfounded claims that Kibret had secretly <a href="https://www.tiktok.com/@millionassefa1/video/7205339771290406150?q=Zemedkun%20Bekele&amp;t=1680708202181">converted</a> to Islam.</p>



<p>In a video with more than 19,000 views, Bekele <a href="https://www.tiktok.com/@lele3234/video/7196419974825757998?_r=1&amp;_t=8bqefCfnITO&amp;social_sharing=v2">maintained</a> that he would not relent. “We will not back down without making sure the Ethiopian Orthodox Church is as big as the country itself. We will not back down without toppling Abiy Ahmed,” he <a href="https://www.tiktok.com/@lele3234/video/7196419974825757998?_r=1&amp;_t=8bqefCfnITO&amp;social_sharing=v2">said</a>. “We will not back down without hanging Daniel Kibret upside down.” The video was posted on February 4, the same day that the three bishops declared their intentions to secede from the Church.</p>



<p>Another account called TegOromo also saw swift growth surrounding the church controversy. TegOromo has a passionate following and is on the opposite side of the conflict from Bekele. The person who runs the account has expressed support for the Oromo religious leaders who sought to establish the independent synod. The account's moniker fuses the first three letters of “Tegaru” with “Oromo” — a calculated move to represent harmony between the Tigrayan and Oromo ethnic groups.</p>



<p>With more than 60,000 followers, TegOromo’s <a href="https://www.tiktok.com/@tegoromo">account</a> is marked by overt threats, inflammatory language and aggressive rhetoric. One TikTok video <a href="https://www.tiktok.com/@welloamhara4/video/7174075402539781378?q=TegOromo&amp;t=1680711180930">urged</a> supporters to “chop the Amharas like an onion.” This video was later removed from the platform, but copies of it remain accessible. In a live session, a TegOromo follower <a href="https://www.tiktok.com/@abyssinian_papi/video/7207241149889826094?q=TegOromo&amp;t=1680711180930">called</a> on Oromo people to “kill all Amharas” and even specified that children should not be spared. TegOromo cheered him on, urging other followers to answer the call and take up arms.&nbsp;&nbsp;</p>



<p>Despite the controversial nature of TegOromo’s content, the influencer's popularity suggests a burgeoning trend. Republishing his material or circulating incisive and satirical clips featuring TegOromo has become a reliable strategy for Ethiopian content creators seeking higher engagement.</p>



<p>In another instance, the spotlight turned toward two emerging TikTok influencers, Dalacha and Betayoo, who garnered attention for their adept use of vitriol. In one <a href="https://www.tiktok.com/@dalecha7/video/7212637904903081262">video</a>, Dalacha, who identifies as Oromo, launched a barrage of insults and sexual slurs at his rival TikToker, Maareg, who identifies as Amhara. The episode exemplified the depths to which Dalacha was willing to stoop in order to denigrate <a href="https://www.tiktok.com/@maereg_kazachisi">Maereg</a>. Dalacha used language that reduced the Amhara community to mere cattle. It was intended only to amplify the prevailing animosity between the two ethnic groups.</p>



<p>In another video, Betayoo, who consistently identifies as Amhara, used similarly troubling language, employing both sexual and ethnic slurs. She <a href="https://www.tiktok.com/@betayoo14/video/7221748142453050642">directed</a> her insults toward a rival TikToker who identifies as Tigrayan and who has publicly expressed disdain for the Amhara community. Betayoo's actions escalated beyond targeting an individual. She proceeded to insult the entire Tigrayan community, expressing a desire for their eradication.&nbsp;</p>



<figure class="wp-block-image aligncenter size-large is-resized"><img src="https://www.codastory.com/wp-content/uploads/2023/06/Screenshots-1300x1200.png" alt="" class="wp-image-44481" style="width:565px;height:521px"/><figcaption class="wp-element-caption">Left: Zemedkun Bekele and his co-host celebrate achieving 60 million views. Right: A screenshot from TegOromo's live session, subtitled in Amharic, in which he calls on his Oromo kin to eliminate the Amharas.</figcaption></figure>



<p class="has-drop-cap">The videos I reference above also all contain clear violations of TikTok’s terms of service, yet they remain on the platform. TikTok's Community Guidelines strictly prohibit hate speech or hateful behavior and promise to remove such material from their platform. Accounts and/or users that engage in severe or multiple violations of hate speech policies are promptly banned from the platform. Despite these guidelines, plenty of Ethiopians who have exhibited hateful behavior remain active on the platform and continue to produce content for significant numbers of followers.</p>



<p>When I approached TikTok staff members to alert them about the videos and ask them to comment for this piece, they did not respond. It is difficult to definitively prove that this kind of discourse directly contributes to violence on the ground. But it is clear that discussions of political violence and religious conflicts on TikTok often result in the spread of misinformation and amplify interethnic hatred. Clips containing these influencers’ offensive remarks have also seeped onto other platforms, such as YouTube and Facebook, where reposting or critiquing such content has become a low-effort method for content creators to gain engagement.</p>



<p>Given the sheer volume of such live streams, TikTok's moderation team may be overwhelmed, struggling to monitor these discussions and remove inappropriate content. It is also worth noting that all of these accounts are run primarily in Amharic, Oromo or Tigrinya, languages that are spoken by millions of Ethiopians in and outside of the country but that have historically been underrepresented on major social media platforms. TikTok does not publicly disclose how many staff members or content moderators it employs for reviewing content in these languages.</p>



<p class="has-drop-cap">All this engagement is not driven purely by political vitriol — it is also a pursuit of profit. The TikTok LIVE feature has seen a swift uptick in popularity among Ethiopian users, catalyzing an emergence of politically-minded influencers who reap economic rewards through virtual gifts. These gifts can be converted into TikTok “diamonds,” which are in turn redeemable for actual cash.</p>



<p>Crafting politically-charged clickbait, designed to fan the flames of ethnic and religious discord, is emerging as a common tactic for financial gain. It has had especially strong uptake among individuals in the Ethiopian diaspora. Many of the most impactful Ethiopian TikTok figures are actually located in Western nations. Zemedkun Bekele, for instance, lives in Germany.</p>



<p>Amid the ongoing crisis, Bekele proudly claimed to have received one of the most sought-after TikTok LIVE gifts — the lion, which translates to a little over $400 in real-world currency. He has prominently featured a video on his profile displaying a virtual lion roaring at the screen, serving as both a symbol of his influence and a testament to the economic gains that one can reap through this kind of engagement on TikTok.</p>



<p class="has-drop-cap">In a 2021 <a href="https://www.nytimes.com/2021/12/05/business/media/tiktok-algorithm.html#:~:text=TikTok%20has%20publicly%20shared%20the,sought%20to%20crack%20its%20code.">essay</a>, former New York Times media critic Ben Smith showed how TikTok's algorithmic recommendation framework has helped to intensify cultural, linguistic and ideological divides among its global user base. The unfolding situation in Ethiopia could serve as a case study for Smith’s argument. With the videos I described — in addition to hundreds of others — the platform's content dissemination strategy appears to inadvertently encourage distinct factions to isolate themselves and push each other to commit hate speech and even physical violence.</p>





<p>The rise of these online strongholds poses significant challenges to promoting inclusive, cross-cultural understanding that TikTok <a href="https://newsroom.tiktok.com/en-us/one-year-later-our-commitment-to-diversity-and-inclusion">claims</a> to want to foster. Users now risk becoming trapped in ideological echo chambers, detached from diverse perspectives and viewpoints, and increasingly vulnerable to politically-motivated disinformation.</p>



<p>At the core of the issue lies the question of accountability. What obligation does TikTok, and by extension other social media platforms, have to curtail the spread of divisive content, particularly when it is financially incentivized? And moreover, could the pursuit of profit from politically-charged content inadvertently pave the way for more extreme or hazardous content, potentially triggering threats of violence in real life?</p>



<p>In the end, for onlookers familiar with Ethiopian culture and politics, it is clear that the platforms that invite us to share our lives online are failing to mediate the complexities of the world they seek to engage with.</p>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-algorithms post_tag-feature post_tag-surveillance post_tag-united-states author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/amazon-workers-surveillance/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2023/05/KEREM-YUCEL-AFP-via-Getty-Images-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/amazon-workers-surveillance/">How Somali workers in the US are fighting Amazon’s surveillance machine</a></h2>


<div class="wp-block-post-author-name">Erica Hellerstein</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-polarization post_tag-anti-lgbtq-disinformation post_tag-content-moderation post_tag-facebook post_tag-feature post_tag-india author-cap-alishan-jafri author-cap-vipul-kumar ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/polarization/india-same-sex-marriage/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/04/Photo-Alishan-Jafri-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2023/04/Photo-Alishan-Jafri-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2023/04/Photo-Alishan-Jafri-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2023/04/Photo-Alishan-Jafri-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2023/04/Photo-Alishan-Jafri-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/polarization/india-same-sex-marriage/">In India, a trans woman stands up to the ‘YouTube Baba’</a></h2>


<div class="wp-block-post-author-name">Alishan Jafri</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-polarization post_tag-africa post_tag-anti-lgbtq-disinformation post_tag-anti-science-politicians post_tag-dispatch post_tag-reproductive-rights coda_storyline-global-anti-lgbtq author-cap-prudencenyamishana ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/polarization/uganda-fertility-treatment-law/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/04/ASSOCIATED-PRESS-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2023/04/ASSOCIATED-PRESS-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2023/04/ASSOCIATED-PRESS-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2023/04/ASSOCIATED-PRESS-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2023/04/ASSOCIATED-PRESS-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/polarization/uganda-fertility-treatment-law/">Uganda is targeting reproductive rights alongside its ‘anti-gay’ bill</a></h2>


<div class="wp-block-post-author-name">Prudence Nyamishana</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/tktok-ethiopia-ethnic-conflict/">How TikTok influencers exploit ethnic divisions in Ethiopia</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">44479</post-id>	</item>
		<item>
		<title>How Somali workers in the US are fighting Amazon’s surveillance machine</title>
		<link>https://www.codastory.com/authoritarian-tech/amazon-workers-surveillance/</link>
		
		<dc:creator><![CDATA[Erica Hellerstein]]></dc:creator>
		<pubDate>Wed, 17 May 2023 13:36:26 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Feature]]></category>
		<category><![CDATA[Surveillance]]></category>
		<category><![CDATA[United States]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=43437</guid>

					<description><![CDATA[<p>Minnesota just passed a labor bill that could force Amazon to respect the rights of warehouse workers</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/amazon-workers-surveillance/">How Somali workers in the US are fighting Amazon’s surveillance machine</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Amazon’s unbelievably quick turnaround times on deliveries have become a given for many people in the U.S. Order a bottle of mouthwash on a weekday morning and your breath will be minty fresh within a day or even just a few hours.&nbsp;</p>



<p>But the success of the e-commerce giant’s rapid-fire delivery model depends on what happens inside Amazon’s “fulfillment centers” — sprawling warehouses where workers sort and pack orders for shipment, all under the watchful eye of technical systems that track their every move. For years, workers have said that the company’s algorithmically-driven approach pushes them to the brink, treating them “like robots” in the service of meeting unattainable productivity quotes and driving up injuries in the process.</p>



<p>On May 16, lawmakers in Minnesota passed a pioneering workplace safety bill that could improve labor conditions for Amazon employees subjected to the company’s worker tracking system. Organizers behind the legislation say it will provide the strongest labor protections in the nation for people working in warehouses like Amazon’s.</p>



<p>Work in Amazon warehouses is overseen entirely by technology: Algorithms <a href="https://www.washingtonpost.com/technology/2021/12/02/amazon-workplace-monitoring-unions/">track</a> workers’ speed and productivity, measure the so-called “time off task” that employees spend logged out of their workstations and alert managers when workers don’t meet their productivity quotas. Mohamed Farah, a 50-year-old Amazon employee who came to the U.S. in the mid-1990s as a refugee from Somalia, works a 10-hour shift packaging items for shipment at a Minnesota warehouse. He said the company’s grueling pace of work and “time off task” rules have worn on workers’ bodies and minds, including his own. “They say you have to pack a minimum of 80 boxes per hour, but you cannot do it,” he told me. “If you try to pack 80 per hour, you cannot go to the bathroom. If you go to the bathroom, the rate is down.”&nbsp;</p>





<p>Amazon’s “time off task” measurement is a constant source of worry for many workers. The company tracks the time employees are gone from their workstations. If you spend too much time away from your station, you get in trouble. Internal company documents obtained by VICE <a href="https://www.vice.com/en/article/5dgn73/internal-documents-show-amazons-dystopian-system-for-tracking-workers-every-minute-of-their-shifts">revealed</a> that Amazon can fire employees who accumulate 30 minutes of “time off task” on three separate days over the course of a year. The documents also showed that the company tracks the amount of time employees spend in the bathroom. Some employees have <a href="https://www.usatoday.com/videos/money/business/2018/04/17/some-amazon-workers-pee-bottles-make-work-deadlines-report-says/33881071/">described</a> needing to urinate in bottles while working to avoid penalties for using the bathroom.</p>



<p>Farah, who has worked for Amazon for seven years, said that workers get hurt trying to keep pace with packaging quotas. He has come home with three injuries over the last few years. “You go home feeling very tired. Headache, muscle aches, leg aches,” he told me. “They want us to work like robots.”</p>



<p>Farah’s experience is common across the company. A recent <a href="https://uniglobalunion.org/news/globalsurvey23/">survey</a> of more than 2,000 Amazon workers across eight countries found that the company’s performance monitoring and tracking system has taken a physical and emotional toll on employees’ well-being: 57% of respondents said their mental health suffered due to the company’s productivity monitoring, and 51% claimed it harmed their physical health. Amazon <a href="https://www.cnbc.com/2023/04/12/study-amazon-workers-seriously-hurt-at-twice-rate-of-other-warehouses.html">has</a> twice the injury rate of comparable warehouses in the U.S., according to a recent analysis of injury data submitted to federal safety regulators.</p>



<p>“You do a lot of bending and back-and-forth walking for hours. You get thirsty, and you go to the bathroom, and it’s on a different floor,” explained Qali Jama, a 39-year-old Amazon warehouse worker who also hails from Somalia. “And then you go to the bathroom, which only has two toilets. If those toilets are occupied, you need to wait to go to the bathroom. The whole time you’re gone your time accumulates, it adds up. And next thing you know, the manager goes up to you and tells you a couple of days later, ‘You have time off task.’”</p>



<p>It was these conditions that fueled a raft of organizing efforts in Amazon facilities across the U.S., including the nation’s first successful union drive at a company warehouse in New York last year. Workers from East Africa, like Qali, were among the first Amazon employees in the nation to confront the company over its labor practices and have been at the helm of organizing efforts in Minnesota.&nbsp;</p>



<p>The Minnesota <a href="https://www.revisor.mn.gov/bills/text.php?number=HF36&amp;type=ue&amp;version=1&amp;session=ls93&amp;session_year=2023&amp;session_number=0">bill</a>, which state lawmakers passed on May 16, will not just apply to Amazon — though lawmakers who supported the legislation <a href="https://www.house.mn.gov/sessiondaily/Story/17316">said</a> it was spurred by reports of injuries at Amazon and a lack of transparency around the company’s productivity quotas. The policy will require any warehouse with more than 250 employees to provide employees with the quotas and work speed metrics used to evaluate workers’ performance. The law also requires for this information to be communicated in employees’ preferred language. Organizers say this will force Amazon to be transparent with its employees about the company’s often opaque workplace productivity metrics — a system they claim increases injuries among workers. The legislation also prohibits employers from imposing quotas on workers that prevent them from taking bathroom, food, rest and prayer breaks.</p>



<p>The law is the product of years of organizing by East African migrant workers, many of whom came to Minnesota as refugees escaping Somalia’s civil war in the 1990s and formed what became the largest Somali diaspora community in the United States. Somali workers now make up large numbers of Amazon’s labor force at warehouses. They were the first in the country to <a href="https://www.nytimes.com/2018/11/20/technology/amazon-somali-workers-minnesota.html">take on</a> Amazon’s labor practices in 2018, when a cohort of workers in the Minneapolis area staged a walkout over working conditions at local warehouses, forcing the company to the negotiating table.&nbsp;</p>





<p>“Before anyone in the labor movement, we took on Amazon,” said Abdirahman Muse, the executive director of the Awood Center, a Minnesota-based nonprofit that advocates for East African workers’ rights. “And everybody thought we were crazy. But we were not.” Muse compared the Minnesota Somali workers’ trajectory to “the story of David and Goliath” — a small group of refugees and immigrants “facing one of the biggest companies in the world.”</p>



<p>It was the realities of working inside Amazon’s warehouses that prompted Qali to begin organizing for better labor conditions last year. “When I started working there, I said, ‘This is not right what they are doing,’” she recalled. “I always felt like we were slaves there. I always fought against them, I knew my rights. I felt that there were people who have only been in the United States for 30 days. They need money. They come from hunger. They will take anything. And I think that’s what Amazon depends on.”&nbsp;</p>



<p>“I want people when they come to America to know that they are still human,” she added. “This country does stand for what they believe, but you have to speak — and act.”</p>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-disinformation post_tag-authoritarianism post_tag-internet-censorship post_tag-pakistan post_tag-q-and-a post_tag-social-media-censorship author-cap-javeriakhalid ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/disinformation/pakistan-internet-shutdown/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/05/AAMIR-QURESHI-AFP-via-Getty-Images-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2023/05/AAMIR-QURESHI-AFP-via-Getty-Images-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2023/05/AAMIR-QURESHI-AFP-via-Getty-Images-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2023/05/AAMIR-QURESHI-AFP-via-Getty-Images-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2023/05/AAMIR-QURESHI-AFP-via-Getty-Images-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/disinformation/pakistan-internet-shutdown/">Amid chaos, Pakistan shut down the internet to little effect</a></h2>


<div class="wp-block-post-author-name">Javeria Khalid</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-surveillance-and-control post_tag-biometrics post_tag-facial-recognition post_tag-feature post_tag-migration post_tag-surveillance coda_storyline-surveillance-and-borders author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/surveillance-and-control/us-ice-alternatives-to-detention/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/05/Immigrating-to-the-US--250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2023/05/Immigrating-to-the-US--250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2023/05/Immigrating-to-the-US--72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2023/05/Immigrating-to-the-US--232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2023/05/Immigrating-to-the-US--900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/surveillance-and-control/us-ice-alternatives-to-detention/">Immigrating to the US? ICE wants your biometrics</a></h2>


<div class="wp-block-post-author-name">Erica Hellerstein</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-feature post_tag-internet-censorship post_tag-reproductive-rights post_tag-surveillance post_tag-united-states author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/texas-isps-bill/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2023/03/Texas-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2023/03/Texas-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2023/03/Texas-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2023/03/Texas-232x232.jpg 232w, https://www.codastory.com/wp-content/uploads/2023/03/Texas-900x900.jpg 900w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/texas-isps-bill/">Texas lawmakers want to erase abortion from the internet</a></h2>


<div class="wp-block-post-author-name">Erica Hellerstein</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/amazon-workers-surveillance/">How Somali workers in the US are fighting Amazon’s surveillance machine</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">43437</post-id>	</item>
		<item>
		<title>U.S. lawmakers grill TikTok, but it’s all bark and no bite</title>
		<link>https://www.codastory.com/newsletters-category/tiktok-hearing-china/</link>
		
		<dc:creator><![CDATA[Ellery Roberts Biddle]]></dc:creator>
		<pubDate>Thu, 30 Mar 2023 16:12:38 +0000</pubDate>
				<category><![CDATA[Authoritarian Tech newsletter]]></category>
		<category><![CDATA[Newsletters]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Newsletter]]></category>
		<category><![CDATA[Surveillance]]></category>
		<category><![CDATA[TikTok]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=42234</guid>

					<description><![CDATA[<p>Authoritarian Tech is a weekly newsletter tracking how people in power are abusing technology and what it means for the rest of us. </p>
<p>The post <a href="https://www.codastory.com/newsletters-category/tiktok-hearing-china/">U.S. lawmakers grill TikTok, but it’s all bark and no bite</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>My early years in this field were dominated by news of the Arab uprisings and the ways in which protest movements were being organized — but also surveilled — through technology. When Egyptian authorities shut down the internet amid mass demonstrations in Cairo’s Tahrir Square, for many of us it seemed unprecedented.&nbsp;</p>



<p>But not for our colleagues from greater China, perhaps the world’s first mover on all things authoritarian tech. Years later, I worked on a <a href="https://advox.globalvoices.org/2013/12/17/in-chinas-ethnic-minority-regions-internet-blackouts-are-the-norm/">story</a> about how shutdowns were routine in China’s ethnic minority regions and could last for months at a time. Political riots in Lhasa triggered a shutdown that lasted from March until December of 2008. In 2009, protests in Xinjiang, home to a majority of China’s Uyghur Muslim population, led to a yearlong blackout. Chinese authorities were ahead of the curve on surveillance tech, too. In 2017, Xinjiang residents were made to <a href="https://advox.globalvoices.org/2013/12/17/in-chinas-ethnic-minority-regions-internet-blackouts-are-the-norm/">install</a> Jingwang, a surveillance software system as pervasive as products like Pegasus, but with added features like a remote control option that would allow the operator to manipulate the user’s phone.</p>



<p>Xinjiang and the systematic oppression of Uyghurs was the focus of one of the two major U.S. congressional hearings on China last week. But only one made major media headlines — and it wasn’t the hearing on what U.S. officials now <a href="https://www.state.gov/reports/2021-report-on-international-religious-freedom/china/xinjiang/">refer</a> to as the Uyghur genocide.</p>



<p>Instead, the congressional <a href="https://energycommerce.house.gov/events/full-committee-hearing-tik-tok-how-congress-can-safeguard-american-data-privacy-and-protect-children-from-online-harms">grilling</a> of TikTok CEO Shou Zi Chew grabbed most of the attention. Members of Congress from both parties put on a spectacle of Cold War-style grandstanding, complete with the Sinophobic pigeonholing of Chew. Representatives repeatedly suggested that Chew was an associate of the Chinese Communist Party. And when Texas Republican Dan Crenshaw insisted that Chew himself was Chinese, the executive had to state for the record that he is from, and lives in, Singapore, an entirely separate country.</p>



<p>The Chinese government’s abysmal human rights record and mass incarceration of Uyghur Muslims did make it into the hearing, as representatives heard again how in 2019, TikTok was <a href="https://www.vox.com/recode/2019/11/27/20985795/tiktok-censorship-china-uighur-bytedance">removing</a> content concerning human rights in western China. But the TikTok hearing wasn’t really about human rights, in China, or in the U.S.&nbsp;</p>



<p>It also didn’t seem designed to get new information from the CEO — on a few occasions, representatives literally <a href="https://www.youtube.com/watch?v=KMjIz_vE6qc">cut him off</a> before he could answer their questions. Instead it was, as Republican Committee Chair Cathy Rodgers put it, about “American values” and a desire to show anti-China toughness towards the leader of TikTok, the parent company of which indeed is Chinese.&nbsp;</p>



<p>Right now, TikTok is subject to more regulatory scrutiny and requirements than any other major social media platform in the U.S., essentially because of its connection to China. All the while, the most clear and present threats posed by China keep failing to capture enough public attention to drive meaningful change in how the U.S. responds to the challenge of this enormous world power.</p>



<p><strong>U.S. government agencies are now banned from using commercial spyware</strong>, <strong>thanks to an </strong><a href="https://www.whitehouse.gov/briefing-room/statements-releases/2023/03/27/fact-sheet-president-biden-signs-executive-order-to-prohibit-u-s-government-use-of-commercial-spyware-that-poses-risks-to-national-security/"><strong>executive order</strong></a><strong> from President Biden.</strong> The order is limited to spyware that poses security risks or “significant risks of improper use by a foreign government or foreign person.” Although this doesn’t quite constitute a wholesale ban on spyware, it is an important step, especially following last year’s revelations that the FBI had <a href="https://www.nytimes.com/2022/11/12/us/politics/fbi-pegasus-spyware-phones-nso.html">considered</a> using NSO Group’s Pegasus — software that is perhaps best known for being abused by governments — as an investigative tool. The order explicitly bans spyware that enables the collection of “information on activists, academics, journalists, dissidents, political figures, or members of non-governmental organizations or marginalized communities in order to intimidate such persons” — key details reflecting the reality of how spyware has been used to undermine democratic institutions around the world.&nbsp;</p>



<p><strong>Mexican president Andres Manuel López Obrador </strong><a href="https://www.reuters.com/world/americas/facing-spying-claims-mexico-president-says-activists-phone-call-was-recorded-2023-03-23/?emci=8d441630-aacc-ed11-a8e0-00224832e811&amp;emdi=20168667-abcc-ed11-a8e0-00224832e811&amp;ceid=4614439"><strong>admitted</strong></a><strong> that his government spied on the human rights defender Raymundo Ramos. </strong>The announcement followed the release of internal <a href="https://ejercitoespia.r3d.mx/">documents</a> showing how the Mexican military used NSO Group’s Pegasus spyware to surveil Ramos, who had been helping families facing threats from drug traffickers. The president’s office also took the opportunity to cast doubt on the veracity of the full set of documents, which became public due to a major hacking operation but were then reviewed and verified by technical and legal experts. In any case, the admission is a big deal for Mexico, where the executive branch has a long history of denying or simply turning a blind eye to the evidence of state abuses of due process and human rights.</p>



<p><strong>Russian authorities are using facial recognition to stop protesters before they even hit the streets.</strong> Authorities are using surveillance systems in the Moscow metro to identify likely protesters and stop them before they even appear at a protest, often using social media posts as grounds for arrest. Although the facial recognition system has certainly attracted more attention since Russia began its war in Ukraine, it is nothing new — Moscow first deployed the technology in 2017. But a review of 2,000 pending criminal cases by Reuters, with the aid of the Russian human rights group OVD-Info, <a href="https://www.reuters.com/investigates/special-report/ukraine-crisis-russia-detentions/">shows</a> that the system has been used to prosecute hundreds of protesters since the start of the war.</p>



<h2 class="wp-block-heading" id="h-what-we-re-reading"><strong>WHAT WE’RE READING</strong></h2>



<p>This week, lots of people read a New York Times <a href="https://www.nytimes.com/2023/03/24/opinion/yuval-harari-ai-chatgpt.html">op-ed</a> by Tristan Harris and Aza Raskin, of the Center for Humane Technology, and historian Yuval Harari, about the apparent impending doom of AI. The piece drove a lot of experts nuts. I don’t want to spill more ink on the subject, but the issues at hand are really important. So I’m recommending two Twitter threads by esteemed scholars who know this stuff much better than most, have nothing to gain from speaking about tech in outlandish terms and are great at separating hype from reality.</p>



<ul class="wp-block-list"><li>Bentley University math professor Noah Giansiracusa dressed down the op-ed almost <a href="https://twitter.com/ProfNoahGian/status/1639698806420697089">line by line</a>. In my favorite tweet, he points out that the authors describe chatbots as “humanity's most consequential technology” and asks, “Are you seriously putting chatbots above antibiotics, pasteurization, the internet, cell phones, smart phones, cars, planes, electricity, the light bulb..?”</li></ul>



<ul class="wp-block-list"><li>“Design Justice” author and former MIT professor Sasha Costanza-Chock launched a fascinating, forward-looking <a href="https://twitter.com/schock/status/1640024767704227840">thread</a> with this provocation: “Generative AI systems are trained upon vast datasets of centuries of human creative and intellectual work. They should thus belong to the commons, to all humanity, rather than to a handful of powerful for-profit corporations.” </li></ul>
<p>The post <a href="https://www.codastory.com/newsletters-category/tiktok-hearing-china/">U.S. lawmakers grill TikTok, but it’s all bark and no bite</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">42234</post-id>	</item>
		<item>
		<title>What a law designed to protect the internet has to do with abortion</title>
		<link>https://www.codastory.com/surveillance-and-control/scotus-section-230-abortion/</link>
		
		<dc:creator><![CDATA[Tamara Evdokimova]]></dc:creator>
		<pubDate>Mon, 23 Jan 2023 09:20:00 +0000</pubDate>
				<category><![CDATA[Surveillance and Control]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Content moderation]]></category>
		<category><![CDATA[Explainer]]></category>
		<category><![CDATA[Social media censorship]]></category>
		<category><![CDATA[United States]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=39414</guid>

					<description><![CDATA[<p>A Supreme Court ruling on Section 230 could limit online access to abortion information</p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/scotus-section-230-abortion/">What a law designed to protect the internet has to do with abortion</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>The United States Supreme Court unleashed a political earthquake when it overturned Roe v. Wade in June 2022, reversing nearly fifty years of precedent establishing a constitutional right to abortion.&nbsp;</p>



<p>After the decision, red states moved quickly to ban or severely limit access to the procedure. This made the virtual sphere uniquely important for people seeking information about abortion, especially those living in states that have outlawed the procedure with little or no exceptions.&nbsp;</p>





<p>Google searches for abortion medications <a href="https://19thnews.org/2022/07/abortion-access-activists-google-keywords-seo/">increased</a> by 70% the month following the court ruling. People flocked to social media platforms and websites with resources about where and how to end a pregnancy, pay for an abortion or <a href="https://www.abortionfinder.org/#find-assistance">seek</a> help to obtain an abortion out of state.&nbsp;</p>



<p>Despite state laws criminalizing abortion, these digital spaces are legally protected from liability for hosting this kind of content. That’s thanks to the landmark Section 230 of the 1996 Communications Decency Act, the 26 words that are often <a href="https://www.jeffkosseff.com/home">credited</a> with creating today’s internet as we know it. Thanks to Section 230, websites of all kinds are protected from lawsuits over material that users might post on their platforms. This legal shield allows sites to host speech about all kinds of things that might be illegal — abortion included — without worrying about being sued.</p>



<p>But the future of 230 is on shaky ground. Next month, the U.S. Supreme Court will <a href="https://www.scotusblog.com/2022/12/court-schedules-february-arguments-on-student-loan-relief-tech-companies-liability/">hear</a> oral arguments on a case that challenges the scope of the landmark internet law. The Court’s decision could have sweeping consequences for digital speech about abortion and reproductive health in a post-Roe America.&nbsp;</p>



<h2 class="wp-block-heading" id="h-the-background"><strong>THE BACKGROUND</strong></h2>



<p>When armed ISIS assailants <a href="https://www.bbc.com/news/world-europe-34818994">staged</a> a series of attacks in central Paris in November 2015, an American college student named Nohemi Gonzalez was among the 130 people who lost their lives. Her family has since <a href="https://www.business-humanrights.org/en/latest-news/family-of-paris-attacks-victim-files-us-lawsuit-against-facebook-google-twitter-claiming-they-provide-material-support-to-isis/">taken</a> Google (the owner of YouTube) to court. Their lawyers argue that the tech giant aided and abetted terrorism by promoting YouTube videos featuring ISIS fighters and other material that could radicalize viewers and make them want to carry out attacks like the one that killed Nohemi. Central to the case is YouTube’s recommendation algorithm, which feeds users a never-ending stream of videos in an effort to keep them hooked. Independent research has <a href="https://dl.acm.org/doi/abs/10.1145/3351095.3372879">shown</a> that the algorithm tends to promote videos that are more “extreme” or shocking than what a person might have searched to begin with. Why? Because this kind of material is more likely to capture and sustain users’ attention.</p>



<p>Section 230 protects Google from legal liability for the videos it hosts on YouTube. But does it protect Google from legal liability for recommending videos that could inspire a person to join a terrorist group and commit murder? That is the central question of <a href="https://www.supremecourt.gov/docket/docketfiles/html/public/21-1333.html">Gonzalez v. Google</a>. If the Supreme Court decides that the legal shield of Section 230 does not apply to the recommendation engine, the outcome could affect all kinds of videos on the platform. Any video that could be illegal under state laws — like abortion-related content in the post-Roe era — could put the company at risk of legal liability and would probably cause Google to more proactively censor videos that might fall afoul of the law. This could end up making abortion and reproductive health-related information much harder to access online.</p>



<p>If this all sounds wonky and technical, that’s because it is. But the Court’s decision has the potential to “dramatically reshape the internet,” according to Eric Goldman, a professor at California’s Santa Clara University School of Law specializing in internet law.&nbsp;</p>



<p>Algorithmic systems are deeply embedded in the architecture of online services. Among other things, websites and social media platforms use algorithms to recommend material to users in response to their online activity. These algorithmic recommendations are behind the personalized ads we see online, recommended videos and accounts to follow on social media sites and what pops up when we look at search engines. They create a user’s newsfeed on social media platforms like Facebook and Twitter. They have become a core feature of how the internet functions.</p>



<h2 class="wp-block-heading"><strong>WHAT ARE THE STAKES IN A POST-ROE AMERICA?</strong></h2>



<p>If the Supreme Court rules in the plaintiffs’ favor, it could open up a vast world of possible&nbsp; litigation, as websites and platforms move assertively to take down content that could put them at legal risk, including speech about abortion care and reproductive health. Platforms then would face the threat of litigation for recommending content that stands in violation of state laws&nbsp; — <a href="https://www.nytimes.com/interactive/2022/us/abortion-laws-roe-v-wade.html">including</a>, in thirteen cases, laws against abortion.&nbsp;</p>



<p>“That's going to dramatically affect [the] availability of abortion-related material because, at that point, anything that a service does that promotes or raises the profile of abortion-related material over other kinds of content would no longer be protected by Section 230, would be open for all these state criminal laws, and services simply can't tolerate that risk,” Goldman explained.&nbsp;</p>



<p>In this scenario, technology companies could not only be exposed to lawsuits but could even find themselves at risk of criminal charges for algorithmically recommending content that runs afoul of state abortion bans. One example is Texas’ anti-abortion “bounty” law, SB 8, which <a href="https://legiscan.com/TX/text/SB8/2021">deputizes</a> private citizens to sue anyone who “aids or abets” another person seeking an abortion. If the Court decides to remove Section 230’s shield for algorithmic amplification, websites and platforms could be sued for recommending content that helps a Texas resident to obtain an abortion in violation of SB 8. Most sites would likely choose to play it safe and simply remove any abortion-related speech that could expose them to criminal or legal risks.</p>



<p>The abortion information space is just one realm where this could play out if the Court decides that Section 230’s protections do not apply to algorithmic promotion of content. Anupam Chander, a law professor at Georgetown University who focuses on international tech regulation, explained: “Making companies liable for algorithmically promoting speech when they haven't themselves developed it will lead to the speech that is most controversial being removed from these online services.”</p>



<p>Goldman had similar concerns. “We’ve never had this discussion about what kind of crazy things could a state legislature do if they wanted to hold services liable for third-party content. And that's because Section 230 basically takes that power away from state legislatures,” he said. “But the Supreme Court could open that up as a new ground for the legislatures to plow. And they're going to plant some really crazy stuff in that newly fertile ground that we've never seen before.”</p>



<p>Consider the #MeToo movement. Section 230 protects platforms against defamation lawsuits for hosting content alleging sexual harassment, abuse or misconduct. Without the law’s shield, the movement could have had a different trajectory. Platforms may have taken down content that could have exposed them to lawsuits from some of the powerful people who were subjects of allegations.</p>



<p>“That kind of speech, which we have seen the internet empower over the last decade in ways that have literally reshaped society, would lead to the kind of liability concerns that would mean that it would be suppressed in the future,” Chander added. “So, when someone claims that Harvey Weinstein assaulted them, companies are in a difficult position having to assess whether or not they can leave that up when Harvey Weinstein's lawyers might be sending cease and desist and saying, ‘we're going to sue you for it for defamation.’”&nbsp;</p>



<p>Proponents of Section 230, who have long argued that changing or eliminating the law would end up disproportionately <a href="https://thehill.com/opinion/technology/458227-in-debate-over-internet-speech-law-pay-attention-to-whose-voices-are/">censoring</a> the speech of marginalized groups, are hoping to avoid this scenario. But it’s hard to predict how the Supreme Court justices will rule in this case. Section 230 is one of the rare issues in contemporary American politics that doesn’t map neatly onto partisan or ideological lines. As I <a href="https://www.codastory.com/authoritarian-tech/global-consequences-section-230/">reported</a> for Coda in 2021, conservative and liberal politicians alike have taken issue with Section 230 in recent years, introducing dozens of bills seeking to change or eliminate it. Both U.S. President Joe Biden and former president Donald Trump have called for the law to be repealed.&nbsp;</p>



<p>“This is not just a left-right issue,” Chander explained. “It has this kind of strange bedfellows character. So I think there's a real possibility here of an odd coalition both from the left and the right to essentially rewrite Section 230 and remove much of its protections.”</p>





<p>If the Supreme Court decides that platforms are on the hook legally for recommendation algorithms, it may be harder for people seeking abortions to come across the information they need, say, in a Google search or on a social media platform like Instagram, as those companies will probably take down (or geoblock) any content that could put them at legal risk. It feels almost impossible to imagine this scenario in the U.S., where we expect to find the world at our fingertips every time we look at our phones. But that reality has been constructed, in large part, on the shoulders of Section 230. Without it, the free flow of information we have come to expect in the digital era may become a relic of the past — when abortion was a constitutional right and information about it was accessible online. The Supreme Court’s decision on this tech policy case could, once again, turn back the clock on abortion rights.</p>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-infodemic category-newsletters-category author-cap-rebekah-robinson ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/newsletters/tiktok-abortion-code/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2022/08/abortion-tiktok-250x250.jpeg" srcset="https://www.codastory.com/wp-content/uploads/2022/08/abortion-tiktok-250x250.jpeg 250w, https://www.codastory.com/wp-content/uploads/2022/08/abortion-tiktok-72x72.jpeg 72w, https://www.codastory.com/wp-content/uploads/2022/08/abortion-tiktok-232x232.jpeg 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/newsletters/tiktok-abortion-code/">Women develop a code to discuss abortion on TikTok</a></h2>


<div class="wp-block-post-author-name">Rebekah Robinson</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-auth-tech category-newsletters-category post_tag-content-moderation post_tag-newsletter author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/newsletters/auth-tech/meta-abortion-content-moderation/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2022/07/abortion--250x250.jpeg" srcset="https://www.codastory.com/wp-content/uploads/2022/07/abortion--250x250.jpeg 250w, https://www.codastory.com/wp-content/uploads/2022/07/abortion--72x72.jpeg 72w, https://www.codastory.com/wp-content/uploads/2022/07/abortion--232x232.jpeg 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/newsletters/auth-tech/meta-abortion-content-moderation/">Roe reversal puts spotlight on Meta’s abortion content moderation policies</a></h2>


<div class="wp-block-post-author-name">Erica Hellerstein</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-censorship post_tag-content-moderation post_tag-feature post_tag-privacy-laws post_tag-social-media-censorship author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/global-consequences-section-230/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2021/06/230.gif" width="1920" height="1080"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/global-consequences-section-230/">How changing a 26-word US internet law could impact online expression everywhere</a></h2>


<div class="wp-block-post-author-name">Erica Hellerstein</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/scotus-section-230-abortion/">What a law designed to protect the internet has to do with abortion</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">39414</post-id>	</item>
		<item>
		<title>China is gaining control of the world’s data as the US stands by</title>
		<link>https://www.codastory.com/surveillance-and-control/data-trafficking-china-us-tiktok/</link>
		
		<dc:creator><![CDATA[Liam Scott]]></dc:creator>
		<pubDate>Thu, 17 Nov 2022 14:45:59 +0000</pubDate>
				<category><![CDATA[Surveillance and Control]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[China]]></category>
		<category><![CDATA[Q&A]]></category>
		<category><![CDATA[TikTok]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=36546</guid>

					<description><![CDATA[<p>Global data trafficking presents security risks that most countries are not prepared to handle, Aynne Kokas argues in her new book</p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/data-trafficking-china-us-tiktok/">China is gaining control of the world’s data as the US stands by</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>There came a point ten years ago when Aynne Kokas realized that she could no longer keep WeChat on her personal phone. She had begun research on what would eventually become her <a href="https://global.oup.com/academic/product/trafficking-data-9780197620502?cc=us&amp;lang=en&amp;">new book</a>, “Trafficking Data: How China is Winning the Battle for Digital Sovereignty,” published this month.&nbsp;</p>



<p>WeChat is an omnipresent Chinese messaging app, and Kokas, a media studies professor at the University of Virginia, needed it to talk to Chinese sources for her research. But, as Kokas told me, it soon became “a very meta experience.” To have WeChat on her personal phone meant that “you were subjecting yourself to precisely the type of surveillance that you were writing about.”</p>



<p>In the book, Kokas analyzes how Chinese firms and the Chinese government gather data on U.S. citizens for political and commercial gain, putting U.S. national security at risk. China is able to do this, Kokas points out, in part because the U.S. government does not have substantial regulations in place to protect users and their data.</p>



<p>“By tracing how China and the US have shaped the global movement of data, I hope this book empowers citizens around the world to navigate the complex terrain created by Silicon Valley, Washington, and Beijing,” she writes.</p>



<p>I recently spoke with Kokas on the phone. Our conversation has been edited for length and clarity.&nbsp;</p>



<p><strong>What do “digital sovereignty” and “data trafficking” mean in layman's terms?</strong>&nbsp;</p>



<p>Digital sovereignty is the idea of control over a country’s digital resources. Digital sovereignty is something that we see in countries that are trying to protect their digital domain from oversight from other countries. The Chinese government has a more expansive vision called cyber sovereignty, which is that any digital space that a country touches should be part of their digital domain.</p>





<p>Data trafficking is the movement of data from one country to another without the consent of users and without their understanding of the implications of their data being moved between national data regimes. For example, if I sign up for TikTok here in the U.S. and I find out that my data has been accessed in another country, that would be data trafficking.&nbsp;</p>



<p><strong>My favorite line in the book is when you write, “Most people are simply not exciting intelligence targets.” So what are the implications of data trafficking for most Americans in their daily lives?</strong>&nbsp;</p>



<p>People are afraid that they are individually going to be targeted, and there are some scary stories, but ultimately the more interesting data for the Chinese government and for Chinese firms is actually at scale. So while you might not personally be interesting, you plus all of your neighbors, or you plus all of the people in your state, yield really rich insights that can enable the tracking and mapping of a whole society.</p>



<p>And while most people aren’t that interesting, there are specific subgroups that face intensive targeting, like Hong Kong democracy activists, as well as Uyghur and Tibetan activists.&nbsp;</p>



<p>I also think there are other layers that are significant. One is economic risk. U.S. companies can’t gather data in China the same way that Chinese companies can in the United States, and that creates a fundamental asymmetry in the development of the digital economy in ways that will have long-standing implications for the development of products. At a certain point, it’s not necessarily just about spying or surveillance. It’s about what types of products you can build.</p>



<p>The third issue is national security. These platforms are becoming essential in daily life and the functioning of society. For example, TikTok now functions as a form of critical communications infrastructure. Chinese firms have also become involved in gathering and using health data and agricultural data from the United States. If that breaks down or if the Chinese government decides to pull participation from these firms, which they can do, it leads to a fundamental destabilization of key areas in the U.S. and global economy — areas like communication, health, food production.&nbsp;</p>



<p>That’s not a risk that I think most people want to take.</p>



<p><strong>Do you think the United States is at fault for not better protecting user data? Or is China more at fault for taking advantage of those weaknesses?&nbsp;</strong></p>



<p>A lot of China’s ability to go into other countries and propose tech platforms that rapidly gather data builds on the fact that U.S.-based companies have already been there. A great example of this is TikTok being officially based in the Cayman Islands. This is a classic move by U.S. firms to escape U.S. government scrutiny. And TikTok adopted this, so while their headquarters are officially in Beijing, they’re domiciled in the Cayman Islands. The other thing that U.S. firms pioneered was a lack of algorithmic transparency. And that’s at the foundation of a lot of these business models from which many Chinese entrepreneurs learn to grow their businesses.</p>





<p>The first and most important thing the U.S. government should do is pass national data regulations that have actual enforcement requirements in place. But there are significant differences within the U.S. government about what is and is not acceptable in terms of government oversight over corporations, as well as oversight over data. And even if laws are passed, enforcement is still really challenging.&nbsp;</p>



<p><strong>You present these issues as being contested, but it seems that the U.S. isn’t putting up much of a fight.&nbsp;</strong></p>



<p>The title should be something like, “China is taking over the digital world, and the U.S. kind of agreed to it.” But people I interviewed in the U.S. government and tech corporations would argue that by not heavily regulating the U.S. digital landscape, U.S. platforms are able to grow and compete with China that way. The other aspect is this resistance to changing U.S. data governance policies because that would be “letting China win” by adopting too many aspects of the Chinese model. I don’t fully agree with that framework.&nbsp;</p>



<p><strong>You wrote that you felt a sense of urgency while working on the book. Why did you feel that way?&nbsp;</strong></p>



<p>A lot of people outside China haven’t experienced China’s digital control directly, so they don’t understand the seriousness of what it means for that model to be exported and how difficult it is to put the genie back in the bottle once it’s out.</p>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-auth-tech category-newsletters-category post_tag-china post_tag-internet-censorship post_tag-newsletter post_tag-russia author-cap-liam-scott ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/newsletters/breaking-up-global-internet/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2022/10/Photo-by-Mikhail-Svetlov-Getty-Images-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2022/10/Photo-by-Mikhail-Svetlov-Getty-Images-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2022/10/Photo-by-Mikhail-Svetlov-Getty-Images-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2022/10/Photo-by-Mikhail-Svetlov-Getty-Images-232x232.jpg 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/newsletters/breaking-up-global-internet/">Repressive regimes around the world are nationalizing the internet and isolating people</a></h2>


<div class="wp-block-post-author-name">Liam Scott</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-surveillance-and-control post_tag-feature post_tag-tiktok author-cap-isobelcockerell ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/surveillance-and-control/tiktok-uyghur-china/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2019/09/For-Coda-250x250.gif" srcset="https://www.codastory.com/wp-content/uploads/2019/09/For-Coda-250x250.gif 250w, https://www.codastory.com/wp-content/uploads/2019/09/For-Coda-72x72.gif 72w, https://www.codastory.com/wp-content/uploads/2019/09/For-Coda-232x232.gif 232w, https://www.codastory.com/wp-content/uploads/2019/09/For-Coda-300x300.gif 300w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/surveillance-and-control/tiktok-uyghur-china/">How TikTok opened a window into China’s police state</a></h2>


<div class="wp-block-post-author-name">Isobel Cockerell</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-authoritarian-tech post_tag-dispatch post_tag-hong-kong author-cap-nithincoca ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/authoritarian-tech/china-digital-wall-tibet/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2019/05/TIBETCoda-250x250.gif" srcset="https://www.codastory.com/wp-content/uploads/2019/05/TIBETCoda-250x250.gif 250w, https://www.codastory.com/wp-content/uploads/2019/05/TIBETCoda-72x72.gif 72w, https://www.codastory.com/wp-content/uploads/2019/05/TIBETCoda-232x232.gif 232w, https://www.codastory.com/wp-content/uploads/2019/05/TIBETCoda-300x300.gif 300w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/authoritarian-tech/china-digital-wall-tibet/">China’s Digital Wall Around Tibet</a></h2>


<div class="wp-block-post-author-name">Nithin Coca</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/data-trafficking-china-us-tiktok/">China is gaining control of the world’s data as the US stands by</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">36546</post-id>	</item>
		<item>
		<title>As anxiety about crime peaks, US cities look to surveillance tech. But does it actually work?</title>
		<link>https://www.codastory.com/surveillance-and-control/us-city-surveillance/</link>
		
		<dc:creator><![CDATA[Erica Hellerstein]]></dc:creator>
		<pubDate>Thu, 10 Nov 2022 16:18:54 +0000</pubDate>
				<category><![CDATA[Surveillance and Control]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Feature]]></category>
		<category><![CDATA[Police surveillance]]></category>
		<category><![CDATA[United States]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=36346</guid>

					<description><![CDATA[<p>From San Francisco to New York, even progressive enclaves are turning to authoritarian tech to appear tough on crime</p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/us-city-surveillance/">As anxiety about crime peaks, US cities look to surveillance tech. But does it actually work?</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In the run-up to the U.S. midterm elections, public anxiety about crime became a flashpoint. While campaigns for right-wing candidates in battleground states <a href="https://www.thedailybeast.com/these-2022-midterms-senate-republican-candidates-are-running-on-crime-without-any-ammo">painted</a> alarming pictures of cities riddled with crime under the control of Democrats, voters, too, expressed real concern about the issue. An October survey by Pew Research Center <a href="https://www.pewresearch.org/politics/2022/10/20/the-midterm-elections-and-views-of-biden/#h-top-midterm-issues-the-economy-future-of-democracy">showed</a> that 61% of registered voters viewed violent crime as “very important” to their vote.</p>



<p>Even in Democratic-majority cities, public anxiety about crime seems to be peaking. Determined to assuage people’s concerns (and keep their votes), major cities including San Francisco, Chicago, and New Orleans are turning to technical surveillance as a solution.</p>



<p>This marks a big shift, especially for a city like San Francisco, which in 2019 became the first U.S. city to ban the use of facial recognition technology by local public agencies, including the police. Boston, Portland, Oakland, and Jackson, Mississippi, have since followed San Francisco’s lead, passing similar restrictions of their own that prevent public agencies from using privately-developed technologies to identify individuals in the course of criminal investigations or other procedures.&nbsp;</p>





<p>Spearheaded by privacy advocates and buoyed by mass protests against police abuse after the killing of George Floyd, these policies were intended to keep cities from treading into the legal and ethical gray area where facial recognition technology currently sits.&nbsp;</p>



<p>But now the tide seems to be turning. A recent poll <a href="https://finance.yahoo.com/news/san-francisco-standard-voter-poll-134500544.html">found</a> some 65% of voters in San Francisco report feeling less safe today than they did in 2019.</p>



<p>“We went from a long-term view to an extremely short-term view,” explained Tracy Rosenberg, the advocacy director for Oakland Privacy, a group that advocates for surveillance oversight in the Bay Area.&nbsp;</p>



<p>“The narrative that was dominant in 2019 was the long-term implications of the ubiquitous use of facial recognition, which is basically the end of public anonymity. And I think that narrative has largely been replaced by a narrative that [says]: ‘Who cares about the future when right now your car is getting stolen or your store is being looted?’ And that basically the short-term implications on your life right now are more important than any sort of future surveillance state that might develop.”</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2022/11/Mel-Melcon-Los-Angeles-Times-via-Getty-Images-1800x1147.jpg" alt="" class="wp-image-36378"/><figcaption class="wp-element-caption">Security cameras on Rodeo Drive, part of an extensive network of surveillance cameras throughout Beverly Hills. Photo:&nbsp;Mel Melcon / Los Angeles Times via Getty Images</figcaption></figure>



<p class="has-drop-cap">Public concern about crime has clearly gone up, but national crime data reveals a complex picture. The murder rate <a href="https://www.nytimes.com/2022/09/23/briefing/crime-rates-murder-robberies-us.html">spiked</a> in 2021, reaching its highest point in nearly 25 years, but now appears to be decreasing, with homicides in major cities <a href="https://www.ahdatalytics.com/dashboards/ytd-murder-comparison/">down</a> nearly 5% in 2022. All other kinds of violent crime have held steady or <a href="https://www.pewresearch.org/fact-tank/2022/10/31/violent-crime-is-a-key-midterm-voting-issue-but-what-does-the-data-say/">dropped</a> since 2019, according to Pew. And cities’ experiences with violent crime are not uniform. As of November 2022, murders have <a href="https://www.ahdatalytics.com/dashboards/ytd-murder-comparison/">increased</a> by nearly 30% in New Orleans and Charlotte compared to the same time period in 2021, and decreased in others, including San Francisco and Oakland.&nbsp;</p>



<p>Despite San Francisco’s pioneering ban on the use of facial recognition technology, in September 2022 the city’s Board of Supervisors passed a policy that will allow law enforcement to access the video footage of private security cameras in real time. During a 15-month pilot phase, San Francisco police will be able to view up to 24 hours of live video footage from private surveillance cameras during criminal investigations and large public events.&nbsp;</p>



<p>In a letter to city officials, a coalition opposing the ordinance, including the American Civil Liberties Union (ACLU) of Northern California and the San Francisco Public Defender's Office, <a href="https://www.aclunc.org/sites/default/files/2022.07.8_SFPD_cam_policy_coalition_letter.pdf">argued</a> the proposal “massively expands police surveillance” and could give officers the ability to “surveil any large gathering of people in San Francisco, including the crowds that gather for the Pride Parade, street markets, and other political and civic events.”</p>



<p>The Electronic Frontier Foundation’s Matthew Guariglia <a href="https://www.eff.org/deeplinks/2022/09/san-franciscos-board-supervisors-grants-police-more-surveillance-powers">described</a> the Board’s decision as an attempt to “[put] voters at ease that something, anything is being done about crime.”</p>





<p>These San Francisco legislators are not alone. Their decision reflects a broader trend playing out in left-leaning cities nationwide. Cities are expanding the use of surveillance technology to reduce crime, or at least assuage some citizens’ concerns about crime, sometimes without clear evidence that these tools are effective as such. These cities also risk entrenching a permanent surveillance infrastructure that may be difficult to dismantle down the road. “The history of surveillance suggests that it's not easy to put the genie back in the bottle,” argues Rosenberg.&nbsp;</p>



<p>One of&nbsp; the most high-profile examples of this dynamic comes out of New Orleans, where lawmakers are poised to expand police surveillance less than two years after passing a sweeping facial recognition ban. In July, the New Orleans City Council voted to allow the city police department to <a href="https://www.nola.com/news/crime_police/article_d31cb51a-090c-11ed-8929-7bc8922a7d0d.html">request</a> access to facial recognition technology from the Louisiana State Analytical and Fusion Exchange, which analyzes data for police, to investigate certain kinds of crimes, <a href="https://legis.la.gov/Legis/Law.aspx?d=78337">including</a> rape, murder, carjacking, robbery, and “purse snatching.”&nbsp;</p>



<p>The ordinance <a href="https://cityofno.granicus.com/MetaViewer.php?view_id=42&amp;clip_id=4142&amp;meta_id=591807">passed</a> amid a surge in violent crime in New Orleans not seen since the mid-1990s. In early July, just weeks before the city council approved the policy, New Orleans reportedly <a href="https://www.nola.com/news/crime_police/article_30197d8a-f95d-11ec-bafa-3f7db4d5f198.html">had</a> the highest murder rate in the nation. Supporters of the measure, including the city’s mayor, <a href="https://nola.gov/next/mayors-office/news/articles/july-2022/mayor-cantrell-praises-passage-of-facial-recognition-technology-ordinance/">claimed</a> that it would help police rein in crime by helping officers track down perpetrators more effectively.&nbsp;</p>



<p>This raises a critical question: Do these tools actually help reduce or solve crimes? As one city council member who voted against the New Orleans policy <a href="https://www.nola.com/news/crime_police/article_d31cb51a-090c-11ed-8929-7bc8922a7d0d.html">pointed out</a>, the argument was not backed up by empirical evidence.&nbsp;</p>



<p>During a hearing on the vote, an official with the police department admitted that he had no information about how frequently the department used facial recognition before it was banned in 2020 and whether its use had led to any arrests or convictions. “You have no data, sitting here today, telling me that this actually works, that it leads to arrests, admissions or clearances,” the councilmember Lesli Harris said.&nbsp;</p>



<p>The Louisiana chapter of the ACLU blasted the council’s decision to “expand racist technologies,” <a href="https://www.laaclu.org/en/press-releases/aclu-louisiana-issues-statement-after-new-orleans-city-council-reverses-surveillance">highlighting</a> research that has found that facial recognition disproportionately misidentifies women and people of color. A 2019 federal study <a href="https://www.nytimes.com/2019/12/19/technology/facial-recognition-bias.html">found</a> that the majority of facial recognition systems are biased, misidentifying Black and Asian faces at significantly higher rates than their white counterparts.&nbsp;</p>



<p>These flawed matches have real-world consequences: At least three Black men in the U.S. have been wrongfully <a href="https://www.wired.com/story/wrongful-arrests-ai-derailed-3-mens-lives/">arrested</a> after facial recognition software incorrectly identified them for crimes they did not commit.</p>



<p>Elsewhere, cities are embracing a controversial gunshot detection surveillance technology that a study from the Northwestern School of Law <a href="https://www.macarthurjustice.org/shotspotter-generated-over-40000-dead-end-police-deployments-in-chicago-in-21-months-according-to-new-study/">found</a> to be “inaccurate, expensive, and dangerous,” sending police on “unfounded deployments” in predominantly Black and Latino neighborhoods. The technology, <a href="https://www.shotspotter.com/system/content-uploads/SST_FAQ_January_2018.pdf">ShotSpotter</a>, uses a system of discrete acoustic sensors to identify the location of gunshots and then send an alert to the police, who can then decide to send an officer to the scene of the alleged crime.</p>



<p>The firm <a href="https://www.shotspotter.com/wp-content/uploads/2021/07/ShotSpotter-Respond-FAQ-Jul-2021.pdf">has</a> contracts in over 120 cities nationally, some of which have come under fire for pouring millions into a technology that critics say is error-prone and ‌ineffective. ShotSpotter <a href="https://www.shotspotter.com/blog/the-aclu-was-wrong-about-shotspotter-technology-heres-why/">contests</a> claims of inaccuracy, saying the technology has a 97% accuracy rate. But a 2021 analysis of the Chicago Police Department’s use of ShotSpotter by the city’s Office of Inspector General <a href="https://igchicago.org/2021/08/24/oig-finds-that-shotspotter-alerts-rarely-lead-to-evidence-of-a-gun-related-crime-and-that-presence-of-the-technology-changes-police-behavior/">found</a> that just ​​9% of alerts were linked to gun-related crimes.</p>



<p>A recent class action lawsuit, filed by the MacArthur Justice Center at Northwestern University, <a href="https://www.macarthurjustice.org/wp-content/uploads/2022/07/Complaint-file-stamped.pdf">alleges</a> that the city “has intentionally deployed ShotSpotter along stark racial lines and uses ShotSpotter to target Black and Latinx people.”</p>



<p>Despite such criticisms about the technology and its impact on policing, cities are still using it. Earlier this month, the Detroit City Council ended a months-long, divisive debate about whether to expand ShotSpotter when it approved a $7 million contract to deploy the system to 10 new neighborhoods in the city. Detroit’s decision came just days after Cleveland’s City Council voted to <a href="https://www.cleveland.com/news/2022/10/cleveland-expected-to-approve-controversial-shotspotter-contract-stimulus-watch.html">quadruple</a> the size of ShotSpotter’s current use area. Other cities that have recently moved to expand or renew contracts include <a href="https://www.latimes.com/california/newsletter/2021-10-13/shotstopper-police-sacramento-essential-california">Sacramento</a>, <a href="https://www.houstonchronicle.com/politics/houston/article/Houston-may-expand-gunshot-detection-system-that-16750119.php">Houston</a> and <a href="https://chicago.suntimes.com/crime/2021/8/19/22633412/activists-slam-city-shotspotter-contract-gunshot-detection-system-policing">Chicago</a>.&nbsp;&nbsp;</p>



<p>Meanwhile, in New York, Mayor Eric Adams, whose ‘90s-style “tough on crime” rhetoric <a href="https://www.nytimes.com/2022/04/29/opinion/eric-adams-new-york-crime.html">has been</a> a hallmark of his campaign and time in office, <a href="https://twitter.com/SallyGold/status/1513956657381969929">has been</a> a vocal proponent of high-tech policing, including facial recognition and gunshot detecting technology like ShotSpotter. Adams, a former New York City police officer, has sought to dramatically <a href="https://www.politico.com/news/2022/02/08/adams-police-surveillance-technology-00006230">expand</a> the use of facial recognition within the police department and has expressed interest in installing metal detectors in city subway stations and replacing school metal detectors with new technology that would scan students for weapons.&nbsp;</p>



<p>The overall picture, says Albert Fox Cahn, the founder and executive director of the Surveillance Technology Oversight Project in New York, is one of “surveillance opportunism” in which technology companies are pitching surveillance systems to lawmakers and law enforcement agencies seeking to quell concerns about public safety. To promote these technologies, Fox Cahn added, some public officials have positioned the expansion of surveillance in cities as a more humane alternative to traditional policing.</p>



<p>Guariglia of the Electronic Frontier Foundation explained, “Surveillance doesn’t come without the iron fist of the police department. Because even if they capture something on surveillance and they want to arrest a person, that person is not going to be arrested by a camera. They're going to be arrested by a person with a nightstick and handcuffs and a gun.” At the end of the day, this trend pushes them towards a vision of citywide surveillance <a href="https://www.codastory.com/newsletters/russia-smart-cities/">favored</a> by some of the world’s most authoritarian regimes.</p>



<p>For now, San Francisco’s facial recognition ban remains intact. But some civil liberties advocates worry that the decision by the city’s Board of Supervisors to grant the police wider surveillance powers could give license to other cities and jurisdictions to follow suit.&nbsp;</p>



<p>“I think that's one of the most disturbing parts of what happened in San Francisco,” explained Oakland Privacy’s Rosenberg. “Because when you don't have those facial recognition bans in place, the green light from a big city, a progressive city, a city that's been famous for innovations in surveillance and looking at things with a critical lens — I think it provides a sort of implicit invitation to other cities that don't have these bans in place to jump on the bandwagon.”</p>



<p>Still, as many privacy experts are quick to point out, it’s unclear if this trend will have staying power. They point to the general ebbs and flows of crime — at its peak, a sense of public insecurity tends to garner more support for policing and a willingness to erode civil liberties than it may when citizens feel safer — as well as the strength of the growing anti-surveillance movement.&nbsp;</p>



<p>“Five years ago, it was unimaginable that there could have been a ban on any type of surveillance technology,” ​​Matt Cagle, a senior staff attorney for the Technology and Civil Liberties Program at the ACLU of Northern California, remarked. “When we started talking about this at the ACLU, we got laughed at by folks in political spaces when we proposed the idea of banning facial recognition.” Now, though, he adds, there are “more groups who are opposed to government surveillance at the local level…by an order of magnitude over what that was five or ten years ago. And I think that’s an important trend even though on the policy itself, the votes didn’t swing the right way this time.”&nbsp;</p>



<p>In the next five years, we will see if those groups have the power to put the genie back in the bottle.</p>

<div class="wp-block-group alignright converted-related-posts is-style-meta-info is-layout-flow wp-block-group-is-layout-flow">
<h4 class="wp-block-heading">Related Articles</h4>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-surveillance-and-control post_tag-anti-migrant post_tag-facial-recognition post_tag-feature post_tag-privacy-laws post_tag-surveillance post_tag-united-states author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/surveillance-and-control/alternatives-to-detention-immigration/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2022/05/Surveillance_CodaStory_AK_Main_300dpi-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2022/05/Surveillance_CodaStory_AK_Main_300dpi-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2022/05/Surveillance_CodaStory_AK_Main_300dpi-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2022/05/Surveillance_CodaStory_AK_Main_300dpi-232x232.jpg 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/surveillance-and-control/alternatives-to-detention-immigration/">‘I felt like I was a prisoner’: The rapid rise of US immigration authorities’ electronic surveillance programs</a></h2>


<div class="wp-block-post-author-name">Erica Hellerstein</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-surveillance-and-control post_tag-anti-migrant post_tag-border-surveillance post_tag-feature post_tag-mexico post_tag-police-surveillance coda_storyline-surveillance-and-borders author-cap-ericahellerstein ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/surveillance-and-control/us-border-surveillance/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2021/07/USBorderSureveillanceBiden1-250x250.jpg" srcset="https://www.codastory.com/wp-content/uploads/2021/07/USBorderSureveillanceBiden1-250x250.jpg 250w, https://www.codastory.com/wp-content/uploads/2021/07/USBorderSureveillanceBiden1-72x72.jpg 72w, https://www.codastory.com/wp-content/uploads/2021/07/USBorderSureveillanceBiden1-232x232.jpg 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/surveillance-and-control/us-border-surveillance/">Between the US and Mexico, a corridor of surveillance becomes lethal</a></h2>


<div class="wp-block-post-author-name">Erica Hellerstein</div></div>
</div>



<div class="wp-block-fabrica-article-preview wp-block-fabrica-article-preview--alignment-left wp-block-fabrica-article-preview--external-source-local is-style-featured category-surveillance-and-control post_tag-dispatch post_tag-police-surveillance post_tag-surveillance author-cap-caitlinthompson ">
<div class="wp-block-fabrica-article-preview-image is-style-round"><a class="wp-block-fabrica-article-preview-image__link" href="https://www.codastory.com/surveillance-and-control/san-francisco-protests-surveillance/"><img class="wp-block-fabrica-article-preview-image__image" src="https://www.codastory.com/wp-content/uploads/2020/09/SFPD_used_private_cameras_for_protest_surveillance_-_CT-250x250.png" srcset="https://www.codastory.com/wp-content/uploads/2020/09/SFPD_used_private_cameras_for_protest_surveillance_-_CT-250x250.png 250w, https://www.codastory.com/wp-content/uploads/2020/09/SFPD_used_private_cameras_for_protest_surveillance_-_CT-72x72.png 72w, https://www.codastory.com/wp-content/uploads/2020/09/SFPD_used_private_cameras_for_protest_surveillance_-_CT-232x232.png 232w" width="250" height="250"/></a></div>



<div class="wp-block-group is-vertical is-layout-flex wp-container-core-group-is-layout-8cf370e7 wp-block-group-is-layout-flex">
<h2 class="wp-block-fabrica-article-preview-title is-style-sans has-small-font-size"><a class="wp-block-fabrica-article-preview-title__link" href="https://www.codastory.com/surveillance-and-control/san-francisco-protests-surveillance/">How San Francisco police surveillance closed in on Black Lives Matter protests</a></h2>


<div class="wp-block-post-author-name">Caitlin Thompson</div></div>
</div>
</div>
</div>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/us-city-surveillance/">As anxiety about crime peaks, US cities look to surveillance tech. But does it actually work?</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">36346</post-id>	</item>
		<item>
		<title>A photographer and artist walk into a fake news factory</title>
		<link>https://www.codastory.com/authoritarian-tech/jonas-bendiksen-book-of-veles/</link>
		
		<dc:creator><![CDATA[Katia Patin]]></dc:creator>
		<pubDate>Tue, 26 Oct 2021 11:33:35 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Art & Surveillance]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Photography]]></category>
		<category><![CDATA[Q&A]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=25089</guid>

					<description><![CDATA[<p>In the Book of Veles, Jonas Bendiksen's controversial new photobook, the joke is on us</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/jonas-bendiksen-book-of-veles/">A photographer and artist walk into a fake news factory</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>The Book of Veles, photographer Jonas Bendiksen’s latest project, is a fresh and unnerving mediation on authenticity, veracity and truth – questions that have dogged photojournalism with every new advance in imagemaking.&nbsp;</p>



<p>Ostensibly a photobook on the small Macedonian town of Veles, which made <a href="https://www.wired.com/2017/02/veles-macedonia-fake-news/">international headlines</a> in 2016 as the unlikely factory of pro-Trump fake news, The Book of Veles created a furore after Bendiksen revealed that the project’s images were synthetically generated using 3D software and the book’s text was written entirely by artificial intelligence.&nbsp;</p>



<p>Fascinated with the story of Veles as well as with developments in synthetic imagery, Bendiksen set out to see just how “real” he could make his images. Although he left breadcrumbs throughout the text, to his surprise the book was published in April 2021 to “nice, positive echo-chamber feedback,” said Bendiksen. No one questioned the authenticity of the images or text. He stepped up the game by then submitting his manipulated photographs to the world’s most prestigious photojournalism festival Visa pour l'image which screened his images in September.</p>



<p>“I thought, what could be a higher threshold for fooling someone with junk, synthetic images than this crowd?,” said Bendiksen. “I gave it 24 hours for someone to come with some questions about the work. It didn’t happen.” The photographer’s final attempt to out himself involved buying a squadron of Facebook and Twitter bots to attack him online. The bots posted dozens of messages claiming that his work was fraudulent, only to have Bendiksen’s colleagues and supporters rush to his defense.</p>



<p>Bendiksen finally came clean in <a href="https://www.magnumphotos.com/newsroom/society/book-veles-jonas-bendiksen-hoodwinked-photography-industry/">an interview on September 17</a> with Magnum Photos. We spoke with Bendiksen about what he’s come away with from the experience and how the project has continued to take unexpected turns even after his revelation last month.</p>



<p><em>This conversation has been edited for length and clarity.</em></p>



<figure class="wp-block-gallery alignfull has-nested-images columns-1 is-cropped wp-block-gallery-12 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image alignfull size-large"><img data-id="25126" src="https://www.codastory.com/wp-content/uploads/2021/10/Jonas-Bendiksen-Book-of-Veles-press_001-1800x1200.jpg" alt="" class="wp-image-25126"/><figcaption class="wp-element-caption"><strong><em>Can you start by telling us about how your project came together and what surprised you along the way? <br></em></strong><br>I found this Veles story so fascinating, but by then the websites had closed down, the algorithms had changed, they were out of business. So I realized the only way I can really explore that story in Veles is by creating my own imaginary version of it.</figcaption></figure>



<figure class="wp-block-image size-large"><img data-id="25127" src="https://www.codastory.com/wp-content/uploads/2021/10/Jonas-Bendiksen-Book-of-Veles-press_005-2-1800x1200.jpg" alt="" class="wp-image-25127"/><figcaption class="wp-element-caption">It spoke to me that if synthetic technology has gotten to the point where one averagely nerdy freelance photographer can go into his basement, look at some YouTube tutorials and create a whole photographic documentary from scratch, then we are in trouble. Can I with no prior experience fake it altogether?<br><br>So I went to Veles, I photographed a bunch of empty locations and downloaded free software to create 3D models.</figcaption></figure>



<figure class="wp-block-image size-large"><img data-id="25115" src="https://www.codastory.com/wp-content/uploads/2021/10/Jonas-Bendiksen-Book-of-Veles-press_012-1800x1200.jpg" alt="" class="wp-image-25115"/><figcaption class="wp-element-caption">Bringing these avatars to life, I found it frightening. I was scaring myself because I saw how easy this is to do and how lifelike they are. It was like seeing myself wake this Frankenstein monster to life.<br><br>On one hand I thought I should stop now, on the other I was wondering how far can this go? The fact that I can do this with no understanding or training of how this works, it says something of the things to come.</figcaption></figure>
</figure>



<p class="has-medium-font-size"><strong><em>What were the main goals that you set out with?</em></strong></p>



<p>I wanted to create some discussion and awareness around this technology but I didn’t go into this project wanting it to be just a technical demonstration of something. I get into my projects because it’s a good story to tell. If it hadn’t been for all these exciting layers of Veles and mythology I wouldn’t have done this. All of these parts of the puzzle fell miraculously into place.</p>



<p class="has-medium-font-size"><strong><em>Yes, tell us more about how the puzzle fell into place, there are so many bizarre layers to this project.</em></strong></p>



<p>One was the discovery of the historical layers to the story. The town Veles is named after this pre-Christian Slavic god who was a kind of sneaky guy, a shapeshifter who turns into a bear, a god of chaos and deception and magic. I thought he would probably be super happy about all this deception going on in Veles.</p>



<figure class="wp-block-gallery alignfull has-nested-images columns-1 is-cropped wp-block-gallery-13 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img data-id="25092" src="https://www.codastory.com/wp-content/uploads/2021/10/Screen-Shot-2021-10-20-at-3.41.55-PM-1800x1200.jpg" alt="" class="wp-image-25092"/><figcaption class="wp-element-caption">Then I came across this discovery of an ancient epic manuscript found by a Russian army officer and a historian in 1919 called the Book of Veles. It’s still quite a popular text in nationalist and new-agey circles.</figcaption></figure>



<figure class="wp-block-image size-large"><img data-id="25096" src="https://www.codastory.com/wp-content/uploads/2021/10/Screen-Shot-2021-10-20-at-3.50.53-PM-1792x1200.jpg" alt="" class="wp-image-25096"/><figcaption class="wp-element-caption"><meta charset="utf-8">The thing is of course all modern historians and linguists have concluded that it’s a forgery.</figcaption></figure>



<figure class="wp-block-image size-large"><img data-id="25095" src="https://www.codastory.com/wp-content/uploads/2021/10/Screen-Shot-2021-10-20-at-3.50.35-PM-1800x1200.jpg" alt="" class="wp-image-25095"/><figcaption class="wp-element-caption"><meta charset="utf-8">Nobody quite knows why they did it, they didn’t get really famous for it or make much money. That’s an interesting parallel to these misinformation efforts in Veles.<br><br>Other layers kept falling onto my head.<br></figcaption></figure>



<figure class="wp-block-image size-large"><img data-id="25141" src="https://www.codastory.com/wp-content/uploads/2021/10/Screen-Shot-2021-10-20-at-4.43.09-PM-1-1800x1195.png" alt="" class="wp-image-25141"/><figcaption class="wp-element-caption">I realized that the development in artificial text generation is also moving very fast. There is software by Open AI which is available to the common man for free.<br><br>I fed it all available articles by reporters who went to Veles in 2016 when the original media story broke and this AI system then wrote this 5,000 word introductory essay to the book about my experiences in Veles.<br></figcaption></figure>
</figure>



<p class="has-medium-font-size"><strong><em>The introduction written by AI is still a bit clumsy. Some of your images, especially the one of a bear  stomping through town, feel like they should have set off some alarm bells. Yet it was all “real” enough that no one questioned it.</em></strong></p>



<p>That was also something that frightened me. The technology in the field has developed a lot even from when I started using it to when the project was done. It’s clear to me that within a few years 95% or more of people will have a hard time decoding whether an article was written by a New York Times journalist or a bot. I wanted this to be a look into what I think is the near future of our information landscape.</p>



<p class="has-medium-font-size"><strong><em>How are our current concerns about automation or AI different from all the previous technology scares?</em></strong></p>



<p>In photography at every step when there is a new technology people say it’s the collapse of truth. Whether that was when digital cameras came or when Photoshop showed up. People always cried wolf like that. Maybe I’m the same and this is a bunch of wolf crying and this will sort itself out nicely.</p>



<p>But I think the difference is: automation and synthetics. We will always have good journalism but I think it will be mixed in with so much synthetic junk or half synthetic junk that it will just be very, very chaotic and difficult to navigate. The difference is that the automation of it gives it such a bigger potential for spread and makes it so difficult to contain.</p>



<p class="has-medium-font-size"><strong><em>You’ve repeatedly compared your work to a penetration test that hackers run searching for vulnerabilities in their code. These tests then allow companies to better fortify their software and find solutions for loopholes in the code. What did your stress test reveal and what solutions did it reveal to you?</em></strong></p>



<p>I’m not trying to pretend I have all the answers but I think there are many levels to it. I think the content verification business will be an industry unto itself. This is also a call out to social media platforms who I believe have failed us in many ways on this front thus far to step up the game. This is a call out to our education system. As a father of four, I’m wondering how my children are going to handle this. To be a functional citizen in the next 50 years, navigating the information space should be at least as important a subject as mathematics in schools.</p>



<figure class="wp-block-gallery alignfull has-nested-images columns-1 is-cropped wp-block-gallery-14 is-layout-flex wp-block-gallery-is-layout-flex">
<figure class="wp-block-image size-large"><img data-id="25158" src="https://www.codastory.com/wp-content/uploads/2021/10/P5250121_G7.jpg" alt="" class="wp-image-25158"/><figcaption class="wp-element-caption">We as content creators, journalists, photographers, editors, publications, we all have roles to play in this. We have to strengthen institutions and publications that work for in-depth, context based journalism and storytelling.<br><br>To me this is not an issue that is in any way limited to the photography community, this is a societal issue: the question of how in the next years and decades we navigate a landscape where people are manipulating information so easily in so many ways.<br><br>That’s an issue for democracy.</figcaption></figure>



<figure class="wp-block-image size-large"><img data-id="25277" src="https://www.codastory.com/wp-content/uploads/2021/10/Jonas-Bendiksen-Book-of-Veles-press_002-1-857x1200.jpg" alt="" class="wp-image-25277"/><figcaption class="wp-element-caption"><meta charset="utf-8"><em><strong>The goal of disinformation, or at least one of its results, is the breakdown of trust between people, between people and institutions. There’s a reason why a channel like RT for example has the slogan, Question More. But there’s a delicate balance between the need for more critical thinking and the need for more trust isn’t there?</strong><br></em><br>This is the big issue, right? This is the central question, this question of trust. We are dependent on trust for functional societies. We’re seeing this fragmentation of the information landscape.</figcaption></figure>



<figure class="wp-block-image size-large"><img data-id="25278" src="https://www.codastory.com/wp-content/uploads/2021/10/Jonas-Bendiksen-Book-of-Veles-press_013-1-857x1200.jpg" alt="" class="wp-image-25278"/><figcaption class="wp-element-caption"><meta charset="utf-8">We’re losing the common denominator. The common narrative is disappearing. We’re already losing the idea that we share a common experience or common truth.<br><br>That’s an unfortunate truth and a consequence of this technology. We have to try to create systems where we somehow contain it. We employ technology to help fight back which is of course an arms race.</figcaption></figure>
</figure>



<p class="has-medium-font-size"><strong><em>This story continues to go in unexpected directions. Tell us about the latest.</em></strong></p>



<p>A junk information site pretending to be a newspaper in Texas called Texas News Today came out from a very similar story that was stolen <a href="https://www.wired.com/story/true-story-bogus-photos-people-fake-news/">from Wired</a> about my project. I think it’s automated: they suck stuff into their system, rewrite it and then it gets blasted out again on all these websites which look very similar to what the fake news websites out of Veles were doing. It’s the same business model. I looked into it and it turns out these people at Texas News Today are a junk news site based in Pune, India running a bunch of sites from this location in different languages.</p>



<p>Now, these fake news sites are stealing stories about my fake news project. And there are people quoting the fake writer who supposedly wrote the piece as some credible source. It’s full circle. There you can see the whole mechanism of the chaos.</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/jonas-bendiksen-book-of-veles/">A photographer and artist walk into a fake news factory</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">25089</post-id>	</item>
		<item>
		<title>Future Wake: the AI art project that predicts police violence</title>
		<link>https://www.codastory.com/authoritarian-tech/future-wake-predictive-policing/</link>
		
		<dc:creator><![CDATA[Caitlin Thompson]]></dc:creator>
		<pubDate>Mon, 18 Oct 2021 11:23:00 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Q&A]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=24825</guid>

					<description><![CDATA[<p>Winner of the Mozilla Creative Media award for 2021, an interactive website calculates when and where fatal encounters with law enforcement will occur — and tells the stories of the victims</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/future-wake-predictive-policing/">Future Wake: the AI art project that predicts police violence</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>To pinpoint when and where future crimes will occur, law enforcement agencies from Amsterdam to Alabama are turning to predictive policing.</p>



<p>However, the technology has attracted significant criticism, citing <a href="https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3333423">biases</a> inherent to its algorithms and alleging that its use contributes to the over-policing of marginalized communities.&nbsp;</p>



<p>Now, following a number of high-profile examples, including the 2020 murder of George Floyd by Minneapolis police officer Derek Chauvin, the conversation is turning to how the same methods can be used to combat police brutality.<br></p>



<p>Enter <a href="https://www.futurewake.com/#/">Future Wake</a>, an interactive website that has received the Mozilla 2021 Creative Media Award. The project uses artificial intelligence to analyze data on fatal police encounters in the U.S. and predict future incidents. It then creates computer-generated avatars to tell the stories of each composite victim.<br></p>



<p>We sat down with two of its creators — Oz, based in New York, and Tim in the Netherlands — to talk about the motivations behind their work. Both asked to be referred to by their given names only.</p>



<p><em>This conversation was edited for length and clarity.</em></p>



<p><strong>Tell us about Future Wake. What was your goal with this project?</strong></p>



<p><strong>Oz: </strong>Future Wake focuses on using the principles of predictive policing to predict when the next fatal encounter with the police will occur. The tactics of predictive policing and the way it’s implemented are relatively unknown. Most of the time, whenever we mention it, people bring up “Minority Report.” They only have a fictionalized understanding of the technology. A lot of people don't realize that it's actually in their own cities. So, in order to bring attention to it, we thought about just flipping its application.</p>



<p><strong>When someone enters the website, what do they see?</strong></p>



<p><strong>Tim:</strong> At first, you see a warning. We are very much aware of trauma for people who've lived through police violence. You don't see any data immediately. You see the five faces. I thought it was the most important thing to show that we're talking about humans here, not numbers.&nbsp;</p>



<figure class="wp-block-video alignwide"><video height="1080" style="aspect-ratio: 1920 / 1080;" width="1920" autoplay loop src="https://www.codastory.com/wp-content/uploads/2021/10/LATEST.mp4" playsinline></video></figure>



<p><strong>Oz:</strong> Each face is a computer-generated image of the next victim from one of the five most populous cities in the U.S. — Chicago, Houston, Los Angeles, New York and Phoenix. Below each victim, you see a countdown that refers to the moment we predict that they're going to die. The people are animated to bring this awareness that they are still alive. Once the countdown ends, that will be the end of their lives. We want to breathe life into the people we predicted.&nbsp;</p>



<p><strong>Tim: </strong>When you click on a person, you enter their space. We predicted the location of the fatal encounter with police. We have a Google Streetview in the background, and it's like you're having a call with them. Then this person tells the story of their own demise.&nbsp;</p>





<p><strong>The project has two elements. You ran data about fatal encounters with police in the U.S. through predictive algorithms to determine the details of future victims. Then you use deepfake technology to create avatars that represent them. Why did you feel like you needed both?&nbsp;</strong></p>



<p><strong>Oz:</strong> Our data set starts from 2000 until now. We wanted to highlight these recurring patterns of police brutality in the U.S and to hone in on the fact that a prediction still has consequences for a human being.&nbsp;</p>



<p><strong>Tell us about the databases you worked with.&nbsp;</strong></p>



<p><strong>Oz: </strong>We used two main ones, called Fatal Encounters and Mapping Police Violence. These are citizen-initiated projects that try to capture every fatal encounter with police officers in the U.S. We did try to find police databases. The FBI has one that it started in 2018, but they're mainly relying on self-reported outcomes from police agencies. It’s actually under-representative of what's going on. We put that data through algorithms to predict who — which consists of the gender and ethnicity of the victim — where and when the next fatal encounter would occur.</p>



<p><strong>Each potential victim has a backstory that describes the circumstances in which they were killed. When you click on their face, they tell their story. You used AI text-generation software to do this, right?</strong></p>



<p><strong>Oz:</strong> In the databases we used, there were two or three sentences detailing if it was a car chase, if somebody was wielding a weapon or how they were shot. We use this algorithm called GPT-2 to learn the aesthetic of all of these media reports. GTP-2 would then generate future police-related media reports. We then edited the text slightly, to make it in the future tense and the first person.</p>



<figure class="wp-block-image alignwide size-large"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Graph-2-1800x506.jpg" alt="" class="wp-image-24957"/><figcaption class="wp-element-caption"><meta charset="utf-8"><em>Source: Aggregate of Fatal Encounters and Mapping Police Violence. Infographic: Coda Story</em></figcaption></figure>



<p><strong>Why did you use the future tense?</strong></p>



<p><strong>Oz:</strong> I see more talk about the horrific events that happened to victims of police brutality after the fact. Occasionally, there are little bubbles of conversation about how we can prevent this in the future. By predicting future victims and showing that this is an ongoing issue, we're asking, ”How can we protect this person from being a future victim?”</p>



<p><strong>Let’s talk about the countdown clock…</strong></p>



<p><strong>Oz:</strong> In traditional predictive policing, they use spatial-temporal models to predict where and when a crime will occur. We replicate that. We want to say that this person is going to die at this specific moment. I used a time series algorithm to predict recurring police-related fatal encounters, and was able to predict an estimated day of when someone would die. The clock is supposed to generate a sense of urgency.&nbsp;</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Graph1-1-1800x506.jpg" alt="" class="wp-image-24974"/><figcaption class="wp-element-caption"><meta charset="utf-8"><em>Source: Aggregate of Fatal Encounters and Mapping Police Violence. Infographic: Coda Story</em></figcaption></figure>



<p><strong>Was there anything that surprised you in the data?</strong></p>



<p><strong>Oz: </strong>We looked at the average time between each fatal encounter for each city and for each demographic. It was pretty creepy. Based on the data that we had, in Chicago, Black males had the shortest time in between each incident. It was an average of 34 days. It was quite shocking. Minorities are overrepresented in the database, but I was still surprised by the fact that everyone is represented.</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/future-wake-predictive-policing/">Future Wake: the AI art project that predicts police violence</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="https://www.codastory.com/wp-content/uploads/2021/10/LATEST.mp4" length="1642731" type="video/mp4" />

		<post-id xmlns="com-wordpress:feed-additions:1">24825</post-id>	</item>
		<item>
		<title>&#8216;The people who control are also being controlled&#8217;</title>
		<link>https://www.codastory.com/surveillance-and-control/salvatore-vitale-art-surveillance/</link>
		
		<dc:creator><![CDATA[Marta Biino]]></dc:creator>
		<pubDate>Thu, 14 Oct 2021 16:12:03 +0000</pubDate>
				<category><![CDATA[Surveillance and Control]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Italy]]></category>
		<category><![CDATA[Q&A]]></category>
		<category><![CDATA[Surveillance]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=24761</guid>

					<description><![CDATA[<p>The art of Salvatore Vitale examines the array of surveillance technology that surrounds us all </p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/salvatore-vitale-art-surveillance/">&#8216;The people who control are also being controlled&#8217;</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>For the past seven years, the Italian artist Salvatore Vitale has investigated surveillance in all its varied forms. </p>



<p>From 2014 to 2019, he worked on the acclaimed project and photographic book <em><a href="https://salvatore-vitale.com/#/project/how-to-secure-a-country">How to Secure a Country</a></em>, focusing on the tensions between security and control that exist in Switzerland, where he now lives. His most recent project, <em><a href="https://salvatore-vitale.com/#/project/persuasive-system">Persuasive System</a></em>, is an interactive installation focused on the use of CCTV in public spaces. We caught up with him for a chat about his work.</p>



<p><em>This conversation has been edited for length and clarity.</em></p>



<div class="wp-block-image"><figure class="aligncenter size-full"><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-10-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-10-scaled.jpg" alt="" class="wp-image-24765"/></a><figcaption>Ceremony of the Protocol Section, part of the Defense Ministry.</figcaption></figure></div>



<p><strong>Coda Story: First of all, let’s talk about your most successful project, “How to Secure a Country.” Why did you decide to work on surveillance systems, and why in Switzerland?</strong></p>



<p><strong>Salvatore Vitale:</strong> The idea for “How to Secure a Country” came to me in 2014, after Switzerland held a referendum to change the constitution, in favor of reducing mass immigration. I had been living in an Italian-speaking Swiss canton for 10 years, but I had never really asked myself what it meant to be an immigrant there. After that moment, I started noticing how many contradictions were at play in the country, torn between progressivism and conservatism, between a need for safety and an almost all-controlling surveillance system.&nbsp;</p>



<p>Switzerland was not only my country of residence, but also a valuable case study. As one of the safest countries in the world, I knew that its surveillance system would be a sort of heightened version of those in place elsewhere.</p>



<figure class="wp-block-gallery columns-2 is-cropped wp-block-gallery-20 is-layout-flex wp-block-gallery-is-layout-flex"><ul class="blocks-gallery-grid"><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-05-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-05-900x1200.jpg" alt="" data-id="24791" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-05-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24791" class="wp-image-24791"/></a><figcaption class="blocks-gallery-item__caption">A military ceremony.</figcaption></figure></li><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-1-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-1-900x1200.jpg" alt="" data-id="24792" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-1-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24792" class="wp-image-24792"/></a><figcaption class="blocks-gallery-item__caption">A CCTV camera from Vitale's "Persuasive System."</figcaption></figure></li></ul></figure>



<p><strong>Surveillance takes so many different shapes that understanding them all seems almost impossible. How did you tackle this incredibly broad subject?</strong></p>



<p>I didn’t really envision a project about security and surveillance in the beginning. What I wanted was to understand where the need for safety in Switzerland comes from. It was an interesting angle to tackle, because safety is such an abstract concept that I didn’t know where to look. I did a lot of research, but I didn’t really talk to normal people about it. I partnered with academics and members of law enforcement, the insiders.&nbsp;</p>



<figure class="wp-block-gallery columns-2 is-cropped wp-block-gallery-21 is-layout-flex wp-block-gallery-is-layout-flex"><ul class="blocks-gallery-grid"><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-19.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-19-900x1200.jpg" alt="" data-id="24800" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-19.jpg" data-link="https://www.codastory.com/?attachment_id=24800" class="wp-image-24800"/></a><figcaption class="blocks-gallery-item__caption">A simulated injury from a military exercise.</figcaption></figure></li><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-2-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-2-900x1200.jpg" alt="" data-id="24801" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-2-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24801" class="wp-image-24801"/></a><figcaption class="blocks-gallery-item__caption">Data collection from surveillance cameras.</figcaption></figure></li></ul></figure>



<p><strong>Many of the themes your work deals with are very abstract. How do you represent ideas like surveillance and control visually?</strong></p>



<p>I took an analytical approach, trying to make the abstract idea of surveillance tangible through its actors and its tools. I tried to be creative in the use of my medium. I knew that my subjects were going to be extremely varied, and I tailored my approach accordingly. It’s one thing to photograph police forces patrolling the border, a completely different thing to photograph malware.&nbsp;</p>



<p>I took a risk. I decided not to capture actions, because surveillance is a process — it’s constantly evolving. I decided to use my images as triggers. I wanted to convey a sense of oppression, in a way that is cynical and clinical.</p>



<figure class="wp-block-gallery columns-2 is-cropped wp-block-gallery-22 is-layout-flex wp-block-gallery-is-layout-flex"><ul class="blocks-gallery-grid"><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-02-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-02-900x1200.jpg" alt="" data-id="24804" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-02-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24804" class="wp-image-24804"/></a><figcaption class="blocks-gallery-item__caption">On the Italy-Switzerland border.</figcaption></figure></li><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-08-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-08-900x1200.jpg" alt="" data-id="24805" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-08-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24805" class="wp-image-24805"/></a><figcaption class="blocks-gallery-item__caption">A detail of some ruins after a simulated rescue mission.</figcaption></figure></li><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-09-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-09-903x1200.jpg" alt="" data-id="24807" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-09-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24807" class="wp-image-24807"/></a><figcaption class="blocks-gallery-item__caption">Employing private security agents is integral to the Swiss security apparatus.</figcaption></figure></li><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-20.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-20-896x1200.jpg" alt="" data-id="24808" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-20.jpg" data-link="https://www.codastory.com/?attachment_id=24808" class="wp-image-24808"/></a><figcaption class="blocks-gallery-item__caption">A military bunker in the Swiss Alps.</figcaption></figure></li></ul></figure>



<p><strong>“How to Secure a Country” took five years to complete. Did you notice the development of surveillance systems over this period of time?</strong></p>



<p>I definitely noticed how the dividing line between the people who control and the people who are being controlled became thinner and more fluid. As algorithms become more sophisticated and biometric surveillance more widespread, the system becomes circular. The people who control are also being controlled at the same time.&nbsp;</p>



<figure class="wp-block-image size-large is-resized"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-01-1600x1200.jpg" alt="" class="wp-image-24782" width="840" height="630"/><figcaption>A security cell on the border between Switzerland and Italy.</figcaption></figure>



<p><strong><em>“</em></strong><strong>How to Secure a Country” is closely intertwined with your current project. Can you tell us more about that?</strong></p>



<p>Yes, the premise for both projects is similar: understanding surveillance. But “Persuasive System”<em> </em>is more universal, not focused on a single country, and it’s not a work of photography, but an interactive installation, almost like a performance.&nbsp;</p>



<p>I essentially built a system with three CCTV cameras. When the installation is working, it constantly captures images and information about people who interact with it, and people can play together, collecting data about each other.</p>



<figure class="wp-block-gallery columns-2 is-cropped wp-block-gallery-23 is-layout-flex wp-block-gallery-is-layout-flex"><ul class="blocks-gallery-grid"><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-5-878x1200.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-5-878x1200.jpg" alt="" data-id="24812" data-link="https://www.codastory.com/?attachment_id=24812" class="wp-image-24812"/></a></figure></li><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-7-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-7-900x1200.jpg" alt="" data-id="24814" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-7-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24814" class="wp-image-24814"/></a></figure></li><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-4-1602x1200.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_Persuasive_System-4-1602x1200.jpg" alt="" data-id="24811" data-link="https://www.codastory.com/?attachment_id=24811" class="wp-image-24811"/></a></figure></li></ul><figcaption class="blocks-gallery-caption">Security cameras constantly capture data about the viewers of the installation.</figcaption></figure>



<p><strong>What is its aim?</strong></p>



<p>I’m particularly interested in how video surveillance influences behavior. I wanted to understand what being a data subject means. We, as humans, have a physical body, but we also have an online persona, built on the basis of all the data that is collected about us. Algorithms that collect data from us online operate like black boxes and so do CCTV cameras. When we see them around, we know that they’re filming us and collecting data, but we don’t know which kind. With “Persuasive System,” people can see how the data is collected, and how much of it there is. It’s a lot. Through it, I’m hoping to foster a shock reaction in my audience.</p>



<p><strong>That’s a huge problem of our time. The amount of data that is collected about us online is unfathomable. How do you think your projects help to shed light on this?</strong></p>



<p>I don’t want to give solutions. I think rushing to solve these issues can be dangerous. My idea is to give my audience more awareness, the “weapons” to navigate our reality. By visualizing the technology, I hope to push people to think critically about their existence as data subjects.</p>



<figure class="wp-block-gallery columns-2 is-cropped wp-block-gallery-24 is-layout-flex wp-block-gallery-is-layout-flex"><ul class="blocks-gallery-grid"><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-03-1-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-03-1-900x1200.jpg" alt="" data-id="24777" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-03-1-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24777" class="wp-image-24777"/></a><figcaption class="blocks-gallery-item__caption">An assault rifle customized for sporting.</figcaption></figure></li><li class="blocks-gallery-item"><figure><a href="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-04-scaled.jpg"><img src="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-04-899x1200.jpg" alt="" data-id="24779" data-full-url="https://www.codastory.com/wp-content/uploads/2021/10/Salvatore_Vitale_How_to_Secure_a_Country-04-scaled.jpg" data-link="https://www.codastory.com/?attachment_id=24779" class="wp-image-24779"/></a><figcaption class="blocks-gallery-item__caption">On the Italy-Switzerland border.</figcaption></figure></li></ul></figure>



<p><em>All images copyright to Salvatore Vitale. Follow Salvatore on <a href="https://www.instagram.com/salvatorevitale_/">Instagram</a> or on his <a href="https://salvatore-vitale.com/">website</a>.</em></p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/salvatore-vitale-art-surveillance/">&#8216;The people who control are also being controlled&#8217;</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">24761</post-id>	</item>
		<item>
		<title>Who&#8217;s homeless enough for housing? In San Francisco, an algorithm decides</title>
		<link>https://www.codastory.com/authoritarian-tech/san-francisco-homeless-algorithm/</link>
		
		<dc:creator><![CDATA[Caitlin Thompson]]></dc:creator>
		<pubDate>Tue, 21 Sep 2021 12:44:03 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Feature]]></category>
		<category><![CDATA[Surveillance]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=23266</guid>

					<description><![CDATA[<p>Replacing human decision making with a computerized scoring system is hurting California's most vulnerable residents</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/san-francisco-homeless-algorithm/">Who&#8217;s homeless enough for housing? In San Francisco, an algorithm decides</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>San Francisco’s Tenderloin district is chaotic. Sirens, music and loud conversations generate almost constant noise. In every direction, dozens of people are camped out, crouched outside tents or sleeping in the open. Planters filled with soil but no flowers line a sidewalk, nails sticking out to discourage sitting. A broken drinking fountain, installed to give people access to clean water during the pandemic, gushes into the street.&nbsp;</p>



<p>Twitter’s headquarters are a 15-minute walk away, the cloud-based software giant Salesforce is 20 minutes in the opposite direction. But right here — in the epicenter of a chronic housing crisis, recently exacerbated by high tech industry salaries — you would barely be able to guess that this city is home to some of the wealthiest and most powerful people in the world. According to the latest count in 2019, there are around 8,035 people experiencing homelessness in San Francisco, a per capita rate comparable to much larger cities like New York.</p>



<p>For the thousands experiencing homelessness in the Tenderloin, their chance of getting off the streets comes down to a single number, generated by an algorithm. It is meant to assess each person’s unique vulnerabilities and allocate assistance accordingly. But now, even the designers of such systems say that, far from solving the problem of homelessness in the United States, these algorithms are used by local governments to deny assistance to large numbers of people in need.</p>



<p>In 2013, Iain De Jong and his colleagues at OrgCode, a consulting firm specializing in issues relevant to the homeless, created the Vulnerability Index — Service Prioritization Decision Assistance Tool, a scoring algorithm to help address America’s homelessness crisis. They were so successful that versions were adopted by authorities in at least 40 states. Other local governments, like San Francisco, have followed suit and created their own similar tools. Eight years later, De Jong and OrgCode say that cities are misusing their system — and that this has to stop.&nbsp;</p>



<p>VI-SPDAT was meant to help local social service providers assess what type of housing assistance might best suit a homeless person’s needs. Instead, resource-strapped cities are relying solely on tools such as VI-SPDAT to make a binary choice: who gets housing and who doesn’t.&nbsp;</p>



<p>“One of the gross misunderstandings and misuses of the tool was making housing decisions based upon the outputs of it,” De Jong told me. “It was never designed to do that.”&nbsp; In December, OrgCode<a href="https://www.orgcode.com/blog/the-time-seems-right-lets-begin-the-end-of-the-vi-spdat"> announced</a> that it would begin phasing out VI-SPDAT and will <a href="https://www.orgcode.com/blog/a-message-from-orgcode-on-the-vi-spdat-moving-forward">no longer</a> provide support for cities using the most common version of it.&nbsp;</p>



<p>But it might be too late. The tool was rolled out one year after the Department of Housing and Urban Development required all cities receiving federal funding for programs that house people who are homeless to adopt centralized assessment processes known as “coordinated entry systems," built on tools such as VI-SPDAT, to allocate accommodation.&nbsp;</p>



<p>Today, VI-SPDAT offers a cautionary tale of how an algorithm meant to help people resolve a thorny societal dilemma replaced human decision making entirely, with devastating effects for its intended beneficiaries. Yet, it is not the classic story of viewing a complex human crisis through a reductive tech-bro lens. Instead, it is an example of the wilful misuse of a well-intentioned tool by city administers, who have turned to such systems to deflect attention from a persistent and multi-layered problem, rather than attempting to marshal the resources and political will to solve it.</p>



<p>VI-SPDAT grew out of a collaboration between OrgCode and Common Ground, a national organization working to house homeless people. The system generates a vulnerability score out of 17 based on a set of questions about mental health, physical health and risk factors for chronic homelessness, with the aim of aiding case managers to triage people to appropriate resources.</p>



<p>“It was intended to try and help frontline staff better understand, across multiple dimensions, what people's vulnerabilities were, what their risks to housing stability were, so that you can work with the individual to guide a plan of support,” said De Jong, explaining that the score was not meant to be the ultimate criteria for deciding whether or not a person should receive housing assistance.&nbsp;</p>



<p>But that, unfortunately, is exactly what happened. City administrators simply ignored the fact&nbsp;that VI-SPDAT was intended to provide a starting point for offering assistance and facilitating further conversations between homeless individuals and case workers.</p>



<p>“People just skipped over that step,” he said. “We even heard things like, ‘Well, we just don't have time,’ or, ‘It's inconvenient,” or, ‘Following up with people to get that sort of information is hard work and the survey isn't.’”</p>



<figure class="wp-block-image size-large is-resized"><img src="https://www.codastory.com/wp-content/uploads/2021/09/d2b-1800x506.png" alt="" class="wp-image-23861" style="width:840px;height:236px"/><figcaption class="wp-element-caption">Posters in the offices of Coalition on Homelessness call for an end to sweeps of encampments and funding for the Compassionate Alternative Response Team (C.A.R.T), which seeks to end police response to homelessness.&nbsp;</figcaption></figure>



<p>Prior to the widespread adoption of coordinated entry systems, the process of providing housing for homeless people was built on relationships between case workers and the individuals in question. Social workers were supposed to develop an understanding of their clients’ needs and assist them accordingly.&nbsp;</p>



<p>Coordinated entry replaced individualized case management with a standardized system that is — depending on where you stand or who you talk to — either objective and less prone to favoritism, or cold, rigid and brutally mechanical.</p>



<p>“The goal of coordinated entry is to promote equity, and to ensure everyone has the equal ability to access resources,” wrote Denny Machuca-Grebe, the public information officer for the Department of Homelessness and Supportive Housing (HSH) in San Francisco.&nbsp;</p>



<p>Service providers working with people experiencing homelessness say that deemphasizing human relationships has created an unbending and soulless system that actively impedes the provision of individualized help. Tools such as VI-SPDAT or San Francisco’s similar Primary Assessment algorithm, critics say, have become a way to quickly eliminate people from housing eligibility under the guise of fairness and efficiency.</p>



<p>“You can propose a solution to meet the scale of the problem. Or you can shrink the problem to meet the available solution,” said Joe Wilson, the executive director of Hospitality House, a community-based organization in the Tenderloin that provides services for people experiencing homelessness and runs a shelter. “Coordinated entry reduces the scale of the problem.”</p>



<p>The reality is there simply isn’t enough housing. As of February, 225 of the roughly 7,755 supportive housing units allocated for people experiencing homelessness in San Francisco were vacant and ready for occupancy, according to <a href="https://www.sfpublicpress.org/1-in-10-s-f-housing-units-for-homeless-sit-vacant/">reporting</a> by the San Francisco Public Press. While over <a href="https://hsh.sfgov.org/wp-content/uploads/2020/06/06.30.20-Permanent-Supportive-Housing_833-Bryant-Street-1.pdf">10,800 people</a> lived in permanent supportive housing as of June of last year, 5,180 people are sleeping on the streets and another 2,855 are living in cars, on couches or in temporary shelters, according to the latest count in 2019.</p>



<p>Wilson, who was homeless in San Francisco himself in the early 1980s, has been a vocal critic of coordinated entry since its launch. “It’s designed not to help people get in, but to keep them out,” he said.&nbsp;</p>



<p>“I think we went from one extreme to the other,” De Jong said. “We went from a system of care that had really come down to luck, self-advocacy or first come, first served, in terms of how people got housing, to a very hands-off numeric-based approach that was very dehumanizing, in which people were not, in my opinion, always seen as people with potential, strengths and resiliency. They were resigned to a number on a waiting list. And that was just overwhelmingly disheartening.”&nbsp;</p>



<figure class="wp-block-image size-large is-resized"><img src="https://www.codastory.com/wp-content/uploads/2021/09/d1b-1800x506.png" alt="" class="wp-image-23860" style="width:840px;height:236px"/><figcaption class="wp-element-caption">Posters and signs in the office of the Coalition on Homelessness in San Francisco’s Tenderloin district. The organization works with social services providers and unhoused people to create permanent solutions to homelessness.</figcaption></figure>



<p>Across the park from San Francisco City Hall, a few blocks south of the Tenderloin, rows of tents are hidden behind a concrete barricade. It’s one of half a dozen city-sanctioned areas with 24/7 security and access to food, water and sanitation. These “safe sleep” sites are the city’s latest experiment in controlling sprawling sidewalk encampments and a strategy to limit the spread of Covid-19 among the homeless population. But, for many, permanent housing remains an unattainable dream.</p>



<p>Primary Assessment, the city’s scoring tool, was created by the local government with community input, but it functions in a similar way to VI-SPDAT. Its questions are phrased and weighted differently, but the fundamental principle of an algorithm assessing vulnerability is exactly the same.</p>



<p>If a person’s score meets or exceeds a certain threshold, they enter the queue to be allocated a housing placement. If their score is too low, they are put into what is known as “problem-solving status,” to be matched with programs that provide assistance other than permanent housing.&nbsp;</p>



<p>The questions asked of applicants are deeply personal: information on drug or alcohol use, mental health issues, developmental disorders and experiences with sexual assault or domestic abuse. Single adults and people aged 18 to 24 are asked if they have visited detox centers or called into suicide hotlines, or if they have traded sex for a place to sleep.</p>



<p>“You're not my therapist. You don't need to know how many times I've been sexually assaulted in my life,” said Roxie, who had been homeless for most of the past four years, before they were finally housed in a subsidized unit. “I hate the fact that the coordinated entry system is based on how traumatizing your life has been, essentially.”&nbsp;</p>



<p>Roxie is based in San Francisco, but has traveled the country from Nashville to St. Louis. They have been through the Primary Assessment process on four occasions, and each time, it reopened old wounds.&nbsp;</p>



<p>"The Department is transparent and respectful about the reality of trauma for the people we serve, and we strive to minimize such impacts through training and standardization across the Homelessness Response System," Deborah Bouck, communications lead at HSH, wrote in response to questions for this article.&nbsp;</p>



<p>Applicants are advised that they don’t need to go into detail during the assessment, but not answering a question can adversely affect their score, which leads to people like Roxie feeling compelled to share far more than they are comfortable with.&nbsp;</p>



<p>Xander, whose mother was homeless and grandmother lived in supportive housing, has taken the assessment twice and experienced panic attacks both times.&nbsp;</p>



<p>“It really sucks trying to bring up these topics,” Xander said. “When I did my housing assessment, I was fresh out of an abusive situation with my family, so it was all raw. When I tried talking about it, I'm like, `Hey, can we skip this question?’”<br><br>Xander walked out without finishing their first assessment or getting a score and was forced to return to a bad situation with their family.</p>



<p>“Stepping back from the complex system and how it all works, fundamentally, that is a really intense thing to ask anyone to do, to offer up even a piece of their vulnerability to somebody,” said Kenn Sutto, who conducts Primary Assessments at the Homeless Youth Alliance, a grassroots harm reduction coalition which is contracted by the city to serve as an access point for young people experiencing homelessness. Because these access points are contracted by the city, they are required to use the Primary Assessment. People have told Sutto that the coordinated entry assessment feels like the “trauma Olympics.”</p>



<p>“If you need to know to be able to determine how much support someone is going to need in housing, that's a legitimate case,” said Wilson, explaining that it is necessary to know if an individual has a disability that makes it difficult to use stairs or a medical condition that requires a private bathroom. But he believes that questions about substance use or sexual violence should not be relevant to determining whether someone qualifies for housing.&nbsp;</p>



<p>“If you're asking it to determine the severity of one's homelessness, why do you need to know that? Everybody that comes needs a roof over their head,” he said.</p>



<p>For De Jong, the designer of VI-SPDAT, these intrusive questions conflate two very different concepts — eligibility, which determines the specific housing program from which they will gain assistance, such as veteran or HIV-positive housing, and “depth of need,” which simply refers to a person’s unique risks and vulnerabilities.</p>



<p>“That's where I'll say the failure of implementation of VI-SPDAT to its true intention is remarkably pronounced,” he said.&nbsp;</p>



<p>San Francisco’s Department of Homelessness and Supportive Housing doesn’t share this view. "The assessment is used to prioritize those most vulnerable,” said Bouck at HSH. But in practice, many people don’t score high enough to be considered for housing assistance.</p>



<p>Roxie got lucky and recently moved into an apartment in San Francisco’s Mission district, but the trauma of the repeated assessments remains with them.&nbsp;</p>



<p>“It’s such bullshit that we have to be willing to expose ourselves and be that vulnerable in order to get housing. It’s exploitative,” they said.</p>



<figure class="wp-block-video alignwide"><video height="1080" style="aspect-ratio: 1920 / 1080;" width="1920" autoplay loop muted src="https://videos.files.wordpress.com/WhgvFZXd/divider.mp4" playsinline></video></figure>



<p>If you believe the strategic five-year plan, issued by San Francisco’s Department of Homelessness and Supportive Housing in 2017, coordinated entry is supposed to “make homelessness a rare, brief, and one-time event.”&nbsp;</p>



<p>On the other hand, you could listen to Jamale, whose experiences suggest an uncomfortable truth: coordinated entry systems will never fix homelessness. Instead, they will keep kicking people to the back of the queue, until the system deems them homeless <em>enough</em> for housing assistance.</p>



<p>Jamele, who asked that his name be changed for reasons of privacy, has taken the assessment twice. Both times he didn’t make the cut for housing priority status.&nbsp;</p>



<p>“I felt like being honest and trustworthy gets me nowhere in that system, because I didn't qualify,” he said. “And I opened myself up to every little incident that I've had, just to be told. ‘Nope, not now.’ It really drains you. It makes you wonder, ‘Why am I still here, or what's my purpose in society, if I can't access help?’”</p>



<p>Jamale, who is originally from Los Angeles, moved north about 15 years ago. A quiet, kind man in his early 30s, he enjoys building model rockets. In college, his favorite course was on legal terminology. “I wanted to be an attorney that could help people get out and stay out of jail,” he said.&nbsp;</p>



<p>But Jamale didn’t know anyone in San Francisco when he first arrived, and he couldn’t afford rent. Without any idea where to turn for help, he ended up on the streets, where he has stayed for most of the past 10 years.</p>



<p>By focusing on a specific definition of vulnerability, San Francisco is inadvertently excluding people like Jamale from housing services. With a roof over his head and support to process the trauma caused by years of homelessness, he could be one of the success stories that many service providers hope to find. Ironically, that could be part of the reason why he hasn’t been allocated housing yet.&nbsp;</p>



<p>Like the vast majority of people who go through the coordinated entry system in San Francisco, Jamale wasn’t determined to be vulnerable enough for permanent supportive housing. According to <a href="https://hsh.sfgov.org/about/research-and-reports/hsh-reports/">data</a> from HSH, only 17% of the 18,327 assessments conducted from January 2019 through May 2021 resulted in people moving into a new home.</p>



<p>Of the 11,979 single adults who were assessed between July 2018 and June 2021,&nbsp; a staggering 69% did not meet the threshold for housing priority status, according to HSH data obtained by Coda Story via a public records request. The numbers are better for other demographics — 88% of families and 58% of youth applicants receive housing referrals.&nbsp;</p>



<p>“We're basically saying, ‘Wait until you're sick enough, until you've been impacted to a point that's very detrimental to a person who is experiencing homelessness, or until that point when this algorithm is going to score you very high,’” said Laura Valdez, executive director of Dolores Street Community Services, a nonprofit which will become a coordinated entry access point for adults later this year. Currently, there are only two such offices in the whole city. Valdez hopes that operating a third will give her team greater insight into how the system works.&nbsp;</p>



<p>It is hard to say why Jamale has not yet been housed. Coordinated entry’s scoring system is deliberately opaque. The people conducting the assessments are not even told how the questions are weighted. According to Jeff Kositsky, who served as director of HSH from 2016 to March 2020 and led the department throughout the implementation of coordinated entry, this is to avoid service providers coaching their clients.&nbsp;</p>



<p>“We want to make sure we’re getting an accurate picture, and we don’t want case managers to game the system to get their people to the top of the list. It’s a best practice,” Kositsky told the local online newsroom <a href="https://thefrisc.com/in-san-franciscos-homelessness-crisis-big-data-still-means-big-problems-38e9e794a0d3">The Frisc</a> in 2017.&nbsp;</p>



<p>“Any large, complex system that renders critical social services is subject to potential manipulation. While HSH understands there is a possibility for coaching, we also are aware such behavior occurs because staff care about those they serve and want to assist community members in getting housed,” said Bouck, communications lead at HSH, in a statement to Coda Story.</p>



<p>Still, people have guesses as to how it works. Joe Wilson believes that “chronicity of homelessness” — how long an individual has been unhoused — makes up a significant part of the score. He’s right, according to a breakdown of how questions are weighted reviewed by Coda Story, which has not been made public before and was obtained via a public records request by the Coalition on Homelessness. People who have been homeless for more than 15 years get 15 points added to their score. Someone who needs help carrying out daily activities or maintaining housing receives nine points. Experiencing sexual or physical violence in their current living situation can count for 12 points, but only applies if the person is 24 years old or younger.&nbsp;</p>



<p>Sometimes, things don’t add up. One of Wilson’s clients at Hospitality House is an 87-year-old woman, who has been homeless for 40 years. After going through the coordinated entry process, she didn’t qualify for a housing referral.&nbsp;</p>



<p>For many, the algorithm’s word is final. Case managers and the people conducting assessments can’t change a score. Individuals placed in problem-solving status can challenge the decision via a clinical case review, but that route is time consuming and requires clients to hand over even more personal information. Many don’t pursue it and instead opt to take the assessment again in six months, remaining homeless in the meantime. Others, like Jamale, just walk away for good. He doesn’t plan on trying a third time.</p>



<p>Without permanent housing, it can be harder to stay engaged with social services like child care, mental health and addiction treatment. Megan Geary is the program director at Central City Access Point, which is contracted by the city to conduct coordinated entry assessments for families. She told me that a significant number of people turn away from other avenues of help after being told that they do not qualify for housing. “We see people cycling through the system much longer. It just feels counterintuitive.”&nbsp;</p>



<p>For Jamale, it feels like the system is telling him that things need to get much worse before he is deemed worthy of a home.</p>



<p>“Why do I need to be in a hospital bed fighting for my life in order to get housed? If that's the case, what do I have to do? Walking in off the street and asking for help, it seems like a dead road,” he said.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2021/09/JoeW-1800x1013.jpg" alt="" class="wp-image-54973"/><figcaption class="wp-element-caption">Joe Wilson stands in front of a mural of San Francisco in the offices of Hospitality House, in the Tenderloin. Wilson slept in a shelter at Hospitality House when he was homeless in the 1980s and now serves as the organization’s executive director. Photo by Caitlin Thompson.</figcaption></figure>



<p>Joe Wilson has been working in homeless services for the better part of the 20 years, starting while he was sleeping at the Hospitality House shelter. He’s furious that human judgement has been replaced by an algorithm.&nbsp;</p>



<p>“To cede that kind of decision-making authority to a computer, is that what we want to do in our field?” he asked. “We can’t do any better than that? That’s not what I came here to do. I came here to bring me to this mix.”</p>



<p>Because of the rigidity of the coordinated entry system, service providers now can’t get their clients into their own housing programs, even when they can clearly see that doing so is the right course of action. Mary Kate Bacalao, the policy director at Compass Family Services, said her team worked with a client who needed a place to live after giving birth to a baby with special needs. Compass runs a housing program for pregnant women and new mothers, but because the scoring system put the woman in problem-solving status, they couldn’t place her. Social workers ultimately lost track of her.&nbsp;</p>



<p>HSH disagrees with criticism of the assessment. “Human decision-making is still very much a part of the process. The pandemic has shown that coordinated entry still very much incorporates human decision-making into the model,” wrote public information officer Denny Machuca-Grebe in response to questions for this story.&nbsp;</p>



<p>He went on to say that the coordinated entry system “strives to center client choice.” To a degree, it does. Families who have made it through the process and receive a temporary <a href="https://hsh.sfgov.org/wp-content/uploads/2017/10/HSH-Strategic-Framework-Full.pdf">rapid rehousing</a> rental subsidy can request a housing case review when a spot in permanent supportive housing becomes available. People who qualify for permanent supportive housing can also be offered up to three units, in the hope of finding one that matches their needs. But many people don’t get that far.&nbsp;</p>



<p>Service providers highlight that not all types of housing intervention will work for everyone. For example, a rapid rehousing subsidy may not be a good fit for people who need more support.</p>



<p>“The housing resolutions at the end of the assessment are a very cookie cutter approach,” said Geary at Central City Access Point. “We can't really pivot to triaging them to a housing intervention that would maybe better meet those needs and be more helpful to them to be able to achieve longer-term stability. We kind of just have to refer them to whatever is available.”&nbsp;</p>



<p>The first time Roxie went through coordinated entry several years ago, they were placed in a single-room occupancy hotel (SRO) in the Mission district. It was right after they were sexually assaulted.</p>



<p>“It was one of the worst places that you could put somebody that had just recently been assaulted. It was like a crack den SRO,” they said.&nbsp;</p>



<p>“I ended up leaving after two weeks. I just said, fuck it, I’m done.”</p>



<p>Meanwhile, people like Kenn Sutto are doing their best to operate within the system. When he conducts a primary assessment, he tries to make people feel as comfortable as possible. Even though he is supposed to ask the questions exactly as written, he is allowed to clarify and steers clear of terms like “substance abuse,” in favor of less stigmatizing phrasing.&nbsp;</p>



<p>“I'm not here to be a person that's reading off a computer screen,” he said. “My job is to be there for people experiencing homelessness.”</p>



<p>For Wilson, handing a decision as monumental as whether someone gets housing over to an algorithm just isn’t a sustainable or moral option.&nbsp;</p>



<p>“There were people in my life, particularly when I became homeless, who refused to turn away from me,” he said. “I refuse to do less than that. No computer is going to help me decide the worth of another human being, and who gets what, when and how much.”</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/san-francisco-homeless-algorithm/">Who&#8217;s homeless enough for housing? In San Francisco, an algorithm decides</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		<enclosure url="https://videos.files.wordpress.com/WhgvFZXd/divider.mp4" length="1247168" type="video/mp4" />

		<post-id xmlns="com-wordpress:feed-additions:1">23266</post-id>	</item>
		<item>
		<title>The hacker who spent a year reclaiming his face from Clearview AI</title>
		<link>https://www.codastory.com/surveillance-and-control/clearview-ai-facial-recognition-face-surveillance-tracking/</link>
		
		<dc:creator><![CDATA[Isobel Cockerell]]></dc:creator>
		<pubDate>Mon, 19 Jul 2021 11:58:51 +0000</pubDate>
				<category><![CDATA[Surveillance and Control]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Facial recognition]]></category>
		<category><![CDATA[Q&A]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=22645</guid>

					<description><![CDATA[<p>Matthias Marx has spearheaded an international campaign to place more controls on facial recognition technology </p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/clearview-ai-facial-recognition-face-surveillance-tracking/">The hacker who spent a year reclaiming his face from Clearview AI</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>Matthias Marx is a hacker and researcher studying security systems from Hamburg, Germany. For a year, he pursued the controversial facial recognition company Clearview AI after lodging a complaint with the Hamburg Data Protection Authority that the company was using his biometric data without his consent.</p>



<p>Clearview says that it is changing the way police investigations operate by providing a searchable database of billions of faces. The company has been used by law enforcement agencies all over the world to track down potential criminals. However, it has now been hit by legal complaints in five countries for violating citizen privacy.&nbsp;</p>



<p>In January, the Hamburg data protection authority ordered Clearview to delete the code that identified Marx’s face, saying that the technology violated European data protection rules. His campaign has been at the forefront of an international push by privacy activists, condemning the company and calling for more stringent controls on facial recognition tech. We spoke to Marx about his odyssey to get his face back.&nbsp;</p>



<p>Clearview AI did not respond to a request for comment for this story.</p>



<p><em>This conversation has been edited for length and clarity</em></p>



<p><strong>For anyone who missed the Clearview AI story, could you sum it up for us?</strong></p>



<p>On most search engines, you can upload a photo and a search engine would show you similar photos. But Clearview AI is different because it lets you search for specific faces. If I took a photo of you and uploaded it to Clearview, it would look for the same face on the internet. So Clearview AI has trawled the internet, looked for photos, identified all faces in those photos, and built a huge database.&nbsp;</p>



<p><strong>How did your quest to get your face back from Clearview start?</strong></p>



<p>The whole trip started in January 2020, when I read a New York Times <a href="https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html">article</a> about Clearview AI. They’d crawled more than three billion images on the internet, so I had reason to believe that photos of me might be among them. There are a few images of my face around, so I just asked them if they had any.</p>



<p><strong>Were you conscious about maintaining privacy online at that time?</strong></p>



<p>I care about my privacy online, so I usually don’t upload pictures of my face on the internet. I don’t use Facebook. But I did appear on the internet a few times, because I participated in student projects.</p>



<p><strong>You’re part of a campaign called “<a href="https://reclaimyourface.eu/">Reclaim your Face</a>.” What does that mean?</strong></p>



<p>At the moment we don’t really own our faces. There are already lots of biometric experiments out there that use, say, CCTV to process our faces without our consent. We need to do something if we want to claim our faces back, because at the moment, companies just could use our face to identify us.&nbsp;</p>



<p><strong>What do Clearview AI say they’re doing?</strong></p>



<p>They say they’re just looking for public images on the internet. But the Clearview AI search engine is a bigger risk to everyone’s privacy. They make it impossible to remain anonymous in the offline world.</p>



<p><strong>Why is remaining anonymous in the offline world important?</strong></p>



<p>I think it should be important to everyone. Under surveillance, we change our behavior. If I want to attend a protest, but I know it’s easy to be identified, I might decide not to go, even if it was completely legal to do so. Likewise, I might not want to go to the psychologist’s office, if I knew that I was being identified wherever I went.&nbsp;</p>



<p><strong>Is there any way to disguise yourself from the algorithm?</strong></p>



<p>That doesn't work. The algorithms are just too good at identifying faces now. I would have to change my nose, ears, and eyes to trick the algorithms.&nbsp;</p>



<p><strong>What happened after you sent the request to Clearview?</strong></p>



<p>I didn’t expect them to respond. But after a month, they told me they’d found pictures of my face twice on the internet. I was surprised. I didn't know those images even existed. It was scary to be part of this database because anyone could use Clearview AI to identify me, just based on my photo.&nbsp;</p>



<p><strong>Did the technology work perfectly?</strong></p>



<p>Actually, no. Clearview later sent photos of eight people from different parts of the world. It’s a good example that those algorithms fail from time to time, and it's dangerous to blindly believe their results.&nbsp;</p>



<p><strong>What happened at the end of the process?</strong></p>



<p>It took more than 12 months. Eventually, the Hamburg data protection authority ordered Clearview to delete the biometric mathematical hash value that describes my face.</p>



<p><strong>What's your message to someone who doesn't care that their face is part of the system?&nbsp;</strong></p>



<p>Maybe it's not a danger to them, but it could be a danger for their friends, family, to minorities. This tech may not be that dangerous when democracy is perfectly functioning. But times change, countries change, and this technology in the hands of a dictator is very dangerous.&nbsp;</p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/clearview-ai-facial-recognition-face-surveillance-tracking/">The hacker who spent a year reclaiming his face from Clearview AI</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">22645</post-id>	</item>
		<item>
		<title>How Amazon’s algorithms push people towards extremist content</title>
		<link>https://www.codastory.com/disinformation/amazon-algorithm-extremist-literature/</link>
		
		<dc:creator><![CDATA[Erica Hellerstein]]></dc:creator>
		<pubDate>Thu, 13 May 2021 18:19:39 +0000</pubDate>
				<category><![CDATA[Disinformation]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Brief]]></category>
		<category><![CDATA[Far-right disinformation]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=21326</guid>

					<description><![CDATA[<p>A new report says the retail giant’s book recommendation algorithms direct people toward conspiracy theories and far-right, white nationalist content</p>
<p>The post <a href="https://www.codastory.com/disinformation/amazon-algorithm-extremist-literature/">How Amazon’s algorithms push people towards extremist content</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>A recently released report highlights how Amazon’s book recommendation algorithms can lead people to literature about extremism, white nationalism, and conspiracy theories, including QAnon and Covid-19 disinformation.</p>



<p>The <a href="https://www.isdglobal.org/wp-content/uploads/2021/04/Amazon-1.pdf">study</a>, published in April by the Institute for Strategic Dialogue, a UK-based think tank researching extremism, analyzed Amazon’s book recommendation algorithms for literature about white nationalist and far-right content, coronavirus denialism, and conspiracies about QAnon and fraud in the 2020 U.S. presidential election.&nbsp;</p>



<p>The analysis found that the platform’s recommendation algorithms, which suggest titles and authors to people browsing through books on the platform — while relatively harmless for most users — can create a disturbing pipeline into extremist literature, directing customers toward conspiracy theories and white nationalist literature.</p>



<p>The study found Amazon’s algorithmic recommendations “could serve as a gateway into a broader universe of conspiracy theories and misinformation, or to increasingly radical far-right and white nationalist content."</p>



<p>Amazon’s recommendation algorithm suggests books to users by telling them what other customers who clicked on the book bought and viewed. This algorithm, according to the ISD, creates a feedback loop. People who click on a book about a conspiracy are fed recommendations for the same and other conspiracies, including books about QAnon, debunked claims about vaccines and fraud in the U.S. 2020 presidential election.</p>



<p>A descent into this algorithmic dystopia could take you to the book <a href="https://www.amazon.com/HAMMER-Coup-Political-Crime-Century-ebook/dp/B08GCC689D/ref=sr_1_1?crid=2UQ1BVHNQF7AS&amp;dchild=1&amp;keywords=the+hammer+is+the+key+to+the+coup&amp;qid=1620812139&amp;s=books&amp;sprefix=the+hammer+is+the+%2Cstripbooks%2C213&amp;sr=1-1">page</a> for <em>The Hammer is the Key to the Coup</em>:<em> How Obama, Brennan, Clapper, and the CIA spied on President Trump, General Flynn ... and everyone</em> <em>else</em>, written by the <a href="https://mediamanipulation.org/case-studies/viral-slogan-hammer-and-scorecard">proponents</a> of a debunked election fraud conspiracy about the 2020 U.S. presidential election. The page suggests links to books about the Illuminati, the Rothschilds, the “Scamdemic,” Pizzagate, and aliens.</p>



<p>“I was shocked by the cornucopia of weirdness in the recommendations,” Elise Thomas, the author of the report and an ISD analyst, told me. She was especially alarmed to see books on the platform by the founder of the Order of Nine Angles (O9A), <a href="https://www.bbc.com/news/uk-51682760">a UK-based </a>Nazi satanic group that has been tied to a number of terror offences, prompting calls to outlaw it as a terrorist organization.&nbsp;</p>



<p>“These beliefs are very extreme, and I was genuinely quite taken aback to see them just sitting on Amazon,” she said. “As a result of doing this research, I learned about other 09A texts I didn’t previously know about because of Amazon’s recommendations. I think that’s an example of the potential harm here.”</p>



<p>Other examples directing readers to extremist literature abound. Click on the book <a href="https://www.amazon.com/Whiteness-Original-Sin-Jim-Goad/dp/1729700411/ref=sr_1_1?crid=32R3I56931JG9&amp;dchild=1&amp;keywords=whiteness+the+original+sin&amp;qid=1620843090&amp;s=books&amp;sprefix=whiteness%3A+the+origin%2Cstripbooks%2C225&amp;sr=1-1"><em>Whiteness: The Original Sin</em></a>, and recommendations include titles about European ethnostates, “white identity politics,” and New World Order conspiracies.</p>



<p>Amazon recently removed the 1978 white supremacist <a href="https://www.nytimes.com/2021/01/12/books/turner-diaries-white-supremacists.html">novel</a> <em>The Turner Diaries</em>, but a search for the book will now direct people to <em>The Anarchist’s Cookbook</em>, a bomb-making guide published in 1971 by William Powell that has been<a href="https://www.npr.org/2017/04/03/522474967/documentarian-says-anarchist-cookbook-author-was-filled-with-remorse#:~:text=of%20Gravitas%20Ventures-,William%20Powell%20was%2019%20when%20he%20wrote%20The%20Anarchist,He%20died%20in%20July%202016.&amp;text=Since%20then%2C%20The%20Anarchist%20Cookbook,as%20other%20acts%20of%20violence."> linked</a> to acts of violence including the Columbine shooting and the Oklahoma City bombing.</p>





<p>An Amazon spokesperson said the company takes “concerns from the Institute for Strategic Dialogue seriously and are committed to providing a positive experience for our customers. Similar to other stores that sell books, we provide our customers with access to a variety of viewpoints and our shopping and discovery tools are not designed to generate results oriented to a specific point of view."</p>



<p>While any conversation about banning books is complex, Thomas says an overhaul of Amazon’s algorithm for extremist books could offer one solution. This would involve turning off the automatic recommendation of books promoting conspiracy theories, disinformation, or extremist views.&nbsp;</p>



<p>“The books are still on the platform, you can find them if you search for them, but if you search for one and go to that page it’s not going to recommend you 20 others,” she said.&nbsp;</p>



<p>The ISD report marks the most recent investigation into Amazon’s search algorithms. Previous analyses have also examined the role played by the digital giant's search algorithms in leading customers into conspiracies and misinformation. A January 2021 <a href="https://arxiv.org/pdf/2101.08419.pdf">study</a> by researchers at the University of Washington, for example, revealed that nearly 10.5% of searches involving the term “vaccine” promote books containing anti-vaccine conspiracies and health misinformation.</p>



<p></p>
<p>The post <a href="https://www.codastory.com/disinformation/amazon-algorithm-extremist-literature/">How Amazon’s algorithms push people towards extremist content</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">21326</post-id>	</item>
		<item>
		<title>Tech company&#8217;s ties to white supremacism trigger debate on surveillance algorithms</title>
		<link>https://www.codastory.com/surveillance-and-control/banjo-artificial-intelligence-bias/</link>
		
		<dc:creator><![CDATA[Brett Bachman]]></dc:creator>
		<pubDate>Mon, 04 May 2020 17:45:21 +0000</pubDate>
				<category><![CDATA[Surveillance and Control]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Artificial intelligence]]></category>
		<category><![CDATA[Dispatch]]></category>
		<category><![CDATA[Surveillance]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=13851</guid>

					<description><![CDATA[<p>Revelations call for transparency in how artificial intelligence is used by law enforcement</p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/banjo-artificial-intelligence-bias/">Tech company&#8217;s ties to white supremacism trigger debate on surveillance algorithms</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>The sudden suspension of a controversial multi-million dollar surveillance system used by several government agencies in Utah has opened up a debate about the lack of oversight for artificial intelligence systems in law enforcement.</p>



<p>Last week, the Utah Attorney General’s office suspended a $20.7 million contract with Banjo — a technology firm using government surveillance data to develop crime detection software — following revelations of the founder’s past membership of a white supremacist group.</p>



<p>Damien Patton, who serves as CEO of the SoftBank-backed company, was reportedly an active member of the Ku Klux Klan as a teenager, and participated in a 1990 drive-by shooting of a synagogue in suburban Nashville, according to<a href="https://onezero.medium.com/ceo-of-surveillance-firm-banjo-once-helped-kkk-leader-shoot-up-synagogue-fdba4ad32829"> the tech blog OneZero.</a></p>



<p>In a statement, a spokesperson for Utah Attorney General Sean Reyes said the office would be moving forward an already planned third-party audit of the software to “address issues like data privacy and possible bias.” Reyes recommended that other state agencies do the same.</p>



<p>“The Utah Attorney General’s office is shocked and dismayed at reports that Banjo’s founder had any affiliation with any hate group or groups in his youth,” said the statement. “Neither the AG nor anyone in the AG’s office were aware of these affiliations or actions. They are indefensible.”</p>



<p>According to documents obtained by Coda Story under Utah’s Government Records Access and Management Act, Banjo’s <a href="https://utah-das-contract-searchsp.s3.amazonaws.com/full_contract_AR3205_AR3205%20Banjo%20Utah%20State%20Cooperative%20Contract%20(executed).pdf">contract</a> with the state gave the company live access to an unprecedented number of government data streams, including 911 calls, traffic and CCTV cameras, and location data for state vehicles.</p>



<p>Banjo’s real-time access to this vast amount of information employs artificial intelligence to alert first responders in dozens of agencies across Utah to crimes and other public safety threats as they happen. Before the suspension of the contract, the company’s technology was already in use by all of the state’s 29 counties, at least 23 cities and even campus police at the University of Utah,<a href="https://www.vice.com/en_us/article/k7exem/banjo-ai-company-utah-surveillance-panopticon"> Motherboard reported</a> last month.</p>



<p>The Utah Department of Public Safety said in a statement that it is also conducting a review of Banjo’s technology, though it and other agencies could not unilaterally suspend their agreements with the company.&nbsp;</p>



<p>“We have not suspended the current contract because the contract is with the Department of Administrative Services Division of Purchasing. We are discussing and reviewing the matter with them,” a spokesperson wrote.</p>



<p>Banjo announced last Wednesday that <a href="https://www.newsbreak.com/news/0Ov0n2xV/banjo-halts-data-collection-in-utah-after-reports-of-ceos-neo-nazi-past">it had suspended </a>all of the company’s data collection operations in Utah.</p>



<p>The revelations about Patton’s past ties to the white supremacist movement have prompted new calls for transparency in how artificial intelligence is used by law enforcement. Several prominent Utah lawmakers called for a public audit of Banjo’s contract long before this week’s news. They have now renewed their push for increased oversight of state data sharing with private companies.</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2020/05/KKK-Patton-banjo-surveillance-utah-police-1691x1200.jpg" alt="" class="wp-image-13906"/><figcaption>A photograph from an August 23, 1992 edition of The Tennessean. Damien Patton is third from left. Source of the photo: Newspapers.com</figcaption></figure>



<p>“I’ve been one of the only people asking questions about this for years now, and nobody seemed to know what the answers are — for so long I thought I was crazy,” said Democratic Utah Representative Angela Romero, who sits on the state’s Law Enforcement and Criminal Justice Committee. “A lot of really powerful people seem to have a vested interest in keeping the details of this program a secret.”&nbsp;</p>



<p>Until this week’s revelations, Patton’s colorful past was a selling point for the company in an industry noted for its culture of founder-worship. Known for his long beard and eccentric dress sense, Patton’s story starts as a homeless teenage runaway. His roundabout career also includes stints in the U.S. Navy, as a NASCAR pit mechanic and a crime scene investigator.</p>



<p>This tale has been told in <a href="https://www.inc.com/magazine/201504/will-bourne/banjo-the-gods-eye-view.html">magazines</a>, <a href="https://blogs.wsj.com/digits/2015/05/06/banjo-raises-100-million-to-detect-world-events-in-real-time/">newspapers</a> and in pitch meetings and introductions at technology conferences for nearly a decade.</p>



<p>But there is also a darker side to it. OneZero, which reviewed thousands of pages of public court documents, wrote that Patton was also involved with white supremacist groups, and, at 17 years old, party to a synagogue shooting.</p>



<p>According to court records, Patton was driving the vehicle on the day a Klan leader shot out the windows of the West End Synagogue in Nashville, Tennessee. No one was injured in the incident. OneZero reported that two Klansmen were later convicted of crimes connected to the shooting and Patton pleaded guilty to acts of related juvenile delinquency.&nbsp;</p>



<p>Patton <a href="https://boingboing.net/2020/04/28/damien-patton-ceo-of-tech-sur.html">also testified </a>at the trial about his beliefs at the time. “We believe that the blacks and the Jews are taking over America, and it’s our job to take America back for the white race,” he said.</p>



<h2 class="wp-block-heading"><strong>Artificial intelligence and bias</strong></h2>



<p>The details of Patton’s past highlight the importance of oversight in the writing of surveillance algorithms so as to ensure that they are as unbiased as possible, says Michael German, a fellow at the Brennan Center for Justice’s Liberty &amp; National Security Program, in a telephone interview.</p>



<p>“When information like this is brought to light, it also brings reform,” he said. “Unfortunately, law enforcement agencies often hide how their systems work, and the only opportunity for reform is a good deal later, when we have the data to back it up.”</p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2020/05/Utah-police-AI-surveillance-Banjo-1800x900.jpg" alt="" class="wp-image-13907"/><figcaption>Banjo uses real-time government data and artificial intelligence to detect crimes. Illustration by Gogi Kamushadze</figcaption></figure>



<p>Ties to white nationalist ideology and alt-right groups have marred the reputations of several law enforcement technology firms recently. The founder of Clearview AI, the facial recognition startup used by more than 600 U.S. law enforcement agencies, including Immigration and Customs Enforcement, Customs and Border Protection, was <a href="https://www.huffingtonpost.co.uk/entry/clearview-ai-facial-recognition-alt-right_n_5e7d028bc5b6cb08a92a5c48?ri18n=true">recently found </a>to have links to far-right figures, including Mike Cernovich, who spearheaded the “<a href="https://apnews.com/e0d30f6da17348ce9f354bfd6cb5cd9a/'Pizzagate'-gunman-in-DC-sentenced-to-4-years-in-prison">Pizzagate</a>” conspiracy theory, and Andrew “weev” Auernheimer, webmaster of the neo-Nazi website “The Daily Stormer.” Several Clearview AI employees were also exposed for sharing extremist content and attending events with other white nationalists, including alt-right figurehead Richard Spencer.</p>



<p>Peter Thiel, who sits on the board of Facebook and runs the data mining giant Palantir, is also reported to <a href="https://www.buzzfeednews.com/article/josephbernstein/heres-how-breitbart-and-milo-smuggled-white-nationalism">be linked </a>to several alt-right figures, including the former Breitbart writer Milo Yiannopoulos.</p>



<p>While these companies and individuals have written off criticism of their connections to far-right and white nationalist groups as partisan attacks, such exposures point to a need to examine biases in AI systems, says German.</p>



<p>“Any system made by human beings will mirror or even amplify the biases that already exist in both law enforcement and society,” he said.<br></p>
<p>The post <a href="https://www.codastory.com/surveillance-and-control/banjo-artificial-intelligence-bias/">Tech company&#8217;s ties to white supremacism trigger debate on surveillance algorithms</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">13851</post-id>	</item>
		<item>
		<title>How anti-vaxxers get around Instagram&#8217;s new hashtag controls</title>
		<link>https://www.codastory.com/authoritarian-tech/antivax-instagram-hashtag/</link>
		
		<dc:creator><![CDATA[Isobel Cockerell]]></dc:creator>
		<pubDate>Fri, 06 Dec 2019 12:23:36 +0000</pubDate>
				<category><![CDATA[Authoritarian Technology]]></category>
		<category><![CDATA[Algorithms]]></category>
		<category><![CDATA[Anti-vaccine]]></category>
		<category><![CDATA[Dispatch]]></category>
		<guid isPermaLink="false">https://www.codastory.com/?p=10183</guid>

					<description><![CDATA[<p>Anti-vax users are hijacking sexual assault and abortion rights hashtags to spread their message</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/antivax-instagram-hashtag/">How anti-vaxxers get around Instagram&#8217;s new hashtag controls</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></description>
										<content:encoded><![CDATA[
<p>In March 2019, a teenager from Ohio, Ethan Lindenberger, sat before a Senate  Committee in Washington. Wearing glasses, a suit and tie, his hair carefully combed to the side, he described how his mother refused to vaccinate him. Her beliefs, which he began challenging aged 13, were reinforced by conspiracy theories she found online. “My mother would turn to anti-vaccine groups and social media looking for evidence in her defense,” he told the committee.<br></p>



<p>The Senate hearing had been called to examine the reasons behind the recent resurgence of preventable diseases. Across the United States, American physicians are battling the worst measles outbreaks in decades. And globally, polio – which had almost been on the brink of extinction –&nbsp;is making a deadly comeback.&nbsp;<br></p>



<p>Responding to mounting pressure in the wake of Lindenberger’s speech, Facebook pledged to stop recommending vaccine disinformation on its platforms and make it harder to find anti-vax content in searches. But eight months on, evidence gathered by Coda shows that anti-vaxxer movement is flocking to Facebook-owned Instagram, gaming the app’s algorithm to spread anti-vaccine disinformation.<br></p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2019/12/Anastasia-Gviniashvili-isobel-.jpg" alt="" class="wp-image-10188"/><figcaption class="wp-element-caption">18-year-old Ethan Lindenberger testifying before the Senate Committee in March. <br>Illustration by Anastasia Gviniashvili</figcaption></figure>



<p>Until the spring of this year, anti-vaxxers enjoyed near total freedom on Instagram. “They had a white flag where they could do anything they wanted,” said Anne-Julie Dionne, 22, a Quebec-based pro-vaccine campaigner who runs an account called @queenofvaccines.<br></p>



<p>In May, Instagram shut down some of the most popular anti-vax hashtags, including #vaccinescauseAIDS, #vaccineskill and #vaccinescauseautism. But anti-vaccine Instagram users have been getting around the controls by employing more than 40 cryptic hashtags such as #learntherisk and #justasking. </p>



<p>“It doesn’t sound as straightforward and can be misleading for families who are looking up that information,” said Alice, 29, a California pediatrician who runs the pro-vaccine account @vaccinesinvogue and wished to remain anonymous for fear of online abuse.<br></p>



<p>Tactics like spelling vaccines with a cedilla (vaççines) or using a bracket (va((ines) to try to avoid detection by Instagram also proliferate on the platform. “The same amount of misinformation is floating around, if not more,” Alice said. “It kind of forms this almost exclusive group of people who are using the same hashtag to distribute their information.”<br></p>





<p>Another popular tactic has been simply to take over hashtags that were initially used by other Instagrammers campaigning for abortion rights and sexual consent, such as pro-choice hashtags #righttochoose, #mybodymychoice and #bodilyautonomy. Many of these hashtags are now saturated with anti-vaccine propaganda.&nbsp;<br></p>



<p>A consent hashtag, #idonotconsent, has been particularly heavily co-opted — around three quarters of last month’s posts had an anti-vaccine agenda.<br></p>



<figure class="wp-block-image size-full"><img src="https://www.codastory.com/wp-content/uploads/2019/12/gogi_hashtags-1.gif" alt="" class="wp-image-10261"/></figure>



<p>Holly, a Canadian anti-vaxxer in New York with the handle @novaccinesnoworries, simply latched on to hashtags used by vaccine advocacy groups. “What accounts like mine have done is actually started using the pro-vax hashtags. I use #vaxwithme and #provax just to try and get the information out there, since our hashtags are being so heavily censored,” the 24-year-old told Coda on condition of anonymity.&nbsp;<br></p>



<p>Anti-vax users have increasingly turned to Instagram as they felt the effects of Facebook’s effort to deprioritize anti-vaccine content and stop recommending it to its users. “I find it a lot harder to share information on Facebook. Posts on Facebook get taken down a lot faster and it seems to be more monitored,” said Holly. “But with Instagram, while it’s still censored, it’s not as heavily done. Posts don’t really get removed that easily – even if someone reports them it’s really rare.”<br></p>



<p>A scroll within any of the hashtags currently used by anti-vaxxers quickly brings up widely-debunked pseudoscience, such as the <a href="https://www.cdc.gov/vaccinesafety/concerns/autism.html">non-existent link</a> between vaccines and autism, or the <a href="https://www.politifact.com/facebook-fact-checks/statements/2019/oct/29/instagram-posts/instagram-post-falsely-says-flu-shot-causes-fetal-/">false claim</a> that the flu shot causes fetal death. “Hundreds of babies were murdered in the name of vaccines,” <a href="https://www.vaccinateyourfamily.org/questions-about-vaccines/vaccine-ingredients/?acc=What%20are%20the%20ingredients%20in%20vaccines%20and%20why%20are%20they%20in%20there">falsely claims</a> a particularly prevalent post from last month.<br></p>



<p>“Don’t tell me you [sic] vaccinated kids turned out “fine” if they have asthma, autism, epilepsy, depression, diabetes, cancer, food allergies, ADHD, anxiety, eczema,” another popular and much-reposted disinformation meme claimed on the #idonotconsent hashtag in November.&nbsp;<br></p>



<p>Together, the hashtags contain more than 240,000 posts. Instagram would not comment on what benchmark was needed to shut down a particular anti-vax hashtag, citing that doing so would weaken their anti-misinformation defenses.&nbsp;<br></p>



<p>Instagram clarified that anti-vaxxers themselves did not violate the company’s guidelines. “Anti-vaccine content isn’t against our policies and we won’t take action on a hashtag because it contains this type of content,” said Stephanie Otway, a Facebook company spokesperson in an email. “If a hashtag contains a certain amount of vaccine misinformation, we may restrict it or block it altogether.”<br></p>



<p>After Coda shared a list of hashtags with Instagram that contained significant volumes of anti-vax content, the #vaççineskill, #noflushot, #learntherisk, #researchdontregret and #stoppoisoningyourkids hashtags were taken down. Dozens of anti-vax hashtags remain, and more are created and followed every week.<br></p>



<figure class="wp-block-image size-large"><img src="https://www.codastory.com/wp-content/uploads/2019/12/Untitled-design-75.png" alt="" class="wp-image-10249"/><figcaption class="wp-element-caption">Sexual consent Instagram hashtags have been flooded with anti-vaccine propaganda</figcaption></figure>



<p>For Instagram, “the challenge is, they’re trying to find a balance between bad information and freedom of expression,” said Renee di Resta, a disinformation expert and co-founder of advocacy group <a href="https://vaccinatecalifornia.org/">Vaccinate California</a>.&nbsp;<br></p>





<p>Another issue Instagram is dealing is what di Resta terms “the asymmetry of passion.” “There’s a real asymmetry of passion on who involves themselves in the issue. The overwhelming majority of people who vaccinate have a perfectly positive experience,” she said, explaining that this meant they were unlikely to begin campaigning about vaccines. “It’s the equivalent of saying, I’m going to tweet about how the earth is round.”<br></p>



<p>Pro-vaccine accounts like Dionne and Alice’s tend to have a much harder time gaining followers – Alice gets around this by using anti-vax hashtags to post her vaccine advocacy content. Her posts, distinctively colored “millennial pink” stand out amid the sea of anti-vaccine memes. “My goal is not to get lots of followers, my goal is just to put that information out there.”&nbsp;<br></p>



<p>Otavio Freire, President and co-founder at U.S.-based cybersecurity company Safeguard Cyber,&nbsp; said anti-vaccine content is a favored pressure point in disinformation campaigns run by foreign bots and trolls. “There are some topics that are perfect to create that divisiveness and anger. That’s what these memes are trying to do,” he said.<br></p>



<p>“The overwhelming majority of these accounts make Instagram behave in a certain way,” he added. “The algorithm’s being spoofed.”<br></p>



<p>Freire calls this practice as “memetic warfare,” a term used to describe modern information warfare waged through the confines of a meme. “It’s digestible, it’s easy to look at,” he said. “It’s great for amplification of disinformation, and lo and behold – what platform is great for images – Instagram.”<br></p>



<p>Another Instagram feature that proves useful to anti-vaxxers is the fact that the app does not allow users to post links except in their bios, making it easy to post unfounded information without citations and accountability.<br></p>



<p>&nbsp;“What I like about Facebook is you can actually post articles and links,” said Dionne, the pro-vax campaigner. “Instagram keeps people from opening an article and reading it. I’ll post my links anyway,” she said, but added that it was a source of frustration – anti-vaxxers would frequently ask her for “proof” that vaccines were safe.&nbsp;<br></p>



<p>In September, Facebook and Instagram introduced a further measure to limit disinformation and increase awareness around vaccines: when a user searches for vaccines, they are greeted with a pop-up asking if they would like to visit the Center for Disease Control or the World Health Organization instead. In a statement, WHO Director-General Dr Tedros Adhamnom Ghebreyesus <a href="https://www.who.int/news-room/detail/04-09-2019-vaccine-misinformation-statement-by-who-director-general-on-facebook-and-instagram">welcomed the feature</a>. “Major digital organizations have a responsibility to their users – to ensure that they can access facts about vaccines,” he said.<br></p>



<figure class="wp-block-embed is-type-rich is-provider-instagram wp-block-embed-instagram"><div class="wp-block-embed__wrapper">
https://www.instagram.com/p/B5u1xK2hSca/
</div></figure>



<p>Those studying the day-to-day vaccine movement were less optimistic. “Now Instagram is picking the winners and losers in society — I don’t know if that’s a better freedom of speech approach to the actual problem,” said Freire, who believes the app’s algorithms allowing falsely-amplified content to reign supreme should be addressed as a priority.</p>



<p>Instead of visiting the WHO websites, users can opt to “see posts anyway,” upon which they’re greeted with a rash of the most popular anti-vaccine accounts. “It’s like ‘we can’t deal with the underlying symptom, so let’s try to mediate it by giving you a pop-up’ – it’s ultimately not attacking the cause but a symptom of disinformation.”</p>





<p><em>Anastasia Gviniashvili contributed research.&nbsp;</em><br></p>



<p><strong>Correction: </strong>In the original version of this article, Otavio Freire's name was misspelled, and his title was misidentified.</p>
<p>The post <a href="https://www.codastory.com/authoritarian-tech/antivax-instagram-hashtag/">How anti-vaxxers get around Instagram&#8217;s new hashtag controls</a> appeared first on <a href="https://www.codastory.com">Coda Story</a>.</p>
]]></content:encoded>
					
		
		
		<post-id xmlns="com-wordpress:feed-additions:1">10183</post-id>	</item>
	</channel>
</rss>

<!--
Performance optimized by W3 Total Cache. Learn more: https://www.boldgrid.com/w3-total-cache/?utm_source=w3tc&utm_medium=footer_comment&utm_campaign=free_plugin

Page Caching using Disk: Enhanced 

Served from: www.codastory.com @ 2026-05-01 15:16:05 by W3 Total Cache
-->