Regular readers will be familiar with our coverage of how authoritarian technologies lurk around the infrastructure of smart cities. Our journalists have looked at how Western companies are aiding the surveillance architecture of smart cities in China; we have also detailed how technology is assaulting the lives of ordinary Zimbabweans.
A new book, “The Smart City in a Digital World” (Emerald Publishing), provides a good overview of some of the challenges faced by local and national policymakers who are under pressure to innovate and save public finances. As the author Vincent Mosco demonstrates, smart city solutions often involve the outsourcing of data gathering and other services to companies like Amazon or Google. At the same time, in its most insidious form, the technology can be used to surveil minorities like Uyghurs in China.
Mosco’s research shows that there are around 1,000 smart city projects in various stages of planning or development worldwide, and around half of these are located in China. India plans to build 100 new smart cities and rejuvenate another 500.
I recently interviewed Mosco and began by asking him if smart cities were like utopian cities of the past. What follows are highlights of our conversation which have been edited for length and clarity.
Are the smart cities of today any different from the kinds of fiefdoms we have seen in the past – for example, Henry Ford’s Fordlandia, Disney’s Celebration or the large infrastructure projects built by planners like Robert Moses?
I think they are similar in certain, specific respects. They embody a kind of private master builder approach, whether it is Ford or Disney or a tech company like IBM. The smart city movement represents a prominent viewpoint in American urban history, that it takes great men, and it tends to be men – unencumbered by governments and regulation — to build truly great cities. Companies like Google are the master builders of the digital word, and unlike in the past, they are not building an industrial society but an informational society.
Builders of large infrastructure projects in the past often looked to governments or officials as partners. Is that still true?
I don’t believe tech companies see this to be true anymore. There is a sense of hubris that governments have gotten it wrong in the past and it takes private innovation to do this. The key difference with master builders in the industrial era was that there was a sense that governments would be a close partner to industrial leaders. While corporate execs would take the lead, the work would be done with government help and regulation that would assist and legitimize. Today, there is a difference, private entrepreneurs take the lead. People like Peter Thiel and Elon Musk look on their projects as a way to get away from governments.
Turning more directly to smart cities, we often see invasive digital technologies rolled out ahead of big events as a form of policing, I’m thinking of how Brazil and Russia used camera systems ahead of their FIFA World Cups, as well as the 2016 Summer Olympics. The technology then stays around long after the event has moved on.
One of the key selling points of this technology is the availability to better root out crime and manage police and security forces. This began years ago when IBM built an operations center in Rio de Janeiro ahead of the World Cup. Smart city technology was installed on the basis of saving money and better managing a part of a city, but underlying it all was facial recognition which became a tool for mass surveillance.
One of the points in my book is that smart city solutions in New York first grew out of an interest in bringing the Olympics to the city ahead of 2012 and spurred the redevelopment of huge swathes of the city itself caused more privatization and gentrification.
Much has been written about biases in algorithms. What do you think can be done to counter this?
In the smart city and in the digital world, if we are going to make use of algorithms, we need to make the process of developing and using them much more transparent We rely on tech experts who know very little about the racial biases built into the systems. We need to open this up to access by private citizens. This doesn’t necessarily provide us with the solution.
Technologies tend to embody the societies and social divisions that they are used for. We need to recognize from the start that the algorithms we deploy are biased. This will require more regulation and it is no surprise the technology industry would resist. My experience in this area shows that the communications and tech industry, starting with the introduction of the telegraph, has resisted regulation. There is nothing new about the resistance of Google and Amazon and Facebook to these efforts.
Can ordinary people do anything?
Urbanites need to take back their cities. We need to do this soon, before these technical systems are so influential in decision making that it would be difficult to redesign them to make them more human.
More and more people are coming to recognize that putting down devices alone will not be enough. What makes smart cities particularly interesting, given how Google Sidewalk Labs are setting up their projects, is they will be tracking people simply by virtue of them being in the area. Cameras, point of sale registers, energy systems will monitor all of your use. Communications and scanners and transportation, sidewalks and street lamps. You don’t have any opportunity to sign up for this level of surveillance. With a website, we can click the “I Agree” box. In smart cities, that opportunity won’t be there.
Do you worry that populations in countries like India or China may be more susceptible to control from smart cities?
We need to be deeply concerned about this. Singapore is becoming the laboratory for smart city development. China has taken a page out of Singapore’s story by applying technologies that are quite authoritarian, like the social credit system which keeps track of all of one’s activities and uses it as an index of citizenship or worthiness. If you happen to be surveilled by it, whether at a demonstration or not, can impact your social credit score, government benefits, schooling, etc. India is different in that while the state is involved in funding the benefits, a lot of private companies are involved in overseeing the projects.
Some people have gone so far as to call for a halt to the sales of authoritarian technology. Do you think a moratorium would be helpful?
It may not go far enough, in my view. Authoritarian regimes will be defined narrowly to include nations like China. But we are seeing the rise of authoritarian tendencies in countries like the United States and Britain. I am concerned about Google and Amazon assisting the reprehensible immigration system in the U.S. where tech companies are rooting out refugees and sending them to their deaths back in their home countries. I think a halt is a good start, but we need to recognize that China and Saudi Arabia are not the only governments to be concerned about. Western governments need to be examined as well.
- Earlier this week, I visited a new exhibition at the Tate Modern museum here in London. “Higher Resolution”, presented by a number of creators, including Romy Gad el Rab and Caroline Sinders, simulates platforms like Twitter and Facebook in everyday settings. Visitors are encouraged to sit in a public “living room,” a “town hall,” and even a “loo” and speak as loudly and as emphatically to strangers as they might on a social media platform. Those participants who were British seemed to embrace the idea with some initial hesitancy. The exhibition includes curated playlists and informative talks about subjects like artificial intelligence and the feminist internet.
- In the forthcoming weeks, we are going to be looking at the use of facial recognition in public spaces like schools and private housing. This story looks at the backlash to the use of facial recognition systems in public housing in Detroit.
One of our highlights this week was this piece about how a global network of Uyghurs living outside of China are digging deep into the popular social media app TikTok (they are using the Chinese version, called Douyin) to uncover information about life in Xinjiang. From footage of demolished mosques to video of long lines of Uyghurs passing through security checkpoints, the story explains the extent of China’s determination to control its minority populations.
The story you just read is a small piece of a complex and an ever-changing storyline we are following as part of our coverage. These overarching storylines — whether the disinformation campaigns that are feeding the war on truth or the new technologies strengthening the growing authoritarianism, are the crises that Coda covers relentlessly and with singular focus. But we can’t do it without your help. Today, you have the opportunity to double the impact of your support for Coda Story. From now through the end of 2020, a year’s worth of monthly payments or a one-time contribution will be matched, all up to $5,000. Support journalism that stays on the story.