🔮 Algorithmic decision-making; social credit; the green golden age; the best code, web tracking, colours++ #240
Rethinking welfare, before we automate it
|Azeem Azhar||Oct 20, 2019|| 85||1|
Hi, I’m Azeem Azhar. I’m exploring how our societies and political economy will change under the force of rapidly accelerating technology and other trends. Enjoy Exponential View in a comfy chair, with a cup of coffee (or tea)!
Please hit the ❤ to like this edition. Or leave a comment, especially if you are working in the area of algorithmic decision-making systems. Would love to hear your experiences.
💥 This issue has been supported by our partner, Toptal
The future is AI, but recruitment for your AI team is probably stuck in the past. Reduce your time to hire by working with Toptal to find AI-focused business and technology experts. Toptal pre-screens hundreds of thousands of candidates so that you’re matched with the right expert to tweak your algorithm, or even a full team to build out your custom AI solution. Build the future of AI with the world’s best talent, build with Toptal.
P.S. Not yet AI ready? We've got Big Data experts too!
Dept of the near future
🍀 Every technological revolution brings with it new social and economic dynamics, building up a socio-technical infrastructure for how society functions. The tricky thing is how to steer that process for the better, as I discuss in my conversation with the economist Carlota Perez. An optimistic chat on why and how we might achieve sustainable green growth.
🔎 ‘Publics are not natural. They don’t exist in the wild. We make public life.’ Mike Ananny unpacks the concept of tech platforms as the new public square, and what it means for ‘the public’ to be built on top of platforms owned and controlled by private corporations. In a similar vein, K. Sabeel Rahman argues that the power and influence of the large tech platforms puts them in a separate class from ordinary corporations: they are the new social and economic infrastructure and must be regulated as such.
🤳 EV reader Dev Lewis has studied China’s city-level social credit programs, using case studies of the cities of Xiamen and Fuzhou. Citizens receive algorithmically-derived personal scores based on ‘adherence to laws, promise keeping, and credit in daily life’ according to government data. The programs ‘mark the entry of the government in the business of scoring citizens however implementation so far reveals a very basic attempt with numerous gaps and question marks, but a far cry from the Western media picture of an all-encompassing score enabled by mass surveillance.’ (Another granular view of the apps making use of social credit in China is available here. See also, how China’s social credit system may extend to foreign companies and how they approach politically sensitive issues.)
💯 Is data a private good or a public commodity? What counts as ‘data’ anyway? Ideas about what data is, who it belongs to and how it should be used, are grounded in their historical and cultural context, argues Sabina Leonelli.
Dept of algorithmic government
Automated algorithms are coming to government services. The promise is straightforward: faster, lower-cost, more even-handed decision-making. It is an understandable promise. Welfare systems are expensive. Across the OECD, public social spending takes up an average of 20 percent of government spending (31 percent in France, 21 percent in the UK, 19 percent in the US).
But the reality of these algorithmic systems is rather different from the promise.
In the name of efficiency, they are having the (completely expected) effect of harming the vulnerable groups they are meant to help, or resulting in prejudiced decisions with little hope of recourse. Computer says no.
The best way of understanding some of the challenges is Karen Hao’s exceptional interactive essay on the COMPAS criminal justice scoring system, which I have written about frequently. Can you make AI a fairer judge allows you to play about with the popular predictive algorithm and explore the issues of bias.
Fundamentally, there is a conflict of fairness decisions that gets embedded in the data set, and there are no objective conniptions to get you out of it:
Whenever you turn philosophical notions of fairness into mathematical expressions, they lose their nuance, their flexibility, their malleability.
Philip Alston, the UN’s special rapporteur on extreme poverty and human rights, highlights how many efforts ostensibly aimed at boosting efficiency, which allow for ‘neoliberal economic policies [to] seamlessly blend into what are presented as cutting edge welfare reforms.’
Instead of ‘obsessing about […] market-driven definitions of efficiency’, Alston argues that the focus should be on using technology to support a higher standard of living for those dependent on welfare.
This report acknowledges the irresistible attractions for governments to move in this direction, but warns that there is a grave risk of stumbling zombie-like into a digital welfare dystopia.
Alston gives an example of how a Dutch welfare surveillance system discriminates against the poorest members of society:
This system can have a hugely negative impact on the rights of poor individuals without according them due process. The problem is made much worse by the fact that when the program was eventually given a legal basis, there was virtually no debate in the Dutch Parliament, no media attention, and virtually no transparency in how the system works. The result is that its assumptions remain a mystery even to those who have studied it at length.
The Guardian identifies efforts in the US, Britain, India and Australia to digitise welfare systems pointing out that
[t]he most glaring similarity is that all this is happening at lightning speed, with hi-tech approaches sweeping through social services, work and pensions, disability and health, often with minimal public debate or accountability.
Activists have tracked 13 cases in Jharkhand [India] where people who were refused support due to Aadhaar [India’s biometric system], glitches have allegedly died of starvation’, said one analysis of the Aadhar system.
Pansure Murmu, who runs a ration shop in Jammu, said Aadhaar worked well on the whole, but she admitted the network signal was a problem. “We have to move the machine one or two kilometres to get the signal,” she said. Others have been known to take their customers to a hilly mound to get a signal for internet access. Residents in Lurgumi Kalan village in Mahuadanr, also in Jharkhand, went without rations for two months this summer because there was no connection.
In the UK, nearly a third of the 408 councils across the country have purchased software to automate decisions about benefit claims, child abuse and other issues.
There is a whole slew of issues at large here:
We don’t have good mechanisms for public scrutiny and accountability of these initiatives. Localised decisions end up creeping in scale and scope without robust investigation of the impact on rights and non-discrimination.
Algorithmic systems may deprive subjects of a clear explanation as to why a decision was made. Although a ‘right to explanation’ exists in the European Union’s Global Data Protection Regulation to promote fairness and accountability, ‘the precise contours of this protection are less than clear.’
Governments are not the most sophisticated buyers of technology. They probably don’t have up-to-date standards by which to test whether the tools are up to the task. Nor will they have the skilled workers who can robustly run these assessments.
Replacing humans by technical systems owned by third parties can reduce the states’ ability to adapt or respond to unexpected changes in the environment.
It constructs a slippery slope towards data-driven governance, and a route into social credit style management. Every algorithmic decision-making system in welfare needs to be tied to a user ID. The temptation to maintain those histories and potentially join those interactions across multiple systems will be great—and dangerous without appropriate safeguards in place.
Welfare systems in many countries may well be broken. Not because they are inefficient. Not because they may defy explicability. Not because they might muddle the descriptive with the normative. Or confuse cause-and-effect with correlation.
Welfare systems, especially in the West, struggle because they are, as EV reader Hilary Cottam argues, organised in a hierarchical set of verticals, like industrial systems, which can’t adjust to meet today’s challenges. Putting in more rigid algorithmic systems to reinforce those broken systems can’t solve the problem. It can only make it worse.
Rather we need to start rethinking our welfare systems before we automate them. Hilary’s book, Radical Help, is highly readable and has some pretty robust, field-tested ideas for just that. This essay in The Economist last year also describes some approaches to redesign welfare systems for a time of ageing populations, mass immigration or the gig economy.
We’re understandably excited by the possibility of applying breakthroughs in machine learning and data science to such important issues as welfare. However, complex systems problems, especially those that involve human systems, are rarely resolved by chucking software at them.
Worse, we might find ourselves unknowingly moving from a system based on rule of law, rights, due process and protections to one based on analytics, overall rather than individual outcomes, and a strict utilitarianism—without an adequate democratic discussion of whether that is what we want.
The signs are that we are going about it the wrong way right now.
India’s early efforts to build a facial recognition system to help police has given a fillip to finding missing children, identifying nearly 11,000. But new plans call for a vast increase in the deployment of CCTV cameras and a national facial recognition system may become a tool of aggressive social policing, especially given the nation’s wobbly laws around personal data protection.
Watch I, Daniel Blake, which chronicles the agonizing journey of its eponymous lead through a byzantine welfare process.
Tom Loosemore’s special edition of Exponential View explores good practices for governments looking to use digital platforms.
The US military will stop using floppy disks in its nuclear weapons systems.
🌡️ Climate breakdown: 409.02ppm | 3,869 days
Each week, we’re going to remind you of the CO2 levels in the atmosphere and the number of days until reaching the 450ppm threshold.
The latest measurement (as of October 18): 409.02ppm; October 2017: 403.64ppm; 25 years ago: 360ppm; 250 years ago, est: 250ppm. Share this reminder with your community by forwarding this email or tweeting this.
👉 The earliest academic paper to link CO2 with global warming was published in 1856 by Eunice Newton Foote. Because she was a woman, Foote was barred from reading the paper describing her findings at the 1856 meeting of the American Association for the Advancement of Science and her paper was passed over for publication in the Association’s annual Proceedings.
💲 Australia’s leading scientific research agency says that wind and solar, even with the added costs of several hours of storage, are now unequivocally cheaper than new coal generation.
Qatar has begun air-conditioning outdoor spaces like football stadiums and shopping arcades in an attempt to maintain normal life in the face of spiralling and increasingly deadly temperatures.
Short morsels to appear smart at dinner parties
Slate asked 75 experts for their opinions on which pieces of computer code changed history. They picked 36, and also highlighted something interesting: ‘The most consequential code often creates new behaviors by removing friction.’
🚶 Google is walking away from its Daydream virtual reality platform. It turns out that people don’t like putting their phones in headsets.
🎓 Applications to America’s elite MBA programs are declining.
The world’s biggest 3D printer is capable of producing objects up to 100 feet long by 22 feet wide by 10 feet high. (I analysed the impact of 3-d printing on world trade a couple of weeks ago.)
🔗 Here’s an interesting overview of projects working on the tricky issue of blockchain interoperability.
🛰️ SpaceX might expand to 30,000 satellites, in a sign of growing competition in the low-Earth satellite broadband market. The increasingly crowded low Earth orbit has the International Astronomical Union concerned about the impact of satellite ‘constellations’ for researchers, whilst Amazon has highlighted the risks of collision with space debris.
Dark web hackers tried to expose a child abuse site years before the government shut it down.
😮 With no experimental tests at all, researchers used AI to develop a new supercompressible material.
🍭 All of the languages appear to have color words that drew from the same 11 basic categories.
The world’s top ten cities in 2035.
This video shows the frankly insane number of data trackers used on modern websites to support their advertising businesses.
We’re experimenting with a slightly different format of the wondermissive right now. I’m putting a longer-form essay at the heart of the newsletter. This will mean that each EV will be a little narrower. This one, for example, has very little to say about deep tech, AI research, trends in the startup community, and so on. I’ll cover topics like that in other weeks.
We’re trying to be even more selective around the Dept of the near future to make everything less overwhelming. We’re also taking the brave step of putting images in the newsletter too...
I also know that What you’re up to section is exceptionally popular. Have a glance below. You, dear reader, are awesome. Keep it up. And keep telling us what you’re up to.
I’ll ask for constructive feedback on these changes in a couple of weeks.
Finally, we’re on a sprint to hit 50,000 signups by the end of this year. You can help by forwarding this to a few friends with a recommendation.
🍬 P.S. Premium members of EV support this newsletter and get more as a result. Sign-up here.
This issue has been supported by our partner, Toptal
What you are up to—notes from EV readers
Gerald Huff was a friend of EV. He passed away last year. His family has set-up the Gerald Huff Fund for Humanity to raise awareness of technological unemployment and UBI.
Eleanor Nell Watson’s report on automated externality accounting.
Investor, Brian Laung Aoaeh, on why the world is a supply chain. (Good read.)
Congrats to Marie Steinthaler for her new role as Head of Asia Pacific for open banking platform, True Layer.
Giedrimas Jeglinskas writes on how states achieve effective deterrence.
Matt Stoller for the NYT: Tech companies are destroying the free press.
Congrats to John Bradford for joining Dynamo Ventures as a Partner!
Francesco Cavalli’s is one of the authors of The State of Deepfakes report.
Azeem was quoted in the WSJ article about the impact of AI.
Roy Bahat announced Bloomberg Beta Fund 3, a $75m seed fund.
Brian Aoaeh profiles Wolfgang Lehmacher, author of The Global Supply Chain: How Technology and Circular Thinking Transform Our Future.
Matt Clifford of Entrepreneur First schooled The Apprentice’s Alan Sugar about whose organisation did more for the economy.
Daniel Stanley invites you to Mozilla Festival from 21 - 27 October, in London.
Tom Wheeler writes for Time: The Presidential Candidates Need a Plan for Big Tech That Isn’t “Break Up Big Tech”
Precious Maskoli shares his new newsletter about email marketing.
Paul Cuatrecasas’s book Go Tech, Or Go Extinct is out.
Rafael Kaufmann writes about upgrading business to an open operating system.