Learning machines: Algorithmic Dreams

The apparent brains behind the Johnson ministry, Dominic Cummings, has been ruffling feathers with a blog post calling for ‘data scientists, project managers, policy experts, assorted weirdos’ (tinyurl.com/yx3yrpd8). Tellingly, the posts are advertised via his blog, rather than the usual civil service recruitment routes, and applicants are asked to apply to a Gmail address. Cummings has form for trying to use personal accounts to circumvent freedom of information rules. This is all part of his ‘disruptive’ persona, looking to shake up the stuffy old civil service, and get things done:

We need some true wild cards, artists, people who never went to university and fought their way out of an appalling hell hole, weirdos from William Gibson novels like that girl hired by Bigend as a brand “diviner” who feels sick at the sight of Tommy Hilfiger or that Chinese-Cuban free runner from a crime family hired by the KGB. If you want to figure out what characters around Putin might do, or how international criminal gangs might exploit holes in our border security, you don’t want more Oxbridge English graduates who chat about Lacan at dinner parties with TV producers and spread fake news about fake news’.

It is a bit strange to complain of liberal arts graduates via citing a work of fiction, and tellingly, Cummings remembers the name of Hubertus Bigend, the uber-capitalist anti-hero of Gibson’s Blue Ant series, but not Cayce Pollard (‘that girl’).

He also wants data scientists to look at models of system change and dynamic modelling of viral information. Seemingly, Cummings wants to bring systems analysis and mathematical thinking into the heart of government. He’d have to be prepared to pay well, since those skilled with such talents can find good work in automated trading and business analysis in the City.

Maths not everything

The mathematics populariser Dr Hannah Fry took issue with Cummings’ call:

There is some truth to this – there are a host of government questions that could benefit from a more mathematical take. In everything from bin collection timetables to Brexit policy, I’d love to see more decisions made on the basis of evidence over instinct. The big-data revolution has transformed the private sector, and I wholeheartedly believe it has the potential to profoundly benefit broader society too’ (Guardian, 5 January).

She points out there are limits to using maths to model society, and that an understanding of humans must go alongside any such scientific modelling.

As Noam Chomsky has pointed out:

A vision of a future social order is in turn based on a concept of human nature. If in fact man is an indefinitely malleable, completely plastic being, with no innate structures of mind and no intrinsic needs of a cultural or social character, then he is a fit subject for the “shaping of behavior” by the state authority, the corporate manager, the technocrat, or the central committee. Those with some confidence in the human species will hope this is not so and will try to determine the intrinsic human characteristics that provide the framework for intellectual development, the growth of moral consciousness, cultural achievement, and participation in a free community’ (Language and Freedom).

Fry herself had addressed these questions in the Royal Institution Christmas Lectures (which should be available via YouTube by the time this article comes to print). In one of them she addressed the:

decade in which we learned the lessons of charging ahead without first carefully thinking about the ethics of forcing equations on to human systems. There were the stories about racist algorithms in the criminal justice system, and sexist algorithms designed to filter job applications. YouTube was accused of unwittingly radicalising some of its viewers. Indeed, some would argue that the world is still reeling from the consequences of mathematical equations gone awry, both during the time leading up to the 2008 financial crash and Facebook failing to consider the consequences of its newsfeed algorithms’.

She also explained, however, how algorithms are being taught to learn, through deep reinforcement learning, being ‘rewarded’ for successfully finding the correct result in their task, and ‘punished’ for failing. She demonstrated this via a pile of matchboxes that had been ‘taught’ to play noughts and crosses.

The problem, as with the racist and sexist algorithms, is that if the original data used to teach the algorithm is skewed: Fry, in her Guardian article, linked to an example of how Amazon’s recruitment algorithm had learned that most successful applications came from men, and thus decided that being male was a desirable quality in an applicant (reut.rs/3af79FS).

Resource allocation

Deep reinforcement learning, though, is a powerful tool. AlphaGo defeated a human grandmaster at the immensely complex Japanese boardgame ‘Go’ after teaching itself the game (this is radically different from when human programmers helped Deep Blue to defeat Gary Kasparov at chess).

Systems employing this approach have already been used to optimise system performance in areas including resource management, device payment optimisation and data centre cooling’ (Gemma Church, The maths problems that could bring the world to a halt, https://tinyurl.com/s72y323).

Such systems are beginning to solve problems that have hitherto been considered computationally intractable: involving too many permutations and requiring too much processing time to be realistically solved using traditional computing techniques.

The capacity of such algorithms, Church points out, brings into play the possibility for them to handle resource allocation, to actively supply human wants in a real-time fashion:

Over the last few decades, researchers have developed a range of pretty effective mathematical solutions that can allocate resources across a variety of industries and scenarios so they can attempt to keep up with the daily demands our lives place on them. But when an allocation made at one time affects subsequent allocations, the problem becomes dynamic, and the passing of time must be considered as part of the equation. This throws a mathematical spanner in the works, requiring these solutions to now take into account the changing and uncertain nature of the real world’.

Amazon is already using the data it collects to train algorithms to anticipate what it needs to stock and ship. Amazon has to solve optimisation problems, trying to maximise use of its warehouses, while minimising its delivery routes and matching that with courier availability and flights, trains, etc. The trick with such immensely complex problems is not to solve them absolutely, but to approximate as closely as possible in a computationally realistic amount of time. It is this non-absolute characteristic that enables algorithms to steadily improve. The solution to the millions of variables in a delivery system is to look for good enough, not the perfect.

The algorithms that YouTube and Amazon use to recommend content to customers are changing and evolving in an ecosystem that drives them to improve and ultimately provide us with the goods and services we want. Effectively, producing a profit-driven, private planned economy.

Algorithms are unhuman, as per Serle’s Chinese room (see: en.wikipedia.org/wiki/Chinese_room) they can learn to follow a set of rules and respond to inputs, but they lack intentionality and the inherent structures of feeling and humanness that people possess. Humans are essential to working with algorithmic artificial intelligence to make it serve our needs, rather than that of the special form of Artificial Intelligence that is the capitalist firm. The computing resources are there to enable us to better model and predict chaotic systems, but they need a political determination on the part of us all to avoid them being used to service the needs of the ruling minority. We don’t need Cummings’ technocratic weirdos to change and shape our world, but we can use their ideas to improve our lives: but only if we are running society on our own behalf.

In the words of D.H. Lawrence:

For God’s sake, let us be men

not monkeys minding machines

or sitting with our tails curled

while the machine amuses us, the radio or film or gramophone.

Monkeys with a bland grin on our faces.

PIK SMEET