On re-reading Peter Drucker

I’ve been reading Peter Drucker recently. Or re-reading: you couldn’t not be aware of him in the 1980s and 1990s, when a new tome with his name on it dropped on your desk every year or two (he wrote 40 in all), with names that often seemed to have little to do with management  – eg Post-Capitalist Society, The New Realities, or, somewhat bafflingly, Landmarks of Tomorrow.

At the time, though, I didn’t get him. He seemed to write in sentences that were both obvious and obscure. ‘The purpose of a business is to create and keep a customer’. ‘Management is doing things right; leadership is doing the right things.’ ‘Management and managers are the … constitutive organ of society’. What on earth did some of his famous one-liners even mean?

Had I been listening, one of the things those titles were saying was that Drucker was quite different from nearly all other management writers (apart perhaps from Charles Handy) in that management was not actually his primary concern. Paradoxically the ‘father of management’ was interested in management only because he was more urgently interested in something else. And – another paradox – management is the richer for it.

As you would be as a clever, well-connected young man in Vienna in the 1930s, Drucker was deeply preoccupied with world politics and society, and particularly the subterranean social currents from which totalitarianism had welled up in the shape of communism and fascism. It’s impossible to overestimate the importance of Drucker’s formative Viennese influences, rubbing shoulders as he did with Hayek, Mises, Schumpeter and Polanyi, as well as artists and musicians. ‘Management was neither my first nor has it been my foremost concern. I only became interested in it because of my work on community and society’, he wrote later. Either implicitly or explicitly, that wider interest is the subject of all his work.

For Drucker, the reason management matters is simple and basic. It isn’t an end but a means. The end is a free and functioning society, which can’t exist without thriving independently-run organisations and institutions. They depend in turn on good management. ‘Our society has become…a society of institutions,’ he wrote. When organizations are ineffective and corrupt, a command economy and society is the only alternative. That is what he meant by management being ‘constitutive’: ‘Performing, responsible management is the alternative to tyranny and our only protection against it.’

From this everything else follows. For Drucker, management was a moral profession, with a duty primum non nocere, first to do no harm. Companies, being part of society, had a direct stake in its health; too systemically important to be under the control of any one constituency, they and their managers had a first positive duty to make productive the resources that society put at their disposal. Profit was both a test of their effectiveness and the essential down payment on the cost of the future jobs and useful products it was their task to provide.

On the other hand, what profit wasn’t was a business’s purpose. Indeed, the nearest Drucker comes to a rant is his exasperation with managers for the complacent and circular way they used the profit motive (which he dismissed impatiently as the invention of neo-liberal economists to justify their equations) and profit maximization to explain their behaviour. Widespread public hostility to profit was their own fault, he declared: ‘In the terms management uses when it talks to the public, there is no possible justification for profit, no explanation for its existence, no function it performs. There is only the profit motive, that is, the desire of some anonymous capitalists – and why that desire should be indulged in by society any more than bigamy is never explained. But profitability is a crucial need of economy and society.’

This was written in 1974, when Drucker was already alarmed at the prospect of growing inequality and managers trashing their organisations’ reputation and legitimacy by ignoring their own social impacts and obligations – particularly the duty to create good jobs, which he correctly saw as the glue that kept society stitched together.

You don’t have to agree with everything Drucker said to see many further resonances with today. He would have treated with contempt Boris Johnson’s crass assertion that the warp speed arrival of Covid vaccines was due to ‘greed’ and ‘capitalism’. On the contrary, he would have described the UK experience, at least, as a too-rare case of ‘the society of institutions’ working as it should: decisive action by government to de-risk vaccine manufacture with bold advance orders, meshing with Oxford University’s public-sector research and Astra-Zeneca’s acceptance of the challenge to distribute it initially on a non-profit basis, and procurement followed up by the NHS’s near faultless execution of the vaccination campaign.

As for capitalism, Drucker judged it potentially a better basis for a free society than anything else on offer – but by no means unconditionally. It was constantly in danger of being subverted by the blind pursuit of money and profit. He hated managers benefiting directly from laying people off. Capitalism wasn’t an end goal in itself: ‘Free enterprise cannot be justified as being good for business. It can only be justified as being good for society,’  he wrote in 1954’s The Practice of Management.

Drucker was the moral conscience of management, which he viewed as a ‘liberal art’ – something that required broad human wisdom and judgment to harness the technological tools available and guide the art of practice. These were deeply unfashionable concepts in the era of financialisation when the only social responsibility of business was ‘to increase its profits’. (This is why his books barely figure in business-school curricula, majoring as the latter often still do on finance and shareholder value.)

Characteristically, Drucker’s last book was a collection of essays entitled A Functioning Society. If he were alive today, that would surely still be his central concern. And as we ponder the future of our economic institutions in the light of Covid, in the wake of Brexit and Trump, management itself would be right in the front line.

As Jerry Davis points out in a powerful recent essay on the weakness of purpose in the face of omnipresent pressures of shareholder value, ‘nearly every major societal pathology in the West today – certainly in the USA – is caused or exacerbated by profit-oriented corporations’. Think opioids (thank you, Big Pharma), obesity (Big Sugar), nicotine addiction (Big Tobacco) and climate change (Big Oil), all of whose managers have used relentless lobbying and misleading scientific evidence to confuse opinion and protect their profits, at the expense of the wider community.

But today the even more immediate danger to Drucker’s ‘free and functioning society’ is the potential reengineering of humanity itself through social media, aka Big Tech. The threat is no longer a blunt totalitarian ideology from outside but (Davis again) ‘a dystopian nightmare of increasing corporate dominance, in which a handful of unaccountable corporate hegemons use pervasive information technology to control our daily lives’ for their, their shareholders’ and manipulative politicians’ profit.

Deployed differently, those same technologies could also open up the prospects for democratic renewal, and there are some pressures from below in this direction. Will they be, though? It will need support from governments to weight incentives against doing the wrong things – no honest company should be handicapped against less scrupulous competition – and Biden’s support for trade unions is also welcome. But in the end it won’t happen unless management finally lives up to the responsibilities Drucker ascribed to it. As he almost said, good intentions, like plans, are worthless ‘unless it all immediately degenerates into hard work’.

A tale of two viruses

Covid-19 is giving us a grim crash course in evolutionary biology. Every day we anxiously check the state of the existential race – are people being inoculated fast enough to suppress the disease before a ‘fitter’ Covid variant emerges to take up the baton for an even tougher stage?

But wait. As in a horror movie, just as the frantic vaccination programme seems to give grounds for a glimmer of optimism on one front, it dawns that we are at the same time struggling with an equally toxic mutant on another front, man-made this time, that has erupted in the last few months and may be even harder to douse than the coronavirus.

In a blistering essay in the New York Times, Shoshana Zuboff, author of last year’s monumental The Age of Surveillance Capitalism, makes a direct link between the storming of the Capitol in Washington on 6 January and the data extraction and manipulation business model of Big Tech, specifically the social media companies.

Facebook, Google, and Twitter are to blame for the riot, she declares: ‘The intolerable truth of our current condition is that America and most other liberal democracies have, so far, ceded the ownership and operation of all things digital to the political economics of private surveillance capital, which now vies with democracy over the fundamental rights and principles that will define our social order in this century’.

We have a choice between surveillance capitalism or democracy, she believes. We can’t have both.

Over the top? Judge for yourself. Zuboff’s charge is that we are in the third stage of what she calls an ‘epistemic coup’ that began 20 years ago with the discovery by Google of the value of the personal data that internet users unwittingly gave up in surfing the web, followed by its permissionless annexation of that data for its own use.

That was the first stage of the coup. It quickly led to a second phase of growing epistemic inequality as companies amassed incomparably more data about us than we imagine or than we have about them and their algorithms. The perilous third stage is epistemic chaos, ‘caused by the profit-driven amplification, dissemination and microtargeting of corrupt information, much of it produced by coordinated schemes of disinformation.’

That’s where she thinks we are now, and looking around – at Trump and Trumpism, at Brexit, at mushrooming conspiracy theories and social unrest – it’s hard to argue she’s wrong.

And this of course is the nightmare scenario, where the twin viruses of Covid and surveillance capitalism intersect and reinforce each other. The surveillance model of capitalism is so toxic because it has no interest in the truth or otherwise of the content it carries; its currency is engagement – and as all the research confirms, wild stories, fake news and conspiracy theories, previously majoring on Trump, now on Covid,  get more clicks, and therefore personal data to store, analyse, and sell on, than boring reality.

Which is why social media firms only censor or ban purveyors of such theories as a last resort, and why they will fight to retain the protected status granted them by Section 230 of the Communications Decency Act of 1996, their great enabler, which absolves them of responsibility for the content they carry, to the bitter end. It is their major asset.

It also explains why other companies, with a few exceptions, do little to contest the power of Big Tech. As Rana Foroohar succinctly puts it in Don’t Be Evil: The Case Against Big Tech, ‘they are the ones buying what the Valley is selling’. What’s more, they are also collecting and trading data on their own account.

Surveillance is catching. Every company with a website does it (including the newspapers that we read online) – and that is before the full advent of the Internet of Things, which will multiply the opportunities for data capture exponentially.

In effect, hidden below the visible economy where companies buy and sell products and services there is a swelling invisible one consisting of the extraction and manipulation of personal information. It is claimed to be the fastest-growing industry in the world, its value (although no one knows for sure) estimated at around $200bn in the US alone.

In politics as in commerce, the hacking of humans has real-world effects. Crudely, these are what’s being bought and sold. One of the most serious is the growing difficulty of splicing splintered truth back together. Simple truth no longer carries weight. It is no accident that those who believe the ‘alternative facts’ of political conspiracy are also more likely to believe that Covid is a hoax or caused by 5G masts or part of a plot for a world government of the elite. Adherents of QAnon not only storm the Capitol in support of Trump – they are more likely to refuse face coverings, social distancing and vaccination.

Worryingly, this viral superstrain is not confined to the US. Anti-vax sentiment is particularly strong in France, where the gilets jaunes and other extremist groups, no strangers to violence, have emerged as its natural carriers. In the UK, health chiefs warn of facing two pandemics: one spread by a virus, the other by unregulated social media companies. ‘We have to fight both with equal vigour,’ says the head of NHS England.

The starkness of the civil dilemma this poses is summed up by a reluctant quarantiner in Hong Kong who ruefully notes that the powers that enable Hong Kong to tackle the virus so effectively – the power to lock her up in in a quarantine hotel room for three weeks with an electronic tag and the threat of legal sanctions – are the self-same ones that are also being used being to stamp out political uprising.

Bringing Covid under control therefore has a double urgency: not only to prevent the outbreak of more virulent disease variants but also to head off what can only be called a looming mental-health pandemic, whose effects are unpredictable. They could be as dire as those of the physical infection.

Stamping out the surveillance bug will be harder, requiring whole-system change. Yet ironically Facebook itself just may have brought the belated regulation of social media a small step closer.

Its decision to black out news feeds in Australia (including government and local information sites) in response to that country’s proposal to make platforms pay publishers for content was quickly rescinded. But in a fierce backlash it was widely branded not just as petulant and hubristic, but also counterproductive.

And not just by its sworn enemies. Interviewed on the BBC’s Today programme (at 2:48:40 in) on 23 February, a former Facebook high-flyer expressed fears that strikingly echoed Zuboff’s. While Facebook did good things, and Zuck was ‘a good person’, he was just too powerful, said Steve Sheeler, CEO of Facebook Australia and New Zealand until 2017.

‘In Australia, I can vote for or against the government at the next election, but I can’t vote for or against Mark Zuckerberg – his own shareholders don’t control him. That’s the problem we’re up against here: sovereign nations are coming up against Facebook, yet they’re not on the same playing field in terms of power.

‘A few years ago I thought breaking up Facebook or Big Tech was a fool’s errand, and ridiculous, because their powers weren’t anything like what the critics were saying. But in the past couple of years I’ve come round to the view that the scale, size and influence of these platforms, particularly on our minds and brains and all the things that we do as consumers and citizens, are so powerful that leaving them in the hands of a very few closely controlled companies like Facebook is a recipe for disaster’.

Facebook and the social media had played a major part in events ranging from controversy around the 2016 US presidential election through the Cambridge Analytica scandal to the recent riots in Washington, he pointed out. ‘It’s not going to get better if we allow the industry to regulate itself,’ he concluded. ‘We need to hear the government’s voice in here’.

A ‘great rebalancing’?

Brexit is the UK’s Trump. It’s a symptom, but a malign one that makes the original condition worse. The original disease, revealed in hyperrealistic detail by Covid, is inequality in all its forms (income, health, wealth, housing, productivity, demography). The UK today, summed up the Institute for Fiscal Studies (IFS) last year,  ‘is one of the most geographically unequal countries in the developed world’. Starkly, London and its surrounds are the only UK region  to make a positive contribution to the Treasury. Yet while London far outranks every other city in terms of productivity, income and wealth, the capital is also riven with the same jagged inequalities, due largely to sky-high housing costs and gig-economy wages. Wealthy, high-productivity London is itself massively unequal. The pattern is fractal.

All this means that the glib remedy of ‘levelling up’, first advanced in the last election and many times repeated since, raises as many questions as the presenting problem. Theresa May as premier got as far as talking about an ‘economy that benefits everyone’ – levelling up by another name – but her fragile government was too embroiled in Brexit even to define what that meant. With Brexit done, sort of, and a reliable majority, Boris Johnson is one step on, but faces a similar issue: where to start. One issue is definitional: is levelling up about places or individuals? The prescriptions are different for each.

Regional, or to use modern jargon, place-based policy in the UK has been tried since at least the 1970s, with very little result. The private sector can’t be coerced to locate to the small towns, coastal areas or run-down cities that need it most, and there’s a limit to the number of civil servants or agency staff that can be decanted there. Brexit complicates the issue for small entrepreneurial companies. Deindustrialisation is as easy (just leave it to the market) as reindustrialisation is hard – if the citizens of the old East Germany, having received massive transfers as well as all kinds of other aid from a competent and concerned national government, are still poorer than their counterparts in the west, ask yourself what hope there is for our left-behind in Stoke, Barnsley or any other brick in the once ‘Red Wall’.

If on the other hand the target is individuals, solutions are in principle more obvious. But they are paradoxically less likely to be chosen, for political reasons: they would involve reversing the austerity policies that have reduced services such as health, education, local government, and welfare to near-anorexia in the decade since the GFC and radically increase public spending.

The other evident way of addressing individual need would be to make a start on correcting the drastically out-of-whack balance between capital and labour. Labour’s share of national income has been shrinking for decades on both sides of the Atlantic, and the process has speeded up since 2000. President Biden is already setting an example here, with promises to up the US minimum wage and roll back some of the inroads into workers’ rights made under Trump and before. Needless to say, this is hardly likely to go down well with far-right Tories who seem to believe, with Trumpian lack of justification, that the British labour market is vastly overregulated and that employers are champing to be liberated from red tape, neither of which is true.

But ruling out such options would leave the government with a still bigger difficulty. After at least three decades when neo-liberal dogma ruled out any economic solution other than deregulation or, in the presence of ‘market failure’, the outsourcing of provision to the private sector, UK governments have hollowed themselves out. The continuing war on the civil service is perversely Pyrrhic, undermining not only continuity but their broad capacity for independent action. As one observer described it, they have progressively ‘infantilised themselves’. Each successive regime is less capable of thinking or acting for itself and ever more dependent on Big Consultancy to supply the answers.

The dangers of this capture are evident in both the long and short term. Look no further than the repeated missteps and U-turns in handling the pandemic. The obsession with scale, centralisation and the private sector, as in the only partly effective testing and test and trace operations, comes straight out of Big Consultancy’s management 1.0 playbook, whose obsolescence is only partly disguised by a few digital trimmings. Further back, the New Public Management policies of marketisation, competition and performance management audit that have demoralised and de-professionalised public sector workers, killing initiative and trust, came from the same stable. Through ignorance, a policy aimed at shaking up public-sector management has done that but also made it less capable than befor. 

In his book The Great Transformation, describing the rise of the market economy, the great economic historian Karl Polanyi (incidentally a good friend of Peter Drucker, who helped support him during the writing of the book) argued that left to self-regulate, the free market would cut itself loose from society with profoundly destructive results, endangering capitalism itself. It needed what he called a ‘double movement’, in which a variety of countervailing legal, regulatory and institutional responses – trade unions, health and safety regulations, social security, among many others – combine to curb excesses, retether markets to society and oblige it, however imperfectly, to work for the public good.

For Polanyi, looking back from the 1940s at the turbulent inter-war years, this was the only way capitalism could work – and the events of the last two equally volatile decades do nothing to suggest he was wrong. It’s early days, but it look as if Biden shares this view. If the diagnosis is indeed correct, any UK attempt at a ‘great rebalancing’ that tries to skirt this central reality, whatever its avowed focus, will be a sham.

Making management great again

Like Hemingway’s bankruptcy, the collapse of the conventional management model has come in two ways – ‘gradually, then suddenly.’

In obvious decay since the financial crash of  2008, although the rot had set in long before that, management as we know it has finally been finished off by covid.

It’s as if a switch has been turned. As Gary Hamel reflected at this year’s Global Peter Drucker Forum at the end of October, everything we thought we knew about management has been derived from observation of organisations that came into being in the first Industrial Revolution. It was perfected against the bureaucratic template of the early 20th century and locked into place since the 1970s by the toxic doctrine of shareholder value.

The pseudo-scientific pretensions of this technocratic, numbers-driven and inhuman model have been stripped bare by a pandemic that has systematically inverted the values it embodied. Human cooperation has been more use than competition – and should have been pursued much more at international level. Centralisation and scale have been no match for a nimble disease which strikes one person at the time (witness the failure and waste in our huge outsourced testing and test-and-trace centres); and above all, it has reasserted in the most basic of terms the centrality of people.

One of the flaws of the exclusive focus on shareholders is that it disables companies’ auto-immune systems, blinding them to their own long-term interest. Covid reminds companies that they need people to be employed and paid not just to solve problems and make stuff, but also to buy it. Absent people with jobs, governments have to invent surrogates in enormous stimulus and recovery programmes, as now. Welcome back, employment policy.

At the same time, the pandemic underlines how dangerously out of kilter we have allowed our value system to become. As Mark Carney is exploring in his current Reith lectures, the market as currently constituted overvalues the present at the expense of the future, and undervalues essential work like care, transport and other basic service to the benefit of a host of inessential ones. Hence our bullshit economies, built on work that is often not worth doing. This too is due for a reset.

But it’s at the company level that divergence between old and new is most spectacular. The recent online Drucker Forum got off to an electrifying start (I mean that) by showcasing a number of companies that unlike struggling competitors are sailing unscathed through covid not only while ignoring conventional management practices, but because they ignore them.

Among the five presenting firms – Nucor (US), Buurtzorg (Netherlands), Michelin (France), Handelsbanken (Sweden) GE Appliances (Sino-US) – only Buurtzorg, the Dutch nurse-run healthcare operator, is a start-up, the rest being solid corporate citizens of many years standing. They prove that it is perfectly possible to ‘transform’ – to use a catastrophically traduced word – if, but only if, managers throw off the blindfold of the old and devote as much attention to management innovation as they do to product and technology development, attempting to make their companies as inventive and creative as their employees are.

Currently, that’s a big ask. At a time when we need to harness every scrap of human ingenuity – and when three new vaccines stand as shining testimony to what can be achieved when that happens – it should be a global emergency that 80 per cent of workers think their opinions are disregarded at work; 70 per cent of jobs require little or no ingenuity; and just 18 per cent of workers are engaged at work – present physically but absent (at best) mentally. Baldly, companies in their present form squander much more human capacity than they use, or than we can afford.

As amply shown by the Forum five, it doesn’t have to be like that. Buurtzorg and Handelsbanken, the Swedish bank, are already rightly well known. Buurtzorg, now 15,000 nurses strong, continues to attract a further 100 recruits a month, and as it expands is starting to transform the Dutch healthcare model from the inside. Handelsbanken’s decentralised, relationship-based banking model ensures that it can respond instantly to its customers’ changing circumstances – one reason it has outperformed its Swedish rivals for the 49th year in succession.

As for the others, disparate as they are in culture, history and nationality, they are united in an unshakable belief that success is driven by people. This is absolutely nothing to do with being ‘nice’. It’s the conviction that ‘27,000 minds are more powerful than any single one,’ in the words of former Nucor CEO John Ferriola. Ferriola talks of a ‘chain of trust’ in which top management’s job is to build teams rather than products, and then provide the environment in which they can focus single-mindedly on the effectiveness that makes Nucor ‘the safest, highest quality, lowest cost, most productive and most profitable steel and steel products company in the world’.

‘Never underestimate the casual genius in every human being’, says Florent Menegaux, CEO of 130-year-old Michelin, the French tire-maker – while admitting that most of the time corporate bullshit stifles them from using it. Starting from small experiments, Michelin is now riding an upsurge of frontline improvement welling up from below. Menegaux now sees his mission as taking the stress out of operational pressures – including on middle managers – and feeding energy back. The manager takes care of the team; the team takes care of everything else, as one slogan neatly puts it.

At GE Appliances the divergence from management’s mainstream is even more dramatic. When the traditionally run white-goods maker was sold to China’s Haier in 2016 the culture shock was colossal. It didn’t realise it, but the company ‘was slowly dying’, in the words of CEO Kevin Nolan, strangled by its 100-year past. Now broken up into ever smaller micro-enterprises, a re-energised GEA is thriving like never before. ‘We need more ceos!’ says Nolan. ‘It sounds counterintuitive, but you have to get more ceos within your company. You have to let people control their future and their decision-making to unlock their creativity.’

The reason why these companies have done so well during the pandemic is blindingly clear. Simply put, their decentralised structure and carefully fostered cult of trustworthiness means that their people don’t have to wait for orders from above – they know what to do and do it. At Handelsbanken, local knowledge and branch responsibility for all lending translates into a fraction of the bad loans of rivals during the crisis. GEA’s ambition of ‘zero-distance’ formalises its recognition that closing the gap between the enterprise and its true boss, the customer, is a key metric of success. Nolan notes that without central direction GEA’s micro-enterprises were solving issues daily ‘at the speed of the market’; under covid they see the future as brighter than at any time in the company’s history.

To achieve zero distance with the customer, omission – eliminating what gets in the way – becomes as important as commission. What gets in the way is management. Buurtzorg has a slide entitled ‘what we don’t do’ that lists ‘management meetings, policy notes, strategic documents, HR strategies, year plans, and other useless things’. The latter include budgets and intermediate goals like targets, two things that Handelsbanken also eschews. Threats and opportunities don’t come in 12-month packages, so why should decisions?

As the technology of human accomplishment, ‘management sets the outer limits on what we can do as a species. It is humankind’s most important technology’, Hamel noted at the Drucker Forum, channelling Drucker himself. After a long pause, companies like those described (among many others) are beginning to test those limits, as they do so redefining management’s fundamental laws along human rather than economic metrics.

Unlike sheer physical size, trust and decentralisation appear to scale without diminishing returns. Effective relationships trump efficient transactions. Companies succeed by working with the grain of the ecosystems they operate in, not against them. Zero (response time, distance from the customer, management itself) is often the best score. IT in the background, not the foreground. Having spent the last 40 years trying to eliminate all traces of the human, companies are belatedly beginning to realise that it’s when they betray the human that things start to go wrong. With that established, perhaps management at last has a chance to live up to the gurus’ claims for it.

Wasting a good crisis?

Never waste a good crisis. That glib slogan is less to be heard this time round. Not surprising, perhaps, given what happened after the financial crash a decade ago – which after the dust had settled, consisted of a return to business as usual, only with added austerity. That didn’t turn out so well for anyone who wasn’t part of the global 1 per cent, and the delayed reaction brought us Brexit, Trump and the election of Boris Johnson. 

So will today’s pandemic crisis be more productive? Nine months on from the first Coronavirus fatality, the signs aren’t good. After an initial burst of good behaviour (research collaboration among pharma groups, repurposing of manufacturing plants to turn out medical supplies, a few bosses forgoing raises) firms are in danger of reverting to bad old habits instead of taking the opportunity to institute better new ones. 

Take working from home. You might think that this was a rare win-win. Employers and workers both get to cut costs. Workers like it. In a recent survey of 10,000 European and Middle Eastern workers, 87 per cent said they wanted a choice over their place of work. Corporates, meanwhile, have discovered to their relief and surprise that under WFH not only does office workers’ productivity not suffer – in many cases it goes up. Unilever and Google found that at home their office workers were putting in more time than before, not less.

The unspoken corollary of that, of course, is that the office environment in general, and management in particular, add no value to employees’ work; rather the reverse. This isn’t new. The late Peter Drucker used to complain that too much management consisted of preventing people from doing their work, and advised every company to subject all their work processes to a zero-budgeting exercise every few years to strip out the friction-generating clutter and grit. 

Alas, rather than take the lesson to heart, managers have swiftly reverted to their default setting of control. Witness soaring demand for, and burgeoning start-ups in the field of, what are euphemistically termed ‘collaboration tools’: software which as well as collaboration also facilitates remote monitoring of computer keystrokes, websites visited, pauses taken and even infrared hotspots pinpointing staff providing the ‘pivotal point that people go to for information and answers’ (and by the same token presumably those who don’t). Bizarrely, apps are also springing up that mimic the background noise of a busy office, or even the ‘gentle chatter’ of a Danish coffee house. 

As Rana Foroohar points out in her latest book, anything that can be used for surveillance, sooner rather than later will be. That’s because behind the drive to control lies another obsession: reducing cost. Understandable as that is in today’s hard times, it is leading to behaviour that spectacularly misses the point. A la Drucker, the crisis would be the perfect moment to go back to ground zero and redesign the work to meet current and projected demand in the light of the new conditions, including WFM, social distancing and other consequences of covid.

But no. Spurred on by the big consultants, companies instead are splurging on ‘digital transformation’. That has led to a dramatic decline in customer service as the punters are peremptorily herded online whether they like it or not, often with no recourse to human contact. Pleading the crisis, companies resort to rationing – ‘due to covid, we are experiencing exceptional call volumes: expect wait times of more than one hour’ – directing callers to FAQs online, or simply deleting any other means of contact. In one prominent NHS operation, sad to relate, where the phone is permanently off the hook and a broken email link never repaired, there seems no means of changing an urgent appointment. The cost in terms of frustration, anxiety and wasted time for citizens and customers is off the scale, while the build-up of failure demand is invisible to managers who are probably congratulating themselves on having cut their (comparatively irrelevant) transaction costs. If anyone was wondering where productivity goes in these ‘transformations’, look no further: it lies in a grave marked ‘digital services’.

The other favourite corporate cost-cutting initiative is to chop full-time staff in favour of agency or ‘contingent’ workers. Around 5m people in the UK were in mostly low-paid, precarious employment even the pandemic hit, and that total will have surged over the last few months. As the FT’s Sarah O’Connor recently noted, the accepted risk-reward ratio in finance – the higher the risk the higher the reward – is reversed in today’s labour market: a truth rubbed in by the news that the boards of a number of US companies have begun quietly to adjust bonus formulae to compensate CEOs for ‘covid-related’ loss of earnings. Good luck finding revisions in the opposite direction to adjust for undeserved strokes of good fortune.

All of these things involve choices. Not all companies are choosing to recalibrate CEO pay. Companies that signed the US Round Table’s historic 2019 retreat from shareholder primacy seem to be behaving better towards their employees in the pandemic than others. In the UK, Aviva and Standard Life Aberdeen have signed up to a ‘living hours’ agreement that guarantees shift patterns (and payment) for workers four weeks ahead. Companies like these, and others that have chosen to maintain or improve levels of customer support (food retailers, including small ones, John Lewis, Waterstones), may gain in the long term when things have returned to something nearer the previous normal.

But these are exceptions. And the real test of an organisation’s purpose is not being nice to stakeholders. It is bending all its energy and ingenuity to challenge the seemingly inevitable and find new ways of fulfilling what it exists to do. Consider this. When all the world’s theatres and cultural festivals were shutting down – the New York Met won’t reopen until at least autumn 2021 – after fierce debates, the Salzburg Music Festival, the largest of its kind, resolved to defy the odds and go ahead with its 100-year anniversary event in June. This involved going back to scratch: in double quick time developing a new programme, preparing a distancing and safety strategy that has become a model for others, reimbursing 180,000 previously sold tickets and selling 76,000 new ones, quite apart from the normal artistic work. ‘We were deeply conscious of our dual responsibility as both a source of meaning and employer,’ says Salzburg president Helga Rabl-Standler. The result of Salzburg’s courage: ‘a sold-out festival, a giant step forward in terms of digitization, and a thousand good ideas on how to offer our greatest asset, regular customers from 80 countries around the world, faster and even better service.’

Well: just encore.

D for dunce: the great exam failure

The current educational algo-debacle is an exquisitely English cock-up*: a slow-motion train wreck that is the product of 30 years of educational initiatives, reorganisations and adjustments to alleviate the problems generated by previous changes, all piled up on each other without consistent architecture or, needless to say, political consensus. Finally this year Covid nudged it over the cliff of its own contradictions.

Our education system is a perfectly designed generator of grade inflation. Like executive pay under shareholder capitalism, it’s an escalator engineered to move in one direction only: up.

This year’s events are the culmination of a story that began in 1992 when 38 polytechnics were elevated to university status, nearly doubling the overall estate. Growth has continued ever since: there are now no less than 132 UK universities, with a student body that has expanded to match. Nineteen-seventy’s total of 200,000 students had mushroomed to almost 2m in 2019.

At a stroke, higher education morphed from an elite to a mass education system. Unfortunately, having willed the end, naturally with no diminution of quality, the government neglected to provide the means to bridge the gulf in standards – judged on traditional measures – between the old and the new. Accurately reflecting the gulf in respective resources, it was and in some cases remains large.

Real levelling up would have required a massive injection of resources into the new-borns. Instead, as ever, the government opted for a sleight of hand whose costs would only surface later. Traditionally, to maintain standards new universities underwent an adjustment period during which they administered degrees set by longer-established institutions. By contrast, the post-1992 cohort were granted degree-awarding powers from the start. There was no way a first from an under-resourced new university could be worth the same as a first from a top established one, but at a stroke the difference as made invisible – except to external examiners, who are often still pressured to verify marks that they know are too high or less often too low, depending on where they come from.

Tuition fees did provide universities with extra resources. But they were a two-edged sword. Particularly after 2010, when they jumped to £9000, they set in motion a programme of marketisation that, as the government intended, turned students from learners into consumers, a process encouraged by the creation of albeit unofficial league tables and increasingly important student satisfaction surveys. By the same token, universities became fierce competitors for their custom. Much of the extra resource was diverted into marketing, facilities and highly paid administration, while students began to argue that shelling out £9000 a year entitled them to a good degree and the teaching that ensured they got it. Lecturers and their employers had strong incentives to oblige. The casualty: a steady inflation of students’ grades.

As part of the supply chain, schools have naturally been sucked into the upward vortex. They were also subject to strong pressures of their own. Exam boards are competing commercial entities, and schools exploit discreet exam arbitrage between them. Moreover, education was an early testing ground for the New Public Management (NPM), the drive to sharpen up the public sector by subjecting it to private-sector methods and techniques. Unsurprisingly, the regime of targets, inspection, league tables and fierce performance management (‘targets and terror’) had the same dismal effects as in other public services such as health. Particularly harmful were the inducements for heads and teachers to play the numbers game by quietly dropping ‘harder’ subjects, excluding poor performers and ‘teaching to the test’ – a classic illustration of the folly of making professionals accountable to ministers and inspectors rather than those they directly serve. While it is widely accepted that many schools, eg London, have improved, the cost has been high in the shape, again, of grade inflation.

Briefly, consider that the percentage of top ‘A’ passes at A-level had gone up from 12 per cent in 1990 to 26 per cent in 2017, and ‘A’s plus ‘B’s from 27 to 55 per cent. The upward progression in degrees is even more marked. As a New Statesman article put it last year, ‘British universities… have increased the number if degrees they award fivefold since 1990, while the proportion of firsts they hand out has quadrupled – from 7% in 1994 to 29% in 2019. For every student who got a first in the early 1990s, nearly 20 do now… The proportion of students getting “good honours” – a first or 2:1 – has leapt from 47% to 79%: at 13 universities more than 90% of students were given at least a 2:1 [in 2018].’ In a perfect self-reinforcing cycle, universities justify this progression by pointing at the schools: it’s not surprising we’re giving more good degrees, they say, because we’re getting better students – just look at the A-level results.

This is the backstory to this year’s school shenanigans, when the creaking system was brought crashing down by the cancellation of GCSEs and A-levels during the lockdown. Without the restraining influence of real marks for real work, the government invented two unreal ones – centrally assessed grades (or CAGS) and a version moderated by the famous algorithm to damp down what it saw as alarming grade inflation. Both measures are barely comprehensible in their complexity (sample: ‘CAGs are not teacher grades or predicted grades, but a centres profile of the most likely grades distributed to students based on the professional views of teachers’). But the circle was unsquarable. While the algorithm did moderate the grades, it could only do so at the price of such manifestly unfair side effects that the government hastily retreated. CAGs, and by extension, grade inflation, on this occasion however justified, rolled on.

So we arrive at a familiar destination. Grade inflation is a symptom of what Ray Ison and Ed Straw, authors of the important new The Hidden Power of Systems Thinking, call a system-determined problem – one that can’t be resolved by first-order change, only by rethinking the system itself. Tinkering with the existing system to make it work better is our old friend doing the wrong thing righter, which ends up making it wronger. And we end up with the worst of both worlds: private-sector market competition moderated by Soviet-style regulation that achieves neither efficiency nor accountability, and whose figures won’t bear the mildest scrutiny. When we most needed a system based on professional trust and respect, we have the reverse, a regime established to assure academic standards that has overseen their almost complete debasement.

This has the potential to be much more than a little local difficulty. Higher and to a lesser extent secondary education, backed up by league tables that conveniently big up their strengths, have long been talked up as one of this country’s strongest international success stories. Covid’s inconvenient intervention suggests a more accurate characterisation might be a house of cards, built on statistical foundations that don’t even come up to O-level standards.

* As a Scottish reader correctly notes, it is increasingly hard to generalise across the component parts of the union in such matters.

How masks became a weapon in the culture wars

Trust in government is emerging as an important factor in how a country fares on what might be called the coronavirus performance league table. That stands to reason: in the absence of a vaccine, ‘beating the virus’ is a collective social enterprise as much as a medical one – just as ‘saving our NHS’ was at the peak of the infection, although the government appears to have forgotten it. (The cost was perilously high, but that’s another story.) In other words, performance is less a matter of science, more a matter of political competence and leadership.

New support for that idea comes from a recent paper in the Lancet describing the ‘Cummings effect’. When the story of the adviser’s dash for Durham, breaching official lockdown advice, broke in May, the result wasn’t just an immediate and continuing loss of public confidence in the government – it changed people’s behaviour. Their growing unwillingness to follow the guidelines was the other side of the coin of declining trust. Rubbing it in, Durham’s former chief constable noted: ‘People were actually using the word “Cummings” in encounters with the police to justify antisocial behaviour’.

A more insidious seepage of confidence – leading to an almost virus-like spike of consternation, rage and conspiracy theories – has been triggered by the government’s vacillation over the desirability of wearing face masks. Indeed, when the history of the pandemic is written, there will likely be a special section on this mundane piece of cloth and gauze, which has become an unlikely symbol of the contradictions and jagged social and political divides that the coronavirus has generated.

It should have been simple. When everyone wears one, the face mask is an important element, along with maintaining distance, hand washing and restricting frequentation, in limiting transmission of the virus.

But it is not quite as straightforward as it looks. The mask has a systemic dimension, and the benefits are asymmetric. For the individual, wearing a mask is a mild inconvenience for not much return. For the collective, on the other hand, there is no downside, and the benefit is multiplicative because of a kind of network effect: the more widespread the use, the greater the value, including to individuals. If sufficient numbers mask up, in protecting other people you protect yourself. This makes it too, and this likewise has been much neglected, a powerful signifier. In the context of the above, wearing a mask is a badge of common endeavour, a recognition of the fact that your health depends partly on the behaviour of others, just as theirs depends on yours.

Yet for many in the individualistic US and UK, these scraps of fabric have become objects of scorn (‘face nappies’) and wearing them an affront to liberty – ‘facemask hell’ and ‘a monstrous imposition’, according to one MP. For some Americans they are symbol of oppression, even totalitarianism, an insult to religious feeling (‘denial of the God-created means of breathing’) or even a threat to wellbeing (one American woman bizarrely shouted to camera, ‘the reason I don’t wear a mask is the same as for not wearing underwear: things gotta breathe!’). According to a trade union poll, 44% of McDonald’s employees had been threatened or abused for insisting that customers don a mask. At least one American has been shot.

In short, instead of being a simple precaution, covering your face has morphed into a weapon in the culture wars – a sign of wokeness or meek compliance with an oppressive state on one hand, an identifier of aggressive right-wing libertarianism on the other.

How has this come about? In microcosm, the depressing story of the face mask mirrors the convulsive progress of the crisis as a whole: a drunken lurch from under- to overreaction, accompanied by mixed messaging and subsequent public cynicism, augmented by the Cummings effect and the utter untrustworthiness of the testing statistics. In the absence of trust, leaders have no levers to pull when they want to get a scared, suspicious and increasingly resentful country back to work. They can only beg and bribe.

In the UK no one has ever explained in simple, clearly understandable terms the cumulative benefits of mask-wearing. And, disastrously, western authorities, including the World Health Organisation (WHO), initially played down of masks not for medical reasons but because they feared that a rush on masks would aggravate the strains on national health services then struggling with critical shortages of PPE, including face coverings. Not surprisingly, people now instructed to wear one are apt to take a cynical view.

The consequences of the failures to come clean are now coming home to roost. Ironically, even in the US and UK, most people are in principle in favour of wearing masks and even of making them compulsory. Yet in the UK, uniquely, this has not translated into behaviour: in an IPSOS Mori poll of 23 July, four months after the start of lockdown, just 28 per cent said they wore one, compared with double that proportion in France, Italy and the US. This is one reason why the UK now has another dubious Europe-beating qualification to add to its list: alongside the highest number of covid-related deaths and the worst hit economy, we are the slowest and most reluctant to return to work.

But if citizens now are slow to wear masks and resist going back to work, it’s largely not because they are bloody-minded or stupid. Inadequate leadership is squarely to blame.

Slavery, Inc

Like most people, including Alfred Chandler in his magnum opus The Visible Hand, I always accepted that – with a nod to ancient institutions like universities, the army and the Catholic church – the origins of modern management lay in the US railroads and the factories of the Industrial Revolution. 

But although long denied or ignored, it is becoming clear that some of the founding practices were already well developed in the 18th-century slave plantations of the Caribbean and the southern states of America. When F.W. Taylor’s The principles of scientific management appeared in 1911, echoes of the earlier ‘scientific agriculture’ practised on some of the sugar and cotton plantations were not lost on contemporary critics who found some of Taylor’s practices uncomfortably reminiscent of ‘slave-driving’ – nor on supporters who on the contrary praised them for the advance they represented over slaveholding.

This is troubling stuff to write about. But the aim is not to pick at the scabs of the past for the sake of it. It is that, as ever, the present is the child of the past, and coming to terms with the history is the first step to resolving the unfinished business it has left behind.

In Accounting for Slavery: Masters and Management, a remarkable piece of primary research, Caitlin Rosenthal, a young McKinsey consultant turned academic, parses surviving plantation account and record books to paint a chilling picture of the blend of violence and innovative data practices that turned plantations into extreme exemplars of scientific management – ‘machines built out of men, women and children’ where ‘the soft power of quantification supplemented the driving force of the whip.’ 

Slavery, Rosenthal notes, ‘plays almost no role in histories of management’. Whether conscious or not, this is denial, the erasure accomplished by Chandler’s comforting categorisation of plantation management as primitive and pre-modern. Not a bit of it, counters Rosenthal. Sophisticated information and accounting practices thrived precisely because slavery suppressed the key variable that makes management difficult – the human. As she puts it, ‘Slavery became a laboratory for the development of accounting because the control drawn on paper matched the reality of the plantation more closely than that of almost any other American business’.

The combination of labour that was essentially free, unspeakably brutal management and smart accounting meant that slaveholding was exceptionally profitable. Plantation owners were among the one percent of the period; at the time of the Civil War, there were more millionaire slave-owners in the south than factory-owners in the North. In the UK, as we are sharply reminded, many Downtons were built on the trade or forced labour of slaves. Historians mostly don’t include human capital in their calculations, but plantation owners did, using depreciation to assess the changing value of slaves according to age, strength and fertility well before the concept was in use in the North, and routinely using them as collateral for loans and mortgages. By buying and selling judiciously, slave-owners could add steady capital accumulation to the profits from cotton and sugar. 

Pace Chandler, plantations were management- as well as capital-intensive: according to one calculation, in 1860, when the railroads were emerging as the acceptable crucible of management, 38,000 plantation overseers, or middle managers, were managing 4m slaves using techniques that included incentives as well as indescribable punishments. Rosenthal recounts that in 1750 a British pamphleteer launched a prospectus for a kind of business school whose target clientele included sons of American planters. Slaveholders, concludes Rosenthal, ‘built an innovative, profit-hungry labor regime that contributed to the emergence of the modern economy… Slavery was central to the emergence of the economic system that goes by [the name of capitalism].’ 

With some estates numbering thousands of slaves, the plantations represented a milestone in managing scale. Even more important, the tools developed there enabled owners to manage their enterprise remotely. The slaveholder no longer had to suffer the physical discomforts of colonial life – or the mental discomfort of seeing at first hand the appalling human cost of his or her mounting wealth. Studying the numbers in the account books – embryonic spreadsheets – in a study in Bristol, London or Liverpool, he (or she) could see at a glance the productivity and profitability of each slave and decide their fate with a tick or a cross. 

This was a genuine management innovation, perfectly aligning the need for distant control with conditions on the ground. It was also crucial in another way. Representing humans as numbers not only put them out of sight and out of mind. It also encoded them as simple instruments of profit, no different in that respect from mules or horses, or the machinery for turning raw cane into sugar. It was to this vision of unfettered capitalism, where the only sanctity was property, that the southern states (and the British ‘West India interest’) clung to so tenaciously for so long – and in the former’s case, went to war to protect.

They lost that battle. But even after abolition the ghost of the old regime lived on in the south in the infamous penal labor and convict leasing schemes – and endures today through the for-profit prison-industrial complex that has seen the quadrupling of the (disproportionately black) US prison population since 1970. A whole raft of blue-chip US companies continue to profit from captive prison labour today.

The debate about economic freedoms and ends and means in business that slavery started rumbles on in 2020. When Milton Friedman wrote in 1970 that the social responsibility of business was to increase its profits, he was reasserting the primacy of capital owners’ property rights, and in an extreme version of Adam Smith’s ‘invisible hand’ argument insisting that anything they do to increase those profits contributes to the common good. Now the management wheel is turning again towards a more inclusive view, although with how much conviction it remains to be seen. If there is any hesitation, slavery should remind us with crystal clarity how far people will go in pursuit of profit if allowed to; that management’s urge to reduce everything to numbers can all too easily result in the destruction of its own humanity as well as the lives of those being managed; in short, that management can be a force for evil rather than for good. Making a clean breast of the dark side of its history is the only way to close off those bleakest avenues for ever.

Remind me: what is HR for?

In case you missed it, May 20 was International HR Day. To celebrate it, the CIPD tweeted five reasons ‘to recognise HR right now’: putting people first, enabling remote and flexible working, championing physical and mental wellbeing, encouraging virtual collaboration, and supporting people and organisations to adjust to the new normal.

Nothing much to object to there – it’s motherhood and apple pie. Yes. And that’s the problem.

Like a great deal – most? – of management advice, what is proposed is true but useless; preaching, as Jeff Pfeffer puts it.

One clue is that you can’t imagine many people arguing a case for putting people last or stubbornly upholding the old normal. More deviously, the five reasons for celebrating HR are actually nothing of the sort. They are really abstract desired outcomes – practices that companies ought to have – pretending to be inputs – processes or principles that companies and organisations actually observe.

But they don’t: the banality of the desiderata is in inverse ratio to their occurrence in real life. As such, the list declines reasons to despair of HR, not celebrate it.

Managing with rather than against the grain of human needs is not a new prescription, nor a controversial one. As big-name researchers from Herzberg (‘to get people to do a good job, give them a good job to do’) in the 1970s and 1980s to Pfeffer (The Human Equation) in 1998 to Julian Birkinshaw (Becoming A Better Boss, 2013) have emphasized in their different ways, effective work arrangements that enlist people’s abilities and motivation are a better and more sustainable route to economic success than downsizing, contracting out and relying on sharp incentives and sanctions. Countless research studies say the same thing.

And it is true today. At the recent launch of a joint RSA-Carnegie Trust report on the question, ‘Can Good Work Solve the [UK’s] Productivity Puzzle?’, top representatives from the Bank of England, the TUC, McKinsey and the RSA all agreed: yes, it can and should. There are simply no downsides.

Except that it doesn’t happen. Despite the lip service, ‘good work’ is almost exclusively honoured in the breach rather than than the observance. Standard management practices unambiguously put shareholders first, and people last, literally.

In today’s economy, companies create full-time ‘good work, at a good wage’ (the RSA’s hopeful formulation) only as a last resort. They rely instead on contingent workers who can be turned on and off at will and are increasingly managed by algorithm, thus dispensing with another tranche of the workforce. Pay is wildly unequal, even though studies again show that wide dispersion undermines teamwork, involvement and attachment to the organisation. Tight supervision and micromanagement kill trust and initiative – and even where, pushed by coronavirus, companies have moved to home working and virtual collaboration, the latter are almost comically sabotaged by the increasing use of digital surveillance to monitor and control remote employees. Meet the new work, actually a return to the old work, where all the risk and responsibility is borne by the individual, none by the corporation.

Given the yawning mismatch between the ideal and the grubby reality that most employees think their company doesn’t care about them and they don’t care about their work, the obvious question is, where on earth is HR in all this? If it and its nominal agenda are so comprehsnsively disregarded, why does it even exist?

There is much hand-wringing within HR and the academic literature over this. Every few years HR is called on to ‘reinvent itself’ or ‘make itself more relevant to business’ in one of the top management journals. But cynicism continues to grow, along with ineffectual programmes and surveys with no follow-up. ‘I do whatever the CEO wants,’ one HR head shrugged to HBR in 2015.

But the frustrations of HR can be explained if you think of it, at least in its current form, as a figleaf. In Beyond Command and Control, John Seddon describes HRM as a by-product of the industrialisation of service organisations along command-and-control lines. HR departments, he says, ‘grew up to treat the unwelcome symptoms of command-and-control management and have steadily expanded as the symptoms have got worse’. HR is, bluntly, damage limitation – yet another example of management consuming itself in trying to do the wrong thing righter (Ackoff), or doing more efficiently that which shouldn’t be done at all (Drucker).

As with so much of management, the way forward isn’t for HR to invent new things to do, but to give up doing old pointless ones. Managers should quit obsessing over individual performance and instead pay attention to the system that governs it. If they stopped demotivating people, removed conditions that get in the way of doing good work (‘So much of management consists of making it difficult for people to work’ (Drucker), ceased measuring activity rather than achievement of purpose and above all did away with incentives that distort priorities and divert ingenuity into gaming the system – bingo! the need for most of what passes for HR today (performance monitoring and surveillance, inspection, culture and engagement surveys, appraisals, courses on coping with change and other fake subjects that add no value) would simply evaporate. When the system changes, says Seddon, so does behaviour; as people act their way into a new way of thinking, culture change comes free.

That’s what an organisation that puts people first looks like. But it’s a result, not a cause. And you may have to kill off HR to get there.

Hitting the target and missing the point

Targets. Stretch targets. 100,000 coronavirus tests a day by the end of April. That turned out well, didn’t it?

When on 2 April health secretary Matt Hancock announced his goal of carrying out the famous 100,000 tests a day by the end of April, the result was predictable.

Given that at the time the daily testing rate was around 11,000, attention naturally focused on the number, and whether it would be achieved. And that’s where the debate stuck for the month. Not on why 100,000 or the purpose of the testing – the number.

On 1 May Hancock used the daily coronavirus briefing to declare that testing numbers had hit 122,347: the pledge had been met. Again, the number hogged the attention. Was it true? Had it really been hit? How?

Well, yes and no. It transpired that between the announcement of the target and the declaration of victory, the definition of ‘completed tests’, which previously meant ‘completed tests’, had quietly changed to ‘completed tests plus test kits in the post’. Subtracting the latter category left a ‘real’ figure of 82,000 actually carried out. Cue a new furore – again about the numbers.

What happened is a textbook illustration of the unintended effects of targets and their faithful sidekick, Goodhart’s Law.

To paraphrase W. Edwards Deming: in the case of a stable system there’s no point in setting a target, because you’ll get what it delivers. But with a non-stable system, there’s no point in setting a target either, because you have no idea what it will deliver. A numerical target in such circumstances is a finger stuck up in the air. Unless you know how to improve system capability permanently (I don’t think so), to hit it you have to be either incredibly lucky (in which case you’ll have to be even luckier to do it again tomorrow); or alter the parameters to make the target attainable.

Hancock did what everyone does when faced with the imperative to hit an arbitrary target: he managed the thing that he could – in this case, the definition of success.

But this is not a harmless bit of jugglery. Deming again: ‘What do “targets” accomplish? Nothing. Wrong: their accomplishment is negative.’ There is a high cost to his action – which is where Goodhart comes in.

As economic adviser at the Bank of England, Charles Goodhart noted that attempts to manage monetary policy by using any definition of the money supply was constantly subverted by actors finding novel ways to circumvent the definition. Hence his law, usually formulated as, ‘when a measure becomes a target, it ceases to be useful as a measure.’ A metric can be either a target or a measure. It can’t be both.

Take Hancock and his tests. To meet his target, he included in his count for 30 April around 40,000 test kits mailed out to the public and to hospitals. Of this number (pay attention here), while the Department of Health and Social Care counts the number of people that test positive, it doesn’t collect figures for tests completed.

What’s worse, since mid-April the government figures include on the same basis (ie people testing positive but not tests completed) 17,500 variegated tests consisting of both diagnostic and antibody tests, thus adding oranges to uneaten, partially eaten and completely eaten apples. As Tim Harford declared incredulously on his latest ‘More or Less‘ show: ‘It’s almost as if they don’t care if the number of tests is consistent or indeed accurate, as long as it’s big.’

At any rate, the upshot of this piece of target-setting is exactly as Deming and Goodhart predicted: the system is beyond comprehension and the figures such a dog’s breakfast that no one can tell what they mean. It seems highly unlikely that Hancock’s original target has been met at all since 30 April, but how can anyone know for sure, including the government? The only certainty about the figures is that they are bogus. You might think that when the subject is life or death, this matters, no?

Yet the damage done by targets doesn’t stop there. What most people don’t get (including a ‘science writer’ on a previous edition of ‘More or Less’) is that the problem with targets isn’t that they don’t work. It’s that they do.

A target is typically a one-club solution to a problem with many moving parts. But the first law of systems is that you can’t optimise one part of a multipart system without sub-optimising others. Any benefits are outweighed by unintended consequences elsewhere in the system. Focusing attention (often with added incentives) on the target rather than the purpose ensures that even if the target is hit, the point is missed.

Targets displace purpose. Tests are a means, not an end. But reporting 100,000 of them became the purpose, both for Hancock and his critics. Yet why 100,000 a day, rather than 75,000 or 250,000? What are we testing for in the first place? Deming once more: ‘Focus on outcome is not an effective way to improve a process or an activity…[M]anagement by numerical goal is an attempt to manage without knowledge of what to do’. Another finger in the air. Or, more tersely: ‘Having lost sight of our objectives, we redoubled our efforts.’

Consistent failure to meet the daily target underlines the point: it bears no relation to purpose, or any other kind of reality, really. Not production capacity, as we have seen. Even more serious, not with demand either – shortage of which, or shortage of which in the right place, has been put forward as a reason for the target debacle.

To be effective, a system needs to be designed against demand. And demand is determined locally. Testing is the first step in the ‘test, trace, isolate’ strategy that the government first initiated and then discontinued in March, and has now resurrected. By definition, that strategy has to play out out locally, where the infection occurs, tracing begins and treatment takes place. But bypassing hospitals and 400 or so existing small labs dotted around the country, all tightly linked to local primary care, the government, as with the Nightingale hospitals, is relying on giant regional testing factories, set up from scratch and remote from their users in every sense. A lurch backward to early 20th century industrial thinking, these in the view of many observers are the exact opposite of what is needed.

We can all support a goal of ramping up testing capacity to the level necessary to meet the purpose, whatever that number is. In fact it would be a good idea. But the minute you set it as a numerical target, it is subject to Goodhart. Managing backwards from an outcome plucked from thin air is a feature of command-and-control management, the only kind of management that government knows. But it is back-to-front. Targets are a disease. They destroy purpose, distort priorities, and soak up energy in games-playing and bureaucracy. They are the problem, to be avoided like, well, the plague.