Understanding measurement madness

Modern-day management is subject to several debilitating diseases, but the most damaging and pervasive may be the measurement fetish. As Jerry Z. Muller puts it in his new book The Tyranny of Metrics, we don’t just live in an age of measurement – ‘we live in an age of mismeasurement, over-measurement, misleading measurement, and counter-productive measurement’. Despite the manifest and mounting costs of measurement failure, there’s no sign of the fetish diminishing; if anything the reverse.

The first reaction on reading Muller’s concise and non-strident study is relief: we’re not actually alone, or mad, to believe we are surrounded by measurement madness. As Muller confirms, it is rampant in public and private sector throughout the US and UK, the chief subjects of the book, and its unintended consequences – distortion of purpose and effort, fake figures, short-termism, discouraged risk-taking, innovation and cooperation, burgeoning bureaucracy, degradation of work and worker demoralisation, and costs both direct and indirect – are the same, equally huge, and equally disregarded, everywhere. As Muller remarks: ‘A question that ought to be asked is to what extent the culture of metrics – with its costs in employee time, morale and initiative, and its promotion of short-termism – has itself contributed to economic stagnation?’

Leading on from the first, the second reaction is a mixture of stupefaction and rage: in view of the evidence, including ‘a large body of literature’ dealing with, say, the problems with goal-setting, pay for performance (P4P), and performance rankings, why does this dysfunctional obsession persist and even grow, spreading like a virus from the source of infection in the US and UK to the rest of the Anglosphere and thence to other countries in Europe and the rest of the world?

Muller attempts to answer this question. ‘In a vicious circle,’ he writes, ‘a lack of social trust leads to the apotheosis of metrics, and faith in metrics contributes to a decling reliance on judgment’. He shrewdly notes that a metrics of accountability appeals equally but for different reasons to both the political right and left – the right being suspicious of public-sector empire building and protectionism (Yes, Minister, public choice theory), the left from the 1960s onwards distrusting authority and being convinced that leaving things to experts was to limply accept the prejudices of established elites. Both wings therefore wanted to make institutions more accountable and transparent, ‘using the purportedly objective and scientific standards of measured performance’, while the institutions themselves had no choice but to use similar performance data to defend themselves.

These tendencies were turbocharged both positively and negatively by related developments on the business front, where growing distrust of managerial motives led to the recasting of management into a system of goal-setting, monitoring and incentivisation, all dependent on standardised measured performance. New Public Management applied the same principles to the public sector, supplemented by consumer choice theory which held that giving people the right information to make rational choices was the basis of economic democracy. When this didn’t work (as it couldn’t), the response was not to change course but to try harder, using technology to measure more things and invent new rules about how to do it, giving another savage twist to the measurement ratchet.

The massive irony is that the choice of measures may be the most important thing that management does. And that being so it is subject to measurement’s all-powerful Catch-22: the point of measurement is assumed to be to supplant fallible human judgment and thus enable better decisions; but the choice of measures itself requires human judgment, and without it measurement is actually worse than useless – it misleads and hides the real situation, both from management and everyone else. This is why no one knows the real demand on the NHS or social care, for example; why politicians are puzzled to see services that win stars and plaudits from regulators and inspectors getting a resounding thumbs down from their constituents; and why none of the figures reported by the public sector – whether schools, universities, medicine or the police, all covered in the book – can be relied on.

The truth is that today’s metrics fixation has produced a culture of gaming and manipulation worthy of the Soviet Union – a resemblance that Muller does not fail to observe: ‘Just as Soviet managers responded by producing shoddy goods that meet the numerical targets set by their overlords, so do schools, police forces and businesses find ways of fulfilling quotas with shoddy goods of their own: by graduating pupils with minimal skills, or downgrading grand theft to misdemeanor-level petty larceny or opening dummy accounts for bank clients.’

In the age of Big Data and the gig economy where tasks are measured by the second, it is hard to see the measurement obsession diminishing any time soon. It’s not that the alternative to dysfunctional measurement – choosing measures that relate to purpose and that support rather than degrade professional judgment – is conceptually or practically more difficult; it just requires a different kind of thinking (and an implicit admission that previous methods were wrong) – which is why companies that model the alternatives are treated as one-off curiosities rather than exemplars to emulate.

But as a sobering reminder of how far we have strayed from the ideal of minimal (self-)management ideal, consider the example of the Medical Research Council’s Laboratory of Molecular Biology in Cambridge. The most successful biological research lab in history, the LMB was set up after the war by Max Perutz, who had arrived in the UK in the 1930s as a penniless refugee from Vienna. Recognising that creativity, in science as in the arts, can be fostered but not organised, still less planned, since it arises from individual talent, Perutz saw his task as removing anything that got in the way of his recruits following their desire to do the best possible science. (Compare with Peter Drucker’s rueful observation that ‘So much of management consists of making it difficult for people to work.’) Lab administration was kept to a bare minimum: ‘No politics, no committees, no reports, no referees, no interviews – just highly motivated people picked by a few men of good judgment,’ as pharmacologist Sir James Black, another Nobel prizewinner, described it. The result of this recipe for anarchy? Not surprisingly, the lab became a magnet for scientific talent, and while the MRC may not have got much paperwork from Perutz in return for its cash, by the time he died in 2002 it had chalked up the extraordinary total of nine Nobel Prizes, four Orders of Merit and nine Copley Medals (the highest honour from the Royal Society). London Business School’s Jules Goddard comments that Perutz deserves to be feted as much for the brilliance of his management as for his scientific example.

Hidden ways technology is reshaping the economy

Data and algorithms and artificial intelligence to manipulate them are all the headlines these days. A few weeks ago AI experts were highlighting some of the more extreme dangers of intentional misuse of these technologies. The large-scale hacking of the physical infrastructure that they warned of hasn’t – yet – come to pass, but the recent Facebook/Cambridge Analytica revelations are proof enough of what can be done without going to those extremes to be disquieting.

But just as concerning if much more insidious are the underlying changes to the economy brought about not by malicious intent but by the evolution of technology itself, with AI to the fore.

And here’s the thing. I am not a techno-determinist. I strongly believe that the course technology has taken has been shaped directly or indirectly by ideology, incentives and tax regimes, among other things, and if we put sufficient mind to it (a big if) we are capable of shaping it in other directions, towards other, more socially favourable outcomes, for example. That said, however, it is impossible to go back. We are where we are. And that is at a point, or so some people think, where today’s technology, formed by the influences mentioned above, is becoming self-organising. It is evolving semi-autonomously, outside conscious human control, and as it does so it is creating an invisible second ‘intelligent’ economy that is steadily subsuming parts of the physical one. Software is eating the world, in the famous phrase – including companies and jobs that will not reappear.

In two articles in the McKinsey Quarterly, 2011’s The Second Economy and Where is technology taking the economy? In 2017, Brian Arthur explains why. Arthur is a highly regarded complexity scientist and economics professor at the multidisciplinary Santa Fé Institute, and several years ago he wrote a highly original book about technology that I reviewed here. He said that technology is more like chemistry or biology than physics, building out from itself in ways that were non-linear and organic.

In the two essays he takes his insights further. Basically, he argues that just as steam power and then electricity bulked up the pre-modern economy by supplying muscle power that enabled mass-production and the huge increase in physical stuff that accompanied it, so the combination of computers, the internet and now cheap ubiquitous sensors is supercharging it by backing up the physical economy with a neural system – a virtual back office if you like – where more and more of the coordination, administration and linkages get done.

Perhaps more accurate than software eating the world is that it is modularising it, generating ‘libraries’ of digital modules available for use, with transformational effect, right across the old industrial landscape. Take driverless cars. It’s no accident that the frontrunners in autonomous vehicles are tech titans like Google and Apple, or start-ups Uber and Tesla, rather than traditional automakers, struggling to stay in the race. Data will be the most important component of the cars of the future (as long as they exist), not metal.

Modularisation is increasingly breaking down conventional industry sectors, blurring their boundaries and reshaping them into loose clusters of technology-linked suppliers, competitors and customers that behave more like ecologies than separate industries. In transport, some observers see the outlines emerging of a ‘mobility ecosystem’ based not on car ownership and mass production but on flexible individual preference – expressed perhaps in subscription models providing access to a variable combination of private, public and shared transport, refined as data from vehicles, the road and personal preference is collected and processed in the virtual economy. If the data and connections are valuable enough – for example generating a market for onboard information and entertainment – the transport element might eventually come free.

Even if it doesn’t, the direction of travel is clear. Increasing portions of the physical economy will migrate to the hidden digital one – and all industries will be affected. Arthur doesn’t mince words: ‘I think it may well be the biggest change ever in the economy. It is a deep qualitative change that is bringing intelligent, automatic response to the economy. There’s no upper limit to this, no place where it has to end’.

He is in no doubt that as with previous great economic shifts, many jobs will disappear – as they are already doing – but after agriculture and manufacturing, this time it is the last repository of employment, the service sector, that is in the firing line. It is correspondingly harder to see where new ones might come from. As Arthur notes, invoking another historical precedent: ‘Offshoring in the last few decades has eaten up physical jobs and whole industries, jobs that were not replaced. The current transfer of jobs from the physical to the virtual economy is a different sort of offshoring, not to a foreign country but to a virtual one. If we follow recent history we can’t assume these jobs will be replaced either’.

That of course poses a problem. Jobs don’t just provide work. Up to now, they have also been the vehicle for distributing wealth and ensuring access to the fruits of production (thank you, Henry Ford). But this benevolent circle is now breaking down under pressures that are partly ideological but increasingly technological, as networked developments feed on each other. The issue now, at least in the leading economies, is no longer production. In the US, for example, if total household income were shared among all households, the mean would be a enough for a decent middle-class living. Instead the agenda-topping item is distribution.

We are shifting, suggests Arthur, from the age of production to the age of distribution. Distribution being political rather than technical, like production, it is likely to be trickier to deal with – although Europe, with its experience of social legislation and safety nets and suspicion of free-market fundamentalism, may be in a better starting position to do it than the US. But there is no doubt that just as past dislocations needed far-reaching institutional innovation (pensions, the welfare state, trade unions) to palliate transitions and smooth rough edges, similar large-scale adjustments will be required this time.

Meanwhile, technology will continue to feed off itself as it grows like an invisible root system beneath the tangible economy. It’s a remarkable thought, if slightly creepy in its implications, given how effectively the unscrupulous have learned how to bend existing technologies to nefarious ends. ‘We just put information [like ‘crooked Hillary’] into the bloodstream of the internet and watch it grow,’ explained Cambridge Analytica CEO Alexander Nix before he was suspended. ‘Give it a little push now and again over time to watch it take shape… So this stuff inflitrates the online community with no branding, so it’s unattributable, untrackable.’ As well as economists, those anxious AI experts may just have landed themselves something else to worry about.

We need to talk about British management

I was in the middle of writing about something completely different when I read a piece in the FT lamenting the UK’s mediocre management record. At the same time financial journalist Ian Fraser, a tireless critic of the UK banks, particularly RBS, and their auditors, sent me an incredulous tweet – ‘Sir Win Bischoff: “25 years on from the original Cadbury principles, the UK’s Corporate Governance Code has greatly enhanced the quality of corporate governance and is now rightly globally renowned.” Seriously??’

The two things taken together are such a magnificent illustration of what might be the UK’s most glaring and neglected long-term problem that they are impossible to ignore. Of course Gavin Kelly, who wrote the FT article, is right: with employment regulation as business-friendly as any in the world, busted unions and management as free to manage (and pay itself) as it could dream of, there is no longer anywhere to hide: in the aggregate UK management is just not very good. Conspicuously there has been no protest about the diagnosis on the FT’s letters page,

At the end of his piece, Kelly goes off on a riff about the harm bad bosses do to worker engagement and the quality of working life in general. Well, yes. But pace Kelly the question is not about making nice but giving workers a good job to do, which plays into the bigger issue: that management has a lot more than that to answer for. If jobs, growth, dependable infrastructure and services are lacking, it’s because of management shortcomings in the widest sense. In other words, as I bore myself repeating, management has macroeconomic consequences. And, via the macroeconomy, great and unpredictable political consequences, too.

The truth is that hapless UK management is a knotty systemic problem of which people making poor decisions in offices and boardrooms are just the symptomatic part. Apart from companies, business schools, the big consultancies, the education system, politicians, civil servants and the public sector are all implicated, not to mention the large gap where an informed business press ought to be, in a web of self-reinforcing ignorance and complacency which seems impervious to assault from outside.

You might have imagined that getting something as important as this right would be a national priority. But imagine again. Perhaps the most damaging aspect of this willful blindness – a problem within a problem – is that we have no idea how bad we are. As Kelly notes, managers, like motorists, ‘tend to think they are better than average’. Exhibit no 1 in this chamber of horrors: the aforementioned Sir Win Bischoff, chairman of the Financial Reporting Council, who in a speech of astounding complacency in January celebrating 25 years of the UK governance codes, showed no sign of appreciating how deeply they are implicated in what has gone wrong.

Thus the UK’s ‘globally renowned’ governance code signally failed to stave off Carillion’s collapse. Nor does it offer any protection to the likes of Unilever or GKN, two substantial corporate citizens obliged to make damaging concessions to bribe shareholders not to sell out to opportunist bids from break-up artists. There was not a word from Bischoff about the code’s failure to nurture a viable UK manufacturing sector, nor about long-playing conflicts of interest at the banks and their auditors. Most of all, not a hint of acknowledgement that the shareholder-first ideology that the code explicitly enshrined for the first time in 2006, just before the 40-year experiment proved its failure by almost blowing up the world financial system, is directly connected to the finding, which Bischoff did deign to note in passing, that ‘public confidence in business has been diminished’ and the ‘perception that business is not delivering for all’.

The fact is that the regime prescribed by the City code is no healthier for a corporation than a diet of fast food is for humans. It provides a short-term hit but in the long term leads to obesity, a weak heart, myopia and early demise. It locks in place the sterile, bureaucratic, command-and-control management that makes for the destructive workplaces that Kelly complains of; it also anchors the obsession with budgets and control that prevents managers seeing how their sky-high costs and dismal service are two sides of the same coin, and kills off customer-focused innovation at birth. No wonder people are unhappy. The same killjoy methods have been imported into the public sector via the New Public Management, with predictable results; and possibly most insidiously have encouraged politicians to outsource their most important capabilities to the market, so that – ironically, despite the Brexit vote – the last thing the UK has the ability to control is its own destiny.

The Brexit omnishambles is of course the culmination of UK management fecklessness, undertaken apparently without any system awareness or anticipation of likely consequences, eg for Ireland. Could the bruising negotiations, plus the mockery of the rest of the world, at last cause people to make ‘a link between the state of British management and national prosperity’ and ‘trace persistent managerial weaknesses back to root causes’, as Kelly hopes? It would be nice to think so.

The great levellers

For the first time, Davos man seems to be genuinely freaked.

Although it surfaced only indirectly in the World Economic Forum’s final communiqué – in measures such as the setting up of a Global Centre for Cybersecurity to counter cyber threats – the scare quotient in the annual survey of WEF members’ concerns was at an all-time high. The main worries are no longer to do with finance and the economy, but above all politics – social turbulence, populism (as reflected in giant hedge fund Bridgewater’s famous memo last year), even war. Ninety-three per cent of of those polled thought that political and economic conflict would rise this year: two-thirds believed the world was becoming a riskier place.

They are right to be concerned. We should be too. We have after all been here before – many times, according to Stanford economic historian Walter Scheidel, author of The Great Leveler, a study of inequality from ancient times to the present, which was shortlisted for the FT’s Business Book of 2017. Inspired by Thomas Piketty’s work to extend the latter’s research to the long term, Scheidel discovered a grim and unmistakable pattern.

In good times, a rising tide initially floats all boats. But some rise higher than others, and at a certain point a self-reinforcing cycle kicks in. Wealth begets power and power begets wealth, and not surprisingly,, equipped with such advantages the wealthy and powerful from the earliest times have done a pretty good job of entrenching and extending their (and their offspring’s) interests. Until, at a certain point, the social strains become so intolerable that one of Scheidel’s four ‘great levellers’ – war, pestilence, revolution or famine – intervenes.

All these catastrophes have a side-effect of compressing inequality. War massively raises taxes and creates full employment. It also has to offer recompense for suffering, which is why it is so often followed by social reform: think of ‘homes fit for heroes’ after WW1 and the welfare state in the 1940s, together with extensions of the franchise, the growth of trade unions and more liberal attitudes. Redistribution, including direct confiscation of the assets of the wealthiest, is usually one of the first items on the agenda of revolution. Epidemics and famine decimate the workforce, but also raise wages for the survivors and depress rents and other returns on capital.

At which point the cycle begins again. So is there a way of heading off the apocalyptic ending? In theory, yes. That the tendency towards growing inequality can be offset by political choices is shown by the relative stability of the more egalitarian economies of Northern Europe – the Nordic countries and to some extent Germany (although the latter is also experiencing some of the populist strains affecting other advanced economies).

But in the UK and US especially, but also in many other advanced economies there is little sign of countervailing measures – rather the reverse.

As we saw in the Great Financial Crash of 2008, the fiercest capitalists are like sharks devouring their own tails – the last people on earth to trust with the future of capitalism. Sure enough, since the crash instead of recognising that their period of grace is over and only serious institutional reform can save it, and them, the powerful and wealthy, together with their political allies, have strained every muscle to restore business as usual, doubling down on the fundamentalist free-market measures that brought us to this perilous pass in the first place – austerity, hatred of state action, and determination to throw people back on their own resources rather than collective solutions. President Trump’s ‘pluto-populist’ tax policies, and to a lesser extent dogged UK insistence on ‘business friendliness’ are classic examples of this willful blindness.

Which is where the insights of another perceptive observer suddenly take on a new relevance.

The work of social democrat thinker Karl Polanyi, part of the extraordinary flowering of Austrian intellect that also produced Sigmund Freud, Joseph Schumpeter and Peter Drucker, has unsurprisingly been eclipsed in a period when the thought of his neoliberal opponents Friedrich Hayek, Ludwig von Mises (also Austrian) and followers such as Milton Friedman has been hegemonic.

But Polanyi’s insights now look prophetic. He would have seen Brexit, Trump, the not-so-stealthy rise of populism, nativism and retreat from democratic principles as more evidence for his proposition that the unfettered free market delivers not proletarian revolution, as the other Karl proposed, but mounting instability and inequality leading quickly to protectionism, nationalism, and demands for strong leadership to make the US (UK, France, Hungary, Poland, the Netherlands…) great again. In other words, something resembling fascism.

Polanyi used the example of 19th century Britain to show that there was nothing natural, or inherently democratic, about free markets: they were the construction of the rich and powerful, for the rich and powerful. ‘Laissez-faire,’ he famously declared, ‘was planned’. But its champions neglected the historical evidence that markets only work sustainably when embedded in society and tempered by politics, not the other way round. Ultraliberalism eventually endangers democracy itself, as it did before WW1, before WW2 and, is doing again today.

As it happens, one of Polyanyi’s friends in Vienna was Peter Drucker, who helped him find a job in the US in 1940. Like Polyani, Drucker was a humanist who insisted on management’s importance as a social as much as an economic institution. ‘The very survival of society is dependent on the performance, the competence, the earnestness and the values of their managers’, he wrote in 1993; and he would certainly maintain the same today.

The difference now is that management, at least in the UK and US, has unmistakably become a constituent part of the wealthy and powerful in whose interests society has been reshaped. Does it have the foresight to look beyond its own self-interest to mitigate the dangers swirling through fractured, unequal, austerity-gripped societies? Two contrasting post-Davos developments spring to mind. On one hand the announcement by Amazon’s Jeff Bezos, Berkshire Hathaway’s Warren Buffett and JP Morgan’s Jamie Dimon that they were setting up a non-profit company to disrupt the dysfunctional US healthcare system directly answers the Viennese call for productive innovation around a social need to temper capitalism’s excesses. On the other hand, Amazon’s patented wristband for tracking worker movements in its warehouses, the use of algorithms rather than humans for performance management and growing threats of ‘data dictatorship’ points management in a different and much darker direction. It now has to choose between the two. If Scheidel and Polanyi are right, the effects of the decision will ramify much wider than the organisations that they work for.

Carillion: for whom the bell tolls

The opening words of the 2016 Carillion annual report, signed by chairman Philip Green and published less than a year ago, run: ‘In 2016, we made good progress in a number of our markets, while managing and mitigating the effects of more difficult trading conditions in others. Importantly, the Board maintained its focus on overseeing and developing the Group’s strong governance and management framework, on scrutinising the Group’s performance, on assessing the Group’s risk management and control processes and on constructively challenging the Executive Directors’. The whole report is written in the same comforting vein, with soothing sections on risk, vision and values, and why investing in the company would be a good idea.

It’s easy to mock. But given that four months later the company issued a substantial profit warning and the CEO departed with instant effect, Green – formerly adviser to David Cameron on corporate responsibility, no less – was either in the dark about the real situation or being exceptionally parsimonious with the truth. And where were the auditors? It’s hard to know which is worse and where to start. Either way, it’s a brutal slap round the face for current governance practice, which has been singularly unable to head off, or even warn of, one more spectacular corporate disappearing act.

Why did Carillion evaporate so quickly? The answer is that it was really a shell company. As a fully financialised product of the neo-liberal political philosophy ‘that for years has elevated financial engineering above real engineering; off balance-sheet finance above paying for things openly; and lauded the private sector above the public sector’, as the Evening Standard’s Tony Hilton noted, it had nothing meaningful to sell.

It grew to be the UK’s second largest construction firm and a member of the FTSE 250 not organically but through a flurry of acquisitions of facilities management and other service companies including Mowlem, Alfred McAlpine, John Laing and a several others. But as with other other large outsourcers such as Serco, G4S, Capita and Atos (incidentally all equally reviled), its main expertise, such as it was, was not in operations – much of the messy real business was subcontracted – but in doing deals and managing outsourcing contracts. When that became harder to pull off, Carillion had no option but to up the stakes and add bigger construction contracts, such as £1.4bn of work for HS2, to keep the cash coming in and the bailiffs away from the door. Eventually, in the colourful phrase of Matthew Vincent in the FT, it had become ‘a sort of lawful Ponzi scheme’ in which new or expected revenues went straight out again to pay pressing current demands. Pay incentives may have played a part in this, bonuses in the industry frequently being tied to increasing revenues rather than profits. In its 2016 ‘Say on Pay’ report, shareholder adviser Manifest noted important reserves about Carillion’s remuneration policies, raising doubts about quantity, lack of ‘forward-looking’ and stretching targets, and the exemption of share-based awards from clawback provisions.

Carillion’s demise shows in sharp relief the real toxicity of most large public outsourcing deals, which might have been designed to be poor value for money for the public and encourage wrong behaviour by companies. Politicians think it’s a simple matter of pricing, but the real problem is design and the thinking they reflect. In effect, deals are constructed zero-sum games in which one party can only gain at the expense of the other. To some extent outsourcers are hoist with their own petard. Taking advantage of naive negotiators, they comprehensively won the opening rounds, getting themselves typically paid by volume (number of calls answered, appointments made, applications processed) rather than service improvement. They compensated for low opening bids by back-loading contracts and making changes subject to punitive cost penalties – in effect locking in existing high-cost, low-quality designs.

But this triggered an arms race which no party could win. To avoid accusations of waste, Whitehall slashed margins and drove ever harder (but not smarter) bargains. Operators responded by cutting costs, their own and those of their subcontractors, the burden of which fell heavily on labour – substantially contributing to the explosive growth of zero-hours and other exploitative contracts – and cutting corners. There is no incentive to understand demand and improve response to it. One branch of this path to destruction leads to Carillion, and the multiplying knock-on effects through the wider economy. Another ends at the burnt-out shell of Grenfell Towers. A third is the dismal overall state of public services, with low pay, low productivity and no innovation co-existing with gross inefficiencies.

Pace Jeremy Corbyn, it’s not outsourcing of itself that is to blame. It’s perfectly possible to devise positive partnerships rewarding innovation that pays off in services that are both better and cheaper. But not with present ideologically blocked thinking. As Hilton puts it: ‘The collapse of Carillion is an indictment of management, but one in which the government, Whitehall, City bankers and even investors are complicit’. It is yet another example of the UK’s unique penchant for ending up with lose-lose: pseudo-marketisation without any of the benefits on one side, intense but ineffective government involvement on the other. The biggest loser of all in the Carillion debacle is the rest of us.

Hold the front page

When Max Clifford died in prison before Christmas, most British newspapers ran lengthy obituaries detailing the lurid lengths the disgraced publicist went to to keep celebrity clients either on or off their front pages. Truth, it was clear, played only a bit part in these stories. ‘I was always instinctively good at lying’, Clifford cheerfully admitted. In a debate at the Oxford Union, he boasted: ‘Every day, every week, every month, a lot of the lies that you see in the newspapers, in the magazines, on television, on the radio, are mine.’

What the obits omitted was the newspapers’ own role in enabling the perpetration of these frauds. Their willingness to be co-opted in his fictions if they led to a good headline was a component part of the Clifford business model. It came back to bite him in the end, but even though hidden at the time, it was an even greater Faustian bargain for the press. True, everyone remembers those headlines. But no one, not even at the Sun – tellingly, the only paper not to carry a full-length Clifford obituary – actually believed that comedian Freddie Starr ate someone’s hamster or that David Mellor wore a Chelsea football shirt for trysts with a lover.

The full costs of this airy disregard for readers’ trust (indeed for readers tout court) are only now becoming evident. They play out on two dimensions. At industry level, very roughly speaking, global press circulation has historically reflected levels of trust in the media. A simpler world, admittedly, but peak trust and circulation coincided in the late 1950s and have broadly declined in parallel ever since. As ever, coincidence is not necessarily cause, but a plausible story is that declining press standards and dumbing down induced by the advent of TV and burgeoning celebrity culture eroded first trust and then the number of people willing to shell out for products that were now competing in an entertainment rather than information economy that the papers were ill-equipped to play in.

With supreme irony, as demonstrated in every performance of Ink, James Graham’s beautifully spiky play about the birth of Rupert Murdoch’s Sun, when for once a newspaper took the trouble to listen to readers it was a brilliant success. Unfortunately, it was about the first and last time until nearly too late. From its joyful iconoclastic beginnings the Sun rapidly lapsed into the complacent cynicism that led directly to the Clifford conspiracies. As for the rest of the industry, ‘I don’t think anyone thought about readers at all, except vaguely as circulation numbers,’ one ex-newspaper manager told a forum recently.

By 2011, levels of trust in the media were so diminished that when the phone-hacking scandal broke, some publications had no reserves of good will left to draw on. As a result, the News of the World, then selling 2.5m copies a week, evaporated from one week to the next in the most dramatic corporate vanishing act since accountancy firm Arthur Andersen was dragged down by the implosion of Enron. Note that the NotW was probably not the sole offender. Newspapers are gossipy places, and as the same manager concedes, it stretches credibility to hamster-consuming dimensions to believe that once hacking techniques were known about, rival publications weren’t also tempted to use them. They just weren’t caught.

Newspapers are nonetheless right to insist that a vibrant fourth estate is essential to keep others on the straight and narrow. But the corollary, which they are less keen to stress, is that the press itself must be honest as well as competent. A corrupt, incompetent press is as dangerous to democracy as to itself – the second dimension of its importance. For the dire consequences, look no further than the vacuum it has left for the spread of fake news (the very definition of Clifford’s stock-in-trade), the solipsism of social media and recent attempts at voter manipulation, on both sides of the Atlantic. Equally culpable is the failure, in the UK at least, to develop a real business press that would go beyond the advertising-driven reporting of company figures and the views of City analysts to probe and critique the workings of business-political complex as a whole (the BBC, so good in many other areas, is the limpest performer of all in this respect). The demise of the monthly Management Today in a country whose economic problems are so clearly rooted in the way its companies are run says it all.

How has newspaper management gone so far off track? If a week is a long time in politics, it is an eternity in the news business, whose greatest strength – its simple, brilliantly grooved daily or weekly production schedules – is also reflected in its greatest weakness: the inability to think beyond them. Hence the infirmity of purpose, leading to a fatal slide from informing to entertaining as operating principle; and the inability to see that the only distinctive thing it had to monetise – and therefore to nurture and measure – in the long term was trust, and that was for readers to deliver rather than for it to invent in the newsroom.

For all their faults, I still love newspapers, consuming them in unreasonable quantity and at unreasonable cost. Can they survive? We have to hope so. What is certain is that to do so, they will have to do a better job than in the past. That means changing attitudes, which doesn’t come naturally to a trade that has always thought it knows best. But this time it can’t wing it. As the late Peter Preston noted in his last column for the Observer, it’s time for journalism to do a job it has been avoiding for ages: clean its own stables. We should celebrate its triumphs, he said: ‘They burnish our business. But they are not, by any means, the whole of the business: a business that means treating readers in a jam like human beings, identifying distress, becoming a functioning part of society rather than commentators at its edges. In short, seeking to be worthy of trust in the hole where admiration ought to be.’

Fireworks in Vienna

The 2017 Global Peter Drucker Forum, which took place earlier this month in Vienna, saved the best until last. Under the title ‘Growth and Inclusive Prosperity’, for the first day and a half it was its usual lively, eclectic and social self; Steve Denning’s notes on some major themes here. But on the final afternoon it burst alight.

The match was struck by Carlota Perez, the economic historian and development scholar, who painted a tantalising picture of what-could-be. Taking a 250-year view, Perez sees the four previous great technology revolutions – factories and canals, coal and railways, steel and heavy engineering, and cars and plastics – following a strikingly similar pattern. The first phase is enthusastic uptake as entrepreneurs pile into the new technology, leading to a speculation-fuelled bubble (think successive canal, railway and auto manias in past surges). Irrational exuberance is followed by a sharp recession, even depression, as reality kicks in and ambition is scaled back.

This is the point we have reached in the fifth great revolution wrought since the 1970s by IT, telecoms and the internet. The initial internet bubble burst at the millennium, to be followed by the great casino bust of 2008. As the dust settles on the financial crash, the question now is: can we hope for a second phase of stable, sustained, broad-based growth – a new ‘golden age’, as Perez calls it – as has happened at each of the technological surges in the past? Yes, replies Perez. But if – and only if – in the new phase investment in the technology is production- rather than finance-led; there is a guiding ‘direction’ for the innovation effort; and supporting institutions evolve, or are put in place by governments, to underpin and ease the jolting social change such transformations set in motion.

Those are evidently big ‘buts’. Production and manufacturing, such as they are, still dance to the tune of Wall Street and the City. The only guiding direction for Silicon Valley’s new masters of the universe seems to be technology for its (and their) own sake. And most business today, at least in the Anglo-Saxon world, channels Milton Friedman in asserting it has no social obligation except to increase its profits. Its interests have diverged so far from those of the societies in which it is nominally embedded that it cannot be relied on to provide a way forward. Governments meanwhile are supine and timid, having bought hook, line and sinker the neo-liberal line that the market alone can provide, and are in any case way behind the curve. Supply-side education and training (as I keep saying), while valuable in themselves, have limited effect when companies only employ humans as a last resort. The one nod to institution-building is the emerging debate about a universal basic income – which although interesting looks increasingly like Silicon Valley’s attempt to offload what should be its own wealth- and job-creating responsibilities on to those who pay taxes, a category which does not include itself.

Yet the outlines of a new positive-sum game between business and society are tantalisingly clear. Says Perez: ‘The direction for innovation is clear: smart, green growth’ that would work within environmental constraints to bring further areas of the world into a new-look, carbon-free prosperity. For that, governments need to rouse themselves to extend the fiduciary duties of corporate managers and directors to a wider stakeholder group, prodding enterprise to embrace its proper function of innovating, as Peter Drucker put it, ‘to tame the dragon, that is, to turn a social problem into economic opportunity and economic benefit’. Management’s ‘job-to-be-done’ would switch from the paradigm of the last 100 years, efficiency, the foundation for mass-production and consumption, to effectiveness, which redefines prosperity as the availability of solutions to pressing human needs, and growth as the increasing pace with which they can be generated. Imaginative institutional reform would underpin the changes with a new social contract reshaping obligations and entitlements for the new era.

So far so good. Perez’s outline was greeted with enthusiasm. But that left one large question outstanding: how would we get from here to there? Above all, who was going to make it happen? Perez couldn’t answer that, and, perhaps predictably, the reaction was a resort to what might be called the great leadership lament: where were the great leaders of old, and who would step up to lead us to the promised land? It seemed for a moment as if an the conference was stumbling to an uncharacteristically downbeat end. But that was to reckon without a second coup de théâtre courtesy of the very last speaker, social philosopher Charles Handy. Handy brought the curtain down and the hall to its feet (literally) by providing a frame for Perez’s ideas that was at the same time more daring historically and a direct call to action. Like the the Catholic Church 500 years ago, Handy said, management was ripe for its own ‘95 theses’, or manifesto for reform, that would take on a corrupted institution and reinstate basic human values at its centre. As to where to start and who should lead – what better than the Drucker Forum to stand in for Wittenberg and Peter Drucker for Luther, or even Luther King, their message spread and amplified by every person in the room? His message received a standing ovation. A defining moment for the Drucker Forum? Judge for yourself: watch here.

The robots’ tale

Many Saturdays, I attend a furniture restoration workshop. Restoring battered pieces to life is inordinately satisfying in itself (as is the interchange with a richly varied, even eccentric cast of characters united only by their craft motivation). But it is also fascinating to reflect on in the context of current concerns over work and automation.

Could this kind of work ever be automated? Unlikely. It’s simply too analogue, too human, the timber (sometimes literally) too crooked.

John, the instructor/boss, has an array of carpentry and upholstery skills that can only be marvelled at. Some of these could be reproduced in machines, although only up to a point. But as impressive as the craft skills are the prodigies of improvisation brought to bear on some of the repairs. Sometimes I laugh aloud with incredulity at the exuberant ingenuity of the resolutely low-tech means used to accomplish a repair that to me looks impossible. ‘Nothing’s impossible,’ John instructs. Thus the contraption rigged up to mend and straighten the leg of a delicate chest brutalised in a previous repair came straight out of Heath Robinson, involving clamps, stray bits of wood, a steel straightedge, sellotape, twine, and a bit of blanket. This in turn illustrates why all material offcuts – everything – is kept, not in the spirit of meanness, but the exact reverse: because of its potential, almost always eventually fulfilled, for imaginative and joyful reuse.

Two contrasting recent stories about industrial automation bring this extraordinary everyday human ingenuity into sharp perspective. One concerns Tesla, Elon Musk’s electric car company, as it attempts to ramp up production of its smaller Model 3 vehicle. Model 3 is crucial for Tesla’s ambition to move out of its high-end niche to compete as a mass-market manufacturer, and it had confidently predicted that by the end of 2017 it would be pumping out 5,000 of them a week at state-of-the-art California and Nevada facilities. Needless to say, automation is a key part of Silicon Valley’s assumption that it can reinvent car manufacturing, and Musk has boasted of his vision of lights-out plants with almost no humans in attendance at all.

That now seems a distant prospect. In its most recent update Tesla conceded that a series of teething problems and production bottlenecks had left workers struggling to produce cars by hand, with the result that just 260 Model 3s were completed in the last quarter. Closer reading suggests that Tesla is waiting to commit to the huge capital outlays necessary to get production up to the planned 10,000-a-week capacity until it knows it can hit the lower target – that is, until it can make the current levels of automation work. In other words Tesla – which is burning money, reporting a larger-than expected loss in 2017 Q3 – is betting the farm on robots doing things better than humans Some investors are beginning to think it is not a forgone conclusion. As Gary Hamel tweeted: ‘Software may eat the world, but hardware is eating Tesla. Turns out making cars is harder than coding. Who knew?’

Well, Toyota for one, which as Fast Company reports is taking a remarkably different approach to advanced automation. Backstory: Toyota vying to be the the largest has always been the most profitable of the major automakers. The Toyota Production System is one of the undisputed wonders of the management world, a living, evolving thing that represents more than half a century of organisational learning.

Nonetheless, in the early 2000s, Toyota went through a bad patch. It ran into quality problems as it chased the global number one spot, culminating in a loss in 2009, followed by hugely embarrassing product recalls and a $1.2bn penalty imposed by the US Justice Department. Out of the debacle emerged a revised production system, now known as the Toyota New Global Architecture (TNGA), and in 2015 a new head of manufacturing, Mitsuro Kawaii.

But in a startling case of back to the future, the new regime represents bold advance not towards more automation but less – a return to the human craftsmanship that the TPS was built on but was neglected in the go-go years. Kawaii worked his way up from the shop floor, and in line with the conviction that automation should ‘grow organically out of human innovation’, he has launched training exercises using string-and-sealing-wax methods to devise small improvements to workplace activity – a bit more sophisticated than the rigs in my furniture workshop, certainly, but recognisably similar in their reliance on craft and human ingenuity.

Kawaii’s view is straightforward and radical: ‘Humans should produce goods manually and make the process as simple as possible. Then when the process is thoroughly simplified, machines can take over. But rather than gigantic multi-function robots, we should use equipment that is adept at single simple purposes’.

These sentences should be stamped on the brow of ministers, civil servants, CEOs – anyone in danger of succumbing to the idea that digital is the automatic answer to a business problem. If they had been we might not have wasted more than £12bn on failed NHS IT and hundreds of millions more on the grotesque inhumanities of Universal Credit, among many other examples. Robots are the apprentice, the servant, not the master; they are used not to cut costs but to free up people to do things better for customers. To rub it in: in tests, Toyota consistently finds that people can assemble cars faster than robots. What’s more, unlike machines they can improve their own efficiency and work quality.

Human by default, supplemented by frugal automation to do the boring bits that humans can’t improve on. It’s a formula that works for Toyota’s vast Georgetown plant in the US. It’s one we’d recognise in our weekly workshop in North London, too.