Needed – another extinction rebellion

In a recent column, The Guardian‘s George Monbiot noted that,‘Of all the varieties of media bias, the deepest is the bias against relevance. The more important the issue, the less it is discussed’, since opening it up might trigger demands for change that would threaten powerful vested interests.

What aroused Monbiot’s ire was the lack of concern over the collapse of nature. But he could have been talking about management. Management is ubiquitous, the invisible link between between the ideology of the day and what we do every day at work. It’s astonishing that although the discipline draws more students at both undergraduate and postgraduate level (God help them, and us) than any other subject, no one writes or talks about what they are being taught in the mainstream media. It is simply treated as a given. So no one would guess that that our conceptual understanding of management lags far behind that of any proper science, while the practice of managment is not even standing still: it is clearly getting worse. At least we are beginning to understand our effect on the rest of nature – but it’s not clear that the same can be said for the effects on ourselves.

The ultimate reasons go right to the top, to the heart of corporate governance. But here I want to invoke a few of the small absurdities and inhumanities we are subjected to, which also speak of the bigger ones behind.

Take first a despairing article by an anonymous policeman in The Guardian describing the near intolerable pressures of policing in the capital. Like one in five of his colleagues in the Met, according to a recent report, the writer suffers from PTSD. He believes the police have lost control of the streets and with just 10 officers in his borough available at any one time to attend to an emergency, admits to being scared himself. Direct cuts of 20,000 police officers (thank you, Mrs May) are bad enough; but equally sapping are reductions not only in police support staff but also in related services – ‘our duties are being stretched beyond our capabilities to include non-criminal matters regarding mental health and social services, because cuts have debilitated those sectors too.’ Unlike hospitals, the police can’t close their doors when they’re full. Instread, they turn off their telephones – so, with foot patrols almost non-existent, intelligence suffers and the chance of anything less than serious crime being dealt with falls to Zero. Finally, unofficial targets for things like stop and search not only divert attention from real purpose but increase the public disengagement and distrust that makes police work harder. This is the very opposite of smart policing: dumb, stupid, bullshit public service that disillusions both people who suffer crime and those who are supposed to prevent it.

Item two is my GP of the last 15 years, a cheerful, no-nonsense senior partner of a busy central London practice. At my last appointment, she announced that she was retiring early because although she loved the work, the strain had become unbearable. The last straw was an inspection, preparation for which had terrified staff and consumed huge amounts of time that should have been devoted to patients. The surgery passed the inspection – but only after the doctor had endured to five hours of hostile interrogation which, she said, clearly started from the assumption that she was incompetent or hiding something. She went home, wept for two hours, persuaded her husband not to track down the inspectors and punch them in the face, and resigned the same evening.

My doctor isn’t alone. Most GPs are intending to retire before the age of 60, according to a recent Pulse survey, blaming overwork, rampant bureaucracy and a plummeting standard of living. GP leaders said the flux of doctors was a “genuine tragedy and waste”’. Again, this is anti-management – practice that makes the condition worse.

Item three may seem trivial, but it’s a symptom of the same deadly disease. At the university where my wife works, departments used to have their own administrators, who knew both staff and students and formed a friendly and effective link between them. To ‘streamline and professionalise’ this arrangement, administrators were brought into a central grouping, redubbed ‘professional services’. This performed the feat of upsetting staff, students and administrators themselves, who, having lost their links with people they had worked with for years, now stay in the job for months rather than years. To remedy the situation, managers are now proposing to edit and circulate a monthly newsletter describing new developments and ‘enhancements’ to a service which is infinitely less effective, more cumbersome and more costly than before. Words fail me.

Writ large, the dire financial consequences of such institutionalised idiocy can be read almost every week on the website of the National Audit Office in a report on a new shared services, outsourcing or IT project disaster. Recent examples include the complete failure of the privatised probation service (step forward, Chris Grayling), the ferry-with-no-ships saga (ditto), and, a new one on me, the lamentable story of the Emergency Services Network: years late, the new communications system for the emergency services is now forecast to cost £9.3bn, 50 per cent more than originally anticipated – and with much of the technology still not proven, the NAO doubts whether it will meet its ‘reset’ launch date of 2022. Doesn’t anyone learn anything?

Aside from mind-boggling financial ineptitude, what all these things, small and large, have in common, is contempt for the human – most directly obvious in the public service examples, but equally present, in compounded form, in the NAO cases. Failed IT projects always grossly overestimate the relative importance of the technology versus that of the humans that use it, for example.

The private sector is no better – in fact it is often worse. As author Dan Lyons noted in a recent RSA presentation self-explanatorily entitled ‘Why modern work makes us miserable’, companies obsessively trying to make humans fit with what computers want rather than the other way round have given up even pretending that workers are assets. The result is not only hateful service (automated tills, chatbots, interactive voice systems) but also dehumanising work practices (gigs and precarity, or alternatively long hours under tight surveillance). It’s not even ‘efficient’: Gary Hamel estimates that excess bureaucracy costs the US private sector, supposedly the leanest in the world, trillions of dollars a year. Even Chinese workers are getting restive under the country’s ‘996’ (9 till 9 six days a week) works system. Jeff Pfeffer in his angry Dying for a Paycheck calculated that simply going to work was the fifth biggest killer in the US, with extra costs to the health system of $2bn a year.

Do we live to work or work to live? The balance has swung so far towards the former that it sometimes seems that far from advancing, we are being propelled back to Victorian levels of exploitation and inequality. Until there’s a sharp change of direction, and we start seriously talking about what management is doing to us, humanity may be in as much danger of collapse as the planet’s rainforests or the oceans.

Managing in the wild

At first sight, Wilding, by Isabella Tree, doesn’t have much to do with business and management. But bear with me. This compelling book tells the story of the extraordinary consequences of turning a large, long-established farming estate in West Sussex back to nature. The historic 3,500 acre Knepp estate had been in the same aristocratic family for centuries. Taking over in the 1980s, author Tree and her farmer husband, heir to the estate, enthusiastically did everything the farming textbooks and consultants told them: intensifying the inputs, investing in labour-saving machinery, betting the farm, literally, on scale. But with all the advantages of ownership, experience, and access to the best conventional advice that the owners could muster, on their marginal land (Susex wealden clay is a nightmare to work) they simply couldn’t make it pay.

So in 2000, at the end of their tether, they sold off their expensive farm machinery and dairy herd, stopped the losing struggle against the clay and began, amateurishly at first, the process of allowing the land to rest. At first, they just felt relief. But almost instantly, they started to see nature pushing back with astonishing speed. Starting with semi-domesticated parkland, then extending the experiment to other parts of the estate by bringing in ponies, cattle and pigs, all allowed to roam and breed free, the owners watched in amazement as soil fertility, plant and insect life, birds and mammal diversity exploded, making the estate an island of exuberant fertility among the surrounding swathes of agri-industrial monoculture – calling into question some basic tenets of ecology and conservation science in the process

And the relevance to management and business? Well, this year’s Global Peter Drucker Forum, management’s Davos (for which, disclosure, I do some editing), has taken as its theme ‘the power of business ecosystems’. This is an important milestone, for two reasons. It belatedly requires management to start thinking in terms of systems, which it has astonishingly managed to avoid up till now. And while business ecosystems are man-made rather than natural, they are based on the same kind of principles. Managers may not have realised it yet, but changing the ruling business metaphor from the machine to biology has the potential to alter the management’s entire underlying philosophy. In short, it is a real evolutionary shift.

Although the idea has been around in the background for a decade or more, researchers are still exploring the concept of ecosystems, and there is plenty of interesting work still to do. In the meantime, however, and in no particular order, here are a few tentative thoughts on the implications for management that suggest themselves from Tree’s remarkable story.

Control…not. Evolution, as Francis Crick once said, ‘is cleverer than you are.’ You can’t control ecosystems. Apple may have been characteristically ahead of the game in thinking in ecosystem terms with the iPhone, but the platform only took off when Steve Jobs reluctantly allowed third-party developers on to the app store (they now number 500,000, contributing a total of 2.5m apps). In reverse, today’s intensive farming is Fordist, command-and-control agriculture. It sort of worked for a bit, but at only at huge cost – requiring huge volumes of artificial inputs (aka incentives) and pesticides and herbicides (rules and sanctions), all of which destroy diversity and the ability to innovate and change – and, like command and control itself, has now become the problem. Recent concern over soil degration only underlines the point. As Tree shows, agriculture is a classic case of the UK’s unerring ability to get the worst of both worlds – a supposedly market-based system that is so hedged around with the bureaucracy of regulation and subsidy that many farmers feel they have no choice what to grow on, or even do with, their land. Ironically, Knepp had to struggle to be allowed to do nothing.

Balance. The first law of ecology is that everything is connected to everything else. So ‘maximising anything is fatal for the balance of the whole system’ (Charles Hampden-Turner). It’s when balance is lost that things go wrong. Meaning that in a business ecosystem maximising shareholder value (or short-term land yields, for that matter) was always going to be destructive, and, like command and control, with its panoply of targets, KPIs and other measures unrelated to purpose, has to be abandoned.

Change. For conventional management, change is a mystery and a nightmare. It takes for ever and most of the time (70 per cent, according to HBR) it doesn’t work. But under ecological rules, change is natural, happens all the time, and, as at Knepp, can take place at astonishing speed. It’s just that it’s emergent – so while you can predict the direction of travel, you can’t tell exactly where you will end up. The course wilding took, and continues to take, at Knepp has been a source of amazement and controversy to scientists, the owners and Natural England alike. Like evolution, of which it is obviously part, ecological change can’t be planned. But it can be observed, understood, learned from, and should be treated as normal.

Economy. Like any low-trust command-and-control operation, intensive farming is highly inefficient and costly as a system, both internally (bureaucratic, management intensive) and externally (obtrusive regulation, subsidies). One of the many surprises at Knepp has been the extent to which, the expensive artificial uppers and downers once removed, the ecosystem has become self-regulating. Without herbicides, ‘weeds’ have sprung up, to the great disapproval of townees who like their countryside trimmed and twee. But it turns out, from observation, that left to themselves farm and other animals use many of them for self-medication (another surprise for scientists). They already have a better, organic diet; together with the ability to dose themselves, Knepp’s cattle, ponies, pigs and deer are now largely vet- and thus expense-free, even when giving birth. The cherry on the cake, as it were, is that the solely pasture-fed meat is shown to be not only better tasting, but actually good for you, even the fat. Managing with rather than against the grain of the ecosystem is better, simpler, and lower cost, since there’s less of it.

Simpler; but – the sting in the tale – not necessarily easier. It’s already clear that ecosystems will involve managing a richer and much more nuanced range of relationships than before, which may include both competition and cooperation in different areas, formal or informal partnerships, cooperative networks in which products co-evolve, sometimes even co-creation with customers. This won’t suit managers who like things black and white, preferably in numerical form, and alpha-male CEOs who can’t bear not to be dominant. But, hey, dealing with ambiguities and interpreting relationships are a large part of what it is to be human. It’s called life. So they’ll just have to get over it and start managing as if it, and humans, mattered.

Telling tales: the difficulties of telling the truth

The ability to tell stories, it is said, is one of the qualities that differentiates humans from other animals. Stories have brought us extraordinary riches – Homer, the gods (or God), perhaps even evolution: would we have left Africa without a story of destiny, greener grass or just curiosity about what lies beyond the next hill? It’s through our ability to connect utterly different elements – a butterfly’s wings and a hurricane, say – to form a narrative that we make sense of the world and allow ourselves to feel that we are in control of our lives.

But storytelling also has a darker side, as an absorbed audience heard at a February forum organised by The Foundation devoted to that complicated subject. The human-ness that allows us to recognize – or invent – a good story also inflects the way we receive it. All too often stories lead us to terrible places. In a minor key, actor Mark Rylance relates how at the Globe theatre he was forced to throttle back the famous call to arms in Shakespeare’s Henry V when he realised the frenzy of anti-French hostility he was whipping up in the groundling audience. For the real-life consequences, think no further than the Inquisition, National Socialism, or Isis.

Yet the potency of storytelling takes on an added importance today, in an age that has been widely characterised as ‘post truth’ – an age of fake news and ‘alternative facts’, where our natural inventiveness on one side and gullibility on the other are sometimes supplemented by deliberate manipulation by ever more sophisticated technological means. To such a degree that, as one Forum speaker, satirist John Morton, put it, stories become all that we have, in the sense that, in the absence of absolute truth, ‘we think, okay, we live with a collection of competing narratives, that’s all we have to sustain us’.We no longer accept to have ‘truth’ curated for us by the church, the mainstream press, or the political parties. We have had enough, Michael Gove said, of experts. So which, or whose, stories are we to privilege? How do we know which to trust and which to dismiss?

And here’s the rub. As humans, the speakers noted, we respond to stories not with Enlightenment-style logic and rationality, but with a very human logic in which ‘facts’ and ‘evidence’ are just the instruments that get us from A to B, B being the place that our emotions have already decided that we are going. As story-telling coach (and comic) Tom Salinsky compellingly showed with the example of the first and last voyage of the Titanic, the irresistible appeal of stories is fuelled by the combination of a few relatively simple components: an unexpected event or paradox – an unsinkable ship that sinks, in this case compounded by the fact that it was on its maiden voyage (you couldn’t make it up); a dramatic immediate cause (the iceberg), hiding a deeper hidden one (hubris, or human folly); and a poignant human hook (the band played on as the ship sank). What changes the account from an encyclopedia entry to a story is, first, the human hook, which plays to the overriding importance of emotion in our responses, overriding reason in the process of decision-making almost every time. ‘The factual stuff is necessary to provide the context, but it’s that moment of emotional catharsis that we remember and that moves us,’ said Salinsky. ‘That’s what gives stories their power, and why also they’re potentially so dangerous’.

The second essential feature is cause and effect. ‘Cause and effect is what stories run on – without it there isn’t a story’, Salinsky noted. No accident, then, that identifying cause and effect is the central quest of much of literature, including the entire genre of detective fiction.

Directly causal connections are of course much harder to establish in social and human affairs than it is in the physical world. Hence the flourishing of fake news, poisonous rumour and conspiracy theories which in turn augment the violence of political or ideological arguments, or assertions – about Brexit or Trump, for example – that rapidly colonise the truth-free space and crowd out less extreme interpretations. Both of these are extraordinary illustrations of the power of a good narrative (‘Make America great again!’, ‘Take back control!’), whether real or imagined, to trump mountains of earnest but story-less facts and figures. Less obviously, both, as Morton pointed out, are classic examples of stories escaping control and developing lives of their own, independent of their makers. He cited the ‘humiliating’ authorial experience of having characters or story-lines refusing to follow the course allotted to them in the outline. But it wasn’t just that stories could take you to a destination you didn’t intend – nor could you control how they landed in and interacted with the real world, sometimes even altering it in their own image, as, arguably, in the case of political satire: what started out with the relatively benign Spitting Image ended up with the scabrous The Thick of It, in which all politicians are duplicitous, stupid or borderline criminal. ‘So I’m wondering,’ said Morton, ‘whether one unintended consequence of the satirical brilliance of The Thick Of It when it got out into the real world was that it was one of the causal factors in the kind of mad, terrible world we live in now’.

Some part of the problem may be epistemological. As the Danish philosopher Søren Kierkegaard famously noted, while life is understood backwards, it is lived forwards. The only way the realities of a present life can be force-fitted to a destination decided in advance is by doing violence either to one’s own beliefs or those of others. This may help explain why we live in what might be called, as in the title of a recent BBC Radio 4 series, ‘the age of denial’. Life contains so many unspeakable, awful things that we can’t individually do much about – climate change, plunging biodiversity, exclusion, slavery, child abuse – that the only way we can deal with them is to blot them out. Complicating matters, blotting out the unacceptable – the ‘optimism bias’ – may be evolutionarily essential: otherwise why go on living? While its complement, the ‘negativity bias’ (the salience of bad news over good), is also essential in keeping us alert to the constant threat of danger. So where’s the balance?

There’s little doubt that all today’s tendencies, but particularly denial, have been supercharged by social media and the internet, both of which radically expand the scope for group polarisation. As a respected journalist and commentator, Gavin Esler has watched with concern as fantasy and malevolence make it ever harder for balance and insight to be heard. Again, Trump is the telltale example here. Never mind all the other disqualifications: how is it, Esler demanded, that a president of the United States can get away with telling, on the Washington Post’s reckoning, 15 lies a day for a total of 6,420 in two years – many of them breathtakingly blatant untruths – and still retain the confidence of 40 per cent of Americans who would never allow the same latitude to his opponents? The answer, Esler suggested, was that there was no pretence about Trump. No one could doubt that what they saw was what they got. Trump was authentically himself – a liar – knew it, and acted it to the hilt. He didn’t need to be an earnest or careful denier – it simply wasn’t important. This puts Trump so far ahead of the curve that some have termed him a ‘post-denialist’ – someone who is so unconcerned about truth or fact that he doesn’t even bother to justify his lies.

Generalised post-denialism would be an internet-age dystopia beyond anything that Orwell or Huxley could have invented, with implications that scarcely bear thinking about. If we are not to go that way, at some stage the fightback has to start. ‘At some point it seems to me we have to reassert that facts do matter’, said Esler. ‘No matter how flat I feel the world is, it isn’t, and if the facts don’t matter, any of us in the journalism or communications business might as well pack up and go home.’

Part of the answer, it was suggested, was to get beyond the facile notion of authenticity that certain demagogues have learned to play so effectively: the commonly-used justification that ‘I just say what I think’ doesn’t mean what you are just saying is right, clever or of any value at all. Sincerity, Esler proposed, was a better criterion to judge by. On the positive side, he added, belying today’s fashionable stereotypes, many politicians are good, intelligent people genuinely motivated by the desire to improve lives: their story deserves to be heard too. Another part of the answer is surely to oblige social media and tech companies to face up to their responsibilities by making them accountable for their content in the same way as the struggling traditional media, as they should have been from the beginning.

Finally, of course, the stories swirling around us are ours too, and it is up to us to handle them with as much care as we can. ‘Nowadays, anyone who wishes to combat lies and ignorance and to write the truth must overcome at least five difficulties. He must have the courage to write the truth when truth is everywhere opposed; the keenness to recognize it, although it is everywhere concealed; the skill to manipulate it as a weapon; the judgment to select those in whose hands it will be effective; and the cunning to spread the truth among such persons. These are formidable problems for writers living under Fascism, but they exist also for those writers who have fled or been exiled; they exist even for writers working in countries where civil liberty prevails.’ That text for today was penned by the great committed German poet, Bertolt Brecht. He wrote it in 1935.

The power of words

A couple of days ago, I came across a copy of Winston Churchill’s ‘We shall fight them on the beaches’ speech of June 1940. It had languished unread in my study since it was reprinted by The Guardian in a series of ‘Great speeches of the 20th century’ in 2007. I’m well aware that ‘Churchill, hero or villain?’, has recently been at the epicentre of a ludicrously trumped-up controversy on Twitter. But whatever your opinion of the old bruiser (and I recommend a quick read of Simon Jenkins’ article in The Guardian to put the matter in perspective), Churchill’s words by both omission and commission contain some lessons that any of today’s politicians making dismal idiots of themselves over Brexit could profitably take to heart.

Reading the speech just now now is an instructive experience. It is absolutely calculated with one end in view: to create unity. From the first word, you know you are not just in the presence of momentous events – you are a participant in them. Every paragraph is about ‘we’. I was familiar with the rousing final ‘We shall go on to the end…’ peroration, of course – but taking in the rest of what is quite a long and dense speech for the first time, the well-known ending is not the most remarkable thing about it. As Simon Schama notes in his short introduction, by far the most striking feature for today’s reader or listener, a companion to the ‘we’, is the startling rhetorical tactic from which it draws its persuasive force: honesty.

There is no fancy introduction. Churchill launches straight into a vivid description of the German May blitzkrieg and the desperate retreat of French and British troops to the Channel ports that had me, as no doubt listeners at the time, reaching for a map to follow the strategic sweep and grasp its implications. He makes no attempt to hide the losses and their consequences. While Dunkirk, which immediately preceded the speech, is a ‘miracle of deliverance’, he leaves no room for doubt about the height of the stakes (‘the whole root and core and brain of the British army… seemed about to perish upon the field’), nor that deliverance has been plucked at the last second from the jaws of disaster. ‘We must be very careful not to assign to this deliverance the attributes of a victory. Wars are not won by evacuations,’ he warns. Make no mistake, the whole episode has been ‘a colossal military disaster’ that has left the French army weakened, the ‘fine’ Belgian army lost, the channel ports and swathes of northern France including ‘valuable mining districts’ in enemy hands, and a daunting quantity of guns and equipment, all of which would have to be painfully built up again, simply abandoned.

Nor is this all. With ‘remorseless candour’ (the description of the reporter of the then Manchester Guardian), the Prime Minister goes on to set out the likely next developments. Hitler might be expected to follow up quickly with a strike at France. Or at Britain: ‘When we see the originality of malice, the ingenuity of aggression, which our enemy displays, we may certainly prepare ourselves for every kind of novel stratagem and every kind of brutal and treacherous manoeuvre.’

But although there is no mistaking the gravity of the situation, or the possibility of worse to come, the tone is above all one of facing down the adversity. When Churchill pays tribute to the bravery of the retreating troops (including French and Belgian) and the extraordinary efforts of the RAF (with a prescient nod to its future role in protecting our own shores), Navy and little ships that brought off 335,000 allied soldiers from the beaches, the statement of national unity in the defiance is uncompromising. We really are all in this together. Yet there is no sense of Little Englandism. Unlike today’s MPs with their excruciatingly emphasized ‘our country’, ‘the British people’, not to mention ‘the will of the British people’, Churchill doesn’t do the virtue-signalling patriotism – he sometimes uses ‘this island’ or ‘the nation’, but mostly simply ‘we’. He repeats the pronoun no less than 10 times in the first half of the famous peroration. Then in a brilliant final coup, he first glancingly evokes the possibility of a British defeat (‘even if… this island or a large part of it were subjugated and starving…’), before closing off the conditional by broadening the ‘us’ to include the British empire and the US, which would in that case carry on the fight until the job was done.

But first there’s another lesson for 2019’s MPs. Taking stock of the need, even after the recent losses, to balance home defence with ‘the largest possible potential of offensive effort’, Churchill proposes that the house discuss the subject in secret session – this partly to avoid giving useful information to the enemy, but mainly because the government ‘would benefit by views freely expressed in all parts of the house by members with their knowledge of so many different parts of the country.’

Let that sink in a bit.

The UK is not currently at war in the most literal sense (although to use the phrase is to be aware of the just subterranean parallels). But contrast the inclusion, unity of purpose and clarity of vision set out in the 1940 speech – a perfect statement for the time – with today’s sorry equivalent at another moment of national crisis: ‘a jumble of jargon, jousting and gibberish, with everyone sucked into the vortex of confusion, to the exclusion of every other issue in the world’, in the words of the New York Times’ columnist Roger Cohen, in principle a friendly observer. Like all the key terms used in Parliament at the moment, the desperate protestations of clarity (‘Let me be clear that…’, ‘The Prime Minister has made it very clear that…’) only underline the reverse: the sole clarity on offer, as Cohen notes, is that no one has a clue what will happen next. The hero-worship of some Brexiters for Churchill (Richard Evans’ takedown of Boris Johnson’s ‘biography’ in the New Statesman is irresistible) in this context is deeply ironic, since the cacophony reproduced daily in the House of Commons displays all the qualities of the June 1940 speech in reverse. It is a monument to muddle, fudge, discord and dissembling that can only comfort enemies and dismay friends. The rhetoric is meretricious and sham, and – again unlike the Churchill example – nothing good can possibly come out of it.

Rogue state

Just now it’s hard to avoid the subject of Brexit, which – irrespective of your referendum choice – is a tale of government ineptitude so great that it inspires a kind of awe: misbegotten in concept, disastrously managed and losing no opportunity to make things worse as it went along. And behind Brexit lurks something even more worrying than our fate within or without Europe: an atrophy of the idea of the state that not only makes a mockery of the notion of taking back control of anything, but makes one fearful for our ability to preserve a functioning democracy.

In fact, the Brexit tale starts well before the referendum was even a mention in a Tory manifesto – and ironically, except in the fevered nationalist imaginings of the European Research Group, it has very little to do with Europe. In this version, the origins of Brexit are to be found in the backwash from the 2008 financial crisis.

The reasoning goes like this. The GFC – a banking, not a spending crisis, itself the result of policy failure to rein in the financial sector – severely dented the UK’s public finances. That triggered austerity, the huge brunt of which (89 per cent) fell on public expenditure, in particular local authorities in already deprived areas, whose citizens responded by enthusiastically voting UKIP in local elections. This in turn frightened Conservatives into promising a referendum in the election of 2015, with the results that we know. The correlation between votes for UKIP in elections and Leave in the referendum, suggests that without austerity – and the sadistic form in which it was administered – the referendum result would have been different. So there is a straight line from the Great Financial Crash through austerity to Brexit that never touches Europe, except of course as collateral damage.

The story of Brexit post-referendum is just as hapless – as far fetched, although not as funny, as a Gilbert and Sullivan plot (martial law and the evacuation of the royals after a crash exit, anyone?). Dsplaying what can only qualify as reckless negligence, the government appears to have done no due diligence on either the external or the internal consequences of its actions, underestimating the negative effects on its own integrity (Northern Ireland, Scotland) as much as it overestimated European willingness to satisfy our notorious appetite for cake both to scoff now and to hoard for future midnight feasts. There was an advisory referendum on the most important political decision for a couple of generations that somehow morphed into the will of the people despite possible manipulation and a majority that wouldn’t have sufficed to alter a local golf club’s membership dues. Then there were negotiations not with the EU but between different governmental factions, and when government members now disingenuously warn of social unrest the opposition is again not Europe but internal. Brexit has turned into a nightmare Catch-22: if we knew our history, including that of two world wars in the last century, we wouldn’t do it in a million years; so the only way to do it is by ignoring history, which of course guarantees that we shouldn’t do it. We’re reliving history without learning its lessons.

That, however, is not all. Writing in The Guardian, Fintan O’Toole, one of the sharpest commentators on the whole saga, noted: ‘Brexit is really just the vehicle that has delivered a fraught state to a place where it can no longer pretend to be a settled and functioning democracy…It is time to move on from the pretence that the problem with British democracy is the EU and to recognise that it is with itself.’ Part of the problem, as O’Toole notes, is about what to do with English nationalism. But that is compounded by the fact that the UK no longer believes in its ability to carry out many of the traditional roles of the state, which it has meekly abandoned to the market and business. It is a self-hating state which now finds itself almost completely bereft of its traditional defences and competencies at exactly the moment, with danger and turmoil swirling around, when statecraft is most needed.

In retrospect, the extent and urgency of these failings had been brought into sharp focus in a forthright presentation by the FT’s Martin Wolf at last November’s Global Peter Drucker Forum in Vienna. In a good session on ‘Beyond market failures: how the state creates value’ (you can watch it here), Wolf laid out some basic home truths. The state, he asserted, was ‘the most important institutional innovation in human history, as essential now as it’s ever been’ (the idea that we would all be in heaven if it only got out of the way, he added, was ‘only possible for people who are so used to strong and powerful states that they cannot imagine their disappearance’). Leaving aside taken-for-granted aspects like security, the justice and legal system, and the laws governing the roles, purpose and legitimate operations of business, all of which just happen to be ‘a total mess’ – ‘and if this isn’t important I don’t know what is’ – all our current priorities of broadly shared and sustainable prosperity, financial stability and environmental protection require the active intervention of a state that is ‘ambitious, effective and … under democratic control’. For many states, perhaps especially our own, that is a very big ask. Yet without it, and without states and governments getting better at cooperating with each other than they presently are, said Wolf, we face a crisis in which the brave new technologies that we set such store by ‘will in my view destroy us.’

Those, in the words of the FT’s chief economic commentator, are the stakes. It is little comfort to know that Brexit in that perspective is just a taster of bigger tests to come.

Managing as if the world matters

As the transmission belt between ideology and the behaviour of companies, the most important actors in the modern economy – and one moreover that transmits both ways via economic and political power – management has always been much more important than most people imagine. Never mind Brexit: on the way it develops from here may now depend the future of the world.

It has done so before. In the decades after WWII, the (mostly) virtuous circle of rising profits feeding into investment in new plant and above all good jobs, fuelled a (mostly) balanced rise in prosperity across Western economies that could not be matched in the rigid Soviet bloc. But as Henry Mintzberg has often insisted, the fall of the Soviet Union wasn’t a case of capitalism defeating socialism. It was a case of balanced, plural societies, comprising vibrant private, public and civic sectors, proving more flexible and productive than unbalanced ones consisting only of a public sector. Now, having learned the wrong lesson, the West, particularly the Anglophone West, is making the same mistake as the Soviet Union but in reverse: encouraging a rapacious private sector to exploit or eat everything else in its path.

The upheavals of 2016 allow us to connect up the suddenly visible economic dots: these, together with other manifestations of populism and and nationalism erupting like a plague of boils all over Europe, are the direct outcome of our own lack of balance, which, if left unchecked, will see our pretensions crumble as comprehensively as the Berlin Wall in 1989.

The irony is that what today’s malcontents, however crude their expression of it, actually want, is neither complicated nor outlandish. In a 2017 Legatum survey, respondents listed, in order, food, water, emergency services, healthcare, housing, jobs and education. A car and air travel came way down on the list, which probably hasn’t changed much since the 1950s, the main difference being that then everything on it was in the realm of the realistically attainable.

Now as then, the key ingredient is jobs, which are largely the creation of a vigorous private-sector – ie companies. The corollary, as commentators such as the FT’s Martin Wolf never tire of pointing out, is that how companies are managed and to what ends – the ideology of management – is a macroeconomic issue of the highest importance. In case there is any doubt just how high, here is Colin Mayer, former dean of Oxford’s Said Business School, in his much-noticed new book on the corporation, Prosperity:

With the emergence of the mindful corporation we could therefore be on the edge of the most remarkable prosperity and creativity in the history of the world. On the other hand we could equally well be at the mercy of corporations that are the seeds of our destruction through growing inequality, national conflicts and environmental collapse on scales that are almost impossible to conceive of today. We are therefore on the border between creation and cataclysm, and the corporation is in large part the determinant of what way we will go.

Our very future, he goes on, ‘depends on reinventing the corporation’. The good news is that that is perfectly possible – Mayer points to the six previous ages that it has evolved through since Roman times, steadily increasing its purview through public services, guilds, towns, universities, the church, hosplitals, trading, manufacture, transport, safekeeping, lending, insurance and financial instrument trading, as evidence of the corporation’s extraordinary Protean ability to adapt to and embrace new purposes and functions according to the demands of the times. The bad news is that this animate, dynamic form has been captured by an abstract, mechanistic management and governance model that prevents further evolution by positing a corporate ‘end of history’ converged on a one-dimensional shareholder-controlled profit-oriented enterprise model.

Changing that means changing the underlying ideology of management. And that won’t be easy. Back in 1998 Sumantra Ghoshal and I wrote a piece for the FT about the spread of ‘asshole management’ – ‘an infernal cycle’, driven by the demands of the capital markets for ever greater returns, ‘in which managers, change agents and academics were all collaborating to make both work and leadership … a crippling and inhuman experience’. Twenty years later that ‘profoundly coercive system’ has tightened more than we could have imagined as activist hedge funds and private equity have squeezed companies until the pips – pensions, careers, welfare and now jobs – squeaked and popped. Surveillance capitalism, allowing companies to manipulate and hack humans – to the extent of compromising free will, in the view of some observers – has added another vicious twist to the screw. In self-fulfilling prophecy, this has now become the norm: we no longer imagine there is an alternative. Ruthlessness begets ruthlessness, so we now have asshole companies and asshole (‘hostile’) government departments as management and corporations recast everything they come in contact with in their own reductive, inhuman image. It’s no accident that the US now has an asshole president, the perfect incarnation of of a capitalism that makes assholes of us all.

Yet could we, just maybe, have reached – apologies – peak asshole? The idiocies in Trump’s self-contradictions are too gross to be treated with anything other than contempt. Big tech’s excesses have triggered a backlash that is gathering momentum.The Orwellian ‘reversifications’ (John Lanchester’s term) that current management systematically leads to – banks that make people poorer, hospitals that kill, welfare that immiserates, a Home Office that creates aliens not citizens – are beginning to create a similar outcry. The insulting contrast between the jobs, housing and public services that we want and the HS2, Heathrow expansion and Brexit that we get has come into focus like never before. Above all the miasma of fascism drifting up from our fractured societies puts any idea that we can return to business as usual out of the question.

In their 1986 blockbuster Megatrends, John Naisbitt and Patricia Aburdene wrote: ‘Whenever a new technology is introduced into society, there must be a counterbalancing human response… We must learn to balance the material wonders of technology with the spiritual demands of our human nature’. That’s true. But just so there’s no misunderstanding, the technology that most urgently needs humanising is the meta-technology that underpins how all the others are used, management.

The building of a bullshit economy

In 2013, anthropologist David Graeber, now a professor at LSE, crashed the website of a small magazine with a short essay that struck a chord all over the world: ‘On the Phenomenon of Bullshit Jobs’.

Graeber, who later extended the article into a book, was struck by the number of jobs thought pointless even by those who did them. He pondered Keynes’ much-quoted prediction that we would all work 15 hours a week by the year 2000, and noted capitalists’ aversion to spending money on unnecessary jobs (or even necessary ones: ‘No – give me back my fucking money!’ Trump reportedly raged on finding he was supposed to employ a transition team when moving into the White House). So what was going on?

Graeber was acute in nailing the proliferation of non-jobs, but less so at explaining it. In fact the situation is more insidious than his version, if admittedly duller. It is not, as he suggested, primarily the result of ‘managerial feudalism’ (employing flunkies to big up your status), nor a dark plot by the ruling class to keep workers out of mischief by insisting on the sanctity of work even when it is valueless, although that is an outcome. Instead it is the predictable consequence of our current destructive management beliefs and the work designs they lead to.

The reasons are fairly simple. Since companies put their own short-term interests above those of society, there is constant friction at the margins of what’s legal or at least acceptable. Pushing too far leads to scandal (Enron), crash (Lehman) or both (2008), and, as sure night follows day regulation to bolt the door after the departed horse. As John Kay wearily explains, ‘We have dysfunctional structures that give rise to behaviour that we don’t want. We respond to these structures by identifying the undesirable behaviour, and telling people to stop. We find the same problem emerges, in a slightly different guise. So we construct new rules. And so on. And on. And on.’

As regulation gets ever more complicated, it evolves into an industry in its own right, with its own vested interests and bureaucracy – a monstrously growing succubus symbiotic with the industries it is supposed to control. You can watch the process playing out again in Silicon Valley now. ‘Facebook puts profits above care for democracy’, proclaimed the FT in a recent article. Of course it does: that’s what managers have been taught to do. The demand for regulation is steadily building as a consequence.

Don’t get me wrong – Big Tech needs reining in as urgently as Big Finance. But as a manifestation of a bigger problem – the ‘dysfunctional structure’ that generates regulation that is simultaneously necessary and useless – the only solution is to reduce the need for regulation in the first place by placing a duty of care on companies for the society they form part of. In other words, regulatory jobs are net energy and value-sapping jobs which shouldn’t exist – the creation of philosopher John Locke’s madman, ‘someone reasoning correctly from erroneous premises’. As Peter Drucker put it, ‘There is nothing quite so useless as doing with great efficiency something that should not be done at all’.

And here’s the thing. The dysfunctional structure is fractal, replicated at every level down through the organisation. Since it assumes at least some workers, including managers, will shirk and skive, management is geared for control rather than trust. Low-trust organisations run on rules, surveillance and performance management – which through the process of self-fulfilling prophecy actually makes untrustworthy, or at least unengaged, behaviour more likely. Look no further for the cause the apparent paradox, noted by Graeber, that bureaucracy proliferates just as much in the supposedly lean and efficient private sector as in the public. In effect, each company carries the burden of its own regulatory apparatus. In 2016 Gary Hamel estimated that excess bureaucracy was costing the US $3tr a year in lost productivity, or 17 per cent of GDP. Across the OECD, what we might call the ‘bullshit tax’ amounted to $5.4tr. ‘Bureaucracy must die!’ proclaims Hamel. Yet he concedes that despite his campaign, it seems to get worse, not better.

Finally, with the ideology of Public Choice, the same pessimistic assumptions and stultifying management structures have been visited on the public sector in the form of New Public Management, with exactly similar results. Marketisation has added a further overlay of bullshit. Symptomatic is the experience of the university sector: compare stationary salaries and worsening conditions of academic staff with burgeoning jobs (and salary levels) in administration and management (especially at the top) and the creation of entirely new departments concerned with branding, PR and massaging the all-important student satisfaction figures – an enormous increase in pointless overhead on the real work of turning out critical citizens who can distinguish real value from hot air.

Putting all this together, it is hardly surprising that the US and UK, as the most extreme proponents of deregulation and privatisation, are, with delicious irony, more subject to this systemic bureaucratisation than other less laisser faire economies. So much so that it is tempting to characterise the UK in particular as a bullshit economy. Having largely abandoned manufacturing, it prides itself as a purveyor of financial and professional services selling advice and other products of which the social value is dubious, to say the least. The extreme and paradigmatic case is advertising. ‘The UK advertising industry,’ a recent House of Lords report solemnly intoned, ‘is a success story. Advertising fuels the economy by helping businesses to grow and compete against one another. It is also a significant sector of the economy on its own. The UK, especially London, is a global centre for advertising, exporting services to clients around that world,’ and plenty more in the same vein.

Well, maybe. But in its own terms, as senior adman Dave Trott succinctly told a BBC Radio 4 audience recently, of £23bn worth of ads purchased annually in the UK, ‘4 per cent are remembered positively, 7 per cent are remembered negatively, and 89 per cent are neither noticed or remembered at all’. Let that sink in a minute. £20bn of ads that might as well never have been created – that is bullshit of an awesome order.

Bullshit generates more bullshit. ‘The best minds of my generation are thinking about how to make people click on ads’, one Silicon Valley techie accurately noted. ‘And that sucks.’ Or about spin and fakery – another British ‘success story’ that bloats as newsrooms shrink. PR people now outnumber reporters five to one, compared with two to one 15 years ago. Which is why this kind of bullshit/bureaucracy is so hard to root out. It’s what happens when economic incentives are out of line with society’s interests. It’s not a bug in the system – it’s a feature. It won’t change, in other words, until everything changes.

Corporate reform grows in unlikely places

It’s mea culpa time. After a grief and denial phase, the growth of populism is producing a rare outbreak of handwringing among the liberal elite, as we now have to call them, as they own up, at least partially, for their part in bringing about the angry, polarised world that we now inhabit. Theresa May was the first to put her hand up with her ‘barely managing’ and ‘capitalism for all’ on the doorstep of No 10 after the botched election of 2017, but all that has long since disappeared into the black hole of Brexit (from which, pace the late Stephen Hawking, nothing ever returns).

More recent owners-up include former Treasury Secretary Larry Summers, who in a bizarre and awkward FT piece recounted his astonished discovery of ‘the way of life’ of ‘the rest of America’ on a two-week transcontinental car trip this summer – a wonderful example of class cluelessness. And The Economist, where editor Zanny Minton Beddoes penned a 10,000-word manifesto for liberalism in which she lamented, rightly, that too many liberals had turned conservative, shunning calls for bold reforms to an economic neoliberalism out of which they had actually done rather well.

Beddoes puts forward a number of proposals to put liberalism on track, ranging from upholding free trade to moderating immigration, enforcing competition policy and dreaming up a new social contract. But although she is right about the need to do something about ‘left behind’ places and people, nowhere, surprisingly, is there an acknowledgement of the extent of the challenge to liberal ‘business as usual’ from the pincer jaws of neoliberal global financialisation on one side and what economic historian Carlota Perez insists is the burgeoning fifth (not fourth) industrial revolution comprising the internet and mobile communications on the other.

What no one has picked up is one of the most obvious things of all. Whether what we are going through is the fourth or fifth revolution, it is different in one crucial sense from all those that have gone before. Right up till this one, economic incentives have largely been aligned with the interests of wider society. Broadly speaking, corporate growth led to prosperity through the creation of well-paid, full-time jobs.

Two important things have happened to change that. First, economic incentives have been yanked round to pull in a different direction, encouraging businesses to treat human employment as just another means to the end of enriching shareholders – another cost to be minimised.

Now, companies are still growing all right; but they only employ people when they have no alternative, and then preferably on minimum pay and zero hours. Look at Uber, the totemic platform enterprise, which is rushing as fast as it can to perfect autonomous vehicles which would dispense it from employing anyone at all, bar a few economists and quants to fine-tune its surge-pricing algorithms. This kind of work is not a reliable route out of poverty, and growth can longer spread wider prosperity when big companies are spending 90 per cent of their earnings on stock buybacks for the benefit of shareholders. The sums are stupendous: over the last decade Apple has spent $102 billion (with another $210 bn to come!); Microsoft $878 bn; Cisco $228 bn; Oracle $67 bn; JPM Chase $63 bn; Wells Fargo, $56 billn; Intel $55 bn; Home Depot $51 bn. Meanwhile real US wage levels have barely budged in getting on for half a century.

The second thing that has changed since the last great growth surge is the power and wealth of the largest corporations, and the monstrous accumulation of vested interest that has resulted, knitting together a formidable fellow-travelling ecology of consultancies, business schools and investing funds, which together have effectively captured the political process. ‘There is no force on earth that can stand up effectively, year after year, against the thousands of individuals and hundreds of millions of dollars in the Washington swamp aimed at influencing the legislative and electoral process,’ former Fed chairman Paul Volcker declared in the New York Times recently.

While economic incentives conflict with society’s interests, prospects of dealing with the immediate Frankensteins of ramping inequality and the desertification of the jobs market, let alone resolving major problems like climate change and shrinking bio-diversity, are grim. Other adjustments, both social and institutional, will be needed too. But there will be no lasting solutions until business and social needs are pulling in the same direction. That means altering the incentives. And given the weight of the aforementioned vested interests and lobbying power, only the most determined effort will prevail.

This is why Elizabeth Warren’s Accountable Capitalism Act, now before the US Senate, is cautiously encouraging. Warren, a hard-nosed Democrat law professor who is mulling a run at the presidency in 2020, knows business, having worked in bankruptcy and consumer protection; she also grasps that the big issue is not identity – it’s the economy, stupid.

Warren’s bill is radical and simple. It would set up an Office of US Corporations which would require the biggest firms to adopt a federal charter mandating them to consider the interests of all stakeholders – workers, customers and communities – and not just shareholders. Workers would elect 40 per cent of the board, and there would be restrictions on the way executive stock options (which have played a huge part in the systematic enrichment of the executive class) are exercised.

Warren’s bill of course has little chance of making it into law any time soon. But it is being taken seriously and draws on a number of strands of public approval, notably worker representation in boardrooms. Not only that: the concept of ‘accountable capitalism’ can be read as a legislative response to the much remarked call of Larry Fink, the head of the world’s largest investor BlackRock, for firms to show they were making a positive as well as financial contribution to society. ‘Companies must benefit all of their stakeholders, including shareholders, employees, customers, and the communities in which they operate’, he wrote to CEOs earlier this year.

Warren and Fink are a powerful duo. Their initiatives should give much-needed heart, and sinew, to British progressives, who are now paying the same high price as US Democrats for backtracking from corporate reform before the Crash (yes, Tony Blair, we do mean you). A similar approach by the two major shareholder-dominated economies would give a much better chance of making reforms stick – and is probably the only hope of overcoming the abject funk of Westminster and Whitehall at the possibility of being labeled anti-business. Perhps the UK has more riding on the outcome of the US mid-terms than we thought.

CEO activism is a dangerous game

Time was when trying to get CEOs to talk about anything more controversial than last year’s profits or the new marketing campaign was like drawing teeth. Even on matters that affected them directly, like tax or budget changes, they preferred to let their organisations ventriloquise for them. But that is changing – not so much in the UK (where even Brexit and Corbyn’s worker-share proposals have failed to evoke more than a few strangulated yelps from the nation’s boardrooms) as in the US, where a recent wave of ‘CEO activism’ has caused excited comment.

Several factors are driving the phenomenon. On one side, in a bitterly divided post-trust world, the demand for figures able (or at least willing) to supply answers to existential questions has never been more intense, especially among needy, brand-conscious younger employees – which puts their CEOs squarely on the spot. Who, on the supply side, are increasingly pleased to step into the spotlight – partly because they feel they have an undeniable right to strong opinions on controversial issues, and perhaps more cynically because they are confident of reaping the rewards in terms of exposure and lobbying influence available to those able and unsqueamish enough to exploit an increasingly competitive celebrity culture. Whatever the reasons, over the last two years CEOs have become ever bolder in speaking out on issues including immigration, the environment, gun law, LGBT rights, inequality and race and gender relations, to name the most prominent.

They could plausibly argue that they have little choice. In the world we live in, controversy is unavoidable. Yet it is uncomfortable, even dangerous territory. As too many have come to realise, the first law of celebrity is that it is as easy to die as to live by. In the age of social media, a reputation can be, and often is, swept away by a twitterstorm overnight. Ask Elon Musk, Travis Kalanick, Elizabeth Holmes at Theranos. United Airlines only survived the ‘unfortunate incident’ when it brutally removed a doctor on an overbooked flight from Chicago by making a grovelling apology and paying an undisclosed sum in compensation, sacking two officials and upping the offer to passengers willing to give up their seat from $400 to $10,000. To state the obvious, high-profile CEO interventions come with consequences, sometimes costly, first and foremost for direct stakeholders and rippling out for the wider society. In turn, that requires them to weigh those consequences carefully – which only gets them into even deeper moral waters. Calculating relative costs and benefits of standing up for a cause for, say, shareholders and society is not only difficult – even in the unlikely event of it being clear cut, it is a hopelessly unreliable guide to action. As John Kay (for the record, an economist) helpfully put it, ‘ethics is about what you to do when good behaviour and profitable business are not necessarily the same thing.’ Damned if you do; even more damned if you don’t.

There is an even more dangerous side to CEO activism, however – and it is playing out in front of us every day. It’s a fairly small step from activism in public affairs to wanting to shape or even control them (that’s surely the point). For where this can lead, look no further than the US White House, the antics of whose current occupant suggest that the direct import of business into government makes for explosive, possibly nuclear, results. The temptation to wish Trump out at any cost is strong; but is the prospect of the only viable opponents consisting of other businessmen (Bezos, Zuckerberg, Bloomberg, anyone?) much more enticing? Giant companies and the very rich already have much too much sway, both direct and indirect, over our lives. What we need is a straight alternative to the money- and profit-centric view of life that has got us in the current mess, not just a less voracious version.

The irony is that there is one obvious area where CEO activism would not only be welcome but that we have been awaiting in vain for years. It is an area that, unlike many other topics, business is uniquely well qualified both to speak and act on – and it would benefit everyone, not just sectional interests. Give up? It’s business itself.

Long before the Great Financial Crash in 2008, it was clear to everyone not wearing earplugs that the tocsins were ringing. Shareholder value maximisation, that simplistic and treacherous mantra, was a corrupt, busted flush, benefiting only chief executives loaded with share options and hedge funds that take no thought for the health of the overall system. In an angry review of Deborah Hargreaves’ ‘devastating’ take-down of executive pay, Are CEOs Overpaid?, Margaret Heffernan, herself an entrepreneur and business leader, charges that the real failing of the current generation of Anglosphere CEOs is not, as so often posited, greed. She writes: ‘I’m not sure chief executives are merely greedy. What I’ve seen, in the US and UK, is more disheartening than greed. These men – and they are mostly men – are not leaders but followers. They are afraid to step out of line and set a better example. Instead, accepting their huge salaries, they hide behind an old, discredited alibi: everyone’s doing it.’

Coming out for minorities is well and good, but away from the public eye many of those worst treated are toiling within the firms that CEOs speak for. From rock ‘n’ roll to grunge, commerce has always been quick to spot the possibilities in coopting rebellion – see Nike’s recent ‘Just Do It’ ad campaign featuring Colin Kaepernick, the mixed-race American footballer who first initiated the practice of black players kneeling during the national anthem as a protest against racial injustice. For Nike, causing controversy was part of the point. CEO activism could, and almost certainly will, be read as something similar: cynical virtue-signalling or personal brand advertising, unless it is first deployed to put their own damaged house in order.

And we’ll take the low road

On 15 September, economist Mariana Mazzucato tweeted:

Great 2 days in San Francisco for West Coast launch of Value of Everything [her new book]. But the number of suffering homeless people on the street was much greater than I have ever seen and left bitter taste. Only one word can describe it: barbaric. Humans in 21st capitalism deserve better.

Her equally eminent colleague at UCL’s Institute for innovation and Public Purpose, Carlota Perez, the analyst of great technological surges, replied:

@MazzucatoM Homelessness, precarious zero hours and gig economy contracts plus the many that have dropped out of the workforce make a mockery of the current celebrations of full employment in the US and UK. We had better start looking at real reality in the face

Their vignettes made a deft counterpoint to a piece the same week by Sarah O’Connor, the FT’s first employment correspondent for decades, entitled ‘Workers have right to gig economy that delivers for 21st century’. It carried the self-explanatory subhead, ‘Flexible working is touted as the future but too often resembles an exploitative past’.

It is worth reading, and sits easily (or make that ‘profoundly uneasily’) beside recent books like James Bloodworth’s Hired: Six Months Undercover in Low-Wage Britain or Jeff Pfeffer’s angry Dying for a Paycheck. Beside the daily tribulations of those trying to make a living in the new precarious economy, she made the wider point that for many of them the tech-enabled ‘flexible labour market of the future’ resembled nothing so much as the bad old piecework system of the past.

Quite starkly, the labour market is dividing in two. In future a minority will enjoy high-paid, full-time, jobs, while the lot of a growing segment of the working population will be self-employment, part-time and gig employment with little security and precious few prospects. Worryingly, as O’Connor wrote,

‘The UK’s low-paid sectors are 20 to 57 per cent less productive on average than the same sectors in Belgium, France, Germany and the Netherlands. Take the low-productivity activities that have recently been “reshored” to the UK, from manufacturing screws to stitching £7 dresses. A developed economy that finds itself sliding back down the global value chain is an economy where something is going awry’.

What’s going awry shouldn’t come as a surprise. In 2012 Guardian economics editor Larry Elliott and Dan Atkinson summed up their view of the situation in a book called Going South: Why Britain Will Have a Third-World Economy by 2014. Widely brushed off as a exaggeration at the time, the book may have got the date wrong, but otherwise the jeremiad looks depressingly accurate. One of the points the authors made was that given the option the UK economy unerringly took what looked like the easy way out in the short term only to find that in the long term it simply reinforced the narrative of decline.

Not one but two examples of this lethal addiction figured in the press last week. The first was by Elliott himself, who brought his 2012 analysis up to date by noting that it took a rather special interpretation of success to qualify UK labour-market flexibility as such. He pointed out that in the much less flexible market of 1975, at least work paid – unlike today, when it is anything but a reliable route out of penury: as witness the fact that two-thirds of the UK poor live in households in which at least one person works. True to form, employers have seized on the negative, cost-cutting possibilities of ultra-flexibility with relish, accentuating the downward slide that Elliott-Atkinson identified in 2012. Britain, Elliott wrote last week, ‘now appeared to be permanently locked into a low-wage, low-skill, low-productivity economy, in which workers compensate for a lack of earnings power by taking on more debt’.

Meanwhile, back in the FT, economics editor Chris Giles was mercilessly rehearsing the sorry story of British devaluation, the nation’s fall-back macro-economic easy way out. In 1948, he recounted, one pound bought more than $4 and DM13.5. That compares with today’s $1.30 and the equivalent of DM2.2. Over the 70-year period, only Canada has grown more slowly than the UK among the G7, and we have underperformed the eurozone since it was formed in 1999.

Cheaper sterling is ‘no route to prosperity’, concluded Giles, qualifying the most recent in the series, the 20 per cent depreciation of sterling since late 2015, as particularly disappointing. There has been no boost to exports, and no import substitution. Confirming the week’s low-road consensus, he noted that ‘a mini revival in manufacturing employment is overwhelmingly in making simple, low-productivity products such as food or metal goods such as radiators, cutlery and screws’; all in all, ‘the latest depreciation is challenging to be the worst in British history’.

That was all in one week. And we haven’t even mentioned Brexit. It was left to O’Connor among our doom correspondents to attempt to pull something positive out of the wreckage. The ‘good news’, she opined bravely, was that ‘the best possible time to reform a labour market is when unemployment is low and bargaining power is naturally on the rise’. With the labour market at its tightest since the 1970s, surely now was the time to drag the labour market into its higher pay, high productivity future. ‘Brexit uncertainties are not a reason to drift and dither, but an impetus to act’.

Mmmm. Well. She’s right in theory, of course. But in a week which has picturesquely reminded us that we seem unable to implement a railway timetable, let alone a political one, all one can really say is: good luck with that.