Read my column in FT Business Education, ‘Why it’s time we rethink the nature of work’, here
Category: Uncategorized
Will the digital revolution make us all poorer?
My article in Professional Manager, 20 August 2015, ‘Will the digital revolution make us all poorer?’, is available here
The unstoppable rise of the robots
The French newspaper Libération devoted four pages of a recent issue to an investigation of the idea of a minimum basic income for every adult. Among the surprises for an Anglo-Saxon reader, for whose politicians such a notion is so far out of mind as to be science fiction, is the discovery that in other countries it is very much on the map. The left-wing Spanish party Poderemos has it in its manifesto. The Finns are considering it, and the Swiss will vote on it in a referendum later this year. The practical Dutch are testing it in a pilot in Utrecht.
If Martin Ford is right, such initiatives, far from being a rush of Corbynitis to the head, are necessary and urgent. Rather than signalling the demise of capitalism, he suggests in his impressively researched and soberly argued book, The Rise of the Robots, a basic minimum wage may turn out to be the only way to save it.
Ford is a successful Silicon Valley software entrepreneur, which at first glance makes him an unlikely technological doomsayer. But unlike gung-ho contemporaries such as Ray Kurzweil who enthuse about the coming ‘singularity’, the moment when machine intelligence surpasses the human kind, and the achievement of immortality (sic), Ford concentrates on what is happening in the here and now to make a powerful case that this time it really is different; meaning that the threat in his subtitle – ‘technology and the threat of a jobless future’ – is very likely to come true.
The techno-optimists’ cheerful view of the economic consequences of technological advance is based on faith and precedent. They point out that since the Industrial Revolution first disrupted craft-working in the 18th century, each succeeding wave of progress has produced new markets and industries that generate more and better-paid jobs than they destroy. This is Schumpeter’s ‘creative destruction’ in action, a virtuous circle in which technological innovation drives higher wages and increasing demand that floats all boats. Better education will keep human capability ahead of the machines; economic growth will provide a steady flow of new jobs in partly-mechanised sectors and others that spring up to serve them.
But although many economists (and almost all politicians) are still parroting the old mantras about a return to growth, Ford notes that the positive relationships had started breaking down long before the financial crash of 2008, and substantially pre-dating the current round of techno-acceleration. (Ford writes mainly about the US, but in economics as in other spheres the UK can be relied on to imitate the transatlantic experience with a short time lag, if in slightly less extreme form.)
Thus Ford lists ‘seven deadly trends’ that have been ticking away behind the economists’ comfortable assumptions like a colony of deathwatch beetle: average wages stagnant since the 1980s; a shrinking labour share of GDP; declining labour-force participation (particularly among less qualified men); jobless recoveries (it took until mid-2014 for US employment to regain pre-crash totals, by which time the working population had increased by 15 million – in fact the US economy has put on no net new jobs this century); soaring inequality; a diminishing premium and growing underemployment for graduates; and the polarisation of the jobs market between well-paid full-time employment for the very few and part-time and freelance for the many – ‘uberisation’, let’s say.
Now take this malign dynamic and supercharge it with the most powerful general purpose technology (ie whose effects will leave no industry untouched) ever devised – one, moreover, whose advance is accelerating with the undiminished momentum of Moore’s Law. Machine intelligence is improving by leaps and bounds. IBM’s Watson supercomputer defeats human champions not only at chess, a bounded problem, but also at the game show Jeopardy!, a cryptic, unbounded one. Watson is now being deployed commercially. While Artificial Intelligence (AI) is still the ‘narrow’ variety, the ‘strong’ version, or Artificial General Intelligence (AGI), is now being vigorously pursued not just in research labs, as in the past, but competitively by ambitious, well-resourced giants such as Google, Facebook and Apple, pushed by powerful commercial incentives to make it work. Another AI, Artificial Intuition, is in the works.
A recent Forbes contribution (entitled none too subtly ‘Deep Learning And Machine Intelligence Will Eat The World’) could not have put it more clearly: ‘The effects of this technology will change the economics of virtually every industry. And although the market value of machine learning and data science talent is climbing rapidly, the value of most human labor will precipitously fall.’ Publications (reputedly including Forbes) already employ software to write news stories and reports; in a few years time 90 per cent will be machine generated, in one estimate. In many other industries, automation, says Ford, is simply ‘the logical next step’. Beware, he warns: if you’re working with computer software, you’re probably training it to replace you.
In a much quoted 2013 report, the Oxford Martin School suggested that 47 per cent of US jobs would be susceptible to computerisation in the next two decades. Later estimates raise that to 80 per cent. Given that the essence of computerisation is enabling more to be done with less, is it conceivable that new industries based on it will be labour intensive? Looking at early evidence from those avatars of the new economy YouTube (which had a value of $1.65bn when it was acquired with its workforce of 65) Instagram ($1bn and 13) and WhatsApp (a staggering $19bn and 55), the answer seems pretty clear. Uber and Airbnb just underline the point: for the first time new technology is not only creating fewer jobs than it consumes: it is also creating worse ones. The circumstantial evidence keeps flooding in. Thirty per cent of US science and technology graduates are currently labouring in jobs that don’t need degrees; most UK graduates are in non-graduate jobs. ‘The assumption that we will transition to a more productive … economy just by increasing the conveyor belt of graduates [the method used in the past] is proven to be flawed,’ says the Chartered Institute of Personnel and Development (CIPD).
So far, so not very good at all. But if the scenario follows Ford’s trajectory, the effects will be massively self-reinforcing through the resulting demand deflation. Ford quotes the following exchange between Henry Ford II and union boss Walter Reuther on a factory visit: ‘Walter, how are you going to get those robots to pay your union dues?’ ‘Henry, how are you going to get them to buy your cars?’
At the extreme, radical inequality is unsustainable in the most basic sense: by hollowing out the middle classes, the winner-takes-all economy becomes a contradiction in terms. There won’t be anything to take, because the very rich simply don’t consume enough to keep the wheels turning. Being partly the result of what happens when average consumers do keep consuming, but using debt rather than cash to do so, the 2008 crash should be a warning here. As Ford relates, companies are already abandoning the middle market to chase the 1 per cent of super-spenders; despite increasingly frantic advertising, the average US car is now 12 years old, a forerunner of the tangible consequences of rising inequality. In this context, the gathering cloud of graduate debt overhanging the US and UK looks increasingly ominous not only for individuals with increasingly uncertain earning prospects but also for the economy as a whole, while the policies that created it are revealed as the monstrous false economy they seemed to many at the time.
This is the background against which Ford sets out his proposal for a universal minimum basic income. Ironically, the original progenitor was the English radical Tom Paine, who advocated it as a blow for social justice against the cruelty of emerging capitalism. By contrast, for Ford it is primarily a prop to keep that system going. As he concedes, the idea is controversial, with weighty considerations on both sides. Yet the need for solutions may be even more pressing than he thinks it is. Ford didn’t foresee another emerging result of global inequalities, the rising tide of immigration. And like almost every other commentator, whether optimist or pessimist, Ford has internalised, and thus leaves out of his reckoning, the most deadly trend of all: the pernicious incentives at the heart of today’s shareholder capitalism.
This is the first great wave of technological evolution whose justification is not that it benefits humanity but that it benefits shareholders. The lesson of the last 30 years is that Investments driven by self-interest and shareholder value – where the benefits are supposed to accrue to one group in society – do not produce a generalised increase in wellbeing, because they are designed not to. As Jeff Pfeffer succinctly puts it, ‘Economic performance and costs trump employee [and societal] wellbeing’. Under today’s incentives, investments in accelerating technology will just destroy more of it. To be clear about this, consider a quote from the founder of a start-up planning to automate the production of customised gourmet hamburgers: ‘Our device isn’t meant to make employees more efficient. It’s meant to completely obviate them.’ Or this from another start-up entrepreneur warning that executive jobs too are in the firing line: ‘It will not be possible to hide in the C-Suite for much longer. The same cost/benefit analyses performed by shareholders against line workers and office managers will soon be applied to executives and their generous salaries’. Oh yes, the name of this start-up: iCEO.
At this point, as Ford notes logically if bleakly, at the end of his book, falling demand may run out even for further automation. It might. But it would surely be unwise to wait to get that far to find out.
****I’m away at the moment: next piece beginning of September****
Productivity is only half the story
And another thing…
The pother over productivity has reached such a pitch that I make no apology for returning to it.
In one of the latest contributions, the newly-Japanese-owned FT recently carried an article suggesting that the solution for the UK’s low-productivity problem was to cancel August – in other words, everyone should take less holiday. I think it was meant seriously.
It brought irresistibly to mind W. E. Deming’s acerbic, ‘Having lost sight of our objective, we redoubled our efforts’, instantly putting a finger on what’s missing from the productivity debate: it’s all very well working harder, but to what end?
‘Productivity’ is about ‘efficiency’, and there is no argument that UK productivity/efficiency is historically poor, lagging that of many of our national competitors. But efficiency is a crude proxy for a more important and profounder measure, which is effectiveness. Like GDP, productivity is a measure of activity in general, irrespective of whether it is useful and beneficial for society or not. It measures outputs against inputs and is about means – doing things right. Effectiveness measures outputs against objectives and is about ends, or doing the right thing. Unless it’s related to the right thing, productivity is of secondary importance. As Peter Drucker decisively summed it up, ‘There is nothing so useless as doing efficiently that which should not be done at all.’
The economic story told by effectiveness differs in important ways from the one derived from the productivity figures. For example, the official narrative puts much emphasis on improving productivity through indirect supply-side measures such as investment in infrastructure and education, and hiving off as much activity as possible to the supposedly more efficient private sector.
Of course, functioning infrastructure, including education, is essential. But it has nothing to do with organisational effectiveness, which is not a private-vs-public-sector issue: it is one of system design. All over the economy people are being beaten up to do more efficiently stuff which shouldn’t be done at all – either because they are attempting to do the wrong thing righter, which, as systems guru Russell Ackoff points out, just makes them wronger, or because they are redoing something that wasn’t done or wasn’t done properly the first time round: ‘failure demand’, in John Seddon’s term.
The dirty secret is that the NHS and many public services are stuffed full of failure demand – in some cases 60 to 80 per cent of contacts are repeat calls from folk who haven’t had their problem solved the first time. But so too are the customer-service departments of banks, phone companies and other utilities which measure their activities in terms of efficiency (x number of calls per hour, x number of rings to answer the phone) rather than effectiveness (the overall time it takes to fix a customer’s problem). In other words, measured against their purpose, whatever the productivity figures say, they are hopelessly ineffective.
At a stroke, this nullifies a second part of the official narrative: in shorthand, the death spiral of public services. The conventional story is that services are unsustainable, groaning under ever-increasing demand and expectation, necessitating constant cost-cutting to make them more efficient. But this is a travesty of the truth. What they are groaning under is the weight of citizens failing to get routine problems fixed by a system that is designed against cost, not to meet predictable demand.
Nor is it true that there is an insurmountable resource crisis; or rather, there is, but it is caused by a work design in which jobs are so tenuously connected with needs that they mostly make things worse rather than better. That includes anyone working to internal service agreements or standards, having to meet numerical or time targets or quotas, managing demand (rationing) rather than meeting it, working in back offices, shared-service and most contact centres; central specifiers, commissioners, inspectors and regulators; and anyone managing the same, basically policing other people and administering the rules. That’s a lot of people.
The bad news is that all these are what anthropologist David Graeber has termed ‘bullshit jobs’, make-work employment offering no meaning or pride (no wonder levels of engagement are so dismal). The good news is that any improvement (doing the right thing, however imperfectly at first) brings a double benefit, reducing wasted work, and therefore cost, on one side, while freeing up capacity to do more on the other. Climbing morale as people are reconnected with a meaningful purpose adds a third whammy. So yes, there is a resource crisis – but it’s a crisis of management effectiveness, not individual productivity.
I recently came across an intriguing concept called ‘Eroom’s Law’ – Moore’s Law backwards, if you haven’t got there, and it applies to processes that unlike computer computer processors get slower and more difficult over time. It was first applied to the seemingly inexorable slow-down in new drug discovery, but it also usefully illustrates what’s happening to management as it proliferates and slows under its own friction (I wrote a bit about how it happens previously here). Unless we can prise management out of the grip of Eroom’s Law, any increase in overall productivity will will be more than eaten up by the rising tide of bullshit and a corresponding decrease in effectiveness.
Being more customer led – do you listen to research or to your intuition
To be blunt: how can you be customer-led if customers don’t know or can’t tell you what they want themselves? From the company angle the Holy Grail is a market game-changer: the automobile to replace the horse, say, or an iPhone, or an iTunes or Spotify. But if as a customer you’ve been thinking in terms of more rapid equine transport, a telephone that’s better at talking on, or a larger collection of CDs, what do you make of a machine with four wheels and a clattering engine, a supercomputer with a touch-screen, or a virtual jukebox hosted in something called ‘the cloud’? Round the other way, how come despite decades of experience, pollsters still find surprises and contradictions in people’s actual behaviour, even when polled the day before an election about how they are going to vote?
The answer to the questions that came back from an intriguing Foundation Forum on 10th June was that human reactions remain more or less as infuriatingly difficult to call as ever. We have a lot more data about why we react as we do. But while that helps us understand the degree of our unpredictability, knowing more accurately the scale of the problem doesn’t make it easier to get business decisions right. To paraphrase Lord Leverhulme, we know that half our knowledge about the way people will jump in any situation is wrong: we just don’t know which half.
‘If you’d told me in 1987 that I’d employ 100 filmmakers in 2015, I’d have wondered why I’d ever do that, or that I’d employ 20 economists’
The challenge for the research industry therefore remains as great as ever. As Ben Page, chief executive of pollster Ipsos MORI, noted, while some of the techniques are remarkably unchanged – companies still knock on doors, do telephone interviews, and send out postal surveys – they have deepened. ‘If you’d told me in 1987 that I’d employ 100 filmmakers in 2015, I’d have wondered why I’d ever do that – or that I’d employ 20 economists’. That reflects more sophisticated attempts to tease meaning out of the information gathered, with a discernible shift of emphasis from asking questions to watching and observing what people do. Today’s key trends, says Page, are speed (clients want reports in hours, not weeks), mobile (location recording, instant selfies at the breakfast table) – and the dawning realisation that since subjects aren’t completely rational, it’s not enough just to record what people think they do.
That means that no one research tool is adequate on its own. ‘Instead what we’re seeing is layering of these different techniques, so clients will be looking at a whole range of different data sources. And as we better understand these things using all the techniques at our disposal we’re getting a much better and richer understanding of human behaviour’, he said. So it’s not a question of either intuition or research, but both/and, and a lot of other things besides. Indeed, intuition remains a powerful force – ‘an Ipsos MORI chief executive wouldn’t be advising anyone to scrap research and just listen to instinct, but it’s amazing how many clients pay millions of pounds to evaluate an ad campaign and then cheerfully ignore the data and go with their gut feel.’
To understand how people work ‘you need research, you need data and you need psychology’
Marc Michaels, the second panellist, agreed that to understand how people work ‘you need research, you need data and you need psychology’. And as someone initially recruited to set up a government direct-marketing unit and then more generally to work on changing behaviour – persuading people to eat more sensibly, give blood, join the army: a tough brief – he is clear that in making research and data actionable, the new findings of psychology and behavioural economics are critical. It’s not that we lie, he says; but as Daniel Kahneman demonstrated in Thinking, Fast and Slow we each have two brains, a System 1, ‘Homer Simpson’ organ for instinctive, holistic, and instant decision-making, and a System 2, ‘Spock’ brain for more analytical, deliberative, demanding thought. There are cognitive biases that undermine strict ‘rationality’ but which also give opportunities to ‘nudge’ people towards certain choices or behaviours by fitting the way they are engaged to the innate predisposition to respond in one direction or another.
Thus there’s a general human tendency to fear losses more than to value gains (‘a bird in the hand is actually worth two point five, sometimes even three, in the bush’). The slacker, Homer Simpson brain will try to get away with answering an easier question than the one asked. Big, complicated issues are often avoided, so ‘chunking’ them down is likely to win a more positive response. As Stanley Milgram’s famous 1960s experiments showed, people respect and obey authority, sometimes to a frightening degree. So, in one celebrated example, the Department of Health and COI recruited Anne Diamond, a respected and well-loved newsreader who had suffered a cot death, to counter the grandmother-sanctioned traditional wisdom of putting infants to sleep on their fronts and persuade young mothers to sleep them instead on their back or side. The result: a reduction in cot deaths of 70 per cent. But doing it needed authority to fight authority.
In a sense, research has gone full circle, observed Clive Humby, co-founder of data-led research business dunnhumby and now Starcount, which uses social media and ‘fan science’ to craft brand influence strategies. ‘Really, what we’re really talking about with data is understanding customers through the things that we observe. We’ve heard about watching them through videos, asking them questions, and obviously looking at what people physically do close-up and the transactions they’ve made using information. It’s gone through a complete revolution’.
Humby is credited with the line that data is the new oil, and he drew two important comparisons. First, the gusher in its raw state has little value, only becoming usable when it is processed into something else. Data is the same: ‘Data is everywhere. I’ve got 147 devices in my house that have their own IP address – smart TVs, a lighting system, computers, phones, all those items are generating data about you all the time. The data on ourselves generated in the last day, exceeds all the data generated in a year 12 months ago. So the real challenge isn’t in collecting data any more, there’s far too much of it – it’s making it useful’. So it’s the algorithm guys, the pattern-identifiers that are the stars, the car designers building on the potential of oil.
‘The real challenge isn’t collecting data any more, there’s far too much of it – it’s making it useful’
Yet solving one problem often just reveals another. Made usable, into petrol for instance, oil becomes volatile. So too with data, which turns not just volatile but nuclear when it collides with privacy. Benign nudging and behaviour-recording with consent are one thing; but what about using your shopping list as a basis for insurance premiums? Or a real example from Tesco: ‘One of the most important correlations we found in terms of data we could have commercially exploited, was the one between £4.99 Chardonnay and condoms. But we never acted on it. The reality is that just because you know, doesn’t mean you should. And that is the dilemma we’re all facing’: Is it cool or is it creepy?
The dilemma can only intensify with the rapidly emerging internet of things, in ways that have barely yet been registered. Suddenly the issue is no longer the quantity of personal information being given away to a faceless corporate. ‘We think about privacy in terms of our big corporate systems and frontline operators who talk to customers’, Humby pointed out. But actually the data is available to developers, app people and potentially everyone in the organisations that has access to it. The people who repair your car know everything about where it has been, how fast it was driven, how long it was parked. How easy would it be for someone to get this and use it for something unforeseen and with bad intent? ‘Once that happens, everyone becomes a possible liability. And we have to really worry about that as leaders in organisations’.
The paradox of research, as with most things human, is that the more we know, the more complicated it gets
The paradox of research, as with most things human, is that the more we know, the more complicated it gets – and a simple scientific synthesis seems as far off as ever. At least for the time being, it’s humans that rule, not algorithms. Noted Page: ‘It’s taken some time, but I think the industry is getting there. One of the things that’s holding us back is that sometimes we’re conservative, and our clients are just as conservative because they’ve been tracking data the same way for 30 years, and it’s consistent and tells them things’. His advice is: relax a bit, be creative, and remember the lesson of Alex Salmond – who on the basis of extremely expensive US analysis through social media knew for certain that the Scots would vote for independence, never mind the opinion polls.
‘People aren’t rational,’ summed up Michaels. ‘When they tell you they want to do something, you may see from the data side that they’re doing something, but you’ve got to think about what is going on there. You can think data, but you need to talk human’.
The Foundation’s thoughts
Four of the most significant points which emerged for us were as follows:
• Collaborative businesses are succeeding because they bring at least three useful characteristics together in a way that reinforce each other. Any new and better business model tends to do this, creating a virtuous circle that is different enough to the incumbents’ for it to be impossible to copy with a simple adjustment:
• To get to a good understanding of what people think, feel, and crucially, do, can take the application of all of the approaches described above. On the evening we talked about triangulation, using market research to understand the landscape, then in the areas of interest conducting deeper exploration. This might use more extensive real-world data, and the vastness is made useful by developing and testing hypotheses based on human understanding that respects the instinctive ways we often act. Another way to describe the process is one of detective work – an overall conundrum to be solved, and lines of enquiry established around the possibilities. Each then explored creatively (what could be going on here?) then challenged to eliminate as much as possible from enquiries using data, further specific research or conducting experiments.
• As Clive reminded us, there is a rear view mirror issue with data. It can tell us what’s happened, and used well it can give us insight into why. But it can’t predict the future. Which might make some of the big data investment going on right now look a bit optimistic
• The vastness of the data we each generate creates real ethical issues that aren’t currently being addressed. It is much easier than we realise to share information on everywhere we’ve been, everyone we’ve spoken to and a fair bit of what’s been exchanged with all sorts of organisations and individuals that we might be wary of if we sat down and thought about it. As we heard, modern cars contain information on where they have been and how they were driven, all easily accessed by your local car dealer, the police or your insurance company… or anyone who knew how to hack and steal it. We often allow apps to get this kind of information from our mobile phones, because we click ‘allow’ and because the Apple Ts&Cs are, in Clive’s words, longer than Shakespeare’s The Tempest.
• Our human intuition isn’t just at the end of the telescope trained on customers. The users of insight have just the same biases, from the more entertaining ‘I don’t care what the research says, we’re running the ad’, to the more important problems we find with inconvenient market research findings getting short shrift from a leadership team trapped in a world they see from the inside of their business looking out. It can be useful to see the conclusions from insight work as the start of another challenge, giving it the impact it needs to get the organisational response it requires. For example, getting leaders speaking to customers themselves so they create their own stories and beliefs in line with the bigger picture.
George Osborne’s productivity delusion
The air of unreality that made the election so weird has only deepened with George Osborne’s ‘big budget’ last week. It’s a novelty to find The Economist, Guardian and Financial Times in unison on anything much, but all three judged that the budget’s political astuteness was only matched by its economic irrelevance. The Economist was particularly severe on ‘indefensible’ cuts to benefits for the lowest paid, ‘barmy’ inheritance-tax reductions on houses, and the ‘outrageous favouritism’ of the welfare cuts.
The Chancellor inhabits a Humpty Dumpty world in which words mean what he chooses them to mean, not what anyone else understands. So in a budget that will do the opposite of what he claims, it is perhaps no surprise to find a ‘productivity review’ that says it ‘sets the agenda for the whole of government over the parliament to reverse the UK’s long-term productivity problem and secure rising living standards and a better quality of life for all our citizens,’ but in fact is just a mash-up of what was in the budget, which itself apart from a levy to fund apprenticeships offered nothing that was either new or remotely relevant to the real productivity issues.
No hint here that productivity is a problem which the UK has been failing to fix using exactly the same tired and half-hearted supply-side means for more than half a century (I was writing about failure to electrify the railways in the 1980s); no hint that we are moving from the old economy, which was already tough enough, into a qualitatively different new technological era where the challenge may to create any jobs at all; no hint that with companies already bulging with cash and labour’s share of the economy shrinking by the minute, there are no strings left for the government to pull to tickle entrepreneurs’ and managers’ jaded animal spirits.
For anyone with eyes to read, the pages of Harvard Business Review, not a publication of the hard left, have been sounding the alarm for the past two or three years: in the US and UK capitalists have given up on the virtuous circle of reinvestment and and innovation that kept wages rising and economies moving forward since the WWII. So any benefit of lowering corporation tax to 18 per cent will simply disappear in bigger payouts to shareholders in the shape of dividends and share buy-backs, thank you very much, with at best some no-risk investment in cost, and job, cutting. This is the new normal, and it’s to do with with relationships and incentives between the firm and its stakeholders – its corporate governance – not the state of the infrastructure, education or housing, for goodness sake (memo to George: if builders haven’t already built on brownfield sites it’s because they’re too expensive – the land is contaminated or low lying – and people don’t want to live there).
In these circumstances, the otherwise welcome announcement that the government has recruited John Lewis chairman Charlie Mayfield to lead a taskforce developing ideas for raising business productivity is unlikely to lead very far. As it happens, Mayfield put his thumb right on the sore point that is the UK productivity record in a recent interview on the BBC’s Today programme. Asked about his role at John Lewis, he replied: ‘I work for the partners in the Partnership. My job is to invest in them, help them to work as well as they can, and if we do that, we’ll succeed as a business…. They hold me to account for that.’ Well, yes. It’s not rocket science. Giving people a job with a purpose, the means to improve and a pay packet that takes wages off the agenda are what sustains the engagement that feeds the high-productivity workplace. All the rest is secondary.
The catch is most companies don’t have what goes with it at John Lewis – in particular committed long-term governance that aligns bosses with the workforce that actually create values, not shareholders. In his other capacity as chairman of the UK Commission on Employment and Skills, Mayfield recently illustrated the extent of the management switch that the country needs to make to plug the productivity gap by drawing attention to OECD research showing that an astonishing 22 per cent of UK jobs only require the educational level of an 11-year-old, a proportion exceeded solely in Spain among our competitors. By contrast, Germany has just 5 per cent of jobs that are as undemanding as this, and the US 10 per cent.
Underlining the point, Will Hutton notes that lackadaisical governance and financialisation have turned the UK into a sub-contract economy with ‘a string of technology-light, productivity-poor small companies’, a yawning trade balance, a hollowed-out industrial base, and a record of unending decline in its share of world exports. The erstwhile workshop of the world’s current champion industrial sector? Food processing.
Reversing the productivity spiral ideally wouldn’t start from here. While it’s difficult, though, impossible it’s not. What it does require, as Mariana Mazzucato has eloquently laid out, is a new, richer and more optimistic narrative of innovation and wealth creation that emphasizes the importance of patient, committed capital and recognizes that productive capitalism ‘is one in which business, the state, and the working population work together to create wealth’, not appropriate it. This is the opposite of the risible, infantile obsession with ‘business friendliness’, and indeed of almost everything in Osborne’s budget. Don’t hold your breath.
The nature of technology
What is technology and how does it develop? Remarkably for something so dominant in our lives, until recently no one had much systematic idea. With a few exceptions economists treat it as a black box. There’s masses of work on technicalities, but technology’s nature, its relationship with innovation and economics and how it evolves, have largely gone by default.
Enter in 2009 W. Brian Arthur and a remarkable book called The Nature of Technology. Arthur is well known for his groundbreaking work on economics and complexity at the multidisciplinary Santa Fe Institute in New Mexico, and he draws on both for his project, the formulation of an overarching theory of technology and technological development.
In (very) short, Arthur concludes that technology – roughly defined as natural phenomena repurposed for human ends – is not a collection of arbitrary standalone techniques and inventions, as previously viewed. Instead, it is something much more like chemistry or biology than Newtonian physics, sharing with life its ‘connectedness, its adaptiveness, its tendency to evolve, its organic quality. Its messy vitality.’
Technology, says Arthur, ‘builds itself organically from itself’ as individual developments feed on each other and cumulate. In a process of ‘combinatorial evolution’, proliferating possible technology combinations become almost infinite. Advance is non-linear, so that abrupt discontinuities can occur as tipping points arrive in record time.
Although as qualified as anyone, Arthur doesn’t do technology prediction – he doesn’t even mention Moore’s Law, perhaps today’s most powerful innovation and technology amplifier. But it’s easy to see the process he describes in today’s digital transformation, the startling speed and unpredictability of technological evolution strikingly borne out by events.
For example, when he was writing just six years ago the idea of drones on sale on the High Street would have been strictly science fiction. Even three years ago a self-driving car was assumed to be decades away. In robotics, too, new devices suddenly sit ‘at the nexus of visual perception, spatial computation, and dexterity [reaching] the final frontier of machine automation’, as Martin Ford puts it in his (also impressive) The Rise of the Robots. Ford notes that a San Francisco startup has devised a robot that that aims to automate the production of custom-made gourmet hamburgers – not, that is, as in ‘make burger flippers more efficient’, but as in ‘obviate the need for them altogether’.
As Arthur sees it, as technologies, collections of technologies and sub-technologies interact with each other in ‘messy vitality’, they generate a teeming ‘supply’ of possible new technology combinations, all available to be used and built on by innovators. But although they emerge from their own history, the form technologies actually take is by no means inevitable, being inflected by human agency and historical small events.
One such human agency is management, as a ‘purposeful system’ itself also a technology in the broadest sense. Although it is rarely considered by commentators, and Arthur doesn’t go into it in depth, management is obviously a crucial influence on demand for technologies and how they are used. When writers such as Ford (himself incidentally a successful Silicon Valley entrepreneur) raise worries about ‘technology and the threat of a jobless future’, to borrow the subtitle of his book, the response of techno-optimists is invariably something like – ‘Trust us. The great technological leaps of the past have always created more jobs than they have destroyed. Invest in the supply side (education, infrastructure, easing the transitions), have faith, and all will be well’.
In the light of Arthur’s theory and Ford’s practice, the rejoinder has to be, ‘What about the demand side? Look at the context: the technological combinations managers and businesses have used, and more particularly what they have chosen to use them for.’
As Ford points out, while production technologies helped raise productivity by 107 per cent between 1973 and 2013, in the letter year a typical US production worker took home 13 per cent less in real terms than 40 years before. In the decade to 2010, the US economy created no net new jobs. Inequality soared as productivity gains were monopolised by shareholders (including and especially top managers). In 2010, the US computer industry employed 166,000 fewer people than in 1975. Meanwhile the ‘sharing economy’ shreds jobs and spits them out as micro-employment, and more generally the internet economy is based on a business model of surveillance which turns consumers into products and only incidentally (and then mostly unpaid) producers.
To emphasize, none of these developments was inevitable. The same digital technology crossed with a different management technology would have produced different outcomes: it’s not hard to imagine peer-to-peer platforms devoted to medical or social ends, for example, or an internet which put individuals in charge of their own data and reversed the current relationship between consumers and companies.
All this suggests that it would be unwise to bank on historical precedent providing a reliable guide to our economic evolution from here on in (that’s what discontinuity means). Arthur ends his book by noting the increasing ambivalence with which humanity views its miraculous technological creation. On one hand it is undeniably a blessing serving our lives; yet on the other there is growing unease at the way it has estranged us from nature, now endangering the future of the planet, and a dawning fear that the apprentice’s magic is outstripping that of the erstwhile sorcerer.
Seeing the manifestations of Arthur’s ‘combinatorial innovation’– the internet of things, learning machines, automation, rapidly progressing Artificial Intelligence – emerging around us, all turbocharged by Moore’s Law, it seems probable that in terms of sheer processing power the race against the machine is already in course of being lost. In which case we’d better sort ourselves out as humans and decide what and for whom this awesome thing is to be used for – before it does it for itself.
Interfering politicians and a dysfunctional market – how we got the worst of both worlds
It’s not where we wanted to be. Somehow we have ended up with a weird mutant capitalism that cumulates the worst of both worlds: on the one hand a predatory and amoral market (motto: ‘If you ain’t cheating, you ain’t trying’, as a Barclays vice-president pithily summed it up) which systematically generates crashes and inequality, and on the other an increasingly dictatorial and interfering administrative state that thinks nothing of casually dispossessing housing charities (the new right to buy), micromanaging everything from GP’s diaries to the number of rooms people are allowed to live in, and now, if you please, outlawing future (Keynesian) changes to economic policy – in sum, a nightmare cross between Ayn Rand and Stalin, or, if you prefer a home-grown version, Orwell and Bullingdon.
How did we get here? After all, the whole point of the market was to strip out useless rules and non-value-adding activity. Outsourcing to the private sector was supposed to get politicians out of managment. Submitting public services to the discipline of market forces would diminish the purchase and interference (not to mention cost) of the state in favour of individual economic choice – just let the marvel of the market decide.
How wrong can you be. Instead the UK has developed a model that is both state-dominated and market-driven. As John Kay has pointed out, a huge and expanding regulatory state, extending across the private as well as the public sector, manages to be both intrusive and ineffective. Meanwhile, David Graeber (The Utopia of Rules) has noted the proliferating bureaucracy (in the double sense of red tape and low-level jobs administering it) of public and private administration, to which the internet, far from mitigating, has simply added another bureaucratic layer. In Graeber’s categorisation, these are ‘bullshit jobs’, adding no value, demoralising user and agent alike, and paying too little to keep those who perform them out of subservience to the state.
Paradoxically, much-derided bureaucracy and much-lauded market are two sides of the same coin. One of the drivers of the dynamic between them is the careless political conflation of ‘the market’ with ‘business’ or ‘companies’. The market is indeed a uniquely powerful mechanism, but like an F1 engine it needs constant care and attention to keep it in balance. Ironically, its most troublesome constituents are companies, which at least in Anglophone countries have been absolved by today’s corporate governance from any duty of care to the markets they claim to live by or the society they are part of.
When the business of business is business, all legal means are fair ones, including those that prevent markets working as they should – tactics such buying up competitors, predatory pricing, rent extraction, or, less obviously, cutting back on investment in R&D and training to benefit the short-term share price. While these are legal, such a culture easily tips over into real market rigging, as with the banks. There’s a weary inevitability about the subsequent process, as Kay describes: ‘We have dysfunctional structures that give rise to behaviour that we don’t want. We respond to these structures by identifying the undesirable behaviour, and telling people to stop. We find the same problem emerges, in a slightly different guise. So we construct new rules. And so on. And on. And on.’
The insistence on an ‘unfettered’ market based on self interest is thus self-defeating, paradoxically driving its own hobbling as retribution for compulsive gaming of a rule-based system. A similar process of remorseless regulatory tightening operates in the public sector, and, as an important forthcoming report by think-tank Respublica will show, in the professions too. In both cases, assumptions of self-interest and producer capture have led to a dispiriting public-private mix of central bureaucratic target-setting with profit-oriented delivery that has reduced relationships of professionals and citizen to one of lowest-common-denominator contractual exchange, disengaging both citizen and service provider and reducing service from concern with individual lives to bureaucratic box-ticking. Government promises to simplify and reduce the number of targets are comprehensively trumped by what we might call Kay’s Law.. Thus for example a deficiency in NHS care caused by pressure to meet financial targets (as at Mid-Staffs) is countered by a target for compassion, or a too-obvious preoccupation with exam results driven by schools league tables generates the forlorn absurdity of a target for making lessons engaging.
Either way we end up with a horrible combination of cynical low-cost private utility policed by an authoritarian state that has replaced individuals and their needs as sole Soviet-style arbiter of the public good. The focus on performance management, outcomes and accountability saps professional purpose and pride, all too easily shading into the surveillance state. No wonder workforce engagement is so low.
This is the vicious circle that results from a system of rules based on mistrust of human nature and a perceived need to prevent people doing bad things rather than incentivising them to to do good. People generally behave according to the expectations their environment generates – it is a self-fulfilling prophecy. To break the cycle, we need to cut off the supply of commercial incentives to do bad things, at the same time relieving the pressure to create ever more rules, and internalise the requirement to behave responsibly. Until we do, work will continue to cut humans off from their better nature, stultifying the ambitions of both public and private sector, and people will continue to wonder why, as Peter Drucker once put it, ‘so much of management consists of making it difficult for people to work’.
High pay is a symptom of diseased organisations
It’s high time to get past platitudes and hand-wringing about CEO pay – even if that leads in unexpected directions.
The more you look at the present situation, the more remarkable it is. You wouldn’t know it from the election, but current forms of executive pay are a (perhaps the) central economic and social policy issue faced by the UK (and US) economy – key to innovation, productivity, growth and jobs, as well as in what to do about growing wage inequality.
As a recent report by the High Pay Centre (HPC) lays out, performance-related pay, ‘a firmly established practice at nearly every major UK-listed [and US-listed] company’, not only doesn’t but can’t work. But the case against the soaring salaries it produces goes far beyond unfairness and ineffectiveness in its own terms. The unacknowledged reality is that executives are receiving telephone-number salaries for acting as corporate Harold Shipmans, euthanatising the companies in their charge and systematically undermining the economy’s capacity to create full-time jobs and decent wages.
An exaggeration? Then consider this. The publicly-quoted corporation, the engine of capitalism for the last 150 years, is on its way out. Over the past decade and a half, on both sides of the Atlantic the number of publicly-quoted companies has halved. The reason is hidden in plain sight: corporations run for the short-term benefit of shareholders and highly incentivised managers are an evolutionary dead end. They do not invest enough in the future to survive in the long term. They underspend on research, capital equipment and human capital, and overspend – sometimes to the equivalent of 100 per cent of earnings – on dividends and stock buybacks for the sole benefit of shareholders. In short, they are dinosaurs.
Not surprisingly, the retreat of the public corporations has huge economic and social implications. On the one hand, their management-inflicted handicaps leave those that remain ‘ill-equipped to provide long-term employment, opportunities for economic advancement, and benefits such as health care and retirement security,’ in the words of US academic Gerald Davis. At the level of the whole economy, skewed investment (or non-investment) decisions by managers under the influence of perverse financial incentives are holding back innovation, job creation and growth. Look no further for the causes of dismal productivity and snail-like post-crash recovery – they are now structural, not cyclical, insists City economist Andrew Smithers. US academics such as Clayton Christensen and William Lazonick agree.
Yet here’s the other remarkable thing about runaway executive rewards: the completeness of their failure is only matched by the inability to curb them. In one sense this is not a mystery. Far from being an outrageous aberration, the pay dynamic, amazingly enough, is part of ‘best practice’, baked into governance codes founded on the idea that companies are run for the benefit of shareholders and executives need to be incentivised to do their bidding. Together, shareholder primacy and executive bonuses form the flywheel of short-termism and an instrument of corporate mass extinction.
Keynes once noted that ‘the real difficulty in changing any enterprise lies not in developing new ideas, but in escaping the old ones’. All the assumptions our pay systems are based on are false or unprovable, and 30 years of not making it work is surely trying to tell us something. While differing on the remedies, every constituency the HPC researchers consulted on the subject – the Institute of Directors (credit where it’s due), the TUC, management academics, economists, investing institutions, even many remuneration consultants – agreed that the system was broken, couldn’t be allowed to continue and had to be replaced. But continue it does, and no one has a clue how to end it.
As Upton Sinclair famously put it, ‘It is difficult to get a man to understand something, when his salary depends upon his not understanding it’. So, given the lack of official appetite or proposition for change, what should our response be?
Well, suppose that, rather than wasting more time and effort fighting hopeless odds, we instead accept that the public listed company is a lost cause. Think about it. Is saving dinosaurs likely to succeed? So leave evolution to take its course.
At first sight that seems unthinkable. After all, our economies are structured round the PLC. All our current management thinking is based on it.
But look at it another way. The decline of the listed company just seems to confirm what some of us have come to think anyway: it isn’t just the way executives are paid that is wrong with current management thinking – all of it is. If that is the case, as the companies that behave according to its logic succumb to the inevitable and disappear from the scene, the high-pay problem, and perhaps many others associated with it, will solve themselves. We can then reboot management too. So… the sooner the better. As a matter of urgency we should focus attention instead, as Davis has proposed, on what comes next – new shapes of collaboration and enterprise that are forming under the surface of the economy.
Once the unthinkable has been thought, it’s possible to perceive a number of green shoots poking through. Recent research shows that private US firms invest at twice the rate of public ones, Indirectly supporting the idea that entrepreneurs are preferring to remain private because it is more favourable to the building of long-term value. Although from a tiny base, the number of ‘benefit corporations’, b-corps for short, companies set up explicitly to serve social as well as profitable ends, is everywhere increasing fast. With Ben & Jerry’s as a b-corp cuckoo in his nest, Unilever boss Paul Polman has publicly wondered what it would take for the whole group to follow suit.
Meanwhile, Forbes commentator Steve Denning argues that we are already in a crossover period in which a new economy of vigorous agile young firms unencumbered by past bad habits is growing up alongside the declining dumb old one which it already far surpasses in ambition and soon will also in amplitude and achievement.
Such optimism perhaps comes more easily to Americans than cautious Europeans. Yet while it goes against the grain to ignore the huge injustice of the present position, the charms of beating one’s head against a brick wall have long since palled. And remembering Keynes and the remarkable half-life of zombie ideas, it’s as well to take on board the lesson of experience that says it’s easier to act your way into a new way of thinking than the reverse. Conclusion, then: it’s time to follow the lead of Davis and the US Academy of Management, which at its last two annual meetings has run well-attended discussion sessions on something that may be nearer than we thought: life after the corporation.
The collaborative economy – a disruptive revolution or another freshly-dressed emperor?
Read my thought-piece with the Foundation, ‘The collaborative economy – a disruptive revolution or another freshly-dressed emperor?’, here