T he concept of a minimum or living wage is not new. The 18th century economist Adam Smith advocated “a wage sufficient to provide the necessities and comforts essential to an acceptable standard of living.”
But while the living wage is still mostly determined by think tanks, pressure groups and altruistic organisations, rather than legislation, the minimum wage is recognised as not only good for society but also a nation’s economy.
However, only 18 of the 27 European Union countries currently have a minimum wage. Countries such as Austria rely on collective bargaining, while Switzerland recently voted to reject the idea altogether. In the U.S., protective wage laws can be traced back over 100 years; but a national minimum wage was only mandated by the Federal Government in 2009.
The Rise of the Machine
A minimum or living wage may help to protect blue collar workers and those at the foot of the income ladder, but they potentially have other impacts too.
The threat comes less from competition within the existing workforce on price, and more from partial or total replacement by technology. This is not new of course, but where technology once complemented workers, it now competes head on with the ripple effect being felt by the greater workforce.
With trials already underway to have packages delivered by drones and driverless trucks from Mercedes Benz, projects like these could have far-reaching effects. Reducing drivers would result in fewer motorway cafes, rest stops and even uniforms. Extended working hours for vehicles would reduce fleet requirements. In the U.S., truck driving is listed in many states as the most common job—about 3.5 million people directly and another 5.2 million in ‘truck-related industries’.
So, should we be worried? Certainly, some machine technology offers a very productive and effective solution, removing boredom, increasing productivity and ensuring consistency. But with increased use, who takes responsibility if something goes wrong and how will blue collar workers be effected?
As humans, we make choices based on information, experience and moral judgement, but computers act on logic and learnt ‘artificial intelligence’. Last year, Professor Stephen Hawking predicted that, “The development of full artificial intelligence could spell the end of the human race… Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.” While not all experts agree, we should perhaps take note.
Who’s Flipping Those Burgers?
We may never see job applications that state ‘humans need not apply’, but a mix of semi-skilled labour and machines will become ever more prevalent. Take the local burger bar. We will still need someone to smile and take your order, ask if you want fries, and wish you a good day. But flipping the burger, adding the sauce, and putting it in the bun can be outsourced to technology.
The choice for employers is about Return on Investment (ROI), and while human workers require breaks and holidays, modern machines can run for years with little maintenance and will never tire or need time off.
What this means for labour is the need to become more flexible and adaptable. History has shown that, on balance, technology creates jobs rather than destroys them, but that could change. The challenge is not to fight the machines but to embrace change and evolve new skillsets to work together. The rise of the machines could also facilitate the rise of the blue collar worker.