A Robot Stole My Job (And I Couldn't Be Happier)
TL;DR:
Most value chains are still—and will remain for a long time—largely ‘analog’, manual, and human-centered.
‘Digitalizing’ only the customer-facing side of things does not even remotely count as ‘digitalization’.
The foundational technologies to enable ‘digitalization’ of the economy are in their infancy.
Our individualistic way of doing business collides with our dreams of a ‘digital’ economy.
The environmental and societal impact of a digital economy is poorly understood.
Digi-washing will only increase: digitalization is being increasingly manipulated as a vapid PR buzzword and quickly losing its meaning.
The education system in all its layers is still creating people for an analog society.
Imagine you need to do groceries. We’ve all been there: dragging our feet to the shop to end up mechanistically touring around the isles putting in the cart the usual bread, tomatoes, bananas, etc. Then queueing a long line while browsing Reddit. Sure, home delivery of groceries is a thing: there are plenty of apps to order online and get the stuff delivered to your door. But here lies the first question: Is that what we mean by ‘digitalization’? Hell no. Why? Because ‘digitalizing’ only the last stage—the customer-facing stage—of an otherwise incredibly analog, long chain is not exactly what ‘digitalizing’ means. Let’s see why.
Adding nice-looking software to the last mile does not replace the humongous choreography of human flesh involved in the process: it does not replace the person ringing your doorbell or leaving the stuff at your door, nor replaces the person driving the van, or the person loading the van, or the people rushing around the warehouse taking things from shelves and putting your order in a box. Nor does it replace the people who unloaded the trucks at the supermarket warehouse or those who populated the shelves, the person driving the truck to the warehouse, or the people working at the factories that supplied the supermarket, where the problem replicates all over again. It is a largely manual, human-driven activity of moving stuff around, lifting things up and down, turning machines on and off, and a lot of people moving, watching, working, sleeping, driving, walking, talking, smiling. Same goes for any other delivery service, such as your typical food delivery, where sleek apps give customers the illusion of a ‘digital’ experience where in reality there is a long list of low-wage workers in flesh and bones using their muscles to cook and ship your food, including the neglected, sweaty ‘partner’—don’t you hate the cruelty of naming them that way?—pedaling their life like there is no tomorrow to get your stuff home before it gets cold. Digi-washing at its best.
Now think of the opposite; a hypothetical, truly ‘digital’ scenario: you order your stuff with your phone, but instead of all the humans referred to above, this time your order is processed by software that sweeps your list of desired products and tasks machines to bring your stuff from shelves which were populated by other machines as another software ensured beforehand that the shelves are always containing enough stuff according to the historical demand of the products: ordering more of what’s popular, phasing out things people don’t seem to like. Said software places orders by commanding the supplier-side software which in turn tasks their factory accordingly, adjusting—and to some extent predicting—demand and also defining the routes and tasking of the delivery machines optimizing for the shortest path and providing estimated time of arrival (ETA) with good accuracy.
Now think about all the technology that must be in place for this full ‘digital’ scenario to be possible: we need autonomous trucks, robots or similar machinery for loading/unloading lorries and organizing the stuff in the warehouse while updating all related databases and systems. And lots of software capable of processing time-series consumption trends and extracting valuable data for tuning the ordering and manufacturing process.
Have you traveled abroad recently? That’s some analog experience, isn’t it? You need to drag your bags to the car, drive to the airport, manually deliver the bags, walk your soul through security, kill time in the duty free, visually poll screens, freak out, run to the gate, board the plane. Although it is hard to imagine a future of robots at home packing your underwear, it’s still possible to think of a future where traveling abroad can involve an autonomous vehicle picking you and your bags up from your door, connecting you to a hub (train or bus), ensuring your bag would be routed to the right flight and destination without you having to roll it. Then, all you have to do is the physical security check (impossible not to go analog here). No screen-polling (you’d be informed of any change at any time), no bag carrying, no more running, unless you purposely want to suffer. Then, at your destination, the counterpart of the more ‘digitalized’ door-to-door transportation system would deliver you and your bag to where you wanted to be. Today, it is even hard to find electrical outlets to charge your phone at airports. Not a very ‘digital’ experience, as much as the marketing departments of the air transportation industry want us to believe with their smooth-voiced ads.
For all this, we surely need a variety of robotic vehicles—this is, machines capable of moving from A to B carrying mass through land, air, and water—and industrial (pun intended) amounts of autonomy for those vehicles to be able to drive themselves in a safe manner from that point A to the other point B. And a strong maturation of the underlying technologies enabling all this: software of all kinds running locally and in ‘the cloud’, electronics, batteries and other energy generation technologies, machine vision, LIDAR, radar, signal processing, radiofrequency, cybersecurity. And a good understanding of the safety and legal implications of connected autonomous traffic.
But, more fundamentally, a full digital economy calls for an insane amount of standardization and consensus, if we dream of a future of machines talking to other fellow machines. Said machines must be capable of ‘talking’ the same language, this means, exchanging information in formats they can process and ‘understand’ accordingly. For the air transportation example above, it would mean your autonomous shuttle and its handling software know at all times where your flight is. What’s more, also knowing where your luggage is.
A realistic ‘digital’ economy requires almost any data set produced by any system to be directly understood, processed, and augmented by any other system that has complementary capabilities the former system does not have.
This hypothetical scenario belongs to an almost—at least in today’s terms—unthinkable reality where all human-made systems are not only capable of talking to each other but also capable of using the information they exchange, where interoperability and collaboration are always present, by design. A reality that will take time to create. It is in our interest to continue creating isolated systems, and isolated systems are the enemy of any realistic ‘digitalizing’ intentions. The Rise of the Machines, as many dread, will remain a sci-fi thing as long as we keep engineering isolated systems. Some forces that shape this isolation are:
Protection, which comes in different flavors: data protection, intellectual-property protection, privacy protection, access protection. Largely driven by competition and regulations. Note that this is not a synonym of encapsulation; we can have a properly encapsulated system capable of interoperating.
Cost saving: as in, both financial and time costs. It appears cheaper and faster to do things without caring about what’s behind the door.
Market dominance: the maker of system A does not feel very attracted to the idea of giving the chance to other commercial actors to interface with their product and make a profit; i.e., share the value chain.
Consensus: if we want system A from entity X to talk to system B from entity Y, we have to get entities X and Y to sit down and discuss how to. Moreover, if we want other entities to be able to eventually join, we must open that conversation up. This is, in a nutshell, the standardization nightmare. Standardizing is a hard thing to do, and very human-oriented. Too many standardization efforts become complex pissing contests between corporate wills. Too many standards become a mess too quickly.
Not surprisingly, the lack of interoperability in the artifacts we make is the result of our intrinsic self-interest. Collaboration is the only way if we want to pursue any true ‘digital’ future, which imposes a lovely paradox in our typically individualistic minds when it comes to doing business. Group life is an evolutionary response to threats: the wildebeest that inhabit the open plains of East Africa live in massive herds because there is safety in numbers. In fact, there is actually no conflict between collaboration and self-motivation. Herds can be opportunistic, transient and fickle, where clustering can revert to solitary life when the threat is removed.
We humans can afford to be individualistic jerks largely thanks to our rich evolutionary history of collaboration; collaboration of our ancestors but also collaboration of cells and genes in the organisms which evolve into our ancestors’ ancestors and ultimately in ourselves. I can’t help but find it quite ironic that the sole reason we can have the cognitive ability to decide to act self-interested is the result of billions of years of cooperation. As long as privacy and ethical integrity is ensured, a collaborative, interoperable system has more chances to succeed versus one moated by air gaps, secrecy and proprietary formats and interfaces. And disaster kicks in when insular systems are forced to interoperate by means of ad hoc artifacts such as shady middleware and odd translators, creating the grounds for a “chinese whispers” game of sorts and sprinkling the whole thing with multiple points of failure. Universities don’t teach us engineers to conceive systems with interoperability as a strong driver or as a driver at all; they teach us to conceive systems with the lowest cost possible. In other words: make it cheap, make it fast. Uncomfortable truth: the way to a ‘digital’ economy or society will require substantial initial investments whose results may take time to be seen. We can’t go cheap.
Destroyed jobs? More like, better jobs
From what was stated above, it appears as if the ‘digital’ future is about huge, pristine warehouses full of robots nervously rushing around, with no human presence at all. Or about airports without counters and without polite servers shouting ‘next!’. It appears as if the ‘digital’ economy means that all the people we commented on as part of the ‘choreography’ every time we do the groceries would eventually lose their jobs. Is this true? It is, and this in fact couldn’t be better news.
As much as driving a truck is a respectable activity as any other, it does not cognitively exploit the best of what a human brain can do. A human sitting behind a wheel for days with the sole intention of keeping the vehicle between two parallel lines and to avoid crashing into other vehicles in order to make it possible for the cargo to reach its destination in one piece sounds, if looked from a reasonable distance, insulting for our evolution as a species. Same for using humans to unload a lorry, or carry boxes, or driving a forklift across a warehouse to move stuff. Or scan products through a laser beam for hours while listening to seniors talking about their newest illness, or checking a passport. We’re way better than that, and the fact those jobs still exist is a sad reminder of the ‘analog’ society we still live in. Robots and software will not destroy jobs irreversibly, but will incentivize the workforce to shift into more cognitively meaningful jobs because the ‘digital’ economy, as such, will need lots of thinking. Human thinking: architects, designers, implementers, debuggers, configurators, maintainers, testers. Our human brains are capable of extremely complex, nuanced operations: from poetry, music, all the way to complex systems engineering. A somewhat hidden face of inequality in the world is the cognitive difference between low-wage jobs and white-collar, better paid jobs. The fact that today ‘low wage’ equals ‘low cognitive intensity’ should concern us more than it actually does.
If a serious ‘digitalization’ process would ever get past the PR buzz, all layers of the education system would need to adjust accordingly, tuning their curriculums for the the skills and capabilities to work in an economy where humans driving a truck will not be needed anymore; where humans stacking boxes in a warehouse will be remembered with humor as knocker-uppers are remembered today. Instead of the sterile discussion of ‘robots coming to steal our jobs’, the discussion should be about how well prepared we are to speed up the process of robots taking our jobs so we can go and do better, more meaningful things.
Always eat your 0s and 1s
Someone would quickly argue that not all the economy can be digitalized. For example, can you download toilet paper over the internet? Granted, you cannot wipe your ass with a PDF file.
More importantly, there’s the question of food security when discussing extensive industry digitalization. Bluntly: can you eat digital bread? No, you can’t. So, food is an industry that can be digitalized to a certain extent but not entirely. But how relevant is agriculture industry in the global economy?
In 2018, agriculture accounted for 4% of global gross domestic product (GDP), which is one third of the contribution a few decades ago. However, more than 25 percent of GDP is derived from agriculture in least developed countries12.
But there’s another side of this: agriculture, forestry, and land use change are responsible for about 25% of greenhouse gas emissions (see footnote 1 for details). Our current way of producing physical things, including food, is environmentally unsustainable, and its methods are rooted in outdated practices which are obsolete and aggressive for the environment. We can’t digitalize tractors, can we? Let’s see.
Recently, an article caught my attention. The article is about how John Deere has turned tractors into computers, and the backlash this has caused from farmers who are increasingly frustrated of having lost the chance to repair their tractors. Farmers seem to find software-intensive tractors a hard-to-swallow pill. The tractor manufacturers claim this is the future and it came to stay. In reality, what the farmers have lost is the chance of understanding the machines they own. Even if provided with all the tools to diagnose and repair their high-tech farming device, the large majority of farmers would be incapable of doing anything, since this would require quite specific and multidisciplinary technical knowledge. Reading the article, you get a different impression: tractors are still physically resembling old tractors but are already something else—they are tractor-shaped computers. More fuel effective, more data-driven, more environmentally friendly.
At my wife’s grandma's farm, there is a very old tractor, I think from the 50s. My kid loves to jump on it and pretend he’s driving fast while honking and arguing with other pretend drivers, and I love just to stare at the machine and appreciate the beauty of its greasy, neat mechanics. I always find it extremely fascinating to be 100% sure there are zero transistors on board of it, let alone any computing. The old tractor still works and it is used to collect potatoes every summer in Levänen.
In the last 75 years, the transistor conquered the mechanical machine. As soon as this happened, machines stopped being the result of single-disciplined engineering, and the fun began. Transistors quickly clustered into logic gates, logic gates clustered into arithmetic units, data paths and memories, and all that turned into microprocessors capable of executing instructions defined by an outsider (programmer). Higher-level languages came to help cope with the somewhat obscure nature of machine code, and the software industry was born. But microprocessors remained kind of harmless while isolated from the physical world. Did not last for long: we made them capable of interfacing with the outer world by means of sensors and actuators. Then, computing took over machines very rapidly. Things which were born purely mechanical became computerized: factories, cars, airplanes, ships, and tractors. Today, those cyber-physical systems produce tons of data, and that data is used to make them work more efficiently.
But there’s something odd here. Why is it that most acquiesce to the computer conquest whereas a few still fiercely resist? I’ll risk a theory here: the tractor remains connected to a past when work was highly physical and animal-dependent and that past is still vividly passing from generation to generation of farmers. In a way, farming equipment is (still) perceived as animal replacement and augmentation, but not more than that. Maybe John Deere should stop making them tractor-shaped and stop calling them tractors in a way farmers would finally understand the new machines are here to do agriculture in a different way.
At some point, we understood that the car is not a replacement for a horse anymore but a thing on its own (a youngster today may not even know there is such an equine past). The tractor remains in the thought of many a sort of metal horse, an extension of their arms and muscles. I could perceive what John Deere’s CTO was trying to say in that article, between lines:
Old farming is gone, and new farming requires a new mental model;
Physical force is not a factor anymore, get over it;
Software enables many great things and pays off, get over it;
Stop thinking about the tractor as your horse or your ox. Start thinking about APIs, about autonomous driving, embrace efficiency. Monitor your farm from your smartphone while you sip a lemonade, or read a book. You don't even need to ride on the thing anymore.
This is the kernel of the ‘digitalization’ conundrum: the shock seems to be too strong. The gap between how things were done versus how they must be done in the ‘digital’ age is way too wide.
Here’s an uncomfortable truth for all of us doing tech: new technology can make the world a better place, but not when decoupled from its inherent societal consequences.
A physiocracy is an economic theory developed by a group of 18th-century Age of Enlightenment French economists who believed that the wealth of nations derived solely from the value of "land agriculture" or "land development" and that agricultural products should be highly priced. Physiocracy became one of the first well-developed theories of economics. We have already proved as a society that a physiocracy is not the only way to function. But to gear up towards a more ‘digital’ economy, the key will lie in making sure we all—globally—transition as together as possible, understanding the skills needed in a way we can leave the robots do all those jobs we’ve always hated to do.
Robots are fine with it.
We are programmed just to do
Anything you want us to
We are the robots
Truly Digitalized Engineering? Not so much
Sadly, the hype industry has also hijacked and bastardized the concept of “digitalization” or “digital transformation” in the engineering practice until it means nothing anymore. We just can’t have nice things. This is sad if we consider that there is a need for a true, honest discussion on how much the process of engineering complex things stems from a messy digital origin and how it steps into the not-well-behaved physical world.
Up until recently, the only way to have extensive knowledge of a physical object was to be in close proximity to that object. The information about any physical object was relatively inseparable from the physical object itself. We could have superficial descriptions of that object, but at best they were limited in both extensiveness and fidelity.
Basic information about an object such as dimensions, height, width, and length, only became available in the late 1800s (!) with the creation of the standard meter and a way to consistently measure things. Before that, everyone had their own version of measurement definitions which meant that interchangeable manufacturing was nearly impossible. It was then only in the last half of the twentieth century, that we could strip the information from a physical object and create a digital representation of it. This digital counterpart started off relatively sparse as a CAD description and is becoming more and more rich and robust over the years. While at first, this digital “reflection” was merely descriptive, in recent years it has become increasingly more behavioral. What behavioral means is that the CAD object is no longer simply a three-dimensional object hanging in empty space, time-independent. We can now simulate physical forces on this object over time to determine its behavior. Where CAD models were static representations of form, simulations are dynamic representations, not only of form but also of function.
One of the previous issues with only being able to work with a physical instance of an object is that the range of experimentation into its behavior was both expensive and time-consuming. We first had to physically create the object as a one-off. We then had to create a physical environment in which the object was impacted by the elements, for instance, mechanical vibration and temperature swings. This meant that we were limited to investigating scenarios and their associated levels that we thought were of concern. Often, the test scenarios would damage the object, increasing the expense. This meant that the first time we actually saw a condition not covered by a physical test would be when the physical object was in actual use. This meant that there were going to be many unforeseen conditions or emergent behaviors that resulted in failures that could result in harm and even death to its users. Aggregating these objects into systems compounded the problem. Systems are much more sophisticated than simple objects. We need to have better ways to understand these increasingly sophisticated systems. If the system contains computing elements, it also means we have to simulate the behavior of those computers and how they execute the code the system requires to achieve its expected behavior. A problem appears: the fidelity of the simulators cannot be ignored. A CPU emulator which contains an error when emulating CPU architecture X or Y will change the execution flow of the code which may lead to totally different behaviors compared to the real thing.
The idea of a seemingly inevitable “Digital Transformation” where we push most of our work as engineers into the digital domain has made headlines recently, with incumbents preaching on the beauties of embracing “Digital Twins” and how they are very much doing it already. The Digital Twin concept, in its definition, means to be able to design, evaluate, verify, and use/operate the virtual version of the system of interest during a large portion of the design lifecycle. In theory, with Digital Twins we could understand whether our designs are actually manufacturable, and determine performance and modes of failure. All of this before the physical system is actually produced. It also downplays the role of physical prototyping, as the need to produce rough instances of the design is replaced by analyses done in the digital realm.
That’s more of the hype side. In reality, there is an ugly side to the “Digital Twins” and the transformation they are supposed to bring. Far from the ideal of a cohesive digital replica that stock images with people touching screens with revolving 3D models and wearing VR headsets has tried to make us believe, representing a complex system in the digital domain requires large amounts of unstructured information in the shape of disaggregated tools and software environments that will not easily talk with each other. This includes mechanical design data, thermal and fluid dynamics data, power systems, multiphysics simulation, finite elements analysis, electromagnétics, application software, firmware, and programmable logic, just to name a few. As much as the Product Lifecycle Data tool makers have tried hard to sell to us that they can easily handle all that, it remains largely untrue; PDM/PLM only bites a small portion of the challenge, usually the portion without complex runtime behavior.
Don’t be deceived. The digitalization of the engineering practice and product design is still in its infancy and looks more like an intention than a ‘transformation’. Today, it’s just a dysfunctional collection of expensive tools that come with industrial amounts of vendor lock-in. Just like lifeboats in the Titanic or space tourism, the ‘Digital Transformation’ appears to be a rather bourgeois thing only possible for those with deep pockets.
For us mortals, the “digital transformation” it’s still largely Excel.
Some paragraphs are adapted from “Digital Twin: Mitigating Unpredictable, Undesirable Emergent Behavior in Complex Systems”, by Michael Grieves and John Vickers
https://www.fao.org/3/i2490e/i2490e01c.pdf