In his article, “Automating Work. Humanising Jobs”, Andrew Cleminson argues, “There’s nothing wrong with getting [intelligent and robotic software] to do the jobs we no longer want to do ourselves. In some cases, it’s simply better as well.”
But what happens if human beings are becoming less competent at those things where artificial intelligence is becoming more efficient?
Cleminson points out that [almost] everyone is taking advantage of new software capabilities. He gives the example of using GPS to find our way to your destination. But have you noticed that the more you use GPS and GoogleMaps, the less able you are to give and follow directions? Have you noticed how much longer it seems to take to orient yourself in a new city without using technology? (Where is the sun here … and which way does it go?) What impact could ‘outsourcing’ the ability to give and follow directions have on your own skills?
Cleminson boasts, “high performance systems are enabling us to provide our clients with better service whilst reducing errors, exceptions, transactions times and cost.” While the efficiency and expediency of digital platforms is appreciated, what are the consequences of ‘outsourcing’ easily resolved errors to technology rather than having human beings practice to the point of proficiency so they reduce errors themselves? Does off-loading small-scale risks to technology leave humans with the higher stakes risks — without giving us the basic error correction skills? Do we not lose something important along the way?
“Application forms, claims, complaints and requests for information can all be read by software,” Cleminson says. But this means that a human person isn’t helping a person with their application. This means that a person isn’t practicing addressing a customer’s complaint. This means that a person isn’t needing to find the answer for a client but is de facto only prepared to tell the client, “Just Google it.” How do you feel when confronted with an automated process and you desperately want to talk to a real person?
Have you ever noticed how the restrictions of ‘technological paperwork’ are sometimes more rigid than the most lamentable bureaucracy? The technological ways of doing things, with an extreme emphasis on accuracy, often eclipses common sense and human touch.
Consider this story of a Canadian man with cerebral palsy trying to get his passport renewed. There is a rule barring staff from filling out forms for clients, including those with disabilities, allegedly to prevent errors. The man expressed frustration to media saying, “The fact that I was denied and told, ‘Here's a paper form,’ which I just told you I can't use — I’d just like to see them have some compassion.” How terribly dehumanising is this experience?
Still, Cleminson is convinced that “we have reached a new crossroads. Increasingly, clever software carries out routine admin tasks, reducing the need for us to do these things ourselves.” Have you also noticed how many people are also losing the ability to do basic administrative tasks well? Apps promising to do “context-optimized” editing of all of your writing and countless apps promising to “organize your life” make me wonder if we are not losing the ability to do the most basic things for ourselves. Are those basic skills building block for more advanced skills?
Somewhat paradoxically time-saving, efficiency promoting technologies are already robbing us of time by keeping us constantly distracted. Are they now starting to “steal” our skills? What are the consequences to your own skill set when you outsource such fundamental tasks?
While I am not necessarily opposed to artificial intelligence, or to technological developments generally, I remain concerned about the implications and the trade-offs. In addition to my suspicion that “automation” leads to “dehumanisation”, I wonder if we are really gaining enough new human skills to compensate those we are surrendering to AI. We need to upgrade our personal and interpersonal capabilities faster than we are upgrading technology.