In a most remarkable product demonstration, Google unveiled their improved artificial intelligence (AI) application, Google Assistant. In the demo, the application phones up a hairdresser and, using uncannily natural-sounding speech, peppered with “uhms”, is able to book an appointment by conversing with the hairdresser. In doing so, Google Assistant appears to pass the Turing Test, developed by the British mathematician Alan Turing in 1950. This test postulates that if a person can’t tell whether they are communicating with a human or a machine, then the machine has passed the test and therefore “thinks”.
In the demo, it is a machine that (or perhaps who?) is calling the business to book the appointment, and the individual answering the phone is human. However, this could easily be reversed, so that it is a person who is calling the business, and the machine answering for the business.
This raises an interesting question: what if it there was a machine at both ends of the conversation, that is, one Google Assistant calling another? If the AI engine running both assistants is advanced enough, they could, in theory, carry on a meaningful conversation. Although this might seem like the ultimate AI prize, there’s a much simpler solution: using a website to book an appointment. Granted, it doesn’t have all the nuances of a regular conversation, but if the goal is simply to book an appointment, then the user’s computer simply has to connect with the business’s.
This use of advanced AI is part of a larger phenomena: the degree to which our daily tasks have been automated or performed by others. Up to a mere 200 years ago, people made and repaired what they needed, including clothes, tools, furniture, and machinery, and often grew their own food. The industrial and agricultural revolutions changed all that. Goods could be mass-manufactured more efficiently and at a lower cost. Food could be grown on a mass scale. We’ve moved away from a society in which individuals made their possessions to one in which we let others do this for us.
As recently as the 1960s, many people maintained and fixed their cars; most people today leave this to a mechanic. We have outsourced nearly everything. Although we have gained much in quality, price and selection, in the process, we have lost many practical skills.
This trend continues as more and more processes are automated or simplified. Coffee makers that use pre-packaged pods are easier to use than regular coffee makers. However, it would be a sad thing if entire generation did not know how to brew coffee the regular way. Even brewing coffee “the regular way” still involves using a machine that others have made and that we cannot fix, powered by electricity that we do not generate, using beans that we can neither grow or process ourselves, and water that is automatically pumped into our home using an infrastructure that we cannot maintain. The parts that make up the parts that make up still larger parts are designed and built by others.
At its heart, Google Assistant uses algorithms, sets of sequential rules or instructions that solve a problem. A simple example is converting Celsius to Fahrenheit: multiply by 9, divide by 5, and then add 32. The algorithms used by software applications are, of course, millions of times more complex than this example, because they use millions of lines of code.
Algorithms are incredibly omnipresent. They are used extensively by online retailers (such as Amazon) to recommend purchases for us based on our previous purchases and browsing habits. Facebook uses them to track our activity and then sell that data to others, often with dire results. Algorithms are also used in two of the most important decisions a person can make: whom they love (in dating applications) and where they work (in résumé and interview screening applications).
Algorithms have even used to determine how likely a criminal defendant is to re-offend based on attributes such as race, gender, age, neigbourhood and past criminal record. But is it ethical for a judge to use an algorithm to determine the length of a sentence? This happened in the case of Eric Loomis, who received a six year prison sentence in part due to a report the judge received based on a software algorithm.
Society is facing the same trade-off that it faced 200 years ago as it moved from personal to mass manufacturing: convenience and comfort versus knowledge and independence. As we relinquish more and more power to machines and let algorithms make more of our decisions, we achieve more comfort but less freedom. We are, bit by (computer) bit, quietly choosing to live in a massive hotel. It’s pleasant, you don’t have to do much, but it does not prepare us for life.
For in life, there is often sadness, pain and hardship. There is no algorithm that tells us how to deal with these things, nor will there ever be.