Resisting New Transformative IDEAS

 

Ashok Ogra
From accepting coffee as an everyday drink to invention of printing press, from discovery of X-ray to the launch of cars, to current debate on the potential threat of drones, gene editing and much more – history is littered with new ideas, processes and technologies that sparked resistance before becoming fixtures in everyday life.
Most of us are familiar how the blue jeans fabric conquered pop culture and fortified the civil rights movement of the 1960s in the US. However, in the 17th century, the dye that would put blue on jeans was banned by many European countries.
Reason: the fermenting process yields a putrid stench not unlike that of a decaying body. It took 100 years before the Indigo dye was accepted by Europeans.
Therefore, the current apprehensions about AI (ChatGPT etc) is perhaps expected considering what Elon Musk said:”AI could be one of our biggest existential threats.”
Humans often converge around massive technological shifts-around any change, really-with anxieties. Also, humans are instinctively designed to react to novel things in a way that aims to protect oneself.
The most interesting case of resistance is certainly that of coffee- a product that was opposed for over 300 years, all the way from Asia to Europe. According to some historians, coffee is considered the most controversial product of all time, certainly more controversial than nuclear power.
During the 1650s, British wives bemoaned the fact that coffee-drinking was such an intellectual, effeminate pastime that it had rendered their husbands impotent. They also complained that coffee made men too talkative.
When coffee emerged in Tamilnadu in the late 19th century, there was much opposition from intelligentsia, who thought filter coffee to be more addictive than even beer and arrack. More worrying was coffee’s transgression into the habits of womenfolk. A correspondent wrote to Gandhi that the greatest obstacle to the success of the non-cooperation movement in Madras ‘is our women who have become addicted to drinking coffee.’
Rewind to ancient Greece: Socrates resisted writing because it was mute, didn’t encourage dialogue and debate and would encourage forgetfulness, as memory wouldn’t be stimulated anymore.
When printing finally arrived around 1450, the major opposition came from the Church and the Ottomans. The concern expressed by Islamic religious leaders was that allowing the printing of the Koran could introduce variations in the text and, therefore, lead to different interpretations of the religious code.
In the initial decades, the Church exercised full control over the printing of books, and discouraged printing new discoveries that challenged their established viewpoint. For instance, the Church taught that God was responsible for illness and propagated that God sent disease as a punishment for sin or to cleanse the soul.
Therefore those medical students who tried to make new discoveries were forced to fit them into the older theories, rather than experimenting to explain new discoveries. This meant that medical understanding made very little progress during the 15th and 16th centuries.
However, the period between the 16th and 18th centuries saw a rapid increase in experimental investigations and advanced anatomy.
The initial public skepticism of new technologies was witnessed when new inventions promised huge benefits to society. When Wilhelm Röntgen discovered in 1895 the phenomenon of the X-Ray completely by chance, it promised both hope for medical miracles and, on the other hand, there was fear of loss of privacy.
Imagine, tobacco smoking as a cause of lung cancer was first researched as early as the 1920s but was not widely supported by publications until the 1950s.
Up until the latter half of the 20th century, the prevailing wisdom among gastroenterologists was that ulcers were caused by stress, spicy food, or too much stomach acid. But in the early 1980s, two Australian doctors – Dr. Barry Marshall and Dr. Robin Warren discovered that patients with ulcers were populated with an unknown microorganism, which they named H.Pylori. To gastroenterologists, the concept of a germ causing ulcers was like saying that the Earth is flat. No wonder, initially skepticism greeted their findings. Both the doctors were awarded Nobel Prize in 2005.
When farm tractors were first introduced in the US, some saw little advantage in the new machines over horses. Some even argued that their value could be marginally improved if they could reproduce themselves like horses.
No one foresaw the transformative power that the telephone would become at the dawn of its invention in the 1870s. The majority was confused and amazed, but didn’t care about it. Many people, content with the telegraph, even insisted that the telephone was entirely unnecessary and would discourage families from visiting each other’s homes. Some wondered if the machines might be used to communicate with the dead.
And when electricity was invented, people had great difficulty adjusting to a new normal that interrupted biological rhythms of life and altered schedules for work and leisure.
Nobody even thought that even promoting the idea of a self-propelling motorcar in the early 20th century would invariably invite ridicule. The eminent German sociologist Werner Sombart complained bitterly of a world in which “one person was permitted to spoil thousands of walkers’ enjoyment of nature.” Initially, accidents happened because the other users of the street refused to adapt to the changed circumstances brought about by the appearance of the motor car.
The history of the airplane is even more interesting. Quite a lot of ‘ooh’ and ‘aah’ greeted the airplane when it made its first appearance in the early 20th century. Commercial aviation was very slow to catch on with the general public, most of whom were afraid to ride in the new flying machines.
When cinema shows began at the turn of the 20th century, “movies were seen to rapidly intensify the process of demoralization; they drew people out of their homes, tempted them into dark spaces and served them content of low quality and despicable moral standards.”
When Vincent Cerf invented the INTERNET in early 1980s, few took notice of it as it sort of crept into our lives slowly. In the case of Tim Lee Berner who formulated WWW code in 1991, the significance of the moment wasn’t immediately obvious. His own colleagues paid little attention.
Just forty years ago, bank employees across the country strongly resisted computerization of their operations. However, today, banks in India have witnessed a radical change from ‘conventional banking’ to ‘convenience banking’. Today, they are poised for ‘digital banking’ at a rapid pace.
In the seminal book ‘Innovation & Its Enemies: Why People Resist New Technologies’, ex Harvard Professor Calestous Juma argues that society tends to reject new technologies when they substitute for, rather than augment, our humanity.
That explains why not every invention is recognized as a game-changer when it first comes out. Plenty of inventions and technologies throughout history are unappreciated, underestimated, and feared at their debut.
Even as beneficial a product as knitting machines was received with great skepticism when introduced first time – as it was seen threatening the livelihoods of a certain class of workers.
From ancient times when ‘the divine was everywhere’ to the present where ‘AI is omnipotent’, one is reminded of Victor Hugo:”Nothing is more powerful than an idea whose time has come.”
However, we are now entering the most unknown: 3D printing, AI & machine learning, drones, gene editing etc, with the latter provoking a group of US scientists and activists to call for a global ban on the genetic modification of human embryos, warning the technology could have an irreversible impact on humanity. It has also generated fears of various jobs becoming redundant.
According to one technology writer, “Artificial intelligence will reach human levels by around 2029. Follow that out further to, say, 2045, and we would have multiplied the intelligence – the human biological intelligence of our civilization – a billion-fold.”
In simple terms, humans who are limited by slow biological evolution, won’t be able to compete and stand to be superseded.
It is this scary scenario that perhaps explains why we humans often resist life-changing transformative technologies because it presents changes, big or small, to our immediate world, a world to which we have worked hard to adjust. Remember, the majority still prefer the familiar. The only weapon we humans have to minimize the AI presence is to ‘turn off the system…’
(The author works for Apeejay Education, New Delhi)