The Fourth Industrial Revolution

#

Since the First Industrial Revolution from the 18th to 19th century, humanity has been trying to make life easier for itself by having machines do the hard work. The First Industrial Revolution was a simple idea to make the hardest of manual tasks less exhaustive. The Second Industrial Revolution was borne of increased populations, with a greater need for mass production. The third was digitalisation during the late 20th Century to computerise the control of the machines that had already made any industry utilising mass production techniques much more efficient. The politics of these revolutions will always be cause for debate, but it is the social aspect of this fourth revolution that has got me thinking; are we too smart?

Stephen Hawking is just one prominent scientist to have already claimed that Industry 4.0 could be death of mankind. With greater resources being spent on machine learning and artificial intelligence, it is only a matter of time before we are living in a similar world to Isaac Asimov’s I, Robot. A post-apocalyptic world is a popular setting for many series of books and films: one of the most popular in recent memory was Disney’s Wall-E. A cute clean-up robot given the task of sorting through the waste of Earth after humanity had destroyed it, finds life in the form of a plant, which is quickly taken away from him to a spaceship where robots perform all tasks except for the bodily functions of the human inhabitants. I am not ashamed to say that I found the animated x-ray of a human’s skeleton from some time in the future a harrowing experience; bones that were almost completely useless encased in body consisting almost entirely of fat. All due to reliance on machines. Therefore, I share Hawking’s fears that AI and machine learning could be the end of humanity if we let it, but it really doesn’t have to be.

I am 26 years old. I work for a Data Platform Provider and Consultancy. I remember my first mobile phone. An Ericsson (before they merged with Sony) previously owned by my mother, that barely fit in the pocket of my school blazer. I sent text messages to friends and family, and that was all. Now, I own all three of the standard devices: mobile phone, tablet, laptop. Not only do I work with data, but I create a lot more data output than I did when I only had the Ericsson. On each device there are applications that are constantly sending information to service providers. WiFi, 4G mobile network, and various apps track usage. Most of the companies collecting this data analyse it to streamline their processes. Some companies analyse it to give feedback to the consumer.

FitBit are one of the leading proponents of personal data collection, making it easier for the consumer to access data about their health and fitness. No more writing out personal best times for that half marathon you’ve been training for. You can even track where you have been, and for those of you happy enough with your times, you can let everyone else know at the touch of a button.

The issue I am putting forward here is that we are unknowingly altering our cognitive ability. I use Google Calendar, which I can access from any device at any time. My entire address book can be accessed in the same way. I do not have to remember, or take a diary with me. If I get lost, either Google Maps or Apple Maps can help me find where I need to be. I can contact almost anyone, anywhere in the world, at any time, so making plans has become a diabolical situation where I can postpone or cancel at the last minute, knowing that whomever I am meeting will not wait and/or worry. And in this age of information, is there too much of it? Will the old-fashioned pub quiz slowly die out as people forget trivial information, knowing that a device holds an answer? 

As homes become digitalised and ‘smart’, there are likely to be less functions that we will have to physically control. All we will have to do is check a screen and tap a few buttons to control the temperature, humidity, alarm systems, curtains, fridge, lights, and eventually even the toilet flush. There have already been studies into how technology has affected us, with one such study, iDisorder (Prof. Larry Rosen, 2012) creating a list of syndromes related to mobile devices. These include phantom vibrations, and the need to check social media even when no notification has been received. The I.o.T and smart technology has many benefits from enabling efficiency in the breeding of cows to GPS tracking, but is it creating as many problems as it’s fixing?

I mentioned above that I am with Professor Hawking in fearing a future of AI, but what I fear the most is that we will become the humans in Wall-E, physically incapable of the most simple of tasks, enabling the AI robots to take over without so much as a fight. We can stop this from becoming a reality, but it will take some sustained removal of technology from one’s life. You don’t have to quit altogether (work may become difficult for some of us), but rather find other ways to use your time. Maybe take up a new hobby where statistics don’t matter.

When technology moves as fast as it has done over the last century, we can only speculate that within the next century a human workforce could well be obsolete, and we’ll all be confined to a sofa.

 

Not a bad option for some.

Latest from this author