Think artificial intelligence (AI) and the advent of powerful thinking machines and images of Arnold Schwarzenegger of The Terminator come to mind.
While machines can take over many jobs currently undertaken by people, there is still room for both to co-exist. After all, it’s people who built the machines first.
But, there must be a shift of mindset for this to happen.

“Today, not only industry but our education system is still trapped in a mindset of training us to do what machines can do better than us. We need to shift that and equip ourselves for a future where machines are our helpers, doing the things that we don’t want to do so that gives us more time to enjoy life itself,” said Drew Perez, Managing Director of Adatos, a venture-backed company that builds data intelligence solutions for the private sector.
The Fourth Industrial Revolution
The World Economic Forum has dubbed this the “Fourth Industrial Revolution”. The first was the use of water and steam power to mechanise production in 1784. Next was the use of electric power for mass production while the Third Industrial Revolution involved using electronics and information technology to automate production.
The Fourth Revolution is about an increasingly digital economy that fuses various technologies to shape the way people live, work, play, and relate to one another.
“We have to look as humans on where that symbiotic relationship is where leveraging machines for what they are very good at, which is repetitive work, high accuracy and high speed, and leveraging us to focus on what we do best as humans, which are the creative skills and imagination,” said Perez.
AI has existed for a long while but modern AI began with the invention of the programmable digital computer in the 1940s. After many false starts and shattered dreams, the technology faltered and went into the backburner in the early 1990s. The main problem was the lack of computing power.
IBM-NVIDIA joint project catalyst
Perez highlighted that it was the IBM-NVIDIA joint project in 2014 that proved to be the turning point for AI. That venture resulted in the NVIDIA DGX-1 and IBM Minsky production servers, the first to be available for commercial use.
He noted that the computing power “allows us to not only do this type of AI building AI but also provide a complete transparency to make us as humans feel comfortable on how this stuff works”.
“Right now, as far as GPUs are concerned, that type of computing is only available through the IBM-NVIDIA relationship. We need that computing power. It is only through the parallel processing power of general purpose GPUs that this is available,” he added.
NVIDIA provides a computing capability with the parallelisation. Unique to that is the architecture, which started off with Mellanox, and is now called NV-Link, that allows the movement of information from the GPU and to the CPU and back. This is essential to create the system of systems.
The computing power coupled with Big Data have helped Adatos to build machines with the capabilities to build machines and self-code and self-evolve.
With such capabilities, the role of the data scientist may become obsolete. No human being is needed to build AI or even write code because the machines can write their own code.
“There is a general sense in the AI community to completely demoncratise this where a non-technical person has the productive capability of a PhD scientist. There is no reason that anybody who can play on a smartphone cannot build these solutions that we have,” said Perez.
“We are not talking about single algorithms. We are talking about analogous to autonomous vehicles. Today, we have autonomous AI. There is no reason why we need humans to intervene,” he added.
While this does sound like the death knell for many jobs, AI will open new doors of opportunity in areas where soft skills, creativity, imagination, and the human touch matters. It just needs a mindset change in the days ahead.