Talking to the future

It’s been 16 years since Tom Cruise’s film, Minority Report (2002), gave us a glimpse into an intriguing, futuristic world of technology. At the time the film’s director, Steven Spielberg, consulted numerous scientists to develop a future world, more plausible than other science fiction movies had portrayed. In fact, such were the visionaries that some of the technology designs in the film have proven prescient.

So, like Steven, we are using trends from recent years to make some predictions of our own. At CERAWeek, and with some trepidation, we are setting the scene of our control room in the year 2025. Before describing this futuristic environment in detail, let’s consider the trends on which it is based.

In the past 18 months alone terms like machine learning (ML), artificial intelligence (AI), augmented reality (AR) and digital twins (DT) have become mainstream topics, blurring the lines between fact and fiction.

However, it is not in the industrial context, but the consumer world, where we are encountering these disruptive technologies. Today many of us are experiencing the joys of communicating with voice activated and interactive tools like Amazon’s Alexa, Microsoft’s Cortana or the Google Assistant. We are engaging with chatbots like Facebook Chat and finding, through our smart devices, a world far beyond simply making telephone calls. The arrival of these intelligent personal assistants is transforming the way we do things at home, from smart lighting and entertainment to ordering the weekly groceries or a taxi.

Similarly, space travel is now being controlled, not by governments, but by private enterprises (Jeff Bezos, Amazon-Blue Origin, Elon Musk, Tesla Cars-SpaceX and Richard Branson, Virgin-Galactic). It is the consumer space that is inspiring us and driving developments in our own industrial world. One of the pillars of digitalization in ABB is to adapt these home entertainment technologies and apply them across the process industries.

Two stand out computer sciences leading the way are natural language processing (NLP) and machine learning (ML). NLP is the art of speaking naturally to a plant, and for it to then figure out and interpret what the intent is of the operator, without all the complex queries that a computer system would generate today. ML gives computer systems the ability to “learn” by progressively improving performance and understanding on specific tasks, without being explicitly programmed. Put another way, ML is a rapid way of finding the things you are looking for in data. It provides a tool for gathering unimaginable volumes of data and extracting insights or interpreting repeatable scenarios needed to solve problems. The good news for humans is that the creativity of people to take the data and solve the problems will be in demand for some time to come.

Together NLP and ML provide a formidable force in eradicating inefficiencies that are rife in process industries today. Take something as simple as a keyboard and mouse. These two stalwarts of our lives are arguably the last barriers to our ability to run plants efficiently. They are, quite simply, not human enough. They provide an unnatural interface to the processes we are trying to operate. What if the human could simply talk directly to the plant? What if operator notes could be instantly interpreted by the system? What if nuances of language were not a hindrance? Can we really have hands-free collaboration with operational technology in a plant?

Our demo at CERAWeek paints just this scenario. It highlights the use of an AI companion – or intelligent knowledge assistant - to tackle everyday inefficiencies faced by a typical plant. In today’s world, plant operations are handled reactively and success hinges on the experience of the human on duty at that time. The many scattered applications only add to the complication of finding relevant data.

However, an AI companion proactively highlights problems. More importantly, it assesses all relevant historical data and makes recommendations using ML-based decision support, in a way which is well beyond the more constrained, experience-based decision-making process of humans. Its algorithms are constantly running to provide context to the data. Then using the operator’s voice to identify the anomaly, the system can sift through masses of data to find a pattern that fits the challenge. And, the really clever part is that it understands industrial language and speaks fluently to the human operator. Our AI companion supports three humans: the control operator, the field operator and the operations officer, with a structured interaction to provide a totally integrated, and productive, user experience.

Today we have no shortage of data and no shortage of computing power manipulating this data by way of algorithms and in-depth analytics. There is now a realization, however, that the future role of the human needs to change. People have a lot to offer but it’s the way that they interface with the plant that needs to change. The dialogue between human and machine needs to be fluent, free-flowing and effective.

As Spielberg contemplated during the making of Minority Report: “I wanted all the toys to come true someday. I want there to be a transportation system that doesn't emit toxins into the atmosphere. And the newspaper that updates itself... In the future, television will be watching us, and customizing itself to what it knows about us. The thrilling thing is, that will make us feel we're part of the medium.”

Well, Steven, the future may be closer than we all think.