Dasudian Industrial Intelligence Summit - Wuhan 2018

This week I held a short 30min introduction into "Industrial Intelligence" on the "Dasudian Industrial Intelligence Summit" in Wuhan. As the participants did come from a broad range of industries and with very different background, I kept the session about how artificial intelligence can support the industrial sector on a very high level. To make the content available to a broader audience I will publish a summary here on my personal blog as well as on https://www.dasudian.com/

Below I added the shown slides with my speaker notes.

Good afternoon, my name is Matthias Hub, I am with Dasudian, and this session will be about how artificial intelligence can support the industrial sector.

To talk about industrial intelligence I will first establish some common ground during the introduction. After that I present market research on what use cases are the most promising ones for the use of artificial intelligence. I will then walk through a very simple example case to demonstration how the different bits and pieces technically work together.


Artificial intelligence is currently at the peak of the hype cycle, the peak of inflated expectations. Here we see the hype cycle for 2017, and when we look at the one from 2018 it is very similar. We see almost every day these buzzwords in the news. And with events like Alpha Go, where a human was defeated by a machine in that game, it is clearly something real present in the last few years.

For more details visit https://www.gartner.com/smarterwithgartner/top-trends-in-the-gartner-hyp...
and https://www.gartner.com/smarterwithgartner/5-trends-emerge-in-gartner-hy...

That brings a few challenges with it, for example it is not easy to have a discussion based on long-term experiences – as it hasn’t been adopted widely yet. And so misconceptions arise: first, everyone thinks that all the others are doing something with AI or machine learning, and second, at the same time, everyone thinks that it is very hard to get started with and to do something useful with it.

Actually, both is not true: not many manufacturing companies are really doing already something with AI, so don’t feel left behind if you are not doing anything yet, and actually it is very easy to start with AI if one accepts that very small steps are OK at the beginning. That is also why later I will show an example of such a small but helpful step, to show that this approach make much more sense than a sophisticated solution requiring a lot of upfront invest.

Within the manufacturing industry a broad variety of sensors and machines are used. An essential first step to make later use of AI is that the data from the various sources like sensors, smart devices and production machines, are connected to a computer system or to put it in a more general term: a platform - maybe even using the internet and servers in the cloud. This connection enables the data flow from the devices into an analytics platform which contains the AI component. AI without input data does not work out. One important point to note is, that the incoming data usually is in a very raw and thus unusable state and needs to be “refined” (similar to oil): the data needs to be filtered, sampled or same basic aggregations has to be applied. The result of the processed data from the analytics platform is then providing answers to various questions or insights into a sea of unstructured data.


What we see here on the left is a big research field on it’s own. It is usually referred to as “Internet of Things” or as “Industrial Internet of Things”: devices of all sort get connected with each other, and especially also connected to a platform to make use of the generated data. And on the right side we see the field where Artificial Intelligence and Machine Learning is located: in the analytics platform. The interesting point is that both are overlap and actually need to be integrated to create real value for the manufacturer and deliver on the high promises of artificial intelligence.

These two areas are a focus of Dasudian, and the overlap of two research fields is really an interesting piece to work on and bring values to the customers.
This was the very high level introduction, now we have a look to what use cases this technology can be applied.

When we look at the industrial sector and want to know how AI can be used in manufacturing, a good approach is to look at some high impact use cases. I quote here a McKinsey report from last year, we did find very similar use cases in our research. For example the first one, autonomous vehicles is a trend that will impact the whole world globally and thus also industries besides automotive itself. To give one example: to provide solid autonomous driving object detection and object recognition from video signals need to be available in real-time. Rule-based approaches are hard to create and need to be maintained manually, whereas a neuronal network with some training can do object recognition as good as humans are, see e.g. https://towardsdatascience.com/google-ais-new-object-detection-competiti... Another use case is the collaborative and context-aware robots. This is very related to the main stream opinion, what AI currently can do: a personal assistant like Siri from Apple, Cortana from Microsoft, Google Assistant, Alexa from Amazon, etc. Some robots will also become an assistant to human workers: they know in what context they are in with using object recognition methods and they can interact with the human via voice using speech recognition and natural language processing algorithms. For the next few minutes I want to focus on the use case “Yield enhancement in manufacturing”, meaning to reduce the error rate of manufactured products.

Direct link to McKinsey report: https://www.mckinsey.com/~/media/McKinsey/Industries/Semiconductors/Our%...

One high impact use is the yield enhancement in manufacturing. Especially when cycle times between the first process step and the final product are long, e.g. in the semi-conductor production it can be weeks or even months, it is very important to detect these cases very early in the cycle. As there is usually so much data available, that an expert or a rule based system cannot easily detect patterns, especially there can be unexpected correlations of data points, which an expert won’t be able to detect. In this scenario AI can detect patterns when linking quality data and yield loss in the large amount of available data.

Lets walk through the steps shown in the diagram:

  1. AI enabled root cause analysis
  2. Data across production tools is linked and fuel the AI engine
  3. Prediction of yield detractors locations

Now we have a closer look how a simple use case supported by artificial intelligence can look like.

As promised, I want to walk you through a simple case how AI can be used. I mentioned earlier that there is a wrong assumption regarding how to start with AI and machine learning: “too hard to get started”. It is not so hard if we accept to make just small steps. In this demonstration I want to walk you through the first simple steps from sensors with raw data to a prediction result. And to make it easy to understand I am using an environment we all can relate to: a smart home with a smart energy meter installed to measure the current consumption. The question I want to get an answer to is following: “How much energy will the home consume over night?” This is a very useful information when you think about electrical cars with large batteries which can possibly power a house the whole night, and as such used as energy store.

In the sample smart home we have more than just the smart energy meter, as we need a system to store the data, a system to do data preprocessing, a system to run the prediction algorithm and also a system to do the visualization and alerting. That is a small version of an analytics platform.

What to remember when looking at this example: the same steps and principles are also valid in an industrial scenario.

Now it will get a little bit more technical, let’s start: we have a smart meter here on the left, which makes the current consumption and other historical values available through the Smart Meter Language protocol via infrared. There is a wire from the meter to the integration hardware, the RaspBerry Pi. On this Pi a small component is running to process that data and to send it every few seconds to a flow engine, in this case via.the HTTP protocol. To give an analogy, this is something like a bridge from the sensor world to the analytics platform world.

Within the flow engine we pre-process the data, give them a label, add a timestamp, store it in a database, etc. Here we see a small JavaScript snippet which parses the data received from the bridge and then uses a time-series NoSQL database to store that data. That code is called within the flow, which is kind of a graphical programming interface, where you model the processing steps in a graphical way.

As we want to do a prediction, we need to select a prediction algorithm. A very straight forward prediction algorithm is linear regression. In Python there are already libraries for some machine learning algorithms, linear regression is also available. This is a supervised algorithm, which means we first need to do some training with the data and then we can apply the learnings on further test data to validate. First we split the available data into training and test data and then run the training with the fit function. Optionally, we can have a look how far off the prediction is with calculating the MSE (mean-squared error). In the next steps we can then do a prediction on new incoming data, which the algorithm hasn’t seen yet.

We can now apply the algorithm to further incoming data. In our case for the energy consumption prediction, we want to have it run daily in the evening just before the night. To view graphically how the prediction with just one feature works a graph can show the prediction function.

As seen data can be visualized in different way, here another visualization of the energy consumption.

That was the short walk through of the energy consumption prediction case in a smart home. The same principles and steps, i.e. data retrieval, data preprocessing, feature selection training and deployment of machine learning algorithms are valid for other industrial use cases as well.

I hope I could show you that with a little help from experts, the first steps towards using Artificial Intelligence and Machine Learning is not so hard. I come to the conclusion now.

What we have learned today is that currently Artificial Intelligence and Machine Learning are at the peak of inflated expectations. That means just now companies are actually starting to work with AI and trying things out. Also we have seen that it is not so hard to get started with a small project.

Another learning was that the IoT world and AI have in the industrial context quite some overlap which the companies need to be aware of. Both areas need to be addressed when doing a project.

Market research shows that there are high impact use cases for the industrial sector available, from autonomous vehicles, which will affect a lot of industries, to yield enhancement in manufacturing. That means it is possible to create a business case with using AI.

Two closing remarks:

  • Start connecting devices now: this is a pre-requirement for AI
  • Start with an AI project now: in a small pilot as initial small step

And we think of course that this is best done with Dasudian.

User login

CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
7 + 8 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Powered by Drupal