Login to SMS group Connect

Got a registration code?

Please enter your eight-digit registration code.

Düsseldorf, December 03, 2020

The role of Artificial Intelligence and Machine Learning for the Learning Steel Plant

Data-driven models help to find the optimal operation of a steel plant and improve defined KPIs

The Learning Steel Plant enables machinery to optimize operations in an ever-changing environment autonomously under the use of artificial intelligence and machine learning. By combining data from the automation system with domain know-how and new Artificial Intelligence techniques, important production results can be predicted, and outcomes optimized taking into account different business goals.

The central premise of the Learning Steel Plant is to enable machinery to optimize an ever-changing manufacturing environment autonomously with the use of artificial intelligence and machine learning. The machinery itself evaluates data from sensors and different systems to understand its condition and respond in the most efficient way possible. However, its response is not triggered by a fixed programmed schedule, fixed automation, or a fixed set of answers. Instead, the machinery itself continuously observes and learns how to react to different situations. The Learning Steel Plant will program itself.

Machine learning for multi-objective optimization problems

Most machine learning algorithms are designed to minimize a singular cost function, which represents the success or failure of a business process (e.g. the cost of metallics per ton of tapped liquid steel). However, decision-makers aim to optimize multiple business goals at the same time (e.g. reduce the process variability as well as the costs of the process at the same time). It can be challenging from both a business as well as an algorithmic perspective to translate multiple business goals into a singular cost function. In general, even with trivial multi-objective optimization problems, there is no solution that optimizes all sub targets at the same time. Instead, decision-makers must weigh up the individual sub targets and decide how to prioritize them. Hence, it is not uncommon in artificial intelligence (AI) projects to spend a significant amount of time in accurately translating multiple business objectives into suitable objective functions.

Aside from profitability, there are also other beneficial targets for optimization: The Learning Steel Plant might aim for minimal process variability, meaning that processes should have a minimal variation to increase the predictability of operations as well as to fulfill tight product specifications. Another optimization target might be to maximize the cash flow of operations. In any case, for AI to shine, goals that are formulated on business KPIs need to be translated into fitting learning objectives before Data Scientists can develop models that optimize these objectives.

From description to prediction

Once a learning objective is derived from the business KPIs, relevant data are collected and pre-processed. Then, data scientists compare different algorithms to optimize the defined cost function. From a top-level perspective, we can differentiate between four levels of maturity of the developed analytical systems: descriptive, diagnostic, predictive, and prescriptive analytics. For a start, descriptive analytics such as data mining or correlation analyses can give a clear idea of what kind of patterns have occurred in the past.

One step that is more mature than descriptive analytics is diagnostic analytics, which additionally lets the Learning Steel Plant know why an event happened. Here, domain knowledge of the process experts is incorporated to make sense of the found patterns.

The next step towards a comprehensive Learning Steel Plant involves predictive analytics, which, by contrast with descriptive analytics, looks not only at the past but also into the future. Predictive analytics works out what is most likely to happen. It can detect incorrect operation ahead of time but is not yet able to determine for itself how to prevent a critical situation.

The final step in the maturity of the system is prescriptive analytics. It advances predictive analytics to include an understanding of the plant’s own reaction. Prescriptive analytics provides the plant with instructions for future actions to achieve a particular goal, such as preventing a breakout event that would cause the casting line to stop.

Machine Learning consists of data-driven models

Machine learning is the study of computer algorithms that improve automatically through experience. It is seen as a subset of both research in artificial intelligence as well as of statistics and computer science. Machine learning algorithms build a mathematical model based on sample data, known as "training data," to make predictions or decisions without being explicitly programmed to do so.

Learning here stands for “picking up patterns from data”. An exemplary pattern could be: if a temperature in a certain steel treatment step is over a threshold for some time, e.g. 800 degrees, a particular end product could suffer from a higher risk of surface defects. Data Scientists working on such algorithms need to work closely with domain experts and customers to evaluate such patterns. In a complex industry such as steelmaking, it is nearly impossible to develop algorithms purely based on data. A close collaboration between data and process experts is necessary for successfully develop, evaluate and deploy such models.

The learning of a machine learning algorithm utilizes information from different layers of the automation pyramid and generally takes place in the upper layers. There are frequent situations where level 0 to level 3 data is combined to construct algorithms. Different levels, systems, and data sources are integrated into the Learning Steel Plant to allow the fast development of various machine learning applications.

Once meaningful patterns are found in the data, they are translated into model weights. This means that the machine learning model will pick up the patterns and convert them into mathematical equations. Later, during the operation of the algorithm, the algorithm looks for learned patterns from historical data to trigger alerts or to control and adapt a process.

In a traditional plant, the operation is automated. The plant reacts in a defined way based on rules and fixed algorithms. In a Learning Steel Plant, the plant can reprogram itself to respond in the best way. It reacts dynamically to its condition based on past experience. Machine learning will be the key enabler of this shift in responsibility.

Integrative approach for Artificial Intelligence

SMS digital aims to optimize learning objectives that are formulated by domain experts. Together with the customer, data experts translate business goals into learning objectives. Such learning objectives will be synchronized across multiple process stages to enable a holistic optimization of the Learning Steel Plant. In the end, that holistic view will be implemented by a mixture of physical models with traditional optimization algorithms as well as new data-driven techniques.

Solutions at SMS digital are designed with AI in mind, meaning that data are cleanly tracked and integrated between systems. Having these systems available on a single platform allows the fast development of AI apps that combine data from different sources. SMS digital is anticipating the needs of future AI applications in the design of new machinery and plants.

The reliability of algorithms is a core quality aspect. Consequently, white box algorithms that are understandable are preferred over black boxes, which are not maintenance friendly. Robustness is more important in the development of algorithms than pure performance. When applying algorithms, scientific research principles are followed. The results of the analyses should be reproducible and verified by experiments.

Application Example: The Metallics Optimizer

Raw materials represent one of the biggest cost factors in the production of crude steel. Implementing the right strategy for allocating materials opens up vast potential for cost savings in production. In electric steelmaking, producers are facing a particular challenge: operators need to maximize the amount of low-priced scrap in a melt while at the same time ensuring that steel quality meets the requisite production goals.

To make things more challenging, there is a lack of knowledge of the chemical composition of the input materials. This introduces a process variability, causing unnecessary large amounts of expensive raw materials being used, because low-cost scrap with unwanted tramp elements puts the product quality at risk. Only an analysis that is carried out after the feedstock has been melted can show how high the proportion of these tramp elements in the scrap is.

To deal with the process variability, the Metallics Optimizer uses machine learning techniques to predict the chemical concentrations of different elements in the available commodities. The chemical concentrations of different elements vary over time as different layers of the scrap piles on the scrapyard are consumed. The prediction of the chemical properties gives operators a better idea on how to use charge materials. The figure shows the estimated copper content of commodity 3 fluctuates between 0.05 and 0.20 percentage points between September 2019 and July 2020. Depending on how much copper is aimed for with casted steel grades during that time span, it is necessary to adjust how much of commodity 3 can be used in the charge mixes.

On top of these machine learning-based commodity characterization, the Metallics Optimizer employs physical (mass and energy balance) models to predict the chemical properties of different charges in future sequences. Those proven mass and energy balance equations further decrease the process variability because operators can reliably forecast the chemical properties of heats in a future sequences.

There are many factors, such as scrap costs, electric energy, wear on electrodes, or tap to tap time, to name just a few, which influence the costs of producing a heat. Based on its commodity characterization and its physical models, the metallics optimizer also employs an optimizer, considering many of such factors. With it, the solution will pick the cheapest charge mixes that fulfill product specifications over a whole future sequence. This optimization allows production at lowest cost without sacrificing quality.

SMSgroup Image SMSgroup Image SMSgroup Image
Prediction of chemical concentration of different elements by the Metallics Optimizer.
SMSgroup Image SMSgroup Image SMSgroup Image
Prediction of chemical properties of different charges in future sequences based on data-driven predictions about chemical concentrations in available commodities.

SMS digital’s Metallics Optimizer combines data-driven models to predict the amount of undesired tramp elements in the scrap before it is melted. The software uses this prediction to calculate the lowest-cost composition for the melt's feedstock by means of optimization algorithms that are used in combination with theory-based models and simulate the melting process. Here, the Metallics Optimizer takes into account the feedstock's costs and all costs related to the production of the melt, such as wear and tear of electrodes, usage of alloys, or energy consumption, for example. The Metallics Optimizer is a prime example of a predictive solution that combines data-driven models, theory-based models with the vast expert knowledge of the SMS group. SMS digital sees enormous potential in this approach and continues to combine innovative AI techniques with proven theory-based models while utilizing the widespread expert knowledge of the SMS group.