• Articles
11min March 11, 2024 11min Mar 11, 2024
March 11, 2024 11 minute read

Achieving production consistency: tools and tactics for engineers

Download this engineer’s guide to multivariate analysis

This guide is for engineering leaders to discover how multivariate analysis reveals deep process optimizations by unraveling the complex relationships that exist between manufacturing variables.

Figuring out how to achieve repeatability and consistency is a priority for today’s engineering teams. Fluctuating supply and demand prices, availability, and stringent customer requirements mean each run is vital.

However, any manufacturing engineer knows that variability is inevitable where complexity reigns. If optimized repeatability was easy to accomplish, scrap and defect rates wouldn’t be persistent issues for manufacturers. Replicating successful results takes time, expertise, and the right data for teams to execute. Applying the right practices to an entire line—or multiple lines—is a massive undertaking with countless variables at play.

Luckily, there are tools available to help engineers repeatedly produce accurate products with greater ease, even amid changing conditions. These tools help teams understand and execute the ideal production parameters necessary for equipment, materials, people, processes, and other aspects of production to flow together simultaneously.

In this article, we’ll look at some key tools and tactics engineers can use to achieve and maintain production consistency.

Streamline efficiency with improved data visibility

When teams achieve consistent results time and time again, they know they are implementing the right production parameters for a given product and scenario. Understanding—and working within—these optimal conditions enable teams to measure against process stability goals. 

A key element of achieving repeatable success is visibility into what’s happening on the shop floor and what has worked successfully in the past. You need access to both historical and live data to achieve repeatable success. 

Historical data helps you understand which settings resulted in the best possible production runs. Analyzing previous runs enables you to understand what settings were in place to know what you need to do to replicate success.

“With Braincube, teams can evaluate what happened and ask themselves, ‘How do we run this next grade best?’”

Andrew Jones, a Senior Engineering Fellow at International Paper.

You also need an ongoing stream of live data, which enables teams to adjust and adapt accordingly. Without real-time visibility, you can’t know if the settings you’re running will generate optimal results. 

It’s important to remember that teams can’t simply run the same parameters every time and expect the same results. Manufacturers need to continuously adjust for set-point drift, grade changes, asset performance, and other variables. This is where advanced tools can make a significant difference in performance and outcome. 

Use case: reduced set-point drift in real time

International Paper, a long-time Braincube customer, wanted to identify how to replicate their best runs, even as shop floor conditions changed throughout the day. 

“If you just go back to the last settings you ran, after about six or seven runs, you realize you’ve drifted off into some poor setting,” said Andrew Jones, a Senior Engineering Fellow at International Paper. “No one really notices it happening, but it happens. With Braincube, teams can evaluate what happened and ask themselves, ‘How do we run this next grade best?’” 

Braincube’s tools make it easier for International Paper’s engineering teams to look at prior runs and identify which parameters resulted in the best runs. For example, teams or individuals can select the grade and speed range they’d like to view and pull optimal run settings for any product. These insights can be used to set standardized operating parameters for a wide range of production scenarios.

The optimal rules/settings for various production scenarios are already validated by engineering based on the best-performing settings from historical runs. Operations teams can then access these pre-determined settings, cross-reference them with the most current production data, and display the validated settings for the next grade run within Braincube’s Studio App.

Braincube’s Studio App

Depending on current conditions and the product being produced, International Paper’s engineers can use these insights to guide operators on how to set up subsequent runs for optimal outcomes. From there, Braincube’s Studio App automatically populates with the running conditions of key input variables from the most recent golden batch. 

By pulling the most recent, optimal settings, teams can account for a larger amount of process drift. They are not relying on what worked in the distant past. Instead, they are implementing the most up-to-date settings to drive the best possible outcome. 

Evaluate historical performance to achieve process stability

Understanding historical performance is a crucial step in uncovering the ideal settings for whichever product you’re producing. Engineering teams typically have to comb through and analyze large amounts of historical data to make these discoveries. Engineers must evaluate different runs to determine which were successful and which were unsuccessful. 

There are also likely different outcomes to consider when evaluating what success looks like. It’s not just about finding runs that produce the highest-quality products; it’s more valuable to identify the runs that meet quality standards while using minimal resources. In other words, production runs that generate the most consistent outcomes under the most stabilized conditions are runs you’ll likely want to repeat since these runs keep costs low and output high. 

Finding historical runs that meet the right conditions for repeatable success is often done by integrating some type of advanced AI to detect patterns that lead to positive results. Tools such as Braincube’s CrossRank or Machine Learning can identify your optimal runs, continuously adapt to changing conditions, and integrate with other Braincube products to send changes directly to the shop floor—all within the same platform. 

Braincube’s Advanced Analysis Application

Sometimes what you discover is different from what you might expect. For example, one of Braincube’s paper manufacturing customers saved the cost of purchasing new equipment by making adjustments identified by Braincube’s AI. These modifications enabled them to maintain product quality while using fewer resources. 

AI is often helpful in identifying patterns but it rarely tells you prescriptive operating settings for repeating your success. We all know that manufacturers can’t afford to take risks—if something is changed upstream, teams must understand the implications on final quality, machine wear and tear, and other downstream elements.

For example, just because you want to reduce energy consumption it doesn’t mean that you can isolate that KPI from the rest of production. You must understand how to reduce energy without sacrificing other factors like quality, speed, or throughput. If manufacturers only hone in on a handful of KPIs, it may cause overall performance to drift. 

It’s important to approach process stabilization with a strategy and goal in mind. For example, are you willing to use more energy for quality improvements? Are processes agile enough that, as energy sources fluctuate, you have fall-back plans that enable you to pivot? Take the time to evaluate what you want to achieve and what you may be willing to sacrifice for improved process stability. 

Use case: asset stability results in $1.2 million in savings

A global food and beverage manufacturer wanted to reduce gas consumption without sacrificing the reputation of their high-quality products. Energy costs are continuously rising: they wanted to source more sustainable options, such as steam generation via biomass boilers. 

Specifically, the company wanted to ensure that their biomass boiler was a focus of their new energy-reduction strategy. This particular biomass boiler asset consistently demonstrated a lower-than-their-machine-average OEE score. With more than 1,300 variables within the entire biomass boiler phase, there was a lot of data to dissect to stabilize the boiler’s efficiency. 

The company’s engineering teams used Braincube’s AI tools to determine that they could lower the product’s water content to regulate the CO2 level. Braincube’s AI tools provided optimized settings for stabilizing boiler usage and output. By implementing these new settings, the company reduced its gas consumption by 21,600 tons and reduced its annual CO2 emissions by 4,200 tons.

Once they stabilized production, engineers could compare different active rules, view their average application rate or results, and uncover additional optimizations. Combined, these changes resulted in over $1.2 million in savings. 

Because Braincube is designed to close the production loop, engineering teams can send updated settings directly to frontline teams with just a few clicks. This continuously connected process is why many customers stay with Braincube

Improve results with improved data quality

Once you stabilize production, the next logical step is to further refine the optimal conditions for repeatable success. Digital Twins can provide you with a continuous data model that provides an as-built snapshot of everything you produce.

Data may be an asset for many manufacturing companies but simply having data isn’t enough. Say, for example, you understand the average boiler temperature on one of your lines. Sure, that’s nice information, but what can you do with an average value? This metric doesn’t tell you what to change. It simply tells you what the temperature was on average over a set period. 

Instead, you need to identify the optimal set points across each product line to ensure that you’re making the best use of your data. To do so, you need to have the right data. This can be a challenge for some companies.

Braincube-powered Digital Twins equip teams with a dynamic digital replica of how a product was made. In turn, you can use the digital twin database in third-party systems, Braincube applications, or other custom-built models. 

“What would have taken hours of work going through 4,000 different data tags that we use on our machine, we’re able to do in just a few clicks.”

Tyler Shirley, Asset Development, Electrical Engineer at Kimberly-Clark.

With data aggregation, cleansing, and distribution handled by a digital twin, teams can focus on making the right changes. For example, data from a Digital Twin can help engineers identify how to manufacture your product with a high yield and low cost. 

When you run an Advanced Analysis within the Braincube platform, you’re actually defining the outcomes you want to achieve. From there, the algorithm prescribes how to achieve them. For example, maybe you want to drive energy savings above other needs. You’ll want to look at developing the right parameters using alternative energy sources. Or perhaps you want to use less costly material and add the highest volume of filler you can use before impacting quality. With digital twin data and AI, this work becomes drastically easier to conduct. 

Use case: using AI to fact-check custom models

Global paper products manufacturer Kimberly-Clark uses Python to build predictive models. These models help them to understand how to optimize basis weight and other standards to generate their golden batches—the runs that produce the optimal products. However, it’s important to remember that models are only as good as their data. 

To ensure the right data would be used, they leveraged Braincube’s Advanced Analysis app to validate the model. Braincube’s AI identified that there were surprising pieces of data in the original model. The data points used for building the model were incorrect!  Wrong tags had mischaracterized the data, showing inconsistencies that did not exist.

“What would have taken hours of work going through 4,000 different data tags that we use on our machine, we’re able to do in just a few clicks,” said Tyler Shirley, Asset Development, Electrical Engineer at Kimberly-Clark.

The right data and tools for analysis can save companies time, and resources, and reduce overall production costs. AI can often even uncover unexpected optimization opportunities. 

Conclusion

While it’s true that you need a large amount of data to uncover and set optimal operating conditions, tools like Braincube are here to help. From data visualization and storytelling to robust self-service analytics, our crawl, walk, run approach to digital transformation will help you not only reach your goals: you’ll exceed them.

Implementing tools like Braincube provides customers with the tools and resources that enable them to think creatively and critically about what happened during production and why. With the right tools, data, and strategy, you can achieve consistent results time and time again. 

See how Braincube’s IIoT Platform can help you improve traceability and find new success.
industrial-iot-platform

5 things about the Autonomous Factory

AI-powered automated operations have revolutionized various industries. However, to truly reap the benefits for both people and the environment, it is crucial to put these changes into practice. These practical implementations can unlock the full potential of autonomous manufacturing.

The engineer’s guide to multivariate analysis

In this white paper for engineering leaders, discover how multivariate analysis reveals deep process optimizations by unraveling the complex relationships that exist between manufacturing variables.

process-digital-twin

Kimberly-Clark
case study

In this case study, learn how Kimberly-Clark leverages its Braincube-powered Digital Twin data to build predictive data models that optimize processes with high accuracy.