Quality - the clanging gauntlet part II

GE Automation

Quality - the clanging gauntlet part II

Posted in General on Monday, 21 March 2016
Quality - the clanging gauntlet part II

As we left off part 1, we’d posed the question of adapting Lean value-stream mapping concepts to a way of thinking about satisfying the many requirements that fall under the “quality” umbrella. In a sense, that adaptation already exists:  consider how recipes, with their bills-of-material (BOMs) and routes, form something of a quality event roadmap. And this map lays out the many intersections (sorry, couldn’t help that) where work processes aimed at getting product through a “go/no-go gate” naturally encounter sources of useful-for-analysis data.

Those gates could simply be thought of as knowing what outcomes should occur at each step of production to ensure the right quality outcomes – what materials should come together (and in what quantities, and under what conditions)

The sources of useful-for-analysis data are generally instrumentation and asset automation, providing a time-series traffic flow - the streams of data that can be used as indicators of WHY the gate data is what it is.

The intersections have traditionally posed a challenge, because the nature and structure of the data related to gate decisions are fundamentally different in quantity, structure and accumulation rate compared to asset and sensory data. Gates are about momentary snapshots, whereas the sensory and automation data generates a steady stream of information of which a tiny portion is needed to support a go/no-go decision “in the moment”.

Thanks to new developments, industrial systems have evolved to let us more easily follow the value stream, overlaying the definitions associated with good outcomes (recipes and quality targets), with key events (gate related data) with whole streams of asset and sensory data. Modelling frameworks like GE software’s SOA-enabled suite of applications, or its Predix platform, let us “stretch” gate data (like process targets) along accumulations of time-series data to enable quality processes to be enabled based on patterns within a much more detailed set of data streams, rather than on limited snapshots from single measurement systems. Similarly, understanding how reference data (like raw material lot IDs and even characteristic data from a CofA) can be “stretched” to be put into context alongside event data and automation data gives an entirely different perspective on doing supplier quality or material performance analysis.

The best thing is that this approach is non-disruptive – it doesn’t demand reworking or replacing systems that are doing what they’re supposed to. From SCADA to LIMS, to MES, existing systems carry on performing their functions, delivering all the processes we laid out in Part 1 – but a new modelling environment allows us to create a virtual mapping that eliminates the siloes from a reporting or integration perspective. And that makes it easier to match data in support of both release decisions and improvement efforts. So the business can much more easily cope with (“dare we say thrive in the face of”?) product and process innovation coupled with never-ending demands to control cost!

Comments

There are currently no comments associated with this blog post.