THESIS
2015
xvi, 187 pages : illustrations ; 30 cm
Abstract
Consistency and economies of scale constitute most pivotal features of processes.
They facilitate operations that run efficiently day in and day out with predictable outcome
and uniform quality. Paradoxically, in today’s service economy high-grade process delivery
requires the consideration of variation in customers, environment, used resources, scenarios
and various other aspects. It is thus often inappropriate to directly apply traditional process
control and process management tools used, for example, in mass production of products,
where standardization is prevailing and variation is minimized. In order to still address key
management issues such as quality, cost and schedules in service processes and production
processes of customized products alike a proper characterizatio...[
Read more ]
Consistency and economies of scale constitute most pivotal features of processes.
They facilitate operations that run efficiently day in and day out with predictable outcome
and uniform quality. Paradoxically, in today’s service economy high-grade process delivery
requires the consideration of variation in customers, environment, used resources, scenarios
and various other aspects. It is thus often inappropriate to directly apply traditional process
control and process management tools used, for example, in mass production of products,
where standardization is prevailing and variation is minimized. In order to still address key
management issues such as quality, cost and schedules in service processes and production
processes of customized products alike a proper characterization of variation in processes
is imperative.
To date, however, there is no attempt in trying to systematically explain and statistically
assess the characteristics of dynamic process implementations in a holistic manner, i.e.
starting from the initial need for variation to the final actual implementation in the process.
This research therefore formalizes the concept of process context as the integral source for
process variation under the hypothesis of an existing causal relationship with the process.
At the same time, it is theorized that a rising variation in this process context not necessarily
causes an increase in process variation, thus creating the opportunity for various process
realization strategies. For modeling the process context variation and its translation into
dynamic process behavior approaches found in two discovered independent analogies in the
field of theoretical physics and information theory are adopted. A practical implementation
is proposed and applied in two relevant real cases with extensive data sets. With the
dawning digital age such fine-granular process data is now abundant.
As a result, a methodology has been found for the first time that allows for holistically
explaining, modeling and measuring variation in processes. It introduces the fundamental
concept of entropy to the process field and manifests a theoretical understanding of possible
response mechanisms underlying the wide range of detached process realization strategies.
In terms of an analytical framework it facilitates a data-driven characterization of processes
while tackling the need for an estimation approach and the impact of two involved trade-offs.
Both case studies have shown the fitness of the modeling and corroborate the underlying
hypotheses. The novel concept of responsive processes hence establishes as a central
contribution. It is defined on the basis of newly derived fundamental entropy measures.
Its reinterpretation of variation in processes as a mandatory or even desirable feature turns
an accomplished management of a responsive process into a remarkable competitive edge.
This relaxation of a persistent focus on standardization has further been found to pave
the way towards a novel consistency paradigm. Building on this consistency requirement a
hypothesized economic limit for process entropy has been proven.
Post a Comment