Whether a collection of data could be useful to a business, is all just a matter of perspective. We can view data in its raw form like a tangled set of wires, and for them to be useful again, they need to be separated.
To make the most out of Big Data, the data must also be rationalized in the context of the business’ processes, where the data is used, by whom, and how. This is what process modeling aims to achieve. Without process modeling, businesses will find it difficult to quantify, and/or prioritize the data from a business perspective – making a truly business outcome-focused approach harder to realize.
“Process modeling is the documentation of an organization’s processes designed to enhance company performance,” said Martin Owen, erwin’s VP of Product Management.
It does this by enabling a business to understand what they do, and how they do it.
As is commonplace for disciplines of this nature, there are multiple industry standards that provide the basis of the approach to how this documentation is handled.
The most common of which, is the “business process modeling notation” (BPMN) standard. With BPMN, businesses can analyze their processes from different perspectives, such as a human capital perspective, shining a light on the roles and competencies required for a process to perform.
Historically, industry analysts have viewed Data and Process Modeling as two competing approaches. However, it’s time that notion was cast aside, as the benefits of the two working in tandem are too great to just ignore.
The secret behind making the most out of data, is being able to see the full picture, as well as drill down – or rather, zoom in – on what’s important in the given context.
From a process perspective, you will be able to see what data is used in the process and architecture models. And from a data perspective, users can see the context of the data and the impact of all the places it is used in processes across the enterprise. This provides a more well-rounded view of the organization and the data. Data modelers will benefit from this, enabling them to create and manage better data models, as well as implement more context specific data deployments.
It could be that the former approach to Data and Process Modeling was born out of the cost to invest in both (for some businesses) being too high, aligning the two approaches being too difficult, or a cocktail of both.
The latter is perhaps the more common culprit, though. This is evident when we consider the many companies already modeling both their data and processes. But the problem with the current approach is that the two model types are siloed, severing the valuable connections between the data and meaning alignment is difficult to achieve. Additionally, although all the data is there, the aforementioned severed connections are just as useful as the data itself, and so denying them means a business isn’t seeing the full picture.
However, there are now examples of both Data and Process Modeling being united under one banner.
“By bringing both data and process together, we are delivering more value to different stakeholders in the organization by providing more visibility of each domain,” suggested Martin. “Data isn’t locked into the database administrator or architect, it’s now expressed to the business by connections to process models.”
The added visibility provided by a connected data and process modeling approach is essential to a Big Data strategy. And there are further indications this approach will soon be (or already is), more crucial than ever before. The Internet of Things (IoT), for example, continues to gain momentum, and with it will come more data, at quicker speeds, from more disparate sources. Businesses will need to adopt this sort of approach to govern how this data is moved and united, and to identify/tackle any security issues that arise.