As a graphical representation of the information requirements for a given business area, a logical data model is constructed by taking the data descriptions depicted in a conceptual data model and introducing associated elements, definitions and greater context for the data’s structure.
This stage is important because while the more streamlined conceptual data model is more easily communicated, the lack of context can make it difficult to move from modeling to implementation. More detail is required to support that progression. Such detail includes defining the owned attributes, primary keys, foreign keys, relationship cardinality and describing entities and classes. At this stage, the nature of relationships between data is established and defined, and data from different systems is normalized.
At this stage, the data model’s primary function is to visualize data elements and how they relate to one another. Logical data modeling also functions to detail the attributes associated with a data element. For example, a logical data model would specify the nature of a data element, i.e., account name (string), account number (integer).
The three different types of data models provide increasing degrees of context and detail, so we can view their use sequentially. Therefore, a logical data model should be considered once the conceptual data model has been built.
This more structured stage of data modeling is most relevant during application design, when it can serve as a communication mechanism in the more technical environments where database analysts and designers work. It helps us understand the details of the data to a greater degree than conceptual data models – but similarly stops short of providing perspective on how it should be implemented.
As with conceptual data modeling, this means teams aren’t bound to technological considerations. This is important as the nature of technology in organizations is often dynamic.
An easy way to comprehend when and why a logical data model would be relevant is to consider the model’s intended audience – database analysts, system analysts and designers. Logical data modeling’s audience and place in the application-design process means excluding context and detail in favor of accessibility is less relevant. The extra layer of detail (when compared to conceptual data modeling) is the context architects require to ensure new applications are compatible with the data they will encompass. Essentially, a logical data model provides the foundations necessary for productive database design.
Without a logical data model, designers can only really figure out a new application’s requirements as they go. This will often mean working with unorganized data elements that make overlooking such requirements more likely. So, skipping the logical data modeling stage in favor of building a physical data model can lead to poor database design and applications that do not function as intended. Addressing such missteps requires a reactive approach that can slow down time to market and increase the total costs associated with the development process.
Additionally, the technology-agnostic nature of a logical data model helps organizations establish opportunities for process improvements. This means new applications can be built to be as effective as possible, rather than as effective as current technological constraints allow.
Context, context, context. Every stage of the data modeling process benefits an organization by introducing a new layer of context to the model. This helps develop a much clearer picture of the current state of an organization’s systems and processes, shining a light on how best to navigate to the desired future state. The specific benefits of logical data modeling are:
Help organizations identify areas for business process improvement
By building a model untethered to current technological constraints, organizations can identify what is required to realize the ideal version of the model.
Design well-informed applications
By accounting for the attributes of data elements, we can reduce the amount of semantic oversight that could lead to problems down the road. Data elements are better defined, and the relationships between them are more complete.
Reduce costs and increase efficiency
By mitigating the potential for oversights, organizations reduce the risk of botched implementations and the need for revisions post-launch. Additionally, data re-use and sharing is encouraged and data redundancy and inconsistencies can be avoided.
Provide a basis for future models
Just as a conceptual data model provides the basis for a logical data model, a logical data model provides the detailed design to be targeted and tuned to a specific technology in the physical data modeling stage.
Data is erwin’s business and has been for more than 30 years. We’re recognized as the data modeling market leader. This expertise has created a data modeling platform perfect for addressing data modeling needs at every stage – conceptual, logical and physical – of the data modeling process.
erwin Data Modeler by Quest allows business and technical stakeholders to collaborate on the design and implementations of new systems. They also enjoy capabilities such as NoSQL support and automated metadata harvesting, which greatly reduce implementation times.