Master Data Management (MDM)

Master Data Management (MDM)

Master Data Management (MDM) is a tool that consistently defines and manages an organization’s critical data to provide a single point of reference. The benefits of the MDM paradigm increase proportionally as the number and diversity of a company’s departments, employee roles, and computing applications expand. For this reason, MDM is more likely to be of value to large or complex businesses than to small or medium-sized enterprises. When companies merge, the implementation of MDM can minimize confusion and optimize the efficiency of the newly expanded organization.

Axional ERP/MDM provides a unified view of the critical areas typically handled by different applications (finance, logistics, etc.). Our tool avoids potential duplications and provides a singular, trustworthy view of your data. This view can be used across all the modules and functionalities of the Axional ERP suite.

Curating and managing master data is key to ensuring its quality. Analysis and reporting is greatly dependent on the quality of an organization’s master data; as such, the end goal of a complete MDM solution is to possess one “trusted version of the truth” for all decision-making.

Master data overall functionality

The main aim of MDM processes is to consolidate variants of instances of core data objects distributed across the enterprise, creating one unified representation. In turn, that representation is continually synchronized across the application architecture to make master data available as a shared resource.

Master data management processes identify the sources from which to collect descriptions of these entities. In the course of transformation and normalization, system administrators adapt descriptions to conform to standard formats and data domains, making it possible to remove duplicate instances of any entity.

Such Axional MDM processes result in an organizational MDM repository, from which all requests for an entity instance produce the same description, irrespective of the source and the requested destination.

This module includes the main processes required to manage master data:

  • source identification
  • data collection, transformation, and normalization
  • rule administration, error detection, and correction
  • data consolidation
  • data storage and distribution
  • data classification and taxonomy services
  • item master creation
  • schema mapping
  • product codification, data enrichment, and data governance


These processes are described in the sections that follow.

In MDM, a source is any instance of a software application providing business information using its own codes and identifiers for each data element (customer, item, etc.). The same application running in different countries may use different codes for the same data elements, which is the main reason to identify sources inside MDM.

The module takes advantage of two concepts: Source System and Source Application. It assigns a single code to each Source System and adds several points of information:

  • The application type: accounting, human resources, etc.
  • The server where it runs
  • Information on location
  • Status (active, temporarily disabled, etc.)
  • Frequency of updates

Data acquisition includes both near real-time processes and batch processes built on standard messaging formats, including ETL and SOA, used to acquire and aggregate data from one or more sources.

The MDM functionality is fully integrated with the ETL (Extract, Transform, Load) processes that feed the data marts used in business analytics. This MDM process converts a set of values from the data format of a source system into the format of a destination system. On a technical level, it uses the capabilities of Axional Integration Bus embedded inside Axional software architecture.

The data transformation maps elements (also known as data mapping) from the source system to the destination system, as well as executing any transformation that must occur. When mapping data element to data element, Axional MDM allows complex transformations that require one-to-many and many-to-one transformation rules.

The ETL process has been specifically designed to move large volumes of data and apply complex business rules to bring data together in a standardized, homogeneous environment. A business rule is a statement that describes a business policy or procedure. Business logic describes the sequence of operations associated with data in a database to carry out said rule.

The module provides basic cleanse/standardization capabilities. In addition, the interfaces provided enable the use of third-party tools and optional custom routines, such as real-time data validation lookups.

The module includes user-definable functions to be executed during the ETL process. These functions will perform specific error detection and correction operations.

Axional MDM includes Match and Merge engines that use business rules to identify matching source records, enabling their automatic or manual merging into a master data record.

The system provides users the ability to handle historical data as well. In this case, all changes to records are kept, allowing the user to restore any previous report.

Data Storage & Distribution

Axional MDM provides a central repository where master/reference data is stored. It also handles all the mapping schemas required to integrate with source systems and local applications.

By definition, the MDM module contains reference data. This information is frequently needed in other systems outside the ERP, e.g. on the company website where certain information about products and articles must be displayed. Axional MDM provides the mechanism to synchronize those systems with the appropriate data at predefined intervals of time. There are three ways in which master data may be collated and distributed to other systems:

  • Data consolidation: Master data is captured from multiple sources and integrated into a single repository (operational data store) for replication into other destination systems.
  • Data federation: The MDM module provides a single, virtual view of master data from one source to one or more destination systems.
  • Data propagation: Master data is copied from one system to another, typically through point-to-point interfaces in legacy systems.

Ambiguity can pose a major challenge for MDM data architecture. When the same term can have different meanings, two Axional MDM tools help determine the most appropriate meaning through context: Taxonomy and Hierarchy Management.

Taxonomy concerns “semantic architecture”— in other words, naming, as well as making decisions about how to map different concepts and terms to a consistent structure.

The MDM thesaurus can map terms as synonyms to account for inconsistencies. Taxonomies can also represent related concepts (technically also part of a thesaurus), which can be used to connect processes, business logic, or dynamic/related content to support specific tasks. This tool can also be used to map terms corresponding to different regulatory systems, e.g. when reconciling multiple versions of industry classification schemes used in different nations.

Hierarchy Management: The Hierarchy Manager provides the ability to manage relationship structures across master data records in order to view them in a hierarchical presentation (top-down, vertical, customers by territory, etc.).

The MDM module supports user-defined data models for each subject area, such as Customer, Product, Reference Data, etc. These Models can contain attributes which identify the business structures of the master data record. The source attributes of the enterprise master data will span across source systems.

This module has the functionality to create and maintain the organization’s item masters. Items are assigned to a corresponding Master Taxonomy category. Although categories with assigned items should be fairly static, Axional MDM supports multiple taxonomies for different business functions, allowing enough flexibility to adapt to business needs.

A key MDM process, schema mapping allows the linkage (or translation) of local identifiers. The use of such mapping allows the local application to continue using its own coding system, while the ERP provides users with a unified view of corporate information.

The module includes the ability to automatize, to some extent, the buildup of such mapping using business rules.

Axional MDM includes a control to ensure that data entry by a team member or automated process meets precise standards, such as a business rule or data definition and data integrity constraints. This control is made possible by User Roles and a set of specific validations (custom-defined) to ensure that important data assets are formally managed throughout the enterprise.

Access and security

The Axional ERP suite includes a comprehensive set of functions to manage user access and data security. These user management functions are also used in the MDM module, where various roles exist: administrator, data modeler, rules designer, and data steward.
User-friendly interfaces provide access to rules and data management functions, supporting both stewards and administrators. Specific capabilities include:

  • Data steward functions:
    • Identify and manage candidate master data sources as well as trusted sources
    • Manage data cleansing and standardization rules
    • Match and merge data
    • Manage and monitor data quality


  • Administrator functions:
    • Manage hub schemas, metadata, and data models
    • Manage user access
    • Manage database resources
    • Monitor and manage performance and scalability
    • Manage operational tools and services

In some cases, data originates from sources containing confidential data. Fortunately, these sources need never be shown to users or even stored in a readable format. The module includes functions and processes to ensure that the original protected data cannot be derived backwards from currently-stored data.
It is imperative that this type of data never travels during data transfers. To accomplish this objective, the fields containing such data are anonymized before storage in destination servers. The uploading and anonymization of data is performed all in one step, as anonymization and normalization is performed on the fly during the loading of external data from files. For each line of the source file, the following steps are executed:

  • Line read in memory.
  • Filtering operations executed in memory, allowing anonymization and normalization. The use of locally-defined functions as filters allows for customized anonymization procedures.
  • The resulting data is loaded into the database table.


The anonymization process is compliant with the following requirements:

  • Uses a proven non-reversible hash key method (Whirlpool)
  • Anonymization is done in-memory and before any storage of data on permanent devices (even temporary storage).
  • The process is performed row by row. In case of process interruption, no original data remains stored.

Empower your business today

Our team is ready to offer you the best services