Explore chapters and articles related to this topic
The Basics of ERP Systems for Manufacturing Supply Chains
Published in Odd Jøran Sagegg, Erlend Alfnes, ERP Systems for Manufacturing Supply Chains, 2020
Odd Jøran Sagegg, Erlend Alfnes
MDM involves the process of collecting and managing critical master data across different systems and business functions. The ERP system is a source of many of these data, and some ERP vendors have built some functionality for MDM into their applications. These functions may include searching, tracking, and resolving issues concerning redundant data, as well as elimination of duplicate data in the database. MDM functions of an ERP system can also include abilities to share and update master data, such as items, customers, and vendors across companies that are using the same installation of an ERP application.
Data Management Software Solutions for Business Sustainability – An Overview
Published in Pedro Novo Melo, Carolina Machado, Business Intelligence and Analytics in Small and Medium Enterprises, 2019
MDM comprises “business applications, information management methods, and data management tools to implement the policies, procedures, and infrastructures that support the capture, integration, and subsequent shared use of accurate, timely, consistent, and complete master data” (Loshin, 2009, p. 8); an extremely important fact has also been highlighted, namely “master data management success depends on high-quality data” (Loshin, 2009, p. 17).
Information Management
Published in Steve Pittard, Peter Sell, BIM and Quantity Surveying, 2017
Malcolm Taylor, Peter Sell, Tahir Ahmad
In line with the principles outlined above, the PIM team is responsible for the following: Common data environment (CDE) – the creation, management and validation of the chosen CDE, involves working with the client and the rest of project team to determine the most appropriate CDE, instigating the environment and then managing the CDE through the life of the project.Ensuring the data within the CDE is appropriately structured so that it can be: measured in terms of its data quality by appropriate metadata (similar to quality control of physical assets) for progress monitoring and key performance indicator (KPI) managementvalidated against the future maintainers’ requirements and readily assured, to ensure data is acceptable for future maintenance activity information standards. The setting of the information standards to be adopted for the project will entail agreeing the standards with the client and the project team (see Figure 6.4 below).Ensuring that the information created matches the requirements for operation and maintenance in form and suitability, so that whatever system is ultimately chosen (e.g. for asset management), the appropriate data set can be made available.Information security – agreeing with the client and the project team the appropriate levels of information security (PAS1192-5:2015), including the information security marking of information to conform to the relevant standards.Master Data Management (MDM) – the creation, standardisation and maintenance of a library of master data, which is the common reference data that is used across the project (e.g. locations, terminology, glossary).Data validation – auditing and checking of information content and structure to ensure that the information contained in the CDE conforms to the agreed standards and structure.Support – offering support to the client and project team, both in terms of the information standards, and also in the use of the CDE.
‘Un’-blocking the industry 4.0 value chain with cyber-physical social thinking
Published in Enterprise Information Systems, 2023
Subodh Mendhurwar, Rajhans Mishra
With the evolution of supply chain technology (Banerjee 2018b) from material (manufacturing, enterprise) requirement (resource) planning to advanced optimisation; various ecosystem stakeholders (partners, suppliers, manufacturers, service providers, customers, etc.,) and their ERPs (in combination with blockchain IoT technology) increasingly interact with each other; bridging technology gaps, enhancing transparency and integration efficiencies, through use cases like (i) Ecosystem wide Master Data Management, (ii) Engineering Design, (iii) Ordering and Procurement, (iv) Demand and Supply Management, (iv) Manufacturing and Logistics Management, (v) Tracking, (vi) Product Provenance, and (vii) Distribution Management. Blockchains help securely record and share transactional data (e.g., Al-Jaroodi and Mohamed 2019), efficiently automating supply chain processes (e.g., blockchain-based Smart Contracts – Chang, Chen, and Lu 2019) to augment transparency throughout the entire value chain (Ferrantino and Koten 2019).
Data Governance Model To Enhance Data Quality In Financial Institutions
Published in Information Systems Management, 2023
Data quality is addressed at the level of corrective measures. Data quality is mostly managed only reactively based on the ad-hoc needs. Metadata is described only for data stored and processed in the data warehouse (DWH). DWH stores the documentation of data models (both logical and physical) in Power Designer. All data models are presented to users as a locally implemented web application. Data products, such as reports or analytics, are part of Cognos documentation. Data architecture is driven by DWH architects. Data security is driven by the Information security department, which stays apart from DWH, to assure that data is used by the right people for the right purposes. Data integration and interoperability are managed by the Integration platform department to augment existing data by a combination of data from different data sources. Reference and master data management is a pilot solution, and this area is part of a further research. Although the master data management application uses data from more than 10 different sources, the results of master data procedures are not used in any way and are only stored in DWH tables. Source systems continue to keep the low quality of data.
A systematic review of the integration of Industry 4.0 with quality-related operational excellence methodologies
Published in Quality Management Journal, 2023
Tim Komkowski, Jiju Antony, Jose Arturo Garza-Reyes, Guilherme Luz Tortorella, Tanawadee Pongboonchai-Empl
Aside from integration, Xu, Xu, and Li (2018) identified workflow management as a prominent subject for process monitoring, control, and optimization. Based on a cross-industry study, Mishra, Sree Devi, and Badri Narayanan (2019a) identified robotic process automation (RPA) as a prominent integration topic. RPA automates process executions based on earlier simplification, standardization, and reengineering efforts (van der Aalst, Bichler, and Heinzl 2018). The study proposes that RPA may be used for processes that are likely to stay constant and require process modifications to be made without considerable coding capabilities (Mishra, Sree Devi, and Badri Narayanan 2019a). Additional issues such as cultural and behavioral change require expenditures in creating digital capabilities. Early deployment should focus on procedures where people operate like robots (Mishra, Sree Devi, and Badri Narayanan 2019b). For example, successful applications examine the use of RPA in master data management (Radke, Dang, and Tan 2020).