Produktionsmanagement
Refine
Year of publication
Document Type
- Conference Proceeding (79) (remove)
Is part of the Bibliography
- no (79)
Keywords
- 02 (6)
- 03 (4)
- AI (1)
- APMS (1)
- APS (1)
- Acceptance analysis (1)
- Adaptive planning (1)
- Additive Fertigung (1)
- Additive/Rapid Manufacturing (AM) (1)
- Adherence To Delivery Dates (1)
Institute
Failure management in the production area has been intensely analyzed in the research community. Although several efficient methods have been developed and partially successfully implemented, producing companies still face a lot of challenges. The resulting main question is how manufacturers can be assisted by a sustainable approach enabling them to proactively detect and prevent failures before they occur. A high-resolution production system based on analyzed real-time data enables manufacturers to find an answer to the main question. In this context, Big Data technologies have gained importance since the critical success factor is not only to collect real-time data in the production but also to structure the data. Therefore, we present in this paper the implementation of Big Data technologies in the production area using the example of an actual research project. After the literature review, we describe a Big Data based approach to prevent failures in the production area. This approach mainly includes a real-time capable platform including complex event processing algorithms to define appropriate improvement measures.
Outsourcing of logistics operations (especially transportation, distribution & warehousing) is one of the most viable options exercised by the customers to excel in their logistic operations. Despite the growing outsourcing of logistics services to 3PL providers, both the service providers & their customers are facing tremendous problems in synchronizing the business processes & analyzing the performance using common key performance indicators. There is a huge demand for an integrated approach to help 3PL and their customers better synchronize their business processes and have common goals & perspectives. Such integrated approaches often take shape of a process oriented reference model covering many diverse aspects related to the operations & controlling of any business. In this paper, an integrated reference model to support 3PL service operations is presented. The Logistics Reference Model (LRM) developed & validated in some 3PL service companies encompasses standard business processes, performance measurement system and best practices.
Due to shorter product life cycles the number of production ramp-ups is increasing, while customers have a soaring demand for more variable and individualized products. In the future, optimizing the production ramp-up will become an important differentiation criterion for companies. Considering the whole supply chain in the ramp-up process becomes therefore indispensable. This is what the presented research in this paper concentrates on. The intention of the research project is to develop a model of a supply chain in the production ramp-up stage. Through this model, approaches for optimizing the production ramp-up in the whole supply chain will be derived.
Further the research project concentrates on measuring the production ramp-up performance in the supply chain, showing the impact on economic and financial measures. The result of this research is an approach to align the tasks and objectives of Supply Chain Management with the tasks and objectives of ramp-up management in order to optimize the whole supply chain in the ramp-up stage.
Rebound Logistics
(2009)
Today, the flow of product returns is becoming a significant concern for many manufacturing companies. In this research area, three fundamental aspects of product returns need to be taken into consideration: First, companies become increasingly aware of the fact that product returns may offer an opportunity for enormous profit generation and for improving the competitive advantage of a manufacturing company when taking into account the accretive value of the products and technology. Second, the impact of green laws, legislative provisions and the increasing impact of a sustainable production management due to marketing aspects force companies to design and manage the reverse supply chain actively. Third, the importance of managing the reverse supply chains effectively will be enforced by the currently volatile economic climate. This paper outlines first results of designing a methodological framework for implementing an integrative reverse supply chain for manufacturing companies based on a type-specific Reverse Supply Chain Reference Model.
With big data-technologies on the rise, new fields of application appear in terms of analyzing data to find new relationships for improving process under-standing and stability. Manufacturing companies oftentimes cope with a high number of deviations but struggle to solve them with less effort. The research project BigPro aims to develop a methodology for implementing counter measures to disturbances and deviations derived from big data. This paper proposes a methodology for practitioners to assess predefined counter measures. It consists of a morphology with several criterions that can have a certain characteristic. Those are then combined with a weighting factor to assess the feasibility of the counter measure for prioritization.
Manufacturing companies are facing an increasingly turbulent market – a market defined by products growing in complexity and shrinking product life cycles. This leads to a boost in planning complexity accompanied by higher error sensitivity. In practice, IT systems and sensors integrated into the shop floor in the context of Industry 4.0 are used to deal with these challenges. However, while existing research provides solutions in the field of pattern recognition or recommended actions, a combination of the two approaches is neglected. This leads to an overwhelming amount of data without contributing to an improvement of processes. To address this problem, this study presents a new platform-based concept to collect and analyze the high-resolution data with the use of self-learning algorithms. Herby, patterns can be identified and reproduced, allowing an exact prediction of the future system behavior. Artificial intelligence maximizes the automation of the reduction and compensation of disruptive factors.
The research outlines a concept to conduct the double materiality assessment through the synergistic use of Generative AI and the AHP method. In the first step, we employ interactive, moderated workshops as our chosen methodology to create a tailored set of sustainability target criteria. This process is enriched by the inclusion of Generative AI. The outcome is a comprehensive set of company-specific sustainability target criteria.
In the last decade, enterprises realized the high value of data and learned to successfully utilize it for internal processes and business models, and they are trying to find more ways to acquire relevant data. Since enterprises are part of complex networks, the data from their partners and customers can also be beneficial: from adjusting the demand and supply to planning production and aligning capacities. One such example is adaptive process control: detailed material data from a supplier can be used to adjust process parameters in their production. This approach may be especially beneficial for the steel industry, as there is a possibility to adjust the material properties by changing the speed, force, or temperature in their own production processes. However, such an approach requires tight collaboration, e.g., regarding improving IT infrastructure, ensuring data acquisition and transfer and most importantly, the utilization of such data.
Companies in the manufacturing sector are confronted with an increasingly dynamic environment. Thus, corporate processes and, consequently, the supporting IT landscape must change. This need is not yet fully met in the development of information systems. While best-of-breed approaches are available, monolithic systems that no longer meet the manufacturing industry's requirements are still prevalent in practical use. A modular structure of IT landscapes could combine the advantages of individual and standard information systems and meet the need for adaptability. At present, however, there is no established standard for the modular design of IT landscapes in the field of manufacturing companies' information systems. This paper presents different ways of the modular design of IT landscapes and information systems and analyzes their objects of modularization. For this purpose, a systematic literature research is carried out in the subject area of software and modularization. Starting from the V-model as a reference model, a framework for different levels of modularization was developed by identifying that most scientific approaches carry out modularization at the data structure-based and source code-based levels. Only a few sources address the consideration of modularization at the level of the software environment-based and software function-based level. In particular, no domain-specific application of these levels of modularization, e.g., for manufacturing, was identified. (Literature base: https://epub.fir.de/frontdoor/index/index/docId/2704)
Process mining has emerged as a crucial technology for digitalization, enabling companies to analyze, visualize, and optimize their processes using system data. Despite significant developments in the field over the years, companies—notably small and medium-sized enterprises—are not yet familiar with the discipline, leaving untapped potential for its practical application in the business domain. They often struggle with understanding the potential use cases, associated benefits, and prerequisites for implementing process mining applications. This lack of clarity and concerns about the effort and costs involved hinder the widespread adoption of process mining. To address this gap between process mining theory and real-world business application, we introduce the “Process Mining Use Case Canvas,” a novel framework designed to facilitate the structured development and specification of suitable use cases for process mining applications within manufacturing companies. We also connect to established methodologies and models for developing and specifying use cases for business models from related domains targeting data analytics and artificial intelligence projects. The canvas has already been tested and validated through its application in the ProMiConE research project, collaborating with manufacturing companies.
Influenced by the high dynamic of the markets the optimization of supply chains gains more importance. However, analyzing different procurement strategies and the influence of various production parameters is difficult to achieve in industrial practice. Therefore, simulations of supply chains are used in order to improve the production process. The objective of this research is to evaluate different procurement strategies in a four-stage supply chain. Besides, this research aims to identify main influencing factors on the supply chain’s performance. The performance of the supply chain is measured by means of back orders (backlog). A scenario analysis of different customer demands and a Design of Experiments analysis enhance the significance of the simulation results.
The complexity and volatility of companies’ environment increase the relevance of disruption preparation. Resilience enables companies to deal with disruptions, reduce their impact and ensure competitiveness. Especially in the context of procurement, disruptions can cause major challenges while resilience contributes to ensuring material availability. Even though past disruptions have posed various challenges and companies have recognized the need to increase resilience, resilience is often not designed systematically. One major challenge is the number of potential measures to increase resilience. The systematic design of resilience thus requires a detailed understanding of domain-specific measures. This also includes an understanding of the contribution of these measures to different resilience components and their interdependencies. This paper proposes a systematic approach for configuring resilience in procurement which enables the evaluation and selection of resilience measures. Based on a resilience framework, a resilience configurator is developed. The basis of the configurator are resilience potentials that have been characterized and clustered. Overarching approaches to design resilience and indicators to evaluate resilience are presented. Moreover, a procedure is proposed to ensure practical applicability. To evaluate the results two case studies are conducted. The results enable companies to systematically design their resilience in procurement.
Based on the increasingly complex value creation networks, more and more event-based systems are being used for decision support. One example of a category of event-based systems is supply chain event management. The aim is to enable the best possible reaction to critical exceptional events based on event data. The central element is the event, which represents the information basis for mapping and matching the process flows in the event-based systems. However, since the data quality is insufficient in numerous application cases and the identification of incorrect data in supply chain event management is considered in the literature, this paper deals with the theoretical derivation of the necessary data attributes for the identification of incorrect event data. In particular, the types of errors that require complex identification strategies are considered. Accordingly, the relevant existing error types of event data are specified in subtypes in this paper. Subsequently, the necessary information requirements and information available regarding identification are considered using a GAP analysis. Based on this gap, the necessary data attributes can then be derived. Finally, an approach is presented that enables the generation of the complete data set. This serves as a basis for the recognition and filtering out of erroneous events in contrast to standard and exception events.
Gap Analysis for CO2 Accounting Tool by Integrating Enterprise Resource Planning System Information
(2023)
Detailed carbon accounting is the foundation for reducing CO2 emissions in manufacturing companies. However, existing accounting approaches are primarily based on manual data preparation, although manufacturing companies already have a variety of IT systems and resulting data available. The gap analysis carried out based on the GHG Protocol and an reference ERP system shows how much of the required information for CO2 accounting can be integrated from an ERP system. The ERP system can cover 20 % of the required information. The information availability can be increased to 49 % through additionally identified modifications of the ERP system. Integrating the CO2 accounting tool with other systems of the IT landscape, e. g. Energy Information System, enables an additional increase.
Maximising economies of scale in individualised production is a vital issue for producing companies in high wage countries. A decisive enabler for this is the management of product and process complexity by systematic standardisation. Due to the strong and far-reaching impact of complexity on the value added chain, its management requires an integrative consideration of the entire product and production system.
The following paper introduces a methodology facing this challenge. The core element of this methodology is an integrative and complexity-focused assessment model. This assessment model has been validated experimentally by analysing key company data from more than 50 German toolmaking firms. Findings of this empirical investigation are presented in this paper.
Manufacturing companies of the machinery and equipment industry find themselves more than ever exposed to a rapidly changing competitive environment. In particular, the resulting diversity of planning and control processes confronts organisations and information systems with a significant coordination effort. To this day, planning and execution of order processing – from offer processing to the final shipment of the product – is still a part of the production planning and control (PPC), which is almost entirely integrated into information systems. Though, in order to manage dynamic influences on processes within order processing, there can be found a deficiency in the processing of decision-relevant and real-time information. Partly, the reason for this is a missing or incorrect feedback of process relevant data, so that the planning results, gained by the use of information systems, differ to the current process situation.
The concept of Manufacturing Resource Planning (MRP II) still represents the central logic of production planning and control. However, the centralised and push-oriented MRP II planning logic is not able to plan and measure dynamic processes adequately, which, due to diverse disturbances, often occur in production environments. Furthermore, specific weaknesses of MRP II-based systems are the lack of support for order releases, the planning principle based on average values and the successive planning method as well as the use of limited partial models. As a result a successive planning method leads to a dissection of PPC-tasks into smaller work packages and so strides away from a holistic approach and the achievement of an optimal solution. Similarly, a planning, focusing on a general business objective system, using a partial planning approach due to isolated considerations is not possible. Insufficient consideration of the current load horizon and the current capacity utilization, non-existing or delayed feedback on order progress as well as faults and poor availability and transparency of information can be named as further weaknesses of MRP II-based systems.
Aufgrund kürzer werdender Produktzyklen und steigender Produktvielfalt werden produzierende Unternehmen mit einer zunehmenden Anzahl von Produktanläufen konfrontiert. Ziel aktueller Forschungsaktivitäten ist es daher, anlaufintensive Unternehmen zu befähigen, verlässliche Produktionsprogramme in kurzer Zeit zu erstellen. Lerneffekte sollen genutzt werden können ohne Diversifikationseffekte zu vernachlässigen. Zur Erreichung dieser Zielsetzung wird ein Modell für eine kybernetische PPP bei Produktanläufen entwickelt.
Production systems are exposed to an increasing planning-related uncertainty and susceptibility. The inter-company coordination has not sufficiently been considered in contemporary concepts of supply chain management. Against this background, it is crucial to provide a suitable tool that increases the planning capability of the players and the robustness of the supply chain as a whole. Therefore, this article provides the relevant causes and effects of planning uncertainties within the production planning and presents based on that an inter-company supply chain planning concept.
Producing companies are confronted with a growing number of product ramp-ups, since product life cycles are decreasing and product diversity is increasing. Production Planning and Control (PPC) of ramp-up products is particularly challenging, as there is a significant lack of reliable experienced data.
The information deficit is exceptionally high for the first step of PPC process, namely Production Program Planning (PPP). The paper in hand proposes an innovative approach of cybernetic PPP that enables companies with numerous ramp-ups to design reliable and fast PPP processes that can react highly adaptable on unpredictable environmental disturbances. The Viable System Model (VSM) is used as frame of reference for the design of PPP processes in line with principles from management cybernetics.