Refine
Year of publication
Document Type
- Conference Proceeding (115)
- Part of a Book (66)
- Contribution to a Periodical (27)
- Article (13)
- Lecture (7)
- Working Paper (4)
- Book (3)
- Internet Paper (2)
- Report (2)
Is part of the Bibliography
- no (239)
Keywords
- 03 (10)
- 2 (15)
- 4 (1)
- 5G (2)
- 7. EU-Forschungsrahmenprogramm (1)
- AI (2)
- APMS (1)
- APS (1)
- Aachener PPS-Modell (1)
- Ablauforganisation (1)
Institute
- FIR e. V. an der RWTH Aachen (239) (remove)
Reinforced through the pandemic and shaped by digitalization, today's professional working environment is in a state of transformation. Working remotely has become a vital component of many professions' regular routines. The design of remote work environments presents challenges to organizations of all sizes. By providing a classification, this paper reveals a comprehensive understanding of the fields of design to be considered to establish lasting remote work concepts in organizations. A hierarchical classification with four dimensions consisting of human, technology, organization, and culture, seven design elements and, twenty design parameters indicates to organizations the fields of design that need to be examined. To satisfy both the theoretical foundation and the practical application, design elements are derived by implementing a systematic review of the literature that represents key areas of interest for remote work. Additionally, these are verified and complemented by a dedicated case study research to incorporate practice-oriented design parameters.
Robotic Process Automation (RPA) gewinnt durch die Möglichkeit, repetitive Administrationsprozesse zu automatisieren und Effizienzpotenziale zu heben, zunehmend an Bedeutung. In der Praxis scheitern jedoch viele Implementierungsprojekte. Dies resultiert primär aus dem fehlenden Verständnis darüber, wie sich die Einführung von RPA auf das Gesamtsystem Organisation auswirkt. Es entsteht eine wachsende Kluft zwischen dem Leistungsversprechen von RPA und der Fähigkeit von Unternehmen, jenes auszuschöpfen. Trotz der exponentiellen Geschwindigkeit des technologischen Fortschritts mangelt es vielen Unternehmen an der notwendigen Adaptionsfähigkeit, welche für den nachhaltigen Erfolg einer RPA-Implementierung essenziell ist. In diesem Kontext spielt die Optimierung der im Einklang stehenden Dimensionen Mensch, Technik und Organisation eine zentrale Rolle. Durch eine systematische Literaturrecherche wird aufgezeigt, dass bisherige Ansätze diesen Zusammenhang nur unzureichend betrachten. In der heutigen Forschungslandschaft existiert kein Modell, welches die technischen, sozialen und organisatorischen Komponenten, die im Zuge der RPA-Einführung zu berücksichtigen sind, darlegt. Angelehnt an das soziotechnische Systemdenken und den Prozess der Fallstudienforschung werden theoriegeleitet Dimensionen und Elemente einer RPA-spezifischen soziotechnischen Systemarchitektur identifiziert und erläutert. Das daraus resultierende Modell zur Unterstützung von Unternehmen bei der RPA-Einführung wurde mit einer Vielzahl Industrievertretern im Rahmen des öffentlichen Forschungsprojekts RPAsset des FIR e. V. an der RWTH Aachen validiert.
Ziel des Beitrags ist es, aufzuzeigen, wie produzierende Unternehmen entlang der Customer-Journey systematisch kundenbezogene Daten erheben können. Nach einer Einleitung zur Motivation der Themenstellung, einer Begriffserläuterung und einer Vorstellung des Studiendesigns wird ein Referenzprozessmodell der Kundeninteraktionen produzierender Unternehmen gestaltet, darauf aufbauend ein Datenmodell des digitalen Schattens der Kundeninteraktionen abgeleitet und zuletzt ein Vorgehensmodell zur Implementierung des digitalen Schattens der Kundeninteraktionen präsentiert.
In short-term production management of the Internet of Production (IoP) the vision of a Production Control Center is pursued, in which interlinked decision-support applications contribute to increasing decision-making quality and speed. The applications developed focus in particular on use cases near the shop floor with an emphasis on the key topics of production planning and control, production system configuration, and quality control loops.
Within the Predictive Quality application, predictive models are used to derive insights from production data and subsequently improve the process- and product-related quality as well as enable automated Root Cause Analysis. The Parameter Prediction application uses invertible neural networks to predict process parameters that can be used to produce components with desired quality properties. The application Production Scheduling investigates the feasibility of applying reinforcement learning to common scheduling tasks in production and compares the performance of trained reinforcement learning agents to traditional methods. In the two applications Deviation Detection and Process Analyzer, the potentials of process mining in the context of production management are investigated. While the Deviation Detection application is designed to identify and mitigate performance and compliance deviations in production systems, the Process Analyzer concept enables the semi-automated detection of weaknesses in business and production processes utilizing event logs.
With regard to the overall vision of the IoP, the developed applications contribute significantly to the intended interdisciplinary of production and information technology. For example, application-specific digital shadows are drafted based on the ongoing research work, and the applications are prototypically embedded in the IoP.
Long-term production management defines the future production structure and ensures the long-term competitiveness. Companies around the world currently have to deal with the challenge of making decisions in an uncertain and rapidly changing environment. The quality of decision-making suffers from the rapidly changing global market requirements and the uniqueness and infrequency with which decisions are made. Since decisions in long-term production management can rarely be reversed and are associated with high costs, an increase in decision quality is urgently needed. To this end, four different applications are presented in the following, which support the decision process by increasing decision quality and make uncertainty manageable. For each of the applications presented, a separate digital shadow was built with the objective of being able to make better decisions from existing data from production and the environment. In addition, a linking of the applications is being pursued:
The Best Practice Sharing App creates transparency about existing production knowledge through the data-based identification of comparable production processes in the production network and helps to share best practices between sites. With the Supply Chain Cockpit, resilience can be increased through a data-based design of the procurement strategy that enables to manage disruptions. By adapting the procurement strategy for example by choosing suppliers at different locations the impact of disruptions can be reduced. While the Supply Chain Cockpit focuses on the strategy and decisions that affect the external partners (e.g., suppliers), the Data-Driven Site Selection concentrates on determining the sites of the company-internal global production network by creating transparency in the decision process of site selections. Different external data from various sources are analyzed and visualized in an appropriate way to support the decision process. Finally, the issue of sustainability is also crucial for successful long-term production management. Thus, the Sustainable Footprint Design App presents an approach that takes into account key sustainability indicators for network design. [https://link.springer.com/referenceworkentry/10.1007/978-3-030-98062-7_15-1]
With the development of publicly accessible broker systems within the last decade, the complexity of data-driven ecosystems is expected to become manageable for self-managed digitalisation. Having identified event-driven IT-architectures as a suitable solution for the architectural requirements of Industry 4.0, the producing industry is now offered a relevant alternative to prominent third-party ecosystems. Although the technical components are readily available, the realisation of an event-driven IT-architecture in production is often hindered by a lack of reference projects, and hence uncertainty about its success and risks. The research institute FIR and IT-expert synyx are thus developing an event-driven IT-architecture in the Center Smart Logistics' producing factory, which is designed to be a multi-agent testbed for members of the cluster. With the experience gained in industrial projects, a target IT-architecture was conceptualised that proposes a solution for a self-managed data-ecosystem based on open-source technologies. With the iterative integration of factory-relevant Industry 4.0 use cases, the target is continuously realised and validated. The paper presents the developed solution for a self-managed event-driven IT-architecture and presents the implications of the decisions made. Furthermore, the progress of two use cases, namely an IT-OT-integration and a smart product demonstrator for the research project BlueSAM, are presented to highlight the iterative technical implementability and merits, enabled by the architecture.
Companies in the manufacturing sector are confronted with an increasingly dynamic environment. Thus, corporate processes and, consequently, the supporting IT landscape must change. This need is not yet fully met in the development of information systems. While best-of-breed approaches are available, monolithic systems that no longer meet the manufacturing industry's requirements are still prevalent in practical use. A modular structure of IT landscapes could combine the advantages of individual and standard information systems and meet the need for adaptability. At present, however, there is no established standard for the modular design of IT landscapes in the field of manufacturing companies' information systems. This paper presents different ways of the modular design of IT landscapes and information systems and analyzes their objects of modularization. For this purpose, a systematic literature research is carried out in the subject area of software and modularization. Starting from the V-model as a reference model, a framework for different levels of modularization was developed by identifying that most scientific approaches carry out modularization at the data structure-based and source code-based levels. Only a few sources address the consideration of modularization at the level of the software environment-based and software function-based level. In particular, no domain-specific application of these levels of modularization, e.g., for manufacturing, was identified. (Literature base: https://epub.fir.de/frontdoor/index/index/docId/2704)
Smart Services – die effektive Trias aus Produkt, Service und kundenorientiertem Leistungsversprechen – bieten Chancen für produktionsorientierte Unternehmen eine Differenzierung und neue Marktchancen zu erreichen. Der bislang geringe Einsatz von Smart Services zeigt, dass im produzierenden Gewerbe vielschichtige Herausforderungen bestehen, die Bausteine Produkt, Service und Leistungsversprechen zu nachhaltigen und wettbewerbsfähigen Smart Services zu kombinieren, erfolgreiche Geschäftsmodelle abzuleiten und Organisationen auf das Smart-Service-Geschäft anzupassen. Nur die großen Player schaffen dies eigenständig, der Innovationsstandort Deutschland lebt aber auch von seinen Hidden Champions: Kleinunternehmen und Mittelständlern.
Manufacturing companies face the challenge of managing vast amounts of unstructured data generated by various sources such as social media, customer feedback, product reviews, and supplier data. Text-mining technology, a branch of data mining and natural language processing, provides a solution to extract valuable insights from unstructured data, enabling manufacturing companies to make informed decisions and improve their processes. Despite the potential benefits of text mining technology, many manufacturing companies struggle to implement use cases due to various reasons. Therefore, the project VoBAKI (IGF-Project No.: 22009 N) aims to enable manufacturing companies to identify and implement text mining use cases in their processes and decision-making processes. The paper presents an analysis of text mining use cases in manufacturing companies using Mayring's content analysis and case study research. The study aims to explore how text mining technology can be effectively used in improving production processes and decision-making in manufacturing companies.
Digital technologies have gained significant importance in the course of the 4th Industrial Revolution and these technologies are widely implemented, nowadays. However, it is necessary to bear in mind that an ill-considered use can quickly have a negative impact on the environment in which the technology is used. For more responsible and sustainable use, the regulation of digital technologies is therefore necessary today. Since the government is taking a very slow response, as the example of the AI Act shows, companies need to take action themselves today. In this context, one of the central questions for companies is: "Which digital technologies are relevant for manufacturing companies in terms of regulation? This paper conducted a quantitative Delphi study to answer this question. The results of the Delphi study are presented and evaluated within the framework of a data analysis. In addition, it will be discussed how to proceed with the results so that manufacturing companies can benefit from them. Furthermore, the paper contributes to the development of an AI platform in the German research project PAIRS by investigating the compliance relevance of artificial intelligence applications.