Explainable Software for Cyber-Physical Systems
Date: Jan 06, 2019 12:00 am – Jan 11, 2019 11:00 am
Location: Wadern, Germany
January 6 – 11 , 2019, GI-Dagstuhl Seminar 19023
Explainable Software for Cyber-Physical Systems
Organizers
- Joel Greenyer (Leibniz Universität Hannover, DE)
- Malte Lochau (TU Darmstadt, DE)
- Thomas Vogel (HU Berlin, DE)
For support, please contact
Documents
- GI-Dagstuhl Seminar Schedule (Upload here)
(Use seminar number and access code to log in)
Motivation
Collaborating and autonomously driving cars, smart grids, as well as modern systems in industry (Industry 4.0) or health care are examples of communicating embedded systems where software enables increasingly advanced functionality. These novel kinds of (software) systems are frequently summarized under the term cyber-physical systems (CPS). CPS are usually described by three essential characteristics: CPS perform complex computations, CPS conduct control tasks involving discrete and continuous data and signal-processing, and CPS are (parts of) distributed, and even mobile, communication systems.
As a consequence, CPS become more and more complex for several reasons. Among others, because (1) they control increasingly complex processes in the physical world, (2) due to the distributed, concurrent, and hybrid nature of their software, (3) because of changing system topologies (e.g., moving cars, reconfigurable smart factory components, suppliers and consumers join or disconnect from electrical grids), and, last, (4) because the systems often learn and continuously adapt themselves to ever-changing contexts and environmental conditions.
This increasing complexity poses several challenges throughout all software development and analysis phases, but also during their usage and maintenance. In particular, it becomes increasingly difficult for system and software engineers, but also users, insurers, lawyers, auditors, etc. to comprehend the behavior of a system, especially in the case of software that relies more and more on learning and self-adaptive functionality. Why did the system respond in a certain way? How will the system react to certain inputs? How can the system achieve a certain goal? What are the reasons for an observed failure of the system and how can this behavior be reproduced?—Being able to answer these questions is important, especially for safety-critical systems, so that, (1) during development, engineers can effectively ensure the quality of the system, and (2) during operation, users can develop trust towards reliability of their systems. Furthermore, in the case of accidents, also lawyers and insurers must be able analyze the system in order to determine the cause of the failure and who can be held responsible.
It will thus be increasingly relevant for future CPS to explain their behavior (past, current, and future behavior, why a certain action was taken, how a certain goal can be achieved) to users and other stakeholders, like lawyers and insurers in a graspable and dependable way. To this end, it will be pivotal for the different application domains of CPS and their respective engineering tools and techniques to be able to infer, update, document, and provide such explanations during different stages of system development and the system’s operation. We use the term explainability. to describe the capability of both the system and its engineering tools to explain certain aspects of interest about the system, both in a human-comprehensible and machine-processable format. In order to increase the explainability of current and future CPS and their engineering tools, fundamental, interdisciplinary research is required; solutions from multiple disciplines within software engineering, systems engineering, and related fields may have to be applied and combined, for example:
- Model-based Development
- Requirements Engineering
- Formal Methods
- Dependable Systems Engineering
- Control Engineering
- Testing
- Simulation
- Visualization
- Product-Line Engineering
- Software Architecture
- Self-Adaptive Systems
- Human-Computer Interaction
- Machine Learning
- Organic Computing
- Areas of application:
- Robotics
- Automotive Systems
- Health Care Systems
We observe that research related to Explainable Software for Cyber-Physical Systems (ES4CPS) is indeed conducted in the different communities, but the research is currently only weakly linked and there are no venues where an appropriate interdisciplinary coordination of research activities focused on ES4CPS takes place.
Goal
The goal of this GI-Dagstuhl Seminar is to serve as a starting point for an interdisciplinary coordination of research activities targeting ES4CPS. The seminar shall serve as an incubator of a new research community around this topic. From this main goal, we derive the following sub-goals for the seminar:
- Build a network of young researchers working on the topic of ES4CPS.
- Facilitate cross-disciplinary insights into research activities.
- Support researchers in identifying common goals, intersections, and connection points in their research.
- Foster cooperation of researchers leading to writing joint papers or project proposals. The seminar is also part of an effort to focus the software engineering community, and adjacent fields, towards the topic of explainable software.
Format
The 5-day seminar includes the following components:
- sessions that activate cross-disciplinary knowledge exchange
- focused group working sessions, supported by
- position talks or poster presentations given by attendees, and
- two or three keynotes from invited senior researchers.
Attendees to this GI-Dagstuhl seminar are young researchers that are either already working on explainable software, cyber-physical systems, or whose recent work can provide valuable contributions to the open challenges in the emerging field of ES4CPS.
Submitted by Anonymous
on
January 6 – 11 , 2019, GI-Dagstuhl Seminar 19023
Explainable Software for Cyber-Physical Systems
Organizers
- Joel Greenyer (Leibniz Universität Hannover, DE)
- Malte Lochau (TU Darmstadt, DE)
- Thomas Vogel (HU Berlin, DE)
For support, please contact
Documents
- GI-Dagstuhl Seminar Schedule (Upload here)
(Use seminar number and access code to log in)
Motivation
Collaborating and autonomously driving cars, smart grids, as well as modern systems in industry (Industry 4.0) or health care are examples of communicating embedded systems where software enables increasingly advanced functionality. These novel kinds of (software) systems are frequently summarized under the term cyber-physical systems (CPS). CPS are usually described by three essential characteristics: CPS perform complex computations, CPS conduct control tasks involving discrete and continuous data and signal-processing, and CPS are (parts of) distributed, and even mobile, communication systems.
As a consequence, CPS become more and more complex for several reasons. Among others, because (1) they control increasingly complex processes in the physical world, (2) due to the distributed, concurrent, and hybrid nature of their software, (3) because of changing system topologies (e.g., moving cars, reconfigurable smart factory components, suppliers and consumers join or disconnect from electrical grids), and, last, (4) because the systems often learn and continuously adapt themselves to ever-changing contexts and environmental conditions.
This increasing complexity poses several challenges throughout all software development and analysis phases, but also during their usage and maintenance. In particular, it becomes increasingly difficult for system and software engineers, but also users, insurers, lawyers, auditors, etc. to comprehend the behavior of a system, especially in the case of software that relies more and more on learning and self-adaptive functionality. Why did the system respond in a certain way? How will the system react to certain inputs? How can the system achieve a certain goal? What are the reasons for an observed failure of the system and how can this behavior be reproduced?—Being able to answer these questions is important, especially for safety-critical systems, so that, (1) during development, engineers can effectively ensure the quality of the system, and (2) during operation, users can develop trust towards reliability of their systems. Furthermore, in the case of accidents, also lawyers and insurers must be able analyze the system in order to determine the cause of the failure and who can be held responsible.
It will thus be increasingly relevant for future CPS to explain their behavior (past, current, and future behavior, why a certain action was taken, how a certain goal can be achieved) to users and other stakeholders, like lawyers and insurers in a graspable and dependable way. To this end, it will be pivotal for the different application domains of CPS and their respective engineering tools and techniques to be able to infer, update, document, and provide such explanations during different stages of system development and the system’s operation. We use the term explainability. to describe the capability of both the system and its engineering tools to explain certain aspects of interest about the system, both in a human-comprehensible and machine-processable format. In order to increase the explainability of current and future CPS and their engineering tools, fundamental, interdisciplinary research is required; solutions from multiple disciplines within software engineering, systems engineering, and related fields may have to be applied and combined, for example:
- Model-based Development
- Requirements Engineering
- Formal Methods
- Dependable Systems Engineering
- Control Engineering
- Testing
- Simulation
- Visualization
- Product-Line Engineering
- Software Architecture
- Self-Adaptive Systems
- Human-Computer Interaction
- Machine Learning
- Organic Computing
- Areas of application:
- Robotics
- Automotive Systems
- Health Care Systems
We observe that research related to Explainable Software for Cyber-Physical Systems (ES4CPS) is indeed conducted in the different communities, but the research is currently only weakly linked and there are no venues where an appropriate interdisciplinary coordination of research activities focused on ES4CPS takes place.
Goal
The goal of this GI-Dagstuhl Seminar is to serve as a starting point for an interdisciplinary coordination of research activities targeting ES4CPS. The seminar shall serve as an incubator of a new research community around this topic. From this main goal, we derive the following sub-goals for the seminar:
- Build a network of young researchers working on the topic of ES4CPS.
- Facilitate cross-disciplinary insights into research activities.
- Support researchers in identifying common goals, intersections, and connection points in their research.
- Foster cooperation of researchers leading to writing joint papers or project proposals. The seminar is also part of an effort to focus the software engineering community, and adjacent fields, towards the topic of explainable software.
Format
The 5-day seminar includes the following components:
- sessions that activate cross-disciplinary knowledge exchange
- focused group working sessions, supported by
- position talks or poster presentations given by attendees, and
- two or three keynotes from invited senior researchers.
Attendees to this GI-Dagstuhl seminar are young researchers that are either already working on explainable software, cyber-physical systems, or whose recent work can provide valuable contributions to the open challenges in the emerging field of ES4CPS.