Independent procedures that are used together for checking that a product, service, or system meets requirements and specifications and that it fulfills its intended purpose.
Errors in cyber-physical systems can lead to disastrous consequences. Classic examples date back to the Therac-25 radiation incidents in 1987 and the Ariane 5 rocket crash in 1996. More recently, Toyota's unintended acceleration bug was caused by software errors, and certain cars were found vulnerable to attacks that can take over key parts of the control software, allowing attackers to even disable the brakes remotely. Pacemakers have also been found vulnerable to attacks that can cause deadly consequences for the patient. To reduce the chances of such errors happening, this project investigates the application of a technique called Foundational Verification to cyber-physical systems. In Foundational Verification, the system being developed is proved correct, in full formal detail, using a proof assistant. The main intellectual merit of the proposal is the attainment of previously unattainable levels of safety for cyber-physical systems because proofs in Foundational Verification are carried out in complete detail. To ensure that the techniques in this project are practical, they are evaluated within the context of a real flying quadcopter. The project's broader significance and importance is the improved correctness, safety and security of cyber-physical systems. In particular, this project lays the foundation for ushering in a new level of formal correctness for cyber-physical systems. Although the initial work focuses on quadcopters, the concepts, ideas, and research contributions have the potential for transformative impact on other kinds of systems, including power-grid software, cars, avionics and medical devices (from pacemakers and insulin pumps to defibrillators and radiation machines).
Off
University of California-San Diego
-
National Science Foundation
Miroslav Krstic
Submitted by Sorin Lerner on September 23rd, 2016
Computation is everywhere. Greeting cards have processors that play songs. Fireworks have processors for precisely timing their detonation. Computers are in engines, monitoring combustion and performance. They are in our homes, hospitals, offices, ovens, planes, trains, and automobiles. These computers, when networked, will form the Internet of Things (IoT). The resulting applications and services have the potential to be even more transformative than the World Wide Web. The security implications are enormous. Internet threats today steal credit cards. Internet threats tomorrow will disable home security systems, flood fields, and disrupt hospitals. The root problem is that these applications consist of software on tiny low-power devices and cloud servers, have difficult networking, and collect sensitive data that deserves strong cryptography, but usually written by developers who have expertise in none of these areas. The goal of the research is to make it possible for two developers to build a complete, secure, Internet of Things applications in three months. The research focuses on four important principles. The first is "distributed model view controller." A developer writes an application as a distributed pipeline of model-view-controller systems. A model specifies what data the application generates and stores, while a new abstraction called a transform specifies how data moves from one model to another. The second is "embedded-gateway-cloud." A common architecture dominates Internet of Things applications. Embedded devices communicate with a gateway over low-power wireless. The gateway processes data and communicates with cloud systems in the broader Internet. Focusing distributed model view controller on this dominant architecture constrains the problem sufficiently to make problems, such as system security, tractable. The third is "end-to-end security." Data emerges encrypted from embedded devices and can only be decrypted by end user applications. Servers can compute on encrypted data, and many parties can collaboratively compute results without learning the input. Analysis of the data processing pipeline allows the system and runtime to assert and verify security properties of the whole application. The final principle is "software-defined hardware." Because designing new embedded device hardware is time consuming, developers rely on general, overkill solutions and ignore the resulting security implications. The data processing pipeline can be compiled into a prototype hardware design and supporting software as well as test cases, diagnostics, and a debugging methodology for a developer to bring up the new device. These principles are grounded in Ravel, a software framework that the team collaborates on, jointly contributes to, and integrates into their courses and curricula on cyberphysical systems.
Off
Stanford University
-
National Science Foundation
Submitted by Philip Levis on September 23rd, 2016
Event
ANT-17
The 8th International Conference on Ambient Systems, Networks and Technologies (ANT-17) The goal of the ANT-2017 conference is to provide an international forum for scientists, engineers, and managers in academia, industry, and government to address recent research results and to present and discuss their ideas, theories, technologies, systems, tools, applications, work in progress and experiences on all theoretical and practical issues arising in the ambient systems paradigm, infrastructures, models, and technologies that have significant contributions to the advancement of amb
Submitted by Anonymous on September 15th, 2016
Event
DATE 2017
Design, Automation and Test in Europe (DATE 2017) DATE 2017, will take place from 27 to 31 March, 2017, at the SwissTech Convention Centre in Lausanne, Switzerland.
Submitted by Anonymous on August 30th, 2016
Event
CRTS 2016
9th International Workshop on Compositional Theory and Technology for Real-Time Embedded Systems (CRTS 2016) collocated with RTSS 2016 Background
Submitted by Anonymous on August 19th, 2016
Exploring the Dimensions of Trustworthiness: Challenges and Opportunities The National Institute of Standards and Technology (NIST) will host a workshop on Trustworthiness in Cyber-Physical Systems August 30-31, 2016 at the NIST Gaithersburg, MD campus. For further information and to register for the workshop, follow this link to the registration page:   Workshop Information and Link to Registration http://www.nist.gov/el/exploring-dimensions-trustworthiness.cfm  
Submitted by Anonymous on August 19th, 2016
Event
AC16
Workshop on Approximate Computing (AC16) AC16 is part of ESWEEK 2016 Focus
Submitted by Anonymous on August 19th, 2016
Event
ICPE 2017
8th ACM/SPEC International Conference on Performance Engineering (ICPE 2017)  Sponsored by ACM SIGMETRICS, SIGSOFT, and SPEC RG
Submitted by Anonymous on July 6th, 2016
12th International Conference on Semantic Systems (SEMANTiCS 2016 ) The annual SEMANTiCS conference is the meeting place for professionals who make semantic computing work, who understand its benefits and encounter its limitations. Every year, SEMANTiCS attracts information managers, IT-architects, software engineers and researchers from organisations ranging from NPOs, through public administrations to the largest companies in the world.
Submitted by Anonymous on July 6th, 2016
Event
WDES 2016
CALL FOR PAPERS - Submissions in July Workshop on Dependability in Evolving Systems (WDES 2016) Cali, Colombia | 19-21 October 2016 http://rcl.dsi.unifi.it/wdes2016/ http://www.unicauca.edu.co/ladc2016/node/55 Co-located with LADC 2016: Latin-American Symposium on Dependable Computing   Topics and Objective
Submitted by Anonymous on July 6th, 2016
Subscribe to Validation and Verification