Applications of CPS technologies dealing with automated machines that can take the place of humans in dangerous environments or manufacturing processes, or resemble humans in appearance, behavior, and/or cognition.
Human-in-the-loop control strategies in which the user performs a task better, and feels more confident to do so, is an important area of research for cyber-physical systems. Humans are very adept at learning to control complex systems, particularly those with non-intuitive kinematic constraints (e.g., cars, bicycles, wheelchairs, steerable needles). With the advent of cyber-physical systems, (physical systems integrated with cyber control layer), human control is no longer constrained to system inputs. Users can also control system outputs through a number of different teleoperation mappings. Given all this flexibility, what is the most intuitive way for a human user to control an arbitrary system and how is intuitiveness quantified? The project focuses on human-in-the-loop control for medical needles, which steer with bicycle-like kinematics. These needles could be used in a variety of medical interventions including tissue biopsy, tumor ablation, abscess drainage, and local drug delivery. We have explored a variety of teleoperation mappings for human control of these steerable needles; yet, we have found inconsistencies between objective performance metrics (e.g., task time and error), and post-experimental surveys on comfort or ease-of use. Users occasionally report a preference for control mappings, which objectively degrade performance, and vice versa. It is important to measure the real-time engagement of the user with the physical system in order to capture the nuances of how different control mappings affect physical effort, mental workload, distraction, drowsiness, and emotional response. Physiological sensors such as electroencephalography (EEG), galvanic skin response (GSR), and electromyography (EMG), can be used to provide these real-time measurements and to quantitatively classify the intuitiveness of new teleoperation algorithms. Broader Impacts: Intuitive and natural human-in-the-loop control interfaces will improve human health and well being, through applications in surgery and rehabilitation. The results of this study will be disseminated publicly on the investigator's laboratory website, a conference workshop, and a new medical robotics seminar to be held jointly between UT Dallas and UT Southwestern Medical Center. Outreach activities, lab tours, and mentoring of underrepresented students at all levels, will broaden participation in STEM. Additionally, the proximity of the investigator?s hospital-based lab to medical professionals will engage non-engineers in design and innovation
Off
University of Texas at Dallas
-
National Science Foundation
Ann Majewicz Submitted by Ann Majewicz on December 22nd, 2015
This CAREER project responds to an urgent need to develop mobile power distribution systems that lower deployment and operating costs while simultaneously increasing network efficiency and response in dynamic and often dangerous physical conditions. The significant need for an efficient and effective mobile power distribution system became evident during search and rescue/recovery missions following the Japan tsunami and the disappearance of the Malaysia MH370 airplane. The technology outcomes from this project will apply to a broad range of environments (in space, air, water or on ground) where the success of long-term robotic network missions is measured by the ability of the robots to operate, for an extended period of time, in highly dynamic and potentially hazardous environments. These advanced features will provide the following advantages: efficiency, efficacy, guaranteed persistence, enhanced performance, and increased success in search/rescue/recovery/discovery missions. Specifically, this project addresses the following technology problems as it translates from research discovery toward commercial application: inflated energy use currently required when the autonomous vehicles break from mission to return to recharging station; lack of multi-robot coordination needed to take into account both fundamental hardware and network science challenges necessary to respond to energy needs and dynamic environment conditions. By addressing these gaps in technology, this work establishes the theoretical, computational, and experimental foundation for mobile power delivery and onsite recharging capability. Moreover, the new technology developed in this project is universally adaptable for disparate autonomous vehicles especially autonomous underwater vehicles (AUVs). In more technical terms, this project creates network optimization and formation strategies that will enable a power distribution system to reconfigure itself depending on the number of operational autonomous vehicles and recharging specifications to meet overall mission specifications, the energy consumption needs of the network, situational conditions, and environmental variables. Such a system will play a vital role in real-time controlled applications across multiple disciplines such as sensor networks, robotics, and transportation systems where limited power resources and unknown environmental dynamics pose major constraints. In addition to addressing technology gaps, undergraduate and graduate students will be involved in this research and will receive interdisciplinary education/ innovation/ technology translation/ outreach experiences through: developing efficient network energy routing, path planning and coordination strategies; designing and creating experimental test-beds and educational platforms; and engaging K-12th grade students in Science, Technology, Engineering and Math including those from underrepresented groups. This project engages Michigan Tech's Great Lake Research Center (GLRC) and Center for Agile Interconnected Microgrids (AIM) to develop experimental test-beds and conduct tests that validate the resulting methods and algorithms, and ultimately, facilitate the technology translation effort from research discovery toward commercial reality.
Off
Michigan Technological University
-
National Science Foundation
Nina Mahmoudian Submitted by Nina Mahmoudian on December 22nd, 2015
Brain-computer interfaces (BCIs) are cyber-physical systems (CPSs) that record human brain waves and translate them into the control commands for external devices such as computers and robots. They may allow individuals with spinal cord injury (SCI) to assume direct brain control of a lower extremity prosthesis to regain the ability to walk. Since the lower extremity paralysis due to SCI leads to as much as $50 billion of health care cost each year in the US alone, the use of a BCI-controlled lower extremity prosthesis to restore walking can have a significant public health impact. Recent results have demonstrated that a person with paraplegia due to SCI can use a non-invasive BCI to regain basic walking. While encouraging, this BCI is unlikely to become a widely adopted solution since the poor signal quality of non-invasively recorded brain waves may lead to unreliable BCI operation. Moreover, lengthy and tedious mounting procedures of the non-invasive BCI systems are impractical. A permanently implantable BCI CPS can address these issues, but critical challenges must be overcome to achieve this goal, including the elimination of protruding electronics and reliance on an external computer for brain signal processing. The goal of this study is to develop a benchtop version of a fully implantable BCI CPS, capable of acquiring electrocorticogram signals, recorded directly from the surface of the brain, and analyzing them internally to enable direct brain control of a robotic gait exoskeleton (RGE) for walking. The BCI CPS will be designed as a low-power system with revolutionary adaptive power management in order to meet stringent heat and power consumption constraints for future human implantation. Comprehensive measurements and benchtop tests will ensure proper function of BCI CPS. Finally, the system will be integrated with an RGE, and its ability to facilitate brain-controlled walking will be tested in a small group of human subjects. The successful completion of this project will have broad bioengineering and scientific impact. It will revolutionize medical device technology by minimizing power consumption and heat production while enabling complex operations to be performed. The study will also help deepen the understanding of how the human brain controls walking, which has long been a mystery to neuroscientists. Finally, this study?s broader impact is to promote education and lifelong learning in engineering students and the community, broaden the participation of underrepresented groups in engineering, and increase the scientific literacy of persons with disabilities. Research opportunities will be provided to (under-)graduate students. Their findings will be broadly disseminated and integrated into teaching activities. To inspire underrepresented K-12 and community college students to pursue higher education in STEM fields, and to increase the scientific literacy of persons with disabilities, outreach activities will be undertaken in the form of live scientific exhibits and actual BCI demonstrations. Recent results have demonstrated that a person with paraplegia due to SCI can use an electroencephalogram (EEG)-based BCI to regain basic walking. While encouraging, this EEG-based BCI is unlikely to become a widely adopted solution due to EEG?s inherent noise and susceptibility to artifacts, which may lead to unreliable operation. Also, lengthy and tedious EEG (un-)mounting procedures are impractical. A permanently implantable BCI CPS can address these issues, but critical CPS challenges must be overcome to achieve this goal, including the elimination of protruding electronics and reliance on an external computer for neural signal processing. The goal of this study is to implement a benchtop analogue of a fully implantable BCI CPS, capable of acquiring high-density (HD) electrocorticogram (ECoG) signals, and analyzing them internally to facilitate direct brain control of a robotic gait exoskeleton (RGE) for walking. The BCI CPS will be designed as a low-power modular system with revolutionary adaptive power management in order to meet stringent heat dissipation and power consumption constraints for future human implantation. The first module will be used for acquisition of HD-ECoG signals. The second module will internally execute optimized BCI algorithms and wirelessly transmit commands to an RGE for walking. System and circuit-level characterizations will be conducted through comprehensive measurements. Benchtop tests will ensure the proper system function and conformity to biomedical constraints. Finally, the system will be integrated with an RGE, and its ability to facilitate brain-controlled walking will be tested in a group of human subjects.The successful completion of this project will have broad bioengineering and scientific impact. It will revolutionize medical device technology by minimizing power consumption and heat dissipation while enabling complex algorithms to be executed in real time. The study will also help deepen the physiological understanding of how the human brain controls walking. This study will promote education and lifelong learning in engineering students and the community, broaden the participation of underrepresented groups in engineering, and increase the scientific literacy of persons with disabilities. Research opportunities will be provided to under-graduate students. Their findings will be broadly disseminated and integrated into teaching activities. To inspire underrepresented K-12 and community college students to pursue higher education in STEM fields, and to increase the scientific literacy of persons with disabilities, outreach activities will be undertaken in the form of live scientific exhibits and actual BCI demonstrations.
Off
University of California at Irvine
-
National Science Foundation
Payam Heydari Submitted by Payam Heydari on December 22nd, 2015
Advances in technology mean that computer-controlled physical devices that currently still require human operators, such as automobiles, trains, airplanes, and medical treatment systems, could operate entirely autonomously and make rational decisions on their own. Autonomous cars and drones are a concrete and highly publicized face of this dream. Before this dream can be realized we must address the need for safety - the guaranteed absence of undesirable behaviors emerging from autonomy. Highly publicized technology accidents such as rocket launch failures, uncontrolled exposure to radiation during treatment, aircraft automation failures and unintended automotive accelerations serve as warnings of what can happen if safety is not adequately addressed in the design of such cyber-physical systems. One approach for safety analysis is the use of software tools that apply formal logic to prove the absence of undesired behavior in the control software of a system. In prior work, this approach this been proven to work for simple controller software that is generated automatically by tools from abstract models like Simulink diagrams. However, autonomous decision making requires more complex software that is able to solve optimization problems in real time. Formal verification of control software that includes such optimization algorithms remains an unmet challenge. The project SORTIES (Semantics of Optimization for Real Time Intelligent Embedded Systems) draws upon expertise in optimization theory, control theory, and computer science to address this challenge. Beginning with the convergence properties of convex optimization algorithms, SORTIES examines how these properties can be automatically expressed as inductive invariants for the software implementation of the algorithms, and then incorporates these properties inside the source code itself as formal annotations which convey the underlying reasoning to the software engineer and to existing computer-aided verification tools. The SORTIES goal is an open-source-semantics-carrying autocoder, which takes an optimization algorithm and its convergence properties as input, and produces annotated, verifiable code as output. The demonstration of the tool on several examples, such as a Mars lander, an aircraft avionics system, and a jet engine controller, shows that the evidence of quality produced by annotations is fully compatible with its application to truly functional products. Project research is integrated with education through training of "tri-lingual" professionals, who are equally conversant in system operation, program analysis, and the theory of control and optimization.
Off
University of Colorado at Boulder
-
National Science Foundation
Submitted by John Hauser on December 22nd, 2015
Tracking Fish Movement with a School of Gliding Robotic Fish This project is focused on developing the technology for continuously tracking the movement of live fish implanted with acoustic tags, using a network of relatively inexpensive underwater robots called gliding robotic fish. The research addresses two fundamental challenges in the system design: (1) accommodating significant uncertainties due to environmental disturbances, communication delays, and apparent randomness in fish movement, and (2) balancing competing objectives (for example, accurate tracking versus long lifetime for the robotic network) while meeting multiple constraints on onboard computing, communication, and power resources. Fish movement data provide insight into choice of habitats, migratory routes, and spawning behavior. By advancing the state of the art in fish tracking technology, this project enables better-informed decisions for fishery management and conservation, including control of invasive species, restoration of native species, and stock assessment for high-valued species, and ultimately contributes to the sustainability of fisheries and aquatic ecosystems. By advancing the coordination and control of gliding robotic fish networks and enabling their operation in challenging environments such as the Great Lakes, the project also facilitates the practical adoption of these robotic systems for a myriad of other applications in environmental monitoring, port surveillance, and underwater structure inspection. The project enhances several graduate courses at Michigan State University, and provides unique interdisciplinary training opportunities for students including those from underrepresented groups. Outreach activities, including robotic fish demos, museum exhibits, teacher training, and "Follow That Fish" smartphone App, are specifically designed to pique the interest of pre-college students in science and engineering. The goal of this project is to create an integrative framework for the design of coupled robotic and biological systems that accommodates system uncertainties and competing objectives in a rigorous and holistic manner. This goal is realized through the pursuit of five tightly coupled research objectives associated with the application of tracking and modeling fish movement: (1) developing new robotic platforms to enable underwater communication and acoustic tag detection, (2) developing robust algorithms with analytical performance assurance to localize tagged fish based on time-of-arrival differences among multiple robots, (3) designing hidden Markov models and online model adaptation algorithms to capture fish movement effectively and efficiently, (4) exploring a two-tier decision architecture for the robots to accomplish fish tracking, which incorporates model-predictions of fish movement, energy consumption, and mobility constraints, and (5) experimentally evaluating the design framework, first in an inland lake for localizing or tracking stationary and moving tags, and then in Thunder Bay, Lake Huron, for tracking and modeling the movement of lake trout during spawning. This project offers fundamental insight into the design of robust robotic-physical-biological systems that addresses the challenges of system uncertainties and competing objectives. First, a feedback paradigm is presented for tight interactions between the robotic and biological components, to facilitate the refinement of biological knowledge and robotic strategies in the presence of uncertainties. Second, tools from estimation and control theory (e.g., Cramer-Rao bounds) are exploited in novel ways to analyze the performance limits of fish tracking algorithms, and to guide the design of optimal or near-optimal tradeoffs to meet multiple competing objectives while accommodating onboard resource constraints. On the biology side, continuous, dynamic tracking of tagged fish with robotic networks represents a significant step forward in acoustic telemetry, and results in novel datasets and models for advancing fish movement ecology.
Off
Michigan State University
-
National Science Foundation
Guoliang Xing
Charles Krueger
Submitted by Xiaobo Tan on December 22nd, 2015
Designing semi-autonomous networks of miniature robots for inspection of bridges and other large civil infrastructure According to the U.S. Department of Transportation, the United States has 605102 bridges of which 64% are 30 years or older and 11% are structurally deficient. Visual inspection is a standard procedure to identify structural flaws and possibly predict the imminent collapse of a bridge and determine effective precautionary measures and repairs. Experts who carry out this difficult task must travel to the location of the bridge and spend many hours assessing the integrity of the structure. The proposal is to establish (i) new design and performance analysis principles and (ii) technologies for creating a self-organizing network of small robots to aid visual inspection of bridges and other large civilian infrastructure. The main idea is to use such a network to aid the experts in remotely and routinely inspecting complex structures, such as the typical girder assemblage that supports the decks of a suspension bridge. The robots will use wireless information exchange to autonomously coordinate and cooperate in the inspection of pre-specified portions of a bridge. At the end of the task, or whenever possible, they will report images as well as other key measurements back to the experts for further evaluation. Common systems to aid visual inspection rely either on stationary cameras with restricted field of view, or tethered ground vehicles. Unmanned aerial vehicles cannot access constricted spaces and must be tethered due to power requirements and the need for uninterrupted communication to support the continual safety critical supervision by one or more operators. In contrast, the system proposed here would be able to access tight spaces, operate under any weather, and execute tasks autonomously over long periods of time. The fact that the proposed framework allows remote expert supervision will reduce cost and time between inspections. The added flexibility as well as the increased regularity and longevity of the deployments will improve the detection and diagnosis of problems, which will increase safety and support effective preventive maintenance. This project will be carried out by a multidisciplinary team specialized in diverse areas of cyber-physical systems and robotics, such as locomotion, network science, modeling, control systems, hardware sensor design and optimization. It involves collaboration between faculty from the University of Maryland (UMD) and Resensys, which specializes in remote bridge monitoring. The proposed system will be tested in collaboration with the Maryland State Highway Administration, which will also provide feedback and expertise throughout the project. This project includes concrete plans to involve undergraduate students throughout its duration. The investigators, who have an established record of STEM outreach and education, will also leverage on exiting programs and resources at the Maryland Robotics Center to support this initiative and carry out outreach activities. In order to make student participation more productive and educational, the structure of the proposed system conforms to a hardware architecture adopted at UMD and many other schools for the teaching of undergraduate courses relevant to cyber-physical systems and robotics. This grant will support research on fundamental principles and design of robotic and cyber-physical systems. It will focus on algorithm design for control and coordination, network science, performance evaluation, microfabrication and system integration to address the following challenges: (i) Devise new locomotion and adhesion principles to support mobility within steel and concrete girder structures. (ii) Investigate the design of location estimators, omniscience and coordination algorithms that are provably optimal, subject to power and computational constraints. (iii) Methods to design and analyze the performance of energy-efficient communication protocols to support robot coordination and localization in the presence of the severe propagation barriers caused by metal and concrete structures of a bridge.
Off
University of Maryland College Park
-
National Science Foundation
Nuno Martins Submitted by Nuno Martins on December 22nd, 2015
Advances in technology mean that computer-controlled physical devices that currently still require human operators, such as automobiles, trains, airplanes, and medical treatment systems, could operate entirely autonomously and make rational decisions on their own. Autonomous cars and drones are a concrete and highly publicized face of this dream. Before this dream can be realized we must address the need for safety - the guaranteed absence of undesirable behaviors emerging from autonomy. Highly publicized technology accidents such as rocket launch failures, uncontrolled exposure to radiation during treatment, aircraft automation failures and unintended automotive accelerations serve as warnings of what can happen if safety is not adequately addressed in the design of such cyber-physical systems. One approach for safety analysis is the use of software tools that apply formal logic to prove the absence of undesired behavior in the control software of a system. In prior work, this approach this been proven to work for simple controller software that is generated automatically by tools from abstract models like Simulink diagrams. However, autonomous decision making requires more complex software that is able to solve optimization problems in real time. Formal verification of control software that includes such optimization algorithms remains an unmet challenge. The project SORTIES (Semantics of Optimization for Real Time Intelligent Embedded Systems) draws upon expertise in optimization theory, control theory, and computer science to address this challenge. Beginning with the convergence properties of convex optimization algorithms, SORTIES examines how these properties can be automatically expressed as inductive invariants for the software implementation of the algorithms, and then incorporates these properties inside the source code itself as formal annotations which convey the underlying reasoning to the software engineer and to existing computer-aided verification tools. The SORTIES goal is an open-source-semantics-carrying autocoder, which takes an optimization algorithm and its convergence properties as input, and produces annotated, verifiable code as output. The demonstration of the tool on several examples, such as a Mars lander, an aircraft avionics system, and a jet engine controller, shows that the evidence of quality produced by annotations is fully compatible with its application to truly functional products. Project research is integrated with education through training of "tri-lingual" professionals, who are equally conversant in system operation, program analysis, and the theory of control and optimization.
Off
Georgia Tech Research Corporation
-
National Science Foundation
Eric Feron Submitted by Eric Feron on December 22nd, 2015
Recent developments in nanotechnology and synthetic biology have enabled a new direction in biological engineering: synthesis of collective behaviors and spatio-temporal patterns in multi-cellular bacterial and mammalian systems. This will have a dramatic impact in such areas as amorphous computing, nano-fabrication, and, in particular, tissue engineering, where patterns can be used to differentiate stem cells into tissues and organs. While recent technologies such as tissue- and organoid on-a-chip have the potential to produce a paradigm shift in tissue engineering and drug development, the synthesis of user-specified, emergent behaviors in cell populations is a key step to unlock this potential and remains a challenging, unsolved problem. This project brings together synthetic biology and micron-scale mobile robotics to define the basis of a next-generation cyber-physical system (CPS) called biological CPS (bioCPS). Synthetic gene circuits for decision making and local communication among the cells are automatically synthesized using a Bio-Design Automation (BDA) workflow. A Robot Assistant for Communication, Sensing, and Control in Cellular Networks (RA), which is designed and built as part of this project, is used to generate desired patterns in networks of engineered cells. In RA, the engineered cells interact with a set of micro-robots that implement control, sensing, and long-range communication strategies needed to achieve the desired global behavior. The micro-robots include both living and non-living matter (engineered cells attached to inorganic substrates that can be controlled using externally applied fields). This technology is applied to test the formation of various patterns in living cells. The project has a rich education and outreach plan, which includes nationwide activities for CPS education of high-school students, lab tours and competitions for high-school and undergraduate students, workshops, seminars, and courses for graduate students, as well as specific initiatives for under-represented groups. Central to the project is the development of theory and computational tools that will significantly advance that state of the art in CPS at large. A novel, formal methods approach is proposed for synthesis of emergent, global behaviors in large collections of locally interacting agents. In particular, a new logic whose formulas can be efficiently learned from quad-tree representations of partitioned images is developed. The quantitative semantics of the logic maps the synthesis of local control and communication protocols to an optimization problem. The project contributes to the nascent area of temporal logic inference by developing a machine learning method to learn temporal logic classifiers from large amounts of data. Novel abstraction and verification techniques for stochastic dynamical systems are defined and used to verify the correctness of the gene circuits in the BDA workflow.
Off
University of Pennsylvania
-
National Science Foundation
Submitted by Vijay Kumar on December 22nd, 2015
Recent developments in nanotechnology and synthetic biology have enabled a new direction in biological engineering: synthesis of collective behaviors and spatio-temporal patterns in multi-cellular bacterial and mammalian systems. This will have a dramatic impact in such areas as amorphous computing, nano-fabrication, and, in particular, tissue engineering, where patterns can be used to differentiate stem cells into tissues and organs. While recent technologies such as tissue- and organoid on-a-chip have the potential to produce a paradigm shift in tissue engineering and drug development, the synthesis of user-specified, emergent behaviors in cell populations is a key step to unlock this potential and remains a challenging, unsolved problem. This project brings together synthetic biology and micron-scale mobile robotics to define the basis of a next-generation cyber-physical system (CPS) called biological CPS (bioCPS). Synthetic gene circuits for decision making and local communication among the cells are automatically synthesized using a Bio-Design Automation (BDA) workflow. A Robot Assistant for Communication, Sensing, and Control in Cellular Networks (RA), which is designed and built as part of this project, is used to generate desired patterns in networks of engineered cells. In RA, the engineered cells interact with a set of micro-robots that implement control, sensing, and long-range communication strategies needed to achieve the desired global behavior. The micro-robots include both living and non-living matter (engineered cells attached to inorganic substrates that can be controlled using externally applied fields). This technology is applied to test the formation of various patterns in living cells. The project has a rich education and outreach plan, which includes nationwide activities for CPS education of high-school students, lab tours and competitions for high-school and undergraduate students, workshops, seminars, and courses for graduate students, as well as specific initiatives for under-represented groups. Central to the project is the development of theory and computational tools that will significantly advance that state of the art in CPS at large. A novel, formal methods approach is proposed for synthesis of emergent, global behaviors in large collections of locally interacting agents. In particular, a new logic whose formulas can be efficiently learned from quad-tree representations of partitioned images is developed. The quantitative semantics of the logic maps the synthesis of local control and communication protocols to an optimization problem. The project contributes to the nascent area of temporal logic inference by developing a machine learning method to learn temporal logic classifiers from large amounts of data. Novel abstraction and verification techniques for stochastic dynamical systems are defined and used to verify the correctness of the gene circuits in the BDA workflow.
Off
Trustees of Boston University
-
National Science Foundation
Calin Belta Submitted by Calin Belta on December 22nd, 2015
The project focuses on swarming cyber-physical systems (swarming CPS) consisting of a collection of mobile networked agents, each of which has sensing, computing, communication, and locomotion capabilities, and that have a wide range of civilian and military applications. Different from conventional static CPS, swarming CPS rely on mobile computing entities, e.g., robots, which collaboratively interact with phenomena of interest at different physical locations. This unique feature calls for novel sensing-motion co-design solutions to accomplish a variety of increasingly complex missions. Towards this, the overall research objective of this project is to establish and demonstrate a generic motion-sensing co-design procedure that will significantly reduce the complexity of the mission design for swarming CPS, and greatly facilitate the development of effective, efficient and adaptive control and sensing strategies under various environment uncertainties. This project aims to offer comprehensive scientific understanding of the dynamic nature of swarming CPS, contribute to generic engineering principles for designing collaborative control and sensing algorithms, and advance the enabling technologies of practically applying CPS in the challenging environment. The research solutions of this project aim to bring significant advance in the environmental sustainability, homeland security, and human well-being. The project provides unique interdisciplinary training opportunities for graduate and undergraduate students through both research work and related courses that the PIs will develop and offer. The project significantly advances the state of the art in cooperative control and sensing and provide an enabling technology for swarming CPS through highly interrelated thrusts: (1) a generic sensing and motion co-design procedure, which reveals the fundamental interplay between the sensing dynamics and motion dynamics of swarming CPS, will be proposed to facilitate the development of effective and efficient control and sensing strategies; (2) by following such co-design procedure, provable correct, computation efficient, and communication light control and sensing strategies will be developed for swarming CPS with constrained resources to accomplish specific missions, e.g., locating pollutants, in an unknown field, while navigating through uncertain spaces; (3) to provide an enabling mobile platform to verify the proposed strategies, innovative small, highly 3D maneuverable, noiseless, energy-efficient, and robust robotic fish fully actuated by smart material will be designed to meet the maneuvering requirements of the proposed algorithms; (4) novel Magnetic Induction (MI)-based underwater communication and localization solutions will be developed, which allows robotic fish to timely and reliably exchange messages, while simultaneously providing accurate inter-fish localization in the harsh 3D underwater environment; and (5) the proposed sensing-motion co-design strategies will be verified and demonstrated using a school of wirelessly interconnected robotic fish in both lab-based experiments and field experiments.
Off
Wichita State University
-
National Science Foundation
Submitted by pu wang on December 22nd, 2015
Subscribe to Robotics