The goal of this project is to investigate a low-cost and energy-efficient hardware and software system to close the loop between processing of sensor data, semantically high-level detection and trajectory generation in real-time. To safely integrate Unmanned Aerial Vehicles into national airspace, there is an urgent need to develop onboard sense-and-avoid capability. While deep neural networks (DNNs) have significantly improved the accuracy of object detection and decision making, they have prohibitively high complexity to be implemented on small UAVs. Moreover, existing UAV flight control approaches ignore the nonlinearities of UAVs and do not provide trajectory assurance. The research thrusts of this project are: (i) FPGA implementation of DNNs: both fully connected and convolutional layers of deep (convolutional) neural networks will be trained using (block-)circulant matrix and implemented using custom designed universal Fast Fourier Transform kernels on FPGA. This research thrust will enable efficient implementation of DNNs, reducing memory and computation complexity from O(N2) to O(N) and O(NlogN), respectively; (ii) autonomous detection and perception for onboard sense-and-avoid: existing regional detection neural networks will be extended to work with images taken from different angles, and multi-modal sensor inputs; (iii) real-time waypoint and trajectory generation - an integrated trajectory generation and feedback control scheme for steering under-actuated vehicles through desired waypoints in 3D space will be developed. For efficient implementation and hardware reuse, both detection and control problems will be formulated and solved using DNNs with (block-)circulant weight matrix. Deep reinforcement learning models will be investigated for waypoint generation and to assign artificial potential around the obstacles to guarantee a safe distance. The fundamental research results will enable onboard computing, real-time detection and control, which are cornerstones of autonomous and next-generation UAVs.
Off
Syracuse University
-
National Science Foundation
Amit Sanyal
Yanzhi Wang
Jian Tang
Senem Velipasalar
Submitted by Qinru Qiu on November 28th, 2017
This proposal is for research on the Mobile Automated Rovers Fly-By (MARS-FLY) for Bridge Network Resiliency. Bridges are often in remote locations and the cost of installing electricity and a data acquisition system in hundreds of thousands of bridges is prohibitive. The MARS-FLY project will develop a cyber-physical system (CPS) designed to monitor the health of highway bridges, control the loads imposed on bridges by heavy trucks, and provide visual inspectors with quantitative information for data-driven bridge health assessment requiring no electricity and a minimum of data acquisition electronics on site. For fly-by monitoring, GPS-controlled auto-piloted drones will periodically carry data acquisition electronics to the bridge and download the data from the sensors at a close range. Larger Imaging drones carrying infrared (IR) cameras will be used to detect detail damages like concrete delamination. The research objectives will be accomplished first, by wireless recharging of remote sensor motes by drone to enhance the sensor operational lifetime whereas wireless recharging of drone battery will extend the operational efficiency, payload, and drone range. The novel multi-coil wireless powering approach will provide an investigation of an engineered material i.e. metamaterial with the resonant link to enhance the power level and link distance, otherwise unachievable. Next, by a major scientific breakthrough in the utilization of small quantities of low quality sensor data and IR images to determine damage information at all levels: detection of a change in behavior, location, and magnitude; streamlining of reliability analysis to incorporate the new information of damage into the bridge's reliability index based on combined numerical and probabilistic approaches such as Ensemble Empirical Mode Decomposition with the Hilbert Transform; and finally detection of nonlinearities in the signals in a Bayesian Updating framework. Moreover, an instrumented drive-by vehicle will complement damage detection on the bridge. A Bayesian updating framework will be used to update the probability distribution for bridge condition, given the measurements. Image processing of the infrared images to distinguish between the environmental effects and the true bridge deterioration (e.g. delamination in concrete) will be used to develop a better method of site-specific and environment-specific calibration.
Off
University of Alabama at Birmingham
-
National Science Foundation
Mohammad Haider
Submitted by Nassim Uddin on November 28th, 2017
Subscribe to uav