Exemplar Project Descriptions

Below is a listing of exemplar project descriptions from prior years. We ask that you read through each of the below for guidance on which areas of research might most interest you to indicate to us in your application. Your preferences are noted to all PIs reviewing your application, helping to ensure that you are selected for a project and/or PI who most closely fit your given interests for the summer research experience. Unfortunately, this year a list of PIs or projects hiring for Summer 2024 will not be available at the time of application. 

2024 Projects

Analytics and data tools for the world’s largest vehicle trajectory datasets (I-24 MOTION) 
Principal Investigator: Dan Work, Will Barbour, Gergely Zachar

Project Description

The I-24 MOTION testbed is the world’s largest source of complete, high resolution vehicle trajectory data from a real-world highway. The testbed operates continuously and a single day of data generates around 1 billion data points. Student researchers will be involved in a variety of tasks: system-level tools that help operate and monitor the system as it generates data, algorithms and data curation that aid in analyzing such a large dataset, and spatial processing to ensure the highest positional accuracy possible for the system. Students will learn skills for production-grade software, curating tools and data for public consumption, and modern techniques for working with truly massive datasets.

Desired Qualifications

Proficiency in Python programming language, databases, and/or large data analytics are helpful but not strictly required

Compositional DSLs for Enhancing Software          
Principal Investigator: Daniel Balasubramanian

Project Description 

The goal of this project is to study how domain-specific languages (DSLs) can be used to represent components of legacy systems and to use the DSLs to enhance those components. The project involves formal methods, compilers, and program analysis, such as symbolic execution. Experience with LLMs is helpful but not required.

Desired Qualifications 

A background or interest in formal methods, program analysis, or software engineering. Experience interfacing with LLMs is helpful but not required.

Cyber Agents for Security Testing and Learning Environments (CASTLE)         
Principal Investigator: Daniel Balasubramanian

Project Description 

This project addresses the pressing need to develop the capability to defend computer networks from sophisticated adversaries and advanced persistent threats. As the scale and complexity of cyber attacks grows, network operators need the ability to understand and defend against these attacks in real-time. Autonomous cyber agents offer the promise of automating the defense against such complex attacks, but developing agents capable of reasoning across multiple layers of the computing stack is a challenging task. This project investigates the use of reinforcement learning and AI in developing autonomous cyber agents capable of defending a computer network against such attacks.

Desired Qualifications 

Experience with reinforcement learning and/or machine learning; experience programming in Python

Cyber-Physical Systems Virtual Organization          
Principal Investigator(s): Jonathan Sprinkle, Janos Sztipanovits

Project Description

The Cyber-Physical Systems Virtual Organization (CPS-VO) is a portal used by thousands of researchers across the country to collaborate on topics involving the intersection between computing and the physical world. Our goal is to bring  together academia, government, and industry, and some ways to do that include integrating research tools and models within our website and making this content citable. Students will be involved in data science and machine learning tasks to aid in search and classification of research. Students may also be involved in helping to integrate existing tools into the website.

Desired Qualifications

For those involved in data science tasks, a background in Computer Science/Software Development with strong experience in either databases or website backend is needed. Those involved in integrating tools should be comfortable working in a Linux command line environment, installing/debugging complex applications and dependencies, and working with Docker, VNC, nginx, and related technology.

Cybersecurity in Industrial Networks  
Principal Investigators: Bryan C. Ward

Project Description

Industrial networks connect many industrial machines and processes in applications ranging from the power grid to the factory floor. The specific type of network used to communicate among such industrial devices is called the Operational Technology (OT) network, and it is often disjoint from the Information Technology (IT) network that is used for email, web browsing, and many other general-purpose services. OT networks are unique from IT ones in that the timing of communication can be critical to ensure the correct operation of the industrial devices, where a timing failure could lead to equipment failures, damage, or even human injury of loss of life.

The Time Sensitive Networking (TSN) networking standard is emerging as a key technology to enable such timely communication in these OT networks. Recently, industry practitioners are moving towards more converged IT/OT networks to enable new features and functionalities, as well as greater efficiencies. However, this convergence introduces new threat vectors, and therefore new cybersecurity architectures are needed to protect critical industrial OT devices. We will explore cybersecurity in the context of industrial networks, especially in the convergence of IT/OT networks, using the North American Industrial Internet Consortium (IIC) TSN testbed, which is hosted at Vanderbilt.

https://www.isis.vanderbilt.edu/tsn

Desired Qualifications

Computer science students with junior standing or higher is preferred but not required. Students should have programming experience in C/C++, and preferably experience with network programming.

Human Activity Tracking using Multimodal Data and Deep Learning Algorithms  
Principal Investigators: Gautam Biswas

Project Description

In this project, the student will work closely with the PI, a Post-Doctoral Researcher, and Graduate Research Assistants in designing and training deep learning architectures for human activity recognition from videos. This work will extend current deep learning work for pose and gaze tracking from video to build transformer structures that combine video and speech data to recognize humans performing specific activities in mixed reality learning and training environments. We will extract video segments for pre-processing and then training and fine tuning specific deep learning architectures using few shot learning methods. As a last step, we will study methods for validating the developed architectures and then integrating them into a multimodal learning analytics pipeline being develop in the PI’s lab.

Physics constraints guided Deep Learning Architectures for Tracking Performance of Cyber Physical Systems  
Principal Investigators: Gautam Biswas

Project Description

The Modeling and Analysis of Complex Systems Lab has been developing Reinforcement Learning-based algorithms for fault-adaptive control of Cyber Physical systems, and methods that track the degradation in systems during operations to activate Replanning algorithms that ensure continued safe operations of the systems. Previously such methods used physics-based models for tracking degradation, and implementing control and replanning. The recent trends have been to use the large amounts of operational data collected from such systems to support the modeling, control, and replanning tasks. The success of the control and replanning tasks are very dependent on the accuracy of the models that are derived from the operational data, and more recently approaches have been developed to combine physics constraints with operational data of the system to construct more accurate models. In this summer project, we will develop deep learning models of system behavior and degradation combining operational data with physics-guided constraints.

Understanding Human Error in Software Vulnerabilities  
Principal Investigators: Kevin Leach

Project Description

Software defects lead to critical vulnerabilities that threaten disclosing sensitive information, disabling critical infrastructure, or risk human life and limb.  While many complex systems and techniques have been developed to automatically find certain classes of vulnerabilities automatically in software, little work has investigated the role the human developer plays in creating software defects.  This project focuses on understanding how and why human software developers mistakenly create software defects that lead to security vulnerabilities.

During this project, the student will develop an interface for collecting human subject data.  In particular, we will design experiments that measure which defects humans are more likely to overlook, and what analyses we can conduct on source code to make vulnerabilities more apparent.  The end result will be a suite of indicative software vulnerabilities as well as a web-based interface for collecting human subject data.  We will run a pilot study by the end of the study, which will directly contribute to a longer-term study suitable for publication.

Desired Qualifications

Students who are studying computer science and/or psychology are preferred.  Sophomore standing or higher is preferred.  Students should be willing to learn Python and Bash scripting, as well as tools for understanding software vulnerabilities like Ghidra or Infer.

2023 Projects

Cognitive Processes for Programming          
Principal Investigator: Yu Huang

Project Description 

This project aims to investigate the cognitive processes and develop cognitive models for programming-related tasks. In this project, students will design user interface, programming tasks and experimental protocols, and analyze behavioral, visual and/or neuroimaging data. The goal is to understand how programmers solve different programming tasks (e.g., debugging, testing) and leverage the findings to improve automated tool design (e.g., AI-based automation for software engineering tasks) or training strategies.

Desired Qualifications

Students should have basic programming and data analysis experience. Knowledge on AI  and/or some psychology and behavioral science background is helpful but not required. 

Compositional DSLs for Enhancing Software          
Principal Investigator: Daniel Balasubramanian

Project Description 

The goal of this project is to study how domain-specific languages (DSLs) can be used to represent components of legacy systems and to use the DSLs to enhance those components. The project involves formal methods, compilers, and program analysis, such as symbolic execution.

Desired Qualifications 

A background or interest in formal methods, program analysis, or software engineering

Cyber-Physical Systems Virtual Organization          
Principal Investigator(s): Jonathan Sprinkle, Janos Sztipanovits

Project Description

The Cyber-Physical Systems Virtual Organization (CPS-VO) is a portal used by thousands of researchers across the country to collaborate on topics involving the intersection between computing and the physical world. Our goal is to bring  together academia, government, and industry, and some ways to do that include integrating research tools and models within our website and making this content citable. Students will be involved in data science and machine learning tasks to aid in search and classification of research. Students may also be involved in helping to integrate existing tools into the website.

Desired Qualifications

For those involved in data science tasks, a background in Computer Science/Software Development with strong experience in either databases or website backend is needed. Those involved in integrating tools should be comfortable working in a Linux command line environment, installing/debugging complex applications and dependencies, and working with Docker, VNC, nginx, and related technology.

Data-centric Robustness for Machine Learning

Principal Investigator(s): Kevin Leach

Project Description

In this project, students will help develop techniques for evaluating the quality and robustness of machine learning models.  Specifically, students will create software to help identify or generate natural adversarial training examples that "break" machine learning models to show when a model needs further data to be sufficiently robust. Anticipated contributions will include working with a graduate student to develop scripts, collect data, and answer important research questions to evaluate deep neural networks, ultimately culminating in a submission of a peer-reviewed manuscript.

Desired Qualifications

Students at all levels are welcome to apply.  Preferably, applicants will have programming experience, especially in Python, Keras, or TensorFlow.  

Data-driven Analysis of Equity and Fairness in Public Transit          
Principal Investigator(s): Abhishek Dubey, Ayan Mukhopadhyay

Project Description

Micro-transit is increasingly becoming common in urban areas across the globe. Typically, data-driven algorithmic approaches are used to allocates resources to spatial regions of cities. However, existing data is often biased against specific demographics, and using such data reinforces bias in resource allocation. For example, the distribution of rental bikes has been shown to discriminate against features such as socioeconomic status. Crucially, such discrimination is often unintentional and not explicitly modeled in algorithmic approaches. This makes it imperative that existing resource allocation methods from the real world are validated against state-of-the-art standards of fairness, and algorithmic approaches be designed to explicitly take fairness into account. As part of this project, students will work on two major socio-technical problems. First, they would conduct data analysis on open-source transportation data to evaluate fairness of resource allocation. Second, they would evaluate the performance of existing algorithms for resource allocation under constraints that explicitly represent fairness.

Desired Qualifications

The students should have a strong background in data analytics and be comfortable with combinatorial optimization. Knowledge of python is expected.

Infrastructure for Computing the Biome          
Principal Investigator: Janos Sztipanovitz, Yogesh Barve

Project Description

This summer research work is linked to the Computing the Biome project (https://digitalbiome.org) funded by NSF - Convergence Accelerator Program. The Computing the Biome team is creating a data and AI platform for monitoring and predicting biothreats in a major U.S. city, and to drive economic sustainability by empowering businesses and advanced science missions to deliver valuable consumer apps and breakthroughs.

The focus of this summer research work will be to support the development of a software framework to facilitate the execution, testing, validation, and visualization of simulation and AI/ML computational workflows on cloud computing platform. The student will work closely with a Research Scientist and undergraduate students working on this project to create modular and reusable software modules to manage, deploy and execute scientific workflows running on cloud computing platforms.

Desired Qualifications

Sufficient programming experience, e.g., should have taken the course CS 3251 - Intermediate Software Design. CS/CmpE/EE background with familiarity with concepts and techniques of  software design. Knowledge of the Java/Python/C++/Kotlin languages is a desired, as well as experience with Git, Docker, and Cloud technologies preferred.

Heterogeneous Simulation Integration for Analyzing Critical Infrastructures          
Principal Investigator: Himanshu Neema

Project Description

This project is about enabling simulation-based evaluation of large-scale systems. These systems (e.g., critical infrastructures such as the transportation networks, electricity networks, or the water distribution networks, or even large command-and-control organizations such as in the military or the air-force) have many different subsystems, which themselves are quite complex. Thus, each of these subsystems require their own specific simulation tools to model and analyze them. For evaluating the large systems (or as we call them 'system-of-systems'), evaluating their different parts in isolation is not sufficient. What is really needed is to integrate different simulators in a logically and temporally coherent manner so that they work together and provide us with the mechanisms to evaluate these large systems as a whole. In this project, in the last many years, we have developed a model-based framework that allows to model these large systems-of-systems and its code-generation tools automatically synthesize the integrated system-of-systems simulations. When executed, the integrated simulations all run concurrently, are time synchronized, and exchange data corresponding to their system-level interdependence. These different simulators are highly heterogeneous in nature as they use different modeling languages, represent different real-world systems, have different models of computation, and are written in different programming languages. Hence, wiring them together in a consistent manner is extremely challenging. The project involves heavy use of meta-modeling, automated simulation configuration, web-programming using NodeJS, JavaScript, and REST APIs, and general programming using Java, C++, and Python.

Desired Qualifications

Familiarity with full-stack development, Java, C++, Python, and simulation tools are desired in successful interns.

Multimodal Analytics Pipeline to student Learners’ Problem solving Behaviors          
Principal Investigator: Gautam Biswas

Project Description

The increasing availability of non-intrusive sensors is making it possible to collect data from learners as they work in computer-based learning environments and augmented reality environments. More recently, researchers have started looking at approaches for getting a more holistic approach of how people learn by combining data obtained from learners’ activities captured in log files, along with gaze data obtained from eye tracking devices, and gestures and posture as well as speech data obtained from video recording systems. Our group is developing an architecture to capture, align, and organize the multimodal data for analyzing students’ learning behaviors that includes their cognitive and metacognitive processes, as well as their focus of attention and how they interact with collaborators. The undergraduate student intern will work with graduate students in the group to develop and analyze machine learning algorithms for analyzing video, speech, and log data to understand learner behaviors in the context of their learning and problem-solving task.

Desired Qualifications

The students should have taken CS 3250 and CS 3251. Sufficient background in python programming is also important. Some background in data analytics and machine learning algorithms is desired.

Studying Reinforcement Learning-Based Algorithms for Fault-Tolerant Control           
Principal Investigator(s): Gautam Biswas and Marcos Quinones-Grueiro

Project Description

This summer research work is linked to a NASA System-Wide Safety Assurance project that is funded by NASA. Our group has been developing online risk analysis and fault-tolerant control algorithms to ensure safe UAV flights under adverse weather conditions and when faults occur in the UAV system.

The focus of this project will be to study and evaluate a number of Reinforcement Learning based Fault-Tolerant Control (FTC) algorithms for safe UAV flights. The student will work closely with a Research Scientist and Graduate students working on this project to conduct experiments in the Gazebo environment to test FTC algorithms for octocopter UAV models. The end result will be an empirical study of FTC approaches that we will publish in conferences and journals.

Desired Qualifications

The students should have taken CS 3250 and CS 3251. Sufficient background in python programming is also important.  Some background in modeling of physical systems and in machine learning approaches is desired.

Transit System Optimizer          
Principal Investigator(s): Abhishek Dubey, Ayan Mukhopadhyay

Project Description

Public transportation systems are seeing immense performance and economic pressures. Our team works on developing data-driven solutions to deal with these pressures and design novel applications that can optimize the performance of these systems. For example, the team is working on a a set of core services that help the transit operators take bookings, operate fleet, schedule trips and trace real-time operations. At the core of the service are innovative AI algorithms that combine the advantages of offline optimization with online refinements considering travel time and weather induced uncertainties. As part of this project, students will work with the team running the pilot operations and will help test the system and identify improvements. In the second part of the project, the students will develop monitoring scripts and quality of service assessment scipts that will identify wether the service is performing as intended or not.          
           
Desired Qualifications

The students should have a strong background in data analytics and cloud. Knowledge of python is expected.

Transportation testbeds for automated vehicles and traffic control

Principal Investigator(s): Dan Work and Will Barbour

Project Description

We are seeking undergraduate research assistants to work on the nationally recognized Interstate 24 MOTION testbed and TDOT SMART Corridor. These two projects are massive scale transportation cyberphysical systems, combining artificial intelligence algorithms developed at Vanderbilt to achieve the largest traffic data instrument and one of the largest corridor management systems, respectively. I-24 MOTION uses 300 video cameras to produce time-space data that is critical to transportation science and provides a location for evaluating the effects of connected and automated vehicles on a real highway. The TDOT I-24 SMART Corridor was designed to manage acute congestion and incident issues on the roadway using lane and speed limit control. Research assistants will have the opportunity to work with tera-scale datasets representing the driving of millions of vehicles – one of the richest traffic datasets ever created. This data can be fused with the operational status and control of the SMART Corridor infrastructure. Researchers will also assist on the continued development of software responsible for controlling hundreds of live transportation control assets.

Desired Qualifications

All of our software development is programmed in Python, so familiarity with the language or willingness and ability to learn it quickly is needed. Experience with algorithms and data science is advantageous.

2022 Projects

Compositional DSLs for Enhancing Software          
Principal Investigator: Daniel Balasubramanian

Project Description 

The goal of this project is to study how domain-specific languages (DSLs) can be used to represent components of legacy systems and to use the DSLs to enhance those components. The project involves formal methods, compilers, and program analysis, such as symbolic execution.

Desired Qualifications 

A background or interest in formal methods, program analysis, or software engineering

CPS Multi-Domain Modeling via Digital Twins          
Principal Investigator: Theodore Bapty

Project Description

The goal of this research is to model a cyberphysical system design and automatically perform virtual evaluation of its behavior. These models form a Digital Twin of the system, allowing rapid assessment of performance and early identification of potential problems. The research focuses on transforming the design model into custom models for a wide range of engineering simulations and analyses, without manual construction of these models. These models can be used by human designers, or to provide a design oracle to automated, Artificial Intelligence/Machine Learning-based designers.

Several aspects of the research can be addressed by summer researchers:

  • Construction of component, subsystem, and system models that provide the basis for building systems.
  • Mapping of the models to target engineering tools, e.g. structural, fluid, thermal, or multi-domain dynamics simulations.
  • Design studies on target systems, e.g. robotics, unmanned air vehicles, unmanned underwater vehicles, electric vehicles, etc.

Desired Qualifications

Background in mechanical engineering is advantageous. Familiarity with CFD, FEA, CAD, Matlab tools. Programming in C++/C# and/or Python.

Cyber-Physical Systems Virtual Organization          
Principal Investigator(s): Jonathan Sprinkle, Janos Sztipanovits

Project Description

The Cyber-Physical Systems Virtual Organization (CPS-VO) is a portal used by thousands of researchers across the country to collaborate on topics involving the intersection between computing and the physical world. Our goal is to bring  together academia, government, and industry, and some ways to do that include integrating research tools and models within our website and making this content citable. Students will be involved in data science and machine learning tasks to aid in search and classification of research. Students may also be involved in helping to integrate existing tools into the website.

Desired Qualifications

For those involved in data science tasks, a background in Computer Science/Software Development with strong experience in either databases or website backend is needed. Those involved in integrating tools should be comfortable working in a Linux command line environment, installing/debugging complex applications and dependencies, and working with Docker, VNC, nginx, and related technology.

Data-driven Analysis of Equity and Fairness in Public Transit          
Principal Investigator(s): Abhishek Dubey, Ayan Mukhopadhyay

Project Description

Micro-transit is increasingly becoming common in urban areas across the globe. Typically, data-driven algorithmic approaches are used to allocates resources to spatial regions of cities. However, existing data is often biased against specific demographics, and using such data reinforces bias in resource allocation. For example, the distribution of rental bikes has been shown to discriminate against features such as socioeconomic status. Crucially, such discrimination is often unintentional and not explicitly modeled in algorithmic approaches. This makes it imperative that existing resource allocation methods from the real world are validated against state-of-the-art standards of fairness, and algorithmic approaches be designed to explicitly take fairness into account. As part of this project, students will work on two major socio-technical problems. First, they would conduct data analysis on open-source transportation data to evaluate fairness of resource allocation. Second, they would evaluate the performance of existing algorithms for resource allocation under constraints that explicitly represent fairness.

Desired Qualifications

The students should have a strong background in data analytics and be comfortable with combinatorial optimization. Knowledge of python is expected.

Driver-Assistance Feature Exploration          
Principal Investigator: Jonathan Sprinkle

Project Description

Advanced driver assistance features on modern cars provide adaptive cruise control, blind-spot monitoring, etc. This project will explore data records of these features in order to bring forward patterns of use for training of machine learning controllers. The impact of this work will be used to develop safety thresholds for new controllers deployed in experiments in Nashville in the Fall of 2022. Participants may also take part in summer vehicle experiments that validate feature extraction through controlled driving tests.

Desired Qualifications

A background in Python is required, with a preference for knowledge of database use. Experience using Machine Learning approaches is preferred. Participants should be comfortable working in a Linux command line environment, suitable for working with Docker and related technology.

Enabling Decentralized Computation Markets using Distributed Ledgers          
Principal Investigator: Abhishek Dubey, Ayan Mukhopadhyay

Project Description

Cloud computing is the current standard for on-demand computing resources; however, there are many idle computing resources available within the community departments and local edge computing deployments; but, such resources are not effectively aggregated or provisioned. To gain access to these resources, we are developing an open-source market platform that leverages distributed ledger technologies, such as Ethereum and Apache pulsar to enable the market. As part of the project, students will be expected to help develop, mature and evaluate the platform using the Google Kubernetes Engine as the testbed.

Desired Qualifications

The student should be comfortable with python and ideally Solidity or similar smart contract language. Experience with Git, Docker, Apache Pulsar, Kubernetes, and Google Cloud is a plus.

Improving Community and Neighborhood Safety Through Open Data Collection          
Principal Investigator: Daniel Balasubramanian

Project Description

The goal of this project is to study how data obtained by members of the public through camera pictures, video feeds and similar technology can be used to improve public safety without unduly infringing on personal rights such as privacy. There are two main aspects to the project: the social aspect involves surveys, focus groups, and other community interactions to study how technology can benefit local neighborhoods. The technical aspect involves building a technical prototype using cameras, machine learning algorithms, mobile apps, and  backend infrastructure.

Desired Qualifications

A background or interest in web development, backend development, machine learning, or social aspects like focus groups and surveys.

Multimodal Analytics Pipeline to student Learners’ Problem solving Behaviors          
Principal Investigator: Gautam Biswas

Project Description

The increasing availability of non-intrusive sensors is making it possible to collect data from learners as they work in computer-based learning environments and augmented reality environments. More recently, researchers have started looking at approaches for getting a more holistic approach of how people learn by combining data obtained from learners’ activities captured in log files, along with gaze data obtained from eye tracking devices, and gestures and posture as well as speech data obtained from video recording systems. Our group is developing an architecture to capture, align, and organize the multimodal data for analyzing students’ learning behaviors that includes their cognitive and metacognitive processes, as well as their focus of attention and how they interact with collaborators. The undergraduate student intern will work with graduate students in the group to develop and analyze machine learning algorithms for analyzing video, speech, and log data to understand learner behaviors in the context of their learning and problem-solving task.

Desired Qualifications

The students should have taken CS 3250 and CS 3251. Sufficient background in python programming is also important. Some background in data analytics and machine learning algorithms is desired.

Neural Network and Machine Learning Verification          
Principal Investigator: Taylor Johnson

Project Description

In this project, students will help develop benchmarking processes for recent machine learning and neural network verification algorithms and tools, such as our nnv tool (https://github.com/verivital/nnv). These approaches allow, for example, to detect or prove the absence of perturbations that can cause various computer vision and machine perception tasks to misbehave, known colloquially as adversarial perturbations, but the source of which could be due to environmental uncertainty, noise, attackers, etc. Anticipated contributions include developing scripts for performing benchmarking of our methods and other research groups' recent approaches, to primarily be evaluated on convolutional neural networks (CNNs) on standard data sets, such as MNIST, CIFAR, and ImageNet. Further details found here.

Desired Qualifications

Students at all levels (freshman through senior) are welcome and will be able to help refine our prototype systems and approach. Programming experience in Matlab, Java, and Python would all be desirable, as would prior experience with machine learning frameworks, such as Keras, TensorFlow, etc. All code will be version controlled using Git/Mercurial. Experience with this is desired, but not required.

StatResp – A toolchain for statistical methods in emergency response management          
Principal Investigator(s): Abhishek Dubey, Ayan Mukhopadhyay

Project Description

Emergency response management (ERM) is a critical problem faced by communities across the globe. First-responders are constrained by limited resources, and must attend to different types of incidents like traffic accidents, fires, and distress calls. In prior art, as well as practice, incident forecasting and response are typically siloed by category and department, reducing effectiveness of prediction and precluding efficient coordination of resources. Further, most of these approaches are offline and fail to capture the dynamically changing environments under which critical emergency response occurs. As a consequence, statistical and algorithmic approaches to emergency response have received significant attention in the last few decades. Governments in urban areas are increasingly adopting methods that enable Smart Statistical Emergency Response, which are a combination of forecasting models and visualization tools to understand where and when incidents occur, and optimization approaches to allocate and dispatch responders. We are building ‘StatResp’ – an open-source integrated tool-chain to aid first responders understand where and when incidents occur, and how to allocate responders in anticipation of incidents. The historical analysis module of the toolchain is available as a public data dashboard at https://dashboard.statresp.ai. As part of the project, students will be expected to help with the feature engineering, learning demand models and develop visualization engines. Students will work with modern big data tools as part of the project.

Desired Qualifications

The students should have a background in machine learning and data analytics. Knowledge of python is expected.

Studying Reinforcement Learning-Based Algorithms for Fault-Tolerant Control           
Principal Investigator(s): Gautam Biswas and Marcos Quinones-Grueiro

Project Description

This summer research work is linked to a NASA System-Wide Safety Assurance project that is funded by NASA. Our group has been developing online risk analysis and fault-tolerant control algorithms to ensure safe UAV flights under adverse weather conditions and when faults occur in the UAV system.

The focus of this project will be to study and evaluate a number of Reinforcement Learning based Fault-Tolerant Control (FTC) algorithms for safe UAV flights. The student will work closely with a Research Scientist and Graduate students working on this project to conduct experiments in the Gazebo environment to test FTC algorithms for octocopter UAV models. The end result will be an empirical study of FTC approaches that we will publish in conferences and journals.

Desired Qualifications

The students should have taken CS 3250 and CS 3251. Sufficient background in python programming is also important.  Some background in modeling of physical systems and in machine learning approaches is desired.

Tools for Assured Autonomy          
Principal Investigator(s): Gabor Karsai, Abhishek Dubey

Project Description

Autonomous vehicles (cars, drones, underwater vehicles, etc.) started using software components that are built using AI and machine learning techniques. These vehicles must operate in highly uncertain environments and it is very difficult to design a correct algorithm for all situations. Instead, engineers collect data from a real or simulated environment and train a general purpose system - typically a neural net - to perform a certain function using machine learning techniques. The challenge is that the training data cannot cover all possible cases, yet one needs assurances that the system works safely and has acceptable performance.

This project is about fundamental research and the construction of design tools for the engineering of such systems. The tools help in modeling the system (e.g. an underwater vehicle), executing the training and testing the learning-based software components, and building formal arguments (called assurance cases) to show that the system is safe.

Desired Qualifications

CS/CmpE/EE background with familiarity with concepts and techniques of signals and systems, computer architecture, software design, and embedded systems. Knowledge of the Python/C/C++ languages is a plus, as well as experience with ROS (the Robot Operating System).

2021 Projects

Data-driven Analysis of Equity and Fairness in Public Transit          
Principal Investigator(s): Abhishek Dubey, Ayan Mukhopadhyay

Project Description

Micro-transit is increasingly becoming common in urban areas across the globe. Typically, data-driven algorithmic approaches are used to allocates resources to spatial regions of cities. However, existing data is often biased against specific demographics, and using such data reinforces bias in resource allocation. For example, the distribution of rental bikes has been shown to discriminate against features such as socioeconomic status. Crucially, such discrimination is often unintentional and not explicitly modeled in algorithmic approaches. This makes it imperative that existing resource allocation methods from the real world are validated against state-of-the-art standards of fairness, and algorithmic approaches be designed to explicitly take fairness into account. As part of this project, students will work on two major socio-technical problems. First, they would conduct data analysis on open-source transportation data to evaluate fairness of resource allocation. Second, they would evaluate the performance of existing algorithms for resource allocation under constraints that explicitly represent fairness.

Desired Qualifications

The students should have a strong background in data analytics and be comfortable with combinatorial optimization. Knowledge of python is expected.

High Performance CPS Co-simulations          
Principal Investigator(s): Janos Sztipanovits, Yogesh Barve

Project Description

CPS co-simulations enable understanding of complex scenarios across different simulation domains like computer networks, wireless communications, physics modeling, electric power grids, vehicles, transportation, fluid dynamics, etc. which is a hallmark of Cyber-Physical Systems (CPS). However, the performance of these co-simulations is affected by the underlying platform resource availability across the heterogeneous cloud-edge run-time platforms. This research project will explore holistically the design and execution of distributed co-simulations across cloud-edge computing platforms which allows for low-overhead monitoring of the performance indicators, and high-performance execution of co-simulations which adhere to the simulations quality of service (QoS) and cost constraints.

Desired Qualifications

CS/CmpE/EE background with familiarity with concepts and techniques of simulations, software design, and embedded systems. Knowledge of the Java/C++/Python languages is a plus, as well as experience with Git, Docker, Cloud technologies, and Raspberry PI. 

StatResp – A toolchain for statistical methods in emergency response management          
Principal Investigator(s): Abhishek Dubey, Ayan Mukhopadhyay

Project Description

Emergency response management (ERM) is a critical problem faced by communities across the globe. First-responders are constrained by limited resources, and must attend to different types of incidents like traffic accidents, fires, and distress calls. In prior art, as well as practice, incident forecasting and response are typically siloed by category and department, reducing effectiveness of prediction and precluding efficient coordination of resources. Further, most of these approaches are offline and fail to capture the dynamically changing environments under which critical emergency response occurs. As a consequence, statistical and algorithmic approaches to emergency response have received significant attention in the last few decades. Governments in urban areas are increasingly adopting methods that enable Smart Statistical Emergency Response, which are a combination of forecasting models and visualization tools to understand where and when incidents occur, and optimization approaches to allocate and dispatch responders. We are building ‘StatResp’ – an open-source integrated tool-chain to aid first responders understand where and when incidents occur, and how to allocate responders in anticipation of incidents. The historical analysis module of the toolchain is available as a public data dashboard at https://dashboard.statresp.ai. As part of the project, students will be expected to help with the feature engineering, learning demand models and develop visualization engines. Students will work with modern big data tools as part of the project.

Desired Qualifications

The students should have a background in machine learning and data analytics. Knowledge of python is expected.

2020 Projects

CPS Network Simulation with Variable Fidelity          
Principal Investigator: Himanshu Neema

Project Description

Cyber-Physical systems are engineering systems where the functionality emerges from networked interactions among the physical and computational components. Thus for simulation based evaluation of CPS, communication network simulation is a central piece that must almost always be integrated with rest of the heterogeneous simulations. In such large-scale CPS simulation based studies, the actual mechanism by which certain network effects such as cyber-attacks, or packet delays and drops, becomes less important as compared to the overall impact of such effects. For that to be simulated along with large CPS simulations, it becomes highly expensive to simulate communication networks at packet levels through all the OSI layers. At the same time, for certain effects to be simulated that is exactly is necessary. In order to balance the increased efficiency of higher-level network simulation with high-fidelity of packet-level simulation, we propose to create an architecture that allows the simulators to dynamically vary the fidelity level of the network simulation during run-time. The key research challenges that we want to address in this project are: (1) to develop, using an open-source simulation tool, a network simulation architecture that enables varying the network model's fidelity levels during run-time, (2) to maintain consistency of network data during such transitions, and (3) generate use-cases to demonstrate both the feasibility and applicability of this approach.

Desired Qualifications

C++ Programming; Computer Networking

Cyber-Physical Systems Virtual Organization          
Principal Investigator: Janos Sztipanovits

Project Description

The Cyber-Physical Systems Virtual Organization (CPS-VO) is a portal used by thousands of researchers across the country to collaborate on topics involving the intersection between computing and the physical world. Our goal is to bring  together academia, government, and industry, and some ways to do that include integrating research tools and models within our website and making this content citable. Students will be involved in helping to upgrade code modules from Drupal 6 to Drupal 8 as part of our on-going effort to modernize and update this platform. Students may also be involved in helping to integrate existing tools into the website.

Desired Qualifications

For those involved in code upgrades, a background in Computer Science/Software Development with strong experience in PHP and Drupal is necessary. Those involved in integrating tools should be comfortable working in a Linux command line environment, installing/debugging complex applications and dependencies, and working with Docker, VNC, nginx, and related technology.

Cybersecurity AI, and Autonomous Systems Research          
Principal Investigator: Jules White

Project Description

This project will involve students in one or more topics related to cybersecurity, AI, or autonomous systems. Students will work to understand cybersecurity issues in domains, ranging from software engineering to manufacturing. Work with AI and autonomous systems may also be performed as needed.

Desired Qualifications

Determination of acceptance for an internship will be assessed on a case-by-case basis for all interested students.

Data Science for Micro-Mobility          
Principal Investigator: Dan Work

Project Description

Increasingly, urban environments are experiencing numerous revolutions in transportation, which includes shared bikes and scooters. In this project we will work with a variety of research questions concerning these shared personal transportation devices, referred to broadly as “micromobility”. These modes of transportation have the potential to service short trips around dense areas of cities, but are subject to numerous management challenges for operators and city governments. Nashville is one such city dealing with the benefits and challenges of micromobility. Vanderbilt has served in multiple ways, already, as a testing ground for new micromobility strategies and data analysis. Areas of study for this project include micromobility infrastructure planning (e.g., bike lanes, designated parking), urban development, transportation demand management, and sustainability.

Desired Qualifications

Data science, programming in Python or ability to learn quickly, possible familiarity with GIS software, possible experience with web servers or databases

Innovative Illustrations of Climate Change in the Classroom          
Principal Investigator: Akos Ledeczi

Project Description

The goal of the research project is to develop innovative projects that simultaneously raise awareness about climate change and teach computer science in high schools. NetsBlox is an educational visual programming environment specifically designed to introduce advanced computing concepts to novices. For a 3-min introduction to NetsBlox, watch this video. The project aims to 1) identify interesting online data sources and services that can be used to quantify climate change, 2) extend NetsBlox to be able to access these, and 3) devise corresponding innovative projects that novice programmers can create in NetsBlox. For example, a simple example project shows carbon dioxide concentrations as a function of temperature variations for the past 800,000 years from ice core measurements from Antarctica and the past 100 years from other data sources.

Desired Qualifications

Software development experience, JavaScript

Neural Network and Machine Learning Verification          
Principal Investigator: Taylor Johnson

Project Description

In this project, students will help develop benchmarking processes for recent machine learning and neural network verification algorithms and tools, such as our nnv tool (https://github.com/verivital/nnv). These approaches allow, for example, to detect or prove the absence of perturbations that can cause various computer vision and machine perception tasks to misbehave, known colloquially as adversarial perturbations, but the source of which could be due to environmental uncertainty, noise, attackers, etc. Anticipated contributions include developing scripts for performing benchmarking of our methods and other research groups' recent approaches, to primarily be evaluated on convolutional neural networks (CNNs) on standard data sets, such as MNIST, CIFAR, and ImageNet.

Desired Qualifications

Students at all levels (freshman through senior) are welcome and will be able to help refine our prototype systems and approach. Programming experience in Matlab, Java, and Python would all be desirable, as would prior experience with machine learning frameworks, such as Keras, TensorFlow, etc. All code will be version controlled using Git/Mercurial, which experience with is desired, but not required.

Tools for Assured Autonomy          
Principal Investigator(s): Gabor Karsai, Abhishek Dubey

Project Description

Autonomous vehicles (cars, drones, underwater vehicles, etc.) have started using software components that are built using machine learning techniques. This is due to the fact that these vehicles must operate in highly uncertain environments and the we cannot design a correct algorithm for all situations. Instead, we collect data from a real or simulated environment and train a general purpose system - typically a neural net - to perform a certain function using machine learning techniques. But the challenge is that the training data cannot cover all possible cases, yet we need to know that the system works safely and has acceptable performance.

Our project is doing fundamental research and building tools for supporting the engineering of such system. The tools are for modeling the system (e.g. an underwater vehicle), executing the training and testing of the learning-based components, and building formal arguments (called assurance cases) to show that the system is safe.

Desired Qualifications

CS/CmpE/EE background with familiarity with concepts and techniques of signals and systems, computer architecture, software design, and embedded systems. Knowledge of the Python/C/C++ languages is a plus, as well as experience with ROS (the Robot Operating System).

Traffic Control with Connected and Autonomous Vehicles          
Principal Investigator: Dan Work

Project Description

This project will look at using Connected and Autonomous Vehicles (CAVS) to beneficially control the flow of traffic. Traffic is known to exhibit complicated, nonlinear behavior that often results in so-called phantom traffic jams in which traffic jams can appear seemingly from thin air. These jams are known to cause decreases in fuel efficiency, increases in commute time, and decreases in driver safety. An emerging technology for attempting to mitigate this phenomenon is the use of CAVs, which employ novel control methods specifically designed for stopping phantom jams.

Students will work on developing algorithms to implement on real instrumented vehicles, as well as developing techniques for modeling their effect on traffic.

Desired Qualifications

Background in data science, proficiency in programming with python/matlab, background in simulation or transportation modeling

Creation and Evaluation of CPS System Model Library          
Principal Investigator: Theodore Bapty

Project Description

Cyber-Physical systems, combinations of physical and computer components, are challenging to design and evaluate. To manage complexity, these systems are often composed from high-level components and subsystems, rather than building them form elemental parts. Prior work at ISIS has produced tools (OpenMETA) that allow modeling these systems and automatically composing models for various types of engineering analysis. Upcoming research will attempt to use AI to automatically create systems from libraries of components and subsystems to achieve required performance. For these tools to work for a particular domain, libraries of components and subsystems must be created, along with the transforms (Test Benches) to map the models to the relevant engineering tools. This internship will work to create component libraries for small, unmanned underwater vehicles (UUV) and engineering tools to evaluate their hydrodynamic properties (Computational Fluid Dynamics and dynamic simulations). Students may also create example system and execute system analysis campaigns.

Desired Qualifications

Background in mechanical engineering is advantageous, familiarity with CFD tools, CAD tools.  Programming in C++/C# and/or Python.