Visible to the public Predicting the Difficulty of Compromise through How Attackers Discover VulnerabilitiesConflict Detection Enabled

PI(s), Co-PI(s), Researchers:

PI: Andrew Meneely; Co-PI: Laurie Williams; Researchers: Ben Meyers and Nasif Imtiaz

This refers to Hard Problems, released November 2012.

  • Metrics

Papers were written as a result of your research from the current quarter only.

  • Imtiaz, Nasif; Thorn, Seaver; Williams, Laurie, "A comparative study of vulnerability reporting by software composition analysis tools", 15th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement (ESEM 2021).

  • Bhattacharya, Saikath; Singh, Munindar P.; Williams, Laurie, "Software Security Readiness and Deployment", 2021 IEEE International Symposium on Software Reliability Engineering Workshops (ISSREW), in proceedings ISSREW 2021.

Each effort should submit one or two specific highlights. Each item should include a paragraph or two along with a citation if available. Write as if for the general reader of IEEE S&P.
The purpose of the highlights is to give our immediate sponsors a body of evidence that the funding they are providing (in the framework of the SoS lablet model) is delivering results that "more than justify" the investment they are making.

  • We have developed an initial taxonomy of human errors in software engineering based on a prior apology mining study. We are conducting a systematic literature review.
  • We have implemented an update tool that (i) identifies the files and the code changes in an update that cannot be traced back to the package's source repository, i.e., phantom artifacts; and then (ii) measures what portion of the changes in the update, excluding the phantom artifacts, has passed through a code review process, i.e., code review coverage.



  • None.