Visible to the public Scalable Privacy Analysis - January 2023Conflict Detection Enabled

PI(s), Co-PI(s), Researchers:

  • Serge Egelman (ICSI)
  • Narseo Vallina-Rodriguez (IMDEA)
  • Primal Wijesekera (ICSI)

Scalability and Composability, Policy-Governed Secure Collaboration, Metrics


  • Accepted:
    C. Gilsenan, F. Shakir, N. Alomar, and S. Egelman. "Security and Privacy Failures in Popular 2FA Apps." Proceedings of the 2023 USENIX Security Symposium.
  • Under Submission:
    Nikita Samarin, Shayna Kothari, Zaina Siyed, Oscar Bjorkman, Reena Yuan, Primal Wijesekera, Noura Alomar, Jordan Fischer, Chris Hoofnagle, and Serge Egelman. "Measuring the Compliance of Android App Developers with the California Consumer Privacy Act (CCPA)." PETS 2023.
  • Submitted:
    Allan Lyons, Julien Gamba, Austin Shawaga, Joel Reardon, Juan Tapiador, Serge Egelman, and Narseo Vallina-Rodriguez. "Oh the Places Your Logs May Go! Measuring the Logging of Sensitive Data in the Android Ecosystem." USENIX Security '23.
  • Submitted:
    Noura Alomar, Yulie Park, Frank Li, Primal Wijesekera, and Serge Egelman. "Understanding Organizational Vulnerability Remediation Processes." USENIX Security '23.


Third-party 2FA TOTP authenticator apps
In this work, we used dynamic analysis tools to examine the security of network backups and then reverse-engineered their security protocols. The Time-based One-Time Password (TOTP) algorithm is a 2FA method widely deployed because of its relatively low implementation costs and purported security benefits compared to SMS 2FA. However, users of TOTP 2FA apps must maintain access to the secrets stored within the TOTP app or risk getting locked out of their accounts. To help users avoid this fate, popular TOTP apps implement a wide range of backup mechanisms, each with varying security and privacy implications. For this study we identified all general-purpose Android TOTP apps in the Google Play Store that had at least 100k installs and implemented a backup mechanism. Most of the 22 apps identified used backup strategies that placed trust in the same technologies they are meant to supersede: passwords and SMS. Also, many backup implementations shared personal user information with third parties, had serious flaws in the implementation and/or usage of cryptography, and allowed the app developers access to users' TOTP secrets.

A paper detailing this work "Security and Privacy Failures in Popular 2FA Apps" has been accepted for presentation at the 32nd USENIX Security Symposium (USENIX Security '23). USENIX brings together researchers, practitioners, system administrators, system programmers and others who are interested in the latest advances in the security and privacy of computer systems and networks.

For the last few months, we have worked with the vendors named in that paper as part of responsible vulnerability disclosure. We disclosed our findings about the issues we found in their apps and offered suggestions on how to fix them. Some of the vendors listened and have fixed the issues in their apps, while others didn't.

California Consumer Privacy Act (CCPA) compliance.
In this work, we investigated CCPA compliance for top-ranked Android mobile app developers from the U.S. Google Play Store. CCPA requires developers to provide accurate privacy notices and to respond to "right to know" requests by disclosing personal information that they have collected, used, or shared about consumers for a business or commercial purpose. Of the 69 app developers who substantively replied to our requests, all but one provided specific pieces of personal data (as opposed to only categorical information). We found that a significant percentage of apps collected information that was not disclosed. This included identifiers (55 apps, 80%), geolocation data (21 apps, 30%), and sensory data (18 apps, 26%). We identified several improvements to the CCPA that could help app developers comply.

After major revisions, and based on current reviews, we expect that our paper on this work "Measuring the Compliance of Android App Developers with the California Consumer Privacy Act (CCPA)" will be accepted for presentation at the 23rd Privacy Enhancing Technologies Symposium (PETS). We were required to add an ethics statement explaining that our IRB declined to review the study because it does not involve human subjects. PETS brings together privacy experts from around the world to discuss recent advances and new perspectives on research in privacy technologies.

Logging of sensitive data
In this study, we examined how sensitive data is logged in the Android ecosystem. Android mobile phones use a shared system that multiplexes logged data from all system components including the operating system and the console output of all apps. Although a security mechanism ensures that user-space apps can only read log entries that they themselves made, many "privileged" apps--including preloaded system apps--are exempt. We conducted a comprehensive end-to-end study of Android's logging behavior. This involved testing a variety of stock smartphone models to measure the presence and variation of device and user identifiers that appear in the log due to the operating system or other preinstalled components. We also conducted a field study to examine the presence of personally identifying information in logs across a variety of users' devices. The study also involved identifying phone vendors who claim to upload personal information and log data, auditing preinstalled apps to understand how many can collect this data, and using static analysis to understand the context around when an app collects a system log.

Our analysis revealed that several types of sensitive data are entered into system logs by various system components, including device drivers, in violation of Google's policies. We also discovered that some user-installed apps are logging data inappropriately. This includes some SDKs that are logging incoming and outgoing network traffic. A paper on this work "Oh the Places Your Logs May Go! Measuring the Logging of Sensitive Data in the Android Ecosystem" was submitted to USENIX Security '23 (and so far has received only positive reviews; we expect it to be accepted this cycle).

Organizational vulnerability remediation
For this study, we interviewed professionals involved in remediation processes to learn more about the steps organizations take to remediate vulnerabilities they uncover. Sustainable vulnerability remediation processes involve taking timely actions to address any vulnerabilities that are discovered as well as implementing strategies to prevent them from reappearing in the future. To remediate a vulnerability, organizations likely make many strategic decisions, including how to evaluate its potential impact, identification of all affected systems, finding out who can remediate and validating the fix. These decisions require contextualizing the vulnerability as well as timely discussions with those involved in building, testing, or managing the affected system(s). To evaluate how organizations remediate vulnerabilities, we interviewed and surveyed more than 60 security professionals who had been involved in remediation processes.

We found that the best remediation processes occur when 1) there is better coordination between everyone involved (internal and external testers, developers, managers, and security engineers), 2) decision makers have up-to-date information on their assets and code owners, and 3) those responsible for remediation receive technical instructions on how to carry it out. Our paper on this work "Understanding Organizational Vulnerability Remediation Processes" was submitted to USENIX Security '23. We received positive reviews last week, which were split between major revisions, minor revisions, and accept, and thus we expect it to be published this year.

New SDK detection method for mobile apps
Building on our prior work in combining static and dynamic analysis, we have developed LibSeeker, a novel hybrid approach for SDK detection and analysis. The ability to detect third-party SDKs and their embedded libraries in compiled software objects and characterize their behavior is vital for the privacy analysis of any software. This is particularly critical for mobile applications due to the increasing presence of potentially intrusive SDKs for analytics or advertisements and the lack of mechanisms in the mobile operating system to discern whether permissions are requested for enabling primary features of the app or for secondary purposes (to which the user might object). LibSeeker combines static and dynamic analysis to overcome the accuracy and scalability limitations of existing state-of-the-art tools. Unlike existing approaches, it focuses on identifying features that point to third-party services or component, which means it doesn't require any prior knowledge about SDKs. To analyze LibSeeker's performance, we used it to detect embedded SDKs in 1,000 mobile apps and compared the results with those from multiple open-source SDK detectors as well as a commercial solution.

The effect of privacy guidance on developers' abilities to discover privacy compliance issues/vulnerabilities
After finding that that developers don't always understand privacy issues for their apps and need guidance on how to improve the privacy of their apps, we decided to design a series of studies to evaluate new tools designed to help developers write software that is more likely to comply with privacy laws. This project will emphasize the importance of code review for discovering privacy vulnerabilities and shed light on how to help developers strengthen their ability to discover privacy vulnerabilities. We will design a privacy checklist and then investigate whether using it would help developers discover privacy vulnerabilities/compliance issues we embed in a skeleton child-directed app adapted/implemented for the purposes of the study. The study will show whether providing explicit privacy guidance to developers could help them make their apps compliant with applicable privacy regulations and which form of guidance is most effective.


  • PI Egelman has been interviewed by several reporters about online privacy issues.


  • Several graduate and undergraduate students are participating in this research.
  • The studies evaluating new developer tools for writing software that complies with privacy laws will form the basis of a PhD dissertation that should be completed in the next year.