Visible to the public Biblio

Found 102 results

Filters: Keyword is Complexity theory  [Clear All Filters]
2020-07-03
Lisova, Elena, El Hachem, Jamal, Causevic, Aida.  2019.  Investigating Attack Propagation in a SoS via a Service Decomposition. 2019 IEEE World Congress on Services (SERVICES). 2642-939X:9—14.
A term systems of systems (SoS) refers to a setup in which a number of independent systems collaborate to create a value that each of them is unable to achieve independently. Complexity of a SoS structure is higher compared to its constitute systems that brings challenges in analyzing its critical properties such as security. An SoS can be seen as a set of connected systems or services that needs to be adequately protected. Communication between such systems or services can be considered as a service itself, and it is the paramount for establishment of a SoS as it enables connections, dependencies, and a cooperation. Given that reliable and predictable communication contributes directly to a correct functioning of an SoS, communication as a service is one of the main assets to consider. Protecting it from malicious adversaries should be one of the highest priorities within SoS design and operation. This study aims to investigate the attack propagation problem in terms of service-guarantees through the decomposition into sub-services enriched with preconditions and postconditions at the service levels. Such analysis is required as a prerequisite for an efficient SoS risk assessment at the design stage of the SoS development life cycle to protect it from possibly high impact attacks capable of affecting safety of systems and humans using the system.
2020-06-26
Padmashree, M G, Arunalatha, J S, Venugopal, K R.  2019.  HSSM: High Speed Split Multiplier for Elliptic Curve Cryptography in IoT. 2019 Fifteenth International Conference on Information Processing (ICINPRO). :1—5.

Security of data in the Internet of Things (IoT) deals with Encryption to provide a stable secure system. The IoT device possess a constrained Main Memory and Secondary Memory that mandates the use of Elliptic Curve Cryptographic (ECC) scheme. The Scalar Multiplication has a great impact on the ECC implementations in reducing the Computation and Space Complexity, thereby enhancing the performance of an IoT System providing high Security and Privacy. The proposed High Speed Split Multiplier (HSSM) for ECC in IoT is a lightweight Multiplication technique that uses Split Multiplication with Pseudo-Mersenne Prime Number and Montgomery Curve to withstand the Power Analysis Attack. The proposed algorithm reduces the Computation Time and the Space Complexity of the Cryptographic operations in terms of Clock cycles and RAM when compared with Liu et al.,’s multiplication algorithms [1].

2020-06-22
Santini, Paolo, Baldi, Marco, Chiaraluce, Franco.  2019.  Cryptanalysis of a One-Time Code-Based Digital Signature Scheme. 2019 IEEE International Symposium on Information Theory (ISIT). :2594–2598.
We consider a one-time digital signature scheme recently proposed by Persichetti and show that a successful key recovery attack can be mounted with limited complexity. The attack we propose exploits a single signature intercepted by the attacker, and relies on a statistical analysis performed over such a signature, followed by information set decoding. We assess the attack complexity and show that a full recovery of the secret key can be performed with a work factor that is far below the claimed security level. The efficiency of the attack is motivated by the sparsity of the signature, which leads to a significant information leakage about the secret key.
Kuznetsov, Alexandr, Kiian, Anastasiia, Pushkar'ov, Andriy, Mialkovskyi, Danylo, Smirnov, Oleksii, Kuznetsova, Tetiana.  2019.  Code-Based Schemes for Post-Quantum Digital Signatures. 2019 10th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS). 2:707–712.
The paper deals with the design and principles of functioning of code-based schemes for formation and verification of electronic digital signature. Comparative studies of the effectiveness of the known CFS scheme and the proposed scheme have been carried out, as well as their possibilities, disadvantages and prospects for use in the post-quantum period.
2020-06-19
Haefner, Kyle, Ray, Indrakshi.  2019.  ComplexIoT: Behavior-Based Trust For IoT Networks. 2019 First IEEE International Conference on Trust, Privacy and Security in Intelligent Systems and Applications (TPS-ISA). :56—65.

This work takes a novel approach to classifying the behavior of devices by exploiting the single-purpose nature of IoT devices and analyzing the complexity and variance of their network traffic. We develop a formalized measurement of complexity for IoT devices, and use this measurement to precisely tune an anomaly detection algorithm for each device. We postulate that IoT devices with low complexity lead to a high confidence in their behavioral model and have a correspondingly more precise decision boundary on their predicted behavior. Conversely, complex general purpose devices have lower confidence and a more generalized decision boundary. We show that there is a positive correlation to our complexity measure and the number of outliers found by an anomaly detection algorithm. By tuning this decision boundary based on device complexity we are able to build a behavioral framework for each device that reduces false positive outliers. Finally, we propose an architecture that can use this tuned behavioral model to rank each flow on the network and calculate a trust score ranking of all traffic to and from a device which allows the network to autonomously make access control decisions on a per-flow basis.

2020-05-15
Wang, Shaolei, Zhou, Ying, Li, Yaowei, Guo, Ronghua, Du, Jiawei.  2018.  Quantitative Analysis of Network Address Randomization's Security Effectiveness. 2018 IEEE 18th International Conference on Communication Technology (ICCT). :906—910.

The quantitative security effectiveness analysis is a difficult problem for the research of network address randomization techniques. In this paper, a system model and an attack model are proposed based on general attacks' attack processes and network address randomization's technical principle. Based on the models, the network address randomization's security effectiveness is quantitatively analyzed from the perspective of the attacker's attack time and attack cost in both static network address and network address randomization cases. The results of the analysis show that the security effectiveness of network address randomization is determined by the randomization frequency, the randomization space, the states of hosts in the target network, and the capabilities of the attacker.

2020-04-24
Zhang, Lei, Zhang, Jianqing, Chen, Yong, Liao, Shaowen.  2018.  Research on the Simulation Algorithm of Object-Oriented Language. 2018 3rd International Conference on Smart City and Systems Engineering (ICSCSE). :902—904.

Security model is an important subject in the field of low energy independence complexity theory. It takes security strategy as the core, changes the system from static protection to dynamic protection, and provides the basis for the rapid response of the system. A large number of empirical studies have been conducted to verify the cache consistency. The development of object oriented language is pure object oriented language, and the other is mixed object oriented language, that is, adding class, inheritance and other elements in process language and other languages. This paper studies a new object-oriented language application, namely GUT for a write-back cache, which is based on the study of simulation algorithm to solve all these challenges in the field of low energy independence complexity theory.

2020-04-06
Hu, Xiaoyan, Zheng, Shaoqi, Zhao, Lixia, Cheng, Guang, Gong, Jian.  2019.  Exploration and Exploitation of Off-path Cached Content in Network Coding Enabled Named Data Networking. 2019 IEEE 27th International Conference on Network Protocols (ICNP). :1—6.

Named Data Networking (NDN) intrinsically supports in-network caching and multipath forwarding. The two salient features offer the potential to simultaneously transmit content segments that comprise the requested content from original content publishers and in-network caches. However, due to the complexity of maintaining the reachability information of off-path cached content at the fine-grained packet level of granularity, the multipath forwarding and off-path cached copies are significantly underutilized in NDN so far. Network coding enabled NDN, referred to as NC-NDN, was proposed to effectively utilize multiple on-path routes to transmit content, but off-path cached copies are still unexploited. This work enhances NC-NDN with an On-demand Off-path Cache Exploration based Multipath Forwarding strategy, dubbed as O2CEMF, to take full advantage of the multipath forwarding to efficiently utilize off-path cached content. In O2CEMF, each network node reactively explores the reachability information of nearby off-path cached content when consumers begin to request a generation of content, and maintains the reachability at the coarse-grained generation level of granularity instead. Then the consumers simultaneously retrieve content from the original content publisher(s) and the explored capable off-path caches. Our experimental studies validate that this strategy improves the content delivery performance efficiently as compared to that in the present NC-NDN.

Shen, Yuanqi, Li, You, Kong, Shuyu, Rezaei, Amin, Zhou, Hai.  2019.  SigAttack: New High-level SAT-based Attack on Logic Encryptions. 2019 Design, Automation Test in Europe Conference Exhibition (DATE). :940–943.
Logic encryption is a powerful hardware protection technique that uses extra key inputs to lock a circuit from piracy or unauthorized use. The recent discovery of the SAT-based attack with Distinguishing Input Pattern (DIP) generation has rendered all traditional logic encryptions vulnerable, and thus the creation of new encryption methods. However, a critical question for any new encryption method is whether security against the DIP-generation attack means security against all other attacks. In this paper, a new high-level SAT-based attack called SigAttack has been discovered and thoroughly investigated. It is based on extracting a key-revealing signature in the encryption. A majority of all known SAT-resilient encryptions are shown to be vulnerable to SigAttack. By formulating the condition under which SigAttack is effective, the paper also provides guidance for the future logic encryption design.
2020-03-23
Karlsson, Linus, Paladi, Nicolae.  2019.  Privacy-Enabled Recommendations for Software Vulnerabilities. 2019 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech). :564–571.
New software vulnerabilities are published daily. Prioritizing vulnerabilities according to their relevance to the collection of software an organization uses is a costly and slow process. While recommender systems were earlier proposed to address this issue, they ignore the security of the vulnerability prioritization data. As a result, a malicious operator or a third party adversary can collect vulnerability prioritization data to identify the security assets in the enterprise deployments of client organizations. To address this, we propose a solution that leverages isolated execution to protect the privacy of vulnerability profiles without compromising data integrity. To validate an implementation of the proposed solution we integrated it with an existing recommender system for software vulnerabilities. The evaluation of our implementation shows that the proposed solution can effectively complement existing recommender systems for software vulnerabilities.
2020-03-16
Noori-Hosseini, Mona, Lennartson, Bengt.  2019.  Incremental Abstraction for Diagnosability Verification of Modular Systems. 2019 24th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA). :393–399.
In a diagnosability verifier with polynomial complexity, a non-diagnosable system generates uncertain loops. Such forbidden loops are in this paper transformed to forbidden states by simple detector automata. The forbidden state problem is trivially transformed to a nonblocking problem by considering all states except the forbidden ones as marked states. This transformation is combined with one of the most efficient abstractions for modular systems called conflict equivalence, where nonblocking properties are preserved. In the resulting abstraction, local events are hidden and more local events are achieved when subsystems are synchronized. This incremental abstraction is applied to a scalable production system, including parallel lines where buffers and machines in each line include some typical failures and feedback flows. For this modular system, the proposed diagnosability algorithm shows great results, where diagnosability of systems including millions of states is analyzed in less than a second.
Gajavelly, Raj Kumar, Baumgartner, Jason, Ivrii, Alexander, Kanzelman, Robert L., Ghosh, Shiladitya.  2019.  Input Elimination Transformations for Scalable Verification and Trace Reconstruction. 2019 Formal Methods in Computer Aided Design (FMCAD). :10–18.
We present two novel sound and complete netlist transformations, which substantially improve verification scalability while enabling very efficient trace reconstruction. First, we present a 2QBF variant of input reparameterization, capable of eliminating inputs without introducing new logic and without complete range computation. While weaker in reduction potential, it yields up to 4 orders of magnitude speedup to trace reconstruction when used as a fast-and-lossy preprocess to traditional reparameterization. Second, we present a novel scalable approach to leverage sequential unateness to merge selective inputs, in cases greatly reducing netlist size and verification complexity. Extensive benchmarking demonstrates the utility of these techniques. Connectivity verification particularly benefits from these reductions, up to 99.8%.
2020-03-09
Hettiarachchi, Charitha, Do, Hyunsook.  2019.  A Systematic Requirements and Risks-Based Test Case Prioritization Using a Fuzzy Expert System. 2019 IEEE 19th International Conference on Software Quality, Reliability and Security (QRS). :374–385.
The use of risk information can help software engineers identify software components that are likely vulnerable or require extra attention when testing. Some studies have shown that the requirements risk-based approaches can be effective in improving the effectiveness of regression testing techniques. However, the risk estimation processes used in such approaches can be subjective, time-consuming, and costly. In this research, we introduce a fuzzy expert system that emulates human thinking to address the subjectivity related issues in the risk estimation process in a systematic and an efficient way and thus further improve the effectiveness of test case prioritization. Further, the required data for our approach was gathered by employing a semi-automated process that made the risk estimation process less subjective. The empirical results indicate that the new prioritization approach can improve the rate of fault detection over several existing test case prioritization techniques, while reducing threats to subjective risk estimation.
2020-03-02
Sultana, Kazi Zakia, Chong, Tai-Yin.  2019.  A Proposed Approach to Build an Automated Software Security Assessment Framework using Mined Patterns and Metrics. 2019 IEEE International Conference on Computational Science and Engineering (CSE) and IEEE International Conference on Embedded and Ubiquitous Computing (EUC). :176–181.

Software security is a major concern of the developers who intend to deliver a reliable software. Although there is research that focuses on vulnerability prediction and discovery, there is still a need for building security-specific metrics to measure software security and vulnerability-proneness quantitatively. The existing methods are either based on software metrics (defined on the physical characteristics of code; e.g. complexity or lines of code) which are not security-specific or some generic patterns known as nano-patterns (Java method-level traceable patterns that characterize a Java method or function). Other methods predict vulnerabilities using text mining approaches or graph algorithms which perform poorly in cross-project validation and fail to be a generalized prediction model for any system. In this paper, we envision to construct an automated framework that will assist developers to assess the security level of their code and guide them towards developing secure code. To accomplish this goal, we aim to refine and redefine the existing nano-patterns and software metrics to make them more security-centric so that they can be used for measuring the software security level of a source code (either file or function) with higher accuracy. In this paper, we present our visionary approach through a series of three consecutive studies where we (1) will study the challenges of the current software metrics and nano-patterns in vulnerability prediction, (2) will redefine and characterize the nano-patterns and software metrics so that they can capture security-specific properties of code and measure the security level quantitatively, and finally (3) will implement an automated framework for the developers to automatically extract the values of all the patterns and metrics for the given code segment and then flag the estimated security level as a feedback based on our research results. We accomplished some preliminary experiments and presented the results which indicate that our vision can be practically implemented and will have valuable implications in the community of software security.

2020-02-10
Hoey, Jesse, Sheikhbahaee, Zahra, MacKinnon, Neil J..  2019.  Deliberative and Affective Reasoning: a Bayesian Dual-Process Model. 2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW). :388–394.
The presence of artificial agents in human social networks is growing. From chatbots to robots, human experience in the developed world is moving towards a socio-technical system in which agents can be technological or biological, with increasingly blurred distinctions between. Given that emotion is a key element of human interaction, enabling artificial agents with the ability to reason about affect is a key stepping stone towards a future in which technological agents and humans can work together. This paper presents work on building intelligent computational agents that integrate both emotion and cognition. These agents are grounded in the well-established social-psychological Bayesian Affect Control Theory (BayesAct). The core idea of BayesAct is that humans are motivated in their social interactions by affective alignment: they strive for their social experiences to be coherent at a deep, emotional level with their sense of identity and general world views as constructed through culturally shared symbols. This affective alignment creates cohesive bonds between group members, and is instrumental for collaborations to solidify as relational group commitments. BayesAct agents are motivated in their social interactions by a combination of affective alignment and decision theoretic reasoning, trading the two off as a function of the uncertainty or unpredictability of the situation. This paper provides a high-level view of dual process theories and advances BayesAct as a plausible, computationally tractable model based in social-psychological and sociological theory.
2020-01-20
Krasnobaev, Victor, Kuznetsov, Alexandr, Babenko, Vitalina, Denysenko, Mykola, Zub, Mihael, Hryhorenko, Vlada.  2019.  The Method of Raising Numbers, Represented in the System of Residual Classes to an Arbitrary Power of a Natural Number. 2019 IEEE 2nd Ukraine Conference on Electrical and Computer Engineering (UKRCON). :1133–1138.

Methods for implementing integer arithmetic operations of addition, subtraction, and multiplication in the system of residual classes are considered. It is shown that their practical use in computer systems can significantly improve the performance of the implementation of arithmetic operations. A new method has been developed for raising numbers represented in the system of residual classes to an arbitrary power of a natural number, both in positive and in negative number ranges. An example of the implementation of the proposed method for the construction of numbers represented in the system of residual classes for the value of degree k = 2 is given.

2019-12-30
Razaque, Abdul, Jinrui, Wang, Zancheng, Wang, Hani, Qassim Bani, Khaskheli, Murad Ali, Bhutto, Waseem Ahmed.  2018.  Integration of CPU and GPU to Accelerate RSA Modular Exponentiation Operation. 2018 IEEE Long Island Systems, Applications and Technology Conference (LISAT). :1-6.

Now-a-days, the security of data becomes more and more important, people store many personal information in their phones. However, stored information require security and maintain privacy. Encryption algorithm has become the main force of maintaining the security of data. Thus, the algorithm complexity and encryption efficiency have become the main measurement of whether the encryption algorithm is save or not. With the development of hardware, we have many tools to improve the algorithm at present. Because modular exponentiation in RSA algorithm can be divided into several parts mathematically. In this paper, we introduce a conception by dividing the process of encryption and add the model into graphics process unit (GPU). By using GPU's capacity in parallel computing, the core of RSA can be accelerated by using central process unit (CPU) and GPU. Compute unified device architecture (CUDA) is a platform which can combine CPU and GPU together to realize GPU parallel programming and this is the tool we use to perform experience of accelerating RSA algorithm. This paper will also build up a mathematical model to help understand the mechanism of RSA encryption algorithm.

2019-12-18
Chen, Jim Q..  2017.  Take the rein of cyber deterrence. 2017 International Conference on Cyber Conflict (CyCon U.S.). :29–35.
Deterrence is badly needed in the cyber domain but it is hard to be achieved. Why is conventional deterrence not working effectively in the cyber domain? What specific characteristics should be considered when deterrence strategies are developed in this man-made domain? These are the questions that this paper intends to address. The research conducted helps to reveal what cyber deterrence can do and what it cannot do so that focus can be put on the enhancement of what it can do. To include varied perspectives, literature review is conducted. Some research works are specifically examined. Based on these studies, this research proposes a holistic approach in cyber deterrence that is empowered by artificial intelligence and machine learning. This approach is capable of making sudden, dynamic, stealthy, and random changes initiated by different contexts. It is able to catch attackers by surprise. The surprising and changing impact inflicts a cost on attackers and makes them to re-calculate the benefits that they might gain through further attacks, thus discouraging or defeating adversaries both mentally and virtually, and eventually controlling escalation of cyber conflicts.
2019-12-16
Ferdowsi, Farzad, Barati, Masoud, Edrington, Chris S..  2019.  Real-Time Resiliency Assessment of Control Systems in Microgrids Using the Complexity Metric. 2019 IEEE Green Technologies Conference(GreenTech). :1-5.

This paper presents a novel technique to quantify the operational resilience for power electronic-based components affected by High-Impact Low-Frequency (HILF) weather-related events such as high speed winds. In this study, the resilience quantification is utilized to investigate how prompt the system goes back to the pre-disturbance or another stable operational state. A complexity quantification metric is used to assess the system resilience. The test system is a Solid-State Transformer (SST) representing a complex, nonlinear interconnected system. Results show the effectiveness of the proposed technique for quantifying the operational resilience in systems affected by weather-related disturbances.

2019-12-11
Canetti, Ran, Stoughton, Alley, Varia, Mayank.  2019.  EasyUC: Using EasyCrypt to Mechanize Proofs of Universally Composable Security. 2019 IEEE 32nd Computer Security Foundations Symposium (CSF). :167–16716.

We present a methodology for using the EasyCrypt proof assistant (originally designed for mechanizing the generation of proofs of game-based security of cryptographic schemes and protocols) to mechanize proofs of security of cryptographic protocols within the universally composable (UC) security framework. This allows, for the first time, the mechanization and formal verification of the entire sequence of steps needed for proving simulation-based security in a modular way: Specifying a protocol and the desired ideal functionality; Constructing a simulator and demonstrating its validity, via reduction to hard computational problems; Invoking the universal composition operation and demonstrating that it indeed preserves security. We demonstrate our methodology on a simple example: stating and proving the security of secure message communication via a one-time pad, where the key comes from a Diffie-Hellman key-exchange, assuming ideally authenticated communication. We first put together EasyCrypt-verified proofs that: (a) the Diffie-Hellman protocol UC-realizes an ideal key-exchange functionality, assuming hardness of the Decisional Diffie-Hellman problem, and (b) one-time-pad encryption, with a key obtained using ideal key-exchange, UC-realizes an ideal secure-communication functionality. We then mechanically combine the two proofs into an EasyCrypt-verified proof that the composed protocol realizes the same ideal secure-communication functionality. Although formulating a methodology that is both sound and workable has proven to be a complex task, we are hopeful that it will prove to be the basis for mechanized UC security analyses for significantly more complex protocols and tasks.

2019-12-09
Gao, Yali, Li, Xiaoyong, Li, Jirui, Gao, Yunquan, Yu, Philip S..  2019.  Info-Trust: A Multi-Criteria and Adaptive Trustworthiness Calculation Mechanism for Information Sources. IEEE Access. 7:13999–14012.
Social media have become increasingly popular for the sharing and spreading of user-generated content due to their easy access, fast dissemination, and low cost. Meanwhile, social media also enable the wide propagation of cyber frauds, which leverage fake information sources to reach an ulterior goal. The prevalence of untrustworthy information sources on social media can have significant negative societal effects. In a trustworthy social media system, trust calculation technology has become a key demand for the identification of information sources. Trust, as one of the most complex concepts in network communities, has multi-criteria properties. However, the existing work only focuses on single trust factor, and does not consider the complexity of trust relationships in social computing completely. In this paper, a multi-criteria trustworthiness calculation mechanism called Info-Trust is proposed for information sources, in which identity-based trust, behavior-based trust, relation-based trust, and feedback-based trust factors are incorporated to present an accuracy-enhanced full view of trustworthiness evaluation of information sources. More importantly, the weights of these factors are dynamically assigned by the ordered weighted averaging and weighted moving average (OWA-WMA) combination algorithm. This mechanism surpasses the limitations of existing approaches in which the weights are assigned subjectively. The experimental results based on the real-world datasets from Sina Weibo demonstrate that the proposed mechanism achieves greater accuracy and adaptability in trustworthiness identification of the network information.
2019-11-25
Abdulwahab, Walled Khalid, Abdulrahman Kadhim, Abdulkareem.  2018.  Comparative Study of Channel Coding Schemes for 5G. 2018 International Conference on Advanced Science and Engineering (ICOASE). :239–243.
In this paper we look into 5G requirements for channel coding and review candidate channel coding schemes for 5G. A comparative study is presented for possible channel coding candidates of 5G covering Convolutional, Turbo, Low Density Parity Check (LDPC), and Polar codes. It seems that polar code with Successive Cancellation List (SCL) decoding using small list length (such as 8) is a promising choice for short message lengths (≤128 bits) due to its error performance and relatively low complexity. Also adopting non-binary LDPC can provide good performance on the expense of increased complexity but with better spectral efficiency. Considering the implementation, polar code with decoding algorithms based on SCL required small area and low power consumption when compared to LDPC codes. For larger message lengths (≥256 bits) turbo code can provide better performance at low coding rates (\textbackslashtextless;1/2).
2019-11-12
Wei, Shengjun, Zhong, Hao, Shan, Chun, Ye, Lin, Du, Xiaojiang, Guizani, Mohsen.  2018.  Vulnerability Prediction Based on Weighted Software Network for Secure Software Building. 2018 IEEE Global Communications Conference (GLOBECOM). :1-6.

To build a secure communications software, Vulnerability Prediction Models (VPMs) are used to predict vulnerable software modules in the software system before software security testing. At present many software security metrics have been proposed to design a VPM. In this paper, we predict vulnerable classes in a software system by establishing the system's weighted software network. The metrics are obtained from the nodes' attributes in the weighted software network. We design and implement a crawler tool to collect all public security vulnerabilities in Mozilla Firefox. Based on these data, the prediction model is trained and tested. The results show that the VPM based on weighted software network has a good performance in accuracy, precision, and recall. Compared to other studies, it shows that the performance of prediction has been improved greatly in Pr and Re.

2019-10-23
Zieger, Andrej, Freiling, Felix, Kossakowski, Klaus-Peter.  2018.  The $\beta$-Time-to-Compromise Metric for Practical Cyber Security Risk Estimation. 2018 11th International Conference on IT Security Incident Management IT Forensics (IMF). :115-133.

To manage cybersecurity risks in practice, a simple yet effective method to assess suchs risks for individual systems is needed. With time-to-compromise (TTC), McQueen et al. (2005) introduced such a metric that measures the expected time that a system remains uncompromised given a specific threat landscape. Unlike other approaches that require complex system modeling to proceed, TTC combines simplicity with expressiveness and therefore has evolved into one of the most successful cybersecurity metrics in practice. We revisit TTC and identify several mathematical and methodological shortcomings which we address by embedding all aspects of the metric into the continuous domain and the possibility to incorporate information about vulnerability characteristics and other cyber threat intelligence into the model. We propose $\beta$-TTC, a formal extension of TTC which includes information from CVSS vectors as well as a continuous attacker skill based on a $\beta$-distribution. We show that our new metric (1) remains simple enough for practical use and (2) gives more realistic predictions than the original TTC by using data from a modern and productively used vulnerability database of a national CERT.

2019-10-22
Deb Nath, Atul Prasad, Bhunia, Swarup, Ray, Sandip.  2018.  ArtiFact: Architecture and CAD Flow for Efficient Formal Verification of SoC Security Policies. 2018 IEEE Computer Society Annual Symposium on VLSI (ISVLSI). :411–416.
Verification of security policies represents one of the most critical, complex, and expensive steps of modern SoC design validation. SoC security policies are typically implemented as part of functional design flow, with a diverse set of protection mechanisms sprinkled across various IP blocks. An obvious upshot is that their verification requires comprehension and analysis of the entire system, representing a scalability bottleneck for verification tools. The scale and complexity of industrial SoC is far beyond the analysis capacity of state-of-the-art formal tools; even simulation-based security verification is severely limited in effectiveness because of the need to exercise subtle corner-cases across the entire system. We address this challenge by developing a novel security architecture that accounts for verification needs from the ground up. Our framework, ArtiFact, provides an alternative architecture for security policy implementation that exploits a flexible, centralized, infrastructure IP and enables scalable, streamlined verification of these policies. With our architecture, verification of system-level security policies reduces to analysis of this single IP and its interfaces, enabling off-the-shelf formal tools to successfully verify these policies. We introduce a CAD flow that supports both formal and dynamic (simulation-based) verification, and is built on top of such off-the-shelf tools. Our approach reduces verification time by over 62X and bug detection time by 34X for illustrative policies.