Visible to the public Security Measurement and Metric Methods, 2014

SoS Newsletter- Advanced Book Block


SoS Logo

Security Measurement and Metric Methods, 2014

Measurement and metrics are hard problems in the Science of Security. The research cited here looks at methods and techniques of developing valid measurement. This work was presented in 2014.

Moeti, M.; Kalema, B.M., "Analytical Hierarchy Process Approach for the Metrics of Information Security Management Framework," Computational Intelligence, Communication Systems and Networks (CICSyN), 2014 Sixth International Conference on, vol., no., pp. 89, 94, 27-29 May 2014. doi:10.1109/CICSyN.2014.31
Abstract: Organizations' information technology systems are increasingly being attacked and exposed to risks that lead to loss of valuable information and money. The systems and applications of vulnerability are basically, networks, databases, web services, internet-based services and communications, mobile technologies and people issues associated with them. The major objective of this study therefore, was to identify metrics needed for the development of an information security management framework. From related literature, relevant metrics were identified using textual analysis and grouped into six categories of, organizational, environmental, contingency management, security policy, internal control, and information and risk management. These metrics were validated in a framework by using the analytical hierarchical process (AHP) method. Results of the study indicated that, environmental metrics play a critical role in the information security management as compared to other metrics whereas the information and risk management metrics was found to be not so significant during the rankings. This study contributes to the information security management body of knowledge by providing a single empirically validated framework that will be used theoretically to extend research in the domain of the study and practically by management while making decisions relating to security management.
Keywords: Internet; analytic hierarchy process; risk management; security of data; AHP; Internet-based services; Web services; analytical hierarchy process approach; databases; information security management framework metrics; mobile technologies; organizations information technology systems; risk management metrics; security management; Contingency management; Educational institutions; Information security; Measurement; Organizations; Risk management; analytical hierarchical process; information security metrics; integrated system theory; theories of information security (ID#: 15-6060)


Manandhar, K.; Xiaojun Cao; Fei Hu; Yao Liu, "Combating False Data Injection Attacks in Smart Grid Using Kalman Filter," Computing, Networking and Communications (ICNC), 2014 International Conference on, vol., no., pp. 16, 20, 3-6 Feb. 2014. doi:10.1109/ICCNC.2014.6785297
Abstract: The security of Smart Grid, being one of the very important aspects of the Smart Grid system, is studied in this paper. We first discuss different pitfalls in the security of the Smart Grid system considering the communication infrastructure among the sensors, actuators, and control systems. Following that, we derive a mathematical model of the system and propose a robust security framework for power grid. To effectively estimate the variables of a wide range of state processes in the model, we adopt Kalman Filter in the framework. The Kalman Filter estimates and system readings are then fed into the χ2-square detectors and the proposed Euclidean detectors, which can detect various attacks and faults in the power system including False Data Injection Attacks. The χ2-detector is a proven-effective exploratory method used with Kalman Filter for the measurement of the relationship between dependent variables and a series of predictor variables. The χ2-detector can detect system faults/attacks such as replay and DoS attacks. However, the study shows that the X2-detector detectors are unable to detect statistically derived False Data Injection Attacks while the Euclidean distance metrics can identify such sophisticated injection attacks.
Keywords: Kalman filters; computer network security; electric sensing devices; fault diagnosis; power engineering computing; power system faults; power system security; power system state estimation; smart power grids; X2-square detector; DoS attacks; Euclidean detector; Euclidean distance metrics; Kalman filter; actuators; communication infrastructure; control systems; false data injection attack detection; fault detection; mathematical model; power system; predictor variable series; sensors; smart power grid security; state process; Detectors; Equations; Kalman filters; Mathematical model; Security; Smart grids (ID#: 15-6061)


Karabat, C.; Topcu, B., "How to Assess Privacy Preservation Capability of Biohashing Methods?: Privacy Metrics," Signal Processing and Communications Applications Conference (SIU), 2014 22nd, vol., no., pp. 2217, 2220, 23-25 April 2014. doi:10.1109/SIU.2014.6830705
Abstract: In this paper, we evaluate privacy preservation capability of biometric hashing methods. Although there are some work on privacy evaluation of biometric template protection methods in the literature, they fail to cover all biometric template protection methods. To the best of our knowledge, there is no work on privacy metrics and assessment for biometric hashing methods. We use several metrics under different threat scenarios to assess privacy protection level of biometric hashing methods in this work. The simulation results demonstrate that biometric hash vectors may leak private information especially under advanced threat scenarios.
Keywords: authorisation; biometrics (access control); data protection; biometric hash vectors; biometric hashing methods; biometric template protection methods; privacy metrics; privacy preservation capability assessment; privacy preservation capability evaluation; privacy protection level assessment; private information leakage; threat scenarios; Conferences; Internet; Measurement; Privacy; Security; Signal processing; Simulation; biometric; biometric hash; metrics; privacy (ID#: 15-6062)


Hong, J.B.; Dong Seong Kim; Haqiq, A., "What Vulnerability Do We Need to Patch First?," Dependable Systems and Networks (DSN), 2014 44th Annual IEEE/IFIP International Conference on, vol., no., pp. 684, 689, 23-26 June 2014. doi:10.1109/DSN.2014.68
Abstract: Computing a prioritized set of vulnerabilities to patch is important for system administrators to determine the order of vulnerabilities to be patched that are more critical to the network security. One way to assess and analyze security to find vulnerabilities to be patched is to use attack representation models (ARMs). However, security solutions using ARMs are optimized for only the current state of the networked system. Therefore, the ARM must reanalyze the network security, causing multiple iterations of the same task to obtain the prioritized set of vulnerabilities to patch. To address this problem, we propose to use importance measures to rank network hosts and vulnerabilities, then combine these measures to prioritize the order of vulnerabilities to be patched. We show that nearly equivalent prioritized set of vulnerabilities can be computed in comparison to an exhaustive search method in various network scenarios, while the performance of computing the set is dramatically improved, while equivalent solutions are computed in various network scenarios.
Keywords: security of data; ARM; attack representation models; importance measures; network hosts; network security; networked system; prioritized set; security solutions; system administrators; vulnerability patch; Analytical models; Computational modeling; Equations; Mathematical model; Measurement; Scalability; Security; Attack Representation Model; Network Centrality; Security Analysis; Security Management; Security Metrics; Vulnerability Patch (ID#: 15-6063)


Nascimento, Z.; Sadok, D.; Fernandes, S.; Kelner, J., "Multi-Objective Optimization of a Hybrid Model for Network Traffic Classification by Combining Machine Learning Techniques," Neural Networks (IJCNN), 2014 International Joint Conference on, vol., no., pp. 2116, 2122, 6-11 July 2014. doi:10.1109/IJCNN.2014.6889935
Abstract: Considerable effort has been made by researchers in the area of network traffic classification, since the Internet is constantly changing. This characteristic makes the task of traffic identification not a straightforward process. Besides that, encrypted data is being widely used by applications and protocols. There are several methods for classifying network traffic such as known ports and Deep Packet Inspection (DPI), but they are not effective since many applications constantly randomize their ports and the payload could be encrypted. This paper proposes a hybrid model that makes use of a classifier based on computational intelligence, the Extreme Learning Machine (ELM), along with Feature Selection (FS) and Multi-objective Genetic Algorithms (MOGA) to classify computer network traffic without making use of the payload or port information. The proposed model presented good results when evaluated against the UNIBS data set, using four performance metrics: Recall, Precision, Flow Accuracy and Byte Accuracy, with most rates exceeding 90%. Besides that, presented the best features and feature selection algorithm for the given problem along with the best ELM parameters.
Keywords: Internet; computer network security; cryptography; feature selection; genetic algorithms; learning (artificial intelligence); pattern classification; protocols; telecommunication traffic; DPI; ELM parameters; Internet; MOGA; UNIBS data set; byte accuracy; computational intelligence; computer network traffic classification; deep packet inspection; encrypted data; extreme learning machine; feature selection algorithm; flow accuracy; hybrid model; machine learning techniques; multiobjective genetic algorithms; multiobjective optimization; payload encryption; precision; protocols; recall; traffic identification; Accuracy; Computational modeling; Genetic algorithms; Measurement; Optimization; Ports (Computers); Protocols (ID#: 15-6064)


Hatzivasilis, G.; Papaefstathiou, I.; Manifavas, C.; Papadakis, N., "A Reasoning System for Composition Verification and Security Validation,"  New Technologies, Mobility and Security (NTMS), 2014 6th International Conference on, vol., no., pp. 1, 4, March 30 2014-April 2 2014.  doi:10.1109/NTMS.2014.6814001
Abstract: The procedure to prove that a system-of-systems is composable and secure is a very difficult task. Formal methods are mathematically-based techniques used for the specification, development and verification of software and hardware systems. This paper presents a model-based framework for dynamic embedded system composition and security evaluation. Event Calculus is applied for modeling the security behavior of a dynamic system and calculating its security level with the progress in time. The framework includes two main functionalities: composition validation and derivation of security and performance metrics and properties. Starting from an initial system state and given a series of further composition events, the framework derives the final system state as well as its security and performance metrics and properties. We implement the proposed framework in an epistemic reasoner, the rule engine JESS with an extension of DECKT for the reasoning process and the JAVA programming language.
Keywords: Java; embedded systems; formal specification; formal verification; reasoning about programs; security of data; software metrics; temporal logic; DECKT; JAVA programming language; composition validation; composition verification; dynamic embedded system composition; epistemic reasoner; event calculus; formal methods; model-based framework; performance metrics; reasoning system; rule engine JESS; security evaluation; security validation; system specification; system-of-systems; Cognition; Computational modeling; Embedded systems; Measurement; Protocols; Security; Unified modeling language (ID#: 15-6064)


Axelrod, C.W., "Reducing Software Assurance Risks for Security-Critical and Safety-Critical Systems," Systems, Applications and Technology Conference (LISAT), 2014 IEEE Long Island, vol., no., pp. 1, 6, 2-2 May 2014. doi:10.1109/LISAT.2014.6845212
Abstract: According to the Office of the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)), the US Department of Defense (DoD) recognizes that there is a “persistent lack of a consistent approach ... for the certification of software assurance tools, testing and methodologies” [1]. As a result, the ASD(R&E) is seeking “to address vulnerabilities and weaknesses to cyber threats of the software that operates ... routine applications and critical kinetic systems ...” The mitigation of these risks has been recognized as a significant issue to be addressed in both the public and private sectors. In this paper we examine deficiencies in various software-assurance approaches and suggest ways in which they can be improved. We take a broad look at current approaches, identify their inherent weaknesses and propose approaches that serve to reduce risks. Some technical, economic and governance issues are: (1) Development of software-assurance technical standards (2) Management of software-assurance standards (3) Evaluation of tools, techniques, and metrics (4) Determination of update frequency for tools, techniques (5) Focus on most pressing threats to software systems (6) Suggestions as to risk-reducing research areas (7) Establishment of models of the economics of software-assurance solutions, and testing and certifying software We show that, in order to improve current software assurance policy and practices, particularly with respect to security, there has to be a major overhaul in how software is developed, especially with respect to the requirements and testing phases of the SDLC (Software Development Lifecycle). We also suggest that the current preventative approaches are inadequate and that greater reliance should be placed upon avoidance and deterrence. We also recommend that those developing and operating security-critical and safety-critical systems exchange best-of-breed software assurance methods to prevent the vulnerability of components leading to compromise of entire systems of systems. The recent catastrophic loss of a Malaysia Airlines airplane is then presented as an example of possible compromises of physical and logical security of on-board communications and management and control systems.
Keywords: program testing; safety-critical software; software development management; software metrics; ASD(R&E);Assistant Secretary of Defense for Research and Engineering; Malaysia Airlines airplane; SDLC; US Department of Defense; US DoD; component vulnerability prevention; control systems; critical kinetic systems; cyber threats; economic issues; governance issues; logical security; management systems; on-board communications; physical security; private sectors; public sectors; risk mitigation; safety-critical systems; security-critical systems; software assurance risk reduction; software assurance tool certification; software development; software development lifecycle; software methodologies; software metric evaluation; software requirements; software system threats; software technique evaluation; software testing; software tool evaluation; software-assurance standard management; software-assurance technical standard development; technical issues; update frequency determination; Measurement; Organizations; Security; Software systems; Standards; Testing; cyber threats; cyber-physical systems; governance; risk; safety-critical systems; security-critical systems; software assurance; technical standards; vulnerabilities; weaknesses (ID#: 15-6066)


Chulhee Lee; Jiheon Ok; Guiwon Seo, "Objective Video Quality Measurement Using Embedded VQMs," Heterogeneous Networking for Quality, Reliability, Security and Robustness (QShine), 2014 10th International Conference on, vol., no., pp. 129 ,130, 18-20 Aug. 2014. doi:10.1109/QSHINE.2014.6928671
Abstract: Video quality monitoring has become an important issue as multimedia data is increasingly being transmitted over the Internet and wireless channels where transmission errors can frequently occur. Although no-reference models are suitable to such applications, current no-reference methods do not provide acceptable performance. In this paper, we propose an objective video quality assessment method using embedded video quality metrics (VQMs). In the proposed method, the video quality of encoded video data is computed at the transmitter during the encoding process. The computed VQMs are embedded in the compressed data. If there are no transmission errors, the video quality at the receiver would be identical with that of the transmitting side. If there are transmission errors, the receiver adjusts the embedded VQMs by taking into account the effects of transmission errors. The proposed method is fast and provides good performance.
Keywords: data compression; video coding; embedded VQM; embedded video quality metric; multimedia data; objective video quality measurement; video data encoding; video quality monitoring; Bit rate; Neural networks; Quality assessment; Receivers; Transmitters; Video recording; Video sequences; embedded VQM; no-reference; quality monitoring; video quality assessment (ID#: 15-6067)


Shangdong Liu; Jian Gong; Jianxin Chen; Yanbing Peng; Wang Yang; Weiwei Zhang; Jakalan, A., "A Flow Based Method to Detect Penetration," Advanced Infocomm Technology (ICAIT), 2014 IEEE 7th International Conference on, vol., no.,  pp. 184, 191, 14-16 Nov. 2014. doi:10.1109/ICAIT.2014.7019551
Abstract: With the rapid expansion of the Internet, network security has become more and more important. IDS (Intrusion Detection System) is an important technology coping network attacks and is of two main types: network based (NIDS) and host based (HIDS). In this paper, we propose the conception of NFPPB (Network Flow Patterns of Penetrating Behavior) to network vulnerable ports and design a NIDS algorithm to detect infiltration behaviors of attacker. Essentially, NFPPB is a set of metrics calculated by network activities exploiting the vulnerabilities of hosts. The paper investigates choosing, generation and comparison of NFPPB metrics. Experiments show that the method is effective and with high efficiency. At last, the paper addresses the future direction and the points that need to be improved.
Keywords: computer network security; IDS; flow based method; intrusion detection system; network attacks; network flow patterns of penetrating behavior; network security; network vulnerable ports; Educational institutions; IP networks; Law; Measurement; Ports (Computers); Security; Flow Records; IDS; Infiltration Detection; Penetration Detection (ID#: 15-6068)


Feng Li; Chin-Tser Huang; Jie Huang; Wei Peng, "Feedback-Based Smartphone Strategic Sampling for BYOD Security," Computer Communication and Networks (ICCCN), 2014 23rd International Conference on , vol., no., pp.1, 8, 4-7 Aug. 2014. doi:10.1109/ICCCN.2014.6911814
Abstract: Bring Your Own Device (BYOD) is an information technology (IT) policy that allows employees to use their own wireless devices to access internal network at work. Mobile malware is a major security concern that impedes BYOD's further adoption in enterprises. Existing works identify the need for better BYOD security mechanisms that balance between the strength of such mechanisms and the costs of implementing such mechanisms. In this paper, based on the idea of self-reinforced feedback loop, we propose a periodic smartphone sampling mechanism that significantly improve BYOD security mechanism's effectiveness without incurring further costs. We quantify the likelihood that “a BYOD smartphone is infected by malware” by two metrics, vulnerability and uncertainty, and base the iterative sampling process on these two metrics; the updated values of these metrics are fed back into future rounds of the mechanism to complete the feedback loop. We validate the efficiency and effectiveness of the proposed strategic sampling via simulations driven by publicly available, real-world collected traces.
Keywords: invasive software; iterative methods; mobile computing; sampling methods; smart phones; telecommunication security; BYOD security; BYOD smartphone; Bring Your Own Device; IT policy; feedback-based smartphone strategic sampling; information technology; iterative sampling process; mobile malware; periodic smartphone sampling mechanism; self-reinforced feedback loop; wireless device; Feedback loop; Malware; Measurement; Topology; Uncertainty; Wireless communication; Enterprise network; probabilistic algorithm; smartphone security; social network; strategic sampling; uncertainty metric; vulnerability metric (ID#: 15-6069)


Vaarandi, R.; Pihelgas, M., "Using Security Logs for Collecting and Reporting Technical Security Metrics," Military Communications Conference (MILCOM), 2014 IEEE, vol., no., pp. 294, 299, 6-8 Oct. 2014. doi:10.1109/MILCOM.2014.53
Abstract: During recent years, establishing proper metrics for measuring system security has received increasing attention. Security logs contain vast amounts of information which are essential for creating many security metrics. Unfortunately, security logs are known to be very large, making their analysis a difficult task. Furthermore, recent security metrics research has focused on generic concepts, and the issue of collecting security metrics with log analysis methods has not been well studied. In this paper, we will first focus on using log analysis techniques for collecting technical security metrics from security logs of common types (e.g., Network IDS alarm logs, workstation logs, and Net flow data sets). We will also describe a production framework for collecting and reporting technical security metrics which is based on novel open-source technologies for big data.
Keywords: Big Data; computer network security; big data; log analysis methods; log analysis techniques; open source technology; security logs; technical security metric collection; technical security metric reporting; Correlation; Internet; Measurement; Monitoring; Peer-to-peer computing; Security; Workstations; security log analysis; security metrics (ID#: 15-6070)


Scholtz, J.; Endert, A., "User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis," Collaboration Technologies and Systems (CTS), 2014 International Conference on, vol., no., pp. 478, 482, 19-23 May 2014. doi:10.1109/CTS.2014.6867610
Abstract: In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We discuss a number of studies of collaboration in the intelligence community and use this information to provide some guidelines for collaboration software.
Keywords: groupware; police data processing; user centred design; UCD methods; collaborative software; intelligence analysis; intelligence community; user-centered design guidelines; Collaborative software; Communities; Guidelines; Integrated circuits; Measurement; Software; Intelligence community; collaboration; evaluation; metrics; user-centered design (ID#: 15-6071)


Keramati, Marjan; Keramati, Mahsa, "Novel Security Metrics for Ranking Vulnerabilities in Computer Networks," Telecommunications (IST), 2014 7th International Symposium on, vol., no., pp. 883, 888, 9-11 Sept. 2014. doi:10.1109/ISTEL.2014.7000828
Abstract: By daily increasing appearance of vulnerabilities and various ways of intruding networks, one of the most important fields in network security will be doing network hardening and this can be possible by patching the vulnerabilities. But this action for all vulnerabilities may cause high cost in the network and so, we should try to eliminate only most perilous vulnerabilities of the network. CVSS itself can score vulnerabilities based on amount of damage they incur in the network but the main problem with CVSS is that, it can only score individual vulnerabilities without considering its relationship with other vulnerabilities of the network. So, in order to help fill this gap, in this paper we have defined some Attack graph and CVSS-based security metrics that can help us to prioritize vulnerabilities in the network by measuring the probability of exploiting them and also the amount of damage they will impose on the network. Proposed security metrics are defined by considering interaction between all vulnerabilities of the network. So our method can rank vulnerabilities based on the network they exist in. Results of applying these security metrics on one well-known network example are also shown that can demonstrates effectiveness of our approach.
Keywords: computer network security; matrix algebra; probability; CVSS-based security metrics; common vulnerability scoring system; computer network; intruding network security; probability; ranking vulnerability; Availability; Communication networks; Complexity theory; Computer networks; Educational institutions; Measurement; Security; Attack Graph; CVSS; Exploit; Network hardening; Security Metric; Vulnerability (ID#: 15-6072)


Samuvelraj, G.; Nalini, N., "A Survey of Self Organizing Trust Method to Avoid Malicious Peers from Peer to Peer Network," Green Computing Communication and Electrical Engineering (ICGCCEE), 2014 International Conference on, vol., no., pp. 1, 4, 6-8 March 2014. doi:10.1109/ICGCCEE.2014.6921379
Abstract: Networks are subject to attacks from malicious sources. Sending the data securely over the network is one of the most tedious processes. A peer-to-peer (P2P) network is a type of decentralized and distributed network architecture in which individual nodes in the network act as both servers and clients of resources. Peer to peer systems are incredibly flexible and can be used for wide range of functions and also a Peer to peer (P2P) system prone to malicious attacks. To provide a security over peer to peer system the self-organizing trust model has been proposed. Here the trustworthiness of the peers has been calculated based on past interactions and recommendations. The interactions and recommendations are evaluated based on importance, recentness, and satisfaction parameters. By this the good peers were able to form trust relationship in their proximity and avoids the malicious peers.
Keywords: client-server systems; computer network security; fault tolerant computing; peer-to-peer computing; recommender systems; trusted computing; P2P network; client-server resources; decentralized network architecture; distributed network architecture; malicious attacks; malicious peers; malicious sources; peer to peer network; peer to peer systems; peer trustworthiness; satisfaction parameters;self organizing trust method; self-organizing trust model; Computer science; History; Measurement; Organizing; Peer-to-peer computing; Security; Servers; Metrics; Network Security; Peer to Peer; SORT (ID#: 15-6073)


Desouky, A.F.; Beard, M.D.; Etzkorn, L.H., "A Qualitative Analysis of Code Clones and Object Oriented Runtime Complexity Based on Method Access Points," Convergence of Technology (I2CT), 2014 International Conference for, vol., no., pp. 1, 5, 6-8 April 2014. doi:10.1109/I2CT.2014.7092292
Abstract: In this paper, we present a new object oriented complexity metric based on runtime method access points. Software engineering metrics have traditionally indicated the level of quality present in a software system. However, the analysis and measurement of quality has long been captured at compile time, rendering useful results, although potentially incomplete, since all source code is considered in metric computation, versus the subset of code that actually executes. In this study, we examine the runtime behavior of our proposed metric on an open source software package, Rhino 1.7R4. We compute and validate our metric by correlating it with code clones and bug data. Code clones are considered to make software more complex and harder to maintain. When cloned, a code fragment with an error quickly transforms into two (or more) errors, both of which can affect the software system in unique ways. Thus a larger number of code clones is generally considered to indicate poorer software quality. For this reason, we consider that clones function as an external quality factor, in addition to bugs, for metric validation.
Keywords: object-oriented programming; program verification; public domain software; security of data; software metrics; software quality; source code (software); Rhino 1.7R4; bug data; code clones; metric computation; metric validation; object oriented runtime complexity; open source software package; qualitative analysis; runtime method access points; software engineering metrics; source code; Cloning; Complexity theory; Computer bugs; Correlation; Measurement; Runtime; Software; Code Clones; Complexity; Object Behavior; Object Oriented Runtime Metrics; Software Engineering (ID#: 15-6074)


Snigurov, A.; Chakrian, V., "The DoS Attack Risk Calculation Based on the Entropy Method and Critical System Resources Usage," Infocommunications Science and Technology, 2014 First International Scientific-Practical Conference Problems of, vol., no., pp. 186, 187, 14-17 Oct. 2014. doi:10.1109/INFOCOMMST.2014.6992346
Abstract: The paper is focused on algorithm of denial of service risk calculation using the entropy method and considering the additional coefficients of critical system resource usage on the network node. Further the decisions of traffic routing or prevention of attack can be chosen based on the level of risk.
Keywords: computer network security; telecommunication traffic; DoS attack risk calculation; critical system resource usage; denial of service risk calculation; entropy method; traffic routing; Computer crime; Entropy; Information security; Random access memory; Routing; Sockets; Time measurement; DoS attack; entropy; information security risk; routing metrics (ID#: 15-6075)


Zabasta, A.; Casaliccio, E.; Kunicina, N.; Ribickis, L., "A Numerical Model for Evaluation Power Outages Impact on Water Infrastructure Services Sustainability," Power Electronics and Applications (EPE'14-ECCE Europe), 2014 16th European Conference on, vol., no., pp. 1, 10, 26-28 Aug. 2014. doi:10.1109/EPE.2014.6910703
Abstract: Critical infrastructure's (CI) (electricity, heat, water, information and communication technology networks) security, stability and reliability are closely related to the interaction phenomenon. Due to the increasing amount of data transferred, increases dependence on telecommunications and internet services, the data integrity and security is becoming a very important aspect for the utility services providers and energy suppliers. In such circumstances, the need is increasing for methods and tools that enable infrastructure managers to evaluate and predict their critical infrastructure operations as the failures, emergency or service degradation occur in other related infrastructures. Using a simulation model, is experimentally tested a method that allows to explore the water supply network nodes the average down time dependence on the battery life time and the battery replacement time cross-correlations, within the parameters set, when outages in power infrastructure arise and taking into account also the impact of telecommunication nodes. The model studies the real case of Latvian city Ventspils. The proposed approach for the analysis of critical infrastructures interdependencies will be useful for practical adoption of methods, models and metrics for CI operators and stakeholders.
Keywords: critical infrastructures; polynomial approximation; power system reliability; power system security; power system stability; water supply; CI operators; average down time dependence; battery life time; battery replacement time cross-correlations; critical infrastructure operations; critical infrastructure security; critical infrastructures interdependencies; data integrity; data security; energy suppliers; infrastructure managers; interaction phenomenon; internet services; power infrastructure outages; stakeholders; telecommunication nodes; utility services providers; water supply network nodes; Analytical models; Batteries; Mathematical model; Measurement; Power supplies; Telecommunications; Unified modeling language; Estimation technique; Fault tolerance; Modelling; Simulation (ID#: 15-6076)


Shittu, R.; Healing, A.; Ghanea-Hercock, R.; Bloomfield, R.; Muttukrishnan, R., "Outmet: A New Metric for Prioritising Intrusion Alerts Using Correlation and Outlier Analysis," Local Computer Networks (LCN), 2014 IEEE 39th Conference on, vol., no., pp. 322, 330, 8-11 Sept. 2014. doi:10.1109/LCN.2014.6925787
Abstract: In a medium sized network, an Intrusion Detection System (IDS) could produce thousands of alerts a day many of which may be false positives. In the vast number of triggered intrusion alerts, identifying those to prioritise is highly challenging. Alert correlation and prioritisation are both viable analytical methods which are commonly used to understand and prioritise alerts. However, to the author's knowledge, very few dynamic prioritisation metrics exist. In this paper, a new prioritisation metric - OutMet, which is based on measuring the degree to which an alert belongs to anomalous behaviour is proposed. OutMet combines alert correlation and prioritisation analysis. We illustrate the effectiveness of OutMet by testing its ability to prioritise alerts generated from a 2012 red-team cyber-range experiment that was carried out as part of the BT Saturn programme. In one of the scenarios, OutMet significantly reduced the false-positives by 99.3%.
Keywords: computer network security; correlation methods; graph theory; BT Saturn programme; IDS; OutMet; alert correlation and prioritisation analysis; correlation analysis; dynamic prioritisation metrics; intrusion alerts; intrusion detection system; medium sized network; outlier analysis; red-team cyber-range experiment; Cities and towns; Complexity theory; Context; Correlation; Educational institutions; IP networks; Measurement; Alert Correlation; Attack Scenario; Graph Mining; IDS Logs; Intrusion Alert Analysis; Intrusion Detection; Pattern Detection (ID#: 15-6077)


Cain, A.A.; Schuster, D., "Measurement of Situation Awareness Among Diverse Agents in Cyber Security," Cognitive Methods in Situation Awareness and Decision Support (CogSIMA), 2014 IEEE International Inter-Disciplinary Conference on, vol., no., pp. 124, 129, 3-6 March 2014. doi:10.1109/CogSIMA.2014.6816551
Abstract: Development of innovative algorithms, metrics, visualizations, and other forms of automation are needed to enable network analysts to build situation awareness (SA) from large amounts of dynamic, distributed, and interacting data in cyber security. Several models of cyber SA can be classified as taking an individual or a distributed approach to modeling SA within a computer network. While these models suggest ways to integrate the SA contributed by multiple actors, implementing more advanced data center automation will require consideration of the differences and similarities between human teaming and human-automation interaction. The purpose of this paper is to offer guidance for quantifying the shared cognition of diverse agents in cyber security. The recommendations presented can inform the development of automated aids to SA as well as illustrate paths for future empirical research.
Keywords: cognition; security of data; SA; cyber security; data center automation; diverse agents; shared cognition; situation awareness measurement; Automation; Autonomous agents; Cognition; Computer security; Data models; Sociotechnical systems; Situation awareness; cognition; cyber security; information security; teamwork (ID#: 15-6078)


Sirsikar, S.; Salunkhe, J., "Analysis of Data Hiding Using Digital Image Signal Processing," Electronic Systems, Signal Processing and Computing Technologies (ICESC), 2014 International Conference on,  vol., no., pp. 134, 139, 9-11 Jan. 2014. doi:10.1109/ICESC.2014.28
Abstract: Data hiding process embeds data into digital media for the purpose of security. Digital image is one of the best media to store data. It provides large capacity for hiding secret information which results into stego-image imperceptible to human vision, a novel steganographic approach based on data hiding method such as pixel-value differencing. This method provides both high embedding capacity and outstanding imperceptibility for the stego-image. In this paper, different image processing techniques are described for data hiding related to pixel value differencing. Pixel Value Differencing based techniques is carried out to produce modified data hiding method. Hamming is an error correcting method which is useful to hide some information where lost bit are detected and corrected. OPAP is used to minimize embedding error thus quality of stego-image is improved without disturbing secret data. ZigZag method enhances security and quality of image. In modified method Hamming, OPAP and ZigZag methods are combined. In adaptive method image is divided into blocks and then data will be hidden. Objective of the proposed work is to increase the stego-image quality as well as increase capacity of secret data. Result analysis compared for BMP images only, with calculation of evaluation metrics i.e. MSE, PSNR and SSIM.

Keywords: image processing; steganography; BMP images; MSE; OPAP; PSNR; SSIM; ZigZag method; data hiding analysis; data hiding method; data hiding process; digital image signal processing; digital media; embedding capacity; error correcting method; human vision; pixel value differencing; pixel-value differencing; secret information hiding;security; steganographic approach; stego-image imperceptible; stego-image quality; Color; Cryptography; Digital images; Image quality; Measurement; PSNR; Data Hiding; Digital image; Pixel Value Differencing; Steganography (ID#: 15-6079)


Younis, A.A.; Malaiya, Y.K., "Using Software Structure to Predict Vulnerability Exploitation Potential," Software Security and Reliability-Companion (SERE-C), 2014 IEEE Eighth International Conference on, vol., no., pp. 13, 18, June 30 2014-July 2 2014. doi:10.1109/SERE-C.2014.17
Abstract: Most of the attacks on computer systems are due to the presence of vulnerabilities in software. Recent trends show that number of newly discovered vulnerabilities still continue to be significant. Studies have also shown that the time gap between the vulnerability public disclosure and the release of an automated exploit is getting smaller. Therefore, assessing vulnerabilities exploitability risk is critical as it aids decision-makers prioritize among vulnerabilities, allocate resources, and choose between alternatives. Several methods have recently been proposed in the literature to deal with this challenge. However, these methods are either subjective, requires human involvement in assessing exploitability, or do not scale. In this research, our aim is to first identify vulnerability exploitation risk problem. Then, we introduce a novel vulnerability exploitability metric based on software structure properties viz.: attack entry points, vulnerability location, presence of dangerous system calls, and reachability. Based on our preliminary results, reachability and the presence of dangerous system calls appear to be a good indicator of exploitability. Next, we propose using the suggested metric as feature to construct a model using machine learning techniques for automatically predicting the risk of vulnerability exploitation. To build a vulnerability exploitation model, we propose using Support Vector Machines (SVMs). Once the predictor is built, given unseen vulnerable function and their exploitability features the model can predict whether the given function is exploitable or not.
Keywords: decision making; learning (artificial intelligence); reachability analysis; software metrics; support vector machines; SVM; attack entry points; computer systems; decision makers; machine learning; reachability; software structure; support vector machines; vulnerabilities exploitability risk; vulnerability exploitability metric; vulnerability exploitation model; vulnerability exploitation potential; vulnerability exploitation risk problem; vulnerability location; vulnerability public disclosure; Feature extraction; Predictive models; Security; Software; Software measurement; Support vector machines; Attack Surface; Machine Learning; Measurement; Risk Assessment; Software Security Metrics; Software Vulnerability (ID#: 15-6080)


Younis, A.A.; Malaiya, Y.K.; Ray, I., "Using Attack Surface Entry Points and Reachability Analysis to Assess the Risk of Software Vulnerability Exploitability," High-Assurance Systems Engineering (HASE), 2014 IEEE 15th International Symposium on, vol., no., pp.1, 8, 9-11 Jan. 2014. doi:10.1109/HASE.2014.10
Abstract: An unpatched vulnerability can lead to security breaches. When a new vulnerability is discovered, it needs to be assessed so that it can be prioritized. A major challenge in software security is the assessment of the potential risk due to vulnerability exploitability. CVSS metrics have become a de facto standard that is commonly used to assess the severity of a vulnerability. The CVSS Base Score measures severity based on exploitability and impact measures. CVSS exploitability is measured based on three metrics: Access Vector, Authentication, and Access Complexity. However, CVSS exploitability measures assign subjective numbers based on the views of experts. Two of its factors, Access Vector and Authentication, are the same for almost all vulnerabilities. CVSS does not specify how the third factor, Access Complexity, is measured, and hence we do not know if it considers software properties as a factor. In this paper, we propose an approach that assesses the risk of vulnerability exploitability based on two software properties - attack surface entry points and reach ability analysis. A vulnerability is reachable if it is located in one of the entry points or is located in a function that is called either directly or indirectly by the entry points. The likelihood of an entry point being used in an attack can be assessed by using damage potential-effort ratio in the attack surface metric and the presence of system calls deemed dangerous. To illustrate the proposed method, five reported vulnerabilities of Apache HTTP server 1.3.0 have been examined at the source code level. The results show that the proposed approach, which uses more detailed information, can yield a risk assessment that can be different from the CVSS Base Score.
Keywords: reachability analysis; risk management; security of data; software metrics; Apache HTTP server 1.3.0; CVSS base score; CVSS exploitability; CVSS metrics; access complexity; access vector; attack surface entry point; attack surface metric; authentication; damage potential-effort ratio; reachability analysis; risk assessment; security breach; severity measurement; software security; software vulnerability exploitability ;Authentication; Complexity theory; Measurement; Servers; Software;Vectors; Attack Surface; CVSS Metrics; Measurement; Risk assessment; Software Security Metrics; Software Vulnerability (ID#: 15-6081)


Akbar, M.; Sukmana, H.T.; Khairani, D., "Models and Software Measurement Using Goal/Question/Metric Method and CMS Matrix Parameter (Case Study Discussion Forum)," Cyber and IT Service Management (CITSM), 2014 International Conference on, vol., no., pp. 34, 38, 3-6 Nov. 2014. doi:10.1109/CITSM.2014.7042171
Abstract: Existence of a CMS as a tool in making a website has been used extensively by the communities. Currently, there are many CMS available as options, especially CMS bulletin board. The number of options is an obstacle for someone to choose a suitable CMS to fulfill their needs. Because of the lack of research on this CMS bulletin board comparison, this research tries to compare and search the best CMS bulletin board. This research uses metrics for modeling and software measurement to identify the characteristics of existing CMS bulletin board. This research used Goal/Question/Metric (GQM) for modelling method and CMS Matrix as the parameters. As for the CMS bulletin board, in this study, we choose PhpBB, MyBB, and SMF. The results of this study indicate that SMF bulletin board has the best score compared to MyBB and phpBB CMS bulletin board.
Keywords: content management; software development management; software metrics; CMS bulletin board; CMS matrix parameter; GQM; MyBB; PhpBB; SMF; Website; goal-question-metric method; software measurement; Browsers; Databases; Operating systems; Security; Software measurement; CMS; Software Measurement (ID#: 15-6082)


Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.