Visible to the public Measurement and Metrics: Testing, 2014

SoS Newsletter- Advanced Book Block


SoS Logo

Measurement and Metrics: Testing, 2014


Measurement and metrics are hard problems in the Science of Security. The research cited here looks at methods and techniques of testing valid measurement. This work was presented in 2014.

Awad, F.; Taqieddin, E.; Mowafi, M.; Banimelhem, O.; AbuQdais, A., "A Simulation Testbed to Jointly Exploit Multiple Image Compression Techniques for Wireless Multimedia Sensor Networks," Wireless Communications Systems (ISWCS), 2014 11th International Symposium on, vol., no., pp. 905, 911, 26-29 Aug. 2014. doi:10.1109/ISWCS.2014.6933482
Abstract: As the demand for large-scale wireless multimedia sensor networks increases, so does the need for well-designed protocols that optimize the utilization of available networks resources. This requires experimental testing for realistic performance evaluation and design tuning. However, experimental testing of large-scale wireless networks using hardware testbeds is usually very hard to perform due to the need for collecting and monitoring the performance metrics data for multiple sensor nodes all at the same time, especially the node's energy consumption data. On the other hand, pure simulation testing may not accurately replicate the real-life scenarios, especially those parameters that are related to the wireless signal behavior in special environments. Therefore, this work attempts to close this gap between experimental and simulation testing. This paper presents a scalable simulation testbed that attempts to mimic our previously designed small-scale hardware testbed for wireless multimedia sensor networks by tuning the simulation parameters to match the real-life measurements obtained via experimental testing. The proposed simulation testbed embeds the JPEG and JPEG2000 image compression algorithms and potentially allows for network-controlled image compression and transmission decisions. The simulation results show very close match to the small-scale experimental testing as well as to the hypothetical large-scale extensions that were based on the experimental results.
Keywords: data compression; energy consumption; image coding; multimedia communication; protocols; wireless sensor networks; JPEG; JPEG2000 image compression algorithms; hardware testbeds; hypothetical large-scale extensions; jointly exploit multiple image compression techniques; large-scale wireless multimedia sensor networks; multiple sensor nodes; network resources; network-controlled image compression; node energy consumption data; performance metrics data collection; performance metrics data monitoring; scalable simulation testbed; small-scale hardware testbed; transmission decisions; well-designed protocols; wireless signal behavior; Energy consumption; Hardware; Image coding; Multimedia communication; Routing; Transform coding; Wireless sensor networks; Imote2; JPEG; JPEG2000; Simulation;Testbed; WMSN (ID#: 15-6045)


Kowtko, M.A., "Biometric Authentication for Older Adults," Systems, Applications and Technology Conference (LISAT), 2014 IEEE Long Island, vol., no., pp. 1, 6, 2-2 May 2014. doi:10.1109/LISAT.2014.6845213
Abstract: In recent times, cyber-attacks and cyber warfare have threatened network infrastructures from across the globe. The world has reacted by increasing security measures through the use of stronger passwords, strict access control lists, and new authentication means; however, while these measures are designed to improve security and Information Assurance (IA), they may create accessibility challenges for older adults and people with disabilities. Studies have shown the memory performance of older adults decline with age. Therefore, it becomes increasingly difficult for older adults to remember random strings of characters or passwords that have 12 or more character lengths. How are older adults challenged by security measures (passwords, CAPTCHA, etc.) and how does this affect their accessibility to engage in online activities or with mobile platforms? While username/password authentication, CAPTCHA, and security questions do provide adequate protection; they are still vulnerable to cyber-attacks. Passwords can be compromised from brute force, dictionary, and social engineering style attacks. CAPTCHA, a type of challenge-response test, was developed to ensure that user inputs were not manipulated by machine-based attacks. Unfortunately, CAPTCHA are now being exploited by new vulnerabilities and exploits. Insecure implementations through code or server interaction have circumvented CAPTCHA. New viruses and malware now utilize character recognition as means to circumvent CAPTCHA [1]. Security questions, another challenge response test that attempts to authenticate users, can also be compromised through social engineering attacks and spyware. Since these common security measures are increasingly being compromised, many security professionals are turning towards biometric authentication. Biometric authentication is any form of human biological measurement or metric that can be used to identify and authenticate an authorized  user of a secure system. Biometric authentication- can include fingerprint, voice, iris, facial, keystroke, and hand geometry [2]. Biometric authentication is also less affected by traditional cyber-attacks. However, is Biometrics completely secure? This research will examine the security challenges and attacks that may risk the security of biometric authentication. Recently, medical professionals in the TeleHealth industry have begun to investigate the effectiveness of biometrics. In the United States alone, the population of older adults has increased significantly with nearly 10,000 adults per day reaching the age of 65 and older [3]. Although people are living longer, that does not mean that they are living healthier. Studies have shown the U.S. healthcare system is being inundated by older adults. As security with the healthcare industry increases, many believe that biometric authentication is the answer. However, there are potential problems; especially in the older adult population. The largest problem is authentication of older adults with medical complications. Cataracts, stroke, congestive heart failure, hard veins, and other ailments may challenge biometric authentication. Since biometrics often utilize metrics and measurement between biological features, anyone of the following conditions and more could potentially affect the verification of users. This research will analyze older adults and their impact of biometric authentication on the verification process.
Keywords: authorisation; biometrics (access control); invasive software; medical administrative data processing; mobile computing; CAPTCHA; Cataracts; IA; TeleHealth industry; US healthcare system; access control lists; authentication means; biometric authentication; challenge-response test; congestive heart failure; cyber warfare; cyber-attacks; dictionary; hard veins; healthcare industry; information assurance; machine-based attacks; medical professionals; mobile platforms; network infrastructures; older adults; online activities; security measures; security professionals; social engineering style attacks; spyware; stroke; username-password authentication; Authentication; Barium; CAPTCHAs; Computers; Heart; Iris recognition; Biometric Authentication; CAPTCHA; Cyber-attacks; Information Security; Older Adults; Telehealth (ID#: 15-6046)


Axelrod, C.W., "Reducing Software Assurance Risks for Security-Critical and Safety-Critical Systems," Systems, Applications and Technology Conference (LISAT), 2014 IEEE Long Island, vol, no., pp. 1, 6, 2-2 May 2014. doi:10.1109/LISAT.2014.6845212
Abstract: According to the Office of the Assistant Secretary of Defense for Research and Engineering (ASD(R&E)), the US Department of Defense (DoD) recognizes that there is a “persistent lack of a consistent approach ... for the certification of software assurance tools, testing and methodologies” [1]. As a result, the ASD(R&E) is seeking “to address vulnerabilities and weaknesses to cyber threats of the software that operates ... routine applications and critical kinetic systems ...” The mitigation of these risks has been recognized as a significant issue to be addressed in both the public and private sectors. In this paper we examine deficiencies in various software-assurance approaches and suggest ways in which they can be improved. We take a broad look at current approaches, identify their inherent weaknesses and propose approaches that serve to reduce risks. Some technical, economic and governance issues are: (1) Development of software-assurance technical standards (2) Management of software-assurance standards (3) Evaluation of tools, techniques, and metrics (4) Determination of update frequency for tools, techniques (5) Focus on most pressing threats to software systems (6) Suggestions as to risk-reducing research areas (7) Establishment of models of the economics of software-assurance solutions, and testing and certifying software. We show that, in order to improve current software assurance policy and practices, particularly with respect to security, there has to be a major overhaul in how software is developed, especially with respect to the requirements and testing phases of the SDLC (Software Development Lifecycle). We also suggest that the current preventative approaches are inadequate and that greater reliance should be placed upon avoidance and deterrence. We also recommend that those developing and operating security-critical and safety-critical systems exchange best-ofbreed software assurance methods to prevent the v- lnerability of components leading to compromise of entire systems of systems. The recent catastrophic loss of a Malaysia Airlines airplane is then presented as an example of possible compromises of physical and logical security of on-board communications and management and control systems.
Keywords: program testing; safety-critical software; software development management; software metrics; ASD(R&E);Assistant Secretary of Defense for Research and Engineering; Malaysia Airlines airplane; SDLC;US Department of Defense; US DoD; component vulnerability prevention; control systems; critical kinetic systems; cyber threats; economic issues; governance issues; logical security; management systems; on-board communications; physical security; private sectors; public sectors; risk mitigation; safety-critical systems; security-critical systems; software assurance risk reduction; software assurance tool certification; software development; software development lifecycle; software methodologies; software metric evaluation; software requirements; software system threats; software technique evaluation; software testing; software tool evaluation; software-assurance standard management; software-assurance technical standard development; technical issues; update frequency determination; Measurement; Organizations; Security; Software systems; Standards; Testing; cyber threats; cyber-physical systems; governance; risk; safety-critical systems; security-critical systems; software assurance; technical standards; vulnerabilities; weaknesses (ID#: 15-6047)


Yihai Zhu; Jun Yan; Yufei Tang; Sun, Y.L.; Haibo He, "Coordinated Attacks Against Substations and Transmission Lines in Power Grids," Global Communications Conference (GLOBECOM), 2014 IEEE, vol., no., pp. 655, 661, 8-12 Dec. 2014. doi:10.1109/GLOCOM.2014.7036882
Abstract: Vulnerability analysis on the power grid has been widely conducted from the substation-only and transmission-line-only perspectives. In order words, it is considered that attacks can occur on substations or transmission lines separately. In this paper, we naturally extend existing two perspectives and introduce the joint-substation-transmission-line's perspective, which means attacks can concurrently occur on substations and transmission lines. Vulnerabilities are referred to as these multiple-component combinations that can yield large damage to the power grid. One such combination consists of substations, transmission lines, or both. The new perspective is promising to discover more power grid vulnerabilities. In particular, we conduct the vulnerability analysis on the IEEE 39 bus system. Compared with known substation-only/transmission-line-only vulnerabilities, joint-substation-transmission-line vulnerabilities account for the largest percentage. Referring to three-component vulnerabilities, for instance, joint-substation-transmission-line vulnerabilities account for 76.06%; substation-only and transmission-line-only vulnerabilities account for 10.96% and 12.98%, respectively. In addition, we adopt two existing metrics, degree and load, to study the joint-substation-transmission-line attack strategy. Generally speaking, the joint-substation-transmission-line attack strategy based on the load metric has better attack performance than comparison schemes.
Keywords: power grids; power transmission reliability; substations; IEEE 39 bus system; coordinated attacks; joint-substation-transmission-line perspective; joint-substation-transmission-line vulnerabilities; load metric; multiple-component combinations; power grid vulnerabilities; vulnerability analysis; Benchmark testing; Measurement; Power grids; Power system faults; Power system protection; Power transmission lines; Substations; Attack; Cascading failures; Power grid security; Vulnerability analysis (ID#: 15-6048)


Duncan, I.; De Muijnck-Hughes, J., "Security Pattern Evaluation," Service Oriented System Engineering (SOSE), 2014 IEEE 8th International Symposium on, vol., no., pp. 428, 429, 7-11 April 2014. doi:10.1109/SOSE.2014.61
Abstract: Current Security Pattern evaluation techniques are demonstrated to be incomplete with respect to quantitative measurement and comparison. A proposal for a dynamic testbed system is presented as a potential mechanism for evaluating patterns within a constrained environment.
Keywords: pattern classification; security of data; dynamic testbed system; security pattern evaluation; Complexity theory; Educational institutions; Measurement; Security; Software; Software reliability; Testing; evaluation; metrics; security patterns; testing (ID#: 15-6049)


Sanchez, A.B.; Segura, S.; Ruiz-Cortes, A., "A Comparison of Test Case Prioritization Criteria for Software Product Lines," Software Testing, Verification and Validation (ICST), 2014 IEEE Seventh International Conference on, vol., no., pp. 41, 50, March 31 2014 - April 4 2014. doi:10.1109/ICST.2014.15
Abstract: Software Product Line (SPL) testing is challenging due to the potentially huge number of derivable products. To alleviate this problem, numerous contributions have been proposed to reduce the number of products to be tested while still having a good coverage. However, not much attention has been paid to the order in which the products are tested. Test case prioritization techniques reorder test cases to meet a certain performance goal. For instance, testers may wish to order their test cases in order to detect faults as soon as possible, which would translate in faster feedback and earlier fault correction. In this paper, we explore the applicability of test case prioritization techniques to SPL testing. We propose five different prioritization criteria based on common metrics of feature models and we compare their effectiveness in increasing the rate of early fault detection, i.e. a measure of how quickly faults are detected. The results show that different orderings of the same SPL suite may lead to significant differences in the rate of early fault detection. They also show that our approach may contribute to accelerate the detection of faults of SPL test suites based on combinatorial testing.
Keywords: fault diagnosis; program testing; SPL test suites; SPL testing; combinatorial testing; fault detection; software product line testing; test case prioritization criteria comparison; test case prioritization techniques; Analytical models; Complexity theory; Fault detection; Feature extraction; Measurement; Security; Testing; Software product lines; automated analysis; feature models. (ID#: 15-6050)


Zabasta, A.; Casaliccio, E.; Kunicina, N.; Ribickis, L., "A Numerical Model for Evaluation Power Outages Impact on Water Infrastructure Services Sustainability," Power Electronics and Applications (EPE'14-ECCE Europe), 2014 16th European Conference on, vol., no., pp.1,10, 26-28 Aug. 2014. doi:10.1109/EPE.2014.6910703
Abstract: Critical infrastructure's (CI) (electricity, heat, water, information and communication technology networks) security, stability and reliability are closely related to the interaction phenomenon. Due to the increasing amount of data transferred, increases dependence on telecommunications and internet services, the data integrity and security is becoming a very important aspect for the utility services providers and energy suppliers. In such circumstances, the need is increasing for methods and tools that enable infrastructure managers to evaluate and predict their critical infrastructure operations as the failures, emergency or service degradation occur in other related infrastructures. Using a simulation model, is experimentally tested a method that allows to explore the water supply network nodes the average down time dependence on the battery life time and the battery replacement time cross-correlations, within the parameters set, when outages in power infrastructure arise and taking into account also the impact of telecommunication nodes. The model studies the real case of Latvian city Ventspils. The proposed approach for the analysis of critical infrastructures interdependencies will be useful for practical adoption of methods, models and metrics for CI operators and stakeholders.
Keywords: critical infrastructures; polynomial approximation; power system reliability; power system security; power system stability; water supply; CI operators; average down time dependence; battery life time; battery replacement time cross-correlations; critical infrastructure operations; critical infrastructure security; critical infrastructures interdependencies; data integrity; data security; energy suppliers; infrastructure managers; interaction phenomenon; internet services; power infrastructure outages; stakeholders; telecommunication nodes; utility services providers; water supply network nodes; Analytical models; Batteries; Mathematical model; Measurement; Power supplies; Telecommunications; Unified modeling language; Estimation technique; Fault tolerance; Modelling; Simulation (ID#: 15-6051)


Hemanidhi, A.; Chimmanee, S.; Sanguansat, P., "Network Risk Evaluation from Security Metric of Vulnerability Detection Tools," TENCON 2014 - 2014 IEEE Region 10 Conference, vol., no., pp. 1, 6, 22-25 Oct. 2014. doi:10.1109/TENCON.2014.7022358
Abstract: Network Security is always a major concern in any organizations. To ensure that the organization network is well prevented from attackers, vulnerability assessment and penetration testing are implemented regularly. However, it is a highly time-consuming procedure to audit and analysis these testing results depending on administrator's expertise. Thus, security professionals prefer proactive-automatic vulnerability detection tools to identify vulnerabilities before they are exploited by an adversary. Although these vulnerability detection tools show that they are very useful for security professionals to audit and analysis much faster and more accurate, they have some important weaknesses as well. They only identify surface vulnerabilities and are unable to address the overall risk level of the scanned network. Also, they often use different standard for network risk level classification which habitually related to some organizations or vendors. Thus, these vulnerability detection tools are likely to, more or less, classify risk evaluation biasedly. This article presents a generic idea of “Network Risk Metric” as an unbiased risk evaluation from several vulnerability detection tools. In this paper, NetClarity (hardware-based), Nessus (software-based), and Retina (software-based) are implemented on two networks from an IT department of the Royal Thai Army (RTA). The proposed metric is applied for evaluating overall network risk from these three vulnerability detection tools. The result is a more accurate risk evaluation for each network.
Keywords: business data processing; computer crime; computer network performance evaluation; computer network security; IT department; Nessus; NetClarity; RTA; Retina; Royal Thai Army; attackers; hardware-based; network risk evaluation; network risk level classification; network risk metric; network security; organization network; proactive-automatic vulnerability detection tools; security metric; security professionals; software-based; unbiased risk evaluation; vulnerabilities identification; vulnerability assessment; vulnerability penetration testing; Equations; Measurement; Retina; Security; Servers; Software; Standards organizations; Network Security; Risk Evaluation; Security Metrics; Vulnerability Detection (ID#: 15-6052)


Shittu, R.; Healing, A.; Ghanea-Hercock, R.; Bloomfield, R.; Muttukrishnan, R., "OutMet: A New Metric for Prioritising Intrusion Alerts Using Correlation and Outlier Analysis," Local Computer Networks (LCN), 2014 IEEE 39th Conference on, vol., no., pp. 322, 330, 8-11 Sept. 2014. doi:10.1109/LCN.2014.6925787
Abstract: In a medium sized network, an Intrusion Detection System (IDS) could produce thousands of alerts a day many of which may be false positives. In the vast number of triggered intrusion alerts, identifying those to prioritise is highly challenging. Alert correlation and prioritisation are both viable analytical methods which are commonly used to understand and prioritise alerts. However, to the author's knowledge, very few dynamic prioritisation metrics exist. In this paper, a new prioritisation metric - OutMet, which is based on measuring the degree to which an alert belongs to anomalous behaviour is proposed. OutMet combines alert correlation and prioritisation analysis. We illustrate the effectiveness of OutMet by testing its ability to prioritise alerts generated from a 2012 red-team cyber-range experiment that was carried out as part of the BT Saturn programme. In one of the scenarios, OutMet significantly reduced the false-positives by 99.3%.
Keywords: computer network security; correlation methods; graph theory; BT Saturn programme; IDS; OutMet; alert correlation and prioritisation analysis; correlation analysis; dynamic prioritisation metrics; intrusion alerts; intrusion detection system; medium sized network; outlier analysis; red-team cyber-range experiment; Cities and towns; Complexity theory; Context; Correlation; Educational institutions; IP networks; Measurement; Alert Correlation; Attack Scenario; Graph Mining; IDS Logs; Intrusion Alert Analysis; Intrusion Detection; Pattern Detection (ID#: 15-6053)


Gaurav, C.; Chandramouleeswaran, D.; Khanam, R., "Progressive Testbed Application for Performance Analysis in Real Time Ad Hoc Networks Using SAP HANA," Advances in Computing and Communications (ICACC), 2014 Fourth International Conference on, vol., no., pp. 171, 174, 27-29 Aug. 2014. doi:10.1109/ICACC.2014.48
Abstract: This paper proposes and subsequently delineates quantification of network security metrics using software defined networking approach in real time using a progressive testbed. This comprehensive testbed implements computation of trust values which lend sentient decision making qualities to the participant nodes in a network and fortify it against threats like blackhole and flooding attacks. AODV and OLSR protocols were tested in real time under ideal and malicious environment using the testbed as the controlling point. With emphasis on reliability, interpreting voluminous data, monitoring attacks immediately with negligible time lag, the paper concludes by justifying the use of SAP HANA and UI5 for the testbed.
Keywords: ad hoc networks; routing protocols; telecommunication security; AODV protocol; OLSR protocol; SAP HANA; network security metrics; progressive testbed; real time ad hoc networks; sentient decision making; software defined networking; trust values; Ad hoc networks; Equations; Mathematical model; Measurement; Protocols; Routing; Security; Ad-Hoc Network; HANA- High Performance Analytic Appliance; Performance Analysis; Security Metrics; Trust Model; UI5 SAP User Interface Technology (ID#: 15-6054)


Renchi Yan; Teng Xu; Potkonjak, M., "Semantic Attacks on Wireless Medical Devices," SENSORS, 2014 IEEE, vol., no., pp. 482, 485, 2-5 Nov. 2014. doi:10.1109/ICSENS.2014.6985040
Abstract: Security of medical embedded systems is of vital importance. Wireless medical devices used in wireless health applications employ large number of sensors and are in particular susceptible to security attacks. They are often not physically secured and are usually used in hostile environments. We have developed theoretical and statistical framework for creating semantic attacks where data is altered in such a way that the consequences include incorrect medical diagnosis and treatment. Our approach maps a semantic attack to an instance of optimization problem where medical damage is maximized under constraints of the probability of detection and root cause tracing. We use a popular medical shoe to demonstrate that low energy and low cost of embedded medical devices increases the probability of successful attacks. We have proposed two types of semantic attacks, respectively pressure-based attack, and time-based attack under two scenarios, a shoe with 99 pressure sensors and a shoe with 20 pressure sensors. We test the effects of the attacks and compare them. Our results indicate that it is surprisingly easy to attack several essential medical metrics and to alter corresponding medical diagnosis.
Keywords: biomedical communication; data communication; intelligent sensors; optimisation; pressure sensors; security of data; wireless sensor networks; detection probability; low cost embedded medical devices; low energy embedded medical devices; medical embedded system security; medical shoe; optimization problem; pressure based attack; pressure sensors; root cause tracing; semantic attacks; sensor security attacks; time based attack; wireless health applications; wireless medical devices; Measurement; Medical diagnostic imaging; Medical services; Security; Semantics; Sensors; Wireless sensor networks (ID#: 15-6055)


Riecker, M.; Thies, D.; Hollick, M., "Measuring the Impact of Denial-of-Service Attacks on Wireless Sensor Networks," Local Computer Networks (LCN), 2014 IEEE 39th Conference on, vol., no., pp. 296, 304, 8-11 Sept. 2014. doi:10.1109/LCN.2014.6925784
Abstract: Wireless sensor networks (WSNs) are especially susceptible to denial-of-service attacks due to the resource-constrained nature of motes. We follow a systematic approach to analyze the impacts of these attacks on the network behavior; therefore, we first identify a large number of metrics easily obtained and calculated without incurring too much overhead. Next, we statistically test these metrics to assess whether they exhibit significantly different values under attack when compared to those of the baseline operation. The metrics look into different aspects of the motes and the network, for example, MCU and radio activities, network traffic statistics, and routing related information. Then, to show the applicability of the metrics to different WSNs, we vary several parameters, such as traffic intensity and transmission power. We consider the most common topologies in wireless sensor networks such as central data collection and meshed multi-hop networks by using the collection tree and the mesh protocol. Finally, the metrics are grouped according to their capability of distinction into different classes. In this work, we focus on jamming and blackhole attacks. Our experiments reveal that certain metrics are able to detect a jamming attack on all motes in the testbed, irrespective of the parameter combination, and at the highest significance value. To illustrate these facts, we use a standard testbed consisting of the widely-employed TelosB motes.
Keywords: jamming; telecommunication network routing; telecommunication network topology; telecommunication security; wireless sensor networks; TelosB motes; blackhole attack; central data collection; collection tree; denial-of-service attack; jamming attack; mesh protocol; meshed multihop network; network behavior; network topology; network traffic statistics; routing related information; wireless sensor networks; Computer crime; Jamming; Measurement; Protocols; Routing; Topology; Wireless sensor networks; Denial-of-Service; Measurements; Wireless Sensor Networks (ID#: 15-6056)


Kundi, M.; Chitchyan, R., "Position on Metrics for Security in Requirements Engineering," Requirements Engineering and Testing (RET), 2014 IEEE 1st International Workshop on, vol., no., pp. 29, 31, 26-26 Aug. 2014. doi:10.1109/RET.2014.6908676
Abstract: A number of well-established software quality metrics are in use in code testing. It is our position that for many code-testing metrics for security equivalent requirements level metrics should be defined. Such requirements-level security metrics should be used in evaluating the quality of software security early on, in order to ensure that the resultant software system possesses the required security characteristics and quality.
Keywords: formal specification; program testing; security of data; software metrics; software quality; code-testing metrics; requirements engineering; requirements-level security metrics; software quality metrics; software security; Conferences; Security; Software measurement; Software systems; Testing (ID#: 15-6057)


Rostami, M.; Wendt, J.B.; Potkonjak, M.; Koushanfar, F., "Quo Vadis, PUF?: Trends and Challenges of Emerging Physical-Disorder Based Security," Design, Automation and Test in Europe Conference and Exhibition (DATE), 2014, vol, no., pp. 1, 6, 24-28 March 2014. doi:10.7873/DATE.2014.365
Abstract: The physical unclonable function (PUF) has emerged as a popular and widely studied security primitive based on the randomness of the underlying physical medium. To date, most of the research emphasis has been placed on finding new ways to measure randomness, hardware realization and analysis of a few initially proposed structures, and conventional secret-key based protocols. In this work, we present our subjective analysis of the emerging and future trends in this area that aim to change the scope, widen the application domain, and make a lasting impact. We emphasize on the development of new PUF-based primitives and paradigms, robust protocols, public-key protocols, digital PUFs, new technologies, implementations, metrics and tests for evaluation/validation, as well as relevant attacks and countermeasures.
Keywords: cryptographic protocols; public key cryptography; PUF-based paradigms; PUF-based primitives; Quo Vadis; application domain; digital PUF; hardware realization; physical medium randomness measurement; physical unclonable function; physical-disorder-based security; public-key protocol; secret-key based protocols; security primitive; structure analysis; subjective analysis; Aging; Correlation; Hardware; NIST; Protocols; Public key (ID#: 15-6058)


Singh, P.; Shivani, S.; Agarwal, S., "A Chaotic Map Based DCT-SVD Watermarking Scheme for Rightful Ownership Verification," Engineering and Systems (SCES), 2014 Students Conference on, vol., no., pp. 1, 4, 28-30 May 2014. doi:10.1109/SCES.2014.6880048
Abstract: A chaotic map-based hybrid watermarking scheme incorporating the concepts of the Discrete Cosine Transform (DCT) and exploiting the stability of the singular values has been proposed here. Homogeneity Analysis of the cover image has been done to chalk out appropriates sites for embedding and thereafter, a reference image has been obtained from it. The singular values of the reference image has been modified for embedding the secret information. The Chaotic map based scrambling enhances the security of the algorithm as only the rightful owner possessing the secret key, could retrieve the actual image. Comprehensive set of attacks has been applied and robustness tested with the Normalized Cross Correlation (NCC) and Peak Signal to Noise Ratio (PSNR) metric values. High values of these metrics signify the appropriateness of the proposed methodology.
Keywords: chaos; discrete cosine transforms; image retrieval; image watermarking; singular value decomposition; NCC; PSNR metric values; chaotic map based DCT-SVD hybrid watermarking scheme; chaotic map based scrambling; cover image; discrete cosine transform; homogeneity analysis; image retrieval; normalized cross correlation; peak signal to noise ratio metric values; reference image; rightful ownership verification; secret information; singular value decomposition; Discrete cosine transforms; Image coding; Measurement; PSNR; Robustness; Transform coding; Watermarking; Chaotic Map; Discrete cosine transformation (DCT); Homogeneity Analysis; Normalized Cross Correlation (NCC); Peak Signal to Noise Ratio (PSNR); Reference Image; Singular values  (ID#: 15-6059)



Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.