Visible to the public Biblio

Filters: Keyword is Time measurement  [Clear All Filters]
2021-10-04
LAPIQUE, Maxime, GAVAGSAZ-GHOACHANI, Roghayeh, MARTIN, Jean-Philippe, PIERFEDERICI, Serge, ZAIM, Sami.  2020.  Flatness-based control of a 3-phases PWM rectifier with LCL-filter amp; disturbance observer. IECON 2020 The 46th Annual Conference of the IEEE Industrial Electronics Society. :4685–4690.
In more electrical aircraft, the embedded electrical network is handling more and more vital functions, being more and more strained as well. Attenuation of switching harmonics is a key step in the network reliability, thus filtering elements play a central role. To keep the weight of the embedded network reasonable, weakly damped high-order filters shall be preferred. Flatness-based control (FBC) can offer both high bandwidth regulation and large signal stability proof. This make FBC a good candidate to handle the inherent oscillating behavior of aforementioned filters. However, this control strategy can be tricky to implement, especially with high order systems. Moreover, FBC is more sensor demanding than classic PI-based control. This paper address these two drawbacks. First, a novel trajectory planning for high order systems is proposed. This method does not require multiple derivations. Then the input sensors are removed thanks to a parameters estimator. Feasibility and performances are verified with experimental results. Performances comparison with cascaded-loop topologies are given in final section to prove the relevance of the proposed control strategy.
2021-09-07
Zebari, Rizgar R., Zeebaree, Subhi R. M., Sallow, Amira Bibo, Shukur, Hanan M., Ahmad, Omar M., Jacksi, Karwan.  2020.  Distributed Denial of Service Attack Mitigation Using High Availability Proxy and Network Load Balancing. 2020 International Conference on Advanced Science and Engineering (ICOASE). :174–179.
Nowadays, cybersecurity threat is a big challenge to all organizations that present their services over the Internet. Distributed Denial of Service (DDoS) attack is the most effective and used attack and seriously affects the quality of service of each E-organization. Hence, mitigation this type of attack is considered a persistent need. In this paper, we used Network Load Balancing (NLB) and High Availability Proxy (HAProxy) as mitigation techniques. The NLB is used in the Windows platform and HAProxy in the Linux platform. Moreover, Internet Information Service (IIS) 10.0 is implemented on Windows server 2016 and Apache 2 on Linux Ubuntu 16.04 as web servers. We evaluated each load balancer efficiency in mitigating synchronize (SYN) DDoS attack on each platform separately. The evaluation process is accomplished in a real network and average response time and average CPU are utilized as metrics. The results illustrated that the NLB in the Windows platform achieved better performance in mitigation SYN DDOS compared to HAProxy in the Linux platform. Whereas, the average response time of the Window webservers is reduced with NLB. However, the impact of the SYN DDoS on the average CPU usage of the IIS 10.0 webservers was more than those of the Apache 2 webservers.
2021-08-02
Gafurov, Davrondzhon, Hurum, Arne Erik.  2020.  Efficiency Metrics and Test Case Design for Test Automation. 2020 IEEE 20th International Conference on Software Quality, Reliability and Security Companion (QRS-C). :15—23.
In this paper, we present our test automation work applied on national e-health portal for residents in Norway which has over million monthly visits. The focus of the work is threefold: delegating automation tasks and increasing reusability of test artifacts; metrics for estimating efficiency when creating test artifacts and designing robust automated test cases. Delegating (part of) test automation tasks from technical specialist (e.g. programmer - expensive resource) to non-technical specialist (e.g. domain expert, functional tester) is carried out by transforming low level test artifacts into high level test artifacts. Such transformations not only reduce dependency on specialists with coding skills but also enables involving more stakeholders with domain knowledge into test automation. Furthermore, we propose simple metrics which are useful for estimating efficiency during such transformations. Examples of the new metrics are implementation creation efficiency and test creation efficiency. We describe how we design automated test cases in order to reduce the number of false positives and minimize code duplication in the presence of test data challenge (i.e. using same test data both for manual and automated testing). We have been using our test automation solution for over three years. We successfully applied test automation on 2 out of 6 Scrum teams in Helsenorge. In total there are over 120 automated test cases with over 600 iterations (as of today).
2021-07-02
Yang, Yang, Wang, Ruchuan.  2020.  LBS-based location privacy protection mechanism in augmented reality. 2020 International Conference on Internet of Things and Intelligent Applications (ITIA). :1—6.
With the development of augmented reality(AR) technology and location-based service (LBS) technology, combining AR with LBS will create a new way of life and socializing. In AR, users may consider the privacy and security of data. In LBS, the leakage of user location privacy is an important threat to LBS users. Therefore, it is very important for privacy management of positioning information and user location privacy to avoid loopholes and abuse. In this review, the concepts and principles of AR technology and LBS would be introduced. The existing privacy measurement and privacy protection framework would be analyzed and summarized. Also future research direction of location privacy protection would be discussed.
2021-05-18
Zhang, Chi, Chen, Jinfu, Cai, Saihua, Liu, Bo, Wu, Yiming, Geng, Ye.  2020.  iTES: Integrated Testing and Evaluation System for Software Vulnerability Detection Methods. 2020 IEEE 19th International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom). :1455–1460.
To find software vulnerabilities using software vulnerability detection technology is an important way to ensure the system security. Existing software vulnerability detection methods have some limitations as they can only play a certain role in some specific situations. To accurately analyze and evaluate the existing vulnerability detection methods, an integrated testing and evaluation system (iTES) is designed and implemented in this paper. The main functions of the iTES are:(1) Vulnerability cases with source codes covering common vulnerability types are collected automatically to form a vulnerability cases library; (2) Fourteen methods including static and dynamic vulnerability detection are evaluated in iTES, involving the Windows and Linux platforms; (3) Furthermore, a set of evaluation metrics is designed, including accuracy, false positive rate, utilization efficiency, time cost and resource cost. The final evaluation and test results of iTES have a good guiding significance for the selection of appropriate software vulnerability detection methods or tools according to the actual situation in practice.
2021-04-29
Fischer, A., Janneck, J., Kussmaul, J., Krätzschmar, N., Kerschbaum, F., Bodden, E..  2020.  PASAPTO: Policy-aware Security and Performance Trade-off Analysis–Computation on Encrypted Data with Restricted Leakage. 2020 IEEE 33rd Computer Security Foundations Symposium (CSF). :230—245.

This work considers the trade-off between security and performance when revealing partial information about encrypted data computed on. The focus of our work is on information revealed through control flow side-channels when executing programs on encrypted data. We use quantitative information flow to measure security, running time to measure performance and program transformation techniques to alter the trade-off between the two. Combined with information flow policies, we perform a policy-aware security and performance trade-off (PASAPTO) analysis. We formalize the problem of PASAPTO analysis as an optimization problem, prove the NP-hardness of the corresponding decision problem and present two algorithms solving it heuristically. We implemented our algorithms and combined them with the Dataflow Authentication (DFAuth) approach for outsourcing sensitive computations. Our DFAuth Trade-off Analyzer (DFATA) takes Java Bytecode operating on plaintext data and an associated information flow policy as input. It outputs semantically equivalent program variants operating on encrypted data which are policy-compliant and approximately Pareto-optimal with respect to leakage and performance. We evaluated DFATA in a commercial cloud environment using Java programs, e.g., a decision tree program performing machine learning on medical data. The decision tree variant with the worst performance is 357% slower than the fastest variant. Leakage varies between 0% and 17% of the input.

2021-03-29
Zimmo, S., Refaey, A., Shami, A..  2020.  Trusted Boot for Embedded Systems Using Hypothesis Testing Benchmark. 2020 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE). :1—2.

Security has become a crucial consideration and is one of the most important design goals for an embedded system. This paper examines the type of boot sequence, and more specifically a trusted boot which utilizes the method of chain of trust. After defining these terms, this paper will examine the limitations of the existing safe boot, and finally propose the method of trusted boot based on hypothesis testing benchmark and the cost it takes to perform this method.

2021-03-17
Kushal, T. R. B., Gao, Z., Wang, J., Illindala, M. S..  2020.  Causal Chain of Time Delay Attack on Synchronous Generator Control. 2020 IEEE Power Energy Society General Meeting (PESGM). :1—5.

Wide integration of information and communication technology (ICT) in modern power grids has brought many benefits as well as the risk of cyber attacks. A critical step towards defending grid cyber security is to understand the cyber-physical causal chain, which describes the progression of intrusion in cyber-space leading to the formation of consequences on the physical power grid. In this paper, we develop an attack vector for a time delay attack at load frequency control in the power grid. Distinct from existing works, which are separately focused on cyber intrusion, grid response, or testbed validation, the proposed attack vector for the first time provides a full cyber-physical causal chain. It targets specific vulnerabilities in the protocols, performs a denial-of-service (DoS) attack, induces the delays in control loop, and destabilizes grid frequency. The proposed attack vector is proved in theory, presented as an attack tree, and validated in an experimental environment. The results will provide valuable insights to develop security measures and robust controls against time delay attacks.

Sadu, A., Stevic, M., Wirtz, N., Monti, A..  2020.  A Stochastic Assessment of Attacks based on Continuous-Time Markov Chains. 2020 6th IEEE International Energy Conference (ENERGYCon). :11—16.

With the increasing interdependence of critical infrastructures, the probability of a specific infrastructure to experience a complex cyber-physical attack is increasing. Thus it is important to analyze the risk of an attack and the dynamics of its propagation in order to design and deploy appropriate countermeasures. The attack trees, commonly adopted to this aim, have inherent shortcomings in representing interdependent, concurrent and sequential attacks. To overcome this, the work presented here proposes a stochastic methodology using Petri Nets and Continuous Time Markov Chain (CTMC) to analyze the attacks, considering the individual attack occurrence probabilities and their stochastic propagation times. A procedure to convert a basic attack tree into an equivalent CTMC is presented. The proposed method is applied in a case study to calculate the different attack propagation characteristics. The characteristics are namely, the probability of reaching the root node & sub attack nodes, the mean time to reach the root node and the mean time spent in the sub attack nodes before reaching the root node. Additionally, the method quantifies the effectiveness of specific defenses in reducing the attack risk considering the efficiency of individual defenses.

2021-03-09
Susanto, Stiawan, D., Arifin, M. A. S., Idris, M. Y., Budiarto, R..  2020.  IoT Botnet Malware Classification Using Weka Tool and Scikit-learn Machine Learning. 2020 7th International Conference on Electrical Engineering, Computer Sciences and Informatics (EECSI). :15—20.

Botnet is one of the threats to internet network security-Botmaster in carrying out attacks on the network by relying on communication on network traffic. Internet of Things (IoT) network infrastructure consists of devices that are inexpensive, low-power, always-on, always connected to the network, and are inconspicuous and have ubiquity and inconspicuousness characteristics so that these characteristics make IoT devices an attractive target for botnet malware attacks. In identifying whether packet traffic is a malware attack or not, one can use machine learning classification methods. By using Weka and Scikit-learn analysis tools machine learning, this paper implements four machine learning algorithms, i.e.: AdaBoost, Decision Tree, Random Forest, and Naïve Bayes. Then experiments are conducted to measure the performance of the four algorithms in terms of accuracy, execution time, and false positive rate (FPR). Experiment results show that the Weka tool provides more accurate and efficient classification methods. However, in false positive rate, the use of Scikit-learn provides better results.

Mashhadi, M. J., Hemmati, H..  2020.  Hybrid Deep Neural Networks to Infer State Models of Black-Box Systems. 2020 35th IEEE/ACM International Conference on Automated Software Engineering (ASE). :299–311.
Inferring behavior model of a running software system is quite useful for several automated software engineering tasks, such as program comprehension, anomaly detection, and testing. Most existing dynamic model inference techniques are white-box, i.e., they require source code to be instrumented to get run-time traces. However, in many systems, instrumenting the entire source code is not possible (e.g., when using black-box third-party libraries) or might be very costly. Unfortunately, most black-box techniques that detect states over time are either univariate, or make assumptions on the data distribution, or have limited power for learning over a long period of past behavior. To overcome the above issues, in this paper, we propose a hybrid deep neural network that accepts as input a set of time series, one per input/output signal of the system, and applies a set of convolutional and recurrent layers to learn the non-linear correlations between signals and the patterns, over time. We have applied our approach on a real UAV auto-pilot solution from our industry partner with half a million lines of C code. We ran 888 random recent system-level test cases and inferred states, over time. Our comparison with several traditional time series change point detection techniques showed that our approach improves their performance by up to 102%, in terms of finding state change points, measured by F1 score. We also showed that our state classification algorithm provides on average 90.45% F1 score, which improves traditional classification algorithms by up to 17%.
2021-01-11
Farokhi, F..  2020.  Temporally Discounted Differential Privacy for Evolving Datasets on an Infinite Horizon. 2020 ACM/IEEE 11th International Conference on Cyber-Physical Systems (ICCPS). :1–8.
We define discounted differential privacy, as an alternative to (conventional) differential privacy, to investigate privacy of evolving datasets, containing time series over an unbounded horizon. We use privacy loss as a measure of the amount of information leaked by the reports at a certain fixed time. We observe that privacy losses are weighted equally across time in the definition of differential privacy, and therefore the magnitude of privacy-preserving additive noise must grow without bound to ensure differential privacy over an infinite horizon. Motivated by the discounted utility theory within the economics literature, we use exponential and hyperbolic discounting of privacy losses across time to relax the definition of differential privacy under continual observations. This implies that privacy losses in distant past are less important than the current ones to an individual. We use discounted differential privacy to investigate privacy of evolving datasets using additive Laplace noise and show that the magnitude of the additive noise can remain bounded under discounted differential privacy. We illustrate the quality of privacy-preserving mechanisms satisfying discounted differential privacy on smart-meter measurement time-series of real households, made publicly available by Ausgrid (an Australian electricity distribution company).
2020-11-09
Wheelus, C., Bou-Harb, E., Zhu, X..  2018.  Tackling Class Imbalance in Cyber Security Datasets. 2018 IEEE International Conference on Information Reuse and Integration (IRI). :229–232.
It is clear that cyber-attacks are a danger that must be addressed with great resolve, as they threaten the information infrastructure upon which we all depend. Many studies have been published expressing varying levels of success with machine learning approaches to combating cyber-attacks, but many modern studies still focus on training and evaluating with very outdated datasets containing old attacks that are no longer a threat, and also lack data on new attacks. Recent datasets like UNSW-NB15 and SANTA have been produced to address this problem. Even so, these modern datasets suffer from class imbalance, which reduces the efficacy of predictive models trained using these datasets. Herein we evaluate several pre-processing methods for addressing the class imbalance problem; using several of the most popular machine learning algorithms and a variant of UNSW-NB15 based upon the attributes from the SANTA dataset.
2020-09-14
Wang, Hui, Yan, Qiurong, Li, Bing, Yuan, Chenglong, Wang, Yuhao.  2019.  Sampling Time Adaptive Single-Photon Compressive Imaging. IEEE Photonics Journal. 11:1–10.
We propose a time-adaptive sampling method and demonstrate a sampling-time-adaptive single-photon compressive imaging system. In order to achieve self-adapting adjustment of sampling time, the theory of threshold of light intensity estimation accuracy is deduced. According to this threshold, a sampling control module, based on field-programmable gate array, is developed. Finally, the advantage of the time-adaptive sampling method is proved experimentally. Imaging performance experiments show that the time-adaptive sampling method can automatically adjust the sampling time for the change of light intensity of image object to obtain an image with better quality and avoid speculative selection of sampling time.
2020-07-20
Lekidis, Alexios, Barosan, Ion.  2019.  Model-based simulation and threat analysis of in-vehicle networks. 2019 15th IEEE International Workshop on Factory Communication Systems (WFCS). :1–8.
Automotive systems are currently undergoing a rapid evolution through the integration of the Internet of Things (IoT) and Software Defined Networking (SDN) technologies. The main focus of this evolution is to improve the driving experience, including automated controls, intelligent navigation and safety systems. Moreover, the extremely rapid pace that such technologies are brought into the vehicles, necessitates the presence of adequate testing of new features to avoid operational errors. Apart from testing though, IoT and SDN technologies also widen the threat landscape of cyber-security risks due to the amount of connectivity interfaces that are nowadays exposed in vehicles. In this paper we present a new method, based on OMNET++, for testing new in-vehicle features and assessing security risks through network simulation. The method is demonstrated through a case-study on a Toyota Prius, whose network data are analyzed for the detection of anomalies caused from security threats or operational errors.
2020-06-19
Khandani, Amir K., Bateni, E..  2019.  A Practical, Provably Unbreakable Approach to Physical Layer Security. 2019 16th Canadian Workshop on Information Theory (CWIT). :1—6.

This article presents a practical approach for secure key exchange exploiting reciprocity in wireless transmission. The method relies on the reciprocal channel phase to mask points of a Phase Shift Keying (PSK) constellation. Masking is achieved by adding (modulo 2π) the measured reciprocal channel phase to the PSK constellation points carrying some of the key bits. As the channel phase is uniformly distributed in [0, 2π], knowing the sum of the two phases does not disclose any information about any of its two components. To enlarge the key size over a static or slow fading channel, the Radio Frequency (RF) propagation path is perturbed to create independent realizations of multi-path fading. Prior techniques have relied on quantizing the reciprocal channel state measured at the two ends and thereby suffer from information leakage in the process of key consolidation (ensuring the two ends have access to the same key). The proposed method does not suffer from such shortcomings as raw key bits can be equipped with Forward Error Correction (FEC) without affecting the masking (zero information leakage) property. To eavesdrop a phase value shared in this manner, the Eavesdropper (Eve) would require to solve a system of linear equations defined over angles, each equation corresponding to a possible measurement by the Eve. Channel perturbation is performed such that each new channel state creates an independent channel realization for the legitimate nodes, as well as for each of Eves antennas. As a result, regardless of the Eves Signal-to-Noise Ratio (SNR) and number of antennas, Eve will always face an under-determined system of equations. On the other hand, trying to solve any such under-determined system of linear equations in terms of an unknown phase will not reveal any useful information about the actual answer, meaning that the distribution of the answer remains uniform in [0, 2π].

2020-03-09
Portolan, Michele, Savino, Alessandro, Leveugle, Regis, Di Carlo, Stefano, Bosio, Alberto, Di Natale, Giorgio.  2019.  Alternatives to Fault Injections for Early Safety/Security Evaluations. 2019 IEEE European Test Symposium (ETS). :1–10.
Functional Safety standards like ISO 26262 require a detailed analysis of the dependability of components subjected to perturbations. Radiation testing or even much more abstract RTL fault injection campaigns are costly and complex to set up especially for SoCs and Cyber Physical Systems (CPSs) comprising intertwined hardware and software. Moreover, some approaches are only applicable at the very end of the development cycle, making potential iterations difficult when market pressure and cost reduction are paramount. In this tutorial, we present a summary of classical state-of-the-art approaches, then alternative approaches for the dependability analysis that can give an early yet accurate estimation of the safety or security characteristics of HW-SW systems. Designers can rely on these tools to identify issues in their design to be addressed by protection mechanisms, ensuring that system dependability constraints are met with limited risk when subjected later to usual fault injections and to e.g., radiation testing or laser attacks for certification.
2019-02-13
Gevargizian, J., Kulkarni, P..  2018.  MSRR: Measurement Framework For Remote Attestation. 2018 IEEE 16th Intl Conf on Dependable, Autonomic and Secure Computing, 16th Intl Conf on Pervasive Intelligence and Computing, 4th Intl Conf on Big Data Intelligence and Computing and Cyber Science and Technology Congress(DASC/PiCom/DataCom/CyberSciTech). :748–753.
Measurers are critical to a remote attestation (RA) system to verify the integrity of a remote untrusted host. Run-time measurers in a dynamic RA system sample the dynamic program state of the host to form evidence in order to establish trust by a remote system (appraiser). However, existing run-time measurers are tightly integrated with specific software. Such measurers need to be generated anew for each software, which is a manual process that is both challenging and tedious. In this paper we present a novel approach to decouple application-specific measurement policies from the measurers tasked with performing the actual run-time measurement. We describe MSRR (MeaSeReR), a novel general-purpose measurement framework that is agnostic of the target application. We show how measurement policies written per application can use MSRR, eliminating much time and effort spent on reproducing core measurement functionality. We describe MSRR's robust querying language, which allows the appraiser to accurately specify the what, when, and how to measure. We evaluate MSRR's overhead and demonstrate its functionality.
Carpent, X., Tsudik, G., Rattanavipanon, N..  2018.  ERASMUS: Efficient remote attestation via self-measurement for unattended settings. 2018 Design, Automation Test in Europe Conference Exhibition (DATE). :1191–1194.
Remote attestation (RA) is a popular means of detecting malware in embedded and IoT devices. RA is usually realized as a protocol via which a trusted verifier measures software integrity of an untrusted remote device called prover. All prior RA techniques require on-demand operation. We identify two drawbacks of this approach in the context of unattended devices: First, it fails to detect mobile malware that enters and leaves the prover between successive RA instances. Second, it requires the prover to engage in a potentially expensive computation, which can negatively impact safety-critical or real-time devices. To this end, we introduce the concept of self-measurement whereby a prover periodically (and securely) measures and records its own software state. A verifier then collects and verifies these measurements. We demonstrate a concrete technique called ERASMUS, justify its features, and evaluate its performance. We show that ERASMUS is well-suited for safety-critical applications. We also define a new metric - Quality of Attestation (QoA).
2019-01-16
Kimmich, J. M., Schlesinger, A., Tschaikner, M., Ochmann, M., Frank, S..  2018.  Acoustical Analysis of Coupled Rooms Applied to the Deutsche Oper Berlin. 2018 Joint Conference - Acoustics. :1–9.
The aim of the project SIMOPERA is to simulate and optimize the acoustics in large and complex rooms, with special focus on the Deutsche Oper Berlin as an example of application. Firstly, characteristic subspaces of the opera are considered such as the orchestra pit, the stage and the auditorium. Special attention is paid to the orchestra pit, where high sound pressure levels can occur, leading to noise related risks for the musicians. However, lowering the sound pressure level in the orchestra pit should not violate other objectives as the propagation of sound into the auditorium, the balance between the stage performers and the orchestra across the hall, and the mutual audibility between performers and orchestra members. For that reason, a hybrid simulation method consisting of the wave-based Finite Element Method (FEM) and the Boundary Element Method (BEM) for low frequencies and geometrical methods like the mirror source method and ray tracing for higher frequencies is developed in order to determine the relevant room acoustic quantities such as impulse response functions, reverberation time, clarity, center time etc. Measurements in the opera will continuously accompany the numerical calculations. Finally, selected constructive means for reducing the sound level in the orchestra pit will be analyzed.
2018-02-27
Canetti, R., Hogan, K., Malhotra, A., Varia, M..  2017.  A Universally Composable Treatment of Network Time. 2017 IEEE 30th Computer Security Foundations Symposium (CSF). :360–375.
The security of almost any real-world distributed system today depends on the participants having some "reasonably accurate" sense of current real time. Indeed, to name one example, the very authenticity of practically any communication on the Internet today hinges on the ability of the parties to accurately detect revocation of certificates, or expiration of passwords or shared keys.,,However, as recent attacks show, the standard protocols for determining time are subvertible, resulting in wide-spread security loss. Worse yet, we do not have security notions for network time protocols that (a) can be rigorously asserted, and (b) rigorously guarantee security of applications that require a sense of real time.,,We propose such notions, within the universally composable (UC) security framework. That is, we formulate ideal functionalities that capture a number of prevalent forms of time measurement within existing systems. We show how they can be realized by real-world protocols, and how they can be used to assert security of time-reliant applications - specifically, certificates with revocation and expiration times. This allows for relatively clear and modular treatment of the use of time consensus in security-sensitive systems.,,Our modeling and analysis are done within the existing UC framework, in spite of its asynchronous, event-driven nature. This allows incorporating the use of real time within the existing body of analytical work done in this framework. In particular it allows for rigorous incorporation of real time within cryptographic tools and primitives.
2018-02-06
Yasumura, Y., Imabayashi, H., Yamana, H..  2017.  Attribute-Based Proxy Re-Encryption Method for Revocation in Cloud Data Storage. 2017 IEEE International Conference on Big Data (Big Data). :4858–4860.

In the big data era, many users upload data to cloud while security concerns are growing. By using attribute-based encryption (ABE), users can securely store data in cloud while exerting access control over it. Revocation is necessary for real-world applications of ABE so that revoked users can no longer decrypt data. In actual implementations, however, revocation requires re-encryption of data in client side through download, decrypt, encrypt, and upload, which results in huge communication cost between the client and the cloud depending on the data size. In this paper, we propose a new method where the data can be re-encrypted in cloud without downloading any data. The experimental result showed that our method reduces the communication cost by one quarter in comparison with the trivial solution where re-encryption is performed in client side.

2017-12-12
Pan, X., Yang, Y., Zhang, G., Zhang, B..  2017.  Resilience-based optimization of recovery strategies for network systems. 2017 Second International Conference on Reliability Systems Engineering (ICRSE). :1–6.

Network systems, such as transportation systems and water supply systems, play important roles in our daily life and industrial production. However, a variety of disruptive events occur during their life time, causing a series of serious losses. Due to the inevitability of disruption, we should not only focus on improving the reliability or the resistance of the system, but also pay attention to the ability of the system to response timely and recover rapidly from disruptive events. That is to say we need to pay more attention to the resilience. In this paper, we describe two resilience models, quotient resilience and integral resilience, to measure the final recovered performance and the performance cumulative process during recovery respectively. Based on these two models, we implement the optimization of the system recovery strategies after disruption, focusing on the repair sequence of the damaged components and the allocation scheme of resource. The proposed research in this paper can serve as guidance to prioritize repair tasks and allocate resource reasonably.

2017-02-21
Liang Zhongyin, Huang Jianjun, Huang Jingxiong.  2015.  "Sub-sampled IFFT based compressive sampling". TENCON 2015 - 2015 IEEE Region 10 Conference. :1-4.

In this paper, a new approach based on Sub-sampled Inverse Fast Fourier Transform (SSIFFT) for efficiently acquiring compressive measurements is proposed, which is motivated by random filter based method and sub-sampled FFT. In our approach, to start with, we multiply the FFT of input signal and that of random-tap FIR filter in frequency domain and then utilize SSIFFT to obtain compressive measurements in the time domain. It requires less data storage and computation than the existing methods based on random filter. Moreover, it is suitable for both one-dimensional and two-dimensional signals. Experimental results show that the proposed approach is effective and efficient.

2015-05-06
Sanandaji, B.M., Bitar, E., Poolla, K., Vincent, T.L..  2014.  An abrupt change detection heuristic with applications to cyber data attacks on power systems. American Control Conference (ACC), 2014. :5056-5061.

We present an analysis of a heuristic for abrupt change detection of systems with bounded state variations. The proposed analysis is based on the Singular Value Decomposition (SVD) of a history matrix built from system observations. We show that monitoring the largest singular value of the history matrix can be used as a heuristic for detecting abrupt changes in the system outputs. We provide sufficient detectability conditions for the proposed heuristic. As an application, we consider detecting malicious cyber data attacks on power systems and test our proposed heuristic on the IEEE 39-bus testbed.