Visible to the public Biblio

Found 6625 results

Filters: Keyword is Metrics  [Clear All Filters]
2022-09-20
Dong, Xingbo, Jin, Zhe, Zhao, Leshan, Guo, Zhenhua.  2021.  BioCanCrypto: An LDPC Coded Bio-Cryptosystem on Fingerprint Cancellable Template. 2021 IEEE International Joint Conference on Biometrics (IJCB). :1—8.
Biometrics as a means of personal authentication has demonstrated strong viability in the past decade. However, directly deriving a unique cryptographic key from biometric data is a non-trivial task due to the fact that biometric data is usually noisy and presents large intra-class variations. Moreover, biometric data is permanently associated with the user, which leads to security and privacy issues. Cancellable biometrics and bio-cryptosystem are two main branches to address those issues, yet both approaches fall short in terms of accuracy performance, security, and privacy. In this paper, we propose a Bio-Crypto system on fingerprint Cancellable template (Bio-CanCrypto), which bridges cancellable biometrics and bio-cryptosystem to achieve a middle-ground for alleviating the limitations of both. Specifically, a cancellable transformation is applied on a fixed-length fingerprint feature vector to generate cancellable templates. Next, an LDPC coding mechanism is introduced into a reusable fuzzy extractor scheme and used to extract the stable cryptographic key from the generated cancellable templates. The proposed system can achieve both cancellability and reusability in one scheme. Experiments are conducted on a public fingerprint dataset, i.e., FVC2002. The results demonstrate that the proposed LDPC coded reusable fuzzy extractor is effective and promising.
Bentahar, Atef, Meraoumia, Abdallah, Bendjenna, Hakim, Chitroub, Salim, Zeroual, Abdelhakim.  2021.  Eigen-Fingerprints-Based Remote Authentication Cryptosystem. 2021 International Conference on Recent Advances in Mathematics and Informatics (ICRAMI). :1—6.
Nowadays, biometric is a most technique to authenticate /identify human been, because its resistance against theft, loss or forgetfulness. However, biometric is subject to different transmission attacks. Today, the protection of the sensitive biometric information is a big challenge, especially in current wireless networks such as internet of things where the transmitted data is easy to sniffer. For that, this paper proposes an Eigens-Fingerprint-based biometric cryptosystem, where the biometric feature vectors are extracted by the Principal Component Analysis technique with an appropriate quantification. The key-binding principle incorporated with bit-wise and byte-wise correcting code is used for encrypting data and sharing key. Several recognition rates and computation time are used to evaluate the proposed system. The findings show that the proposed cryptosystem achieves a high security without decreasing the accuracy.
Cooley, Rafer, Cutshaw, Michael, Wolf, Shaya, Foster, Rita, Haile, Jed, Borowczak, Mike.  2021.  Comparing Ransomware using TLSH and @DisCo Analysis Frameworks. 2021 IEEE International Conference on Big Data (Big Data). :2084—2091.
Modern malware indicators utilized by the current top threat feeds are easily bypassed and generated through enigmatic methods, leading to a lack of detection capabilities for cyber defenders. Static hash-based algorithms such as MD5 or SHA generate indicators that are rendered obsolete by modifying a single byte of the source file. Conversely, fuzzy hash-based algorithms such as SSDEEP and TLSH are more robust to alterations of source information; however, these methods often utilize context boundaries that are hard to define or not based on meaningful information. In previous work, a custom binary analysis tool was created called @DisCo. In this study, four current ransomware campaigns were analyzed using TLSH fuzzy hashing and the @DisCo tool. While TLSH works on the binary level of the entire program, @DisCo works at an intermediate function level. The results from each analysis method were compared to provide validation between the two as well as introduce a narrative for using combinations of these types of methods for the creation of stronger indicators of compromise.
Thao Nguyen, Thi Ai, Dang, Tran Khanh, Nguyen, Dinh Thanh.  2021.  Non-Invertibility for Random Projection based Biometric Template Protection Scheme. 2021 15th International Conference on Ubiquitous Information Management and Communication (IMCOM). :1—8.
Nowadays, biometric-based authentication systems are widely used. This fact has led to increased attacks on biometric data of users. Therefore, biometric template protection is sure to keep the attention of researchers for the security of the authentication systems. Many previous works proposed the biometric template protection schemes by transforming the original biometric data into a secure domain, or establishing a cryptographic key with the use of biometric data. The main purpose was that fulfill the all three requirements: cancelability, security, and performance as many as possible. In this paper, using random projection merged with fuzzy commitment, we will introduce a hybrid scheme of biometric template protection. We try to limit their own drawbacks and take full advantages of these techniques at the same time. In addition, an analysis of non-invertibility property will be exercised with regards to the use of random projection aiming at enhancing the security of the system while preserving the discriminability of the original biometric template.
Sreemol, R, Santosh Kumar, M B, Sreekumar, A.  2021.  Improvement of Security in Multi-Biometric Cryptosystem by Modulus Fuzzy Vault Algorithm. 2021 International Conference on Advances in Computing and Communications (ICACC). :1—7.
Numerous prevalent techniques build a Multi-Modal Biometric (MMB) system that struggles in offering security and also revocability onto the templates. This work proffered a MMB system centred on the Modulus Fuzzy Vault (MFV) aimed at resolving these issues. The methodology proposed includes Fingerprint (FP), Palmprint (PP), Ear and also Retina images. Utilizing the Boosted Double Plateau Histogram Equalization (BDPHE) technique, all images are improved. Aimed at removing the unnecessary things as of the ear and the blood vessels are segmented as of the retina images utilizing the Modified Balanced Iterative Reducing and Clustering using Hierarchy (MBIRCH) technique. Next, the input traits features are extracted; then the essential features are chosen as of the features extracted utilizing the Bidirectional Deer Hunting optimization Algorithm (BDHOA). The features chosen are merged utilizing the Normalized Feature Level and Score Level (NFLSL) fusion. The features fused are saved securely utilizing Modulus Fuzzy Vault. Upto fusion, the procedure is repeated aimed at the query image template. Next, the de-Fuzzy Vault procedure is executed aimed at the query template, and then the key is detached by matching the query template’s and input biometric template features. The key separated is analogized with the threshold that categorizes the user as genuine or else imposter. The proposed BDPHE and also MFV techniques function efficiently than the existent techniques.
Korenda, Ashwija Reddy, Afghah, Fatemeh, Razi, Abolfazl, Cambou, Bertrand, Begay, Taylor.  2021.  Fuzzy Key Generator Design using ReRAM-Based Physically Unclonable Functions. 2021 IEEE Physical Assurance and Inspection of Electronics (PAINE). :1—7.
Physical unclonable functions (PUFs) are used to create unique device identifiers from their inherent fabrication variability. Unstable readings and variation of the PUF response over time are key issues that limit the applicability of PUFs in real-world systems. In this project, we developed a fuzzy extractor (FE) to generate robust cryptographic keys from ReRAM-based PUFs. We tested the efficiency of the proposed FE using BCH and Polar error correction codes. We use ReRAM-based PUFs operating in pre-forming range to generate binary cryptographic keys at ultra-low power with an objective of tamper sensitivity. We investigate the performance of the proposed FE with real data using the reading of the resistance of pre-formed ReRAM cells under various noise conditions. The results show a bit error rate (BER) in the range of 10−5 for the Polar-codes based method when 10% of the ReRAM cell array is erroneous at Signal to Noise Ratio (SNR) of 20dB.This error rate is achieved by using helper data length of 512 bits for a 256 bit cryptographic key. Our method uses a 2:1 ratio for helper data and key, much lower than the majority of previously reported methods. This property makes our method more robust against helper data attacks.
Simjanović, Dušan J., Milošević, Dušan M., Milošević, Mimica R..  2021.  Fuzzy AHP based Ranking of Cryptography Indicators. 2021 15th International Conference on Advanced Technologies, Systems and Services in Telecommunications (℡SIKS). :237—240.
The progression of cryptographic attacks in the ICT era doubtless leads to the development of new cryptographic algorithms and assessment, and evaluation of the existing ones. In this paper, the artificial intelligence application, through the fuzzy analytic hierarchy process (FAHP) implementation, is used to rank criteria and sub-criteria on which the algorithms are based to determine the most promising criteria and optimize their use. Out of fifteen criteria, security soundness, robustness and hardware failure distinguished as significant ones.
Shaomei, Lv, Xiangyan, Zeng, Long, Huang, Lan, Wu, Wei, Jiang.  2021.  Passenger Volume Interval Prediction based on MTIGM (1,1) and BP Neural Network. 2021 33rd Chinese Control and Decision Conference (CCDC). :6013—6018.
The ternary interval number contains more comprehensive information than the exact number, and the prediction of the ternary interval number is more conducive to intelligent decision-making. In order to reduce the overfitting problem of the neural network model, a combination prediction method of the BP neural network and the matrix GM (1, 1) model for the ternary interval number sequence is proposed in the paper, and based on the proposed method to predict the passenger volume. The matrix grey model for the ternary interval number sequence (MTIGM (1, 1)) can stably predict the overall development trend of a time series. Considering the integrity of interval numbers, the BP neural network model is established by combining the lower, middle and upper boundary points of the ternary interval numbers. The combined weights of MTIGM (1, 1) and the BP neural network are determined based on the grey relational degree. The combined method is used to predict the total passenger volume and railway passenger volume of China, and the prediction effect is better than MTIGM (1, 1) and BP neural network.
Emadi, Hamid, Clanin, Joe, Hyder, Burhan, Khanna, Kush, Govindarasu, Manimaran, Bhattacharya, Sourabh.  2021.  An Efficient Computational Strategy for Cyber-Physical Contingency Analysis in Smart Grids. 2021 IEEE Power & Energy Society General Meeting (PESGM). :1—5.
The increasing penetration of cyber systems into smart grids has resulted in these grids being more vulnerable to cyber physical attacks. The central challenge of higher order cyber-physical contingency analysis is the exponential blow-up of the attack surface due to a large number of attack vectors. This gives rise to computational challenges in devising efficient attack mitigation strategies. However, a system operator can leverage private information about the underlying network to maintain a strategic advantage over an adversary equipped with superior computational capability and situational awareness. In this work, we examine the following scenario: A malicious entity intrudes the cyber-layer of a power network and trips the transmission lines. The objective of the system operator is to deploy security measures in the cyber-layer to minimize the impact of such attacks. Due to budget constraints, the attacker and the system operator have limits on the maximum number of transmission lines they can attack or defend. We model this adversarial interaction as a resource-constrained attacker-defender game. The computational intractability of solving large security games is well known. However, we exploit the approximately modular behaviour of an impact metric known as the disturbance value to arrive at a linear-time algorithm for computing an optimal defense strategy. We validate the efficacy of the proposed strategy against attackers of various capabilities and provide an algorithm for a real-time implementation.
Yao, Pengchao, Hao, Weijie, Yan, Bingjing, Yang, Tao, Wang, Jinming, Yang, Qiang.  2021.  Game-Theoretic Model for Optimal Cyber-Attack Defensive Decision-Making in Cyber-Physical Power Systems. 2021 IEEE 5th Conference on Energy Internet and Energy System Integration (EI2). :2359—2364.
Cyber-Physical Power Systems (CPPSs) currently face an increasing number of security attacks and lack methods for optimal proactive security decisions to defend the attacks. This paper proposed an optimal defensive method based on game theory to minimize the system performance deterioration of CPPSs under cyberspace attacks. The reinforcement learning algorithmic solution is used to obtain the Nash equilibrium and a set of metrics of system vulnerabilities are adopted to quantify the cost of defense against cyber-attacks. The minimax-Q algorithm is utilized to obtain the optimal defense strategy without the availability of the attacker's information. The proposed solution is assessed through experiments based on a realistic power generation microsystem testbed and the numerical results confirmed its effectiveness.
Pereira, Luiz Manella, Iyengar, S. S., Amini, M. Hadi.  2021.  On the Impact of the Embedding Process on Network Resilience Quantification. 2021 International Conference on Computational Science and Computational Intelligence (CSCI). :836—839.
Network resilience is crucial to ensure reliable and secure operation of critical infrastructures. Although graph theoretic methods have been developed to quantify the topological resilience of networks, i.e., measuring resilience with respect to connectivity, in this study we propose to use the tools from Topological Data Analysis (TDA), Algebraic Topology, and Optimal Transport (OT). In our prior work, we used these tools to create a resilience metric that bypassed the need to embed a network onto a space. We also hypothesized that embeddings could encode different information about a network and that different embeddings could result in different outcomes when computing resilience. In this paper we attempt to test this hypothesis. We will utilize the WEGL framework to compute the embedding for the considered network and compare the results against our prior work, which did not use an embedding process. To our knowledge, this is the first attempt to study the ramifications of choosing an embedding, thus providing a novel understanding into how to choose an embedding and whether such a choice matters when quantifying resilience.
Ndemeye, Bosco, Hussain, Shahid, Norris, Boyana.  2021.  Threshold-Based Analysis of the Code Quality of High-Performance Computing Software Packages. 2021 IEEE 21st International Conference on Software Quality, Reliability and Security Companion (QRS-C). :222—228.
Many popular metrics used for the quantification of the quality or complexity of a codebase (e.g. cyclomatic complexity) were developed in the 1970s or 1980s when source code sizes were significantly smaller than they are today, and before a number of modern programming language features were introduced in different languages. Thus, the many thresholds that were suggested by researchers for deciding whether a given function is lacking in a given quality dimension need to be updated. In the pursuit of this goal, we study a number of open-source high-performance codes, each of which has been in development for more than 15 years—a characteristic which we take to imply good design to score them in terms of their source codes' quality and to relax the above-mentioned thresholds. First, we employ the LLVM/Clang compiler infrastructure and introduce a Clang AST tool to gather AST-based metrics, as well as an LLVM IR pass for those based on a source code's static call graph. Second, we perform statistical analysis to identify the reference thresholds of 22 code quality and callgraph-related metrics at a fine grained level.
Singh, Jagdeep, Behal, Sunny.  2021.  A Novel Approach for the Detection of DDoS Attacks in SDN using Information Theory Metric. 2021 8th International Conference on Computing for Sustainable Global Development (INDIACom). :512—516.
Internet always remains the target for the cyberattacks, and attackers are getting equipped with more potent tools due to the advancement of technology to preach the security of the Internet. Industries and organizations are sponsoring many projects to avoid these kinds of problems. As a result, SDN (Software Defined Network) architecture is becoming an acceptable alternative for the traditional IP based networks which seems a better approach to defend the Internet. However, SDN is also vulnerable to many new threats because of its architectural concept. SDN might be a primary target for DoS (Denial of Service) and DDoS (Distributed Denial of Service) attacks due to centralized control and linking of data plane and control plane. In this paper, the we propose a novel technique for detection of DDoS attacks using information theory metric. We compared our approach with widely used Intrusion Detection Systems (IDSs) based on Shannon entropy and Renyi entropy, and proved that our proposed methodology has more power to detect malicious flows in SDN based networks. We have used precision, detection rate and FPR (False Positive Rate) as performance parameters for comparison, and validated the methodology using a topology implemented in Mininet network emulator.
Chandramouli, Athreya, Jana, Sayantan, Kothapalli, Kishore.  2021.  Efficient Parallel Algorithms for Computing Percolation Centrality. 2021 IEEE 28th International Conference on High Performance Computing, Data, and Analytics (HiPC). :111—120.
Centrality measures on graphs have found applications in a large number of domains including modeling the spread of an infection/disease, social network analysis, and transportation networks. As a result, parallel algorithms for computing various centrality metrics on graphs are gaining significant research attention in recent years. In this paper, we study parallel algorithms for the percolation centrality measure which extends the betweenness-centrality measure by incorporating a time dependent state variable with every node. We present parallel algorithms that compute the source-based and source-destination variants of the percolation centrality values of nodes in a network. Our algorithms extend the algorithm of Brandes, introduce optimizations aimed at exploiting the structural properties of graphs, and extend the algorithmic techniques introduced by Sariyuce et al. [26] in the context of centrality computation. Experimental studies of our algorithms on an Intel Xeon(R) Silver 4116 CPU and an Nvidia Tesla V100 GPU on a collection of 12 real-world graphs indicate that our algorithmic techniques offer a significant speedup.
Zhao, Lianying, Oshman, Muhammad Shafayat, Zhang, Mengyuan, Moghaddam, Fereydoun Farrahi, Chander, Shubham, Pourzandi, Makan.  2021.  Towards 5G-ready Security Metrics. ICC 2021 - IEEE International Conference on Communications. :1—6.
The fifth-generation (5G) mobile telecom network has been garnering interest in both academia and industry, with better flexibility and higher performance compared to previous generations. Along with functionality improvements, new attack vectors also made way. Network operators and regulatory organizations wish to have a more precise idea about the security posture of 5G environments. Meanwhile, various security metrics for IT environments have been around and attracted the community’s attention. However, 5G-specific factors are less taken into consideration.This paper considers such 5G-specific factors to identify potential gaps if existing security metrics are to be applied to the 5G environments. In light of the layered nature and multi-ownership, the paper proposes a new approach to the modular computation of security metrics based on cross-layer projection as a means of information sharing between layers. Finally, the proposed approach is evaluated through simulation.
Koteshwara, Sandhya.  2021.  Security Risk Assessment of Server Hardware Architectures Using Graph Analysis. 2021 Asian Hardware Oriented Security and Trust Symposium (AsianHOST). :1—4.
The growing complexity of server architectures, which incorporate several components with state, has necessitated rigorous assessment of the security risk both during design and operation. In this paper, we propose a novel technique to model the security risk of servers by mapping their architectures to graphs. This allows us to leverage tools from computational graph theory, which we combine with probability theory for deriving quantitative metrics for risk assessment. Probability of attack is derived for server components, with prior probabilities assigned based on knowledge of existing vulnerabilities and countermeasures. The resulting analysis is further used to compute measures of impact and exploitability of attack. The proposed methods are demonstrated on two open-source server designs with different architectures.
Boutaib, Sofien, Elarbi, Maha, Bechikh, Slim, Palomba, Fabio, Said, Lamjed Ben.  2021.  A Possibilistic Evolutionary Approach to Handle the Uncertainty of Software Metrics Thresholds in Code Smells Detection. 2021 IEEE 21st International Conference on Software Quality, Reliability and Security (QRS). :574—585.
A code smells detection rule is a combination of metrics with their corresponding crisp thresholds and labels. The goal of this paper is to deal with metrics' thresholds uncertainty; as usually such thresholds could not be exactly determined to judge the smelliness of a particular software class. To deal with this issue, we first propose to encode each metric value into a binary possibility distribution with respect to a threshold computed from a discretization technique; using the Possibilistic C-means classifier. Then, we propose ADIPOK-UMT as an evolutionary algorithm that evolves a population of PK-NN classifiers for the detection of smells under thresholds' uncertainty. The experimental results reveal that the possibility distribution-based encoding allows the implicit weighting of software metrics (features) with respect to their computed discretization thresholds. Moreover, ADIPOK-UMT is shown to outperform four relevant state-of-art approaches on a set of commonly adopted benchmark software systems.
2022-09-16
Anh, Dao Vu, Tran Thi Thanh, Thuy, Huu, Long Nguyen, Dung Truong, Cao, Xuan, Quyen Nguyen.  2021.  Performance Analysis of High-Speed Wavelength Division Multiplexing Communication Between Chaotic Secure and Optical Fiber Channels Using DP-16QAM Scheme. 2020 IEEE Eighth International Conference on Communications and Electronics (ICCE). :33—38.
In this paper, we propose a numerical simulation investigation of the wavelength division multiplexing mechanism between a chaotic secure channel and a traditional fiber channel using the advanced modulation method DP-16QAM at the bitrate of 80Gbps, the fiber length of 80 km and 100 GHz channel spacing in C-band. Our paper investigates correlation coefficients between the transmitter and also the receiver for two forms of communication channels. Our simulation results demonstrate that, in all cases, BER is always below 2.10-4 even when we have not used the forward-error-correction method. Besides, cross-interaction between the chaotic channel and also the non-chaotic channel is negligible showing a highly independent level between two channels.
Liu, Shiqin, Jiang, Ning, Zhang, Yiqun, Peng, Jiafa, Zhao, Anke, Qiu, Kun.  2021.  Security-enhanced Key Distribution Based on Chaos Synchronization Between Dual Path-injected Semiconductor Lasers. 2021 International Conference on UK-China Emerging Technologies (UCET). :109—112.
We propose and numerically demonstrate a novel secure key distribution scheme based on the chaos synchronization of two semiconductor lasers (SLs) subject to symmetrical double chaotic injections, which are outputted by two mutually-coupled semiconductor lasers. The results show that high quality chaos synchronization can be observed between two local SLs with suitable injection strength and identical injection time delays for Alice and Bob. On the basis of satisfactory chaos synchronization and a post-processing technology, identical secret keys for Alice and Bob are successfully generated with bit error ratio (BER) below the HD-FEC threshold of $^\textrm-3\$$\$.
Garcia, Daniel, Liu, Hong.  2021.  A Study of Post Quantum Cipher Suites for Key Exchange. 2021 IEEE International Symposium on Technologies for Homeland Security (HST). :1—7.
Current cryptographic solutions used in information technologies today like Transport Layer Security utilize algorithms with underlying computationally difficult problems to solve. With the ongoing research and development of quantum computers, these same computationally difficult problems become solvable within reasonable (polynomial) time. The emergence of large-scale quantum computers would put the integrity and confidentiality of today’s data in jeopardy. It then becomes urgent to develop, implement, and test a new suite of cybersecurity measures against attacks from a quantum computer. This paper explores, understands, and evaluates this new category of cryptosystems as well as the many tradeoffs among them. All the algorithms submitted to the National Institute of Standards and Technology (NIST) for standardization can be categorized into three major categories, each relating to the new underlying hard problem: namely error code correcting, algebraic lattices (including ring learning with errors), and supersingular isogenies. These new mathematical hard problems have shown to be resistant to the same type of quantum attack. Utilizing hardware clock cycle registers, the work sets up the benchmarks of the four Round 3 NIST algorithms in two environments: cloud computing and embedded system. As expected, there are many tradeoffs and advantages in each algorithm for applications. Saber and Kyber are exceedingly fast but have larger ciphertext size for transmission over a wire. McEliece key size and key generation are the largest drawbacks but having the smallest ciphertext size and only slightly decreased performance allow a use case where key reuse is prioritized. NTRU finds a middle ground in these tradeoffs, being better than McEliece performance wise and better than Kyber and Saber in ciphertext size allows for a use case of highly varied environments, which need to value speed and ciphertext size equally. Going forward, the benchmarking system developed could be applied to digital signature, another vital aspect to a cryptosystem.
Asaithambi, Gobika, Gopalakrishnan, Balamurugan.  2021.  Design of Code and Chaotic Frequency Modulation for Secure and High Data rate Communication. 2021 5th International Conference on Computer, Communication and Signal Processing (ICCCSP). :1—6.
In Forward Error Correction (FEC), redundant bits are added for detecting and correcting bit error which increases the bandwidth. To solve this issue we combined FEC method with higher order M-ary modulation to provide a bandwidth efficient system. An input bit stream is mapped to a bi-orthogonal code on different levels based on the code rates (4/16, 3/16, and 2/16) used. The jamming attack on wireless networks are mitigated by Chaotic Frequency Hopping (CFH) spread spectrum technique. In this paper, to achieve better data rate and to transmit the data in a secured manner we combined FEC and CFH technique, represented as Code and Chaotic Frequency Modulation (CCFM). In addition, two rate adaptation algorithms namely Static retransmission rate ARF (SARF) and Fast rate reduction ARF (FARF) are employed in CFH technique to dynamically adapt the code rate based on channel condition to reduce a packet retransmission. Symbol Error Rate (SER) performance of the system is analyzed for different code rate with the conventional OFDM in the presence AWGN and Rayleigh channel and the reliability of CFH method is tested under different jammer.
Kaur, Satwinder, Kuttan, Deepak B, Mittal, Nitin.  2021.  An Energy-saving Approach for Error control Codes in Wireless Sensor Networks. 2021 2nd International Conference on Smart Electronics and Communication (ICOSEC). :313—316.
Wireless Sensor Networks (WSNs) have limited energy resource which requires authentic data transmission at a minimum cost. The major challenge is to deploy WSN with limited energy and lifetime of nodes while taking care of secure data communication. The transmission of data from the wireless channels may cause many losses such as fading, noise, bit error rate increases as well as deplete the energy resource from the nodes. To reduce the adverse effects of losses and to save power usage, error control coding (ECC) techniques are widely used and it also brings coding gain. Since WSN have limited energy resource so the selection of ECC is very difficult as both power consumption, as well as BER, has also taken into consideration. This research paper reviews different types of models, their applications, limitations of the sensor networks, and what are different types of future works going to overcome the limitations.
Silvério, Tiago, Figueiredo, Gonçalo, André, Paulo S., Ferreira, Rute A.S..  2021.  Privacy Increase in VLC System Based on Hyperchaotic Map. 2021 Telecoms Conference (Conf℡E). :1—4.
Visible light communications (VLC) have been the focus of many recent investigations due to its potential for transmitting data at a higher bitrate than conventional communication systems. Alongside the advantages of being energy efficient through the use of LEDs (Light Emitting Diodes), it is imperative that these systems also take in consideration privacy and security measures available. This work highlights the technical aspects of a typical 16-QAM (Quadrature Amplitude Modulation) VLC system incorporating an enhanced privacy feature using an hyperchaotic map to scramble the symbols. The results obtained in this study showed a low dispersion symbol constellation while communicating at 100 Baud and with a 1 m link. Using the measured EVM (Error Vector Magnitude) of the constellation, the BER (Bit Error Rate) of this system was estimated to be bellow 10−12 which is lower than the threshold limit of 3.8.10−3 that corresponds to the 7% hard-decision forward error correction (HD- FEC) for optimal transmission, showing that this technique can be implemented with higher bitrates and with a higher modulation index.
Mishra, Suman, Radhika, K, Babu, Y.Murali Mohan.  2021.  Error Detection And Correction In TCAMS Based SRAM. 2021 6th International Conference on Signal Processing, Computing and Control (ISPCC). :283—287.
Ternary content addressable memories (TCAMs) widely utilized in network systems to enforce the labeling of packets. For example, they are used for packet forwarding, security, and software-defined networks (SDNs). TCAMs are typically deployed as standalone instruments or as an embedded intellectual property component on application-specific integrated circuits. However, field-programmable gate arrays (FPGAs) do not have TCAM bases. However, FPGAs’ versatility allows them to appeal for SDN deployment, and most FPGA vendors have SDN production kits. Those need to help TCAM features and then simulate TCAMs using the FPGA logic blocks. Several methods to reproduction TCAMs on FPGAs have been introduced in recent years. Some of them use a huge multiple storage blocks within modern FPGAs to incorporate TCAMs. A trouble while remembrances are that soft errors that corrupt stored bits can affect them. Memories may be covered by a parity test to identify errors or by an error correction code, although this involves extra bits in a word frame. This brief considers memory security used to simulate TCAMs. It is shown in particular that by leveraging the assumption its part of potential memory information is true, most single-bit errors can be resolved when memoirs are emulated with a parity bit.
Cheng, Junyuan, Jiang, Xue-Qin, Bai, Enjian, Wu, Yun, Hai, Han, Pan, Feng, Peng, Yuyang.  2021.  Rate Adaptive Reconciliation Based on Reed-Solomon Codes. 2021 6th International Conference on Communication, Image and Signal Processing (CCISP). :245—249.
Security of physical layer key generation is based on the randomness and reciprocity of wireless fading channel, which has attracted more and more attention in recent years. This paper proposes a rate adaptive key agreement scheme and utilizes the received signal strength (RSS) of the channel between two wireless devices to generate the key. In conventional information reconciliation process, the bit inconsistency rate is usually eliminated by using the filter method, which increases the possibility of exposing the generated key bit string. Building on the strengths of existing secret key extraction approaches, this paper develops a scheme that uses Reed-Solomon (RS) codes, one of forward error correction channel codes, for information reconciliation. Owing to strong error correction performance of RS codes, the proposed scheme can solve the problem of inconsistent key bit string in the process of channel sensing. At the same time, the composition of RS codes can help the scheme realize rate adaptation well due to the construction principle of error correction code, which can freely control the code rate and achieve the reconciliation method of different key bit string length. Through experiments, we find that when the number of inconsistent key bits is not greater than the maximum error correction number of RS codes, it can well meet the purpose of reconciliation.