Visible to the public Biblio

Found 327 results

Filters: First Letter Of Last Name is R  [Clear All Filters]
A B C D E F G H I J K L M N O P Q [R] S T U V W X Y Z   [Show ALL]
R
R P, Jagadeesh Chandra Bose, Singi, Kapil, Kaulgud, Vikrant, Phokela, Kanchanjot Kaur, Podder, Sanjay.  2019.  Framework for Trustworthy Software Development. 2019 34th IEEE/ACM International Conference on Automated Software Engineering Workshop (ASEW). :45–48.
Intelligent software applications are becoming ubiquitous and pervasive affecting various aspects of our lives and livelihoods. At the same time, the risks to which these systems expose the organizations and end users are growing dramatically. Trustworthiness of software applications is becoming a paramount necessity. Trust is to be regarded as a first-class citizen in the total product life cycle and should be addressed across all stages of software development. Trust can be looked at from two facets: one at an algorithmic level (e.g., bias-free, discrimination-aware, explainable and interpretable techniques) and the other at a process level by making development processes more transparent, auditable, and adhering to regulations and best practices. In this paper, we address the latter and propose a blockchain enabled governance framework for building trustworthy software. Our framework supports the recording, monitoring, and analysis of various activities throughout the application development life cycle thereby bringing in transparency and auditability. It facilitates the specification of regulations and best practices and verifies for its adherence raising alerts of non-compliance and prescribes remedial measures.
R. Lee, L. Mullen, P. Pal, D. Illig.  2015.  "Time of flight measurements for optically illuminated underwater targets using Compressive Sampling and Sparse reconstruction". OCEANS 2015 - MTS/IEEE Washington. :1-6.

Compressive Sampling and Sparse reconstruction theory is applied to a linearly frequency modulated continuous wave hybrid lidar/radar system. The goal is to show that high resolution time of flight measurements to underwater targets can be obtained utilizing far fewer samples than dictated by Nyquist sampling theorems. Traditional mixing/down-conversion and matched filter signal processing methods are reviewed and compared to the Compressive Sampling and Sparse Reconstruction methods. Simulated evidence is provided to show the possible sampling rate reductions, and experiments are used to observe the effects that turbid underwater environments have on recovery. Results show that by using compressive sensing theory and sparse reconstruction, it is possible to achieve significant sample rate reduction while maintaining centimeter range resolution.

R. Leszczyna, M. Łosiński, R. Małkowski.  2015.  "Security information sharing for the polish power system". 2015 Modern Electric Power Systems (MEPS). :1-6.

The Polish Power System is becoming increasingly more dependent on Information and Communication Technologies which results in its exposure to cyberattacks, including the evolved and highly sophisticated threats such as Advanced Persistent Threats or Distributed Denial of Service attacks. The most exposed components are SCADA systems in substations and Distributed Control Systems in power plants. When addressing this situation the usual cyber security technologies are prerequisite, but not sufficient. With the rapidly evolving cyber threat landscape the use of partnerships and information sharing has become critical. However due to several anonymity concerns the relevant stakeholders may become reluctant to exchange sensitive information about security incidents. In the paper a multi-agent architecture is presented for the Polish Power System which addresses the anonymity concerns.

R. Mishra, A. Mishra, P. Bhanodiya.  2015.  "An edge based image steganography with compression and encryption". 2015 International Conference on Computer, Communication and Control (IC4). :1-4.

Security of secret data has been a major issue of concern from ancient time. Steganography and cryptography are the two techniques which are used to reduce the security threat. Cryptography is an art of converting secret message in other than human readable form. Steganography is an art of hiding the existence of secret message. These techniques are required to protect the data theft over rapidly growing network. To achieve this there is a need of such a system which is very less susceptible to human visual system. In this paper a new technique is going to be introducing for data transmission over an unsecure channel. In this paper secret data is compressed first using LZW algorithm before embedding it behind any cover media. Data is compressed to reduce its size. After compression data encryption is performed to increase the security. Encryption is performed with the help of a key which make it difficult to get the secret message even if the existence of the secret message is reveled. Now the edge of secret message is detected by using canny edge detector and then embedded secret data is stored there with the help of a hash function. Proposed technique is implemented in MATLAB and key strength of this project is its huge data hiding capacity and least distortion in Stego image. This technique is applied over various images and the results show least distortion in altered image.

R. Saravanan, V. Saminadan, V. Thirunavukkarasu.  2015.  "VLSI implementation of BER measurement for wireless communication system". 2015 International Conference on Innovations in Information, Embedded and Communication Systems (ICIIECS). :1-5.

This paper presents the Bit Error Rate (BER) performance of the wireless communication system. The complexity of modern wireless communication system are increasing at fast pace. It becomes challenging to design the hardware of wireless system. The proposed system consists of MIMO transmitter and MIMO receiver along with the along with a realistic fading channel. To make the data transmission more secure when the data are passed into channel Crypto-System with Embedded Error Control (CSEEC) is used. The system supports data security and reliability using forward error correction codes (FEC). Security is provided through the use of a new symmetric encryption algorithm, and reliability is provided by the use of FEC codes. The system aims at speeding up the encryption and encoding operations and reduces the hardware dedicated to each of these operations. The proposed system allows users to achieve more security and reliable communication. The proposed BER measurement communication system consumes low power compared to existing systems. Advantage of VLSI based BER measurement it that they can be used in the Real time applications and it provides single chip solution.

Ra, Gyeong-Jin, Lee, Im-Yeong.  2019.  A Study on Hybrid Blockchain-based XGS (XOR Global State) Injection Technology for Efficient Contents Modification and Deletion. 2019 Sixth International Conference on Software Defined Systems (SDS). :300—305.

Blockchain is a database technology that provides the integrity and trust of the system can't make arbitrary modifications and deletions by being an append-only distributed ledger. That is, the blockchain is not a modification or deletion but a CRAB (Create-Retrieve-Append-Burn) method in which data can be read and written according to a legitimate user's access right(For example, owner private key). However, this can not delete the created data once, which causes problems such as privacy breach. In this paper, we propose an on-off block-chained Hybrid Blockchain system to separate the data and save the connection history to the blockchain. In addition, the state is changed to the distributed database separately from the ledger record, and the state is changed by generating the arbitrary injection in the XOR form, so that the history of modification / deletion of the Off Blockchain can be efficiently retrieved.

Raber, Frederic, Krüger, Antonio.  2018.  Deriving Privacy Settings for Location Sharing: Are Context Factors Always the Best Choice? 2018 IEEE Symposium on Privacy-Aware Computing (PAC). :86–94.
Research has observed context factors like occasion and time as influential factors for predicting whether or not to share a location with online friends. In other domains like social networks, personality was also found to play an important role. Furthermore, users are seeking a fine-grained disclosement policy that also allows them to display an obfuscated location, like the center of the current city, to some of their friends. In this paper, we observe which context factors and personality measures can be used to predict the correct privacy level out of seven privacy levels, which include obfuscation levels like center of the street or current city. Our results show that a prediction is possible with a precision 20% better than a constant value. We will give design indications to determine which context factors should be recorded, and how much the precision can be increased if personality and privacy measures are recorded using either a questionnaire or automated text analysis.
Raber, Frederic, Krüger, Antonio.  2018.  Deriving Privacy Settings for Location Sharing: Are Context Factors Always the Best Choice? 2018 IEEE Symposium on Privacy-Aware Computing (PAC). :86–94.
Research has observed context factors like occasion and time as influential factors for predicting whether or not to share a location with online friends. In other domains like social networks, personality was also found to play an important role. Furthermore, users are seeking a fine-grained disclosement policy that also allows them to display an obfuscated location, like the center of the current city, to some of their friends. In this paper, we observe which context factors and personality measures can be used to predict the correct privacy level out of seven privacy levels, which include obfuscation levels like center of the street or current city. Our results show that a prediction is possible with a precision 20% better than a constant value. We will give design indications to determine which context factors should be recorded, and how much the precision can be increased if personality and privacy measures are recorded using either a questionnaire or automated text analysis.
Rabie, Asmaa.  2016.  The RSA Trap. XRDS. 23:65–65.
Rabie, R., Drissi, M..  2018.  Applying Sigmoid Filter for Detecting the Low-Rate Denial of Service Attacks. 2018 IEEE 8th Annual Computing and Communication Workshop and Conference (CCWC). :450–456.

This paper focuses on optimizing the sigmoid filter for detecting Low-Rate DoS attacks. Though sigmoid filter could help for detecting the attacker, it could severely affect the network efficiency. Unlike high rate attacks, Low-Rate DoS attacks such as ``Shrew'' and ``New Shrew'' are hard to detect. Attackers choose a malicious low-rate bandwidth to exploit the TCP's congestion control window algorithm and the re-transition timeout mechanism. We simulated the attacker traffic by editing using NS3. The Sigmoid filter was used to create a threshold bandwidth filter at the router that allowed a specific bandwidth, so when traffic that exceeded the threshold occurred, it would be dropped, or it would be redirected to a honey-pot server, instead. We simulated the Sigmoid filter using MATLAB and took the attacker's and legitimate user's traffic generated by NS-3 as the input for the Sigmoid filter in the MATLAB. We run the experiment three times with different threshold values correlated to the TCP packet size. We found the probability to detect the attacker traffic as follows: the first was 25%, the second 50% and the third 60%. However, we observed a drop in legitimate user traffic with the following probabilities, respectively: 75%, 50%, and 85%.

Radhakrishnan, Kiran, Menon, Rajeev R, Nath, Hiran V.  2019.  A survey of zero-day malware attacks and its detection methodology. TENCON 2019 - 2019 IEEE Region 10 Conference (TENCON). :533—539.

The recent malware outbreaks have shown that the existing end-point security solutions are not robust enough to secure the systems from getting compromised. The techniques, like code obfuscation along with one or more zero-days, are used by malware developers for evading the security systems. These malwares are used for large-scale attacks involving Advanced Persistent Threats(APT), Botnets, Cryptojacking, etc. Cryptojacking poses a severe threat to various organizations and individuals. We are summarising multiple methods available for the detection of malware.

Radhakrishnan, Vijayanand, Durairaj, Devaraj, Balasubramanian, Kannapiran, Kamatchi, Kartheeban.  2019.  Development Of A Novel Security Scheme Using DNA Biocryptography For Smart Meter Data Communication. 2019 3rd International Conference on Computing and Communications Technologies (ICCCT). :237-244.

Data security is a major requirement of smart meter communication to control server through Advanced Metering infrastructure. Easy access of smart meters and multi-faceted nature of AMI communication network are the main reasons of smart meter facing large number of attacks. The different topology, bandwidth and heterogeneity in communication network prevent the existing security mechanisms in satisfying the security requirements of smart meter. Hence, advanced security mechanisms are essential to encrypt smart meter data before transmitting to control server. The emerging biocryptography technique has several advantages over existing techniques and is most suitable for providing security to communication of low processing devices like smart meter. In this paper, a lightweight encryption scheme using DNA sequence with suitable key management scheme is proposed for secure communication of smart meter in an efficient way. The proposed 2-phase DNA cryptography provides confidentiality and integrity to transmitted data and the authentication of keys is attained by exchanging through Diffie Hellman scheme. The strength of proposed encryption scheme is analyzed and its efficiency is evaluated by simulating an AMI communication network using Simulink/Matlab. Comparison of simulation results with various techniques show that the proposed scheme is suitable for secure communication of smart meter data.

Radhika, K. R., Nalini, M. K..  2017.  Biometric Image Encryption Using DNA Sequences and Chaotic Systems. 2017 International Conference on Recent Advances in Electronics and Communication Technology (ICRAECT). :164–168.

Emerging communication technologies in distributed network systems require transfer of biometric digital images with high security. Network security is identified by the changes in system behavior which is either Dynamic or Deterministic. Performance computation is complex in dynamic system where cryptographic techniques are not highly suitable. Chaotic theory solves complex problems of nonlinear deterministic system. Several chaotic methods are combined to get hyper chaotic system for more security. Chaotic theory along with DNA sequence enhances security of biometric image encryption. Implementation proves the encrypted image is highly chaotic and resistant to various attacks.

Radlak, Krystian, Smolka, Bogdan.  2016.  Automated Recognition of Facial Expressions Authenticity. Proceedings of the 18th ACM International Conference on Multimodal Interaction. :577–581.

Recognition of facial expressions authenticity is quite troublesome for humans. Therefore, it is an interesting topic for the computer vision community, as the developed algorithms for facial expressions authenticity estimation may be used as indicators of deception. This paper discusses the state-of-the art methods developed for smile veracity estimation and proposes a plan of development and validation of a novel approach to automated discrimination between genuine and posed facial expressions. The proposed fully automated technique is based on the extension of the high-dimensional Local Binary Patterns (LBP) to the spatio-temporal domain and combines them with the dynamics of facial landmarks movements. The proposed technique will be validated on several existing smile databases and a novel database created with the use of a high speed camera. Finally, the developed framework will be applied for the detection of deception in real life scenarios.

Radoglou-Grammatikis, Panagiotis, Sarigiannidis, Panagiotis, Giannoulakis, Ioannis, Kafetzakis, Emmanouil, Panaousis, Emmanouil.  2019.  Attacking IEC-60870-5-104 SCADA Systems. 2019 IEEE World Congress on Services (SERVICES). 2642-939X:41–46.
The rapid evolution of the Information and Communications Technology (ICT) services transforms the conventional electrical grid into a new paradigm called Smart Grid (SG). Even though SG brings significant improvements, such as increased reliability and better energy management, it also introduces multiple security challenges. One of the main reasons for this is that SG combines a wide range of heterogeneous technologies, including Internet of Things (IoT) devices as well as Supervisory Control and Data Acquisition (SCADA) systems. The latter are responsible for monitoring and controlling the automatic procedures of energy transmission and distribution. Nevertheless, the presence of these systems introduces multiple vulnerabilities because their protocols do not implement essential security mechanisms such as authentication and access control. In this paper, we focus our attention on the security issues of the IEC 60870-5-104 (IEC-104) protocol, which is widely utilized in the European energy sector. In particular, we provide a SCADA threat model based on a Coloured Petri Net (CPN) and emulate four different types of cyber attacks against IEC-104. Last, we used AlienVault's risk assessment model to evaluate the risk level that each of these cyber attacks introduces to our system to confirm our intuition about their severity.
Rady, Mai, Abdelkader, Tamer, Ismail, Rasha.  2018.  SCIQ-CD: A Secure Scheme to Provide Confidentiality and Integrity of Query results for Cloud Databases. 2018 14th International Computer Engineering Conference (ICENCO). :225–230.
Database outsourcing introduces a new paradigm, called Database as a Service (DBaaS). Database Service Providers (DSPs) have the ability to host outsourced databases and provide efficient facilities for their users. However, the data and the execution of database queries are under the control of the DSP, which is not always a trusted authority. Therefore, our problem is to ensure the outsourced database security. To address this problem, we propose a Secure scheme to provide Confidentiality and Integrity of Query results for Cloud Databases (SCIQ-CD). The performance analysis shows that our proposed scheme is secure and efficient for practical deployment.
Rafailidis, Dimitrios.  2016.  Modeling Trust and Distrust Information in Recommender Systems via Joint Matrix Factorization with Signed Graphs. Proceedings of the 31st Annual ACM Symposium on Applied Computing. :1060–1065.

We propose an efficient recommendation algorithm, by incorporating the side information of users' trust and distrust social relationships into the learning process of a Joint Non-negative Matrix Factorization technique based on Signed Graphs, namely JNMF-SG. The key idea in this study is to generate clusters based on signed graphs, considering positive and negative weights for the trust and distrust relationships, respectively. Using a spectral clustering approach for signed graphs, the clusters are extracted on condition that users with positive connections should lie close, while users with negative ones should lie far. Then, we propose a Joint Non-negative Matrix factorization framework, by generating the final recommendations, using the user-item and user-cluster associations over the joint factorization. In our experiments with a dataset from a real-world social media platform, we show that we significantly increase the recommendation accuracy, compared to state-of-the-art methods that also consider the trust and distrust side information in matrix factorization.

Raff, Edward, Nicholas, Charles.  2017.  Malware Classification and Class Imbalance via Stochastic Hashed LZJD. Proceedings of the 10th ACM Workshop on Artificial Intelligence and Security. :111–120.

There are currently few methods that can be applied to malware classification problems which don't require domain knowledge to apply. In this work, we develop our new SHWeL feature vector representation, by extending the recently proposed Lempel-Ziv Jaccard Distance. These SHWeL vectors improve upon LZJD's accuracy, outperform byte n-grams, and allow us to build efficient algorithms for both training (a weakness of byte n-grams) and inference (a weakness of LZJD). Furthermore, our new SHWeL method also allows us to directly tackle the class imbalance problem, which is common for malware-related tasks. Compared to existing methods like SMOTE, SHWeL provides significantly improved accuracy while reducing algorithmic complexity to O(N). Because our approach is developed without the use of domain knowledge, it can be easily re-applied to any new domain where there is a need to classify byte sequences.

Rafii, Z., Coover, B., Jinyu Han.  2014.  An audio fingerprinting system for live version identification using image processing techniques. Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on. :644-648.

Suppose that you are at a music festival checking on an artist, and you would like to quickly know about the song that is being played (e.g., title, lyrics, album, etc.). If you have a smartphone, you could record a sample of the live performance and compare it against a database of existing recordings from the artist. Services such as Shazam or SoundHound will not work here, as this is not the typical framework for audio fingerprinting or query-by-humming systems, as a live performance is neither identical to its studio version (e.g., variations in instrumentation, key, tempo, etc.) nor it is a hummed or sung melody. We propose an audio fingerprinting system that can deal with live version identification by using image processing techniques. Compact fingerprints are derived using a log-frequency spectrogram and an adaptive thresholding method, and template matching is performed using the Hamming similarity and the Hough Transform.

Rafique, Ansar, Van Landuyt, Dimitri, Reniers, Vincent, Joosen, Wouter.  2017.  Towards Scalable and Dynamic Data Encryption for Multi-tenant SaaS. Proceedings of the Symposium on Applied Computing. :411–416.
Application-level data management middleware solutions are becoming increasingly compelling to deal with the complexity of a multi-cloud or federated cloud storage and multitenant storage architecture. However, these systems typically support traditional data mapping strategies that are created under the assumption of a fixed and rigorous database schema, and mapping data objects while supporting varying data confidentiality requirements therefore leads to fragmentation of data over distributed storage nodes. This introduces performance over-head at the level of individual database transactions and negatively affects the overall scalability. This paper discusses these challenges and highlights the potential of leveraging the data schema flexibility of NoSQL databases to accomplish dynamic and fine-grained data encryption in a more efficient and scalable manner. We illustrate these ideas in the context of an industrial multi-tenant SaaS application.
Rafiuddin, M. F. B., Minhas, H., Dhubb, P. S..  2017.  A dark web story in-depth research and study conducted on the dark web based on forensic computing and security in Malaysia. 2017 IEEE International Conference on Power, Control, Signals and Instrumentation Engineering (ICPCSI). :3049–3055.
The following is a research conducted on the Dark Web to study and identify the ins and outs of the dark web, what the dark web is all about, the various methods available to access the dark web and many others. The researchers have also included the steps and precautions taken before the dark web was opened. Apart from that, the findings and the website links / URL are also included along with a description of the sites. The primary usage of the dark web and some of the researcher's experience has been further documented in this research paper.
Raghothaman, Mukund, Kulkarni, Sulekha, Heo, Kihong, Naik, Mayur.  2018.  User-Guided Program Reasoning Using Bayesian Inference. Proceedings of the 39th ACM SIGPLAN Conference on Programming Language Design and Implementation. :722-735.

Program analyses necessarily make approximations that often lead them to report true alarms interspersed with many false alarms. We propose a new approach to leverage user feedback to guide program analyses towards true alarms and away from false alarms. Our approach associates each alarm with a confidence value by performing Bayesian inference on a probabilistic model derived from the analysis rules. In each iteration, the user inspects the alarm with the highest confidence and labels its ground truth, and the approach recomputes the confidences of the remaining alarms given this feedback. It thereby maximizes the return on the effort by the user in inspecting each alarm. We have implemented our approach in a tool named Bingo for program analyses expressed in Datalog. Experiments with real users and two sophisticated analyses–-a static datarace analysis for Java programs and a static taint analysis for Android apps–-show significant improvements on a range of metrics, including false alarm rates and number of bugs found.

Ragmani, Awatif, El Omri, Amina, Abghour, Noreddine, Moussaid, Khalid, Rida, Mohammed.  2016.  An Improved Scheduling Strategy in Cloud Computing Using Fuzzy Logic. Proceedings of the International Conference on Big Data and Advanced Wireless Technologies. :22:1–22:9.

Within few years, Cloud computing has emerged as the most promising IT business model. Thanks to its various technical and financial advantages, Cloud computing continues to convince every day new users coming from scientific and industrial sectors. To satisfy the various users' requirements, Cloud providers must maximize the performance of their IT resources to ensure the best service at the lowest cost. The performance optimization efforts in the Cloud can be achieved at different levels and aspects. In the present paper, we propose to introduce a fuzzy logic process in scheduling strategy for public Cloud in order to improve the response time, processing time and total cost. In fact, fuzzy logic has proven his ability to solve the problem of optimization in several fields such as data mining, image processing, networking and much more.

Rahayu, T.M., Sang-Gon Lee, Hoon-Jae Lee.  2014.  Security analysis of secure data aggregation protocols in wireless sensor networks. Advanced Communication Technology (ICACT), 2014 16th International Conference on. :471-474.

In order to conserve wireless sensor network (WSN) lifetime, data aggregation is applied. Some researchers consider the importance of security and propose secure data aggregation protocols. The essential of those secure approaches is to make sure that the aggregators aggregate the data in appropriate and secure way. In this paper we give the description of ESPDA (Energy-efficient and Secure Pattern-based Data Aggregation) and SRDA (Secure Reference-Based Data Aggregation) protocol that work on cluster-based WSN and the deep security analysis that are different from the previously presented one.

Rahayuda, I. G. S., Santiari, N. P. L..  2017.  Crawling and cluster hidden web using crawler framework and fuzzy-KNN. 2017 5th International Conference on Cyber and IT Service Management (CITSM). :1–7.
Today almost everyone is using internet for daily activities. Whether it's for social, academic, work or business. But only a few of us are aware that internet generally we access only a small part of the overall of internet access. The Internet or the world wide web is divided into several levels, such as web surfaces, deep web or dark web. Accessing internet into deep or dark web is a dangerous thing. This research will be conducted with research on web content and deep content. For a faster and safer search, in this research will be use crawler framework. From the search process will be obtained various kinds of data to be stored into the database. The database classification process will be implemented to know the level of the website. The classification process is done by using the fuzzy-KNN method. The fuzzy-KNN method classifies the results of the crawling framework that contained in the database. Crawling framework will generate data in the form of url address, page info and other. Crawling data will be compared with predefined sample data. The classification result of fuzzy-KNN will result in the data of the web level based on the value of the word specified in the sample data. From the research conducted on several data tests that found there are as much as 20% of the web surface, 7.5% web bergie, 20% deep web, 22.5% charter and 30% dark web. Research is only done on some test data, it is necessary to add some data in order to get better result. Better crawler frameworks can speed up crawling results, especially at certain web levels because not all crawler frameworks can work at a particular web level, the tor browser's can be used but the crawler framework sometimes can not work.