Visible to the public Biblio

Found 1683 results

Filters: First Letter Of Last Name is Z  [Clear All Filters]
2014-09-17
Tembe, Rucha, Zielinska, Olga, Liu, Yuqi, Hong, Kyung Wha, Murphy-Hill, Emerson, Mayhorn, Chris, Ge, Xi.  2014.  Phishing in International Waters: Exploring Cross-national Differences in Phishing Conceptualizations Between Chinese, Indian and American Samples. Proceedings of the 2014 Symposium and Bootcamp on the Science of Security. :8:1–8:7.

One hundred-sixty four participants from the United States, India and China completed a survey designed to assess past phishing experiences and whether they engaged in certain online safety practices (e.g., reading a privacy policy). The study investigated participants' reported agreement regarding the characteristics of phishing attacks, types of media where phishing occurs and the consequences of phishing. A multivariate analysis of covariance indicated that there were significant differences in agreement regarding phishing characteristics, phishing consequences and types of media where phishing occurs for these three nationalities. Chronological age and education did not influence the agreement ratings; therefore, the samples were demographically equivalent with regards to these variables. A logistic regression analysis was conducted to analyze the categorical variables and nationality data. Results based on self-report data indicated that (1) Indians were more likely to be phished than Americans, (2) Americans took protective actions more frequently than Indians by destroying old documents, and (3) Americans were more likely to notice the "padlock" security icon than either Indian or Chinese respondents. The potential implications of these results are discussed in terms of designing culturally sensitive anti-phishing solutions.

Layman, Lucas, Diffo, Sylvain David, Zazworka, Nico.  2014.  Human Factors in Webserver Log File Analysis: A Controlled Experiment on Investigating Malicious Activity. Proceedings of the 2014 Symposium and Bootcamp on the Science of Security. :9:1–9:11.

While automated methods are the first line of defense for detecting attacks on webservers, a human agent is required to understand the attacker's intent and the attack process. The goal of this research is to understand the value of various log fields and the cognitive processes by which log information is grouped, searched, and correlated. Such knowledge will enable the development of human-focused log file investigation technologies. We performed controlled experiments with 65 subjects (IT professionals and novices) who investigated excerpts from six webserver log files. Quantitative and qualitative data were gathered to: 1) analyze subject accuracy in identifying malicious activity; 2) identify the most useful pieces of log file information; and 3) understand the techniques and strategies used by subjects to process the information. Statistically significant effects were observed in the accuracy of identifying attacks and time taken depending on the type of attack. Systematic differences were also observed in the log fields used by high-performing and low-performing groups. The findings include: 1) new insights into how specific log data fields are used to effectively assess potentially malicious activity; 2) obfuscating factors in log data from a human cognitive perspective; and 3) practical implications for tools to support log file investigations.

Layman, Lucas, Zazworka, Nico.  2014.  InViz: Instant Visualization of Security Attacks. Proceedings of the 2014 Symposium and Bootcamp on the Science of Security. :15:1–15:2.

The InViz tool is a functional prototype that provides graphical visualizations of log file events to support real-time attack investigation. Through visualization, both experts and novices in cybersecurity can analyze patterns of application behavior and investigate potential cybersecurity attacks. The goal of this research is to identify and evaluate the cybersecurity information to visualize that reduces the amount of time required to perform cyber forensics.

2015-03-03
Li, Bo, Vorobeychik, Yevgeniy.  2014.  Feature Cross-Substitution in Adversarial Classification. Advances in Neural Information Processing Systems 27. :2087–2095.

The success of machine learning, particularly in supervised settings, has led to numerous attempts to apply it in adversarial settings such as spam and malware detection. The core challenge in this class of applications is that adversaries are not static data generators, but make a deliberate effort to evade the classifiers deployed to detect them. We investigate both the problem of modeling the objectives of such adversaries, as well as the algorithmic problem of accounting for rational, objective-driven adversaries. In particular, we demonstrate severe shortcomings of feature reduction in adversarial settings using several natural adversarial objective functions, an observation that is particularly pronounced when the adversary is able to substitute across similar features (for example, replace words with synonyms or replace letters in words). We offer a simple heuristic method for making learning more robust to feature cross-substitution attacks. We then present a more general approach based on mixed-integer linear programming with constraint generation, which implicitly trades off overfitting and feature selection in an adversarial setting using a sparse regularizer along with an evasion model. Our approach is the first method for combining an adversarial classification algorithm with a very general class of models of adversarial classifier evasion. We show that our algorithmic approach significantly outperforms state-of-the-art alternatives.

2015-04-30
Zhuoping Yu, Junxian Wu, Lu Xiong.  2014.  Research of stability control of distributed drive electric vehicles under motor failure modes. Transportation Electrification Asia-Pacific (ITEC Asia-Pacific), 2014 IEEE Conference and Expo. :1-5.

With the application and promotion of electric vehicles, vehicle security problems caused by actuator reliability have become increasingly prominent. Firstly, the paper analyses and sums motor failure modes and their effects of permanent magnet synchronous motor (PMSM) , which is commonly used on electric vehicles. And then design a hierarchical structure of the vehicle control strategies and the corresponding algorithms, and adjust based on the different failure modes. Finally conduct simulation conditions in CarSim environment. Verify the control strategy and algorithm can maintain vehicle stability and reduce the burden on driver under motor failure conditions.

Yufei Gu, Yangchun Fu, Prakash, A., Zhiqiang Lin, Heng Yin.  2014.  Multi-Aspect, Robust, and Memory Exclusive Guest OS Fingerprinting. Cloud Computing, IEEE Transactions on. 2:380-394.

Precise fingerprinting of an operating system (OS) is critical to many security and forensics applications in the cloud, such as virtual machine (VM) introspection, penetration testing, guest OS administration, kernel dump analysis, and memory forensics. The existing OS fingerprinting techniques primarily inspect network packets or CPU states, and they all fall short in precision and usability. As the physical memory of a VM always exists in all these applications, in this article, we present OS-SOMMELIER+, a multi-aspect, memory exclusive approach for precise and robust guest OS fingerprinting in the cloud. It works as follows: given a physical memory dump of a guest OS, OS-SOMMELIER+ first uses a code hash based approach from kernel code aspect to determine the guest OS version. If code hash approach fails, OS-SOMMELIER+ then uses a kernel data signature based approach from kernel data aspect to determine the version. We have implemented a prototype system, and tested it with a number of Linux kernels. Our evaluation results show that the code hash approach is faster but can only fingerprint the known kernels, and data signature approach complements the code signature approach and can fingerprint even unknown kernels.

Liu, Yuanyuan, Cheng, Jianping, Zhang, Li, Xing, Yuxiang, Chen, Zhiqiang, Zheng, Peng.  2014.  A low-cost dual energy CT system with sparse data. Tsinghua Science and Technology. 19:184-194.

Dual Energy CT (DECT) has recently gained significant research interest owing to its ability to discriminate materials, and hence is widely applied in the field of nuclear safety and security inspection. With the current technological developments, DECT can be typically realized by using two sets of detectors, one for detecting lower energy X-rays and another for detecting higher energy X-rays. This makes the imaging system expensive, limiting its practical implementation. In 2009, our group performed a preliminary study on a new low-cost system design, using only a complete data set for lower energy level and a sparse data set for the higher energy level. This could significantly reduce the cost of the system, as it contained much smaller number of detector elements. Reconstruction method is the key point of this system. In the present study, we further validated this system and proposed a robust method, involving three main steps: (1) estimation of the missing data iteratively with TV constraints; (2) use the reconstruction from the complete lower energy CT data set to form an initial estimation of the projection data for higher energy level; (3) use ordered views to accelerate the computation. Numerical simulations with different number of detector elements have also been examined. The results obtained in this study demonstrate that 1 + 14% CT data is sufficient enough to provide a rather good reconstruction of both the effective atomic number and electron density distributions of the scanned object, instead of 2 sets CT data.

Baofeng Wu, Qingfang Jin, Zhuojun Liu, Dongdai Lin.  2014.  Constructing Boolean functions with potentially optimal algebraic immunity based on additive decompositions of finite fields (extended abstract). Information Theory (ISIT), 2014 IEEE International Symposium on. :1361-1365.

We propose a general approach to construct cryptographic significant Boolean functions of (r + 1)m variables based on the additive decomposition F2rm × F2m of the finite field F2(r+1)m, where r ≥ 1 is odd and m ≥ 3. A class of unbalanced functions is constructed first via this approach, which coincides with a variant of the unbalanced class of generalized Tu-Deng functions in the case r = 1. Functions belonging to this class have high algebraic degree, but their algebraic immunity does not exceed m, which is impossible to be optimal when r > 1. By modifying these unbalanced functions, we obtain a class of balanced functions which have optimal algebraic degree and high nonlinearity (shown by a lower bound we prove). These functions have optimal algebraic immunity provided a combinatorial conjecture on binary strings which generalizes the Tu-Deng conjecture is true. Computer investigations show that, at least for small values of number of variables, functions from this class also behave well against fast algebraic attacks.

Yexing Li, Xinye Cai, Zhun Fan, Qingfu Zhang.  2014.  An external archive guided multiobjective evolutionary approach based on decomposition for continuous optimization. Evolutionary Computation (CEC), 2014 IEEE Congress on. :1124-1130.

In this paper, we propose a decomposition based multiobjective evolutionary algorithm that extracts information from an external archive to guide the evolutionary search for continuous optimization problem. The proposed algorithm used a mechanism to identify the promising regions(subproblems) through learning information from the external archive to guide evolutionary search process. In order to demonstrate the performance of the algorithm, we conduct experiments to compare it with other decomposition based approaches. The results validate that our proposed algorithm is very competitive.

Foroushani, V.A., Zincir-Heywood, A.N..  2014.  TDFA: Traceback-Based Defense against DDoS Flooding Attacks. Advanced Information Networking and Applications (AINA), 2014 IEEE 28th International Conference on. :597-604.

Distributed Denial of Service (DDoS) attacks are one of the challenging network security problems to address. The existing defense mechanisms against DDoS attacks usually filter the attack traffic at the victim side. The problem is exacerbated when there are spoofed IP addresses in the attack packets. In this case, even if the attacking traffic can be filtered by the victim, the attacker may reach the goal of blocking the access to the victim by consuming the computing resources or by consuming a big portion of the bandwidth to the victim. This paper proposes a Trace back-based Defense against DDoS Flooding Attacks (TDFA) approach to counter this problem. TDFA consists of three main components: Detection, Trace back, and Traffic Control. In this approach, the goal is to place the packet filtering as close to the attack source as possible. In doing so, the traffic control component at the victim side aims to set up a limit on the packet forwarding rate to the victim. This mechanism effectively reduces the rate of forwarding the attack packets and therefore improves the throughput of the legitimate traffic. Our results based on real world data sets show that TDFA is effective to reduce the attack traffic and to defend the quality of service for the legitimate traffic.

Chiang, R., Rajasekaran, S., Zhang, N., Huang, H..  2014.  Swiper: Exploiting Virtual Machine Vulnerability in Third-Party Clouds with Competition for I/O Resources. Parallel and Distributed Systems, IEEE Transactions on. PP:1-1.

The emerging paradigm of cloud computing, e.g., Amazon Elastic Compute Cloud (EC2), promises a highly flexible yet robust environment for large-scale applications. Ideally, while multiple virtual machines (VM) share the same physical resources (e.g., CPUs, caches, DRAM, and I/O devices), each application should be allocated to an independently managed VM and isolated from one another. Unfortunately, the absence of physical isolation inevitably opens doors to a number of security threats. In this paper, we demonstrate in EC2 a new type of security vulnerability caused by competition between virtual I/O workloads-i.e., by leveraging the competition for shared resources, an adversary could intentionally slow down the execution of a targeted application in a VM that shares the same hardware. In particular, we focus on I/O resources such as hard-drive throughput and/or network bandwidth-which are critical for data-intensive applications. We design and implement Swiper, a framework which uses a carefully designed workload to incur significant delays on the targeted application and VM with minimum cost (i.e., resource consumption). We conduct a comprehensive set of experiments in EC2, which clearly demonstrates that Swiper is capable of significantly slowing down various server applications while consuming a small amount of resources.

Gong Bei, Zhang Jianbiao, Ye Xiaolie, Shen Changxiang.  2014.  A trusted measurement scheme suitable for the clients in the trusted network. Communications, China. 11:143-153.

The trusted network connection is a hot spot in trusted computing field and the trust measurement and access control technology are used to deal with network security threats in trusted network. But the trusted network connection lacks fine-grained states and real-time measurement support for the client and the authentication mechanism is difficult to apply in the trusted network connection, it is easy to cause the loss of identity privacy. In order to solve the above-described problems, this paper presents a trust measurement scheme suitable for clients in the trusted network, the scheme integrates the following attributes such as authentication mechanism, state measurement, and real-time state measurement and so on, and based on the authentication mechanism and the initial state measurement, the scheme uses the real-time state measurement as the core method to complete the trust measurement for the client. This scheme presented in this paper supports both static and dynamic measurements. Overall, the characteristics of this scheme such as fine granularity, dynamic, real-time state measurement make it possible to make more fine-grained security policy and therefore it overcomes inadequacies existing in the current trusted network connection.

Zhuo Lu, Wenye Wang, Wang, C..  2015.  Camouflage Traffic: Minimizing Message Delay for Smart Grid Applications under Jamming. Dependable and Secure Computing, IEEE Transactions on. 12:31-44.

Smart grid is a cyber-physical system that integrates power infrastructures with information technologies. To facilitate efficient information exchange, wireless networks have been proposed to be widely used in the smart grid. However, the jamming attack that constantly broadcasts radio interference is a primary security threat to prevent the deployment of wireless networks in the smart grid. Hence, spread spectrum systems, which provide jamming resilience via multiple frequency and code channels, must be adapted to the smart grid for secure wireless communications, while at the same time providing latency guarantee for control messages. An open question is how to minimize message delay for timely smart grid communication under any potential jamming attack. To address this issue, we provide a paradigm shift from the case-by-case methodology, which is widely used in existing works to investigate well-adopted attack models, to the worst-case methodology, which offers delay performance guarantee for smart grid applications under any attack. We first define a generic jamming process that characterizes a wide range of existing attack models. Then, we show that in all strategies under the generic process, the worst-case message delay is a U-shaped function of network traffic load. This indicates that, interestingly, increasing a fair amount of traffic can in fact improve the worst-case delay performance. As a result, we demonstrate a lightweight yet promising system, transmitting adaptive camouflage traffic (TACT), to combat jamming attacks. TACT minimizes the message delay by generating extra traffic called camouflage to balance the network load at the optimum. Experiments show that TACT can decrease the probability that a message is not delivered on time in order of magnitude.

Zheng, J.X., Dongfang Li, Potkonjak, M..  2014.  A secure and unclonable embedded system using instruction-level PUF authentication. Field Programmable Logic and Applications (FPL), 2014 24th International Conference on. :1-4.

In this paper we present a secure and unclonable embedded system design that can target either an FPGA or an ASIC technology. The premise of the security is that the executed machine code and the executing environment (the embedded processor) will authenticate each other at a per-instruction basis using Physical Unclonable Functions (PUFs) that are built into the processor. The PUFs ensure that the execution of the binary code may only proceed if the binary is compiled with the correct intrinsic knowledge of the PUFs, and that such intrinsic knowledge is virtually unique to each processor and therefore unclonable. We will explain how to implement and integrate the PUFs into the processor's execution environment such that each instruction is authenticated and de-obfuscated on-demand and how to transform an ordinary binary executable into PUF-aware, obfuscated binaries. We will also present a prototype system on a Xilinx Spartan6-based FPGA board.

Wei, Lifei, Zhu, Haojin, Cao, Zhenfu, Dong, Xiaolei, Jia, Weiwei, Chen, Yunlu, Vasilakos, Athanasios V..  2014.  Security and Privacy for Storage and Computation in Cloud Computing. Inf. Sci.. 258:371–386.

Cloud computing emerges as a new computing paradigm that aims to provide reliable, customized and quality of service guaranteed computation environments for cloud users. Applications and databases are moved to the large centralized data centers, called cloud. Due to resource virtualization, global replication and migration, the physical absence of data and machine in the cloud, the stored data in the cloud and the computation results may not be well managed and fully trusted by the cloud users. Most of the previous work on the cloud security focuses on the storage security rather than taking the computation security into consideration together. In this paper, we propose a privacy cheating discouragement and secure computation auditing protocol, or SecCloud, which is a first protocol bridging secure storage and secure computation auditing in cloud and achieving privacy cheating discouragement by designated verifier signature, batch verification and probabilistic sampling techniques. The detailed analysis is given to obtain an optimal sampling size to minimize the cost. Another major contribution of this paper is that we build a practical secure-aware cloud computing experimental environment, or SecHDFS, as a test bed to implement SecCloud. Further experimental results have demonstrated the effectiveness and efficiency of the proposed SecCloud.

2015-05-01
Xuezhong Guan, Jinlong Liu, Zhe Gao, Di Yu, Miao Cai.  2014.  Power grids vulnerability analysis based on combination of degree and betweenness. Control and Decision Conference (2014 CCDC), The 26th Chinese. :4829-4833.

This paper proposes an analysis method of power grids vulnerability based on complex networks. The method effectively combines the degree and betweenness of nodes or lines into a new index. Through combination of the two indexes, the new index can help to analyze the vulnerability of power grids. Attacking the line of the new index can obtain a smaller size of the largest cluster and global efficiency than that of the pure degree index or betweenness index. Finally, the fault simulation results of IEEE 118 bus system show that the new index can reveal the vulnerability of power grids more effectively.

Zhe Gao, Xiaowu Cai, Chuan Lv, Chao Liang.  2014.  Analysis on vulnerability of power grid based on electrical betweenness with information entropy. Control Conference (CCC), 2014 33rd Chinese. :2727-2731.

This paper investigates the vulnerability of power grids based on the complex networks combining the information entropy. The difference of current directions for a link is considered, and it is characterized by the information entropy. By combining the information entropy, the electrical betweenness is improved to evaluate the vulnerability of power grids. Attacking the link based on the largest electrical betweenness with the information can get the larger size of the largest cluster and the lower lost of loads, compared with the electrical betweenness without the information entropy. Finally, IEEE 118 bus system is tested to validate the effectiveness of the novel index to characterize the the vulnerability of power grids.

Zonouz, S., Davis, C.M., Davis, K.R., Berthier, R., Bobba, R.B., Sanders, W.H..  2014.  SOCCA: A Security-Oriented Cyber-Physical Contingency Analysis in Power Infrastructures. Smart Grid, IEEE Transactions on. 5:3-13.

Contingency analysis is a critical activity in the context of the power infrastructure because it provides a guide for resiliency and enables the grid to continue operating even in the case of failure. In this paper, we augment this concept by introducing SOCCA, a cyber-physical security evaluation technique to plan not only for accidental contingencies but also for malicious compromises. SOCCA presents a new unified formalism to model the cyber-physical system including interconnections among cyber and physical components. The cyber-physical contingency ranking technique employed by SOCCA assesses the potential impacts of events. Contingencies are ranked according to their impact as well as attack complexity. The results are valuable in both cyber and physical domains. From a physical perspective, SOCCA scores power system contingencies based on cyber network configuration, whereas from a cyber perspective, control network vulnerabilities are ranked according to the underlying power system topology.

Xiang, Yingmeng, Zhang, Yichi, Wang, Lingfeng, Sun, Weiqing.  2014.  Impact of UPFC on power system reliability considering its cyber vulnerability. T D Conference and Exposition, 2014 IEEE PES. :1-5.

The unified power flow controller (UPFC) has attracted much attention recently because of its capability in controlling the active and reactive power flows. The normal operation of UPFC is dependent on both its physical part and the associated cyber system. Thus malicious cyber attacks may impact the reliability of UPFC. As more information and communication technologies are being integrated into the current power grid, more frequent occurrences of cyber attacks are possible. In this paper, the cyber architecture of UPFC is analyzed, and the possible attack scenarios are considered and discussed. Based on the interdependency of the physical part and the cyber part, an integrated reliability model for UPFC is proposed and analyzed. The impact of UPFC on the overall system reliability is examined, and it is shown that cyber attacks against UPFC may yield an adverse influence.

Chen, K.Y., Heckel-Jones, C.A.C., Maupin, N.G., Rubin, S.M., Bogdanor, J.M., Zhenyu Guo, Haimes, Y.Y..  2014.  Risk analysis of GPS-dependent critical infrastructure system of systems. Systems and Information Engineering Design Symposium (SIEDS), 2014. :316-321.

The Department of Energy seeks to modernize the U.S. electric grid through the SmartGrid initiative, which includes the use of Global Positioning System (GPS)-timing dependent electric phasor measurement units (PMUs) for continual monitoring and automated controls. The U.S. Department of Homeland Security is concerned with the associated risks of increased utilization of GPS timing in the electricity subsector, which could in turn affect a large number of electricity-dependent Critical Infrastructure (CI) sectors. Exploiting the vulnerabilities of GPS systems in the electricity subsector can result to large-scale and costly blackouts. This paper seeks to analyze the risks of increased dependence of GPS into the electric grid through the introduction of PMUs and provides a systems engineering perspective to the GPS-dependent System of Systems (S-o-S) created by the SmartGrid initiative. The team started by defining and modeling the S-o-S followed by usage of a risk analysis methodology to identify and measure risks and evaluate solutions to mitigating the effects of the risks. The team expects that the designs and models resulting from the study will prove useful in terms of determining both current and future risks to GPS-dependent CIs sectors along with the appropriate countermeasures as the United States moves towards a SmartGrid system.

Bo Chai, Zaiyue Yang, Jiming Chen.  2014.  Impacts of unreliable communication and regret matching based anti-jamming approach in smart grid. Innovative Smart Grid Technologies Conference (ISGT), 2014 IEEE PES. :1-5.

Demand response management (DRM) is one of the main features in smart grid, which is realized via communications between power providers and consumers. Due to the vulnerabilities of communication channels, communication is not perfect in practice and will be threatened by jamming attack. In this paper, we consider jamming attack in the wireless communication for smart grid. Firstly, the DRM performance degradation introduced by unreliable communication is fully studied. Secondly, a regret matching based anti-jamming algorithm is proposed to enhance the performance of communication and DRM. Finally, numerical results are presented to illustrate the impacts of unreliable communication on DRM and the performance of the proposed anti-jamming algorithm.

Zahid, A., Masood, R., Shibli, M.A..  2014.  Security of sharded NoSQL databases: A comparative analysis. Information Assurance and Cyber Security (CIACS), 2014 Conference on. :1-8.

NoSQL databases are easy to scale-out because of their flexible schema and support for BASE (Basically Available, Soft State and Eventually Consistent) properties. The process of scaling-out in most of these databases is supported by sharding which is considered as the key feature in providing faster reads and writes to the database. However, securing the data sharded over various servers is a challenging problem because of the data being distributedly processed and transmitted over the unsecured network. Though, extensive research has been performed on NoSQL sharding mechanisms but no specific criterion has been defined to analyze the security of sharded architecture. This paper proposes an assessment criterion comprising various security features for the analysis of sharded NoSQL databases. It presents a detailed view of the security features offered by NoSQL databases and analyzes them with respect to proposed assessment criteria. The presented analysis helps various organizations in the selection of appropriate and reliable database in accordance with their preferences and security requirements.

Kun Wen, Jiahai Yang, Fengjuan Cheng, Chenxi Li, Ziyu Wang, Hui Yin.  2014.  Two-stage detection algorithm for RoQ attack based on localized periodicity analysis of traffic anomaly. Computer Communication and Networks (ICCCN), 2014 23rd International Conference on. :1-6.

Reduction of Quality (RoQ) attack is a stealthy denial of service attack. It can decrease or inhibit normal TCP flows in network. Victims are hard to perceive it as the final network throughput is decreasing instead of increasing during the attack. Therefore, the attack is strongly hidden and it is difficult to be detected by existing detection systems. Based on the principle of Time-Frequency analysis, we propose a two-stage detection algorithm which combines anomaly detection with misuse detection. In the first stage, we try to detect the potential anomaly by analyzing network traffic through Wavelet multiresolution analysis method. According to different time-domain characteristics, we locate the abrupt change points. In the second stage, we further analyze the local traffic around the abrupt change point. We extract the potential attack characteristics by autocorrelation analysis. By the two-stage detection, we can ultimately confirm whether the network is affected by the attack. Results of simulations and real network experiments demonstrate that our algorithm can detect RoQ attacks, with high accuracy and high efficiency.

Baofeng Wu, Qingfang Jin, Zhuojun Liu, Dongdai Lin.  2014.  Constructing Boolean functions with potentially optimal algebraic immunity based on additive decompositions of finite fields (extended abstract). Information Theory (ISIT), 2014 IEEE International Symposium on. :1361-1365.

We propose a general approach to construct cryptographic significant Boolean functions of (r + 1)m variables based on the additive decomposition F2rm × F2m of the finite field F2(r+1)m, where r ≥ 1 is odd and m ≥ 3. A class of unbalanced functions is constructed first via this approach, which coincides with a variant of the unbalanced class of generalized Tu-Deng functions in the case r = 1. Functions belonging to this class have high algebraic degree, but their algebraic immunity does not exceed m, which is impossible to be optimal when r > 1. By modifying these unbalanced functions, we obtain a class of balanced functions which have optimal algebraic degree and high nonlinearity (shown by a lower bound we prove). These functions have optimal algebraic immunity provided a combinatorial conjecture on binary strings which generalizes the Tu-Deng conjecture is true. Computer investigations show that, at least for small values of number of variables, functions from this class also behave well against fast algebraic attacks.

Ding, Shuai, Yang, Shanlin, Zhang, Youtao, Liang, Changyong, Xia, Chenyi.  2014.  Combining QoS Prediction and Customer Satisfaction Estimation to Solve Cloud Service Trustworthiness Evaluation Problems. Know.-Based Syst.. 56:216–225.

The collection and combination of assessment data in trustworthiness evaluation of cloud service is challenging, notably because QoS value may be missing in offline evaluation situation due to the time-consuming and costly cloud service invocation. Considering the fact that many trustworthiness evaluation problems require not only objective measurement but also subjective perception, this paper designs a novel framework named CSTrust for conducting cloud service trustworthiness evaluation by combining QoS prediction and customer satisfaction estimation. The proposed framework considers how to improve the accuracy of QoS value prediction on quantitative trustworthy attributes, as well as how to estimate the customer satisfaction of target cloud service by taking advantages of the perception ratings on qualitative attributes. The proposed methods are validated through simulations, demonstrating that CSTrust can effectively predict assessment data and release evaluation results of trustworthiness.