Visible to the public Biblio

Found 2000 results

Filters: First Letter Of Last Name is A  [Clear All Filters]
1994
Amoroso, E., Merritt, M..  1994.  Composing system integrity using I/O automata. Tenth Annual Computer Security Applications Conference. :34—43.
The I/O automata model of Lynch and Turtle (1987) is summarized and used to formalize several types of system integrity based on the control of transitions to invalid starts. Type-A integrity is exhibited by systems with no invalid initial states and that disallow transitions from valid reachable to invalid states. Type-B integrity is exhibited by systems that disallow externally-controlled transitions from valid reachable to invalid states, Type-C integrity is exhibited by systems that allow locally-controlled or externally-controlled transitions from reachable to invalid states. Strict-B integrity is exhibited by systems that are Type-B but not Type-A. Strict-C integrity is exhibited by systems that are Type-C but not Type-B. Basic results on the closure properties that hold under composition of systems exhibiting these types of integrity are presented in I/O automata-theoretic terms. Specifically, Type-A, Type-B, and Type-C integrity are shown to be composable, whereas Strict-B and Strict-C integrity are shown to not be generally composable. The integrity definitions and compositional results are illustrated using the familiar vending machine example specified as an I/O automaton and composed with a customer environment. The implications of the integrity definitions and compositional results on practical system design are discussed and a research plan for future work is outlined.
2006
Sekine, Junko, Campos-Náñnez, Enrique, Harrald, John R., Abeledo, Hernán.  2006.  A Simulation-Based Approach to Trade-off Analysis of Port Security. Proceedings of the 38th Conference on Winter Simulation. :521–528.

Motivated by the September 11 attacks, we are addressing the problem of policy analysis of supply-chain security. Considering the potential economic and operational impacts of inspection together with the inherent difficulty of assigning a reasonable cost to an inspection failure call for a policy analysis methodology in which stakeholders can understand the trade-offs between the diverse and potentially conflicting objectives. To obtain this information, we used a simulation-based methodology to characterize the set of Pareto optimal solutions with respect to the multiple objectives represented in the decision problem. Our methodology relies on simulation and the response surface method (RSM) to model the relationships between inspection policies and relevant stakeholder objectives in order to construct a set of Pareto optimal solutions. The approach is illustrated with an application to a real-world supply chain.

2009
Halderman, J. Alex, Schoen, Seth D., Heninger, Nadia, Clarkson, William, Paul, William, Calandrino, Joseph A., Feldman, Ariel J., Appelbaum, Jacob, Felten, Edward W..  2009.  Lest We Remember: Cold-boot Attacks on Encryption Keys. Commun. ACM. 52:91–98.

Contrary to widespread assumption, dynamic RAM (DRAM), the main memory in most modern computers, retains its contents for several seconds after power is lost, even at room temperature and even if removed from a motherboard. Although DRAM becomes less reliable when it is not refreshed, it is not immediately erased, and its contents persist sufficiently for malicious (or forensic) acquisition of usable full-system memory images. We show that this phenomenon limits the ability of an operating system to protect cryptographic key material from an attacker with physical access to a machine. It poses a particular threat to laptop users who rely on disk encryption: we demonstrate that it could be used to compromise several popular disk encryption products without the need for any special devices or materials. We experimentally characterize the extent and predictability of memory retention and report that remanence times can be increased dramatically with simple cooling techniques. We offer new algorithms for finding cryptographic keys in memory images and for correcting errors caused by bit decay. Though we discuss several strategies for mitigating these risks, we know of no simple remedy that would eliminate them.

2010
Schwartz, E.J., Avgerinos, T., Brumley, D..  2010.  All You Ever Wanted to Know about Dynamic Taint Analysis and Forward Symbolic Execution (but Might Have Been Afraid to Ask). Security and Privacy (SP), 2010 IEEE Symposium on. :317-331.

Dynamic taint analysis and forward symbolic execution are quickly becoming staple techniques in security analyses. Example applications of dynamic taint analysis and forward symbolic execution include malware analysis, input filter generation, test case generation, and vulnerability discovery. Despite the widespread usage of these two techniques, there has been little effort to formally define the algorithms and summarize the critical issues that arise when these techniques are used in typical security contexts. The contributions of this paper are two-fold. First, we precisely describe the algorithms for dynamic taint analysis and forward symbolic execution as extensions to the run-time semantics of a general language. Second, we highlight important implementation choices, common pitfalls, and considerations when using these techniques in a security context.

2011
Tootaghaj, Diman Zad, Farhat, Farshid, Pakravan, Mohammad-Reza, Aref, Mohammad-Reza.  2011.  Game-theoretic approach to mitigate packet dropping in wireless Ad-hoc networks. 2011 IEEE Consumer Communications and Networking Conference (CCNC). :163–165.
Performance of routing is severely degraded when misbehaving nodes drop packets instead of properly forwarding them. In this paper, we propose a Game-Theoretic Adaptive Multipath Routing (GTAMR) protocol to detect and punish selfish or malicious nodes which try to drop information packets in routing phase and defend against collaborative attacks in which nodes try to disrupt communication or save their power. Our proposed algorithm outranks previous schemes because it is resilient against attacks in which more than one node coordinate their misbehavior and can be used in networks which wireless nodes use directional antennas. We then propose a game theoretic strategy, ERTFT, for nodes to promote cooperation. In comparison with other proposed TFT-like strategies, ours is resilient to systematic errors in detection of selfish nodes and does not lead to unending death spirals.
Alperovitch, Dmitri.  2011.  Towards establishment of cyberspace deterrence strategy. 2011 3rd International Conference on Cyber Conflict. :1–8.
The question of whether strategic deterrence in cyberspace is achievable given the challenges of detection, attribution and credible retaliation is a topic of contention among military and civilian defense strategists. This paper examines the traditional strategic deterrence theory and its application to deterrence in cyberspace (the newly defined 5th battlespace domain, following land, air, sea and space domains), which is being used increasingly by nation-states and their proxies to achieve information dominance and to gain tactical and strategic economic and military advantage. It presents a taxonomy of cyberattacks that identifies which types of threats in the confidentiality, integrity, availability cybersecurity model triad present the greatest risk to nation-state economic and military security, including their political and social facets. The argument is presented that attacks on confidentiality cannot be subject to deterrence in the current international legal framework and that the focus of strategy needs to be applied to integrity and availability attacks. A potential cyberdeterrence strategy is put forth that can enhance national security against devastating cyberattacks through a credible declaratory retaliation capability that establishes red lines which may trigger a counter-strike against all identifiable responsible parties. The author believes such strategy can credibly influence nation-state threat actors who themselves exhibit serious vulnerabilities to cyber attacks from launching a devastating cyber first strike.
Dogrul, Murat, Aslan, Adil, Celik, Eyyup.  2011.  Developing an international cooperation on cyber defense and deterrence against Cyber terrorism. 2011 3rd International Conference on Cyber Conflict. :1–15.
Information Technology (IT) security is a growing concern for governments around the world. Cyber terrorism poses a direct threat to the security of the nations' critical infrastructures and ITs as a low-cost asymmetric warfare element. Most of these nations are aware of the vulnerability of the information technologies and the significance of protecting critical infrastructures. To counteract the threat of potentially disastrous cyber attacks, nations' policy makers are increasingly pondering on the use of deterrence strategies to supplement cyber defense. Nations create their own national policies and strategies which cover cyber security countermeasures including cyber defense and deterrence against cyber threats. But it is rather hard to cope with the threat by means of merely `national' cyber defense policies and strategies, since the cyberspace spans worldwide and attack's origin can even be overseas. The term “cyber terrorism” is another source of controversy. An agreement on a common definition of cyber terrorism among the nations is needed. However, the international community has not been able to succeed in developing a commonly accepted comprehensive definition of “terrorism” itself. This paper evaluates the importance of building international cooperation on cyber defense and deterrence against cyber terrorism. It aims to improve and further existing contents and definitions of cyber terrorism; discusses the attractiveness of cyber attacks for terrorists and past experiences on cyber terrorism. It emphasizes establishing international legal measures and cooperation between nations against cyber terrorism in order to maintain the international stability and prosperity. In accordance with NATO's new strategic concept, it focuses on developing the member nations' ability to prevent, detect, defend against and recover from cyber attacks to enhance and coordinate national cyber defense capabilities. It provides necessary steps that have to be taken globally in order to counter cyber terrorism.
Armknecht, F., Maes, R., Sadeghi, A, Standaert, O.-X., Wachsmann, C..  2011.  A Formalization of the Security Features of Physical Functions. Security and Privacy (SP), 2011 IEEE Symposium on. :397-412.

Physical attacks against cryptographic devices typically take advantage of information leakage (e.g., side-channels attacks) or erroneous computations (e.g., fault injection attacks). Preventing or detecting these attacks has become a challenging task in modern cryptographic research. In this context intrinsic physical properties of integrated circuits, such as Physical(ly) Unclonable Functions (PUFs), can be used to complement classical cryptographic constructions, and to enhance the security of cryptographic devices. PUFs have recently been proposed for various applications, including anti-counterfeiting schemes, key generation algorithms, and in the design of block ciphers. However, currently only rudimentary security models for PUFs exist, limiting the confidence in the security claims of PUF-based security primitives. A useful model should at the same time (i) define the security properties of PUFs abstractly and naturally, allowing to design and formally analyze PUF-based security solutions, and (ii) provide practical quantification tools allowing engineers to evaluate PUF instantiations. In this paper, we present a formal foundation for security primitives based on PUFs. Our approach requires as little as possible from the physics and focuses more on the main properties at the heart of most published works on PUFs: robustness (generation of stable answers), unclonability (not provided by algorithmic solutions), and unpredictability. We first formally define these properties and then show that they can be achieved by previously introduced PUF instantiations. We stress that such a consolidating work allows for a meaningful security analysis of security primitives taking advantage of physical properties, becoming increasingly important in the development of the next generation secure information systems.

Vorobeychik, Yevgeniy, Mayo, Jackson R., Armstrong, Robert C., Ruthruff, Joseph R..  2011.  Noncooperatively Optimized Tolerance: Decentralized Strategic Optimization in Complex Systems. Phys. Rev. Lett.. 107:108702.

We introduce noncooperatively optimized tolerance (NOT), a game theoretic generalization of highly optimized tolerance (HOT), which we illustrate in the forest fire framework. As the number of players increases, NOT retains features of HOT, such as robustness and self-dissimilar landscapes, but also develops features of self-organized criticality. The system retains considerable robustness even as it becomes fractured, due in part to emergent cooperation between players, and at the same time exhibits increasing resilience against changes in the environment, giving rise to intermediate regimes where the system is robust to a particular distribution of adverse events, yet not very fragile to changes.

2012
Atkinson, Simon Reay, Walker, David, Beaulne, Kevin, Hossain, Liaquat.  2012.  Cyber – Transparencies, Assurance and Deterrence. 2012 International Conference on Cyber Security. :119–126.
Cyber-has often been considered as a coordination and control, as opposed to collaborative influence, media. This conceptual-design paper, uniquely, builds upon a number of entangled, cross disciplinary research strands – integrating engineering and conflict studies – and a detailed literature review to propose a new paradigm of assurance and deterrence models. We consider an ontology for Cyber-sûréte, which combines both the social trusts necessary for [knowledge &, information] assurance such as collaboration by social influence (CSI) and the technological controls and rules for secure information management referred as coordination by rule and control (CRC). We posit Cyber-sûréte as enabling both a 'safe-to-fail' ecology (in which learning, testing and adaptation can take place) within a fail-safe supervisory control and data acquisition (SCADA type) system, e.g. in a nuclear power plant. Building upon traditional state-based threat analysis, we consider Warning Time and the Threat equation with relation to policies for managing Cyber-Deterrence. We examine how the goods of Cyber-might be galvanised so as to encourage virtuous behaviour and deter and / or dissuade ne'er-do-wells through multiple transparencies. We consider how the Deterrence-escalator may be managed by identifying both weak influence and strong control signals so as to create a more benign and responsive cyber-ecology, in which strengths can be exploited and weaknesses identified. Finally, we consider declaratory / mutual transparencies as opposed to legalistic / controlled transparency.
Ahmed Khurshid, University of Illinois at Urbana-Champaign, Wenxuan Zhou, University of Illinois at Urbana-Champaign, Matthew Caesar, University of Illinois at Urbana-Champaign, P. Brighten Godfrey, University of Illinois at Urbana-Champaign.  2012.  VeriFlow: Verifying Network-Wide Invariants in Real Time. First Workshop on Hot Topics in Software Defined Networks (HotSDN 2012).

Networks are complex and prone to bugs. Existing tools that check configuration files and data-plane state operate offline at timescales of seconds to hours, and cannot detect or prevent bugs as they arise. Is it possible to check network-wide invariants in real time, as the network state evolves? The key challenge here is to achieve extremely low latency during the checks so that network performance is not affected. In this paper, we present a preliminary design, VeriFlow, which suggests that this goal is achievable. VeriFlow is a layer between a software-defined networking controller and network devices that checks for network-wide invariant violations dynamically as each forwarding rule is inserted. Based on an implementation using a Mininet OpenFlow network and Route Views trace data, we find that VeriFlow can perform rigorous checking within hundreds of microseconds per rule insertion.

Abdullah Akce, University of Illinois at Urbana-Champaign, James Norton, University of Illinois at Urbana-Champaign, Timothy Bretl, University of Illinois at Urbana-Champaign.  2012.  A Brain-Machine Interface to Navigate Mobile Robots Along Human-Like Paths Amidst Obstacles. IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

This paper presents an interface that allows a human user to specify a desired path for a mobile robot in a planar workspace with noisy binary inputs that are obtained at low bit-rates through an electroencephalograph (EEG). We represent desired paths as geodesics with respect to a cost function that is defined so that each path-homotopy class contains exactly one (local) geodesic. We apply max-margin structured learning to recover a cost function that is consistent with observations of human walking paths. We derive an optimal feedback communication protocol to select a local geodesic— equivalently, a path-homotopy class—using a sequence of noisy bits. We validate our approach with experiments that quantify both how well our learned cost function characterizes human walking data and how well human subjects perform with the resulting interface in navigating a simulated robot with EEG.

Andrew Clark, University of Washington, Quanyan Zhu, University of Illinois at Urbana-Champaign, Radha Poovendran, University of Washington, Tamer Başar, University of Illinois at Urbana-Champaign.  2012.  Deceptive Routing in Relay Networks. Conference on Decision and Game Theory for Security.

Physical-layer and MAC-layer defense mechanisms against jamming attacks are often inherently reactive to experienced delay and loss of throughput after being attacked. In this paper, we study a proactive defense mechanism against jamming in multi-hop relay networks, in which one or more network sources introduce a deceptive network flow along a disjoint routing path. The deceptive mechanism leverages strategic jamming behaviors, causing the attacker to expend resources on targeting deceptive flows and thereby reducing the impact on real network trac. We use a two-stage game model to obtain deception strategies at Stackelberg equilibrium for sel sh and altruistic nodes. The equilibrium solutions are illustrated and corroborated through a simulation study.

Quanyan Zhu, University of Illinois at Urbana-Champaign, Andrew Clark, Radha Poovendran, Tamer Başar, University of Illinois at Urbana-Champaign.  2012.  Deceptive Routing Games. 51st IEEE Conference on Decision and Control.

The use of a shared medium leaves wireless networks, including mobile ad hoc and sensor networks, vulnerable to jamming attacks. In this paper, we introduce a jamming defense mechanism for multiple-path routing networks based on maintaining deceptive flows, consisting of fake packets, between a source and a destination. An adversary observing a deceptive flow will expend energy on disrupting the fake packets, allowing the real data packets to arrive at the destination unharmed. We model this deceptive flow-based defense within a multi-stage stochastic game framework between the network nodes, which choose a routing path and flow rates for the real and fake data, and an adversary, which chooses which fraction of each flow to target at each hop. We develop an efficient, distributed procedure for computing the optimal routing at each hop and the optimal flow allocation at the destination. Furthermore, by studying the equilibria of the game, we quantify the benefit arising from deception, as reflected in an increase in the valid throughput. Our results are demonstrated via a simulation study.

2013
Aldrich, Jonathan.  2013.  The Power of Interoperability: Why Objects Are Inevitable. Proceedings of the 2013 ACM International Symposium on New Ideas, New Paradigms, and Reflections on Programming & Software. :101–116.
Three years ago in this venue, Cook argued that in their essence, objects are what Reynolds called procedural data structures. His observation raises a natural question: if procedural data structures are the essence of objects, has this contributed to the empirical success of objects, and if so, how? This essay attempts to answer that question. After reviewing Cook's definition, I propose the term service abstractions to capture the essential nature of objects. This terminology emphasizes, following Kay, that objects are not primarily about representing and manipulating data, but are more about providing services in support of higher-level goals. Using examples taken from object-oriented frameworks, I illustrate the unique design leverage that service abstractions provide: the ability to define abstractions that can be extended, and whose extensions are interoperable in a first-class way. The essay argues that the form of interoperable extension supported by service abstractions is essential to modern software: many modern frameworks and ecosystems could not have been built without service abstractions. In this sense, the success of objects was not a coincidence: it was an inevitable consequence of their service abstraction nature.
Omar, Cyrus, Chung, Benjamin, Kurilova, Darya, Potanin, Alex, Aldrich, Jonathan.  2013.  Type-directed, whitespace-delimited parsing for embedded DSLs. Proceedings of the First Workshop on the Globalization of Domain Specific Languages. :8–11.
Domain-specific languages improve ease-of-use, expressiveness and verifiability, but defining and using different DSLs within a single application remains difficult. We introduce an approach for embedded DSLs where 1) whitespace delimits DSL-governed blocks, and 2) the parsing and type checking phases occur in tandem so that the expected type of the block determines which domain-specific parser governs that block. We argue that this approach occupies a sweet spot, providing high expressiveness and ease-of-use while maintaining safe composability. We introduce the design, provide examples and describe an ongoing implementation of this strategy in the Wyvern programming language. We also discuss how a more conventional keyword-directed strategy for parsing of DSLs can arise as a special case of this type-directed strategy.
Nistor, Ligia, Kurilova, Darya, Balzer, Stephanie, Chung, Benjamin, Potanin, Alex, Aldrich, Jonathan.  2013.  Wyvern: A Simple, Typed, and Pure Object-oriented Language. Proceedings of the 5th Workshop on MechAnisms for SPEcialization, Generalization and inHerItance. :9–16.
The simplest and purest practical object-oriented language designs today are seen in dynamically-typed languages, such as Smalltalk and Self. Static types, however, have potential benefits for productivity, security, and reasoning about programs. In this paper, we describe the design of Wyvern, a statically typed, pure object-oriented language that attempts to retain much of the simplicity and expressiveness of these iconic designs. Our goals lead us to combine pure object-oriented and functional abstractions in a simple, typed setting. We present a foundational object-based language that we believe to be as close as one can get to simple typed lambda calculus while keeping object-orientation. We show how this foundational language can be translated to the typed lambda calculus via standard encodings. We then define a simple extension to this language that introduces classes and show that classes are no more than sugar for the foundational object-based language. Our future intention is to demonstrate that modules and other object-oriented features can be added to our language as not more than such syntactical extensions while keeping the object-oriented core as pure as possible. The design of Wyvern closely follows both historical and modern ideas about the essence of object-orientation, suggesting a new way to think about a minimal, practical, typed core language for objects.
Ivars, Eugene, Armands, Vadim.  2013.  Alias-free compressed signal digitizing and recording on the basis of Event Timer. 2013 21st Telecommunications Forum Telfor (℡FOR). :443–446.

Specifics of an alias-free digitizer application for compressed digitizing and recording of wideband signals are considered. Signal sampling in this case is performed on the basis of picosecond resolution event timing, the digitizer actually is a subsystem of Event Timer A033-ET and specific events that are detected and then timed are the signal and reference sine-wave crossings. The used approach to development of this subsystem is described and some results of experimental studies are given.

Titus Barik, Arpan Chakraborty, Brent Harrison, David L. Roberts, Robert St. Amant.  2013.  Modeling the Concentration Game with ACT-R. The 12th International Conference on Cognitive Modeling.

This paper describes the development of subsymbolic ACT-R models for the Concentration game. Performance data is taken from an experiment in which participants played the game un- der two conditions: minimizing the number of mismatches/ turns during a game, and minimizing the time to complete a game. Conflict resolution and parameter tuning are used to implement an accuracy model and a speed model that capture the differences for the two conditions. Visual attention drives exploration of the game board in the models. Modeling re- sults are generally consistent with human performance, though some systematic differences can be seen. Modeling decisions, model limitations, and open issues are discussed. 

Alejandro Domininguez-Garcia, University of Illinois at Urbana-Champaign, Bahman Gharesifard, University of Illinois at Urbana-Champaign, Tamer Başar, University of Illinois at Urbana-Champaign.  2013.  A Price-Based Approach to Control of Networked Distributed Energy Resources.

We introduce a framework for controlling the energy provided or absorbed by distributed energy resources (DERs) in power distribution networks. In this framework, there is a set of agents referred to as aggregators that interact with the wholesale electricity market, and through some market-clearing mechanism, are requested (and will be compensated for) to provide (or absorb) certain amount of active (or reactive) power over some period of time. In order to fulfill the request, each aggregator interacts with a set of DERs and offers them some price per unit of active (or reactive) power they provide (or absorb); the objective is for the aggregator to design a pricing strategy for incentivizing DERs to change its active (or reactive) power consumption (or production) so as they collectively provide the amount that the aggregator has been asked for. In order to make a decision, each DER uses the price information provided by the aggregator and some estimate of the average active (or reactive) power that neighboring DERs can provide computed through some exchange of information among them; this exchange is described by a connected undirected graph. The focus is on the DER strategic decision-making process, which we cast as a game. In this context, we provide sufficient conditions on the aggregator's pricing strategy under which this game has a unique Nash equilibrium. Then, we propose a distributed iterative algorithm that adheres to the graph that describes the exchange of information between DERs that allows them to seek for this Nash equilibrium. We illustrate our results through several numerical simulations.

Presented as part of the DIMACS Workshop on Energy Infrastructure: Designing for Stability and Resilience, Rutgers University, Piscataway, NJ, February 20-22, 2013

Hui Lin, University of Illinois at Urbana-Champaign, Adam Slagell, University of Illinois at Urbana-Champaign, Catello Di Marino, University of Illinois at Urbana-Champaugn, Zbigniew Kalbarczyk, University of Illinois at Urbana-Champaign, Ravishankar K. Iyer, University of Illinois at Urbana-Champaign.  2013.  Adapting Bro into SCADA: Building a Specification-based Instrusion Detection System for the DNP3 Protocol. Eighth Annual Security and Information Intelligence Research Workshop (CSIRRW 2013).

When SCADA systems are exposed to public networks, attackers can more easily penetrate the control systems that operate electrical power grids, water plants, and other critical infrastructures. To detect such attacks, SCADA systems require an intrusion detection technique that can understand the information carried by their usually proprietary network protocols.

To achieve that goal, we propose to attach to SCADA systems a specification-based intrusion detection framework based on Bro [7][8], a runtime network traffic analyzer. We have built a parser in Bro to support DNP3, a network protocol widely used in SCADA systems that operate electrical power grids. This built-in parser provides a clear view of all network events related to SCADA systems. Consequently, security policies to analyze SCADA-specific semantics related to the network events can be accurately defined. As a proof of concept, we specify a protocol validation policy to verify that the semantics of the data extracted from network packets conform to protocol definitions. We performed an experimental evaluation to study the processing capabilities of the proposed intrusion detection framework.

Andrew Clark, Quanyan Zhu, University of Illinois at Urbana-Champaign, Radha Poovendran, Tamer Başar, University of Illinois at Urbana-Champaign.  2013.  An Impact-Aware Defense against Stuxnet. IFAC American Control Conference (ACC 2013).

The Stuxnet worm is a sophisticated malware designed to sabotage industrial control systems (ICSs). It exploits vulnerabilities in removable drives, local area communication networks, and programmable logic controllers (PLCs) to penetrate the process control network (PCN) and the control system network (CSN). Stuxnet was successful in penetrating the control system network and sabotaging industrial control processes since the targeted control systems lacked security mechanisms for verifying message integrity and source authentication. In this work, we propose a novel proactive defense system framework, in which commands from the system operator to the PLC are authenticated using a randomized set of cryptographic keys. The framework leverages cryptographic analysis and controland game-theoretic methods to quantify the impact of malicious commands on the performance of the physical plant. We derive the worst-case optimal randomization strategy as a saddle-point equilibrium of a game between an adversary attempting to insert commands and the system operator, and show that the proposed scheme can achieve arbitrarily low adversary success probability for a sufficiently large number of keys. We evaluate our proposed scheme, using a linear-quadratic regulator (LQR) as a case study, through theoretical and numerical analysis.

Bahman Gharesifard, University of Illinois at Urbana-Champaign, Tamer Başar, University of Illinois at Urbana-Champaign, Alejandro D. Domínguez-García, University of Illinois at Urbana-Champaign.  2013.  Price-based Distributed Control for Networked Plug-in Electric Eehicles. 2013 American Control Conference.

We introduce a framework for controlling the charging and discharging processes of plug-in electric vehicles (PEVs) via pricing strategies. Our framework consists of a hierarchical decision-making setting with two layers, which we refer to as aggregator layer and retail market layer. In the aggregator layer, there is a set of aggregators that are requested (and will be compensated for) to provide certain amount of energy over a period of time. In the retail market layer, the aggregator offers some price for the energy that PEVs may provide; the objective is to choose a pricing strategy to incentivize the PEVs so as they collectively provide the amount of energy that the aggregator has been asked for. The focus of this paper is on the decision-making process that takes places in the retail market layer, where we assume that each individual PEV is a price-anticipating decision-maker. We cast this decision-making process as a game, and provide conditions on the pricing strategy of the aggregator under which this game has a unique Nash equilibrium. We propose a distributed consensus-based iterative algorithm through which the PEVs can seek for this Nash equilibrium. Numerical simulations are included to illustrate our results.

Navid Aghasadeghi, University of Illinois at Urbana-Champaign, Huihua Zhao, Texas A&M University, Levi J. Hargrove, Northwestern University, Aaron D. Ames, Texas A&M University, Eric J. Perreault, Northwestern University, Timothy Bretl, University of Illinois at Urbana-Champaign.  2013.  Learning Impedance Controller Parameters for Lower-Limb Prostheses. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

Impedance control is a common framework for control of lower-limb prosthetic devices. This approach requires choosing many impedance controller parameters. In this paper, we show how to learn these parameters for lower-limb prostheses by observation of unimpaired human walkers. We validate our approach in simulation of a transfemoral amputee, and we demonstrate the performance of the learned parameters in a preliminary experiment with a lower-limb prosthetic device.

Hui Lin, University of Illinois at Urbana-Champaign, Adam Slagell, University of Illinois at Urbana-Champaign, Zbigniew Kalbarczyk, University of Illinois at Urbana-Champaign, Peter W. Sauer, University of Illinois at Urbana-Champaign, Ravishankar K. Iyer, University of Illinois at Urbana-Champaign.  2013.  Semantic Security Analysis of SCADA Networks to Detect Malicious Control Commands in Power Grids. First ACM Workshop on Smart Engergy Grid Security.

In the current generation of SCADA (Supervisory Control And Data Acquisition) systems used in power grids, a sophisticated attacker can exploit system vulnerabilities and use a legitimate maliciously crafted command to cause a wide range of system changes that traditional contingency analysis does not consider and remedial action schemes cannot handle. To detect such malicious commands, we propose a semantic analysis framework based on a distributed network of intrusion detection systems (IDSes). The framework combines system knowledge of both cyber and physical infrastructure in power grid to help IDS to estimate execution consequences of control commands, thus to reveal attacker’s malicious intentions. We evaluated the approach on the IEEE 30-bus system. Our experiments demonstrate that: (i) by opening 3 transmission lines, an attacker can avoid detection by the traditional contingency analysis and instantly put the tested 30-bus system into an insecure state and (ii) the semantic analysis provides reliable detection of malicious commands with a small amount of analysis time.