Visible to the public Biblio

Filters: Keyword is Buildings  [Clear All Filters]
2020-02-10
Carneiro, Lucas R., Delgado, Carla A.D.M., da Silva, João C.P..  2019.  Social Analysis of Game Agents: How Trust and Reputation can Improve Player Experience. 2019 8th Brazilian Conference on Intelligent Systems (BRACIS). :485–490.
Video games normally use Artificial Intelligence techniques to improve Non-Player Character (NPC) behavior, creating a more realistic experience for their players. However, rational behavior in general does not consider social interactions between player and bots. Because of that, a new framework for NPCs was proposed, which uses a social bias to mix the default strategy of finding the best possible plays to win with a analysis to decide if other players should be categorized as allies or foes. Trust and reputation models were used together to implement this demeanor. In this paper we discuss an implementation of this framework inside the game Settlers of Catan. New NPC agents are created to this implementation. We also analyze the results obtained from simulations among agents and players to conclude how the use of trust and reputation in NPCs can create a better gaming experience.
2020-01-20
Ajaei, F. Badrkhani, Mohammadi, J., Stevens, G., Akhavan, E..  2019.  Hybrid AC/DC Microgrid Configurations for a Net-Zero Energy Community. 2019 IEEE/IAS 55th Industrial and Commercial Power Systems Technical Conference (I CPS). :1–7.

The hybrid microgrid is attracting great attention in recent years as it combines the main advantages of the alternating current (AC) and direct current (DC) microgrids. It is one of the best candidates to support a net-zero energy community. Thus, this paper investigates and compares different hybrid AC/DC microgrid configurations that are suitable for a net-zero energy community. Four different configurations are compared with each other in terms of their impacts on the overall system reliability, expandability, load shedding requirements, power sharing issues, net-zero energy capability, number of the required interface converters, and the requirement of costly medium-voltage components. The results of the investigations indicate that the best results are achieved when each building is enabled to supply its critical loads using an independent AC microgrid that is interfaced to the DC microgrid through a dedicated interface converter.

2019-12-09
Robert, Henzel, Georg, Herzwurm.  2018.  A preliminary approach towards the trust issue in cloud manufacturing using grounded theory: Defining the problem domain. 2018 4th International Conference on Universal Village (UV). :1–6.
In Cloud Manufacturing trust is an important, under investigated issue. This paper proceeds the noncommittal phase of the grounded theory method approach by investigating the trust topic in several research streams, defining the problem domain. This novel approach fills a research gap and can be treated as a snapshot and blueprint of research. Findings were accomplished by a structured literature review and are able to help future researchers in pursuing the integrative phase in Grounded Theory by building on the preliminary result of this paper.
2019-10-14
Angelini, M., Blasilli, G., Borrello, P., Coppa, E., D’Elia, D. C., Ferracci, S., Lenti, S., Santucci, G..  2018.  ROPMate: Visually Assisting the Creation of ROP-based Exploits. 2018 IEEE Symposium on Visualization for Cyber Security (VizSec). :1–8.

Exploits based on ROP (Return-Oriented Programming) are increasingly present in advanced attack scenarios. Testing systems for ROP-based attacks can be valuable for improving the security and reliability of software. In this paper, we propose ROPMATE, the first Visual Analytics system specifically designed to assist human red team ROP exploit builders. In contrast, previous ROP tools typically require users to inspect a puzzle of hundreds or thousands of lines of textual information, making it a daunting task. ROPMATE presents builders with a clear interface of well-defined and semantically meaningful gadgets, i.e., fragments of code already present in the binary application that can be chained to form fully-functional exploits. The system supports incrementally building exploits by suggesting gadget candidates filtered according to constraints on preserved registers and accessed memory. Several visual aids are offered to identify suitable gadgets and assemble them into semantically correct chains. We report on a preliminary user study that shows how ROPMATE can assist users in building ROP chains.

2019-09-26
Chung, S., Shieh, M., Chiueh, T..  2018.  A Security Proxy to Cloud Storage Backends Based on an Efficient Wildcard Searchable Encryption. 2018 IEEE 8th International Symposium on Cloud and Service Computing (SC2). :127-130.

Cloud storage backends such as Amazon S3 are a potential storage solution to enterprises. However, to couple enterprises with these backends, at least two problems must be solved: first, how to make these semi-trusted backends as secure as on-premises storage; and second, how to selectively retrieve files as easy as on-premises storage. A security proxy can address both the problems by building a local index from keywords in files before encrypting and uploading files to these backends. But, if the local index is built in plaintext, file content is still vulnerable to local malicious staff. Searchable Encryption (SE) can get rid of this vulnerability by making index into ciphertext; however, its known constructions often require modifications to index database, and, to support wildcard queries, they are not efficient at all. In this paper, we present a security proxy that, based on our wildcard SE construction, can securely and efficiently couple enterprises with these backends. In particular, since our SE construction can work directly with existing database systems, it incurs only a little overhead, and when needed, permits the security proxy to run with constantly small storage footprint by readily out-sourcing all built indices to existing cloud databases.

2019-07-01
Medeiros, N., Ivaki, N., Costa, P., Vieira, M..  2018.  An Approach for Trustworthiness Benchmarking Using Software Metrics. 2018 IEEE 23rd Pacific Rim International Symposium on Dependable Computing (PRDC). :84–93.

Trustworthiness is a paramount concern for users and customers in the selection of a software solution, specially in the context of complex and dynamic environments, such as Cloud and IoT. However, assessing and benchmarking trustworthiness (worthiness of software for being trusted) is a challenging task, mainly due to the variety of application scenarios (e.g., businesscritical, safety-critical), the large number of determinative quality attributes (e.g., security, performance), and last, but foremost, due to the subjective notion of trust and trustworthiness. In this paper, we present trustworthiness as a measurable notion in relative terms based on security attributes and propose an approach for the assessment and benchmarking of software. The main goal is to build a trustworthiness assessment model based on software metrics (e.g., Cyclomatic Complexity, CountLine, CBO) that can be used as indicators of software security. To demonstrate the proposed approach, we assessed and ranked several files and functions of the Mozilla Firefox project based on their trustworthiness score and conducted a survey among several software security experts in order to validate the obtained rank. Results show that our approach is able to provide a sound ranking of the benchmarked software.

2019-06-24
Chouikhi, S., Merghem-Boulahia, L., Esseghir, M..  2018.  Energy Demand Scheduling Based on Game Theory for Microgrids. 2018 IEEE International Conference on Communications (ICC). :1–6.

The advent of smart grids offers us the opportunity to better manage the electricity grids. One of the most interesting challenges in the modern grids is the consumer demand management. Indeed, the development in Information and Communication Technologies (ICTs) encourages the development of demand-side management systems. In this paper, we propose a distributed energy demand scheduling approach that uses minimal interactions between consumers to optimize the energy demand. We formulate the consumption scheduling as a constrained optimization problem and use game theory to solve this problem. On one hand, the proposed approach aims to reduce the total energy cost of a building's consumers. This imposes the cooperation between all the consumers to achieve the collective goal. On the other hand, the privacy of each user must be protected, which means that our distributed approach must operate with a minimal information exchange. The performance evaluation shows that the proposed approach reduces the total energy cost, each consumer's individual cost, as well as the peak to average ratio.

2019-03-28
Costantino, G., Marra, A. La, Martinelli, F., Mori, P., Saracino, A..  2018.  Privacy Preserving Distributed Computation of Private Attributes for Collaborative Privacy Aware Usage Control Systems. 2018 IEEE International Conference on Smart Computing (SMARTCOMP). :315-320.

Collaborative smart services provide functionalities which exploit data collected from different sources to provide benefits to a community of users. Such data, however, might be privacy sensitive and their disclosure has to be avoided. In this paper, we present a distributed multi-tier framework intended for smart-environment management, based on usage control for policy evaluation and enforcement on devices belonging to different collaborating entities. The proposed framework exploits secure multi-party computation to evaluate policy conditions without disclosing actual value of evaluated attributes, to preserve privacy. As reference example, a smart-grid use case is presented.

2019-01-21
Fei, Y., Ning, J., Jiang, W..  2018.  A quantifiable Attack-Defense Trees model for APT attack. 2018 IEEE 3rd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC). :2303–2306.
In order to deal with APT(Advanced Persistent Threat) attacks, this paper proposes a quantifiable Attack-Defense Tree model. First, the model gives both attack and defense leaf node a variety of security attributes. And then quantifies the nodes through the analytic hierarchy process. Finally, it analyzes the impact of the defense measures on the attack behavior. Through the application of the model, we can see that the quantifiable Attack-Defense Tree model can well describe the impact of defense measures on attack behavior.
2018-12-03
Molka-Danielsen, J., Engelseth, P., Olešnaníková, V., Šarafín, P., Žalman, R..  2017.  Big Data Analytics for Air Quality Monitoring at a Logistics Shipping Base via Autonomous Wireless Sensor Network Technologies. 2017 5th International Conference on Enterprise Systems (ES). :38–45.
The indoor air quality in industrial workplace buildings, e.g. air temperature, humidity and levels of carbon dioxide (CO2), play a critical role in the perceived levels of workers' comfort and in reported medical health. CO2 can act as an oxygen displacer, and in confined spaces humans can have, for example, reactions of dizziness, increased heart rate and blood pressure, headaches, and in more serious cases loss of consciousness. Specialized organizations can be brought in to monitor the work environment for limited periods. However, new low cost wireless sensor network (WSN) technologies offer potential for more continuous and autonomous assessment of industrial workplace air quality. Central to effective decision making is the data analytics approach and visualization of what is potentially, big data (BD) in monitoring the air quality in industrial workplaces. This paper presents a case study that monitors air quality that is collected with WSN technologies. We discuss the potential BD problems. The case trials are from two workshops that are part of a large on-shore logistics base a regional shipping industry in Norway. This small case study demonstrates a monitoring and visualization approach for facilitating BD in decision making for health and safety in the shipping industry. We also identify other potential applications of WSN technologies and visualization of BD in the workplace environments; for example, for monitoring of other substances for worker safety in high risk industries and for quality of goods in supply chain management.
2018-02-21
Pak, W., Choi, Y. J..  2017.  High Performance and High Scalable Packet Classification Algorithm for Network Security Systems. IEEE Transactions on Dependable and Secure Computing. 14:37–49.

Packet classification is a core function in network and security systems; hence, hardware-based solutions, such as packet classification accelerator chips or Ternary Content Addressable Memory (T-CAM), have been widely adopted for high-performance systems. With the rapid improvement of general hardware architectures and growing popularity of multi-core multi-threaded processors, software-based packet classification algorithms are attracting considerable attention, owing to their high flexibility in satisfying various industrial requirements for security and network systems. For high classification speed, these algorithms internally use large tables, whose size increases exponentially with the ruleset size; consequently, they cannot be used with a large rulesets. To overcome this problem, we propose a new software-based packet classification algorithm that simultaneously supports high scalability and fast classification performance by merging partition decision trees in a search table. While most partitioning-based packet classification algorithms show good scalability at the cost of low classification speed, our algorithm shows very high classification speed, irrespective of the number of rules, with small tables and short table building time. Our test results confirm that the proposed algorithm enables network and security systems to support heavy traffic in the most effective manner.

2018-02-06
Verma, D. C., de Mel, G..  2017.  Measures of Network Centricity for Edge Deployment of IoT Applications. 2017 IEEE International Conference on Big Data (Big Data). :4612–4620.

Edge Computing is a scheme to improve the performance, latency and security guidelines for IoT applications. However, edge deployment of an application also comes with additional complexity in management, an increased attack surface for security vulnerability, and could potentially result in a more expensive solution. As a result, the conditions under which an edge deployment of IoT applications delivers a better solution is not always obvious. Metrics which would be able to predict whether or not an IoT application is suitable for edge deployment can provide useful insights to address this question. In this paper, we examine the key performance indicators for IoT applications, namely the responsiveness, scalability and cost models for different types of IoT applications. Our analysis identifies that network centrality of an IoT application is a key characteristic which determines whether or not an IoT application is a good candidate for edge deployment. We discuss the different measures of network centrality that can be used to characterize applications, and the relative performance of edge deployment compared to centralized deployment for various IoT applications.

2018-02-02
Whitmore, J., Tobin, W..  2017.  Improving Attention to Security in Software Design with Analytics and Cognitive Techniques. 2017 IEEE Cybersecurity Development (SecDev). :16–21.

There is widening chasm between the ease of creating software and difficulty of "building security in". This paper reviews the approach, the findings and recent experiments from a seven-year effort to enable consistency across a large, diverse development organization and software portfolio via policies, guidance, automated tools and services. Experience shows that developing secure software is an elusive goal for most. It requires every team to know and apply a wide range of security knowledge in the context of what software is being built, how the software will be used, and the projected threats in the environment where the software will operate. The drive for better outcomes for secure development and increased developer productivity led to experiments to augment developer knowledge and eventually realize the goal of "building the right security in".

2018-01-23
AbuAli, N. A., Taha, A. E. M..  2017.  A dynamic scalable scheme for managing mixed crowds. 2017 IEEE International Conference on Communications (ICC). :1–5.

Crowd management in urban settings has mostly relied on either classical, non-automated mechanisms or spontaneous notifications/alerts through social networks. Such management techniques are heavily marred by lack of comprehensive control, especially in terms of averting risks in a manner that ensures crowd safety and enables prompt emergency response. In this paper, we propose a Markov Decision Process Scheme MDP to realize a smart infrastructure that is directly aimed at crowd management. A key emphasis of the scheme is a robust and reliable scalability that provides sufficient flexibility to manage a mixed crowd (i.e., pedestrian, cyclers, manned vehicles and unmanned vehicles). The infrastructure also spans various population settings (e.g., roads, buildings, game arenas, etc.). To realize a reliable and scalable crowd management scheme, the classical MDP is decomposed into Local MDPs with smaller action-state spaces. Preliminarily results show that the MDP decomposition can reduce the system global cost and facilitate fast convergence to local near-optimal solution for each L-MDP.

2018-01-10
Vincur, J., Navrat, P., Polasek, I..  2017.  VR City: Software Analysis in Virtual Reality Environment. 2017 IEEE International Conference on Software Quality, Reliability and Security Companion (QRS-C). :509–516.
This paper presents software visualization tool that utilizes the modified city metaphor to represent software system and related analysis data in virtual reality environment. To better address all three kinds of software aspects we propose a new layouting algorithm that provides a higher level of detail and position the buildings according to the coupling between classes that they represent. Resulting layout allows us to visualize software metrics and source code modifications at the granularity of methods, visualize method invocations involved in program execution and to support the remodularization analysis. To further reduce the cognitive load and increase efficiency of 3D visualization we allow users to observe and interact with our city in immersive virtual reality environment that also provides a source code browsing feature. We demonstrate the use of our approach on two open-source systems.
2017-08-22
Lazarova-Molnar, Sanja, Logason, Halldór Þór, Andersen, Peter Grønb\textbackslasha ek, Kj\textbackslasha ergaard, Mikkel Baun.  2016.  Mobile Crowdsourcing of Data for Fault Detection and Diagnosis in Smart Buildings. Proceedings of the International Conference on Research in Adaptive and Convergent Systems. :12–17.

Energy use of buildings represents roughly 40% of the overall energy consumption. Most of the national agendas contain goals related to reducing the energy consumption and carbon footprint. Timely and accurate fault detection and diagnosis (FDD) in building management systems (BMS) have the potential to reduce energy consumption cost by approximately 15-30%. Most of the FDD methods are data-based, meaning that their performance is tightly linked to the quality and availability of relevant data. Based on our experience, faults and relevant events data is very sparse and inadequate, mostly because of the lack of will and incentive for those that would need to keep track of faults. In this paper we introduce the idea of using crowdsourcing to support FDD data collection processes, and illustrate our idea through a mobile application that has been implemented for this purpose. Furthermore, we propose a strategy of how to successfully deploy this building occupants' crowdsourcing application.

2017-03-08
Varma, P..  2015.  Building an Open Identity Platform for India. 2015 Asia-Pacific Software Engineering Conference (APSEC). :3–3.

Summary form only given. Aadhaar, India's Unique Identity Project, has become the largest biometric identity system in the world, already covering more than 920 million people. Building such a massive system required significant design thinking, aligning to the core strategy, and building a technology platform that is scalable to meet the project's objective. Entire technology architecture behind Aadhaar is based on principles of openness, linear scalability, strong security, and most importantly vendor neutrality. All application components are built using open source components and open standards. Aadhaar system currently runs across two of the data centers within India managed by UIDAI and handles 1 million enrollments a day and at the peak doing about 900 trillion biometric matches a day. Current system has about 8 PB (8000 Terabytes) of raw data. Aadhaar Authentication service, which requires sub-second response time, is already live and can handle more than 100 million authentications a day. In this talk, the speaker, who has been the Chief Architect of Aadhaar since inception, shares his experience of building the system.

2017-02-21
M. Clark, L. Lampe.  2015.  "Single-channel compressive sampling of electrical data for non-intrusive load monitoring". 2015 IEEE Global Conference on Signal and Information Processing (GlobalSIP). :790-794.

Non-intrusive load monitoring (NILM) extracts information about how energy is being used in a building from electricity measurements collected at a single location. Obtaining measurements at only one location is attractive because it is inexpensive and convenient, but it can result in large amounts of data from high frequency electrical measurements. Different ways to compress or selectively measure this data are therefore required for practical implementations of NILM. We explore the use of random filtering and random demodulation, techniques that are closely related to compressed sensing, to offer a computationally simple way of compressing the electrical data. We show how these techniques can allow one to reduce the sampling rate of the electricity measurements, while requiring only one sampling channel and allowing accurate NILM performance. Our tests are performed using real measurements of electrical signals from a public data set, thus demonstrating their effectiveness on real appliances and allowing for reproducibility and comparison with other data management strategies for NILM.

2015-05-05
Pirinen, R..  2014.  Studies of Integration Readiness Levels: Case Shared Maritime Situational Awareness System. Intelligence and Security Informatics Conference (JISIC), 2014 IEEE Joint. :212-215.

The research question of this study is: How Integration Readiness Level (IRL) metrics can be understood and realized in the domain of border control information systems. The study address to the IRL metrics and their definition, criteria, references, and questionnaires for validation of border control information systems in case of the shared maritime situational awareness system. The target of study is in improvements of ways for acceptance, operational validation, risk assessment, and development of sharing mechanisms and integration of information systems and border control information interactions and collaboration concepts in Finnish national and European border control domains.
 

Srivastava, M..  2014.  In Sensors We Trust – A Realistic Possibility? Distributed Computing in Sensor Systems (DCOSS), 2014 IEEE International Conference on. :1-1.

Sensors of diverse capabilities and modalities, carried by us or deeply embedded in the physical world, have invaded our personal, social, work, and urban spaces. Our relationship with these sensors is a complicated one. On the one hand, these sensors collect rich data that are shared and disseminated, often initiated by us, with a broad array of service providers, interest groups, friends, and family. Embedded in this data is information that can be used to algorithmically construct a virtual biography of our activities, revealing intimate behaviors and lifestyle patterns. On the other hand, we and the services we use, increasingly depend directly and indirectly on information originating from these sensors for making a variety of decisions, both routine and critical, in our lives. The quality of these decisions and our confidence in them depend directly on the quality of the sensory information and our trust in the sources. Sophisticated adversaries, benefiting from the same technology advances as the sensing systems, can manipulate sensory sources and analyze data in subtle ways to extract sensitive knowledge, cause erroneous inferences, and subvert decisions. The consequences of these compromises will only amplify as our society increasingly complex human-cyber-physical systems with increased reliance on sensory information and real-time decision cycles.Drawing upon examples of this two-faceted relationship with sensors in applications such as mobile health and sustainable buildings, this talk will discuss the challenges inherent in designing a sensor information flow and processing architecture that is sensitive to the concerns of both producers and consumer. For the pervasive sensing infrastructure to be trusted by both, it must be robust to active adversaries who are deceptively extracting private information, manipulating beliefs and subverting decisions. While completely solving these challenges would require a new science of resilient, secure and trustworthy networked sensing and decision systems that would combine hitherto disciplines of distributed embedded systems, network science, control theory, security, behavioral science, and game theory, this talk will provide some initial ideas. These include an approach to enabling privacy-utility trade-offs that balance the tension between risk of information sharing to the producer and the value of information sharing to the consumer, and method to secure systems against physical manipulation of sensed information.
 

Campbell, S..  2014.  Open science, open security. High Performance Computing Simulation (HPCS), 2014 International Conference on. :584-587.

We propose that to address the growing problems with complexity and data volumes in HPC security wee need to refactor how we look at data by creating tools that not only select data, but analyze and represent it in a manner well suited for intuitive analysis. We propose a set of rules describing what this means, and provide a number of production quality tools that represent our current best effort in implementing these ideas.
 

2015-05-04
Wiesner, K., Feld, S., Dorfmeister, F., Linnhoff-Popien, C..  2014.  Right to silence: Establishing map-based Silent Zones for participatory sensing. Intelligent Sensors, Sensor Networks and Information Processing (ISSNIP), 2014 IEEE Ninth International Conference on. :1-6.

Participatory sensing tries to create cost-effective, large-scale sensing systems by leveraging sensors embedded in mobile devices. One major challenge in these systems is to protect the users' privacy, since users will not contribute data if their privacy is jeopardized. Especially location data needs to be protected if it is likely to reveal information about the users' identities. A common solution is the blinding out approach that creates so-called ban zones in which location data is not published. Thereby, a user's important places, e.g., her home or workplace, can be concealed. However, ban zones of a fixed size are not able to guarantee any particular level of privacy. For instance, a ban zone that is large enough to conceal a user's home in a large city might be too small in a less populated area. For this reason, we propose an approach for dynamic map-based blinding out: The boundaries of our privacy zones, called Silent Zones, are determined in such way that at least k buildings are located within this zone. Thus, our approach adapts to the habitat density and we can guarantee k-anonymity in terms of surrounding buildings. In this paper, we present two new algorithms for creating Silent Zones and evaluate their performance. Our results show that especially in worst case scenarios, i.e., in sparsely populated areas, our approach outperforms standard ban zones and guarantees the specified privacy level.

2015-04-30
Srivastava, M..  2014.  In Sensors We Trust – A Realistic Possibility? Distributed Computing in Sensor Systems (DCOSS), 2014 IEEE International Conference on. :1-1.

Sensors of diverse capabilities and modalities, carried by us or deeply embedded in the physical world, have invaded our personal, social, work, and urban spaces. Our relationship with these sensors is a complicated one. On the one hand, these sensors collect rich data that are shared and disseminated, often initiated by us, with a broad array of service providers, interest groups, friends, and family. Embedded in this data is information that can be used to algorithmically construct a virtual biography of our activities, revealing intimate behaviors and lifestyle patterns. On the other hand, we and the services we use, increasingly depend directly and indirectly on information originating from these sensors for making a variety of decisions, both routine and critical, in our lives. The quality of these decisions and our confidence in them depend directly on the quality of the sensory information and our trust in the sources. Sophisticated adversaries, benefiting from the same technology advances as the sensing systems, can manipulate sensory sources and analyze data in subtle ways to extract sensitive knowledge, cause erroneous inferences, and subvert decisions. The consequences of these compromises will only amplify as our society increasingly complex human-cyber-physical systems with increased reliance on sensory information and real-time decision cycles.Drawing upon examples of this two-faceted relationship with sensors in applications such as mobile health and sustainable buildings, this talk will discuss the challenges inherent in designing a sensor information flow and processing architecture that is sensitive to the concerns of both producers and consumer. For the pervasive sensing infrastructure to be trusted by both, it must be robust to active adversaries who are deceptively extracting private information, manipulating beliefs and subverting decisions. While completely solving these challenges would require a new science of resilient, secure and trustworthy networked sensing and decision systems that would combine hitherto disciplines of distributed embedded systems, network science, control theory, security, behavioral science, and game theory, this talk will provide some initial ideas. These include an approach to enabling privacy-utility trade-offs that balance the tension between risk of information sharing to the producer and the value of information sharing to the consumer, and method to secure systems against physical manipulation of sensed information.