Visible to the public Biblio

Filters: Keyword is interoperability  [Clear All Filters]
2020-06-01
Dhas, Y. Justin, Jeyanthi, P..  2019.  A Review on Internet of Things Protocol and Service Oriented Middleware. 2019 International Conference on Communication and Signal Processing (ICCSP). :0104–0108.
This paper surveys a review of Internet of Things (IoT) protocols, Service oriented Middleware in IoT. The modern development of IoT, expected to create many divorce application in health care without human intervention. Various protocols are involved in the applications development. Researchers are doing research for desirable protocol with all functionalities. Middleware for an IoT provides interoperability between the devices or applications. The engineering of an IoT dependent on Service Oriented Architecture (SOA), it operates as middleware. We survey the existing SOA based IoT middleware and its functionalities.
2020-05-11
Enos, James R., Nilchiani, Roshanak R..  2018.  Merging DoDAF architectures to develop and analyze the DoD network of systems. 2018 IEEE Aerospace Conference. :1–9.
The Department of Defense (DoD) manages capabilities through the Joint Interoperability and Capability Development System (JCIDS) process. As part of this process, sponsors develop a series of DoD Architecture Framework (DoDAF) products to assist analysts understand the proposed capability and how it fits into the broader network of DoD legacy systems and systems under development. However, the Joint Staff, responsible for executing the JCIDS process, often analyzes these architectures in isolation without considering the broader network of systems. DoD leadership, the Government Accountability Organization, and others have noted the lack of the DoD's ability to manage the broader portfolio of capabilities in various reports and papers. Several efforts have proposed merging DoDAF architecture into a larger meta-architecture based on individual system architectures. This paper specifically targets the Systems View 3 (SV-3), System-to-system matrix, as an opportunity to merge multiple DoDAF architecture views into a network of system and understand the potential benefits associated with analyzing a broader perspective. The goal of merging multiple SV-3s is to better understand the interoperability of a system within the network of DoD systems as network metrics may provide insights into the relative interoperability of a DoD system. Currently, the DoD's definition of interoperability focuses on the system or capability's ability to enter and operate within the DoD Information Network (DoDIN); however, this view limits the definition of interoperability as it focuses solely on information flows and not resource flows or physical connections that should be present in a SV-3. The paper demonstrates the importance of including all forms of connections between systems in a network by comparing network metrics associated with the different types of connections. Without a complete set of DoDAF architectures for each system within the DoD and based on the potential classification of these products, the paper collates data that should be included in an SV-3 from open source, unclassified references to build the overall network of DoD systems. From these sources, a network of over 300 systems with almost 1000 connections emerges based on the documented information, resource, and physical connections between these legacy and planned DoD systems. With this network, the paper explores the quantification of individual system's interoperability through the application of nodal and network metrics from social network analysis (SNA). A SNA perspective on a network of systems provides additional insights beyond traditional network analysis because of the emphasis on the importance of nodes, systems, in the network as well as the relationship, connections, between the nodes. Finally, the paper proposes future work to explore the quantification of additional attributes of systems as well as a method for further validating the findings.
2020-01-27
Li, Zhangtan, Cheng, Liang, Zhang, Yang.  2019.  Tracking Sensitive Information and Operations in Integrated Clinical Environment. 2019 18th IEEE International Conference On Trust, Security And Privacy In Computing And Communications/13th IEEE International Conference On Big Data Science And Engineering (TrustCom/BigDataSE). :192–199.
Integrated Clinical Environment (ICE) is a standardized framework for achieving device interoperability in medical cyber-physical systems. The ICE utilizes high-level supervisory apps and a low-level communication middleware to coordinate medical devices. The need to design complex ICE systems that are both safe and effective has presented numerous challenges, including interoperability, context-aware intelligence, security and privacy. In this paper, we present a data flow analysis framework for the ICE systems. The framework performs the combination of static and dynamic analysis for the sensitive data and operations in the ICE systems. Our experiments demonstrate that the data flow analysis framework can record how the medical devices transmit sensitive data and perform misuse detection by tracing the runtime context of the sensitive operations.
2020-01-20
Klarin, K., Nazor, I., Celar, S..  2019.  Ontology literature review as guidelines for improving Croatian Qualification Framework. 2019 42nd International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO). :1402–1407.

Development of information systems dealing with education and labour market using web and grid service architecture enables their modularity, expandability and interoperability. Application of ontologies to the web helps with collecting and selecting the knowledge about a certain field in a generic way, thus enabling different applications to understand, use, reuse and share the knowledge among them. A necessary step before publishing computer-interpretable data on the public web is the implementation of common standards that will ensure the exchange of information. Croatian Qualification Framework (CROQF) is a project of standardization of occupations for the labour market, as well as standardization of sets of qualifications, skills and competences and their mutual relations. This paper analysis a respectable amount of research dealing with application of ontologies to information systems in education during the last decade. The main goal is to compare achieved results according to: 1) phases of development/classifications of education-related ontologies; 2) areas of education and 3) standards and structures of metadata for educational systems. Collected information is used to provide insight into building blocks of CROQF, both the ones well supported by experience and best practices, and the ones that are not, together with guidelines for development of own standards using ontological structures.

2020-01-13
Mohamed, Nader, Al-Jaroodi, Jameela.  2019.  A Middleware Framework to Address Security Issues in Integrated Multisystem Applications. 2019 IEEE International Systems Conference (SysCon). :1–6.
Integrating multiple programmable components and subsystems developed by different manufacturers into a final system (a system of systems) can create some security concerns. While there are many efforts for developing interoperability approaches to enable smooth, reliable and safe integration among different types of components to build final systems for different applications, less attention is usually given for the security aspects of this integration. This may leave the final systems exposed and vulnerable to potential security attacks. The issues elevate further when such systems are also connected to other networks such as the Internet or systems like fog and cloud computing. This issue can be found in important industrial applications like smart medical, smart manufacturing and smart city systems. As a result, along with performance, safety and reliability; multisystem integration must also be highly secure. This paper discusses the security issues instigated by such integration. In addition, it proposes a middleware framework to address the security issues for integrated multisystem applications.
2019-11-25
Cui, Hongyan, Chen, Zunming, Xi, Yu, Chen, Hao, Hao, Jiawang.  2019.  IoT Data Management and Lineage Traceability: A Blockchain-based Solution. 2019 IEEE/CIC International Conference on Communications Workshops in China (ICCC Workshops). :239–244.
The Internet of Things is stepping out of its infancy into full maturity, requiring massive data processing and storage. Unfortunately, because of the unique characteristics of resource constraints, short-range communication, and self-organization in IoT, it always resorts to the cloud or fog nodes for outsourced computation and storage, which has brought about a series of novel challenging security and privacy threats. For this reason, one of the critical challenges of having numerous IoT devices is the capacity to manage them and their data. A specific concern is from which devices or Edge clouds to accept join requests or interaction requests. This paper discusses a design concept for developing the IoT data management platform, along with a data management and lineage traceability implementation of the platform based on blockchain and smart contracts, which approaches the two major challenges: how to implement effective data management and enrich rational interoperability for trusted groups of linked Things; And how to settle conflicts between untrusted IoT devices and its requests taking into account security and privacy preserving. Experimental results show that the system scales well with the loss of computing and communication performance maintaining within the acceptable range, works well to effectively defend against unauthorized access and empower data provenance and transparency, which verifies the feasibility and efficiency of the design concept to provide privacy, fine-grained, and integrity data management over the IoT devices by introducing the blockchain-based data management platform.
2019-11-19
Wimmer, Maria A., Boneva, Rositsa, di Giacomo, Debora.  2018.  Interoperability Governance: A Definition and Insights from Case Studies in Europe. Proceedings of the 19th Annual International Conference on Digital Government Research: Governance in the Data Age. :14:1-14:11.

Interoperability has become a crucial value in European e-government developments, as promoted by the Digital Single Market strategy and the Tallinn Declaration. The European Union and its Member States have made considerable investments in improving the understanding of interoperability and in developing interoperable building blocks to support cross-border data exchange and public service provisioning. This includes recent updates of the European Interoperability Framework (EIF) and European Interoperability Reference Architecture (EIRA), as well as the publication of a number of generic and domain specific architecture and solutions building blocks such as digital identification or electronic delivery services. While in the previous version of the EIF, interoperability governance was not clearly developed, the new version of 2017 puts interoperability governance as a concept that spans across the different interoperability layers (legal, organizational, semantic and technical) and that builds the frame for interoperability overall. In this paper, we develop a definition of interoperability governance from a literature review and we put forward a model to investigate interoperability governance models at European and Member State levels. Based on several case studies of EU institutions and Member States, we could draw recommendations for what the key aspects of interoperability governance are to successfully diffuse interoperability into public service provisioning.

2019-08-05
Akkermans, Sven, Crispo, Bruno, Joosen, Wouter, Hughes, Danny.  2018.  Polyglot CerberOS: Resource Security, Interoperability and Multi-Tenancy for IoT Services on a Multilingual Platform. Proceedings of the 15th EAI International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services. :59–68.
The Internet of Things (IoT) promises to tackle a range of environmental challenges and deliver large efficiency gains in industry by embedding computational intelligence, sensing and control in our physical environment. Multiple independent parties are increasingly seeking to leverage shared IoT infrastructure, using a similar model to the cloud, and thus require constrained IoT devices to become microservice-hosting platforms that can securely and concurrently execute their code and interoperate. This vision demands that heterogeneous services, peripherals and platforms are provided with an expanded set of security guarantees to prevent third-party services from hijacking the platform, resource-level access control and accounting, and strong isolation between running processes to prevent unauthorized access to third-party services and data. This paper introduces Polyglot CerberOS, a resource-secure operating system for multi-tenant IoT devices that is realised through a reconfigurable virtual machine which can simultaneously execute interoperable services, written in different languages. We evaluate Polyglot CerberOS on IETF Class-1 devices running both Java and C services. The results show that interoperability and strong security guarantees for multilingual services on multi-tenant commodity IoT devices are feasible, in terms of performance and memory overhead, and transparent for developers.
2019-03-28
Schroeder, Jill M., Manz, David O., Amaya, Jodi P., McMakin, Andrea H., Bays, Ryan M..  2018.  Understanding Past, Current and Future Communication and Situational Awareness Technologies for First Responders. Proceedings of the Fifth Cybersecurity Symposium. :2:1-2:14.
This study builds a foundation for improving research for first responder communication and situational awareness technology in the future. In an online survey, we elicited the opinions of 250 U.S. first responders about effectiveness, security, and reliability of past, current, and future Internet of Things technology. The most desired features respondents identified were connectivity, reliability, interoperability, and affordability. The top barriers to technology adoption and use included restricted budgets/costs, interoperability, insufficient training resources, and insufficient interagency collaboration and communication. First responders in all job types indicated that technology has made first responder equipment more useful, and technology that supports situational awareness is particularly valued. As such, future Internet of Things capabilities, such as tapping into smart device data in residences and piggybacking onto alternative communication channels, could be valuable for future first responders. Potential areas for future investigation are suggested for technology development and research.
2019-01-16
Ayers, Hudson, Crews, Paul Thomas, Teo, Hubert Hua Kian, McAvity, Conor, Levy, Amit, Levis, Philip.  2018.  Design Considerations for Low Power Internet Protocols. Proceedings of the 16th ACM Conference on Embedded Networked Sensor Systems. :317–318.
Examining implementations of the 6LoWPAN Internet Standard in major embedded operating systems, we observe that they do not fully interoperate. We find this is due to some inherent design flaws in 6LoWPAN. We propose and demonstrate four principles that can be used to structure protocols for low power devices that encourage interoperability between diverse implementations.
2018-06-11
Andročec, D., Tomaš, B., Kišasondi, T..  2017.  Interoperability and lightweight security for simple IoT devices. 2017 40th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO). :1285–1291.

The Semantic Web can be used to enable the interoperability of IoT devices and to annotate their functional and nonfunctional properties, including security and privacy. In this paper, we will show how to use the ontology and JSON-LD to annotate connectivity, security and privacy properties of IoT devices. Out of that, we will present our prototype for a lightweight, secure application level protocol wrapper that ensures communication consistency, secrecy and integrity for low cost IoT devices like the ESP8266 and Photon particle.

2018-02-02
Grewe, D., Wagner, M., Frey, H..  2017.  ICN-based open, distributed data market place for connected vehicles: Challenges and research directions. 2017 IEEE International Conference on Communications Workshops (ICC Workshops). :265–270.

Currently, the networking of everyday objects, socalled Internet of Things (IoT), such as vehicles and home automation environments is progressing rapidly. Formerly deployed as domain-specific solutions, the development is continuing to link different domains together to form a large heterogeneous IoT ecosystem. This development raises challenges in different fields such as scalability of billions of devices, interoperability across different IoT domains and the need of mobility support. The Information-Centric Networking (ICN) paradigm is a promising candidate to form a unified platform to connect different IoT domains together including infrastructure, wireless, and ad-hoc environments. This paper describes a vision of a harmonized architectural design providing dynamic access of data and services based on an ICN. Within the context of connected vehicles, the paper introduces requirements and challenges of the vision and contributes in open research directions in Information-Centric Networking.

2017-12-28
Liu, X., Leon-Garcia, A., Zhu, P..  2017.  A distributed software-defined multi-agent architecture for unifying IoT applications. 2017 8th IEEE Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON). :49–55.

During the development and expansion of Internet of Things (IoT), main challenges needing to be addressed are the heterogeneity, interoperability, scalability, flexibility and security of IoT applications. In this paper, we view IoT as a large-scale distributed cyber-physical-social complex network. From that perspective, the above challenges are analyzed. Then, we propose a distributed multi-agent architecture to unify numbers of different IoT applications by designing the software-defined sensors, auctuators and controllers. Furthermore, we analyze the proposed architecture and clarify why and how it can tackle the heterogeneity of IoT applications, enable them to interoperate with each other, make it efficient to introduce new applications, and enhance the flexibility and security of different applications. Finally, the use case of smart home with multiple applications is applied to verify the feasibility of the proposed solution for IoT architecture.

2017-12-12
Jiang, L., Kuhn, W., Yue, P..  2017.  An interoperable approach for Sensor Web provenance. 2017 6th International Conference on Agro-Geoinformatics. :1–6.

The Sensor Web is evolving into a complex information space, where large volumes of sensor observation data are often consumed by complex applications. Provenance has become an important issue in the Sensor Web, since it allows applications to answer “what”, “when”, “where”, “who”, “why”, and “how” queries related to observations and consumption processes, which helps determine the usability and reliability of data products. This paper investigates characteristics and requirements of provenance in the Sensor Web and proposes an interoperable approach to building a provenance model for the Sensor Web. Our provenance model extends the W3C PROV Data Model with Sensor Web domain vocabularies. It is developed using Semantic Web technologies and thus allows provenance information of sensor observations to be exposed in the Web of Data using the Linked Data approach. A use case illustrates the applicability of the approach.

Suh, Y. K., Ma, J..  2017.  SuperMan: A Novel System for Storing and Retrieving Scientific-Simulation Provenance for Efficient Job Executions on Computing Clusters. 2017 IEEE 2nd International Workshops on Foundations and Applications of Self* Systems (FAS*W). :283–288.

Compute-intensive simulations typically charge substantial workloads on an online simulation platform backed by limited computing clusters and storage resources. Some (or most) of the simulations initiated by users may accompany input parameters/files that have been already provided by other (or same) users in the past. Unfortunately, these duplicate simulations may aggravate the performance of the platform by drastic consumption of the limited resources shared by a number of users on the platform. To minimize or avoid conducting repeated simulations, we present a novel system, called SUPERMAN (SimUlation ProvEnance Recycling MANager) that can record simulation provenances and recycle the results of past simulations. This system presents a great opportunity to not only reutilize existing results but also perform various analytics helpful for those who are not familiar with the platform. The system also offers interoperability across other systems by collecting the provenances in a standardized format. In our simulated experiments we found that over half of past computing jobs could be answered without actual executions by our system.

2017-11-13
Patti, E., Syrri, A. L. A., Jahn, M., Mancarella, P., Acquaviva, A., Macii, E..  2016.  Distributed Software Infrastructure for General Purpose Services in Smart Grid. IEEE Transactions on Smart Grid. 7:1156–1163.

In this paper, the design of an event-driven middleware for general purpose services in smart grid (SG) is presented. The main purpose is to provide a peer-to-peer distributed software infrastructure to allow the access of new multiple and authorized actors to SGs information in order to provide new services. To achieve this, the proposed middleware has been designed to be: 1) event-based; 2) reliable; 3) secure from malicious information and communication technology attacks; and 4) to enable hardware independent interoperability between heterogeneous technologies. To demonstrate practical deployment, a numerical case study applied to the whole U.K. distribution network is presented, and the capabilities of the proposed infrastructure are discussed.

2017-08-02
Kubler, Sylvain, Robert, Jérémy, Hefnawy, Ahmed, Cherifi, Chantal, Bouras, Abdelaziz, Främling, Kary.  2016.  IoT-based Smart Parking System for Sporting Event Management. Proceedings of the 13th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services. :104–114.

By connecting devices, people, vehicles and infrastructures everywhere in a city, governments and their partners can improve community wellbeing and other economic and financial aspects (e.g., cost and energy savings). Nonetheless, smart cities are complex ecosystems that comprise many different stakeholders (network operators, managed service providers, logistic centers...) who must work together to provide the best services and unlock the commercial potential of the IoT. This is one of the major challenges that faces today's smart city movement, and more generally the IoT as a whole. Indeed, while new smart connected objects hit the market every day, they mostly feed "vertical silos" (e.g., vertical apps, siloed apps...) that are closed to the rest of the IoT, thus hampering developers to produce new added value across multiple platforms. Within this context, the contribution of this paper is twofold: (i) present the EU vision and ongoing activities to overcome the problem of vertical silos; (ii) introduce recent IoT standards used as part of a recent Horizon 2020 IoT project to address this problem. The implementation of those standards for enhanced sporting event management in a smart city/government context (FIFA World Cup 2022) is developed, presented, and evaluated as a proof-of-concept.

2017-02-27
Lever, K. E., Kifayat, K., Merabti, M..  2015.  Identifying interdependencies using attack graph generation methods. 2015 11th International Conference on Innovations in Information Technology (IIT). :80–85.

Information and communication technologies have augmented interoperability and rapidly advanced varying industries, with vast complex interconnected networks being formed in areas such as safety-critical systems, which can be further categorised as critical infrastructures. What also must be considered is the paradigm of the Internet of Things which is rapidly gaining prevalence within the field of wireless communications, being incorporated into areas such as e-health and automation for industrial manufacturing. As critical infrastructures and the Internet of Things begin to integrate into much wider networks, their reliance upon communication assets by third parties to ensure collaboration and control of their systems will significantly increase, along with system complexity and the requirement for improved security metrics. We present a critical analysis of the risk assessment methods developed for generating attack graphs. The failings of these existing schemas include the inability to accurately identify the relationships and interdependencies between the risks and the reduction of attack graph size and generation complexity. Many existing methods also fail due to the heavy reliance upon the input, identification of vulnerabilities, and analysis of results by human intervention. Conveying our work, we outline our approach to modelling interdependencies within large heterogeneous collaborative infrastructures, proposing a distributed schema which utilises network modelling and attack graph generation methods, to provide a means for vulnerabilities, exploits and conditions to be represented within a unified model.

2017-02-21
M. Machado, J. W. Byers.  2015.  "Linux XIA: an interoperable meta network architecture to crowdsource the future internet". 2015 ACM/IEEE Symposium on Architectures for Networking and Communications Systems (ANCS). :147-158.

With the growing number of proposed clean-slate redesigns of the Internet, the need for a medium that enables all stakeholders to participate in the realization, evaluation, and selection of these designs is increasing. We believe that the missing catalyst is a meta network architecture that welcomes most, if not all, clean-state designs on a level playing field, lowers deployment barriers, and leaves the final evaluation to the broader community. This paper presents Linux XIA, a native implementation of XIA in the Linux kernel, as a candidate. We first describe Linux XIA in terms of its architectural realizations and algorithmic contributions. We then demonstrate how to port several distinct and unrelated network architectures onto Linux XIA. Finally, we provide a hybrid evaluation of Linux XIA at three levels of abstraction in terms of its ability to: evolve and foster interoperation of new architectures, embed disparate architectures inside the implementation's framework, and maintain a comparable forwarding performance to that of the legacy TCP/IP implementation. Given this evaluation, we substantiate a previously unsupported claim of XIA: that it readily supports and enables network evolution, collaboration, and interoperability - traits we view as central to the success of any future Internet architecture.

2015-05-05
Chenine, M., Ullberg, J., Nordstrom, L., Wu, Y., Ericsson, G.N..  2014.  A Framework for Wide-Area Monitoring and Control Systems Interoperability and Cybersecurity Analysis. Power Delivery, IEEE Transactions on. 29:633-641.

Wide-area monitoring and control (WAMC) systems are the next-generation operational-management systems for electric power systems. The main purpose of such systems is to provide high resolution real-time situational awareness in order to improve the operation of the power system by detecting and responding to fast evolving phenomenon in power systems. From an information and communication technology (ICT) perspective, the nonfunctional qualities of these systems are increasingly becoming important and there is a need to evaluate and analyze the factors that impact these nonfunctional qualities. Enterprise architecture methods, which capture properties of ICT systems in architecture models and use these models as a basis for analysis and decision making, are a promising approach to meet these challenges. This paper presents a quantitative architecture analysis method for the study of WAMC ICT architectures focusing primarily on the interoperability and cybersecurity aspects.
 

2015-05-04
Cherkaoui, A., Bossuet, L., Seitz, L., Selander, G., Borgaonkar, R..  2014.  New paradigms for access control in constrained environments. Reconfigurable and Communication-Centric Systems-on-Chip (ReCoSoC), 2014 9th International Symposium on. :1-4.

The Internet of Things (IoT) is here, more than 10 billion units are already connected and five times more devices are expected to be deployed in the next five years. Technological standarization and the management and fostering of rapid innovation by governments are among the main challenges of the IoT. However, security and privacy are the key to make the IoT reliable and trusted. Security mechanisms for the IoT should provide features such as scalability, interoperability and lightness. This paper addresses authentication and access control in the frame of the IoT. It presents Physical Unclonable Functions (PUF), which can provide cheap, secure, tamper-proof secret keys to authentify constrained M2M devices. To be successfully used in the IoT context, this technology needs to be embedded in a standardized identity and access management framework. On the other hand, Embedded Subscriber Identity Module (eSIM) can provide cellular connectivity with scalability, interoperability and standard compliant security protocols. The paper discusses an authorization scheme for a constrained resource server taking advantage of PUF and eSIM features. Concrete IoT uses cases are discussed (SCADA and building automation).

Friedman, A., Hu, V.C..  2014.  Presentation 9. Attribute assurance for attribute based access control. IT Professional Conference (IT Pro), 2014. :1-3.

In recent years, Attribute Based Access Control (ABAC) has evolved as the preferred logical access control methodology in the Department of Defense and Intelligence Community, as well as many other agencies across the federal government. Gartner recently predicted that “by 2020, 70% of enterprises will use attribute-based access control (ABAC) as the dominant mechanism to protect critical assets, up from less that 5% today.” A definition and introduction to ABAC can be found in NIST Special Publication 800-162, Guide to Attribute Based Access Control (ABAC) Definition and Considerations and Intelligence Community Policy Guidance (ICPG) 500.2, Attribute-Based Authorization and Access Management. Within ABAC, attributes are used to make critical access control decisions, yet standards for attribute assurance have just started to be researched and documented. This presentation outlines factors influencing attributes that an authoritative body must address when standardizing attribute assurance and proposes some notional implementation suggestions for consideration. Attribute Assurance brings a level of confidence to attributes that is similar to levels of assurance for authentication (e.g., guidelines specified in NIST SP 800-63 and OMB M-04-04). There are three principal areas of interest when considering factors related to Attribute Assurance. Accuracy establishes the policy and technical underpinnings for semantically and syntactically correct descriptions of Subjects, Objects, or Environmental conditions. Interoperability considers different standards and protocols used for secure sharing of attributes between systems in order to avoid compromising the integrity and confidentiality of the attributes or exposing vulnerabilities in provider or relying systems or entities. Availability ensures that the update and retrieval of attributes satisfy the application to which the ABAC system is applied. In addition, the security and backup capability of attribute repositories need to be considered. Similar to a Level of Assurance (LOA), a Level of Attribute Assurance (LOAA) assures a relying party that the attribute value received from an Attribute Provider (AP) is accurately associated with the subject, resource, or environmental condition to which it applies. An Attribute Provider (AP) is any person or system that provides subject, object (or resource), or environmental attributes to relying parties regardless of transmission method. The AP may be the original, authoritative source (e.g., an Applicant). The AP may also receive information from an authoritative source for repacking or store-and-forward (e.g., an employee database) to relying parties or they may derive the attributes from formulas (e.g., a credit score). Regardless of the source of the AP's attributes, the same standards should apply to determining the LOAA. As ABAC is implemented throughout government, attribute assurance will be a critical, limiting factor in its acceptance. With this presentation, we hope to encourage dialog between attribute relying parties, attribute providers, and federal agencies that will be defining standards for ABAC in the immediate future.
 

Pandey, A.K., Agrawal, C.P..  2014.  Analytical Network Process based model to estimate the quality of software components. Issues and Challenges in Intelligent Computing Techniques (ICICT), 2014 International Conference on. :678-682.

Software components are software units designed to interact with other independently developed software components. These components are assembled by third parties into software applications. The success of final software applications largely depends upon the selection of appropriate and easy to fit components in software application according to the need of customer. It is primary requirement to evaluate the quality of components before using them in the final software application system. All the quality characteristics may not be of same significance for a particular software application of a specific domain. Therefore, it is necessary to identify only those characteristics/ sub-characteristics, which may have higher importance over the others. Analytical Network Process (ANP) is used to solve the decision problem, where attributes of decision parameters form dependency networks. The objective of this paper is to propose ANP based model to prioritize the characteristics /sub-characteristics of quality and to o estimate the numeric value of software quality.
 

2015-04-30
Bovet, G., Hennebert, J..  2014.  Distributed Semantic Discovery for Web-of-Things Enabled Smart Buildings. New Technologies, Mobility and Security (NTMS), 2014 6th International Conference on. :1-5.

Nowadays, our surrounding environment is more and more scattered with various types of sensors. Due to their intrinsic properties and representation formats, they form small islands isolated from each other. In order to increase interoperability and release their full capabilities, we propose to represent devices descriptions including data and service invocation with a common model allowing to compose mashups of heterogeneous sensors. Pushing this paradigm further, we also propose to augment service descriptions with a discovery protocol easing automatic assimilation of knowledge. In this work, we describe the architecture supporting what can be called a Semantic Sensor Web-of-Things. As proof of concept, we apply our proposal to the domain of smart buildings, composing a novel ontology covering heterogeneous sensing, actuation and service invocation. Our architecture also emphasizes on the energetic aspect and is optimized for constrained environments.

Shafagh, H., Hithnawi, A..  2014.  Poster Abstract: Security Comes First, a Public-key Cryptography Framework for the Internet of Things. Distributed Computing in Sensor Systems (DCOSS), 2014 IEEE International Conference on. :135-136.

Novel Internet services are emerging around an increasing number of sensors and actuators in our surroundings, commonly referred to as smart devices. Smart devices, which form the backbone of the Internet of Things (IoT), enable alternative forms of user experience by means of automation, convenience, and efficiency. At the same time new security and safety issues arise, given the Internet-connectivity and the interaction possibility of smart devices with human's proximate living space. Hence, security is a fundamental requirement of the IoT design. In order to remain interoperable with the existing infrastructure, we postulate a security framework compatible to standard IP-based security solutions, yet optimized to meet the constraints of the IoT ecosystem. In this ongoing work, we first identify necessary components of an interoperable secure End-to-End communication while incorporating Public-key Cryptography (PKC). To this end, we tackle involved computational and communication overheads. The required components on the hardware side are the affordable hardware acceleration engines for cryptographic operations and on the software side header compression and long-lasting secure sessions. In future work, we focus on integration of these components into a framework and the evaluation of an early prototype of this framework.