Visible to the public Biblio

Filters: Keyword is Interfaces  [Clear All Filters]
Oliveira, Luis, Luton, Jacob, Iyer, Sumeet, Burns, Chris, Mouzakitis, Alexandros, Jennings, Paul, Birrell, Stewart.  2018.  Evaluating How Interfaces Influence the User Interaction with Fully Autonomous Vehicles. Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. :320–331.
With increasing automation, occupants of fully autonomous vehicles are likely to be completely disengaged from the driving task. However, even with no driving involved, there are still activities that will require interfaces between the vehicle and passengers. This study evaluated different configurations of screens providing operational-related information to occupants for tracking the progress of journeys. Surveys and interviews were used to measure trust, usability, workload and experience after users were driven by an autonomous low speed pod. Results showed that participants want to monitor the state of the vehicle and see details about the ride, including a map of the route and related information. There was a preference for this information to be displayed via an onboard touchscreen device combined with an overhead letterbox display versus a smartphone-based interface. This paper provides recommendations for the design of devices with the potential to improve the user interaction with future autonomous vehicles.
Smith, E., Fuller, L..  2017.  Control systems and the internet of things \#x2014; Shrinking the factory. 2017 56th FITCE Congress. :68–73.

In this paper we discuss the Internet of Things (IoT) by exploring aspects which go beyond the proliferation of devices and information enabled by: the growth of the Internet, increased miniaturization, prolonged battery life and an IT literate user base. We highlight the role of feedback mechanisms and illustrate this with reference to implemented computer enabled factory control systems. As the technology has developed, the cost of computing has reduced drastically, programming interfaces have improved, sensors are simpler and more cost effective and high performance communications across a wide area are readily available. We illustrate this by considering an application based on the Raspberry Pi, which is a low cost, small, programmable and network capable computer based on a powerful ARM processor with a programmable I/O interface, which can provide access to sensors (and other devices). The prototype application running on this platform can sense the presence of human being, using inexpensive passive infrared detectors. This can be used to monitor the activity of vulnerable adults, logging the results to a central server using a domestic Internet solution over a Wireless LAN. Whilst this demonstrates the potential for the use of such control/monitoring systems, practical systems spanning thousands of sites will be more complex to deliver and will have more stringent data processing and management demands and security requirements. We will discuss these concepts in the context of delivery of a smart interconnected society.

Chernyshov, George, Chen, Jiajun, Lai, Yenchin, Noriyasu, Vontin, Kunze, Kai.  2016.  Ambient Rhythm: Melodic Sonification of Status Information for IoT-enabled Devices. Proceedings of the 6th International Conference on the Internet of Things. :1–6.
In this paper we explore how to embed status information of IoT-enabled devices in the acoustic atmosphere using melodic ambient sounds while limiting obtrusiveness for the user. The user can use arbitrary sound samples to represent the devices he wants to monitor. Our system combines these sound samples into a melodic ambient rhythm that contains information on all the processes or variables that user is monitoring. We focus on continuous rather than binary information (e.g. "monitoring progress status" rather then "new message received"). We evaluate our system in a machine monitoring scenario focusing on 5 distinct machines/processes to monitor with 6 priority levels for each. 9 participants use our system to monitor these processes with an up to 92.44% detection rate, if several levels are combined. Participants had no previous experience with this or similar systems and had only 5-10 minute training session before the tests.