Visible to the public International Conferences: Incident Management and Forensics (IMF), Germany, 2015

SoS Newsletter- Advanced Book Block


SoS Logo

International Conferences:

Incident Management and Forensics (IMF)

Germany, 2015

The 2015 Ninth International Conference on IT Security Incident Management & IT Forensics (IMF) was held 18-20 May 2015 at Magdeburg, Germany. Papers were presented on forensics, recent trends, memory and file system analysis, database aspects, detection of encrypted content, and response challenges in automated incident handling, mobile payment frauds, and evidence modeling.

Lösche, Ulf; Morgenstern, Maik; Pilz, Hendrik, “Platform Independent Malware Analysis Framework,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 109-113, 18-20 May 2015. doi:10.1109/IMF.2015.21
Abstract: Over the past years malicious software has evolved to a persistent threat on all major computer platforms. Due to the high number of new threats which are released every day security researchers have developed automatic systems to analyze and classify unknown pieces of software. While these techniques are technically mature on the Windows platform they still have to be improved on many other platforms such as Linux and Mac OS X. As the process of malware analysis is very similar on all platforms we have developed a platform independent framework to easily implement malware analysis on a new platform. This paper will cover our experience with malware analysis and we will show our generic approach, which can be applied on any platform.
Keywords: Androids; Humanoid robots; Linux; Malware; Monitoring; Operating systems; Virtual machine monitors; Android; Dynamic analysis; Forensic; Linux; Mac OS X; Malware analysis; Platform independent; Sandbox; Virtualization; Windows
(ID#: 15-6738)


Thurner, Simon; Grun, Marcel; Schmitt, Sven; Baier, Harald, “Improving the Detection of Encrypted Data on Storage Devices,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 26-39, 18-20 May 2015. doi:10.1109/IMF.2015.12
Abstract: The detection of persistently stored encrypted data plays an increasingly important role in digital forensics. This is especially true during live analysis of IT systems, when the encrypted data structures are temporarily decrypted in main memory and thus can be accessed as plaintext. One method commonly used to detect the presence of encrypted data on a storage device is the calculation of entropy. However, this method has a significant drawback: both random and compressed data have a very similar entropy compared to encrypted data, which yields a high false positive rate. That is why entropy is not very suitable to differentiate between these types of data. In this work we suggest both a workflow for detection of encrypted data structures on a storage device and an improved classification algorithm. The classification part of the workflow is based on statistical tests. For convenience of the investigator an important goal is to minimize the number of falsely classified unencrypted data structures (e.g. compressed data is classified as encrypted data). Our approach to achieve this goal is to combine different statistical tests. As a practical proof of concept we provide and evaluate a tool for automated analysis of storage devices that implements a multitude of statistical tests for improved detection of encrypted data, compared to both the application of only one such test and the calculation of entropy. More precisely our tool is able to reliably distinguish high-entropy file formats (i.e. DOCX, JPG, PDF, ZIP) from encrypted files (i.e. a truecrypt container).
Keywords: Ciphers; Data structures; Encryption; Entropy; Generators; Reliability; digital forensics; encryption detection; entropy; statistical tests (ID#: 15-6739)


Schiefer, Michael, “Smart Home Definition and Security Threats,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 114-118, 18-20 May 2015. doi:10.1109/IMF.2015.17
Abstract: The home of the future should be a smart one, to support us in our daily life. Up to now only a few security incidents in that area are known. Depending on different security analyses, this fact is rather a result of the low spread of Smart Home products than the success of such systems security. Given that Smart Homes become more and more popular, we will consider current incidents and analyses to estimate potential security threats in the future. The definitions of a Smart Home drift widely apart. Thus we first need to define Smart Home for ourselves and additionally provide a way to categorize the big mass of products into smaller groups.
Keywords: Cameras; Heating; Internet; Monitoring; Security; Smart homes; Web pages; internet of things; security threats; smart home (ID#: 15-6740)


Ossenbühl, Sven; Steinberger, Jessica; Baier, Harald, “Towards Automated Incident Handling: How to Select an Appropriate Response against a Network-Based Attack?,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 51-67, 18-20 May 2015. doi:10.1109/IMF.2015.13
Abstract: The increasing amount of network-based attacks evolved to one of the top concerns responsible for network infrastructure and service outages. In order to counteract these threats, computer networks are monitored to detect malicious traffic and initiate suitable reactions. However, initiating a suitable reaction is a process of selecting an appropriate response related to the identified network-based attack. The process of selecting a response requires taking into account the economics of an reaction e.g., risks and benefits. The literature describes several response selection models, but they are not widely adopted. In addition, these models and their evaluation are often not reproducible due to closed testing data. In this paper, we introduce a new response selection model, called REASSESS, that allows to mitigate network-based attacks by incorporating an intuitive response selection process that evaluates negative and positive impacts associated with each countermeasure. We compare REASSESS with the response selection models of IE-IRS, ADEPTS, CS-IRS, and TVA and show that REASSESS is able to select the most appropriate response to an attack in consideration of the positive and negative impacts and thus reduces the effects caused by an network-based attack. Further, we show that REASSESS is aligned to the NIST incident life cycle. We expect REASSESS to help organizations to select the most appropriate response measure against a detected network-based attack, and hence contribute to mitigate them.
Keywords: Adaptation models; Biological system modeling; Delays; Internet; NIST; Network topology; Security; automatic mitigation; cyber security; intrusion response systems; network security (ID#: 15-6741)


Kier, Christof; Madlmayr, Gerald; Nawratil, Alexander; Schafferer, Michael; Schanes, Christian; Grechenig, Thomas, “Mobile Payment Fraud: A Practical View on the Technical Architecture and Starting Points for Forensic Analysis of New Attack Scenarios,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 68-76, 18-20 May 2015. doi:10.1109/IMF.2015.14
Abstract: As payment cards and mobile devices are equipped with Near Field Communication (NFC) technology, electronic payment transactions at physical Point of Sale (POS) environments are changing. Payment transactions do not require the customer to insert their card into a slot of the payment terminal. The customer is able to simply swipe the payment card or mobile phone in front of a dedicated zone of the terminal to initiate a payment transaction. Secure Elements (SEs) in mobile phones and payment cards with NFC should keep sensitive application data in a safe place to protect it from abuse by attackers. Although hardware and the operating system of such a chip has to go through an intensive process of security testing, the current integration of such a chip in mobile phones easily allows attackers to access the information stored. In the following paper we present the implementation of two different proof-of-concept attacks. Out of the analysis of the attack scenarios, we propose various starting points for the forensic analysis in order to detect such fraudulent transactions. The presented concept should lead to fewer fraudulent transactions as well as protected evidence in case of fraud.
Keywords: Credit cards; Google; ISO Standards; Relays; Security; Smart phones; EMV Payment; Mobile Payment; NFC Transaction; Payment Fraud (ID#: 15-6742)


Bellin, Knut; Creutzburg, Reiner, “Conception of a Master Course for IT and Media Forensics Part II: Android Forensics,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 96-105, 18-20 May 2015. doi:10.1109/IMF.2015.19
Abstract: The growth of Android in the mobile sector and the interest to investigate these devices from a forensic point of view has rapidly increased. Many companies have security problems with mobile devices in their own IT infrastructure. To respond to these incidents, it is important to have professional trained staff. Furthermore, it is necessary to further train their existing employees in the practical applications of mobile forensics owing to the fact that a lot of companies are trusted with very sensitive data. Inspired by these facts, this paper addresses training approaches and practical exercises to investigate Android mobile devices.
Keywords: Androids; Forensics; Humanoid robots; Mobile communication; Oxygen; Smart phones; mobile forensics training education Android small scale digital device (ID#: 15-6743)


Kiltz, Stefan; Dittmann, Jana; Vielhauer, Claus, “Supporting Forensic Design — A Course Profile to Teach Forensics,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 85-95, 18-20 May 2015. doi:10.1109/IMF.2015.16
Abstract: There is a growing demand for experts with a dedicated knowledge of forensics, especially in the domain of digital and digitised forensics, besides a general shortage of teaching of digital forensics. Further, there is prominent lack of standardisation in designing a curriculum [18]. We address this by offering the profile ForensikDesign@Informatik [23] to the bachelor's degree at university level. By teaching digital and digitised forensics, we propose a model-based approach combining the practitioners and the computer scientist's view [19], also to address the standardisation issue. We identify three main application areas: teaching conventional digital forensic examinations using existing tools and methods following the model-based approach, the design of new forensic tools and methods and the system design to achieve a desired degree of forensic readiness in the conflict field of a degree of anonymity. The last two application areas, we believe, also justify teaching at university level. We set an international focus, and highlight the science part of forensic sciences. Selected law aspects are addressed both for motivational and comparative purposes. We implement different teaching strategies and provide dedicated resources (technical, organisational and personnel). Finally, we outline the two options for the profile ForensikDesign@Informatik, depending on the effort of commitment by the students.
Keywords: Computational modeling; Data models; Digital forensics; Documentation; Education; Security; Existing and planned teaching programs with goals and concepts; basic and emerging trends to provide education; from theory to practical approaches (ID#: 15-6744)


Ramisch, Felix; Rieger, Martin, “Recovery of SQLite Data Using Expired Indexes,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 19-25, 18-20 May 2015. doi:10.1109/IMF.2015.11
Abstract: SQLite databases have tremendous forensic potential. In addition to active data, expired data remain in the database file, if the option secure delete is not applied. Tests of available forensic tools show, that the indexes were not considered, although they may complete the recovery of the table structures. Algorithms for their recovery and combination with each other or with table data are worked out. A new tool, SQLite Index Recovery, was developed for this study. The use with test data and data of Apple Mail shows, that the recovery of indexes is possible and enriches the recovery of ordinary table data.
Keywords: File systems; Forensics; Indexes; Metadata; Oxygen; Postal services; Apple Mail; SQLite; database; expired data; forensic tool; free block; index recovery (ID#: 15-6745)


Gruhn, Michael, “Windows NT pagefile.sys Virtual Memory Analysis,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 3-18, 18-20 May 2015. doi:10.1109/IMF.2015.10
Abstract: As hard disk encryption, RAM disks, persistent data avoidance technology and memory resident malware become more widespread, memory analysis becomes more important. In order to provide more virtual memory than is actually physical present on a system, an operating system may transfer frames of memory to a page file on persistent storage. Current memory analysis software does not incorporate such page files and thus misses important information. We therefore present a detailed analysis of Windows NT paging. We use dynamic gray-box analysis, in which we place known data into virtual memory and examine where it is mapped to, in either the physical memory or the page file, and cross-reference these findings with the Windows NT Research Kernel source code. We demonstrate how to decode the non-present page table entries, and accurately reconstruct the complete virtual memory space, including non-present memory pages on Windows NT systems using 32-bit, PAE or IA32e paging. Our analysis approach can be used to analyze other operating systems as well.
Keywords: Forensics; Hardware; Kernel; Random access memory; Resource management; Digital Forensics; Pagefile Analysis; Virtual Memory Analysis; Windows NT Paging (ID#: 15-6746)


Dewald, Andreas, “Characteristic Evidence, Counter Evidence and Reconstruction Problems in Forensic Computing,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 77-82, 18-20 May 2015. doi:10.1109/IMF.2015.15
Abstract: Historically, forensic computing (as digital forensics) developed pragmatically, driven by specific technical needs. Indeed, in comparison with other forensic sciences the field still is rather immature and has many deficits, such as the unclear terminology used in court. In this paper, we introduce notions of (digital) evidence, characteristic evidence, and (characteristic) counter evidence, as well as the definitions of two fundamental forensic reconstruction problems. We show the relation of the observability of the different types of evidence to the solvability of those problems. By doing this, we wish to exemplify the usefulness of formalization in the establishment of a precise terminology. While this will not replace all terminological shortcomings, it (1) may provide the basis for a better understanding between experts, and (2) helps to understand the significance of different types of digital evidence to answer questions in an investigation.
Keywords: Computational modeling; Computers; Digital forensics; Electronic mail; Hard disks; Radiation detectors; characteristic evidence; counter evidence; digital forensics; evidence; reconstruction; terminology (ID#: 15-6747)


Freiling, Felix; Gruhn, Michael, “What is Essential Data in Digital Forensic Analysis?,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 40-48, 18-20 May 2015. doi:10.1109/IMF.2015.20
Abstract: In his seminal work on file system forensic analysis, Carrier defined the notion of essential data as "those that are needed to save and retrieve files." He argues that essential data is therefore more trustworthy since it has to be correct in order for the user to use the file system. In many practical settings, however, it is unclear whether a specific piece of data is essential because either file system specifications are ambiguous or the importance of a specific data field depends on the operating system that processes the file system data. We therefore revisit Carrier's definition and show that there are two types of essential data: strong and weak. While strongly essential corresponds to Carrier's definition, weakly essential refers to application specific interpretations. We empirically show the amount of strongly and weakly essential data in DOS/MBR and GPT partition systems, thereby complementing and extending Carrier's findings.
Keywords: Computers; Data structures; Digital forensics; Metadata; Operating systems; Standards; file system; forensic investigations; operating systems (ID#: 15-6748)


Merkel, Ronny, “Latent Fingerprint Aging from a Hyperspectral Perspective: First Qualitative Degradation Studies Using UV/VIS Spectroscopy,” in IT Security Incident Management & IT Forensics (IMF), 2015 Ninth International Conference on, vol., no., pp. 121-135, 18-20 May 2015. doi:10.1109/IMF.2015.18
Abstract: Latent print age estimation is an important topic in the emerging field of digitized crime scene forensics. While several capturing devices have recently been studied towards this goal, hyperspectral imaging in the UV/VIS (ultraviolet and visible light) range of the electromagnetic spectrum has not been investigated so far. Addressing this research gap, a first qualitative evaluation on the aging behavior of 30 latent print time series from 6 different donors is conducted, utilizing an optical reflection spectrometer. Results show more unpredictable aging tendencies in the ultraviolet spectral range, whereas a general logarithmic trend from prior work (using non-spectral capturing devices) is confirmed for the visible light band. Furthermore, a different behavior of eccrine and sebaceous print components is found, especially in the ultraviolet band, where sebaceous components seem to become reflective to the emitted radiation and might furthermore be utilized for studying longer aging periods in contrast to eccrine prints. Overall, the combined degradation information of the ultraviolet and the visible light band seem to provide the most reliable results for measuring a reproducible aging trend, serving as a potential opportunity to address the strong influence of different sweat compositions on the aging behavior of latent prints.
Keywords: Aging; Degradation; Estimation; Fingerprint recognition; Hyperspectral imaging; Lipidomics; Optical surface waves; UV/VIS spectroscopy; age estimation; digitized crime scene forensics; eccrine vs. sebaceous; hyperspectral imaging; latent fingerprints (ID#: 15-6749)


Articles listed on these pages have been found on publicly available internet pages and are cited with links to those pages. Some of the information included herein has been reprinted with permission from the authors or data repositories. Direct any requests via Email to for removal of the links or modifications to specific citations. Please include the ID# of the specific citation in your correspondence.