2013 BES Workshop
Date: Sep 28, 2013 11:45 pm – Sep 29, 2013 10:45 pm
Location: Montreal, Canada
Benchmarking of Embedded Systems Workshop
Held in conjunction with Embedded Systems Week, this workshop will be presented on Sunday, September 29 by Sebastian Fischmeister of University of Waterloo, Shay Gal-On of EEMBC and Peter Stokes and Darshika G. Perera of CMC Microsystems.
Call for Positioning Statements
Prospective participants are invited to submit a one page position paper. Submitters will get a chance to give a standup presentation on their position. All position papers will be made available prior to the workshop. Deadline for submitting position papers is September 10, 2013. Download information (PDF).
Program/Agenda
TBA
Abstract and Goals:
Empirical systems research is facing a dilemma. Minor aspects of an experimental setup can have a significant impact on its associated performance measurements and potentially invalidate conclusions drawn from them. The growth in complexity and size of modern systems will further aggravate this dilemma, especially with the given time pressure of producing results. So how can one trust any reported empirical analysis of a new idea or concept?
This workshop aims to bring together researchers and industry concerned with quantitative and empirical evaluation of embedded computing systems. The workshop will have a work program consisting of brief presentations of position statements, breakout sessions on specific topics, and a report summarizing the workshop outcomes. The workshop agenda will address topics such as the quality of benchmarking, state-of-the-art workloads, standardization of workloads, proper experimental setups, replicability and reproducibility of results, and sharing of infrastructure.
Topics of interest include, but not limited to:
Major Topics:
- Is there a need to benchmark? What are the alternatives? Benchmark creation; Relevance and importance of benchmarking
- Trustworthiness and reproducibility of (academic) empirical measurements; Benchmark characterization
- Benchmarking heterogeneous and multicore systems
- Benchmarking with peripherals, including sensors, actuators, and busses; Energy
Organization
Sebastian Fischmeister, University of Waterloo, Canada Peter Stokes, CMC
Microsystems, Canada Shay Gal-On, EEMBC, USA Darshika G. Perera, CMC
Microsystems, Canada
Submitted by Anonymous
on
Held in conjunction with Embedded Systems Week, this workshop will be presented on Sunday, September 29 by Sebastian Fischmeister of University of Waterloo, Shay Gal-On of EEMBC and Peter Stokes and Darshika G. Perera of CMC Microsystems.
Call for Positioning Statements
Prospective participants are invited to submit a one page position paper. Submitters will get a chance to give a standup presentation on their position. All position papers will be made available prior to the workshop. Deadline for submitting position papers is September 10, 2013. Download information (PDF).
Program/Agenda
TBA
Abstract and Goals:
Empirical systems research is facing a dilemma. Minor aspects of an experimental setup can have a significant impact on its associated performance measurements and potentially invalidate conclusions drawn from them. The growth in complexity and size of modern systems will further aggravate this dilemma, especially with the given time pressure of producing results. So how can one trust any reported empirical analysis of a new idea or concept?
This workshop aims to bring together researchers and industry concerned with quantitative and empirical evaluation of embedded computing systems. The workshop will have a work program consisting of brief presentations of position statements, breakout sessions on specific topics, and a report summarizing the workshop outcomes. The workshop agenda will address topics such as the quality of benchmarking, state-of-the-art workloads, standardization of workloads, proper experimental setups, replicability and reproducibility of results, and sharing of infrastructure.
Topics of interest include, but not limited to:
Major Topics:
- Is there a need to benchmark? What are the alternatives? Benchmark creation; Relevance and importance of benchmarking
- Trustworthiness and reproducibility of (academic) empirical measurements; Benchmark characterization
- Benchmarking heterogeneous and multicore systems
- Benchmarking with peripherals, including sensors, actuators, and busses; Energy
Organization
Sebastian Fischmeister, University of Waterloo, Canada Peter Stokes, CMC
Microsystems, Canada Shay Gal-On, EEMBC, USA Darshika G. Perera, CMC
Microsystems, Canada