Saddek Bensalem

VERIMAG, University Joseph Fourier/CNRS/INPG, Grenoble, France

Rigorous Component-based System Design Using the BIP Framework

Rigorous system design requires use of a single powerful component framework
allowing a representation of the designed system at different levels of detail,
from a high-level model of application software to its implementation. The use
of a single framework allows one to maintain overall coherency and correctness
by comparing different architectural solutions and their properties. In this
talk, I will first present the BIP (Behaviour, Interaction, Priority) component
framework which encompasses an expressive notion of composition for
heterogeneous components by combining interactions and priorities. This allows
description at different levels of abstraction from high-level application
software to mixed hardware/software systems. Second, I will introduce a
rigorous design approach that uses BIP as a unifying semantic model to derive
from an application software, a model of the target architecture and a mapping,
a correct implementation. Correctness of implementation is ensured by
application of source-to-source transformations in BIP which preserve
correctness of essential design properties. The design is fully automated and
supported by a toolset including a compiler, the D-Finder verification tool,
and model transformers. We illustrate the use of BIP as a modelling formalism
as well as crucial aspects of the design flow for ensuring correctness, through
an autonomous robot case study.

Peter Bentley

University College London, United Kingdom

Modelling Complex Systems - Lessons from the Life Sciences

Our technology is becoming increasing complex. Not just complicated,
but complex in the sense that the behaviour of a system may be
unpredictable. Peter Bentley describes his work in complex systems,
providing examples of various collaborations with biologists in areas
such as evolution, development, and cancer research. He describes how
many of the underlying principles can be modelled, not only to improve
our understanding of the biology, but also in order to provide new
technologies that exploit those principles. Through this work he aims
to investigate the technologies of the future - devices that can
adapt, be fault tolerant, and even assemble themselves. Examples of a
self-repairing robot, physical self-assembling systems will be shown,
and he will describe his systemic computer concept which aims to be
the first parallel fault tolerant computer that is based on general
biological systems. Through examples such as these, he argues that
while we may never be able to predict exactly what a complex system
may do, that does not prevent such systems from being extremely useful
for us - after all, we are unpredictable complex systems ourselves.

Krishnendu Chatterjee

IST Austria, Austria

Games and Probabilistic Systems with Mean-payoff, Energy and
Parity Objectives

The analysis of games and probabilistic systems with
quantitative objectives (such as
mean-payoff and energy objectives) and \omega-regular objectives (such
as parity objectives) provide
the mathematical foundation for performance analysis and verification
of various classes of systems.
In this talk, we will present a survey of both classical results and
recent results about mean-payoff,
energy, and parity objectives. We will discuss about how to solve
their combinations, their inter-relationship,
and mention interesting open problems.

Goerschwin Fey

University of Bremen, Germany

Assessing System Vulnerability using Formal Verification Techniques

Hardware systems are becoming more and more vulnerable to soft errors
caused by radiation or process variations. Design measures to cope with
these problems are built into the system. But how to verify that the
final system is as resilient as expected?

The presentation will consider soft errors caused e.g. by electrical bugs or radiation. Automatic approaches to analyze the effect of such faults on the system will be presented. The presentation will briefly cover modeling issues related to assessing fault tolerance, e.g. which abstraction level is appropriate for the analysis, what are potential pitfalls and what kind of models will be required. Based on this analysis existing approaches will be briefly introduced that analyze faults on the electrical as well as the logical level. Their trade-offs regarding resource requirements and quality of results will be discussed and the individual advantages highlighted. One approach suitable for assessment on the gate level and above will be explained in detail. Extensions to improve the performance using approximation approaches will be considered. Use-cases of the resulting robustness checker tool will illustrate the practical application of the techniques.

The presentation will consider soft errors caused e.g. by electrical bugs or radiation. Automatic approaches to analyze the effect of such faults on the system will be presented. The presentation will briefly cover modeling issues related to assessing fault tolerance, e.g. which abstraction level is appropriate for the analysis, what are potential pitfalls and what kind of models will be required. Based on this analysis existing approaches will be briefly introduced that analyze faults on the electrical as well as the logical level. Their trade-offs regarding resource requirements and quality of results will be discussed and the individual advantages highlighted. One approach suitable for assessment on the gate level and above will be explained in detail. Extensions to improve the performance using approximation approaches will be considered. Use-cases of the resulting robustness checker tool will illustrate the practical application of the techniques.

Renato Renner

ETH Zurich, Switzerland

Security in a quantum world

In this talk, I address the question whether classical cryptographic
systems are secure against attacks by an adversary equipped with
quantum devices. This question is of increasing relevance as such
devices may be built in the near future (although the construction of
a full-fledged quantum computer is currently out of reach). As an
example, I will demonstrate that a cryptographic scheme may be
susceptible to quantum attacks even if it is provably secure against
any classical adversary. (This talk does not require prior knowledge
in quantum mechanics.)

Petr Tuma

Charles University in Prague, Czech Republic

Computer Memory: Why We Should Care What Is Under The Hood

The memory subsystems of contemporary computer architectures are
increasingly complex - in fact, so much so that it becomes difficult to
estimate the performance impact of many coding constructs, and some long
known coding patterns are even discovered to be principally wrong. In
contrast, many researchers still reason about algorithmic complexity
in simple terms, where memory operations are sequential and of
equal cost. The goal of this talk is to give an overview of
some memory subsystem features that violate this assumption
significantly, with the ambition to motivate development of
algorithms tailored to contemporary computer architectures.