19th International Symposium on Formal Methods


Tutorial 1: Collaborative Modelling and Co-simulation: Tools and techniques for Designing Embedded Systems
Date: 13 May 2014 (full day)
Presenters: John Fitzgerald and Ken Pierce (Newcastle University, UK), Peter Gorm Larsen (Aarhus University, Denmark)

Abstract: The embedded systems market is a lively place, and there is growing demand for rapid innovation of products that make exploit new materials, sensors and computing hardware, often through clever and complex software. In this context, developers have to form creative teams out of disparate disciplines but the semantic gaps between disciplines cost time and money because misunderstandings are often only detected when the physical product is built and software fails to control it properly. How can model-based development work if these teams of specialist engineers describe different parts of the product and its environment in very different ways, and can formal techniques help?
We have been developing practical methods for collaborative creation of “co-models” composed of discrete-event models of control devices/software and continuous-time models of the controlled devices and the environment, bridging gaps between software and other engineering disciplines. Reconciled operational semantics permit co-models to be “co-simulated”, allowing us to explore the design spaces of physics and software together, so that we can trade off alternatives on such bases as performance, energy consumption and cost before committing to a solution.
Our tutorial mixes lectures, discussions and opportunities for hands-on tool-based modelling using the new Crescendo toolset linking Overture-VDM and 20-sim tools. We will introduce the principles of co-modelling and co-simulation and give participants the opportunity to experiment with co-model creation and modification. We will present the experience gained in developing Crescendo, and with its industry application in the DESTECS project (www.destecs.org).

Tutorial 2: Unifying Theories of Programming in Isabelle/HOL
Date: 12 May 2014 (full day)
Presenters: Jim Woodcock and Simon Foster (University of York, UK)

Abstract: This is a tutorial introduction to three theories in Hoare & He's Unifying Theories of Programming (UTP) and their implementation in the Isabelle interactive theorem prover. We describe mechanised theories of (i) relations, (ii) designs (pre-postcondition pairs), and (iii) reactive processes. UTP is a long-term research agenda to understand the relationship between different programming theories and pragmatic programming paradigms. The origin of UTP is in the work on predicative programming, which was started by Hehner. It supports three different ways to study such relationships: (i) by computational paradigm; (ii) by level of abstraction; and (iii) by method of presentation.
In this tutorial, we start with the basic concepts of UTP: alphabets, signatures, and healthiness conditions, and the concept of theory mechanisation in Isabelle/HOL. We go on to describe the alphabetised relational calculus, the formalism used to describe predicates in UTP theories. We introduce a basic nondeterministic programming language and its laws of programming and show how this is mechanised. We complete the initial presentation of UTP by describing the organisation of UTP theories into complete lattices. We show how Hoare logic and the weakest precondition calculus are defined in UTP.
The relational semantics of our programming language allows us to construct a theory of partial correctness; for total correctness, we need a more powerful theory. The UTP theory of designs captures the notion of total correctness using assumptions and commitments (preconditions and postconditions). The theory depends on two observational variables to detect when a program has started and when it has finished. We show how to mechanise the theory in Isabelle/HOL and how to characterise the set of designs syntactically, semantically, and through healthiness conditions.
The third and final part of the tutorial adds concurrency and communication to the programming language using the UTP theory of reactive processes. This theory is characterised by more healthiness conditions to record the history of events and responsiveness to external events. Once more, we show how this theory can be mechanised and provide example proofs in Isabelle/HOL.
The mechanisation in Isabelle/HOL of the nondeterministic, imperative, con-current programming language with communication channels forms the basis ofthe Circus family of modelling languages, including CML (the COMPASS Mod-elling Language), which is used for developing systems of systems.

Tutorial 3: Formal Model-Driven Engineering
Date: 13 May 2014 (1/2 day, AM)
Presenters: James Welch, Jeremy Gibbons and Jim Davies (University of Oxford, UK)

Abstract: Booster is a formal method designed for the automatic production of information systems from precise object models. Inspired by B and Z, it is supported by a collection of Eclipse-based (Spoofax) model transformations for the generation of relational database implementations with a matching Java API and default `object browser' interfaces. This tutorial provides a brief introduction to Booster, explaining the approach taken to language design and implementation, together with a first-hand account of the lessons learnt in applying the method to real-world systems development. This is followed by a hands-on demonstration: participants will be guided through the generation of an information system from a simple object model.

Tutorial 4: BACH: From Bounded Reachability Analysis of Linear Hybrid Automata To Online Verification of Industrial Cyber-Physical System
Date: 13 May 2014 (1/2 day)
Presenters: Lei Bu and Xuandong Li (Nanjing University)

Abstract: Safety assurance of real time safety-critical systems is a very important problem. Hybrid automata (HA) is the natural modeling language for real time systems with both discrete and continuous state changes. However, as the behavior of HA is too complex, the state-of-the-art techniques cannot scale up to industrial level cases. Furthermore, Cyber-Physical System (CPS) is even more difficult to analyze than standalone real time embedded systems. In this tutorial, we will give a brief introduction of HA and present an overview of our work on model checking large scale real time systems efficiently. We give a tour of path-oriented methods for complexity control, and introduce an online modeling and verification framework to handle nondeterministic CPS systems by modeling and checking the system’s time-bounded behavior in short-run future online periodically.
The above techniques are all implemented into a toolset: BACH. BACH gives nice support of graphical modeling, reachability verification, and online runtime verification of hybrid and CPS systems. BACH outperforms many competitors on the well- recognized benchmarks. Since its debut 5 years ago, BACH has attracted hundreds of registered users globally including researchers from top institutions, i.e. CMU, UC Berkeley, UBC and engineers from industry like AeroSpace, Railway, Telecom and etc.
We will demonstrate BACH during the tutorial. Specifically, we will use an animated simulation platform, which is based on a real industrial CPS system, a communication- based train control (CBTC) system, to highlight the success stories of using BACH in industrial practice. We will show the processing capability of BACH that it can perform the online modeling and verification of the complex industrial CBTC system in only milliseconds.

Tutorial 5: The Correctness-by-Construction Approach to Programming
Date: 12 May 2014 (full day)
Presenters: Bruce Watson (Stellenbosch, SA) and Derrick Kourie (University of Pretoria, SA)

Abstract: The purpose of the tutorial is to influence deeply the way participants approach the task of developing algorithms, with a view to improving code quality. The tutorial features: a step-by-step explanation of how to derive provably correct algorithms using small and tractable refinements rules; a detailed illustration of the presented methodology through a set of carefully selected graded examples; a demonstration of how practical non-trivial algorithms have been derived.
The focus is on bridging the gap between two extreme methods for developing software. On the one hand, some approaches are so formal that they scare off all but the most dedicated theoretical computer scientists. On the other, there are some who believe that any measure of formality is a waste of time, resulting in software that is developed by following gut feelings and intuitions.
The “correctness-by-construction” approach to developing software relies on a formal theory of refinement, and requires the theory to be deployed in a systematic but pragmatic way. We provide the key theoretical background (refinement laws) needed to apply the method. We then detail a series of graded examples to show how it can be applied to increasingly complex algorithmic problems.

For more information, please click here.