Mar 242014

This week I am attending SAC 2014, the 29th Symposium On Applied Computing, Gyeongju, Korea. I am also presenting our work with Inria Rennes on Runtime Enforcement of Regular Timed Properties.

edit: Here are the slides I have presented.

Mar 112014

The paper Efficient and Generalized Decentralized Monitoring of Regular Languages has been accepted to FORTE’14, the 34th IFIP International Conference on Formal Techniques for Distributed Objects, Components and Systems, June 3-6, 2014, Berlin, Germany.

Below is the abstract of the paper.

This paper proposes an efficient and generalized decentralized monitoring algorithm allowing to detect satisfaction or violation of any regular specification by local monitors alone in a system without central observation point. Our algorithm does not assume any form of synchronization between system events and communication of monitors, uses state machines as underlying mechanism for efficiency, and tries to keep the number and size of messages exchanged between monitors to a minimum. We provide a full implementation of the algorithm with an open-source benchmark to evaluate its efficiency in terms of number, size of exchanged messages, and delay induced by communication between monitors. Experimental results demonstrate the effectiveness of our algorithm which outperforms the previous most general one along several (new) monitoring metrics.

This is joint work with Tom Cornebize and Jean-Claude Fernandez, LIG and Verimag lab, Grenoble.

Mar 102014

This week I am visiting HUST – Hanoi University of Sciences and Technology (aka IPH – Institut Polytechnique de Hanoi) to work with Thanh Hung Nguyen.

Feb 122014
PERSYVAL-Lab and NASA-JPL are organizing the second edition of the CPS Summer School. The broad objective of the CPS Summer School is to explore the manifold relationship between networked embedded systems (« the internet of things ») and humans as their creators, users, and subjects. The format of the Summer School is a five days meeting, organized around different aspects of rigorous engineering of Cyber-Physical Systems.

This year, the objective of the school is to survey fundamental and applied aspects of modelling, monitoring and learning of systems as well as to identify novel opportunities and research directions in these areas through a series of lectures by international experts. Participants will also experience the relevant technologies during hands-on courses and be given a chance to present their own work. The school will provide a great opportunity to know other people working in the field, to meet distinguished scholars, and to establish contacts that may lead to research collaborations in the future.

The school will concentrate on the fields of system modelling, monitoring and learning. Over the last ten years we have seen a lot of growth in these areas, building on strong theoretical foundations to apply and extend techniques to new application domains. Runtime verification is a growing field with more and more efffective applications in safety/mission-critical systems, enterprise and systems software, autonomous and reactive control systems, health management and diagnosis systems, and system security and privacy. The field of specification mining (learning specifications from system behaviour) has also seen a surge in research effort, with the establishment of a number of competitions to drive forward the development of practical tools. This research community is at an ideal stage to benefit from a school such as this, to inspire, motivate and instruct new researchers into the field.

Students participating at this summer school will learn the current state of the art in modeling, monitoring, and learning. Students will be able to apply new techniques coming from various communities and backgrounds to their own domain. The CPS Summer School will be held at Grenoble University. Courses will be given in English by experts from industry and academia working in various fields of CPS.


• System modeling.
• Monitoring.
• Learning.
• Medical devices.
• Sensor networks.

Confirmed Speakers:

• Eric Bodden (TU Darmstadt and EC SPRIDE, Germany).
• Olivier Coutelou (Schneider Electric, France).
• Radu Grosu (Technical Univesity Wien, Austria).
• Jean Goubault-Larrecq (ENS Cachan, France).
• Klaus Havelund (NASA JPL, USA).
• Kim G. Larsen (Aalborg University, Denmark).
• Martin Leucker (University of Lübeck, Germany).
• Roberto Passerone (Universita’ degli Studi di Trento, Italy).
• Grigore Rosu (University of Illinois at Urbana Champaign, USA).
• Mohamad Sawan (Ecole Polytechnique de Montréal, Canada).
• Bernhard Steffen (Technical University Dortmund, Germany).
• Andreas Zeller (Saarland University, Germany).

Organization Committee:

• Saddek Bensalem – University of Grenoble, France.
• Yliès Falcone – University of Grenoble, France.
• Klaus Havelund – NASA JPL, USA.

Registration fee is €250 for students, €400 for non-students, which includes lunches and coffee breaks from Monday 8th through Thursday 11th, and a party. The registration fee only partially covers the costs incurred. The remaining costs are covered by PERSYVAL-Lab. The local organization committee has arranged university accommodations for students. Please refer to the Website for more details.

Application Procedure and Important Dates (please refer to the Website for the full procedure):

• Deadline for Application: April 14, 2014.
• Response to Applicants: April 21, 2014.
• Online Registration and Fee payment: May 10, 2014.
• Summer school on CPS: July 7-11, 2014.
Since attendance is limited, priority will be given to Ph.D. students and companies’ staff.More details can be found at:

Applications can be submitted at:

Enquiries can be sent to

Jan 202014

This week I am visiting Laboratoire Bordelais de Recherche en Informatique (Bordeaux computer science research lab) to work with Antoine Rollet, Serge Chaumette, Hervé Marchand, and Srinivas Pinisetty.

Thank you Antoine for the invitation.

Nov 262013

The paper Runtime Enforcement of Regular Timed Properties is accepted to the ACM Symposium on Applied Computing, Software Verification and Testing Track, Gyeongju, Korea, March 24 – 28, 2014.

Runtime enforcement is a verification/validation technique aiming at correcting (possibly incorrect) executions of a system of interest. In this paper, we consider enforcement monitoring for systems with timing specifications (modeled as timed automata). We consider runtime enforcement of any regular timed property specified by a timed automaton. To ease their design and their correctness-proof, enforcement mechanisms are described at several levels: enforcement functions that specify the input-output behavior, constraints that should be satisfied by such functions, enforcement monitors that implement an enforcement function as a transition system, and enforcement algorithms that describe the implementation of enforcement monitors. The feasibility of enforcement monitoring for timed properties is validated by prototyping the synthesis of enforcement monitors.

This is joint work with S. Pinisetty (Inria, Rennes – Bretagne Atlantique), T. Jéron (Inria, Rennes – Bretagne Atlantique), H. Marchand (Inria, Rennes – Bretagne Atlantique).

Nov 192013

CSRV-2014 is the 1st International Software Runtime Verification Competition as a part of the 14th International Conference on Runtime Verification. The event will be held in September 2014, in Toronto, Canada.

CSRV-2014 will draw attention to the invaluable effort of software developers and researchers who contribute in this field by providing the community with new or updated tools, libraries and frameworks for the instrumentation and runtime verification of software.

Runtime Verification is a verification technique for the analysis of software at execution time based on extracting information from a running system and checking if the observed behaviors satisfy or violate the properties of interest. During the last decade, many important tools and techniques have been developed and successfully employed. However, there is a pressing need to compare such tools and techniques, since we currently lack of a common benchmark suite as well as scientific evaluation methods to validate and test new prototype runtime verification tools.

The main aims of CSRV-2014 competition are to:

  • Stimulate the development of new efficient and practical runtime verification tools and the maintenance of the already developed ones.
  • Produce a benchmark suite for runtime verification tools, by sharing case studies and programs that researchers and developers can use in the future to test and to validate their prototypes.
  • Discuss the metrics employed for comparing the tools.
  • Provide a comparison of the tools running with different benchmarks and evaluating using different criteria.
  • Enhance the visibility of presented tools among the different communities (verification, software engineering, cloud computing and security) involved in software monitoring.

Please direct any enquiries to the competition co-organizers:

CSRV-2014 Jury

The CSRV Jury will include a representative for each participating team and some representatives of the Demonstration Tools Committee of Runtime Verification Conference.

Call for Participation

The main goal of CSRV-2014 competition is to compare tools for runtime verification. We invite and encourage the particiEpation with benchmarks and tools for the competition.The competition will consist of three main tracks based on the input language used:

  • Track on monitoring Java programs (online monitoring)
  • Track on monitoring C programs (online monitoring)
  • Track on monitoring of traces (offline monitoring)

The competition will follow three phases:

  • Benchmarks/Specification collection phase - the participants are invited to submit their benchmarks (C or Java programs and/or traces). The organizers will collect them in a common repository (publicly available). The participants will then train their tools using the shared benchmarks;
  • Monitor collection phase - the participants are invited to submit their monitors. The participants with the tools/monitors (see more information in the following section) that meet the qualification requirements will be qualified for the evaluation phase;
  • Evaluation phase - the qualified tools will be evaluated running the  benchmarks and they will be ranked using different criteria (i.e. memory utilization/overhead, CPU utilization/overhead, …). The final results will be presented at RV 2014 conference.

Important Dates

Dec 15, 2013
Declaration of intent (email
March. 1, 2014 Submission deadline for benchmark programs and the properties to be monitored.
March. 15, 2014 Tool training starts by participants
June 1, 2014 Monitor submission
July 1, 2014 Notifications and reviews.