[loginf] CfPart: RV'08

Martin Leucker leucker at in.tum.de
Tue Feb 26 22:09:00 CET 2008


We apologize if you receive multiple copies of this email.

                           Call for Participation

                 8th Workshop on Runtime Verification

                            March 30, 2008
                          Budapest, Hungary

                       Affiliated with ETAPS'08

Jean Goubault-Larrecq:  
LSV - Laboratoire Spécification et Vérification
École Normale Supérieure de Cachan

Title: Orchids, and Bad Weeds


Orchids    is a real-time intrusion    detection system.  The original
algorithmic idea   is   online model-checking  of   a temporal  logic,
specialized  to intrusion detection. We shall  explain how this can be
seen, alternatively, as dynamically  spawned (and  dynamically killed)
monitors. We shall also  demonstrate  Orchids in action,  catching bad
weeds -- actual, recent and subtle system and network attacks.

The presentation is based on joint work with Julien Olivain (formerly
LSV, now Above Security, Montreal, Canada)


John Rushby
SRI International Computer Science Laboratory

Software  often   must be  certified  for safety,  security,  or other
critical  properties.  Traditional approaches to certification require
the  software, its systems  context, and their assurance artifacts, to
be available for scrutiny in their  final, completed forms. But modern
development  practices postpone    the determination  of  final system
configuration from design time to integration time, load time, or even
runtime. Adaptive systems go  beyond   this and modify  or  synthesize
functions at runtime.

Developments such as these require  an overhaul to the basic framework
for  certification, so that some of  its  responsibilities also may be
discharged at integration-, load- or runtime.

I will    outline  a  suitable   framework, in  which   the  basis for
certification  is  changed  from  compliance   with  standards to  the
construction of  explicit  goals, evidence,  and  arguments (generally
called  an ``assurance     case'').  I  will describe how      runtime
verification  can be  used  within  this framework,   thereby allowing
certification  partially to   be    performed at  runtime   or,   more
provocatively, to ``runtime certification.''


March 30th, 2008
09:00 - 10:30 SESSION 1

    Introduction and Welcome 

    Invited talk: Orchids, and Bad Weeds 
    Jean Goubault-Larrecq (LSV/ENS Cachan) with contributions from
    Julien Olivain (Above Security)   

    Model-based Run-time Checking of Security Permissions using
    Guarded Objects  
    Jan Jürjens (Open University) 

10:30 - 11:00 Coffee
11:00 - 12:30 SESSION 2

    Synthesizing Monitors for Safety Properties - This Time With Calls
    and Returns  
    Grigore Rosu, Feng Chen (UIUC) and Thomas Ball (Microsoft Research). 

    Forays into Sequential Composition and Concatenation in EAGLE 
    Joachim Baran and Howard Barringer (University of Manchester) 

    Checking Traces for Regulatory Conformance 
    Nikhil Dinesh, Aravind Joshi, Insup Lee and Oleg Sokolsky (UPenn) 

12:30 - 14:00 Lunch
14:00 - 16:00 SESSION 3

    Invited talk: Runtime Certification 
    John Rushby (SRI) 

    Deadlocks: from Exhibiting to Healing 
    Yarden Nir-Buchbinder, Rachel Tzoref and Shmuel Ur (IBM Haifa) 

    A Scalable, Sound, Eventually-Complete Algorithm for Deadlock Immunity 
    Horatiu Jula and George Candea (EPFL) 

16:00 - 16:30 Coffee
16:30 - 18:00 SESSION 4

    Property Patterns for Runtime Monitoring of Web Service Conversations 
    Jocelyn Simmonds, Marsha Chechik, Shiva Nejati (University of
    Toronto), and Elena Litani and Bill O'Farrell (IBM Toronto)  

    Runtime Monitoring of Object Invariants with Guarantee 
    Madhu Gopinathan (Indian Institute of Science) and Sriram Rajamani
    (Microsoft Research India)   

    A Lightweight Container Architecture for Runtime Verification 
    Hakim Belhaouari and Frederic Peschanski (Laboratoire
    d'Informatique de Paris 6)  


RV'08  brings together researchers in  order  to debate how to monitor
and  analyze  the  execution   of   programs. The  focus   of  runtime
verification varies    from  testing  software before   deployment  to
detecting errors after deployment.  Approaches to runtime verification
include checking conformance with  a formal specification written in a
temporal or  history-tracking logic. One of  the longer-term  goals of
the workshop  is to investigate the use  of lightweight formal methods
applied at runtime as  a viable complement to  methods aimed mainly at
proving programs correct prior to execution, e.g., theorem proving and
model checking.

Moreover, the  focus of RV  ranges from detecting (non)-conformance to
triggering  fault protection  mechanisms  in case non-conformance  has
been detected.  This  allows for  new software design  and programming
paradigms.  Thus,  RV's topics partially overlap   with those found in
other directions such   as aspect oriented  programming,  self-healing
systems, autonomous systems, adaptive systems, etc.

The subject covers several technical fields as outlined below.

    * Specification languages and logics:

      Formal methods scientists have investigated logics and developed
      technologies  that are suitable  for  model checking and theorem
      proving,   but  monitoring  can   reveal  new  observation-based
      foundational logics.

    * Aspect oriented languages with trace predicates:

      New  results in extending  aspect languages, such as for example
      AspectJ,   with   trace   predicates   replacing   the  standard
      pointcuts.    Aspect  oriented   programming  provides  specific
      solutions to program instrumentation and program guidance.

    * Program instrumentation in general:

      Any techniques for instrumenting programs, at the source code or
      object  code/byte  code level, to  emit  relevant  events  to an

    * Program Guidance in general:

      Methodologies, architectures,  and  techniques  for  guiding the
      behavior  of a program  once  its specification is violated, for
      developing    self-healing,     autonomous,    or       adaptive
      systems. Techniques ranging from standard exceptions to advanced
      planning  lead  to  new development  methodologies  and software
      architectures  such    as    monitor-oriented    programming  or
      monitor-based runtime reflection.

    * Combining static and dynamic analysis:

      Monitoring a program with respect to a temporal formula can have
      an impact  on the monitored  program, with respect  to execution
      time as well as memory consumption.  Static analysis can be used
      to   minimize  the   impact     by optimizing     the    program
      instrumentation.  Runtime   monitors  can    be seen   as  proof
      obligations left over from proofs - what  is left that could not
      be proved.

    * Dynamic program analysis:

      Techniques that gather information  during program execution and
      use it to conclude  properties about the program, either  during
      test or in  operation. Algorithms for detecting  multi-threading
      errors  in execution   traces,    such as deadlocks   and   data
      races.  Algorithms   for generating specifications from   runs -
      dynamic  reverse      engineering, including     also    program

    * Security analysis:

      Monitoring for the enforcement  of security policies. Successful
      applications  include   operating system and   middleware access
      control, firewalls, stack inspection based sandboxing, detecting
      the    threats of untrustworthy     (malicious or  buggy)  code,
      intrusion detection etc.

    * Contract Security analysis:

      Monitoring for the enforcement   of contract fulfillment in  SOA
      and  web-services,    especially  in contract-oriented  software

Both foundational and practical aspects are encouraged.


Mehmet Aksit            (University of Twente, NL)
Howard Barringer        (University of Manchester, UK)
Mads Dam                (KTH Stockholm, SE)
Bernd Finkbeiner        (Saarland University, DE)
Klaus Havelund          (NASA Jet Propulsion Laboratory, US)
Bengt Jonsson           (Uppsala Univesitet, SE)
Moonzoo Kim             (KAIST, KR)
Martin Leucker (Chair)  (Technical University of Munich, DE)
Dejan Nickovic          (Verimag, FR)
Doron Peled             (Bar Ilan University, IL)
Mauro Pezze             (University of Lugano, CH)
Shaz Qadeer             (Microsoft Research, US)
Grigore Rosu            (University of Illinois, Urbana-Champaign, US)
Gerardo Schneider       (University of Oslo, NO)
Henny Sipma             (Stanford University, US)
Oleg Sokolsky           (University of Pennsylvania, US)
Scott Stoller  	        (State University of New York, US)
Mario Sudholt           (Ecole des Mines de Nantes-INRIA, LINA, FR)
Serdar Tasiran          (Koc University, TR)
Stavros Tripakis        (Cadence Labs, US)
Yaron Wolfsthal         (IBM, IL)


Klaus Havelund          (NASA Jet Propulsion Laboratory)
Gerard Holzmann         (NASA Jet Propulsion Laboratory)
Insup Lee               (University of Pennsylvania)
Grigore Rosu            (University of Illinois, Urbana-Champaign)

More information about the loginf mailing list