Reactive Property Monitoring of Hybrid Systems with ...

32 downloads 8047 Views 83KB Size Report
and whose time definition domain is a bounded interval of R, noted. 1 The final .... consider the conservative extension of (ϕ) from domain D to D ∪ λ. According.
Reactive Property Monitoring of Hybrid Systems with Aggregation Nicolas RAPIN CEA LIST Boˆıte Courrier 174, Gif sur Yvette, F-91191 France [email protected]

Abstract. This work 1 is related to our monitoring tool called ARTiMon for the property monitoring of hybrid systems. We explain how the aggregation operator of its language derives naturally from a generalization of the eventually operator as introduced by Maler & Nickovik for M IT L[a,b] . We present its syntax and its semantics using an intervalbased representation of piecewise-constant functions. We define an online algorithm for its semantics calculus coupled with an elimination of irrelevant intervals in order to keep the memory resource bounded.

1

Introduction

Property monitoring is a unified solution in order to detect failures at many stages of systems life-cycle. Supervision, applied during exploitation phase, requires reactive monitoring: monitors have to run on-line, in real time and indefinitely. The motivation of our work is to define an expressive specification language suitable for systems evolving in dense time, like continuous and hybrid systems, coupled to an effective monitoring approach suitable for supervision purpose. In this short paper we restrict the presentation of this approach to one single operator, called the aggregation operator, which makes more expressive real time temporal logics restricted to the boolean type or to a booleanization [5] of non-boolean types. Our presentation is strongly based on a work due to Maler & Nickovic [4]. Signals and the eventually operator are recalled and discussed in Section 2. Section 3 is dedicated to the aggregation operator and gives some examples of properties. Section 4 describes an algorithm for reactive monitoring of aggregation properties.

2

Signals

In [4] Maler & Nickovik study M IT L[a,b] a bounded version of M IT L and its interpretation over behaviors of continuous and hybrid systems modelled by signals defined as partial piecewise-constant boolean time functions satisfying the finite variability property. Formally, a signal s is a function ranging in B = {⊥, ⊤} and whose time definition domain is a bounded interval of R, noted 1

The final publication is available at Springer via http://dx.doi.org/10.1007/978-3319-46982-9 28.

|s|. This domain is bounded because, in the context of monitoring a running system always delivers a partial trace. But as time elapses the monitoring process extends the domain of signals i.e. |s| is successively of the forms ∅, δ1 , δ1 ∪ δ2 , . . . where δi , δi+1 are adjacent intervals (δi ∪ δi+1 is an interval and δi ∩ δi+1 = ∅) satisfying δi ≺ δi+1 (which holds if t < t′ holds for any t ∈ δi , t′ ∈ δi+1 ). Of course monitoring produces only conservative extensions: noting sn the signal s at the nth extension, for any n > 1, the restriction of sn+1 to |sn | is sn . The finite variability property ensures that any signal can be represented by a finite and minimal set of intervals carrying the value ⊤ (called positive intervals). Notice that finite variability does not imply the bounded variability property which is satisfied when the function changes with a bounded rate χ ∈ N (i.e. at most χ variations over any interval of length 1). A signal changing n times on [n, n + 1[ satisfies finite variability but not bounded variability. The eventually operator is derived from the until operator primary in M IT L[a,b] . Its syntax is 3i ϕ where ϕ is a boolean sub-term and i a bounded interval of R+ . Its semantics is defined with |= called the satisfaction relation: (s, t) |= 3i ϕ iff ∃t′ ∈ t⊕i.(s, t′ ) |= ϕ where t ⊕ i denotes the interval i shifted of t (for example t ⊕ [a, b[ is [t + a, t + b[). Notice that notation (s, t′ ) |= ϕ is equivalent to s(t′ ) = ⊤ when s is a time function. As far as we know the relation |= comes from model theory. Using it subsumes that signals are considered as models and that all terms of the logic should be interpreted with respect to those models. Our approach, which constitutes one of our contribution, is different. We do not really interpret terms over models as usual. Instead we consider there exists a set of ground signals and that operators of a logic proceed as constructors for building new time functions or, as it will be proven, new signals. According to this point of view the term 3i ϕ builds a time function noted (3i ϕ) (we add parenthesis to the term to denote the function it builds) which derives inductively from (ϕ). Let us begin by the definition of (3i ϕ)(t) and secondly we will focus on its definition domain. Derived from the above definition based on |= relation, a first definition is: (3i ϕ)(t) = ⊤ iff ∃t′ ∈ (t⊕i).(ϕ)(t′ ) = ⊤. Another equivalent definition can be given by introducing the set of values taken by a time function over a restriction of its domain. Let g be a signal and r satisfying r ⊆ |g| then g(r) denotes the set {g(t)/t ∈ r}. The semantics definition of an eventually terms at a time instant becomes: Definition 1 (Eventually as Aggregation). (3i ϕ)(t) =

∨ b∈(ϕ)(t⊕i)

b

Since (ϕ) is a boolean function we have (ϕ)(t ⊕ i) ⊆ {⊥, ⊤}. It suffices that ⊤ ∈ (ϕ)(t ⊕ i) for (3i ϕ) being true at t. This definition emphasis the fact that the eventually modality is the result of the aggregation of a set of values using disjunction. Extending this aggregation notion, which is the main idea of this work, will be studied below. For now let us define |(3i ϕ)|. We consider that (3i ϕ) is reliable at time instant t if (t ⊕ i) ⊆ |(ϕ)|. We define |(3i ϕ)| as the set of all reliable time instants, so |(3i ϕ)| = {t/t ⊕ i ⊆ |(ϕ)|}. We will note this latter set i # |(ϕ)| in the sequel. As |(ϕ)| is a bounded interval it is also a bounded interval.

Definition 2 (Eventually Semantics). Let ϕ be a boolean signal. ∨ |(3i ϕ)| = i # |(ϕ)| (3i ϕ)(t) = b∈(ϕ)(t⊕i) b Remark. A important point to notice here, which constitutes one of our contribution, is that the completeness and reliability of the domain enables an incremental and inductive computation of signals. Basic Case: as mentioned in Section 2 the extension of a ground signal is conservative. Induction Step: consider the conservative extension of (ϕ) from domain D to D ∪ λ. According to Definition 2 the domain |(3i ϕ)| is extended from i # D to i # (D ∪ λ). By induction hypothesis (ϕ) remains the same on D. It follows that (3i ϕ) remains the same on i # D. Hence the extension of (3i ϕ) is also conservative. Thus one has to compute (3i ϕ) only on ∆ = (i # (D ∪ λ)) \ (i # D) in order to know the function (3i ϕ) on i # (D ∪ λ). One can already feel the benefit of such a restriction for the on-line calculus. It will be detailed below in Section 4. Lemma 1 (Signal Property Conservation). If (ϕ) is a signal then the time function (3i ϕ), as defined in Definition 2, is also a signal. We have already mentioned that |(3i ϕ)| defined as i # |(ϕ)| is bounded (in the algorithm below we give an operational calculus for i # |(ϕ)|). What remains to be proved is that (3i ϕ) satisfies the finite variability property. To establish this we need to describe the operational semantics calculus of (3i ϕ). The so-called backward propagation proposed in [4] plays an important role in this calculus. For the ease of the presentation, in an algorithmic context, any signal is assimilated to its interval based representation being a chronologically ordered list of positive intervals (i.e. ordered by ≺). It is also useful to formalize intervals and their associated operations before introducing backward propagation. Formally an interval is a 4-tuple (l, lb, ub, u) of B × R × R × B (for example (⊤, a, b, ⊥) stands for [a, b[). We use pointed notation to denote interval attributes: (l, a, b, u).ub denotes b. The opposite of i, noted −i is (i.u, −i.ub, −i.lb, i.l) ; notation t ⊕ i stands for (i.l, t + i.lb, t + i.ub, i.u) and t ⊖ i for t ⊕ −i. The ⊕∪operation can be extended to an interval: given k, i two intervals, k ⊕ i denotes t∈k t ⊕ i which is (k.l ∧ i.l, k.lb + i.lb, k.ub + i.ub, k.u ∧ i.u). A valued interval is an interval carrying a value. val(i) denotes the value carried by i. For example a boolean positive interval is an interval carrying the value ⊤. Backward Propagation. Let us suppose that for t′ ∈ |(ϕ)| we have ϕ(t′ ) = ⊤ then also (3i ϕ)(t) = ⊤ provided t satisfies t′ ∈ t ⊕ i i.e. t ∈ t′ ⊕ −i. The interval t′ ⊕ −i, also noted t′ ⊖ i, is the backward propagation of the true value of ϕ at t′ . This can be extended to an interval: if ϕ is valid over k then also (3i ϕ) over k ⊖i. Given that signals representations are based on positive intervals, an algorithm for computing (3i ϕ) could be the following. Init : (3i ϕ) = ∅. Iteration : for all interval k of (ϕ) aggregate j = k ⊖ i to (3i ϕ). P ost − T reatment : merge adjacent intervals of (3i ϕ) (until no more adjacent can be found). We will refer to this algorithm as the off-line algorithm as (ϕ) is assumed to exist as an input. In the Iteration step aggregate has different meanings depending on how j covers the existing positive intervals of (3i ϕ): if covers none it is purely added to (3i ϕ) ; if covers some empty spaces, each empty space covered by j is converted into a positive interval added to (3i ϕ). The merging step is achieved in order to obtain

minimality of the representation. It follows that the backward propagation of one interval of ϕ produces three kind of modifications of (3i ϕ): (1) it adds one interval (2) it extends one interval (3) it reduces the number of intervals (when j fills the gap between intervals which are merged). By induction hypothesis (ϕ) is a signal ; it satisfies the finite variability property and hence its interval representation is composed of a finite number of positive intervals. So according to modifications (1), (2), (3) it is also the case for (3i ϕ) which satisfies the finite variability assumption ; hence it is a signal. With the same argumentation we can prove that bounded variability is preserved.

3

Aggregation Operator

Our idea, firstly appearing in [6], of the aggregation operator stems from the algorithm described in the previous Section. Let us interpret propagation as an aggregation process. Distinguishing (3i ϕ) before (with superscript bf ) the propagation of k and after (with superscript af ) we have ∀t ∈ k ⊖ i.(3i ϕ)af (t) = ⊤ ∨ (3i ϕ)bf (t). This equality shows that ⊤ is aggregated by disjunction to the value of (3i ϕ)bf for every t of k ⊖ i. This is coherent with Definition 2 relating eventually modality with disjunction. Now ϕ could have another type than boolean type and the aggregation could be based on other functions than disjunction. This is what we investigate in the remainder. A non-boolean signal differs from a boolean one by its range which is of the form E = E ′ × {∅} where E ′ gives the type of the signals. Notice that a non-boolean signal may takes the value ∅ which stands for the undefined value. Interval based representations of non-boolean signals is also based on positive intervals whose definition is extended to intervals not carrying the special value ∅. The syntax for an aggregation term is A{f }i ϕ where f is an aggregation function, i is an interval of R with finite bounds and ϕ is a term. An aggregation function is any binary function f (e, a) which aggregates an element e to an aggregate a (where a can be the special value ∅). Formally it is a function of E × A → A where E and A are sets, both containing the special value noted ∅, and satisfying f (∅, a) = a. A term A{f }i ϕ is well formed if ϕ and f are compatible regarding their types: if range of ϕ is E then f must be of the form E × A → A. Then A{f }i ϕ type is A \ {∅}. Examples of aggregation functions. Let max min : R×((R×R)∪{∅}) → ((R×R)∪{∅}) be the aggregation function satisfying: max min(x, (M, m)) = (max(x, M ), min(x, m)), max min(x, ∅) = (x, x). Let sum(e, a) : R×(R∪{∅}) → (R∪{∅}) satisfying sum(e, a) = e+a, sum(e, ∅) = e ; disj satisfying disj(e, a) = e ∨ a, disj(e, ∅) = e. For aggregation the backward propagation satisfies: ∀t ∈ k ⊖ i.(A{f }i ϕ)af (t) = f (val(k), (A{f }i ϕ)bf (t)). If f is an aggregation function we note f its extension to finite sequences. For e1 , . . . , en being elements of E it satisfies: f (()) = ∅ and f ((e1 , . . . , en ) = f (en , f ((e1 , . . . , en−1 ))). For Definition 1 we introduced g(r) denoting a set of values, for general aggregation we need to denote a sequence. Let g be a signal and r ⊂ |g| be an interval, it follows that the restriction of g to r is the concatenation of a finite number of constant functions g1 → c1 , . . . , gn → cn satisfying |gw | ≺ |gw+1 | for w ∈ [1, n − 1]. We note gseq (r) the sequence (c1 , . . . , cn ).

Definition 3 (Aggregation). Let ϕ be a signal of range E, i a bounded interval of R, and f an aggregation function of E × A → A then: |(A{f }i ϕ)| = i # |(ϕ)|, (A{f }i ϕ)(t) = f ((ϕ)seq (t ⊕ i)) Though Maler & Nickovic introduce also in [4] non-boolean signals in their logic, those are always composed with non-temporal predicative functions reducing the composition to the boolean framework. We claim that with the aggregation the logic is more expressive as one can form terms with a spread temporal dependency (not reduced to current time). The off-line semantics calculus for (A{f }i ϕ) can be obtained by achieving a slight modification of the backward propagation in the algorithm described for the eventually modality. Iteration over intervals of (ϕ) is achieved in the chronological order. At each step of the iteration, it aggregates j = k ⊖ i with value val(k) to (A{f }i ϕ). Due to the chronological iteration, there is only three cases: (1) j covers an empty space beyond (w.r.t ≺), if any, all positive intervals of (A{disj}i ϕ) ; this space is converted into an interval with value f (val(k), ∅) and added (2) the value of any interval m ⊆ j is changed to f (val(k), val(m)) (3) it may exists one interval m partially covered by j, it is split in two, the uncovered part value is set to val(m) and the covered to f (val(k), val(m)). It follows that one propagation adds at most two intervals into (A{disj}i ϕ). The number of intervals of (A{disj}i ϕ) is then at most the double of (ϕ). Examples. Notice that 3i ϕ and A{disj}i ϕ are equivalent. Invariant a ⇒ A {disj}[−1,1] b specifies that b should be always present around a in a the time window [−1, 1]. Notice that our logic supports pairing of signals and the application of functions and predicates like STL [5]. For example if (s, s′ ) is a pair of signals and g a binary function or predicate g(s, s′ )(t) = g(s(t), s′ (t)). For readability we may note s g s′ instead of g(s, s′ ) (typically s < s′ for < (s, s′ )). Example 2. The variation of the flow over 60 seconds should be under 10 percent. Let us consider this term ÷(A{max min}[−60,0] f low). At t its value is ÷(M, m) = M ÷ m where M and m are respectively the max and the min values of f low over t ⊕ [−60, 0]. The invariant is then formalized by: ÷(A{max min}[−60,0] f low) < 1.1. Example 3. When the temperature is under 10 degrees the motor should not be started more than 3 times during the next hour. Here we consider that motor starts function has the form of a Dirac function (its value is 0 except at some time instants where its value is 1). The invariant is formalized by: (temp < 10) ⇒ (A{sum}[0,3600] motor starts ≤ 3). Example 4. With inc(e, a) satisfying inc(e, ∅) = (⊤, e), inc(e, (b, a)) = ((e > a) ∧ b, e) the invariant motor starts ⇒ (A{inc}[0,150] temp)[0] formalizes temperature should not decrease for 150 seconds after the motor starts. Example 5. A{owr}[c,c] ϕ where owr (for overwrite) satisfies: owr(x, ∅) = x, owr(x, y) = x shifts (ϕ) of c in time.

4

On-line Monitoring

For us supervision consists in checking that some invariants remain true. To achieve this we exploit the remark made in Section 2 about an incremental semantics calculus completed it with a garbage collection mechanism which, assuming ground signals satisfy also the bounded variability property (see Section

2), keeps the memory bounded. Our on-line algorithm for (A{f }i ϕ) is called after (ϕ) has been extended. start denotes an interval of ϕ or nil and nxt(i) denotes, if exists, the next interval in the list containing i else nil. Dom is an interval initially being ∅. Basically the algorithm restricts the off-line calculus to the domain extension and remembers, thanks to start, where to restart the iteration. (ϕ).last denotes the last interval of the interval representation of (ϕ). Step 0. If Dom ̸= ∅ goto Step 2. Step 1. If Dom = i # |(ϕ)| is equal to ∅ then RETURN else perform off-line calculus on Dom ; make start refers to (ϕ).last ; RETURN. Step 2. ∆ = (i # |(ϕ)|) \ Dom. While ((start ⊖ i) ≺ ∆ and nxt(start) ̸= nil) do start = nxt(start). Step 3. Perform the off-line algorithm on domain ∆ by iterating on (ϕ) from start. Concatenate result over ∆ to Dom. Dom = (i # |(ϕ)|). RETURN. Now let us examine which part of (ϕ) is irrelevant regarding the computation of any extension of (A{f }i ϕ) i.e. regarding the next call. It is clear that t′ is irrelevant if its propagation t′ ⊖ i does not exceed Dom i.e. if t′ ⊖ i is included in Dom−∞ = (⊥, −∞, (i # |(ϕ)|).ub, (i # |(ϕ)|).u) being the leftunbounded version of Dom. This is equivalent to have t′ ∈ (−i # Dom−∞ ) or t′ ∈ (i.u ⇒ ⊥, −∞ + i.ub, |(ϕ)|.ub − i.ub + i.lb, i.l ⇒ (i.u ⇒ |(ϕ)|).u)). The main interesting data is the upper bound |(ϕ)|.ub − i.ub + i.lb = |(ϕ)|.ub − (i.ub − i.lb) revealing the constant value (i.ub − i.lb). Finally for computing further extension of (A{f }i ϕ) only the definition of (ϕ) over is useful Dom \ (−i # Dom−∞ ) proved to have always the constant length (i.ub − i.lb). If ϕ is sub-term of several terms each determining a relevant interval we choose the larger one. Suppose now that ground signals satisfy bounded variability and that their extension are always bounded (in length or in number of new intervals). With our definition of time domains and the argumentation used for Lemma 1 one can show it is preserved for complex terms. Then firstly the number of intervals over the relevant interval is bounded, say by Kϕ . And secondly an extension of ϕ is bounded in number of new intervals, let us say by Nϕ ∈ N. So, when the extension of (ϕ) is computed the monitor must preserve Kϕ intervals for upper terms and create at most Nϕ new intervals, for a total cost of Kϕ + Nϕ intervals. This principle, extended to all operators of our logic and applied to all sub-terms of invariants ensures that the memory resource remains bounded. Related Works. We firstly introduced aggregation in [6]. It has been studied by Basin & al. in [2,1] for discrete sequences of data. In this work aggregation calculus is based on sliding windows bringing potentially some complexity reduction for associative functions. Earlier Finkbeiner & al. [3] studied an extension of LTL with aggregation in order to collect statistics over runtime executions.

5

Conclusion and Future Works

Starting from the eventually operator of M IT L[a,b] , we defined an aggregation operator suitable for specifying complex invariants about hybrid systems. Main principles of our reactive monitoring approach have been presented, though only focused on the bounded aggregation of our logic. In future works we plan to expose how the same principles extend to conventional modalities like since,

until, or to less conventional like the unbounded aggregation operator and to operators strongly inspired by the interval based operational calculus, like for example the incremental length operator which, combined with aggregation, can express properties involving integrals.

References 1. David Basin, Felix Klaedtke, Srdjan Marinovic, and Eugen Z˘ alinescu. Monitoring of temporal first-order properties with aggregations. Form. Methods Syst. Des., 46(3):262–285, June 2015. 2. David Basin, Felix Klaedtke, and Eugen Z˘ alinescu. Greedily computing associative aggregations on sliding windows. Information Processing Letters, 115(2):186 – 192, 2015. 3. Bernd Finkbeiner, Sriram Sankaranarayanan, and Henny B. Sipma. Collecting statistics over runtime executions. Form. Methods Syst. Des., 27(3):253–274, November 2005. 4. Oded Maler and Dejan Nickovic. Monitoring Temporal Properties of Continuous Signals, FORMATS 2004. Springer Berlin Heidelberg, 2004. 5. Dejan Nickovic and Oded Maler. Amt: A property-based monitoring tool for analog systems. Lecture Notes in Computer Science. Springer Berlin Heidelberg, 2007. 6. Nicolas RAPIN. Procede et dispositif permettant de generer un systeme de controle a partir de comportements redoutes specifies. PCT/EP2011/072221, December 2011.

Suggest Documents