MECHANICAL PROOFS OF SECURITY PROPERTIES

0 downloads 0 Views 283KB Size Report
We de ne IFp as: IFp = fx jx 2v and v 2IFpg. Our inference system for the proof of information ow properties is described in Figure 2. fPy i xi. IFp]gy := exp(x1; x2; ...
I

IN ST IT UT

DE

E U Q I T A M R

ET

ES M È ST Y S

E N

RE CH ER C H E

R

IN F O

I

S

S IRE O T ÉA L A

PUBLICATION INTERNE No 825

MECHANICAL PROOFS OF SECURITY PROPERTIES

ISSN 1166-8687

ˆ ´ BRYCE, DANIEL LE JEAN-PIERRE BANATRE, CIARAN ´ METAYER

IRISA CAMPUS UNIVERSITAIRE DE BEAULIEU - 35042 RENNES CEDEX - FRANCE

IRISA

` ´ INSTITUT DE RECHERCHE EN INFORMATIQUE ET SYSTEMES ALEATOIRES

Campus de Beaulieu – 35042 Rennes Cedex – France Tel. ´ : (33) 99 84 71 00 – Fax : (33) 99 84 71 71

Mechanical Proofs of Security Properties Jean-Pierre Ban^ atre, Ciar an Bryce, Daniel Le Metayer



Programme 1 | Architectures paralleles, bases de donnees, reseaux et systemes distribues Projets Lande & Solidor Publication interne n825 | mai 1994 | 28 pages

Abstract: We give a formal de nition of the notion of information ow

for a simple guarded command language. We propose an axiomatisation of security properties based on this notion of information ow and we prove its soundness with respect to the operational semantics of the language. We then identify the sources of non determinism in proofs and we derive in successive steps an inference algorithm which is both sound and complete with respect to the inference system. The complexity of the resulting algorithm is linear in terms of the size of the program and the analysis can realistically be integrated within a compiler. Thus, the contribution of the paper is the derivation of a formally based and e ective tool for checking security properties of sequential programs. Key-words: formal veri cation, program analysis, veri cation tools, computer security, information ow.

(Resume : tsvp) 

email :jpbanatre, bryce, [email protected]

Centre National de la Recherche Scientifique (URA 227) Universite´ de Rennes 1 – Insa de Rennes

Institut National de Recherche en Informatique et en Automatique – unite´ de recherche de Rennes

Une mecanisation des preuves de securite

Resume : Nous donnons une de nition de ux d'informations dans un

langage a commandes gardees sequentiel. Nous proposons une axiomatisation des proprietes de securite basee sur cette notion de ux d'information et nous fournissons une demonstration de sa correction par rapport a la semantique de ce langage. Nous de nissons les sources de non-determinisme dans les preuves et nous derivons un algorithme d'inference qui est correct et complet par rapport a notre systeme axiomatique. La complexite de cet algorithme est proportionnelle a la taille du programme et donc l'algorithme peut ^etre facilement integre dans un compilateur. Ainsi, la contribution principale de ce rapport est la derivation d'un algorithme complet et ecace destine a la veri cation des proprietes de securite dans les programmes sequentiels. Mots-cle : veri cation formelle, analyse de programmes, outils de veri cation, securite informatique, ux d'information.

Mechanical Proofs of Security Properties

1

1 Introduction With the widespread use of distributed systems and networks, the problem of ensuring security constraints of information systems is becoming increasingly important. The notion of security itself has given rise to a wide range of interpretations and has been studied in various contexts [6, 11, 13, 20]. We propose a security proof system which relies on the notion of information

ow. The signi cance of the approach is that it allows the user to describe security constraints with much more exibility than the traditional security levels [10, 20, 25]. We rst provide a formal de nition of the notion of information ow embodying the intuitive idea that information does not ow from a variable x to a variable y if variations in the original value of x cannot produce any variation in the nal value of y. We de ne an information ow logic and we prove its correctness with respect to the operational semantics of the language. Then we identify the sources of non determinism in proofs in this logic and we successively re ne it into a correct and complete algorithm for information ow analysis. Our approach is inspired by previous work on type inference and static analysis by abstract interpretation. The techniques used to transform the original system into an algorithmic presentation are akin to methods used to get a syntax-directed version of type inference systems including weakening rules [4, 8, 14]. We believe that the availability of mechanical tools can have a major impact on the use of formal methods to ensure security properties and our proposal is a rst step in this direction. The rest of the paper is organized in the following way. In section 2 we provide a short introduction to the concept of computer security and we motivate our position. Section 3 introduces our guarded command language with its operational semantics and our de nition of information ow. We propose an information ow logic SS1 and we prove its correctness with respect to the semantics of the language. In section 4 we present more deterministic versions of the original system (SS2 and SS3). The basic idea is that SS2 avoids the use of a speci c weakening rule and SS3 derives at each step the most precise property provable in SS2 (or the conjunction of all the properties derivable in SS2). We show the soundness and a form of completeness of the new system (with respect to the information ow logic SS1). SS3 still contains a source of non determinism in the rule of the repetitive command. We propose in section 5 a fourth system SS4 which can be seen as a property transformer. The rule for the repetitive command is implemented as an iteration in the style of abstract interpretation [1, 9]. We prove the convergence of the algorithm and its soundness and completeness with respect to SS3. In section 6, we show that properties can be represented as graphs for an ecient implementation. The iteration itself can be replaced by a simple graph transformation. A property can be extracted from a graph using a path nding algorithm. The complexity of the nal algorithm is linear in terms of

2

J.P. Ban^atre, C. Bryce & D. Le Metayer

the size of the program. In conclusion we review related work and we provide insights on the extension to a parallel language.

2 Information ow security The most important aspect of computer security is the non-disclosure, or secrecy, of the information stored in a system. Saltzer & Schroeder [26] de ne a secrecy violation as occurring when "An unauthorized person is able to read or take advantage of information stored in the computer" Ensuring secrecy is equivalent to saying that certain information transmissions, or ows, between system objects must be prohibited. There are many examples of this: a tax program must not leak (let ow) the private information it accesses to unauthorized processes in the system. A bank application has particularly strict secrecy requirements: information about a client's account may ow to the teller objects but not to other clients. So what kind of mechanisms are needed to achieve information secrecy? Consider a mail application; each user process has a letter box object. One might consider two secrecy policies. Firstly, only the owning user is allowed to retrieve the contents of his letter box. Secondly, if a user A sends a message to user B marked fyeo (signifying For Your Eyes Only), then B must not be able to forward the contents of that message to some other user. Access controls are universally used to reduce the information ows in a system [19, 23]. To execute an operation on an object, the calling process must possess a key for the operation. Each process is given a set of keys, one for each of the operations on the objects that it may legally invoke. All operating systems use this abstraction. In Unix, the key abstraction is implemented with access control lists. If a user's process has a \read" key for a le then the r mode bit for the le will be set for "world" or for that user's group. Capability based systems are direct a implementation of the key abstraction [21]. Going back to the mail application example, access controls can be used to enforce the secrecy requirement that only the owning user reads his own letter box, by only granting that user a key for the retrieve operation on the letter box. However, the problem arises with the second secrecy requirement. A user might receive a fyeo message and forward the contents in another message to some other user. Access controls do not understand how information ows in a system. The only way that access controls can stop a user A illegally forwarding a message to user B directly, or indirectly via other user objects, is to ensure that there is no sequence of user processes in the system starting with A and nishing with B such that all members of the sequence have a

Mechanical Proofs of Security Properties

3

send key for the next member's letter box. Unfortunately, this solution would prohibit A from sending anything to B ; hence, the access control solution is insucient for information ow security. We need some other approach for preventing undesired transmission of information between program objects. An information ow mechanism must tag each variable with the set of variables from which it has received information ows, the e ects of which have not been lost due to subsequent ows. The

ow security of the system can then be determined from these tags. Such a mechanism requires that the behavior of each construct of the language used to program the system be precisely de ned with respect to the information

ows it generates. One can use the resulting semantics to statically analyse the program text for ows and reject the program if any of these ows violate the security constraints placed on the program. This analysis can be done by hand or automatically (by a compiler for example) if enough information is available. Alternatively, the semantics can be used to specify a dynamic mechanism. This consists of extra program instructions that a compiler inserts so that the program information ows are logged at runtime. The approach supposes that an attacker has the program text; he/she thus understands the expected behavior of the program and so may be able to deduce information by examining the value of some program variable. Information ow control mechanisms have traditionally used security levels [10, 25]. Each variable is assigned a level denoting the sensitivity of the information it contains. After an operation, the level of the variable which received the information ow must be no less than the level of the ow source variables. However, the security level approach severely restricts the range of policies that one might like to support. A ow mechanism should log the variables that have own to each variable rather than the level of the data. Jones & Lipton's surveillance set mechanism [16] is in this spirit and has some similarities with the mechanism proposed here. In this paper we concentrate primarily on a static information ow mechanism in the form of a proof system. The goal is to be able to formally verify information ow properties from the program text. Each command of the language is analysed for ows and the (axiomatic) semantics of the command regarding how it e ects information ows is presented. This enables

ow security proofs, that is, for any policy which the programmer may care to de ne the proof system can be used to determine if the program state at some point satis es this policy. In the next section we present our inference rules for a small imperative language which is a restriction to the sequential part of the language considered in [5]. We suggest in the conclusion how our techniques can be applied to a complete version of CSP.

4

J.P. Ban^atre, C. Bryce & D. Le Metayer

3 An inference system for security properties We consider in this paper a simple guarded command language whose syntax is de ned as follows: Program Decl Stmt Comm Alt Rep guard

::= ::= ::= ::= ::= ::= ::=

Decl ; Decl ; Stmt var v (p, Comm) v := E j skip j Stmt; Stmt j Alt j Rep [ guard ! Stmt ; 2 guard ! Stmt ] *[ guard ! Stmt ; 2 guard ! Stmt ] B

program declarations statements commands alternative repetitive guard

where  stands for zero or more repetitions of the enclosed syntactical units, 'v' stands for a variable or a list of variables, 'E' for an integer expression and 'B' for a boolean expression. Commands are associated with program points p. All program points are assumed to be di erent and p0 and p stand for respectively the entry point and the exit point of the program. We omit program points in the text of the programs but they are used to state certain properties. The alternative and repetitive commands consist of one or more guard branch pairs. A guard is a boolean expression. A guard is passable if it evaluates to true. When an alternative command is executed, a branch whose guard is passable is chosen. If more than one guard is passable, then any one of the corresponding branches can be executed. If no guard is passable then the command fails and the program terminates. On each iteration of the repetitive command, a branch whose guard is passable is executed. If more than one guard is passable, then like for the alternative, any one of the branches is chosen. When no guard is passable, the command terminates and the program continues. The structural operational semantics of this language is de ned in Figure 1. The rules are expressed in terms of rewritings of con gurations. A con guration is either a pair < S;  >, where S is a statement and  a state, or a state . The latter is a terminal con guration. Let us now turn to the problem of de ning the information ow for this language. There are two classes of information ows in programs. An assignment command causes a direct ow of information from the variables appearing on the right hand side of the (:=) operator to the variable on the left hand side. This is because the information in each of the right hand side operands can in uence value of the left hand side variable [7]. The information that was in the destination variable is lost. Conditional commands introduce a new class of ows [10]. The fact that a command is conditionally executed signals information to an observer concerx

5

Mechanical Proofs of Security Properties

< y := exp;  >

al(exp; )=y]

!

< skip;  > !  < S1;  > ! < S10 ; 0 > < S1; S2;  > ! < S10 ; S2; 0 > < S1 ;  > !  0 < S1 ; S2 ;  > ! < S 2 ;  0 > < S1;  > ! abort < S1; S2;  > ! abort < C ;  > ! true < [C1 ! S12C2 ! S22::::::2C ! S ];  > ! < S ;  > < C ;  > ! abort < [C1 ! S12C2 ! S22::::::2C ! S ];  > ! abort 8i: < C ;  > ! false < [C1 ! S12C2 ! S22::::::2C ! S ];  > ! abort < C ;  > ! true < [C1 ! S12C2 ! S22::::::2C ! S ];  > ! < S ; [C1 ! S12C2 ! S22::::::2C ! S ];  > < C ;  > ! abort < [C1 ! S12C2 ! S22::::::2C ! S ];  > ! abort 8i: < C ;  > ! false < [C1 ! S12C2 ! S22::::::2C ! S ];  > !  i

n

n

i

i

n

n

n

n

i

i

n

n

i

n

i

n

n

i

n

n

Figure 1: Operational semantics

n

6

J.P. Ban^atre, C. Bryce & D. Le Metayer

ning the value of the command guard. Consider the following program segment. e is some expression: x := e; a := 0; b := 0; [ x = 0 ! a := 1 2 x 6= 0 ! b := 1 ] In this program segment, the values of both a and b after execution indicate whether x was zero or not. This is an example of an implicit ow [10] or what we more generally refer to as an indirect ow. We note IF the set of indirect information ow variables at a particular program point p. IF can be de ned syntactically as the set of variables occurring in embedding guards. We need some way of representing the set of variables which may have

own to, or in uenced, a variable v. We call this set the security variable of v, denoted v. We de ne IF as: IF = fx j x 2 v and v 2 IF g Our inference system for the proof of information ow properties is described in Figure 2. p

p

p

p

fP [y

[

x

i

p

[

IF ]g y := exp(x1; x2; :::::; x ) fP g p

n

i

skipfP g

fP g

fP gS 1fQg; fQgS 2fRg fP gS 1; S 2fRg

8i = 1::nfP gS fQg fP g[C1 ! S1 2C2 ! S2 2::::::2C ! S ]fQg i

n

n

8i = 1::nfP gS fP g fP g  [C1 ! S1 2C2 ! S2 2::::::2C ! S ]fP g i

n

P

)

P 0; fP 0g S fQ0g; Q0 fP g S fQg

)

n

Q

Figure 2: System SS1 The last rule in Figure 2 is called the consequence rule or the weakening rule. We use the notation `1 fP g S fQg to denote the fact that fP g S fQg is provable in SS1.

7

Mechanical Proofs of Security Properties

We de ne a correspondence relation between properties and the semantics of statements and we use it to state the correctness of the information ow logic of Figure 2.

Definition 3.1

C (P; S ) =

(P ) x 62 y) ) 8; v: such that < S;  ># and < S;  [v=x] >#  fv 0 j< S;  > !  0;  0(y ) = v 0g =  fv 00 j< S;  [v=x] > !  00;  00(y ) = v 00g

Proposition 3.2 (correctness of SS1) 8S; P: if `1 fInitgS fP g then C (P; S ) Init is de ned as 8x; y x 6= y: x 62 y. It represents the standard (minimal) initial property. S [v=x] is the same as S except that variable x is assigned value v. < S;  ># stands for 90 6= abort: < S;  > ! 0 which means that the program may terminate successfully. The above de nition characterises our notion of information ow. If P ) x 62 y holds, then the value of x before executing S cannot have any e ect on the possible values possessed by y after the execution of S . In other words, no information can ow from x to y in S . The condition < S;  ># and < S; =x] ># is required because the execution of S may terminate or not depending on the original value of x. We prove the correctness of SS1 by induction on the structure of terms as a consequence of the more general property:

Proposition 3.3 (correctness of SS1 (general form)) 8S; S0; P; Q: if C (P; S0 ) and `1 fP gS fQg then C (Q; S0; S ) The original correctness property can be derived from this general form by stating S0 = skip. We consider the assignment statement rst. We assume: [ C (P 0; S0) and P 0 = P [y x [ IF ] i

p

i

and we prove:

C (P; S0; y := exp(x1; x2; :::::; x )) Let z; t be such that P ) z 62 t. Let ; v be such that < S0; y := exp(x1; x2; :::::; x );  > # and < S0; y := exp(x1; x2; :::::; x ); =z] > #. We have to prove:  fv 0 j< S0; y := exp(x1; x2; :::::; x );  > !  0;  0(t) = v 0g = n

n

n

n

8

J.P. Ban^atre, C. Bryce & D. Le Metayer

); =z] > ! 00; 00(t) = v00g >From the rules for the sequential command in the operational semantics of the language (Figure 1), this is equivalent to: fv 00 j< S0 ; y := exp(x1 ; x2; :::::; x



fv 0 j< S0 ;  > !

n

0; < y := exp(x1; x2; :::::; x ); 0 > ! 0; 0(t) = v0g = n



fv 00 j< S0 ;  [v=z ] > !

00 ; < y := exp(x1; x2; :::::; x ); 00 > ! 00; 00(t) = v00g We rst consider the case t = y. Applying the rule for the operational semantics of assignment to the above equality, we get: 

fv 0 j< S0 ;  > !

n

0; v0 = val(exp(x1; x2; :::::; x ); 0)g = n



But

fv 00 j< S0 ;  [v=z ] > !

(1)

00 ; v00 = val(exp(x1; x2; :::::; x ); 00 )g [ x [ IF ] P ) z 62 y and P 0 = P [y n

i

p

i

implies

8j 2 [1; . . . ; n]: P 0 ) z 62 x

j

This, together with C (P 0; S0), implies: 

fv 0 j< S0 ;  > !

0; 0(x ) = v0g = j



fv 00 j< S0;  [v=z ] > !

which entails (1). If t 6= y then we have to prove: fv 0 j< S0; 

j

> ! 0; 0(t) = v0g = 

But

00 ; 00 (x ) = v00g

fv 00 j< S0;  [v=z ] > !

00 ; 00 (t) = v00g

P ) z 62 t and z 6= y

implies

P 0 ) z 62 t and C (P 0; S0) allows us to conclude. The proof of the sequential command follows from the induction hypothesis and the associativity of the operational semantics of the command. Let lhs(S ) denote the set of variables occurring in the left handside of an assignment command in S . The proof in the case of the alternative command follows from the induction hypothesis and the following two properties:

if

`1 fP gS fQg

and y 62 lhs(S )

9

Mechanical Proofs of Security Properties

then 8x:(P ) x 62 y) , (Q ) x 62 y) if

and S = [C1 ! S12C2 ! S22:::2C ! S ] and y 2 lhs(S ) then 8z such that 9i:z 2 C : :(Q ) z 62 y) The case of the repetitive command is made by induction on the number of iterations. It makes use of the two properties de ned above and relies on the fact that only terminating executions are considered in the de nition of C. Let us now consider, as an example, a library decryption program. The program has three inputs and two outputs. The input consists of a string of encrypted text, or cipher, a key for decryption and a unit rate which the user is charged for each character decrypted. The variable cipher is an array of characters. A character is decrypted by applying it to an expression D with the key parameter. To save computing resources, some characters may not have been encrypted. The user pays twice the price for every encrypted character that goes through the decryption program. The boolean expression encrypted() determines if the character passed is encrypted or not. The outputs are the decrypted text, or clear, and the charge for the decryption. We assume that clear is output to the user and that charge is output to the library owner. To be usable, the user must trust the program not to secretly leak the clear text or the key to the library owner via the charges output. Such a leakage is termed a covert channel in [18]. The proof system described in Figure 1 allows us to prove the following property (an array assignment a[k] := e being treated as a := exp(a; k; e)): `1 fInitg Library f(clear 62 charge) and (key 62 charge)g that is, the charge output may not receive a ow of information from the clear variable or from the key input. We show in section 6 that this property can in fact be proven mechanically. `1 fP gS fQg;

n

n

i

4 A more deterministic system We consider now the problem of mechanising the proof of security properties. As suggested above, the sort of properties we are interested in are of the form x 62 y. The language of properties is: P ::= x 62 y j P1 ^ P2 where ^ represents the logical \and" connective. The system SS1 presented in section 3 is not suggestive of an algorithm for several reasons:

10

J.P. Ban^atre, C. Bryce & D. Le Metayer var: i, charge, key, unit; array: clear, cipher;

cipher :=  message to be decrypted ; unit :=  unit rate constant ; charge := unit; i := 0; [ cipher[i] 6= null constant ! [ encrypted(cipher[i]) ! clear[i] := D(cipher[i], key); charge := charge + 2*unit; 2 not encrypted(cipher[i]) ! clear[i] := cipher[i]; charge := charge + unit; ]; i := i + 1 ]

Figure 3: Library decryption program The relationship between the input and the output property of the rule for assignment is not one to one.  The weakening rule can be applied at any time in a proof. The combination of the weakening rule with the rule for the repetitive command in particular requires some insight. In general this amounts to guessing the appropriate invariant for the loop. There are two possible ways of proving a property of the form fP g Prog fQg: one can either start with P and try to nd a postcondition implying Q or start with Q and derive a precondition implied by P . In the terminology of mechanical program analysis [1, 9, 14] these techniques are called respectively forwards analysis and backwards analysis. The method we present here for deriving security properties belongs to the forwards analysis category. Let us note however that the inference system SS1 is not biased towards one technique or the other and we can apply the same idea to derive a backwards analysis. In order to reduce the amount of non determinism we rst distribute the weakening rule over the remaining rules, getting system SS2 presented in Figure 4. The soundness of SS2 is obvious and its completeness follows from the transitivity of implication: 

Proposition 4.1 (soundness and completeness of SS2) 8S; P; Q: `1 fP g S fQg if and only if `2 fP g S fQg

11

Mechanical Proofs of Security Properties

P

S

) P 0 [y x [ IF ]; fP g y := exp(x1; x2; :::::; x i

i

p

n

P0 ) Q ) fQg

P ) P0 fP gskipfP 0 g P

)

P 0;

fP 0 gS 1fQ0g; Q0 ) Q"; fQ"gS 2fR0 g; fP gS 1; S 2fRg

R0

)

R

) P 0 ; 8i = 1::nfP 0gS fQ0g; Q0 ) Q fP g[C1 ! S1 2C2 ! S2 2::::::2C ! S ]fQg

P

i

n

n

P ) P 0; 8i = 1::nfP 0gS fP 0g; P 0 ) Q fP g  [C1 ! S1 2C2 ! S2 2::::::2C ! S ]fQg i

n

n

Figure 4: System SS2 This rst transformation still yields a highly non deterministic proof procedure but it paves the way for the next re nement. Let us rst note that the new system SS2 is syntax directed. In order to derive an algorithm from SS2 we want to factor out all the possible proofs of a program into a single most precise proof. This proof should associate with any property P the greatest property Q (in the sense of set inclusion) such that `2 fP g S fQg. This requirement allows us to get rid of most of the uses of ) in the rules (but not all of them) and imposes a new rule for the assignment command. The new system SS3 is described in Figure 5. The intuition behind the new rule for the assignment command is that T (R) represents the conjunction of all the properties x 62 z derivable from the input property R.FR is the restriction of R to properties of the form (x 62 z) with z 6= y. is the approximation in our language of properties of the logical \or " connective (_). It is expressed in terms of sets as an intersection. For instance: G ((x 62 y) ^ (z 62 t)) ((x 62 t) ^ (z 62 t)) = (z 62 t) y

y

It is easy to see that (Q1

_

Q2 )

)

(Q1

G

Q2 )

12

J.P. Ban^atre, C. Bryce & D. Le Metayer

fRg

y := exp(x1; x2; :::::; x ) fT (R)g n

y

skipfP g

fP g

fP gS 1fQg; fQgS 2fRg fP gS 1; S 2fRg 8i = 1::nfP gS fQ g F fP g[C1 ! S1 2C2 ! S2 2::::::2C ! S ]f i

i

n

n

i

Qg i

P ) P 0; 8i = 1::nfP 0gS fQ g; F Q F) P 0 fP g  [C1 ! S1 2C2 ! S2 2::::::2C ! S ]f Q g i

i

i

n

i

n

i

i

with: R = V 6= f(xV62 z) j R ) (x 62 z)g f(y 62 y ) j 8 j 2 [1; . . . ; n]: R ) (y 62 x ) and T (R) = R : R ) (y 62 v)g F Q = Vf(x 62 y) j 8i 2 [1; . . .8; nv]; Q2 IF ) (x 62 y)g y

z

y

y

y

i

i

i

p

i

i

i

Figure 5: System SS3

i

j

13

Mechanical Proofs of Security Properties

This approximation is required because the \or" connective does not belong to our language of properties. The language could be extended with _ but it makes the treatment more complex and it does not seem to allow the derivation of more useful information (the situation can be compared with the lack of union types in many type systems [4, 8, 14]). We cannot get rid of the implication in a straightforward way in the rule for the repetitive command because: P ) P 0 and fP 0gS fP 0g does not imply fP gS fP g In order to prove a property of the repetitive command an appropriate invariant P 0 has to be discovered. We show in the next section how the maximal invariant can be computed iteratively. The following properties state respectively the soundness and the completeness of SS3 with respect to SS2. Proposition 4.2 (soundness of SS3) i

i

if `3 fP g S fQg then `2 fP g S fQg Proposition 4.3 (completeness of SS3) 8S; P; Q: if `2 fP g S fQg then 9Q0: `3 fP g S fQ0g Q0 8S; P; Q:

Q Both properties are proved by induction on the structure of commands. Let us consider soundness rst. The base case is the rule for the assignment statement. We have to prove: [ x [ IF ] R ) T (R)[y i

y

)

p

i

From the de nition of T we have: T (R)[y V S x [ IF ] S f(y 62 x [ IF ) j 8 j 2 [1; . . . ; n]: R ) y 62 x and =R 8 v 2 IF : R ) y 62 v g = R ( R The other cases follow trivially from QF ) F Q and the fact that fP 0g S fF Q g F F and Q ) P 0 imply f Q g S f Q g. Let us now turn to the proof of the information ow logic completeness. For the base case (assignment) we have to show that: [ P ) P 0 [y x [ IF ] and P 0 ) Q y

y

i

y

i

i

p

i

i

i

p

p

i

j

i

y

i

i

i

i

i

i

i

i

i

i

i

p

i

i

i

i

14

J.P. Ban^atre, C. Bryce & D. Le Metayer

implies

T (P )

Q

)

y

Using the monotonicity of T it is enough to prove: [ x [ IF ]) ) P 0 T (P 0[y y

i

y

p

i

Let P 00 = P 0[y S x [ IF ]. >From the de nition of T we have: T (P 00) V = P 00 f(y 62 y) j 8 j 2 [1; . . . ; n]: P 00 ) y 62 xVand8 v 2 IF : P 00 ) y From the de nition of our language of properties P 0 = x 62 y . Let us distinguish three possible cases for the xS 62 y : - y 6= y: we have P 00 = P 0[y x [ IF ] ) x 62 y and 00 T (P ) ) x 62 y . - y = y and 8 j 2 [1; . . . ; n]: P 00 ) x 62 x and 8 v 2 IF : P 00 ) x 62 v: then T (P 00) ) x 62 y. - otherwise let us assume :(T (P 00) ) x 62 y). From the de nition of T (P 00) we have: i

i

p

y

y

i

i

i

j

p

k

k;l

k

l

l

i

i

y

k

p

k

l

l

l

j

k

y

l

p

k

k

y

:(8 j 2

[1; . . . ; n]: P 00

that is to say:

)

k

x

62

k

:(P 00 ) x

y

x and 8 v j

[

p

)

x

k

62

v)

IF ) S x [ IF ]. And P 00 ) But we Shave: P 0 ) x 62 y and P 00 = P 0[y x [ IF which leads to a contradiction. Thus we must have: x 62 k

62

x

IF : P 00

2

i

[

p

i

k

k

i

i

i

i

p

p

T (P 00) ) x y

k

62

y

5 Mechanical analysis of the repetitive command In order to be able to treat the repetitive statement mechanically we must be able to compute a property P 0 such that

P

)

P0

and 8i = 1::nfP 0gS fQ g i

i

and

G

Q

i

i

) P0

i

62 vg

15

Mechanical Proofs of Security Properties

fRg

y

:= exp(x1; x2; :::::; x ) fT (R)g n

y

fP gskipfP g fP gS 1fQg; fQgS 2fRg fP gS 1; S 2fRg 8i = 1::nfP gSi fQig F fP g[C1 ! S12C2 ! S2 2::::::2Cn ! Sn ]f i Qi g 8i = 1::nfP 0gSi fQ0i g; 8i = 1::nfP 1gSi fQ1i g;

0 = 1 = Q

Q

F F

i i

0 1 Q ;

Qi ;

.. F.

i

0 ) 6 1 Q 6 )

Q

0; 1 P ;

P

1 = 2 = P

P

8i = 1::nfP n?1 gSi fQni ?1 g; Qn?1 = i Qni ?1 ; Qn?1 ) P n?1 ; fP 0 g  [C1 ! S12C2 ! S2 2::::::2Cn ! Sn ]fP n g

F F

0 1 P

P

P

n

0 1 Q

Q

= Q ?1 n

Figure 6: System SS4 Furthermore it must be the greatest of these properties in order to retain completeness. We compute this property using an iterative technique akin to the method used for nding least xed points in abstract interpretation [1, 9]. Figure 6 presents SS4 which is a re nement of SS3 with an e ective rule for the repetitive statement. The following properties show that SS4 is the expression, in the form of an inference system, of a terminating, correct and complete algorithm.

Proposition 5.1 (Termination of SS4) 8S; P;

9Q: `4 fP g

S fQg and Q is unique

Proposition 5.2 (soundness of SS4) 8S; P; Q: if `4 fP g S fQg then `3 fP g S fQg Proposition 5.3 (completeness of SS4) 8S; P; Q: if `3 fP g S fQg then 9Q0: `4 fP g S fQ0g Q0

)

Q

16

J.P. Ban^atre, C. Bryce & D. Le Metayer

The three properties are proven by induction on the structure of commands. Let us consider them in turn. The only non trivial part in the proof of the rst proposition is the existence of the property Q in the case of the repetitive command. In fact this amounts to proving the termination of the algorithm described by SS4. In order to prove it, we just have to observe that 8 k: P ) P +1 and P 6= P +1. Thus the P form a strictly decreasing sequence of properties. The set of properties is nite, so the sequence must converge. The soundness of SS4 follows directly from the fact that 8 k: P ) P +1. In order to prove completeness, we have to show that: G P 0 ) P 0 and 8i = 1::nfP 0gS fQ g and Q ) P 0 k

k

k

k

k

k

k

i

i

i

i

implies

P

)

n

G

Q

i

i

with P de ned as in Figure 6. We prove G Q and 8 k: Q ) n

8 k:

k

i

P

k

P0

)

i

by induction on k. F Q0 with 8i = 1::nfP 0gS fQ0g.  Q0 is de ned by Q0 = Furthermore we assume: P 0 ) P 0 and 8i = 1::nfP 0gS fQ g Thus, from the general induction hypothesis, the monotonicity of the inference rules and the monotonicity of F, we have: G Q0 ) Q i

i

i

i

i

i

i

i

By assumption we also have:

P0 

We assume

Q

k

)

G

Q

i

)

P0

and P

k

)

P0

i

and we prove:

Q +1 k

)

G i

Q

i

and P

k

+1 )

P0

17

Mechanical Proofs of Security Properties

>From Figure 6 we have:

G P +1 = P Q P 0 and the induction hypothesis P P +1 ) P 0 k

Q

k

)

F Q i

i

)

k

k

k

) P0

imply:

k

Furthermore: 8i = 1::nfP

k

+1 gS fQ +1 g k

i

i

and Q +1 = k

G

Q +1 and k i

i

8i = 1::nfP 0gS fQ g i

And

Q +1 k

)

G

i

Q

i

i

as required.

6 Inference as transformations on graphs

A conjunctive property P can alternatively be represented as a set of pairs of variables: f(y; x) j P ) x 62 y g For instance z 62 t ^ t 62 u is represented as f(t; z); (u; t)g. We present in Figure 7 a new version of SS4 expressed in the form of an algorithm T5 taking as arguments a property P represented as a set and a program Prog and returning the property Q such that `4 fP g Prog fQg. proof of the equivalence of SS4 and SS5 is obvious. In terms of sets, F isThe implemented as set intersection(T) and ) corresponds to the superset relation (). Proposition 6.1 (Correctness and completeness of T5)

S fQg if and only if T5(P; S ) = Q The representation of properties as sets of pairs leads to a very expensive implementation of the rule for assignment involving a quadratic number of tests in sets R and IF . We propose instead to represent properties as accessibility graphs. We consider directed graphs de ned as pairs of a set of nodes and a set of arcs: G ::= (N; A) N ::= fng n ::= V A ::= fag a ::= (n1; n2) 8S; P; Q:

`4 fP g

p

p

18

J.P. Ban^atre, C. Bryce & D. Le Metayer

T

5 (P; (y := exp(x1; x2; :::::; x ))) = n

5(P; skip) =

T

T

( )

Ty P

P

5(P; S 1) = Q; T5 (Q; S 2) = T5 (P; S 1; S 2) = R

R

8i = 1::n T5 (P; S ) = Q T ( P; [ C ! S 5 1 12C2 ! S2 2::::::2C ! S ]) = i

i

T

8i = 1::n 8i = 1::n

n

5(P 0 ; S ) = 1 T5 (P ; S ) =

T

i

i

0 1 Q ;

Qi ; i

0=T 1 T Q =

Q

i

.. .

i

0 1 Q ;

Qi ; i

n

0  6 1 Q 6 

Q

i

0; 1 P ;

P

Qi

1 = P0 T 2 = P1 T P

P

T

8i = 1::n T5(P n?1 ; Si) = Qni ?1 ; Qn?1 = i Qni ?1 ; Qn?1  P n?1 ; P n 0 n T5 (P ; [C1 ! S1 2C2 ! S2 2::::::2Cn ! Sn ]) = P

with: = f(z; x) 2SR j z 6= y g ( ) = R f(y; y ) j 8 j 8 v 2 I F : (v; y ) 2 Rg

y

R

Ty R

y

i

p

2

[1; . . . ; n]: (x ; y ) 2 R and

i

Figure 7: System SS5

j

i

0 1 Q

Q

= Q ?1 n

19

Mechanical Proofs of Security Properties

V is the set of the variables of the program subscripted by program points. The property represented by a graph G at program point p is given by the function H de ned as follows: p

H (p; G) = f(y; x)

j

Nopath(G; x0 ; y )g p

Nopath(G; x0; y ) returns True is there is no path from node x0 to node y in the graph G. We have now to show how the operations on properties required by T5 are implemented in terms of graphs. Since the set of nodes of the graphs manipulated by our algorithm is constant it is convenient to introduce the following notation: p

p

if G = (N; A) then G + A0 = (N; A [ A0) Furthermore the symbol + is overloaded to operate on two graphs (no ambiguity can arise from this overloading): if G = (N; A) and G0 = (N; A0) then G + G0 = (N; A [ A0) The nal version of our algorithm is described in Figure 8 (variables v are implicitly quanti ed over the whole set of variables of the program). T

G

+ f(x

p i

;y

6 (G; p; (q; (y := exp(x1; x2; :::::; x )))) =

q

n

)

j

i

= 1::ng + f(z T

p

;z

q

)

j

z

6 (G; p(q; skip)) =

6= y g + f(z r ; y q ) j

zr

2 I Fq g

G

6(G; p; (q1; S 1)) = G1 ; T6 (G1; q1; (q2; S 2)) = G2 T6 (G; p; (q; (q1; S 1); (q2; S 2))) = G2 + f(v 2 ; v )g

T

q

8i = 1::n T6 (G; p; (q ; S )) = G ( G; p; ( q; [ C ! ( q1 ; S1)2::::::2C ! (q ; S )])) = + 6 1 i

T

n

i

q

i

n

n

i

Gi

+ f(v

qi

;v

q

)g

6 (G0; p; (q ; S )) = G0; G1 = + G0 + f(v i ; v )g + f(v ; v )g 0 1 T6 (G ; p; (q; [C1 ! (q1 ; S1 )2C2 ! (q2 ; S2 )2::::::2C ! (q ; S )])) = G

8i = 1::n

T

i

i

i

i

q

q

q

p

i

n

n

n

Figure 8: System SS6

T6 takes three arguments: a graph G, a program point p and a statement S 2 Stmt. The program point characterises a statement \preceding" the

20

J.P. Ban^atre, C. Bryce & D. Le Metayer

current statement in the (execution of) the program. The program is analysed with the input program point p0 as argument. Program points are made explicit in the statements because they play a crucial r^ole at this stage. The rule for assignment can be explained as follows. An arc is added to the graph from each occurrence of variables x at the preceding program point p to y at the current program point q, and from each variable in the set of indirect

ow to y. Other variables are not modi ed and an arc is added from their occurrence at point p to their occurrence at point q. In the rules for the alternative command and the repetitive command the operation + is used to implement T. This comes from the fact that graphs record accessibility when sets contain negative information of the form x 62 y. Let us note that the graph manipulated by T6 is monotonically increasing: Proposition 6.2 (Monotonicity of T6) i

8S; G; p:

T6(G; p; S ) = G0; G = (N; A); G0 = (N; A0)

) A  A0

This property can easily be checked by inspection of the di erent cases. The rule for the repetitive command is an optimization of its counterpart in Figure 7 which arises naturally from the use of graphs to represent properties. The natural rule would have been the following: 8i = 1::n 8i = 1::n

6(G0; p; (q ; S )) = 1 T6 (G ; q; (q ; S )) =

T

i

i

i

i

0 1 G ; Gi ; i

1=+ 2 G = +

G

.. .

i i

0 + f(v 1 G + f(v Gi i

qi qi

;v ;v

q q

)g; )g;

1 6= G0 2 1 G 6= G G

6(G ?1 ; q; (q ; S )) = G ?1 G = + G ?1 + f(v i ; v )g; G = G ?1 0 T6 (G ; p; (q; [C1 ! (q1 ; S1 )2C2 ! (q2 ; S2 )2::::::2C ! (q ; S )])) = G

8i = 1::n

T

n

i

i

n i

n

i

n

q

q

n

n

i

n

n

n

n

It is easy to prove however that no iteration above the second step can increase the current graph. The basic reason is that the actions performed on the graph during a call to T6(G; p; S ) depend only on the entry program point p. Since all the iteration steps above the second one call T6 with the same program point p, their e ect is null. Furthermore the second iteration di ers from the rst one only in the rst call to T6 which has q as argument instead of p, hence the simpli cation. The correctness of this last algorithm is stated as follows: Proposition 6.3 (Correctness of T6) 8S; P; Q; G; p; q:

H (p; G) = P and T5(P; S ) = Q

H (q; T6(G; p; (q; S ))) = Q

)

21

Mechanical Proofs of Security Properties

This property can be proved by induction on the structure of expressions. We consider the rule for assignment rst. Assuming

G0 = G + f(x ; y ) j i = 1::ng + f(y ; y ) j y 6= yg + f(z ; y ) j z and H (p; G) = P we have to prove H (q; G0) = T (P ) Unfolding the two terms of this equality, we get: 8x:8z: Nopath(G0 ; x0; z ) p

q

i

p

q

i

i

r

q

i

r

2 IF g q

y

q

,

(z; x) 2 P or (z = y) and 8 j 2 [1; . . . ; n]: (x ; x) 2 P and 8 v 2 IF : (v; x) 2 P It is important to note that any call to T6(G; p; (q; S )) is such that there is no incident arc at nodes y : 8x ; y :(x ; y ) 62 G y

j

p

q

t

q

t

q

We distinguish two cases: z 6= y and z = y. If z 6= y we have: Nopath(G0 ; x0; z ) , Nopath(G; x0; z ) , (z; x) 2 H (p; G) , (z; x) 2 P , (z; x) 2 P If z = y then: Nopath(G0 ; x0; y ) , 8 j 2 [1; . . . ; n]:Nopath(G; x0 ; x ) and 8 z 2 IF : Nopath(G; x0 ; z ) , 8 j 2 [1; . . . ; n]:(x ; x) 2 H (p; G) and 8 z 2 IF : (z; x) 2 H (p; G) , 8 j 2 [1; . . . ; n]:(x ; x) 2 P and 8 v 2 IF : (v; x) 2 P Notice that the notation IF is overloaded here: IF stands both for the set of variables used in T5 and the set of superscripted variables used in T6. No ambiguity arises from this abuse of notation. The rule for the alternative command follows from the following property: 8x:8z : (Nopath(+ G + f(v ; v )g; x0 ; z )) , (8i:Nopath(G ; x0; z )) The rule for the repetitive statement can be proven using the more natural version presented earlier and showing that it is equivalent to the rule appearing in gure 8 (applying the reasoning sketched above). q

p

y

q

p

r

r

p

j

r

j

p

j

p

p

q

i

i

p

qi

q

q

i

qi

22

J.P. Ban^atre, C. Bryce & D. Le Metayer

It should be clear that some straightforward optimizations can be applied to this algorithm. First it is not necessary to keep one occurrence of variable per program point in the graph. As can be noticed from the rule for assignment, most of these variables would just receive one arc from the previous occurrence of the variable. All these useless arcs can be short-circuited and the only nodes kept into the graph are occurrences of x where p is an assignment to x or an alternative (or repetitive) statement with several assignments to x. Also a nave implementation of the rules for the alternative and the repetitive statements would lead to duplications of the graph. The monotonicity of T6 allows us to get rid of this duplication. Instead the graph can be constructed iteratively as follows: p

0 = G; 8i = 1::n T6(G ?1 ; p; (q ; S )) = G T6 (G; p; (q; [C1 ! (q1 ; S1 )2C2 ! (q2 ; S2 )2::::::2C ! (q ; S )])) = G

i

i

n

i

n

i

n

Gn

+ f(v

qn

;v

Let us now return to the library decryption program to illustrate the algorithm. Figure 9 is a new presentation of the program making some program points explicit (we do not include all of them for the sake of readability). var: i, charge, key, unit; array: clear, cipher;

cipher :=  message to be decrypted ; unit :=  unit rate constant ; (p1,charge := unit); i := 0; (p2,[ cipher[i] 6= null constant ! (p3 ,(p4,[ encrypted(cipher[i]) ! (p5,(p6,clear[i] := D(cipher[i], key)); (p7,charge := charge + 2*unit)); 2 not encrypted(cipher[i]) ! (p8,(p9,clear[i] := cipher[i]); (p10,charge := charge + unit)); ]); (p11,i := i + 1)) ])

Figure 9: Library decryption program Figure 10 presents the main steps of the application of T6 to this program. We note P the command associated with p and we consider only the arc component of the graph. We avoid the introduction of useless nodes and arcs as described above. As a consequence, only 13 nodes are necessary for this program. Figure 11 shows the graph returned by the algorithm. Applying i

i

q

)g

23

Mechanical Proofs of Security Properties

the Nopath function to this graph, we can derive the property mentioned in section 3 (p4 is the exit program point for charge): (clear 62 charge) and (key

62

charge)

1 = f(unit0 ; charge1)g 0 6 0 6 0 6 G2 = G1 + f(cipher ; clear ); (key ; clear ); (i ; clear )g 1 7 0 7 G3 = G2 + f(charge ; charge ); (unit ; charge ); (i0; charge7)g 0 9 0 9 T6 (G3 ; p0 ; P9 ) = G4 G4 = G3 + f(cipher ; clear ); (i ; clear )gg 1 10 0 10 T6 (G4 ; p9 ; P10 ) = G5 G5 = G4 + f(charge ; charge ); (unit ; charge ); 0 10 (i ; charge )g 7 4 10 4 T6 (G5 ; p0 ; P4 ) = G6 G6 = G5 + f(charge ; charge ); (charge ; charge ); (clear6; clear4); (clear9; clear4)g 4 1 4 0 T6 (G6 ; p0 ; P2 ) = G7 G7 = G6 + f(charge ; charge ); (clear ; clear )g 6(; p0; P1) = G1 T6 (G1 ; p0 ; P6 ) = G2 T6 (G2 ; p6 ; P7 ) = G3 T

G

Figure 10: Analysis of the library decryption program charge

0

0

unit

0

i

key 0

cipher0

clear

0

charge1 charge 7

charge 10 charge 4

clear 6

clear

9

clear 4

Figure 11: Results of analysis of library decryption program

7 Conclusion In this paper we have proposed a formal de nition of the notion of information ow and an axiomatisation of security properties. We then derived a practical inference algorithm which is both correct and complete with respect to the inference system. Thus our main contribution is to provide a formally based and e ective tool for checking security properties of sequential programs. To our knowledge there have been surprisingly few attempts to

24

J.P. Ban^atre, C. Bryce & D. Le Metayer

achieve these goals so far. Most of the approaches described in the literature either lead to manual veri cation techniques [3, 20, 25] or rely on informal correctness proofs [12]. The closest work in the spirit of the contribution presented here is [22]. They derive a ow control algorithm as an abstract interpretation of the denotational semantics of the programming language. The programmer associates each variable with a security class (such as unclassi ed, classi ed, secret, . . . ). Security classes correspond to particular abstract semantics domains forming a lattice of properties and the analysis computes abstract values to check the security constraints. In contrast with this approach, we do not require security classes to be associated with variables but we check that the value of one particular variable cannot ow into another variable. We have shown in [5] that this approach provides more exibility in the choice of a particular security policy. Our algorithm could in fact be applied to synthesise the weakest constraints on the security classes of the variables of an unannotated program. These two options can be compared with the choice between explicit typing and type synthesis in strongly typed programming languages. [3, 25] present a certi cation technique based on Hoare logic. Their proof rules capture a notion of information ow through properties expressed as collections of inequalities between variable classes. However they do not consider the mechanisation of the proofs and they do not relate their axiomatics to any other formal semantics of the language. As a result the underlying notion of security is unclear. Let us consider for instance the following program [3]:

if b then y := x ; if :b then z := y where b does not refer to x, y, or z. Andrews and Reitman correctly point out that since only one branch can execute, there is no ow from x to z. Using a combined proof system that allows assertions about both classes and values, they are able to prove (the property is rephrased using our terminology): (b ) (x

2

y) and (b

2

y)) and

(:b ) (y 2 z) and (b 2 z)) This clearly does not cater for all indirect ows since in the rst case z does not have the ow from b registered, y in the second. There are several directions in which we plan to extend this work. The most important is the treatment of a parallel language. Indeed the need for ensuring security properties becomes especially crucial in the context of distributed systems. We are currently studying the generalisation of our work for a full version of CSP [15]. In CSP communication commands may occur

25

Mechanical Proofs of Security Properties

in guards and in statements. The notion of indirect ow has to be extended to take such communications into account. The semantics of CSP introduces two main technical diculties for a correct treatment of control ow:  Indirect control ow can occur even in the absence of rendez-vous (when such a rendez-vous would have been made possible by a di erent execution of a guarded command).  The non termination of a process can in uence the values of the variables of the processes it might have communicated with. As an example of how indirect ows can occur in the absence of a rendezvous, consider the following program segment. Suppose that y of process P 1 is either 1 or 0. Whatever, the value of y, at the end of process P 1, x will equal y. The reason for this is that, if y = 0 in process P1, then P1 passes t he value 1 to b of process P2 which then passes 0 back to x. Conversely, if y is 1 in P1, then P1 signals 0 to process P3 which signals 1 to P2's b which in turn passes this value back to x. [

]

P1:: [var x,y; y := e(); [ y=0 ! P2 ! 1 2 y6=0 ! skip ] P3 ! 0; P2 ? x ]k

P2:: [var a,b; [ P1 ? b ! b:=b-1 2 P3 ? b ! skip ] a := b; P1 ! a ]k

P3:: [var s ; P1 ? s; P2 ! 1 ]

Our solution consists of associating each program point p with a control

ow variable c containing all the variables which may in uence the fact that the execution of the program reaches that point. When a communication occurs between p1 : P2 ! v and p2 : P1 ? x, the control ow c1 at point p1 is added to the security variable x. Furthermore both control ows c1 and c2 become c1 [ c2. As far as algorithmic aspects are concerned, communications introduce a new source of non determinism in the proof. The traditional technique consists in carrying out the proof of each process independently before checking a cooperation condition on the individual rules. The rst phase places little constraints on communication commands and appropriate properties have to be guessed in order to derive proofs that satisfy the cooperation conditions. Our graph algorithm can be extended in a natural way to simulate this reasoning. The set of nodes includes control ow variables and the required arcs are added between matching communication commands. The important property allowing us to retain the simplicity of the algorithm i

i

26

J.P. Ban^atre, C. Bryce & D. Le Metayer

described here is the fact that we derive for each point of the program the strongest property provable at this point. As a consequence the graph can still be built incrementally avoiding the need for an iterative process. Last but not least, we are currently working on an implementation of our analysis technique. The algorithm yielded by our derivation is akin to data ow and control ow analysis techniques used in compilers [2, 17] (but the particular kind of analysis carried out here cannot be classi ed in either of these two categories). It can be shown that the complexity of a slightly improved version of our algorithm is linear in terms of the length of the program. The optimization consists in introducing intermediate nodes in order to factorise the indirect ow from guards in the alternative and the repetitive commands. This improvement allows us to generate a number of arcs in the graph which is at most linear in terms of the size of the program. We hope to be able to report experimental results soon.

References [1] Abramsky (S.) and Hankin (C. L.), "Abstract interpretation of declarative languages", Ellis Horwood, 1987. [2] Aho (A. V.), Sethi (R.) and Ullman (J. D.), "Compilers: Principles, Techniques and Tools", Addison Wesley, Reading, Mass, 1986. [3] Andrews (G.R.), Reitman (R.P.), "An Axiomatic Approach to Information Flow in Programs", in ACM Transactions on Programming Languages and Systems, volume 2 (1), January 1980, pages 504-513. [4] van Bakel (S.), "Complete restrictions of the intersection type discipline", in Theoretical Computer Science, volume 102(1), 1992, pages 135-163. [5] Ban^atre (J.-P.) and C. Bryce, (C.), "A security proof system for networks of communicating processes", Irisa research report, no 744, June 1993. [6] Clark (D.D.), Wilson (D.R.), "A Comparison of Commercial and Military Computer Security Policies", in Proceedings of the IEEE Symposium on Security and Privacy, Oakland, April 1987, 184-194. [7] Cohen (E.), "Information Transmission in Computational Systems", in Proceedings ACM Symposium on Operating System Principles, 1977, pages 133-139.

27

Mechanical Proofs of Security Properties

[8] Coppo (M.), Giannini (P.), "A Complete Type Inference Algorithm for Simple Intersection Types", in Proceedings of the 17th CAAP, LnCS 581, 1992. [9] Cousot (P.) and Cousot (R.), "Systematic design of program analysis frameworks", in Proceedings ACM POPL, 1979, pages 133-139. [10] Denning (D.E.), Secure Information Flow in Computer Systems, Phd Thesis, Purdue University, May 1975. [11] Denning (D.E.), "A Lattice Model for Secure Information Flow", in Communications of the ACM, volume 19 (5), May 1976, pages 236243. [12] Denning (D.E.), Denning (P.J.), "Certi cation of Programs for Secure Information Flow", in Com munications of the ACM, volume 20 (7), July 1977, pages 504-513. [13] Denning (D.E.), Cryptography and Data Security, Addison-Wesley Publications, 1982. [14] Hankin (C. L.) and Le Metayer (D.), "Deriving Algorithms from Type Inference Systems: Application to Strictness Analysis", to appear in Proceedings ACM POPL 1994. [15] Hoare (C.A.R.), Communicating Sequential Processes, Prentice-Hall London, 1985. [16] Jones (A.), Lipton (R.), "The Enforcement of Security Policies for Computations", in the Proceedings of the 5 Symposium on Operating System Principles, November 1975, pages 197-206. [17] Kennedy K. W., "A Survey of Data Flow Analysis Techniques", in Program Flow Analysis, S. S. Muchnik and N. D. Jones, Eds, PrenticeHall, Englewood Cli s, NJ, 1981. [18] Lampson (B.), "A note on the Con nement Problem", in Communications of the ACM, volume 16 (10), October 1973, pages 613-615. [19] Lampson (B.), "Protection", in Proceedings of the 5th Princeton Symposium on Information Sciences and Systems, Princeton University, March 1971, pages 437-443, reprinted in Operating Systems Review 8,1, January 1974, pages 18-24. [20] Landwehr (C. E.), "Formal models of computer security", ACM Computing Surveys, 13,3, pages 247-278, 1981. th

28

J.P. Ban^atre, C. Bryce & D. Le Metayer

[21] Levy (H.) Capability-Based Computer Systems, Digital Press, Mass. 1984. [22] Mizuno (M.), Schmidt (D.), "A Security Control Flow Control Algorithm and Its Denotational Semantics Correctness Proof", Journal on the Formal Aspects of Computing, 4 (6A), november 1992, pages 722754. [23] Neuman (B.C.), "Protection and Security Issues for Future Systems", in Operating Systems of the 90s and Beyond, Proceedings published in Lecture Notes in Computer Science, volume 563, pages 184-201. [24] Plotkin (A.), "An Operational Semantics of CSP", in D. Bjorner, editor, Formal Description of Programming Concepts - II, North Holland Publishing Company, pages 199-225, 1983. [25] Reitman (R.), Information Flow in Parallel Programs: an Axiomatic Approach, Phd Thesis, Cornell University, August 1978. [26] Saltzer (J.), Schroeder (M.), "The Protection of Information in Computer Systems ", in Proceedings of the IEEE, volume 63 (9), September 1975, pages 1278-1308.