by Simon Foley
[The following article will appear in the next issue of SIGSAC Review
and appears here with permission of its editor, Catherine Meadows, and
the author. This piece supplements Cipher's earlier account by Trent
Jaeger in EI#7. Simon Foley chaired the meeting reported.-- CEL]
The purpose of the Computer Security Foundations Workshop is to bring together researchers to explore fundamental issues in computer security. Each year, papers and panel sessions are presented in foundational areas such as: access control, cryptographic protocols, database security, integrity and availability, information flow, and formal methods for security.
This year the workshop was held in Dromquinna Manor, Kenmare, County Kerry, Ireland, June 13-15, 1995. It was the first time the workshop was held outside the USA. Dromquinna Manor, situated on a peninsula on the south-west coast of Ireland, provided the kind of peaceful environment, in an idyllic setting, that has become synonymous with the workshop. The number of submissions to the workshop were up on the previous three years, perhaps due in part to the attractive venue, but most certainly due to the lively technical discussions that can be expected during the workshop. It was the hard work of Program Chair Li Gong(SRI, USA), Program Committee, authors and panelists, that made the workshop such a success.
The first session, chaired by Simon Foley, was on Composition and comprised of three papers about relationships between information flow properties and composition. In his presentation of The Composability of Non-Interference (A. Zakinthinos and E.S. Lee, U. Toronto, Canada), Aris Zakinthinos proposed a novel, composable, non-interference property that permitted a degree of feedback. He argued that non-interference was more appealing (than restrictiveness), because unlike restrictiveness, it has a more intuitive feel and a larger class of systems satisfy the property. Lars Wulf presented Composing and Decomposing Systems under Security Properties, co-authored with A.W. Roscoe (both Oxford U., UK). He argued that a more expressive model than traces should be used when capturing information flow properties in CSP. The authors studied separability, and its relationship to composition, in terms of the failures-divergences model. A conclusion was that separability is not a sound property for systems that have (internal) nondeterminism. The final paper, Algebraic Properties of System Composition in the Loral, Ulysses and McLean Trace Models was presented by Alfred Maneki (DOD, USA), who gave a catalogue of algebraic properties for the three information flow properties. His conclusion was one of caution: one should be mindful of how compositions are made, even if the property is composable.
Michael Reiter (AT&T Bell Labs, USA) chaired the session on Authentication Protocols. Li Gong, was to present a paper, Optimal Authentication Protocols Resistant to Password Guessing Attacks, but in the tradition of CSFW spontaneity, he proposed instead, ten foundational issues for computer security. They were: secure initial access (by universal authentication format); high integrity and easily accessible pseudo-random number generators; highly available non-tamperable global time service on internet; sensible placement of security mechanisms within internet; secure, dynamically constituted groups; provably secure protocols; refinement of security properties and specifications; secure (heterogeneous) system composition; secure system inter-operation, and sensible integration of security and fault tolerance.
The second paper, Key Distribution without Individual Trusted Authentication Servers, was presented by Liqun Chen (co-authors, D. Gollmann and C. Mitchell, all U. London). Chen described the problem of establishing secure (symmetric) channels between entities who share no secret, and where some authentication servers cannot be trusted. The advantages, over existing solutions, of their protocol, include, less (and smaller) messages, greater choice of hash function, and a lower computational complexity.
The session after lunch, Analysis of Cryptographic Protocols was chaired by Gene Tsudik (IBM, Switzerland). Colin Boyd (U. Manchester, UK), presented Towards a Classification of Key Agreement Protocols, and advised caution when selecting hash functions. His `menagerie' of hash functions (acronyms, BOW, MIOW and WOOF), help in classifying key agreement protocols into three different types, which he described. The second speaker, Wenbo Mao (HP Labs, UK) proposed An Augmentation of BAN-Like Logics. He noted that errors are easily made during the protocol idealization step. His approach does not change the axioms of the logic, but proposes that stages of the idealization be described using new operators.
After a break for cream teas on the lawns of Dromquinna, the first day was concluded by a panel session on What Makes a Cryptographic Protocol Dependable? The moderator was Catherine Meadows (NRL, USA), with panelists Colin Boyd, Dieter Gollmann and Michael Merritt (AT&T Bell Labs, USA). The motivation for the panel was that there exists a large body of work describing various conditions that cryptographic protocols should satisfy to be reliable/dependable. However, these requirements are often contradictory when taken together.
Colin Boyd's position was that we should follow the formal top down principles of dependable systems development. He suggested a layered approach, with refinement between layers providing correctness: first, specify the security requirements for messages in the system; second, design/describe the protocol in an abstract manner, but avoid details about specific cryptographic algorithms, and third, implement. He argued that many existing protocols are not specified properly, feeling that approaches such as using CSP/FDR are good because they force the specifier to be precise. In light of this Bob Morris wondered do we even know what cryptographic protocols are intended to do in the first place?
Dieter Gollmann also asked this same question of protocols and repudiation, suggesting that the problem is, not being able to focus on what we want to talk about. He agreed with Boyd's layered approach and argued that when designing protocols we should: always use the correct level of abstraction; disassociate signature from encryption with a secret key; keep proofs a distance from protocol, and make all initial assumptions on keys, nonces and algorithm explicit.
Michael Merritt wondered if insight could be gained by taking the viewpoint of dependable protocols as a form of fault tolerant distributed algorithms. With fault tolerance one is interested in the reachable states of a system, while with a protocol, we are interested in the reachable states of an adversary. However, he pointed to a number of problems that make developing dependable protocols harder. He felt that secure refinement will be a difficult problem, especially given that an adversary attacks the implementation, not the specification. He suggested that refinement could be a process of re-examining the steps from previous levels to make sure they still hold. Bill Roscoe agreed that refinement of specifications involving true nondeterminism is very hard. John McLean wondered if properties like subliminal channels might get overlooked by refinement and suggested that perhaps refinement should be viewed as just a way to narrow down what one needs to look at when developing a protocol.
Mark Lomas outlined a system which used two individually `dependable' protocols together in such a way that they failed. Others had similar experiences, often due to protocol assumptions not being made explicit. Michael Merritt wondered how we specify `you do not do anything incredibly stupid', and suggested analyzing substitution attacks as a way to avoid problems of extending protocols later on. Li Gong felt that at the protocol level we should not be concerned with low-level details about crypto-system attacks etc., making it even harder to analyze. He argued that it is the person who codes the protocol who should ask the relevant questions about what appropriate ciphers to use.
Before chairing the first session of day two, Jonathan Millen (MITRE, USA) asked the audience to consider the state of computer security foundations and its relationship to the photograph of Staigue Fort on the proceeding's cover (an early medieval stone fort near Dromquinna Manor). He speculated that the fort looked like a foundation, but nothing was ever built on it and furthermore, it was crumbling on top!
The first paper in this session on Issues in Implementations was The Security Checker: A Semantics-based Tool for the Verification of Security Properties, by R. Focardi, R. Gorrieri and V. Panini (U. Bologna, Italy). In his talk, Ricardo Focardi, described work on adapting the CCS Concurrency Workbench to model-check information flow properties of CCS style specifications. The checking algorithms have poor(some exponential), worst case complexity results. However, in practice the authors have found the time and space requirements for the algorithms to be reasonable, especially if algebraic properties (for example, parallel composition) of the security property are used to reduce the number of states to be checked.
Trent Jaeger presented Implementation of a Discretionary Access Control Model for Script-based Systems (co-author Atul Prakash, both from U. Michigan, USA). He proposed an access control model tailored specifically for script based systems. The model provides roles for processes running command scripts: accesses are determined by the rights of the process and script author. The model has been implemented using Safe-Tcl under Kerberos and Taos. In the absence of the authors, J.V. Janeri (MITRE, USA), D.B. Darby and D.D. Schnackenberg (Boeing, USA), Jonathan Millen presented their paper Building Higher Resolution Synthetic Clocks for Signaling in Covert Timing Channels. Millen described a network LAN covert channel, and its countermeasures, based on process scheduling. Controlling the channel by adjusting clock granularity can conflict with the timing requirements of network software. This was exemplified by the discovery of an Heisenberg-like principle: the covert channel worked when debugging code was present in the software, but decayed when it was removed!
Michael Merritt chaired the session on cryptographic protocols, which focused on using algebraic techniques for verifying protocols (as opposed to BAN style approaches). A.W. Roscoe's paper was on Modeling and Verifying Key-Exchange Protocols using CSP and FDR, but he spoke about subsequent ongoing work at Oxford with P. Gardiner, G. Lowe and M. Goldsmith. Their approach is to use CSP to describe the agents in the protocol (responder, server, initiator, adversary) and use the FDR tool to model check safety and liveness properties of the protocol. The second paper, Using Temporal logic to Specify and Verify Cryptographic Protocols (Progress Report), by J.W. Gray III (Hong Kong U. of Science and Technology) and J. McLean, was presented by John McLean. The goal of their research is to provide a single logic in which requirements specification, protocol specification and proof of correctness can all be done within the same formalism. In this respect, they have effectively recast the Prolog, context free grammar and temporal logic components of the NRL protocol analyzer tool into Lamport's Raw Temporal Logic of Actions. The session adjourned for lunch, which was had down by the sea shore.
The session on Secure Systems was chaired by John McLean. Tom Keefe (Penn State, USA) presented Concurrency Control for Federated Multilevel Secure Database Systems (co-author I.E. Kang, GTE Labs, USA). He explained their interest in building federated multilevel DBMS on top of autonomous, pre-existing multilevel DBMS systems. Their approach uses a secure validation protocol which ensures serializability across the federation, maintains local autonomy of individual DBMS and ensures multilevel security. While their protocol currently requires the security lattices of local DBMS's to be total, the resulting lattice for the secure federation is not necessarily total. Keefe also described a novel method for untrusted timestamp generation that can be used in their protocol. Jeremy Jacob (University of York, UK) presented the paper Specifying Security for CSCW Systems (co-author, S. Foley). He described what the authors believe to be a reasonable method for specifying functionality and security (confidentiality) requirements for CSCW applications. Testing the method against a simple case study, they were surprised that security turned out to be very simple to specify. Jacob suggested that this came about as a result of the way the application is modularized, and because so much is captured by the functionality requirements.
The panel session Considering the Common Criteria concluded the second day, with moderator Jane Sinclair (Open University, UK) and panelists, Jeremy Jacob, Jonathan Millen and Bronia Szczygiel(NPL, UK). Jane Sinclair gave the motivation for the panel and asked a number of questions about the Criteria she hoped we might find answers for.
The first panelist, Jonathan Millen, outlined the key components of the Criteria, described how it might be used, and gave pointers to those aspects of the Criteria that are open to opportunities to do further technical work (practically everything). He noted that the Criteria typically defines what is required, but gives no assistance on how to do it; he felt that further support/guidance documentation should be provided. He finished with the meta-challenge: do we really know enough to do what is required by the Criteria?
Jeremy Jacob felt there little trace of the security foundations community in the Criteria documentation, and asked if the people developing it are paying any attention to us, or do we have anything to offer them anyhow? John McLean noted that the Orange book did enshrine the view of the technical community, and halted them in their tracks.
Bronia Szczygiel argued that the Criteria does not meet user's needs, is incomplete and inflexible and should address the integrity of the entire enterprise, not just the computer system. But she felt that its principles were good, i.e., addressing what your security is, rather than low-level details. She suggested that conformance testing for security can give an added objectivity that is not got from evaluation alone.
There was speculation from the audience as to whether `evaluated to CC' was like `Intel Inside': will it help sell systems? This was countered by, perhaps it is a warning. This seemed indicative of audience's mood: one of skepticism on whether it would be used in commercial practice. Perhaps, using the Criteria is like the Irish dancing seen that night in Dromquinna: just a lot of fancy footwork?
The session for the final day of the workshop, Information Flow was chaired by Joshua Guttman (MITRE, USA). Paul Syverson(NRL, USA) presented a paper The Epistemic Representation of Information Flow Security in Probabilistic Systems, co-author J.W. Gray III. He outlined a logic for security in which information flows due to time and probability can be accounted for. Their logic, iss built on the standard Halpern-Tuttle model of knowledge and probability in systems, and represents a bringing together of previous work in the areas of noninterference and epistemic logic for modeling security. Connection Policies and Controlled Interference, by W.R. Bevier, R.M. Cohen and W.D. Young (Computational Logic, USA) was presented by Bill Young. He described a generalization of their work (presented at this workshop last year) on intransitive non-interference. Typical intransitive noninterference requirements are at the granularity of agents: whether agents may/may not interfere. Controlled noninterference permits a finer granularity by using connection policies to define what channels connect agents, and effectively capture how agents may/may not interfere with each other. He concluded by examining related work on intransitive noninterference, separability and type enforcement(assured pipelines).
We were fortunate to have three days of glorious sunshine, giving delegates the opportunity to discuss their research over walks by the sea, across mountains, and as with every previous year, over games of croquet. Congratulations to this year's Croquet Tournament winner, Bill Roscoe, and the runner-up Aris Zakinthinos. At the business meeting, held after the last session, it was agreed to use Dromquinna Manor as the workshop venue for 1996. The workshop will run June 10--12, 1996. Simon Foley will remain as General Chair and the Programme Chair will be Michael Merritt. For further information about the workshop contact Simon Foley (at the address above), or access the CSFW web page at: http://www.csl.sri.com/ieee-csfw/csfw.html. Information about obtaining copies of this year's CSFW8 proceedings is also available at this site.