Notes from the Eleventh Annual Computer Security Applications Conference, New Orleans, December 11-15, 1995


by Charlie Payne, Secure Computing Corporation and
Ron Ross, Institute for Defense Analysis

[Copies of the conference proceedings are available as:

IEEE Computer Society Press Order Number PR07159
ISBN 0-8186-7159-9
ISSN 1063-9527
Copies may be ordered from:
IEEE Computer Society Press
Customer Service Center
10662 Los Vaqueros Circle
P.O. Box 3014
Los Alamitos, CA 90720-1264
e-mail: cs.books@computer.org
Voice: +1-714-821-8380
Fax: +1-714-821-4641
I don't have prices, but the current IEEE publication catalog lists last year's ACSAC proceedings as available for $60 list price, $30 to IEEE CS members. --CEL]

Sessions reported by Charlie Payne:

Amid the glamour and pomp of old New Orleans, the Annual Computer Security Applications Conference (ACSAC '95) convened for its eleventh conference. Paul Strassman of SAIC was the Keynote Speaker, and Bob Courtney of Robert Courtney Company presented the Distinguished Lecture. While the conference covered many of the traditional INFOSEC topics, e.g., cryptography, database security and applications of formal methods, it was clear that securing the Internet was the hot topic, with cost-effective methods for assurance a close second.

Paul Strassman opened the conference with a call to top management to take personal responsibility for information security in their organizations. Too often, he insisted, the task is delegated to techies. Strassman advocated a security organization based on systems of governance so that there is separation of power. The executive committee is the legislative branch, while the operating management is the executive branch. Strassman stressed that only operating management, with its direct responsibility for profits, should be charged with trade-offs between security and other concerns. The judicial branch arises from the requirement that organization enforce standards. Strassman address concluded with some "bumper stickers":

Bob Courtney spoke next. He proposed that the 1996 conference committee arrange a debate between Strassman and him. He didn't elaborate on the points of disagreement; instead, he decried organizations for not understanding what they are trying protect. We're not doing a good job of defining the problem, he said, because protecting the integrity of information is a far greater problem than ensuring its confidentiality. Integrity is the property of data (or anything) "being no worse than you think it is". Access control is a poor countermeasure for protecting integrity because individuals are not held accountable. He advocated widening the internal auditor's scope to include all business processes and controls.

At the conclusion of the opening session, Marshall Abrams of The MITRE Corp. announced that "LAFS: A Logging and Auditing File System" by Christopher Wee of the University of California, Davis, won the student paper award. Later in the conference it was announced that a team of authors from the Naval Research Laboratory won the outstanding paper award with their submission, Improving Inter-Enclave Information Flow for a Secure Strike Planning Application". (postscript) The conference was presented in three tracks: two technical presentation tracks and a vendor track. The summaries below address only the technical presentation tracks that this reviewer attended.

Wednesday late morning - Track A: Firewalls

Jeremy Epstein of Cordant, Inc. introduced the three presentations: two described applications of firewall technology, while the third presentation described the secure remote control of uninterruptible power supply systems.

In "A Community of Firewalls: An Implementation Example", Dan Woycke of The MITRE Corp. described a new, firewall-based architecture for the Open Source Information System (OSIS), which is an unclassified confederation of systems serving the intelligence community. The old architecture did not support direct connection between the internal network and the Internet, and each node required strong authentication. After considering encrypting routers and managed IP service, MITRE settled on a hybrid approach: virtual private networks. The new architecture stresses usability and connectivity between nodes while retaining an acceptable level of security. It permits WWW between OSIS nodes. MITRE concludes that virtual private networks provide node-to-node connectivity and Internet connectivity while reducing the need for strong authentication.

In "Sidewinder: Combining Type Enforcement and UNIX", Dan Thomsen of Secure Computing Corp. described how Type Enforcement and other mechanisms were added to BSDi UNIX to create the company's firewall product. Four primary features were added: an administration kernel, to which there is no access from the network; an operational kernel, which enforces Type Enforcement and access to the administration kernel; triggers for detecting malicious behavior; and controlled system calls. Finally, Thomsen summarized the results of the initial Sidewinder Challenge, for which there were no successful penetration attacks.

Gerd Enste of debis Systemhaus, Germany, concluded this session with his presentation on the "Secure Remote Control and Adminstration of Uninterruptible Power Supply-Systems with SNMP". The primary threats are message modification and replay attacks. Message disclosure is not a major threat because the content of the control messages may be publicly known. Denial of service attacks are indistinguishable from ordinary network failures. The solution is a modified protocol that uses monotonic counters instead of synchronized clocks and a cryptographic algorithm for generating a message authentication code that is changed for every session or message.

Wednesday early afternoon - Track B: Forum

Steve LaFountain of NSA moderated a forum titled "Experiences Using the Common Criteria (CC) to Develop Protection Profiles (PP) and Security Targets". Forum participants were Bernard Roussely of the NATO Office of Security, Leslie LaFountain of NSA, Jon Millen of The MITRE Corp., and Ken Elliot of The Aerospace Corp. Version 1.0 of the CC will be available in January 1996, with another version appearing a year later.

Roussely described NATO's efforts to develop PPs for NATO critical systems, including a PP with C2 features (plus information labels) and B1 assurance as well as a PP for a firewall. He concluded that product PPs should be developed from system PPs, and that a method is needed to design PPs from system requirements. Roussely liked the rich database of information that appears in part 2 of the CC, but he faulted the perceived gap between a system's security objectives and the PP components. He also noted that the granularity of the components was often inappropriate.

Leslie LaFountain described NSA's goal to develop generic high and low assurance PPs for operating systems. Overall she found the CC to be very flexible and to provide useful guidance on the dependencies between features and assurance. However, the documentation requirements may be overwhelming, and some requirements, such as for assurance of modularity, are still too poorly stated.

Millen developed a PP for an application-level firewall with user authentication, access control and auditing. While most of the needed requirements were present in the CC, there was no requirement for exporting audit data. In addition, he would have liked a special interpretation of subjects and objects for clients and services, respectively, and support for different authentication mechanisms. Millen concluded that the Common Criteria needed improvement in the area of refinement.

Elliot relayed his experience developing B2-level PPs. He shared Roussely's frustration with component granularity. He also noted that the profile structure is not backward-compatible with the TCSEC. In general, the CC is still too large and too informal. However, it does provide a common language for expressing requirements, and it is flexible and extensible. Outstanding issues seem to be the unnecessary dependencies between components, the duplication of functionality and the identification of a process for profile assessment.

One audience member noted that the CC is just the TCSEC in more shades of colors; instead a PP should be something the vendor provides to make claims about what is built. Another individual noted that the utility of the CC is still unknown since only evaluators and accreditors have developed PPs.

Wednesday late afternoon - Track A: Trusted Distribution Systems

In the final session of the day, Emilie Siarkiewicz of Rome Laboratory moderated technical presentations on trusted distributed systems.

In "The Triad System: The Design of a Distributed, Real-Time Trusted System", John Sebes of Trusted Information Systems, Inc. described the addition of real-time features to B3 Trusted Mach (TMach). The system is so-named because it merges three areas of operating system functionality: multilevel security, real-time, and distributed processing. Triad's security functions are provided by the TMach system and by a distributed interprocess communication (IPC) mechanism for propagating security data between hosts. Real-time extensions were made to the Mach microkernel and to the TMach servers. Enforcing the security policy depends on attaching security ID tags (of the sending task) to each IPC and interpreting the tags at each node in the system. A prototype operating system has been produced. TIS believes the architecture is ideal for CORBA security.

In "Immediacy (and Consistency) in Distributed Trusted Systems", Gary Grossman of Cordant, Inc. pondered the meaning of "immediacy" for distributed systems. The term refers to the latency between the time of check and the time of use. For local systems, an action is considered "immediate" if it occurs within a bounded period, or if it occurs before some other action that could be affected by the change. However, that definition fails for distributed systems. If a security database is replicated and distributed, immediacy and consistency will conflict, and immediacy must take priority. In fact, the Trusted Network Interpretation (TNI) of the TCSEC refers to "indeterminate delays" instead of "immediacy". Grossman concludes that immediacy should only be a local system requirement, not a distributed system requirement.

Thomas Darr of CTA, Inc., concluded the session with a presentation on "Multilevel Security Issues in Real-Time Embedded Systems". He discussed the issues and problems in applying existing security technology, guidance and criteria to real-time embedded computer systems with multilevel security requirements, including finding suitable security models, identifying and characterizing the "TCB", the requirements for mandatory access controls, label mechanisms and authentication mechanisms, and the interpretation of security assurance requirements. He recommended closer ties between research and acquisition organizations and suggested further research in threat and risk-based rationales for real-time embedded system security mechanisms.

Thursday late morning - Track B: Forum

Charles Payne of Secure Computing Corp. moderated a forum titled "Roads to Assurance". Forum participants included Doug Landoll of Arca Systems, Inc., David Ferraiolo of NIST, Jody Heaney of The MITRE Corp, John Adams of NSA, and Jan Filsinger of Trusted Information Systems.

Landoll set the tone for the other presentations with a framework for understanding, comparing and reasoning about assurance methods. The framework, which was introduced at the 1995 Workshop on Information Technology Assurance and Trustworthiness (WITAT), considers an assurance method in three dimensions: assurance type (correctness, effectiveness, usability, workmanship), assurance source (system, process, people, environment), and assurance technique (evidence production, evidence evaluation).

Ferraiolo discussed criteria-based assurance methods, focusing primarily on the Common Criteria. The Common Criteria, whose merits were discussed earlier at this conference (see the forum above, "Experiences Using the Common Criteria to Develop Protection Profiles and Security Targets") define a set of components for expressing the assurance requirements for products and systems. The Criteria are consistent with the TCSEC (US), ITSEC (EC) and CTCPEC (Canada). In terms of Landoll's assurance framework, the Common Criteria address the correctness and effectiveness of a product or system through evidence production and evaluation.

Heaney considered the contribution of process-based methods to assurance. The two most popular methods in use, the Software Engineering Institute's Capability Maturity Model (CMM) for Software and ISO 9000, have spawned a new generation of process-based assessments, including the System Engineering CMM, the System Security Engineering CMM and the Acquisition Maturity Model. Process-based assessments not only improve the process but the product as well; however, they require significant management support. Heaney questioned whether the proposed assurance framework adequately captures process improvement and the dependencies between pieces of assurance evidence.

Adams described a recent NSA thrust to develop the Trusted Capability Maturity Model (TCMM). The TCMM model is the marriage of the CMM for Software and the Trusted Software Development Methodology (TSDM). The TSDM describes a development process for building software with a high degree of assurance that it is free from inadvertant errors or malicious code. The TCMM is a technique that yields quantitative measures of both process maturity and compliance with developmental principles that increase the trustworthiness of software. NSA's goal for the project is to replace several assessments (e.g., process-based, product-oriented, etc.) with a single assessment. The project went from ideas to results in only eight months.

Filsinger concluded the formal segment of the forum discussion with an assessment of certification and accreditation methods. C&A is unique in that it is a lifecycle assurance process, and the certifier is a consumer of the assurance evidence produced by the techniques described above. It requires claims about the operational environment and remains in effect throughout the lifecycle of the system.

An audience member noted that the success of any of these techniques still relies heavily on the pedigree of the organization. Another individual questioned why the commercial sector would ever undergo security process improvement. The answer, of course, lies in the financial reward for taking this road. Finally, someone asked if the NSA planned to combine the TCMM and the SSE-MM (both funded by NSA). Adams responded that this issue was still unresolved.

Thursday early afternoon - Track A: Intrusion Detection

Vince Reed of The MITRE Corp. moderated a technical presentation session on recent efforts in intrusion detection technology.

In "Monitoring and Controlling Suspicious Activity in Real-time With IP-Watcher", Michael Neumann of En Garde Systems described a program that allows the security administrator to monitor network logons in real-time, record only the relevant streams of data as evidence, and control the intruder by terminating or assuming control of his connection. Readers can get more information on the product at URL http://nad.infostructure.com/watcher.html.

In "Addressing Threats in World Wide Web Technology", Kraig Meyer of The Aerospace Corp. discussed threats inherent in the use of World Wide Web (WWW) technology, including disclosure, modification, fabrication and repudiation. Security services are available to address these threats at several architectural levels. None of the commercial security solutions currently being developed for Web security (e.g., SSL, SHTTP and DCE Web), however, appear to both completely address the computer security issues and be easily integrated into legacy environments.

Friday early morning - Track A: Assurance

Moderated by Marshall Abrams of The MITRE Corp., this session included presentations on the Trust Technology Assessment Program (TTAP), managing risk in software systems, and new perspectives on combining assurance evidence.

In "A New Perspective on Combining Assurance Evidence", Jay Kahn of The MITRE Corp. described an assurance evidence framework that is designed to reduce the cost of assurance while improving the chances of success for trusted system integration. The framework seeks to produce evidence that is useful, understandable, and that can be generalized and extended to provide insight into the system as a whole. The framework has three properties: it can be applied over the system lifecycle, it provides a structure for contingency planning, and it provides the ability to perform tradeoffs. It treats assurance evidence as products and focuses on their interfaces.

Friday late morning - Track A: Formal Tools

Moderated by Klaus Keus of the German Information Security Agency, this session included presentations on a method for specifying the interfaces for a C2 system, tools for developing trusted applications, and the Verification Support Environment (VSE).

In "A Semi-Formal Method for Specification of Interfaces to a C2 System", Jeremy Epstein of Cordant Inc., described a semi-formal security semantics for describing TCB interfaces. The security semantics was applied to two existing products (Novell's Netware and Cordant's Assure) to build a C2 component with over 500 TCB entry points. Using the technique, Cordant discovered many security flaws and undocumented interfaces in the underlying products; however, they were disappointed with the steep learning curve required, the difficulty of maintaining consistency between authors and with the code, and the semantics' support for testing. In retrospect, Cordant would have put more emphasis on training, and they would have based the semantics on a real programming language rather than pseudo-code. Overall the effort was worthwhile since it forced analysts and developers to understand what was security-relevant in the system, and while the effort might seem great for a C2 system, the authors felt that the semantics helped illuminate a complex discretionary access control (DAC) policy.

ACSAC-11 reported by Ron Ross, Institute for Defense Analyses

Track B: Security Engineering

The session on Security Engineering, chaired by Ron Ross, Institute for Defense Analyses, presented two interesting and complementary papers in the area of security metrics and trusted software reuse. Leading off the session, Mark Aldrich, GRC Inc. presented his paper "Trusted Software, Repositories and Reuse" in which he made several observations and recommendations regarding the ability of system developers to reuse trusted software that would reside in public repositories. He provided current approaches used to identify specific components that are candidates for reuse (described as domain analysis) and described six different categories of assets that could be reused, (i.e., program source code, design specification, plans, documentation, experience and specialized expertise, and meta information about assets). The speaker then discussed the various complications introduced by sharing trusted code based on differing security policies of the systems desiring to take advantage of reuse. He concluded that reuse of components could be very beneficial if employed correctly. Special consideration must be given to ensuring that the concept of trust is not focused on a single aspect of a component such as the source code or documentation, but instead reflects a continuing thread of trust extending from the original security policy down through the machine code executing on the target system.

Following the reuse presentation, Deb Bodeau, Mitre Corporation, presented her paper, "INFOSEC Metrics: Issues and Future Directions". She began her presentation with a survey of current INFOSEC metrics, including COMPUSEC metrics, COMSEC metrics, personnel security metrics, and risk metrics. She also discussed the critical area of information valuation or the determination of asset values as part of the risk management process. She then presented the three forms of security metrics, (i.e., ranking, qualitative, and dollar equivalent) and the pros and cons of each. Building on her previous definitions and discussion, the speaker next provided some recommendations to security engineers and architects needing to use security metrics. These recommendations focused on selecting the appropriate level of abstraction for the metric, understanding the target audience, (i.e., person receiving the resulting information from the metric), and providing the correct scope for the metric. She concluded her presentation by providing several goals for INFOSEC metrics to ensure the utility of current, emerging, and new metrics. These include integrating discipline-specific INFOSEC metrics, providing common language spanning organizational boundaries, closing the gap between risk management and security engineering, and producing scalable metrics that can be targeted to systems of varying complexity.

Track B: The System Security Engineering Capability Maturity Model: Does It Provide Appropriate System Assurance

This panel session, chaired by Rick Hefner, TRW Corporation, addressed the continuing controversial topic of using developmental assurance techniques to achieve requisite assurances about products and systems in lieu of the more traditional approaches involving system evaluation and certification. Specifically, the panel, consisting of Aaron Cohen, CSE, Milan Kuchta, CSE, John Adams, NSA, and Bill Wilson, ARCA Systems, Inc., presented position statements regarding the use of the System Security Engineering Capability Maturity Model (SSECMM) as an assurance alternative or supplement. John Adams discussed the NSA effort to develop an assurance framework which includes a variety of sources to include the SSECMM. The use of the SSECMM will provide NSA a more cost effective and timely alternative to achieving product and system assurance and facilitate the widespread use of commercial off-the shelf products. Aaron Cohen stressed the importance of developmental assurance as an alternative approach and indicated that the new Common Criteria addresses this aspect of assurance. He also stressed the need to understand the relationship among the various types of assurances and highlighted the continuing controversy within the evaluation community regarding the sufficiency of developmental assurance when applied to products. Milan Kuchta focused on understanding the specific capabilities and limitations of the SSECMM to ensure community awareness that developmental assurance will not address all aspects of assurance and trustworthiness. Bill Wilson stressed the increased sophistication and frequency of attacks on systems and that products containing only security features without appropriate assurances, are highly vulnerable to attack and easily defeated. He also talked about the benefits of organizations viewing security from a broad perspective relying on sound security analysis, from the initial requirements phase through the design, development, and implementation phases.

Track B: Access Control

This session provided an interesting contrast between role-based access control issues and a logging and auditing file system. Christopher Wee, University of California, Davis, presented his paper, "LAFS: A Logging and Auditing File System", which describes an extension to the traditional Unix-based file system with a policy-directed security logging and audit analysis for non-privileged users. LAFS allows users to specify a security policy and assists users in the configuration of file system protection mechanisms. The system also logs file accesses and audits the file access logs against the user-specified security policy. These activities are all done in a mode that is transparent to the user. The principal difference in LAFS auditing versus the more traditional auditing mechanisms is the level of granularity of the audit. LAFS limits the audit capability to files only, and does not allow any finer grain monitoring of objects within the system. The user-specified security policy is derived from a policy language that uses predicate logic and regular expressions. The language facilitates expression of Clark-Wilson style integrity policies. The speaker also described prototype implementations of the LAFS system on a variety of common platforms.

Dave Ferraiolo, National Institute of Standards and Technology, presented his paper, "Role-Based Access Control (RBAC): Features and Motivations", which began with a detailed description of the role-based access control paradigm. He discussed the central aspects of RBAC and the relationship between users, subjects, roles, objects and operations. Next, he described the notion of role hierarchies, role authorization, role activation, and separation of duty providing formal rules to specify the interaction among entities. By administratively associating access permissions with roles and making users members of those roles, the management of authorizations within the system is simplified. This provides a greater flexibility in the specification and enforcement of organizational security policies and greatly reduces the cost of security management. After a detailed presentation of the formal aspects of RBAC, the speaker stated that while certain commercial vendors have implemented RBAC features, there is still confusion within the community on precise definitions of RBAC functionality. He mentioned several research efforts that have been initiated to better define RBAC features and the capabilities and limitations of the approach.

Track A: Assurance

Julie Connolly, Mitre Corporation, presented a paper, "The Trust Technology Assessment Program and Benefits to U.S. Evaluation", in which she described the changing paradigm of trusted product evaluations. Currently, all trusted product evaluations in the U.S. are performed by the Government at the National Computer Security Center (NCSC). Under the emerging Trust Technology Assessment Program (TTAP), selected evaluations would be conducted by commercially-licensed evaluation facilities. The goals of the new program include an increase in the number of evaluated products available to the community, shorter and more timely evaluations, and the framework for mutual recognition and international reciprocity of evaluations. The speaker also described the TTAP implementation plan which initially focuses on using Trusted Computer System Evaluation Criteria (TCSEC) and an evaluation methodology based largely on the current Trusted Product Evaluation Program (TPEP). Stating the eventual goal of moving from TCSEC and TPEP to the Common Criteria and associated methodology, the speaker emphasized the need to increase the availability of commercial products and keep the focus of the commercial evaluations on lower levels of assurance. The speaker also provided a brief status report on current TTAP activities including the prototype experimental evaluation and then outlined, in detail, the expected benefits TTAP will bring to the U.S. evaluation community. These benefits included defined and consistent evaluation procedures, shorter evaluation schedules, more timely evaluations, more evaluated products, increased access to evaluations, increased vendor flexibility, and smoother transition to the Common Criteria.

Sharon Fletcher presented an interesting an informative paper entitled "Understanding and Managing Risk in Software Systems", in which she described a risk assessment methodology and toolset developed for software systems involved in safety-critical, security-critical, or mission-critical activities. The methodology introduced a framework for defining perceived risk and desired risk reduction within the context of a risk identification matrix. A system risk model captures the interactions and effectiveness of risk mitigators (or barriers) within a risk mitigators matrix. The purpose is to define an overall process for an analyst to develop a system risk model using the risk ID and mitigators matrices to guide the effort and provide essential information. Analysts perform a barrier analysis and threat analysis for each identified threat. An analysis engine determines the remaining risk within the system given the stated conditions. The model also has the ability to use cost information as part of the overall analysis effort.