Summary of the ACSAC'99


Mahesh V. Tripunitara

CERIAS, Purdue University


The fifteenth Annual Computer Security Applications Conference (ACSAC'99) took place December 6-10 in Scottsdale, Arizona, USA. It was sponsored by the Applied Computer Security Associates, in cooperation with the ACM SIGSAC. The proceedings are published by the IEEE Computer Society.


The location of the conference had been very well chosen, and the conference organizers did an excellent job with arranging facilities for the talks, extra-curricular activities after hours and "mingle time" for the participants. As suggested by the name of the conference, and as pointed out by Dee Akers, the conference chair, in her introductory talk, the conference focuses on applied information security, yet looks for far-reaching, and long-lasting solutions.


The conference had three parallel tracks for the first two days, one of which was a track for product and service vendors, and the other two for research work. The last day had two research tracks only. There were about 200 attendees, of which about 20 were students. The organizers funded three students (one of which was this author), chosen based on answers to questions about their interests in information security, and recommendation letters from faculty, to attend the conference.


The conference also awarded a "best student paper" award, apart from a "best paper" award. There was also an audience poll for the best presentation from the product/service vendor track, and there was some intense lobbying by some vendors during their presentations for that award, and the competition turned out be quite fierce. There were attendees from fifteen countries other than the USA, and such attendees comprised 25% of the total number.


The remainder of this summary gives a sampling of some of the talks. Not all talks from the conference are summarized, partly to keep this summary short, and also because there were two or three parallel tracks.


Prof. Ross Anderson from the Computer Laboratory at the University of Cambridge was the keynote speaker. He spoke on "How to Cheat at the Lottery," that summarized three pieces of recent research work of his. One of those, that this author found most interesting, was about the "resurrecting duckling" paradigm for instantiating keying material in devices that work together, such as a TV and its remote control. Prof. Anderson spoke about the need to tie such devices together "at birth," and in case of a compromise, to "resurrect" them (with respect to security context), and not simply try to roll back changes, or apply other heuristics. He received a rousing applause for his talk.


On the first day, Andre dos Santos from the University of California at Santa Barbara spoke on "Safe Areas of Computing with Insecure Applications." He discussed such Safe Areas of Computing (SAC) in the context of smart cards, and discussed how smart cards can be used as "safe areas" for computation even though they are used in conjunction with unsafe areas such as the Internet. He spoke about the generic nature of the paradigm, and presented data structures, and a client-server configuration showing a client SAC, and a server SAC. He also spoke about how authentication and access control are achieved in such a setup.

Aaron Temin demonstrated his work on "Automated Intrusion Detection Environment, Advanced Concept Technology." Dr. Temin spoke about the need to collect audit data from a large number of sensors, and the scalability problem of analyzing such data at a single place. He wondered about whether the nature of intrusions will allow us to push analysis engines closer to the source of the audit logs, and when questioned on this front, commented that this was an open problem. Nonetheless, he stressed the need for distributed audit logging and agglomeration of such data, as intrusions are increasingly coordinated, targeting multiple victims, and consist of several steps in a single attack.


Carsten Benecke of the University of Hamburg presented his work on "A Parallel Packet Screen for High Speed Networks." This was an excellent presentation that showed empirically that a parallel packet filter implementation will scale well with increased network traffic. Mr. Benecke assumed a simple hash-function based decision function on whether a packet filter is qualified to process a packet for pass/drop. The implementation involved Ethernet broadcast, with hubs sending each packet to every packet filter. He showed both empirically and analytically that the speedup with 4 processors, as compared to a single processor, was by a factor of 3. He went on to discuss how the implementation can be improved using multicast instead of broadcast, or a switch instead of a hub.


The best paper award of the conference went to the work by A. Arona, D. Bruschi and E. Rosti, from the Universita degli Studi di Milano, on "Adding Availability to Log Services of Untrusted Machines." Their work addressed the problem of potential corruption of audit logs, and the need to distribute the logs in a space and time efficient way, such that fault tolerance is achieved. Their solution used the Information Dispersal Algorithm (IDA) from Rabin, that consists of splitting the information into n parts, any m (m < n) of which are sufficient to reconstruct the original data. Their implementation shows that the encoding for IDA is efficient for their purposes. They also presented a "log availability filter" that transparently processes the log files and disperses the information.


David A. Cooper from NIST spoke on his paper titled, "A Model of Certificate Revocation." Mr. Cooper analyzed efficiency issues with issuing Certificate Revocation Lists (CRLs). He showed that the traditional technique of issuing CRLs is highly inefficient because cached CRLs expire around the same time, leading to a rush to obtain CRLs all at once. He proposed two alternatives: over-issued CRLs and segmented CRLs. Over-issued CRLs involve issuing CRLs with different time-out values to sets of parties, so they do not all expire at the same time. Segmented CRLs involves splitting each CRL into segments and issuing each segment at a different. He analyzed the efficiency gains from each approach over the traditional method by deriving the request rate in the limit, given the number of CRLs that are valid at a given time, and the probability that a party will validate certificate in a given interval.


Wenliang Du of Purdue University presented his work on, "Security Relevancy Analysis on the Registry of Windows NT 4.0." Mr. Du applied dependancy analysis, which is a static analysis technique, to analyze whether keys in the Windows NT 4.0 registry are security-relevant or not. He asserted that the input to programs is crucial in determining security-relevancy. He then showed how dependency di-graphs are built with function names as labels for the nodes, starting with an "S node." Then, "I nodes" are added to the graph based on its association with an S node, or a previously added I node. Mr. Du then presented statistics on his analysis of a portion of the registry, and reported that about half of the keys were shown to be security relevant. This result was greeted with amazement by the audience. Following the talk, audience members expressed concern for the state of information security of a such a popular operating system.


ACSAC'99 was well-organized, and well attended. This author also presented a paper there, enjoyed being at the conference, and plans on attending again next year. The applied flavor of the conference is particularly appealing, and it is one of the few information security conferences in which there is a confluence of industry and academia.