Q&A with Ulf Lindqvist
December 14, 2015
Dr. Ulf Lindqvist is the vice chair (2014-2015) and chair elect (2016-2017) of the IEEE Computer Society’s Technical Committee on Security and Privacy. He is a program director at SRI International, where he manages R&D projects with an emphasis on protecting critical infrastructure. In August 2015, Lindqvist was named vice chair of the IEEE Cybersecurity Initiative. This is his first interview in that capacity.
Question: Congratulations on your appointment as vice chair of the IEEE Cybersecurity Initiative. Would you describe your background and your work at SRI International?
Lindqvist: At SRI International, an independent, nonprofit research center, I manage research and development in the area of security for critical infrastructure systems. I also manage the support that we provide for the U.S. Department of Homeland Security and the Cyber Security Division in their Science and Technology Directorate.
Back in 1997, as a Ph.D. student at Chalmers University of Technology in Sweden, I got a paper accepted at the annual IEEE Symposium on Security and Privacy. That turned out to be fundamental to advancing my career. My work in that paper was based on some very early work done at SRI in the 1970s by Peter G. Neumann and Donn Parker. After I presented the paper at the Symposium, Peter Neumann invited me to spend the summer of 1998 at SRI, which I happily accepted. After my summer visit, I returned to Sweden, finished my thesis work, and moved to California and started at SRI in 1999. Participation in the Symposium really initiated my career path and I’ve attended every Symposium since 1997.
The connection with the Symposium and its community has been very personal to me. Over time I became more involved. I served as vice chair for organizing the Symposium in 2009, as general chair in 2010, and in the following years I continued to handle media relations. Then I got involved in the Technical Committee, for which I’ve been the vice chair for two years, and I’m taking the position of chair on 1 January 2016 for two years.
Question: Would you compare and contrast the work of the IEEE Computer Society’s Technical Committee on Security and Privacy with that of the IEEE Cybersecurity Initiative?
Lindqvist: Rob and I are working to ensure that the Symposium, the Initiative, and the Security and Privacy magazine are more connected than in the past. The Symposium is the premier scientific research conference for security and privacy. One of the Technical Committee’s mandates is to serve as steering committee for the Symposium and other events. The Initiative focuses more on the engineering and implementation side of things. Naturally, there’s overlap along the continuum between R&D, advanced development and product development. Ensuring that these respective activities are coordinated will create valuable benefits, as research is informed by problems in engineering and the research side gets transitioned to practice by close connections with engineering and development.
My work with Homeland Security has involved baking technology transfer into cybersecurity R&D programs to ensure useful, evaluated, deployable results at the end. We developed several principles to foster useful outcomes for research projects.
Question: That sounds like a very useful insight to share with readers. What principles get “baked in” to make sure that research has applicable results?
Lindqvist: We identified four principles. The first is that technology transfer be given pervasive emphasis throughout a project, not just at the end. Before you even start the program, you need to think about the end user or customer and how you will transition your findings. Then there’s “early involvement,” meaning that eventual end users are engaged from the start. “Active engagement” means that the researchers must continue to engage their customers throughout the projects because that’s a well-established path to create something useful. Fourth is something we call “tangible support,” meaning that funding in the research program must be set aside specifically for transition activities such as events, showcases and matchmaking between the resulting technologies and the potential users or partners. A lot of research is done in universities and small companies and, often, it’s the bigger companies that have the resources and the infrastructure to take the next step with customers – testing facilities and so forth.
Question: One often hears that developing cybersecurity technology is one effort, but that human behavior and the adoption of best practices present its own challenges. What are your thoughts on that dichotomy?
Lindqvist: Some research is addressing “usable security,” but I think that’s an area that needs more emphasis. Too often we create a solution that requires the end user to be an expert to use it correctly. Even now, with commercially available tools, when I want to send an encrypted e-mail, it’s so easy to make a mistake so it goes out unencrypted, for example. What we need are tools that make it easy to do the right thing, to do it securely. We’re certainly not there yet.
There’s a flip side to this challenge. If one makes a mistake in a chain of actions to secure something, it can lead to frustration, something not working. That often pushes people to bypass security measures. If you have the wrong encryption key, your recipient can’t read the message on the other end. Some people will then disable security features just to make sure it works and they forget to turn the features back on. So, lots of challenges there.
In terms of encouraging the adoption of best practices, some companies have very successful internal security programs in terms of getting people engaged and aware of effective security practices. These can take the form of internal awareness campaigns. I’d like to see those best practices more widely shared so that other organizations can learn what works and what doesn’t and they don’t have to reinvent the wheel.
Question: What “keeps you up at night,” in so many words, and lends urgency to your work?
Lindqvist: As we increasingly rely on computer and communication systems, there’s no longer a manual backup for so many things. In the early days of computerization, computers were expected to be unreliable and you could always go back to some sort of manual system. Our global society is way beyond that today. Nearly all our value is digitized in one way or another and that creates enormous dependence on computers and networks. That state of affairs, in turn, requires that we address the cybersecurity issues because we know that there are people out there with various motivations who want to mess with our systems. Today, intrusions and hacks are way too easy and the consequences of successful attacks are way too severe to be acceptable.
In terms of severity, many networks and data today can have a direct impact on human health and safety. From critical infrastructures such as electric power and energy, oil and gas, water and transportation to wearable and implantable medical devices, an attack on a computer system could have real tangible consequences – not just your e-mail disappearing but people getting hurt or dying. As a result, I think these are incredibly important fields to work on.