CERIAS Recap: Featured Commentary and Tech Talk #3

Once again, I’ve attended the CERIAS Security Symposium held on the campus of Purdue University. This is the final post summarizing the talks I attended.

I’m combining the last two talks into a single post. The first was fairly short, and by the time the second one rolled around, my brain was too tired to focus.

Thursday afternoon included a featured commentary from The Honorable Mark Weatherford, Deputy Undersecretary of Cybersecurity at the U.S. Department of Homeland Security. Mr. Weatherford was originally scheduled to speak at the Symposium, but restrictions in federal travel budgets forced him to present via pre-recorded video. Mr. Weatherford opened with an observation that “99% secure means 100% vulnerable.” There are many cases where a single failure in security resulted in compromise.

The cyber threat is real. DHS Secretary Napolitano says infrastructure is dangerously vulnerable to cyber attack. Banks and other financial institution have been under sustained DDoS attack and it has become very predictable. In the future, there will be more attacks, they will be more disruptive, and they will be harder to defend against.

So what does DHS do in this space? DHS provides operational protection for the .gov domain. They work with the .com sector to improve protection, especially against critical infrastructure. DHS responds to national events and works with other agencies to foster international cooperation.

Cybersecurity got two paragraphs in President Obama’s 2013 State of the Union address. Obama’s recent cybersecurity executive order has goals of establishing an up-to-date cybersecurity network and enhancing information sharing among key stakeholders. DHS is involved in the Scholarship for Service student program which is working to create professionals to meet current and future needs.

The final session was a tech talk by Stephen Elliott, Associate Professor of Technology Leadership and Innovation at Purdue University, entitled “What is missing in biometric testing.” Traditional biometric testing is algorithmic, with well-established metrics and methodologies. Operation testing is harder to do because test methodologies are sometimes dependent on the test. Many papers have been written about the contributions of individual error on performance. Some papers have been written on the contribution of metadata error. Elliott is focused on training: how do users get accustomed to devices, how they remember how to use them, and how can training be provided to users with a consistent message.

One way to improve biometrics is understanding the stability of the user’s response. If we know how stable a subject is, we can reduce the transaction time by requiring fewer measurements. Many factors, including the user, the agent, and system usability affect the performance of biometeric systems. Improving performance is not a matter of simply improving the algorithms, but improving the entire system.

Other posts from this event: