2008-11-25

What security can learn from the recent financial crisis

In the most recent Scientific American Perspectives, the editors lament the state of our economic system and place a great deal of blame for it on software models. In their words, risk management models should serve only as aids not substitutes for the human factor. While this is certainly not the only example of the perils of algorithms replacing analysts, it is perhaps the most poignant.

In the security industry, software vendors and managers have been pushing hard for years to supplant analysts with software -- the theory is that automated software can do just as good a job, and after all, labor in this day and age is expensive. The danger, of course, is that the security field is far less mature than the study of capitalism. Instead of dangerously repurposed algorithms originally designed for unrelated fields of physics and mathematics, though, algorithms never having a connection to any causal relationship are employed in our industry. Indeed, the end state of "security" has been elusive even in the most anecdotal of terms; we are a long way away from quantitative methods to define the risk management that is our jobs. Yet software vendors are happy to hand-wave their way through a sale in an effort to provide what amounts to a false sense of security riding on principles that are often far enough from empirically proven that they are better described as "faith" than "science," even though they are presented as such. Management, without the requisite technical skill set or trust of their subordinates to identify the b.s., is too often eager to buy into the hype.

The information security industry as a whole would be wise to learn from this painful lesson in economics: technologies are tools, to be used by skilled analysts to digest large and complicated data sets and produce actionable intelligence. Analysts drive the tools, the tools should not drive the analysts. Otherwise, you find yourself dangerously reliant on inflexible tools incapable of identifying the larger systemic problems, and the only means to identify a problem is the collapse of the entire system - in our case, a catastrophic compromise of security.

No comments: