Washington - An aeronautics professor from the Massachusetts Institute of Technology told a roundtable panel of senior market executives that there were three vital practices that the best systems managers recognised and employed.
Dr. Nancy Leveson, professor of aeronautics and astronautics and engineering systems at MIT, was invited to share her views at the discussion, which was convened in the wake of the Knight Capital trading disaster in August.
Leveson said there was no such thing as software that didn't have errors. Eventually, all software had problems that would emerge in one situation or another.
"While I'm not suggesting that anyone shouldn't use the highest standards, it's not going to be enough," she told the group. "I wish it were. It's not."
But Leveson said there were three practices used by the most successful entities to limit risk.
The first was having oversight. Leveson said most industries that have high reliability software had government agencies overseeing what was being done "with an iron hand". Citing the airline industry, which knows full well that if planes fell from the sky people would lose confidence, she said industry does a lot of its own self-policing and that the Federal Aviation Authority works in partnership with them.
A second critical tactic that the best risk-conscious groups employ: they are extremely conservative in the use technological devices.
"It's not that they don't use the latest technology. They do. They use the latest and greatest technology. But they limit software functionality complexity," Leveson said. Software in these industries contains for the most part only the minimum function required to achieve the goals of the system.
Finally, they apply systems engineering beyond the technology.
In these industries, firms realise they need a larger system so that software errors don't cause mayhem, given that they know software errors are ultimately going to happen no matter what they do
"And they do this through providing a control structure that limits and controls risk by enforcing constraint on the non-technology related system behaviour that precludes, or at least greatly reduces, serious losses. They understand that they need to fix the system, not just fix the technology, Leveson said.
The bottom line, she said, was that there was 100 percent certainty that there will be more problems caused by the financial system software. "And probably in not too long a time."
It will occur more frequently, she added, because people keep adding functionality and risk into the system.
"And there is no technical fix. Doesn't mean that we shouldn't test," she said.
"The industries that have learned this lesson the hard way limit their risk with discipline and establishing controls. The biggest mistake the Titanic designers made was believing that they could build an unsinkable ship and therefore they didn't have to prepare for contingencies for calamities."