Featured Commentary - SEC Technology Roundtable - Zachary Ziliak - October 2012

From MarketsReformWiki
Jump to: navigation, search
Zachary Ziliak
Ziliak 2012.jpg
Occupation Managing Member
Employer Ziliak Law, LLC
Location Chicago, IL
Web site www.ziliak.com/

Zachary Ziliak runs Ziliak Law, LLC, a law firm that focuses on the needs of trading companies and the vendors that service them. Prior to law school, Ziliak worked in the financial sector as a trader and quantitative analyst. He then spent six years in Mayer Brown's Litigation & Dispute Resolution practice, with a concentration in complex financial, mathematical, and computer-related concerns, before breaking off in May 2013 to found his own firm.[1] As such, he is particularly suited to comprehending the legal ramifications of algorithmic and high frequency trading regulation. Following is his summary and commentary of the SEC Market Technology Roundtable, October 2, 2012.

Roundtable Recap: SEC's Technology Panels, October 2, 2012

On Tuesday October 2, 2012, technologists from automated trading companies told the Securities and Exchange Commission that the industry would benefit internally from enhanced quality control and externally from the adoption of safeguards like "kill switches."

In the wake of Knight Capital’s loss of $440 million on August 1, 2012 and the glitch-ridden IPOs of BATS and Facebook earlier this year, the SEC convened a roundtable discussion of market technology on October 2. In an opening statement, SEC Chairman Mary Schapiro noted that “thanks to technology, our securities markets are more efficient and accessible than ever before.” Nonetheless, she highlighted automated trading’s role in this year’s high-profile market disturbances, ascribing the problems to “basic Technology 101 issues.” The Commissioners sought advice on responses to such issues from two panels—one on “preventing errors” and one on “responding to errors.”

Dr. Nancy Leveson of the Massachusetts Institute of Technology quickly put dampers on that first goal, stating that “all software contains errors.” Leveson rejected as “myths” the notion that certain industries had managed to defeat this rule, citing software problems in aircraft and the space shuttle. “There’s 100% certainty that you will have more episodes caused by the financial system’s software,” she said.

Short of eliminating errors entirely, trading companies must work to reduce the frequency and impact of errors. Saro Jahani, Chief Information Officer of Direct Edge, advocated the adoption of more mature software development practices as a step in that direction.

According to Mr. Jahani, “We cannot operate the exchanges and financial institutions—no longer—as a development shop. We have to do it as a production shop.” Dave Lauer, a consultant at Better Markets, Inc. and a former analyst at Allston Trading and Citadel, proposed the Information Quality Management Capability Maturity Model and ISO 9000 as appropriate quality management systems for the trading industry. (As previously reported on MarketsReformWiki, some academics and industry professionals are currently working on adapting ISO 9000 to automated trading firms, in an effort currently called AT 9000.)

Under such quality management standards, organizations manage their activities as documented processes. Several panelists advocated specific processes that could decrease the incidence of errors. Mr. Jahani said that firms should start coding differently, expanding instrumentation of automated systems. Lou Pastina, the Executive Vice President of NYSE Operations, called for all exchanges to provide test symbols in their live trading environments, so that companies could confirm that their systems linked properly with the exchanges without generating actual trades. Chris Isaacson, the Chief Operating Officer of the BATS Exchange, favored increased use of “drop copies,” real-time position statements from exchanges to trading firms.

In Dr. Leveson’s view, however, while all such best practices are helpful, they are not sufficient. She rather called for three overarching approaches, which were seconded by various panelists.

First, the industry would benefit from additional governmental oversight, just as such oversight has helped encourage airlines to produce high-quality software. Through this roundtable, the SEC has shown that it is prepared to provide that oversight, although it is not yet clear how much will be done through prescriptive regulations.

Second, firms must anticipate errors and design systems to limit their impact. Most panelists focused on kill switches as a useful control of this type. Such “switches” are manual or automatic procedures that separate a trading firm from exchanges, preventing additional trades when an automated system has gone out of control. Interest in such systems climbed after Knight Capital’s $440 million loss, with many suggesting that an appropriate kill switch might have enabled the firm or an outside entity to curtail the losing trades in much less time.

While the call for kill switches in general led to what Lou Steinberg, Chief Technology Officer of TD Ameritrade, termed “violent agreement” among the panelists, there was less unanimity as to the optimal contours of such switches. Exchanges worried that if they used a kill switch to block a firm from trading, they could run afoul of the Fair Access rule, 17 CFR 242.301(b). Mr. Steinberg suggested that if kill switches are fully automated, market participants will set trigger levels far out so as to cover all conceivable intended use cases, thereby greatly reducing their utility.

Conversely, Chairman Schapiro expressed concern over the ability of a system that relied on human intervention to block automated trading systems in time. Leveson stated more generally that “it’s almost impossible for humans to monitor computers.” As one reason she cited the “incredulity response” that comes when a system suddenly exhibits an error after months of proper function.

Attempting to address these shortcomings, panelists such as Anna Ewing, the Chief Information Officer of NASDAQ OMX, and Chad Cook, the Chief Technology Officer of Lime Brokerage, proposed a layered approach. Exchanges, broker-dealers, and trading firms could all have access to various kill switches. Those systems could have hard limits that would automatically trigger blockage, as well as less extreme, soft limits that would prompt human intervention. For instance, if a firm’s exposure reached 70% of its kill-switch limit, an alarm might tell the exchange to phone for an explanation, with an automatic cutoff kicking in after ten minutes absent explicit human override.

Such hybrid systems, however, potentially conflict with Dr. Leveson’s third recommendation: the intentional limitation of software functionality and complexity. Dr. M. Lynne Markus of Bentley University similarly called for the reduction, by design, of system complexity and interconnectedness. How to draw the line between the level of complexity necessary to limit the impact of errors and complexity that would unduly increase the frequency of errors remains an open question.

In summary, several panelists called for implementation at trading firms of quality management systems and specific processes aimed at reducing the incidence of errors. At the same time, it was widely acknowledged that such measures would not eliminate all errors, for which reason mitigating solutions such as kill switches would be necessary. Less clear is who would operate those switches, how they would be triggered, or how much control regulators would exercise over the process.

References

  1. Zachary Ziliak. Ziliak Law, LLC. Retrieved on June 3, 2013.

MarketsReformWiki Sponsors

RSM US LLP ADM Investor Services Cinnober Fidessa