Featured Commentary - CFTC Technology Advisory Committee - Zachary Ziliak - November 2012

From MarketsReformWiki
Jump to: navigation, search
150px-Fidessa.gif


Zachary Ziliak
Ziliak 2012.jpg
Occupation Managing Member
Employer Ziliak Law, LLC
Location Chicago, IL
Web site www.ziliak.com/

Zachary Ziliak runs Ziliak Law, LLC, a law firm that focuses on the needs of trading companies and the vendors that service them. Prior to law school, Ziliak worked in the financial sector as a trader and quantitative analyst. He then spent six years in Mayer Brown's Litigation & Dispute Resolution practice, with a concentration in complex financial, mathematical, and computer-related concerns, before breaking off in May 2013 to found his own firm.[1] As such, he is particularly suited to comprehending the legal ramifications of algorithmic and high frequency trading regulation. Following is his summary and commentary of the CFTC Technology Advisory Committee meeting, October 30, 2012.

TAC Recap: CFTC's Technology Advisory Committee, October 30, 2012

The Commodity Futures Trading Commission’s Technology Advisory Committee (TAC) met in Chicago on October 30, 2012 to hear updates from its Subcommittee on Automated and High Frequency Trading. Commissioner Scott O’Malia, chairman of the TAC, heard industry professionals recommend turning regulatory focus away from high-frequency trading (HFT) as such and toward the particular activities that cause concern over HFT. Market participants cautioned against controls on HFT that could negatively impact market quality. Instead, panelists called for an industry-led initiative to implement quality management practices and appropriate trading controls.

HFT Definition

The TAC received reports from four working groups. Greg Wood, Director at Deutsche Bank Securities, explained the reasoning behind the first working group’s proposed definition for HFT. The key elements are

  1. algorithmic trading;
  2. high speed; and
  3. some objective measure of high message rates.

The team defined algorithmic trading broadly, so as to capture strategies that rely on an algorithm at any stage. For instance, trading systems in which a trader decides what to buy or sell but then launches code that routes or executes trades without further human intervention would qualify. The working group suggested that a regulator update the test for high message rates periodically, to keep it in line with current speeds and areas of concern.

This proposed definition strips away much of the public perception of HFT (e.g., fully automated trade logic and short holding periods) to focus only on measures correlated with the specific activities that worry observers, such as “quote stuffing,” an attempt to slow a market connection through issuance of high numbers of spurious, evanescent orders. Jim Northey, Partner at the LaSalle Technology Group, described an approach that takes this idea one step further.

Mr. Northey recounted how Jorge Herrada, Associate Director at the CFTC, suggested that regulations not be triggered by an entity’s status as a high-frequency trader, but rather by its involvement in deleterious conduct of the sort sometimes ascribed to HFT companies. In this “inverted” approach, a trading firm would qualify for additional restrictions if it sent out and canceled numerous orders, or pursued other strategies that regulations were designed to discourage, regardless of the frequency of the firm’s trades. Conversely, a firm that executed frequent trades but avoided the potentially harmful market activities that concern the CFTC would not face such restrictions. This approach accords with the observation from Edward Dasso III, Vice President of Market Regulation at the National Futures Association, that non-automated trading companies can harm the market as well.

Marketplace Quality

To decide what market activities are potentially harmful and thus worthy of restrictions, the second working group looked to effects on overall “marketplace quality.” Specifically, Working Group 2 proposed four factors for measuring marketplace quality:

  1. high liquidity, to enable investors to trade into and out of positions;
  2. facilitation of price discovery, an essential function of the markets;
  3. low volatility, suggesting that spurious sources of volatility such as abusive market practices have been removed, leaving only the natural level of volatility associated with price discovery; and
  4. low trading costs, including tight bid-ask spreads, which enable more investors to participate in the market.

Panelists sought to ensure that regulations in this area would tend to improve, not harm, marketplace quality. Along these lines, many speakers referenced a recent report by Foresight in the UK, which gathered evidence from multiple studies and advised against implementing various proposed HFT controls because of potential harm to marketplace quality. Dean Payton, Managing Director at CME Group, urged the CFTC not to let rules impair the benefits resulting from HFT, specifically citing new execution opportunities that HFT had created for buy-side traders.

Working Group 2 also introduced the related concept of “market participant quality,” which represents an individual company’s marginal impact on marketplace quality. The team proposed factors for estimating market participant quality, such as a market-making order’s size, duration, and proximity to the top of the book, which presumably correlate with the order’s likely effect on liquidity. But market participant quality proved much more contentious than marketplace quality. Richard Haynes, a CFTC economist, opposed the implicit labeling of particular trading strategies as inherently “good” or “bad,” preferring to focus on strategies that were truly abusive.

As a counterpart to marketplace quality, Working Group 2 highlighted quality practices that are applied within an individual market participant to ensure that its actions match its intentions (“market system quality”). Mr. Northey explained the CFTC’s interest in such internal processes, arguing that while market participants should be free to lose all of their money on foolhardy investment strategies, they should not be allowed to take the market down with them. A market disruption resulting from one firm’s error affects investor confidence in the market as a whole, Mr. Northey reasoned. Thus, market participants are all in it together in this area.

Quality Management Systems

This notion of shared exposure to market perceptions led to proposals for market participants both individually and as a group. Looking first at individual market participants, Mr. Northey echoed statements made to the SEC four weeks earlier, saying that one can never engineer a defect-fee system. Rather, he argued, the focus must be on minimizing the frequency of errors and mitigating the impact of errors. Quality management systems, such as ISO 9000 or Six Sigma, have just such an aim and have worked in numerous other industries.

Specifically, Working Group 2 cited the AT 9000 project, an attempt by academics and practitioners to formulate an ISO 9000 analogue geared toward automated trading. Mr. Northey presented a timeline for AT 9000 that envisioned adoption by the American National Standards Institute in the first half of 2013. Market participants could then voluntarily seek AT 9000 certification, a process that ideally would both decrease a company’s operational risk and improve its ability to communicate its quality practices externally.

Information Sharing

In relation to market participants as a group, Working Group 4 recommended that market participants share information regarding their errors and “near misses” to spread understanding of valuable control processes. Dr. Haynes supported this idea, stating that a consensus had emerged that trading errors harm everyone, so that companies should not compete with one another by keeping secret their methods for preventing market disruption.

While many at the meeting agreed with the working group’s general proposal and Dr. Haynes’ reasoning, the precise contours of such a program of information sharing remain unclear. Speaking for Working Group 4, Jitesh Thakkar, Founder of Edge Financial Technologies, described one idea for an investigatory body like the National Transportation Safety Board that could similarly monitor the safety of markets, but he noted that the working group was divided over the proposal. Cliff Lewis, Executive Vice President at State Street Global Markets, opposed the NTSB proposal and expressed doubt over firms’ willingness to share the “crown jewels” by telling competitors about their strategies. Mr. Thakkar noted the risk companies would run by exposing their errors if the CFTC used such admissions to support fines. Whether some limited data sharing program—which does not expose trading strategies or details sufficient to support enforcement actions—might work remains to be seen.

Risk Controls

While AT 9000 focuses on operations within one market participant, Working Group 3 emphasized the way risk spreads through the market, necessitating controls that operate at multiple levels and bridge the gap between market participants. An exchange can manage its own processes flawlessly and yet still suffer from poor marketplace quality if the trading firms using that exchange do not follow similar quality principles. Speaking for the group, Mr. Payton of the CME argued that in addition to monitoring their own processes, trading venues should expand testing of automated trading systems that connect to them and could adversely affect their marketplace quality. Equally, where a trading venue cannot independently verify the quality attributes of a trading company’s systems, Working Group 3 would have the venue demand certification from the connecting entity that it has implemented the appropriate controls.

The third working group proposed a number of controls to be implemented by trading firms, clearing firms, and market venues. Working Group 4, meanwhile, emphasized that such controls are not without costs. Mr. Thakkar of Edge Financial noted that while pre-trade risk controls would likely decrease the risk of market disruption, they would also slow down processing at the companies that used them. Trading strategies that depend on low latency could thus be rendered unprofitable, generating significant incentive for trading firms not to apply such tests. One way around this, Mr. Thakkar suggested, was to ensure that pre-trade risk controls would be applied equally to all trading systems, generating a “level playing field.” No structure has yet emerged for such a regulation, but Working Group 4 highlighted the similar SEC Rule 15c3-5, which could potentially serve as a model.

Summary and Conclusion

The four working groups thus presented a picture of growing consensus regarding the kinds of activities that can harm the market, practices that market participants can implement to reduce their operational risk, trading controls that can reduce the risk of market disruption, and the desire for some level of information sharing, if such a program should prove workable. At the same time, many details and a few broader policy decisions remain unclear.

The working groups apparently have a few months yet in which to work through their differences: Commissioner O’Malia indicated that he hoped the TAC could meet again “in the first quarter of 2013 with recommendations to the Commission on the HFT Subcommittee’s efforts as well as a gap analysis of existing market controls and regulations.”

References

  1. Zachary Ziliak. Ziliak Law, LLC. Retrieved on June 3, 2013.

[edit] MarketsReformWiki Sponsors

McGladrey ADM Investor Services DTCC Fidessa
Personal tools
Namespaces

Variants
Actions
Navigation
John Lothian News
Special Pages
Toolbox
Share