The TABB group has released a study predicting that new data-recording requirements promulgated under Dodd-Frank may send derivatives data levels soaring by as much as 400%. Electronic trading, clearing, and reporting will generate an immense amount of information, to be stored in Swap Data Repositories.
Despite the possible quadrupling of data levels, derivatives market participants will still record and store much less information than their exchange-traded counterparts. Yet Kevin McPartland of the Tabb Group’s fixed income unit says that record-keeping requirements will make the day of an OTC derivatives trader infinitely more complicated: “The data we expect to see is not overwhelming but compared to what the OTC derivatives areas are used to in the past, it’s a huge change from before. Things that were done in a batch overnight now need to be done intraday. So not only are they doing it faster, but they need to consume a considerable amount more data to make those same decisions and calculations.”
Firms will have to rely on sound technology and sensible organization to cope with the increased workload. The report estimates that OTC derivatives market participants will spend $3.4 billion in order to ensure they comply with new rules. The TABB group also predicts that the move to electronic platforms will bring speed to the relatively sluggish derivatives market. Though still world away from the breakneck pace of equity trading, there will be sizable decreases in latency. “Latency is all relative. So we’re not going to be talking about microseconds in swaps trading,” explains McPartland. “However you’re going from overnight batch to intraday, or an hour to a minute. That’s a huge drop in latency.”
Surprisingly, given the data and speed increases, networks are expected to accommodate the changes within current bandwidth levels: “Bringing the data mart concept to the front office is essential. The challenge will be in creating a best-of-breed solution based on a combination of in-hose and third-party technology to solve the vast array of data challenges,” said McPartland.