The consolidated audit trail of orders placed and matched on the nation’s stock markets should cost a ‘fraction’ of what the SEC has estimated, according to a financial data expert and author speaking at TradeTech 2012 conference in New York.

Given existing computing technology, collecting the data and storing it in real time should cost roughly half what the federal regulator projected when first proposed nearly two years ago, said David Leinweber, co-founder of the Center for Innovative Financial Technology at the Computational Research Division of the Lawrence Berkeley National Laboratory and author of “Nerds on Wall Street: Math, Machines and Wired Markets.”

In announcing the plan for the system to consolidate all order information from national exchanges, SEC chairman Mary L. Schapiro estimated that the real-time trail of market activity would cost $4 billion to build and $2 billion a year to maintain.

She estimated that such a system would need to take in roughly 100 gigabytes a day, which translates to 100 billion characters of information a day.

[IMGCAP(1)]

That, Leinweber said, is not that much data, in a world flooded with both structured data, such as market quotes and tick data, and unstructured data, such as news, images and social communication.

By comparison, physicists measure the data they need to collect for analysis in petabytes. A petabyte is one quadrillion characters of information. That is one thousand terabytes. A terabyte in turn is one thousand gigabytes.

Market data barely registers on scales that compare information needs for financial analysis versus physical analysis, he said.

At the Berkeley Lab, market movements are studied using a Cray XE6 Supercomputer, known as Hopper. The machine has 153,216 core processors, main memory of 212 terabytes and a 2 petabyte disk.

The lab has been studying capital markets since the 1970’s.

Tom Steinert-Threlkeld writes for Securities Technology Monitor.