The XBRL tagging requirement in SEC filings is part of a broader trend in financial regulation, domestically and globally, toward standardizing the data used by investors, analysts, regulators, and others in the capital markets. Like barcoding in the retail, shipping, and related industries, standardization of financial information involves the use of machine-readable structured data such as XBRL, as well as related technologies. This consistency makes it easier to convey, extract, and consume vast amounts of financial data swiftly and accurately. Data standards benefit not only investors making market decisions but also companies that need to tell their financial stories clearly to investors and market analysts.
The efficiencies of XBRL-tagged data have led to increasing use of standardized data. Major data providers are incorporating XBRL-tagged disclosures into their feeds, while product innovators have developed XBRL-based platforms that efficiently deliver financial data to companies for their strategic research. XBRL is also a crucial tool for the SEC’s review of company disclosures in the Corporation Finance and Enforcement divisions.
Another example of standardization is the Legal Entity Identifier (LEI), discussed in a recent issue of Dimensions. The LEI is a universal alpha-numeric code that identifies legal entities across markets, products, and regions. In addition to facilitating financial transactions, the LEI is a significant factor in creating market transparency.
Despite the benefits of standardization, researchers find obstacles remain
Yet despite its universal benefits, data standardization has not progressed quickly, according to researchers Richard Berner (Executive-In-Residence at NYU Stern School of Business) and Kathryn Judge (Professor of Law at Columbia Law School). Their paper (The Data Standardization Challenge) explains what the benefits of standardization are, what obstacles it faces, why it should be a high priority for regulators, and how to facilitate its implementation. The research is also summarized in a commentary published by Columbia Law School’s CLS Blue Sky Blog (When Good Incentives Are Not Enough: The Quest for Financial Data Standardization).
“Data standardization offers significant benefits for industry and regulators alike, suggesting that it should be easy,” observe the researchers. “In practice, however, the process has been hard and slow moving.” If the advantages of data standardization are universal and obvious, why has it proved so difficult to establish?
The study reveals a few major reasons for the slow adoption of standardized data. First, companies bear the costs of implementing specific standards at the outset, while the benefits take time to become visible. As the researchers observe:
Standardizing data is not a free good. The costs of developing standards, testing them, retooling firm and regulatory systems to use them, and working out the kinks in implementation are considerable, and the costs of these investments are incurred early on. The benefits follow with a lag, and the benefits aren’t restricted to those who incur the bulk of the costs.
Compounding these issues are cultural obstacles, since data standardization needs coordinated action from many different groups in the public and private sectors. Moreover, companies may lack “the technology and enterprise-wide data management and governance practices that are needed to allow them to use the data to better identify and manage their risks.” In other words, for many companies, data standardization seems to fail a cost/benefit analysis. The US regulatory structure also makes standardization more complex than it should be. Individual regulatory agencies can specify any data standards or none at all. “Adding to the challenge,” the researchers note, “no higher authority can compel agencies to use a particular standard.”