December 11, 2014

Final Data Breakfast of 2014: Federal CFOs Take on the DATA Act Transformation



It was standing room only on Monday for the final installment of this year’s Data Transparency Breakfast series, presented by PwC, exploring the impact of the Digital Accountability and Transparency Act (DATA Act) and similar reforms across government.

For the first time, we were joined by the federal financial officers who will be responsible, under the DATA Act, for applying government-wide data standards to make federal spending information fully searchable, interoperable, and open to all. Our panel—Dorrice Roth, Deputy CFO, Treasury; Sheila Conley, Deputy CFO, HHS; Mark Reger, Deputy Controller, OMB; and Stacy Marcott, Deputy CFO, DHS—explored the challenges and opportunities of this transformation.

The hour-long panel session was packed with insight—perspectives on how the DATA Act will benefit agencies’ internal management, what the current focus of implementation should be, deadline concerns, new business opportunities for the private sector, and more. We’ve outlined our key takeaways below – and you can check Twitter for the play-by-play.

Government-wide data standards for federal spending should allow agencies to improve internal management—without forcing them to redesign or replace their systems.

OMB’s Reger put it this way: "We’ve got to use data as a management tool and we've got to make the data we're collecting valuable to the people who collect the data. The DATA Act is a new orientation for federal government management." Each federal financial manager represented had a unique take on how standardized data would help their agency better manage itself.

To Sheila Conley at HHS, data is critical to the agency’s functionality. With 300 programs and $1.4 trillion in resources in 2014, HHS would rank in the top ten countries in the world. Conley explained that the DATA Act provides an impetus to standardize across all programs, giving the agency the ability to evaluate performance and manage risks.

Conley said her “Aha moment” came during a predictive analytics pilot program at HHS seeking to identify grantees that might be at risk. The agency faced a series of problems while attempting to provide data for analysis: 1) access to data is difficult, 2) the attainable data had quality issues, and 3) the format was seldom machine-readable. Conley explained that federal grant making is what HHS does; it’s how the agency carries out its mission. And “structured data is key to knowing whether or not HHS is making a difference.”

Stacy Marcott of DHS stressed that her agency has six accounting systems, none of which links to the resource management systems that manage grants and contracts: “We’re as stove-piped as it gets.”

“Standards are scary,” admitted Marcott, “[but] I need a way to be able to pull the data from those systems and map it to a standard format.” For example, she explained, securing the U.S. border involves three different DHS entities—each with different programs, systems, and procurement—all aligning to a common goal. If DHS could align the data of all the different programs involved in border security, the agency could more effectively manage its border security spending. Marcott is hopeful that Treasury and OMB's government-wide data standards can provide a smarter way to align individual assets, evaluate, and manage organizations better than the current no-standards system.

Treasury and OMB are working on a way to map federal spending data to a standard without requiring agencies’ systems to be changed. The panel agreed that this, as Marcott put it, would be “way easier.”

The executives in charge of the DATA Act and complying agencies agree on a key implementation priority: government-wide data standards start with common data elements for common concepts.

OMB Controller David Mader
The panel agreed on the importance, and difficulty, of establishing a government-wide data element dictionary for federal spending. This comes on the heels of comments made last Wednesday (Dec. 3) by OMB Controller David Mader at the House Oversight and Government Reform Committee's hearing on DATA Act implementation. Mader indicated that OMB's current implementation focus is on "intellectual work," which he said meant establishing a data dictionary, rather than "moving data around."

Agreeing on definitions for each individual data element is a key aspect of the government-wide data standards. The fact that agencies and the executives in charge of the DATA Act are on the same page and, as the saying goes, "putting first things first" is great news during these early days of implementation.

Agencies seem to be fully aware, but not fully embracing, the aggressive timeline of DATA Act implementation.

The DATA Act puts Treasury and OMB on an aggressive timeline to establish the data standards by May 9, 2015, with agencies starting to use them for financial, payment, budget, grant, and contract reporting by May 9, 2017.

About midway through the breakfast Monday morning, moderator Joe Kull of PwC offered an opportunity for the agencies represented to “give advice to OMB.” The agency officers suggested some deadlines might not be met. Dorrice Roth of Treasury outlined that it’s “meeting the intent” of the DATA Act that is of paramount importance, not necessarily meeting deadlines. “Let’s get this right,” Roth said. Conley also underplayed the importance of looming deadlines, “We need to take a long view of ‘how do we comply and meet statutory requirements.’ Let’s take this as an opportunity to think about our data longer term.”

OMB on the future of implementation: “There is a ton of work to be done.”

OMB’s Mark Reger compared the DATA Act to the Full Employment Act, noting, “there is a ton of work to be done.” Reger said that the input from data transparency consultants, contractors, and data specialists is needed to tell the implementing federal executives what data is most important and help with analysis.

The DATA Act could bring unprecedented accountability for U.S. citizens and advances in management for federal agencies—but it all depends on implementation. To that end, our Coalition will keep scrutinizing the federal government’s progress.

Last month we issued a formal comment explaining what federal spending data standards must look like in order to be effective. Next year we’ll be connecting with Treasury, OMB, and implementing agencies through informal Coalition member roundtables and through public events.

December 3, 2014

Here's what DATA standards should look like. (Dump DUNS.)


Last week, just before the Thanksgiving holiday, the U.S. Treasury Department collected formal comments from all interested parties on the crucial first step of implementing the Digital Accountability and Transparency Act (DATA Act). By May 2015, Treasury and the White House Office of Management and Budget (OMB) must establish government-wide data standards for federal spending.

Data standards will transform U.S. federal spending from disconnected documents into open, machine-readable data. So it's crucial for Treasury and OMB to get the data standards right.

The Data Transparency Coalition's comment - full text available here - runs over 20 pages. If you prefer a quick summary to the full text, you're in luck. Here's what we said.

Data standards, done right, will make your laptop look like this!

Sing from the Same Data Hymnbook!

Once government money managers start singing from the same data hymnbook, taxpayers and their representatives in Congress will be able to follow the money. Agencies will be able to use Big Data tools to better manage their own finances. And recipients of federal grants and contracts will be able to save money by reporting automatically. Data standards can deliver these benefits.

But in order to deliver the promised benefits, the standards, and the way they're maintained and used, must meet certain basic criteria.

1. The data standards must be complete. The law requires that the standards cover all of the existing reporting regimes that agencies and recipients use to report how they receive and spend federal money.

Agencies report their spending through five key reporting regimes: (1) Financial Account Balances, reported to Treasury; (2) Payment Requests, submitted to Treasury; (3) Budget Actions, reported to OMB; (4) Contracts, reported to the GSA; and (5) Grants and Other Assistance, formerly reported to the Commerce Department but now reported to Treasury. For recipients, the picture is more complicated. Grantees and contractors might report to the agency that gave them the funds; to various systems at the General Services Administration; and to the FFATA Subaward Reporting System, if they've packaged the money they received into sub-grants or sub-contracts.

The data standards have to incorporate all these reporting regimes - and must link them together with common data elements. In our comment, we discussed five groups of core concepts that could become a government-wide core taxonomy, or basic data element dictionary, for federal spending:

What for? (Transaction Subject), How much? (Dollar Amount), Who? (Payor/Payee), When? (Time), and Where? (Location).

2. The data standards must be fully accepted. Treasury and OMB should start with data elements and data formats that others have already built, like the Legal Entity Identifier for grantees and contractors and XML / XBRL to format reports.

3. The data standards must be nonproprietary. The data standards can't be restricted by licensing requirements, or the data won't be truly open. There's more (much more!) on this topic below.

4. The data standards must be adopted incrementally. Treasury and OMB should start with the reporting regimes they already control and encourage the others to start using the data standards more gradually.

5. The data standards must be fully enforced. The failure of the Securities and Exchange Commission to enforce the quality of the structured financial data that it collects from public companies shows it's crucial to reject, and correct, submissions that don't adhere to the data standards.

6. The data standards must be fully sustainable. There should be some sort of governance framework to maintain the data standards and update them as needed. There's more on this below.

7. Once federal spending information is reported using the data standards, these reports must easily support validation. It should be easy for agencies and recipients to submit standardized spending reports, see where there are errors, and correct the errors.

DATA Implementation needs a Governance Structure.

Treasury and OMB can't create and maintain the data standards on their own. They must create a government-wide advisory body, involving everybody who participates in the existing reporting regimes. This advisory body can provide input on what the data standards should look like - and later be responsible for harmonizing the different reporting regimes to conform to the data standards.

The Data Transparency Coalition comment encouraged Treasury and OMB to create an advisory body representing each existing reporting regime. These regime-specific advisory bodies could supplement the work of the government-wide DATA Act inter-agency advisory committee that Treasury and OMB have already established. Working together, the inter-agency and regime-specific advisory groups could recommend first a government-wide core taxonomy and then regime-specific supplements to that core, each mapping to the core taxonomy. After the announcement of the first version of the standards (by the legislative deadline of May 9, 2015), the regime-specific groups could be put in charge of harmonizing their own reporting regime with the core and with its own regime-specific taxonomy.

Our comment includes examples of standards governance and the lessons lessons already learned. For example, we noted that National Information Exchange Network (NIEM) handles many disparate domains, from health care to homeland security, with varying needs for extensibility. The NIEM program offers two types of standards updates: major and minor releases, so we suggested that Treasury and OMB could choose to maintain DATA Act data standards in a similar manner.

Want DATA to work? Dump DUNS.

If high quality government spending information is to be freely available to taxpayers and their representatives in Congress, then the government needs to use a non-proprietary identifier for grantees and contractors. Treasury and OMB must choose to stop using the proprietary Data Universal Numbering System (DUNS) number, which is owned by Dun & Bradstreet, Inc.

The government’s continued reliance on the DUNS number prevents federal spending data from being truly open. And it unduly restricts competition in the transparency marketplace.

Why is the method of identifying recipients so important? Here's one example.

Our comment describes Treasury’s collection of payment requests and GSA’s collection of contract summaries, which both require agencies to identify the recipient of contract payments. Today the two reporting regimes express this information differently. Most payment requests submitted to Treasury identify the recipient using an Employer Identification Number (EIN). In contrast, contract summaries identify the contractor using the DUNS number. If these two reporting regimes used the same number, it would become possible to match payments to recipients.

(Dun & Bradstreet’s own comment to Treasury has a different perspective.)

The Data Transparency Coalition generally encourages the government to engage industry - companies like Dun & Bradstreet - to help improve the quality of data. However, the core “Who” element is so central to federal spending that no one company should hold a monopoly on how it's expressed.

Increase Transparency, Reduce Compliance Costs, and Create Tech Jobs.

If implemented successfully, the DATA Act will deliver cost savings within government by reducing red tape and allowing information to flow better between silos. The DATA Act has the potential to modernize government, giving stakeholders the tools to understand, monitor, predict, and make decisions using timely data on actual expenses – just as America’s leading corporations do already. The DATA Act could also relieve the burden of compliance on industry through automated reporting of data rather than documents.

Our members have business opportunities to pursue, too.

The DATA Act's enactment and successful implementation has been a primary motivator for the Data Transparency Coalition’s founding in 2012. As the nation’s only open data trade association, we represent leading technology and consulting firms, including both industry leaders and growing startups. Some of our members offer software solutions and platforms. Others offer solution-agnostic expert advice. The DATA Act transformation will also enable many of our members to pursue new business models and create high-tech jobs.

We're confident that Treasury and OMB can deliver data standards that will do all these things. We'll be watching to make sure.

December 2, 2014

We're Expanding! Job Opening: Director of Development


The Data Transparency Coalition is seeking a full-time Director of Development. We are offering a competitive salary and benefits, with the potential for performance-based bonuses.

The Data Transparency Coalition (datacoalition.org) is the only trade association pursuing the publication of government information as standardized, machine-readable data. Through advocacy, education, and collaboration, the Coalition supports policy reforms that require consistent data standardization and publication. Data transparency enhances accountability, improves government management, reduces compliance costs, and stimulates innovation. Representing a cross-section of the technology industry, the Data Transparency Coalition membership includes market leaders such as Teradata Corporation, Workiva, RR Donnelley, PwC, Booz Allen Hamilton, and CGI Federal and growing startups such as FindTheBest, Enigma.io, and Level One Technologies.

Interested candidates should submit a resume to info@datacoalition.org.

DUTIES

1. Reporting to the Executive Director, develop and execute an annual development plan aimed at building the Coalition's membership base of Executive, Partner, Regular, Startup, and Individual Members.

2. Identify and pursue potential new Coalition members and potential membership upgrades.

3. Research, recommend, and execute potential opportunities to establish and fund complementary organizations such as a research foundation and a political action committee.

4. Work with our events and member services team to coordinate services to existing members with outreach to potential members.

5. Work with our events and member services team to ensure that Coalition events are structured to maximize new member recruitment.

6. Work with our communications team to ensure that Coalition messaging communicates the opportunities and benefits of Coalition membership.

7. Work with our policy team to align the Coalition's advocacy and development efforts.

8. Work with our Executive Director to ensure that members of the Coalition's Board of Directors and Board of Advisors participate actively in development efforts.

9. Manage corporate and supporter constituency lists within the Coalition's overall contact management system.

10. Recruit and supervise a development intern.

QUALIFICATIONS

1. Bachelor's degree.

2. Four or more years of development or sales experience.

3. Background in successfully securing gifts or donations in the $10,000 - $50,000 range.

4. Strong communications skills.

5. Experience in volunteer management.

6. Basic design skills.

7. Experience in membership-based or political organization management.

November 19, 2014

SEC to Congress: Don't force us to stop collecting open data

In response to questions from a member of the House Financial Services Committee, Securities and Exchange Commission Chair Mary Jo White has voiced her agency's desire to keep collecting structured financial data from public companies--contrary to a proposal that would force the SEC to collect this information from most companies as less-useful documents.

Rep. Keith Ellison (D-MN) submitted written follow-up questions after Chair White's testimony before the committee in April 2014. The Chair's answers to Rep. Ellison's questions show that the SEC wants to improve the quality and usefulness of the structured data it collects--and hopes Congress will keep allowing it to collect such data from all public companies, rather than a subset of them.

The Chair's statements are the clearest picture we've got of her view of the need for the SEC to transform its disclosure system from documents into standardized data. In a nutshell, the SEC wants to improve the quality and usefulness of the structured data it collects--and hopes Congress will keep allowing it to collect such data from all public companies, rather than a subset of them.

Here's what Chair White's statements tell us:

1. The SEC's Position on XBRL Exemption Legislation. Asked to comment on Rep. Robert Hurt's proposal to require the SEC to stop collecting structured-data financial statements from all public companies below $250 million in revenue, Chair White responds that the exemption would harm investors' interests because it would restrict investors' and the SEC's ability to "search and analyze the financial information dynamically and ... [compare] financial and business performance across companies, reporting periods, and agencies" (page 3).

2. Adopting the iXBRL Format for Corporate Financial Statements. Chair White confirms (page 1) that her agency is considering adopting the inline XBRL format for corporate financial statements. The inline XBRL format would allow public companies to submit a single version of each financial statement that is both human-readable and machine-readable, replacing the current duplicative system that requires companies to submit two versions of every financial statement (a document version and an XBRL version).
The SEC hopes for a more modern perspective from Congress.

3. Replacing Document-Based Reporting with Data. In July 2013, the SEC's Investor Advisory Committee recommended that the SEC adopt structured data formats for all information it collects under the securities laws. (Currently, a few of the more than 600 forms and submissions are expressed as structured data; the rest are documents.) The SEC has not yet formally responded to the Advisory Committee's recommendation, though it's obligated to do so under the Dodd-Frank Act. Chair White doesn't answer Rep. Ellison's request that she state when the SEC will finally respond--but she does say (page 2) that the SEC now considers the need for data structure with each new disclosure requirement.
 
Rep. Ellison is one of a growing and bipartisan group of Congressional supporters of transforming securities disclosure from disconnected documents into open, structured data.

November 14, 2014

SEC Incorporates Coalition’s Views in New Strategic Plan, Aims for Data Transformation


For years following the Securities and Exchange Commission’s (SEC) 2009 adoption of the eXtensible Business Reporting Language (XBRL) open data format for corporate financial statements, the agency remained mum about its intentions for the rest of corporate disclosure. The SEC still collects most regulatory information as documents, not as data. Investors, companies, and markets do not know whether the agency intends to transform its whole disclosure system into open data.

SEC's pivot towards an open data transformation works well
with headquarters' main architectural feature: transparency.  
In an apparent response to the Coalition’s advocacy for data standards and data publication, the SEC is now embracing the need for transformation.

In February the SEC released its Draft Strategic Plan for fiscal years 2014 to 2018. The draft laid out the agency’s mission, planned initiatives, and performance goals. It outlined a framework of strategic objectives for the five-year period.

Our Coalition criticized the draft plan for not fully embracing the transformation from documents into open data. In March the Coalition submitted a formal comment to the SEC, detailing the crucial role data standards and data publication must play in order for the agency to modernize its disclosure system and realize the overarching goals set forth in the draft strategic plan.

The SEC responded to the Coalition’s recommendations.

In mid-September the agency quietly released the final version of its plan for 2014 to 2018. The February draft only mentioned the need for data transformation in passing—as part of Strategic Objective 4.2. The final version published endorses data transformation throughout.

The very first element of the SEC’s final plan, Strategic Objective 1.1, illustrates this change. The final plan commits the SEC major efforts in data standardization and publication—and illustrates the impact of the Coalition’s advocacy.


Draft


Final (changes highlighted)

STRATEGIC OBJECTIVE 1.1
Improve the quality and usefulness of disclosure: The SEC will continue to evaluate and, where necessary, amend its requirements to improve the quality and usefulness of registrants’ disclosures to investors. Areas of focus will include disclosure about registrants’ financial condition, operations, risk management and executive compensation decisions and practices.

STRATEGIC OBJECTIVE 1.1
Improve the quality and usefulness of disclosure: The SEC will continue to evaluate and, where necessary, amend its requirements to improve the quality and usefulness of registrants’ disclosures to investors, including continuing to modernize the collection and dissemination of timely, machine-readable, structured data to investors when appropriate. Areas of focus will include disclosure about registrants’ financial condition, operations, risk management and executive compensation decisions and practices. Additionally, the SEC will continue to pursue data standards and methods that permit investors to more efficiently search for information within forms as well as aggregate and compare financial data across filers.


Other notable commitments come in objectives 3.1 and 4.2.

In Strategic Objective 3.1 the agency makes a pledge for XBRL improvements. "Design and implement enhancements to EDGAR and SEC.gov to facilitate investor and market participant access to and utilization of disclosure documents and other information: The SEC will continue to modernize its IT systems and the dissemination and rendering of electronic disclosure documents to improve investor access to relevant information and the ease of interacting with the SEC. The SEC is working on enhancements to data standards and XBRL filing requirements that improve the quality of structured data and reduce burdens on filers."

And in Strategic Objective 4.2: "Make disclosure information more useful for analysis: Disclosure documents are submitted to the Commission electronically and, as appropriate, disseminated electronically to the investing public. This initiative will review the current disclosure systems and processes and identify ways to optimize the use of technology to improve the way disclosure documents are constructed and submitted with more emphasis on data collection. A new filing system that is optimized for data retrieval and analysis will provide features that help users create filings that are appropriate to their purpose and that allow computers to extract data from the filings for automated analysis. The system will be more flexible, so that, as new disclosure documents are defined, they can be implemented much more quickly, with all of the features of a modern, web-based filing system. Eventually, new filings structured for automated data retrieval and analysis will replace all filings submitted through the EDGAR system."

Our Coalition is encouraged by the SEC’s recent pivot towards an open data transformation.

The agency’s willingness to incorporate our suggestions throughout its strategic plan is the latest in a string of good news. In July, the SEC took its first steps toward XBRL quality enforcement. This fall, its brand-new chief economist, Mark Flannery announced at our policy conference that the agency is poised to stop collecting separate document-based and data-based versions of company financial statements and collect a single version that is both human-readable and machine-readable.

We hope that this progress and leadership will continue at the SEC—and we will continue to urge Congress not to cut it off through short-sighted legislation.

October 30, 2014

Open Data Could Improve Access to the Great Outdoors


Our Coalition focuses on pursuing open data in two key areas of the U.S. government's information: federal spending and financial regulation. We think transforming these two areas from disconnected documents into open data will create huge benefits for government and society.

Not everybody agrees that government information is most valuable when it's published online as machine-readable data. In both our key areas, supporters of data standardization and publication have had to overcome entrenched opposition.

Would you believe the same is true of the U.S. government's camping information? It is.

On October 8, the United States Forest Service posted a draft Request for Proposal (RFP) for “Recreation One Stop Support Services (R1S).” The draft RFP seeks a private contractor to build software for a new reservation system covering campsites, cabins, and tours at all federal parks. The contract will span up to 11 years, over $1 billion in revenue, and critically define how our nation will access public lands for years to come.

As written, the draft RFP places a single contractor in charge of managing campsite availability and reservation information rather than releasing all that information as open data.

If campsite availability and reservations were published as open data, entrepreneurs could build competing platforms to help campers make reservations and advocates could track campsite use. These are the types of innovations envisioned in President Obama’s May 2013 Open Data Policy, which pushes all agencies to “default to open.”

But the Forest Service's draft RFP does the opposite. In fact, it would remove data that is currently accessible from the public view altogether.

  • The goal of this contract is a reservation system that will provide the best access to our nation's parks and public lands.
  • The problem is that the winning contractor will be able build a website that meets minimum requirements without opening up campsite availability and reservation data. Because the draft RFP gives authority to the contractor to keep all these important data sets private, competition will not exist.

This month the Data Transparency Coalition spoke with Alyssa Ravasio, the founder of Hipcamp. Hipcamp has created comprehensive search engine for campgrounds across government agencies by using publically available government data. Alyssa is one of the entrepreneurs who'd eagerly build new ways for Americans to interact with government campsite data—if that data were open.

Chief among Alyssa’s concerns about the Forest Service's draft RFP is the importance of including an Application Programming Interface (API). At a basic level, APIs allow applications talk to each other. Yelp uses Google Map’s API to show you where restaurants are located and TurboTax uses an API provided by the IRS to give you an easier way to file your taxes. Failing to make an API a primary requirement of this contract will greatly restrict entrepreneurs’ ability to develop innovative applications on top of the open data.

Members of Congress who agree that this draft RFP is not a 21st century approach to conservation and outdoor recreation sent a letter requesting an extension for the public comment period.

To its credit, the Forest Service has responded to the concerns from the open data industry and from Congress. The RFP comment period has been extended an additional two weeks until November 7—and the Forest Service has announced it'll host an in-person meeting on November 13th to collect additional views.

The Forest Service has a chance to amend its RFP to transform campsite information into open data. If it doesn't, Congress and the White House should step in.

October 22, 2014

A Jungle of Entity Identifiers, and What We're Going to Do About It

It's awfully hard to track what you can't identify.

Every U.S. government agency has to uniquely identify the businesses, organizations, and other legal entities that must submit information to it. This task may sound straightforward, but it isn't. Across the federal government there are scores of different identification schemes. And they don't agree.

The jungle of entity identifiers creates big problems--for citizens seeking accountability, for entities trying to comply, and for the government itself.

Let's name just a few of the entity identification schemes that make up this jungle. The IRS uses Employer Identification Numbers (EIN) and Global Intermediary Identification Numbers (GIIN) to identify companies and organizations reporting their taxes. The GSA requires contractors (and, indirectly, most grant recipients too) to apply for a DUNS Number in order to get an award and supply an EIN to receive the cash.

In the financial regulatory world, the thicket gets thicker. The FDIC uses Certificate Numbers, the OCC uses Charter or Docket Numbers, and the Federal Reserve Bank uses RSSD ID Numbers. The Securities and Exchange Commission assigns a completely different identification scheme depending on which of the securities laws it's operating under. An SEC registrant might be identified using the Central Registration Depository (CRD), the Central Index Key (CIK), or one of the multiple filing numbers required under the 1933 Securities Act1934 Securities Act, or 1940 Investment Company Act.

And the Defense Department is its own fenced-off grove--a grove dominated by Commercial And Government Entity (CAGE) Codes.

Because these different agencies assign different and incompatible electronic identifier codes, one entity could be denoted by dozens of unrelated numbers. Investors seeking to track a public company's government contracts, regulators trying to determine a firm's exposure to SEC-regulated and CFTC-regulated derivatives, inspectors general trying to find suspicious patterns in a grantee's federal payments are all out of luck. Even with ready access to huge repositories of filings, transactions, and forms, incompatible identifiers make matching difficult or impossible.

And it’s nobody’s job to make sense of it all.

Watch DTC's Hudson Hollister explain the grantee/contractor identifier mess. 

Help may be on the way. Since 2010, the U.S. Treasury's Office of Financial Research has been working with financial regulators around the world to establish the Legal Entity Identifier (LEI)--a global standard for entity identification. The LEI is built for interoperability. And it's domain-agnostic, which means it's just as easily used by a securities regulator as by a contracting office.

At Data Transparency 2014 last month, Matthew Reed, chief counsel in the U.S. Treasury’s Office of Financial Research suggested that the LEI is ready to be used by agencies to tame the jungle of identifiers. Reed has previously described the LEI as “a bar code for identifying entities” and “a linchpin for making connections in the massive volumes of financial data that course through the international economy every day ... [the LEI will] help companies manage their risk and government regulators analyze data related to financial stability.”

Matthew Reed at DT2014
Reed outlined three important questions the LEI could answer:

  1. Who is who? LEIs uniquely identify entities.
  2. Who owns whom? LEIs show parent-child corporate relationships.
  3. Who owns what? LEIs attached to products show who owns what.


Our Coalition has called on the Treasury Department and the White House to start taming the jungle of identifiers when it comes time to set up government-wide data standards for federal spending, under the DATA Act.

Whenever the government collects information, it should reuse an identifier code that is already being used somewhere else. Equally important is choosing a standard that is non-proprietary. We must avoid identifiers carrying restrictions that impede the free reuse of open data.

The LEI fits both these criteria. That's why our Coalition urges the Treasury Department and the White House, when they adopt the common identifier for grantees and contractors that is required by the DATA Act, will adopt the LEI system. And as we pursue future open data mandates in other areas of government information, we'll try to make sure that the decision-makers in these areas, too, also consider the LEI.

Gradually, outdated and incompatible identification schemes can be retired and replaced by the LEI. This won't happen fast, but it can happen eventually.

With the LEI uniting disparate reporting requirements, the jungle of entity identifiers will become a garden of open, interoperable data--producing abundant insights and opportunities for all.

Welcome to the official mouthpiece of the Data Transparency Coalition.