Tuesday, February 11, 2020
A Conversation with Nancy Potok, former Chief Statistician of the U.S.

An effective and efficient U.S. federal government requires evidence about where needs are greatest, what works and what does not work, where and how programs could be improved, and how programs of yesterday may no longer be suited for today. Having access to timely, accurate, reliable statistical data enables the federal government to make reasoned and disciplined decisions about where to target resources to get the largest possible return for the American taxpayer. The federal government’s statistical agencies and programs play a vital role in generating that data. Timely, accurate, and relevant statistical data are the foundation of evidence-based decision making.

Nancy Potok, the then chief statistician within the U.S. federal government joined me on The Business of Government Hour to share her insights into how the U.S. federal government is leveraging data as a strategic asset, how it is building the infrastructure for evidence-based policymaking, and what the future holds for the federal data and statistical communities.

Would you tell us more about the work of the Statistical and Science Policy Office and the duties and responsibilities of the U.S. federal government chief statistician?

When I [would] tell people that I’m the chief statistician of the United States, there is this pause. People say, that’s the coolest title I ever heard in government. There is another pause with an immediate, “What do you do?” I don’t produce statistics. It’s a policy job. It was established as part of the Paperwork Reduction Act and put into the Office of Information and Regulatory Affairs (OIRA). The job is threefold. First and foremost, it is to safeguard the integrity of federal data. The chief statI am charged with making sure that federal statistics are objective, unbiased, not politically influenced, accurate, timely, and relevant. All statistical directives that outline standards and rules on handling federal statistical data come from my office. This office puts out methods and standards that federal agencies have to follow if they’re going to assert that their statistical data is official U.S. government data. Given we have a decentralized statistical system in the U.S., the chief statistician also coordinates all federal statistical agencies. I’ll give you a sense of the size and scope: 13 principal federal statistical agencies and three recognized statistical units—agencies whose principal mission is to produce official federal statistics—are joined by over 100 other federal programs in statistical activities, spanning measurement, information collection, statistical products, data management, and dissemination. The chief statistician heads the  Interagency Council on Statistical Policy (ICSP), which promotes integration across the federal statistical System. The third role is to represent the U.S. internationally. I lead the U.S. delegation to the UN Statistical Commission. I also represented the U.S. at the Organization for Economic Co-operation and Development (OECD) on statistical matters. These are very important partnerships. We collaborate with the international statistical community and have good working relationship with our international counterparts.

Would you highlight some of your key strategic priorities while chief statistician?

One of my key priorities focused on modernizing the data collection methods in order to be able to get data out faster. Surveys take a long time to process and they’re expensive. Also, people increasingly don’t like to answer surveys. It’s an intrusion. It’s hard to collect information that way. We also have a proliferation of data accessible in less traditional ways that can be used for statistical purposes. For example, if you are releasing a monthly retail sales economic indicator and you want to put it out faster than, say, six weeks after you complete each monthly survey asking businesses about their retail sales, you can start to look at data from companies that aggregate credit card records, because more and more purchases are on credit cards. The Census Bureau and the Bureau of Economic Analysis have done research in using aggregated credit card records to calculated retail sales. The individual purchases are de-identified because they’re aggregated, but you can see what was purchased using credit cards in Chicago or in New York the day after the purchases took place. That’s how fast the data are aggregated. You no longer have to go to the businesses to ask about sales, because you can see the sales from the purchase end. But you need to be careful that you are not missing sales that are paid for by means other than credit cards in the released indicator.

Would you tell us more about the federal data strategy?

To help agencies leverage their data as a strategic asset, the federal data strategy includes four components. These components are the building blocks and guides for federal agency actions over the next several years. The first component is enterprise data governance. It includes standardizing metadata, creating inventories, safeguarding confidentiality and privacy, and so on. The more expansive governance vision includes collaboration across agencies and agency program silos in order to bring multidisciplinary expertise together to formulate and address the ‘big questions’ that have been so difficult for agencies to tackle. To be successful means changing federal agency cultures not only to ask priority questions that are meaningful and specific to the agency, including operational and mission-strategic questions, but also to share data across silos within and across agencies. The change for many agencies will be that the priority questions to be answered must drive the research methods, rather than methods being determined by what data have been readily available in the past.

The second component focuses on access, use, and augmentation of data. It calls on agencies to make data available to the public more quickly and in more useful formats. In addition, agencies should be using the best available technologies to increase access to sensitive, protected data while protecting privacy, confidentiality, and security, including the interests of the data providers. The Evidence Commission envisioned a National Secure Data Service that would be a center of excellence for statistical activities that support evidence building. The strategy’s action plan calls for the creation of toolkits and methodologies to help agencies build their own competencies as well. Agencies would also be expected to seek out new sources for building statistical data sets, which could include commercially available data and data from state and local governments.

The third component—decision making and accountability. It addresses the need for policy and decision makers to increase their use of high-quality data and analyses to inform evidence-based decision making and improved operations. Agencies are expected to use the most rigorous methods possible that align and are appropriate to answering the identified ‘big’ questions. Agencies may answer questions using existing evidence, including literature reviews, meta-analyses, and research clearinghouses. But they are also encouraged to explore opportunities for acquiring new evidence, including utilizing outside expertise.

Finally, the federal agencies need to facilitate the use of government data assets by external parties, such as academic researchers, businesses, and community groups. To accomplish this through commercialization, innovation, and public use will require agencies to reach out to partners outside of government to assess which data are most valuable and should be prioritized for making available.

Would tell us more about the implementations of the Foundations for Evidence-Based Policymaking Act (the Evidence Act)?

The federal data strategy and the Evidence Act are a powerful match. Their collective vision: to create partnerships between U.S. federal agencies, State, tribal, and local governments, academia, and industry to realize effectively the value of shared federal data—accomplished by putting ‘open’ non-sensitive data in the hands of the public and using secure technology to increase legitimate researcher access to more restricted, sensitive data while still protecting privacy and confidentiality of those data.

The requirements of the Evidence Act are geared toward a fundamental change in the way agencies think about what they’re doing. The law enacts 11 of the 27 recommendations of the Commission on Evidence-Based Policymaking. The purpose is to address the fractured federal statistical landscape. It illustrates a shift in thinking. The Evidence Act requires that agencies become more transparent with their data and create a comprehensive data inventory and data catalogue accessible to the public that can be accessed through a single site for the federal government. In addition, each agency must create an Open Data Plan. To help facilitate easier access to protected statistical data, the Act mandates that a single application be developed and put in place for researchers to request access to statistical agency data. Currently, each agency has its own application, making the process cumbersome for researchers. The Act also requires agencies to develop evaluation plans tied to their strategic goals. Agencies then create learning agendas focused on first asking the big questions, and then getting the information needed to answer those questions. What kinds of questions might agencies have? The Act envisions that agencies will begin to understand the longer- term societal outcomes of their programs, be able to visualize the results of multiple federal programs in various geographic areas, improve their operations, and better serve the public.