LEADERSHIP SERIES INNOVATIONS in OPEN GRANTMAKING by Beth Simone Noveck Andrew Miller Andrew Young In partnership with #opengrantmaking  This guide was written by Beth Simone Noveck, Andrew Miller, and Andrew Young, The Governance Lab (The GovLab), and edited by Jen Bokoff, Foundation Center. Design by Christine Innamorato, Foundation Center. The authors would like to thank everyone who contributed information and perspectives to this piece, and particularly would like to acknowledge David Bringle, DARPA, for his contributions. To access this guide and other resources, please visit grantcraft.org. You are welcome to excerpt, copy, or quote from GrantCraft materials, with attribution to GrantCraft and The GovLab, and inclusion of the copyright. GrantCraft is a service of Foundation Center. For further information, please e-mail info@grantcraft.org. Resources in the GrantCraft library are not meant to give instructions or prescribe solutions; rather, they are intended to spark ideas, stimulate discussion, and suggest possibilities. This paper is part of GrantCraft's Leadership Series. © 2016 The GovLab. This work is made available under the terms of the Creative Commons Attribution-NonCommercial 4.0 Unported License, creative commons.org/licenses/by0nc/4.0  Contents 3 Introduction to Open and Effective Grantmaking 5 Innovations Pre-Granting: Ideation Challenges 7 Innovations Pre-Granting: Improving the Quality of Applications through Matchmaking 9 Innovations Pre-Granting: Prioritizing Bottom-Up Participation 11 Innovations in Granting: Open Peer Review and Participatory Judging 13 Innovations in Granting: Evidence-Based Grantmaking 15 Innovations in Granting: Expert Networking 17 Innovations in Granting: Open Alternatives to Grants 19 Innovations Post-Granting: Opening Data About Grants, Grantors, and Grantees 21 Innovations Post-Granting: Standardizing Reporting 23 Innovations Post-Granting: Opening Access to Grant-Funded Work Product 25 Conclusion and Reflection INNOVATIONS IN OPEN GRANTMAKING 1 WHY THIS PAPER? Despite grantmaking's importance, we have a decidedly 20th-century system in place for deciding how we make these billions of dollars of crucial public investments. To make the most of limited funding— and help build confidence in the ability of public investments to make a positive difference—it is essential for our government agencies to try more innovative approaches to designing, awarding, and measuring their grantmaking activities. HOW CAN I USE THIS AS A RESOURCE? Innovations in Open Grantmaking seeks to provide inspiration and early proof of concept regarding innovative practices at every stage of the grantmaking process. The examples and lessons included can act as suggested guidelines for future research and experimentation around more openly and effectively providing access to public money. WHO ARE THE AUTHORS? Beth Simone Noveck is Florence Rogatz Visiting Clinical Professor of Law, Yale Law School and Jerry M. Hultin Global Network Professor of Engineering, New York University. She is Director of the GovLab and its MacArthur Research Network on Opening Governance. Funded by the John D. and Catherine T. MacArthur Foundation, the John S. and James L. Knight Foundation, and Google.org, the GovLab strives to improve people’s lives by changing how we govern. The GovLab designs and tests technology, policy, and strategies for fostering more open and collaborative approaches to strengthen the ability of people and institutions to work together to solve problems, make decisions, resolve conflict, and govern themselves more effectively and legitimately. Andrew Miller is a former Research Fellow at the GovLab currently pursuing a J.D. at Yale Law School. Andrew joined the GovLab after three years in the U.S. Senate, where he served on the legislative teams of Senator Jeff Merkley (D-OR) and Senator Richard Blumenthal (D-CT). During his time in the Senate, Andrew’s portfolio included banking, consumer finance, budget, tax, international trade, labor, and housing issues. While in Sen. Merkley’s office, Andrew helped draft the CROWDFUND Act, which legalized certain kinds of online crowdfunded investments and established a regulatory framework for the new crowdfunding marketplace. Prior to his work in the Senate, Andrew spent two years in the UK on a Marshall Scholarship, where his research focused on the changing politics of the Chinese media and its approach to covering sensitive foreign affairs topics. Andrew Young is the Associate Director of Research at the GovLab, where he leads research efforts focusing on the impact of technology on public institutions. Among the grant-funded projects he has directed are a global assessment of the impact of open government data; comparative benchmarking of government innovation efforts against those of other countries; and crafting the experimental design for testing the adoption of technology innovations in federal agencies. He has written extended work on how public sector institutions use new technology to coordinate work and developed original public resources on new ways of governing with technology. WHERE CAN I LEARN MORE? To learn more about the GovLab's work on open grantmaking, and innovations in governance more generally, visit thegovlab.org, contact info@thegovlab.org, follow us on Twitter @thegovlab, and sign up for our weekly The GovLab Digest newsletter at thegovlab.org/govlab-digest/. An earlier version of this publication was shared on Medium earlier this year, which includes links and comments from other readers, and can be found at medium.com/open-grantmaking-innovations. You can learn more about Beth Simone Noveck's work on governance innovation in her books Smart Citizens, Smarter State: The Technologies of Expertise and the Future of Governing (Harvard University Press, 2015) and Wiki Government: Technology Can Make Government Better, Democracy Stronger and Citizens More Powerful (Brookings, 2009). GrantCraft, a service of Foundation Center, offers resources to help funders be more strategic about their work, and has published this paper as part of its leadership collection to encourage a conversation about this topic. Explore GrantCraft’s resources at grantcraft.org and on Twitter by following @grantcraft. Other services and tools that Foundation Center offers can be accessed at foundationcenter.org. 2 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Introduction to Open and Effective Grantmaking What It Is, Why It Matters Of its $4.1 trillion fiscal year 2016 budget, the U.S. federal government and its grantmaking agencies will give out billions of dollars in the form of grants to states, localities, and individuals, supporting a dizzying array of activities, from scientific research and economic development to arts, culture, and education. Unlike contracts — where government gives a business or organization X dollars in exchange for a specific product or service defined in advance — grants generally provide greater flexibility for the recipient to decide how, precisely, to use the funds to advance a particular goal. As the U.S. federal government defines it: “A grant is a way the government funds your ideas and projects to provide public services and stimulate the economy.”1 When invested well, such grant funding has the power to yield cutting-edge research and innovation, create jobs, deepen the impact that state and local organizations have in their communities, and support smart solutions to hard problems. Grantmaking, in short, plays a vital role in helping our government, our researchers, and our communities confront 21st-century challenges. Yet we still have a decidedly 20th-century system in place for deciding how we make these billions of dollars of crucial investments. In order to make the most of limited funding — and to help build confidence in the ability of public investments to make a positive difference — it is essential for our government agencies to try more strategic approaches to designing, awarding, and measuring their grantmaking. That is the sort of change we hope to advance with this publication on innovations in public grantmaking. THE SYSTEM WE HAVE NOW AND THE SYSTEM WE COULD HAVE In most instances, grantmaking by government agencies follows a familiar lifecycle: the agency describes and publicizes the grant in a public call for proposals; qualifying individuals or entities send in applications; and the agencies select their winners through internal delibera- tions. Members of the public — including outside experts, past grantees, and service recipients in the community — often have few opportunities to provide input before, during, or after the judging process. After awarding grants, the agencies themselves usually have limited continuing inter- actions with those they fund. Grantmaking plays a vital role in helping our government, our researchers, and our communities confront 21st-century challenges. The current system, to be sure, developed for a number of reasons. In an effort to safeguard the legitimacy and fairness of the grantmaking This paper is part of GrantCraft's Leadership Series. GrantCraft publishes papers written by leaders in the field of philanthropy to spark ideas, stimulate discussion, and suggest possibilities. While you read, push yourself to learn from, but also critically reflect on, this text. What do you agree with? What other perspectives do you see? What questions does it raise for you? At the end of the paper, you'll find additional questions that you can use to spark conversation with colleagues and others, which you can also discuss further with an online community on grantcraft.org. As you’re reading, think about what examples you have that can contribute to ongoing learning. E-mail info@grantcraft.org and info@thegovlab to share your perspective. INNOVATIONS IN OPEN GRANTMAKING 3 process, agencies have traditionally conducted grantmaking strictly behind closed doors. From application to judging, most government grantmaking processes have been confidential and at arm’s length. For statutory, regulatory, or even cultural reasons, the grantmaking process in many agencies is characterized by caution rather than by creativity. It is essential for our government agencies to try more strategic approaches to designing, awarding, and measuring their grantmaking. But it doesn’t have to be this way. Innovators in government, philanthropy, and private sector companies have begun to experiment with greater transparency and collaboration at all stages of grantmaking. Perhaps counterintuitively, these innovations in “open grantmaking” have the potential to yield more legitimate and more accountable processes than their closed-door antecedents. These processes, in turn, have the potential to result in more creative strategies for solving problems and, ultimately, more effective outcomes, including greater economic growth. Encouragingly, the federal government has begun to take note. Since the White House organized a conference on open grantmaking, prizes, and challenges in 2010,2 experiments in open grantmaking have indeed proliferated. But there has been no systematic policy adoption of these techniques. The global mandate for trans- parency and open government data, as well as the adoption of alternative funding mechanisms to complement traditional grants and contracts (such as prize-backed challenges), makes the time especially ripe for more systemic change. Our hope is to encourage the broader adoption of open and innovative grantmaking practices, the incorporation of these practices into policy, and a more sustained empirical assessment of their impact. What This Publication Covers Open and effective grantmaking innovations can take many forms, including techniques that: l enable broader and more diverse groups of people to participate, with the aim of bringing greater expertise and creativity into the process; l mandate more transparency, with the aim of improving accountability; and l incorporate greater use of data and evidence, with the aim of evolving the design of the grant and informing future judging decisions. Often, innovative grantmaking processes will combine more than one technique, such as the use of bottom-up crowdsourcing to engage people in gathering data about what’s working on the ground. Throughout this publication, we will take a closer look at several such categories of open grantmaking innovations, organized chronologically along the lifecycle of the grantmaking process: innovations pre-granting, innovations in judging and awarding grant funds, and innovations post-granting. For each type of innovation, we will explore a selection of examples from across the public, private, and philanthropic sectors, as well as their particular advantages and potential drawbacks. Certainly, not every innovation is appropriate for every agency or every grant. But all grantmaking agencies could benefit by taking a long, hard look at their existing procedures and determining how best to modernize and improve them. This publication will provide practitioners throughout government a menu of options to learn from — and some important issues to consider — as they decide how to do so. Through practical examples, this publication will attempt to sketch out the range and potential impact of open and effective grantmaking innovations at all stages of the process. We begin with a series of three stories on innovations before the judging/ awarding process even begins. 4 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations Pre-Granting: Ideation Challenges Using the Crowd to Develop Grant Strategy Institutions can use “the crowd” to brainstorm ideas for the design or goal of the grant itself. In other words, outsiders can be useful for helping in defining the questions institutions ought to use their funding to answer or the problems they should focus on solving. When a government agency seeks to improve how it uses its grant funding, the hard work begins long before the judging process. Grantmaking practitioners know that the quality of a grant’s design (e.g., the problem definition, the application process, the communication and outreach strategy) can determine the quality of applicants and the success of the projects that are ultimately funded. During this pre-judging stage, the objective is to identify the most worthwhile problems toward which to direct grant funding, as well as the best mechanisms for addressing those problems. Openness to outside input, in particular, has the potential to bring to bear greater expertise — including both on-the-ground know-how and more formal training — when determining where and how to fund. Grantmaking institutions have begun to solicit ideas from outside their walls very early in the process, turning to “the crowd” for help to inform how they design the grant opportunity in the first place. Crowdsourcing the grant design has the potential to bring in new ideas and better information from more diverse sources before a single application has been sent in. In 2010, for example, Harvard University launched its Type 1 Diabetes Challenge3 to get creative suggestions for combating the disease and to open up how universities generate their research questions.4 Typically, an academic decides on the direction for their lab and seeks out funding in support of the pre-existing idea. In an effort to expand participation beyond the usual prospects and uncover new ideas for fighting type 1 diabetes prior to investing full- fledged research funding, Harvard sponsored a $30,000 prize–backed challenge to come up with promising approaches that could become the basis for a larger, subsequent research grant. The challenge did not ask people to come up with answers, as is typically the case in grantmaking projects. Rather, contributors — the prize competitors — supplied the questions. This enabled people to propose ideas whether or not they had the resources or desire to solve the problem they proposed. During this pre-judging stage, the objective is to identify the most worthwhile problems toward which to direct grant funding, as well as the best mechanisms for addressing those problems. After six weeks, 150 solid research hypotheses were submitted, encompassing a broad range of approaches from different disciplines. One winner out of the 12 selected was a college chemistry major, who believed there ought to be more focus on the chemical origins of the disease. As she put it, “I was drawn to the fact that the challenge promised to create a dialogue spanning scientific disciplines and based on the merit of people’s ideas. Opportunities like this are extremely rare.”5 Another winning applicant was herself a diabetes patient. The Leona Helmsley Trust then offered $1 million in grant funding to encourage qualified biomedical INNOVATIONS IN OPEN GRANTMAKING 5 researchers to create experiments based on these newly generated research questions, including the approaches suggested by the college student and the diabetes patient. Using a similar model, other foundations such as the Alfred P. Sloan Foundation awarded $15,000 for good ideas for what to fund in connection with research for the White House Smart Disclosure Initiative.6 “Smart Disclosure” refers to creating tools to help consumers make better and safer decisions using the data that government collects from companies and then publishes openly in machine-readable formats. The challenge asked people to answer five questions to help guide future Smart Disclosure research. Good proposals received between $5,000 and $15,000 dollars and did not require the submitter to implement the research. Rather, inspired by these suggestions, Sloan and Russell Openness to outside input has the potential to bear greater expertise — including both on- the-ground know-how and more formal training — when determining where and how to fund. Sage plan on pursuing further grantmaking. The public sector has also begun replicating this model of separating idea generation in advance of grant implementation. In 2013, the National Science Foundation (NSF) held the Basic Research to Enable Agricultural Development (BREAD) Ideas Challenge7 to get good ideas from diverse sources for grantmaking in the agricul- tural sciences, with a focus on improving the lives of millions of smallholder farmers in the developing world. Crucially, the submissions needed to be challenges in need of solving, rather than a solution to some preselected challenge. Examples of winning challenge ideas include “Develop knowledge, methods, and tools to identify drought-productive microbiomes and facilitate their use by smallholder farmers” and “Develop means for ‘root swelling’ of small wild roots, leading the way to the creation of hundreds of new root crops that could improve the nutrition and incomes of developing world farmers.”8 In holding this competition, the NSF is both signaling its own interest in solving the problems it’s selected, and also using its convening power to convince others — including other funders — of the importance of these challenges. While there is no formal evidence of impact from the BREAD challenge just yet, separating the “idea generation” from the “execution” phase potentially allows more diverse people to suggest ideas and inform how funding agencies frame later grant offerings, even if these first-round applicants may not be eligible for or interested in applying for subsequent funding. WHY DO IT l Diversity of input: The quality of grant design can determine the quality of grant applicants and recipients. Using “the crowd” to brainstorm offers a way to harness the knowledge, experience, and diversity of a broader group of people to make sure you’re answering the right questions and solving the right problems. l Flexibility: This approach offers the flexibility to decide whether to engage those outside the organization in helping to design the grant through an open call to a broad public, or through targeted outreach to a specific audience. l Buzz: The publicity and outreach that go into crowdsourcing grant design can, in turn, generate enthusiasm about the grant (or its overarching goals) and attract more applicants. WHY NOT DO IT l Institutional constraints: In circumstances where statutes, regulations, or bylaws tightly constrain a government entity’s or organiza- tion’s grantmaking activity, there may be insufficient room for outside applicants to shape the parameters of the grant call itself. l Time: Crowdsourcing the grant design neces- sarily turns the grant into a two-stage process (the first focusing on problem definition and the second on generating solutions), which may be inappropriate if time is of the essence. 6 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations Pre-Granting: Improving the Quality of Applications through Matchmaking Helping Complementary Applicants Find Each Other Online matchmaking tools can help connect grant applicants with potential partners who have complementary expertise or who might otherwise strengthen their application by joining it. Although relatively new, “matchmaking” has emerged as a method for potentially improving the quality of grant applications. With this approach, grantmaking institutions can use online tools to connect grant applicants with potential partners who might strengthen their applications or join forces in their eventual projects. In the United States, a key example of matchmaking is a joint 2010 effort by the Department of Commerce and Department of Agriculture to award $7.2 billion stimulus dollars for broadband deployment grants. To improve the quality of grant proposals and promote collaboration between large companies and small community groups, they set up Broadbandmatch.gov, an online tool to allow potential grant applicants to find partners with complementary expertise in.9 Conceived as part of the Obama administration’s Open Government Initiative,10 the tool enabled applicants and would-be partners to see not only which small and minority-owned companies might supply goods or services for their projects, but also which nonprofit organizations, educational institutions, and state and municipal governments are working to improve broadband access and digital literacy. During the first weeks of use leading up to the next application deadline, “Over 1,500 organi- zations established profiles on the website, including hundreds of community anchor institutions like libraries and community colleges, hundreds more Internet service providers, dozens of small and minority-owned for-profit businesses, over 100 states or municipalities, as well as various technology vendors, public safety institutions, venture capital firms, and tribal entities.”11 In Europe, the North Atlantic Tourism Association (NATA), which uses grants to support tourism and cultural-exchange projects in Greenland, Iceland, and the Faroe Islands, also offers applicants the opportunity to find each other via a matchmaking tool. As they describe it, “Our new partner database provides opportunities for people who have exciting tourism development ideas to link up. If you have a project that requires partnership in one or both of the other participating countries, this is the perfect way to find the right people who can help you make it happen.”12 In fact, it is a requirement for funding that projects involve at least two of the three countries under NATA’s jurisdiction, making the tool of central importance for assembling a successful application. INNOVATIONS IN OPEN GRANTMAKING 7 In a variation on this matchmaking concept, convening organizations are helping practitioners find and learn from each other. Projects like C40 convene the world’s 40 largest megacities to exchange best practices and cooperate on reducing greenhouse gas emissions.13 By creating an intermediary and convener, the cities have been able to identify and invest in doing what works. One notable example started in 2004, when the VNA Foundation and Michael Reese Health Trust convened about 50 people from agencies that specifically provided health care for homeless community members. After the third convening, the funders bowed out and the agencies continued to meet regularly. There was a direct outcome: several of the member agencies eventually banded together to form the West Side Collaborative, a group with a newly honed strategy to tackle the issues that formed the basis of the convenings, which the VNA Foundation later funded. Matchmaking has emerged as a method for potentially improving the quality of grant applications. We draw attention to these exciting first movers because examples of matchmaking for grant applicants are still somewhat few and far between. WHY DO IT l Applicant quality: Matchmaking can help improve applicant quality by allowing well- rounded teams or partnerships to form from complementary individuals/groups that might otherwise not have found each other. l Idea quality: Bringing together individuals and groups with complementary skills and experiences has the potential to yield better idea outcomes. l Capacity building: Matchmaking tools can also address equity and capacity-building concerns. Applicants with a strong need or compelling case to be awarded a particular grant may not always be best-placed to meet certain technical or other require- ments. These tools can help connect them with potential partners to strengthen their applications. WHY NOT DO IT l Confidentiality: Where confidentiality of applicants or their submission materials is an issue, this approach would perhaps be less appropriate. l Too many cooks: In situations where the grantmaking entity prefers working with individuals or smaller groups, it may be preferable to limit the size of “teams” in the applicant pool. 8 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations Pre-Granting: Prioritizing Bottom-Up Participation Using Distributed Participants to Improve Agility and Impact14 From environmental monitoring to collaborative art spaces, social innovation projects are increasingly harnessing the power and creativity of bottom-up participation. In order to break out of the traditional top-down approach to solving public problems, government agencies may consider making bottom-up participation (e.g., a scientist engaging non-professionals in data gathering) a condition of funding in some instances. As economist Friedrich Hayek wrote in 1945, “the knowledge of the circumstances of which we must make use never exists in concentrated or integrated form.”15 Networks enable institu- tions of all types to quickly and efficiently access the wealth of knowledge, creativity, insight, and enthusiasm that is out there in the wider society. This is perhaps nowhere more evidenced than when researchers find ways to tap into broad networks of nonprofessional contributors. There are estimated to be one hundred billion galaxies in the observable universe, each containing billions of stars. For many years, deep-view telescopes like the Hubble Space Telescope have recorded images of the Milky Way and other galaxies to help us understand how galaxies form. The volume of data that has resulted is enormous. After more than 20 years in orbit, Hubble alone has recorded over one million data points. In 2007, to begin to translate this raw information into useful scientific knowledge, the scientists at NASA launched Zooniverse,16 turning to “citizen scientists” — volunteer hobbyists, amateur science buffs, and space enthusiasts — to classify the images according to their shape: elliptical, spiral, lenticular, irregular. This information, in turn, illuminates the age of the galaxy. In contrast to Zooniverse, where amateurs assist professional scientists, Public Laboratory for Open Technology and Science (Public Lab) dubs itself a “Civic Science” project.17 Public Lab views citizens not as mere amanuenses, but as field scientists fully capable in their own right. In one project, it provides tools to help people make maps and aerial images of environmental conditions using balloons and kites. These sorts of “grassroots mapping” projects have been used to contest official maps. In 2010, for instance, members of an informal settlement in Lima, Peru, developed maps of their community as evidence of their habitation.18 On the Gulf Coast of the United States, locally produced maps of oil spillage are being used to document damage that is underreported by company or government officials.19 Map Kibera, launched in 2009 in Nairobi with support from UNICEF and the Gates Foundation, enlisted slum residents—especially young people—to identify and map formal and informal social service delivery points, as well as community risks and vulnerabilities. Similarly, Ureport, an SMS reporting tool, mobilized 300,000 volunteers across Uganda to spot the problem of banana bacterial wilt, a scourge affecting the country’s most important crop.20 Within five days of the first INNOVATIONS IN OPEN GRANTMAKING 9 text message going out, 190,000 Ugandans had gotten notice of the disease and how to save bananas on their farms. In recognition of the potential of this sort of approach, the Environmental Protection Agency (EPA) has also experimented with awarding grants for citizen science projects such as community-led air and water monitoring initiatives.21 The integration of community participation into grantees’ work product has not been limited to scientific research. Through its Exploring Engagement Fund, the California-based James Irvine Foundation awards funds to community arts initiatives that “aim to engage new and diverse populations by adding active participation oppor- tunities for participants and/or incorporating the use of nontraditional arts spaces.”22 This fund provides an example of how specific project criteria can help grant dollars go beyond the scope of a given project or organization, helping build both capacity and community that could outlast the grant itself. It is conceivable to imagine making citizen engagement a precondition or at least a plus point for successful proposals in a variety of contexts. Prioritizing bottom-up engagement and feedback loops also occupies a major portion of the Fund for Shared Insight’s mission. Shared Insight is a partnership between 30 foundations including the Ford, MacArthur, Hewlett, and Packard Foundations, among others. The initiative was launched in 2014 with the aim of pooling “financial and other resources to make grants to improve philanthropy”—primarily through increased coordination and openness between funders and potential grantees, and supporting initiatives that establish feedback loops with the communities the funded work is intended to benefit. These examples suggest not only a future in which more grants might be awarded to non-professionals, but also the possibility of changing grant policy to require engagement with citizen-amateurs as a condition of funding. It is conceivable to imagine making citizen engagement — i.e., involving citizens in measuring, monitoring, and policing on-the- ground conditions such as environmental indicators, prices, or when and where services have or have not been delivered — a precondition or at least a plus point for successful proposals in a variety of contexts. Using amateur participants to engage in distributed “sensing” of conditions is already improving feedback loops in scientific context, and could very well be fruitfully incorpo- rated into grantmaking more systematically. WHY DO IT l Ear to the ground: Bottom-up participation creates an important channel for people to stay in close touch with needs, ideas, and views of the communities they serve. This, in turn, makes it less likely that publicly funded projects will be received as white elephants. l Efficiency: For projects requiring monitoring or mapping over a vast expanse of space or long period of time, tight government budgets can severely limit the amount of ground that can be covered. Making crowds an integral part of grantee work (e.g., citizen scientists helping to spot an invasive species or signs of a plant disease) can maximize the bang for the buck that government projects generate. WHY NOT DO IT l Expertise: Highly technical projects requiring all participants to have a particular skill set may not be amenable to widespread bottom-up participation. l Community-building is hard: The skills needed to achieve the goals of the grant might not be commensurate with the skills needed to organize and maintain a community of participants. 10 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations in Granting: Open Peer Review and Participatory Judging Changing How “Winners” Are Picked In contrast to the traditional closed-door review process, many organizations have begun exploring new ways to make the judging or awarding of grants more collaborative. These more open judging processes can involve opportunities for public input at the outset to narrow a broad field or, later on, to select final winners from a shortlist. This input could consist of public comments or voting, judging by panels of outside peer reviewers, or a combination of both. During the selection phase, organizations are concerned with ensuring fairness, decreasing the costs, and increasing the efficiency of grant administration. They want to recognize and select the most promising proposals. By bringing more people (and data) from more diverse backgrounds into the process at the selection stage, open grantmaking techniques have the potential to make the grant award process more informed and legitimate. The philanthropic sector has led the way in exploring new ways to open up the process of judging grant applications to participants from outside the awarding entity.23 Beginning in 2007, the Case Foundation involved the public in every aspect of decision making in connection with its Make it Your Own Awards, from determining grant guidelines and judging criteria to voting.24 Although judges worked behind closed doors to winnow the 4,600 applications down to a top 20, the public was then invited to vote for the final winners. More than 15,000 people participated. Participatory innovations like this offer grantmakers the opportunity to make better decisions by broadening the sources of knowledge and expertise brought to bear, but also to build relationships with the communities they serve by involving them directly in the process. It is worth noting, however, that safeguards should be put in place to ensure that finalists do not lobby for votes — a concern held by many regarding such a system. The Wikimedia Foundation, for example, with a grantmaking budget of over $2 million, integrates community input throughout the lifecycle of proposals and awards. As they explain, “In the same way that Wikipedia articles are born and grown on a public platform through the collabo- ration of a global community, so too are our grant proposals workshopped and reviewed on public wikis, as well as improved by volunteer editors.”25 Wikimedia’s model offers a powerful example of an “open peer review” alternative to traditional closed models of judging. A report26 by philanthropic consultancy The Lafayette Practice (commissioned by Wikimedia itself to evaluate its grantmaking practices and compare it to its peers) documented the growth in “Participatory Grantmaking Funds” (PGFs) more broadly, including the Disability Rights Fund,27 the HIV Young Leaders Fund,28 and FRIDA — The Young Feminist Fund.29 The report found that “PGFs serve as a powerful intermediary between grassroots organizing and traditional and institu- tional donors, functioning as a learning hub for institutional donors and participants.” In the United Kingdom, the cooperatively run Edge Fund has found success using participatory INNOVATIONS IN OPEN GRANTMAKING 11 grantmaking specifically to bring marginalized communities directly into the grantmaking process.30 After receiving awards, grantees then have the opportunity to become part of the co-op, helping to reach out to potential applicants and eventually participate in future funding decisions. The Fund also invites other community members (beyond the grantees) to apply for membership in the co-op. As co-founder Sophie Pritchard writes, a unique advantage of the Edge Fund’s collaborative approach is that “members scoring applica- tions [that affect] their own community... [give] guidance to the rest of the members” who weigh in later.31 Participatory innovations like this offer grantmakers the opportunity to make better decisions by broadening the sources of knowledge and expertise brought to bear. These alternative models in participatory grant assessment fall on a spectrum between the traditional closed judging approach and the wide-open wiki-based process. The White House Social Innovation Fund, another such example, outsources the awarding of grants for social innovators to a handful of organizations with a successful track record for social innovation.32 By giving grants to the grantmakers, the Social Innovation Fund diversifies access to innovative proposals and applicants. In an alternative version of this approach, The Other Foundation,33 a South Africa–based LGBT rights organization, used small teams of distributed peer reviewers — under the guidance of foundation board members — to vet appli- cations and decide on awards in its inaugural year of grantmaking.34 The public nominated reviewers from across six countries to assess 114 pending funding applications. The organi- zation then chose 12 peer reviewers, including academics, activists, health practitioners, and representatives from other nonprofits. As part of the process of conducting their evaluations, these peer reviewers had the chance to meet each other in person, agree on funding priorities, and develop a relevant theory of change.35 As the examples of the Case Foundation and The Other Foundation demonstrate, it is possible to combine closed with open and carefully curate the sources and channels of outside input. WHY DO IT l Smarter judging: Collaborative judging processes can bring to bear a wider range of knowledge and expertise, e.g., regarding what sorts of funded projects have or haven’t worked in the past. l Community: Open judging can also help funders build relationships with the commu- nities they serve by involving them directly in the process. l Legitimacy: The transparency provided by a more open judging process can help build public confidence in the grantmaking body and assuage concerns about corruption, cronyism, or bias. l Skill building: When members of the public are given the opportunity to weigh in on grant opportunities, they stand to gain new knowledge not only about the issue addressed by the grant, but also regarding philanthropic decision-making processes. WHY NOT DO IT l Confidentiality: Where confidentiality of applicants or their submission materials is an issue, more traditional judging processes may be more appropriate. However, grant- making bodies can also pursue a “middle ground” with special safeguards relevant to these concerns, e.g., where applicants would know in advance the limited circle of peer reviewers who, exclusively, would have access to their materials. l Timing: If a very fast turnaround is a priority, a wider circle of judges (be they busy peer reviewers or members of the crowd) may slow down the process excessively. l Popularity: Participatory judging results could be skewed in cases where a popular organi- zation with a high level of name recognition is competing against smaller entities. 12 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations in Granting: Evidence-Based Grantmaking A Little Evidence, A Little Money; A Lot of Evidence, A Lot of Money Greater openness in grantmaking processes has the potential to lead to the availability of more and better evidence, which, in turn, could enable funders to use data to help steer money toward interventions that have already been proven to create economic and scientific value. One example of this technique involves giving more money where there is more evidence and giving smaller amounts to riskier and more entrepreneurial endeavors. Traditionally, grantmaking organizations have had to rely in large part on the text of a grant application and the submitting organization’s reputation when facing funding decisions, leading to the frequent practice of funding those who have been previously funded. In recent years, however, government and private funders have been experimenting with more evidence- based grantmaking strategies as a way to ensure greater impact but also to open the field to new applicants.36 To inform this burgeoning practice, America Achieves and The Bridgespan Group, for instance, collaborated on a report seeking to identify best practices for evidence-based funding at the city level based on interviews with dozens of practitioners.37 The trend toward evidence-based grantmaking is part of a larger movement, enabled by better tools for managing data, toward evidence- based policymaking, generally.38 The United States government is leading the way in this Pay For Success movement with new policies and over $100 million invested in such initiatives,39 including over $10.6 million allocated in 2016 for Pay for Success social innovation grants to be awarded by the White House Office of Social Innovation and Civic Participation to nonprofits and state and local governments trying to develop projects using data-driven decision making.40 The Laura and John Arnold Foundation has also prioritized Pay for Success by providing support to the Urban Institute’s Pay for Success Initiative,41 among other investments.42 The trend toward evidence-based grantmaking is part of a larger movement. As the authors of Moneyball for Government write: “Building evidence about the practices, policies and programs that will achieve the most effective and efficient results so that policymakers can make better decisions; investing limited taxpayer dollars in practices, policies and programs that use data, evidence and evaluation to demonstrate they work; and directing funds away from practices, policies, and programs that consistently fail to achieve measurable outcomes.”43 Part of Pay for Success is the idea of starting small and agile — and waiting for results before going big, rather than merely evaluating after the fact. This was the strategy behind the Department of Education’s Investing in Innovation Fund (i3),44 which provides tiered grants contingent on the degree of demon- strated results. By dividing grantees into “development,” “validation,” “and “scaling-up” stages, each with different maximum grant amounts, i3 helps advance the principle that better evidence should be a prerequisite for bigger grants.45 The tiered approach could also enable funders to use data to help steer money toward interventions that have already been proven to create economic and scientific value. It is notable that the creator of the Department of Education i3 project—Jim Shelton—and one of the leaders of this more evidence-based and INNOVATIONS IN OPEN GRANTMAKING 13 entrepreneurial grantmaking movement was tapped by the Facebook founder to lead the $45 billion Chan Zuckerberg Initiative.46 Some funders have recognized that small, entrepreneurial, early-stage investments can help generate the evidence to support later efforts to scale up. The John S. and James L. Knight Foundation’s Prototype Fund, launched in 2012, gives small grants of up to $50,000 for innovators to “research, test core assumptions, and iterate before building out an entire project.”47 Similarly, the J.M. Kaplan Fund’s J.M.K. Innovation Prize, launched in early 2015, awarded up to $50,000 annually for three years to 10 “high-risk, early stage ideas being piloted or prototyped by dynamic visionaries.”48 Starting on an even smaller scale, the Awesome Foundation gives $1,000 grants on a monthly basis to projects deemed “awesome” by a chapter of the foundation.49 Some funders have recognized that small, entrepreneurial, early-stage investments can help generate the evidence to support later efforts to scale up. In addition to the policy preference for informed decision making, new technology platforms like The Giving Common in Massachusetts, which help funders to collect and make sense of more data, are driving the movement toward evidence-based grantmaking. The platform invites nonprofits to “tell their story in their own words in an organized, detailed way” to provide potential donors with more comprehensible, uniform, and useful data on different entities.50 Prospective donors can then search by issue area, geography, and other variables. Other tools, like Foundation Center’s Foundation Maps,51 show who is funding what and where, so funders can connect with others who have supported a given organization and learn about the structure of other grants they have received. Mandates to collect more information coupled with the policy of openness and sharing what funders learn have the potential to lead to more innovation. WHY DO IT l Cost savings: In an era of limited government funding, evidence-based grantmaking can help funding bodies avoid duplication, ineffi- ciency, and waste. l Entrepreneurial innovation: So-called “tiered” grantmaking — where increasing amounts of evidence yield increasing amounts of funding — can help unproven projects scale up while providing incentives for sharing evidence as they progress. WHY NOT DO IT l When impact is less quantifiable: While evidence-based grantmaking has tremendous potential in domains where funders seek to effect quantifiable outcomes — e.g., health, safety, or learning outcomes — other areas, e.g., beautification projects or cultural offerings, may present challenges in defining or gathering measurable or easily compa- rable “evidence.” 14 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations in Granting: Expert Networking Matching Experts to Opportunities The evolution of information retrieval technology and the large-scale availability of relevant data about people’s skills have made it possible to develop platforms that can automate the process of expressing, locating, and matching expertise within and across organizations. Such systems could help grantmaking bodies target potential judges (e.g., for peer review panels) and/or applicants based on their knowledge, experience, and expertise. Due to recent advances in information retrieval technology and the large-scale availability of digital traces of knowledge-related activities, it is possible to develop platforms that fully automate the process of expressing, locating, and matching expertise within and across organizations. Expert networking platforms such as LinkedIn — also called people search, expert discovery, expertise retrieval, expert finding, expert profiling, and e-expertise tools — entail software and associated algorithms that help to answer the question: who is an expert on a topic? Generally speaking, these tools incorporate profiles showcasing what people know. Using a combination of data scraped from social and professional networking sites such as Twitter and LinkedIn, from public sources such as website profiles and publication records, and from profiles provided by users themselves or from referrals, these tools rely on rich schema for organizing data from myriad sources into easily searchable directories. Where a MacArthur Research Network tends to fund and mobilize groups of those they already know (the rolodex approach) and prize- backed challenges throw open the invitation to participate and hope people come (the open call approach), expert networking tools target specific people with the right expertise and match them to opportunities to participate. LinkedIn itself gives nonprofits tools they can use to search its membership for board members with relevant knowledge and experience.52 Early adopters within government are also making use of such tools. The Food and Drug Administration, for example, is experimenting with the use of an expert network called Harvard Profiles to algorithmically match government employees to opportunities to serve on medical device regulatory review panels.53 The hope is that matching technology can accelerate the process of finding the right people to assess ever more complicated and cutting-edge inventions. Expert networking tools target specific people with the right expertise and match them to opportunities to participate. Although they are not yet widely used in the grantmaking context, it is a small leap to imagine the application of such technologies of expertise to connect people with the appropriate know-how and expertise to opportunities to serve on peer review panels for grants, or even to target particular people to ensure they are aware of the opportunity to apply for particularly relevant grants. Tools and approaches for expert networking are still evolving, but the world is on the cusp of an expertise revolution, not just an information revolution.54 INNOVATIONS IN OPEN GRANTMAKING 15 WHY DO IT l Stronger applicants: Automation technology can make it easier and more efficient for government agencies to connect with the strongest potential applicants, many of whom might not otherwise learn about the grant at all. l Stronger judges: Peer review panels work best when they draw from a strong and diverse group of relevant experts. Expert networking technology can augment human recruiting efforts to improve the quality of judging. WHY NOT DO IT l Leaving room for serendipity: When attracting an especially broad or diverse body input into the judging process is a particular priority, grantmaking entities may not wish to use overly specific criteria in targeting invitations to participate. Nuts and Bolts: Finding Government Grants While there are many innovations in grantmaking, there are a few key points and best practices to keep in mind if you’re looking for government grants, regardless of the type. GrantSpace, a free service of Foundation Center, shares extensive information about accessing government grants in this knowledge base article, but here are a few highlights: l Federal funders generally prefer projects that serve as prototypes or models for others to replicate; local government funders require strong evidence of community support for a project. l The majority of government grants are awarded to eligible nonprofit organizations, not to individuals. l Government grants nearly always have stiff reporting requirements. Careful record keeping is a must, since an audit is always a possibility. l Research funding opportunities thoroughly. Be sure to record details on the program itself, application guidelines, the timeline for submittal and notification, agency contacts, the review process, past grants awarded, and any other relevant information. Bookmark or follow the agencies you apply to so that you do not miss future funding opportunities. l Since government funding programs and priorities change frequently, it is a good idea to call or e-mail the appropriate agency contact person to obtain the most up-to-date information on funding guidelines and application information. l Government grant applications often have strict content and formatting guidelines. Be sure to follow any instructions closely, especially deadlines for submission. 16 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations in Granting: Open Alternatives to Grants Crowdfunding, Micropayments, and Prize-Backed Challenges Through crowdfunding, micropayments, and prize-backed challenges, government can use its convening power to harness more broad-based sources of funds and collaborate with private sector partners to fund innovation in new ways and generate problem-solving ideas. Crowdfunding, whereby would-be grantees raise funds to support their projects from distributed donors, uses openness and collaboration to circumvent the centralized grantmaking process altogether.55 Sites like Kickstarer, Indiegogo, and more56 make make the grants process radically participatory. Micro-payment platforms such as Flattr radically distribute the process even further down the long tail by enabling large numbers of people to support philanthropic efforts with very small donations, rather than the larger contributions possible on crowd- funding sites. Microlenders, such as Kiva.org, similarly pool small-scale loans to entrepre- neurs and students in poor communities. These models suggest ways that government agencies could use its convening power to harness more broad-based sources of funds, ideas, and judgment. Prize-backed challenges also present an alternative mechanism to fund innovation in lieu of traditional grants. In 2010, President Obama, in his Strategy for American Innovation,57 called on all U.S. government agencies to increase their use of prizes and challenges to address the most pressing problems facing the country. Subsequently, the Office of Management and Budget (OMB) issued a policy framework to guide agencies in using prizes to mobilize “American ingenuity.”58 Taking up the call, the General Services Administration (GSA) launched Challenge.gov, a one-stop shop where entrepre- neurs and citizen-solvers have had the chance to participate in over 400 of these public sector prize competitions run by a wide range of government agencies. NASA, in particular, has been a standard-bearer in the use of prize-backed challenges in the U.S. government. Successful challenges coordinated by NASA’s Center of Excellence for Collaborative Innovation (CoECI)59 have generated useful, implementable means for improving the design of astronauts' gloves, noninvasively measuring intracranial pressure, and advancing repeated rocket travel to the moon. In an article looking back on the use of challenges at the agency, NASA’s Jenn Gustetic, Jason Crusan, Steve Rader, and Sam Ortega list diverse beneficial outcomes ranging from research advancement, education, and public outreach to advancing the state of the art, demonstrating proof of concept and creating new aerospace vendors and companies.60 Government agencies could use micropayment platforms' convening power to harness more broad-based sources of funds, ideas, and judgment. The philanthropic sector has also begun to explore prize-backed challenges. The American Society for the Prevention of Cruelty to Animals (ASPCA), for instance, used a $100,000 prize to challenge shelters to save more animal lives over a three-month period than they had over the same time period in the previous year. The impetus for this prize lay in a growing recognition that animal adoption had seemingly peaked and existing adoption-promotion techniques had not evolved sufficiently. Since its inception, the ASPCA’s Rachael Ray $100K challenge has not only saved over 280,000 animals from INNOVATIONS IN OPEN GRANTMAKING 17 euthanasia, but has also led shelters to develop new and effective approaches to animal foster care, in which providers take temporary custody of animals and become responsible for finding them permanent homes.61 As this example shows, prize-backed challenges can both help generate innovative new ideas and inspire performance improvements among service providers. Prize-backed challenges present an alternative mechanism to fund innovation in lieu of traditional grants. These sorts of challenges, in turn, rely on crowdsourcing to engage more people in supplying novel ideas to tackle a problem. Stefaan Verhulst and Andrew Young of the The GovLab write that “prizes and challenges allow governments to establish ambitious goals without having to predict which individual, team or approach is most likely to succeed (thus reducing the riskiness of funding decisions at the outset), and to stimulate private-sector investment that is potentially much greater in value than the prize amount itself.”62 However, the promise of prize-backed challenges over grants-incorporating-crowdsourcing comes not only from widening the pool of potential problem solvers, but also from the absence of statutory requirements. Crowdfunding, micropayments, and prize-backed challenges all open up the possibility for organi- zational funders to steer applicants toward these new platforms and processes as an alternative or supplement to institutionalized grantmaking (or procurement). But in the many contexts in which grantmaking is required by statute and subject to a defined statutory framework, the flexible and participatory techniques used in challenges could still be incorporated to attract more diverse and innovative solutions. WHY DO IT l Cost efficiency: Crowdsourcing could provide a “force multiplier” for government to advertise and attract outside “micro- sponsors” for its own grantee projects. This could be especially attractive in an era of funding constraints. l Community: When government requires attracting community contributions to match its grantmaking efforts, it increases the odds that grant recipients are providing relevant services for their communities. WHY NOT DO IT l Legal constraints: Statutes may constrain the extent to which government can solicit or incorporate outside funds for its grantee projects. l Legitimacy: Private sponsorship can erode or be perceived to erode the public-mindedness of grantmaking efforts and the projects they fund. 18 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations Post-Granting: Opening Data About Grants, Grantors, and Grantees Amplifying Impact by Increasing Access When others can easily discover and manipulate data about what activities are funded in a given region, population, or issue, this has the potential to avoid duplication of investment, decrease fraud and abuse, enable better analysis of impact, and create a marketplace so that other funders can match funds or support nonwinning proposals. When data about grants, grantors, grantees, and their grant-funded work product is made open and available to the public, the entire funding ecosystem benefits. Grantors get the chance to learn about what’s already been funded and with what impact, helping guide future investment decisions. Grantees, in turn, can use demon- strated successes (and funding commitments) to attract follow-on funding. Future applicants can study past grant decisions to improve their understanding of a funder’s priorities and patterns of investment. And citizens get the benefit of greater transparency about what their tax dollars are accomplishing. While the United States does not yet have such a comprehensive system in place,63 the United Kingdom’s commitment to opening the outputs of publicly funded research — because these outputs offer “significant social and economic benefits as well as aiding the development of new research” — speaks to the potential of such an open system for grantmaking.64 For instance, although the federal government already makes grant calls available via grants.gov, there is no agreement to disclose data about grants once awarded. For example, in 2010, the philanthropic world was in an uproar about whether it was appropriate for the White House to release the names of applicants or winners in connection with the awarding of $50 million in social innovation grants.65 There is no consensus yet on what kinds of grant data (applicants, awardees, impact reports, etc.) should be transparent, and therefore no parallel global movement, but various disclosure initiatives are proliferating in the public and philanthropic sectors. When data about grants, grantors, grantees, and their grant-funded work product is made open and available to the public, the entire funding ecosystem benefits. Some federal agencies, such as the National Institutes of Health (NIH), are making information about grant awards available as raw, machine- readable open data via the data.gov open data portal. Each month, the agency publishes data about the type of projects funded, the area of research, the lead researchers, and their organizations. The Department of Education’s Investing in Education (i3) program posts all relevant application materials, and also posts overview information after each closing date about the number of applications received and the list of applicants.66 When the program announces the highest-rated applicants each fall, these applicants’ project narratives and technical review forms are posted on the website along INNOVATIONS IN OPEN GRANTMAKING 19 with an overview document that discusses the i3 competitions for that year. Interestingly, the Department of Education’s unusually open process allowed for several large national foundations to conduct a parallel vetting process of i3 applicants to award their own separate grants. These foundations adopted streamlined procedures for board approval of projects that had initially gone through the i3 process and put aside funds for applicants that made the cut. “In the end,” comments Foundation Center, “large foundations were critical to the success of the i3 matching-grant requirement, but each individual organization followed its own guidelines and approval processes and made its grants directly to their grantees.”67 As the i3 experience suggests, one potential benefit for grantees from making award data transparent is the ability to create a marketplace so that other funders can match funds or support nonwinning proposals. For funders, opening up an after-market decreases the costs of grant administration because they get to piggyback on an existing process. Opening up data about grantmaking (including in the philanthropic sector) is also translating into improved understanding of the impact of such investments. Opening up data about grantmaking (including in the philanthropic sector) is also translating into improved understanding of the impact of such investments. Glasspockets, an initiative led by Foundation Center, champions greater philanthropic transparency by aggregating and sharing information on the financials, governance, grantmaking, and performance assessment of grantmaking organizations, and provides tools to help them become more transparent. Additionally, Glasspockets is home to the Reporting Commitment Initiative, in which currently a total of 19 U.S. foundations aim to improve the quality of grant information by advancing transparency and open data. Grant information is reported at least quarterly (daily in some cases) by each foundation, coded for geographic focus, and made available on Glasspockets, which includes an interactive map of the data68 that illustrates the national and global reach of America’s largest foundations. All data is completely open and can be downloaded through an API.69 In the public sector, a joint effort of the NIH and National Science Foundation, which make the data about the grants they give available as open data via data.gov, have launched the STAR METRICS initiative.70 STAR METRICS is a first-of-its-kind effort to measure the concrete impact of federally funded scientific research on the economy and on downstream innovations. Hopefully, as these open data initiatives demonstrate their value for the various partici- pants in the grantmaking process, the move toward greater transparency will gain even more traction throughout government and beyond. WHY DO IT l Improving grant effectiveness: Opening grantee data can allow resource-constrained agencies to give opportunities to third-party analysis to measure impact, detect fraud, etc. l Inviting collaborators: When an agency creates an “after-market” based on open grantee data, other grantmaking entities inside and outside government can more easily multiply the impact of limited grant funding by matching funds. l Promoting greater learning: Open grantee data could give the public an expansive new learning resource of information on public issues and potential solutions. WHY NOT DO IT l Privacy: Where confidentiality of applicants or their submission materials is of special concern, limitation on the openness of applicant/grantee data may be appropriate. This might not be a question of whether to open or not, but of what to open and when. l Lack of infrastructure: Opening data, though a simple concept, requires significant resource investment and back-end infra- structure. For open data mandates to be successful, significant investment must be made to create an infrastructure for storing and releasing data. 20 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations Post-Granting: Standardizing Reporting Improving the Clarity and Utility of Grant Reporting In order to make open grantmaking data more useful, it is important to develop more uniform reporting standards for grantors and grantees alike. There is a growing awareness that making sense of open grantmaking data (as described above) will also require better, more uniform data- reporting standards, such as unique identifiers for specific grantors, grantees, and projects. Achieving anything close to a unified reporting standard is easier said than done, though work toward that goal has, thankfully, begun within both the public and philanthropic sectors. In 2014, President Obama signed the DATA Act,71 which mandates that the federal government standardize the reporting of federal spending data. Although focused on how both grants and contracts are reported, the statute provides the power for the White House Office of Management and Budget and Department of the Treasury to set standards that could also apply to grants. The Act is aimed at standard- izing all government data elements and, especially, creating a single unified open data set in which all government spending will be made accessible. Once fully implemented, the DATA Act will expand and improve USASpending. gov, a government website that makes public the value of grant awards but, importantly, does not actually track how much money is ultimately dispersed.72 On an international level, the Open Contracting Partnership is a global initiative working to secure international transparency commitments for all stages of government contracting. As an important component of its work, it has developed a standard to make published contracting data more uniform and usable.73 The Open Contracting Partnership is a compelling example of how, when data is released (as discussed in the previous post), and when done in a standardized fashion, there is the potential to avoid duplication of investment, decrease fraud and abuse, and enable better analysis of impact. The public sector, though, generally lags behind the nonprofit sector in adopting data reporting standards for grants and evaluations. The International Aid Transparency Initiative (IATI), for instance, endeavors to systematize the reporting of aid and development spending data. As they describe it, “Organizations implement IATI by publishing their aid information in IATI’s agreed electronic format (XML) — usually on their website — before linking it to the IATI Registry. Achieving anything close to a unified reporting standard is easier said than done. The Registry acts as an online catalogue and index of links to all of the raw data published to the IATI Standard.”74 This model may prove useful as governments work to set standards for their own grant reporting. A further challenge to greater transparency about grant funding (as with contracts) is the absence of a common taxonomy for describing the underlying entities and organizations themselves. (To put it differently: it’s difficult to report on financial flows between agency A and organization B when there is no agreed-on way to name or describe agency A or organization B.) There is already underway an international, multi-stakeholder effort to establish legal-entity identifiers (LEIs) in the financial services sector.75 The BRIDGE project, funded by the Gates INNOVATIONS IN OPEN GRANTMAKING 21 Foundation, is an acronym for Basic Registry of Identified Global Entities. BRIDGE aims to assign a unique ID to every NGO to make open data about grants more easily analyzable.76 Although each nonprofit files a tax return with the IRS and registers in the states in which it operates, there is no plan at present to coordinate between these administrative authorities and the BRIDGE project. A similar effort aimed at disambiguating among entities, called ORCID, assigns unique identities to researchers. ORCID is intended to make it possible to track publications and their authors more accurately and to ensure that data about grantmaking translates into improved understanding of downstream impact.77 It is important to note that, although similar, LEI, ORCID, and BRIDGE are working in parallel and not in collaboration. A further challenge to greater transparency about grant funding (as with contracts) is the absence of a common taxonomy for describing the underlying entities and organizations themselves. The sheer number of initiatives mentioned above makes clear that a critical mass of interest in data standardization exists. But coordinating among these various actors, and ensuring that they do not end up at cross-purposes, will be crucial. WHY DO IT l Signal, not noise: With standardizing reporting categories, grantors and third parties will have greater ability to compare activities, outputs, and impacts across different sorts of grants. l Ease of reporting: With predefined categories, there is less guesswork for grantees in knowing how to report to funders. WHY NOT DO IT l Stifling creativity: Just as metrics cannot always be standardized, rigidly defining reporting categories can artificially constrain grantees and the activities they undertake. l Compliance burden: Box-ticking exercises take time and resources away from core activities. 22 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Innovations Post-Granting: Opening Access to Grant-Funded Work Product Making Public the Fruits of Public Money Increasing access to the work product developed as a result of a grant helps ensure that the public can benefit from the knowledge that grantees produce. In contrast to opening data about grantmaking expenditures, there are also innovations focused on increasing access to the knowledge developed through grantmaking practices, to ensure that the public can benefit from and build on the knowledge that grantees produce. Dr. John Holdren, Director of the Office of Science and Technology Policy (OSTP), promulgated a Memorandum Expanding Public Access to the Results of Federally Funded Research in 2013 to advocate for the use of open-access policies throughout the federal government, directing “each federal agency with over $100 million in annual conduct of research and development expenditures to develop a plan to support increased public access to the results of research funded by the Federal Government.”78 The Holdren memo was, in turn, inspired by the practice of the National Institutes of Health (NIH), which requires79 that all papers it funds that are accepted for publication be made publicly accessible, for free, on its PubMed Central archive.80 Taking a slower approach, the National Science Foundation (NSF) funded a one-year pilot study to encourage researchers to file a “data management plan” to explain how they would share their research data (or explain why they cannot).81 NSF opted for the pilot to enable it to hear from its grantees as to what challenges, such as unforeseen costs or privacy issues, opening up the underlying datasets used in a research project would create. Now NSF is following suit and is headed toward open access across the board.82 Relatedly, the NIH also mandates that certain categories of clinical trials make results available through clinicaltrials.gov, and can withhold funding if an applicable trial fails to do so. However, partly out of a concern that this NIH mandate was both overly narrow in scope and insufficiently enforced, a parallel advocacy campaign, AllTrials,83 is securing commitments from private companies and research institutions to publish the full results of clinical drug trials. There are innovations focused on increasing access to the knowledge developed through grantmaking practices. AllTrials is also in discussions with hedge funds investing in these companies to make funding contingent upon disclosure. Outside the realm of scientific research, the Department of Labor’s Trade Adjustment Assistance Community College and Career Training (TAACCCT) program requires grantees to make the educational training materials they develop with the Department’s $2 billion in initial workforce training grants fully reusable under a Creative Commons license, including by commercial third parties.84 Many NGOs are also starting to adopt this model: Foundation Center’s IssueLab, for instance, brings together case studies, evaluations, white papers, and issue briefs from a broad range of nonprofits to make the collective intelligence of the social sector more easily accessible.85 INNOVATIONS IN OPEN GRANTMAKING 23 Defense Advanced Research Project (DARPA) Open Catalog87 Launched in 2014, DARPA’s Open Catalog is in many ways the standard- bearer for opening access to data and other information from publicly funded grants. The Open Catalog is a “public web portal that organizes and shares the publicly releasable results of DARPA research in the form of software, peer-reviewed publications, data and experimental details.”88 The platform was created with an eye toward creating a mechanism for sharing the diversity of potentially useful information related to DARPA’s many projects, including, notably, sharing across internal agency siloes. Its name refers to its original form as a “card catalog” for DARPA- funded projects. Chris White, the Open Catalog’s program manager, described the platform’s value proposition in a statement released at launch: “Making our open source catalog available increases the number of experts who can help quickly develop relevant software for the government. Our hope is that the computer science community will test and evaluate elements of our software and afterward adopt them as either standalone offerings or as components of their products.”89 In addition to the potential impacts of opening DARPA’s information to the public, the platform’s creation was inspired by the belief that government- funded product should naturally be made accessible to the public. While grantees are not required to open their data on the platform upon program completion, program managers attempt to articulate the value proposition of more openness—e.g., future uptake of findings and identification of potential collaborators—and an open publishing agreement is reached before the formal contract is signed. As a result of what DARPA’s David Bringle terms the “double-edged sword” of openness, DARPA has little insight into the downstream impacts of making information accessible on the Open Catalog. While there is confidence within the agency that the site is being put to use and helping to establish collaborations, they have little means for tracking those impacts. According to Bringle, the two central challenges for maintaining the Open Catalog are: 1) gaining access to the new content to be published on the site in a timely manner; and 2) keeping the content available on the site up to date. With the difficulty of meaningfully identifying the impacts of the platform combined with the resource-intensive challenges related to maintaining the site, replication in other government contexts will likely require a consistent funding stream (which the Open Catalog enjoys), or high-level policy requirements for hosting such a platform. Despite these challenges, the Open Catalog could provide a roadmap for other government agencies to work with grantees and improve the stock of useful data, evidence, and research available to those seeking to address public problems. 24 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER Sphaera, a cloud-based platform developed by a consortium of organizations including the Rockefeller Foundation and Oxfam America, is also aiming to help foundations and other organizations to “serve up curated solutions that have been funded and implemented.” The Sphaera Solutions Hub, currently in beta, offers users across sectors access to a “growing, global, living library of solutions to the hard problems of the 21st century.”86 WHY USE IT l Enhancing access: Open access increases access to the knowledge produced with grant funding. By giving people the ability to scrutinize the underlying work product, open access can potentially accelerate additional advances in knowledge and new kinds of problem solving. In addition to enabling collaboration, it can enhance trust in the grantmaking process. l Spurring innovation: Open access can promote third-party innovation by enabling newcomers to build upon the work product created by grantees. l Magnifying impact: By opening up access to underlying work product, agencies can make their limited grantmaking dollars go further in advancing their missions. WHY NOT TO USE IT l IP incentives: There may be instances in which allowing grantees to retain certain intellectual property rights is important for attracting quality applicants. l Absence of evidence: There is not a lot of understanding of the circumstances under which open access promotes innovation best. l Monetizing first sale rights: In some grant competitions, investors will offer to fund second- and third-place winners not funded by the government, if they can retain rights. Conclusion and Reflection With a few years of experience with new open and innovative grantmaking processes pre-, during and post-award, it is now time to invest in more systematic and empirical review of agencies’ progress and assessment of which innovators are using the newly available techniques, and how, to solve public problems and advance the public good. Given that grantmaking accounts for half the federal budget, every presidential candidate, regardless of party, should be committed to ensuring that we are doing grantmaking in the best possible way and that we use the data we collect about which practices are leading to better outcomes, to enact the policies that scale up these new ways of working. Now, think about how you can use these ideas and examples to influence your own work, and how to move government grantmaking to increasingly innovative and strategic heights. Here, we offer some questions to reflect on the text and discuss with your colleagues: l What examples jumped out the most to you, and why? l How can innovations in grantmaking move beyond anecdotal wins and pilot projects to become more ingrained in the business of disseminating public money? l What are the institutional, cultural, and legal barriers to more open and effective grantmaking processes? l What institutional arrangements could be leveraged to increase the openness of the grantmaking process? l What innovations in grantmaking did we miss in this publication, and what new grantmaking innovations are on the horizon? INNOVATIONS IN OPEN GRANTMAKING 25 ENDNOTES 1. grants.gov/web/grants/learn-grants/grants-101.html 2. casefoundation.org/resource/promoting-innovation 3. news.harvard.edu/gazette/story/2010/02/open- innovation-challenge-seeks-solutions-to-type-1- diabetes 4. thegovlab.org/innovating-the-innovation-process-re- imagining-university-research 5. hms.harvard.edu/news/harvard-medicine/ideation- challenge-diabetes 6. innocentive.com/ar/challenge/9933073 7. nsf.gov/pubs/2013/nsf13035/nsf13035.jsp 8. nsf.gov/news/news_summ.jsp?cntn_id=128546 9. govtech.com/wireless/Like-Matchcom-Feds-Launch- BroadbandMatch.html 10. whitehouse.gov/open 11. ntia.doc.gov/blog/2010/connecting-broadbandmatch 12. northatlantic-islands.com/nata-grants/project- matchmaking.html 13. c40.org 14. Examples in this section are drawn from Beth Simone Noveck, Smart Citizens, Smarter State: The Technologies of Expertise and the Future of Governing (Harvard University Press, 2015), chap. 1. 15. econlib.org/library/Essays/hykKnw1.html 16. galaxyzoo.org/?utm_source=Zooniverse%20 Home&utm_medium=Web&utm_ campaign=Homepage%20Catalogue 17. publiclab.org 18. grassrootsmapping.org/tag/lima 19. grassrootsmapping.org/gulf-oil-spill 20. theguardian.com/global-development-professionals- network/2015/feb/11/open-data-how-mobile-phones- saved-bananas-from-bacterial-wilt-in-uganda 21. www3.epa.gov/region02/citizenscience/grants.html 22. irvine.org/arts/who-we-fund/exploring-engagement- fund 23. huffingtonpost.com/jane-wales/if-you-want-an-answer- ask_b_742111.html 24. casefoundation.org/resource/citizen-centered- solutions 25. blog.wikimedia.org/2015/02/19/wmf-largest- participatory-grantmaking 26. thelafayettepractice.com/reports/whodecides 27. disabilityrightsfund.org 28. hivyoungleadersfund.org 29. youngfeministfund.org 30. edgefund.org.uk 31. grantcraft.org/blog/building-transparency-into-the- operational-structure-of-a-new-fund 32. nationalservice.gov/programs/social-innovation-fund/ our-model 33. theotherfoundation.org 34. africangrantmakersnetwork.org/Documents/Report- Pilot-Peer-Review-Grant-Making-Initiative.pdf 35. atlanticphilanthropies.org/app/uploads/2015/09/ Report-Pilot-Peer-Review-Grant-Making-Initiative.pdf 36. whitehouse.gov/sites/default/files/omb/ memoranda/2012/m-12-14.pdf 37. bridgespan.org/Publications-and-Tools/Performance- Measurement/Geek-Cities-Data-Improves-Lives.aspx#. V4U3qFcsjNW 38. brookings.edu/research/books/2014/show-me-the- evidence 39. nationalservice.gov/build-your-capacity/grants/ funding-opportunities/2016/social-innovation-fund- 2016-pay-success-grant 40. whitehouse.gov/blog/2015/12/11/social-innovation- fund-launches-new-pay-success-grant-program 41. urban.org/urban-wire/introducing-pay-success- initiative-urban-institute 42. issuu.com/ljaf/docs/ljfa-issuu/3?e=14811366/31893762 43. moneyballforgov.com/moneyball-principles 26 GRANTCRAFT, A SERVICE OF FOUNDATION CENTER 44. ed.gov/news/press-releases/us-secretary-education- arne-duncan-announces-highest-rated-applications- investing-innovation-i3-2014-competition-during-visit- high-school-students-north-carolina 45. foundationcenter.org/grantmakers/eupdates/pdf/ philp_foundationreview.pdf 46. fortune.com/2016/05/04/chan-zuckerberg-jim-shelton 47. knightfoundation.org/funding-initiatives/knight- prototype-fund 48. jmkfund.org/the-jmk-innovation-prize-2/ 49. awesomefoundation.org 50. givingcommon.org/about 51. maps.foundationcenter.org 52. nonprofits.linkedin.com/find-board-members 53. thegovlab.org/static/files/smarterstate/HHS.pdf 54. For more about uses for expert networking technology in the public sector, please see the GovLab, Smarter State Case Studies, smarterstate.org. 55. crowdsnyustern.com 56. crowdfunding.com 57. whitehouse.gov/sites/default/files/uploads/ InnovationStrategy.pdf 58. whitehouse.gov/sites/default/files/omb/assets/ memoranda_2010/m10-11.pdf 59. nasa.gov/offices/COECI 60. sciencedirect.com/science/article/pii/ S0265964615300072 61. grantcraft.org/case-studies/paws-up-for-philanthropic- prizes 62. medium.com/@sverhulst/governing-through-prizes- and-challenges-677f3ef861d1 63. ced.org/reports/single/the-future-of-taxpayer-funded- research-who-will-control-access-to-the-resul 64. rcuk.ac.uk/research/openaccess/policy 65. tacticalphilanthropy.com/2010/08/transparency- controversy-at-the-social-innovation-fund 66. www2.ed.gov/about/opengov-plan-v30.pdf 67. foundationcenter.org/grantmakers/eupdates/pdf/ philp_foundationreview.pdf 68. glasspockets.org/philanthropy-in-focus/reporting- commitment-map 69. glasspockets.org/philanthropy-in-focus/reporting- commitment-api-and-querybuilder 70. starmetrics.nih.gov 71. datacoalition.org/issues/policy-agenda 72. Frank Landefeld, Jamie Yachera, Hudson Hollister, "The DATA Act: Vision & Value," July 2016, www.morganfranklin.com/insights/article/data-act- vision-value-research-paper 73. open-contracting.org/data-standard 74. aidtransparency.net/about 75. sifma.org/uploadedfiles/issues/technology_and_ operations/legal_entity_identifier/lei-global-calls. pdf?n=07707 76. marketsforgood.org/bridge-to-somewhere-a- conversation-with-global-giving-guidestar-the- foundation-center-and-techsoup-global 77. orcid.org 78. whitehouse.gov/sites/default/files/microsites/ostp/ ostp_public_access_memo_2013.pdf 79. publicaccess.nih.gov 80. ncbi.nlm.nih.gov/pmc 81. nsf.gov/eng/general/dmp.jsp 82. nsf.gov/about/budget/fy2014/pdf/45_fy2014.pdf 83. alltrials.net 84. creativecommons.org/tag/taaccct 85. issuelab.org 86. sphaera.world/what-we-do-1 87. Special thanks to DARPA’s David Bringle for sharing his insights on the Open Catalog in an interview conducted on April 4, 2016. 88. darpa.mil/work-with-us/darpa-open-catalog 89. theverge.com/2014/2/4/5377492/darpa-publishes-all- its-open-source-code-in-one-place-open-catalog Notes ABOUT FOUNDATION CENTER Established in 1956, Foundation Center is the leading source of information about philanthropy worldwide. Through data, analysis, and training, it connects people who want to change the world to the resources they need to succeed. Foundation Center maintains the most comprehensive database on U.S. and, increasingly, global grantmakers and their grants —a robust, accessible knowledge bank for the sector. It also operates research, education, and training programs designed to advance knowledge of philanthropy at every level. Thousands of people visit Foundation Center‘s website each day and are served in its five library/learning centers and at more than 450 Funding Information Network locations nationwide and around the world. ABOUT GOVLAB The GovLab's mission is to improve people's lives by changing the way we govern. Our goal is to strengthen the ability of institutions—including but not limited to governments—and people to work more openly, collaboratively, effectively and legitimately to make better decisions and solve public problems. INNOVATIONS IN OPEN GRANTMAKING 27 For additional guides and other materials in the GrantCraft series, see grantcraft.org