Summary (TL:DR):
The goal of this project is to do a review of and to produce a report on the current state of grant programs in the web3 space. This will also entail looking to grant programs outside of web3 to better understand best practices. Finally, this report will focus on best practices for decentralizing grant programs, specifically tailored for Gitcoin.
I discussed this proposal with @azeem @Sov @Viriya and am looking forward to wider feedback. Let me know if I can add any other info / answer any questions.
Abstract:
This project would entail: producing a list of active grant programs as of Q1 2023, conducting interviews and running surveys with those who both structured/ran the programs or applied for/received grants from them and writing a report on the current state of grants in web3.
Grants programs to assess will include: Aave, Algorand, Aragon, BitDAO, Cardano, Celo, Chainlink, Consensys, Ethereum Foundation, NEAR, Ocean Protocol, Optimism, PolygonDAO, Protocol Labs, Radicle, Safe DAO, Solana, Uniswap, and Web3 Foundation, amongst others. The first step with expanding and refining this list of grant programs will be to split the programs into a tiered structure based on size of treasury, size of grant issuance, age of the program, and/or kind of organization.
Questions to explore and compare will include: volume of grants (quantity, funds), team size, review process, number of reviewers, post issuance process, general operations structure, focus of grants, as well as the user experience of applying for and receiving the grants. It is important to explore the journey of users as part of this research to better understand the perceived desire and impact from the perspective of those applying for grants.
In order to gain the right level of insight, a series of surveys and interviews will be conducted with at least 50-75+ individuals. These interviews will explore how the grant programs were created, structured, managed, and what kind of impact they have had.
In addition to this information, larger scale surveys and data gathering will take place in order to conduct some analysis across grant programs and to provide some learnings for the space. A report will then be written, bringing together the results of the surveys and analysis, along with any other potential suggestions for how to improve the state of grants in web3. This report will provide a history of grant programs in web3, highlight common and best practices, share any data that was collected, and provide both open questions and potential solutions.
Benefits & Motivation:
Grant programs play an interesting role in the web3 space. At their core, most programs are intended to support the overall growth of the specific ecosystem they exist within (e.g. the Ethereum Foundationās grant program is meant to grow Ethereum broadly, Aaveās program is meant to grow Aave, etc.). Thatās not to say these grants are myopically focused on short-term ecosystem growth at the expense of long-term success. Quite the contrary, some programs are inspired by foundation or other long-term focused funding/giving models. However, many programs have struggled with various elements such as what to focus giving grants on, who should be reviewing grants, how to best assess grants once issued, etc.
Especially given the bear market, just about every grant program is evaluating its progress over the last few years. Some programs are outright pausing / indefinitely stopping while others are trying to get a better sense of what impact they want to have with their capital allocation and how to measure said impact.
The results of this report will help all grant programs in the space in terms of more clearly understanding the current state of affairs. The report will also include suggestions for improvements broadly, as well as specific learnings tailored for Gitcoin.
Strategic fit:
This project is aligned with Gitcoinās desire to better understand the landscape of grant programs, to learn from non-web3 grant programs as far as best practices go, and to think about how Gitcoin can best support decentralized grant funding.
Scope:
The scope of this project will be to:
- Better understand the current state of grant programs in web3 and how they are changing
- Learn from non-web3 grant programs (foundation, government, research funding, arts funding, etc.)
- Glean insights on the user experience of going through web3 grant programs
Below is a list of elements to explore for each grant program. This is not an exhaustive list and is intended to give a general sense of the specific questions that will be asked. The surveys are currently being developed and more detail can be provided as needed.
- History of grant programs and how they came together
- When was the program first conceived and by whom
- What examples were looked to when structuring the grant program
- What was the desired impact?
- Was there a grant lead and who decided on that person?
- How was the initial grant budget decided?
- When was the grant program ālaunchedā? What did launch entail? When was the first grant application received?
- Who reviewed the first grant(s)?
- Nature of the operations of grant programs
- How many grants issued to date (including size and focus, ideally)
- How many people are on the grants team, both part-time and full-time? If thereās no dedicated team, how many people support the program?
- How many people review each grant? How do you source these reviewers?
- Is there a component of community review / voting?
- Are grants decisions on-chain? If yes, does that trigger payout or signal for someone else to payout? If the latter, who has control over the treasury?
- Average and range of response time from the time the grant was first submitted.
- What kind of interactions take place with grantees once grants have been issued?
- Is there any kind of follow up on impact / achievement of milestones? If yes, what does that look like?
- Impact of grant programs
- Is there a measure of impact already used in the ecosystem?
- Is there some overly simplistic breakdown of āwas this grant impactful or notā?
- How many were product / development focus vs other?
- Experience going through grant programs
- How many grants did you apply to across how many ecosystems?
- How hard was it to find grants?
- How hard was it to apply?
- How long did it take to hear?
- How much communication and feedback did you get?
- How much did you receive in funding?
- Select non-web3 grant programs to explore
- Explore the grant programs that are being run by Experiment Foundation, Speculative Technologies, Artizen Fund, Big Green DAO, etc.
- How many grants issued to date (including size and focus, ideally)
- How many people are on the grants team, both part-time and full-time? If thereās no dedicated team, how many people support the program?
- How many people review each grant? How do you source these reviewers?
- Is there a component of community review / voting?
- How is impact assessed?
- How grant programs in web3 compare to overall grant programs
- Landscape of funding that research orgs apply for
- Landscape
- How can grant programs improve
- Synthesize best practices to be shared publicly
Risk assessment:
Increased transparency across grants programs will be a net-positive for the space overall, though specific programs might get positive or negative outcomes depending on the specifics of the rigor and logic of their individual programs.
The main risks with this project arise from the operations and execution of the project itself. There are no known systemic risks that can result from going forward with this project.
Budget:
The main costs for this project will be labor, with some additional costs around travel, equipment rental, and marketing related expenses. The roles required for this project include, at a minimum:
- Lead researcher(s)
- Research assistants
We are starting out by seeking a total budget of $50,000 total. This will cover the labor hours on a general survey of grant programs as well as a more comprehensive deep dive for a few programs. If it makes sense, additional funding can be sought beyond GCP.
Proposed fund distribution for this project would be on a monthly basis and would roughly correspond to:
- First distribution: $10k USD equivalent upon approval to support through milestones 1 & 2 during the first month
- Second distribution: $10k USD equivalent for month 2, expectation of wrapping up milestones 1 and 2 and commencing milestone 3
- Third distribution: $10k USD equivalent for month 3, expectation of getting 75+ survey responses and conducting 20 interviews, initial assessment of data and blog post
- Fourth distribution: $10k USD equivalent for month 3, expectation of getting to 200+ survey responses and 50 interviews, assessment of more complete dataset, second blog post
- Fifth distribution: $10k in month 5 or 6 upon release of the report.
Timeline:
The first milestone revolves around mapping grant programs in the space and providing a clear way for others to learn about active grant programs with more insights relative to a tool such as Grantfarm: Crypto Grants Directory - Blockworks and https://wiki.defillama.com/wiki/LlamaoGrants. This would involve proposing a tier-based breakdown of grant programs. Progress has already started on this milestone and the first version of this milestone would be generated within two weeks of approval.
The second milestone will revolve around developing two surveys: one for those running/building grant programs and one for those who have applied to grant programs. Progress has already started on this milestone and this version would complete within four to six weeks of approval in order to allow for the initial draft, sharing with a wider community of practitioners and survey design specialists, and for making revisions.
The third milestone will involve deploying aforementioned surveys to as many individuals as possible, with an initial focus on getting as many core contributors to major grant programs to fill out the survey as possible. This milestone will take at least 2 months, as it will also entail running 50+ hours of one on one interviews as well as deploying the survey at scale.
The fourth milestone will entail analyzing the data from the third milestone and synthesizing results into a series of blog posts (at least 1) and a longer report. This milestone will take at least two months, with one month overlapping with milestone 3 for exploring some initial results and at least one month after to synthesize the data, produce a preliminary blog post, and to work on the report. Updates will be provided if any delays occur with data collection in milestone 3 or with analysis or report generation in milestone 4.
Customer acceptance criteria:
Outcomes and deliverables:
- List of grant programs to assess
- Surveys to run
- Data gathered from surveys and interviews
- At least 2 blog posts and 1 report
Relevant metrics:
- Number of grant programs assessed
- Number of survey responses
- Number of interviews (along with time of each interview)
- Time it took to complete milestone 3
- Time it took to complete milestone 4
- Likert scale measure of satisfaction with the results
- Number of grant programs that implement any learnings from the program
- Specific points that the Gitcoin or Allo or other related projects/products take from the report and make internal adjustments as a result
Vote:
Clearly outline what voting āyesā and ānoā entails. Always add an āabstainā option on Snapshot.
Yes = full support of the project as outlined with $50k in funding
Yes, conditional other funding = full support of the project as outlined with $30k in funding (milestone payments go down to $6k) and the assumption that the remaining funds will be raised from other programs
No = full rejection of this proposal, whether due to the scope or funding amount
Abstain = neither support nor reject the proposal