[Proposal] - State of Web3 Grants Report

Summary (TL:DR):

The goal of this project is to do a review of and to produce a report on the current state of grant programs in the web3 space. This will also entail looking to grant programs outside of web3 to better understand best practices. Finally, this report will focus on best practices for decentralizing grant programs, specifically tailored for Gitcoin.

I discussed this proposal with @azeem @Sov @Viriya and am looking forward to wider feedback. Let me know if I can add any other info / answer any questions.

Abstract:

This project would entail: producing a list of active grant programs as of Q1 2023, conducting interviews and running surveys with those who both structured/ran the programs or applied for/received grants from them and writing a report on the current state of grants in web3.

Grants programs to assess will include: Aave, Algorand, Aragon, BitDAO, Cardano, Celo, Chainlink, Consensys, Ethereum Foundation, NEAR, Ocean Protocol, Optimism, PolygonDAO, Protocol Labs, Radicle, Safe DAO, Solana, Uniswap, and Web3 Foundation, amongst others. The first step with expanding and refining this list of grant programs will be to split the programs into a tiered structure based on size of treasury, size of grant issuance, age of the program, and/or kind of organization.

Questions to explore and compare will include: volume of grants (quantity, funds), team size, review process, number of reviewers, post issuance process, general operations structure, focus of grants, as well as the user experience of applying for and receiving the grants. It is important to explore the journey of users as part of this research to better understand the perceived desire and impact from the perspective of those applying for grants.

In order to gain the right level of insight, a series of surveys and interviews will be conducted with at least 50-75+ individuals. These interviews will explore how the grant programs were created, structured, managed, and what kind of impact they have had.

In addition to this information, larger scale surveys and data gathering will take place in order to conduct some analysis across grant programs and to provide some learnings for the space. A report will then be written, bringing together the results of the surveys and analysis, along with any other potential suggestions for how to improve the state of grants in web3. This report will provide a history of grant programs in web3, highlight common and best practices, share any data that was collected, and provide both open questions and potential solutions.

Benefits & Motivation:

Grant programs play an interesting role in the web3 space. At their core, most programs are intended to support the overall growth of the specific ecosystem they exist within (e.g. the Ethereum Foundation’s grant program is meant to grow Ethereum broadly, Aave’s program is meant to grow Aave, etc.). That’s not to say these grants are myopically focused on short-term ecosystem growth at the expense of long-term success. Quite the contrary, some programs are inspired by foundation or other long-term focused funding/giving models. However, many programs have struggled with various elements such as what to focus giving grants on, who should be reviewing grants, how to best assess grants once issued, etc.

Especially given the bear market, just about every grant program is evaluating its progress over the last few years. Some programs are outright pausing / indefinitely stopping while others are trying to get a better sense of what impact they want to have with their capital allocation and how to measure said impact.

The results of this report will help all grant programs in the space in terms of more clearly understanding the current state of affairs. The report will also include suggestions for improvements broadly, as well as specific learnings tailored for Gitcoin.

Strategic fit:

This project is aligned with Gitcoin’s desire to better understand the landscape of grant programs, to learn from non-web3 grant programs as far as best practices go, and to think about how Gitcoin can best support decentralized grant funding.

Scope:

The scope of this project will be to:

  • Better understand the current state of grant programs in web3 and how they are changing
  • Learn from non-web3 grant programs (foundation, government, research funding, arts funding, etc.)
  • Glean insights on the user experience of going through web3 grant programs

Below is a list of elements to explore for each grant program. This is not an exhaustive list and is intended to give a general sense of the specific questions that will be asked. The surveys are currently being developed and more detail can be provided as needed.

  • History of grant programs and how they came together
    • When was the program first conceived and by whom
    • What examples were looked to when structuring the grant program
    • What was the desired impact?
    • Was there a grant lead and who decided on that person?
    • How was the initial grant budget decided?
    • When was the grant program ‘launched’? What did launch entail? When was the first grant application received?
    • Who reviewed the first grant(s)?
  • Nature of the operations of grant programs
    • How many grants issued to date (including size and focus, ideally)
    • How many people are on the grants team, both part-time and full-time? If there’s no dedicated team, how many people support the program?
    • How many people review each grant? How do you source these reviewers?
    • Is there a component of community review / voting?
    • Are grants decisions on-chain? If yes, does that trigger payout or signal for someone else to payout? If the latter, who has control over the treasury?
    • Average and range of response time from the time the grant was first submitted.
    • What kind of interactions take place with grantees once grants have been issued?
    • Is there any kind of follow up on impact / achievement of milestones? If yes, what does that look like?
  • Impact of grant programs
    • Is there a measure of impact already used in the ecosystem?
    • Is there some overly simplistic breakdown of ‘was this grant impactful or not’?
    • How many were product / development focus vs other?
  • Experience going through grant programs
    • How many grants did you apply to across how many ecosystems?
    • How hard was it to find grants?
    • How hard was it to apply?
    • How long did it take to hear?
    • How much communication and feedback did you get?
    • How much did you receive in funding?
  • Select non-web3 grant programs to explore
    • Explore the grant programs that are being run by Experiment Foundation, Speculative Technologies, Artizen Fund, Big Green DAO, etc.
    • How many grants issued to date (including size and focus, ideally)
    • How many people are on the grants team, both part-time and full-time? If there’s no dedicated team, how many people support the program?
    • How many people review each grant? How do you source these reviewers?
    • Is there a component of community review / voting?
    • How is impact assessed?
  • How grant programs in web3 compare to overall grant programs
    • Landscape of funding that research orgs apply for
    • Landscape
  • How can grant programs improve
    • Synthesize best practices to be shared publicly

Risk assessment:

Increased transparency across grants programs will be a net-positive for the space overall, though specific programs might get positive or negative outcomes depending on the specifics of the rigor and logic of their individual programs.

The main risks with this project arise from the operations and execution of the project itself. There are no known systemic risks that can result from going forward with this project.

Budget:

The main costs for this project will be labor, with some additional costs around travel, equipment rental, and marketing related expenses. The roles required for this project include, at a minimum:

  • Lead researcher(s)
  • Research assistants

We are starting out by seeking a total budget of $50,000 total. This will cover the labor hours on a general survey of grant programs as well as a more comprehensive deep dive for a few programs. If it makes sense, additional funding can be sought beyond GCP.

Proposed fund distribution for this project would be on a monthly basis and would roughly correspond to:

  • First distribution: $10k USD equivalent upon approval to support through milestones 1 & 2 during the first month
  • Second distribution: $10k USD equivalent for month 2, expectation of wrapping up milestones 1 and 2 and commencing milestone 3
  • Third distribution: $10k USD equivalent for month 3, expectation of getting 75+ survey responses and conducting 20 interviews, initial assessment of data and blog post
  • Fourth distribution: $10k USD equivalent for month 3, expectation of getting to 200+ survey responses and 50 interviews, assessment of more complete dataset, second blog post
  • Fifth distribution: $10k in month 5 or 6 upon release of the report.

Timeline:

The first milestone revolves around mapping grant programs in the space and providing a clear way for others to learn about active grant programs with more insights relative to a tool such as Grantfarm: Crypto Grants Directory - Blockworks and https://wiki.defillama.com/wiki/LlamaoGrants. This would involve proposing a tier-based breakdown of grant programs. Progress has already started on this milestone and the first version of this milestone would be generated within two weeks of approval.

The second milestone will revolve around developing two surveys: one for those running/building grant programs and one for those who have applied to grant programs. Progress has already started on this milestone and this version would complete within four to six weeks of approval in order to allow for the initial draft, sharing with a wider community of practitioners and survey design specialists, and for making revisions.

The third milestone will involve deploying aforementioned surveys to as many individuals as possible, with an initial focus on getting as many core contributors to major grant programs to fill out the survey as possible. This milestone will take at least 2 months, as it will also entail running 50+ hours of one on one interviews as well as deploying the survey at scale.

The fourth milestone will entail analyzing the data from the third milestone and synthesizing results into a series of blog posts (at least 1) and a longer report. This milestone will take at least two months, with one month overlapping with milestone 3 for exploring some initial results and at least one month after to synthesize the data, produce a preliminary blog post, and to work on the report. Updates will be provided if any delays occur with data collection in milestone 3 or with analysis or report generation in milestone 4.

Customer acceptance criteria:

Outcomes and deliverables:

  1. List of grant programs to assess
  2. Surveys to run
  3. Data gathered from surveys and interviews
  4. At least 2 blog posts and 1 report

Relevant metrics:

  • Number of grant programs assessed
  • Number of survey responses
  • Number of interviews (along with time of each interview)
  • Time it took to complete milestone 3
  • Time it took to complete milestone 4
  • Likert scale measure of satisfaction with the results
  • Number of grant programs that implement any learnings from the program
  • Specific points that the Gitcoin or Allo or other related projects/products take from the report and make internal adjustments as a result

Vote:

Clearly outline what voting “yes” and “no” entails. Always add an “abstain” option on Snapshot.

Yes = full support of the project as outlined with $50k in funding

Yes, conditional other funding = full support of the project as outlined with $30k in funding (milestone payments go down to $6k) and the assumption that the remaining funds will be raised from other programs

No = full rejection of this proposal, whether due to the scope or funding amount

Abstain = neither support nor reject the proposal

7 Likes

Thanks for the thoughtful proposal and the awesome writing you’ve already done on the topic of grants @eleventh19. Really enjoyed this read that Sov pointed me towards.

I deeply love the implicit focus here on the grantee experience, and would definitely be excited for a more holistic understanding of the web3 grants landscape right now.

I do want to push back against this risk assessment a bit, though -

My primary concern is about the precedent of a single grants program funding what’s meant to be an industry-wide, impartial audit of ongoing grants programs (of which Gitcoin Grants Program is one, but notably not included). Would be curious to hear community’s thinking around this as a more generalized topic, and whether we’re comfortable with the ethical precedents therein… (Unethical examples of such a precedent would prominently be the case of pharmaceutical companies using pseudo-DAFs to fund medical research and silencing non-patentable outcomes, etc.- I’m sure this group is well-versed.)

Funding research that informs our product, frameworks, or core mechanisms seems well within scope since we are the stakeholders and consumers of this research. I wonder how public-facing research will be disseminated and published, however, and if there’s any reputational risk we want to address at the outset of a project intending to be so comprehensive. How closely will Gitcoin be associated with the results of the study? What publication methods do we foresee being used for the report? Are the blog posts for Gitcoin’s own blog?

I appreciate the presence of this option but don’t quite follow the math here. I wonder if we could consider co-funding in a way that equally distributes cost assumed to all funding partners… And I also wonder if there’s appetite here from other grants programs. This would definitely help, although not eliminate, some of the concern I raise above.

3 Likes

Interesting, I would be looking for Gitcoin to be included in the assessment.

Additionally,
I would be interested in seeing a the creation of a grant maturity matrix. Based on your research @eleventh19, I would be interested in using this assessment as a platform for building a quantifiable assessment which, based on a some key measures can be duplicated 1- 2x per year to show progress across the grant ecosystem. Is this possible to work into the report?

6 Likes

Appreciate the proposal and the detail that’s gone into it! Though I’ve only been working in the grants space for a short time, but one of the things that’s impressed me the most about it is that it’s so new! Best practices are very nascent and the history of the discipline is only about 18 months old. Given this, I’m not sure that a report of this scope is necessary to communicate the state of grants. I’m also not sure that interviewing 50-75 individuals will yield many more insights than interviewing 10 individuals – since many of them are interacting with the same programs and likely have quite similar experiences and themes.

I’d be supportive of this in a scoped down format that focuses on operations/best practices for grant program managers and the experiences of grantees in Web3. A budget of 5-10k and a 4 week turnaround might be appropriate for that scope. In order to be confident in the proposed output, I think it would also be helpful for the authors to present examples of previous research work and their research credentials.

5 Likes

I appreciate the thoughtful approach of this proposal and find its comprehensive examination of grant programs in the web3 space very informative. Moreover, drawing insights from non-web3 grant programs could yield valuable best practices that could guide the refinement of our efforts in Gitcoin (specifically with the development of Grants Stack and Allo).

As someone who conducts similar research, as evidenced in my blog, I understand the complexities and challenges of this kind of study. I attempted to develop a similar understanding of grant programs, but it proved challenging due to the difficulty of data collection, so I instead pivoted my research to individual programs. By better understanding grant programs’ operations, challenges, and successes, we can adapt and implement more efficient, transparent, and impactful strategies.

@meglister I understand your comments and concerns about the scope. From my experience, the nature of data in the web3 space is unique, often nonstandard, dispersed across various sources, and not always easily accessible or understandable. The proposed budget reflects the time-intensive process of gathering, collating, and interpreting this scattered data. I do think if we want to scale down the number of interviews, this could have a meaningful impact on the hours in the scope.

@eleventh19 could you provide your perspective on scaling down the number of interviews and how that might impact the proposed budget?

1 Like

Love the discussion that’s happening here – especially love @meglister’s comments as this work will directly tie into her work as a product manager for Grants Stack.

I’m also wondering what a slightly scoped-down, phased approach might look like.
I see you’ve listed interviews with at least 50-75 individuals. Will these interviews exclusively be done with grants program managers or grantees of those programs as well? I’m curious about this as only 25 grants programs are listed in the assessment list. Given that logic, it makes sense to conduct 25 interviews max? Maybe I’m missing something?

As a thought experiment, I wonder about the timing of this and what would have to be true to tighten up the proposed timeline.

On a completely other note
I think it’s important to reiterate that this proposal isn’t only going to benefit market analysis work for Grants Stack but also (and mainly) intends to be packaged as a public good in which the entire ecosystem can learn and benefit from (not just Gitcoin). I think considering this proposal from both lenses is important to consider here :slight_smile:

From a brand perspective, I think that funding research like this and creating educational public goods for the wider ecosystem can help us continue to position ourselves as thought leaders in the space. MMM intends to use the results of this report for brand marketing content that aims to better position Gitcoin in this way. Beyond that, based on the results of this research, we will create a plan to use it to bolster a narrative that Gitcoin is a leading provider of OS grants solutions for web3 and beyond.

3 Likes

Thanks for all of the thoughtful comments!

That’s an important concern and I appreciate you bringing it up. In terms of funding, to be totally transparent, and apologies if I forgot to mention this in the proposal, but I have already applied to a few grant programs and want to apply to a few others [Aave and ESP have rejected, waiting to hear from Solana, want to apply to Public Nouns, can set up some better reporting infrastructure and will share that]. The hope is that in making this a project that gets supported by multiple ecosystems, there can be enough funds to support the general research. I will be open with all parties about the funding size and scope of support.

There’s also the more general question of who funds research. Unfortunately, there aren’t any non web3 foundation/government grants that cover this kind of work, at least none that I could find. I think it’s better to try and get a group of orgs to support this work as the most realistic path towards getting this research going.

At the same time, for programs that do provide larger scale of support, I see it as reasonable to split research vs more consulting style work and to include both as in scope, drawing lines as much as possible about what work is happening for a public report vs happening specifically for an individual funder.

In regards to the pharma example, 100% agree that we want to avoid that. I think a huge problem in those scenarios is also a lack of transparency. All of the research work (interviews, surveys, list of grants, final report as outlined) will be shared. In addition to that, some of the budgeted time is meant to support work informed by the public research that is intended to be more applicable to the Gitcoin community. We can also separate out the Gitcoin specific activities more clearly and into separate outputs to be more distinct with the division of the activities.

I’ll address the idea of what a lower budget, faster turnaround version could look like below. Let me know if I can share any more info on it and sorry for not providing better logic around the smaller version in the initial proposal.

Yes, definitely want to include Gitcoin as part of the research as well.

This is an interesting idea and I want to think more about what this could look like in terms of research operations. One of the challenges I imagine would be around reporting (survey vs just publish certain metadata). Especially if it’s the latter, that might be related to a working group that Metagov, Gitcoin, Nouns, and some others are working on with some metadata standardization for grant programs.

I personally think that creating an assessment is beyond what I was initially imagining but happy to rescope if that’s of particular interest to the community. I also just want to be upfront that I’m not a data scientist (see my bio below) and am more confident with data analysis vs proposing and validating assessment tools. We can always look to supplement the team appropriately if that is a desired output.

Point taken and I think it’s reasonable to rescope / do a first research output that can be produced on a shorter timeline. The idea of getting 50-75 interviews would be to talk to those who help structure grant programs, those who are running them, those who went through them, and those working in grant programs outside of web3. I do think there is value in getting a larger set of interviews as that would allow to explore the history of these programs and how they changed, which itself can shed some insights on what it’s like to scale programs.

We can definitely explore a more tapered down version that would focus on a smaller deliverable. As you mentioned, we can start with a smaller set of interviews and a shorter time window for surveys and produce a report based on that. If we do 10-12 interviews, we can focus 6 interviews on talking to individuals who were involved in building or operating multiple grant programs and 6 interviews with those who applied/were funded by multiple grant programs. In addition to that, we can run surveys for a week or so and incorporate that data. That can be synthesized into a report. We can also spend some time with Gitcoin stack team members to better understand the topics they’re most interested in and to think of how the research can inform relevant solutions. From there, I can propose a potential continuation if there’s interest from the Gitcoin community / other grant programs.

Sorry I forgot to include that in the proposal! The final team will depend on the size of the scope of the work that gets funded but I can provide the names of some potential collaborators if that’s helpful. Let me reach out to them to make sure they’re ok with my including them here as such and I’ll follow up on that.

In terms of my background - I’m currently the Head of Ops and Partnerships at Metagov, where I help lead our push on grant applications, both in web3 and beyond. I’m also the outgoing Executive Director at SCRF, where we issued grants, applied for grants, and explored scientific funding systems.

Prior to jumping into web3 full-time, I finished my masters and then worked at CMU, where I help apply for government and foundation funding and also helped run grant programs around blockchain and IoT research. Throughout my time there, I also helped research and launch a few podcasts (Heinz Radio first ~20 eps, Consequential first 3 seasons, History of Drugs in Society, On Meaning). You can check out my linkedin for more on my professional experience. This was a talk I gave on Decentralized Research Centers, an idea exploring the different networks needed to help accelerate web3 research. I admittedly haven’t done much dedicated research since grad school, this was independent study thesis on the state of blockchain regulation in 2019 - link. Let me know if I can share any other info on my background.

Thanks for sharing your insights and I really appreciate the materials you’ve created! Let me know if there’s any other info I can share with the shorter timeline scenario I wrote out above.

In the full scope of the research, I think it would be good to talk to a few people who were active in the formation of many programs, to some current operators who may be newer to the space, and to some grantees, ideally applying for different types of projects (e.g. dev vs community vs research). Additionally, I wanted to talk to at least a few people outside of web3 (large foundations, government grant giving agency, etc.) As mentioned above, definitely happy to rescope if something that can be achieved in 4-6 weeks works better to start.

Totally agree and do want to stress that the ultimate intention with this is to do research to better understand web3 grant programs for the interest of anyone wants to learn from it. Thanks for calling that out!

3 Likes

@Viriya thank you for pointing out the brand perspective on educational public goods! Love that this could have multiple utilities.

@eleventh19 what an impressive background! Looking forward to connecting with you soon.

3 Likes