[Gitcoin 3.2] Accelerating GTC Value Capture

IMPORTANT - This is a draft direction for Gitcoin 3.0 => 3.2. Digest it, debate it, fork it, make changes. Nothing is official until ratified by @gitcoin governance. The post is for informational purposes only. Do not make any financial decisions based off this post.

Gitcoin 3.2: Accelerating GTC Value Capture

Building on Gitcoin 3.1’s TEV foundation to create the ultimate systematic alpha generation & capture machine

What is Gitcoin 3.2?

TLDR -

A Bittensor inspired version of Gitcoin that evolves the best capital allocation tooling possible

Bittensor is a decentralized network that incentivizes development through a market-based system using its TAO token and Proof of Intelligence. Gitcoin could apply a similar model to capital allocation innovation by creating competitive subnets where novel funding mechanisms are tested, evaluated, and rewarded based on performance.

Gitcoin 3.0 creates the Tokenizable Exportable Value (TEV), 3,1 enables, TEV Capture, 3.2 creates TEV acceleration

What is Bittensor?

Bittensor was recommended to us by Allo.Capital cofounder Juan Benet in February as one of the most innovative blockchain plays out there. It is a groundbreaking decentralized network that sits at the intersection of blockchain technology and artificial intelligence. It creates a peer-to-peer market for digital services (trading, compute networks) where participants can collaborate, train, share, and monetize intelligence.

The network operates through a unique consensus mechanism called Proof of Intelligence (PoI), which rewards participants based on the value of their contributions to the collective intelligence. Unlike traditional blockchain networks that use Proof of Work or Proof of Stake, Bittensor evaluates the quality and value of machine learning outputs.

At the core of Bittensor’s ecosystem is TAO, its native cryptocurrency, which serves multiple functions:

  • Rewarding miners who contribute computational resources and AI models
  • Compensating validators who evaluate model quality
  • Enabling users to access and extract information from the network
  • Facilitating governance through staking

Bittensor’s architecture is organized into subnets, specialized domains where miners contribute computational resources to solve specific computational tasks (defined by subnets) while validators evaluate their performance. This structure creates a competitive environment that drives continuous innovation and improvement in capabilities.

By democratizing access to development and creating an incentivized framework for collaboration, Bittensor accelerates the advancement of intelligence generating technology while ensuring that rewards are distributed based on the value contributed to the network.

How Gitcoin Could Create a Bittensor-style Arena for Capital Allocation Tools

Bittensor has successfully created an ecosystem where AI models compete and collaborate to continuously improve, with rewards distributed based on the value contributed. Gitcoin could apply this model to capital allocation by creating a similar competitive arena for evolving optimal capital allocation tools Here’s how:

Proposed Implementation:


  1. Subnet Model for Capital Allocation
  • Create specialized subnets focused on different capital allocation strategies (e.g., grant distribution, investment evaluation, risk assessment)
  • Each subnet would contain miners developing and deploying novel capital allocation algorithms and validators evaluating their performance

  1. Proof of Allocation Intelligence (proof of AI)
  • Develop a consensus mechanism that evaluates capital allocation strategies based on predefined metrics (Value Flowed, ROI, distribution efficiency, community satisfaction).
  • A simple mechanism could be “proof of flow”
  • Alternatively, have each subnet tokenize and judge success by how many that token performs.
  • Allocate rewards to strategies that consistently outperform others.
  • Allocate more rewards to projects that are outperforming AND agree to do GTC token swaps.

  1. Synthetic Capital Markets
  • Create simulated environments where allocation strategies can be tested using historical data or synthetic scenarios
  • Allow real-time competition between strategies to identify optimal approaches for different contexts

  1. Progressive Learning System
  • Enable successful strategies to build upon one another through a knowledge-sharing framework
  • Create incentives for continuous improvement and adaptation to changing market conditions
  • Identify friction points for builders and reduce the friction, enabling faster evolution.

  1. Real-world Implementation Track
  • Once strategies prove successful in simulated environments, provide pathways to deploy them with actual capital (perhaps via GG)
  • Create a feedback loop where real-world performance influences future development

Benefits:

  • Evolutionary Improvement: By creating competitive pressure between different capital allocation mechanisms, the system would naturally evolve toward increasingly effective strategies.
  • Context-aware Solutions: Different strategies could emerge for different scenarios (early-stage funding, mature project governance, emergency response).
  • Transparency and Trust: All allocation decisions would be traceable and explainable, increasing confidence in the system.
  • Community Governance: The community could vote on which metrics should be prioritized in evaluating allocation strategies.

The Strategic Evolution:

Gitcoin 3.0 established the foundation: a Network-First Funding Festival for Ethereum’s biggest problems through diverse allocation mechanisms. This arena creates competitive evolutionary pressure on builders to build the dopest crowdfunding and capital allocation technology out there. Access to this tech is alpha; its worth a lot. This is TEV.

Gitcoin 3.1 proved the thesis: Every breakthrough crypto project shows early signals in community funding data before markets recognize their value. We capture this TEV through tokenized products, institutional licensing, and managed funds—transforming public goods funding from cost center to profit center.

Gitcoin 3.2 accelerates the capability: By creating competitive evolutionary pressure on capital allocation mechanisms themselves, we don’t just capture alpha from funding outcomes—we systematically improve the signal quality of our entire TEV generation engine.

In summary, Gitcoin 3.1 established our ability to extract Tokenizable Exportable Value (TEV) from community funding metadata. Gitcoin 3.2 accelerates this capability by creating a Bittensor-inspired competitive arena where capital allocation mechanisms evolve under market pressure, dramatically amplifying our TEV generation while creating the most sophisticated capital allocation intelligence in crypto.

Conclusion

By applying Bittensor’s competitive intelligence model to the challenge of capital allocation, Gitcoin could create an arena where diverse strategies compete, collaborate, and continuously improve, ultimately developing more efficient and effective ways to distribute resources across the Web3 ecosystem.

Appendix A - The GTC Token gets new life

Potential Role of GTC in this world Token

Launching new GTC token utility could provide specific advantages for a Bittensor-style arena:

  1. Incentive Alignment: A native token could directly incentivize participants who develop and improve capital allocation strategies, similar to how TAO rewards AI model contributions. (Tao is a top 50 token)
  2. Specialized Governance: An bittensor inspired GTC token could enable weighted voting on which capital allocation strategies should receive more resources and which metrics should be prioritized.
  3. Value Capture: The token could capture value generated by successful allocation strategies, distributing it to developers, validators, and other ecosystem participants.
  4. Network Effects: A token could help bootstrap the network by attracting early participants through token incentives.

GTC Token vs. TAO: A Comparison

If implemented, an GTC token with bitteesnor like utility would differ from TAO in several key ways:

Feature GTC Token TAO (Bittensor)
Primary Purpose Incentivize capital allocation innovation Incentivize AI model contribution
Value Metric Capital efficiency, ROI, distribution fairness Intelligence contribution quality
Economic Model Could use a different supply model focused on sustainable funding Fixed supply of 21 million with bitcoin-like halvings
Staking Dynamics Would likely stake to validate allocation strategies Stakes to run validators that evaluate AI models
Target Participants DeFi developers, economists, governance experts AI/ML developers, compute providers

Recommendation

For Gitcoin to successfully implement a Bittensor-style arena for capital allocation tools, a dedicated token is not strictly necessary but could provide advantages for network growth and alignment.
The most practical approach might be a hybrid model:

  • Begin without a token, leveraging existing infrastructure (as described in the 3.1 post)
  • Measure adoption and effectiveness of the arena.
  • If the system proves valuable and a token would enhance its utility, introduce an GTC token utility with careful tokenomics designed specifically for optimal balance and financial thriving.
  • Ensure the token has genuine utility beyond speculation, with mechanics that directly tie its value to the effectiveness of the allocation strategies it supports

Appendix B - Feedback on this proposal so far

Summary of all comments to date on a previous version of this proposal note:

• Synthetic test-bed first – Several commenters (cerv1, carlb) like starting with “synthetic capital markets”: freeze a historical data snapshot, have agents allocate blindly, then fast-forward and score thousands of randomized runs to build a leaderboard of strategies that generalize. This lowers risk and lets researchers compare mechanisms side-by-side before real money is involved.

• Key design questions – Repeated feedback (Griff, thelastjosh, multiple anonymous posts) stresses that capital allocation is harder to judge than ML models. You must nail:
– Which metrics truly matter (value-flow, fairness, ROI, community satisfaction, long-term impact)
– How to blend objective data with subjective evaluations without Goodharting the metric
– Who produces/validates the evals and how to prevent gaming or validator fatigue.

• Evaluation scope & phasing – Consensus is to begin with a tiny number of subnets (2-4) and a single simple metric such as “total value flowed,” then iteratively add richer metrics once the system runs. Deep-funding style “indirection” (voting on mechanisms, not projects) and AI-assisted info compression are seen as promising ways to scale evaluation.

• Normalization & comparability – Mechanisms have very different input structures (direct grants, QF, retro funding). Commenters suggest normalizing for apples-to-apples comparisons and perhaps using Shapley values or counterfactual analysis to attribute impact.

• Distribution edge for builders – A benefit pointed out by owocki: founders can focus on code while Allo Arenas supplies distribution and user flow, filling a common go-to-market gap.

• Token debate – Views range from “stick with GTC” (Griff) to a future token or even a two-token model. The prevailing advice: start tokenless or hybrid, gather data, then launch a token only if it clearly enhances incentives and governance. Legal and design ramifications of subnet-level tokens were flagged.

• Viability of evaluation - The more short-term, tighly scoped a grant allocation mechanism is, the easier it is to evaluate. For anything long term it becomes incredibly hard. There’s just a ton of noise, it becomes almost impossible to imply any causality, etc.

• Open questions called out – How to create, update, and wind down subnets; where fees come from; current friction points for Allo builders; why deployment would route via Gitcoin Grants versus Allo Capital; concrete incentive examples; and what a knowledge-sharing framework looks like.

• Strategic framing – Private-chat feedback urges focusing on “missing primitives” rather than immediate market demand, keeping the system product-led, and avoiding metric over-fitting by allowing the weight-setting process to evolve as new data and metrics appear.

• Overall tone: strong enthusiasm for an evolutionary “arena” but consistent concern about evaluation methodology, governance overhead, and practical implementation details. Most contributors urge starting narrow, instrumenting heavily, and expanding only once simple pilots prove robust.

3 Likes

Reviewed. I agree with the recommendations based on the Appendix B feedback and my perspective.

When it comes to tokenomics, my POV is that we should simulate what pieces of the ecosystem are required for the entire flywheel (including getting payouts from successful alpha bets, which could take months/years) to start becoming regenerative.

Based on that, define what amount of capital we estimate needing to realize this vision, and in which buckets. If we have enough - or can organize/raise enough - for the unified vision, it can be a single token. If the demand for these elements is more diverse (certain stakeholders believe primarily in the value capture side of part X and not part Y), we should opt for a multi-token economy, and potentially merge things back once parts have proven to be valuable as part of the larger system.

2 Likes

Please don’t introduce another token and end up diluting GTC. Concentrate all energies into making GTC a worthwhile token first with utility ( for builders AND investors). People have lost big money on GTC, they deserve some empathy.

2 Likes

No doubt if they introduce a new token it will end up like GTC, and only difference it will happen much faster.

Gitcoin is great at producing solutions for needs, but they are extremely slow. Also they have been using GTC as fuel but not for storing value. That is why people do not buy $GTC which belongs to useful and productive environment and it has only 100m circ. supply. People see no hope in $GTC. If you don’t care about your own token, why would anyone else care? This is something Gitcoin team consider and think about.

1 Like

This direction really resonates with the kind of work I’m doing. I’m building a community-rooted recovery platform that uses tech and incentives to support people impacted by addiction, incarceration, and systemic neglect. It’s a real-world use case that benefits from capital coordination — but we’ve been flying under the radar because there’s no reliable track for builders like us to be seen and supported.

I’m excited by the idea of an arena that tests and rewards capital allocation strategies. If Gitcoin builds out these subnets and synthetic models, I hope there’s room to uplift strategies that serve grassroots, culturally grounded, and recovery-centered communities. We’re not just “early-stage,” we’re often excluded-stage — and systems like this could open new doors.

Please keep accessibility, mentorship, and real-world friction in mind as this evolves. There are builders out here doing high-impact work who are ready to participate — if the door is open.

3 Likes

Correct however at the end of the day people with maximum voting rights will rule roost. They are trying to move to the next token while the first one is in doldrums. I hope GTC can still be salvaged from here. 100 million is still a small supply and people will come flocking in this bull market IF THERE IS THE WILL FROM THE GITCOIN TEAM. Not that my voice even matters but NO more tokens like Allo or something.

not sure why we are talking about the introduction of new tokens on a thread that is 100% about acelerating GTC value capture.

unless you are talking about subnet tokens? which (if introduced) are different in scope similar to how TAO and TAO subnet tokens are different/complementary in the bittensor ecosystem. the idea here is that the subnet tokens represent the TEV of each subnet and would be partially owned by GTC

2 Likes

GTC was introduced as a valueless governance token in 2021. Then in 2021-2022 it seems like it was swept up in the bull market and given a very very inflated valuation for a brief moment of time. I don’t see these specific circumtances happening again.

I think that showing up and just demanding that number go up doesnt achieve anything.

Your sense of expectation that OTHERS should be doing more; or that others should be pumping up your bags, where does it come from? Were you promised something specific that wasnt delivered? Or is it just an expectation, but not an agreement?

From expectations vs agreements:

  • Expectations are often unspoken, while agreements are usually explicit. When we have expectations, we don’t always communicate them to the other person and are often what we think someone should do or be. This can lead to misunderstandings and disappointment. Agreements, on the other hand, are made explicit. This means that both parties know what is expected of them and can hold each other accountable.
  • Expectations can be unrealistic, often based on our hopes and dreams and set too high. While agreements are more realistic - based on what is possible and achievable and on what both parties are willing to do (boundaries anyone?!)
  • When we have expectations, they can be one-sided and we often focus on what we want from the other person. This can lead to resentment and anger. Agreements, on the other hand, are usually mutual. This means that both parties are making promises and commitments to each other.

IMO We need to turn more of these expectations into agreements…

To do that, we need more gov proposals that create agreements about resources, their uses, and how that impacts GTC. Doing the work is weaving the path forward, making intros to help, listening and reading and understanding,and providing feedback… This is doing the work of turning around the DAO.

Just driveby posting demanding that number go up IS NOT doing the work.

I want Gitcoin to be successful too. Thats why you see me engaging on these posts (and prettymuch every post on Gitcoin’s path to success here 2021-2025, with the exception of my formal 15-month disaffiliation w the project around 2023).

1 Like

Appreciate the transparency in this thread. As someone new to Gitcoin but deeply aligned with the mission, I’ve been working on a decentralized recovery project for underserved communities:

:brain: [Daily Haven / Recovery Haven App]

We’re building a peer-led platform for anonymous daily check-ins, biometric kiosk access in detox centers, and future AI/DAO integrations for mental health and recovery.

This is what doing the work looks like for me—showing up, building, and learning. I don’t expect Gitcoin to pump my bags—I’m here to learn how to help strengthen the public goods movement and maybe one day contribute something that actually helps revive trust in $GTC and the mission behind it.

Thanks for holding space for these tough convos. Respect.