[Karma GAP] Grant updates

Hello!

We recently received a grant from Plurality Labs to build a decentralized grant evaluation platform. We will use this thread to keep the community updated on our progress.

Grant Details

Provide a summary of your intended program & the impact

Karma GAP is a protocol for tracking and ensuring accountability of community-issued grants. It enables grantees to set and update milestones, allowing communities to monitor progress and assess the impact of projects.

Our mission is to create a neutral system for evaluating funding, giving communities and grant programs the data needed to discern effective projects and decide on funding allocations.

Problem

Below are specific problems we would like to solve for the Arbitrum Grant ecosystem.

  1. Insufficient visibility of grant issuance, progress, and outcomes
  2. Lack of data for community-driven decisions on grant category funding adjustments.
  3. No mechanism for grantees to establish a track record.
  4. Inadequate information to gauge the impact of specific projects or categories.
  5. The community is willing to engage but lacks a structured way to contribute.

How does your project meet one or more of our strategic priorities outlined above?

Our project aligns with strategic priorities:

  1. Achieve GOVERNANCE OPTIMIZATION by identifying and iteratively improving key capabilities to increase DAO performance and accountability.
  • Capital allocation is crucial for DAOs. Implementing GAP yields insights and transparency, leading to improved capital distribution choices.
  1. GROW THE COMMUNITY through awareness, participation, efficient inquiry handling, and bias reduction.
  • Creating a reviewer system to engage the community aims to boost DAO awareness and involvement, potentially enhancing other activities such as governance voting and delegation.
  1. How do you plan to execute this program (Specification & Implementation)?

We intend to implement Karma GAP for Arbitrum DAO’s grantees to report updates, allowing community evaluation of the projects. All data will be on-chain on Arbitrum One.

Our solution addresses the outlined problems as follows:

  1. Visibility: Integration with Arbitrum One allows grantees to create projects and outline milestones for received grants, with regular update postings. The Arbitrum community gains the ability to monitor all grants and their progress.

  2. Data-Driven Decisions: We’re developing a feature for grantees and community members to assess grants on various metrics. This data, stored on-chain, will be available for public analysis.

  3. Grantee Reputation: A profile page will showcase grantees’ grants and their statuses, useful for future grant applications within the Arbitrum ecosystem or elsewhere.

  4. Impact Measurement: Projects will be categorized to facilitate community access to categorized evaluation outcomes.

  5. Community Contribution: A reviewer system will enable community members to assess grants matching their expertise. We will collaborate with Plurality Labs to establish measurable reviewing standards.

  6. What are your Milestones?

Below are all the specifications broken down by milestones.

Milestone 1: Add support for Arbitrum One (2 weeks) and Categorize grants for better evaluation (1 week) - Due Dec 8, 2023

Deploy smart contracts on Arbitrum One, update front and back-end systems for One compatibility. Post-Milestone 1, grantees can create projects and update grants with all data stored on Arbitrum One.

In collaboration with Plurality Labs, we will define project categories and update our system to classify projects accordingly. Post-implementation, community members can browse grants by category.

Milestone 2: Feature for grantees to describe how their project should be evaluated (1 week), anyone should be able to evaluate projects (3 weeks) and integrate Open Source Observer (1 week) - Due Jan 12th, 2024

To assess impact, projects must be evaluated using criteria set by those who know them best—the grantees. We will build a feature allowing grantees to specify their evaluation metrics, ensuring transparency as these will be publicly viewable. This information will be on-chain and retrievable via our SDK or an indexer for quick access.

Community members can declare their expertise, self-nominate as reviewers, and evaluate grants. A dedicated page will display each reviewer’s grant assessments, aiding in building their reputation. We will introduce measures to filter out Sybil (fraudulent) reviews. Recognizing the complexity of this issue, we will refine the reviewer system continually, guided by community input.

We will integrate with Open Source Observer to include their analytics in the GAP application for projects linked to GitHub repositories, offering funders insight into the impact of open source contributions on ecosystem health.

  1. How will the community validate impact?

The community can apply GAP’s evaluation framework to assess GAP itself using these metrics:

  1. The proportion of grantees regularly updating their grant progress.
  2. The count of reviewers analyzing different grants.
  3. The community’s ability to discern impactful grant categories and inform future funding decisions.

Success is marked by the community’s capacity to use these indicators to guide funding strategies.

6 Likes

As of today, we have uploaded most of the Arbitrum grants onto GAP. You can see all the grants here: Karma GAP - Arbitrum community grants

You can see our own project and milestone updates here: https://gap.karmahq.xyz/project/0x86c61f601d498bf6081573434452fea50ea40c27ee2d323e1dab89c9317a323c/?tab=grants&grant=0xa8bb9aa56d854f6b3d5ddf287c96d3120732e5143abbab0da360547666888a5e&grant-tab=overview

You can review all Arbitrum citizen round grants here: https://gap.karmahq.xyz/arbitrum/?categories=&status=all&sortBy=milestones. Click on any grant and review.

We worked closely with Feems from Plurality Labs on setting up these questions and we plan to iterate on it.

3 Likes

We completed all the milestones (except integration with OSO which will happen in next couple weeks). Feems from Plurality Labs has been helping us run workshops every week to onboard Arbitrum grantees to GAP. We have run 3 workshops to date and the turnout has been good (ranging from 20 - 40 users) per session.

Contributors have also started reviewing various grants. You can see example here.

cc: @karmagap

1 Like