[STIP Monitoring - Objective 1&2] - Ouroboros Research - [FINAL]

Ouroboros Research is delighted to present our proposal for the Arbitrum Short-Term Incentive Program (STIP) Data Monitoring and Reporting RFP.

TL;DR: We propose a partial scope for objective 1 and full scope for objective 2 for 100k ARB

We are keen to hear the feedback of the Arbitrum community as well as drafters of this RFP - @DisruptionJoe, @tnorm, @Burns, and @epowell101.


Understanding of Scope

The primary goals of Arbitrum Short-Term Incentives Program (STIP) are as follows:

  1. Growth: Promote user, TVL, activity and ecosystem growth

  2. Experimentation: Experiment with grant distribution models and use the information generated from this program to inform future programs and new strategies

The large grant program size of 50m ARB spanning across 30 projects justifies the use of a service provider (SP) to monitor and report misuse of grant funds. Furthermore, a primary objective of the Arbitrum STIP is to inform the best use of funds in the future, thereby requiring a comprehensive impact analysis.

In the initial proposal, we also note that “participating grantees will be expected to self-report data, dashboards, and summarize grant performance on an ongoing basis.” This means a service provider will likely have to blend 1) on-chain data with 2) reported data / data sources from grantees to formulate a full picture.

The following are the specific deliverables for the SP as posted in the RFP. We will be expanding on our proposed approach in the following section.

1. Open source dashboard and analytics tooling

Realtime Fraud Detection / Misuse of Funds: Wallet and Fund Tracking, Protocol/Core Team Farming, Transaction monitoring, Wash Trading, Claimant Analysis.

Grant/Protocol Data and Impact Analysis: Ecosystem metrics (TVL, transactions, users, fees and volume grouped by sector/ protocol as well as pool level analysis, and impact analysis), ARB spending (distributed, claimable, planned), claimant analysis (distribution of claimants by protocol / asset, percentage of claimants selling / delegating ARB).

2. Reporting to the Arbitrum Community

Bi-weekly reporting of activity and progress of grantees, which focuses on if 1) funds are appropriately used and 2) update summaries / KPI tracking.

Coordinate with each project individually and communicate issues inform any potential decision to halt funding streams.

Execute on requests from the STIP-ARB multisig and conduct investigations raised by the community.

3. Final Impact Report

Deliver a comprehensive STIP impact report which covers, Arbitrum ecosystem-wide, sector, protocol, pool specific and asset metrics.


Key Signals and Metrics to Monitor

We believe our capabilities will be able to effectively fulfil parts of Objective 1 and the entirety of Objective 2.

Partial coverage of objective 1: Our proposal will cover funds and wallet tracking, claimant analysis, transaction monitoring but will not cover wash trading, or team farming. We believe the latter is hard to detect and to provide real time updates. That said, we propose dashboards that will open source information in such a way that the community (as well as us) will be able to dig deeper to identify funds misuse.

Broadly speaking, we have classified the 30 projects’ plans into 3 categories - Usage based rewards, pools / vaults incentivization and other. We will use these categories as a framework to inform our data architecture and dashboard design.


1. Grant funds movements tracker: We will build a dune dashboard to track all ARB stored in the grantees’ multisigs and their corresponding movements. This dashboard will also label all disclosed incentivized contracts, which will show flow of funds for granted ARB into contracts. This will make funds flow to non-whitelisted contracts immediately apparent, and will serve as monitoring to enforce proper use of funds.

We have previously built a dashboard for the initial DAO airdrop distribution which has similar functionality here.


We are also currently exploring the possibility of building a X / twitter alert bot that leverages functionality of this dashboard which aims to increase accountability and transparency of the grantees.

2. Activity tracker: We will build a dune dashboard for depositors and users (traders) for each pool showing the activity from the initiation of grants, which will include a firehose view of activity (subject to a threshold filter), as well as a ranking of top depositors and users. This should help the auditing of users’ activity.

3. Impact dashboard: We will build a KPI dashboard showing the difference between several KPI metrics including TVL / users / usage volumes vs. a baseline level of activity (which will be prior to grant deployment). This will be done on a pool basis which will then be rolled up into protocol-level, sector-level and ecosystem-level.

Users will also be segmented into new users, defined as users who had not used the pools before and existing users, which will help provide further insight to the efficacy of grants in attracting new users.

This data will be tracked even after the grant program concludes to evaluate the stickiness of incentivized users.

4. Claimant dashboard: We will build a claimant dashboard showing hold / sell / transfer / delegation activity.


Value Definition (Priority 2 only)

The goal of the STIP data monitoring and reporting exercise is to understand, decipher and act on learnings.

Conclusions that we think are important to extract from this exercise include:

  • (Short term) How effective are the grants in incentivizing activity? This can be broadly defined as TVL increase / ARB, trading volume increase / ARB, new users / ARB and can be segmented into new and existing users / different sectors / size of projects. We will focus on capital efficiency to answer this question.

  • (Long term) How sticky are users once incentivization activity ends? Tracking KPI metrics even past the end of the program should help inform what kinds of grants are the most effective at retaining users. Depending on the project, we will track KPIs decline in different time periods as a % of activity generated during the incentivized period which will allow us to formulate the following statement: “X% of activity has been retained X months after the incentivization”.

  • Are the funds fully utilised? A significant component in the initial ARB airdrop was largely unutilised (as incentives) and are instead kept in the original wallets. This should be less of an issue with STIP, given clear mandates for the grants to be consumed by a certain timeline. We define this as % of ARB utilised vs. planned.

These questions will be answered in each bi-weekly report as well as the final impact report.


Project Timeline

We propose the following timelines:

  • RFP Decision: 22 Oct’23 - 27 Oct’23

Dashboard setup

  • Grant funds movements tracker: 8 Nov’23
  • Activity tracker: 29 Nov’23
  • Impact dashboard: 29 Nov’23
  • Claimant dashboard: 5 Dec’23

Reports

  • First bi-weekly report: 30 Nov’23, every following 2 weeks
  • Final impact report (Part 1: Post STIP completion): 12 Feb’24
  • Final impact report (Part 2:Three months after STIP completion): 13 May’24

Milestone / Tranche amounts

Total request: 100k ARB

Timeline:

  • Milestone 1: 25% to be paid upfront
  • Milestone 2: 50% to be paid on completion of dashboards
  • Milestone 3: 25% to be paid on completion of final report

Prior experience and Proposed Team

Ouroboros Research (the research arm of Ouroboros Capital) comprises a team of analysts that conducts fundamental investment analysis, on-chain data gathering / compilation to provide insights to our users / readers. Our experience spans across blockchain research companies, investment banks and hedge funds (crypto and otherwise).

We pride ourselves on being hands-on investors, and provide value-adding advice to our portfolio companies, which include Radiant Capital (designing tokenomics), Good Entry and many others. We are also active on governance forums (see STIP extension post here, Frax buyback proposal). This oftentimes include conducting DeFi-related analysis and protocol design.

Our deep understanding of DeFi allows us to objectively evaluate and critique incentive structures, which we believe is required of this

More recently, we have conducted analysis on STIP using a dashboard we had created to track voting progress and grant details which was widely used by various projects to track voting progress. This was built up using individual project data then segmented by sectors.

image

We propose a team of 4 including myself, 0xRamen (head of research) and 3 other analysts (0xMize, FabulousDegen and 0xVega) to work on this project full time. We will deploy further analyst resources as needed.


Information / resources required

  • Wallet addresses of multisigs holding ARB
  • Contract addresses which are the target recipients of ARB
6 Likes

Woowowowow good

Why not ?

1 Like

I’ll happily do this for 50% of this ARB request

1 Like

Thank you for your proposal, to assist in understanding this proposal in full can you answer the following:

  1. What ecosystem metrics will be shown in the dashboard? Is there any deviation from those in the RFQ?
  2. Will grant matching and ROIs be tracked/displayed? If so, how is ROI determined?
  3. Please expand more on what the final report covers - eg assessment of incentive effectiveness, creation of value, recommendations for future programs or summary of data only.
  4. Can you provide more background on the team with any experience in data analysis beyond presentation and providing data driven incentive program recommendations?
  5. What is the teams current capacity to work on this project over the next ~5 months?
1 Like

Hey Burns, thanks for the questions! Here are our replies:

1. What ecosystem metrics will be shown in the dashboard? Is there any deviation from those in the RFQ?

We’ll be showing the metrics as defined in the RFP - TVL, Transactions, Users, Fees Generated, Volume.

On top of this, we do intend to present several different cuts of data including 1) growth of the above metrics from incentive start date, 2) TVL, transactions, users, fees generated, volume arising from new addresses that have never interacted with the contracts, 3) average depositor stay duration for vault depositors, 4) retention rate for traders.

2. Will grant matching and ROIs be tracked/displayed? If so, how is ROI determined?

Grant matching: Yes, grant matching will be displayed.

ROI: We also intend to show ROI, this can be defined as

For pools:
ROI = (TVL increase vs. baseline) / (value of ARB + grant matching emitted)
where baseline = Average TVL from T-2 weeks to T-1 week

For trading volume:
ROI = (Trading volume or fee increase vs. baseline) / (value of ARB + grant matching emitted)
where baseline = Average TVL from T-2 weeks to T-1 week

3. Please expand more on what the final report covers - eg assessment of incentive effectiveness, creation of value, recommendations for future programs or summary of data only.

This will not just be a summary of data. It is probably best to show some a sample conclusion will can be:

Perp DEXs incentives specifically targeted at trading rebates are the most effective in terms of ROI, which generates $X of trading activity per $ of incentive spend. More importantly, user retention is higher than average at X% 3 months after emissions end. As compared to this, direct emissions that are framed as “trade to earn” do not have a similar effect. We recommend future incentive programs for perp DEXs to focus on trading rebates as a result.

The idea here is to provide actionable recommendations that can be used or added to an “incentive playbook” and not just putting data together.

4. Can you provide more background on the team with any experience in data analysis beyond presentation and providing data driven incentive program recommendations?

Without being too specific, we have a team member who worked at a blockchain research company (analyzed risk by monitoring KPIs of large projects), a team member who worked at a startup putting together analytics for decision making, and two crypto fund analysts (one of which was a data analytics major) who produces data-backed investment recommendations (gather data / design data architecture / run correlation analysis).

5. What is the teams current capacity to work on this project over the next ~5 months?

Collectively, around 100 hours / week for the entire team.

2 Likes