[ARDC] Final Report

TL;DR

  • The ARDC finished its 6-month term from April 1st to October 1st, 2024.
  • During its term, it published 42 deliverables in the forum.
  • The ARDC’s multisig was funded with 1,761,000 ARB, of which 1,753,466.42 ARB was used.
  • The remaining 7,533.58 ARB are to be returned to the treasury.
  • We will be hosting a call on Monday 7th of October at 14:00 UTC to discuss the conclusion of ARDC and answer any questions you might have.

Intro

The ‘Arbitrum Research and Development Collective (ARDC)’ was established on January 26, and it consisted of 4 member seats occupied by 6 DAO-elected entities:

  • L2BEAT & Ant Federation as the DAO Advocate(s)
  • Blockworks Research & Delphi Digital as the Research Member(s)
  • OpenZeppelin as the Security Member
  • Chaos Labs as the Risk Member

The primary purpose of the ARDC was to ‘provide objective assessment of governance posts by distilling information and expediting governance decision-making, thereby enabling participants in the ArbitrumDAO to make better-informed choices.’ Please see the corresponding section below for a more detailed breakdown of the exact mandate outlined in the proposal.

The ARDC’s official start date was April 1st since there were delays with the KYC/KYB process and the signing of legal agreements between members and the Arbitrum Foundation. The term of the ARDC was to be six months, and therefore, the ARDC was to conclude on October 1st.

ARDC’s Mandate

Although the ARDC’s mandate is challenging to capture in a single sentence or paragraph, we’ve compiled a list of references from the original proposal to help you understand it.

The mandate of the ARDC was outlined as follows:

  • The primary mandate of the ARDC will be to provide objective assessments of governance posts by distilling information and expediting governance decision-making, thereby enabling participants in the ArbitrumDAO to make better-informed choices.

  • Aid in ‘Governance Optimization’ through research & development of tooling related to the ArbitrumDAO’s governance framework.

  • Forum Proposal Review & Assistance

    • Impartial data-driven research conducted by ARDC to aid delegates in making well-informed decisions by offering a comprehensive understanding of proposal contexts and competitive landscapes.
    • These reports would double as tools for proposal authors, helping them enhance their work by incorporating feedback and addressing shortcomings. In this regard, the ARDC will assist proposers on an on-request basis to optimize and structure proposals that could prove to be a value-add to Arbitrum.
  • Review on Chain Proposal Code Updates

    • Through manual and automated code reviews, ARDC will offer security assessment, identifying design flaws and security issues.
    • Audit Executables of different proposals.
  • Quantitative Assistance

    • Adding an element of quantitative rigor to proposal evaluations, offering insights into economic risk, design optimization, and overall proposal viability.
    • Identifying and mitigating economic risks associated with proposed initiatives, promoting sound decision-making.
  • Project Management [On a Request-Basis]

    • Aim to assist in facilitating effective communication & management of proposals between the ArbitrumDAO, stakeholders, and service providers.
  • Tooling Creation and Enhancement

    • Develop and enhance security assessment tools to strengthen the integrity of the Arbitrum ecosystem and ensure that proposal code updates meet stringent security standards.
  • Research New Mechanisms

    • ARDC will objectively analyze and contribute to developing innovative mechanisms, promoting data-driven decision-making and enhancing the ecosystem’s capabilities.
  • Delegate Engagement

    • ARDC’s processes will incentivize delegates to actively contribute to proposal refinement, fostering a more engaged and collaborative governance community.
  • Growth Initiatives

    • Through content creation, including podcasts, Twitter threads, and newsletters, ARDC will attract developers and users to the Arbitrum ecosystem, promoting growth and awareness.
  • Operational Parameters
    The following operational parameters were to be implemented by the ARDC for the DAO to oversee its operations properly.

    • Meeting minutes to be taken for every meeting and published on a public Notion site for review by the ArbitrumDAO;
    • Monthly report detailing the performance of the ARDC;
    • Bi-weekly calls with the community with the ARDC;
    • A public Asana/Airtable dashboard for members to submit updates on specific tasks and sub-tasks, keeping the ArbitrumDAO in the loop.

Deliverables

Over the six-month term, the ARDC worked on and published 42 deliverables, ranging from retroactive analysis of incentive programs to risk assessments of different proposals and security reviews of executable proposals.

Complete list of ARDC deliverables
  1. [Blockworks Research] STIP Analysis Case Study: GMX 2
  2. [Blockworks Research] STIP Analysis Case Study: JOJO 2
  3. [Blockworks Research] STIP Analysis: Concerns Regarding Possible Misconduct by Synapse with Respect to the Usage of ARB Incentives Allocated Through the STIP
  4. [Blockworks Research] STIP Bridge: Support Material for the Community
  5. [OpenZeppelin] Using Hedgey for Proposal Payment Vesting
  6. [Delphi Digital] Gaming Catalyst Program: SWOT Analysis
  7. [Delphi Digital] Gaming Catalyst Program - Compensation Structure Memo
  8. [Chaos Labs] STIP Risk Analysis — Case Study #1: Vertex Protocol
  9. [Chaos Labs] STIP Risk Analysis — Case Study #2: Silo Finance
  10. [Delphi Digital] BOLD Dispute Mechanism Summary & Comparisons
  11. [OpenZeppelin] Security Council Improvement Proposal
  12. [Blockworks Research] STIP Retroactive Analysis - Perp DEX Protocols Volume Report
  13. [Open Zeppelin] BOLD Security Analysis
  14. [Chaos Labs] STIP Risk Analysis — Case Study #3: Pendle Finance
  15. [Blockworks Research] STIP Retroactive Analysis – Spot DEX TVL
  16. [Blockworks Research] STIP Analysis of Operations and Incentive Mechanisms
  17. [Delphi Digital] Arbitrum DAO Treasury Research
  18. [OpenZeppelin] ETH Staking Options and Risks for the DAO
  19. [OpenZeppelin] Event Horizon Franchiser Contract Audit
  20. [Chaos Labs] STIP Analysis | Insights & Key Findings
  21. [Chaos Labs] Risk Analysis of Adjusting the Minimum Base Fee on Arbitrum
  22. [OpenZeppelin] ArbOS 31 “Bianca” Proposal Review
  23. [Blockworks Research] STIP Retroactive Analysis – Sequencer
  24. [Delphi Digital] Response to Arbitrum Staking Proposal
  25. [Blockworks Research] Treasury-Backed Vault Research
  26. [Open Zeppelin] Arbitrum Governor V2 Review
  27. [Open Zeppelin] Arbitrum Governor Upgrade Rollout & Timeline
  28. [Chaos Labs] Treasury Backed Vaults Risk Analysis
  29. [OpenZeppelin] Security Analysis of Arbitrum Staking Proposal
  30. [Delphi Digital] Follow Up - DAO Incomes Sources and the Path to Staking
  31. [Chaos Labs] Risk Analysis of Adjusting the minimum base fee on Arbitrum
  32. [Blockworks Research] Incentives Research Summary
  33. [Blockworks Research] Timeboost Revenue and LP Impact Analysis*
  34. [Delphi Digital] Transaction Ordering Policies & Value Accrual in L2s: Timeboost, OP PGA, Fastlane & OEV Network
  35. [Chaos Labs] Timeboost Risk Analysis
  36. [OpenZeppelin] Timeboost Security Analysis
  37. [OpenZeppelin] Arbitrum L2 Time Lock Delay Proposal Security Review
  38. [OpenZeppelin] RARI Multichain Governance Proposal Security Review
  39. [OpenZeppelin] Arbitrum daoURI Proposal Security Review
  40. [Blockworks Research] Retroactive LTIPP Analysis**
  41. [Delphi Digital] Incentives Programs in other protocols**
  42. [Chaos Labs] Treasury Management Risk Assessment**

It’s important to note that we should not assess the performance of members solely based on the numbers of deliverables, and that’s why we didn’t create a total for each member. Different deliverables had different and varying levels of complexity, and therefore, it’s not an apples-to-apples comparison.

*Although Blockworks didn’t pick up this workstream specifically as a member of the ARDC, we’ve included it in the list since it was relevant to the Timeboost discussion and since it was published in the overall context of the work ARDC was doing on that front.
***These deliverables haven’t been published on the forum yet. Although the ARDC term has concluded, the engagement of the members will be officially regarded as ‘concluded’ upon publishing the final deliverables.


The DAO Advocate was responsible for coordinating the ARDC and acting as a bridge between the DAO and the collective’s members. Over the ARDC’s term, we received 11 requests, 8 of which materialized into a deliverable published on the forum.

One issue that became apparent was that although the DAO Advocate was responsible for representing the DAO and maintaining the ability to direct and oversee the ARDC, the mandate to ‘function as a bridge between the DAO and the ARDC’ was somewhat vague.

Since the level of discretion the DAO Advocate could exercise was left open to interpretation, we decided to operate so that anyone could submit a request to the ARDC (see the ‘Transparency & Communication’ section for more info on how), but the DAO Advocate was in charge of prioritizing the work undertaken by the ARDC members.

In theory, that meant that all requests were optimistically ‘approved’ to be worked on by the ARDC, but the DAO Advocate could at any point prioritize new requests. In practice, that meant that some requests were constantly left on the back burner and eventually never materialized.

When a member of the ARDC had no active workstream, and the DAO had no requests for them to undertake one, we took the initiative to assign one to them internally.

Transparency & Communication

Throughout the ARDC’s term, we implemented all the operational parameters outlined in the original proposal and went the extra mile to ensure constant communication with the DAO and its stakeholders.

  1. Meeting minutes for every ARDC meeting

We kept publicly accessible meeting minutes for all ARDC meetings, including the ones not open to the DAO’s participation. For each meeting, we kept attendance, included the agenda going into the meeting, and noted down the items that we discussed (categorized by member), including action items and ETA of deliverables.

  1. Monthly report detailing the performance of the ARDC

From the very beginning, we published a ‘DAO Advocate Communication Thread’ on the forum under a specially created forum category, which we updated monthly. The monthly updates included a summary of the deliverables the ARDC had published in the month prior, an overview of active workstreams the ARDC members were working on, and a list of requests from DAO contributors (if any).

  1. Bi-weekly calls with the community with the ARDC

The ARDC was syncing every week every Monday (except public holidays, in which cases the call was moved) at 12:00 UTC. The meeting was publicly available every second Monday and was shared on the Arbitrum DAO Governance Calendar.

  1. A public Asana/Airtable dashboard for members to submit updates on specific tasks and sub-tasks, keeping the ArbitrumDAO in the loop.

We used Notion to host a dashboard outlining all the tasks the ARDC had completed or was working on at any given time. The Notion dashboard was publicly accessible to everyone.

  1. DAO Advocate Discussion with Arbitrum DAO

On Friday 10th of May, we hosted an open call to discuss the ARDC and the DAO Advocate’s role and invited delegates and active contributors. Notes from the call can be found here.

  1. L2BEAT Arbitrum Office Hours

As the DAO Advocate, L2BEAT leveraged their weekly office hours to discuss anything relevant to the ARDC with interested parties. This was another avenue for people to reach the ARDC.

  1. Telegram availability

Anyone could reach the DAO Advocate (L2BEAT) on Telegram to communicate via instant messages or async in case of timezone differences.

  1. Monthly Open Governance Call

Lastly, we attended all monthly Arbitrum Open Governance Calls to provide updates, answer any questions, and generally discuss ARDC matters with the broader community.

Finances

Disclaimer: Although the DAO Advocate wasn’t responsible for the finances of the ARDC, we decided to include this section for ease of access and reference. The information was collected from publicly accessible information (e.g. forum posts and onchain transactions). The payments to members and the general handling of funds was up to the multisig setup for this purpose.

The ARDC was funded with a total of 1,761,000 ARB, valued at $3,009,385.55 at the time of transfer (now the same amount of ARB would be equal to ~ $1,058,265.91, or roughly 65% less). The funds were kept in a ⅗ multisig at the following address since MSS wasn’t a thing when ARDC was established: arb1:0xd19Ed0E8E723fDb85a6a1480e5345FbCcE0BFF85.

Budget Breakdown

The original proposal earmarked the following amounts for their respective purposes:

  • 665,000 ARB [Security] [Applicable Cap]
  • 665,000 ARB [Research] [Applicable Cap]
  • 335,000 ARB [Risk] [Applicable Cap]
  • 50,000 ARB [DAOAdvocate]
  • 1,000 ARB [per signer - per month for the 6-month term]
  • $16,000 paid in ARB to ImmutableLawyer as retroactive compensation for drafting the proposal, coordinating its establishment and elections, and managing and administrating all related processes until the ARDC was fully operational.

Applicable caps are the maximum amount of ARB a member will be eligible to receive throughout their engagement in the ARDC. They were introduced to avoid having to spent huge amounts of ARB in case of ARB’s price declining, as it ended up happening.

Although the funds earmarked for each seat were denominated in ARB, the members quoted their service fees in USD during their application process. Only the DAO Advocate was to be compensated with a fixed, ARB-denominated amount of 50,000 ARB. Specifically, the quoted fees were as follows:

Total of quoted fees - $2,030,000

Please note that the quoted fees were included in each member’s application, which the DAO voted on and was fully aware of.

When the ARDC multisig was first funded, the ARB was more than enough to cover the total compensation of all members, the multisig signers, and the retroactive compensation to ImmutableLawyer (1,761,000 ARB against the ~1,280,000 ARB of expected spend with ARB at $1,70).

ARB Price Decrease Impact on ARDC’s Budget

While the proposal had a buffer of about 30%, which could be used if ARB’s price decreased, ARB’s price dropped significantly more than the allocated buffer (~60%), which meant that the ARDC ended up with fewer assets on hand than its obligations.

While there were discussions and even a proposal to inject additional funds into the ARDC, all members agreed to complete their engagement without extra funding to meet their USD-denominated fee quotes. Essentially, the members would be absorbing the ‘loss’ caused by ARB’s price decrease.

The ‘Applicable Caps’ for each member came into effect, and the ARDC would only pay each member up to their respective ARB cap.

Final Spending

Of the 1,761,000 ARB, the ARDC was funded with:

  • 50,000 ARB went to the DAO Advocate
  • 665,000 ARB went to the Research Member(s)
  • 665,000 ARB went to the Security Member
  • 335,000 ARB went to the Risk Member
  • 30,000 ARB went to the five multisig signers
  • 8466 ARB went to ImmutableLawyer

For a total spend of 1,753,466.42 ARB. The remaining 7,533.58 ARB are to be returned to the Arbitrum DAO Treasury.

ARDC Conclusion Call

We’re inviting interested parties to a final call on ARDC to discuss all of the above and answer any questions you might have. The call will take place on Monday 7th of October at 14:00 UTC in this link.

6 Likes

Any chance this call was recorded?

1 Like

The call wasn’t recorded, but we basically just went over the report written above. If you have any questions that are not covered in the report, please drop them below and I’ll answer them :slight_smile:

Nothing specific from us! Was just curious, always good to listen in on discussions when they happen, might hear things you hadn’t thought of yourself.

1 Like

The following reflects the views of L2BEAT’s governance team, composed of @krst and @Sinkas, and it’s based on the combined research, fact-checking, and ideation of the two.

While we put together and published the above report, we did so in our capacity as the DAO Advocate, and tried to be non-biased and deliver objective facts on the things the ARDC accomplished during its 6-month tenure.

Now, we also want to drop our opinion as delegates, informed both by the objective results and by our experience as the DAO Advocate.

Before we begin, we want to thank all the participating members for the very good collaboration throughout the term. Our feedback has more to do with the structure of the ARDC and all relevant details and less to do with any individual member. We will also focus solely on reflecting on the ARDC’s past term and won’t touch on our feedback on the ‘ARDC Term 2’ proposal, which we’ll share in a separate comment under that forum post.

ARDC 1 Reflections

From the very beginning, it became apparent that some things were overlooked and while in theory they were details to be figured out, in practice they turned out to be very important.

Compensation Oversight

One such thing has to do with the compensation of the ARDC members and the fact that it wasn’t clear who was responsible for overseeing it. While there was a multisig set up to handle the streams to ARDC members on a monthly basis, the oversight of the hours that each member has worked wasn’t included in its purpose. Nor was any sort of financial management of the funds (which we’ll see later why it was important). Those things were also not included in the DAO Advocate’s mandate. Even though the proposal suggested that there would be some check of the hours worked against the fee quoted, nothing like this happened.

Instead, what ended up happening was a silent ‘agreement’ that members would be paid the total fee quoted split over the 6-month duration. This wouldn’t be a problem if the workload the ARDC had to undertake was there to justify paying the full quote. However, delegate/DAO engagement wasn’t as much as expected (more on this below), and we basically had to ‘create’ demand for the ARDC to work on things without necessarily those things being requested from the DAO. We think we’ve done a good job of managing that, and we’ve had a good outcome from ARDC overall, but it’s important to take that into account when designing future programs.

USD Denominated fees paid in ARB

The members of the ARDC quoted their fees in USD, but they were paid in ARB. The problem with this setup, except from obvious, is also twofold.

  1. The proposal was passed before significant token unlocks. While there was a buffer (~30%) included in the budget to account for negative price movements, it’s safe to say that the estimation for the buffer was quite off. Since the execution of the proposal, the price of ARB plummeted by >60%.
  2. As we saw in point ‘1’, there was nobody responsible for the financial management of the funds in the multisig. So, as the price of ARB continued to decrease over time, and even after it hit a point where the buffer would have to be used, no actions were taken to convert the ARB to stables so the multisig could meet its responsibilities. That wouldn’t have been necessary if the fees were both paid but also denominated in ARB or if the necessary amount of $ was secured in stables after the proposal passed.

Vague mandate

The ARDC’s mandate was too vague, and included things that ended up not happening in practice. For example, the mandate of the ARDC included security audits for executables, and proposal improvement “services” on request of proposal authors.

Performing security audits didn’t make sense because we faced a flood of proposal from authors looking for free audits. That was troublesome on 3 levels:

  1. Security audits can be very time-consuming and costly. We wouldn’t have the capacity to perform all security audits requested.
  2. In practice, a proposal author receiving a security audit through the ARDC could also be perceived as receiving a considerable grant from the DAO. The ARDC’s mandate was vague enough to theoretically ‘permit’ it, but we felt it was out of its scope.
  3. There was already an avenue for projects to receive audits (although subsidized and not free) through the ADPC.

Instead, we decided only to perform security reviews of onchain executables once proposals have gone to Tally. The purpose of those reviews was to inform the community on whether the proposal would only execute the actions outlined and nothing additional or malicious.

When it came to assisting in proposal improvement, we’d often be asked to review a proposal without a specific request from the author or the person requesting the review. Basically, it was something along the lines of ‘Here’s proposal X, can the ARDC take a look?’

Although that sounds feasible in theory, it quickly becomes impossible with many people asking for the ARDC to identify any and all gaps in their proposals and then work with the authors on improving them.

Instead, we chose to try to identify risks or points of concern in proposals that were already on Snapshot, or had garnered significant and vocal support from delegates without necessarily having gone to Snapshot, so we could help delegates make informed decisions.

There’s also the risk of using ARDC to outsource some of the execution costs of proposals. For example, one of the most valuable outcomes of ARDC has definitely been the analysis of the results of selected protocols’ incentive programs. This has led to a better understanding of these programs and the recovery of funds from some of the protocols that did not use them properly. While obviously useful and valuable, this analysis should be a part of these incentive programs, along with the costs associated with it. On the other hand, one could argue that ARDC was successful in this case because we had resources available that could easily be used to fill an existing gap in the program we’ve been running - this practical use case should also be considered when thinking about future programs.

In future programs, it would be good to either have a precise mandate or make it clear that the mandate will be defined during the course of the program and have a process for clarifying it.

Delegate Engagement

When the ‘Arbitrum Coalition’ was first proposed, there was pushback against centralizing too much power in the hands of the proposed coalition. However, it turned out that the idea of having the DAO Advocate act as a ‘bridge’ between the DAO and the ARDC, although well intentioned, didn’t exactly really mitigate anything.

The reason for that was the fact that delegate engagement wasn’t as much as one would expect. There were multiple avenues for people to reach out to the DAO Advocate to request that the ARDC work on something. We had biweekly calls that were open to the DAO (which people outside of the ARDC members rarely attended, proof of which can be found in the meeting notes of the ARDC), but we still didn’t have a lot of engagement. As seen in the report, out of the 42 deliverables from the ARDC, only 11 were requests from the DAO, with 4 of them coming from a single source.

One learning from this setup that could be addressed in a future version of the ARDC is that the DAO Advocate should have the freedom and discretion to proactively assign workstreams to members of the ARDC without having to rely on delegates’ input. That’s what we chose to do during our engagement, but it should have been part of the scope of the proposal.

In addition, in future programs, ARDC could serve not only delegates, but also other key stakeholders in the DAO, such as the Arbitrum Foundation, Offchain Labs, or leaders of key initiatives. Especially towards the end of ARDC’s operations, we saw significant interest in ARDC’s support from these sources, which we found not only interesting but also valuable for the DAO.

Value received from the ARDC

Overall, we spent 1,753,466 ARB, with an average value totaling over $1,000,000 USD, for 6 months of the ARDC and received 43 distinct deliverables, some of which helped return over 1,000,000 ARB to the DAO’s treasury. The ‘ROI’ of the ARDC isn’t something that can be objectively assessed and everyone needs to subjectively assess whether the value they feel the DAO received was worth the cost.

We’re overall skeptical of the efficacy of the ARDC, not because of the outputs it delivered, but because we believe that the DAO wasn’t positioned to leverage the ARDC to its full potential. That has to do with a lot of other things irrelevant to the ARDC or its structure, and therefore, things that cannot be addressed in a new proposal or with a simple amendment in the previous one.

Looking Forward

While we do believe the concept of the ARDC to be something worth exploring further, we are not convinced that the DAO is at a point where we should expend additional effort in trying to make it work.

We do recognize the need for the DAO to have access to unbiased information surrounding different proposals by relevant domain experts and to have an avenue to easily procure it and it’d make sense to revisit the topic again in the future.

Right now, however, there are things in the DAO that need to be addressed in order for us to be in a position to leverage the ARDC to the fullest.

8 Likes