[ARDC] Final Report

The following reflects the views of L2BEAT’s governance team, composed of @krst and @Sinkas, and it’s based on the combined research, fact-checking, and ideation of the two.

While we put together and published the above report, we did so in our capacity as the DAO Advocate, and tried to be non-biased and deliver objective facts on the things the ARDC accomplished during its 6-month tenure.

Now, we also want to drop our opinion as delegates, informed both by the objective results and by our experience as the DAO Advocate.

Before we begin, we want to thank all the participating members for the very good collaboration throughout the term. Our feedback has more to do with the structure of the ARDC and all relevant details and less to do with any individual member. We will also focus solely on reflecting on the ARDC’s past term and won’t touch on our feedback on the ‘ARDC Term 2’ proposal, which we’ll share in a separate comment under that forum post.

ARDC 1 Reflections

From the very beginning, it became apparent that some things were overlooked and while in theory they were details to be figured out, in practice they turned out to be very important.

Compensation Oversight

One such thing has to do with the compensation of the ARDC members and the fact that it wasn’t clear who was responsible for overseeing it. While there was a multisig set up to handle the streams to ARDC members on a monthly basis, the oversight of the hours that each member has worked wasn’t included in its purpose. Nor was any sort of financial management of the funds (which we’ll see later why it was important). Those things were also not included in the DAO Advocate’s mandate. Even though the proposal suggested that there would be some check of the hours worked against the fee quoted, nothing like this happened.

Instead, what ended up happening was a silent ‘agreement’ that members would be paid the total fee quoted split over the 6-month duration. This wouldn’t be a problem if the workload the ARDC had to undertake was there to justify paying the full quote. However, delegate/DAO engagement wasn’t as much as expected (more on this below), and we basically had to ‘create’ demand for the ARDC to work on things without necessarily those things being requested from the DAO. We think we’ve done a good job of managing that, and we’ve had a good outcome from ARDC overall, but it’s important to take that into account when designing future programs.

USD Denominated fees paid in ARB

The members of the ARDC quoted their fees in USD, but they were paid in ARB. The problem with this setup, except from obvious, is also twofold.

  1. The proposal was passed before significant token unlocks. While there was a buffer (~30%) included in the budget to account for negative price movements, it’s safe to say that the estimation for the buffer was quite off. Since the execution of the proposal, the price of ARB plummeted by >60%.
  2. As we saw in point ‘1’, there was nobody responsible for the financial management of the funds in the multisig. So, as the price of ARB continued to decrease over time, and even after it hit a point where the buffer would have to be used, no actions were taken to convert the ARB to stables so the multisig could meet its responsibilities. That wouldn’t have been necessary if the fees were both paid but also denominated in ARB or if the necessary amount of $ was secured in stables after the proposal passed.

Vague mandate

The ARDC’s mandate was too vague, and included things that ended up not happening in practice. For example, the mandate of the ARDC included security audits for executables, and proposal improvement “services” on request of proposal authors.

Performing security audits didn’t make sense because we faced a flood of proposal from authors looking for free audits. That was troublesome on 3 levels:

  1. Security audits can be very time-consuming and costly. We wouldn’t have the capacity to perform all security audits requested.
  2. In practice, a proposal author receiving a security audit through the ARDC could also be perceived as receiving a considerable grant from the DAO. The ARDC’s mandate was vague enough to theoretically ‘permit’ it, but we felt it was out of its scope.
  3. There was already an avenue for projects to receive audits (although subsidized and not free) through the ADPC.

Instead, we decided only to perform security reviews of onchain executables once proposals have gone to Tally. The purpose of those reviews was to inform the community on whether the proposal would only execute the actions outlined and nothing additional or malicious.

When it came to assisting in proposal improvement, we’d often be asked to review a proposal without a specific request from the author or the person requesting the review. Basically, it was something along the lines of ‘Here’s proposal X, can the ARDC take a look?’

Although that sounds feasible in theory, it quickly becomes impossible with many people asking for the ARDC to identify any and all gaps in their proposals and then work with the authors on improving them.

Instead, we chose to try to identify risks or points of concern in proposals that were already on Snapshot, or had garnered significant and vocal support from delegates without necessarily having gone to Snapshot, so we could help delegates make informed decisions.

There’s also the risk of using ARDC to outsource some of the execution costs of proposals. For example, one of the most valuable outcomes of ARDC has definitely been the analysis of the results of selected protocols’ incentive programs. This has led to a better understanding of these programs and the recovery of funds from some of the protocols that did not use them properly. While obviously useful and valuable, this analysis should be a part of these incentive programs, along with the costs associated with it. On the other hand, one could argue that ARDC was successful in this case because we had resources available that could easily be used to fill an existing gap in the program we’ve been running - this practical use case should also be considered when thinking about future programs.

In future programs, it would be good to either have a precise mandate or make it clear that the mandate will be defined during the course of the program and have a process for clarifying it.

Delegate Engagement

When the ‘Arbitrum Coalition’ was first proposed, there was pushback against centralizing too much power in the hands of the proposed coalition. However, it turned out that the idea of having the DAO Advocate act as a ‘bridge’ between the DAO and the ARDC, although well intentioned, didn’t exactly really mitigate anything.

The reason for that was the fact that delegate engagement wasn’t as much as one would expect. There were multiple avenues for people to reach out to the DAO Advocate to request that the ARDC work on something. We had biweekly calls that were open to the DAO (which people outside of the ARDC members rarely attended, proof of which can be found in the meeting notes of the ARDC), but we still didn’t have a lot of engagement. As seen in the report, out of the 42 deliverables from the ARDC, only 11 were requests from the DAO, with 4 of them coming from a single source.

One learning from this setup that could be addressed in a future version of the ARDC is that the DAO Advocate should have the freedom and discretion to proactively assign workstreams to members of the ARDC without having to rely on delegates’ input. That’s what we chose to do during our engagement, but it should have been part of the scope of the proposal.

In addition, in future programs, ARDC could serve not only delegates, but also other key stakeholders in the DAO, such as the Arbitrum Foundation, Offchain Labs, or leaders of key initiatives. Especially towards the end of ARDC’s operations, we saw significant interest in ARDC’s support from these sources, which we found not only interesting but also valuable for the DAO.

Value received from the ARDC

Overall, we spent 1,753,466 ARB, with an average value totaling over $1,000,000 USD, for 6 months of the ARDC and received 43 distinct deliverables, some of which helped return over 1,000,000 ARB to the DAO’s treasury. The ‘ROI’ of the ARDC isn’t something that can be objectively assessed and everyone needs to subjectively assess whether the value they feel the DAO received was worth the cost.

We’re overall skeptical of the efficacy of the ARDC, not because of the outputs it delivered, but because we believe that the DAO wasn’t positioned to leverage the ARDC to its full potential. That has to do with a lot of other things irrelevant to the ARDC or its structure, and therefore, things that cannot be addressed in a new proposal or with a simple amendment in the previous one.

Looking Forward

While we do believe the concept of the ARDC to be something worth exploring further, we are not convinced that the DAO is at a point where we should expend additional effort in trying to make it work.

We do recognize the need for the DAO to have access to unbiased information surrounding different proposals by relevant domain experts and to have an avenue to easily procure it and it’d make sense to revisit the topic again in the future.

Right now, however, there are things in the DAO that need to be addressed in order for us to be in a position to leverage the ARDC to the fullest.

8 Likes