Thank you for your comment, @danielo, and for sharing the summary from the async retrospective. Critical feedback is needed for the ARDC and DAO programs to iterate and improve. As the communications lead for the ARDC, I’d like to address the points raised, providing context and referencing the work we’ve done over the past term.
On Transparency, Accessibility, and Awareness
A core focus from the beginning of ARDC V2 was to establish a high degree of transparency and create multiple channels for delegate engagement.
- Communication Structure: From the outset, we established comprehensive communication channels, including an ARDC Homepage on Notion for all documentation, bi-weekly community calls, and regular updates in the forum. Our Notion page includes meeting minutes, budget utilization, task tracking, and service provider progress.
- Information Accessibility: We have worked to make information accessible through numerous avenues. We maintain a dedicated forum thread for continuous updates, provide direct access to researchers in our bi-weekly calls, and have posted numerous X (Twitter) threads and articles to summarize and disseminate our findings. The goal has always been to bring information to the delegates.
- Compensation Transparency: Compensation details are publicly available on our Notion page, but welcome feedback on how to make this information more accessible. All service provider hours and costs are tracked publicly. Furthermore, when changes to the Supervisory Council occurred, we publicly detailed the rationale for my role consolidation and the corresponding compensation adjustment, including the expected weekly hours and effective hourly rate, in a detailed forum post.
On Research Prioritization and Influence
The ARDC’s research agenda was designed to be a collaborative process, drawing from key stakeholders while remaining open to the entire community.
- Research Drivers: The initial research agenda was built upon recommendations from the Arbitrum Foundation, the Supervisory Council, and delegates who reached out to us. The Foundation’s initial list of topics was explicitly presented as “non-prescriptive suggestions”, not mandates. From the very beginning, we issued a broad call for delegate input, encouraging the community to propose research topics through the forum, direct messages, and our community calls.
- Prioritization Framework: We established a structured task prioritization framework early on, where prioritization occurs weekly in collaboration with the Arbitrum Foundation and OCL to focus on high-impact initiatives. To ensure our work remained aligned with the DAO’s needs, we continuously solicited feedback and research ideas from delegates during our bi-weekly calls and in every major communication update we published.
- “Mouthpiece” Concern: A review of our completed deliverables shows a wide range of topics that benefit the entire ecosystem, not a specific entity. Among other things, we have analyzed grants programs across multiple L1s and L2s, studied sequencer revenue and market share challenges, examined governance attack vectors, and provided recommendations to improve the Security Council. This work is foundational for the DAO’s strategic decision-making.
On Value and Feedback
We have strived to deliver high-value, actionable research and welcome feedback, while also recognizing the scope of our mandate.
- Quality and Value: While the perception of value can be subjective, the comprehensive nature of the reports holds significant value, as some community members have noted. Our reports on Incentive Programs, for example, provided concrete recommendations like the need for a dedicated operations team, clear KPIs, and tapering rewards to avoid cliff effects. The Vote Buying Services analysis gave the DAO a clear framework for responding to this emerging challenge.
- Feedback Mechanisms: While a formal rating system is an interesting idea for the future, we established direct feedback loops from day one. Our bi-weekly office hours are the primary forum for this, where service providers present their findings and listen directly to delegate feedback and questions. We explicitly state that these calls are open for discussion.
I also want to share some observations on potential structural improvements for future iterations.
I believe OpCo could play a valuable role in research operational activities, this would bring direct alignment with operational needs while reducing coordination overhead. The retainer model has created some pressure from service providers to justify their pre-paid hours, which has led the Supervisory Council to spend valuable time analyzing whether specific research initiatives are truly needed.
I agree that a more flexible, on-demand model with a pre-vetted vendor list could prove more efficient and would enable request-driven research with flexible engagement terms. However, the current structure remains valuable, and the lessons learned during this first phase will help us address many of the concerns raised.
We appreciate your engagement and view this feedback as a vital part of the ARDC’s evolution, should this extension get approved. We are committed to continuing our work through the remainder of this term transparently and delivering impactful research for the Arbitrum DAO.
Finally, I’d like to invite all delegates to our next bi-weekly call this Thursday, June 26, where we’ll open discussion for feedback and provide a comprehensive overview of our first six months of work.
Thank you.