[Non-constitutional] Subsidy Fund for Security Services

Subsidy Fund Proposal from the Arbitrum DAO Procurement Committee

Executive Summary

As voted on in the ADPC proposal here, one of the key tasks of the ADPC was to conceptualize and structure a subsidy fund for the Arbitrum DAO. This proposal intends to set up this subsidy fund, seeking $10 million worth of ARB to be administered and selected by the Arbitrum DAO Procurement Committee (hereinafter referred to as the ‘ADPC’) to facilitate the selection of projects that will benefit from the whitelisted security audit service providers selected via the ADPC’s procurement framework. The ultimate decision will be made by the ADPC based on the Means Test and the Application Process Terms.

Proposal Request

We propose the creation of a procurement subsidy fund allocating up to $10 million worth of ARB to provide financial assistance to both new and existing projects within the Arbitrum ecosystem.

These subsidies will be exclusive to a pre-approved whitelisted set of security audit service providers, selected by the ADPC, who will publicly display their fees. This approach eliminates the need for the ADPC to assess the reasonableness of funds requests.

The aim of the subsidy fund is to incentivise participation and growth among smaller projects helping them to overcome barriers to entry, such as challenges to acquire funding to pay for the cost of robust security audits.

The figure of up to $10 million worth of ARB has been determined via a benchmarking exercise conducted with various security audit service providers. This form was shared with these service providers and based on the responses of 10 service providers (including the likes of Spearbit, Halborn, Nethermind, Three Sigma, Guardian, Zellic, etc.) on their scope of services and fees associated, we have estimated that each project will require a 2-month security audit at an average cost of $200K. This will enable the ADPC to potentially fund up to 50 projects; however, it should be noted that the average of $200K is an estimate and fees are usually specific to each project, each project requires a scoping exercise, and audit costs will vary based on the size of the codebase, complexity, etc.

Any feedback on this proposal is encouraged via a public discussion on the community forum.

Subsidy Fund Design & Approach

Exec View

  • Up to $10 million worth of ARB allocated across ~50 projects to spend on whitelisted security audit service providers, with an average amount of $200K worth of ARB allocated per project and a max cap of $500K per project.
  • ADPC responsible for selection and oversight.
  • Our aim is to distribute the Subsidy Fund across cohorts of 8 weeks each. The number of cohorts is to be decided by governance.

Key Pillars

(A) Subsidy Fund Principles and Criteria

Core Principles Underlying the ADPC Subsidy Fund

Before considering a subsidy application, applicants should carefully evaluate the need for support. The purpose of these guidelines is to clarify the subsidy program and make the process as straightforward as possible.

All applicants should keep in mind the following key principles:

  1. Transparency & integrity - Applicants must disclose any potential conflicts of interest and receipt of any prior funding.
  2. Encouragement of investment - Subsidies should incentivise investment that would not otherwise occur without the subsidy. They should not cover costs that the beneficiary would have funded independently in the absence of any subsidy.
  3. Proportionate and necessary - Subsidies should be proportionate to their intended objective and limited to what is necessary to achieve it.
  4. Economic neutrality - The requested subsidy amount should not confer an economic advantage. It should be provided on fair terms, comparable to what could reasonably be obtained on the market.
  5. Consideration of funding options - Applicants are expected to explore all available funding options and responsibly choose the option that best suits their needs.
  6. Enhancement of beneficiary security - Subsidies should bring about changes that enhance the security of the beneficiary’s protocol.
  7. Positive impact on the Arbitrum ecosystem - Subsidies should have a beneficial effect on the Arbitrum ecosystem, contributing to its growth and sustainability.
  8. Value for Money - The evaluation method should ensure decisions maximize both financial and non-financial value to Arbitrum.
  9. Probity - Subsidies should be issued in an environment that ensures fairness, impartiality, and compliance with established guidelines and rules. There will be an emphasis on transparency, accountability, and integrity throughout the procurement process, evidenced by the development of this publicly documented Means Test methodology. This focus on Probity will mitigate procurement risks and safeguard the integrity of the procurement process.
  10. Risk Management - Risk management policies will be implemented to identify and mitigate procurement risks.
  11. Engagement with SMEs - We have invited a wide range of organisations to participate, all of whom will be subject to the eligibility criteria.
  12. Cyclical Whitelist events - We will have clear rules for cyclical whitelist reviews to open a pathway for future applicants and new market entrants.

Means Test: Criteria for Evaluation

The reason for this approach over a purely quantitative approach is that most projects, especially the smaller ones being targeted within this subsidy program do not possess obvious immediately measurable metrics.

The development of this Means Test aims to provide a structured approach for the ADPC to evaluate applications for financial assistance. This tool is designed to identify applicants who would benefit most from support, ensuring equitable access to subsidies within the Arbitrum Ecosystem, particularly for smaller entities with valuable contributions.

The intent is to allocate subsidies to those most in need, avoiding exploitation by larger players looking for a ‘free lunch/handout’. Such an event could give recipients an unfair advantage over their competitors or be an inefficient use of the DAO’s funds if they do not bring about a net positive change.

The means test will include a scoring system ranging from 1 to 5, reflecting the merit of each application.

  • Rating 1: Unsatisfactory
  • Rating 2: Below expectations
  • Rating 3: Meets expectations
  • Rating 4: Above expectations
  • Rating 5: Exceptional

Each of the sub-criteria in the means test have varying levels of importance, and they will each have a weighting attached. A weighting of 1 indicates low importance, 2 indicates neutral importance, and 3 indicates high importance.

Each application will be scored by ADPC members, followed by a collective decision on the most deserving grant recipients, taking into account the rating against the eligibility criteria, a value-for-money evaluation and the funds available. The ADPC may make other decisions in relation to the operation of the fund and selection of applicants as further detailed in the Application Process Terms.

After the applications have been reviewed and decisions taken as to the grant beneficiaries, the average score assigned to each project will be shared publicly, ensuring that transparency is maintained throughout the process. In the event that an applicant receives a high score but is not chosen as a grant recipient, explanatory feedback will be provided either on an individual or collective basis to the cohort.

Evaluation Criteria

Criteria Sub-Criteria Description Weight
Arbitrum Ecosystem Contribution
How aligned is the project with the Arbitrum ecosystem and how easy will it be to track the applicant’s use of the subsidy funds?
Ecosystem Contribution How does the applicant’s project contribute towards the growth of the Arbitrum ecosystem? 3
Transparency Practices To what extent does the applicant demonstrate transparency in its operations? 2
Community Engagement How does the applicant engage with the DAO community and solicit feedback/input on its project, incorporating this into its decision-making? 1
Accountability Measures What mechanisms does the project have in place to ensure accountability and responsible stewardship of subsidy funds, including governance structures in place? 3
Business Model & Need for the Subsidy
How effectively does the applicant’s business model align with their need for the subsidy?
Clarity of Business Model How well-defined and understandable is the applicant’s business model? 2
Team Experience What is the track record of the team on their ability to execute their plan? 2
Funding Gap Rationale Is there a clear explanation of the funding gap the applicant is facing, along with the rationale for why additional subsidy funding is necessary to achieve its objectives? 3
Reasonableness of Subsidy Amount Requested Does the requested subsidy amount make sense within the context of the project’s needs and potential impact? 3
Scalability Potential What is the scalability potential of the applicant’s business model following the support of the subsidy? 1
Financial Analysis
How realistic and stress tested is the applicant’s financial status and projections and is their plan for the use of the subsidy funds clearly outlined?
Accuracy of Projections How realistic and well-supported are the financial projections provided by the applicant, inclusive of revenue forecasts and cost analysis? 1
Sensitivity to Scenarios To what extent does the applicant’s financial analysis consider different scenarios, such as base, target and stress scenarios to assess the projects’ resilience and adaptability to changing market conditions? 1
KPIs Are there clearly defined KPIs that will be used to track the project’s performance and measure progress towards achieving its goals? 3
Preferred Funding Distribution Does the applicant have a preferred distribution plan for the subsidy funds, and is there a rationale provided for this distribution approach, such as front-loading funds for critical start-up costs or phased funding based on project milestones? 2
Risk Analysis
Is the applicant aware of risks with their project and what is their plan for mitigating these risks?
Risk Identification How effectively does the applicant identify and assess potential risks and vulnerabilities that the project may have? 2
Security Requirements Does the applicant have a clear understanding of its security requirements and the measures needed to protect against security breaches, such as through the conducting of a security audit? 3
Mitigation Strategies What strategies does the applicant have in place or intend to implement to safeguard against the aforementioned risks? 2

Regarding the ‘Ecosystem Contribution’ metric above, we have conducted an initial assessment of the types of projects that are currently building in the Arbitrum ecosystem and identified a few verticals that the ecosystem would benefit from funding. These are set out below, along with the rationales for choosing them. We will provide more weight to these areas and welcome input from the community on our selection.

RWAs & Tokenization

  • Commands a small proportion of Arbitrum attention and building however is a focus of big institutions & banks deploying large amounts of capital into tokenization.
  • Arbitrum as the home of DeFi is currently trailing in this burgeoning key category (e.g., Mantle passed a $60 million support program for tokenized RWAs on their chain and Polygon and Avalanche have been very active in the space).
  • Tokenized treasuries and private credit have passed $1.3 billion in terms of value tokenized, and are only projected to grow in importance in the near future, highlighted by the recent launch of BlackRock’s tokenized fund.
  • Forum discussions talking about RWA’s being used as a tool for diversifying the Arbitrum DAO treasury. See here.

Gaming

  • Becoming an increasingly relevant Web3 vertical with major games finally launching this year, and will catch some more tailwinds (eg. Shrapnel, Star Atlas, etc.).
  • Arbitrum significantly lags behind other major ecosystems (especially Polygon) both in terms of games and in terms of support & incentives for development on their network
    • Discussion on ramping up incentives and acknowledging Arbitrum’s position in gaming. See here.
  • Attracting gamers in the right demographic (18-34 tech savvy) to support onboarding to Arbitrum’s broader DeFi and blockchain ecosystem.
  • Aligns with the ‘Catalyze Gaming Ecosystem Growth on Arbitrum’ proposal here.

Collab Tech

  • Definition: Tooling for governance, operations, community, and contributors.
  • Arbitrum governance is currently sprawling, exacerbated by the introduction of Orbit, which will result in an ecosystem of decentralized governance organisations making collaboration even more challenging.
  • Focusing on and incentivizing new Collab Tech represents an opportunity to clean up governance practices for Arbitrum DAO and beyond.
  • This sector is underfunded since it is less understood by VCs, thus, focus and resources will have to come from within.
  • The majority of Arbitrum’s sequencer fees come from this category (e.g., Quest Protocol).
  • Discussion on spinning up a Collab Tech business cluster. See here.

You can find more detailed information on the rubrics informing the Means Test here.

(B) Application Process

Application & Review Windows

The Subsidy Fund will run in cohorts of 8 weeks each, to be decided by DAO governance. Each cohort will consist of an initial submission period of 2 weeks, followed by a 6-week review period. Moreover, a maximum cap of 25% of the total Subsidy Fund amount can be disbursed per cohort to ensure that the fund is structured to allow the door to be opened to new entrants over time. Each cohort will operate on a first-come-first-served basis for application reviews, and if the maximum capital has been allocated for the cohort, the remaining applicants will be rolled over to the next cohort.

Initial Screening

To efficiently handle the anticipated surge in applications and to ensure that the highest quality and most relevant applicants are selected, the below 5 sub-criteria (with the highest weights as mentioned above in the Means Test) will first be applied to all applicants, with the top-scoring applicants moving forward in the evaluation process and being assessed in greater depth:

  1. Funding Gap Rationale
  2. Reasonableness of Subsidy Amount Requested
  3. KPIs
  4. Ecosystem Contribution
  5. Accountability Measures

The ADPC reserves the right to introduce Mandatory Requirements over time that operate as threshold tests and will publish those requirements if introduced.

In-Depth Review & Feedback

Projects that pass the initial screening will undergo a review due diligence (DD) by the ADPC, including interviews and constructive feedback (either individually or on a collective basis).

Award & Monitoring

Once approved, projects receive subsidies, with periodic check-ins and a concluding evaluation to measure impact and success.

(C) Selection Process & Reporting

Transparency and continuous dialogue form the backbone of our selection and reporting process, ensuring that each funded project remains aligned with program expectations.

Bi-Monthly Reports

We will provide bi-monthly updates on our selections and updates on funded projects (i.e., in line with each Cohort). These updates will include general project trajectory and progress toward milestones. To create the reports we will set regular monthly check-in dates where projects fill a template/slide in order to give the key info about the project’s status, such as:

  • Summary of Achievements for the Month
  • Funds Utilized
  • Milestones Reached
  • Challenges Faced & Plan of Action
  • Feedback Integration, i.e., how projects have incorporated feedback provided.
  • Next Steps & Priorities

Output Metrics

With the initial priorities in mind, some effective measures for meaningful output will look as follows:

  • Number of Projects Funded: Number of projects funded during each cohort.
  • Total Funds Allocated: Cumulative sum of funds distributed in each cohort, showcasing the program’s financial impact.
  • Percentage of projects funded in target verticals: As outlined in the Means Test, the three key verticals we have identified are RWAs & Tokenization, Gaming, and Collab Tech.

Outcome Metrics

Depending on the final portfolio of funded projects, we will gauge the success rate of awarded projects through specific outcome metrics. While these metrics can be influenced by a wide range of external factors, such as market conditions and individual decisions on a project level, we are committed to supporting and funding the most promising projects to the best of our ability. Metrics include:

  • Percentage of funded projects successfully deployed on Arbitrum
  • Percentage of KPIs outlined in the application achieved by funded projects
  • Percentage of projects successfully deployed as Orbit chains: One of the key aims of the Arbitrum DAO is to build and expand the Orbit ecosystem. Funding projects that grow the Orbit ecosystem is a net positive to the DAO.

(D) Project Allocation

Our approach to subsidy fund allocation focuses on achieving high impact while ensuring that a de minimus number of projects obtain funding.

To ensure that the subsidy is spread across a large number of projects rather than concentrated in several larger projects, the maximum subsidy to be granted will comprise 5% of the subsidy fund available. Therefore, given that the subsidy fund comprises up to $10 million, the maximum subsidy that a project can receive will comprise no more than $500K worth of ARB.

(E) Team Setup

The administration and selection process of these subsidies will be managed by the ADPC. Even though the ultimate decision will lie with the judgment of the ADPC, their assessment will be strongly guided by a means test that evaluates key metrics to determine deserving projects.

The activation of the ADPC to manage the Subsidy Fund will hinge on extending the current 6-month mandate once the Subsidy Fund becomes operational. Should the DAO or the ADPC opt against a continuation of the ADPC, a Subsidy Fund Management Committee will need to be elected. The ADPC will allocate ample time for this process to ensure the Subsidy Fund operation is not reliant on the ADPC’s mandate extension.

(G) Governance

The Subsidy Fund governance aims for transparency, efficiency, and broad community involvement. It outlines mechanisms to ensure fair and balanced decision-making for all stakeholders.

Multi-Sig

All providers must undergo and successfully complete the standard Know-Your-Business (hereinafter referred to as the ‘KYB’) verification processes with the Arbitrum Foundation prior to receiving the service-subsidy.

Subsequently, the designated Multi-sig members, established at the inception of the ADPC and voted in favour of by the ArbitrumDAO, will take charge of disbursing funds to the selected beneficiaries, whereby the transactions will be streamed using Hedgey.

In recognition of the additional responsibilities undertaken, each of the five multi-sig wallets is proposed to receive a supplementary compensation ranging from 500 ARB - 1,000 ARB monthly.

It is also important to note, as per the ratified proposal which led to the formation of the ADPC, that the multi-sig committee grants the ArbitrumDAO the authority to claw back funds from the ADPC’s multi-sig wallet using the Zodiac Governor Module, if necessary.

Checks & Balances

Kindly note that the subsidy fund will be subject to the same checks and balances found within the procurement committee proposal, regulated by an agreement entered into by all elected ADPC Members, with the Arbitrum Foundation serving as a counterparty to the agreement. These checks and balances include:

Conflict of Interest Provision: ADPC Members will be bound to act in absolute good faith, utmost honesty, refraining from deriving unauthorized profits from their position & disclose conflicts of interest. ADPC members should always disclose any potential or actual conflicts of interests to other ADPC members who will then proceed to mitigate the respective ADPC Members’ involvement in the task in relation to which such ADPC Member is conflicted.

To sum up, all ADPC Members must declare the nature and extent of any interest, direct or indirect, which the ADPC Member is aware that she, he or it has in a proposed task at hand.

Record-keeping and Reporting: Comprehensive and precise record-keeping is imperative. ADPC Members will be required to maintain detailed accounts and documentation of the ADPC’s internal operational workflow together with meeting minutes. Furthermore, periodic reporting is essential so as to keep the ArbitrumDAO updated re. Task-specific progress & internal ADPC Administration.

Duty of Impartiality: ADPC Members will have an obligation to act in an impartial manner in relation to their tasks & workflow, ensuring that the ADPC is not compromised by personal interests or external influences.

Obligation of Recusal: ADPC Members with a conflict of interest involving a project and/or service provider being reviewed by the ADPC should recuse themselves from participating in the evaluation, facilitation & administration of the applicable procurement process.

Prohibition of Self-Dealing: Participants should refrain from voting on sending funds to themselves or organizations where any portion of those funds is expected to flow to them, their other projects, or anyone they have a close personal or economic relationship with.

Ethical Trading: Members are required to follow ethical trading standards concerning ARB and any other relevant digital assets.

Grant Application Terms and Conditions can be found here.

CCing: @Immutablelawyer; @Pablo, @sid_areta, @cliffton.eth, @raam

10 Likes

We appreciate the ADPC for this clear proposal that outlines the goals, purpose, and evaluation criteria for projects that might receive the security audit subsidy. We believe this proposal could be quite beneficial to onboarding new applications. A few questions came to mind:

  1. Is there data or evidence to suggest that the cost of security audits are a big hurdle for new protocols wanting to launch on Arbitrum?
  2. Do you foresee a large number of non-VC funded projects utilizing this security audit subsidy?
  3. Do you foresee all of the $10M budget being distributed after the 8 month period?
3 Likes

Is the idea to pay for the audit in full? I know a previous discussion was to do something like:

  1. Projects that meet X criteria have an annual stipend of $50,000 to spend on audits annually.
  2. The Sec Service Subsidy Fund will share 50% of the costs with the project up to $X amount annually.
  3. Special Exceptions can occur on a case-by-case basis if needed, although the expectation would be that 95% of projects would go down the outlined path.

This would allow for casting a wider net and be more of a cost-sharing program for being an arbitrum-aligned project rather than a blanket welfare program, IMO. The numbers I used were just ballpark, but it was my understanding that it would be more like a coinsurance program. This also allows projects at various stages to choose the best auditor for their project based on the negotiated rates, auditor expertise, and specific needs.

Also, the Google Docs are not publicly shared; we need to change the permissions so they are viewable.

5 Likes

In life, markets and magic the gathering, anything that costs 0 would be eventually exploited by external forces.
A partial payment would be in my opinion quite good:

  • would be +ve for every participants
  • would still require participants to de fact have skin in the game by paying a portion (and so they won’t sling around to ask for audits for whatever).

Maybe numbers might be a bit different, to favour smaller teams that could tap into a budget that can potentially be 65-70k (effectively if the audit then costs 100k, a normal project has 50% covered, a smaller project has 65-70% covered, which likely would be instead 2 audits of 50k covereted by 70% cause team is smaller so likely product is initially smaller). To do this tho, a due diligence on effective runaway and spending of the team would be needed, which in some cases might just be very very hard to do.

4 Likes

Hello, Bernard! Thank you for this proposal I’m from the Crypto Unicorns project that is about to join the XAI and Arbitrum ecosystem late this month.

I tried clicking on Application Process Team document but I don’t seem to have access to it. How do I request access to this? Thank you!

2 Likes

Thank you for the comments everyone! We’re drafting our responses and will share them soon.

@CU-ManicUnicorn, @dk3, apologies - the Google Docs for the Grant Application Terms and Conditions is now viewable.

3 Likes

@dk3 @JoJo

Sharing replies to feedback hereunder:

Thanks for the constructive feedback lads!

Re. the co-insurance method:

This method/mode of administration was discussed internally when structuring the operational and administrational parameters that would underpin fund disbersement. In this regard, we opted for a service-by-service approach due to the following reasons:

  1. The ADPC’s current mandate is that of a mere 6 months (extendable by a further period which is still not determined due to our nascent stage), hence, we cannot (in good faith) enter into an agreement w/ a third party project-applicant and guarantee an annual remuneration given the mandate would not align with such a timeline.

  2. Giving a one-time grant to use of security-services in general as opposed to projects electing to choose an SP, choose a service they need, and then apply for the subsidy would entail more workload on the administration side in ensuring that funds allocated are actually being spent for security services, and not for other unintended purposes. Hence, from a proper administration perspective, it would be more difficult to trace fund expenditure in this manner.

  3. Were we to go with this model, (in line with Point 1), should we grant an annual Stipend to Project X to be used within a 12-month period, and the ADPC be terminated within 6 months, there would be no one in place to actually vet whether the funds are being spent in accordance with their intended purposes (a concern highlighted in 2).

Re. the %-based approach, following your comments and internal discussions w/ fellow ADPC Members, we have decided to amend the proposal to reflect a %-based approach to subsidies which will entail the ADPC covering up to 70% (maximum threshold) of the corresponding service solicited by the project.

3 Likes

Appreciate the comment @PennBlockchain !

Answering your query below:

The need for the ADPC initially came about when several projects were requesting subsidies from the DAO in relation to [primarily] security services so as to get their project up and running. Hence, this was where the initial concern (and subsequent need of the ADPC) materialised. In addition, we’ve already gotten some projects (mainly non-VC funded ones), that have shown their interest in the Subsidy program.

This was further substantiated following our public consultation with security service providers wherein we solicited their input in relation to the average/median security-service fees applicable to certain service-classes.

As a final point the intention is not to fully allocate the 10Million budget. The threshold per-cohort (as delineated above), is a maximum threshold that cannot be exceeded and not a sum that is to be fully disbersed. Naturally, unutilised funds would be sent to the ArbitrumDAO Treasury or, if the ADPC’s mandate is extended, transitioned over to the next iteration of the ADPC.

Should you have any further questions or queries, we remain at your disposal!

4 Likes

I agree that since we have a commission, it needs funds to carry out its functions.
The proposal itself is good and I support it.
But to begin with, I would suggest a more modest amount for this.
I suggest making options for funding amounts during a temperature check

1 Like

Appreciate the comment @cp0x !

The amount requested was based on the data emanating from a public consultation carried out by the Procurement Committee w/the participation from security service providers.

It is based on an average maximum amount needed to cover enough funding for X Project per/Cohort. Hence, having funding options in place would eliminate the data-driven approach that we utilised in structuring the internal mechanics and funding allocation applicable to project subsidies.

Yes, thanks for the clarification.
But there is still one question remaining regarding the number of projects
Why exactly so many projects?

The SF is expected to attract a large number of projects as we have already gotten numerous inquiries pre-launch.

Hence, we needed to impose a maximum cap.

1 Like

@Bernard @Immutablelawyer thank you for the work on getting this proposal up for evaluation and taking feedback from the DAO, its good to see ADPC progressing on putting up a framework for the Audit subsidy support program.

I am supportive that we need a framework and program to support development of our ecosystem to have access to the best audit and security partners, including hopefully in later rounds a wider selection of safety and security tools and inputs (threat detection, economic security, formal verification (especially for Stylus protocols) etc…).

Having said all that I have some concerns with the framework that is put up because it feels like it has veered somewhat from the original mandate which was to propose and assist the DAO in handling procurement operations, establishing frameworks and setup procurement programs with the Subsidy Fund being named as one of the first initiatives.

While I appreciate the urgent need for these programs, intentionally or unintentionally you have proposed a system where the ADPC is seeking to deploy up to $10m under the following process:

  1. ADPC are the screening committee to decide which vendors to whitelist for this program
  2. ADPC has not made public (or not linked here) which vendors qualified or didn’t qualify and why
  3. ADPC will directly administer and decide on which protocols are recipients of this $10m
  4. no oversight board or technical board with specific expertise in the area of these grants
  5. no DAO voting either directly or via an Optimistic process with challenge

To be clear do not take my comments as a total rebuke there is much to like in this proposal including in the sections for Application Process an Selection Process & Reporting which show great thought. Still we we might be better served by the ADPC not trying to take on all roles in this process.

A good procurement process even if not whole transparent should be auditable and have good checks and balances, the current process simply lacks any material checks and balances. Which isn’t a good precedent for the ADPC to set out the gate for establishing such frameworks.

For reference looking at other grant programs that are $1m or more to date in the DAO to see their process and how there is some clear aspect of checks and balances.

  • LTIPP where up to 45m ARB is being distributed has a council of 5 members, three teams of advisors who work to provide recommendations to the DAO and all proposals are approved by the DAO.
  • PL / Thrive where there is now an oversight board providing approval or has challenge authority over expenditures depending on spend levels, and provide insight into areas of focus for the DAO (like for deciding what areas we want to focus on).
  • Questbook does all for the domain experts to singularly or jointly deploy grants but with each grant and available budgets being much smaller, plus each domain expert was elected by the dao based on their specific competancy in the subject of the grants.

Hope the ADPC takes this feedback to help refine their proposal, and to critcally evaluate what role they are intended to have during this initial 6 month mandate, is it to run procurement for the DAO or is it to recommend and hopefully implement systems for the DAO to be able to handle such activities at scale in many areas of DAO procurement.

ps. if you feel the proposal should go to the DAO as is please do consider providing delegates multiple voting options on significantly smaller program size, because it’s possible that there is interest in funding a quick and dirty pilot program while a full program is formally put in place.

5 Likes

These comments and thoughts reflect my personal opinions on this proposal. Whilst I am a member of the Arbitrum Representative Council (ARC), they do not necessarily represent the overall views of the council or provide an indication of final voting decision

I’m directionally in favour of this proposal as it aligns well with my vision for the DAO as a support service for protocols building on Arbitrum. This would be a competitive advantage which would draw builders to Arbitrum by offering them a range of free/discounted support services they could not access elsewhere.

I’m also in favour of this change for the reasons pointed out by @dk3 and @JoJo. Thanks for being flexible.

@coinflip makes an interesting point RE check and balances. I lean towards greater empowerment of the groups like the ADPC to make decision on behalf of the DAO. However, adding in greater transparency (i.e. details on vendors qualifying / didn’t qualify) and/or an optimistic process with challenge on decisions may be beneficial.

1 Like

@coinflip, thanks for the feedback! Appreciate the time you took going through, and happy to clarify a few aspects about the ADPC’s mandate and process - some of them might directly resolve a few of your remarks:

(1) On your comment ‘ADPC has not made public (or not linked here) which vendors qualified or didn’t qualify and why’:

  • Exactly right, the ADPC is currently in the process of setting up the procurement framework to whitelist security service providers for the DAO. Given the large amount of legal work required to structure an RFP, it is still in the drafting phase and has not yet been published to procure any security service providers.
  • To shed some more light as to why, the RFP includes a Framework Agreement which outlines detailed considerations around general provisions of the framework, administration of the framework, financial provisions, personnel, information management, risk management, IP rights, termination and dispute, delivery, security, and reporting. The RFP also includes an Order Form and Contract Details for all applicants. As you can imagine, this framework is comprehensive and requires time to put together, refine, and publish. Moreover, we are also putting time into defining the infrastructure around accepting RFP responses and evaluating them fairly, including sourcing a security expert as an advisor to aid us in judging applications.
  • The aim is for the ADPC to publish the RFP at the end of April, with applications needing to be submitted 4 weeks after the RFP being published and a subsequent review period consisting of 8 weeks, which will be a rolling process where the ADPC will approve applicants through the course of this period. We will start accepting applications for the Subsidy Fund only after vendors are whitelisted as part of this process.
  • As such, the ADPC will make public the vendors that have been whitelisted, along with a rationale for their selection. The intention was never to make decisions in a silo and always make the reasons for selection public. We will also publicise which vendors have not been selected but will not provide a reason for their non-selection in the interest of privacy.
  • Furthermore, the ADPC’s mandate was always to act as the screening committee to decide the vendors to whitelist for the program. As you can see in the Tally vote which established the ADPC, ‘the ADPC bears the responsibility of diligently executing the steps essential to implement the aforementioned procurement framework’.

(2) On your remarks ‘ADPC will directly administer and decide on which protocols are recipients of this $10m’ and ‘no oversight board or technical board with specific expertise in the area of these grants’:

  • Fully agree here on gap in security experience - as mentioned above, we are in the process of sourcing a neutral security expert as an advisor to aid us in judging both, applications from service providers during the RFP process and applications from projects looking to receive subsidies from the Subsidy Fund.
  • On management, the original intention around the drafting of the original ADPC proposal was that the ADPC would execute the Subsidy Fund, and we have received no other indication from the rest of the community that another committee is required to handle fund disbursement.
  • Besides that from an operational standpoint, we would feel comfortable managing it bringing in our experience from running the Uniswap-Arbitrum Grants Program (UAGP).
  • Obviously, if there is consensus from the DAO on standing up another separate committee to disburse grants from the Subsidy Fund, we are very happy to take that into consideration.

(3) On your question ‘no DAO voting either directly or via an Optimistic process with challenge’:

  • Good idea, we’d be happy to institute an Optimistic Challenge process on the selection of specific security service providers or grant recipients if it works operationally. At this point in time, the Optimistic Governance Module that is being developed by Axis Advisory &. Tally is not available for use yet.

(4) On your comment ‘if you feel the proposal should go to the DAO as is please do consider providing delegates multiple voting options on significantly smaller program size’:

  • Hear you on this one, we are not married to this number but merely see it as a reasonable starting point. In that sense, agree with your suggestion to reflect different options for funding amounts in the initial Snapshot. We propose $2.5M over a 2-month period, $5M over a 4-month period, or the original $10M over an 8-month period.
  • For more context on the $10M, as explained in @ImmutableLawyer’s response to @cp0x above, the amount requested was based on data we obtained via a public consultation where we consulted security service providers on their fee structures and scope of services. The mere size was seen as a high enough amount to provide sufficient impact and justify the efforts going into structuring the program.
  • The $10M size is based on the assumption of a 2-month audit at an average cost of $200K, which will allow the ADPC to fund 50 projects over the 8-month program. Any unutilised funds will be sent to the ArbitrumDAO Treasury or, if the ADPC’s mandate is extended, transitioned over to the next iteration of the ADPC.

Just to recap, if there is consensus from delegates on the below, we are happy to:

  1. Stand up a separate committee to disburse the Subsidy Fund.
  2. Institute an Optimistic Challenge process on the selection of specific security service providers or grant recipients.
  3. Start with a smaller program, e.g., of $2.5M over a 2-month period or $5M over a 4-month period before additional funds are requested to continue the program.

If you have time to respond, obviously much appreciated and thanks again for the helpful guidance

3 Likes

Thank you all for the feedback! The proposal will incorporate the following amendments and subsequently be posted on Snapshot:

Amendments

  • The ADPC will cover up to 70% (maximum threshold) of the corresponding service solicited by the project.
  • Institute an Optimistic Challenge process on the selection of specific security service providers or grant recipients.
  • Provide the following options for the initial size of the fund and length of the program:
    • 1 cohort of 8 weeks (2 months) for a total fund size of $2.5 million.
    • 2 cohorts of 8 weeks each (4 months) for a total fund size of $5 million.
    • 4 cohorts of 8 weeks each (8 months) for a total fund size of $10 million.

We also discussed forming an entirely additional committee to disburse the Subsidy Fund and collectively landed on that for this initial iteration, it may make more sense to operate in the “committee-light” way, i.e., ADPC + security expert given:

  • Additional security expert will fill gaps in subject-matter knowledge;
  • Experience in the ADPC to manage the fund operationally;
  • No clear consensus around the need for a separate committee to manage fund disbursement;
  • The implementation of the Optimistic Challenge process (when the module is available) and the transparency around providing rationales for the selection of vendors and grant recipients, which will act as a check and balance against any decisions taken by the ADPC.

Lastly, we are currently in discussion with the Foundation’s legal counsel regarding the addition of an exclusivity clause for projects receiving subsidies, namely around the provision being added to the Grant Agreement and the practicality of enforcing oversight post-disbursement.

3 Likes

Hello! Thank you very much for developing this proposal and process. With @SEEDGov we’ve been involved as an Advisor in the LTIPP and I think this process has a very interesting and well-thought-out design. So, congratulations and thank you.

That said, I have some questions and concerns very similar to those expressed by @coinflip, which I support and believe need further discussion.

According to the proposal approved in Tally “The mandate of the ADPC aims to create an optimal organizational framework for service procurement while also creating a marketplace for service providers that would have gone through preemptive quality assurance.”

It is my understanding that mandate 1/5 was to develop the RFP for the selection of these service providers. How is it that now it depends on a pre-approved white list? Am I wrong?

Also, when listing your mandates, you mention that:

Where can I find the “input collated” for the evaluation criteria (mandate 2) of small projects that will receive the subsidies? Have you discussed the rubric with the LTIPP Council? There have been very useful learnings from their scoring experience that I think can be helpful.

As Conflip mentions, and with whom I agree regarding the question of involving ADPC in the process as program manager, committee, and decision maker, I believe it is better to limit your participation to a sort of program management role (similar to StableLab in the LTIPP where the other decisions are made by Advisors and Council).

I don’t believe this response satisfactorily addresses his question. ‘Diligently executing the steps’ means managing or advancing a process, not positioning oneself as the decision maker for everything related to it.

Why is there a need to rush this snapshot vote before having this defined?

Assuming it is approved, wouldn’t this incur additional costs? Given that you acknowledge a lack of security experience, the role of the ‘security expert’ would be crucial in assessing the applicants. Why not directly appoint a manager or a committee of security experts and provide them with compensation to carry out the task?

It might be interesting to involve the ARDC in this process, as it includes a DAO advocate, a member qualified for Risk Assessment, and a member qualified for Security Assessment.

I sincerely believe that it was not sufficiently discussed. I have requested information about the public notion, the biweekly reports, and the minutes of the meetings, none of which were made available to the DAO.

The only message with information about a call was in the Telegram group regarding the first call, which was announced approximately an hour before it actually took place. If I am mistaken or do not have the correct source of information, I apologize.

What is the need to expand the budget that has already been approved for controlling one (or the same) multisig?

I believe it would be ideal for this entire document to be presented on the forum rather than in a separate PDF on Drive that could be modified in the future.

Regarding this, I believe it would be better for the funds paid by the DAO to be denominated in ARB

Thanks again, I think overall the proposal is good and well thought out.

7 Likes

ADPC Recording for call 18/04/2024:

I am concerned by the current trend and the perception that the Arbitrum Foundation is a pie everyone wants a slice of for an easy life. Most proposals lack transparency, requesting huge sums (yes, I find 10 million ARB excessive for these purposes), with no specifics and often just a collection of abstract ideas without practical applicability. People are only asking for funds. But instead of asking, why not start contributing?

Create a pilot project, conduct audits, show how it works. Demonstrate your success stories from past projects and please be more modest in your demands. Let’s start with smaller amounts, remembering that there are people who can’t afford food and water.

My message might be a bit emotional, I don’t intend to offend or devalue anyone’s work. I like this proposal, it’s important and needed, but let’s approach targeted funding more responsibly.

Here are some comments from the UADP:

Directionally, we are in favor of this proposal. There is clearly a need to help front some of the costs associated with audits. However, the proposal does seem a bit rushed. There are a couple of aspects that could have been addressed prior to taking this proposal to snapshot, which would’ve assured a higher success rate as opposed to the current divisiveness we’re seeing in the polls.

“we are in the process of sourcing a neutral security expert as an advisor to aid us in judging both, applications from service providers during the RFP process and applications from projects looking to receive subsidies from the Subsidy Fund”

“the ADPC is currently in the process of setting up the procurement framework to whitelist security service providers for the DAO. Given the large amount of legal work required to structure an RFP, it is still in the drafting phase and has not yet been published to procure any security service providers.”

The above comments from Bernard illustrate some pending work that should be completed before snapshot.

Perhaps a better order of operations is to attain a soft commitment from the DAO regarding how much an initial pilot cohort will require. Say, the snapshot vote leads to the $2.5M fund being selected. Then, the ADPC can run an RFP process, collect the projects that require subsidy, present the findings to the DAO, then follow up with an onchain vote finalizing the payment transfer from the DAO to the ADPC for distribution. This way, there’s a soft commitment present from the DAO, and the contingency at hand is that the initiative attains the earmarked funds only if it’s run in a reasonable manner. What if the DAO disapproves the onchain vote? I doubt that this will happen as long as the ADPC delivers on its promises properly. To those ends, we would like to signal our support for $2.5M for a single cohort, treating this as a pilot. If this proposal fails the snapshot, the ADPC should return to the DAO with a more comprehensive proposal once the aspects from the above quotes have been addressed.

Also, regarding the stated areas of interest–RWA, gaming, and collab tech are noted as the main sectors for audits since there are many developments occurring here. I may be wrong in my assumption, but isn’t the best use of audits for protocols that perhaps have the most value at risk? This would largely include DeFi protocols, especially high TVL ones like money markets. RWA seems like another sector that falls under this umbrella. I’d assume the cost for audits regarding tooling/collab tech and even gaming is properly lower. All this to say I’d think critically about what teams really need an audit to begin/sustain operations–versus those who can delay a full-fledged audit until they either raise more or earn more revenue.

1 Like