Thanks to everyone who has shown interest in the SOS proposal!
We’ve made the following edits on January 20th:
- Added “High-level estimates of non-capital resources needed for each objective” to the list of minimum information required for a valid submission.
- Revised the timeline with up-to-date information and more accurate dates.
Barring additional notable edits, this proposal will move to Snapshot on January 23rd.
Responses to comments and feedback below:
We feel as though this might influence/restrict the proposers unnecessarily. The framework’s idea is to foster collaboration between proposers and the community while producing several options to vote across, with people finally converging on an objectives matrix that they are the most aligned with and covers all the necessary strategic points. We expect that most, if not all, submissions will cover objectives across several different categories or verticals. As stated in the proposal, submitters are encouraged to include the main area each of their proposed objectives is focused on. This should help a reviewer understand what a submission might be missing in their opinion as well as what parts of a submission they should be reviewing based on their specialization.
This doesn’t necessarily have to be the case. If a proposer wanted to, they could submit 2-year objectives that are unrelated to the 1-year objectives. We don’t want to restrict proposers with respect to how they structure their objectives (as long as there is no overlap across objectives within a specific year). Having said that, it’s worth reiterating that delegates should be critical when it comes to judging the feasibility of laid-out objectives—adopting a matrix that is unrealistic with a vast number of different objectives is likely to be counterproductive.
The DAO’s short- and medium-term priorities will be decided by the objectives chosen through this framework, with submitters being responsible for crafting the actual focus areas. When it comes to the objectives, we don’t think they require a ranking since a well-constructed matrix will have a proper number of objectives such that the DAO can work on most/all of them in tandem (another reason why a specific year’s objectives shouldn’t have any overlap).
To be clear, Entropy would not manage goal evaluation. As described in the proposal, we suggest this be handled by the research member of the ARDC through quarterly updates. All matrix submissions will be posted directly to the forum, the process is open to anyone, Entropy isn’t in a position to withhold any information, and our capacity to assess submissions will be similar to any other delegate. That is, providing feedback to submitters during the feedback period and voting on the matrix we’re most aligned with during the voting period.
Simply put, we’d make sure that during the initial SOS process, all the required announcements are posted, periods in phase two begin and end at the correct time, and the Snapshot vote comprising all proposed matrices is submitted correctly. Ideally, when the time comes to initiate the review phase, OpCo has already been set up, and the DAO could instruct the entity to manage the process. Moreover, OpCo could take over the objectives evaluation task from the research member if the DAO so chooses.
We strived to make the process as inclusive as possible. Having said that, objective setting for a blockchain ecosystem isn’t an easy task, which inevitably requires a notable time commitment as well as effort to get right. We expect that one-goal matrix submissions wouldn’t be considered seriously. However, we at Entropy will make a submission through the framework, and encourage anyone who has one-off objectives in mind but doesn’t want to create a full-fledged submission to send them our way, which we’ll consider for our submission. We might also utilize Harmonica to collect one-off objectives and associated key results submissions during the Notice Period such that anyone can incorporate the information in their matrix submission if they so choose.
This was something we thought about adding when we drafted the proposal. However, there are quite a few frictions we foresaw with this approach. For example, it’s well-known that DAOs often move slowly, inherent to their decentralized nature, meaning that, e.g., 6 months might not be enough time for the objectives’ “correctness” to materialize. This means that delegates may be more inclined to vote for a change prematurely, especially given delegates voting for a change can’t be obligated to create a submission. Compare this to the current structure, where ad hoc objective adjustments can be proposed by anyone at any time, but this requires them to post reasoning for the change together with adjusted objectives and key results. As such, objectives could still be changed 2/4/6/etc. months after their implementation, but the proposer must be able to clearly justify the reasoning and exert effort in creating an adjusted objectives matrix. This ensures that proposals for change are tied to accountability and require proposers to deeply think about the current objectives as well as what should be changed and why. The proposal also suggests instructing the research member of the ARDC to create quarterly reports on how the DAO is doing against its objectives, meaning that active delegates will receive timely updates on any potential changes that might be needed.
Another structure we considered, but decided to drop due to excessive complexity for decentralized decision-making, unreasonable strain exerted on delegates, as well as the high risk of creating a matrix that is, for example, bloated and/or has objectives with overlap. If delegates feel as though none of the submitted matrices are adequate, they can abstain during the voting period. It’s also worth noting that the barrier to proposing a single objective is much lower, meaning that this could lead to a situation where a vast amount of not-so-well-thought-out objectives are proposed.
We agree that the second phase is lengthy, but this is a tradeoff required to ensure decision-making is done in a decentralized manner. Shortening the timeline would likely exclude many delegates from participating or require a more centralized approach such as having a single submission made through a committee of only a few contributors. In our opinion, both of these options are far suboptimal compared to a long process.
We are afraid that such a structure could lead to unintended consequences, with proposers creating submissions that are the most likely to pass instead of optimal for the ecosystem. Ideally, proposers would be naturally encouraged to produce optimal submissions by being intrinsically aligned with the ecosystem and having skin in the game, such as being ARB tokenholders. Additionally, many delegates are already receiving incentives to participate in governance processes through the Delegate Incentive Program.
- Implementation responsibility: The idea is that anyone can create a proposal that would push the DAO toward reaching one/several of the chosen objectives. In other words, the process would work in the same way as before, but it’ll hopefully orient contributors to produce synergistic proposals that form a cohesive structure while making it easier for delegates to judge whether a proposal is high value-add or not. It’s important to reiterate that contributors will still maintain the freedom to submit proposals that don’t fit the chosen objectives, but we expect that this will be discouraged via the creation of a social contract among delegates. If OpCo is stood up and operationalized in the future, delegates could instruct the entity to execute a strategy or strategies that would help the DAO reach an objective/several objectives. Depending on the DAO’s instructions, OpCo will also be able to procure/hire contributors from outside the Arbitrum ecosystem, meaning that if most Arbitrum-native facilitators are at capacity or missing the required skills, external talent can be brought in.
- Budget: The total budget would only be decided in the next and final step of this overarching initiative (MVP → SOS → budgeting framework) and would require another vote to be approved. We think bundling the objective-setting exercise with budgeting introduces too many variables for possible disagreement, which could restrict the process from moving forward. Moreover, in the case that a vast amount of objectives matrices are submitted, delegates will already be put under quite a lot of strain to evaluate all submissions. Adding a budgeting decision to the mix would be impractical and unreasonable for delegates.
- Evaluation: As stated in the proposal, we suggest the research member of the ARDC be tasked with creating quarterly reports assessing how the DAO has improved as well as suggesting what objectives require more attention or changes while recommending solutions to address shortfalls.
- Adjustments: These will be made either during the review phase or through any ad hoc strategic objective adjustments that might be proposed.
Hey @ChrisB! Thanks for the overview of Harmonica—the solution looks interesting. We could potentially utilize Harmonica during the Notice Period to gather objectives and key results from community members who don’t want to create a full-fledged submission. This information could then be considered by anyone who wants to submit a complete objectives matrix during the Submission Period.
We’ll contact you async to explore the feasibility of implementing the solution in more detail!
Submitters are free to include as many objectives as they want. However, as mentioned earlier, we think that including a vast number of objectives is likely to be counterproductive. Even 10 objectives might be unrealistic for the DAO to focus on simultaneously, at least in the beginning, but this is just our opinion.