Grant Progress Tracking Platform

Honestly I’m not sure how well it’ll work long term if there’s only one person working on development.

With such a limited budget I think you will require additional funding later for data collection, testing, or even future maintenance.

Also, even if the platform works well, grant tracking still requires human involvement—like updating Notion/sheets, contacting project leads, and having experts review and make decisions.

At the end of the day, if a project truly cares about getting funded, they’ll likely stay on top of updates and won’t miss any milestones—maybe even check progress daily :slight_smile:

1 Like

We appreciate the proposal and agree on the benefits, but we are unsure that creating a new independent tracking platform is the best approach. It might be more effective to update the “grants” page in the Arbitrum Hub portal with a built-in tracking system - this would re-use current infrastructure and avoid additional fragmentation.

2 Likes

Thank you for the thoughtful feedback and for taking the time to dig into this.

You’re right that the initial budget is small for the scope of development. My initial plan was to keep the scope lean, as this was meant to gauge the DAO’s interest in the concept. The original goal was to create an interface for grant recipients to update their milestones, with the platform sending notifications. with notifications sent to users (via email, in-app alerts, Discord, or Telegram) for updates or missed deadlines.

However, based on the feedback and interest from the community, I’m expanding the scope and adjusting the budget to better align with the desired features and technical requirements. To elaborate on how the core features will function I will implement webhooks to monitor APIs like Questbook, Discord, Twitter, etc., to catch real-time events as they happen, for events without real-time API support, I will use periodic polling as a fallback.

The notification system will work by events triggered for example Discord data will be normalized through webhook events, and Firebase will manage the notification states to ensure consistency. I will also use WebSockets to broadcast real-time updates across all connected front-end users.

You brought up another valid point about long-term maintenance. The tool will definitely need some post-launch upkeep. I’m considering introducing a monthly maintenance retainer to ensure it stays functional and receives updates. I would like to hear the community’s thoughts on what they think is reasonable in terms of the retainer. It would cover the cost of bug fixing, monitoring third-party API disruptions, server patching, uptime monitoring, following up on users who miss milestone updates, etc.

Ezreal brought up another excellent suggestion: requiring grant recipients to report their progress to me through the platform. This would make tracking milestones even easier and more accurate while adding a layer of accountability. If Questbook is renewed, this platform will integrate easily with their API. However, if it isn’t renewed, the tool will still function fully as a standalone solution.

Haha, I don’t mind the questions at all. Thanks for the encouragement! It means a lot. I’m committed to creating tools that truly add value to the community, and your input helps me refine this vision. Let’s keep building together :metal:

1 Like

Hey, thank you for the feedback! I agree that relying on just one person for development can introduce risks, especially long-term. That’s why I’m open to exploring collaborations or partnerships as the project progresses. The goal isn’t to build a standalone tool and walk away but to create something scalable, adaptable, and integrated with the community’s evolving needs. Additional funding would make this easier, especially for data collection, testing, and long-term maintenance.

This tool isn’t meant to replace all human oversight but to streamline it by improving tracking and visibility across the entire grant landscape with less manual effort. You’re right: if a project truly values funding, it will likely stay on top of its milestones. This tool adds clarity and oversight. Following a suggestion I’ll add to the proposal to make reporting progress through the platform mandatory.

Thanks again for sharing your thoughts! Feedback like this is invaluable and ensures the proposal aligns with community needs.

Hi, thanks for the suggestion I’m fully on board with this idea and would love to collaborate with the team to explore how we can implement the tracking features directly into the platform.

1 Like

This is a needed tool to ensure better visibility of ongoing DAO-funded projects. A system or hub for progress tracking will help the community see what is being done and where projects stand.

Right now, milestones are being tracked but there is no standardized framework across projects, the DAO should consider this to improve consistency. This helps with project’s flexibility but having inconsistent criteria for evaluating progress will not aid the DAO direction-wise. A future iteration of this dashboard could address this gap by integrating a milestone framework or guidelines to standardize progress measurement.

I agree with other delegates about the budget being tight. This tool could have long-term potential. It could even evolve into a data feeder for a performance system, such as a Balanced Scorecard. This would enable the DAO to track KPIs aligned with community engagement, funding efficiency, and project impact with real-time visibility.

If the OpCo proposal is approved, it should implement a similar system for its operations monitoring. A tracking tool like this would help with transparency and decision-making for governance, resource allocation and accountability.

1 Like

Hi @EmmanuelO, thank you for your proposal. Based on its current needs, we think what you’ve put forward makes sense for the DAO, and we are supportive of the proposal.

Some questions:

  1. For the interface, do you have any references of what you had in mind for the final product?
  2. How would you propose setting up data ingestion? We’re asking because we feel it should be part of your overall proposal. At the moment, it is unclear if the current proposal requires individual working teams to input data into a CMS.
  3. And if by manual input, have you considered what the fields would be?
  4. We’re assuming hosting costs are not factored in your proposal. What are the server requirements that you’re looking at or would like to propose, and is the expectation that the DAO will provide them?
  5. What are the frameworks that will be used for the build?

Also, please factor in an estimate for platform maintenance into your cost for a 12-month period. That’s all from us; we feel that this is a beneficial tool to have for delegates to gain more clarity on the progress of the different funded initiatives.

1 Like

Im glad you decided to go through the Questbook path, which I believe is the right choice.

I would like to add that, in my opinion, the budget seems too small to develop a truly robust platform. It might be a good idea to request additional funds when applying for the grant, in case further resources are needed down the line.

I also think it’s fantastic that we have a system in place to track each grant awarded and monitor how the corresponding milestones were achieved. This transparency ensures accountability and allows for a clear assessment of progress.

Also could you please explain if running the platform in the future will have anual costs and what would they be???

Also i would suggest you work with some of grant stakeholders like DA to know what is the most relevant infomation needed in a platform like this.

1 Like

The idea itself is good.
However, there are many parallel projects that should partially fulfill this task.

  1. First, the much-talked-about proposal to organize OpCo.
  2. Second, the allocation of funds for the formation of reports on each grant (if you want to do this work, you need to take away funding in this part)
  3. Third, I only saw the starting cost of the project, but grants are not standing still and you have not determined how much it will cost to support this project.
    The rest of the questions that came to me have already been asked by most of the delegates (thanks to them for this)
1 Like

Thank you for the thoughtful feedback! I’m glad to see that you resonate with the importance of a standardized tracking system. Your suggestion of evolving the tool into a data feeder for a performance system, like a Balanced Scorecard, is fascinating. I’ll explore how this could be implemented in the tool’s roadmap, especially for tracking KPIs like engagement, funding efficiency, and impact.

If the OpCo proposal is approved, integrating this system for operations monitoring is an excellent idea. It would be a natural extension of the tool’s purpose. I also appreciate your input on the budget constraints. While I kept the initial budget lean to gauge community interest, I’m reevaluating it now to better reflect the project’s features and long-term potential and will update it shortly.

Hi, thank you for your thoughtful feedback! I appreciate your support and the insightful questions you’ve raised.

  1. Interface References: For the interface, I envision a dashboard with clear project tracking and milestone progress indicators. It will be similar to tools like Monday.com in terms of layout and interactivity. I plan to share initial wireframes once the proposal is approved so I can gather early feedback from the community. Throughout the development process, we will allow everyone to review and provide feedback.
  2. Data Ingestion: The platform will support both manual data input (from grant recipients and project leads) and automatic data ingestion through APIs (e.g., Questbook, Discord, etc.). This approach will allow for flexibility in how data is captured. For instance, grant recipients can manually update their progress, while other data points can be automatically pulled from external tools.
  3. Manual Input Fields: Yes, the standard field would be:
  • Project Name
  • Milestone Description
  • Spending for the milestone
  • Team member/s
  • Outcome Type (Whether it’s a technical, operational, financial, or community-based outcome)
  • Milestone Impact (the immediate, short-term effect of the milestone)
  • Overall Project Impact (the broader outcome of the entire project)
  • KPI (specific measurable metrics or indicators tied to project objectives)
  • Completion Percentage
  • Deadline
  • Status (On Track/Delayed/Completed)
  • Comments/Notes or Complaints (e.g., Dependencies—if the milestone depends on or is dependent on other factors or milestones)
  1. Hosting Costs: I have not included hosting costs in the initial budget and estimate a monthly hosting cost of $300–$500. I will update the proposal with a new budget shortly.
  2. Technical Stack: I intend to use Next.js for the front end and Express.js for the back end.

I am writing up a new budget to accommodate all the feedback received so far, including the cost for platform maintenance across 12 months. Thanks again, I truly appreciate CastleCapital’s support.

1 Like

Thank you for your support of the idea! I appreciate your feedback and the points you’ve raised. This ensures that the project doesn’t duplicate efforts, but rather complements these existing initiatives. The activities of such a project that you mentioned would be tracked by this, as a constant external auditor. I agree that ongoing support and maintenance are critical aspects. I’m revising the budget to include estimates for the long-term costs of maintaining the platform. Correct, there has been a lot of valuable feedback so far, and I’m in the process of updating the proposal to reflect these changes. I truly appreciate your continued input and will ensure that it is all taken into account in the revised version.

1 Like

Hi Gabriel. Thank you for the insightful feedback! I’m glad you agree with the Questbook path. Yes, I will be revising the budget to allow for a more robust build. Regarding the annual costs, running the platform will indeed incur ongoing expenses, particularly related to hosting, API integrations, and regular maintenance (including server patching, third-party service monitoring, and bug fixes), the cost for this will be reflected in the new budget.
I also appreciate your suggestion about engaging with stakeholders to understand which information is most relevant for the platform. I’ll take this into account. Thanks again for the feedback and for pushing the proposal forward! I’m excited to keep refining it and make it a valuable tool for everyone.

1 Like

Isn’t Karma GAP supposed to do this?

I don’t see Karma functioning well but important to understand why so this doesn’t end the same way

Hey, Paulo alerted me about this thread, so jumping in.

Karma GAP is built to solve this exact problem and we were funded by Arbitrum through Plurality Labs. We have many communities using GAP, so the application is working well and thriving.

When we got funded, Feems was super helpful and helped load all the grants across various programs, you can see them all here: Karma GAP - Arbitrum community grants. Many grantees have continued to use it to share milestones and updates.

So, it’s not a tech issue anymore, it’s a process issue. I have messaged and talked to almost all providers who run grant programs for Arbitrum. Some providers have their own internal systems for tracking it and may be they don’t have a reason to use GAP. Grantees will do what the program operators tell them.

We have continued to add more features, impact measurement and many more things to GAP and would really want to make it work for Arbitrum if I can get some help on process or atleast someone to talk to!

I’m happy to find a way to work with you as well @EmmanuelO if you are open to it.

1 Like

Thank you for sharing the details about Karma GAP! A key part of my updated proposal is the mandatory reporting requirement, which ensures that all grant recipients are consistently updating their progress on a scheduled basis. Something that Karma GAP does not offer. If they are open to some UI/UX change I could use the platform to achieve this goal.

Sure, I’m happy to discuss how our efforts could complement each other if that aligns with Arbitrum’s goals. That said, my focus is on delivering a tailored solution based on the specific needs outlined in the proposal discussions, If your platform is open to UI/UX changes, I’d be interested in exploring whether it could be adapted to meet these requirements.

Hey @Srijith-Questbook what’s Questbook’s policy for having the grants data composable for other apps to use? Is the grants data on the Arbitrum Questbook tracks put onchain somehow? Or is there a Questbook API that Karma GAP and others could integrate with, to be able to display that data and have it up to date?

Same thing for all the other grant platforms other than Questbook, that Arbitrum is/will be using. cc/ @DisruptionJoe

2 Likes

Hi. It seems to us that this proposal leans more towards the Questbook model, but we don’t want to miss the opportunity to raise a few questions and ideas:

This is a solid approach, but it could be further strengthened by specifying how you plan to structure these collaborations. Are you considering bringing on additional technical team members from the outset or in later phases? Detailing this would provide confidence in the long-term sustainability of the project.

This is an excellent idea, but it might help to define an initial cost range for this retainer. For example, are you considering a percentage of the initial budget or a fixed figure?

You explained that you are “expanding the scope and adjusting the budget to better align with the desired features and technical requirements” and plan to use technologies such as “webhooks to monitor APIs like Questbook, Discord, Twitter, etc.” and “WebSockets to broadcast real-time updates.” While this expansion adds valuable functionality, it also increases technical complexity. It might be helpful to detail how you will prioritize the initial features to ensure the MVP is functional before implementing the full scope. Do you plan to iterate on the MVP based on early feedback before moving forward?

The tool’s goal is clear, but examples of the practical impact would make the proposal more compelling. For instance, how much manual effort do you expect this tool to save, or how will it specifically improve the experience for grant recipients and the community?

From our perspective, GAP appears to be a solid tool that addresses the technical challenges well, so the issue doesn’t seem to lie in its design or functionality. Instead, the real hurdle seems to be the human coordination needed to ensure all grant programs actively use it. This isn’t a problem that can be solved by simply improving the platform; it’s about creating alignment among operators and grantees.
To tackle this, it might be worth exploring ways to motivate operators and grantees to engage with the platform consistently. These are just initial thoughts, but it could be something like introduce reputational incentives, where those who diligently maintain and update their milestones or enforce compliance are rewarded with recognition or other benefits.
Alternatively (or perhaps in addition) having a dedicated facilitator or team to manage onboarding and act as a bridge between program operators and grantees could also help. This team could oversee compliance, answer questions, and ensure smooth communication, which might lower barriers for adoption and make the process more seamless for everyone.

Good questions, from my research they seem to offer public APIs for this but we are waiting for them to confirm.

1 Like