As outlined in our proposal to expand Tally support for the Arbitrum DAO, we integrated Karma’s delegate score and contributor metrics into Tally’s delegate page so token holders and DAO stakeholders have access to holistic participation information. Specifically, we added Karma Score, Snapshot voting %, onchain voting %, and forum score to Tally.
Forum score is calculated according to the following formula: ((100) * ((Proposals Discussed Percentile * 1) + (Proposals Initiated Percentile * 1) + (Forum Post Count Percentile * 1) + (Forum Topic Count Percentile * 1) + (Forum Likes Received Percentile * 1) + (Forum Posts Read Count Percentile * 1))) / (Sum of Weights times Max Score Setting * 1)
Karma Score is calculated according to the following formula: ((100) * ((Forum Activity Score * 1) + (Off-chain Votes % * 3) + (On-chain Votes % * 5))) / (Sum of Weights times Max Score Setting * 1)
As always, please reach out if you have any feedback or suggestions for Tally. Cheers!
Super stoked to see the integration of Karma’s delegate stats into Tally’s delegate dashboard!
I would like to address few common questions we receive regularly from delegates:
Why is my forum score blank?
This is most likely because we don’t have your forum handle linked to your delegate wallet address. You can link them here: Link Forum Handle.
Why don’t I see any of my stats?
We index the top 500 delegates by voting power. If you want your profile to be indexed, login with your delegate address on arbitrum.karmahq.xyz and create your profile and we will start indexing your address.
If you have any questions, feel free to reach out to our team on Telegram or Discord.
Hi @mmurthy, I noticed a potential discrepancy in the forum score calculations while comparing delegate scores. For instance, if we take two delegates as an example: Arana Digital and Curia-delegates.eth
The “Forum Likes Received Percentile” metric shows Arana Digital with a significantly higher percentile, despite having only 32 likes received, whereas Curia-delegates.eth has 248 likes—substantially more. This seems counterintuitive, and I’m wondering if there might be an issue with how the percentile is calculated or the dataset being used.
Additionally, “Forum Likes Received Percentile” is just one example—there are other metrics with similar discrepancies as well across several delegates.
Could you clarify:
What subset of users is used to calculate the percentile scores? Is it based on all forum users, users by trust level, or some other criteria (e.g., days read, activity levels)?
Are there additional factors influencing the percentile that might explain the discrepancy?
Thanks for your time and for contribution so far towards governance transparency!