I'm in support of this proposal!
One question regarding the allocation % increase:
What factors usually contribute to the decision to increase the % of excess reserves? For example, is there some hypothesis that the team is testing out, after which if it hits some certain threshold for a period of time the team may in the future further increase the % from 50 to something like 65%?
Additionally, I want to add that I like how you included some risk/protocol analysis in this writeup! A couple of questions/comments:
1. For future writeups, I think it would be helpful to include the full dataset of votes from the DAO members. Things like min/max, mode, median for example also help to reveal how core DAO members feel about a certain protocol that the mean doesn't necessarily capture well.
2. It's unclear to me how the "Risk of losing funds" value is calculated. For example, what are all the factors (as well as factor weights) used to calculate that percentage? Additionally, does it make sense to include insurance coverage (via something like Risk Harbor) in the "risk of losing funds" % since it is a mitigating factor. Or, is that something that's already factored in?
As an example of what I think will be helpful to know:
Risk of losing funds % is derived from:
1. Reputable auditors (did samczsun approve???) [factor weight: x%] [score=<1-100>]
2. Absolute savage dev team [factor weight: y%] [score=<1-100>]
3. Lindy effect (how long protocol has been around/TVL) [factor weight: z%] [score=<1-100>]
…
n. WAGMI vibes [factor weight: xyz%] [score=<1-100>]
Risk % = 100 - (1) - (2) - (3) - … - (n)
Something like that.
3. Additionally, I think it would also be a cool metric to track DAO perceived protocol risk over time, maybe something like a quarterly review that shows the change in perceived risk for each protocol our treasury is involved in. Obviously, easier requested than done, so this is just a nice to have (perhaps as part of the agora newsletter? 🤔)