Should Compound Retroactively Airdrop Tokens to Early Users?

@allthecolors

I have run the script against my geth node and received the output. I have a pull request back to your repo to add the file. Let me know if anything is amiss and we can try again.

Thanks,
GraspOnCrypto

2 Likes

Option 1 is the best and easier solution.

Just as @alive suggested.

Including the vesting for over 4 years, so it doesn’t matter if receivers decides to sell or hold, it will take years for most of those tokens to enter the market.

And that also means less tokens to farm and dump by whales.

I guess people forgot about the $500M DAI whale from October, that single handedly kept the price at $100-110 by dumping tons of tokens every day.

Now with the current popularity of crypto, there’s gonna be whales farming with even larger budgets, just like Justin is currently doing with $1.2B in ETH.

3 Likes

I know I have spoken my mind before, but I’d just like to raise it again. In my opinion, if you want this idea to go through, it needs to be as simple and straight forward as possible. The more complex you make it, the more you attempt to create little niches of ideas, the more likely this is to go nowhere.

I’ve put forth a quite a bit of effort to move this along, more so than most who are just replying to this thread in an attempt to put their voice out there, and I dont want that all to be for nought because the community decided to spend all the effort deciding which address deserves more than another.

IMO, benefiting the compound community by distributing to strong early supporters in any way is better than trying to knick pick many details. KISS - Keep it simple. Lets find progress, get the information required, find someone with enough COMP to put in proposal and get it moving forward.

thank you for your listening to this rant
-GraspOnCrypto

6 Likes

Thanks everyone for your input! I want to start this update by harkening all the way back to the beginning of the forum thread:

As my earlier posts show, I found significantly more early users than estimated by @alive; my analysis gives ~30000 direct users, not including interactions mediated by contracts. So,

  • either we adopt @blck’s rationale to increase the available pool of COMP tokens to reach 100 COMP per eligible address;
  • or we maintain the initially suggested 500,000 COMP pool, which yields fewer than 20 COMP per eligible address.

I anticipate it will be much harder to pass 100 COMP per early user through full governance if it means sextupling the early-user allocation beyond what was initially envisioned. So the proposal I share below is based on the 500,000 COMP pool originally envisioned, inflated only slightly (by about 9%) to bring the minimum COMP distribution from 18.305842454139047 COMP to an even 20 COMP.

Proposed distribution list
Having now sliced the data several ways (socialized, capital-weighted, sqrt-cap-weighted, log-cap-weighted…), the approach that strikes me as the most fair is a “95% socialized, 5% capital-weighted” distribution.

:comp: :comp: :comp: Here is the list :comp: :comp: :comp:
(updated 12 Apr '21 to exclude addresses associated with Sybil attack on early governance)

How does this specific proposal balance fairness and simplicity?

  • There are compelling arguments for both socialized and capital-weighted distributions. Including both strategies in the formula is a way to reach a compromise on this question.
  • It is sensible to reward users who supplied and borrowed more capital to the protocol, but even a 50-50 split between socialized and cap-weight distributions is so heavily whale-dominated that it fails to achieve the goal of empowering the early user community with a meaningful shot at a role in governance.
  • A 95-5 split grants the largest whale (who has outed themselves already in this forum, @borovan) roughly ~100x the smallest grantee.
  • More than 9 out of 10 early users would receive between 20 and 21 COMP under this proposal; 45 users would receive > 100 COMP; 4 users would receive > 1000 COMP.

Finally, to align this distribution with the goal of broadening early user participation in governance, I suggest we develop a separate proposal to lower the minimum COMP required to submit a CAP to 20 COMP.

What about contract-mediated interactions?
The elephant in the room with early-user distributions is contract-driven interactions with the Compound protocol.

Context: Uniswap’s genesis UNI airdrop excluded contract interactions, spurring an initiative from application integration and DEX aggregator projects to propose an airdrop for users who interacted with Uniswap via proxy. The proposers took the position that this exclusion was accidental, and while the proposal received enough support to go to a vote, it ultimately did not pass.

Now that I’ve attempted to account for proxy interactions with Compound through on-chain data myself (thanks to @grasponcrypto for running part of the analysis!), I’ve come to a different conclusion about the Uniswap team’s motivations for excluding contract interactions: it’s simply way more labor-intensive to unwind these proxy interactions and find the initiating user on-chain – like, by orders of magnitude. And that’s just for projects that drop cTokens in the user’s self-custodied wallet. If a project uses Compound in a less user-direct way, e.g. holding everyone’s cTokens in a contract and periodically distributing interest in the underlying token, then it would not be possible to determine users’ distributions without additional knowledge about the project’s contract structure.

Application integration projects are welcome to develop separate proposal(s) for their users who interacted with Compound. Most of those projects have the skills in-house to do this kind of analysis much more efficiently than me, and they’ve had a few months to take the initiative. I am just a user/fan of the protocol, not representing any company in the space, so I choose to focus on early users who interacted directly with the protocol.

Next steps

  • Give folks a few days to review the code behind this analysis and make the case for any revisions.
  • Develop a CAP: Here, I think we can borrow @arr00’s merkle-distributor code originally developed for Proposal 32 to address the DAI liquidation event. This is venturing even further out of my coding comfort zone, so I would love it if someone with experience submitting a CAP were willing to step in here and use this list to produce the merkle tree and contracts required to submit a CAP (or at least help walk me through it!)
  • Persuade one of the ~400 community members holding > 100 COMP to submit the CAP.
17 Likes

This is awesome work, thank you for the great effort! I’m slightly out of the loop, but does this list make any attempt at removing those “manipulating” votes? Or did it take the KISS approach?

2 Likes

Thank you for the awesome work you put into this proposal,
the remerk about the proxy makes sense.
Also note that for the socialized reward it might be possible to remove many adresses which took part into the Sybil consensus attack (they supplied 0.1 eth and Voted and withdrew), so it would maybe mean less adresses included if we found a way to remove them and only them, of course in case this is technically possible, maybe it would be feasable as not many people were voting at that stage. Note this wouldn’t affect the capital weighted distribution.

About the share that should be split between socialized and capital weighted, I would say that 95-5 split seems really neglecting the capital weighted part.
Looking at the example of UNI as we talked about it, around 49 million UNI were allocated for early Liquidity providers as capital weighted allocation, while around 100 millions were allocated to early users for as an socialized allocation.
So this would make around 67%-33% socialized - cap-weight distribution in this example.

Even if there is no rule that would force us to match what was done in the other project, we could balance these two type of reward in a more evenly way… without really affect the socialized reward, or to propose that ratio to the vote.
I would say that the capital weighted allocation wouldn’t be only here for distinguishing ONE whale but also some smaller early users who dedicated large part of their funds to the protocol since its earliest days, which gave traction, and permitted compound to develop, thus the importance also I would say to weight the oldest higher (for the capital weighted part of the allocation )taking into account the exponential increase of TVL, taking into account a dollar supplied on compound was supplying a larger fraction of the total in 2019 than it was at the cut off date and consider these who took the risk in the early days to supply to the new protocol which was compound when the tvl was peanut. Thats how UNI did, by the way, as pyggie mentionned.

In january 2019 TVL was 10 millions and in june 2020 TVL reached 60x increase to 600 millions.

2 Likes

@allthecolors has the following info saved in the files:
6400580,0x13dcf605c12832359ca64768c2a0c246a48181167fbc271cf5cd281378ba9f31,0x29884627385E3F7bb763Eb72E3d5B9A98143452d,supply,0.0,WETH

thats a random line pulled for an example. I’m not sure what he first column represents but the second is the contract address, 3rd is the user’s address, 4th is the amount and 5th is the token.

If anyone has an idea on which information could be used to purge the sybil attacks I’d be glad to work with this data and create an output which excludes those. I could purge all accounts that added 0.1 eth, but i think maybe that would exclude valid accounts?

Anyway, I’m not familiar with the sybil attack or what it looks like, so if anyone wants to dm me in discord or paste info here on how we could weed those out, I’d be happy to work on that.

Is there not a list out there already of the addresses which participated in this attack? I’d think it would be easier to generate a list of malicious addresses as a separate task, and then use that list to purge records out of the one currently being proposed. Further, I’d suggest we use MEWs list of naughty addresses to purge any of those from this list as well. ethereum-lists/addresses-darklist.json at master · MyEtherWallet/ethereum-lists · GitHub
Edit: Dont have to worry about that, did a quick search and there were no matches! However, if anyone has a list off addresses that should be filtered, maybe there are other blacklist address lists out there, or perhaps someone at compound has a list of malicious addresses used in the sybil attack, I have a basic script i can use to modify the current list.

3 Likes

I tried to watch a bit the code but I am not a programmer at all(
What I was wondering was, which price did you set for the token which are not stablecoins ?
Did you take the price of these token at each block to calculate the notional value lent on compound by the adress ? Or did you simply picked an average ?
I tried to read SimpleCapitalWeights.py
token_weights[token] = (0.5*(StartPrice+EndPrice))

As some coin price could be x30 or x100 it would make more sense not to take current price for a volume that was lent in 2018 for which the price was 1% of current worth, for example, or it totally distord the capital weighted reward, it would mean if I lent rep then withdrew and sold it, then rep would make x100, the notional would count 100 times instead of 1 for the capital weighted allocation
Just saying it, as I’m not sure on the way which was picked

2 Likes

It’s a good question @Andre1. StartPrice and EndPrice are the date the Compound V1 money market contract was deployed (Sept 26, 2018) and the date of the launch of the COMP governance token (June 15, 2020), respectively. These dates cover the eligibility window for the early-user distribution, not current prices. So the weights are just a “two-point average”, the simplest possible approximation for the average price over the eligibility window. Stablecoin weights were fixed at exactly 1 for simplicity.

A daily TWAP (time-weighted average price) for each token would not be too much extra work if folks would like to see whether that has any significant impact on the proposed distribution…

The most surgical approach would be to use the Coingecko API to produce customized weights for each transaction by estimating the USD value of the supply/borrow at the relevant block. That is possible, but it seemed like a lot of work for something that is ultimately not likely in my opinion to change the final distribution amounts that much.

2 Likes

my point of view but I am not sure if it will change something as I didn’t check price of all tokens
we could use integer(BTC(t)dt, t=26SEPT2018…08JUN2020(or 15 june))) the formula would then perfectly match for someone who deposit at the start date and let it throught the 15 june, without any distortion compared to other tokens, would only need to get this integer once for the few tokens available on compound v2.
But it seems there were no huge token price movement between these two dates so maybe this integer match your arithmetic average

1 Like

@TragedyStruck good point, and @grasponcrypto thanks for collaborating with me on a solution. I prefer the “keep it simple” approach in general, but I also agree that if someone’s main interaction with the protocol was to spam an early governance vote with a bunch of small-value addresses that did nothing else, they should not receive a separate distribution for every address they used.

I’ve posted a solution to the early user analysis Github repo (see SybilAttackRemover.py, SybilAttackRemover.output.txt, and edits to EarlyUserProposal.weighted.py) and updated the early user list (link in previous post now returns the updated list).

There are probably other users with multiple addresses who will benefit disproportionately from the distribution, but I am satisfied that we have removed the most egregious (and hopefully the only malicious) example.

2 Likes

I like it. I completely agree as well. Keep it as simple as possible, but we need to do all we can to ensure malicious actors do not get rewarded for their misbehavior. Especially considering this malicious actor would benefit exponentially due to the nature of their mischief.

Nice work.

2 Likes

I do believe there may be one issue, your solution calls for 500,000 COMP to be distributed

The Comptroller contract only has 228,000 COMP

My interpretation of @alive’s post kicking off this forum discussion is that the proposal can request a fraction of COMP be redirected from the community token allocation, that is, the allocation currently being streamed at 2,312 COMP/day to active lenders and borrowers:

The Compound Governance announcement on Medium specifies:

  • 4,229,949 COMP are reserved for users of the protocol
  • 775,000 COMP are reserved for the community to advance governance through other means

It was later announced that the 775K COMP have been apportioned to Compound’s participation in Coinbase Earn, with the rest sitting in the Reservoir contract.

In short, it seems reasonable for the proposal to include a transfer of COMP from the Reservoir to fund the early-user distribution.

Edit: The background to Proposal 32 does an excellent job describing the relationship between the Reservoir and Comptroller and has helped me better understand how this would look in practice.

The Reservoir is immutable and drips 0.5 COMP/block to the Comptroller, of which ~0.352 COMP is distributed to suppliers and borrowers. Assuming we include four-year vesting in this proposal – which I support and which seems likely necessary to pass – linearly distributing 500K COMP over 4 years works out to about 0.052 COMP/block.

A vesting-based proposal would “cost nothing” up front and would instead have the effect of changing the per-block distribution of COMP from “100% current users / 0% pre-COMP users” to “~85% (0.300 COMP) to current users / 15% (0.052 COMP) to pre-COMP users” while the early user distribution vests.

5 Likes

Also, sorry for redundancy, just gonna think outloud as its the moment or never to submit our thoughts as we approach a proposal now that I analyze your 4 days ago post it seems that your program then went for the 1)socialized, and the 2) capital weighted without any account of the tvl which went from 0 to 239 millions.

Without going to a fully exhaustive pro rata ratio, which I am not sure will be technically easy, do you think it would be easy to simply apply a linear factor which would be a reduction of the curves representing TVL on compound throught time, knowing

  • on 18 June TVL was 239.737 Millions (according to DefiPulse)
    We would make a linear reduction of the TVL curve as a line going from 30 SEPT2018 (2.779 millions) to 15 JUNE2020 (239.737 Millions) by multiplying the weight by a Pro rata factor taking into account the TVL each day, P(t) where t is the number of days elapsed since 30 sept 2018

as P(t0=30sept2018)=239,737/2.779=86.26
and P(15June2020)=1 aka p(655 days) = 1

we get p(t)=86.26 - 0.13016*t , a coefficient factor that we could apply per day to calibrate a pro rata distribution taking into account a 86.26x increase of the TVL from the last day of the month of the launch to the 15 june cut off date.
Taking into account such a factor or a more developped one would fit current comp distribution and would also be like UNI distributions., I also think it would permit a better distribution of the capital weighted allocation and spreading of the capital weighted part of the reward as well as the rebalance of the 95-5 to something more balanced.

The flaw in my reduction is the 30 september is taken arbitrary as well as that’s linear,extracting day to day the TVL datas would give an more accurate result

The only points that remain subjectiv are the ratio social/capital as well as the linear tvl reduction factors, let’s add it as a choice in the proposal ? Even thought I am not sure it’s the to go throught, as it mustn’t divide us on the main purpose all in all, but the stake of a right distribution is here too, I would at the end vouch for feasable solutions only

We could give several choice :
a-50/50
b-55/45
c-60/40
d-65/35
e-70/30
f-75/25 … to 95/5 for the ratio socialized/weighted or even 100/0 meaning no capital weighted

and several choices for the reduction factor from X to 1 where
a-X=80
b-X=…
c-X=1 (ignoring the TVL)
d-extracting the exact daily TVL

This could be submitted as a preamble vote if feasable in this way “what should the early user allocation proposal contain, pick among these factors” once this vote would end the final proposal could be submitted with the factors which won, again I am only thinking out loud, this might not be the right way to go

2 Likes

I hear you @Andre1; the good news is that your “linear TVL” proposal is almost identical to the time multiplier in my proposal formula, except that in my proposal this multiplier only ranges from 1 to 2 (which was not chosen for any particularly good reason), while for the linear TVL formula it should range from 1 to 86.26 (i.e. by how many times TVL grew during the eligibility window). I think this is more rational than arbitrarily having it run from 1 to 2, so I think we should adopt it.

I think it will also partially address your concern about the balance of social and capital weights. I agree that 95-5 appears imbalanced, but I don’t see a way for anything closer to 50-50 to keep the “floor” amount at 20 COMP without significantly inflating the 500,000 COMP request (we would already be asking for a 9% boost to bring the 95-5 distribution floor up to 20 COMP).

FWIW a 50-50 distribution shifts the floor to about 10 COMP – with the other ~10 COMP per small user (and by small we’re talking anyone under millions of USD equivalent supplied/borrowed) going mostly to a handful of very deep-pocketed users, as any capital-weighted distribution would do.

3 Likes

Alright I would have several questions then as I am not able to understand this code easily, will be useful for others also who cannot read any code

–1)Basic question, just to make sure I understood the choice you made, as I didn’t see the formula used, your system (for the capital weighted) is allocating a reward which is proportional to the number of blocks during which someone was lending (and possibly borrowing) a Volume on compound., for each block there is a factor (but thats question 2), in a way that your system is kinda similar to the current compound distribution ? I mean by that if someone came and lent 1 million for 1 block, he wouldn’t be elligible for example to a capital weighted reward right, as its proportional to the time he allocated his funds to the protocol?

–2) So you wrote
" * There are compelling arguments for both socialized and capital-weighted distributions. Including both strategies in the formula is a way to reach a compromise on this question."
Do you mean that you applied a “boost” which is linearly decreasing from the start toward the end date that went from 2 toward 1 ? (to take into account TVL change)
I mean by that, providing 10 dollar for 1 block on september 2018, would be equivalent to provide 20 dollar for one block on june 2020 in the amount of capital weighted allocation your calculation generate ?

–3)Were you able to take into account both compound v1 and v2 for the capital weighed allocation (including compound v1 users after the launch of v2)

3 Likes

@Andre1 I agree it would be helpful to share a simpler document laying out the current calculation and hope to add this to the repo; in the meantime, please take a look at the comments (specifically the lines starting with # in EarlyUserProposal.weighted.py). The formulas are written out and explained there.

(1) No, the time contribution to the weight is not proportional to the number of blocks during which someone was lending (and possibly borrowing). It is based exclusively on the time of the address’s first interaction with the protocol. Disentangling the duration of supplies and borrows using the on-chain withdraw/repay data is much, much harder because of the way multiple interactions can layer on each other. If a user supplied 100 SAI at block A and withdrew 100 SAI plus interest at block B, that would be easy enough to track with the tx data. But many users executed multiple supplies, withdraws, borrows, and repays of differing amounts that would basically require tracking every address’s “state” at each block to achieve a duration-based weight factor. That would be a much heavier data forensics effort. The current approach is perhaps overly generous to folks who dipped their toes in for just a few blocks and quickly withdrew in the early days; but I’m okay with that, because the risk at the beginning was arguably concentrated in the choice to supply (and how much to supply) rather than in how long the user kept their assets in the protocol.

(2) Yes, that’s correct – although as I was implementing your TVL suggestion, I found a mistake in the previous implementation which basically reversed the effect of the time multiplier, rewarding later users instead of early ones (!) This is now fixed in the repo and reflected in the updated list, which also implements your TVL suggestion.

(3) Yes, interactions with V1 and V2 are treated in the same manner with respect to the weights for the capital-weighted distribution.

4 Likes

Thanks for your answer

Now that it’s clearer I can say

(1)The problem in your method if I get it well is that if someone supplied 1 millions at launch and withdrew it instantly after a minut it would grant him exactly the same compared to if he locked this money for 2 years in the protocol ?

I would argue this is not really fair as, take someone who supplied from 2019, for 2 years, consider he would be treated for the specifically capital weighted the same as someone who deposit for a year then left the protocol for an other, or his bank, after 1 week.

If talking about the risks, I would disagree with your assumption, the risks of oracle hack, liquidation failure, (which we saw in defi are the biggest) etc are proportional to the time you lent.

Do you think there there is an easier way to obtain the total interest earnt+paid by an adress, that would make it easier and would match current comp distribution.

(2) My tvl suggestion works only if we proceed block per block, as we cannot apply a x86 boost for someone who lent (and maybe withdrew btw) for the whole period, this boost would apply only for the first week on his deposit (as if we tried to retroactively distribute COMP as if they started to be distributed since day one)

(3)

Do you think the huge difference between alive number of adress and yours is due to the starting date ? (you picked 15 June and he choose 8 June)

4 Likes

Regarding (1), I don’t disagree; however, I think the practical difference between your

“integral over the eligibility window of capital supplied+borrowed times TVL weight factor, int[C(t)w(t),dt]”
and my
“sum of capital supplies/borrows times TVL weight factor at time of the interaction, sum_i[C(t_i]]*w(t_i)”

will be negligible for all but a handful of addresses that supplied and borrowed multi-million USD worth of assets. Both formulas will deliver between 20 and ~21 COMP per user unless the user is in the multimillionaire early user club. But I can’t prove this hunch without actually implementing it, which is a fair amount of extra work.

For the same reasons, I follow your argument (2) but also believe that the impact of replacing the sum involving w(t_i) with the full integral involving w(t) will be minor except at the very, very top of the list.

For (3), the June 15 date is only used as the date at which EndPrice for each underlying token was determined. The EarlyUserCutoffBlock everywhere else (specifically the address list generated by CompoundEarlyUsers.py and the distribution list generated by EarlyUserProposal.weighted.py) is block 10228172, the last block produced on June 8, 2020.

I can’t speak for @alive but would guess that the 5,000 user estimate was either a back-of-the-envelope calculation (and honestly not a bad one either, off by less than a full order of magnitude) or based on higher thresholds for what’s considered a small-value (dust) transaction and therefore not counted as a bona fide user.

Edited: previously I stated that my formula exclusively used the TVL at time of first interaction, w(t_0), but actually I am using w(t_i), that is, each transaction’s capital is weighted by the TVL factor at the time of the transaction.

2 Likes