Should Compound Retroactively Airdrop Tokens to Early Users?

It would be nice if some of the vc founders/funders take a position into this retroactive airdrop/reward. Its seems that the majority of the DeFi space are concluding that rewarding early users could give a major boost into governance. Look only how Badger finance is having a gold era just because the community is so strong.

And its true I may be biased bc I used the protocol in march 2019! But you know why I am in this ethereum space? Because I believe in the community and truly decentralization. I hope Compound dont make the same mistake by only looking to the big pocket / money.

10 Likes

I feel that raining a large sum of money on anyone that happened to use Compound in the past is not an efficient use of the protocolā€™s capital. The community should align itself with ā€œwhat is best for ensuring the platformā€™s future of becoming the global community-owned lending platform?ā€ There is a lot of work to be done and where we are now is just the very beginning. Iā€™m not opposed to a modest degree of financial reward for early users, but I think the bulk of the benefit should mostly come in the form of giving them a powerful voice in the stewardship of Compound. That voice comes through voting power, but that voice doesnā€™t have to come in the form of unrestricted COMP tokens given away freely.

I propose that the COMP token allocation to early adopters be locked in a voting delegation contract in perpetuity. This contract could allow the early user to vote directly or delegate their votes to others.
The contract could also have the feature to transfer ownership of who controls those votes, which could in the future allow an early adopter to extract value through private sales, if in the future the market deems voting rights on Compound to valuable in-and-of themselves.

This eliminates the immediate financial gain and perverse incentive structure thatā€™s causing a bunch of buzz on this forum from people that stand to gain from raining money. The true stewards of Compoundā€™s future will be able to use this gift to help build its legacy. For those just seeking a short-term financial gain, this gift will be worthless and, if because of that, they choose not to use their votes then that just makes everyone elseā€™s vote more powerful.

Yes, large retroactive airdrops have been popular in the defi space and they create a flurry of publicity within crypto-insider communities, but Compound is already so well established that the marginal benefit seems small. Airdrops arenā€™t effective beyond this insular community and the majority of the tokens awarded in past airdrops just end up being sold on the market. We need to think much bigger and longer term.

You can think of this perpetual lock as being a burn of Comp tokens, but not the voting right attached to them. The burn of value does enrich existing COMP holders to a marginal degree by decreasing the fully diluted supply, but thatā€™s more than offset by increasing the attractiveness of future COMP emissions and grants. In my opinion, grants are a much more effective use of the protocolā€™s capital in pursuit of its long-term goals.

Locking the Comp rewards those who came early and truly care about this project in the purest form.

3 Likes

Hereā€™s one reason why a vested airdrop of 100 comp should take place, right now when VCā€™s vote itā€™s pretty much already set decision.

5 Likes

Is any result on retroactively airdrop ?

1 Like

The question is that normal users now have nearly no voting weight to change the result of a proposal.The protocol is controlled by a few citadels. Airdrop can distribute more comp into userā€™s hand but not these citadels and it will make the protocol more decentralised.

5 Likes

That is the big problem for Compound, I think its more CeFi then DeFi currently. I hope that protocol will be more decentralized in future

4 Likes

I say yes to an initial airdrop of COMP tokens to early users but only through using some sort of ā€œclaimā€ mechanism. Also, the COMP not claimed by those users/addresses could still be airdropped to users who interacted with Compound after the launch of the COMP token, but before the initial airdrop. Those users would also have to ā€œclaimā€ their airdrop.

My ETH address here: Address 0xe84d25b1C4fe0E5A9bEe95934AB24C9867Aac2cc | Etherscan was created using Coinbase Wallet, however, I have lost the keys to this wallet and the address will always be earning interest on $360 worth of USDC, LOL! This is the reason I believe the early adopters should have to claim their airdrop and the amount of COMP not claimed be airdropped to addresses that are currently supporting the protocol.

Airdropping COMP to early adopters and not current supporters could be a mistake. As early adopters may not have any use for a governance token, besides monetary value, which means theyā€™ll most likely dump the COMP as soon as they claim it. Current supporters of the Compound protocol would most likely use the COMP for voting or supply it to Compound.

Just my 2 cents.

2 Likes

how did it go in this topic for retroactively airdrop ?
there are many defi platforms airdrop done ā€¦remain us.

3 Likes

If compound/coinbase lasts for the next 500 years and you get an average of 5% on that 360 USDC. You will have 14 Trillion Dollar in the year 2521.

3 Likes

Justin Sun (founder of TRON) added 1.000.000.000 usd worth of ETH in Compound. He is now farming 80.000 COMP a day!!!

Wellā€¦ we could have done this job (give governance to early users) but yet the VCs and other whales waited until this governance attack happened. Justin is not going to use it for Compound or Ethereum, he have is own agenda focused around his true bad blockchain Tron.

Anyhowā€¦ compound is moving slow, governance to early users with a vested/locked time could make this protocol more open.

Happy easter, enjoy governance attack

7 Likes

This is a serious problem

3 Likes

I was inspired by @grasponcryptoā€™s effort with the Dune Analytics data to see if I could put together a more flexible tool for us to gather detailed info on early users of the Compound protocol to help move this discussion forward. Iā€™ve played around with the Compound Subgraph but havenā€™t found a way to extract all the data we need that way, so Iā€™m using the web3 Python module to scan the blockchain for cToken interactions directly.

A first attempt is posted here.

This tool has been spot-tested on a few randomly-selected blocks bearing cToken interactions but hasnā€™t been run over the full range of relevant blocks yet: I donā€™t have access to a local ethereum node or the funds for a remote/paid service for the remote procedure calls (RPCs), so I cannot loop over all relevant blocks to produce an exhaustive list of addresses.

A possible next step would be to start a conversation in the grants channel on Discord and see if some funds to cover the RPCs, or an ethereum full node, could be made available to run the script and collect the data, which we could then analyze together to develop a robust proposal.

I welcome comments, suggestions, and pull requests, especially any ideas for reducing the number of required RPCs. As currently implemented, the script needs to have a peek at every ethereum transaction that took place between the deployment of the first cToken contract and the deployment of the COMP token contract.

Like many in the conversation here, I am a small-potatoes early user with a completely unrelated day job who appreciates the enormous amount of labor that has gone into the protocolā€™s development and believes that empowering early users with a greater voice in governance will be a net benefit to the project.

8 Likes

Is the proposal still alive ?

1 Like

Thanks to @blck for pointing me toward web3ā€™s Contract.events, I no longer need to loop over all transactions and can obtain all the metadata on early users we should need with a very modest number of RPCs. @blck also pointed out that my first attempt ignored V1 users (!) I think Iā€™ve rectified that with the new version.

Here is the updated tool

Iā€™ve also posted the resulting information in csv format, separately for V1 and for V2:

Compound V1 Early User List (36942 txs by 28021 unique addresses)
Compound V2 Early User List (254436 txs by 218725 unique addresses)

I think/hope this is moving us in the direction of what @getty had in mind by:

I didnā€™t provide headers, but youā€™ll see in the lists that each row contains the block height, user address, type of interaction with the Compound protocol, amount of tokens, and underlying token ID (cashtag) for each transaction. Also, I didnā€™t remove duplicate addresses: if you interacted with the protocol several times, there should be a separate line for each transaction.

As a first sanity check, it would be awesome if the folks who didnā€™t see their addresses in @grasponcryptoā€™s Dune Analytics results could check these outputs and see if theyā€™re listed here.

As a next step, I would suggest we slice the data in a few of the different ways that have been suggested on this forum already. Here are a few of the ways I noticed:

1. (UNI-like or socialized distribution style) equal distribution to all early user addresses

  • with or without a minimum value threshhold (what value?)
  • with or without a bonus multiplier for V1 users (what multiplier?)

2. (COMP-like or capital-weighted distribution style) pro-rata distribution based on (value supplied+borrowed)*(time)

  • how should early liquidators be included in a pro-rata distribution, if at all? They lack the ā€œtimeā€ axis that suppliers and borrowers have.

3. Pro-rata distribution based only on total value supplied, borrowed, and liquidated, not based on time.

  • for either 2 or 3, should the pro-rata distribution be rescaled to elevate small early users relative to early whales? I have seen quadratic scaling suggested; or we could take an even more aggressive exponential scaling, where the amount received grows linearly with each order of magnitude of value supplied to, and/or borrowed from, the protocol.

The script does not currently access the price data that would be needed for distribution styles 2 and 3. If someone else wants to add that piece to the puzzle, feel free to submit a pull request! Otherwise Iā€™ll take a stab at it, but again this isnā€™t really my wheelhouse so it might take me a while.

What do folks think? Are there other ways of slicing the data youā€™d like to see? Can we narrow it down to one or two of these options before producing the actual list of addresses and distribution amounts for a proposal?

Last but not least, will we be able to pull together enough COMP to submit a CAP once weā€™ve converged on the details? I can get us a few percent of the way there; maybe if there are a few dozen of us with a few COMP eachā€¦

12 Likes

Thank for the awesome work to start his off.
But it seems a block id cutoff would be better than which comp protocol version was used?

As based on the list.
Comp V1 has tx till may 2020 (Feb-27-2020 , block 9567234)
while Comp V2 first tx was (May-07-2019, block 7711104)

I think this would narrow it down to these 3 choices:

  1. Airdrop to v1 users (cutoff before start of comp v2 - block 7711104 )
  2. Airdrop to v1 & v2 users with June 8 cutoff one week before comp token launched, as was originally suggested by @alive
  3. No airdrop at all
3 Likes

Great effort! Thanks for putting in the time on this.

I think you may be using the wrong decimals for USDC (6) and WBTC (8). The numbers in the report are mostly zeros.

1 Like

I would favor a distribution like (2) that increases somewhat with larger value*time, but sub-linearly. So something like sqrt(value*time), but I wouldnā€™t worry too much about the exact form. This achieves several goals:

  • wide distribution by rewarding small users
  • recognizing bigger players as contributing more to protocol success
  • being more efficient in getting a significant quantity of COMP to small users without breaking the bank with huge rewards to big users

Just to clarify how UNI was distributed: as far as trading users went, it was a simple constant value, but for liquidity providers it was more complicated. Something like a fixed amount of UNI per day, distributed pro rata among LPs according to their capital value. So early LPs got big rewards because they were suppling a large fraction of the total. I think the idea was to reward those who took the most risk and contributed to the early bootstrapping.

You can make a good argument for that approach here, but personally Iā€™d support it either way.

2 Likes

Yes I am, thank you! Will have that corrected next update.

2 Likes

I think you summed it well.
Firstly thank you for the big work which is necessary to make this proposal go further.
Will try to sum up my ideas, they are just ideas opened to discussions ofc

I would agree with the others and think we should propose 1 and 2.
1 : a socialized allocation, which would grant a fixed amount to each early user.
2 : a capital weighted allocation, which would match the current COMP distribution in a way.
We could choose it in a way it is proportional to the total amount of capital lent+borrowed throught time.
Or maybe better proportional to total amount of interest paid+received throught time.

However as pyggie said, if you supplied in the earliest day on Compound would carry more % of the total liquidity supplied a week before comp distribution. Thus we would have two choices :

a)either try to perfectly match current COMP distribution and at each block, a dollar lent on compound would represent a certain % of the network depending on the total liquidity, so basically we would simulate a behaviour as if the comp distribution started for early users at earliest stage. But I am not even sure it is possible

b) Or we would do it MUCH simplier, but still it would be fair, we would simply apply a (linear?) coefficient which would also as 1 be able to distinguish the importance of the network taken by a liquidity provider at a given time.
For example we could choose to apply a coefficient that would linearly decrease from 4 to 1 starting from the comp v1 launch, finishing at the cut off date. This might not follow the exponential liquidity growth totally but at least would partly be able to weight the oldest liquidity at their share of the network.

I would be in favour to propose these features to the vote.

2 Likes

Awesome work. I have a local geth node running so if you want some help there dm me in discord.

2 Likes