[RFC] Partial Liquidations

Hello, dear community

As a follow-up to our previous post on partial liquidations discussion, we share a short update on the topic.

Next steps

Since the solution on the partial liquidations is finalized and enhanced with all the ideas and innovations offered during the community discussion, we move to the next steps:

  1. Snapshot voting is planned to be pushed later this week;
  2. Development kickoff shortly after that (and the team is getting ready);
  3. Start of the implementation of the presented solution.

However, to avoid any time lost, we start a parallel work on the retro-analysis and modeling section described in the last post.

What is it about

The goal of the data gathering, retro-analysis, and modeling section of the scope is to ensure that the partial liquidation solution will be applied in the most efficient and secure way. In order to remind why it is needed:

  • The goal is to detect the most efficient order of markets update, detect markets which can meet a certain collateral imbalance risk and so determine which markets may have changes in collateral order or certain tweaks for the solution;
  • The additional goal is to suggest the order of markets to update to build a deployment pipeline and to offer the initial setting for the target HF for each of those markets;
  • To achieve that, we will need to perform a modeling of the market state after the implementation of the partial liquidation solution, and a modeling of a change in the user’s risk profile based on the updated liquidation mechanism;
  • The previous step requires a specific model to start with, thus, it will require a retro-analysis based on the Comet markets’ behavior in the last 1-1.5 years. The acquired markets’ state and users’ behavior will be taken as a basis for modeling.
  • And to perform the retro-analysis, we will require a dataset reflecting the Compound market state. Our engineers will gather data on the few most suitable markets over the last 1-1.5 years.

Why start it in parallel

While the direct need of that data is for the partial liquidation implementation, it is partially a separate task that covers several sub-targets.

  1. The collected data will be used in future research, including several ongoing projects initiated by Woof! engineers and researchers (including e.g. several periphery improvements for the deployment flow, for testing scenarios, and future improvements track);
  2. Such data will be used for insightful illustrations for future posts and community updates, especially since it will be continuously updated, e.g., for regular state-of-the-market reports;
  3. The developed model of the market state and users’ behavior can be shared with the community and teams who have an interest in innovations for the Compound ecosystem;
  4. It will also be used for modeling the protocol behavior after other updates, as it gives a basis to work with, and so some solutions can be checked retrospectively.

And so, despite the partial liquidation solution requiring a snapshot vote first, data gathering and analytical model creation will still be required for future ecosystem enhancements.

What we work on

We start with basic data and have a plan to extend the collected data to more complex forms:

  1. The first data to collect is a simple snapshot of several markets:
    1.1. The initial state of the market;
    1.2. Collection of the consequent supplies and withdrawals (of both base asset and collaterals)
    1.3. Collection of the consequent liquidations with recording of the collateral value seized and the debt closed;
    1.4. Collection of the consequent collateral purchases to record the amount of a base asset returned to the market and the amount of the collateral purchased
  2. While the initial data is basic, it will be further aggregated to have daily and weekly values, so it will allow analysis of the behavior and state changes
  3. Furthermore, the data will be enhanced with the in-moment prices of the assets and the price impact analysis (based on the scrapped sell transactions after the collateral purchase);
  4. After that, a separate dataset will be collected purely for liquidation analysis (especially for the dynamics of the reserves analysis) and for the interest paid/earned analysis;
  5. As a “big goal” we have in mind the collection of data based on the different users’ behavior patterns (e.g., risk profile, risk tolerance in terms of % above liquidation threshold, portfolio sizes in terms of position size, etc.).

“Next” next steps

While the partial liquidation implementation will go through all necessary steps, we will work on the data collection and analysis, so that once the development is started, we will ensure it is efficient and oriented towards a risk mitigation basis. We will keep the community updated on our progress and post the next update, which will include some of the data and insights, around 2 weeks from now.