What is CoFi? In-depth discussion on the application of “computable finance” in the field of DeFi oracles

What is CoFi? In-depth discussion on the application of “computable finance” in the field of DeFi oracles

Loading

What formula can make the oracle quotation more accurate and minimize the risk for market makers and users?

Original Title: “Distributed Classroom: Sharing on the Application of Computable Finance to DeFi Oracles”
Written by: James, CoFi Researcher, NEST Community

The oracle is generally regarded as a bridge between the decentralized protocol and the external data of the blockchain, and the oracle is mainly responsible for the quotation function in the DeFi field. So, can the current DeFi oracle really achieve accurate quotation?

This time, the second issue of Distributed Classroom invited NEST Community CoFi researcher James to talk about computable financial technology : What formula can make the oracle quote more accurate and minimize the risk for market makers and users?

James, a firm believer in blockchain, focuses on the research of decentralized native assets and computing finance, and looks at the external world from the blockchain world, and firmly believes that the wealth of human society will gradually transfer to decentralized assets in the future.


What is CoFi? In-depth discussion on the application of ``computable finance'' in the field of DeFi oracles

What is the difference between centralized assets and our real-world assets? The difference is that the formed decentralized assets cannot be replicated by real-world assets. You cannot copy the risk-return structure of centralized assets with existing large-scale assets.

So from the perspective of finance, what does finance do? In general, it is to deal with all kinds of human uncertainties. Each uncertainty corresponds to a risk-return structure. Whenever you discover a new risk-return structure and make the corresponding asset or product, you will definitely eliminate the risk of all mankind in asset allocation.

The easiest to understand is in volatility risk . Take Markowitz’s asset portfolio theory , for example, if two unrelated assets are put together, their volatility can be reduced without changing the income structure, and the goal of improving investment risk can be achieved. The goal of decentralized assets is to eliminate the uncertainty of all mankind.

Credibility and availability

There are two variables, one is credibility and the other is availability . Usability is to eliminate the uncertainty of ordinary people. Credibility is the elimination of human uncertainty. BTC does not spend money on availability, but spends 20 billion in electricity bills every year to solve the credibility problem. So, who will solve the usability? It may be solved by Bitcoin holders. They advertise to ordinary people what Bitcoin is about, so that these people who do not understand Bitcoin can understand Bitcoin. Satoshi Nakamoto created Bitcoin to propose a new risk-return structure to eliminate the uncertainty of the entire human race.

In the entire decentralized market, as long as a centralized institution is introduced, your risk-return structure will be similar to equity to some extent. This is equivalent to copying a “Bitcoin + Equity” without creating something completely new. This is why I want to make the project so decentralized. Here we also want to mention DeFi, which is particularly popular recently.

The current problems of DeFi: First, the risk of the project is unquantifiable and incalculable; second, it is difficult for DeFi to deposit centralized assets on the chain. If centralized assets cannot be deposited on the chain, this DeFi project may ultimately be a computing contract. The calculation contract means that no matter how you calculate, the amount of information will not increase. The meaning of calculation is to make messy information coding easy to understand. In this process, it did not eliminate the uncertainty of human beings. Of course, it may eliminate the uncertainty of certain specific groups of people.

If the DeFi project cannot deposit centralized assets, it means that when everyone competes to the extreme, they are not charged. Bitcoin realizes the credibility of a transfer function and realizes the function of transfer. Ethereum expands this transfer function into a logical function . One of the current features of this logic function is that all calculations are completed within polynomial time and resources. Because it consumes resources, it must be terminated within a given time.

What is CoFi? In-depth discussion on the application of ``computable finance'' in the field of DeFi oracles

Asset pricing

Similar to many problems in our real world, asset pricing is also a difficult problem. For example: how to design traffic lights? Optimal transportation network problem? The problem of the optimal social network? Optimal business network problem? These network-related issues are complex issues.

Asset pricing is an optimal price calculation problem. This problem cannot be solved by P calculation. Ethereum cannot use smart contracts to price assets. Therefore, without an oracle, Ethereum can only do three things: transactions (Uniswap), stablecoins (USDT), and ETH-wETH conversion. This is determined by the limitations of the p function.

Considering that such a complex calculation of asset pricing is difficult to complete on the chain, a new mechanism must be formed to allow it to approach this result. So we constructed such a scheme: assuming that there is no external market, how to approach this price; or if there is an external market, how to pass this off-chain price up.

If there is a calculation result of an NP problem on the chain, it will increase the amount of information and provide brand new information for the entire ecosystem. The important point is that it continuously expands the boundaries of the blockchain. Only when the boundaries are expanded can it truly become a substantial progress . If the entire network only progresses horizontally, for example, the programming speed is faster and the block is larger, this is not a substantial improvement. In fact, a lot of our understanding of blockchain needs to be adjusted.

In terms of eliminating human uncertainty, there is no need for everyone to verify the ledger. As long as we are open, everyone can have the power to verify the ledger. Analogous to Layer1 and Layer2, Layer 2 is to eliminate the uncertainty of ordinary people, and Layer 1 is to create new value . Developers can discuss how to increase the block size, improve packaging time, etc.

What is CoFi? In-depth discussion on the application of ``computable finance'' in the field of DeFi oracles

So, under this background, what will NEST do? The first is that NEST will form a more updated decentralized risk-return structure . The second is that it needs to expand the function of the blockchain so that things that could not be done on the blockchain can now be completed. Of course, all of this must be kept decentralized.

When NEST’s oracle was cited, a series of by-products were produced. The first very important by-product is to make risk calculable when quoting prices. After the credit risk is stripped off, the whole calculation result is relatively accurate. The credit risk here mainly refers to the subject risk after the project risk is excluded. Subject risk is generally difficult to calculate. For example, how much cash flow this project can eventually generate, and what is the probability of failure can be calculated and analyzed, but the question is, what should I do if the main body’s money absconds? In fact, this reflects the incompleteness of the system.

After the blockchain strips off the credit risk (in a decentralized way), only liquidity risk and volatility risk remain. Since liquidity is a natural advantage of blockchain, I won’t talk about it here. Volatility risk can be calculated, and it has a strong theoretical basis. In fact, in the 1970s, Samuelson, Black, Merton, Fama and others put forward relevant financial ideas: historical models have studied this risk so thoroughly, and these risks can be priced. Can’t these risks be managed automatically?

Then these ideas have been learned by hedge funds, and many new investment models and risk management models have been formed. The most typical is the long-term capital management company at that time. Although they built a very sophisticated model, they ultimately lost the subject of uncalculated risk: Russia defaulted.

Now we are all talking about Alpha Go’s victory over Go and other artificial intelligence topics. In fact, this dream existed in the Turing era. Despite the intermittent development of artificial intelligence, the dream still exists. Although the computable model is subject to subject risk, there is no need to worry about this risk in the field of decentralized blockchain (at least logically). Without the risk of credit entities and without considering the ineffectiveness of the market, it is obviously much better to make risk management algorithmic than the era faced by long-term capital management companies.

Can the calculation of risk management be done on the chain?

In fact, DeFi simply refers to transactions, interest rates, and positive securities/negative securities. These three structures make a very rough division of finance.

There are two most important things in a discipline: One is the basic concepts. The discussion of every basic concept may take hundreds of years. And the formation of each concept has to go through thousands of tempers. The second is that this concept must promote the formation of the theoretical commanding heights in this field. If this theory is equal to common sense, this subject is meaningless.

When Satoshi Nakamoto designed the blockchain architecture, he had computing/storage/communication functions and related technologies. At that time, the digital currency had to solve the double-spending problem, that is, “money is used once, not a second time.” In fact, the zero-knowledge proof/homomorphic encryption schemes have been discussed, but they cannot solve the double-spending problem. Decentralization is not only distributed. It does not mean that I distribute the work to you, but that you consciously upload something and automatically combine it.

Looking at it now, the structure of Bitcoin is a bit inefficient: it is very redundant in terms of computing, information storage, and communication. Redundancy refers to multiple computers repeatedly performing the same calculations and storing the same data repeatedly. Redundant architecture makes it difficult to complete matching transactions (that is, asset pricing is very difficult), because it is difficult to solve by function or voting.

What is CoFi? In-depth discussion on the application of ``computable finance'' in the field of DeFi oracles

From a market perspective, matchmaking transactions are the ultimate in information interaction in the modern economy. Transactions are not suitable for matching on the chain. In other words, once the matching transaction is feasible on the chain, the advantage of the off-chain exchange will be greater. This is a question of comparative advantage.

What types of traders are there in the micro-market structure of matchmaking? The first level is called insider traders , the second is information traders , the third is market makers , the fourth is value traders, and the fifth is noise traders . Here I will focus on information traders. An information trader means that the information obtained by the owner has an advantage, and he has to discount the price of the asset based on the information he has at all times. For example, every time he receives new information, he will tell you how much the price has changed. These information owners have the most extreme desires for transactions and requirements for computing storage communication, and they are also the main providers of exchange transactions and liquidity.

For example, high-frequency traders on American exchanges are accurate to the nanosecond level, and they have to continue trading outside when the exchange is closed. Because they feel that information is constantly changing, and that they have information advantages, they must price assets all the time. For such people, exchanges with advantages in computing, storage, and communications are more suitable for them. For example, what kind of asset has value on uniswap will also be moved to a centralized exchange, because information traders want to profit from this, they have greater advantages here, and they are more willing to provide more Trading and liquidity.

Assuming there is a price sequence under the chain, how to form it on the chain?

Since there is no advantage in pricing on the blockchain, we feel that price calculation should be formed off-chain . The blockchain system is not to make you more efficient, but to make you more credible. Assuming there is a price sequence under the chain, how to generate this price sequence on the chain and keep it true?

  • The first is that the verification and generation of data must be decentralized.
  • The second is to ensure that these two prices do not deviate.
  • The third is that the generation mechanism cannot be influenced by others.
  • The fourth is that there will be a delay in the generation of all price sequences, unless you have a price in each block, where the delay refers to the latest effective price and the block interval between calling the price and completing the transaction.

When you call off-chain data to go on-chain, the first thing there will be price deviation . And this deviation is the possibility of arbitrage by off-chain copyable transactions. At the same time, the delay will indirectly affect the price deviation, because the corresponding prices at different times are necessarily different. At present, DeFi’s oracle does not manage risk on these two points. In fact, when you operate on uniswap, the market maker bears the risk of price fluctuations and the risk of arbitrage by others. The question is how big is this risk and can the fees charged by the market maker cover this risk? At present, the model of this type of oracle is still at a relatively crude stage.

The formation of the external equilibrium price is an NP problem (non-polynomial time problem). What Uniswap does is a calculation of P (this type refers to polynomial time problems: give me an X, and I output a Y for you). So in this case, when the market price fluctuates, it will definitely be arbitrage by others. If you ensure that gains can cover losses, you can form an equilibrium; then you also need to ensure that the equilibrium is stable, and you cannot start to deviate because of disturbances, and you cannot cover losses, otherwise the market maker will withdraw.

When off-chain information is uploaded to the chain, there will definitely be a price deviation, so how to optimize this price deviation? Because minimizing price deviation is very difficult, we thought that price deviation must be less than or equal to the cost of arbitrage . Therefore, the mechanism of NEST oracle includes two-way option/price chain/Beta coefficient. The Beta coefficient is to fight against self-dealing attacks. The price chain means that the information flow can exist forever. Two-way options are a quotation mechanism. There is a verification period, which is mainly used to combat blockage attacks. Otherwise, it can be as short as possible in theory. , And converge to the equilibrium price.

There are two costs that the oracle must include, one is gas cost, and the other is hedging cost. Chainlink is used first and then verified, and Nest is first verified and then used. There are a few problems with using first and then verifying? The first problem is that it will definitely have an asymmetric node at that time. It will definitely encounter a centralization problem. If your mortgage assets are here, there must be a centralized object to be the punisher. The second is the mismatch in the scale of downstream risk mortgage assets. One million US dollars cannot support a 1 trillion US dollar project. Even if there is no motivation for evil, downstream users will worry about the risk of evil.

Our society requires completeness. For example, the law supports the use of first verification (penalty) mechanism. But the blockchain cannot be processed in our traditional way. We have to verify before use, and control the risk of error before you use it. What is the cost of verification mechanism in NEST? That is, after the verifier is arbitraged, he has to provide a double quote, that is, the verification cost is Beta coefficient * two-way option (here beta = 2) . The cost of arbitrage is equal to gas cost + hedging cost + Beta coefficient * two-way option .

So, what is the cost of arbitrage? After we have done statistics on the price deviation of NEST, we found that the cost of arbitrage is about 4/1000. This statistical arbitrage is not an absolute arbitrage, but a calculated probability of a fixed price under market price fluctuations. According to the current volatility of Ethernet, the probability of being arbitrage is about 7%. That is, there are 100 quotations, about 7 arbitrage, and the actual detection is about 2-3.

Blocking attacks are the first variable. (The so-called congestion attack is to occupy blocks with transactions so that other people’s transactions cannot be packaged.) The oracle needs to verify 25 blocks to prevent congestion attacks. The entire Ethereum miner community slowly started consciously. Whenever dust attacks and blockage attacks are launched, miners know that you are malicious and will choose not to pack them for you. The miner’s choice of not packing is an imperfect correction to the system, because it is undefined by the agreement.

Second, the verification period and volatility determine the cost of two-way options. Any verification mechanism cannot bypass the verification cycle and volatility. When this volatility rises, the price deviation of NEST will become very large. For example, March 2 reached a volatility of three thousandths. The volatility of three thousandths is equivalent to forty to fifty times that of A shares. Under this volatility, normal investors don’t know how to predict and control it, and they are often at a loss. At this time, the design of financial products should be based on the flow of information on the chain instead of focusing on the off-chain.

In fact, the two-way option + price chain + Beta coefficient mechanism can be used in traditional finance. It puts the smartest information traders and arbitrageurs together for pricing. If you bring the smartest people together, you don’t need so many people to participate in pricing.

The second variable is delay . In the NEST system, the delay is reduced by the incentive mechanism. If the NEST price does not change and every block quote has become two block quotes, it means that your income has doubled. No matter how the blockage is, as long as the NEST price does not fall too much, someone will always report because the rate of return is too high.

When market makers quote NEST prices for trading, there will be a layer of risk protection . This risk protection can ensure that the market maker will not lose the final expected profit during the transaction. Because this price will fluctuate, no matter how it fluctuates, market makers may be arbitraged by others. If the oracle machine puts this price compensation in, and then does not lose money when the two parties conduct long-term transactions, then both parties are appropriate. If the market maker does not use this risk function for protection, the transaction price will vary greatly and will continue to lose money until it exits.

The above is our discussion on Computable Finance and DeFi oracles.

Source link: mp.weixin.qq.com