2022-02-10 Exploring Terra Data with Flipside Crypto
Mentor's Workshop #2

Recording

Resources

We've created a new page under the Mentor's Circle GitBook! Resources we find helpful and that are referenced in the MetricsDAO Workshops will be added to an ever-expanding repository.

Follow Along with Starter Queries!

Feel free to Copy & Edit and make them your own!

0 - Exploring the Data

Start by running a simple SELECT statement so we can see the data in the anchor.collateral table. When using SELECT * from a table for this purpose, always use a LIMIT.

1 - Collateral Transactions

Using this query, we take a high level view of the collateral transaction volume happening with Anchor Protocol.
We're using a few SQL techniques here to aggregate data and look at trends across time. As mentioned in the workshop, the initial goal is to take a look at general trends so transaction volume, simply by count of collateral adds or withdrawals, each day is where we are going to start.
To do that we need to combine all data from a day into one number, so we use date_trunc() to flatten all timestamps from throughout the day into one, common, date. Then, we use the aggregation function count() to simply count up the number of records associated with that day.
When using an aggregate function, we need to tell SQL what the "groups" are. In this case, we are going to group by the date that was calculated above, with date_trunc().
Finally, let's order the data by day using order by. Flipside's charting tool will generally automatically do this, but it's good practice.

Additional tables from the workshop:

2 - Transaction Volume

Using the techniques learned above, let's determine the gross volume of collateral, in USD, for these transactions. This utilizes the sum() aggregate function.
The amount_usd column is a computed value by the Flipside team. When working with these, always check for missing values. Checking for Null Values in amount_usd. From this query we see that there are a few dates with significant missing data. Accounting for missing data is beyond the scope of this workshop, but checking the quality and completeness of the data tables is essential to good analysis. This should be noted in our final dashboard.

Additional tables:

3 - User Behavior

High level volume is a useful starting point, but transaction count and USD volume flows only tell us so much. What are users doing and how is that changing across time? We can compute some powerful summary statistics using the same exact techiques.
Replace count() or sum() with avg() in the aggregation, and we start to see trends emerge.
Outliers should be investigated, and we can do a couple things about those. First, let's figure out why average bETH is spiking by finding the top transactions, by amount_usd.
​Query Here
We use order by to specify which column and desc to ensure the highest numbers are ordered at the top. As with above, we need to exclude null values using where amount_usd is not null. Again, set a limit! We only need the top few results.
​median() is another aggregate function that we can use to interpret user behavior. Averages tend to be skewed as the max() continues to grow, so this provides us with valuable insight into what the more "average" user is doing.

Additional tables:

Final Product - the Dashboard

Flipside Crypto
Final dashboard using our queries from above.
Key points about the final product, describe the data! What insights do you see? What was interesting along the course of the investigation. Write-ups do not need to be long, by any means, but telling the story of the data is very important.
Copy link
Outline
Recording
Resources
Follow Along with Starter Queries!
0 - Exploring the Data
1 - Collateral Transactions
2 - Transaction Volume
3 - User Behavior
Final Product - the Dashboard