2022-02-10 Exploring Terra Data with Flipside Crypto
Mentor's Workshop #2
Last updated
Was this helpful?
Mentor's Workshop #2
Last updated
Was this helpful?
We've created a new page under the Mentor's Circle GitBook! Resources we find helpful and that are referenced in the MetricsDAO Workshops will be added to an ever-expanding repository.
Feel free to Copy & Edit and make them your own!
Using this query, we take a high level view of the collateral transaction volume happening with Anchor Protocol.
We're using a few SQL techniques here to aggregate data and look at trends across time. As mentioned in the workshop, the initial goal is to take a look at general trends so transaction volume, simply by count of collateral adds or withdrawals, each day is where we are going to start.
High level volume is a useful starting point, but transaction count and USD volume flows only tell us so much. What are users doing and how is that changing across time? We can compute some powerful summary statistics using the same exact techiques.
Outliers should be investigated, and we can do a couple things about those. First, let's figure out why average bETH is spiking by finding the top transactions, by amount_usd
.
Key points about the final product, describe the data! What insights do you see? What was interesting along the course of the investigation. Write-ups do not need to be long, by any means, but telling the story of the data is very important.
Start by running a simple SELECT
statement so we can see the data in the anchor.collateral
table. When using SELECT *
from a table for this purpose, always use a .
To do that we need to combine all data from a day into one number, so we use to flatten all timestamps from throughout the day into one, common, date. Then, we use the count()
to simply count up the number of records associated with that day.
When using an aggregate function, we need to tell SQL what the "groups" are. In this case, we are going to the date that was calculated above, with date_trunc()
.
Finally, let's order the data by day using . Flipside's charting tool will generally automatically do this, but it's good practice.
using
using
Using the techniques learned above, let's determine the gross volume of collateral, in USD, for these transactions. This utilizes the aggregate function.
The amount_usd
column is a computed value by the Flipside team. When working with these, always check for missing values. . From this query we see that there are a few dates with significant missing data. Accounting for missing data is beyond the scope of this workshop, but checking the quality and completeness of the data tables is essential to good analysis. This should be noted in our final dashboard.
Replace count()
or sum()
with in the aggregation, and we start to see trends emerge.
We use order by
to specify which column and desc
to ensure the highest numbers are ordered at the top. As with above, we need to exclude null
values using . Again, set a limit
! We only need the top few results.
is another aggregate function that we can use to interpret user behavior. Averages tend to be skewed as the continues to grow, so this provides us with valuable insight into what the more "average" user is doing.