2022-02-10 Exploring Terra Data with Flipside Crypto
Mentor's Workshop #2
We've created a new page under the Mentor's Circle GitBook! Resources we find helpful and that are referenced in the MetricsDAO Workshops will be added to an ever-expanding repository.
Feel free to Copy & Edit and make them your own!
Start by running a simple
SELECT
statement so we can see the data in the anchor.collateral
table. When using SELECT *
from a table for this purpose, always use a LIMIT
.Using this query, we take a high level view of the collateral transaction volume happening with Anchor Protocol.
We're using a few SQL techniques here to aggregate data and look at trends across time. As mentioned in the workshop, the initial goal is to take a look at general trends so transaction volume, simply by count of collateral adds or withdrawals, each day is where we are going to start.
To do that we need to combine all data from a day into one number, so we use
date_trunc()
to flatten all timestamps from throughout the day into one, common, date. Then, we use the aggregation function count()
to simply count up the number of records associated with that day.When using an aggregate function, we need to tell SQL what the "groups" are. In this case, we are going to
group by
the date that was calculated above, with date_trunc()
.Finally, let's order the data by day using
order by
. Flipside's charting tool will generally automatically do this, but it's good practice.Using the techniques learned above, let's determine the gross volume of collateral, in USD, for these transactions. This utilizes the
sum()
aggregate function.The
amount_usd
column is a computed value by the Flipside team. When working with these, always check for missing values. Checking for Null Values in amount_usd. From this query we see that there are a few dates with significant missing data. Accounting for missing data is beyond the scope of this workshop, but checking the quality and completeness of the data tables is essential to good analysis. This should be noted in our final dashboard.High level volume is a useful starting point, but transaction count and USD volume flows only tell us so much. What are users doing and how is that changing across time? We can compute some powerful summary statistics using the same exact techiques.
Outliers should be investigated, and we can do a couple things about those. First, let's figure out why average bETH is spiking by finding the top transactions, by
amount_usd
. We use
order by
to specify which column and desc
to ensure the highest numbers are ordered at the top. As with above, we need to exclude null
values using where amount_usd is not null
. Again, set a limit
! We only need the top few results.Flipside Crypto
Final dashboard using our queries from above.
Key points about the final product, describe the data! What insights do you see? What was interesting along the course of the investigation. Write-ups do not need to be long, by any means, but telling the story of the data is very important.
Last modified 1yr ago