I first started playing around with crypto when I was in college (what feels like a decade ago already — thanks COVID), but only until this past summer, with the defi boom, did I really start paying close attention to the space and investing more seriously.
I quickly realized that the crypto world isn’t necessarily optimized for UX and found myself juggling tokens distributed on multiple platforms and wallets; namely, Robinhood, Coinbase, and Metamask. I wanted a way to get a quick overview of the balance in my accounts in one place, so I spent a couple of hours on a lazy Sunday afternoon putting together some python scripts that would fetch my account balances and send me a message with how much money I had.
My hopes with sharing this is that (1) it’ll give new programmers a quick overview of a real-world example of how to run serverless functions and interact with API’s (2) it’ll allow other people to borrow my cool python code if they face the same pain-point or want to replicate a similar workflow, and (3) to open myself up for constructive criticism (clarity in my writing, expressing my thoughts, as well as from a program design/logic perspective and best practices in software dev)
Section 1. Robinhood and Coinbase setup and code
Setting up and using Robinhood and Coinbase programmatically was relatively straightforward since there have been some wonderful people on the internet that have already created python API wrappers that fulfill all of our requirements. I used robin_stocks for Robinhood and coinbase-python (which is no longer maintained, but still works wonders as of Jan 2021) for Coinbase. Please take a look at their respective Github repo’s for instructions on how to set it up (super straightforward)
Using environment variables for your credentials isn’t the most secure option, so I’d recommend using GCP’s Secrets Manager or some other secret manager service instead — in the meantime, and for PoC purposes, this works.
Section 2. Etherscan setup
We are using the Etherscan API to get the our Metamask account balance.
Etherscan was just as simple to setup since they also have a well documented API, but there were some further caveats to get the balance correctly. The first step to get the balance of our Metamask wallet, is to manually create a
tokens.json dictionary with all of the tokens (except for $ETH) that we own and their respective addresses and decimal value. We'll be using this to look at how much of each of these tokens we have in our account. Here is my example file:
Before we can interact with their API service we need to create a free API key. It’s important to note that their free API has a rate limit of 5 requests per second, so if you have more than 5 different tokens on your account, you’ll likely have to
sleep() between each loop iteration.
The logic here is that I’m iterating through each of the tokens in the token dictionary that we created above, getting the token price from CoinGecko, getting the amount of that token I have in my account, and calculating my total balance for it. To check the ETH balance, on the other hand, the process is much simpler because the etherscan API gives us an endpoint specific for it (ie.
get_eth_balance(address)). At the end we sum everything up and return the total balance for the Ethereum address.
Section 3. Twilio setup
By this section you’ll probably start noticing a pattern — all I’m doing is sending and receiving requests from a variety of API’s. Twilio is a service that allows us to send and receive messages programmatically; I’m using it to send a message to my phone with an update of my total account balance.
Section 4. Setting up the Cloud Function
Now that we have all the code written, nicely organized into their own individual functions, and we’ve tested the script locally, it’s time to create a serverless function on your favorite CSP. I chose GCP’s Cloud Functions simply because it’s the easiest to interact with out of the major cloud providers — AWS and Azure have similar serverless services (AWS Lambda and Azure Functions).
The reason why we’re using a serverless (FaaS) approach here is so we don’t have to worry about maintaining any infrastructure or containers. We simply give Google the code we want it to execute, set up a schedule, and boom — everything miraculously works.
The first steps are to create a GCP account, setup a billing account, create a new project and associate it to the billing account, and activate the Cloud Functions API — follow these steps on Google’s official documentation.
After you have the initial configuration complete it’s time to locate the Cloud Function service on the GCP Console. Simply type ‘Cloud Functions’ on the search bar or expand the hamburger menu and scroll through the services:
Next, press on the Create Function button at the top:
In order to configure the Cloud Function, you’ll have to do following: (1) Set the name and region of the function, (2) toggle the radio button to allow unauthenticated invocations
Next, (3) set your environment variables (usernames, passwords, api keys, etc)
Finally, (4) choose your runtime language (python in this case), create the python script file (along with a requirements.txt and the tokens.json mentioned above), and press deploy 🚀
Section 5. Putting the function on a schedule
The last stage of this project is to leverage another GCP service called Cloud Scheduler that allows us to schedule a task, also known as a Cron job, that will send an HTTP request to our Cloud Function at specific times. The first step is to locate the Cloud Scheduler service on the console:
Next create a new job:
And lastly, in the configuration portion you’ll be prompted to add how frequently you want the cloud function to trigger, and where you want the trigger to send a request to. A good way to help you configure the correct frequency that you want to set on your Cron Job on is with this tool called crontab.guru. The URL is the endpoint that was generated when we created the Cloud Function.
Conclusion + Next Steps
I wrote all of this code in one afternoon, so it’s by no means perfect (use at your own risk). The next steps in this project are to setup continuous integration so that whenever new changes are made to the repository on Github they are automatically pushed to GCP — my suggestion for this would be to use Cloud Build since it’s a service also offered by Google and would keep everything internal to that cloud project.
Feel free to connect with me on LinkedIn if you have any questions.