Stockr

Tech Stack

Finhubb API

JavaScript

Node

PostgreSQL

SQL

Vitest

REST API

Git

GitHub

Express


The idea

Returning the 2 best performing and worst perfoming stocks of the day. A chance to practice working with external data


The intial plan

First step was to break down the problem into chunks to begin understanding how to tackle this.


Getting started

Firstly I searched for a data source for stock information and found Finnhub.io. They had robust documentation a node module and crucially for me a free tier to play around with.

finnhub API

Test API call for the current quote of Amazon using thuderclient.

Created a function to call on a loop so I can repeat this request for every stock in a list.

finnhub API

As this was going to have to run over an extended period of time I created unit tests to ensure the return values were as expected.

finnhub API

With the API requirements understood and tested the next step was to set up a postgreSQL database so I can manage and query the data the API returned.

finnhub API

I now needed a way reset the database, this consisted of dropping any tables that exist, creating a new one and then seeding it with data from a JSON of the S&P500 listed companies. I had real issues getting readFile to work until I came across a GitHub thread discussing how it handles file paths. relative paths work but from the CWD, not the file the line is written in, so my numerous attempts using ../../../sANDp500.json didn't work.


Next step was adding the REST database API calls to allow data to be accessed and updated on the database. I created a function return all the symbols from the database and parse it into a neat array.

finnhub API

Then using this array I could call the quote on a loop. This passed all of my unit testing. However it kept running into errors when I tried to run on it on the whole S&P500. This turned out to be two main problems:



Lesson learned, add real stress testing into the unit-tests and I would have caught some of this much earlier.

As it stands right now the system works, and takes around 10 minutes to run due to the amount of API requests it has to make and being limited to 1 a second. The video below is sped up around 20x to show it working.

Future additions