Stockr
Tech Stack
Finhubb API
JavaScript
Node
PostgreSQL
SQL
Vitest
REST API
Git
GitHub
Express
The idea
Returning the 2 best performing and worst perfoming stocks of the day. A chance to practice working with external data
The intial plan
First step was to break down the problem into chunks to begin understanding how to tackle this.
- First get stock price data for a range of stocks for today
- Then store those somewhere
- Finally search the stored data and return the 2 with the highest gain and the 2 with the biggest loss
Getting started
Firstly I searched for a data source for stock information and found Finnhub.io. They had robust documentation a node module and crucially for me a free tier to play around with.

Test API call for the current quote of Amazon using thuderclient.
Created a function to call on a loop so I can repeat this request for every stock in a list.

As this was going to have to run over an extended period of time I created unit tests to ensure the return values were as expected.

With the API requirements understood and tested the next step was to set up a postgreSQL database so I can manage and query the data the API returned.

I now needed a way reset the database, this consisted of dropping any tables that exist, creating a new one and then seeding it with data from a JSON of the S&P500 listed companies. I had real issues getting readFile to work until I came across a GitHub thread discussing how it handles file paths. relative paths work but from the CWD, not the file the line is written in, so my numerous attempts using ../../../sANDp500.json didn't work.
Next step was adding the REST database API calls to allow data to be accessed and updated on the database. I created a function return all the symbols from the database and parse it into a neat array.

Then using this array I could call the quote on a loop. This passed all of my unit testing. However it kept running into errors when I tried to run on it on the whole S&P500. This turned out to be two main problems:
- Finnhub's documentation was incorrect, the number of API calls on the free version allowed was 60 call/min not 60 calls/sec. This was solved by adding a time out of 1001 to ensure it would not exceed this limit.
- I had set up my database patch request to create a new connection each time. This worked fine at a small scale, but tripped the limit of 5 when I tried to do 505 in the space of a few seconds.
Lesson learned, add real stress testing into the unit-tests and I would have caught some of this much earlier.
As it stands right now the system works, and takes around 10 minutes to run due to the amount of API requests it has to make and being limited to 1 a second. The video below is sped up around 20x to show it working.
Future additions
- To add a dashboard interface
- Deploy the project
- Have the system automatically update itself once a day
- To store a log of previous days best and worst performers