This project was bootstrapped with Create React App.
The bulk of the code lives in /src/components
. If you want to update a specific part of the app, you can find it by:
- Inspect the element in the browser
- I've named each element's
class
based on the BEM methodology. If you see a class that starts with a capital letter (eg.Network__type-path
), the root of that string will be the name of the component it belongs to (Network
) - Find the component with the same name in
/src/components
The root of the app is in src/components/App.js
. You can always trace the render logic from here to find when components are rendered.
Static files live in the /public
folder, accessible at /filename
within the app. For example, data.json
is loadable at the url /data.json
, which completes to localhost:3000/data.json
when running locally.
- Clone this repository onto your local machine.
- install npm modules using
yarn
- run
yarn start
to start the server - open http://localhost:3000 to view it in the browser. Th eobjectives view is visible at http://localhost:3000?viz=objectives
The page will reload if you make edits.
You will also see any lint errors in the console.
To deploy to Github Pages, run:
npm run deploy
This will build the production bundle and push to the gh-pages
branch, which is used to host the site.
I have a file named .env.local
that sits in the base of this repository that contains my airtable API key.
AIRTABLE_API_KEY=XXX
When yarn data
is run, it will run the node script ./getData.js
, which uses that API key to fetch data from Airtable and parse it.
The repo has an attached Github Action that can access that API key (pls don't steal it), runs yarn data
, and commits any changes to the data.json
file.
This Action will run whenever a change is pushed to master
, and at midnight UTC every day (by the fetch-data.yml
GitHub Action). See past runs or re-run the Action in the Github interface.
There are several node scripts for scraping benchmark data, which output data in the src/data/
folder. These can all be run using yarn scrape-data
, and the scrape-dashboard-data.yml
GitHub Action will automatically run it every Sunday.