Skip to content

Commit ec556e7

Browse files
committed
docs: update README
1 parent 06d2e7a commit ec556e7

File tree

1 file changed

+7
-23
lines changed

1 file changed

+7
-23
lines changed

README.md

+7-23
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ Some tools used do not represent the best choice, they were only used for learni
2121
* [Golangci-lint](https://golangci-lint.run)
2222
* [Godoc](https://pkg.go.dev/golang.org/x/tools/cmd/godoc)
2323
* [Govulncheck](https://pkg.go.dev/golang.org/x/vuln/cmd/govulncheck)
24+
* [Deadcode](https://go.dev/blog/deadcode)
2425
* [Viper](https://github.com/spf13/viper)
2526
* Configuration solution
2627
* [Cobra](https://github.com/spf13/cobra)
@@ -44,27 +45,6 @@ You can run the command below to see all the useful commands available and your
4445
```
4546
make help
4647
```
47-
```
48-
help: show this help.
49-
setup: run the command mod download and tidy from Go
50-
vet: run the command vet from Go
51-
tests: run all unit tests
52-
integration-tests: run all integration tests
53-
all-tests: run all unit and integration tests
54-
cover: run the command tool cover to open coverage file as HTML
55-
lint: run all linters configured
56-
sonarqube-up: start sonarqube container
57-
sonarqube-down: stop sonarqube container
58-
sonarqube-analysis: run sonar scanner
59-
fmt: run go formatter recursively on all files
60-
compose-ps: list all containers running
61-
compose-up: start API and dependencies
62-
compose-down: stop API and dependencies
63-
build: create an executable of the application
64-
build-run-api: build project and run the API using the built binary
65-
clean: run the go clean command and removes the application binary
66-
doc: run the project documentation using HTTP
67-
```
6848

6949
## ⚙️ Running the Application
7050
To run the project locally you need to export some environment variables and this can be done using `direnv`. You can export the variables below.
@@ -93,7 +73,7 @@ If you want to run the API outside of Docker:
9373
- ${MONGODB_PORT}:${MONGODB_PORT}
9474
```
9575
* comment out the `api` and `nginx` service in `docker-compose.yml` and run `make compose-up` and then `make build-run-api`, the API will run on the default port `http://localhost:8888/index`
96-
* if you want to debug the API, you don't need to run `make build-run-api` and in your IDE you need to set the command to `api` as the application is using [cobra library](https://github.com/spf13/cobra)
76+
* if you want to debug the API, you don't need to run `make build-run-api`, but in your IDE you need to set the command to `api` when starting the application given its using [cobra library](https://github.com/spf13/cobra)
9777
9878
## 🏁 How to crawl the page
9979
Fill in the URI and Depth in the form(it will be used to limit the depth when fetching pages with so many links that they can underperform and can take so long).
@@ -129,7 +109,11 @@ GRAFANA_PORT='3000'
129109
130110
The application metrics are exposed using the [ginmetrics library](https://github.com/penglongli/gin-metrics) and can be accessed at `http://localhost:8888/metrics`. These exposed metrics are collected by Prometheus and can be accessed at `http://localhost:9090`.
131111
132-
The collected metrics are sent to Grafana and can be accessed at `http://localhost:3000`. The default credentials are `admin`/`admin`(Grafana may prompt you to reset the password, but it is optional). After that, you need to configure the `data source` by clicking on the `Configuration` option in the left hand panel and then clicking on `Data source`. Click on the `Add Data Source` button and select `Prometeus` under `Time Series Database`. Fill in the address in the HTTP option as in the image below:
112+
The collected metrics are sent to Grafana and can be accessed at `http://localhost:3000`. The default credentials are `admin`/`admin`(Grafana may prompt you to reset the password, but it is optional).
113+
114+
_Please note that due to some changes to the tools, the arrangement of items may be different._
115+
116+
After that, you need to configure the `data source` by clicking on the `Configuration` option in the left hand panel and then clicking on `Data source`. Click on the `Add Data Source` button and select `Prometeus` under `Time Series Database`. Fill in the address in the HTTP option as in the image below:
133117
134118
[![datasource](/metrics/assets/datasource.png)](/metrics/assets/datasource.png)
135119

0 commit comments

Comments
 (0)