Skip to content

Commit

Permalink
[README] Added documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
Stanislas Girard committed Jan 10, 2021
1 parent 509d5c3 commit b9ae3c3
Show file tree
Hide file tree
Showing 3 changed files with 57 additions and 88 deletions.
2 changes: 1 addition & 1 deletion .env-example
Original file line number Diff line number Diff line change
Expand Up @@ -43,4 +43,4 @@ TINI_VERSION="v0.19.0"

# Traefik
TRAEFIK_VERSION="v2.3.4"
TRAEFIK_LOG_PREFIX_PATH="/var/log/traefik"
TRAEFIK_LOG_PREFIX_PATH="/var/log/traefik"
143 changes: 56 additions & 87 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,131 +33,100 @@ After implementing the first features of **OSAT** I decided to introduced other


## Installation
### Docker

You can use **Docker**
- Install Docker

### Manual

You need:
- **Python3**
- **[Redis Server](https://redis.io/topics/quickstart)**



```Bash
git clone https://github.com/StanGirard/SEOToolkit
cd SEOToolkit
```

Then install dependencies

```Bash
pip3 install -r requirements.txt
```

## Running

### Docker
```Bash
docker-compose up -d
docker-compose --env-file .env-example up
```
## Dashboard

You can access the dashboard by going to [localhost:3000](http://localhost:3000)

## Config

If needed create a `.env` file with information that you would like to overload from config.py
If needed create a `.env` file with information that you would like to change

## Initialisation

### Create super admin
You need to create a superuser in order to get started. Type the following command

```Bash
docker-compose run osat_server python manage.py createsuperuser
```

Once this is done, you need to go to [localhost:8000/admin](http://localhost:8000/admin)

Connect using the super user that you have created.

### Create organization

You need to go to `Org -> Organization` and create a new organization. You can create as many as you want. Organization are used in order to implement RBAC in the project and only display information about an organization to users of this organization. Here is a quicklinkg to access it [http://localhost:8000/admin/org/website/](http://localhost:8000/admin/org/website/)


### Add user to organization

Once your organization is created. You need to add your user to this organization.
Go to `Organizations -> Organizations Users` and add your user to the organization created before. [http://localhost:8000/admin/organizations/organizationuser/](http://localhost:8000/admin/organizations/organizationuser/)

### Create periodic task

We have implemented multiple periodic task in osat such as lighthouse audit and security audit.
The parameters are all saved inside the DB. Therefore you need to instantiate your crawlers.

Go to `Periodic Tasks -> Periodic Tasks` and click on **ADD PERIODIC TASK**.

You need to create two periodick task:
- One for `lighthouse_crawler`
- One for `security_crawler`

My settings for lighthouse and security are as follows

## Screenshots
<p align="center"><img src="./docs/images/lighthouse-crawler.png" width="400px" /></p>

### SERP Rank
I'm using a cronjob that runs every day for both security and lighthouse. But feel free to crawl more often or less :)

![](examples/SERP-rank.png)
Once you've done all the above, you are ready to go.
You can create as many organizations as you'd like. You can add users panel and you can access all the database from the admin panel.

### Internal Links Graphs
## Links

![](examples/graphs.png)
- **Webapp** [http://localhost:3000](http://localhost:3000)
- **Admin Dashboard** [http://localhost:8000](http://localhost:8000/admin)
- **Swagger like interface** [http://localhost:8000](http://localhost:8000)

### Keywords Finder

![](examples/keywords-finder.png)
## Contributions

### Lighthouse Audit
Please feel free to add any contribution.
If you want to contribute a project that you did, I've documented the code as much as I could.

![](examples/lighthouse-primates.png)
### Backend
You can just add a django module and I'll take care of intregrating it in the front. I know how hard it can be :D

### Images Extractor
### Frontend
I've used React Admin to build the front-end. If you want to help me improve the UI or add new functionnalites. Please feel free to contribute.

![](examples/images.png)

## API
## Disclaimers

### Lighthouse
I'm not a python nor a frontend developer.
I'll keep working on it.

| METHOD | DESCRIPTION | ENDPOINT | PARAMS |
| :-------------: |-------------| -----|-----|
| **GET** | All Audits | `/api/audit/lighthouse/score` | `None` |
| **GET** | Audit by Id | `/api/audit/lighthouse/score/<id>` | `id` |
| **POST** | Generates an Audit | `/api/audit/lighthouse/score` | `url` |

### Extract
#### Headers
| METHOD | DESCRIPTION | ENDPOINT | PARAMS |
| :-------------: |-------------| -----|-----|
| **GET** | All Extracted Headers | `/api/extract/headers` | `None` |
| **GET** | Headers by Id | `/api/extract/headers/<id>` | `id` |
| **POST** | Extract Header from URL | `/api/extract/headers` | `url` |
| **POST** | Deletes Headers by Id | `/api/extract/headers/delete` | `id` |

#### Status Code Links
| METHOD | DESCRIPTION | ENDPOINT | PARAMS |
| :-------------: |-------------| -----|-----|
| **GET** | All Links status extracted from pages| `/api/extract/links` | `None` |
| **GET** | Links Status by ID | `/api/extract/links/<id>` | `id` |
| **POST** | Extracts link status from URL | `/api/extract/links` | `url` |
| **POST** | Delete Link status by ID | `/api/extract/links/delete` | `id` |

#### Internal & External Links
| METHOD | DESCRIPTION | ENDPOINT | PARAMS |
| :-------------: |-------------| -----|-----|
| **GET** | All Internal & External links extracted from pages | `/api/extract/links/website` | `None` |
| **GET** | Internal & External by ID | `/api/extract/links/website/<id>` | `id` |
| **POST** | Extracts Internal & External links from URL | `/api/extract/links/website` | `url` |
| **POST** | Deletes Internal & External links by ID | `/api/extract/links/website/delete` | `id` |

#### Images
| METHOD | DESCRIPTION | ENDPOINT | PARAMS |
| :-------------: |-------------| -----|-----|
| **GET** | All Images extracted from pages | `/api/extract/images` | `None` |
| **GET** | Images by ID | `/api/extract/images/<id>` | `id` |
| **POST** | Extracts Images from URL | `/api/extract/images` | `url` |
| **POST** | Deletes Images by ID | `/api/extract/images/delete` | `id` |

### Internal Linking Graphs
| METHOD | DESCRIPTION | ENDPOINT | PARAMS |
| :-------------: |-------------| -----|-----|
| **GET** | All Internal Linking Graphs generated | `/api/graphs` | `None` |
| **GET** | Graphs by ID | `/api/graphs/<id>` | `id` |
| **POST** | Extracts graph from domain | `/api/graphs` | `domain` |
| **POST** | Deletes Graphs by ID | `/api/graphs/delete` | `id` |

### Query Keywords Generator
| METHOD | DESCRIPTION | ENDPOINT | PARAMS |
| :-------------: |-------------| -----|-----|
| **GET** | All Keywords generated | `/api/keywords` | `None` |
| **GET** | Keywords by ID | `/api/keywords/<id>` | `id` |
| **POST** | Extracts keywords from query | `/api/keywords` | `query` |
| **POST** | Deletes Keywords by ID | `/api/keywords/delete` | `id` |

### Search Engine Result Page Rank
| METHOD | DESCRIPTION | ENDPOINT | PARAMS |
| :-------------: |-------------| -----|-----|
| **GET** | All Ranks | `/api/rank` | `None` |
| **POST** | Extracts ranks from query and domain | `/api/rank` | `query` & `domain` |
| **POST** | Deletes ranks by ID | `/api/rank/delete` | `id` |



Binary file added docs/images/lighthouse-crawler.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit b9ae3c3

Please sign in to comment.