Skip to content
This repository has been archived by the owner on Jun 7, 2023. It is now read-only.

Commit

Permalink
Merge pull request #25 from Paperspace/docs-move-local-setup-as-first
Browse files Browse the repository at this point in the history
docs: clean up virtual env & naturalize language
  • Loading branch information
mkulaczkowski authored May 31, 2019
2 parents 741fd58 + 8763d00 commit 6ec139d
Showing 1 changed file with 32 additions and 33 deletions.
65 changes: 32 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,55 +121,52 @@ You can also use Gradient SDK to ensure you have the correct path:
from gradient_sdk.utils import data_dir, model_dir, export_dir
```

# Local Setup

To begin, you'll simply need the latest version of TensorFlow installed.

First make sure you've [added the models folder to your Python path](/official/#running-the-models); otherwise you may encounter an error like `ImportError: No module named mnist`.

Then, to train the model, simply run:
# (Optional) Local Setup using a Virtual Environment
Users sometimes run into local machine environment issues when trying to use Python. A common solution for this is to create and use a Python virtual environment to run Python from within. To do so:

1. Create and activate a Python virtual environment (we recommend using python3.7+):
```
python mnist.py
```
cd mnist-sample
### (Optional) Running localy from your virtualenv
You can download mnist sample on your local machine and run it on your computer.
python3 -m venv venv
- download code from github:
source venv/bin/activate
```
git clone [email protected]:Paperspace/mnist-sample.git

2. Install the required Python packages:
```
- create a python virtual environment (we recommend using python3.7+) and activate it
pip install -r requirements-local.txt
```
cd mnist-sample

python3 -m venv venv
# Local Training

source venv/bin/activate
```
To train a the mnist model locally:

1. Make sure you have the latest version of TensorFlow installed.

- install required python packages
2. Also make sure you've [added the models folder to your Python path](/official/#running-the-models); otherwise you may encounter an error like `ImportError: No module named mnist`.

3. Download the code from GitHub:
```
pip install -r requirements-local.txt
git clone [email protected]:Paperspace/mnist-sample.git
```

- train the model
4. Start training the model:

Before you run it you should know that it will be running for a long time.
Command to train the model:
```
python mnist.py
```
If you want to shorten model training time you can change max steps parameter:

_Note: local training will take a long time, so be prepared to wait!_

If you want to shorten model training time, you can change the max steps parameter:
```
python mnist.py --max_steps=1500
```

Mnist data are downloaded to `./data` directory.
The mnist dataset is downloaded to the `./data` directory.

Model results are stored to `./models` directory.
Model results are stored in the `./models` directory.

Both directories can be safely deleted if you would like to start the training over from the beginning.

Expand All @@ -179,19 +176,21 @@ You can export the model into a specific directory, in the Tensorflow [SavedMode
```
python mnist.py --export_dir /tmp/mnist_saved_model
```
If no export directory is specified the model is saved to a timestamped directory under `./models` subdirectory (e.g. `mnist-sample/models/1513630966/`).
If no export directory is specified, the model is saved to a timestamped directory under `./models` subdirectory (e.g. `mnist-sample/models/1513630966/`).

## Testing of a Paperspace Gradient Model Deployment Endpoint
Example:
## Testing a Tensorflow Serving-deployed model on Paperspace
To test the prediction endpoint of a model deployed with Tensorflow Serving on Paperspace, run the following commands, replacing `your-deployment-id` with your deployment's id:
```
python serving_rest_client_test.py --url https://services.paperspace.io/model-serving/de6g5i8wko4km1:predict
python serving_rest_client_test.py --url https://services.paperspace.io/model-serving/your-deployment-id:predict
```
Optionally you can provide a path to an image file to run a prediction on; example:
Optionally you can provide a path to an image file to run a prediction on, for example:
```
python serving_rest_client_test.py --url https://services.paperspace.io/model-serving/de6g5i8wko4km1:predict --path example5.png
python serving_rest_client_test.py --url https://services.paperspace.io/model-serving/your-deployment-id:predict --path example5.png
```

## Local testing of Tensorflow Serving using docker
_Note: it may be useful to run this test from within a virtual environment to guard against issues in your local environment. To do so, use the instructions above._

## Testing a Tensorflow Serving-deployed model on your local machine using Docker
Open another terminal window and run the following in the directory where you cloned this repo:
```
docker run -t --rm -p 8501:8501 -v "$PWD/models:/models/mnist" -e MODEL_NAME=mnist tensorflow/serving
Expand Down

0 comments on commit 6ec139d

Please sign in to comment.