Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAPI support #146

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ Welcome to grpc4bmi's documentation!
container/building
container/usage
cli
openapi
python_api

Indices and tables
Expand Down
79 changes: 79 additions & 0 deletions docs/openapi.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
OpenAPI
=======

Your model might be written in a language which does not have gRPC support.
In this case, you can use the OpenAPI specifcation to wrap your model in a JSON web service.

Generate spec
-------------

The OpenAPI spec can be generated from the gRPC spec using the `protoc-gen-openapiv2` plugin.
The plugin can be found at https://github.com/grpc-ecosystem/grpc-gateway/

```bash
protoc -I . --openapiv2_out . \
--openapiv2_opt=output_format=yaml \
--openapiv2_opt=generate_unbound_methods=true \
./proto/grpc4bmi/bmi.proto
```

Generate Python client
----------------------

```bash
npx --package @openapitools/openapi-generator-cli openapi-generator-cli generate -i proto/grpc4bmi/bmi.swagger.yaml -g python -o openapi/python-client
```

Consuming
---------

To consume a web service using the BMI OpenAPI specification, you can use the Python client:

```python
from grpc4bmi.bmi_openapi_client import BmiOpenApiClient
model = BmiOpenApiClient(host='localhost', port=50051, timeout=10)
model.initialize(config_file)
model.update()
```

To spin up a web service inside a container and s client in one go you can use

```python
from grpc4bmi.bmi_openapi_client import BmiOpenApiApptainerClient, BmiOpenApiDockerClient

model = BmiOpenApiApptainerClient(
image='wflowjl.sif', work_dir='/tmp/workdir', input_dirs=[]
)
model = BmiOpenApiDockerClient(
image='ghcr.io/eWatercycle/wflowjl', work_dir='/tmp/workdir', input_dirs=[]
)
```

Providing
---------

To provide a web service using the BMI OpenAPI specification you will need to create a web service in the language in which you have your model.

Python
~~~~~~

Generate the server stubs with:

```shell
npx --package @openapitools/openapi-generator-cli openapi-generator-cli generate -i proto/grpc4bmi/bmi.swagger.yaml -g python-fastapi -o openapi/python-server
```

Inside each stub call your corresponding BMI method of your model.

As Python is supported by gRPC, you should not need to use the OpenAPI for Python.

Julia
~~~~~

Generate the server stubs with:

```shell
npx --package @openapitools/openapi-generator-cli openapi-generator-cli generate -i proto/grpc4bmi/bmi.swagger.yaml -g julia-server -o openapi/julia-server
```

Inside each stub call your corresponding BMI method of your model.
1 change: 1 addition & 0 deletions proto/grpc4bmi/bmi.proto
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
syntax = "proto3";

package bmi;
option go_package = "github.com/eWatercycle/grpc4bmi";

message Empty{}

Expand Down
Loading
Loading