This document will highlight how to add new APIs to the OpenStorage SDK.
The SDK follows the Protocol Buffer Style Guide. From the guide:
Note that protocol buffer style has evolved over time, so it is likely that you will see .proto files written in different conventions or styles. Please respect the existing style when you modify these files. Consistency is key. However, it is best to adopt the current best style when you are creating a new .proto file.
Therefore, for any new messages, enums, etc, please follow the style guide.
All SDK APIs and values must satisfy by the following:
Any changes to the protocol must bump the version by one. On the master branch, the minor number is bumped. On release branches, the patch number is bumped.
APIs Must be readable. SDK APIs and values must be concrete clear values. They are used not just by Portworx, but also non-Portworx developers which do not have an understanding of the internals of the Portworx cluster.
string
types should be used only for ids, messages, or opaque values. They are not meant to marshal information as ayaml
. Instead create a concrete message.- Only use
map<string, string>
for opaque values like labels, key-value pairs, etc. Do not use them for operations. Use enums instead. - Value options should not be passed as
string
. Instead of passing "Done", or "paused", use enums for these value, making it clear to the reader.
- Services contain RPC functions
- Services should be in the format
OpenStorage<Service Name>
. - Note that the service is a collection of APIs and are grouped as such in the documentation.
- Here is an example for OpenStorageClusterService
- If it is a new service, then it should have
Create
,Inspect
,Delete
, orEnumerate
style APIs, if possible. - All APIs must have a single message for the request and a single message for the response with the following style:
Sdk<Service Type><Api Name>Request|Response
- RPCs will be created as methods to the service object, therefore there is
no need to add the service name as part of the RPC. For example,
use
Foo
, orBar
instead orServiceFoo
orServiceBar
as RPC names.
- Follow the Google protobuf style for enums
- According to the Google guide, the enum of zero value should be labeled as
UNSPECIFIED
to check if it was not set since0
is the default value set when the client does not provide it. - Wrap enums in messages so that their string values are clearer. Wrapping an enum in a message also has the benefit of not needing to prefix the enums with namespaced information. For example, instead of using the enum
XATTR_UNSPECIFIED
, the example above uses justUNSPECIFIED
since it is inide theXattr
message. The generated code will be namepaced:
Proto:
// Xattr defines implementation specific volume attribute
message Xattr {
enum Value {
// Value is uninitialized or unknown
UNSPECIFIED = 0;
// Enable on-demand copy-on-write on the volume
COW_ON_DEMAND = 1;
}
}
Using the enum in a Proto
message VolumeSpec {
// Holds the extended attributes for the volume
Xattr.Value xattr = 1;
}
Notice the namepaced and string values in the generated output code:
type Xattr_Value int32
const (
// Value is uninitialized or unknown
Xattr_UNSPECIFIED Xattr_Value = 0
// Enable on-demand copy-on-write on the volume
Xattr_COW_ON_DEMAND Xattr_Value = 1
)
var Xattr_Value_name = map[int32]string{
0: "UNSPECIFIED",
1: "COW_ON_DEMAND",
}
var Xattr_Value_value = map[string]int32{
"UNSPECIFIED": 0,
"COW_ON_DEMAND": 1,
}
typedef VolueSpec struct {
// Holds the extended attributes for the volume
Xattr Xattr_Value `protobuf:"varint,36,opt,name=xattr,enum=openstorage.api.Xattr_Value" json:"xattr,omitempty"`
}
OpenStorage has a number of default roles used to authorize access to an API.
Please make sure your API is accessible by the appropriate role. For example,
the role system.admin
has access to all APIs while the role system.user
only has access to volume management APIs.
The default roles are configured under pkg/roles
. See SdkRule for more information.
If you have any questions, please do not hesitate to ask.
- Try not to use
uint64
. Instead try to use signedint64
. (There is a reason for this which is why CSI changed all uint64s to in64s in version 0.2, but I can find out why. I think it has to do with Java gRPC)
- If it is a new message, start with the field number of
1
. - If it is an addition to a message, continue the field number sequence by one.
- If you are using
oneof
you may want to start with a large value for the field number so that they do not interfere with other values in the message:
string s3_storage_class = 7;
// Start at field number 200 to allow for expansion
oneof credential_type {
// Credentials for AWS/S3
SdkAwsCredentialRequest aws_credential = 200;
// Credentials for Azure
SdkAzureCredentialRequest azure_credential = 201;
// Credentials for Google
SdkGoogleCredentialRequest google_credential = 202;
}
REST endpoints are autogenerated from the protofile by the grpc-gateway protoc compiler. All OpenStorage SDK APIs should add the appropriate information to generate a REST endpoint for the service. Here is an example:
rpc Inspect(SdkRoleInspectRequest)
returns (SdkRoleInspectResponse){
option(google.api.http) = {
get: "/v1/roles/inspect/{name}"
};
}
// Delete an existing role
rpc Delete(SdkRoleDeleteRequest)
returns (SdkRoleDeleteResponse){
option(google.api.http) = {
delete: "/v1/roles/{name}"
};
}
// Update an existing role
rpc Update(SdkRoleUpdateRequest)
returns (SdkRoleUpdateResponse){
option(google.api.http) = {
put: "/v1/roles"
body: "*"
};
}
Here are the guidelines for REST in OpenStorage SDK:
- Endpoint must be prefixed as follows:
/v1/<service name>/<rpc name if needed>/{any variables if needed}
. - Use the appropriate HTTP method. Here are some guidelines:
- For Create RPCs use the
post
http method - For Inspect RPCs use the
get
http method - For Update RPCs use the
put
http method - For Delete RPCs use the
delete
http method
- For Create RPCs use the
- Use
get
for non-mutable calls. - Use
put
withbody: "*"
most calls that need to send a message to the SDK server.
-
It is imperative that the comments are correct since they are used to automatically generate the documentation for https://libopenstorage.github.io . The documentation for these values in the proto files can be in Markdown format.
-
Documenting Messages
- Document each value of the message.
- Do not use Golang style. Do not repeat the name of the variable in Golang Camel Format in the comment to document it since the variable could be in other styles in other languages. For example:
// Provides volume's exclusive bytes and its total usage. This cannot be
// retrieved individually and is obtained as part node's usage for a given
// node.
message VolumeUsage {
// id for the volume/snapshot
string volume_id = 1;
// name of the volume/snapshot
string volume_name = 2;
// uuid of the pool that this volume belongs to
string pool_uuid = 3;
// size in bytes exclusively used by the volume/snapshot
uint64 exclusive_bytes = 4;
// size in bytes by the volume/snapshot
uint64 total_bytes = 5;
// set to true if this volume is snapshot created by cloudbackups
bool local_cloud_snapshot = 6;
}
- NOTE: Most importantly is that these APIs must be supported forever once released.
- They will almost never be deprecated since at some point we will have many versions of the clients. So please be clear and careful on the API you create.
- If we need to change or update, you can always add.
Here is the process if you would like to deprecate:
- According to proto3 Language Guide set the value in the message to deprecated and add a
(deprecated)
string to the comment as follows:
// (deprecated) Field documentation here
int32 field = 6 [deprecated = true];
- Comment in the SDK_CHANGELOG that the value is deprecated.
- Provide at least two releases before removing support for that value in the message. Make sure to document in the release notes of the product the deprecation.
- Once at least two releases have passed. Reserve the field number as shown in the proto3 Language Guide:
message Foo {
reserved 6;
}
It is essential that no values override the field number when updating or replacing. From the guide:
Note: If you update a message type by entirely removing a field, or commenting it out, future users can reuse the field number when making their own updates to the type. This can cause severe issues if they later load old versions of the same .proto, including data corruption, privacy bugs, and so on.
If you are adding a new service, use the following steps:
- Create a new service in the proto file
- For Volume services:
- Create a new file under
api/server/sdk
with the name<service>.go
. - In it create an object which will house the API implementation of the server functions for this service. See Example.
- Initialize this object in
server.go::New()
- Add it the endpoint to the REST gRPC Gateway.
- Create a new file under
- For non-Volume services:
- Create a Golang implementation of the service in
pkg/
- Implement the gRPC generated server definition there
- Write unit tests for your implementation if possible
- See Role Manager as an example.
- Adding your service to the SDK server:
- NOTE: We are moving away from adding services directly in the SDK
server.go
. Instead, instantiate your object, then add it to the SDKServerConfig{}
. This can be done by adding services toServerConfig.GrpcServerExtensions[]
and generated REST services toServerConfig.RestServerExtensions[]
. For more information seeServerConfig{}
.
- NOTE: We are moving away from adding services directly in the SDK
- Create a Golang implementation of the service in
To add an API, follow the following steps:
- Create a new API in a service proto file and create its messages.
- It is HIGHLY recommended that you have these messages reviewed first before sending your PR. The easiest way is to create an Issue with the description of the plan and the proto file API and messages. If not you may have to change all your code in case their is a suggestion on changes to your proto file.
- Generate the Golang bindings by running:
make docker-proto
. - Add the implementation as described above
- The implementation should only communicate with the OpenStorage golang interfaces, never the REST API.
- APIs must check for the required parameters in the message and unit tests must confirm these checks.
- If your API is not supported by the
fake
driver, please add support for it. It is essential that thefake
driver supports the your API since it will be used by developers to write their clients using the docker container as shown in the documentation.
To do a development testing using the fake
driver, do the following:
- Type:
make launch-sdk
- This will create a new container with the SDK and run it on your system. Note, if you have one already running, you must stop that container before running this command.
- Use a browser to execute your command.
- Go to http://127.0.0.1:9110/swagger-ui then click on the command you want to try, then click on
Try it now
. - Change or adjust the input request as needed, then click on the
Execute
command. - Inspect the response from the server.
- Go to http://127.0.0.1:9110/swagger-ui then click on the command you want to try, then click on
When rebasing files you may get conflicts on generated files. If you do, just accept the incoming generated files (referred by git as --ours
) then once all the rebases are done, regenerate again, and commit.
Here are the commands you may need:
$ git rebase master
<-- Conflicts. For each conflict on a generated file, repeat: -->
$ git checkout --ours <file with conflict>
$ git add <file with conflict>
$ git rebase --continue
The following steps can be used to publish it once the SDK is ready for a new release and the version number has been updated:
- Update sdk-test to test the new functionality.
- Update docs Reference and Changelog.
- Update openstorage-sdk-clients to regenerate new clients.