Skip to content

Latest commit

 

History

History
381 lines (272 loc) · 17 KB

BACKEND.md

File metadata and controls

381 lines (272 loc) · 17 KB

Back-end

The back-end is entirely powered by Amazon Web Services. See platform schema. If you don't have an AWS account, create one. Amazon provides the AWS Free Tier for new accounts. Demo platform is lightweight and should fit into the free tier limits.

Contents

Components

  • AWS IoT is used to communicate with the Nucleo board as well as process data coming from the device. Data is automatically processed by AWS IoT Rules Engine.
  • AWS Lambda is a computing component of the platform. Lambdas are used to process data by the IoT rules engine and to implement business logic for the API. There is also a "bot" implemented on Lambdas. Bot emulates Nucleo board and can be used when board is not available and for debugging. In addition to processing Nucleo board data Lambda is used to fetch weather data for a number of cities from OpenWeatherMap API.
  • Amazon DynamoDB is a key-value storage where data is persisted by the IoT rules engine and Lambdas.
  • API is powered by Amazon API Gateway.
  • Amazon CloudWatch is used as a scheduler.
  • Amazon Cognito is used for providing read-only public access to IoT data streams via MQTT over Websockets for the web dashboard.
  • Web dashboard is hosted on Amazon S3.

Amazon DynamoDB

DynamoDB is used as a data storage for the demo platform. We will need two tables: one for sensor data and one for weather data.

Sensor Data

Create sensor data table with the following parameters:

  • Name: nucleo-metrics
  • Primary partition key: metric (String)
  • Primary sort key: timestamp (Number)

Weather Data

Create weather data table with the following parameters:

  • Name: nucleo-weather
  • Primary partition key: city (Number)
  • Primary sort key: timestamp (Number)

NOTE: You can use any table names but don't forget to change them in lambdas code as well.

AWS IoT

AWS IoT works as a middleware between "things" (Nucleo board in our case) and other system components.

NOTE: This guide is based on latest interface of AWS IoT console.

Open AWS IoT console. First register a Thing. You should go to "Registry -> Things" and click "Register a thing". Name is the only required parameter here. Set it to Nucleo. Thing will reflect the Nucleo board status.

After register the thing we should add Policy. Policy defines access rules for the thing. In AWS IoT console you should go to "Security -> Policies" and click "Create a policy". Name it Nucleo-Policy and set proper parameters.

Now we should generate custom Certificates and register it in AWS IoT.

After that click on "Attach Policy" button. Attach policy Nucleo-Policy to the certificate.

In the next step we will need two rules.

The first rule aim is to store sensor data to DynamoDB. Click "Create a rule" and set the following parameters:

  • Name: any, i.e. store_temperature
  • Attribute: state.reported.temperature, state.reported.humidity, state.reported.pressure, state.reported.accelerometer, state.reported.gyroscope, state.reported.magnetometer, timestamp
  • Topic filter: $aws/things/Nucleo/shadow/update/accepted
  • Choose an Action: "Insert message into a database table (DynamoDB)"

  • Table name: select the sensor data table created earlier (nucleo-metrics)
  • Hash key value: temperature
  • Range key value: ${metadata.reported.temperature.timestamp}
  • Write message data to this column: payload
  • Role name: click "Create new role" (for example, iot-dynamo-insert-role) and then click "Create a new role".
  • Click "Add action" to create the action.

NOTE: This will generate an AWS IAM role which allows write operations to the table.

Another action should be created for this rule:

  1. Select "Republish this item to another topic"
  2. Set Nucleo/data as the target topic
  3. Click "Create a new role" (for example, iot-iot-republish-role) then "Allow"
  4. Click "Add action"

After that submit the rule by clicking "Create rule" button.

The second rule aim is to store markers. Click "Create a rule" and set the following parameters:

  • Name: any, i.e. store_markers
  • Attribute: *
  • Topic filter: Nucleo/data
  • Condition: marker = true

  • Choose an Action: "Insert message into a database table (DynamoDB)"
  • Table name: select the sensor data table created earlier
  • Hash key value: temperature
  • Range key value: ${timestamp() / 1000}
  • Write message data to this column: payload
  • Role name: select the role you generated for the previous rule (iot-dynamo-insert-role)

After that submit the rule by clicking "Create rule" button.

Certificates

NOTE: For this step, you should have installed OpenSSL.

First, you should generate a private key (local) using a standard elliptic curve prime256v1 over a 256-bit prime field. Run following command terminal (Mac OS / Linux):

openssl ecparam -genkey -name prime256v1 -out nucleo.key.pem 

Next step is to make signing request (CSR). You will be prompted for additional information. Run following command terminal (Mac OS / Linux):

openssl req -new -sha256 -key nucleo.key.pem -out nucleo.csr

After that in AWS IoT console, you should go to "Security -> Certificates" and click "Create a certificate". Choose "Create with CSR".

Upload file nucleo.csr which we generate before. After that save generated certificate (rootCA) in the local folder.

As the result you will get following on your local folder.

Finally, activate your certificate in AWS IoT console:

AWS Lambda

There are three Lambdas which should be set up. See the lambdas folder for their sources.

Build Lambda Functions

First we need to build lambdas: Install Node.js

Copy the config.dist.js file as config.js: cp config.dist.js config.js

Edit the config.js file. There is one configuration parameter: IoT endpoint hostname. It is unique for every AWS account. You can get it in AWS IoT console. Go to the console and click (Registry -> Things -> Nucleo -> Interact), then copy and paste the hostname to the iotEndpoint config parameter.

To build lambdas run in command line: npm i && npm run build && npm run zip

Create Lambda Functions

Now there is a ZIP file for each lambda in the dist folder. Open the AWS Lambda console and create lambda for each file.

Choose "Create Lambda function" (or "Get Started Now", if it's a new AWS account).

On the Select blueprint page choose the "Blank Function" blueprint.

Configure triggers page will be populated. Choose Next.

On the Configure function page, do the following: 1. Give a name to the function and select Node.js 6.10 runtime 2. Set "Code entry type" to "Upload a *.ZIP file", click "Upload" and select the corresponding ZIP file

Choose "Create a role" in Role. Enter a role name for a new role that will be created (for example, lambda-dynamo-role). Click Allow.

Open IAM console and select "lambda-dynamo-role" in Roles tab.

Click "Attach Policy" and select "AmazonDynamoDBFullAccess" on Permissions tab. Click "Attach Policy".

Come back on the Configure function page and review "Lambda function handler and role" section:

NOTE: Select this role for the next lambdas as well.

In "Advanced settings" set:

  • Memory: 256 MB
  • Timeout: 10 sec

Click "Next" then "Create function"

Get Nucleo Data

The getNucleoData lambda provides initial data set for client applications. We need to assign an API endpoint to it so that clients will be able to call it remotely.

To do that click "Add trigger":

Then select "API Gateway".

Click "Submit".

NOTE: Any method (GET, POST, etc.) will trigger your Lambda function. To set up more advanced method mappings or subpath routes visit Amazon API Gateway console.

Go to Amazon API Gateway console and click on the API created within the previous step.

Choose Create Method from the Actions drop-down menu. For the HTTP method, choose GET, and then save your choice:

NOTE: Delete ANY.

For Integration type in GET method Setup pane choose Lambda Function. For Lambda Region choose the region (.e.g, us-east-1) where you created the Lambda functions. In Lambda Function type getNucleoData. Choose Save to finish setting up the integration request for this method.

NOTE: For Add Permission to Lambda Function, choose OK.

In the GET method click on "Method Request".

Choose "AWS IAM" in Authorization.

Expand "URL Query String Parameters" and click on "Add query string".

Add "metric" and "since" parameters (use "Add query string" link).

Return to Method Execution.

Click on "Integration Request" and expand "Body Mapping Templates".

Click on "Add mapping template" and specify application/json as content type.

Copy and paste this JSON into text area:

```
{
    "metric": "$input.params('metric')",
    "since": $input.params('since')
}
```

Click "Save"

Select resource in the resources list and click "Actions"

Select "Enable CORS" and then click on "Enable CORS and replace existing CORS headers"

Now the API endpoint is open and available for invocation by user browsers.

Nucleo Fetch Weather

The nucleoFetchWeather lambda fetches weather data for a number of cities from OpenWeatherMap API. Historical data is not available for free accounts so we have to fetch current data from time to time to build temperature history. In order to be able to invoke the API sign up for a free account, get an API key, copy and paste it into the owmApiKey variable value.

In order to invoke the lambda periodically we can use Amazon CloudWatch scheduling service. AWS Lambda console provides handy functionality to set up the invocation schedule:

Go to AWS Lambda console and click on the lambda.

Go to the "Triggers" tab and click on "Add trigger".

Select "CloudWatch Events - Schedule" as event source type.

Give a name for the rule and select a schedule expression, i.e. "rate (15 minutes)"

Make sure the "Enable trigger" checkbox is checked and click "Submit" button.

Generate Nucleo Data

The generateNucleoData lambda is an optional one. It emulates the Nucleo board activity by updating its shadow and generating markers. You can set up an API endpoint or invocations scheduler similar to previous lambdas.

Repeat steps from "getNucleoData" for API endpoint:

  1. Add "marker" parameter in "URL Query String Parameters".
  2. Copy and paste this JSON into text area in "Body Mapping Templates":
{
    "marker": $input.params('marker')
}

Or repeat steps from "nucleoFetchWeather" for scheduler with 1 minute rate.

NOTE: For intervals, less than 1 minute, you could setup EC2 instance and call API endpoint in custom script.

This lambda requires more privileges in order to publish to IoT data streams. Perform the following steps to grant it access:

Go to IAM console and then to "Roles" section Select the role you generated for the lambdas Click "Create Role Policy" then "Custom Policy" then "Select" button Give a name to the policy (for example, lambda-iot-publish-role) and copy and paste the following JSON into the "Policy Document" text area:

```
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "iot:*"
            ],
            "Resource": [
                "*"
            ]
        }
    ]
}
```

Amazon Cognito

We use Amazon Cognito to provide public read-only access to IoT data streams.

Configuration here is pretty simple. Open Cognito console, go to "Manage Federated Identities" and create new identity pool. Give it any name (for example, Nucleo Metrics) and set the "Enable access to unauthenticated identities" checkbox.

Along with the pool an IAM role will be generated. This role will not grant access to our IoT topics by default. We need to extend it:

  1. In Cognito console go to the just created pool and click "Edit identity pool"

  2. Note the authenticated and unauthenticated role names. We will need them on the next step.

  3. Go to IAM console

  4. Go to Roles and find roles from the previous step and do the following for both:

  5. Click on the role

  6. Click "Create Role Policy"

  7. Click "Custom Policy" then "Select"

  8. Give it any name (for example, cognito-iot-publish-auth-role) and paste the following text into the "Policy Document" text area:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "iot:*"
                ],
                "Resource": [
                    "*"
                ]
            }
        ]
    }
    
  9. Click "Apply Policy"

Amazon S3

Web dashboard is a static web application which can be hosted on Amazon S3. Just create a bucket and configure it as described in this guide.