1. 程式人生 > >Containerize your IOT application with AWS IOT Analytics

Containerize your IOT application with AWS IOT Analytics

Overview

In an earlier blog post about IoT Analytics, we discussed how AWS IoT Analytics enables you to collect, visualize, process, query and store large amounts of time series data generated from your connected devices. In this blog post, I will show how you can use your custom application to interact with AWS IoT Analytics. Specifically, we will discuss –

  • How to bring your custom code as a docker image to AWS IoT Analytics
  • How the docker based application can process batch data on a schedule

Scenario

We’ll use an example of a smart building in NYC that uses AWS IoT to become energy efficient. The building is equipped with sensor enabled conference rooms , phone booths and offices. The telemetry data from various sensors across different floors & rooms are analyzed in certain intervals against the corporate meeting calendar data to identify if the rooms are in use. If not, the Lights and Air conditioner of the rooms are turned off.

This is how the flow works –

  1. The telemetry data from various sensors are published from the building through a device gateway to AWS IoT Core topic (for example : building/sensors) on the cloud.
  2. AWS IoT Core in turn uses Rules engine to route the data to data stores in AWS IoT Analytics.
  3. The data in the data store is analyzed by a containerized application integrated with AWS IoT Analytics. In this blog, we will use it to determine how many rooms have Light & AC turned on.
  4. The result set is then validated against the corporate calendar , to determine if any of the rooms are reserved for this time period or can energy be conserved by turning off the Lights or AC. And the output is stored in a landing zone for further actions.

Pre-Requisites

This blog assumes that the reader have completed the the following steps prior to moving to the Solution section :

  • Create a ssh key-pair to be able to log into the EC2 instance
    A ssh key-pair can be generated or imported in the AWS console under EC2 -> Key Pairs
  • Launch the cloudformation template to provision the environment for this lab –
  • Check if the telemetry data is being published once the cloudformation stack is successfully created
    • Go to IoT Core Console, Click Test on the left pane –
      • Subscription topic: workshop/telemetry
      • Click Subscribe to topic

Solution

Create SQL Data Set

A SQL data set is similar to a materialized view from a SQL database. We will create a SQL data set below that will store the telemetry data generated in the section above.

On the IoT Analytics console home page, in the left navigation pane, choose Analyze.

  • Click on Data sets → Create
  • Choose SQL Data Sets → Create SQL
  • Choose a ID for the SQL Data Set and select the datastore :
    • ID → mydataset, Data Store Source → mydatastore , click Next
  • Paste the below SQL into the Query window, click Next
SELECT floor_id,room_id,day,time FROM mydatastore where light = 1 and hvac = 1
  • Keep delta selection window as None (default) , click Next
  • Schedule the data set to run on every 15 minutes
    • Choose Minute of hour – 0 , Click Next
  • Choose the the default retention for the data set (Indefinitely) and Create Data set.

Query Data

Click on the Data Set you just created.

  • On the data set page, in the upper-right corner, choose Actions, and then choose Run now
  • It can take few minutes for the data set to show results. Check for SUCCEEDED under the name of the data set in the upper left-hand corner. The Content section contains the query results (left pane).

Create Custom Container for Analysis

Please follow the instructions below to create the docker image with custom application code –

1. SSH to EC2 instance (copy command from cloudformation output tab) & navigate to the docker directory:

cd ~/docker-setup
aws s3 cp ./calendar.csv s3://<paste-s3-bucket-name-from-cloudformation-output> 

2. Build the docker image:

docker build -t container-app-ia .

3. You should see a new image in your Docker repo. Verify it by running:

docker image ls | grep container-app-ia

4. Create a new repository in ECR:

aws ecr create-repository --repository-name container-app-ia

Please copy the repositoryUri from the output for use in step 7 & 8

5. Get the login to your Docker environment:

aws ecr get-login --no-include-email

6. Copy the output and run it. The output should look something like:

docker login -u AWS -p <password> https://<your-aws-account-id>.dkr.ecr.amazonaws.com

7. Tag the image you created with the ECR Repository Tag:

docker tag container-app-ia:latest <<paste repositoryUri copied earlier>>:latest

8. Push the image to ECR:

docker push <<paste repositoryUri copied earlier>>

Create the Container Data Set

A container data set allows you to automatically run your analysis tools and generate results. It brings together a SQL data set as input, a Docker container with your analysis tools and needed library files, input and output variables, and an optional schedule trigger. The input and output variables tell the executable image where to get the data and store the results.

On the IoT Analytics console home page, in the left navigation pane, choose Analyze.

  • Click on Data Sets → Create
  • Choose Container Data Sets → Create Container
  • Choose a unique ID for the Container Data Set → container_dataset, click Next
  • Choose the option → Link an existing data set’s query → Link
  • Select a trigger for your analysis → Choose mydataset → Schedule will be automatically populated, click Next
  • Select from your ECR Repository → Choose the repository container-app-ia (as per screenshot below)
  • Select your image → Choose the image with latest tag

  • Configure the input variables (as below) → Click Next :
Name Type Value
datasetv Content version mydataset
resulturi Output file output.csv
inputDataS3BucketName String <<paste s3 bucket name from cloudformation output>>
inputDataS3Key String calendar.csv
  • Select a Role → Choose the IAM Role → search & select iotAContainerRole
  • Configure the capacity for container :
    • Compute Resource : 4 vCPUs and 16 GiB Memory
    • Volume size (GB) : 1
  • Configure the retention of your results → Keep its default (indefinitely) and Click on Create Data set

Query & Validate Container Data Set

On the IoT Analytics console home page, in the left navigation pane, choose Analyze.

  • Select Data Set → container_dataset
  • On the data set page, in the upper-right corner, choose Actions, and then choose Run now
  • It can take few minutes for the data set to show results. Check for SUCCEEDED under the name of the data set in the upper left-hand corner. Check if the output file exists under Content tab (left pane) , or in your S3 bucket and download it.

The output data once downloaded should be similar to below –

Thus you completed the workflow to ingest telemetry data from your smart building, enrich the data using your custom docker code to gain insight into the availability of rooms, and store the processed data into a landing zone for further actions like turning off the lights & AC in the free rooms .

Clean Up

Please follow the instructions below to clean-up the resources created part of this blog –

  • SSH to EC2 instance & navigate to the clean-up directory:
cd ~/clean-up
./clean-up.sh
Enter name of the device > smart-building
Enter device type > sensors
Enter S3 bucket > <<paste your s3 bucket name>>
  • Navigate to AWS Console -> Choose Cloudformation -> Select and Delete the stack created earlier
  • Navigate to AWS Console -> Choose ECS -> Select Repositories (left pane) -> Delete the ECR repository “container-app-ia”

Troubleshooting

  • If SQL Data Set execution fails, run the below to check the error for the respective version –
    • SSH to EC2 instance and run the below :
aws iotanalytics list-dataset-contents --dataset-name mydataset
  • If Container Data Set execution fails, you can check the the logs in –
    • Cloudwatch -> Log Groups → /aws/sagemaker/TrainingJobs
  • For any other issues, please refer to here – Link

Conclusion

AWS IoT Analytics enabled you to use your custom code in a container to analyze, process and enrich sensor data with calendar data. Now you have the output data files in Amazon S3 where you can perform analytics / visualization / device shadow updates. A common pattern is to trigger a AWS Lambda function once a file is uploaded to Amazon S3. AWS Lambda function can analyse the file to identify the free rooms and update the device shadow for the respective devices registered in AWS IoT Core , that can turn off the Light & AC for those rooms in the building. I hope you found the information in this post helpful. Please feel free to leave questions or other feedback in the forum.

相關推薦

Containerize your IOT application with AWS IOT Analytics

Overview In an earlier blog post about IoT Analytics, we discussed how AWS IoT Analytics enables you to collect, visualize, process, query

Collect, Structure, and Search Industrial IoT data with AWS IoT SiteWise

AWS IoT SiteWise is a managed service that makes it easy to collect and organize data from industrial equipment at scale. You can easily monitor

Using Continuous Jobs with AWS IoT Device Management

In an earlier Using Over-the-Air Updates with AWS IoT Device Management blog post, we showed you how to create a simple AWS IoT snapshot job and t

Configuring Cognito User Pools to Communicate with AWS IoT Core

AWS IoT Core supports certificate-based mutual authentication, custom authorizers, and Amazon Cognito Identity as way to authenticate requests to

Setting Up Just-in-Time Provisioning with AWS IoT Core

In an earlier blog post about just-in-time registration of device certificates, we discussed how just-in-time registration (JITR) can be used to a

Secure your web application with these HTTP headers

Secure your web application with these HTTP headersAs we’ve seen in the previous parts of this series, servers can send HTTP headers to provide the client

Quickly build, test, and deploy your data lake with AWS and partner solutions

Performing data science workloads on data from disparate sources – data lake, data warehouse, streaming, and more – creates challenges f

Transform your manufacturing operations with AWS Cloud

Amazon Web Services (AWS) secure, agile, and scalable platform and comprehensive set of, data lake, analytics, and machine learning tools allow y

第三篇:Certificate Vending Machine – IoT 裝置接入 AWS IoT 平臺解決方案

AWS IoT 物聯網系列部落格 當前物聯網環境中,裝置型別多種多樣,連線方式不一而足。為了幫助讀者更好的理解並運用 AWS IoT 相關服務,我們提供了一個完整的 IoT 起步指南,包含裝置的註冊及上線、裝置管理、使用者身份及許可權管理以及成本控制,通過這一系列的起步指南,

Store, Protect, Optimize Your Healthcare Data with AWS: Part 1

This blog post was co-authored by Ujjwal Ratan, a senior AI/ML solutions architect on the global life sciences team. Healthcare data is ge

Connecting to LiveIntersect IoT application enablement platform with Zerynth Studio

Last week we announced our partnership with Esprida’s LiveIntersect, an IoT Cloud platform for the Industrial IoT. So, today we will share a tutori

time bushfire alerting with Complex Event Processing in Apache Flink on Amazon EMR and IoT sensor network | AWS Big Data Blog

Bushfires are frequent events in the warmer months of the year when the climate is hot and dry. Countries like Australia and the United States are

AWS IoT Analytics Overview

AWS IoT Analytics is a fully-managed service that makes it easy to run and operationalize sophisticated analytics on massive volumes of IoT data w

Page FAQ d'AWS IoT Analytics

Q : En quoi un ensemble de données SQL est-il différent d'un ensemble de données de conteneur ? Un ensemble de données SQL est simi

In the Works – AWS IoT Device Defender – Secure Your IoT Fleet

Scale takes on a whole new meaning when it comes to IoT. Last year I was lucky enough to tour a gigantic factory that had, on average, one environ

AWS IoT Analytics Pricing

AWS IoT Analytics is a fully-managed IoT analytics service that collects, pre-processes, enriches, stores, and analyzes IoT device data at scale.

Using AWS IoT Core in a Low-Power Application

At AWS, we work closely with customers to assist them in building various types of IoT solutions. We often hear from customers about the need to m

AWS IoT Analytics 定價

AWS IoT Analytics 是一項完全託管的 IoT 分析服務,用於大規模收集、預處理、擴充、儲存和分析 IoT 裝置資料。客戶還可以將自己的分析打包到容器中,以在 AWS IoT Analytics 上執行。 使用 AWS IoT Analytics,您只需按實際用

[React] Theme your application with styled-components and "ThemeProvider"

radius int ssi tid react style pro sha reac In this styled-components lesson, we set a "primary color" within a UI "theme" object. We mak

亞馬遜AWS-IoT:從架構到開發

本來很早就想寫一個關於AWS,MS這些老牌雲服務商的IOT支援介紹的,一直犯懶。昨天參加了AWS的一個線下活動,很接地氣的活動,一下記住了好幾個AWS的服務名稱以及其IOT架構,又勾起了寫的想法。 不過,偷懶主義告訴我,一定有別人寫吧,搜搜看,還真找到一篇不錯的,那我就不寫