1. 程式人生 > >Push VPC Flow Logs to Splunk Using Kinesis Firehose

Push VPC Flow Logs to Splunk Using Kinesis Firehose

Before you begin, be sure to:

Start creating the Data Firehose delivery stream

1.    Create your delivery stream. For Source, choose Direct PUT or other sources.

2.    Choose Next

.

Configure record transformation with AWS Lambda

1.    Configure record transformation.
Note: Be sure to choose Enabled for Record transformation under Transform source records with AWS Lambda. You must enable this option, because CloudWatch sends the logs as compressed .gzip files. Amazon Kinesis must extract these files before they are usable.

2.    For Lambda function, select Create new.

3.    On the Choose Lambda blueprint pop-up window that appears, for Lambda blueprint, choose Kinesis Firehose CloudWatch Logs Processor.

4.    Select the new tab that opens in your browser to create the new Lambda function.
For Name, enter a name for the Lambda function.
For Role, select Create a custom role.

5.    Select the new tab that opens in your browser to create a new create an AWS Identity and Access Management (IAM) role.
For Role Name, be sure that the name is lambda_basic_execution.

6.    Choose Allow to create the role and return to the Lambda function configuration page.

7.    Choose Create function, and then wait for the function to be created.

8.    Increase the Timeout to 1 minute from the default 3 seconds to prevent the function from timing out.

9.    Choose Save.

Finish creating the Data Firehose delivery stream

2.    In the navigation pane, select Data Firehose.

3.    For your delivery stream, choose Lambda function.
Choose the name of your new AWS Lambda function from the drop-down menu.
For Destination, choose .
Enter the Splunk HEC details, including the Splunk HEC endpoint that you created before. The Splunk HEC endpoint must be terminated with a valid SSL certificate. Use the matching DNS hostname to connect to your HEC endpoint. The format for the cluster endpoint is https://YOUR-ENDPOINT.splunk.com:8088.
For Splunk endpoint type, choose Raw endpoint, and then enter the authentication token.

4.    Choose Next.

5.    (Optional) Create an S3 backup for failed events or all events by choosing an existing bucket or creating a new bucket. Be sure to configure S3-related settings such as buffer conditions, compression and encryption settings, and error logging options in the delivery stream wizard.

6.    Under IAM role, choose Create New.

7.    In the tab that opens, enter a Role name, and then choose Allow.

8.    Choose Next.

9.    Choose Create delivery stream.

Configure VPC flow logs

If you already have a VPC flow log that you want to use, skip to the next section.

2.    In the navigation pane, choose Logs.

3.    For Actions select Create log group.

4.    Enter a Log Group Name.

5.    Choose Create log group.

7.    In the navigation pane under Virtual Private Cloud, choose Your VPCs.

8.    In the content pane, select your VPC.

9.    Choose the Flow logs view.

10.   Choose Create flow log.
For Filter, select All.
For Destination log group, select the log group you just created.
For IAM role, select an IAM role that allows your VPC to publish logs to CloudWatch.
Note:
If you don't have an appropriate IAM role, choose Set Up Permissions under IAM role. Choose Create a new IAM role. Leave the default settings selected. Choose Allow to create and associate the role VPCFlowLogs with the destination log group.

11.   Choose Create to create your VPC flow log.

12.   Establish a real-time feed from the log group to your delivery stream.
For AWS Lambda instructions, see Accessing Amazon CloudWatch Logs for AWS Lambda.
For Amazon Elasticsearch Service (Amazon ES) instructions, see Streaming CloudWatch Logs Data to Amazon Elasticsearch Service.
For Kinesis Data Firehose, create a CloudWatch Logs subscription in the AWS Command Line Interface (AWS CLI) using the following instructions.

Create a CloudWatch Logs subscription

1.    Grant access to CloudWatch to publish your Kinesis Data Firehose stream with the correct role permissions.

2.    Sign in to the AWS CLI.

3.    Create your trust policy (such as TrustPolicyforCWLToFireHose.json) using the following example JSON file. Be sure to replace YOUR-RESOURCE-REGION with your resource's AWS Region.

相關推薦

Push VPC Flow Logs to Splunk Using Kinesis Firehose

Before you begin, be sure to: Set up a Splunk HEC HTTP Event Collector (HEC) instance that is reachable. For more infor

Streaming CloudWatch Logs to Kinesis Data Streams

Amazon Web Services is Hiring. Amazon Web Services (AWS) is a dynamic, growing business unit within Amazon.com. We are currently hiring So

Monitor VPC Traffic with Flow Logs

Amazon Web Services is Hiring. Amazon Web Services (AWS) is a dynamic, growing business unit within Amazon.com. We are currently hiring So

[TypeStyle] Add type safety to CSS using TypeStyle

ner develop design and cati ice eas demo cnblogs TypeStyle is the only current CSS in JS solution that is designed with TypeSafety and Ty

Agent admitted failure to sign using the key

連接服務器 用戶 sin col p s 服務 failure sig round SSH生成id_rsa, id_rsa.pub後,連接服務器卻報:Agent admitted failure to sign using the key錯誤。解決方法:在當前用戶下執行命令

How To Connect Using A Sqlplus Preliminary Connect

sqlplus -prelim / aAPPLIES TO:Oracle Database - Enterprise Edition - Version 10.2.0.1 to 12.1.0.2 [Release 10.2 to 12.1]Information in this document applie

全網最詳細的啟動或格式化zkfc時出現java.net.NoRouteToHostException: No route to host ... Will not attempt to authenticate using SASL (unknown error)錯誤的解決辦法(圖文詳解)

手機 TP 關註 fail AS hadoop .net ror cep     不多說,直接上幹貨! 解決辦法:     每臺機器都,先執行,sudo chkconfig ipta

git push 提示 Everything up-to-date

remote ranch 分享 mage 一個 自己 提示 master 分支 服務器倉庫是完全空的,不包含任何一個分支(branch),因此剛開始 Push 時需要指定一個。執行 git remote -v 後看到自己的 remote 端名字為 origin: 執行

倒計時5,4,3,2,1css實現(count down from 5 to 1 using css)

The flow mes hid useful code con hidden init //count down from 5 to 1, a useful animation. show the code to you: <!DOCTYPE html>

kettle7.0 Error connecting to database: (using class org.gjt.mm.mysql.Driver)

kettle7.0 連線mysql錯誤 錯誤連線資料庫 [con_mysql] : org.pentaho.di.core.exception.KettleDatabaseException:  Error occurred while trying to connect to t

解決 Agent admitted failure to sign using the key 問題 with ssh

配置ssh 之前要在本機上裝上ssh,可以通過sudo apt-get install ssh或者通過新立德來安裝。如果沒有進行配置的話,登入到本機或者遠端主機需要該主機的密碼才行。下面進行無密碼登入的配置:  很簡單,執行ssh-keygen -t rsa命令,一路敲回

git push 時報錯 failed to push some refs

錯誤原因:github 中的 README.md 檔案不在原生代碼目錄中 執行命令:git pull --rebase origin master(程式碼合併) 原生代碼庫中多了README.md檔案

How to send an object from one Android Activity to another using Intents?

在 Activity 之間傳遞引數的方法: If you’re just passing objects around then Parcelable was designed for this. It requires a little more effort to use than using J

Breaking barriers to trust using Robotic Process Automation

The highly challenging global business environment has become even more competitive recently. Technological developments and digital disruptors continue to

There Is Literally No Excuse to Keep Using Facebook

The writing is on the wall. Facebook is detrimental to global discourse, has harmed democracies around the world, and, because of its dependence on adverti

Explore and get value out of your raw data: An Introduction to Splunk

Install Splunk EnterpriseLet’s start by installing Splunk Enterprise in your machine. Installing Splunk is quite straightforward and the setup package is a

Converting a PDF to JPG (using ImageMagick) · GolangCode

In the example below we use the gographics/imagick package as a wrapper to the C library for ImageMagick to convert our PDF into a JPG. Th

How to Delete using INNER JOIN with SQL Server?

 https://stackoverflow.com/questions/16481379/how-to-delete-using-inner-join-with-sql-server You need to specify what table you are deleting f

Troubleshoot Issues with SES Publishing Data to Kinesis Firehose

Here are some reasons why Amazon SES might not publish data to Amazon Kinesis Firehose: The delivery stream was deleted S

Use Logs to Track Redshift Database Cluster

Amazon Web Services is Hiring. Amazon Web Services (AWS) is a dynamic, growing business unit within Amazon.com. We are currently hiring So