Tutorial: Shipping AWS Kinesis Data Stream Logs to Logz.io

栏目: IT技术 · 发布时间: 4年前

内容简介:While Kafka is considered more durable (as an open-source application, it’s configuration is ultimately up to the developer), it also comes with the need to manually manage clusters while Amazon Kinesis is a fully managed service by Amazon for AWS. With th
Tutorial: Shipping AWS Kinesis Data Stream Logs to Logz.io
  • Home
  • Blog
  • Tutorial: Shipping AWS Kinesis Data Stream Logs to Logz.io

Kinesis is a managed, high-performance and large-capacity service for real time processing of (live) streaming data. Prominent users include Netflix, Comcast and Major League Baseball. Its design to let it grab data from multiple sources at the same time and to scale processing within EC2 instances. AWS Kinesis logs come from its Data Stream feature, one of the main two Kinesis services along with  Kinesis Data Firehose (note that there are also services for Kinesis Analytics and Kinesis Video Streams). It is modelled after, and is designed to be an alternative to, Apache Kafka.

While Kafka is considered more durable (as an open-source application, it’s configuration is ultimately up to the developer), it also comes with the need to manually manage clusters while Amazon Kinesis is a fully managed service by Amazon for AWS. With that in mind, Kinesis isn’t for on-prem applications. So if you’re looking to save time and personnel resources, and have already gone all-in on using the cloud (and AWS in particular), Kinesis might (should) be a better option than Kafka.

As a fully managed service, Kinesis has limits to data storage: a default of 24 hours but a configurable maximum limit of seven days. All uptime is managed by Amazon and all data going through Data Streams gets automatic, built-in cross replication.

Producers send data to be ingested into AWS Kinesis Data Streams. Each stream is divided into shards (each shard has a limit of 1 MB and 1,000 records per second). Output is then sent onward to Consumers .

Shipping AWS Kinesis Logs to Logz.io

1. Create a new Lambda function

This Lambda function will consume a Kinesis data stream and sends the logs to Logz.io in bulk over HTTP.

Open the AWS Lambda Console , and click Create Function . Choose Author from scratch , and use this information

Name : We suggest adding the log type to the name (in this case, obviously, “ Kinesis “), but any name is acceptable.

Runtime : Choose Python 3.7

Role : Use a role that has AWSLambdaKinesisExecutionRole permissions.

Click Create Function (bottom right corner of the page). After a few moments, you’ll see configuration options for your Lambda function. You’ll need this page later on, so keep it open.

2. Zip Source Files

Clone the Kinesis Stream Shipper – Lambda project from GitHub to your computer, and zip the Python files in the src/ folder.

git clone https://github.com/logzio/logzio_aws_serverless.git \

&& cd logzio_aws_serverless/python3/kinesis/ \

&& mkdir -p dist/python3/shipper; cp -r ../shipper/shipper.py dist/python3/shipper \

&& cp src/lambda_function.py dist \

&& cd dist/ \

&& zip logzio-kinesis lambda_function.py python3/shipper/ *

You’ll upload logzio-kinesis.zip in the next step.

3. Upload the zip file and set environment variables

In the Function code section of Lambda, find the Code entry type list. Choose Upload a .ZIP file from this list.

Click Upload , and choose the zip file you created earlier (logzio-kinesis.zip).

In the Environment variables section, set your Logz.io account token, URL, and log type, and any other variables that you need to use.

TOKEN: <> URL: https://<>:8071 TYPE: logzio_kinesis_stream FORMAT: text #or_json COMPRESS: false

Notes: FORMAT could be text or json . Set COMPRESS to false if you want to compress logs before sending

4. Configuring the function

In basic settings, we recommend 512 MB of memory and a Timeout setting of 1:00.

5. Set the Kinesis event trigger

On the Add triggers list to the left of the Designer panel, choose Kinesis . Below the Designer, notice the Configure triggers panel and choose the Kinesis stream you want your Lambda function to watch. Then click Add and Save at the top of the page.

6. Check Logz.io that logs have been sent

Logs will not instantaneously appear in your Logz.io account (nor in the open-source version of ELK, we might add). After a few minutes, they should appear in Kibana.

Monitor, troubleshoot, and secure distributed cloud workloads and Kubernetes with our Cloud Observability Platform.

Learn More!


以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持 码农网

查看所有标签

猜你喜欢:

本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们

C++程序设计

C++程序设计

谭浩强 / 清华大学出版社 / 2004-6-1 / 36.00元

《C++程序设计》作者深入调查了我国大学的程序设计课程的现状和发展趋势,参阅了国内外数十种有关C++的教材,认真分析了学习者在学习过程中遇到的困难,研究了初学者的认识规律。在本书中做到准确定位,合理取舍内容,设计了读者易于学习的教材体系,并且以通俗易懂的语言化解了许多复杂的概念,大大减少了初学者学习C++的困难。C++是近年来国内外广泛使用的现代计算机语言,它既支持面向过程的程序设计,也支持基于对......一起来看看 《C++程序设计》 这本书的介绍吧!

MD5 加密
MD5 加密

MD5 加密工具

html转js在线工具
html转js在线工具

html转js在线工具

RGB CMYK 转换工具
RGB CMYK 转换工具

RGB CMYK 互转工具