PikoPong
  • Web Dev
  • Hack
  • Database
  • Big Data
  • AWS
  • Linux
No Result
View All Result
PikoPong
  • Web Dev
  • Hack
  • Database
  • Big Data
  • AWS
  • Linux
No Result
View All Result
PikoPong
No Result
View All Result
Home Big Data

Analyze logs with Datadog using Amazon Kinesis Data Firehose HTTP endpoint delivery : idk.dev

July 31, 2020
in Big Data
274 18
Analyze logs with Datadog using Amazon Kinesis Data Firehose HTTP endpoint delivery : idk.dev


Amazon Kinesis Data Firehose now provides an easy-to-configure and straightforward process for streaming data to a third-party service for analysis, including logs from AWS services. Due to the varying formats and high volume of this data, it’s a complex challenge to identify and correlate key event details and data points to fix issues and improve your applications’ performance. This new feature enables you to easily and quickly send logs from AWS services directly to Datadog to perform this analysis. Kinesis Data Firehose is fully managed by AWS, so you don’t need to maintain any additional infrastructure or forwarding configurations for streaming logs.

In this post, we describe the Kinesis Data Firehose and related Datadog integration. We also show you how to ingest logs from multiple AWS services into Datadog.

Kinesis Data Firehose delivers real-time streaming data to destinations like Amazon Simple Storage Service (Amazon S3), Amazon Redshift, or Amazon Elasticsearch Service (Amazon ES), and now supports delivering streaming data to Datadog. Kinesis Data Firehose provides built-in, fully managed error handling, transformation, conversion, aggregation, and compression functionality so you don’t need to write applications to handle these complexities. These capabilities enable customers to monitor and analyze data from any source and use it to deliver operational intelligence and business performance.

Datadog is an AWS Partner Network (APN) Advanced Technology Partner with AWS Competencies in DevOps, Migration, Containers, and Microsoft Workloads, and enables you to easily explore and analyze logs to gain deeper insights into the state of your applications and AWS infrastructure. You can analyze all your AWS service logs while storing only the ones you need, and generate metrics from aggregated logs to uncover and alert on trends in your AWS services.

How it works

To process and deliver AWS service logs to Datadog, you can implement the following architecture.

Using this solution, you can configure AWS Web Application Firewall (AWS WAF) or Amazon API Gateway to deliver log events directly to Kinesis Data Firehose. If using another service that delivers logs to Amazon CloudWatch Logs, you can use CloudWatch log subscriptions to feed log events from CloudWatch Logs and have it delivered to a Firehose delivery stream. By configuring Kinesis Data Firehose with the Datadog API as a destination, you can deliver the logs to Datadog for further analysis. You can also back up events to an S3 bucket to help prevent data loss. You can configure backup for all log events or only those exceeding a specified retry duration. For more information about adjusting your backup based on the amount of log data you’re streaming, see Amazon Kinesis Data Firehose Quota.

Creating your delivery stream

Your new delivery stream needs an API key to send logs to Datadog. You can find your key in the API section of your Datadog account.

If you don’t already have an account, visit the Datadog website to sign up for a free 14-day trial.

To create your delivery stream, complete the following steps:

  1. On the Kinesis Data Firehose console, choose Create delivery stream.
  2. For Delivery stream name, enter a name.
  3. For Source, choose Direct PUT of other sources.
  4. Choose Next.

  1. On the Process records page, keep all settings at their default and choose Next.
  2. On the Choose a destination page, for Destination, choose Third-party partner.
  3. From the Third-party partner drop-down menu, choose Datadog.

  1. For HTTP endpoint URL, choose the desired HTTP endpoint based on your Region and Datadog account configuration.

For more information, see Logs Guides.

  1. For API key, enter your Datadog API key.

This allows your delivery stream to publish to the endpoint.

  1. For Content encoding, select GZIP.

  1. Accept the defaults for Retry duration.
  2. For S3 backup mode, select Failed data only.
  3. For S3 bucket, enter the S3 bucket for delivery of log events that exceeded the retry duration.

Alternatively, you can create a new bucket by choosing Create new.

  1. Choose Next.
  2. For Datadog buffer conditions, accept the default Datadog and Amazon S3 buffer conditions for your stream.

  1. In the IAM role section, configure permissions for your delivery stream by choosing Create or update IAM role.
  2. Choose Next.
  3. Review your settings and choose Create delivery stream.

Logs subscribed to your delivery stream are now available for analysis with Datadog.

Conclusion

Datadog allows you to monitor your servers, containers, databases, and third-party services to ensure high availability, optimize performance, and troubleshoot issues. With Kinesis Data Firehose HTTP endpoint delivery, AWS service log data is available for analysis quickly. This allows you to identify issues and performance bottlenecks with your applications by correlating logs from AWS services such as AWS CloudTrail, Amazon Relational Database Service (Amazon RDS), and your AWS Lambda functions with metrics and traces. By taking advantage of a fully managed AWS service, this delivery method provides high availability and scalability for your integration. For more information about configuring this integration directly via AWS CloudFormation, see Log collection.

 


About the Authors

Ilan Rabinovitch is Vice President of Product and Community at Datadog.

 

 

 

 

 

Kalyan Janaki is Senior Technical Account Manager with AWS. Kalyan enjoys working with customers and helping them migrate their workloads to the cloud. In his spare time, he tries to keep up with his 2-year-old.

 

 

 

 

 

Bryan Hopkins is a Senior Technical Account Manager with AWS. Bryan helps large enterprises build secure and scalable solutions on the AWS cloud. In his spare time, he enjoys reading classic science fiction and building quirky creations with his family.



Source link

Share219Tweet137Share55Pin49

Related Posts

Best practices using AWS SCT and AWS Snowball to migrate from Teradata to Amazon Redshift : idk.dev
Big Data

Best practices using AWS SCT and AWS Snowball to migrate from Teradata to Amazon Redshift : idk.dev

This is a guest post from ZS. In their own words, “ZS is a professional services firm that works...

September 24, 2020
Automating bucketing of streaming data using Amazon Athena and AWS Lambda : idk.dev
Big Data

Automating bucketing of streaming data using Amazon Athena and AWS Lambda : idk.dev

In today’s world, data plays a vital role in helping businesses understand and improve their processes and services to...

September 23, 2020
Building an AWS Glue ETL pipeline locally without an AWS account : idk.dev
Big Data

Building an AWS Glue ETL pipeline locally without an AWS account : idk.dev

If you’re new to AWS Glue and looking to understand its transformation capabilities without incurring an added expense, or...

September 22, 2020
Bringing the power of embedded analytics to your apps and services with Amazon QuickSight : idk.dev
Big Data

Bringing the power of embedded analytics to your apps and services with Amazon QuickSight : idk.dev

In the world we live in today, companies need to quickly react to change—and to anticipate it. Customers tell...

September 22, 2020
Next Post
AWS Well-Architected for Financial Services : idk.dev

AWS Well-Architected for Financial Services : idk.dev

Introducing launch configurations support for SAM debugging in the AWS Toolkit for VS Code : idk.dev

Introducing launch configurations support for SAM debugging in the AWS Toolkit for VS Code : idk.dev

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Pre-Caching Image with React Suspense

Pre-Caching Image with React Suspense

September 21, 2020
User agent tunnel vision

User agent tunnel vision

May 23, 2020
Architecting Secure Serverless Applications | AWS Architecture Blog : idk.dev

Architecting Secure Serverless Applications | AWS Architecture Blog : idk.dev

August 5, 2020
Fading in a Page on Load with CSS & JavaScript

Fading in a Page on Load with CSS & JavaScript

January 13, 2021

Categories

  • AWS
  • Big Data
  • Database
  • DevOps
  • IoT
  • Linux
  • Web Dev
No Result
View All Result
  • Web Dev
  • Hack
  • Database
  • Big Data
  • AWS
  • Linux

Welcome Back!

Login to your account below

Forgotten Password?

Create New Account!

Fill the forms bellow to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In