Using Amazon SQS Dead Letter Queues
After Jason Fulghum recently posted a blog entry about using Amazon SQS dead letter queues with the AWS SDK for Java, I thought his post would be interesting for .NET developers as well. Here is Jason’s post with the code replaced with the C# equivalent.
Amazon SQS recently introduced support for dead letter queues. This feature is an important tool to help your applications consume messages from SQS queues in a more resilient way.
Dead letter queues allow you to set a limit on the number of times a message in a queue is processed. Consider an application that consumes messages from a queue and does some sort of processing based on the message. A bug in your application may only be triggered by certain types of messages or when working with certain data in your application. If your application receives one of these messages, it won’t be able to successfully process it and remove it from the queue. Instead, your application will continue to try to process the message again and again. While this message is being continually retried, your queue is likely filling up with other messages, which your application is unable to process because it’s stuck repeatedly processing the bad message.
Amazon SQS dead letter queues enable you to configure your application so that if it can’t successfully process a problematic message and remove it from the queue, that message will be automatically removed from your queue and delivered to a different SQS queue that you’ve designated as a dead letter queue. Another part of your application can then periodically monitor the dead letter queue and alert you if it contains any messages, which you can debug separately.
Using Amazon SQS dead letter queues is easy. You just need to configure a RedrivePolicy on your queue to specify when messages are delivered to a dead letter queue and to which dead letter queue they should be delivered. You can use the AWS Management Console, or you can access the Amazon SQS API directly with the AWS SDK for .NET.
// First, we'll need an Amazon SQS client object.
IAmazonSQS sqs = new AmazonSQSClient(RegionEndpoint.USWest2);
// Create two new queues:
// one main queue for our application messages
// and another to use as our dead letter queue
string qUrl = sqs.CreateQueue(new CreateQueueRequest()
{
QueueName = "MyApplicationQueue"
}).QueueUrl;
string dlqUrl = sqs.CreateQueue(new CreateQueueRequest()
{
QueueName = "MyDeadLetterQueue"
}).QueueUrl;
// Next, we need to get the the ARN (Amazon Resource Name) of our dead
// letter queue so we can configure our main queue to deliver messages to it.
IDictionary attributes = sqs.GetQueueAttributes(new GetQueueAttributesRequest()
{
QueueUrl = dlqUrl,
AttributeNames = new List() { "QueueArn" }
}).Attributes;
string dlqArn = attributes["QueueArn"];
// The last step is setting a RedrivePolicy on our main queue to configure
// it to deliver messages to our dead letter queue if they haven't been
// successfully processed after five attempts.
string redrivePolicy = string.Format(
"{{"maxReceiveCount":"{0}", "deadLetterTargetArn":"{1}"}}",
5, dlqArn);
sqs.SetQueueAttributes(new SetQueueAttributesRequest()
{
QueueUrl = qUrl,
Attributes = new Dictionary()
{
{"RedrivePolicy", redrivePolicy}
}
});
There’s also a new operation in the Amazon SQS API to help you identify which of your queues are set up to deliver messages to a specific dead letter queue. If you want to know what queues are sending messages to a dead letter queue, just use the IAmazonSQS.ListDeadLetterSourceQueues operation.
IList sourceQueues = sqs.ListDeadLetterSourceQueues(
new ListDeadLetterSourceQueuesRequest()
{
QueueUrl = dlqUrl
}).QueueUrls;
Console.WriteLine("Source Queues Delivering to " + qUrl);
foreach (string queueUrl in sourceQueues)
{
Console.WriteLine(" * " + queueUrl);
}
Dead letter queues are a great way to add more resiliency to your queue-based applications. Have you set up any dead letter queues in Amazon SQS yet?
相關推薦
Using Amazon SQS Dead-Letter Queues to Control Message Failure
Michael G. Khmelnitsky, Senior Programmer Writer Sometimes, messages can’t be processed because of a variety of possible issues,
Using Amazon SQS Dead Letter Queues
After Jason Fulghum recently posted a blog entry about using Amazon SQS dead letter queues with the AWS SDK for Java, I thought his post would be
Robust Serverless Application Design with AWS Lambda Dead Letter Queues
Gene Ting, Solutions Architect AWS Lambda is a serverless, event-driven compute service that allows developers to bring their functions t
RabbitMQ Dead Letter Exchange
簡稱DLX 如果一個queue設定了DLX,那麼在rabbitmq的管理控制檯中的這個queue上就會多個標誌,如圖: 邊上還有DLK,是dead-letter-routing-key的縮寫。 佇列中的訊息,有可能是'dead-lettered',字面意思就是'死信',當下面的這些情況發生時,信就被認為是‘
Analyze and visualize your VPC network traffic using Amazon Kinesis and Amazon Athena
Network log analysis is a common practice in many organizations. By capturing and analyzing network logs, you can learn how devices on your netwo
How Annalect built an event log data analytics solution using Amazon Redshift
Ingesting and analyzing event log data into a data warehouse in near real-time is challenging. Data ingest must be fast and efficient. The data wa
Training models with unequal economic error costs using Amazon SageMaker
Many companies are turning to machine learning (ML) to improve customer and business outcomes. They use the power of ML models built over “big dat
Discovering and indexing podcast episodes using Amazon Transcribe and Amazon Comprehend
As an avid podcast listener, I had always wished for an easy way to glimpse at the transcript of an episode to decide whether I should add it to m
Using Amazon’s Mechanical Turk for Machine Learning Data
How to build a model from Mechanical Turk resultsAmazon Mechanical Turk will notify you when your results are ready and you will finally have a labelled da
New Engen improves customer acquisition marketing campaigns using Amazon Rekognition
New Engen is a cross-channel performance marketing technology company that uses its proprietary software products and creative solutions to help t
Using Amazon Redshift for Fast Analytical Reports
With digital data growing at an incomprehensible rate, enterprises are finding it difficult to ingest, store, and analyze the data quickly while k
Set Up Scheduled EBS Volume Snapshots Using Amazon CloudWatch
Amazon Web Services is Hiring. Amazon Web Services (AWS) is a dynamic, growing business unit within Amazon.com. We are currently hiring So
Zillow Provides Near-Real-Time Home-Value Estimates Using Amazon Kinesis
Amazon Web Services is Hiring. Amazon Web Services (AWS) is a dynamic, growing business unit within Amazon.com. We are currently hiring So
Using Amazon EFS to Persist Data from Amazon ECS Containers
My colleagues Jeremy Cowan and Drew Dennis sent a nice guest post that shows how to use Amazon Elastic File System with Amazon ECS. —
Interactive Analysis of Genomic Datasets Using Amazon Athena
Aaron Friedman is a Healthcare and Life Sciences Solutions Architect with Amazon Web Services The genomics industry is in the midst of a d
Movable Ink Gets Insights 50% Faster Using Amazon Athena
Movable Ink uses AWS to query seven years’ worth of historical data and get results in moments, with the flexibility to explore data for de
Reduce DDoS Risks Using Amazon Route 53 and AWS Shield
In late October of 2016 a large-scale cyber attack consisting of multiple denial of service attacks targeted a well-known DNS provider. The attack
Concatenate Parquet Files Using Amazon EMR
Amazon Web Services is Hiring. Amazon Web Services (AWS) is a dynamic, growing business unit within Amazon.com. We are currently hiring So
Analyzing Data in S3 using Amazon Athena
Neil Mukerje is a Solution Architect for Amazon Web Services Abhishek Sinha is a Senior Product Manager on Amazon Athena Amazon Athena is
Build More Reliable and Secure Windows Services Using Amazon Kinesis Agent for Microsoft Windows
We’ve all been there. You’ve deployed a new service on Windows servers. Maybe it’s based on Microsoft technology such as IIS, AD, DHCP, Microsoft