Transactional Email Speed: A 30-Day Experiment

Has this ever happened to you?

I love aged Sumatran coffee. It’s often marketed as a Christmas blend, so it can be particularly hard to find in spring and summer. So when our resident coffee guru pointed a particularly tasty version, I jumped on it. After ordering my pure caffeinated gold and closing out the order confirmation page, I realized that I wasn’t 100% sure that I’d ordered the right grind.  I went to check the confirmation email and saw….nothing. As I waited the ten seconds for it to show up, I had a  thought: with transactional email, time literally is money. The faster that confirmation email comes in, the better the odds of the customer following through and completing their transaction. 

Ten seconds can feel like a major delay in today’s digital age, and each additional second it takes to deliver an email increases the risk of a lost sale, abandoned account signup, or a confused customer placing a double order. How fast is your email?

Stacking Up The Competition: The Experiment

I’ve noticed I rarely wait for a transactional email that’s sent by GreenArrow to be delivered, so that got me thinking — is GreenArrow meaningfully faster than the competition in this area, or are there other factors at play? I decided to find out by running a 30-day long test comparing the transactional email performance of GreenArrow to that of three of our competitors— Amazon SES, Mailgun and SendGrid. Wait, how did we choose those three? Recent customer inquiries. The results were interesting.

Leveling the Playing Field

On a high level, the test was to inject a 65KB email into each ESP (Email Service Provider) using SMTP once an hour, and measure how long it took for each to be delivered to its recipient. To make the test as fair as we could (i.e., level the playing field), we also took some extra steps:

  • If DKIM signing options were available, then we selected a 1024-bit key.
  • Network latency can be a significant factor for SMTP performance, so we selected a test server in a location that had as close to equal network latency to each provider as we could find. That ended up being an email server in the New York City area.
  • Our testing server was dedicated to running the performance tests and logging the results. It was otherwise idle.
  • We gathered data for 30 days, so 720 test emails were sent with each ESP. You can read the captivating details of our methodology here.

“And The Winner Is…”

Want to see our raw data on all 720 test messages for each ESP? Check out this Google Spreadsheet. Want a basic run-down of what the data says? Keep reading. (I’ll be using statistical terms like mean, median, mode, and standard deviation. If you want a quick refresher course on these terms, this Khan Academy video is a good overview.)

The Mean

Let’s start with the metric that, in my opinion, is the most interesting – the average (mean) time that elapsed while the test server injected each message into the test ESP, then waited for it to come back. The lower the mean, the better:

Average (Mean) Message Delivery TimeIn SecondsGreenArrowAmazon SESMailgunSendGrid05101520253035404550

GreenArrow was the fastest by far in terms of mean delivery time. Amazon SES and Mailgun are about a second and a half, and two seconds behind, respectively. Surprisingly, SendGrid took an average of 49.71 seconds.

The Median

The relative rankings of each ISP are unchanged when we look at the median message delivery times. GreenArrow is still the fastest, and SendGrid is still the slowest. The numbers aren’t nearly as spread out as they were for the mean, though. This is due to some outliers weighing heavily on the mean figures. For example, the messages that took the longest to deliver was sent through SendGrid and took 12235.03 seconds, or 3.4 hours to be delivered. We dig into more details on these outliers in the “The Fast, The Slow and The Missing” section.

Here are the median message delivery times for each ESP. The lower the median time, the better:

Median Message Delivery TimeIn SecondsGreenArrowAmazon SESMailgunSendGrid00.511.522.533.5

The (Pie a la) Mode

Interestingly, there’s not much difference in mode. The slowest ESP’s mode was only 1.18 seconds longer than the fastest. The lower the mode, the better:

Mode of Message Delivery TimeIn SecondsGreenArrowAmazon SESMailgunSendGrid00.

The Standard Deviation

Next, let’s move onto the standard deviation. Standard deviation is a way of measuring how much individual measurements depart from the mean (average). In this context, the higher the standard deviation, the more variability there is in how long it takes to deliver emails. The lower the standard deviation, the better:

Standard Deviation of Message Delivery TimesIn SecondsGreenArrowAmazon SESMailgunSendGrid050100150200250300350400450500550600

If we assume that the data distribution is approximately normal, then this means that for 68% of messages (those within 1 standard deviation):

  • GreenArrow takes 0.18 to 0.58 seconds to deliver each message.
  • Amazon SES takes 0.78 to 2.88 seconds to deliver each message
  • Mailgun takes 0 to 6.59 seconds to deliver each message
  • SendGrid takes 0 to 603.03 seconds to deliver each message.

Wait; what?! If you’re wondering how Mailgun’s and SendGrid’s best delivery times could be zero—they’re not. What’s happening here is that there are so many more outliers on the long end of the data distribution than the short end that we’re not getting a traditional bell curve. We’ll show the outliers on both ends of the scale in the “10 Slowest Deliveries” and “10 Fastest Deliveries” sections.

The Fast, The Slow, and The Missing

Of the 2,880 emails that we sent during this test, seven went missing. One of the missing messages was sent through Mailgun, and the other six were sent through Amazon.

What do these missing emails mean? One’s an incident, two’s a coincidence, and three’s a pattern.  We’re not particularly concerned about the one email that went missing with Mailgun. That could have happened with any of the ESPs – GreenArrow included. Having 6 out of 720 emails disappear with Amazon SES is worrying, though. That’s 0.83% of the emails sent through Amazon.

All of the emails that Amazon SES failed to deliver were injected on February 28 between 6 and 11 pm, so perhaps they were having some technical issues. Unfortunately, at the time of this writing, Amazon only reports the last 2 weeks of SES stats, so by the time we started analyzing the data from our tests, the stats for February 28 on Amazon’s side had already been rotated out. The focus of this blog post is on performance, so I won’t dig into a comparison between Amazon and other ESPs on stats retention, but it was disappointing to see this 2-week limit. I would have expected to be able to go back much further – perhaps a year.

The missing emails were excluded from our other calculations, so if you run the math with the raw data and see your figures come out differently for Amazon, make sure you’re excluding those data points as well, rather than counting them as zeros.

The 10 Slowest Deliveries

Here are the 10 longest delivery times in the test results. They were all for emails relayed through SendGrid:

  1. 12,235.04 seconds (about 3 hours, 24 minutes)
  2. 8,639.33 seconds (about 2 hours, 24 minutes)
  3. 5,361.26 seconds (about 1 hour, 29 minutes)
  4. 2,034.59 seconds (about 34 minutes)
  5. 295.70 seconds
  6. 190.24 seconds
  7. 122.35 seconds
  8. 106.40 seconds
  9. 104.50 seconds
  10. 88.95 seconds

The 10 Fastest Deliveries

Here’s the amount of time required for the 10 fastest deliveries. These numbers are tightly packed, so we added an extra decimal place of precision to avoid listing 10 entries that were all rounded to either 0.30 or 0.31 seconds:

  1. 0.296 seconds
  2. 0.300 seconds
  3. 0.302 seconds
  4. 0.303 seconds
  5. 0.303 seconds
  6. 0.305 seconds
  7. 0.307 seconds
  8. 0.307 seconds
  9. 0.307 seconds
  10. 0.308 seconds

And as I expected, the 10 fastest deliveries were all for messages relayed through GreenArrow. Yeah!

Taking Off The Training Wheels

To make our test as fair as possible, we tried to use common default settings. We also steered away from options that were supported by some (but not all) of the ESPs tested.

But if we were aiming to optimize GreenArrow’s performance without worrying about an equal playing field, we would consider making the following changes:

  1. Use an HTTP Message Submission API, instead of SMTP. SMTP is a very chatty protocol, so by switching to HTTP, we would reduce the number of round trips that need to take place while injecting each message.
  2. Use QMQP or QMQP Streaming Protocol to inject messages. These protocols are also less chatty than SMTP.
  3. If SMTP is kept, then use IP address authentication instead of SMTP AUTH.
  4. If SMTP is kept, then use a library that supports SMTP Command Pipelining.

As fast as we’ve already proven GreenArrow to be, with these configurations in place, it can perform much faster. We’re talking beyond Warp 9 fast. You might even call it Ludicrous Speed

Your Customers Want To Hear From You, Fast

Email should be fast. In transactional email sending—it pays to be fast. Sure my coffee example was about the confirmation of a purchase, but what about signup or shipping notification, welcome email, or password reset? At GreenArrow, we believe the more you can do to minimize the chance that customers will be left waiting, the happier those customers will be. And that’s what life is all about, happy customers. (And coffee, lots of coffee.)

Want to see first-hand what our email sending software can do for your business? Schedule a live demo today, and you, too, can learn how to deliver transactional emails to your customers in under a second. Ludicrous Speed…GO!!!

The Boring Appendix (For Those Interested…)

The testing server we used had the following mean latency measurements (in milliseconds) to each ESP:

Latency MeasurementsMinimumMeanMaxGreenArrowAmazon SESMailGunSendgrid024681012141618

The above network latency figures were generated by using hping to connect to the port 587 SMTP submission service of each ESP.

Testing Methodology

Test emails were generated and injected into each ESP using a short Ruby script we wrote for this test. The script uses the Net::SMTP library, and the only differences in the code run for each ISP were for ISP-specific SMTP settings, such as the SMTP server username and password.

The times the test emails came back from each ESP were recorded using a .qmail script that executed as each message was delivered.

In these tests, we define the message delivery time as the time that elapsed from the moment that the test server first attempted to connect to the ESP to inject the email to the moment that the test server received that same email back from the ESP.

See? I told you it was boring. You should have stopped at the “Free Demo” section above. That is unless you’re like our team and actually like this sort of thing…


Don't Miss Out!

Sign up for the GreenArrow newsletter, and we’ll email you tips, updates, and resources.