The 5 Step Scientific Process that will Drastically Improve Your Outbound Sales Campaign

by

The sales enablement and automation industry has completely taken off in the last 12-18 months, with an exponential increase in the number of companies that collect, sort, and deliver data to sales teams.

However, data without context is irrelevant, even potentially harmful. If you’re given information that happens to be outdated and try to leverage it in ao cold email, the recipient will delete it faster than you can tweet “sorry.”

Obtaining that next sale is an endeavour every company undoubtedly faces. When you finally discover a process for delivering the right message to the right person at the right time, you know you’ve struck gold. But there’s a fine balance that must be kept between manually and painstakingly personalizing every email that gets sent and completely automating your outbound sales campaign.

Given the right tools and a proven scientific process for testing and improving your outbound campaigns, you can reach that balance quicker and strike gold. In this post, I’ll cover the 5 simple steps that will help you get there.

But before we get started, here’s what you’ll need:

A list of fresh, new leads. This list should include as much data about the lead as possible, like company, title, recent awards/recognition, competitors, industry association membership, etc. The more information the better.

At bare minimum you will need 200, ideally 1,000, to reach statistically significant results. If you don’t have such a list, don’t worry – you can use a 3rd party lead sourcing/intelligence platform like Datanyze or ZoomInfo, purchase a new list, or outsource your lead generation.

An email platform to send emails and report results. My favorite is PersistIQ, though I may be a little biased ;) It’s not 100% necessary, but it will make your life 100X easier. Otherwise, you’ll need time, diligence and a high proficiency working with spreadsheets.

And that’s it! Now you’re ready to start launching, testing, and improving your campaigns. Let’s dive into the 5 step process.

5 Simple Steps to Better Outbound Sales Results

Step 1: Set Your Goals

For the best results, you’re going to need to be as methodical and scientific as possible.
In other words, you need to make your goals SMART – Specific, Measurable, Actionable, Realistic, and Timely.

For example, having a goal of “Increasing the performance of my outbound campaign” is not a SMART goal. Instead, “Increasing my open rate by 8% in my outbound campaign targeting CXOs within the next month” meets the SMART goal criteria.

The three most important metrics for outbound campaigns are open rates, response rates and close rates. Open rates for cold outbound sales emails are declining, but yours should be between 50-60 percent. Response rates are also declining, but yours should hover around 12-15 percent. Close rates should be about 1 percent, but the important thing to watch here is fluctuation. Since this number is small, and closing one or two more deals can dramatically affect your close rate, I’d also layer cycle time and deal size on top of this metric. For example, campaign A might have the exact same close rate as campaign B, but campaign A might close twice as fast and for twice as much money.

Step 2: Define What You’re Testing

We’ll be using the concept of an A/B test, which essentially compares two things that have one difference, or variant, between them in order to determine which performs better.

Once you’ve set your goal, you need to define what you’re going to test. We’ll only be making one change to the entire campaign, launching that campaign and analyzing the results. If you make more than one change, you won’t be able to attribute your increase or decrease in performance to a single factor. That’s multi-variant testing and takes more time, a sophisticated analytical system and many, many more leads.

To choose the appropriate element to test, start by first looking at your goal. If your goal is to increase open rates, for example, testing two different value propositions wouldn’t make sense. Instead, choose an element in the subject line to alter.

Here are a few examples of helpful elements to test:

In a subject line:

  • Custom variable vs. no variable
  • Question vs. statement
  • Long vs. short
  • Feature vs. benefit

In an email body:

  • Value proposition A vs. Value proposition B
  • Call-to-action A vs. call-to-action B
  • P.S. vs. no P.S.
  • Social proof vs. no social proof
  • Whitepaper vs. no whitepaper

In the spirit of taking a scientific approach, write down your hypothesis and your reasoning. Now go out and try to prove yourself wrong! This is the true scientific method.

Step 3: Create Your Campaign and Run Your Test

Now you’re ready to run your test. First, write the template that will serve as your “control” or original template. Then, copy that exact template and make your one change to it. This second template is called your “variant.”

Testing the subject line could look something like this:

Variant:

In this example, we’re testing a long subject line vs a short subject line.

If you need more ideas for templates and campaigns to test, check out our collection of 100+ sales email templates for some inspiration.
 
Now, set up one campaign in your outbound platform with the control template, and a separate campaign with your variant. Make sure you have all the {{variables}} filled in with the correct information. Randomly split your leads equally into two groups. Drop one half into your control campaign and the other half into the variant campaign.

If you’re hacking this together manually, the most important thing to remember is to keep everything between the two test groups as similar as possible: your audience, the time of the day your emails are sent, the day of the week your emails are sent, the body of the email, etc.

Now hit send!

Step 4: Collect, Analyze and Make Sense of the Results

The big questions when running A/B tests are “How long should I run it for?” and “How many emails are enough?” If you have a large list and can afford to test generously, then test until you can reach a statistically significant result.

Here’s an A/B testing statistical significance calculator to help. A good rule of thumb to remember is if you can get to a 90% statistical significance rate, you can be confident that your results are meaningful.

However, if you don’t have that luxury of a big list and ample time to reach statistical significance, you’ll have to make do with what you have and use your best judgement. The more tests you run, the quicker you’ll be able to get a feel for what’s working and what isn’t.

As a general rule, let your test run for one full week.

Diving into the platform that we used to launch our campaign, let’s look at our results.

email campaign variant 1

email campaign variant 2

Remember, our goal was to increase open rates, so we tested two different subject lines; a long subject line (the control) and a short subject line (the variant). The shorter subject line received 24 more opens, or an 11% increase. From this test, we can conclude that a shorter subject line is better when targeting CXOs.

Step 5: Iterate and Refine

Keep testing and refining. If you’ve conclusively proven that short email subject lines are better than long ones, then you can start testing different short subject lines against each other. Or you may decide to use that proven shorter subject line and go on to test reply rates by changing the CTA in the body of the email. That’s how you’re really going to master the outbound sale – repeated trial and error.

I’d recommend retesting your findings every 6 months or so to counteract the effects of because fatigue and learned blindness sets in. As soon as other sales reps begin to catch on and realize what works, they’ll start practicing it themselves until it’s overused, trite, and, effectively, rendered useless.

A big mistake sales reps commonly make is finding one subject line that works and only using that subject line. The B2B space evolves quickly and people catch on to what works. Then, when everyone is using the once powerful subject line, it inevitably decreases in effectiveness. It’s adapt or die, and that’s why I recommend every 6-9 months going back to the beginning and besting your original findings.

Conclusion

When we’re looking at improving the performance of your outbound campaigns, it takes a methodical approach to effect real change. Investing time in scientifically testing your outbound campaigns will pay off in the long run, but you have to start now.

Data is a powerful thing. Of all the activities one could do to ensure the success of a sales organization, identifying, tracking and understanding how your sales efforts are performing is paramount. Although analytics and reporting is increasingly becoming a central part of sales, increased access to data doesn’t always result in an increase in sales performance. Use what you’ve learned here to improve your outbound sales campaigns and close more deals.

Brandon Redlinger runs Growth Marketing at PersistIQ. PersistIQ helps reps deliver truly personalized outbound sales at scale. Automate the tedious tasks so you can get more replies and meetings with qualified leads. Follow Brandon at @Brandon_Lee_09 for the latest information on outbound sales.