E-books PDF courtesy by ioninteractive.com


Learn how to use landing page software in tandem with your current marketing automation solution for exceptional results.

Integrating LiveBall with your marketing automation solution can help you to generate results unlike anything that you could achieve singly. In this guide, you’ll learn how to leverage the Power of Two to create a wider variety of better experience with less effort, more sophistication, greater insights, and streamlined testing.

___________________________________________________

50 Landing Page Best Practices

The Ultimate Guide to Effective Landing Pages

From templates to testing and everything in between, this jam-packed best practices manual contains 50 expert tips and strategies for building the best landing pages.

You’ll gain valuable insights such as how to integrate your landing experience with a post-conversion audience, and how to build audience-specific landing pages. In addition, we’re revealing a trove of testing strategies, including which elements are most important to test on your pages.

__________________________________________________________________________

Landing Page Toolkit

This workbook takes you step-by-step from creating landing pages to optimizing them for high-conversion. Learn strategies for testing, segmenting, and messaging success. We even included extra workpages for future use. Read the toolkit and start making high-performance pages today.

 

 

 

 

 

 

 

______________________________________

About ion
Scalable, Agile Landing Page Optimization & Management

Marketers spend a lot to win clicks. We think every one of those clicks should lead to a great experience. Our cloud-based platform, LiveBall is used by hundreds of global brands to create and optimize post-click web experiences. We also offer a wide-range of strategic services for companies who need help in the planning, executing and optimizing of their post-click programs.

  • Born of Market Demand
  • Built for the Future
  • Designed to Deliver Results
  • Priced & Packaged to Grow with You
  • Immediate Deployment
  • Unlike Anything Else

 

 

 

______________________________

Main 1.888.466.4332
Intl +01.561.394.9484
Facsimile +01.561.394.9773
Twitter @ioninteractive
Email info@ioninteractive.com

Florida
ion interactive
200 East Palmetto Park Road
Suite 107
Boca Raton, FL 33432

Massachusetts
ion interactive
One Broadway, 14th Floor
Cambridge, MA 02142

Anuncios

Three Testing Strategies For Sophomore Conversion Testers | by Celine Roque


THE DAILY EGG

Hard boiled conversion optimization and design advice

Change the color of your call-to-action button.  Test a new headline.  Swap this hero image for that hero image.

Been there, done that.

Try these three conversion testing methods for Conversion Testing sophomores.

#1 – Blow it up, start from scratch

Instead of pursuing split tests on specific site elements – such as the headline or call-to-action button – 37signals decided to test two fundamentally different versions of their Highrise homepage. The rationale behind this was that they needed to destroy their assumptions about what may or may not work.

Almost everything was different about each version.

This is an image of the Original Design vs. the Person Design:

  • The Original Design had several smaller customer photos, while the Person Design had one large background photo of just a single customer.
  • There were several customer testimonials in the Original Design, while the Person Design only had a single quote from the featured customer.
  • Only a handful of benefits are listed on the Person Design, while several features and benefits are outlined in the Original Design.
  • The person design has only one prominent call-to-action: “See Highrise Plans and Pricing”. The Original Design had a navigation menu, as well as options to view more testimonials and features.

The result of this difference was that the Person Design led to a 102.5% increase in paid sign-ups.

Key Takeaway: Rather than just testing small elements of your landing page, try to test radically different versions of the page.

Complete history

What to Do When Conversion Optimization Goes Bad


 

http://blog.kissmetrics.com

Conversion rate optimization isn’t always all kittens and rainbows. Sometimes you test things that you’re sure will send your conversions through the roof, but it ends up going over like a lead balloon. Sales plummet, sign ups slow to a trickle…

And you freeze.

The most important step you can take is to roll your site back to its pre-test glory. And you might be inclined to just keep it there because testing the wrong things (again) could send your traffic into a tailspin.

But before you swear off testing ever again, consider the following tips. Not only will you be able to recover more quickly, but you’ll also be able to create a testing and optimization plan that helps you pinpoint where your target audience is slipping through the cracks – and get them back.

Welcome to Testing.

welcome to website testing

You aren’t the first person who has gone through this – and you certainly won’t be the last. It’s completely natural to go into this with high expectations, but what you’re seeing is possibly a more down-to-earth result. Don’t look at this as if your idea is worthless or your site is ruined – this is why we test. Once you understand that this is a positive step forward, you can start thinking like a real conversion optimization scientist – crunching numbers and trying different changes to see what resonates with your unique audience.

Real data and hard numbers are preferable to gut feelings and instincts any day – especially when it comes to maximizing your sales and subscribers. So let’s get started. Leer más “What to Do When Conversion Optimization Goes Bad”

Forrester: Consumers Will Not Only Buy, They’ll Help Create

If the general trend toward crowdsourcing is any clue, then we are all well aware of the value of the Internet masses. Having access to a loyal fan base can be like a fount of free ideas and labor. From translating Wikipedia and Facebook to beta testing Google Chrome, crowdsourcing is used all across the Web for a number of purposes and analyst firm Forrester is suggesting one more – co-creation.

According to a report released this week, U.S. consumers a willing “co-creators”, a fact that many companies have yet to take advantage of. [Más…]

The report surveyed consumer product strategy professionals and consumers and found that “nearly half of all companies are not using social media to interact directly with their customers in order to influence product creation, design or strategy.” Beyond that, the report found that a majority of consumers were more than willing to lend a helping hand in creating the products they would eventually purchase.


If the general trend toward crowdsourcing is any clue, then we are all well aware of the value of the Internet masses. Having access to a loyal fan base can be like a fount of free ideas and labor. From translating Wikipedia and Facebook to beta testing Google Chrome, crowdsourcing is used all across the Web for a number of purposes and analyst firm Forrester is suggesting one more – co-creation.

According to a report released this week, U.S. consumers a willing “co-creators”, a fact that many companies have yet to take advantage of. Leer más “Forrester: Consumers Will Not Only Buy, They’ll Help Create”

Marketing Optimization Technology: Be careful of shooting yourself (and your test) in the foot

…) I had the pleasure of learning about an experiment devised by my colleague, Jon Powell, that illustrates why we must never assume that we test in a vacuum devoid of any external factors that can skew data in our tests (and even looking at external factors that we can create ourselves).

If you’d like to learn most about this experiment in its entirety, you can hear it firsthand from Jon on the web clinic replay. SPOILER ALERT: If you choose to keep reading, be warned that I am now giving away the ending.

So after reanalyzing the data and adjusting the test duration to exclude the results from when an unintended (by our researchers at least) promotional email had been sent out, Jon saw that each of the treatments significantly outperformed the control with conclusive validity.


(…) I had the pleasure of learning about an experiment devised by my colleague, Jon Powell, that illustrates why we must never assume that we test in a vacuum devoid of any external factors that can skew data in our tests (and even looking at external factors that we can create ourselves).

If you’d like to learn most about this experiment in its entirety, you can hear it firsthand from Jon on the web clinic replay. SPOILER ALERT: If you choose to keep reading, be warned that I am now giving away the ending.

Computer ChipAccording to the testing platform Jon was using, the aggregate results came up inconclusive. None of the treatments outperformed the control with any significance difference.  However, what was interesting is the data indicated a pretty large difference in performance with a couple of the treatments.

So after reanalyzing the data and adjusting the test duration to exclude the results from when an unintended (by our researchers at least) promotional email had been sent out, Jon saw that each of the treatments significantly outperformed the control with conclusive validity. Leer más “Marketing Optimization Technology: Be careful of shooting yourself (and your test) in the foot”

Multivariate Testing: Can you radically improve marketing ROI by increasing variables you test?

In response, one emerging MVT service model offers getting to a “lift” faster by using adaptive elimination of likely underperformers, in exchange for the test results providing limited information beyond identifying the winner. Such test results are not as useful as their full-factorial brethren for designing subsequent tests because adaptive elimination of treatments makes it difficult to extrapolate the psychological factors and consumer preferences responsible for the test outcome. The immediate business benefits, however, are more immediate.


As I was reading a few LinkedIn discussions about multivariate testing (MVT), I began to wonder if 2010 was going to be the year of multivariate.

1,000,000 monkeys can’t be wrong

Multivariate Testing (MVT) is starting to earn a place in the pantheon of buzzwords like cloud computing, service-oriented architecture, and synergy. But is a test the same thing as an experiment? While I am not a statistician (nor did I stay at the Holiday Inn last night), working at MarketingExperiments with the analytical likes of Bob Kemper (MBA) and Arturo Silva Nava (MBA) has helped me understand the value of a disciplined approach to experimental design.

MonkeyWhat I see out there is that a little knowledge is indeed a dangerous thing. Good intentions behind powerful and relatively easy-to-use platforms like Omniture® Test&Target™ and Google® Website Optimizer™ have generated a misleading sense that as long as a multivariate test is large enough (several hundred or more combinations being tested), at least one of the combinations will outperform the control.

This notion has become the value proposition of a growing number of companies offering services around either the big-name or their own (simpler, and often therefore easier to set up) MVT tools. They are ostensibly betting on the technology, and not on a systematic approach to experimental design or any particular UI/UX (user interface/user experience) optimization theory.

Even though, as Bob has pointed out to me, it is reasonable that an MVT setup with a billion combinations may not yield a lift over the control, my contention is that the risk-weighted business cost of a dissatisfied customer is low. Therefore, little stops the burgeoning MVT shops from safely offering a “100% lift guarantee.” Just like the proverbial million monkeys with typewriters, somewhere among thousands of spray-and-pray treatments their MVT tests are expected to produce one that’s better than the rest.

1 monkey with a stick Leer más “Multivariate Testing: Can you radically improve marketing ROI by increasing variables you test?”