Loading...

There Should Be A Test On This

The move toward digital began as a crawl, then a walk and is now in a full sprint at many organizations. The next step is finding out whether digital strategies are accelerating in the correct direction.

Testing is one means organizations can use to track current results, as well as whether and how improvements can be made. During the 2016 Bridge to Integrated Marketing Conference in National Harbor, Md., Brian Rogel, SEO and optimization lead for Beaconfire RED; Jen Boland, data analyst for Beaconfire RED; and Sara Hoffman, manager of web development for Avon Breast Cancer Crusade presented tips for optimal testing. Their recommendations included:

* Have a plan. Rogel recommended considering goals, listing problem areas and ranking problem areas by cost effectiveness. Avoid simply plugging in tests that other organizations have used. Just because a test might work for one entity does not mean that it will be applicable or effective for your organization, said Hoffman.

* Strategize. Tests should generally be done in full-week periods lasting no less than two weeks. Be aware of day-to-day and seasonal factors that might impact results. Testing should not go on for more than four weeks as sample pollution can begin having major effects on results at that point, Rogel said.

Sample pollution comes from users clearing cookies and thus potentially falling into both control and variable groups. Between 30 and 40 percent of individuals clear cookies monthly. Identify a target sample size and make sure that that number can be reached during the testing period. Create a hypothesis of the anticipated problem, solution and results.

* Monitor results closely. Hoffman recommended keeping a browser open throughout the day and setting calendar reminders to check results as to keep stakeholders up to date. Test broadly, according to Boland. By looking beyond data immediately applicable to your goal, you can see potential added benefits or risks. Rogel used the example of a test on changing the colors of two page buttons to better distinguish them.

The variance upped clicks on the buttons by 9.7 and 4.9 percent and, by extension, increased registration clicks by 21 percent. If the opposite were to have occurred – if registration clicks would have gone down – the organization would have had to evaluate whether it was worth upping clicks on the buttons if registration could decrease.

* Share lessons throughout the organization. Rogel used another example of how variant ad-copy language of “don’t miss out” and “don’t miss” resulted in a 34.4 percent difference in clicks for one client. Such knowledge could have been converted to other “call to action” organizational messaging ranging from direct mail to email to social media, he said.