Which email design works best, or where should the offer banner be added to gauge more interest from the target users?  Comparing and analyzing the success and acceptance of different versions of Marketing Offers or Mailing or different Segments is the essence of today’s Marketing.

A popular saying goes, “comparison is the thief of joy,” does not hold good when it comes to comparing the results by different Marketing strategies. In this case, the comparison isn’t necessarily a bad thing. When you compare different variables in Marketing Offers, Marketing Mails, or website or your social media posts, you can uncover which one performs the best. This type of comparison and analysis can be best achieved by A/B testing, also called Split Testing.

HCL’s Unica suit has the capability to carry out the Comparative analysis from its different Inbound and outbound products for:

  • Mailing comparison using A/B testing

To find out the most effective email design/message contents / Mailing subject in Unica Deliver.

  • Campaign effectiveness using Target VS Control group

To check the success and effectiveness of Campaigning in Unica Campaign by comparing the Target group vs. Control group responses.

  • Offer version comparison using A/B testing.

To find out the most effective Offers or Offer versions in Interact.

This article will see each of the above mentioned 3 methods in detail and how it is used for Comparative analysis.

Mailing comparison using A/B testing

Introduction

In Unica Deliver, the User can configure and conduct A/B testing and automatically send the most effective version (winner) of an HTML email message to the audience. By sending the mailing as an A/B test, responses from the recipient are used to determine the most effective email design and message content. When a user sends a mailing as an A/B test, the system sends different email messages, or different versions of the same message, to between two and five sample groups of recipients.

In the A/B test, the user can compare how recipients respond to various combinations of different messages,  layouts, content variations, or subject lines.

Different Criteria to Compare mailing in A/B Testing:

User can compare results for the following criteria:

  1. Different email layouts,
  2. Different content elements within the same email layout (Combination of different layouts and different content.)
  3. Different email subject lines.

Compare email layout in an A/B test:

For example, a user might conduct an A/B test to send a mailing that includes a link to open a new account or receive a special offer.

  • Create one email communication that puts the call to action at the top of the message.
  • Create another communication that places the link at the bottom of the message.
  • Add each document as a separate test in the A/B test configuration to determine the more effective design.

To compare the different versions of the email layout, select Maximum Unique Clicks as the evaluation criterion.

Compare content designs in an A/B test.

In some cases, users might want to determine the effect of changes in content rather than changes in the layout of an email message.

For example,

  • In email communication, add multiple versions of a sign-up button to a zone in email communication. In the configuration of the A/B test, add each version of the button as a separate test.
  • Compare results between using a graphic or a human photograph in a call to action.
  • Compare colors and graphics for headlines or sign-up buttons.
  • See what happens when you include the word Free in a headline.
  • Compare images that show people of different ages.
  • Compare a shorter message to a longer message.
  • Try different typefaces
  • Compare a single-column layout to a multi-column layout.

Select Maximum Unique Clicks as the evaluation criterion.

Compare subject lines in an A/B test.

The objective of the test is to determine which subject line variation compels the greatest number of recipients to open the email.

Example:

  • Users can create several email documents, each with a different email subject line, and add each document to a separate test split in an A/B test.
  • Users might instead create a single email document in the Document Composer that contains multiple email subject lines.

To Compare the most appealing variation of the subject line, select Maximum unique views as the evaluation criteria OR Select Minimum unique complaints as the evaluation criteria.

Criteria to evaluate response in A/B test to decide the winner

As part of the planning for an A/B test, the User must define the system’s criteria to determine the winner.  A/B testing provides three criteria for evaluating responses.

  • Maximum unique views.
    Considers how many different recipients opened the email. For example, you might specify Maximum unique views as an evaluation criterion if you were conducting an A/B test to compare email subject lines. The variation which gets the Maximum unique view will be the winner in A/B testing.
  • Minimum unique complaints.
    Considers how many different recipients reported the email as spam or asked to unsubscribe. The system records a complaint when it records an unsubscribe request or an ISP indication that the recipient marked the message as an unwanted email or spam. For example, if a user is comparing email subject lines, using Minimum unique complaints as an evaluation criterion can help you to identify subject line variations to avoid the current mailing. A variation that gets Minimum unique complaints will be the winner in A/B testing.
  • Maximum unique clicks.
    Considers how many different recipients clicked at least one link in the email. For example, users might use Maximum unique clicks as evaluation criteria to compare recipient response to different email designs, the placement. The variation which gets Maximum unique clicks will be the winner in A/B testing.

Target and Control group analysis

Introduction:

Comparing the responses of Target groups, VS Control groups is a powerful analysis tool for measuring a marketing campaign’s effectiveness. In Unica Campaign, Groups that contain audience IDs which you purposely exclude from being targeted by the offer(s) for analysis purposes are called control cells.

Target groups / Cells that contain audience IDs that are targeted by the offer(s). Some portions of Audiences are purposely exclude based on different sampling techniques from marketing campaigns to ensure that they do not receive the offer. After the campaign runs, the marketer compares the activity of the control group against those who received the offer to determine the effectiveness of the campaign.

Target vs. Control In Unica:

In the Unica Campaign, Target and Control can be defined in Target Cell Spreadsheet or in the MailList / CallList processes box of the Flowcharts. The contact history tables are populated when the user runs these types of flowcharts in production mode. The contact history identifies the members of control cells and the offers that were withheld (not sent to controls).

Once Responses are received from audienceIDs from Target group VS responses received from audienceIDs of Control cells, it is populated into ResponseHistory of Unica Campaign. This History information helps analyze and compare the responses from target versus control cell for lift and ROI calculations.

Inferred responses from control groups

All responses from members of control groups (which are always hold-out controls in Campaign) are inferred responses.  Since members of a control group did not receive any communication or Offers, they cannot have any tracking codes to return, so their responses are tracked with the matching the Attribute of Interest.

Control groups are always tracked using multiple attributions: every response from a member of a control group receives full credit.

There are different reports available in Unica Campaign to compare the responses from the Control group against the responses against the Target group, i.e., Lift Over Control Group (increase in response compared to the control group).

Analysis and Reports

Example 1:

Say there are 98 Policyholders whose Premium due date is 31st March. Now Company wants policyholders should renew the policy before the due date, so it offers a 5% discount to 78 policyholders (Target Group), and for the rest of the 20 policyholders (Control Group), do not give the offer. This is to compare the effectiveness of the 5% discount offer by checking the Target group’s response rate and the response rate of the Control group without any offer.

Out of 78 unique offer recipients, say 50 responded positively to the offer; in this case, the Response rate will be 64.10%, and out of say 20 customers to whom offers were not given and still 13 customers from this group deposited the Policy premium in that case the response rate will be 61.90%. Thus, there is no significant difference or impact of treating the customers with a 5% discount offer to say that the Campaign is not effective. In this case, the value of Lift Over Control Group is 3.55%, as shown in the below image.

Campaign performance comparison Report

Example 2:
A campaign might target a group of customers who do not have to check accounts with a checking account offer. Members of the control group are tracked to see if they open a checking account within the same time period as the checking account offer.

The higher the value for Lift Over Control Group indicates the high effectiveness of the Campaign. A single control group can be used as the control for multiple target groups. However, each target group can have only a single control group.

Offers or Offer version comparison in using A/B testing

In Interact, A/B testing was done for the comparative analysis for the responses received for Offers or Offer versions. It is done to test the best accepting offer version.
In the Interact treatment rule with all its properties fixed, but only one parameter varies, this parameter is a combination of offer and offers attributes.

In A/B testing of a rule, each variation of offer and/or offer attributes represents one branch with specified audience distribution.

This A/B testing is valid only till within the Effective and Expiration date specified. If the Expiration Date is null, it is in effect forever.

To enable A/B testing, click the Enabled checkbox on the A/B testing tab. This adds the base rule as the first branch in the Branch table. The additional branches can be added by click Add Branch.

That is the Base branch and is related to the Comparative branch.

To analyze A/B Testing results, a column “ABTestBranch” can be optionally added into UACI_CHStaging and UACI_DtlContactHist tables in the Campaign database to record the name of the selected branches when A/B testing is engaged in arbitrating the relevant smart rules.

The information in this column, together with other treatment fields, enables users to create reports on the performance on A/B testing branches for individual smart rules.

To increase Marketing effectiveness and Return on Investment, marketers should engage in a comparative analysis by using different techniques. Different HCL Unica Products uses different techniques like A/B Testing, Target vs. Control analysis to do Comparative analysis. It helps to avoid contact Fatigue, achieve a higher response rate, more acceptance, and greater customer satisfaction. To understand how this works in Unica here is a product guide for the same.

You can reach out to us for any more queries, and we will be happy to help.

Comment wrap
Further Reading
Unica Journey - Dynamic Content and Offers
Marketing & Commerce | January 22, 2021
Unica Journey – Handling Dynamic Content using Decision and Engagement Split
With the increasing number of touchpoints for the customers to engage with a brand or their product, mapping out a customer’s journey has become increasingly complex. Despite that, customer journey mapping is important as it helps in unraveling customer's expectations, gain insight into their mental model, and helps in creating a personalized omnichannel experience. HCL Unica Journey is a product developed for marketers, helping them design the entire customer journey using different capabilities of Unica Journey like Touchpoints, Engagement split, Decision splits, etc. Unica Journey helps marketers communicate with their customers using different touchpoints like Email and SMS. Marketers can integrate the Journey touchpoint with different Email and SMS connectors and quickly deliver their contents/offers to the targeted audiences. Supported Email Delivery Connectors – Unica Deliver, Mailchimp, and Mandrill using HCL Link. Supported SMS Delivery Connectors – Unica Deliver, Twilio using HCL Link. Use Case: As a marketer,  I would prefer to send desired content/offers to the customers based on their hobbies along with personalization with a single click. If the customer has a hobby of ‘Reading’ then as a marketer, I want to send content/offer related to books/magazines to this customers and those who love ‘Music’ I must send content/offer related to some musical instruments, etc. Unica Journey can send dynamic offers/contents to the customers using the preferred delivery channel. The decision split of Unica Journey has a strong capability to do segmentation based on desired conditions. Journey can send desired contents/offers to the set of people who belong to that segment. Let's have a look at how we can achieve it with Unica Journey. Steps 1. Entry Source creation: The marketer needs to create an Entry source where he can upload the customer data in any supported data import formats, as shown below. E.g., CSV file in this...
Unica Interact message Connector
Marketing & Commerce | January 13, 2021
Real-Time Offer Personalization with Unica Interact
A country’s tourism website is planning an email campaign featuring special travel packages to entice 2021 tourism. The company would like to deploy strategic emails offering real-time offer personalization for its customers. Let's understand with the help of this article the goals of the company, the challenges they face, and how Unica Interact can help them in strategizing the entire offer strategy. GOAL Make business user aware of how Interact message connector works. To deploy the Interact message connector in the WebSphere application server. To test the message connector tag to verify personalized offers are getting displayed. CHALLENGE Unfortunately, the company is limited in its backend digital capabilities (no Java or SOAP) and anticipates problems with customers who open emails where only HTML content is possible. The company is looking for a solution to provide real-time offer management in situations where you only have control over the web content that is loaded on the client’s side. SOLUTION Unica Interact is a sophisticated real-time interaction solution to personalize the customer experience and determine the next best action for customer engagement across various channels such as websites, call centers, point-of-sale, ATM’s, kiosks, and more. KEY FEATURE Unica Interact Message Connector allows email messages and other electronic media to make calls to Unica Interact to allow personalized offers to be presented at open-time and when the customer clicks-through the message to the specified site. This is accomplished by using two key tags: The image tag (IMG), which loads the personalized offers at open-time, and the link tag (A), which captures click-through information and redirects the customer to a specific landing page. STEP-BY-STEP SOLUTION 1. MessageConnector gets installed with the Interact installation itself.  The path of msgconnector folder in our windows environment is as follows Path :- C:\HCL\Unica\Interact\msgconnector 2. Deploy the ‘MessageConnector.war’ in the WAS profile....
Marketing & Commerce | January 8, 2021
Unica Discover is Cloud Native!
Cloud-native is the biggest enabler of the generation and helps you to deliver business value by providing you the flexibility to quickly incorporate user feedback for continuous improvement. With the latest feature release, Unica Discover has become cloud-native. Why go Cloud-Native? Cloud-Native applications have several advantages over typical on-premises deployments. Unlike traditional applications, Cloud-Native application resources are automatically provisioned and released with an on-demand allocation of infrastructure. With auto-scaling, complex up-down process needs are addressed, and resources are paid for on a need-only basis. Container-based platforms decrease the time taken to deliver new products and services to market.  HCL understands the customer's needs to have a stable infrastructure that is also primed to scale up or down based on the times of the year. Reduced cost through containerization and the ease of managing the infrastructure will translate to a better customer experience. Unica Discover Is Dockerized! Along with the traditional on-premises solution, Unica Discover is now cloud-ready for any infrastructure options with Docker containerization. With this, Unica Discover has become more flexible, scalable, and manageable, leveraging the Continuous Integration / Continuous Delivery (CICD) model. Dockerized CICD upgrades will allow upgrades, features, enhancements, and fixes to be delivered to customers more frequently and reduced time for testing and deployment! A traditional Unica Discover deployment has Linux components (DNCA and Packet Forwarder) and Windows components (Routing, Processing server, Reports, Replay server, and DB server). All of the Windows components can be deployed in one server or distributed across servers for larger-scale implementations, and the servers themselves can be physical or virtual. In the cloud-native deployment, we have containerized these components and provided Docker images for each. The ports required for communication within the different components, capturing data from Packet Forwarder to DNCA, and forwarding that data to Routing are all automatically managed...
a/icon/common/search Created with Sketch.