Behavioral Data as a Complement to Mobile Survey Data in ...

1 downloads 152 Views 247KB Size Report
Recently, we have implemented a method to record human-mobile ... exposing them to one of the various mobile ad units (s
Behavioral Data as a Complement to Mobile Survey Data in Measuring Effectiveness of Mobile Ad Campaign

Thao Duong, Ph.D. Senior Statistical Analyst, comScore ([email protected]) Steven Millman Vice President, Survey Research and Modeling Group, comScore ([email protected])

Paper presented at the CASRO Digital Research Conference San Antonio, March 11-12, 2014

RESEARCH OBJECTIVE During the last few decades, there have been great advances in technologies utilized in survey research that revolutionized the way it is done, moving from human administered surveys to online surveys and computerizing data analysis. With a rapid increase in smartphones and tablets usage over the past few years, surveys are moving into a new era – via mobile devices. In addition to the strengths traditional online surveys hold such as global reach, flexibility, convenience, question diversity, ease of data entry and analysis, etc., surveys now can be taken on the go and in real time. In addition to surveys, large-scale behavioral data replaces human responses with automatic tracking of online traffic which can improve integrity of data and uncover valuable insights that would otherwise remain hidden. However, there are still some drawbacks for each method of collecting data. For instance, surveys taken via mobile devices still see low response rate, impersonal or systematic responses, and skewed attributes of internet population. Behavioral data may not capture all information that we need to know to understand the full attributes of online population. To improve our market research process, we Use the combination of tracking online activities and interviewing users. One of the benefits is that we can lower the effect of impersonal or systematic responses by adding information that doesn’t require inputs from the responders. In addition, behavioral data also makes up for the short attention span of the mobile user and their inability to take long surveys due to that attention span on device. As market researchers, we look for ways to improve our ability to measure online advertising effectiveness by combining different sources of data in order to increase the data’s integrity. Recently, we have implemented a method to record human-mobile interaction with the ad campaign as well as providing the opportunity for users to complete the surveys on mobile devices. For the purpose of demonstrating the methodology, comScore was commissioned by Vibrant and worked with them and the IAB on research around the Mobile Rising Star ad units in an experimental design study. We will go into details of the experiment in the following section . In this paper, we plan to address the following issues 1. Reason and solution for experimental design 2. Methodology on collecting survey data via mobile devices 3. Methodology on recording human-mobile interaction 4. Integration of survey data and interaction behavioral data in the data analysis Experimental Design

PAGE 2

In general, researchers agree that experimental design is an ideal method in learning the effect of test versus control. In our context, when the users were exposed to a type of ad, they are considered to be in the test group. Otherwise, they are in the control group. In the market research environment, a couple of the important questions that advertisers are interested in are how online ad campaigns contribute to the perception of building a brand image and what the advertising team can do to maximize its campaign’s contribution while at the same time maintaining a neutral customer experience online. Hence, experimental design is used to study the effect of a specific ad type on user’s perception of a brand. This approach will eliminate external factors and biases that may impact the results or increase the variation in data including crosscontamination, exposure to non-advertising influencers, and multiple exposures. In this case study, we recruited users to complete the “lab” exercise while randomly exposing them to one of the various mobile ad units (standard mobile banner ad format, adhesion banner expandable filmstrip, in-text trigger ad) or content only information without any ad units present and then ask them to complete the brand lift survey via a survey link on the content URL. The ad units are either not yet in field or were designed specifically for this experiment. The creation of control group (those who did not have an ad unit present) generates a baseline for studying ad effectiveness among different types of ad units. Survey Data Collection In our experiment, we sought to learn what advertisement style is the most effective among the available ad units, how it compares to non-advertisement environment, and whether interaction with the ad will improve customer’s perception of the brand. In order to achieve these goals, we couple mobile survey research and interaction measurement. comScore has an online panel that can be contacted via email addresses and invited to participate in surveys. Specifically, we emailed the panelists and told them to move over to their mobile device to move to the next step in participating in our study. During recruitment, users are directed to click on the URL within the instructions from their mobile device. This URL will open a website that contains content relevant to the advertiser’s campaign (for example, an article about snacking, while the experiment involved for a cookie advertiser). If users try to access the website from a non-mobile device, they were prompted to retry from a mobile device. Upon the second attempt to take the survey from a non-mobile device, the participant was screened out from continuing. This step is done since we are able to

PAGE 3

sniff the device user agent present in the browser and this is then the hidden variable that does the screening process. Once the users are verified as being on a mobile device, they are directed to act as they normally would on the website and that a button on the bottom of the page would activate when the survey was ready to begin. The survey includes a short set of brand impact measures and attitudinal questions pertaining to the specific ad unit. In order to achieve a balanced sample, we decide to reach a minimum quota of 200 completes in each group (control group, exposed to standard ad format group, exposed to adhesion banner group, and exposed to in-text trigger group). To avoid bias in the popularity of the brand, the experiment is set up for three different brands in food and technology industries. In order to achieve unbiased responses towards any brand in particular, the questionnaires are very similar among these brands. Interaction Measurement In order to measure interaction rates, we implement a number of pixel calls on every engagement point for each of the mobile ad units. For instance, a pixel call is made when the user swipes or clicks the ad to move to the next level of content. The final pixel call is made when a participant clicks to close the ad unit or return to the content on the website. In addition, time stamps associated with each interaction were captured in the pixel call and used in the calculation of interaction time. Total interaction time begins with pixel call for the first engagement and ends with the last engagement when user closes the ad or chooses the “back” option. Through this process, we will be able to create a data set of interaction metrics across the various mobile ad units in the experimental design study. Another pixel is used to capture a randomly generated ID that was also captured in the survey records. This enables us to tie the two sources of data at the participant level. Findings In this study, each outcome metric is either on a 5-point scale or 7-point scale. For the analysis, we converted these scales to binary using the top 2 ratings as best responses to the question asked and not otherwise. For instance: What is your overall opinion of the brand after seeing the ad?

PAGE 4

1. 2. 3. 4. 5.

Very favorable Somewhat favorable Neutral Somewhat unfavorable Very unfavorable

We present the results in the form of point lift which is the difference between estimated proportions of top 2 box between two different types of ads or between a type of ad and the control. In this paper, we only show the results comparing standard banner ad and Mobile Rising Star ad. Preliminary analysis of the combined data (including all three brands) indicates that the Mobile Rising Star (MRS) Ad units are generating about the same effects as the Standard Banners in most of the measures. For recalling the ad message, MRS ad actually generates significantly less inspiring effect as compared to Standard Banner. However, this may not tell the entire story.

Standard Banner

Adhesion banner and in-text branded keyword ad (MRS ad)

Point Lift

Impression of the brand

20%

24%

+4%

Recall the brand

54%

55%

+1%

Resonate emotionally with positive brand memories

33%

32%

-1%

Recall the ad message

53%

44%

-9%

Recommend the brand

64%

62%

-2%

Improved impression of content

10%

12%

+2%

Ads are more engaging

21%

23%

+2%

Ads are more attention grabbing

38%

41%

+3%

Complimented the content

31%

32%

+1%

Ads are more enjoyable

32%

30%

-2%

Top 2 Boxes

Since the interaction rate for standard banner is very low (< 5%) as compared to that for MRS ad types (~ 10%), we show the lifts after inclusion of interaction data to MRS ad types only. Results are shown in Table 2. After touching or swiping MRS ads, participants will be taken to the full page take-over of the ad. This full page take-over impacts results in significantly higher branding influence and drives better advertising experiences compared to standard mobile banner ads.

PAGE 5

Standard Banner

Interaction with Adhesion banner and in-text branded keyword ad (MRS ad)

Point Lift

Impression of the brand Recall the brand

20% 54%

38% 90%

+18% +36%

Resonate emotionally with positive brand memories

33%

44%

+11%

Recall the ad message Recommend the brand Improved impression of content Ads are more engaging Ads are more attention grabbing Complimented the content Ads are more enjoyable

53% 64% 10% 21% 38% 31% 32%

63% 72% 21% 37% 56% 43% 42%

+10% +8% +11% +16% +18% +12% +10%

Top 2 Boxes

Conclusions In this paper, we have discussed about the methodology to collect interaction points to different ad types on mobile devices using additional pixel calls. This provides additional information to the traditional online survey. Since surveys taken on mobile devices usually are shorter, we are not able to capture all information that we would like. Hence, by adding interaction data, we have a clearer picture and come closer to the truth in measuring the effect of ad types on mobile devices. Even though we can measure the length of interaction period, for the elegance of simplicity, we decide to treat interaction as a binary variable as part of the analyses. We found that users usually only view standard banners but are more likely to interact with adhesion banners and/or in-text branded keyword. As a result, the interaction drives higher lifts that are significant at a 90% level.

PAGE 6

Suggest Documents