Branded Content: Measuring success

Branded Content: Measuring success

- in Articles, Features
2990
Comments Off on Branded Content: Measuring success

The BCMA’s CEO Andrew Canter discusses measuring branded content success and the content monitor and placement monitor tools developed in partnership with Ipsos MediaCT. This article was originally published in the Best of Branded Content Marketing:Volume I, curated by BCMA Global Council and UK Advisory Board member Justin Kirby as part of the BOBCM partnership with the BCMA :

Measuring a branded content campaign can be the market research equivalent of putting together IKEA furniture – simple in concept, surprisingly difficult in practice. And like that Aardvark wardrobe you have spent several hours trying to assemble, you can be left looking at lots of components without knowing how they relate to one another and unsure if the whole thing will actually fit together.

As anyone who has tried to evaluate the success of a branded entertainment campaign will tell you, there are numerous challenges to overcome.

The first is that what constitutes branded content is very diverse. It could be an event taking place over a few minutes, hours, days, months; it might be a 90-second online video, a 30-minute TV or radio show or a feature-length movie; it could exist in a virtual or the real world; it might require consumer participation/interaction or be entirely passive. Whatever it might be, it is unlikely that consumers will regard it as advertising and therefore we cannot talk to them about it in those terms.

Secondly, branded content is often part of a wider campaign and isolating its success or effect can be difficult. In particular, what did it contribute compared to conventional advertising? How well did the marketing support for a piece of branded content perform relative to the actual content? It might be that the promise of the content was stronger than the actual delivery, and in some cases the campaign was a success not because of the actual branded content, but the campaign elements supporting it.

Another issue is identifying what is working well in the branded content and if a ‘less is more’ approach would have been more successful, or conversely if the branded element was so subtle that it was missed by many consumers. This is one of the most common debates between brand owners and the makers of the content, with the natural temptation among the former to over-acknowledge their own magnanimity in providing this great content.

These are just some of the more common challenges; you can be sure that each branded content campaign will bring several more of its own. When evaluating a campaign therefore, the first thing to recognise is that whatever approach you take, it is unlikely to be perfect. Like that piece of IKEA furniture, good enough is the aim, not design perfection.

In general, it is easier to address the challenges outlined above using an experimental design, rather than trying to track a campaign in-market. Unlike an in-market tracker, in an experimental design we are not seeking to find people who have seen a campaign in the real world. Instead, we are creating an environment in which they are being exposed to the campaign or elements of it in a very controlled way, and without them knowing that our interest is in the brand featured in the content (more on how we control this exposure in a moment). An experimental design therefore employs multiple test cells versus a control group to enable the research team to evaluate individual elements of the campaign, as well as the campaign as a whole.

As a simple example, we can evaluate the impact of both a piece of branded content and the marketing support for it using a four-cell design. ‘Test Cell 1 ‘ is exposed to just the branded content, ‘Test Cell 2 ‘ to just the marketing support and ‘Test Cell 3 ‘ to both the branded content and the marketing support. The Control Cell does not see any element of the campaign. Using this approach, we can isolate the impact of the support from the actual branded content, evaluating them individually and in combination. An in-market tracking approach may struggle to identify enough consumers who were exposed to just one particular element of the campaign, making it very difficult to isolate and ultimately understand and optimise its effect.

contentmonitor_2_cells

One element (e.g. just branded content) to a campaign
will require one test cell and one control cell

contentmonitor_more_cells

Two elements (e.g. branded content and
product placement) would require four cells

Respondents in all cells answer the same survey that covers a number of pre- determined brand metrics as well as feedback on the branded content and campaign. With this design it is essential that the sample size and profile of each cell – including the number of brand users per cell – are tightly controlled. By doing so, we can make the statement that, other things being equal, the impact of campaign element x is y for a brand.

The obvious weakness of an experimental design is that it conveniently ignores the real world problems of both the reach and relative cost of the different campaign elements. After all, the fact that an in-market tracker might struggle to identify people who saw just the branded content and not the marketing support is a finding. Without a reality check on cost and reach, that experiential element which was experienced by the 100 sober people at Glastonbury looks a real winner in an experimental design.

measuring_branded_content_pointlogic

  • Depending on the size of your campaign and your research objectives, there are two parts to the analysis.
  • The first part evaluates how well the campaign has worked vs. the objectives and identifies strengths and weaknesses in the creative approach.
  • The second involves Pointlogic’s Commspoint Influence TM media planning system. Commspoint Influence uses the survey results and translates them into a response curve to capture diminishing returns. It also manages costs, reach and frequency and other planning detail. In this part of the analysis we can look at various ‘what if scenarios’ to see how the campaign would have performed vs. each objective with a different spend and a different mix.
  • Both parts (if applicable) are combined into a single executive summary; moreover, you will also be able to specify your own analysis using the Commspoint Influence system.

The solution is to anchor the data to the real world using a system such as Pointlogic’s Commspoint Influence planning system. The survey data from this type of design measures the ‘power’ of individual and combined elements of a campaign. A system like Commspoint Influence then factors in cost, reach, halo effects and so on to convert it into meaningful data that a planner can use.

This combination of granular data on each campaign element (in isolation and in combination) integrated with media planning software allows for a lot of ‘what if?’ analyses – which is useful when marketers seemingly have so many unanswered questions around the role and impact of branded content. For example, this design could help answer what if the marketing support for the branded content component was doubled, tripled or halved? What if the rest of the campaign had made more use of online pre-rolls and social media at the expense of TV? What if the branded content had stronger branding? This approach also enables a closer look at reactions to the content to identify its strengths and weaknesses.

As with any approach, the devil is in the detail. A crucial element of an experimental design is exposing consumers to the campaign (or elements of it) in a way that feels natural, and without ‘hot-housing’ it. The beauty of branded content is that the consumer will be inclined to believe the interest of the researcher is in the nature of the content itself and not the brand behind it, which helps with this ‘disguising ‘ process.

Consumers can be exposed to a campaign in a number of ways. It could form part of an online survey in which we recreate different media experiences, or it could make use of a media lab (or focus group facility) to which respondents are invited to watch TV, surf the web, etc. individually. It does require some effort on the part of the research agency to stage- manage the exposure, but it is essential to producing accurate data.

The BCMA’s contentmonitor uses this approach and has helped numerous brand owners to quantify the success of their branded content campaigns.

lynne_robinson-150x150The IPA very much welcomes this BCMA initiative to provide greater transparency and accountability into the measurement and understanding of branded content.

Lynne Robinson, Research Director, IPA

There are two parts to the BCMA ‘Monitors’: the front end is Ipsos MediaCT’s versatile and precise content evaluator; the back end is Pointlogic’s Commspoint Influence system which is used by many of the leading media agencies. The Ipsos MediaCT component evaluates the power of the different elements in the campaign in meeting the objectives. The Pointlogic component takes this data and converts it into a media planning tool – looking at the power and the reach of each element as well as the synergy between elements.

The output answers the initial questions of ‘did my campaign work?’ and ‘which elements of it were most powerful?’, then goes on to reveal how the results could have been different with a different media spend and mix. Moreover, it enables you to change the importance of different objectives and see the implications for the media plan.

bob_wotton-150x150We’re living in an age of accountability, and recent austerity has forced the pace. We’re also living in an age of media channel proliferation which has led to stiff competition for advertising revenue. Channels seeking consideration by potential advertisers will have to underpin their claimed contributions or they will perish. The BCMA’s initiative puts branded content firmly in the frame.

Bob Wootton, Director of Media & Advertising, ISBA

An integral part of the analysis is based on measuring the emotional and cognitive response to the branded content. There is much evidence supporting the importance of emotion as both gatekeeper and driver of the decision maker.*

bcma_ipsos_measuring_branded_content

Monitors give marketers an in-depth insight into:

  1. How your campaign is performing against key brand metrics, allowing you to determine the ROI.
  2. Which elements of the campaign are performing the most strongly in meeting the campaign objectives.
  3. What you could do differently to support the campaign more strongly to optimise its ROI.

Ipsos-MediaCT1

The BCMA contentmonitor includes the Cognitive and Emotive Power (CEP) test, devised by Dr Robert Heath, author of The Hidden Power of Advertising. The CEP test involves a forced exposure of the test branded content/ad, followed by a series of questions where the respondent rates the branded content/ad on 10 different elements. Using an algorithm, the responses are converted into a score for:

  • Emotive Power (strength of subconscious feeling)
  • Cognitive Power (strength of conscious thinking)

CEPTEST

* Damasio, 1994: Rational decision making is ‘hard wired’ to our emotions; Zajonc 1980, Damasio 1999: Processing of emotions is independent of working memory (cognition) and does not require attention; Shiv & Fedorikhin 1999: Emotion drives decision making when time is constrained with people relying more on intuition; Watzlawick 1962: Relationships are driven not by the rational message in advertising but by emotional content; it’s not just what you say, but how you say it.

measuring_branded_content_1

Traditional’ advertising can find it hard to be
both emotional and cognitive at the same time

measuring_branded_content2

Branded content and traditional advertising can work together
to produce a campaign that is both emotive and cognitive

One such example is HSBC Private Banking’s partnership with CNBC. This involved short films created specifically to feature on the TV channel that featured alternative investing opportunities in various markets. The on-air films were complemented by HSBC-branded (in the form of banners and pre-roll sponsorship credits) web pages on the CNBC website.

The campaign was evaluated among 392 high net worth individuals (with £1m+ of liquid assets) in UK (n=100), New York (117), Hong Kong (119) and Brazil (56). Two elements of the campaign were tested: a two-minute branded content film and the web pages, both individually and in combination.

The research found that the branded content on TV was effective in ‘sneaking in under the radar’. The information contained in the videos was pitched at the right level, as was the level of branding. The web pages had a strong synergy with the video and they helped to dial-up the branding of the campaign.

Overall, the combination of these two elements had a strong impact across a range of key brand metrics. The response from this difficult-to-please audience was similarly impressive – 76% said that it got their attention, while 67% agreed that they would like to see more of this type of advertising in the future.

HSBC were delighted with the results and the Group Head of Marketing Insight & Planning said: “It’s usually very hard to determine the impact of a campaign retrospectively when the target audience, like ours, is so difficult to reach. What this research has given us is real insight, not only into whether or not our target audience liked the campaign, but also how different elements made them feel towards our brand, and which messages were coming across strongest.”

This article was originally published in the first edition of the Best of Branded Content Marketing (BOBCM) ebook series and specifically thanked Ian Wright for his valuable contribution. Ian was then Executive Vice President Corporate Development at Ipsos OTX MediaCT, and was the former Research Director of BCMA North America Inc. He’s now at Tapestry Research, and has more recently kindly participated in a BOBCM facilitated round table discussion at Havas with Tim Foley at pointlogic and others. Ian discussed Dr Heath’s CEP Test mentioned above, as well as new ways of measuring branded content using the latest technologies. You can read the write up of the round table discussion here.

Full contentmonitor case studies are exclusively available to BCMA members, but there’s one available in the Products & Services section of this site where you can also find out more about the tool.