There is now a growing recognition that advertising can be more effective when it engages the viewer. This most likely because it can influence perception, guide attention, influence memory retrieval, alter values and beliefs, communicate how we feel to others, and guide decision-making. The problem for market researchers is how to measure specific emotional responses accurately. Whilst it is relatively straightforward to ask respondents to explicitly state how they feel after watching or listening to a message or turn a dial clockwise or anticlockwise depending upon how they feel, the response obtained is likely to be a global measure of their current emotional state. Furthermore, such methods may become contaminated by participants’ expectations about the experiment, demand characteristics of the study, the need to give an impression to the researcher, and the real problem of being able to introspect accurately and locate one’s feelings.
An alternative approach might be to use physiological or biometric measures, but although such measures can provide strong indications of a person’s arousal, interpretation requires additional measures. In addition, physiological measures require more expensive testing at central locations. Recent advances in implicit reaction time (IRT) testing offer a way forward for circumventing the above problems. There have been considerable successes using this method to provide an objective measure of attitudes in consumer research.
Affective priming is an IRT that was developed by Fazio and colleagues to measure automatic (immediate and subconscious) evaluations by comparing reaction times to emotional material. In its original version, the task is to categorize a word as either positive or negative by pressing one key on a computer keyboard for positive words (fun, happy, etc.) and another key for negative words (dull, sad, etc.) as quickly as possible. In a second phase of the task, a word prime is presented before the target word. The prime itself is either a positive or negative word, such as joy or sad. When the prime and the target have the same valence (e.g., joy + happy) the time taken to detect the target (happy as a positive word) is faster than when the prime and the target have opposite valence (e.g., sad + happy). In the context of consumer behavior, therefore, if someone has positive feelings for a particular brand, such as MTV, then they will be quicker to detect a positive word when paired with MTV than a negative word when it is paired with MTV.
Fulcher, Trufil, and Calvert have just published an article in the International Journal of Market Research describing the Implicit Affective Congruency Test or IMPACT. This test works on the same principle as affective priming, but with an audiovisual clip running in the background. This time a prime is not a word but is the content of the audiovisual clip at the moment the target word is presented. Hence the audiovisual clip will infl uence the speed and accuracy of responding because it interferes with the task. Words that are congruent with how the clip makes the viewer feel (e.g., happy) are easier to react to, while words that are incongruent with the viewer’s feelings (e.g., sad) are harder to react to. This test provides a way to measure a viewer’s feelings towards a commercial or indeed any type of audiovisual content.
Here we demonstrate that IMPACT is able to measure specific emotional responses to two videos. The first video was chosen to elicit positive, happy feelings and consists of military personnel returning home to their loved ones (Positive Video). The second video is a Cruelty to Children Must Stop advert from the NSPCC and was chosen to elicit sadness (Negative Video). For the Positive Video, four different IMPACTs were used, each measuring one of four bipolar emotions: Happy vs Sad, Pleased vs Disgusted, Calm vs Afraid, Surprised vs Bored. The task in each case was to detect the valence of a word shown centrally on the screen by pressing one key if they saw a positive word and another key if they saw a negative word, and to do so as quickly as possible. For the Negative Video, we used one IMPACT, Happy vs Sad.
We followed the method outlined by Fazio for analyzing the results, and for each test a facilitation index (FI) was computed for every word presented and for every two seconds of each video. An FI greater than zero implies a positive emotional response and an FI less than zero implies a negative emotional response. For the Positive Video, words from the Happy category produced the strongest positive overall FI, followed by the Pleased category, and then by Calm, but the Surprised category was close to zero (indicating that surprise or boredom weren’t experienced overall). Moment-to-moment FIs for Happy vs Sad are plotted in Figure 1. The interpretation participants felt happy, pleased, calm, but neither surprised nor bored when watching this video.
The Negative Video elicited a negative emotional response overall. The trace in Figure 2 clearly shows that the emotional response in viewers is almost entirely negative throughout. There is a small positive peak around the middle of the commercial, which coincides with the caption “£2 a month” displayed over the NSPCC logo and a telephone number.
IMPACT can determine the emotional responses to audiovisual content on a scene-by-scene basis and is able to detect highly specific emotions and feelings. It is useful for evaluating the creativity and effectiveness of audiovisual content, such as television and web-based adverts. The test is an indirect measure of emotion and hence is not susceptible to eff ects of social desirability or
insincere responding. It can be carried out online, which could save considerable resources, and is not a difficult task. This new method opens a new avenue of research for examining the effectiveness of advertising. It can detect the general emotions elicited by the advert as well as very specific feelings. These are useful things to know since they inform how engaged the viewer is with the advert, and hence the brand.
This article was originally published in the Neuromarketing Yearbook 2017. Did you like it? Order the Neuromarketing Yearbook 2018 here!