AI + IQ = Bonsai
How would it be if an artificial intelligence could answer one of the most important questions in advertising effectiveness research at the push of a button, namely: What needs of people are triggered by exactly this one advertising medium, this one image that a brand sends out? Sounds good? It is. And it already works.
Artificial intelligence determines the relevant emotions that an image or video triggers - without questioning, with just a few clicks, in a minute. Because even the AI still needs a few seconds to create a "footprint of emotions" of the advertising medium by comparing it with other logos, packaging, posters, websites, social media photos, moving images and advertisements.
If you add human intelligence to the artificial intelligence, it becomes even more exciting - but takes only slightly longer: a brand strategist interprets the results and delivers an initial assessment within a day as to whether the image of a brand is emotionally coherent and "right".
If you want to go deeper and understand the background, you need more, of course. But the AI, which has already been trained for ten years with one brand visual after another, is fun and makes sense even without a big background check. It inspires and surprises. By the way, the AI is part of the brand strategy tool NeedScope, which measures the emotions that decide whether a brand is loved and bought. The German licence for NeedScope - and thus also for the AI that decodes emotions - has been taken over by Bonsai Research on 1 April 2021 (no joke!). MORE
But what images does the automotive industry actually send out and what emotions does it arouse? And what do perfume manufacturers do better? Thomas Hoch, brand strategist at Bonsai, will be happy to show you this week at p&a insights 2021 / the HORIZONT Advertising Effectiveness Summit in Frankfurt - or at another date(thomas.hoch@bonsai-research.com).
Emotional clarity strengthens sales
The background: Brands communicate on the most diverse platforms via the most diverse advertising media - they use shapes and colours, depict things, people and animals. The clearer, the more unambiguous the emotions that brands arouse, the better they sell. This is shown by more than 10,000 NeedScope projects in more than 100 countries.
But what can AI do that decodes the emotions that (brand) images trigger? Here, too, four pictures say more than a thousand words - if you know how to interpret them. Roughly speaking, the different colour segments stand for different emotions (details at Bonsai Strategy).
What can AI do? A sneak preview from the sneaker sector
A first look at the emotions triggered by the manufacturers Nike and Reebok with two ad motifs: In the Nike ad, the AI shows a clear emotional positioning in the red area, which is not an alarming signal here, but stands for extroversion, for "full of energy". The perfect implementation of the Nike slogan "Just do it" - which no longer even appears in the ad (it doesn't have to).
The Reebok ad is quite different: Here, no such uniform emotional message arrives. The ad arouses different emotions. The dominant blue stands for introversion, for objectivity and analytics (evoked by the straight, clearly structured rows of houses). Purple, in second place, stands for the feeling of superiority, for wanting to have the best, etc.
Anyone who wants to delve deeper, see more examples or even have AI work for them. If you want to "feed" it with your own brand story in pictures or films, please contact Thomas Hoch and the NeedScope team.