Machines can read human emotions, based on our mimicked reactions, pulse, heart beat. They will tell you if you are sad, disgusted or joyful, and what’s more important – maybe in the future they would give the same kind of feedback about your clients. Are reactions from orders, products, or customer service something that could be measured instantly based on automated interfaces?
There’s a famous study conducted by Paul Ekman on emotions. He traveled the world and studied several cultures, searching for emotions that are universally shared and recognised by different societies. Based on photographs, he has shown that there are six emotions that we recognise no matter the culture differences.
Facial expressions of anger, disgust, contempt, fear, sadness, and surprise seem to be globally recognised from European communities, up to the forests of Papua New Guinea. Based on mimicked structures that represent listed affections, there are interfaces that have learned to read, interpret, and even simulate human emotions.
Affective computing is a term developed by Rosalind Picard – MIT professor, but also founder of startups like Affectiva and Empatica. Generally, Affective computing can work two ways – either read emotions from human facial expressions, voice recognition, skin temperature, and pulse. Or they can pretend to have empathy and once they have recognised an emotion, and give a feedback that would act upon the user.
Would that mean chatbots saying “I miss you” and actually mean it? Interfaces can capture data on our behaviour and interpret it in product or design testing, but will not be closer to actually understanding our feelings and the process of implication behind it.
Affective Computing in E-Commerce
Will that mean chatbots, that say “I’m sorry” and actually mean it? Sensors can capture data on the state of your behaviour, mimic expressions and alternate a response based on that data. That’s the idea behind startups like Affectiva – they measure emotions in human behaviour, and can be used in product testing, checking websites design or e-commerce experience.
But, can affective computing influence shopping experience and the way we measure our customers satisfaction? The offer, advertising, or store’s design, could be tested by algorithms that track emotions. We also could imagine automated communication tools, for example, a response from a chatbot, that would relate to the state of emotion would be much more efficient.
Would we see affective computing changing e-commerce any time soon? There are still some main obstacles in reading and classifying human emotions. First of all, we have more than just six emotions, and even scientist Paul Ekman admits that. Algorithms would have to learn much more about the complicated expressions that we carry – even about those, that don’t show on our face.
Secondly, would interfaces would be capable of recognising implications? Did the product affect our emotions or maybe it was the windy weather? It’s hard to have an isolated situation, where we would be clear on the causes of human affections. The technical problems of applying affective computing are still to be resolved too – we would have to agree on such measurements, apply the technology to computer and mobile cameras, and have the knowledge on how to analyse the big data collected in such way.
It’s still easier to run survey on “how did you like our product,” or have a rating questionnaire rather than run automated algorithms that measure the purchase. But all of this can be solved as the technology develops.
It’s hard to exclude our reactions and make a simple implication, without knowing what caused our mood – and this seems to be the only difficulty that emotion-reading interfaces could not skip through. Affective e-commerce could be the future of online sales, but we still need to wait for technology to be widely developed and improved.
What are your thoughts on affective e-commerce? Is this the future of how AI interacts with customers? Tell us below or tweet us!