In our lives—be it physical or digital—we look for signals to better understand how people feel about things. From a smile flashed during a conversation to a happy face emoji at the end of a written message, we look for clues on people’s feelings to bring clarity to the emotion attached to communications or interactions. We often use this information to make decisions about what we should continue doing and what we should halt.
We recently released Sentiment Tagging, allowing customers to select intuitive “negative” and “positive” tags for the notes and clips they capture when reviewing completed sessions. By scanning these tags, customers can understand at-a-glance the sentiment of insights and quickly filter when creating Highlight Reels.
Our journey to creating Sentiment Tagging started when we observed how our customers were manually tagging their notes. We noticed that many of the top tags customers created were about sentiment. The top 10 most used tags included (in relevant groupings):
Another observation: the tags were inconsistent. This is very human: in an effort to quickly analyze responses manually, a person may use synonyms for the same interpretation of sentiment or may choose to abbreviate or punctuate a word or phrase when creating certain tags. And this inconsistency made the creation of Highlight Reels more difficult, considering customers are filtering for clips by tags.
So we set out to create a solution that would increase speed to insights by streamlining the creation of sentiment-based tags and improving visibility when scanning in the video player or during Highlight Reel creation. Since we wanted the solution to be as simple and as easy-to-use as possible, we thought about using emojis because they are such an intuitive proxy for emotion. And in early conversations, customers agreed with this initial hypothesis.
Finally, back when we started on this path, we also had to consider this future feature in the context of our then-product, and we had to make some decisions to set the right foundation for this feature to have a real impact. It’s for this reason that back in November of last year, we changed the heart icon that had previously existed in Notes & Clips to a star icon to mark as important. Doing this ensured that customers could better understand and differentiate between tagging a note based on its importance or the sentiment expressed.
Sentiment Tagging was released along with our other Q1 2020 product releases. By tagging notes with the “negative” and “positive” emojis, we hope that it becomes clear how your customers and users feel about your digital experiences and makes it easier for you to share this with others in your organization.
If you have positive feelings about this upgrade, then you’ll be excited to hear that we have additional sentiment-based updates planned for the future. In fact, we are using the data from Sentiment Tagging to train our Machine Learning model so that we can introduce features in the future that help automatically identify sentiment.
And if you have feedback about Sentiment Tagging or would like information on participating in beta programs to get early access to future features, please reach out to your Customer Success Manager or contact us.