Back to Blog

How artificial intelligence can change usability testing

Portrait Felipe Restrepo
Felipe Restrepo
4 min read

UX and Artificial Intelligence it's now ready to help designers, product owners and companies improve the way they gather information from their users.

Artificial intelligence is becoming more and more common in the tech industry conversations. One of the questions that I ask myself as a designer is “how will A.I.impact my work in 10 years?” My job is mainly directed at designing experiences based on Artificial Intelligence (chatbots, Assistants, Marketing ads, etc), but what if we could use Artificial Intelligence and more specifically Machine Learning to design better user experiences?

Our first question here at Odaptos was “is it possible to automate design?” With that in mind, we jumped into the long and crucial task of gathering all the documentation on what artificial intelligence could offer. We ended up with a huge list of ideas, some quite crazy, but others perfectly doable, amazing us with the ocean of possibilities that stood before us. The next step was to look at what parts of the design process could be tweaked and how we could improve our own processes as user experience designers.

Our strong connection to human-centered design drove us to focus our attention on the Empathy Mapping methodology, created by Dave Gray, a method that has gained a lot of traction in the agile community and is now common practice in product design.

What is Empathy Mapping?

An empathy map is a collaborative visualization articulating what we know about a particular type of user. It externalizes knowledge about users based on what they say, do, think, and feel, in order to A, Create a shared understanding of user needs, and B, aid decision-making.

The process is based on the observation of the user and the attentive record of the user’s expressions, both verbal and non-verbal. The role of the team conducting empathy mapping interviews is to facilitate the manifestation of emotions from the user during the interviews.

The post-it nightmare

If you are a user experience designer like us, chances are that after an empathy mapping session you are looking at a wall covered by a considerable amount of post-its, asking yourself the question “where do I begin?”. This became our problem to solve.

Typically, after some usability interviews with users, our teams end up with tons of notes in the form of spreadsheets, notebooks or recordings. Then we spend substantial time building those notes into comprehensive summaries and one way of making this happen is to stick the most important elements on a board for the whole team to see.

Artificial Intelligence to the rescue!

User experience and Artificial Intelligence share a common objective: to predict human behavior and anticipate what will happen next. Language and emotional analysis is a common denominator in both disciplines. It is at that touching point that we saw an opportunity to create something new. Working on it quickly became clear to us that the constant need for teams to gather emotions from the users and the long and tedious process of analyzing results after an empathy mapping interview with several users was a perfect mixture of need/problem.

We identified “4” Ways Artificial Intelligence can change the user experience interview:

Deep Learning. The way we can get the most of what users say during an interview and how we can give value to their expressions.

Gestural Data. How we will be able to determine through user actions the non-verbal mind-state of the user.

Facial Recognition. The depth of insight we will be able to have when interpreting their emotions and expressions.

Semantical Data. Often users give insight in the form of linguistic keys, by expressing themselves in coded phrases that later are analyzed by ux designers to provide meaning. Using Machine learning we can create a deep understanding of users intentions and make hidden information expressed by the users visible.

The combination of these technologies will allow the platform to extract metadata such as concepts, entities, keywords, categories, feelings, emotions, relationships and semantic hints by understanding what users are expressing during interviews. It will then be possible to establish in a more dynamic way the relationships between different categories of data aimed at expressing user emotions, offering designers and product owners real insight from human emotions.

Conclusion

Artificial Intelligence can speed up the process of emotional design on usability testing, the constant need to gather information from what the user is expressing without the time consuming burden of listening over and over again to multiple interviews is a must in design thinking. In addition, the capacity to overcome classification of data and the collaborative capacities offered by digital environments make the perfect combination of methodologies.