The Dark Side of Emotional Intelligence in UX

| November 19, 2015
Sign up to get weekly resources, and receive your FREE bonus eBook.
Thank you!

Get ready for some great content coming to your inbox from the team at UserTesting!

Emotional intelligence is a powerful skill, and the benefits to possessing it are clear: better understanding of your colleagues, customers, friends and family, and yourself. Which in a business context, translates into brand loyalty, and even sales. Emotional intelligence, or EQ, is discussed at length in the UX community. And for good reason. Thoughtful design isn’t just about making products that are attractive and usable; it’s about making an emotional connection with the user. A great user experience is one in which your product or service meets the user where they’re at, which is why emotional intelligence has become such a hot topic.

The good side of emotional intelligence

There are plenty of companies dipping their toes in the waters of emotional intelligence. Facebook is considering adding an “empathy” button to enable users to respond to posts beyond the iconic “like” button. If they roll it out, you could express sympathy when a friend shares a sad story, rather than giving them the thumbs-up.

Twitter recently changed its favorite icon from a star to a heart, which they say is more in line with how users were feeling when using the app (despite a vocal response on Twitter to the contrary).

My favorite example though is project management tool Asana’s unicorn hack. When enabled, the program shoots a colorful unicorn across your screen when you complete a task. Checking items off your to-do list has never been so gratifying.

unicorn

The dark side of emotional intelligence

But there’s a dark side to emotional intelligence, too—especially when it comes to UX. It may be hard to believe, but being emotionally intelligent can lead you down a more nefarious path if you’re not careful.

Recent research has revealed that there may be a link to emotional intelligence and manipulation. It turns out that individuals with high emotional intelligence are more likely to use that insight to manipulate others for their own purposes.

In other words, emotional intelligence doesn’t just make it easier to connect with your user—it makes it easier for you to get what you want from them. So what’s the difference between removing friction for users and creating great experiences and manipulating them for your own purposes?

The difference is subtle. Here are a few questions you can ask yourself to make sure you’re using your emotional intelligence for good.

Is this a dark pattern?

We’ve discussed dark patterns before, and although virtually no one condones them, they’re everywhere in the digital world. Take a hard look at your product’s design and make sure it hasn’t turned to the dark side.

Will your users get what they want?

Part of creating a great user experience is enabling users to achieve their goals. And that means you might have to sacrifice—or at least acknowledge that—you might not get what you want as a result. At least not right away. This is an important step in using your emotional intelligence for good. If you’re using an approach that serves your short-term needs but isn’t helping your user in the long run, it’s a good idea to take a step back and reconsider.

It’s nearly impossible to completely ignore metrics. But whatever your goals may be, try to always keep the relationship with your users as your prime motivator. While it may be tempting to persuade your users to take a particular action because it will improve your metrics, none of those measurements will mean much if those users are gone by next quarter. Use your emotional intelligence to understand what your users want and need, and tailor your goals to align with align with those wants and needs.

Pro tip: This is a great question to ask in team meetings when you’re discussing design decisions. It’s easy to get super focused on goals, so asking the group how this will benefit the user, and will it get in the way of their goals can help bring the discussion back to user-centred territory.

Can this be used for evil?

You already know it’s helpful to put yourself in the shoes of your users, but sometimes it’s also a good idea to consider a more nefarious angle. How might a product or feature be abused? Is there any way users could be manipulated or put their privacy or trust at risk?

Take the app that has yet to officially launch, Peeple, for example. It was a cool idea—an app that allows others to rate anyone they know. The app was born out of a genuine desire to help others. Co-founder, Nicole McCullough, has said her inspiration came from being a mother and wanting to get to know her neighbors a bit better. But what resulted was an app that some have called “the internet’s most-hated app,” that seemed to completely ignore the possible ramifications of giving John Smith free reign on rating his former business partner after their venture succumbed to bankruptcy.

By considering every which way your product or service could go south, you’ll see the potential dangers early on, and hopefully avoid taking a turn down the path to the dark side.

 

Most of the time emotional intelligence is a good thing. Having empathy for your users will not only help you create a great experience but establish trust and cultivate a long-term relationship. Keep these questions top of mind as you continue to improve your user experience, and you’ll avoid the dark side of emotional intelligence.