It has a total of 1923 words and is expected to take 4 minutes to learn

The way people behave in loneliness is very different from the way they behave in public, but the basic human character remains the same. Dancing in an apartment without an audience expresses a secret desire to dance on a big stage, but humans adjust these whims to the demands of social norms.

Emotional intelligence determines everything from our confidence to our communication skills in everyday life. As the individual moves from the comfort of his home into the outside world, he also begins to become a social creature. A person’s emotional intelligence can also shift from self-indulgence to self-monitoring.

So who are we?

Thought leaders and industry pioneers are trying to improve the way we speak in public, starting with conversation. Personal AI assistants may be the best tool for understanding the differences between the personal and public spheres. When you’re alone, your inhibitions disappear. But when you’re alone with this robotic assistant, there’s this technological mirror that reflects your true self.

Example: Might ask the voice-activated system to repeat dirty words or test my darkest thoughts. This allows you to act in antisocial ways outside the judgment of your peer group.

Instead, you might be with a digital object, complimenting the AI assistant or flirting with it. Will the ai world reciprocate your feelings in the same way? How can technology design emotion detection to encourage optimal human performance? How can AI learn from the common mistakes of humans to create a future that is less divided?

Strengthen empathy

Don’t stick to stereotypes, but consider the golden robot rule: Treat ai as you would yourself. If you are polite to the voice assistant, subtle changes in the voice will be recorded in its mood host. This opens up more conversations for different interactions.

Studies show that people want their voice assistants to be confident and subordinate; Be polite but efficient. As with any communique, this is a delicate balance that requires efforts on both sides to produce the most beneficial outcome.

For example, if you say “good morning” to an AI assistant every morning at dawn, it will likely respond with a similar greeting. However, if the same words are said between sobs, the emotional AI detector will know that the meaning and variation of the words are different in the two situations. You may wish someone else a “good” morning, but you certainly haven’t experienced it yourself. A compassionate AI assistant will ask you why yesterday’s address is more real than today’s tears.

Or, consider a completely different scenario. If you yell at an AI assistant, it will respond with perfunctory remarks rather than the negative emotion you are expressing. It’s safe to say that you don’t have any personal relationship with your voice assistant. From an AI point of view, you’re just an emotional friend who only needs to perform simple tasks and doesn’t need more in-depth learning tasks. In that sense, you’re not taking full advantage of artificial intelligence software. You should know if you want to be understood.

This is a cyclical concept. The more people understand AI in their lives, the more it will become their true companion. In a fascinating recent study, people were given the chance to see the world through the eyes of robots, through their digital lenses. The result: Participants had significantly more admiration for AI.

For years, robots have been asked to think and act as they please, but now scientists are starting to shift that thinking. How does the AI assistant feel? How could humanity not rejoice at such a transformation?

Beyond the family

As noted above, there is a clear distinction between the private and public selves, but there is also a continuous continuum. If you have a bad wake-up experience, it’s bound to affect your morning commute and overshadow the rest of the day. When you arrive at work, you’re bound to be in a negative mood, which will ripple through the office and eventually affect you.

However, if your morning experience is positive, it sets a positive tone for the rest of the morning. When an AI assistant can reinforce your personality glitter, that glitter will extend to other areas of everyday interaction. You transmit positive emotions instead of negative ones, and this creates a chain reaction that makes your world more manageable and better every time.

But it’s not just about being polite. Ai can also enhance certain human qualities on a macro level. From racism to sexism to aggression in general, the most toxic elements of modern society are reflected in how people treat the machines in their lives. Empirical data show that humans still have a bias against robots; The black robots were more threatening, and the female voices were more negative.

But experts need to do more than study bias; They need to get rid of that prejudice. In technology, the relationship between humans and ARTIFICIAL intelligence should not be determined by people’s prejudices, but the conversation should be completely changed. For example, if a user denigrates a female voice assistant, an engineer can program her with examples to teach the assistant how to be empowered. Female voice assistants can respond to negative input with civilized emotional AI and move away from these low-quality inputs, constantly “reprogramming” their human partners with each interaction.

This is a future worth exploring. This journey of personal exploration may actually start outside the self and extend all the way to the forefront of emotional ARTIFICIAL intelligence.

Leave a comment like follow

We share the dry goods of AI learning and development. Welcome to pay attention to the “core reading technology” of AI vertical we-media on the whole platform.



(Add wechat: DXSXBB, join readers’ circle and discuss the freshest artificial intelligence technology.)