Sat. Nov 27th, 2021

Asking the intelligent wife can, in its simplest form, give the digital assistant different personalities that more accurately represent the many versions of femininity that exist around the world, as opposed to the pleasurable, subordinate personalities that many companies choose to adopt.

Adding Strangers would be a fair case for what these devices might look like, “but that may not be the only solution.” Another option may be to bring masculinity in different ways. An example might be Paper, a humanoid robot created by Softbank Robotics that often identifies him / her as a pronoun and is able to recognize faces and basic human emotions. Or Jibo, another robot that relaunched in 2017, which also used masculine pronouns and was marketed as a social robot for the home, although it was given a second life as a device focused on healthcare and education. In terms of the “mild and strong” masculinity edited by Pepper and Gibo উদাহরণ for example, the former answers questions politely and often gives a flirtatious look, and the latter often turns around whimsically and communicates an affectionate behavior to users দেখুন Stranger and Kennedy see positive in the right way. Them as a step.

Quaring digital can also be helpful in creating bot personalities to replace the human concept of technology. Eno, the Capital One baking robot launched in 2019, when asked about his gender, will jokingly answer: “I am binary. I’m not saying I’m both, I’m saying I’m just one and zero. Think of me as a bot. “

Similarly, Kai, an online banking chatbot created by Kasisto সংস্থা an organization that creates AI software for online banking করে completely abandons human features. Jacqueline Feldman, a Massachusetts-based author and UX designer who created Kai, explained that the bot was “designed to be sexless.” Not with a nonbinary identity, such as Q, but with a robot-specific identity and using the pronoun “it”. “From my point of view as a designer, a bot can be beautifully designed and captivating in new ways that are specific to the bot, not pretending to be human,” he says.

When asked if this is a real person, Kai would say, “A bot is a bot a bot. The next question, please,” clearly indicates to users that it is not human or pretending. And if asked about gender, it Will reply, “As a bot, I am not a man. But I learn. It’s machine learning. “

The identity of a bot does not mean that Kai abuses it. A few years ago, Feldman too Talked about Designed with the ability to deliberately distract Kai and stop harassment. For example, if a user repeatedly harassed the bot, Kai would respond with something like “I’m imagining white sand and a hammock, please try me later!” “I’ve really tried my best to give the bot some dignity,” Feldman said To say Australian Broadcasting Corporation in 2017.

Nevertheless, Feldman believes that there is an ethical requirement for bots to self-identify as bots. “Companies that design lack transparency [bots] Make it easy for people interacting with bots to forget that it’s a bot, ”he says, and make it even harder to gender the bot or give them a human voice. Since many consumers experience with chatbots Can be frustrating And while many would rather talk to one person, Feldman thinks that providing the human qualities of bots could be an “extra design”.

Source link

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *