Do you yell at Alexa? Swear at Siri? Be honest—you’ve been there. And so have many of us. But why yell at what is essentially software? Gartner research vice president Frank Buytendijk thinks many have and wondered why in a recent blog post.
Buytendijk cited some numbers from research conducted by Dr. Sheryl Brahnam, professor of computer information systems at Missouri State University. Brahnam studied logs recording human-computer agent calls and found that a significant number of these interactions go off the rails. The human participant in many of these conversations ended up speaking in a way that would be considered unacceptable if the other party were a person instead of a bot.
According to a post to the TechEmergence website last year, she found that in 10% to 50% of all of the “human-computer interactions” studied, the person abused, mocked, or was just plain mean to the computer assistant or call agent. Fortune reached out to Brahnam to see if there are newer numbers and will update this story as needed.
Based on personal anecdotal evidence, it’s clear that people get exasperated with Apple (aapl) Siri, Amazon (amzn) Alexa, and Microsoft Cortana. And a good chunk of them go out of their way to try to irritate or stump their smart assistant. There are lots of posts on the subject.
For more on smart assistants, watch:
A recent example of human-Siri interaction that didn’t work out as planned: This weekend at the beach some friends were trying to figure out the name of Alicia Vikander’s new movie. Siri was called. She, er, it, couldn’t grok the actor’s name. Siri kept translating Alicia Vikander as Alicia Witt Condor and returned non-results, causing hilarity and commentary. At one point, someone said Siri was useless, to which she replied something along the lines: “You have a right to feel that way.”
A few days later, I tried the query again and Siri still didn’t “get” the actor’s name, at least in the text translation of the question, but did return the right result: The Light Between Oceans. Bravo, Siri. (Alicia Vikander, ironically, also starred as the eerily humanoid robot in Ex Machina last year.)
Get Data Sheet, Fortune’s technology newsletter.
As someone who has witnessed a balky laptop hurled out of a second-floor window, I understand the frustration with technology that isn’t helpful, but bad behavior toward Siri or Alexa or Cortana shows more about a person’s character than the technology itself. Perhaps the most blatant example of bad human-computer interaction was the Microsoft Tay chatbot situation, in which people taught Tay racist and sexist language which ended up in question responses.
People may find it fun to interact with something without consequences, but Buytendijk argues that may not really be the case here. The abuse of a computerized personal assistant does say something about a person, he notes. “Isn’t good behavior self-evident a reward in itself and part of building character?” he asks.
Verbal abuse toward a computer can also offend other people who witness it and ingrain a not-very-pleasant tendency to fly off the handle in other situations. Which is why it’s so endearing to see parents of young children insist that those kids say please and thank you to Siri or Alexa.
In other words, do unto Siri as you would have Siri do unto you.