Language testing can hold back careers.
We recently worked with a pharmacist who came to us after failing the speaking/listening portion of the TOEFL test 12 times. Passing this test was a necessary step for him in moving up in his career and failing it was holding him back.
Tests are no longer scored by humans, but AI.
What he explained to me when he was describing his continued failure was that the test is no longer being administered and scored by humans. Instead a computer listens to the student speaking and makes its judgment on whether the human is intelligible and exhibits fluidity.
The problem with this is that computers don’t have the sensitivity to parse the slight and sometimes minuscule differences in pronunciation that essentially fluent non-native speakers (NNS) exhibit.
Most NNS haven’t quite mastered the TH sound and so words like “these” and “those” can sound to a computer lacking intuition, like “tease” and “toes”. Sensing gibberish, the computer administers a failing grade.
Accents in the World of Siri and Alexa
After eight accent reduction sessions with us, the student, on his very next try (his 13th) finally passed the test. But his experience revealed to us the very real intrusion of the AI (Artificial Intelligence) universe into the issue of accents in language and conversation intelligibility.
This is the new world, where not only do your fellow humans (colleagues, acquaintances, potential employers) judge you on your communication skills, now so does machinery, in the form of speech-to-text and speech-to-action devices.
In this new era, non-native English speakers have to add to their potential communication obstacles digital brains who will never know what a great person you are, nor how intelligent, nor how funny, but only if you can clearly differentiate for it, the difference between “glass” and “grass” or “thought” and “taught”.
The new reality of AI has seeped into the accent modification world, as more and more people with accents find that they have trouble being understood by the very devices that work for them!
Amazon’s Alexa, Apple’s Siri, and Google Home all await your command, to manage everything from your grocery shopping to your social calendar, to closing the curtains in your bedroom, to feeding your dog, as long as they understand you.
If you have an accent and ask Alexa to order “ink cartridges”, you might end up receiving “pink garbage bags”.
If you request Google Home to “play Opera”, and your initial “O” is not the “AH” it needs to be (according to American English), you may end up hearing a lecture delivered by Oprah.
If you ask Siri to play Beyoncé and you don’t land that initial “B”, you may end up listening to Vivaldi.
At home and at the office, your devices are listening to you, but if you have an accent, they may not be able to help you.
When a non-native speaker dictates to a speech-to-text application, the results can be quite funny (one we heard: “write and deliver” was translated to “right in the liver”), but we have yet to collect the data on the negative repercussions of a misunderstood command. One can imagine it might not always be so humorous. This how those with accents in the world of Siri and Alexa are affected.
Recently the New York Times compared Alexa, Siri and Google Home.
All three have flaws in understanding, and because of their different programming, their issues are not consistent with one another. These three major AI devices (along with others which will come along) will continue to trade the mantle of comprehension mastery as they mature and learn, gaining experience by simply listening, over time, to millions of accented voices.
Outside the U.S., the Chinese company Baidu is developing a “deep speech” algorithm that will recognize over 200 Chinese sub-dialects, along with the wide range of English dialectic variations. Perhaps in the future, the Baidu is “whom” we will address in our American homes!
In the meantime, these communication platforms have added a new challenge for accent reduction professionals.
Not only do we assist our students in polishing their American English accents for interactions with their fellow humans, but now we need to be sensitive to the digital colleague: the robot in the room with whom we all must also communicate.
So if you have an accent, fear not; we can help you be understood at school, in your office, and at home, by the digital devices you hope to command. It will help in understanding accents in the world of Siri and Alex.
Even if neither your spouse, nor your children, nor even your dog will listen to you at home, and if your human colleagues at work aren’t paying attention, at least there will be one entity who will (probably) understand you: Alexa!
Watch Alexa failing her Grammar 101 test!