Apple has had to quickly address a software glitch in its iPhone dictation feature that caused the word “racist” to be transcribed as “Trump.”
The issue gained widespread attention after a series of TikTok videos came out demonstrating the phenomenon, leading to debates about potential political bias in technology.
Users reported that when using voice dictation, saying “racist” resulted in the text displaying “Trump” momentarily before correcting itself. Apple attributed this anomaly to “phonetic overlap” between the words.
A company spokesperson stated, “We are aware of an issue with the speech recognition model that powers Dictation and we are rolling out a fix today.”
The glitch wasn’t limited to the word “racist.” Other terms containing the “r” consonant, such as “rampage” and “rampant,” also triggered similar issues, lending credibility to Apple‘s explanation of phonetic similarities causing the problem.
However, not everyone is convinced by this technical explanation. John Burkey, a former member of Apple’s Siri team, suggested the possibility of an internal prank. He noted that the issue appeared after a server update and questioned whether someone might have intentionally introduced the glitch into the data or code.
The timing of this incident is particularly noteworthy as it coincides with Apple’s announcement of a $500 billion investment in the United States over the next four years, a move seen as an effort to strengthen ties with the current administration. This investment includes plans to manufacture AI servers at a new facility in Houston.
Conservative commentators have seized upon the dictation glitch as evidence of alleged political bias within the tech industry. Similar controversies have arisen in the past, such as during the 2024 election when Amazon’s Alexa and Google search results were accused of political partiality.
Apple has faced challenges with its AI-powered features recently. Last month, the company disabled a key function of its Apple Intelligence system after it generated inaccurate news summaries.
In 2018, Siri faced backlash when it briefly displayed a nude image in response to a question about Donald Trump, an issue linked to Wikipedia edits.
As of now, Apple has not specified when the fix for the dictation bug will be fully deployed. The company emphasizes its commitment to resolving the issue promptly to maintain user trust and uphold the accuracy of its services.