Gender Bias In Machine translation
Machine translation models are trained on huge corpuses of text, with pairs of sentences, one a translation of another into a different language. However, there are nuances in language that often make it difficult to provide an accurate and direct translation from one language to another.
When translating from English to languages such as French or Spanish, some gender neutral nouns will be translated into gender specific nouns. For example, the word “friend” in “his friend is kind” is gender neutral in English. However, in Spanish it is gender specific, either “amiga” (feminine) or “amigo” (masculine).
Another example is translation from Turkish to English. Turkish is almost an entirely gender neutral language. The pronoun “o” in Turkish can be translated into English as any of “he”, “she” or “it”. Google claim that 10% of their Turkish translate queries are ambiguous, and could be correctly translated into either gender.
In both these examples, we can see how a phrase in one language can be correctly translated into another language with different variations based on gender. Neither is more correct than the other, and a human with the same translation task would be faced with the same ambiguity without being provided with further context. (The only difference is that perhaps the human would know to ask for further context, or else provide both translations.) This means that it is incorrect to assume that there is always a single correct translation for any given word, phrase or sentence when translating from one language to another.
It is now easy understand why Google Translate was having issues with gender bias. If societal biases meant more men than women had historically become doctors, there would be more examples of male doctors than female doctors in the training data, which is just an accurate historical record of that gender imbalance. The model would learn from this data, resulting in a bias, that doctors are more likely to be male.
Now, when faced with the task of finding a single translation for “o bir doktor”, “he/she is a doctor” from Turkish to English, the model will assume “o” should he translated as he, as doctors are more likely to be male.
One might see how the opposite could occur for nurses.
Related Articles
19. March 2020
A. Decoding Artificial Intelligence (###3)
30. November 2019
Request for deletion
About
MC.AI – Aggregated news about artificial intelligence
MC.AI collects interesting articles and news about artificial intelligence and related areas. The contributions come from various open sources and are presented here in a collected form.
The copyrights are held by the original authors, the source is indicated with each contribution.
Contributions which should be deleted from this platform can be reported using the appropriate form (within the contribution).
MC.AI is open for direct submissions, we look forward to your contribution!
Search on MC.AI
mc.ai aggregates articles from different sources - copyright remains at original authors
以上就是本文的全部内容,希望本文的内容对大家的学习或者工作能带来一定的帮助,也希望大家多多支持 码农网
猜你喜欢:本站部分资源来源于网络,本站转载出于传递更多信息之目的,版权归原作者或者来源机构所有,如转载稿涉及版权问题,请联系我们。
Google御用網頁語言Node.js
郭家寶 / 佳魁資訊 / 2013-4-26 / NT 490
這是一本 Node.js 的入門教學,寫給想要學習 Node.js,但沒有任何系統的經驗的開發者。如果你聽說過 Node.js,並被它許多神奇的特性吸引,本書就是為你準備的。 透過閱讀本書,你可以對 Node.js 有全面的認識,學會如何用 Node.js 程式設計,了解事件驅動、非同步式 I/O 的程式設計模式,同時還可以了解一些使用JavaScript 進行函數式程式設計的方法。 ......一起来看看 《Google御用網頁語言Node.js》 这本书的介绍吧!