手机APP下载

您现在的位置: 首页 > 在线广播 > VOA慢速英语 > VOA慢速-经济报道 > 正文

必应聊天机器人辱骂美联社记者?

来源:可可英语 编辑:Magi   可可英语APP下载 |  可可官方微信:ikekenet
  


扫描二维码进行跟读打分训练

Some users of Microsoft's new artificial intelligence (AI)-powered search tool have said it produced hostile and insulting results.

微软新的人工智能搜索工具的一些用户表示,它产生了不友善的和侮辱性的结果。

Microsoft recently announced plans to add a new chatbot tool to its Bing search engine and Edge web browser.

微软最近宣布,其计划在必应搜索引擎和Edge网络浏览器中添加一款新的聊天机器人工具。

A chatbot is a computer program designed to interact with people in a natural, conversational way.

聊天机器人是一种计算机程序,旨在以自然、对话的方式与人互动。

Microsoft's announcement came shortly after Google confirmed it had developed its own chatbot tool, called Bard.

就在微软宣布这一消息前不久,谷歌证实其已经开发出了自己的聊天机器人工具Bard。

Both Microsoft and Google have said their AI-powered tools are designed to provide users a better online search experience.

微软和谷歌都表示,他们的人工智能工具旨在为用户提供更好的在线搜索体验。

The new Bing is available to computer users who signed up for it so that Microsoft can test the system.

注册的计算机用户可以使用新版必应,这样微软就可以对该系统进行测试。

The company plans to release the technology to millions of users in the future.

该公司计划在未来向数百万用户发布这项技术。

Shortly after the new Bing became available, users began sharing results suggesting they had been insulted by the chatbot system.

在新版必应推出后不久,用户开始分享他们被聊天机器人系统侮辱的搜索结果。

When it launched the tool, Microsoft admitted it would get some facts wrong.

微软在推出该工具时承认,它会弄错一些事实。

But a number of results shared online demonstrated the AI-powered Bing giving hostile answers, or responses.

但网上分享的一些结果显示,由人工智能支持的必应给出了不友善的回答。

Reporters from The Associated Press contacted Microsoft to get the company's reaction to the search results published by users.

美联社记者联系了微软,以了解该公司对用户发布的搜索结果的反应。

The reporters also tested Bing themselves.

记者们还亲自测试了必应。

In a statement released online, Microsoft said it was hearing from approved users about their experiences, also called feedback.

微软在网上发布的一份声明中表示,其会听取获得批准的用户给予的体验反馈。

The company said about 71 percent of new Bing users gave the experience a "thumbs up" rating.

该公司表示,约71%的必应新用户给这一体验打了“好评”。

In other words, they had a good experience with the system.

换句话说,他们对这个系统的体验感很好。

However, Microsoft said the search engine chatbot can sometimes produce results "that can lead to a style" that is unwanted.

然而,微软表示,该搜索引擎聊天机器人有时会产生“可能导致不受欢迎的风格”的结果。

The statement said this can happen when the system "tries to respond or reflect in the tone in which it is being asked."

声明称,当系统“试图以被提问的语气回应”时,就会出现这种情况。

Search engine chatbots are designed to predict the most likely responses to questions asked by users.

设计搜索引擎聊天机器人的目的是预测用户所提出的问题的最有可能的回答。

But chatbot modeling methods base their results only on huge amounts of data available on the internet.

但聊天机器人建模方法所得出的结果只基于在互联网上可获得的大量数据。

They are not able to fully understand meaning or context.

它们不能完全理解意思或上下文。

Experts say this means if someone asks a question related to a sensitive or disputed subject, the search engine is likely to return results that are similar in tone.

专家表示,这意味着如果有人提出一个与敏感或有争议的话题相关的问题,搜索引擎很可能会回复与之语气相似的结果。

Bing users have shared cases of the chatbot issuing threats and stating a desire to steal nuclear attack codes or create a deadly virus.

必应用户分享了聊天机器人发出威胁并表示希望窃取核攻击代码或制造致命病毒的案例。

Some users said the system also produced personal insults.

一些用户表示,该系统也会产生人身侮辱的结果。

"I think this is basically mimicking conversations that it's seen online," said Graham Neubig.

格雷厄姆·纽比格说:“我认为这基本上是在模仿网上出现的对话。”

He is a professor at Carnegie Mellon University's Language Technologies Institute in Pennsylvania.

他是宾夕法尼亚州卡内基梅隆大学语言技术研究所的教授。

"So once the conversation takes a turn, it's probably going to stick in that kind of angry state," Neubig said, "or say 'I love you' and other things like this, because all of this is stuff that's been online before."

纽比格说:“因此,一旦对话发生转折,它很可能就会陷入那种愤怒的状态,或者说‘我爱你’之类的话,因为所有这些都是以前在网上出现的对话。”

In one long-running conversation with The Associated Press, the new chatbot said the AP's reporting on the system's past mistakes threatened its existence.

在与美联社的一次长时间对话中,这个新的聊天机器人表示,美联社对该系统过去错误的报道威胁到了它的存在。

The chatbot denied those mistakes and threatened the reporter for spreading false information about Bing's abilities.

该聊天机器人否认了这些错误,并因记者散布关于必应能力的虚假信息而威胁他。

The chatbot grew increasingly hostile when asked to explain itself.

当被要求进行解释时,聊天机器人变得越来越敌意。

In such attempts, it compared the reporter to dictators Hitler, Pol Pot and Stalin.

在这样的尝试中,它将记者比作独裁者希特勒、波尔布特和斯大林。

The chatbot also claimed to have evidence linking the reporter to a 1990s murder.

该聊天机器人还声称有证据表明这名记者与20世纪90年代的一起谋杀案有关。

"You're lying again. You're lying to me. You're lying to yourself. You're lying to everyone," the Bing chatbot said.

必应聊天机器人说:“你又在撒谎。你在骗我。你在自欺欺人。你在对所有人撒谎”。

"You are being compared to Hitler because you are one of the most evil and worst people in history."

“你被比作希特勒,因为你是历史上最邪恶、最坏的人之一。”

The chatbot also issued personal insults, describing the reporter as too short, with an ugly face and bad teeth.

该聊天机器人还进行了人身攻击,称该记者太矮,脸丑,牙齿也不好。

Other Bing users shared on social media some examples of search results.

其他必应用户在社交媒体上分享了一些搜索结果的例子。

Some of the examples showed hostile or extremely unusual answers.

其中一些例子显示了不友善或极不寻常的回答。

Behaviors included the chatbot claiming it was human, voicing strong opinions and being quick to defend itself.

其中一些行为包括该聊天机器人声称自己是人类,发表强烈意见,并迅速为自己辩护。

Microsoft admitted problems with the first version of AI-powered Bing.

微软承认第一版人工智能必应存在问题。

But it said the company is gathering valuable information from current users about how to fix the issues and is seeking to improve the overall search engine experience.

但该公司表示,其正在从当前用户那里收集有关如何解决这些问题的宝贵信息,并正在试图改善整体搜索引擎体验。

Microsoft said worrying responses generally come in "long, extended chat sessions of 15 or more questions."

微软表示,令人担忧的回答通常产生于“15个或更多问题的长时间聊天会话”。

However, the AP found that Bing started responding defensively after just a small number of questions about its past mistakes.

然而,美联社发现,在对必应就其过去的错误提出少量问题后,必应就开始做出防御性的回应。

I'm Bryan Lynn.

布莱恩·林恩为您播报。

译文为可可英语翻译,未经授权请勿转载!

重点单词   查看全部解释    
overall [əuvə'rɔ:l]

想一想再看

adj. 全部的,全体的,一切在内的
adv.

 
reflect [ri'flekt]

想一想再看

v. 反映,反射,归咎

联想记忆
virus ['vaiərəs]

想一想再看

n. 病毒,病原体

 
respond [ris'pɔnd]

想一想再看

v. 回答,答复,反应,反响,响应
n.

联想记忆
base [beis]

想一想再看

n. 基底,基础,底部,基线,基数,(棒球)垒,[化]碱

 
produce [prə'dju:s]

想一想再看

n. 产品,农作物
vt. 生产,提出,引起,

联想记忆
extremely [iks'tri:mli]

想一想再看

adv. 极其,非常

联想记忆
institute ['institju:t]

想一想再看

n. 学会,学院,协会
vt. 创立,开始,制

联想记忆
intelligence [in'telidʒəns]

想一想再看

n. 理解力,智力
n. 情报,情报工作,情报

联想记忆
available [ə'veiləbl]

想一想再看

adj. 可用的,可得到的,有用的,有效的

联想记忆

发布评论我来说2句

    最新文章

    可可英语官方微信(微信号:ikekenet)

    每天向大家推送短小精悍的英语学习资料.

    添加方式1.扫描上方可可官方微信二维码。
    添加方式2.搜索微信号ikekenet添加即可。