低血糖的人吃什么东西最好| 好饭不怕晚什么意思| 夏季吃什么水果| 双马尾是什么意思| 孩子睡觉咬牙齿是什么原因引起的| 耳仓为什么是臭的| 云朵像什么| 意志力什么意思| 绦是什么意思| 都市丽人什么意思| 六月份是什么季节| 屎黄色是什么颜色| 聘书是什么| jasonwood是什么牌子| 吃什么有助于睡眠效果好| 不举是什么原因造成的| 加拿大货币叫什么| 你要干什么| 肺活量不足是什么症状| 咽喉炎吃什么药好| 拉肚子吃什么水果好| 金钱草长什么样| 吃燕窝有什么功效| 港币长什么样| 巡视组组长什么级别| 肩膀疼吃什么药| 什么叫亚健康| 沙门是什么意思| 男人少一个睾丸有什么影响| 做梦梦到老婆出轨是什么意思| 左金丸治什么病最好| 唯我独尊指什么生肖| 水逆是什么意思| 猫眼石是什么材质| 安坦又叫什么药| 沙漠玫瑰什么时候开花| 玉兰片和竹笋有什么区别| 血红蛋白低吃什么| 鞭炮笋学名叫什么| 耵聍是什么意思| 钾肥是什么肥料| 一饿就胃疼是什么原因| 2014属什么生肖| 花金龟吃什么| 吃苦荞有什么好处| 刚是什么意思| 什么样的充电宝能带上飞机| 瘟神是什么意思| 来大姨妈能吃什么水果| 双侧颈部淋巴结可见是什么意思| 羽字五行属什么的| 壬水命是什么意思| 肚子疼做什么检查| 印度是什么人种| 打喷嚏流鼻涕属于什么感冒| 十月十一是什么星座| 绿豆芽炒什么好吃| 吃什么水果对心脏有好处| 重庆沱茶属于什么茶| 咽喉炎吃什么药| 大脑供血不足头晕吃什么药最好| 月牙消失了是什么原因| 串串是什么意思| 梦见穿裤子是什么意思| 血压高吃什么药比较好| 钻石和锆石有什么区别| 四个又念什么| 敷设是什么意思| 流涎是什么意思| 天丝是什么成分| 月经期间吃什么补气血| 火星是什么意思| 女司机为什么开不好车| 碱性磷酸酶高是什么原因| 伪君子是什么意思| 脾虚气滞吃什么中成药| 能量守恒是什么意思| 大便不规律是什么原因| 黄芪有什么好处| 十二月二十七是什么星座| 为什么受伤的总是我| 为什么会做噩梦| ct和b超有什么区别| 牙周袋是什么| 白蛋白偏高是什么意思| 虾不能和什么东西一起吃| 十月份出生的是什么星座| 3月是什么星座| gg 是什么意思| 维生素d3什么时候吃最好| 大明湖畔的夏雨荷是什么意思| 营养过剩会导致什么| 排骨搭配什么菜好吃| 十二指肠球部溃疡吃什么药| mmp是什么意思| 缺钾吃什么食物| 崎岖是什么意思| 小钙化灶是什么意思| 什么的衣服| 脂溢性皮炎是什么症状| 胃火喝什么茶降火| 牡丹花什么颜色| 刘晓庆为什么坐牢| 外阴炎用什么药| 万加一笔是什么字| 聊表心意什么意思| 长期咳嗽是什么原因| 小松鼠吃什么食物| 淡然自若的意思是什么| 瓦特发明了什么| 1936年中国发生了什么| 中国四大发明是什么| 天秤座的幸运色是什么| plt医学上是什么意思| 钙化什么意思| 汁字五行属什么| 瓜娃子是什么意思| 掌中宝是什么东西| 吃什么能减肥最快还能减全身| 玫瑰糠疹用什么药| 10周年结婚是什么婚| 硒中毒有什么症状| 七月份可以种什么菜| 什么什么的阳光| 9价疫苗适合什么年龄人打| 孕早期头晕是什么原因| 中风是什么原因引起的| 乔迁是什么意思| 吃什么可以缓解孕吐恶心| 什么水果含铁| 1.9号是什么星座| 乌托邦是什么意思| 扁桃体发炎挂什么科| 雅戈尔男装什么档次| 李逵的绰号是什么| 玉米须煮水喝有什么好处| 意大利全称是什么| 柔软对什么| 为什么拉稀| mido手表什么牌子| 保外就医是什么意思| 就诊是什么意思| 秒男是什么意思| 小孩出汗多是什么原因| 植物神经紊乱吃什么中成药| 声嘶力竭是什么意思| 摩什么擦什么| 奶酪和芝士有什么区别| cooh是什么基| 动脉导管未闭是什么意思| 梦见盖房子什么意思| 局部皮肤瘙痒什么原因| 什么水果降血压| 六月中旬是什么时候| 阳痿是什么意思| 双眼屈光不正是什么意思| red是什么颜色| 三维彩超主要检查什么| 化疗前要做什么检查| 中性粒细胞是什么| 看头发挂什么科| 为什么要做羊水穿刺检查| 06年属什么| 指甲发白是什么原因| 羊五行属什么| 妇科臭氧治疗的作用是什么| 忙碌的动物是什么生肖| 面包糠是什么做的| 荨麻疹是什么原因引起| 甲磺酸倍他司汀片治什么病| 冰箱不制冷是什么问题| 孕妇建档需要什么资料| lh是什么激素| 还俗是什么意思| 后腰左侧疼痛是什么原因男性| 五年是什么婚| 冬五行属什么| 有什么可以快速止痒的方法| 肌酐高是什么问题| dwi呈高信号什么意思| 减肥吃什么好而且瘦的快| 生丝是什么| fujixerox是什么牌子| 左脸长痣代表什么| 什么叫npc| 肾结石不处理有什么后果| 吃芒果有什么好处| 砚字五行属什么| 城投公司是干什么的| 尿胆原阴性是什么意思| 盆腔炎用什么药最好| 什么是出轨| 面包虫是什么| 王八羔子是什么意思| 拉拉是什么意思| 鲜为人知什么意思| 胶原蛋白的成分是什么| 心率高是什么原因| 芒果吃多了有什么坏处| 骨骼清奇什么意思| 夫妻肺片里面都有什么| 凌晨6点是什么时辰| 什么钙片最好| 什么叫近视| 开户名是什么| 妈妈的外婆叫什么| cea是什么意思| 龄字五行属什么| 补钙有什么好处| 结婚登记需要什么| 埃及的母亲河是什么| 什么是乳腺结节| 惊厥是什么症状| 四级军士长是什么级别| 小孩喜欢趴着睡觉是什么原因| 冲羊煞东是什么意思| 翠色是什么颜色| 三四月份是什么星座| 肠胃紊乱什么症状| 伯恩光学是做什么的| cdfi可见血流信号是什么意思| 太平鸟属于什么档次| 四级士官是什么级别| 瓜婆娘四川话是什么意思| 副营长是什么军衔| 智商是什么| 乐什么什么什么成语| 六味地黄丸什么牌子好| 无氧运动是什么| 备孕需要做什么检查| 心衰吃什么药最好| 气血不足吃什么东西| 保肝降酶药首选什么药| 耳朵痒痒是什么原因| 舌头臭是什么原因| 阑尾炎做什么检查| kallas是什么牌子| pcr是什么意思| 耳朵痛什么原因| 幽灵是什么意思| 巴西货币叫什么| 仙人掌有什么功效| 专场是什么意思| 什么原因导致荨麻疹| 眼压高用什么药| 晚上难以入睡是什么原因| 太多的借口太多的理由是什么歌| 上环什么时候去最合适| hr医学上是什么意思| 后背痒是什么病的前兆| 双重性格是什么意思| 定点医院什么意思| b超涂的液体是什么| 鬼代表什么数字| 佩戴貔貅有什么讲究与禁忌| 衣食父母什么意思| 什么是龙抬头| 太君是什么意思| 炭疽病用什么药最好| 品种是什么意思| 犬瘟是什么原因引起的| 血糖高早餐吃什么好| 有什么有什么的四字词语| 第一次坐飞机需要注意什么| 百度Jump to content

朝代表什么生肖

From Wikipedia, the free encyclopedia
IBM Watson
OperatorsIBM
LocationThomas J. Watson Research Center, New York, USA
Architecture2,880 POWER7 processor threads
Memory16 terabytes of RAM
Speed80 teraFLOPS
WebsiteIBM Watson
百度 当洪灾时,她主动捐款元;汶川地震时,她积极捐款并缴纳特殊党费元,与灾区人民心连心;团委倡议青年职工“对口援青”时,她主动捐款元,尽展友善;她还到福利院看望老人和智障儿童,教他们写字、画画、打球,离开时孩子们拉着她依依不舍;年月,她组织机关部分职工到“留守儿童之家”,为多位孩子捐赠学习用品,她还动员爱人义务为孩子们送上了一堂“千里之行始于足下”的书法讲座,教孩子们做任何事都要脚踏实地,注重点滴积累;手把手地教孩子们写从未写过的毛笔字,让孩子们感受中国传统文化的魅力,用行动温暖留守儿童的心房。

IBM Watson is a computer system capable of answering questions posed in natural language.[1] It was developed as a part of IBM's DeepQA project by a research team, led by principal investigator David Ferrucci.[2] Watson was named after IBM's founder and first CEO, industrialist Thomas J. Watson.[3][4]

The computer system was initially developed to answer questions on the popular quiz show Jeopardy![5] and in 2011, the Watson computer system competed on Jeopardy! against champions Brad Rutter and Ken Jennings,[3][6] winning the first-place prize of US$1 million.[7]

In February 2013, IBM announced that Watson's first commercial application would be for utilization management decisions in lung cancer treatment, at Memorial Sloan Kettering Cancer Center, New York City, in conjunction with WellPoint (now Elevance Health).[8]

Description

[edit]
The high-level architecture of IBM's DeepQA used in Watson[9]

Watson was created as a question answering (QA) computing system that IBM built to apply advanced natural language processing, information retrieval, knowledge representation, automated reasoning, and machine learning technologies to the field of open domain question answering. The system is named DeepQA (though it did not involve the use of deep neural networks).[1]

IBM stated that Watson uses "more than 100 different techniques to analyze natural language, identify sources, find and generate hypotheses, find and score evidence, and merge and rank hypotheses."[10]

In recent years[when?], Watson's capabilities have been extended and the way in which Watson works has been changed to take advantage of new deployment models (Watson on IBM Cloud), evolved machine learning capabilities, and optimized hardware available to developers and researchers. [citation needed]

Software

[edit]

Watson uses IBM's DeepQA software and the Apache UIMA (Unstructured Information Management Architecture) framework implementation. The system was written in various languages, including Java, C++, and Prolog, and runs on the SUSE Linux Enterprise Server 11 operating system using the Apache Hadoop framework to provide distributed computing.[11][12][13]

Other than the DeepQA system, Watson contained several strategy modules. For example, one module calculated the amount to bet for Final Jeopardy, according to the confidence score on getting the answer right, and the current scores of all contestants. One module used the Bayes rule to calculate the probability that each unrevealed question might be the Daily Double, using historical data from the J! Archive as the prior. If a Daily Double is found, the amount to wager is computed by a 2-layered neural network of the same kind as those used by TD-Gammon, a neural network that played backgammon, developed by Gerald Tesauro in the 1990s.[14] The parameters in the strategy modules were tuned by benchmarking against a statistical model of human contestants fitted on data from the J! Archive, and selecting the best one.[15][16][17]

Hardware

[edit]

The system is workload-optimized, integrating massively parallel POWER7 processors and built on IBM's DeepQA technology,[18] which it uses to generate hypotheses, gather massive evidence, and analyze data.[1] Watson employs a cluster of ninety IBM Power 750 servers, each of which uses a 3.5 GHz POWER7 eight-core processor, with four threads per core. In total, the system uses 2,880 POWER7 processor threads and 16 terabytes of RAM.[18]

According to John Rennie, Watson can process 500 gigabytes (the equivalent of a million books) per second.[19] IBM master inventor and senior consultant Tony Pearson estimated Watson's hardware cost at about three million dollars.[20] Its Linpack performance stands at 80 TeraFLOPs, which is about half as fast as the cut-off line for the Top 500 Supercomputers list.[21] According to Rennie, all content was stored in Watson's RAM for the Jeopardy game because data stored on hard drives would be too slow to compete with human Jeopardy champions.[19]

Data

[edit]

The sources of information for Watson include encyclopedias, dictionaries, thesauri, newswire articles and literary works. Watson also used databases, taxonomies and ontologies including DBPedia, WordNet and Yago.[22] The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias and other reference material, that it could use to build its knowledge.[23]

Operation

[edit]

Watson parses questions into different keywords and sentence fragments in order to find statistically related phrases.[23] Watson's main innovation was not in the creation of a new algorithm for this operation, but rather its ability to quickly execute hundreds of proven language analysis algorithms simultaneously.[23][24] The more algorithms that find the same answer independently, the more likely Watson is to be correct. Once Watson has a small number of potential solutions, it is able to check against its database to ascertain whether the solution makes sense or not.[23]

Comparison with human players

[edit]
Ken Jennings, Watson, and Brad Rutter in their Jeopardy! exhibition match

Watson's basic working principle is to parse keywords in a clue while searching for related terms as responses. This gives Watson some advantages and disadvantages compared with human Jeopardy! players.[25] Watson has deficiencies in understanding the context of the clues. Watson can read, analyze, and learn from natural language, which gives it the ability to make human-like decisions.[26] As a result, human players usually generate responses faster than Watson, especially to short clues.[23] Watson's programming prevents it from using the popular tactic of buzzing before it is sure of its response.[23] However, Watson has consistently better reaction time on the buzzer once it has generated a response, and is immune to human players' psychological tactics, such as jumping between categories on every clue.[23][27]

In a sequence of 20 mock games of Jeopardy!, human participants were able to use the six to seven seconds that Watson needed to hear the clue and decide whether to signal for responding.[23] During that time, Watson also has to evaluate the response and determine whether it is sufficiently confident in the result to signal.[23] Part of the system used to win the Jeopardy! contest was the electronic circuitry that receives the "ready" signal and then examines whether Watson's confidence level was great enough to activate the buzzer. Given the speed of this circuitry compared to the speed of human reaction times, Watson's reaction time was faster than the human contestants except when the human anticipated (instead of reacted to) the ready signal.[28] After signaling, Watson speaks with an electronic voice and gives the responses in Jeopardy!'s question format.[23] Watson's voice was synthesized from recordings that actor Jeff Woodman made for an IBM text-to-speech program in 2004.[29]

The Jeopardy! staff used different means to notify Watson and the human players when to buzz,[28] which was critical in many rounds.[27] The humans were notified by a light, which took them tenths of a second to perceive.[30][31] Watson was notified by an electronic signal and could activate the buzzer within about eight milliseconds.[32] The humans tried to compensate for the perception delay by anticipating the light,[33] but the variation in the anticipation time was generally too great to fall within Watson's response time.[27] Watson did not attempt to anticipate the notification signal.[31][33]

History

[edit]

Development

[edit]

Since Deep Blue's victory over Garry Kasparov in chess in 1997, IBM had been on the hunt for a new challenge. In 2004, IBM Research manager Charles Lickel, over dinner with coworkers, noticed that the restaurant they were in had fallen silent. He soon discovered the cause of this evening's hiatus: Ken Jennings, who was then in the middle of his successful 74-game run on Jeopardy!. Nearly the entire restaurant had piled toward the televisions, mid-meal, to watch Jeopardy!. Intrigued by the quiz show as a possible challenge for IBM, Lickel passed the idea on, and in 2005, IBM Research executive Paul Horn supported Lickel, pushing for someone in his department to take up the challenge of playing Jeopardy! with an IBM system. Though he initially had trouble finding any research staff willing to take on what looked to be a much more complex challenge than the wordless game of chess, eventually David Ferrucci took him up on the offer.[34] In competitions managed by the United States government, Watson's predecessor, a system named Piquant, was usually able to respond correctly to only about 35% of clues and often required several minutes to respond.[35][36][37] To compete successfully on Jeopardy!, Watson would need to respond in no more than a few seconds, and at that time, the problems posed by the game show were deemed to be impossible to solve.[23]

In initial tests run during 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past Jeopardy! programs. While the best real-life competitors buzzed in half the time and responded correctly to as many as 95% of clues, Watson's first pass could get only about 15% correct. During 2007, the IBM team was given three to five years and a staff of 15 people to solve the problems.[23] John E. Kelly III succeeded Paul Horn as head of IBM Research in 2007.[38] InformationWeek described Kelly as "the father of Watson" and credited him for encouraging the system to compete against humans on Jeopardy!.[39] By 2008, the developers had advanced Watson such that it could compete with Jeopardy! champions.[23] By February 2010, Watson could beat human Jeopardy! contestants on a regular basis.[40]

During the game, Watson had access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage[11] including the full text of the 2011 edition of Wikipedia,[41] but was not connected to the Internet.[42][23] For each clue, Watson's three most probable responses were displayed on the television screen. Watson consistently outperformed its human opponents on the game's signaling device, but had trouble in a few categories, notably those having short clues containing only a few words.[citation needed]

Although the system is primarily an IBM effort, Watson's development involved faculty and graduate students from Rensselaer Polytechnic Institute, Carnegie Mellon University, University of Massachusetts Amherst, the University of Southern California's Information Sciences Institute, the University of Texas at Austin, the Massachusetts Institute of Technology, and the University of Trento,[9] as well as students from New York Medical College.[43] Among the team of IBM programmers who worked on Watson was 2001 Who Wants to Be a Millionaire? top prize winner Ed Toutant, who himself had appeared on Jeopardy! in 1989 (winning one game).[44]

Jeopardy!

[edit]

Preparation

[edit]
Watson demo at an IBM booth at a trade show

In 2008, IBM representatives communicated with Jeopardy! executive producer Harry Friedman about the possibility of having Watson compete against Ken Jennings and Brad Rutter, two of the most successful contestants on the show, and the program's producers agreed.[23][45] Watson's differences with human players had generated conflicts between IBM and Jeopardy! staff during the planning of the competition.[25] IBM repeatedly expressed concerns that the show's writers would exploit Watson's cognitive deficiencies when writing the clues, thereby turning the game into a Turing test. To alleviate that claim, a third party randomly picked the clues from previously written shows that were never broadcast.[25] Jeopardy! staff also showed concerns over Watson's reaction time on the buzzer. Originally Watson signaled electronically, but show staff requested that it press a button physically, as the human contestants would.[46] Even with a robotic "finger" pressing the buzzer, Watson remained faster than its human competitors. Ken Jennings noted, "If you're trying to win on the show, the buzzer is all", and that Watson "can knock out a microsecond-precise buzz every single time with little or no variation. Human reflexes can't compete with computer circuits in this regard."[27][33][47] Stephen Baker, a journalist who recorded Watson's development in his book Final Jeopardy, reported that the conflict between IBM and Jeopardy! became so serious in May 2010 that the competition was almost cancelled.[25] As part of the preparation, IBM constructed a mock set in a conference room at one of its technology sites to model the one used on Jeopardy!. Human players, including former Jeopardy! contestants, also participated in mock games against Watson with Todd Alan Crain of The Onion playing host.[23] About 100 test matches were conducted with Watson winning 65% of the games.[48]

To provide a physical presence in the televised games, Watson was represented by an "avatar" of a globe, inspired by the IBM "smarter planet" symbol. Jennings described the computer's avatar as a "glowing blue ball crisscrossed by 'threads' of thought—42 threads, to be precise",[49] and stated that the number of thought threads in the avatar was an in-joke referencing the significance of the number 42 in Douglas Adams' Hitchhiker's Guide to the Galaxy.[49] Joshua Davis, the artist who designed the avatar for the project, explained to Stephen Baker that there are 36 triggerable states that Watson was able to use throughout the game to show its confidence in responding to a clue correctly; he had hoped to be able to find forty-two, to add another level to the Hitchhiker's Guide reference, but he was unable to pinpoint enough game states.[50]

A practice match was recorded on January 13, 2011, and the official matches were recorded on January 14, 2011. All participants maintained secrecy about the outcome until the match was broadcast in February.[51]

Practice match

[edit]

In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question. None of the three players responded incorrectly to a clue.[52]

First match

[edit]

The first round was broadcast February 14, 2011, and the second round, on February 15, 2011. The right to choose the first category had been determined by a draw won by Rutter.[53] Watson, represented by a computer monitor display and artificial voice, responded correctly to the second clue and then selected the fourth clue of the first category, a deliberate strategy to find the Daily Double as quickly as possible.[54] Watson's guess at the Daily Double location was correct. At the end of the first round, Watson was tied with Rutter at $5,000; Jennings had $2,000.[53]

Watson's performance was characterized by some quirks. In one instance, Watson repeated a reworded version of an incorrect response offered by Jennings. (Jennings said "What are the '20s?" in reference to the 1920s. Then Watson said "What is 1920s?") Because Watson could not recognize other contestants' responses, it did not know that Jennings had already given the same response. In another instance, Watson was initially given credit for a response of "What is a leg?" after Jennings incorrectly responded "What is: he only had one hand?" to a clue about George Eyser (the correct response was, "What is: he's missing a leg?"). Because Watson, unlike a human, could not have been responding to Jennings's mistake, it was decided that this response was incorrect. The broadcast version of the episode was edited to omit Trebek's original acceptance of Watson's response.[55] Watson also demonstrated complex wagering strategies on the Daily Doubles, with one bet at $6,435 and another at $1,246.[56] Gerald Tesauro, one of the IBM researchers who worked on Watson, explained that Watson's wagers were based on its confidence level for the category and a complex regression model called the Game State Evaluator.[17]

Watson took a commanding lead in Double Jeopardy!, correctly responding to both Daily Doubles. Watson responded to the second Daily Double correctly with a 32% confidence score.[56]

However, during the Final Jeopardy! round, Watson was the only contestant to miss the clue in the category U.S. Cities ("Its largest airport was named for a World War II hero; its second largest, for a World War II battle"). Rutter and Jennings gave the correct response of Chicago, but Watson's response was "What is Toronto?????" with five question marks appended indicating a lack of confidence.[56][57][58] Ferrucci offered reasons why Watson would appear to have guessed a Canadian city: categories only weakly suggest the type of response desired, the phrase "U.S. city" did not appear in the question, there are cities named Toronto in the U.S., and Toronto in Ontario has an American League baseball team.[59] Chris Welty, who also worked on Watson, suggested that it may not have been able to correctly parse the second part of the clue, "its second largest, for a World War II battle" (which was not a standalone clause despite it following a semicolon, and required context to understand that it was referring to a second-largest airport).[60] Eric Nyberg, a professor at Carnegie Mellon University and a member of the development team, stated that the error occurred because Watson does not possess the comparative knowledge to discard that potential response as not viable.[58] Although not displayed to the audience as with non-Final Jeopardy! questions, Watson's second choice was Chicago. Both Toronto and Chicago were well below Watson's confidence threshold, at 14% and 11% respectively. Watson wagered only $947 on the question.[61]

The game ended with Jennings with $4,800, Rutter with $10,400, and Watson with $35,734.[56]

Second match

[edit]

During the introduction, Trebek (a Canadian native) joked that he had learned Toronto was a U.S. city, and Watson's error in the first match prompted an IBM engineer to wear a Toronto Blue Jays jacket to the recording of the second match.[62]

In the first round, Jennings was finally able to choose a Daily Double clue,[63] while Watson responded to one Daily Double clue incorrectly for the first time in the Double Jeopardy! Round.[64] After the first round, Watson placed second for the first time in the competition after Rutter and Jennings were briefly successful in increasing their dollar values before Watson could respond.[64][65] Nonetheless, the final result ended with a victory for Watson with a score of $77,147, besting Jennings who scored $24,000 and Rutter who scored $21,600.[66]

Final outcome

[edit]

The prizes for the competition were $1 million for first place (Watson), $300,000 for second place (Jennings), and $200,000 for third place (Rutter). As promised, IBM donated 100% of Watson's winnings to charity, with 50% of those winnings going to World Vision and 50% going to World Community Grid.[67] Similarly, Jennings and Rutter donated 50% of their winnings to their respective charities.[68]

In acknowledgement of IBM and Watson's achievements, Jennings made an additional remark in his Final Jeopardy! response: "I for one welcome our new computer overlords", paraphrasing a joke from The Simpsons.[69][70] Jennings later wrote an article for Slate, in which he stated:

IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of 'thinking' machines. 'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last.[49]

Philosophy

[edit]

Philosopher John Searle argues that Watson—despite impressive capabilities—cannot actually think.[71] Drawing on his Chinese room thought experiment, Searle claims that Watson, like other computational machines, is capable only of manipulating symbols, but has no ability to understand the meaning of those symbols; however, Searle's experiment has its detractors.[72]

Match against members of the United States Congress

[edit]

On February 28, 2011, Watson played an untelevised exhibition match of Jeopardy! against members of the United States House of Representatives. In the first round, Rush D. Holt, Jr. (D-NJ, a former Jeopardy! contestant), who was challenging the computer with Bill Cassidy (R-LA, later Senator from Louisiana), led with Watson in second place. However, combining the scores between all matches, the final score was $40,300 for Watson and $30,000 for the congressional players combined.[73]

IBM's Christopher Padilla said of the match, "The technology behind Watson represents a major advancement in computing. In the data-intensive environment of government, this type of technology can help organizations make better decisions and improve how government helps its citizens."[73]

Applications

[edit]

After the national press attention gained by the 2011 Jeopardy! appearance, IBM sought out partnerships from education to weather and cancer to retail chatbots in order convince business about Watson's alleged capabilities. This ultimately led to the failure of Watson to find a profit-making product for the company.[74]

In 2011, the IBM general counsel wrote in The National Law Review arguing that the law profession will become more efficient and better with Watson.[75] After the national attention Jeopardy! afforded them, began an ultimately unsuccessful and expensive project that began when the Memorial Sloan Kettering Cancer Center tried to use Watson to help doctors diagnose and treat cancer patients. Ultimately, the division cost $4 billion to develop but was sold for a quarter of that—$1 billion, in 2022.[76] By 2023, Watson resulted in IBM losing 10% of its stock value, costing four times more than what it brought to the company and resulting in mass layoffs.[74]

From 2012 through the late 2010s, Watson's technology was used to create applications—mostly discontinued[77] to help people make decisions in a variety of areas, among them:

  • diagnosing cancer and treatment plans,[78]
  • retail shopping,[79]
  • medical equipment purchasing,[80]
  • cooking and recipes,[81][82]
  • water conservation,[83]
  • hospitality management,[84]
  • human genetic sequencing,[84]
  • music development and identification,[85]
  • weather forecasting[86]
  • to sell ads with weather forecasts,[87]
  • to tutor students,[88]
  • and tax preparations,[89]

In 2021, technology reporter at The New York Times for Steve Rohr, explained:

The company’s missteps with Watson began with its early emphasis on big and difficult initiatives intended to generate both acclaim and sizable revenue for the company, according to many of the more than a dozen current and former IBM managers and scientists interviewed for this article. Several of those people asked not to be named because they had not been authorized to speak or still had business ties to IBM.

—?Steve Rohr, "What Ever Happened to IBM’s Watson?", The New York Times[77]

Writing in The Atlantic in 2023, Mac Schwerin argued that IBM's leadership fundamentally did not understand the technology, leading to the hardship and strain caused by the project, saying:

But the suits in charge went after the bigger and more technically challenging game of feeding the machine entirely different types of material. They viewed Watson as a generational meal ticket.

—?Mac Schwerin, "America Forgot About IBM Watson. Is ChatGPT Next?", The Atlantic[90]

In the end, IBM's initial vision for Watson as a transformative technology capable of revolutionizing industries did not materialize as anticipated.[91] Watson's capabilities were primarily suited to specific tasks, like natural language processing for trivia games, rather than generalized commercial problem-solving.[92] Watson's mismatch between capabilities and IBM's marketing contributed significantly to Watson's commercial struggles and eventual decline. The overstated claims about Watson's abilities also caused public sentiment to turn against the idea of Watson and artificial intelligence.[77]

Between 2019 and 2023, IBM shifted focus to a separate initiative WatsonX, distinctly different from Watson, aiming for narrower, industry-targeted technology within IBM's cloud computing and platform-based strategies IBM Watsonx.[77][74]

Healthcare

[edit]

IBM's Watson was used to analyze medical datasets to provide physicians with guidance on diagnoses and cancer treatment decisions.[93][94] When a physician submitted a query to Watson, the system started a multi-step process by parsing the input to identify key information, examining patient data to uncover relevant medical and hereditary history, and finally compare various data sources to form and test hypotheses.[95][94]

IBM claimed that Watson's could draw from a wide range of sources, including treatment guidelines, electronic medical records, and research materials.[94] Although, company executives would later blame the lack of data on the projects ultimate failure.[76]

Notably, Watson has not been involved in the actual diagnosis process, but rather assists doctors in identifying suitable treatment options for patients who have already been diagnosed.[96]In fact, a study of 1,000 challenging patient cases found that Watson's recommendations matched those of human doctors in an impressive 99% of cases.[97]

IBM established partnerships with the Cleveland Clinic,[98] the MD Anderson Cancer Center, and Memorial Sloan-Kettering Cancer Center to further its mission in healthcare. In 2011, IBM entered into a research partnership with Nuance Communications and physicians at the University of Maryland and Harvard to develop a commercial product using Watson's clinical decision support capabilities. IBM partnered with WellPoint (now Anthem) in 2011 to utilize Watson in suggesting treatment options to physicians,[99] and in 2013, Watson was deployed in its first commercial application for utilization management decisions in lung cancer treatment at Memorial Sloan-Kettering Cancer Center.[8] The Cleveland Clinic collaboration aimed to enhance Watson's health expertise and support medical professionals in treating patients more effectively. However, the MD Anderson Cancer Center pilot program, initiated in 2013, ultimately failed to meet its goals and was discontinued after $65 million in investment.[100][101][98]

In 2016, IBM launched "IBM Watson for Oncology," a product designed to provide personalized, evidence-based cancer care options to physicians and patients.[91] This initiative marked a significant milestone in the adoption of Watson's technology in the healthcare industry. Additionally, IBM partnered with Manipal Hospitals in India to offer Watson's expertise to patients online.[102][103]

The company ultimately faced challenges in the healthcare market, with no profit and increased competition.[91] In 2022, IBM announced the sell-off of its Watson Health unit to Francisco Partners, marking a significant shift in the company's approach to the healthcare industry.[91][76]

IBM Watson Group

[edit]

On January 9, 2014, IBM announced it was creating a business unit around Watson.[104] IBM Watson Group will have headquarters in New York City's Silicon Alley and will employ 2,000 people. IBM has invested $1 billion to get the division going. Watson Group will develop three new cloud-delivered services: Watson Discovery Advisor, Watson Engagement Advisor, and Watson Explorer. Watson Discovery Advisor will focus on research and development projects in pharmaceutical industry, publishing, and biotechnology, Watson Engagement Advisor will focus on self-service applications using insights on the basis of natural language questions posed by business users, and Watson Explorer will focus on helping enterprise users uncover and share data-driven insights based on federated search more easily.[104] The company is also launching a $100 million venture fund to spur application development for "cognitive" applications. According to IBM, the cloud-delivered enterprise-ready Watson has seen its speed increase 24 times over—a 2,300 percent improvement in performance and its physical size shrank by 90 percent—from the size of a master bedroom to three stacked pizza boxes.[104] IBM CEO Virginia Rometty said she wants Watson to generate $10 billion in annual revenue within ten years.[105] In 2017, IBM and MIT established a new joint research venture in artificial intelligence. IBM invested $240 million to create the MIT–IBM Watson AI Lab in partnership with MIT, which brings together researchers in academia and industry to advance AI research, with projects ranging from computer vision and NLP to devising new ways to ensure that AI systems are fair, reliable and secure.[106] In March 2018, IBM's CEO Ginni Rometty proposed "Watson's Law," the "use of and application of business, smart cities, consumer applications and life in general."[107]

See also

[edit]

References

[edit]
  1. ^ a b c "DeepQA Project: FAQ". IBM. 22 April 2009. Archived from the original on June 29, 2011. Retrieved February 11, 2011.
  2. ^ Ferrucci, David; Levas, Anthony; Bagchi, Sugato; Gondek, David; Mueller, Erik T. (2025-08-05). "Watson: Beyond Jeopardy!". Artificial Intelligence. 199: 93–105. doi:10.1016/j.artint.2012.06.009.
  3. ^ a b Hale, Mike (February 8, 2011). "Actors and Their Roles for $300, HAL? HAL!". The New York Times. Retrieved February 11, 2011.
  4. ^ "The DeepQA Project". IBM Research. 22 April 2009. Archived from the original on June 29, 2011. Retrieved February 18, 2011.
  5. ^ "Dave Ferrucci at Computer History Museum – How It All Began and What's Next". IBM Research. December 1, 2011. Archived from the original on March 13, 2012. Retrieved February 11, 2012.
  6. ^ Loftus, Jack (April 26, 2009). "IBM Prepping 'Watson' Computer to Compete on Jeopardy!". Gizmodo. Archived from the original on July 31, 2017. Retrieved September 18, 2017.
  7. ^ "IBM's "Watson" Computing System to Challenge All Time Henry Lambert Jeopardy! Champions". Sony Pictures Television. December 14, 2010. Archived from the original on June 16, 2013.
  8. ^ a b Upbin, Bruce (February 8, 2013). "IBM's Watson Gets Its First Piece Of Business In Healthcare". Forbes. Archived from the original on September 18, 2017. Retrieved September 18, 2017.
  9. ^ a b Ferrucci, D.; et al. (2010). "Building Watson: An Overview of the DeepQA Project". AI Magazine. 31 (3): 59–79. doi:10.1609/aimag.v31i3.2303. Archived from the original on December 28, 2017. Retrieved February 19, 2011.
  10. ^ "Watson, A System Designed for Answers: The Future of Workload Optimized Systems Design". IBM Systems and Technology. February 2011. p. 3. Archived from the original on March 4, 2016. Retrieved September 9, 2015.
  11. ^ a b Jackson, Joab (February 17, 2011). "IBM Watson Vanquishes Human Jeopardy Foes". PC World. IDG News. Archived from the original on February 20, 2011. Retrieved February 17, 2011.
  12. ^ Takahashi, Dean (February 17, 2011). "IBM researcher explains what Watson gets right and wrong". VentureBeat. Archived from the original on February 18, 2011. Retrieved February 18, 2011.
  13. ^ Novell (February 2, 2011). "Watson Supercomputer to Compete on 'Jeopardy!' – Powered by SUSE Linux Enterprise Server on IBM POWER7". The Wall Street Journal. Archived from the original on April 21, 2011. Retrieved February 21, 2011.
  14. ^ Tesauro, Gerry (2025-08-05). "How Watson Learns Superhuman Jeopardy! Strategies". YouTube. IBM Research. Retrieved 2025-08-05.
  15. ^ Tesauro, G.; Gondek, D. C.; Lenchner, J.; Fan, J.; Prager, J. M. (May 2012). "Simulation, learning, and optimization techniques in Watson's game strategies". IBM Journal of Research and Development. 56 (3.4): 16:1–16:11. doi:10.1147/JRD.2012.2188931. ISSN 0018-8646.
  16. ^ Tesauro, G.; Gondek, D. C.; Lenchner, J.; Fan, J.; Prager, J. M. (2025-08-05). "Analysis of Watson's Strategies for Playing Jeopardy!". Journal of Artificial Intelligence Research. 47: 205–251. arXiv:1402.0571. doi:10.1613/jair.3834. ISSN 1076-9757.
  17. ^ a b Tesauro, Gerald (February 13, 2011). "Watson's wagering strategies". IBM Research News. IBM. Archived from the original on February 18, 2011. Retrieved February 18, 2011.
  18. ^ a b "Is Watson the smartest machine on earth?". Computer Science and Electrical Engineering Department. University of Maryland, Baltimore County. February 10, 2011. Archived from the original on September 27, 2011. Retrieved February 11, 2011.
  19. ^ a b Rennie, John (February 14, 2011). "How IBM's Watson Computer Excels at Jeopardy!". PLoS blogs. Archived from the original on February 22, 2011. Retrieved February 19, 2011.
  20. ^ Lucas, Mearian (February 21, 2011). "Can anyone afford an IBM Watson supercomputer? (Yes)". Computerworld. Archived from the original on December 12, 2013. Retrieved February 21, 2011.
  21. ^ "Top500 List – November 2013". Top500.org. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  22. ^ Ferrucci, David; et al. "The AI Behind Watson – The Technical Article". AI Magazine (Fall 2010). Archived from the original on November 6, 2020. Retrieved November 11, 2013.
  23. ^ a b c d e f g h i j k l m n o p Thompson, Clive (June 16, 2010). "Smarter Than You Think: What Is I.B.M.'s Watson?". The New York Times Magazine. Archived from the original on June 5, 2011. Retrieved February 18, 2011.
  24. ^ "Will Watson Win On Jeopardy!?". Nova ScienceNOW. Public Broadcasting Service. January 20, 2011. Archived from the original on April 14, 2011. Retrieved January 27, 2011.
  25. ^ a b c d Needleman, Rafe (February 18, 2011). "Reporters' Roundtable: Debating the robobrains". CNET. Retrieved February 18, 2011.[dead link]
  26. ^ Russo-Spena, Tiziana; Mele, Cristina; Marzullo, Marialuisa (2018). "Practising Value Innovation through Artificial Intelligence: The IBM Watson Case". Journal of Creating Value. 5 (1): 11–24. doi:10.1177/2394964318805839. ISSN 2394-9643. S2CID 56759835.
  27. ^ a b c d "Jeopardy! Champ Ken Jennings". The Washington Post. February 15, 2011. Archived from the original on February 14, 2011. Retrieved February 15, 2011.
  28. ^ a b Gondek, David (January 10, 2011). "How Watson "sees," "hears," and "speaks" to play Jeopardy!". IBM Research News. Retrieved February 21, 2011.
  29. ^ Avery, Lise (February 14, 2011). "Interview with Actor Jeff Woodman, Voice of IBM's Watson Computer" (MP3). Anything Goes!!. Archived from the original on September 21, 2019. Retrieved February 15, 2011.
  30. ^ Kosinski, Robert J. (2008). "A Literature Review on Reaction Time". Clemson University. Archived from the original on March 17, 2016. Retrieved January 10, 2016.
  31. ^ a b Baker (2011), p. 174.
  32. ^ Baker (2011), p. 178.
  33. ^ a b c Strachan, Alex (February 12, 2011). "For Jennings, it's a man vs. man competition". The Vancouver Sun. Archived from the original on February 21, 2011. Retrieved February 15, 2011.
  34. ^ Baker (2011), pp. 6–8.
  35. ^ Baker (2011), p. 30.
  36. ^ Radev, Dragomir R.; Prager, John; Samn, Valerie (2000). "Ranking potential answers to natural language questions" (PDF). Proceedings of the 6th Conference on Applied Natural Language Processing. Archived from the original (PDF) on 2025-08-05. Retrieved 2025-08-05.
  37. ^ Prager, John; Brown, Eric; Coden, Annie; Radev, Dragomir R. (July 2000). "Question-answering by predictive annotation" (PDF). Proceedings, 23rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval. Archived from the original (PDF) on 2025-08-05. Retrieved 2025-08-05.
  38. ^ Leopold, George (July 18, 2007). "IBM's Paul Horn retires, Kelly named research chief". EE Times. Archived from the original on June 3, 2020. Retrieved May 27, 2020.
  39. ^ Babcock, Charles (October 14, 2015). "IBM Cognitive Colloquium Spotlights Uncovering Dark Data". InformationWeek. Archived from the original on June 3, 2020. Retrieved May 27, 2020.
  40. ^ Brodkin, Jon (February 10, 2010). "IBM's Jeopardy-playing machine can now beat human contestants". Network World. Archived from the original on June 3, 2013. Retrieved February 19, 2011.
  41. ^ Zimmer, Ben (February 17, 2011). "Is It Time to Welcome Our New Computer Overlords?". The Atlantic. Archived from the original on August 29, 2018. Retrieved February 17, 2011.
  42. ^ Raz, Guy (January 28, 2011). "Can a Computer Become a Jeopardy! Champ?". National Public Radio. Archived from the original on February 28, 2011. Retrieved February 18, 2011.
  43. ^ "Medical Students Offer Expertise to IBM's Jeopardy!-Winning Computer Watson as It Pursues a New Career in Medicine" (PDF). InTouch. 18. New York Medical College: 4. June 2012. Archived from the original (PDF) on 2025-08-05.
  44. ^ "'Millionaire' quiz whiz Toutant had passion for trivia, Austin's arts scene". Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  45. ^ Stelter, Brian (December 14, 2010). "I.B.M. Supercomputer 'Watson' to Challenge 'Jeopardy' Stars". The New York Times. Retrieved December 14, 2010.
  46. ^ Baker (2011), p. 171.
  47. ^ Flatow, Ira (February 11, 2011). "IBM Computer Faces Off Against 'Jeopardy' Champs". Talk of the Nation. National Public Radio. Archived from the original on February 17, 2011. Retrieved February 15, 2011.
  48. ^ Sostek, Anya (February 13, 2011). "Human champs of 'Jeopardy!' vs. Watson the IBM computer: a close match". Pittsburgh Post Gazette. Archived from the original on February 17, 2011. Retrieved February 19, 2011.
  49. ^ a b c Jennings, Ken (February 16, 2011). "My Puny Human Brain". Slate. Newsweek Interactive Co. LLC. Archived from the original on February 18, 2011. Retrieved February 17, 2011.
  50. ^ Baker (2011), p. 117.
  51. ^ Baker (2011), pp. 232–258.
  52. ^ Dignan, Larry (January 13, 2011). "IBM's Watson wins Jeopardy practice round: Can humans hang?". ZDnet. Archived from the original on January 13, 2011. Retrieved January 13, 2011.
  53. ^ a b "The IBM Challenge Day 1". Jeopardy. Season 27. Episode 23. February 14, 2011.
  54. ^ Lenchner, Jon (February 3, 2011). "Knowing what it knows: selected nuances of Watson's strategy". IBM Research News. IBM. Archived from the original on February 16, 2011. Retrieved February 16, 2011.
  55. ^ Johnston, Casey (February 15, 2011). "Jeopardy: IBM's Watson almost sneaks wrong answer by Trebek". Ars Technica. Archived from the original on February 18, 2011. Retrieved February 15, 2011.
  56. ^ a b c d "Computer crushes the competition on 'Jeopardy!'". Associated Press. February 15, 2011. Archived from the original on February 19, 2011. Retrieved February 19, 2011.
  57. ^ Staff (February 15, 2011). "IBM's computer wins 'Jeopardy!' but... Toronto?". CTV News. Archived from the original on November 27, 2012. Retrieved February 15, 2011.
  58. ^ a b Robertson, Jordan; Borenstein, Seth (February 16, 2011). "For Watson, Jeopardy! victory was elementary". The Globe and Mail. The Associated Press. Archived from the original on February 20, 2011. Retrieved February 17, 2011.
  59. ^ Hamm, Steve (February 15, 2011). "Watson on Jeopardy! Day Two: The Confusion over and Airport Clue". A Smart Planet Blog. Archived from the original on October 24, 2011. Retrieved February 21, 2011.
  60. ^ Johnston, Casey (February 15, 2011). "Creators: Watson has no speed advantage as it crushes humans in Jeopardy". Ars Technica. Archived from the original on February 18, 2011. Retrieved February 21, 2011.
  61. ^ "IBM Watson: Final Jeopardy! And the Future of Watson". YouTube. 16 February 2011.
  62. ^ Oberman, Mira (February 17, 2011). "Computer creams human Jeopardy! champions". Vancouver Sun. Agence France-Presse. Archived from the original on February 20, 2011. Retrieved February 17, 2011.
  63. ^ Johnston, Casey (February 17, 2011). "Bug lets humans grab Daily Double as Watson triumphs on Jeopardy". Ars Technica. Archived from the original on February 21, 2011. Retrieved February 21, 2011.
  64. ^ a b Upbin, Bruce (February 17, 2011). "IBM's Supercomputer Watson Wins It All With $367 Bet". Forbes. Archived from the original on February 21, 2011. Retrieved February 21, 2011.
  65. ^ Oldenburg, Ann (February 17, 2011). "Ken Jennings: 'My puny brain' did just fine on 'Jeopardy!'". USA Today. Archived from the original on February 20, 2011. Retrieved February 21, 2011.
  66. ^ "Show 6088 – The IBM Challenge, Day 2". Jeopardy!. February 16, 2011. Syndicated.
  67. ^ "World Community Grid to benefit from Jeopardy! competition". World Community Grid. February 4, 2011. Archived from the original on January 14, 2012. Retrieved February 19, 2011.
  68. ^ "Jeopardy! And IBM Announce Charities To Benefit From Watson Competition". IBM Corporation. January 13, 2011. Archived from the original on November 10, 2021. Retrieved February 19, 2011.
  69. ^ "IBM's Watson supercomputer crowned Jeopardy king". BBC News. February 17, 2011. Archived from the original on February 18, 2011. Retrieved February 17, 2011.
  70. ^ Markoff, John (February 16, 2011). "Computer Wins on 'Jeopardy!': Trivial, It's Not". The New York Times. Yorktown Heights, New York. Archived from the original on October 22, 2014. Retrieved February 17, 2011.
  71. ^ Searle, John (February 23, 2011). "Watson Doesn't Know It Won on 'Jeopardy!'". The Wall Street Journal. Archived from the original on November 10, 2021. Retrieved July 26, 2011.
  72. ^ Lohr, Steve (December 5, 2011). "Creating AI based on the real thing". The New York Times. Archived from the original on November 10, 2021. Retrieved February 26, 2017..
  73. ^ a b "NJ congressman tops 'Jeopardy' computer Watson". Associated Press. March 2, 2011. Archived from the original on March 7, 2011. Retrieved March 2, 2011.
  74. ^ a b c Schwerin, Mac (May 5, 2023). "America Forgot About IBM Watson. Is ChatGPT Next?". The Atlantic. Archived from the original on May 5, 2023.
  75. ^ Weber, Robert C. (February 14, 2011). "Why 'Watson' matters to lawyers". The National Law Review. Archived from the original on September 8, 2019.
  76. ^ a b c Lohr, Steve (2025-08-05). "IBM is selling off Watson Health to a private equity firm". The New York Times. ISSN 0362-4331. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  77. ^ a b c d Lohr, Steve (2025-08-05). "What Ever Happened to IBM's Watson?". The New York Times. ISSN 0362-4331. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  78. ^ Lohr, Steve (2025-08-05). "IBM Is Counting on Its Bet on Watson, and Paying Big Money for It". The New York Times. ISSN 0362-4331. Retrieved 2025-08-05.
  79. ^ Ungerleider, Neal (April 23, 2014). "The North Face Testing Watson-Powered Virtual Personal Shoppers". Fast Company. Archived from the original on January 17, 2019.
  80. ^ Hesseldahl, Arik (February 12, 2014). "First Investment by IBM's Watson Fund Is for Welltok". Vox. Archived from the original on September 28, 2021.
  81. ^ Yang, Ina (July 13, 2015). "Caviar + Mango: Chef Watson Wants You to Cook Outside the Comfort Zone". NPR. Archived from the original on July 14, 2015.
  82. ^ O'Brien, Terrence (2025-08-05). "Watson's South American spin on a Canadian classic". Engadget. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  83. ^ Johnson, Scott K. (July 6, 2016). "IBM's Watson Fed Images to Estimate Water Use Efficiency in California". Ars Technica. Archived from the original on July 6, 2016.
  84. ^ a b Hardawar, Devindra (May 5, 2015). "IBM's big bet on Watson is paying off with more apps and DNA analysis". Engadget. Archived from the original on September 29, 2020.
  85. ^ Liu, Bin; Liao, Yuanyuan (2025-08-05). "Integrating IBM Watson BEAT generative AI software into flute music learning: the impact of advanced AI tools on students' learning strategies". Education and Information Technologies. doi:10.1007/s10639-025-13394-y. ISSN 1573-7608.
  86. ^ Jancer, Matt (26 August 2016). "IBM's Watson Takes On Yet Another Job, as a Weather Forecaster". Smithsonian. Archived from the original on 1 September 2016. Retrieved 29 August 2016.
  87. ^ Booton, Jennifer (15 June 2016). "IBM finally reveals why it bought The Weather Company". Market Watch. Archived from the original on 22 August 2016. Retrieved 29 August 2016.
  88. ^ Plenty, Rebecca (October 25, 2016). "Pearson Taps IBM's Watson as a Virtual Tutor for College Students". No. October 25, 2016. Bloomberg. Bloomberg. Archived from the original on September 27, 2017. Retrieved 26 September 2017.
  89. ^ Moscaritolo, Angela (2 February 2017). "H&R Block Enlists IBM Watson to Find Tax Deductions". PC Magazine. Archived from the original on 15 February 2017. Retrieved 14 February 2017.
  90. ^ Schwerin, Mac (May 5, 2023). "America Forgot About IBM Watson. Is ChatGPT Next?". The Atlantic. Archived from the original on May 5, 2023.
  91. ^ a b c d Strickland, Eliza (April 2, 2019). "How IBM Watson Overpromised and Underdelivered on AI Health Care". IEEE Spectrum. Archived from the original on July 30, 2021.
  92. ^ Yu, Jea (April 10, 2023). "Back from the Dead, IBM's Watson AI is Alive and Re-Emerging". MarketBeat / Nasdaq. Archived from the original on April 26, 2023.
  93. ^ "IBM Watson is AI for Business". IBM. 9 July 2024.
  94. ^ a b c "Putting Watson to Work: Watson in Healthcare". IBM. Archived from the original on November 11, 2013. Retrieved November 11, 2013.
  95. ^ "IBM Watson Helps Fight Cancer with Evidence-Based Diagnosis and Treatment Suggestions" (PDF). IBM. Archived from the original (PDF) on April 26, 2013. Retrieved November 12, 2013.
  96. ^ Saxena, Manoj (February 13, 2013). "IBM Watson Progress and 2013 Roadmap (Slide 7)". IBM. Archived from the original on November 13, 2013. Retrieved November 12, 2013.
  97. ^ "MD Anderson Taps IBM Watson to Power "Moon Shots" Mission Aimed at Ending Cancer, Starting with Leukemia" (Press release). IBM. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  98. ^ a b Miliard, Mike (October 30, 2012). "Watson Heads to Medical School: Cleveland Clinic, IBM Send Supercomputer to College". Healthcare IT News. Archived from the original on November 11, 2013. Retrieved November 11, 2013.
  99. ^ Mathews, Anna Wilde (September 12, 2011). "Wellpoint's New Hire: What is Watson?". The Wall Street Journal. Archived from the original on February 22, 2017. Retrieved March 12, 2017.
  100. ^ "IBM's Jeopardy! Stunt Computer Is Curing Cancer Now". New York Magazine. November 23, 2016.
  101. ^ "MD Anderson Benches IBM Watson In Setback For Artificial Intelligence In Medicine". Forbes. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  102. ^ ANI (2025-08-05). "Manipal Hospitals to adopt IBM's 'Watson for Oncology' supercomputer for cancer treatment". Business Standard India. Archived from the original on 2025-08-05. Retrieved 2025-08-05.
  103. ^ Goel, Vindu (2025-08-05). "IBM Now Has More Employees in India Than in the U.S." The New York Times. ISSN 0362-4331. Retrieved 2025-08-05.
  104. ^ a b c "IBM Watson Group Unveils Cloud-Delivered Watson Services to Transform Industrial R&D, Visualize Big Data Insights and Fuel Analytics Exploration" (Press release). IBM. January 9, 2014. Archived from the original on October 12, 2020. Retrieved February 14, 2020.
  105. ^ Ante, Spencer E. (January 9, 2014). "IBM Set to Expand Watson's Reach". The Wall Street Journal. Archived from the original on May 9, 2015. Retrieved January 9, 2014.
  106. ^ "Inside the Lab". September 2017. Archived from the original on October 23, 2020. Retrieved October 6, 2020.
  107. ^ "IBM CEO Rometty Proposes 'Watson's Law': AI In Everything" Archived 2025-08-05 at the Wayback Machine, Adrian Bridgewater, Forbes, March 20, 2018

Bibliography

[edit]

Further reading

[edit]
  • Baker, Stephen (2012) Final Jeopardy: The Story of Watson, the Computer That Will Transform Our World, Mariner Books.
  • Jackson, Joab (2014). IBM bets big on Watson-branded cognitive computing PCWorld: Jan 9, 2014 2:30 PM
  • Greenemeier, Larry. (2013). Will IBM's Watson Usher in a New Era of Cognitive Computing? Scientific American. Nov 13, 2013 |* Lazarus, R. S. (1982).
  • Kelly, J.E. and Hamm, S. ( 2013). Smart Machines: IBM's Watson and the Era of Cognitive Computing. Columbia Business School Publishing
[edit]

J! Archive

[edit]

Videos

[edit]
腺瘤样增生是什么意思 拜土地公时要念什么好 吃什么东西补血 乙肝dna检测是查什么 黄帝内经讲的什么
肠瘘是什么意思 囊肿是什么病严重吗 智齿为什么会发炎 正常头皮是什么颜色的 宫颈糜烂用什么药最好
肝脑涂地是什么意思 软组织损伤用什么药 蜘蛛属于什么类动物 矫枉过正什么意思 卵巢下降是什么原因
朋友搬家送什么礼物好 名声是什么意思 什么的寒风 2050年是什么年 沙门氏菌是什么
大姑姐是什么意思bjhyzcsm.com rap什么意思adwl56.com 口出狂言是什么生肖hcv9jop7ns9r.cn 啤酒花是什么hcv8jop3ns2r.cn 各什么各什么hcv8jop3ns6r.cn
湿疹是什么原因引起的hcv8jop2ns3r.cn 喝黑苦荞茶有什么好处和坏处hcv8jop7ns5r.cn 什么是碳水食物有哪些hcv7jop5ns5r.cn 薄荷脑是什么hcv8jop0ns6r.cn 葛根泡水有什么功效hcv9jop3ns6r.cn
氯喹是什么药naasee.com 梅子色是什么颜色hcv9jop2ns6r.cn 忍耐是什么意思hcv9jop2ns7r.cn 叶黄素什么时间吃最好hcv7jop9ns5r.cn 双子座是什么象hcv9jop1ns2r.cn
什么叫暧昧hcv9jop5ns1r.cn 广藿香是什么味道jingluanji.com 考试前吃什么提神醒脑hcv7jop9ns1r.cn 自费是什么意思hcv8jop2ns1r.cn 什么叫菩提hcv9jop6ns3r.cn
百度