Difference between revisions of "Main Page"

From cognitivecomputer
Jump to: navigation, search
 
 
Line 1: Line 1:
<strong>MediaWiki has been installed.</strong>
+
[[File:Watson's avatar.jpg|thumb|400px|Watson's Avatar, inspired by the IBM "smarter planet" logo[1]]]
 +
Watson is an artificially intelligent computer system capable of answering questions posed in natural language,[2] developed in IBM's DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named after IBM's first CEO and industrialist Thomas J. Watson.[3][4] The computer system was specifically developed to answer questions on the quiz show Jeopardy![5] In 2011, Watson competed on Jeopardy! against former winners Brad Rutter and Ken Jennings.[3][6] Watson received the first place prize of $1 million.[7]
  
Consult the [https://meta.wikimedia.org/wiki/Help:Contents User's Guide] for information on using the wiki software.
+
Watson had access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage[8] including the full text of Wikipedia,[9] but was not connected to the Internet during the game.[10][11] For each clue, Watson's three most probable responses were displayed on the television screen. Watson consistently outperformed its human opponents on the game's signaling device, but had trouble responding to a few categories, notably those having short clues containing only a few words.
  
== Getting started ==
+
In February 2013, IBM announced that Watson software system's first commercial application would be for utilization management decisions in lung cancer treatment at Memorial Sloan–Kettering Cancer Center in conjunction with health insurance company WellPoint.[12] IBM Watson's business chief Manoj Saxena says that 90% of nurses in the field who use Watson now follow its guidance.[13]
* [https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:Configuration_settings Configuration settings list]
+
 
* [https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:FAQ MediaWiki FAQ]
+
==Description==
* [https://lists.wikimedia.org/mailman/listinfo/mediawiki-announce MediaWiki release mailing list]
+
[[File:800px-IBM Watson.PNG|thumb|500px|Watson]]
* [https://www.mediawiki.org/wiki/Special:MyLanguage/Localisation#Translation_resources Localise MediaWiki for your language]
+
[[File:DeepQA.svg.png|500px|thumb|The high-level architecture of IBM's DeepQA used in Watson]]
* [https://www.mediawiki.org/wiki/Special:MyLanguage/Manual:Combating_spam Learn how to combat spam on your wiki]
+
Watson is a question answering (QA) computing system that IBM built to apply advanced natural language processing, information retrieval, knowledge representation, automated reasoning, and machine learning technologies to the field of open domain question answering.[2]
 +
 
 +
The key difference between QA technology and document search is that document search takes a keyword query and returns a list of documents, ranked in order of relevance to the query (often based on popularity and page ranking), while QA technology takes a question expressed in natural language, seeks to understand it in much greater detail, and returns a precise answer to the question.[15]
 +
 
 +
According to IBM, "more than 100 different techniques are used to analyze natural language, identify sources, find and generate hypotheses, find and score evidence, and merge and rank hypotheses."[16]
 +
 
 +
===Software===
 +
Watson uses IBM's DeepQA software and the Apache UIMA (Unstructured Information Management Architecture) framework. The system was written in various languages, including Java, C++, and Prolog, and runs on the SUSE Linux Enterprise Server 11 operating system using Apache Hadoop framework to provide distributed computing.[8][17][18]
 +
 
 +
===Hardware===
 +
The system is workload optimized, integrating massively parallel POWER7 processors and being built on IBM's DeepQA technology,[19] which it uses to generate hypotheses, gather massive evidence, and analyze data.[2] Watson is composed of a cluster of ninety IBM Power 750 servers, each of which uses a 3.5 GHz POWER7 eight core processor, with four threads per core. In total, the system has 2,880 POWER7 processor cores and has 16 terabytes of RAM.[19]
 +
 
 +
According to John Rennie, Watson can process 500 gigabytes, the equivalent of a million books, per second.[20] IBM's master inventor and senior consultant Tony Pearson estimated Watson's hardware cost at about $3 million.[21] Its performance stands at 80 TeraFLOPs which is not enough to place it at Top 500 Supercomputers list.[22] According to Rennie, the content was stored in Watson's RAM for the game because data stored on hard drives are too slow to access.[20]
 +
 
 +
===Data===
 +
The sources of information for Watson include encyclopedias, dictionaries, thesauri, newswire articles, and literary works. Watson also used databases, taxonomies, and ontologies. Specifically, DBPedia, WordNet, and Yago were used.[23]
 +
 
 +
The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias, and other reference material that it could use to build its knowledge.[11] Although Watson was not connected to the Internet during the game,[24] it contained 200 million pages of structured and unstructured content consuming four terabytes of disk storage,[8] including the full text of Wikipedia.[9]
 +
 
 +
==Operation==
 +
When playing Jeopardy! all players must wait until host Alex Trebek reads each clue in its entirety, after which a light is lit as a "ready" signal; the first to activate their buzzer button wins the chance to respond.[11][26] Watson received the clues as electronic texts at the same moment they were made visible to the human players.[11] It would then parse the clues into different keywords and sentence fragments in order to find statistically related phrases.[11] Watson's main innovation was not in the creation of a new algorithm for this operation but rather its ability to quickly execute thousands of proven language analysis algorithms simultaneously to find the correct answer.[11][27] The more algorithms that find the same answer independently the more likely Watson is to be correct.[11] Once Watson has a small number of potential solutions, it is able to check against its database to ascertain whether the solution makes sense.[11] In a sequence of 20 mock games, human participants were able to use the average six to seven seconds that Watson needed to hear the clue and decide whether to signal for responding.[11] During that time, Watson also has to evaluate the response and determine whether it is sufficiently confident in the result to signal.[11] Part of the system used to win the Jeopardy! contest was the electronic circuitry that receives the "ready" signal and then examined whether Watson's confidence level was great enough to activate the buzzer. Given the speed of this circuitry compared to the speed of human reaction times, Watson's reaction time was faster than the human contestants except when the human anticipated (instead of reacted to) the ready signal.[28] After signaling, Watson speaks with an electronic voice and gives the responses in Jeopardy! '​s question format.[11] Watson's voice was synthesized from recordings that actor Jeff Woodman made for an IBM text-to-speech program in 2004.[29]
 +
 
 +
===Comparison with human players===
 +
 
 +
Ken Jennings, Watson, and Brad Rutter in their Jeopardy! exhibition match
 +
Watson's basic working principle is to parse keywords in a clue while searching for related terms as responses. This gives Watson some advantages and disadvantages compared with human Jeopardy! players.[30] Watson has deficiencies in understanding the contexts of the clues. As a result, human players usually generate responses faster than Watson, especially to short clues.[11] Watson's programming prevents it from using the popular tactic of buzzing before it is sure of its response.[11] Watson has consistently better reaction time on the buzzer once it has generated a response, and is immune to human players' psychological tactics.[11][31]
 +
 
 +
The Jeopardy! staff used different means to notify Watson and the human players when to buzz,[28] which was critical in many rounds.[31] The humans were notified by a light, which took them tenths of a second to perceive.[32][33] Watson was notified by an electronic signal and could activate the buzzer within about eight milliseconds.[34] The humans tried to compensate for the perception delay by anticipating the light,[35] but the variation in the anticipation time was generally too great to fall within Watson's response time.[31] Watson did not operate to anticipate the notification signal.[33][35]
 +
 
 +
==History==
 +
 
 +
===Development===
 +
Since Deep Blue's victory over Garry Kasparov in chess in 1997, IBM had been on the hunt for a new challenge. In 2004, IBM Research manager Charles Lickel, over dinner with coworkers, noticed that the restaurant they were in had fallen silent. He soon discovered the cause of this evening hiatus: Ken Jennings, who was then in the middle of his successful 74-game run on Jeopardy!. Nearly the entire restaurant had piled toward the televisions, mid-meal, to watch the phenomenon. Intrigued by the quiz show as a possible challenge for IBM, Lickel passed the idea on, and in 2005, IBM Research executive Paul Horn backed Lickel up, pushing for someone in his department to take up the challenge of playing Jeopardy! with an IBM system. Though he initially had trouble finding any research staff willing to take on what looked to be a much more complex challenge than the wordless game of chess, eventually David Ferrucci took him up on the offer.[36] In competitions managed by the United States government, Watson's predecessor, a system named Piquant, was usually able to respond correctly to only about 35% of clues and often required several minutes to respond.[37][38][39] To compete successfully on Jeopardy!, Watson would need to respond in no more than a few seconds, and at that time, the problems posed by the game show were deemed to be impossible to solve.[11]
 +
 
 +
In initial tests run during 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past Jeopardy! programs. While the best real-life competitors buzzed in half the time and responded correctly to as many as 95% of clues, Watson's first pass could get only about 15% correct. During 2007, the IBM team was given three to five years and a staff of 15 people to solve the problems.[11] By 2008, the developers had advanced Watson such that it could compete with Jeopardy! champions.[11] By February 2010, Watson could beat human Jeopardy! contestants on a regular basis.[40]
 +
 
 +
Although the system is primarily an IBM effort, Watson's development involved faculty and graduate students from Rensselaer Polytechnic Institute, Carnegie Mellon University, University of Massachusetts Amherst, the University of Southern California's Information Sciences Institute, the University of Texas at Austin, the Massachusetts Institute of Technology, and the University of Trento,[14] as well as students from New York Medical College.[41]
 +
 
 +
===Jeopardy!===
 +
[[File:Watson Jeopardy.jpg|thumb|400px|Ken Jennings, Watson, and Brad Rutter in their ''Jeopardy!'' exhibition match]]
 +
====Preparation====
 +
 
 +
In 2008, IBM representatives communicated with Jeopardy! executive producer Harry Friedman about the possibility of having Watson compete against Ken Jennings and Brad Rutter, two of the most successful contestants on the show, and the program's producers agreed.[11][42] Watson's differences with human players had generated conflicts between IBM and Jeopardy! staff during the planning of the competition.[30] IBM repeatedly expressed concerns that the show's writers would exploit Watson's cognitive deficiencies when writing the clues, thereby turning the game into a Turing test. To alleviate that claim, a third party randomly picked the clues from previously written shows that were never broadcast.[30] Jeopardy! staff also showed concerns over Watson's reaction time on the buzzer. Originally Watson signaled electronically, but show staff requested that it press a button physically, as the human contestants would.[43] Even with a robotic "finger" pressing the buzzer, Watson remained faster than its human competitors. Ken Jennings noted, "If you're trying to win on the show, the buzzer is all," and that Watson "can knock out a microsecond-precise buzz every single time with little or no variation. Human reflexes can't compete with computer circuits in this regard."[31][35][44] Stephen Baker, a journalist who recorded Watson's development in his book "Final Jeopardy", reported that the conflict between IBM and Jeopardy! became so serious in May 2010 that the competition was almost canceled.[30] Watson learns from his mistakes, for example, the following mistake during a practice round: he was given the clue "This trusted friend was the first non-dairy powdered creamer," to which he replied, "What is milk?", mistaking the clue as asking for a dairy product. As part of the preparation, IBM constructed a mock set in a conference room at one of its technology sites to model the one used on Jeopardy! Human players, including former Jeopardy! contestants, also participated in mock games against Watson with Todd Alan Crain of The Onion playing host.[11] About 100 test matches were conducted with Watson winning 65% of the games.[45]
 +
 
 +
To provide a physical presence in the televised games, Watson was represented by an "avatar" of a globe, inspired by the IBM "smarter planet" symbol. Jennings described the computer's avatar as a "glowing blue ball criscrossed by 'threads' of thought—42 threads, to be precise,"[25] and stated that the number of thought threads in the avatar was an in-joke referencing the significance of the number 42 in Douglas Adams' Hitchhiker's Guide to the Galaxy.[25] Joshua Davis, the artist who designed the avatar for the project, explained to Stephen Baker that there are 36 triggerable states that Watson was able to use throughout the game to show its confidence in responding to a clue correctly; he had hoped to be able to find forty-two, to add another level to the Hitchhiker's Guide reference, but he was unable to pinpoint enough game states.[46]
 +
 
 +
A practice match was recorded on January 13, 2011, and the official matches were recorded on January 14, 2011. All participants maintained secrecy about the outcome until the match was broadcast in February.[47]
 +
 
 +
====Practice match====
 +
In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question. None of the three players responded incorrectly to a clue.[48]
 +
 
 +
====First match====
 +
The first round was broadcast February 14, 2011, and the second round, on February 15, 2011. The right to choose the first category had been determined by a draw won by Rutter.[49] Watson, represented by a computer monitor display and artificial voice, responded correctly to the second clue and then selected the fourth clue of the first category, a deliberate strategy to find the Daily Double as quickly as possible.[50] Watson's guess at the Daily Double location was correct. At the end of the first round, Watson was tied with Rutter at $5,000; Jennings had $2,000.[49]
 +
 
 +
Watson's performance was characterized by some quirks. In one instance, Watson repeated a reworded version of an incorrect response offered by Jennings (Jennings said "What are the '20s?" in reference to the 1920s. Then Watson said "What is 1920s?") Because Watson could not recognize other contestants' responses, it did not know that Jennings had already given the same response. In another instance, Watson was initially given credit for a response of "What is leg?" after Jennings incorrectly responded "What is: he only had one hand?" to a clue about George Eyser (The correct response was, "What is: he's missing a leg?"). Because Watson, unlike a human, could not have been responding to Jennings's mistake, it was decided that this response was incorrect. The broadcast version of the episode was edited to omit Trebek's original acceptance of Watson's response.[51] Watson also demonstrated complex wagering strategies on the Daily Doubles, with one bet at $6,435 and another at $1,246.[52] Gerald Tesauro, one of the IBM researchers who worked on Watson, explained that Watson's wagers were based on its confidence level for the category and a complex regression model called the Game State Evaluator.[53]
 +
 
 +
Watson took a commanding lead in Double Jeopardy!, correctly responding to both Daily Doubles. Watson responded to the second Daily Double correctly with a 32% confidence score.[52]
 +
 
 +
Although it wagered only $947 on the clue, Watson was the only contestant to miss the Final Jeopardy! response in the category U.S. CITIES ("Its largest airport was named for a World War II hero; its second largest, for a World War II battle"). Rutter and Jennings gave the correct response of Chicago, but Watson's response was "What is Toronto?????"[52][54][55] Ferrucci offered reasons why Watson would appear to have guessed a Canadian city: categories only weakly suggest the type of response desired, the phrase "U.S. city" didn't appear in the question, there are cities named Toronto in the U.S., and Toronto in Ontario has an American League baseball team.[56] Dr. Chris Welty, who also worked on Watson, suggested that it may not have been able to correctly parse the second part of the clue, "its second largest, for a World War II battle" (which was not a standalone clause despite it following a semicolon, and required context to understand that it was referring to a second-largest airport).[57] Eric Nyberg, a professor at Carnegie Mellon University and a member of the development team, stated that the error occurred because Watson does not possess the comparative knowledge to discard that potential response as not viable.[55] Although not displayed to the audience as with non-Final Jeopardy! questions, Watson's second choice was Chicago. Both Toronto and Chicago were well below Watson's confidence threshold, at 14% and 11% respectively. (This lack of confidence was the reason for the multiple question marks in Watson's response.)
 +
 
 +
The game ended with Jennings with $4,800, Rutter with $10,400, and Watson with $35,734.[52]
 +
 
 +
====Second match====
 +
During the introduction, Trebek (a Canadian native) joked that he had learned Toronto was a U.S. city, and Watson's error in the first match prompted an IBM engineer to wear a Toronto Blue Jays jacket to the recording of the second match.[58]
 +
 
 +
In the first round, Jennings was finally able to choose a Daily Double clue,[59] while Watson responded to one Daily Double clue incorrectly for the first time in the Double Jeopardy! Round.[60] After the first round, Watson placed second for the first time in the competition after Rutter and Jennings were briefly successful in increasing their dollar values before Watson could respond.[60][61] Nonetheless, the final result ended with a victory for Watson with a score of $77,147, besting Jennings who scored $24,000 and Rutter who scored $21,600.[62]
 +
 
 +
====Final outcome====
 +
The prizes for the competition were $1 million for first place (Watson), $300,000 for second place (Jennings), and $200,000 for third place (Rutter). As promised, IBM donated 100% of Watson's winnings to charity, with 50% of those winnings going to World Vision and 50% going to World Community Grid.[63] Similarly, Jennings and Rutter donated 50% of their winnings to their respective charities.[64]
 +
 
 +
In acknowledgment of IBM and Watson's achievements, Jennings made an additional remark in his Final Jeopardy! response: "I for one welcome our new computer overlords", echoing a similar memetic reference to the episode "Deep Space Homer" on The Simpsons, in which TV news presenter Kent Brockman speaks of welcoming "our new insect overlords".[65][66] Jennings later wrote an article for Slate, in which he stated "IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of 'thinking' machines. 'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last."[25]
 +
 
 +
==Philosophy==
 +
Philosopher John Searle argues that Watson—despite impressive capabilities—cannot actually think.[67] Drawing on his Chinese room thought experiment, Searle claims that Watson, like other computational machines, is capable only of manipulating symbols, but has no ability to understand the meaning of those symbols; however, Searle's experiment has its detractors.[68]
 +
 
 +
Match against members of the United States Congress[edit]
 +
On February 28, 2011, Watson played an untelevised exhibition match of Jeopardy! against members of the United States House of Representatives. In the first round, Rush D. Holt, Jr. (D-NJ, a former Jeopardy! contestant), who was challenging the computer with Bill Cassidy (R-LA), led with Watson in second place. However, combining the scores between all matches, the final score was $40,300 for Watson and $30,000 for the congressional players combined.[69]
 +
 
 +
IBM's Christopher Padilla said of the match, "The technology behind Watson represents a major advancement in computing. "In the data-intensive environment of government, this type of technology can help organizations make better decisions and improve how government helps its citizens."[69]
 +
 
 +
==Future applications==
 +
[[File:800px-IBMWatson.jpg|thumb|500px|Watson demo at an IBM booth at a trade show]]
 +
According to IBM, "The goal is to have computers start to interact in natural human terms across a range of applications and processes, understanding the questions that humans ask and providing answers that humans can understand and justify."[40] It has been suggested by Robert C. Weber, IBM's general counsel, that Watson may be used for legal research.[70] The company also intends to use Watson in other information-intensive fields, such as telecommunications, financial services, and government.[71]
 +
 
 +
Watson is based on commercially available IBM Power 750 servers that have been marketed since February 2010. IBM also intends to market the DeepQA software to large corporations, with a price in the millions of dollars, reflecting the $1 million needed to acquire a server that meets the minimum system requirement to operate Watson. IBM expects the price to decrease substantially within a decade as the technology improves.[11]
 +
 
 +
Commentator Rick Merritt said that "there's another really important reason why it is strategic for IBM to be seen very broadly by the American public as a company that can tackle tough computer problems. A big slice of [IBM's profit] comes from selling to the U.S. government some of the biggest, most expensive systems in the world."[72]
 +
 
 +
In 2013, it was reported that three companies were working with IBM to create apps embedded with Watson technology. Fluid is developing an app for retailer, The North Face, designed to provide advice to online shoppers. Welltok is developing an app designed to give people advice on ways to engage in activities to improve their health. MD Buyline is developing an app for the purpose of advising medical institutions on equipment procurement decisions.[73][74]
 +
 
 +
In November, 2013, IBM announced it would make Watson's API available to software application providers, enabling them to build apps and services that are embedded with Watson's capabilities. To build out its base of partners who create applications on the Watson platform, IBM consults with a network of venture capital firms, which advise IBM on which of their portfolio companies may be a logical fit for what IBM calls the Watson Ecosystem. Thus far, roughly 800 organizations and individuals have signed up with IBM, with interest in creating applications that could use the Watson platform.[75]
 +
 
 +
On January 30, 2013, it was announced that Rensselaer Polytechnic Institute would receive a successor version of Watson, which would be housed at the Institute's technology park and be available to researchers and students.[76] By summer 2013, Rensselaer had become the first university to receive a Watson computer.[77]
 +
 
 +
On February 6, 2014, it was reported that IBM plans to invest $100 million in a 10-year initiative to use Watson and other IBM technologies to help countries in Africa address development problems, beginning with healthcare and education.[78]
 +
 
 +
On June 3, 2014, three new Watson Ecosystem partners were chosen from more than 400 business concepts submitted by teams spanning 18 industries from 43 countries. "These bright and enterprising organizations have discovered innovative ways to apply Watson that can deliver demonstrable business benefits," said Steve Gold, vice president, IBM Watson Group. The winners are Majestyk Apps with their adaptive educational platform, FANG (Friendly Anthropomorphic Networked Genome);[79] Red Ant with their retail sales trainer;[80] and GenieMD[81] with their medical recommendation service.[82]
 +
 
 +
On July 9, 2014, Genesys Telecommunications Laboratories announced plans to integrate Watson to improve their customer experience platform, citing the sheer volume of customer data to analyze is staggering.[83]
 +
 
 +
Watson has been integrated with databases including Bon Appétit magazine to perform a recipe generating platform[84]
 +
 
 +
====Healthcare====
 +
In healthcare, Watson's natural language, hypothesis generation, and evidence-based learning capabilities allow it to function as a clinical decision support system for use by medical professionals.[85] To aid physicians in the treatment of their patients, once a doctor has posed a query to the system describing symptoms and other related factors, Watson first parses the input to identify the most important pieces of information; then mines patient data to find facts relevant to the patient's medical and hereditary history; then examines available data sources to form and test hypotheses;[85] and finally provides a list of individualized, confidence-scored recommendations.[86] The sources of data that Watson uses for analysis can include treatment guidelines, electronic medical record data, notes from doctors and nurses, research materials, clinical studies, journal articles, and patient information.[85] Despite being developed and marketed as a "diagnosis and treatment advisor," Watson has never been actually involved in the medical diagnosis process, only in assisting with identifying treatment options for patients who have already been diagnosed.[87]
 +
 
 +
In February 2011, it was announced that IBM would be partnering with Nuance Communications for a research project to develop a commercial product during the next 18 to 24 months, designed to exploit Watson's clinical decision support capabilities. Physicians at Columbia University would help to identify critical issues in the practice of medicine where the system's technology may be able to contribute, and physicians at the University of Maryland would work to identify the best way that a technology like Watson could interact with medical practitioners to provide the maximum assistance.[88]
 +
 
 +
In September 2011, IBM and WellPoint, a major American healthcare solutions provider, announced a partnership to utilize Watson's data crunching capability to help suggest treatment options to doctors.[89] Then, in February 2013, IBM and WellPoint gave Watson its first commercial application, for utilization management decisions in lung cancer treatment at Memorial Sloan–Kettering Cancer Center.[12]
 +
 
 +
IBM announced a partnership with Cleveland Clinic in October 2012. The company has sent Watson to the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, where it will increase its health expertise and assist medical professionals in treating patients. The medical facility will utilize Watson's ability to store and process large quantities of information to help speed up and increase the accuracy of the treatment process. "Cleveland Clinic's collaboration with IBM is exciting because it offers us the opportunity to teach Watson to 'think' in ways that have the potential to make it a powerful tool in medicine," said C. Martin Harris, MD, chief information officer of Cleveland Clinic.[90]
 +
 
 +
On February 8, 2013, IBM announced that oncologists at the Maine Center for Cancer Medicine and Westmed Medical Group in New York have started to test the Watson supercomputer system in an effort to recommend treatment for lung cancer.[91]
 +
 
 +
==IBM Watson Group==
 +
On January 9, 2014 IBM announced it is creating a business unit around Watson, led by senior vice president Michael Rhodin.[92] IBM Watson Group will have headquarters in New York's Silicon Alley and will employ 2,000 people. IBM has invested $1 billion to get the division going. Watson Group will develop three new cloud-delivered services: Watson Discovery Advisor, Watson Analytics, and Watson Explorer. Watson Discovery Advisor will focus on research and development projects in pharmaceutical industry, publishing and biotechnology, Watson Analytics will focus on Big Data visualization and insights on the basis of natural language questions posed by business users, and Watson Explorer will focus on helping enterprise users uncover and share data-driven insights more easily.[92] The company is also launching a $100 million venture fund to spur application development for "cognitive" applications. According to IBM, the cloud-delivered enterprise-ready Watson has seen its speed increase 24 times over - a 2,300 percent improvement in performance, and its physical size shrank by 90 percent - from the size of a master bedroom to three stacked pizza boxes.[92] IBM CEO Virginia Rometty said she wants Watson to generate $10 billion in annual revenue within ten years.[93]
 +
 
 +
==Background of Cognitive Computing==
 +
Cognitive computing (CC) makes a new class of problems computable. It addresses complex situations that are characterized by ambiguity and uncertainty; in other words it handles human kinds of problems. In these dynamic, information-rich, and shifting situations, data tends to change frequently, and it is often conflicting. The goals of users evolve as they learn more and redefine their objectives. To respond to the fluid nature of users’ understanding of their problems, the cognitive computing system offers a synthesis not just of information sources but of influences, contexts, and insights. To do this, systems often need to weigh conflicting evidence and suggest an answer that is “best” rather than “right”.
 +
 
 +
Cognitive computing systems make context computable. They identify and extract context features such as hour, location, task, history or profile to present an information set that is appropriate for an individual or for a dependent application engaged in a specific process at a specific time and place. They provide machine-aided serendipity by wading through massive collections of diverse information to find patterns and then apply those patterns to respond to the needs of the moment.
 +
 
 +
Cognitive computing systems redefine the nature of the relationship between people and their increasingly pervasive digital environment. They may play the role of assistant or coach for the user, and they may act virtually autonomously in many problem-solving situations. The boundaries of the processes and domains these systems will affect are still elastic and emergent. Their output may be prescriptive, suggestive, instructive, or simply entertaining. (See Smart Machines,[1] or articles by Ferrucci or Denning listed below for more information on these concepts.)
 +
 
 +
In order to achieve this new level of computing, cognitive systems must be:
 +
 
 +
*Adaptive. They must learn as information changes, and as goals and requirements evolve. They must resolve ambiguity and tolerate unpredictability. They must be engineered to feed on dynamic data in real time, or near real time.[2]
 +
*Interactive. They must interact easily with users so that those users can define their needs comfortably. They may also interact with other processors, devices, and Cloud services, as well as with people.
 +
*Iterative and stateful. They must aid in defining a problem by asking questions or finding additional source input if a problem statement is ambiguous or incomplete. They must “remember” previous interactions in a process and return information that is suitable for the specific application at that point in time.
 +
*Contextual. They must understand, identify, and extract contextual elements such as meaning, syntax, time, location, appropriate domain, regulations, user’s profile, process, task and goal. They may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory, or sensor-provided).
 +
Cognitive systems differ from current computing applications in that they move beyond tabulating and calculating based on preconfigured rules and programs. Although they are capable of basic computing, they can also infer and even reason based on broad objectives.
 +
 
 +
Beyond these principles, cognitive computing systems can be extended to include additional tools and technologies. They may integrate or leverage existing information systems and add domain or task-specific interfaces and tools as required.
 +
 
 +
Many of today’s applications (e.g., search, ecommerce, eDiscovery) exhibit some of these features, but it is rare to find all of them fully integrated and interactive.
 +
 
 +
Cognitive systems will coexist with legacy systems into the indefinite future. Many cognitive systems will build upon today’s IT resources. But the ambition and reach of cognitive computing is fundamentally different. Leaving the model of computer-as-appliance behind, it seeks to bring computing into a closer, fundamental partnership in human endeavors.
 +
 
 +
Additional uses of the term The term cognitive computing has also been used to refer to new hardware and/or software that mimics the functioning of the human brain. In this sense, cognitive computing (CC) is a new type of computing with the goal of more accurate models of how the human brain/mind senses, reasons, and responds to stimulus. CC applications link data analysis and adaptive page displays (AUI) to adjust content for a particular type of audience. As such, CC hardware and applications strive to be more affective and more influential by design.
 +
 
 +
Like a human, a cognitive computing application learns by experience and/or instruction. The CC application learns and remembers how to adapt its content displays, by situation, to influence behavior. This means a CC application must have intent, memory, foreknowledge and cognitive reasoning for a domain of variable situations. These 'cognitive' functions are in addition to the more fixed page displays now found in most paging applications.
 +
 
 +
==External links==
 +
* [http://www.predictiveanalyticstoday.com/ibm-watson-platform-cloud-applications/ IBM Watson platform in the Cloud and Applications]
 +
* [http://www.ibmwatson.com/ Watson homepage]
 +
* [http://www.research.ibm.com/deepqa/deepqa.shtml DeepQA homepage]
 +
* [http://web.archive.org/web/20130616010509/http://www.jeopardy.com/news/ibm.php About Watson on Jeopardy.com]
 +
* [http://video.pbs.org/video/1786674622 Smartest Machine on Earth (PBS NOVA documentary about the making of Watson)]
 +
* [http://wayback.archive.org/web/20130603033306/http://www.ibm.com/systems/power/news/announcement/20100817_ad.html POWER7]
 +
* [http://www.ibm.com/systems/power Power Systems]
 +
* [http://www.nytimes.com/interactive/2010/06/16/magazine/watson-trivia-game.html The Watson Trivia Challenge]. ''The New York Times''. June 16, 2010.
 +
* [http://twitter.com/#search?q=%23IBMWatson #IBMWatson] Twitter hashtag
 +
* [http://wayback.archive.org/web/20120411155021/http://ieeexplore.ieee.org/xpl/tocresult.jsp?isnumber=6177717 This is Watson] - IBM Journal of Research and Development (published by the [[IEEE]])
 +
 
 +
===J! Archive===
 +
* [http://www.j-archive.com/showgame.php?game_id=3575 ''Jeopardy!'' Show #6086 - Game 1, Part 1]
 +
* [http://www.j-archive.com/showgame.php?game_id=3576 ''Jeopardy!'' Show #6087 - Game 1, Part 2]
 +
* [http://www.j-archive.com/showgame.php?game_id=3577 ''Jeopardy!'' Show #6088 - Game 2]
 +
 
 +
===Videos===
 +
* [http://video.pbs.org/video/1786674622 PBS NOVA documentary on the making of Watson]
 +
*[http://www.youtube.com/playlist?list=PL4F1C783776E708A8 IBM Watson playlist], [https://www.youtube.com/playlist?list=PL3A7FC0CD1F1BB3D1 IBMLabs Watson playlist]
 +
 
 +
==Further Reading ==
 +
* Terdiman, Daniel (2008). ''[http://www.cnet.com/news/ibm-gets-darpa-cognitive-computing-contract/ IBM gets DARPA cognitive computing contract]'' CNet: November 19, 2008 9:01 PM PST
 +
* Terdiman, Daniel (2014) .IBM's TrueNorth processor mimics the human brain.http://www.cnet.com/news/ibms-truenorth-processor-mimics-the-human-brain/
 +
* Reynolds, H. and Feldman, S. (2014) Cognitive Computing: Beyond the Hype http://www.kmworld.com/Articles/News/News-Analysis/Cognitive-computing-Beyond-the-hype-97685.aspx
 +
* What is cognitive computing?  IBM Research.  http://www.research.ibm.com/cognitive-computing/index.shtml#fbid=BrUXYNtK6-r
 +
* Mounier, G. (2014). Cognitive Computing: Why Now and Why it Matters to the Enterprise. KMWorld, http://www.kmworld.com/Articles/News/News-Analysis/Cognitive-computing-Why-now-and-why-it-matters-98770.aspx
 +
* Jeopardy! IBM Watson day 3 (2011). Retrieved July 26, 2012 from http://www.youtube.com/watch?v=o6oS64Bpx0gandfeature=fvwrel
 +
* UPI (2009). ''[http://www.upi.com/Science_News/2009/11/18/IBM-reports-cognitive-computing-advances/UPI-73871258566950/ IBM reports cognitive computing advances]'' United Press International, Inc.: Nov. 18, 2009 at 12:55 PM
 +
* Knight, Shawn (2011). ''[http://www.techspot.com/news/45138-ibm-unveils-cognitive-computing-chips-that-mimic-human-brain.html IBM unveils cognitive computing chips that mimic human brain]'' TechSpot: August 18, 2011, 12:00 PM
 +
* Hamill, Jasper (2013). ''[http://www.theregister.co.uk/2013/08/08/ibm_unveils_computer_architecture_based_upon_your_brain/ Cognitive computing: IBM unveils software for its brain-like SyNAPSE chips]'' The Register: August 8, 2013
 +
* Kirk, Jeremy (2013). ''[http://www.pcworld.com/article/2051501/universities-join-ibm-in-cognitive-computing-research-project.html Universities, IBM join forces to build a brain-like computer]'' PCworld: Oct 1, 2013 11:05 PM
 +
* Jackson, Joab (2014). ''http://www.pcworld.com/article/2086520/ibm-bets-big-on-watsonbranded-cognitive-computing.html IBM bets big on Watson-branded cognitive computing]'' PCworld: Jan 9, 2014 2:30 PM
 +
* Neumann, Alexander (2014). ''[http://www.heise.de/developer/meldung/Cognitive-Computing-IBM-lanciert-Programmierwettbewerb-zur-Watson-Technik-2126361.html IBM lanciert Programmierwettbewerb zur Watson-Technik]'' heise Developer: 27.02.2014 14:26
 +
* Deanfelis, Stephen (2014). ''[http://innovationinsights.wired.com/insights/2014/04/will-2014-year-fall-love-cognitive-computing/ Will 2014 Be the Year You Fall in Love With Cognitive Computing?]'' Wired: 04.21.1411:03 AM
 +
*Baker, Stephen (2012) ''Final Jeopardy: The Story of Watson, the Computer That Will Transform Our World'', Mariner Books
 +
 
 +
==Bibliography==
 +
* APA (2006). VandenBos, Gary R., ed. ''APA Dictionary of Psychology'' Washington, DC: American Psychological Association, page 26.
 +
* Balliene, B. W. (2005).  Dietary Influences on Obesity: Environment, Behavior and Biology. ''[[Physiology & Behavior]]'', 86 (5), pp.&nbsp;717–730
 +
* Batson, C.D., Shaw, L. L., Oleson, K. C. (1992). Differentiating Affect, Mood and Emotion: Toward Functionally based Conceptual Distinctions. ''Emotion''. Newbury Park, CA: Sage
 +
* Blechman, E. A. (1990). ''Moods, Affect, and Emotions''. Lawrence Erlbaum Associates: Hillsdale, NJ
 +
* Brewin, C. R. (1989). Cognitive Change Processes in Psychotherapy. ''Psychological Review'', 96(45), pp.&nbsp;379–394
 +
* Damasio, A., (1994). *''Descartes' Error: Emotion, Reason, and the Human Brain'', Putnam Publishing
 +
* Denning. P.J. (2014) Surfing Toward the Future.  Communications of the ACM, Vol. 57 No. 3, Pages 26–29 10.1145/2566967
 +
* Feldman, Susan E. (2012).  The Answer Machine. Morgan & Claypool
 +
*Greenemeier, Larry. (2013). Will IBM’s Watson Usher in a New Era of Cognitive Computing? Scientific American. Nov 13, 2013 |* Lazarus, R. S. (1982).
 +
* Griffiths, P. E. (1997). ''What Emotions Really Are: The Problem of Psychological Categories''. The University of Chicago Press: Chicago Thoughts on the Relations between Emotions and Cognition. ''American Physiologist'', 37(10), pp.&nbsp;1019–1024
 +
* Lerner, J.S., and D. Keltner.  (2000)  Beyond valence: Toward a model of emotion-specific influences on judgement and choice.  "Cognition and Emotion", 14(4), pp.&nbsp;473–493
 +
*Kelly, J.E. and Hamm, S. ( 2013). Smart Machines: IBM's Watson and the Era of Cognitive Computing. Columbia Business School Publishing 
 +
* Nathanson, Donald L. ''Shame and Pride: Affect, Sex, and the Birth of the Self''. London: W.W. Norton, 1992
 +
* Quirin, M., Kazén, M., & Kuhl, J. (2009). When nonsense sounds happy or helpless: The Implicit Positive and Negative Affect Test (IPANAT). ''Journal of Personality and Social Psychology'', 97(3), pp.&nbsp;500–516
 +
* Proudfoot, J., Guest, D., Carson, J., Dunn, G., & Gray, J. (1997). Effect of cognitive-behavioural training on job-finding among long-term unemployed people. ''The Lancet, Volume 350, Issue 9071'', pp.&nbsp;99–100
 +
* Schucman, H., Thetford, C. (1975). ''A Course in Miracle''. New York: Viking Penguin
 +
* Shepard, R. N. (1984). Ecological Constraints on Internal Representation. ''Psychological Review'', 91, pp.&nbsp;417–447
 +
* Shepard, R. N. (1994). Perceptual-cognitive Universals as Reflections of the World. ''Psychonomic Bulletin & Review'', 1, pp.&nbsp;2–28.
 +
* Tolle, E. (1999). ''The Power of Now''. Vancouver: Namaste Publishing.
 +
* Tolle, E. (2003). ''Stillness Speaks''. Vancouver: Namaste Publishing
 +
* Weiskrantz, L. (1997). ''Consciousness Lost and Found''. Oxford: Oxford Univ. Press.
 +
* Zajonc, R. B. (1980). Feelings and Thinking: Preferences Need No Inferences. ''American Psychologist'', 35(2), pp.&nbsp;151–175
 +
 
 +
==References==
 +
1.  IBM Watson: The Face of Watson on YouTube
 +
 
 +
2. a b c "DeepQA Project: FAQ". IBM. Retrieved 2011-02-11.
 +
 
 +
3. a b Hale, Mike (2011-02-08). "Actors and Their Roles for $300, HAL? HAL!". The New York Times. Retrieved 2011-02-11.
 +
 
 +
4. "The DeepQA Project". Research.ibm.com. Retrieved 2011-02-18.
 +
 
 +
5. "IBM Research: Dave Ferrucci at Computer History Museum - How It All Began and What's Next". IBM Research. 2011-12-01. Retrieved 2012-02-11. "In 2007, when IBM executive Charles Lickel challenged Dave and his team to revolutionize Deep QA and put an IBM computer against Jeopardy!'s human champions, he was off to the races."
 +
 
 +
6. Loftus, Jack (2009-04-26). "IBM Prepping 'Watson' Computer to Compete on Jeopardy!". Gizmodo. Retrieved 2009-04-27.
 +
 
 +
7. IBM's "Watson" Computing System to Challenge All Time Greatest Jeopardy! Champions, Sony Pictures Television, 2010-12-14, archived from the original on 2013-06-06, retrieved 2013-11-11
 +
 
 +
8. a b c Jackson, Joab (2011-02-17), IBM Watson Vanquishes Human Jeopardy Foes, PC World, IDG News, retrieved 2011-02-17
 +
 
 +
9. a b Zimmer, Ben (2011-02-17), Is It Time to Welcome Our New Computer Overlords?, The Atlantic, retrieved 2011-02-17
 +
 
 +
10. Raz, Guy (2011-01-28), Can a Computer Become a Jeopardy! Champ?, National Public Radio, retrieved 2011-02-18
 +
 
 +
11. a b c d e f g h i j k l m n o p q r s t Thompson, Clive (2010-06-16). "Smarter Than You Think: What Is I.B.M.’s Watson?". The New York Times Magazine. Retrieved 2011-02-18.
 +
 
 +
12. a b "IBM's Watson Gets Its First Piece Of Business In Healthcare" Forbes, February 8, 2013
 +
 
 +
13. Upbin, Bruce (2013-02-08). "IBM's Watson Gets Its First Piece Of Business In Healthcare". Forbes. Retrieved March 10, 2013.
 +
 
 +
14. a b Ferrucci, D, et al. (2010), Building Watson: An Overview of the DeepQA Project, AI Magazine (AI Magazine) 31 (3), retrieved 2011-02-19
 +
 
 +
15. http://craigrhinehart.com/2011/01/17/10-things-you-need-to-know-about-the-technology-behind-watson/
 +
 
 +
16. "Watson, A System Designed for Answers: The Future of Workload Optimized Systems Design". IBM Systems and Technology. p. 3. Retrieved 2011-02-21.
 +
 
 +
17. Takahashi, Dean (2011-02-17), IBM researcher explains what Watson gets right and wrong, VentureBeat, retrieved 2011-02-18
 +
 
 +
18. Novell (2011-02-02), Watson Supercomputer to Compete on 'Jeopardy!' -- Powered by SUSE Linux Enterprise Server on IBM POWER7, The Wall Street Journal, retrieved 2011-02-21
 +
 
 +
19. a b Is Watson the smartest machine on earth?, Computer Science and Electrical Engineering Department, UMBC, 2011-02-10, retrieved 2011-02-11
 +
 
 +
20. a b Rennie, John (2011-02-14), How IBM's Watson Computer Excels at Jeopardy!, PLoS blogs, retrieved 2011-02-19
 +
 
 +
21. Lucas, Mearian (2011-02-21), Can anyone afford an IBM Watson supercomputer? (Yes), Computerworld, retrieved 2011-02-21
 +
 
 +
22. [1]
 +
 
 +
23. The AI behind Watson "The AI Behind Watson - The Technical Article". AI Magazine. Fall 2010. Retrieved 2013-11-11.
 +
 
 +
24. IBM's 'Watson' to take on Jeopardy! champs, AFP, 2011-02-11, retrieved 2011-02-19
 +
 
 +
25. a b c d Jennings, Ken (2011-02-16), My Puny Human Brain, Slate, Newsweek Interactive Co. LLC, retrieved 2011-02-17
 +
 
 +
26. Libresco, Leah Anthony (2011-02-21), A Non-Trivial Advantage for Watson, The Huffington Post, retrieved 2011-02-21
 +
 
 +
27. Will Watson Win On Jeopardy!?, Nova ScienceNOW (Public Broadcasting Service), 2011-01-20, retrieved 2011-01-27
 +
 
 +
28. a b Gondek, David (2011-01-10), How Watson "sees," "hears," and "speaks" to play Jeopardy!, IBM Research blog (IBM), retrieved 2011-02-21
 +
 
 +
29. Avery, Lise (2011-02-14), Actor Jeff Woodman, Voice of IBM's Watson Computer (MP3), Anything Goes!!, retrieved 2011-02-15 (interview of Jeff Woodman)
 +
 
 +
30. a b c d Needleman, Rafe (2011-02-18), Reporters' Roundtable: Debating the robobrains, CNET, retrieved 2011-02-18
 +
 
 +
31. a b c d Jeopardy! Champ Ken Jennings, The Washington Post, 2011-02-15, retrieved 2011-02-15
 +
 
 +
32. Kosinski, R. J. (2008). A literature review on reaction time, Clemson University.
 +
 
 +
33. a b Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 174. ISBN 0-547-48316-3.
 +
 
 +
34. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 178. ISBN 0-547-48316-3.
 +
 
 +
35. a b c Alex Strachan (2011-02-12), For Jennings, it's a man vs. man competition, The Vancouver Sun, retrieved 2011-02-15
 +
 
 +
36. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. pp. 6–8. ISBN 0-547-48316-3.
 +
 
 +
37. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 30. ISBN 0-547-48316-3.
 +
 
 +
38. Radev, Dragomir R.; Prager, John; Samn, Valerie (2000). "Ranking potential answers to natural language questions". "Proceedings of the 6th Conference on Applied Natural Language Processing".
 +
 
 +
39. Prager, John; Brown, Eric; Coden, Annie; Radev, Dragomir R. (July 2000). "Question-answering by predictive annotation". "Proceedings, 23rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval".
 +
 
 +
40. a b Brodkin, Jon (2010-02-10), IBM's Jeopardy-playing machine can now beat human contestants, Network World, retrieved 2011-02-19
 +
 
 +
41. "Medical Students Offer Expertise to IBM's Jeopardy!-Winning Computer Watson as It Pursues a New Career in Medicine". InTouch 18. New York Medical College. June 2012. p. 4.
 +
 
 +
42. Stelter, Brian (2010-12-14), I.B.M. Supercomputer 'Watson' to Challenge 'Jeopardy' Stars, The New York Times, retrieved 2010-12-14, "An I.B.M. supercomputer system named after the company's founder, Thomas J. Watson Sr., is almost ready for a televised test: a bout of questioning on the quiz show "Jeopardy." I.B.M. and the producers of "Jeopardy" will announce on Tuesday that the computer, "Watson," will face the two most successful players in "Jeopardy" history, Ken Jennings and Brad Rutter, in three episodes that will be broadcast Feb. 14–16, 2011."
 +
 
 +
43. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 171. ISBN 0-547-48316-3.
 +
 
 +
44. Flatow, Ira (2011-02-11), IBM Computer Faces Off Against 'Jeopardy' Champs, Talk of the Nation (National Public Radio), retrieved 2011-02-15
 +
 
 +
45. Sostek, Anya (2011-02-13), Human champs of 'Jeopardy!' vs. Watson the IBM computer: a close match, Pittsburgh Post Gazette, retrieved 2011-02-19
 +
 
 +
46. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 117. ISBN 0-547-48316-3.
 +
 
 +
47. Baker, pp. 232-258.
 +
 
 +
48. Dignan, Larry (2011-01-13), IBM's Watson wins Jeopardy practice round: Can humans hang?, ZDnet, retrieved 2011-01-13
 +
 
 +
49. a b "The IBM Challenge Day 1". Jeopardy. Season 27. Episode 23. 2011-02-14.
 +
 
 +
50. Lenchner, Jon (2011-02-03), Knowing what it knows: selected nuances of Watson's strategy, IBM Research blog (IBM), retrieved 2011-02-16
 +
 
 +
51. Johnston, Casey (2011-02-15), Jeopardy: IBM's Watson almost sneaks wrong answer by Trebek, Ars Technica, retrieved 2011-02-15
 +
 
 +
52. a b c d Computer crushes the competition on 'Jeopardy!', Associated Press, 2011-02-15, retrieved 2011-02-19
 +
 
 +
53. Tesauro, Gerald (2011-02-13), IBM Research: Watson's wagering strategies, IBM Research blog (IBM), retrieved 2011-02-18
 +
 
 +
54. Staff (2011-02-15), IBM's computer wins 'Jeopardy!' but... Toronto?, CTV.ca, retrieved 2011-02-15, "Watson, IBM's quiz-master computer with the strangely serene voice, beat the humans on "Jeopardy!" tonight. But it got the final question on U.S. cities wrong, answering: Toronto."
 +
 
 +
55. a b Robertson, Jordan; Borenstein, Seth (2011-02-16), For Watson, Jeopardy! victory was elementary, The Globe and Mail, The Associated Press, retrieved 2011-02-17, "A human would have considered Toronto and discarded it because it is a Canadian city, not a U.S. one, but that's not the type of comparative knowledge Watson has, Prof. Nyberg said."
 +
 
 +
56. Hamm, Steve (2011-02-15), Watson on Jeopardy! Day Two: The Confusion over and Airport Clue, A Smart Planet Blog, retrieved 2011-02-21
 +
 
 +
57. Johnston, Casey (2011-02-15), Creators: Watson has no speed advantage as it crushes humans in Jeopardy, Ars Technica, retrieved 2011-02-21
 +
 
 +
58. Oberman, Mira (2011-02-17), Computer creams human Jeopardy! champions, Vancouver Sun, Agence France-Presse, retrieved 2011-02-17, "But a Final Jeopardy flub prompted one IBM engineer to wear a Toronto Blue Jays jacket to the second day of taping and Trebek to joke that he'd learned Toronto was a U.S. city."
 +
 
 +
59. Johnston, Casey (2011-02-17), Bug lets humans grab Daily Double as Watson triumphs on Jeopardy, Ars Technic, retrieved 2011-02-21
 +
 
 +
60. a b Upbin, Bruce (2011-02-17), IBM's Supercomputer Watson Wins It All With $367 Bet, Forbes, retrieved 2011-02-21
 +
 
 +
61. Oldenburg, Ann (2011-02-17), Ken Jennings: 'My puny brain' did just fine on 'Jeopardy!', USA Today, retrieved 2011-02-21
 +
 
 +
62. "Show 6088 - The IBM Challenge, Day 2". Jeopardy!. 2011-02-16. Syndicated.
 +
 
 +
63. World Community Grid to benefit from Jeopardy! competition, World Community Grid, 2011-02-04, retrieved 2011-02-19
 +
 
 +
64. Jeopardy! And IBM Announce Charities To Benefit From Watson Competition, IBM Corporation, 2011-01-13, retrieved 2011-02-19
 +
 
 +
65. IBM's Watson supercomputer crowned Jeopardy king, BBC News, 2011-02-17, retrieved 2011-02-17
 +
 
 +
66. Markoff, John (2011-02-16), Computer Wins on ‘Jeopardy!’: Trivial, It's Not, Yorktown Heights, New York: The New York Times, retrieved 2011-02-17
 +
 
 +
67. Searle, John (2011-02-23), Watson Doesn't Know It Won on 'Jeopardy!', The Wall Street Journal, retrieved 2011-07-26
 +
 
 +
68. Lohr, Steve. "Creating AI based on the real thing", Dec 05, 2011 on The New York Times.
 +
 
 +
69. a b NJ congressman tops 'Jeopardy' computer Watson, Associated Press, 2011-03-02, retrieved 2011-03-02
 +
 
 +
70. Weber, Robert C. (2011-02-14), Why 'Watson' matters to lawyers, The National Law Journal, retrieved 2011-02-18
 +
 
 +
71. Nay, Chris (2011-09-06). "Putting Watson to work: Interview with GM of Watson Solutions Manoj Saxena". Smarter Planet Blog. IBM. Retrieved 2013-11-12.
 +
 
 +
72. Merritt, Rick (2011-02-14), IBM playing Jeopardy with tax dollars, EE Times, retrieved 2011-02-19
 +
 
 +
73. https://www.internetretailer.com/2013/12/03/ibms-watson-computer-helps-shoppers-new-app
 +
 
 +
74. http://mobihealthnews.com/27414/with-watson-api-launch-ibm-turns-to-welltok-for-patients-md-buyline-for-docs/
 +
 
 +
75. http://www.forbes.com/sites/bruceupbin/2013/11/14/ibm-opens-up-watson-as-a-web-service/
 +
 
 +
76. "IBM's Watson to Join Research Team at Rensselaer | News & Events". News.rpi.edu. 2013-01-30. Retrieved 2013-10-01.
 +
 
 +
77. The Independent Sector: Cultural, Economic and Social Contributions of New York's 100+, Not-for-Profit Colleges and Universities, Commission on Independent Colleges and Universities, Summer 2013, p. 12, retrieved 2013-10-01
 +
 
 +
78. http://www.reuters.com/article/2014/02/06/us-ibm-africa-idUSBREA1507H20140206
 +
 
 +
79. https://www.flickr.com/photos/ibm_media/14359875853/
 +
 
 +
80. https://www.flickr.com/photos/ibm_media/14153092500/in/photostream/
 +
 
 +
81. https://www.flickr.com/photos/ibm_media/14152996019/in/photostream/
 +
 
 +
82. http://www-03.ibm.com/press/us/en/pressrelease/44057.wss
 +
 
 +
83. "Genesys to Put IBM's Watson to Work".
 +
 
 +
84. "IBM’s Watson Is Now A Cooking App With Infinite Recipes".
 +
 
 +
85. a b c "Putting Watson to Work: Watson in Healthcare". IBM. Retrieved 2013-11-11.
 +
 
 +
86. IBM Watson Helps Fight Cancer with Evidence-Based Diagnosis and Treatment Suggestions, IBM, retrieved 2013-11-12
 +
 
 +
87. Saxena, Manoj (2013-02-13). "IBM Watson Progress and 2013 Roadmap (Slide 7)". IBM. Retrieved 2013-11-12.
 +
 
 +
88. Wakeman, Nick (2011-02-17), IBM's Watson heads to medical school, Washington Technology, retrieved 2011-02-19
 +
 
 +
89. Mathews, Anna Wilde (September 12, 2011). "Wellpoint's New Hire: What is Watson?". The Wall Street Journal.
 +
 
 +
90. Miliard, Mike (2012-10-30). "Watson Heads to Medical School: Cleveland Clinic, IBM Send Supercomputer to College". Healthcare IT News. Retrieved 2013-11-11.
 +
 
 +
91. Leske, Nikola (2013-02-09). "Doctors Seek Help on Cancer Treatment from IBM Supercomputer". Reuters. Retrieved 2013-11-11.
 +
 
 +
92. a b c IBM Watson Group Unveils Cloud-Delivered Watson Services to Transform Industrial R&D, Visualize Big Data Insights and Fuel Analytics Exploration, retrieved January 18, 2014
 +
 
 +
93. IBM set to expand Watson's reach, The Wall Street Journal, January 9, 2014, p.B2

Latest revision as of 12:22, 22 December 2016

Watson's Avatar, inspired by the IBM "smarter planet" logo[1]

Watson is an artificially intelligent computer system capable of answering questions posed in natural language,[2] developed in IBM's DeepQA project by a research team led by principal investigator David Ferrucci. Watson was named after IBM's first CEO and industrialist Thomas J. Watson.[3][4] The computer system was specifically developed to answer questions on the quiz show Jeopardy![5] In 2011, Watson competed on Jeopardy! against former winners Brad Rutter and Ken Jennings.[3][6] Watson received the first place prize of $1 million.[7]

Watson had access to 200 million pages of structured and unstructured content consuming four terabytes of disk storage[8] including the full text of Wikipedia,[9] but was not connected to the Internet during the game.[10][11] For each clue, Watson's three most probable responses were displayed on the television screen. Watson consistently outperformed its human opponents on the game's signaling device, but had trouble responding to a few categories, notably those having short clues containing only a few words.

In February 2013, IBM announced that Watson software system's first commercial application would be for utilization management decisions in lung cancer treatment at Memorial Sloan–Kettering Cancer Center in conjunction with health insurance company WellPoint.[12] IBM Watson's business chief Manoj Saxena says that 90% of nurses in the field who use Watson now follow its guidance.[13]

Description

Watson
The high-level architecture of IBM's DeepQA used in Watson

Watson is a question answering (QA) computing system that IBM built to apply advanced natural language processing, information retrieval, knowledge representation, automated reasoning, and machine learning technologies to the field of open domain question answering.[2]

The key difference between QA technology and document search is that document search takes a keyword query and returns a list of documents, ranked in order of relevance to the query (often based on popularity and page ranking), while QA technology takes a question expressed in natural language, seeks to understand it in much greater detail, and returns a precise answer to the question.[15]

According to IBM, "more than 100 different techniques are used to analyze natural language, identify sources, find and generate hypotheses, find and score evidence, and merge and rank hypotheses."[16]

Software

Watson uses IBM's DeepQA software and the Apache UIMA (Unstructured Information Management Architecture) framework. The system was written in various languages, including Java, C++, and Prolog, and runs on the SUSE Linux Enterprise Server 11 operating system using Apache Hadoop framework to provide distributed computing.[8][17][18]

Hardware

The system is workload optimized, integrating massively parallel POWER7 processors and being built on IBM's DeepQA technology,[19] which it uses to generate hypotheses, gather massive evidence, and analyze data.[2] Watson is composed of a cluster of ninety IBM Power 750 servers, each of which uses a 3.5 GHz POWER7 eight core processor, with four threads per core. In total, the system has 2,880 POWER7 processor cores and has 16 terabytes of RAM.[19]

According to John Rennie, Watson can process 500 gigabytes, the equivalent of a million books, per second.[20] IBM's master inventor and senior consultant Tony Pearson estimated Watson's hardware cost at about $3 million.[21] Its performance stands at 80 TeraFLOPs which is not enough to place it at Top 500 Supercomputers list.[22] According to Rennie, the content was stored in Watson's RAM for the game because data stored on hard drives are too slow to access.[20]

Data

The sources of information for Watson include encyclopedias, dictionaries, thesauri, newswire articles, and literary works. Watson also used databases, taxonomies, and ontologies. Specifically, DBPedia, WordNet, and Yago were used.[23]

The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias, and other reference material that it could use to build its knowledge.[11] Although Watson was not connected to the Internet during the game,[24] it contained 200 million pages of structured and unstructured content consuming four terabytes of disk storage,[8] including the full text of Wikipedia.[9]

Operation

When playing Jeopardy! all players must wait until host Alex Trebek reads each clue in its entirety, after which a light is lit as a "ready" signal; the first to activate their buzzer button wins the chance to respond.[11][26] Watson received the clues as electronic texts at the same moment they were made visible to the human players.[11] It would then parse the clues into different keywords and sentence fragments in order to find statistically related phrases.[11] Watson's main innovation was not in the creation of a new algorithm for this operation but rather its ability to quickly execute thousands of proven language analysis algorithms simultaneously to find the correct answer.[11][27] The more algorithms that find the same answer independently the more likely Watson is to be correct.[11] Once Watson has a small number of potential solutions, it is able to check against its database to ascertain whether the solution makes sense.[11] In a sequence of 20 mock games, human participants were able to use the average six to seven seconds that Watson needed to hear the clue and decide whether to signal for responding.[11] During that time, Watson also has to evaluate the response and determine whether it is sufficiently confident in the result to signal.[11] Part of the system used to win the Jeopardy! contest was the electronic circuitry that receives the "ready" signal and then examined whether Watson's confidence level was great enough to activate the buzzer. Given the speed of this circuitry compared to the speed of human reaction times, Watson's reaction time was faster than the human contestants except when the human anticipated (instead of reacted to) the ready signal.[28] After signaling, Watson speaks with an electronic voice and gives the responses in Jeopardy! '​s question format.[11] Watson's voice was synthesized from recordings that actor Jeff Woodman made for an IBM text-to-speech program in 2004.[29]

Comparison with human players

Ken Jennings, Watson, and Brad Rutter in their Jeopardy! exhibition match Watson's basic working principle is to parse keywords in a clue while searching for related terms as responses. This gives Watson some advantages and disadvantages compared with human Jeopardy! players.[30] Watson has deficiencies in understanding the contexts of the clues. As a result, human players usually generate responses faster than Watson, especially to short clues.[11] Watson's programming prevents it from using the popular tactic of buzzing before it is sure of its response.[11] Watson has consistently better reaction time on the buzzer once it has generated a response, and is immune to human players' psychological tactics.[11][31]

The Jeopardy! staff used different means to notify Watson and the human players when to buzz,[28] which was critical in many rounds.[31] The humans were notified by a light, which took them tenths of a second to perceive.[32][33] Watson was notified by an electronic signal and could activate the buzzer within about eight milliseconds.[34] The humans tried to compensate for the perception delay by anticipating the light,[35] but the variation in the anticipation time was generally too great to fall within Watson's response time.[31] Watson did not operate to anticipate the notification signal.[33][35]

History

Development

Since Deep Blue's victory over Garry Kasparov in chess in 1997, IBM had been on the hunt for a new challenge. In 2004, IBM Research manager Charles Lickel, over dinner with coworkers, noticed that the restaurant they were in had fallen silent. He soon discovered the cause of this evening hiatus: Ken Jennings, who was then in the middle of his successful 74-game run on Jeopardy!. Nearly the entire restaurant had piled toward the televisions, mid-meal, to watch the phenomenon. Intrigued by the quiz show as a possible challenge for IBM, Lickel passed the idea on, and in 2005, IBM Research executive Paul Horn backed Lickel up, pushing for someone in his department to take up the challenge of playing Jeopardy! with an IBM system. Though he initially had trouble finding any research staff willing to take on what looked to be a much more complex challenge than the wordless game of chess, eventually David Ferrucci took him up on the offer.[36] In competitions managed by the United States government, Watson's predecessor, a system named Piquant, was usually able to respond correctly to only about 35% of clues and often required several minutes to respond.[37][38][39] To compete successfully on Jeopardy!, Watson would need to respond in no more than a few seconds, and at that time, the problems posed by the game show were deemed to be impossible to solve.[11]

In initial tests run during 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past Jeopardy! programs. While the best real-life competitors buzzed in half the time and responded correctly to as many as 95% of clues, Watson's first pass could get only about 15% correct. During 2007, the IBM team was given three to five years and a staff of 15 people to solve the problems.[11] By 2008, the developers had advanced Watson such that it could compete with Jeopardy! champions.[11] By February 2010, Watson could beat human Jeopardy! contestants on a regular basis.[40]

Although the system is primarily an IBM effort, Watson's development involved faculty and graduate students from Rensselaer Polytechnic Institute, Carnegie Mellon University, University of Massachusetts Amherst, the University of Southern California's Information Sciences Institute, the University of Texas at Austin, the Massachusetts Institute of Technology, and the University of Trento,[14] as well as students from New York Medical College.[41]

Jeopardy!

Ken Jennings, Watson, and Brad Rutter in their Jeopardy! exhibition match

Preparation

In 2008, IBM representatives communicated with Jeopardy! executive producer Harry Friedman about the possibility of having Watson compete against Ken Jennings and Brad Rutter, two of the most successful contestants on the show, and the program's producers agreed.[11][42] Watson's differences with human players had generated conflicts between IBM and Jeopardy! staff during the planning of the competition.[30] IBM repeatedly expressed concerns that the show's writers would exploit Watson's cognitive deficiencies when writing the clues, thereby turning the game into a Turing test. To alleviate that claim, a third party randomly picked the clues from previously written shows that were never broadcast.[30] Jeopardy! staff also showed concerns over Watson's reaction time on the buzzer. Originally Watson signaled electronically, but show staff requested that it press a button physically, as the human contestants would.[43] Even with a robotic "finger" pressing the buzzer, Watson remained faster than its human competitors. Ken Jennings noted, "If you're trying to win on the show, the buzzer is all," and that Watson "can knock out a microsecond-precise buzz every single time with little or no variation. Human reflexes can't compete with computer circuits in this regard."[31][35][44] Stephen Baker, a journalist who recorded Watson's development in his book "Final Jeopardy", reported that the conflict between IBM and Jeopardy! became so serious in May 2010 that the competition was almost canceled.[30] Watson learns from his mistakes, for example, the following mistake during a practice round: he was given the clue "This trusted friend was the first non-dairy powdered creamer," to which he replied, "What is milk?", mistaking the clue as asking for a dairy product. As part of the preparation, IBM constructed a mock set in a conference room at one of its technology sites to model the one used on Jeopardy! Human players, including former Jeopardy! contestants, also participated in mock games against Watson with Todd Alan Crain of The Onion playing host.[11] About 100 test matches were conducted with Watson winning 65% of the games.[45]

To provide a physical presence in the televised games, Watson was represented by an "avatar" of a globe, inspired by the IBM "smarter planet" symbol. Jennings described the computer's avatar as a "glowing blue ball criscrossed by 'threads' of thought—42 threads, to be precise,"[25] and stated that the number of thought threads in the avatar was an in-joke referencing the significance of the number 42 in Douglas Adams' Hitchhiker's Guide to the Galaxy.[25] Joshua Davis, the artist who designed the avatar for the project, explained to Stephen Baker that there are 36 triggerable states that Watson was able to use throughout the game to show its confidence in responding to a clue correctly; he had hoped to be able to find forty-two, to add another level to the Hitchhiker's Guide reference, but he was unable to pinpoint enough game states.[46]

A practice match was recorded on January 13, 2011, and the official matches were recorded on January 14, 2011. All participants maintained secrecy about the outcome until the match was broadcast in February.[47]

Practice match

In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question. None of the three players responded incorrectly to a clue.[48]

First match

The first round was broadcast February 14, 2011, and the second round, on February 15, 2011. The right to choose the first category had been determined by a draw won by Rutter.[49] Watson, represented by a computer monitor display and artificial voice, responded correctly to the second clue and then selected the fourth clue of the first category, a deliberate strategy to find the Daily Double as quickly as possible.[50] Watson's guess at the Daily Double location was correct. At the end of the first round, Watson was tied with Rutter at $5,000; Jennings had $2,000.[49]

Watson's performance was characterized by some quirks. In one instance, Watson repeated a reworded version of an incorrect response offered by Jennings (Jennings said "What are the '20s?" in reference to the 1920s. Then Watson said "What is 1920s?") Because Watson could not recognize other contestants' responses, it did not know that Jennings had already given the same response. In another instance, Watson was initially given credit for a response of "What is leg?" after Jennings incorrectly responded "What is: he only had one hand?" to a clue about George Eyser (The correct response was, "What is: he's missing a leg?"). Because Watson, unlike a human, could not have been responding to Jennings's mistake, it was decided that this response was incorrect. The broadcast version of the episode was edited to omit Trebek's original acceptance of Watson's response.[51] Watson also demonstrated complex wagering strategies on the Daily Doubles, with one bet at $6,435 and another at $1,246.[52] Gerald Tesauro, one of the IBM researchers who worked on Watson, explained that Watson's wagers were based on its confidence level for the category and a complex regression model called the Game State Evaluator.[53]

Watson took a commanding lead in Double Jeopardy!, correctly responding to both Daily Doubles. Watson responded to the second Daily Double correctly with a 32% confidence score.[52]

Although it wagered only $947 on the clue, Watson was the only contestant to miss the Final Jeopardy! response in the category U.S. CITIES ("Its largest airport was named for a World War II hero; its second largest, for a World War II battle"). Rutter and Jennings gave the correct response of Chicago, but Watson's response was "What is Toronto?????"[52][54][55] Ferrucci offered reasons why Watson would appear to have guessed a Canadian city: categories only weakly suggest the type of response desired, the phrase "U.S. city" didn't appear in the question, there are cities named Toronto in the U.S., and Toronto in Ontario has an American League baseball team.[56] Dr. Chris Welty, who also worked on Watson, suggested that it may not have been able to correctly parse the second part of the clue, "its second largest, for a World War II battle" (which was not a standalone clause despite it following a semicolon, and required context to understand that it was referring to a second-largest airport).[57] Eric Nyberg, a professor at Carnegie Mellon University and a member of the development team, stated that the error occurred because Watson does not possess the comparative knowledge to discard that potential response as not viable.[55] Although not displayed to the audience as with non-Final Jeopardy! questions, Watson's second choice was Chicago. Both Toronto and Chicago were well below Watson's confidence threshold, at 14% and 11% respectively. (This lack of confidence was the reason for the multiple question marks in Watson's response.)

The game ended with Jennings with $4,800, Rutter with $10,400, and Watson with $35,734.[52]

Second match

During the introduction, Trebek (a Canadian native) joked that he had learned Toronto was a U.S. city, and Watson's error in the first match prompted an IBM engineer to wear a Toronto Blue Jays jacket to the recording of the second match.[58]

In the first round, Jennings was finally able to choose a Daily Double clue,[59] while Watson responded to one Daily Double clue incorrectly for the first time in the Double Jeopardy! Round.[60] After the first round, Watson placed second for the first time in the competition after Rutter and Jennings were briefly successful in increasing their dollar values before Watson could respond.[60][61] Nonetheless, the final result ended with a victory for Watson with a score of $77,147, besting Jennings who scored $24,000 and Rutter who scored $21,600.[62]

Final outcome

The prizes for the competition were $1 million for first place (Watson), $300,000 for second place (Jennings), and $200,000 for third place (Rutter). As promised, IBM donated 100% of Watson's winnings to charity, with 50% of those winnings going to World Vision and 50% going to World Community Grid.[63] Similarly, Jennings and Rutter donated 50% of their winnings to their respective charities.[64]

In acknowledgment of IBM and Watson's achievements, Jennings made an additional remark in his Final Jeopardy! response: "I for one welcome our new computer overlords", echoing a similar memetic reference to the episode "Deep Space Homer" on The Simpsons, in which TV news presenter Kent Brockman speaks of welcoming "our new insect overlords".[65][66] Jennings later wrote an article for Slate, in which he stated "IBM has bragged to the media that Watson's question-answering skills are good for more than annoying Alex Trebek. The company sees a future in which fields like medical diagnosis, business analytics, and tech support are automated by question-answering software like Watson. Just as factory jobs were eliminated in the 20th century by new assembly-line robots, Brad and I were the first knowledge-industry workers put out of work by the new generation of 'thinking' machines. 'Quiz show contestant' may be the first job made redundant by Watson, but I'm sure it won't be the last."[25]

Philosophy

Philosopher John Searle argues that Watson—despite impressive capabilities—cannot actually think.[67] Drawing on his Chinese room thought experiment, Searle claims that Watson, like other computational machines, is capable only of manipulating symbols, but has no ability to understand the meaning of those symbols; however, Searle's experiment has its detractors.[68]

Match against members of the United States Congress[edit] On February 28, 2011, Watson played an untelevised exhibition match of Jeopardy! against members of the United States House of Representatives. In the first round, Rush D. Holt, Jr. (D-NJ, a former Jeopardy! contestant), who was challenging the computer with Bill Cassidy (R-LA), led with Watson in second place. However, combining the scores between all matches, the final score was $40,300 for Watson and $30,000 for the congressional players combined.[69]

IBM's Christopher Padilla said of the match, "The technology behind Watson represents a major advancement in computing. "In the data-intensive environment of government, this type of technology can help organizations make better decisions and improve how government helps its citizens."[69]

Future applications

Watson demo at an IBM booth at a trade show

According to IBM, "The goal is to have computers start to interact in natural human terms across a range of applications and processes, understanding the questions that humans ask and providing answers that humans can understand and justify."[40] It has been suggested by Robert C. Weber, IBM's general counsel, that Watson may be used for legal research.[70] The company also intends to use Watson in other information-intensive fields, such as telecommunications, financial services, and government.[71]

Watson is based on commercially available IBM Power 750 servers that have been marketed since February 2010. IBM also intends to market the DeepQA software to large corporations, with a price in the millions of dollars, reflecting the $1 million needed to acquire a server that meets the minimum system requirement to operate Watson. IBM expects the price to decrease substantially within a decade as the technology improves.[11]

Commentator Rick Merritt said that "there's another really important reason why it is strategic for IBM to be seen very broadly by the American public as a company that can tackle tough computer problems. A big slice of [IBM's profit] comes from selling to the U.S. government some of the biggest, most expensive systems in the world."[72]

In 2013, it was reported that three companies were working with IBM to create apps embedded with Watson technology. Fluid is developing an app for retailer, The North Face, designed to provide advice to online shoppers. Welltok is developing an app designed to give people advice on ways to engage in activities to improve their health. MD Buyline is developing an app for the purpose of advising medical institutions on equipment procurement decisions.[73][74]

In November, 2013, IBM announced it would make Watson's API available to software application providers, enabling them to build apps and services that are embedded with Watson's capabilities. To build out its base of partners who create applications on the Watson platform, IBM consults with a network of venture capital firms, which advise IBM on which of their portfolio companies may be a logical fit for what IBM calls the Watson Ecosystem. Thus far, roughly 800 organizations and individuals have signed up with IBM, with interest in creating applications that could use the Watson platform.[75]

On January 30, 2013, it was announced that Rensselaer Polytechnic Institute would receive a successor version of Watson, which would be housed at the Institute's technology park and be available to researchers and students.[76] By summer 2013, Rensselaer had become the first university to receive a Watson computer.[77]

On February 6, 2014, it was reported that IBM plans to invest $100 million in a 10-year initiative to use Watson and other IBM technologies to help countries in Africa address development problems, beginning with healthcare and education.[78]

On June 3, 2014, three new Watson Ecosystem partners were chosen from more than 400 business concepts submitted by teams spanning 18 industries from 43 countries. "These bright and enterprising organizations have discovered innovative ways to apply Watson that can deliver demonstrable business benefits," said Steve Gold, vice president, IBM Watson Group. The winners are Majestyk Apps with their adaptive educational platform, FANG (Friendly Anthropomorphic Networked Genome);[79] Red Ant with their retail sales trainer;[80] and GenieMD[81] with their medical recommendation service.[82]

On July 9, 2014, Genesys Telecommunications Laboratories announced plans to integrate Watson to improve their customer experience platform, citing the sheer volume of customer data to analyze is staggering.[83]

Watson has been integrated with databases including Bon Appétit magazine to perform a recipe generating platform[84]

Healthcare

In healthcare, Watson's natural language, hypothesis generation, and evidence-based learning capabilities allow it to function as a clinical decision support system for use by medical professionals.[85] To aid physicians in the treatment of their patients, once a doctor has posed a query to the system describing symptoms and other related factors, Watson first parses the input to identify the most important pieces of information; then mines patient data to find facts relevant to the patient's medical and hereditary history; then examines available data sources to form and test hypotheses;[85] and finally provides a list of individualized, confidence-scored recommendations.[86] The sources of data that Watson uses for analysis can include treatment guidelines, electronic medical record data, notes from doctors and nurses, research materials, clinical studies, journal articles, and patient information.[85] Despite being developed and marketed as a "diagnosis and treatment advisor," Watson has never been actually involved in the medical diagnosis process, only in assisting with identifying treatment options for patients who have already been diagnosed.[87]

In February 2011, it was announced that IBM would be partnering with Nuance Communications for a research project to develop a commercial product during the next 18 to 24 months, designed to exploit Watson's clinical decision support capabilities. Physicians at Columbia University would help to identify critical issues in the practice of medicine where the system's technology may be able to contribute, and physicians at the University of Maryland would work to identify the best way that a technology like Watson could interact with medical practitioners to provide the maximum assistance.[88]

In September 2011, IBM and WellPoint, a major American healthcare solutions provider, announced a partnership to utilize Watson's data crunching capability to help suggest treatment options to doctors.[89] Then, in February 2013, IBM and WellPoint gave Watson its first commercial application, for utilization management decisions in lung cancer treatment at Memorial Sloan–Kettering Cancer Center.[12]

IBM announced a partnership with Cleveland Clinic in October 2012. The company has sent Watson to the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University, where it will increase its health expertise and assist medical professionals in treating patients. The medical facility will utilize Watson's ability to store and process large quantities of information to help speed up and increase the accuracy of the treatment process. "Cleveland Clinic's collaboration with IBM is exciting because it offers us the opportunity to teach Watson to 'think' in ways that have the potential to make it a powerful tool in medicine," said C. Martin Harris, MD, chief information officer of Cleveland Clinic.[90]

On February 8, 2013, IBM announced that oncologists at the Maine Center for Cancer Medicine and Westmed Medical Group in New York have started to test the Watson supercomputer system in an effort to recommend treatment for lung cancer.[91]

IBM Watson Group

On January 9, 2014 IBM announced it is creating a business unit around Watson, led by senior vice president Michael Rhodin.[92] IBM Watson Group will have headquarters in New York's Silicon Alley and will employ 2,000 people. IBM has invested $1 billion to get the division going. Watson Group will develop three new cloud-delivered services: Watson Discovery Advisor, Watson Analytics, and Watson Explorer. Watson Discovery Advisor will focus on research and development projects in pharmaceutical industry, publishing and biotechnology, Watson Analytics will focus on Big Data visualization and insights on the basis of natural language questions posed by business users, and Watson Explorer will focus on helping enterprise users uncover and share data-driven insights more easily.[92] The company is also launching a $100 million venture fund to spur application development for "cognitive" applications. According to IBM, the cloud-delivered enterprise-ready Watson has seen its speed increase 24 times over - a 2,300 percent improvement in performance, and its physical size shrank by 90 percent - from the size of a master bedroom to three stacked pizza boxes.[92] IBM CEO Virginia Rometty said she wants Watson to generate $10 billion in annual revenue within ten years.[93]

Background of Cognitive Computing

Cognitive computing (CC) makes a new class of problems computable. It addresses complex situations that are characterized by ambiguity and uncertainty; in other words it handles human kinds of problems. In these dynamic, information-rich, and shifting situations, data tends to change frequently, and it is often conflicting. The goals of users evolve as they learn more and redefine their objectives. To respond to the fluid nature of users’ understanding of their problems, the cognitive computing system offers a synthesis not just of information sources but of influences, contexts, and insights. To do this, systems often need to weigh conflicting evidence and suggest an answer that is “best” rather than “right”.

Cognitive computing systems make context computable. They identify and extract context features such as hour, location, task, history or profile to present an information set that is appropriate for an individual or for a dependent application engaged in a specific process at a specific time and place. They provide machine-aided serendipity by wading through massive collections of diverse information to find patterns and then apply those patterns to respond to the needs of the moment.

Cognitive computing systems redefine the nature of the relationship between people and their increasingly pervasive digital environment. They may play the role of assistant or coach for the user, and they may act virtually autonomously in many problem-solving situations. The boundaries of the processes and domains these systems will affect are still elastic and emergent. Their output may be prescriptive, suggestive, instructive, or simply entertaining. (See Smart Machines,[1] or articles by Ferrucci or Denning listed below for more information on these concepts.)

In order to achieve this new level of computing, cognitive systems must be:

  • Adaptive. They must learn as information changes, and as goals and requirements evolve. They must resolve ambiguity and tolerate unpredictability. They must be engineered to feed on dynamic data in real time, or near real time.[2]
  • Interactive. They must interact easily with users so that those users can define their needs comfortably. They may also interact with other processors, devices, and Cloud services, as well as with people.
  • Iterative and stateful. They must aid in defining a problem by asking questions or finding additional source input if a problem statement is ambiguous or incomplete. They must “remember” previous interactions in a process and return information that is suitable for the specific application at that point in time.
  • Contextual. They must understand, identify, and extract contextual elements such as meaning, syntax, time, location, appropriate domain, regulations, user’s profile, process, task and goal. They may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory, or sensor-provided).

Cognitive systems differ from current computing applications in that they move beyond tabulating and calculating based on preconfigured rules and programs. Although they are capable of basic computing, they can also infer and even reason based on broad objectives.

Beyond these principles, cognitive computing systems can be extended to include additional tools and technologies. They may integrate or leverage existing information systems and add domain or task-specific interfaces and tools as required.

Many of today’s applications (e.g., search, ecommerce, eDiscovery) exhibit some of these features, but it is rare to find all of them fully integrated and interactive.

Cognitive systems will coexist with legacy systems into the indefinite future. Many cognitive systems will build upon today’s IT resources. But the ambition and reach of cognitive computing is fundamentally different. Leaving the model of computer-as-appliance behind, it seeks to bring computing into a closer, fundamental partnership in human endeavors.

Additional uses of the term The term cognitive computing has also been used to refer to new hardware and/or software that mimics the functioning of the human brain. In this sense, cognitive computing (CC) is a new type of computing with the goal of more accurate models of how the human brain/mind senses, reasons, and responds to stimulus. CC applications link data analysis and adaptive page displays (AUI) to adjust content for a particular type of audience. As such, CC hardware and applications strive to be more affective and more influential by design.

Like a human, a cognitive computing application learns by experience and/or instruction. The CC application learns and remembers how to adapt its content displays, by situation, to influence behavior. This means a CC application must have intent, memory, foreknowledge and cognitive reasoning for a domain of variable situations. These 'cognitive' functions are in addition to the more fixed page displays now found in most paging applications.

External links

J! Archive

Videos

Further Reading

Bibliography

  • APA (2006). VandenBos, Gary R., ed. APA Dictionary of Psychology Washington, DC: American Psychological Association, page 26.
  • Balliene, B. W. (2005). Dietary Influences on Obesity: Environment, Behavior and Biology. Physiology & Behavior, 86 (5), pp. 717–730
  • Batson, C.D., Shaw, L. L., Oleson, K. C. (1992). Differentiating Affect, Mood and Emotion: Toward Functionally based Conceptual Distinctions. Emotion. Newbury Park, CA: Sage
  • Blechman, E. A. (1990). Moods, Affect, and Emotions. Lawrence Erlbaum Associates: Hillsdale, NJ
  • Brewin, C. R. (1989). Cognitive Change Processes in Psychotherapy. Psychological Review, 96(45), pp. 379–394
  • Damasio, A., (1994). *Descartes' Error: Emotion, Reason, and the Human Brain, Putnam Publishing
  • Denning. P.J. (2014) Surfing Toward the Future. Communications of the ACM, Vol. 57 No. 3, Pages 26–29 10.1145/2566967
  • Feldman, Susan E. (2012). The Answer Machine. Morgan & Claypool
  • Greenemeier, Larry. (2013). Will IBM’s Watson Usher in a New Era of Cognitive Computing? Scientific American. Nov 13, 2013 |* Lazarus, R. S. (1982).
  • Griffiths, P. E. (1997). What Emotions Really Are: The Problem of Psychological Categories. The University of Chicago Press: Chicago Thoughts on the Relations between Emotions and Cognition. American Physiologist, 37(10), pp. 1019–1024
  • Lerner, J.S., and D. Keltner. (2000) Beyond valence: Toward a model of emotion-specific influences on judgement and choice. "Cognition and Emotion", 14(4), pp. 473–493
  • Kelly, J.E. and Hamm, S. ( 2013). Smart Machines: IBM's Watson and the Era of Cognitive Computing. Columbia Business School Publishing
  • Nathanson, Donald L. Shame and Pride: Affect, Sex, and the Birth of the Self. London: W.W. Norton, 1992
  • Quirin, M., Kazén, M., & Kuhl, J. (2009). When nonsense sounds happy or helpless: The Implicit Positive and Negative Affect Test (IPANAT). Journal of Personality and Social Psychology, 97(3), pp. 500–516
  • Proudfoot, J., Guest, D., Carson, J., Dunn, G., & Gray, J. (1997). Effect of cognitive-behavioural training on job-finding among long-term unemployed people. The Lancet, Volume 350, Issue 9071, pp. 99–100
  • Schucman, H., Thetford, C. (1975). A Course in Miracle. New York: Viking Penguin
  • Shepard, R. N. (1984). Ecological Constraints on Internal Representation. Psychological Review, 91, pp. 417–447
  • Shepard, R. N. (1994). Perceptual-cognitive Universals as Reflections of the World. Psychonomic Bulletin & Review, 1, pp. 2–28.
  • Tolle, E. (1999). The Power of Now. Vancouver: Namaste Publishing.
  • Tolle, E. (2003). Stillness Speaks. Vancouver: Namaste Publishing
  • Weiskrantz, L. (1997). Consciousness Lost and Found. Oxford: Oxford Univ. Press.
  • Zajonc, R. B. (1980). Feelings and Thinking: Preferences Need No Inferences. American Psychologist, 35(2), pp. 151–175

References

1. IBM Watson: The Face of Watson on YouTube

2. a b c "DeepQA Project: FAQ". IBM. Retrieved 2011-02-11.

3. a b Hale, Mike (2011-02-08). "Actors and Their Roles for $300, HAL? HAL!". The New York Times. Retrieved 2011-02-11.

4. "The DeepQA Project". Research.ibm.com. Retrieved 2011-02-18.

5. "IBM Research: Dave Ferrucci at Computer History Museum - How It All Began and What's Next". IBM Research. 2011-12-01. Retrieved 2012-02-11. "In 2007, when IBM executive Charles Lickel challenged Dave and his team to revolutionize Deep QA and put an IBM computer against Jeopardy!'s human champions, he was off to the races."

6. Loftus, Jack (2009-04-26). "IBM Prepping 'Watson' Computer to Compete on Jeopardy!". Gizmodo. Retrieved 2009-04-27.

7. IBM's "Watson" Computing System to Challenge All Time Greatest Jeopardy! Champions, Sony Pictures Television, 2010-12-14, archived from the original on 2013-06-06, retrieved 2013-11-11

8. a b c Jackson, Joab (2011-02-17), IBM Watson Vanquishes Human Jeopardy Foes, PC World, IDG News, retrieved 2011-02-17

9. a b Zimmer, Ben (2011-02-17), Is It Time to Welcome Our New Computer Overlords?, The Atlantic, retrieved 2011-02-17

10. Raz, Guy (2011-01-28), Can a Computer Become a Jeopardy! Champ?, National Public Radio, retrieved 2011-02-18

11. a b c d e f g h i j k l m n o p q r s t Thompson, Clive (2010-06-16). "Smarter Than You Think: What Is I.B.M.’s Watson?". The New York Times Magazine. Retrieved 2011-02-18.

12. a b "IBM's Watson Gets Its First Piece Of Business In Healthcare" Forbes, February 8, 2013

13. Upbin, Bruce (2013-02-08). "IBM's Watson Gets Its First Piece Of Business In Healthcare". Forbes. Retrieved March 10, 2013.

14. a b Ferrucci, D, et al. (2010), Building Watson: An Overview of the DeepQA Project, AI Magazine (AI Magazine) 31 (3), retrieved 2011-02-19

15. http://craigrhinehart.com/2011/01/17/10-things-you-need-to-know-about-the-technology-behind-watson/

16. "Watson, A System Designed for Answers: The Future of Workload Optimized Systems Design". IBM Systems and Technology. p. 3. Retrieved 2011-02-21.

17. Takahashi, Dean (2011-02-17), IBM researcher explains what Watson gets right and wrong, VentureBeat, retrieved 2011-02-18

18. Novell (2011-02-02), Watson Supercomputer to Compete on 'Jeopardy!' -- Powered by SUSE Linux Enterprise Server on IBM POWER7, The Wall Street Journal, retrieved 2011-02-21

19. a b Is Watson the smartest machine on earth?, Computer Science and Electrical Engineering Department, UMBC, 2011-02-10, retrieved 2011-02-11

20. a b Rennie, John (2011-02-14), How IBM's Watson Computer Excels at Jeopardy!, PLoS blogs, retrieved 2011-02-19

21. Lucas, Mearian (2011-02-21), Can anyone afford an IBM Watson supercomputer? (Yes), Computerworld, retrieved 2011-02-21

22. [1]

23. The AI behind Watson "The AI Behind Watson - The Technical Article". AI Magazine. Fall 2010. Retrieved 2013-11-11.

24. IBM's 'Watson' to take on Jeopardy! champs, AFP, 2011-02-11, retrieved 2011-02-19

25. a b c d Jennings, Ken (2011-02-16), My Puny Human Brain, Slate, Newsweek Interactive Co. LLC, retrieved 2011-02-17

26. Libresco, Leah Anthony (2011-02-21), A Non-Trivial Advantage for Watson, The Huffington Post, retrieved 2011-02-21

27. Will Watson Win On Jeopardy!?, Nova ScienceNOW (Public Broadcasting Service), 2011-01-20, retrieved 2011-01-27

28. a b Gondek, David (2011-01-10), How Watson "sees," "hears," and "speaks" to play Jeopardy!, IBM Research blog (IBM), retrieved 2011-02-21

29. Avery, Lise (2011-02-14), Actor Jeff Woodman, Voice of IBM's Watson Computer (MP3), Anything Goes!!, retrieved 2011-02-15 (interview of Jeff Woodman)

30. a b c d Needleman, Rafe (2011-02-18), Reporters' Roundtable: Debating the robobrains, CNET, retrieved 2011-02-18

31. a b c d Jeopardy! Champ Ken Jennings, The Washington Post, 2011-02-15, retrieved 2011-02-15

32. Kosinski, R. J. (2008). A literature review on reaction time, Clemson University.

33. a b Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 174. ISBN 0-547-48316-3.

34. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 178. ISBN 0-547-48316-3.

35. a b c Alex Strachan (2011-02-12), For Jennings, it's a man vs. man competition, The Vancouver Sun, retrieved 2011-02-15

36. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. pp. 6–8. ISBN 0-547-48316-3.

37. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 30. ISBN 0-547-48316-3.

38. Radev, Dragomir R.; Prager, John; Samn, Valerie (2000). "Ranking potential answers to natural language questions". "Proceedings of the 6th Conference on Applied Natural Language Processing".

39. Prager, John; Brown, Eric; Coden, Annie; Radev, Dragomir R. (July 2000). "Question-answering by predictive annotation". "Proceedings, 23rd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval".

40. a b Brodkin, Jon (2010-02-10), IBM's Jeopardy-playing machine can now beat human contestants, Network World, retrieved 2011-02-19

41. "Medical Students Offer Expertise to IBM's Jeopardy!-Winning Computer Watson as It Pursues a New Career in Medicine". InTouch 18. New York Medical College. June 2012. p. 4.

42. Stelter, Brian (2010-12-14), I.B.M. Supercomputer 'Watson' to Challenge 'Jeopardy' Stars, The New York Times, retrieved 2010-12-14, "An I.B.M. supercomputer system named after the company's founder, Thomas J. Watson Sr., is almost ready for a televised test: a bout of questioning on the quiz show "Jeopardy." I.B.M. and the producers of "Jeopardy" will announce on Tuesday that the computer, "Watson," will face the two most successful players in "Jeopardy" history, Ken Jennings and Brad Rutter, in three episodes that will be broadcast Feb. 14–16, 2011."

43. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 171. ISBN 0-547-48316-3.

44. Flatow, Ira (2011-02-11), IBM Computer Faces Off Against 'Jeopardy' Champs, Talk of the Nation (National Public Radio), retrieved 2011-02-15

45. Sostek, Anya (2011-02-13), Human champs of 'Jeopardy!' vs. Watson the IBM computer: a close match, Pittsburgh Post Gazette, retrieved 2011-02-19

46. Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. Boston, New York: Houghton Mifflin Harcourt. p. 117. ISBN 0-547-48316-3.

47. Baker, pp. 232-258.

48. Dignan, Larry (2011-01-13), IBM's Watson wins Jeopardy practice round: Can humans hang?, ZDnet, retrieved 2011-01-13

49. a b "The IBM Challenge Day 1". Jeopardy. Season 27. Episode 23. 2011-02-14.

50. Lenchner, Jon (2011-02-03), Knowing what it knows: selected nuances of Watson's strategy, IBM Research blog (IBM), retrieved 2011-02-16

51. Johnston, Casey (2011-02-15), Jeopardy: IBM's Watson almost sneaks wrong answer by Trebek, Ars Technica, retrieved 2011-02-15

52. a b c d Computer crushes the competition on 'Jeopardy!', Associated Press, 2011-02-15, retrieved 2011-02-19

53. Tesauro, Gerald (2011-02-13), IBM Research: Watson's wagering strategies, IBM Research blog (IBM), retrieved 2011-02-18

54. Staff (2011-02-15), IBM's computer wins 'Jeopardy!' but... Toronto?, CTV.ca, retrieved 2011-02-15, "Watson, IBM's quiz-master computer with the strangely serene voice, beat the humans on "Jeopardy!" tonight. But it got the final question on U.S. cities wrong, answering: Toronto."

55. a b Robertson, Jordan; Borenstein, Seth (2011-02-16), For Watson, Jeopardy! victory was elementary, The Globe and Mail, The Associated Press, retrieved 2011-02-17, "A human would have considered Toronto and discarded it because it is a Canadian city, not a U.S. one, but that's not the type of comparative knowledge Watson has, Prof. Nyberg said."

56. Hamm, Steve (2011-02-15), Watson on Jeopardy! Day Two: The Confusion over and Airport Clue, A Smart Planet Blog, retrieved 2011-02-21

57. Johnston, Casey (2011-02-15), Creators: Watson has no speed advantage as it crushes humans in Jeopardy, Ars Technica, retrieved 2011-02-21

58. Oberman, Mira (2011-02-17), Computer creams human Jeopardy! champions, Vancouver Sun, Agence France-Presse, retrieved 2011-02-17, "But a Final Jeopardy flub prompted one IBM engineer to wear a Toronto Blue Jays jacket to the second day of taping and Trebek to joke that he'd learned Toronto was a U.S. city."

59. Johnston, Casey (2011-02-17), Bug lets humans grab Daily Double as Watson triumphs on Jeopardy, Ars Technic, retrieved 2011-02-21

60. a b Upbin, Bruce (2011-02-17), IBM's Supercomputer Watson Wins It All With $367 Bet, Forbes, retrieved 2011-02-21

61. Oldenburg, Ann (2011-02-17), Ken Jennings: 'My puny brain' did just fine on 'Jeopardy!', USA Today, retrieved 2011-02-21

62. "Show 6088 - The IBM Challenge, Day 2". Jeopardy!. 2011-02-16. Syndicated.

63. World Community Grid to benefit from Jeopardy! competition, World Community Grid, 2011-02-04, retrieved 2011-02-19

64. Jeopardy! And IBM Announce Charities To Benefit From Watson Competition, IBM Corporation, 2011-01-13, retrieved 2011-02-19

65. IBM's Watson supercomputer crowned Jeopardy king, BBC News, 2011-02-17, retrieved 2011-02-17

66. Markoff, John (2011-02-16), Computer Wins on ‘Jeopardy!’: Trivial, It's Not, Yorktown Heights, New York: The New York Times, retrieved 2011-02-17

67. Searle, John (2011-02-23), Watson Doesn't Know It Won on 'Jeopardy!', The Wall Street Journal, retrieved 2011-07-26

68. Lohr, Steve. "Creating AI based on the real thing", Dec 05, 2011 on The New York Times.

69. a b NJ congressman tops 'Jeopardy' computer Watson, Associated Press, 2011-03-02, retrieved 2011-03-02

70. Weber, Robert C. (2011-02-14), Why 'Watson' matters to lawyers, The National Law Journal, retrieved 2011-02-18

71. Nay, Chris (2011-09-06). "Putting Watson to work: Interview with GM of Watson Solutions Manoj Saxena". Smarter Planet Blog. IBM. Retrieved 2013-11-12.

72. Merritt, Rick (2011-02-14), IBM playing Jeopardy with tax dollars, EE Times, retrieved 2011-02-19

73. https://www.internetretailer.com/2013/12/03/ibms-watson-computer-helps-shoppers-new-app

74. http://mobihealthnews.com/27414/with-watson-api-launch-ibm-turns-to-welltok-for-patients-md-buyline-for-docs/

75. http://www.forbes.com/sites/bruceupbin/2013/11/14/ibm-opens-up-watson-as-a-web-service/

76. "IBM's Watson to Join Research Team at Rensselaer | News & Events". News.rpi.edu. 2013-01-30. Retrieved 2013-10-01.

77. The Independent Sector: Cultural, Economic and Social Contributions of New York's 100+, Not-for-Profit Colleges and Universities, Commission on Independent Colleges and Universities, Summer 2013, p. 12, retrieved 2013-10-01

78. http://www.reuters.com/article/2014/02/06/us-ibm-africa-idUSBREA1507H20140206

79. https://www.flickr.com/photos/ibm_media/14359875853/

80. https://www.flickr.com/photos/ibm_media/14153092500/in/photostream/

81. https://www.flickr.com/photos/ibm_media/14152996019/in/photostream/

82. http://www-03.ibm.com/press/us/en/pressrelease/44057.wss

83. "Genesys to Put IBM's Watson to Work".

84. "IBM’s Watson Is Now A Cooking App With Infinite Recipes".

85. a b c "Putting Watson to Work: Watson in Healthcare". IBM. Retrieved 2013-11-11.

86. IBM Watson Helps Fight Cancer with Evidence-Based Diagnosis and Treatment Suggestions, IBM, retrieved 2013-11-12

87. Saxena, Manoj (2013-02-13). "IBM Watson Progress and 2013 Roadmap (Slide 7)". IBM. Retrieved 2013-11-12.

88. Wakeman, Nick (2011-02-17), IBM's Watson heads to medical school, Washington Technology, retrieved 2011-02-19

89. Mathews, Anna Wilde (September 12, 2011). "Wellpoint's New Hire: What is Watson?". The Wall Street Journal.

90. Miliard, Mike (2012-10-30). "Watson Heads to Medical School: Cleveland Clinic, IBM Send Supercomputer to College". Healthcare IT News. Retrieved 2013-11-11.

91. Leske, Nikola (2013-02-09). "Doctors Seek Help on Cancer Treatment from IBM Supercomputer". Reuters. Retrieved 2013-11-11.

92. a b c IBM Watson Group Unveils Cloud-Delivered Watson Services to Transform Industrial R&D, Visualize Big Data Insights and Fuel Analytics Exploration, retrieved January 18, 2014

93. IBM set to expand Watson's reach, The Wall Street Journal, January 9, 2014, p.B2