Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Home Topics Titles Quotes Blog Featured Help
Search

Cover Quote: June 2000

Bayes’ theorem chains probabilities, maximizing the amount of learned information brought to bear on a problem, and is especially well suited to predicting the outcome of situations where a mass of conflicting or overlapping influences converge on the result.

...[I]magine a short-order cook working in a busy café. In a din of conversation, clattering plates, and street noise, the waiters dash by the kitchen counter, shouting orders... A Bayesian decision process would allow the beleaguered chef to send out as many correct orders as possible.

He has some prior knowledge: Few customers order the smothered duck feet, while the steak frites is a perennial favorite. The cook can use information like this to calculate the prior possibility that a given dish was ordered... He also knows that one waiter tends to mispronounce any dish with a French name, that a bus roars past the restaurant every 10 minutes, and that the sizzling sounds from the skillets can obscure the difference between, say, ham and lamb. Taking these factors into account, he creates a complicated model of ways the orders might get confused. ... Bayes’ theorem allows the chef to chain the probabilities of all possible influences on what he hears to calculate the probability ... that the order he heard was for the dish the customer chose.



- Steve Silberman
The Quest for Meaning, 2000
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy