Contact:cdent@burningchrome.com
Berners-Lee, T., & Fischetti, M. (1999). Chapter 13: Machines and the Web. In _Weaving the Web: the original design and ultimate destiny of the World Wide Web by its inventor_. San Francisco: HaperSanFrancisco. Discussion of the Semantic Web, a network of information sharing between machines based on the presence of metadata represented in RDF. The network is called semantic because an inference layer will allow entities on the network to reach a consensual understanding. Such understanding is supposed to enhance the generation of knowledge. -=-=- There are so many ways to criticize this chapter (Berners-Lee has a bad case of second system syndrome; he's up on a soapbox which is distorting his view somewhat; he equates, to some extant, traversal of a thesaurus as meaning acquistion; essentially he suggest that computers can categorize, I don't think so: they classify) but who wants to. This is such a nice pretty vision of the future I'm inclined to support it despite the flaws. One area where this discussion makes a big win is in its understanding of the power of brute force methods in granting computers some semblance of intelligence. This was recently (200111) discussed on the unrev-ii@yahoogroups.com mailing list: IBM has released some systems that do self diagnosis and healing using techniques utilized for the deep blue chess system. IBM discovered that to make deep blue smart their best approach was to provide the computer with as much info as possible from whlch it could do pattern matching. They call this the brute force approach. Unrevvers liken this to the approach used with cyc. The semantic web is the same: using the entire web as the brute information force. Back to the Index