Shirky says most human thought isn't deduction from syllogisms, it's much more fuzzy and heuristic than that. Doctorow points out lots of good reasons why metadata will be crap in practice.
These ideas relate to better programming languages too. There's an impedance mismatch between human concepts and the way computer software systems tend to classify things. We just don't think as rigidly about things as computers do. We get frustrated with how software will go mindlessly clanking down the wrong path (for example happily cranking out copies of a virus and emailing them to all your friends), but when we try to build smarts into it, we're frustrated by just how dumb those smarts turn out to be.
Human computer interfaces, from GUIs to languages to development and testing environments, need to help our squishy minds navigate the bizarre tinkertoy syllogismatrons we're forced to build. While we frantically look for ways to teach software to think more like we do.