Great fiction and poetry almost always observe the particular in ways that reveal something beyond the particular. It matters that the doubloon that Captain Ahab nailed to the mast of a ship stinking of whale oil and death “was of purest, virgin gold, raked somewhere out of the heart of gorgeous hills, whence, east and west, over golden sands, the head-waters of many a Pactolus flows.” Hamlet’s telling us that Yorick was not just a merry jester but that “he hath borne me on his back a thousand times” gives us a vivid picture of their relationship. If when we first met Elizabeth Bennet in Pride and Prejudice she “had been obliged, by the scarcity of gentlemen, to sit down” not for two dances but for one or three, she might have seemed full of herself or desperate.
Rehabilitation of the particular
Against the importance of the particulars has stood the construction of modern science. (It’s disturbing that these days I have to reassure anyone that I am an enthusiastic supporter of science—real, evidence-based science—but given the times, such avowals are necessary.) It’s not a criticism of science to say that it pays great attention to particulars in pursuit of the general principles and rules that can be abstracted from them. There is great power, truth, and beauty in this.
At the same time, I think we are beginning to see outside of literature a rehabilitation of the particular. We are acknowledging what we have always known: The particular so overwhelms our ability to apply general principles that those principles’ predictive power is more often theoretical than usable. Yet we have a history of treating the general principles as what’s really real, what’s truly true, for they stay the same as the particulars change. At least since Plato we’ve assumed that permanence is a sign of reality. That assumption may itself be changing.
If so, it is to some significant extent because we are getting used to the idea that our new technology—machine learning, in particular—lets us make better predictions than ever by attending to particulars without insisting on reducing them to the generalities under which they can be clustered. Traditional computing upholds the accepted method of applying general rules to particulars: You provide the computer with a model of your domain—a generalized abstraction—and then feed in the data representing the particulars. The computer applies the rules and presents the output. With machine learning, you skip the generalized abstraction and just feed in the particulars. The system looks for relationships among them, and builds its own model. That model is generalized in that you use it to make probabilistic predictions based on the new data you give it, but machine learning models can be so complex, with so many factors and relationships, that they represent a complex world composed of tiny interdependencies, rather than a world governed by a relative handful of principles. This focuses our attention on the particulars even when we are engaged in something such as scientific prediction.