Based on the writing styles of Thomas Hardy, D.H. Lawrence and Herman Melville, physicists have developed a formula to detect the literary “fingerprints” of different authors.
New research describes a new concept from a group of Swedish physicists from Ume University. The “meta book” uses the frequency with which authors use new words in their work to discern distinct patterns in authors' written styles.
For more than 75 years, George Kingsley Zipf's maxim, based on a carefully selected compilation of American English called Brown Corpus, suggested a universal pattern for the frequency of new words used by authors.
Zipf's law suggests that the frequency ranking of a word is inversely proportional to its occurrence. New research suggests however that the truth behind word frequency is less universal than Zipf asserted, and is linked more with the author's linguistic ability than any over-arching linguistic rule.
Researchers first found that the occurrence of new words in the texts by Hardy, Lawrence and Melville did begin to drop off in their texts as their book gets longer, despite new settings and plot-twists.
Their evidence also shows however that the rate of unique word drop-off varies for different authors and, most significantly, is consistent across the entire works of any one of the three authors they analysed.
The statistical analysis was applied to entire novels, sections from novels, complete works and amalgamations from different works by the same authors – they all had a unique word-frequency “fingerprints”.
By using the statistical patterns evident from their study, the researchers have pondered the idea of a meta-book – a code for each author which could represent their entire work, completed or in the mental pipeline, says Ume University release.
“These findings lead us towards the meta book concept – the writing of a text can be described by a process where the author pulls a piece of text out of a large mother book (the meta book) and puts it down on paper,” write the study authors.