From the Tech Desk
The Bestseller Code
The big news in publishing industry technology this month is the release of The Bestseller Code: Anatomy of the Bestselling Novel. The book, written by Jodie Archer and Matthew L. Jockers and published on September 20th, looks at what transforms a book from simple words on a page into a global phenomenon. Archer, who used to be an acquisitions editor at Penguin UK, was inspired to look into this particular subject after Dan Brown's The Da Vinci Code became a expectations-smashing blockbuster in the middle of the last decade. The book was the kind of miracle that every acquisitions editor wants to buy and ever publisher hopes to break to the public. But what made it that kind of book? Working with Jockers, a professor with an interest in text analysis, Archer set out to find the answer.
The result is The Bestseller Code. The book conveys the big data revelations that Archer and Jockers uncovered during their studies. To create the "code," the two fed thousands of books into a text analysis program. They were looking for commonalities shared by the blockbusters and bestsellers in their stack of some 20,000 contemporary novels. What did those titles have that the other books featured in their study didn't? What about them lit the world on fire? It couldn't all just be about extremely talented marketing teams, could it?
You'll have to read The Bestseller Code to discover all of the criteria that supposedly marks a book as a bestseller in waiting. Archer and Jockers came up with about 2,800 qualities that many of the bestselling books in the pack had in common. These qualities, meanwhile, run quite the range, from ideal protagonists to preferred sentence structure and language use, and from fluidity of themes and conflicts to the ways in which readers want to see characters connect. For the latter category, "human closeness" seems to be in vogue; sex, contrary to popular belief, does not sell.
Archer and Jockers have funneled all of these big data revelations into an algorithm designed to take some of the guesswork out of finding bestsellers. Theoretically, acquiring editors could start feeding manuscripts through the algorithm today and it would tell them whether or not they were looking at the Harry Potter of tomorrow. (Though, let's be honest: there will never be another Harry Potter.) Alternatively, writers could use the algorithm to check their own works for blockbuster potential, or could internalize the 2,800 tenets of blockbuster novels and consciously implement them in their own works.
In terms of pure theoretical conversation, this allsounds fascinating. Looking at what characteristics seemingly unrelated books like The Da Vinci Code and The Hunger Games might have in common is a fun and thought-provoking exercise. Even beyond simple theory, it's easy to see how this "bestseller code" algorithm might help publishers find the hits (or at least the best bets) in their stacks of submissions. In turn, the code might help publishers make more money and use those funds to drive the industry into a more prosperous place. In short, this big data algorithm could be the powerful tool that the entire publishing industry has been waiting for.
Yes, more bestsellers and bigger profits sound great. But let your mind wander a few steps beyond money and this algorithm starts to look like one of the biggest threats that publishing has ever faced. Already, publishing acquisitions are driven by too many factors other than the caliber of the writing and the quality of the story. Everything from sales figures for comparable titles to the author's social media platform is taken into consideration by acquiring teams at today's publishing houses. Are we really ready to enter an era where we give up the human factor entirely and let an algorithm tell us which books are worthy of publication?
If you enjoy reading the same formulaic book over and over again, then the answer to this question might be yes. If you prefer bonding with different types of protagonists, supporting authors with unique voices and innovative writing styles, or reading a novel that surprises you and beguiles you by playing against all the trends, then the answer is no. We need publishers to be willing to take a chance on novels that might not light the world on fire, but that might also say something unique about the human condition to a still-broad base of readers. An algorithm that only looks for the next Twilight or Gone Girl might find a few more bestsellers, but it will also decimate any risk-taking quotient that publishers still have.
So read The Bestseller Code if you must. Certainly, Archer and Jockers have put a lot of work into arriving at their findings, and those findings do have value. But big data can't give you a book that changes your life and captivates your imagination: only a living, breathing human being can do that.
Craig Manning is currently studying English and Music at Western Michigan University. In addition to writing for IndependentPublisher.com, he maintains a pair of entertainment blogs, interns at the Traverse City Business News, and writes for Rockfreaks.net and his college newspaper. He welcomes comments or questions concerning his articles via email, at firstname.lastname@example.org.