This post explains some basic concepts of how search engines work and index your book metadata, and the differences between Amazon and Google search engines.Read More
Coming up with relevant and effective keywords is hard! Keeping them up-to-date and optimized for then number of sales your book is currently making is even harder. Here are common mistakes we see authors make when implementing their keyword strategy on Amazon:
1. Choosing keywords that are too broad
2. Not validating that a keyword is commonly used by customers
3. Choosing keywords without much traffic
4. Not monitoring keyword progress (checking search rank for a book)
5. Leaving keywords unchanged for a month or longer
6. Choosing keywords that are too competitive for their book
7. Repeating terms across keywords
8. Not aligning keyword strategy with external marketing activities (to capitalize on sales rank increases)
9. Not having a keyword strategy!
A human copyeditor is unlikely to be completely displaced by a machine, but a significant portion of common copy edit’s to manuscripts could be automated. A primitive tool to assist with copyediting exists (AutoCrit) which suggests changes to text based on readability and other metrics. An advanced tool could be created to capture micro edits across multiple manuscripts, compare these edits and then automatically apply the changes where confidence in the change is high.
Publishing houses are best placed to create these specialized copyediting knowledge bases. They could start by installing software on editors' machines to capture each line-edit and log it to a central database. A copyediting rules engine would then analyze the before and after text changes using part-of-speech (POS) tagging to disambiguate word-categories. After collecting enough examples of similar edits, a rule could be learned by the system, and applied to similar occurrences in new text. These rules would be saved as templates that understand POS tagging. A rules-based library already exists that could easily be adapted to support this system.
The new copyedit system will undoubtedly suggest suboptimal changes, or multiple text alternatives. In this scenario, a human would verify the change. The system would learn which changes were preferable, under which circumstances, until it has enough knowledge and confidence to apply edits automatically. The review process could be extended to include feedback from book reviewers, to rate the most effective changes.
It's unlikely the system could turn good writing into great writing, but at the very least, it could learn enough Strunk-like style suggestions to improve poor writing, via rule based templates, for example to ‘use the active voice', 'omit needless words' and to 'put statements in positive form'.
Kadaxis is pleased to announce the beta release of Slush Filter, a tool for literary agents and publishers. Slush Filter accepts fiction manuscripts of 40,000 words or more, in doc, docx, txt and ePub formats, and provides a machine generated report in seconds. Each report contains:
- A recommendation on whether to review the manuscript (based on potential marketability)
- BISAC Codes
- Comp Titles
- Locations and character names.
Please email [email protected] for an unlimited trial license.
If you were a fan of 50 Shades of Grey, you might be interested in these somewhat older, classic tales that our algorithms classified as Fiction / Erotica, purely by analyzing the content. All titles from Project Gutenberg are available as a free download to your eReader or to browse online. Check them out: