Home / News & Analysis / Notorious Kindle Unlimited abuser has been booted from the bookstore

Notorious Kindle Unlimited abuser has been booted from the bookstore

A few levels past the bestsellers and sci-fi/romance/adventure titles on Kindle Unlimited, in the darkest corners of the Kindle Direct Publishing market, there are books that are made entirely out of garbage designed to make scammers hundreds of dollars a day. One user, who called his or herself Chance Carter, was one of the biggest abusers of the KDP system and, more important, made over $15 per book they uploaded to the system, over and over, for books that contained no real content.

Carter, according to the Digital Reader, would create large novels out of other books. The books, which were simple hack jobs written by Fiverr writers, were hundreds of pages long and, on the first page, featured a recommendation to flip to the last page to get a free giveaway. KDP pays authors for both paid downloads as well as for pages read and it doesn’t sense reading speed, just the highest number of pages reached. Therefore Chance’s “readers” were instantly sending him or her about twenty dollars a read.

The way that the book-stuffing con works is that scammers stuff lots of extra content into an ebook before uploading it to Kindle Unlimited, and then trick readers into jumping to the end of the book.

Thanks to a flaw in the Kindle platform, namely that the platform knows your location in a book but not how many pages you have actually read, the scammers can get paid for a user having “read” a book in Kindle Unlimited by getting the user to jump to the last page.

So how are clever authors getting around Amazon's new review policies? By giving readers a chance to win some Tiffany jewelry of course. All you have to do is purchase & review Chance's new book on Amazon. This is so ridiculous pic.twitter.com/k3l9Z25jPa

— nikki🐕 (@ease_dropper) May 30, 2018

This sort of KDP scam is actually quite unusual. Amazon has worked to prevent scams like these from taking cash out of the KDP “pool” – a multi-million dollar account that is passed out to the best KDP authors – but this one was so long-running and ingenious that it’s not surprising that it took so long to pull these books from the store. Interestingly, the flip-to-end scam doesn’t quite work on newer Kindles but still works on older, non-updated Kindles which makes it still a lucrative scam.

Chance, for his part, offered free Tiffany jewelry if you flipped to the end of his books. This was, obviously, against KDP rules.

Carter and his books are gone but books stuffers like him still exist. While it’s not a crime per se, it does muddy the Kindle ebook waters and brings garbage content into the market. While most of us wouldn’t fall for these cynical tricks, plenty will and that makes it a danger to readers and a boon to scammers.

Read more

Check Also

Facebook’s new AI research is a real eye-opener

There are plenty of ways to manipulate photos to make you look better, remove red eye or lens flare, and so on. But so far the blink has proven a tenacious opponent of good snapshots. That may change with research from Facebook that replaces closed eyes with open ones in a remarkably convincing manner. It’s far from the only example of intelligent “in-painting,” as the technique is called when a program fills in a space with what it thinks belongs there. Adobe in particular has made good use of it with its “context-aware fill,” allowing users to seamlessly replace undesired features, for example a protruding branch or a cloud, with a pretty good guess at what would be there if it weren’t. But some features are beyond the tools’ capacity to replace, one of which is eyes. Their detailed and highly variable nature make it particularly difficult for a system to change or create them realistically. Facebook, which probably has more pictures of people blinking than any other entity in history, decided to take a crack at this problem. It does so with a Generative Adversarial Network, essentially a machine learning system that tries to fool itself into thinking its creations are real. In a GAN, one part of the system learns to recognize, say, faces, and another part of the system repeatedly creates images that, based on feedback from the recognition part, gradually grow in realism. From left to right: “Exemplar” images, source images, Photoshop’s eye-opening algorithm, and Facebook’s method. In this case the network is trained to both recognize and replicate convincing open eyes. This could be done already, but as you can see in the examples at right, existing methods left something to be desired. They seem to paste in the eyes of the people without much consideration for consistency with the rest of the image. Machines are naive that way: they have no intuitive understanding that opening one’s eyes does not also change the color of the skin around them. (For that matter, they have no intuitive understanding of eyes, color, or anything at all.) What Facebook’s researchers did was to include “exemplar” data showing the target person with their eyes open, from which the GAN learns not just what eyes should go on the person, but how the eyes of this particular person are shaped, colored, and so on. The results are quite realistic: there’s no color mismatch or obvious stitching because the recognition part of the network knows that that’s not how the person looks. In testing, people mistook the fake eyes-opened photos for real ones, or said they couldn’t be sure which was which, more than half the time. And unless I knew a photo was definitely tampered with, I probably wouldn’t notice if I was scrolling past it in my newsfeed. Gandhi looks a little weird, though. It still fails in some situations, creating weird artifacts if a person’s eye is partially covered by a lock of hair, or sometimes failing to recreate the color correctly. But those are fixable problems. You can imagine the usefulness of an automatic eye-opening utility on Facebook that checks a person’s other photos and uses them as reference to replace a blink in the latest one. It would be a little creepy, but that’s pretty standard for Facebook, and at least it might save a group photo or two.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.