Friday, July 19, 2013

Deceit on the Internet: Shopping Reviews

Deceit on the Internet                                                                         (July 19, 2013)


We were intrigued by an entry (comment) by Anon on our entry for Chummie: Strong Deceit by Perversion on July 15, 2013. Anon referenced a blog http://stopdeceit.blogspot.com/titled Stop Deceit on the Internet. The author(s), using the name “Stop Deceit,” gave a decent description of deceit on the internet. Stop Deceit gave three basic sources of deceit on the internet:
1.       Manufacturer’s Web Sites,
2.       Product Blogs, and
3.       Shopping Reviews.
We would strongly recommend that readers look at http://stopdeceit.blogspot.com/ .

We have been systematically addressing deceit on manufacturers’ blog sites for bedwetting alarm manufacturers on this blog. We felt intrigued about how much deceit there may be with respect to bed wetting alarms on shopping blogs and shopping reviews. As we are following the bedwetting alarm industry, we thought we should look at bedwetting alarm related deceit in Product Blogs and Shopping Reviews


Shopping Reviews:

We chose Amazon.com for our basic analysis, as Amazon is undoubtedly the biggest shopping site (even for bedwetting alarms) and we can expect the most buyer reviews. Furthermore, Amazon does provide more information about the reviewer and the reviewer’s history at Amazon, so that we could better analyze the reviewer. To get a sufficiently large sample, we arbitrarily decided to look at “wired” alarms currently being sold on Amazon so that their reviews would be available to us. We also decided to only consider alarms where there were more than fifty reviews, so that we could have a reasonable sample size. Even there, to somewhat limit the time we might have to spend on this task, we selected four alarms which we felt might adequately cover the different styles and prices available.

Our intent was to try and get a plausible idea of the extent to which the reviews were unduly biased, and whether they were biased in favor of or against that particular alarm. In other words, we were attempting to identify the extent to which reviews might have been provided by shills or touts of the manufacturer (which would unduly favor that item) or by competitors (which would unduly bad-mouth the item). We then came up with a set of criteria to apply to each and every revue for that product on Amazon. We do want to point out that the criteria are subjective and are not necessarily perfect in their ability to identify touts or shills. But we feel that the identified reviews are much more likely to be unduly biased, and not be honest reviews by honest and actual buyers.

The following table shows the  results for the number of possibly biased reviews out of the total, and further breaks them up into positive bias (perhaps shills for the manufacturer/seller) and negative bias (perhaps shills for competitors). They are expressed as a percentage of the total identified biased reviews (rounded to the nearest whole percent):

Product A:           34%        Positive Bias       32%        Negative Bias     2%

Product B:           31%        Positive Bias       25%        Negative Bias     6%

Product C:           11%        Positive Bias         9%        Negative Bias     2%

Product D:             2%        Positive Bias         2%        Negative Bias     0%
 
We must confess that the high percentage of bias that we judged to be present for Product A came as no surprise to us. We also see that a huge proportion of those are positively biased, perhaps unduly raising the "ratings" for Product A. We must “assume” that the manufacturer is quite desperate or non-caring about using deceitful reviews so as to raise the ratings of their product. So the “ratings” that you see on Amazon for this product are possibly very biased (skewed) on the high side and wrong.
 
Product B was the newest of the four products examined, and we were somewhat surprised that the percentage of possibly fake reviews was so high. The primary difference between Product B and Product A is that B has not flooded Amazon with possibly biased positive ratings as much as Product A might have. So the proportion of positive to negative biased reviews is less for Product B than for Product A. Never-the-less, Product B also appears to be overindulging in introducing biased reviews into Amazon.
 
Product C is an established product. Although the percentage of possibly fake reviews was lower, we still considered it to be high. The ratio of positive to negative biased reviews is about the same as for Product B.
 
Product D is also an established product. Here, our surprise was that the bias was so low, as compared to the other products. Without mentioning names, we must commend the manufacturer of Product D for our not noticing much presence of shills and touts in the reviews of this product.
 
In conclusion, we must agree with Stop Deceit, that even the “best” of shopping sites, using their own criteria for weeding out fake reviews, was unable to prevent a substantial number of possibly fake reviews from being presented on Amazon.com . For Product A and Product B, we estimate that about one-third of the reviews could be fake. This is a huge fraction of the total reviews. Consequently, do not take these reviews as “the honest truth.” Take them with a grain of salt, or even a shovelful of salt.

No comments:

Post a Comment