Startup Review Signal Mines Twitter for Reliable Web Hosting Reviews

3 comments

In a TechCrunch post this morning, Klint Finley took a look at a new startup, Review Signal, that is working to clean up the “cesspool” of web hosting reviews with a system that uses software to track discussion about a given hosting company on Twitter, and share the aggregate of that data as a percentage rating on a positive-to-negative scale.

According to TechCrunch, the service was started by Kevin Ohashi, a former moderator at the online forum Web Hosting Talk, who had first-hand experience with the disorganized, questionable, even unreliable state of online reviews concerning web hosting services.

The reliability of web hosting reviews has been in question as long as the reviews themselves have existed online, with honest reviews (most of the reviews posted on Web Hosting Talk are a good example) generally representing first-hand accounts and very frequently motivated by bad experiences, and many formal web hosting reviews sites seemingly motivated by affiliate links or other profit motivations.

The overall sketchiness of web hosting reviews is an issue we’ve faced directly at the WHIR, and it’s the reason we’ve chosen not to offer anything like reviews on our site, or to recognize positive “reviews” in our reporting. A similar problem seems to exist in the world of web hosting “awards.”

Ohashi seeks to pull back the curtain on the methodology around the reviews with Review Signal, a project that spawned from his masters thesis on sentiment analysis. The service scans Twitter for posts about a given web host in order to create a picture of how satisfied its customers are.

Because sentiment analysis is an imprecise science with a pretty significant margin for error, Ohashi tells TechCrunch, the tool throws out those tweets that it can’t definitively identify as either positive or negative. The project also requires a lot of effort to fight against the ability of spammers to game the results, meaning Ohashi spends time looking over the tweets that are analyzed and updating the system to spot bad results.

While the results may be “honest,” in the sense that they’re not being created or massaged by the site, they may not be correlated directly to the quality of a particular hosting service so much as a measure of the effectiveness of that company’s social media efforts. Judging by the overall ratings on the site (pictured above) the ratings in general seem to be low, with hosting companies just outside the top 10 barely breaking the 50 percent positive mark. This is probably owing to Twitter being used by most people as a vehicle for complaining more than applauding the services they consume.

The Review Signal website includes a “how it works” section, that explains in pretty good detail how positive tweets are distinguished from negative, and how tweets about a business’s services are distinguished from other, “irrelevant” tweets about a business.

The site is monetized by posting affiliate links to the companies listed in its rankings, which would seem to raise the reliability issues and conflict of interest that plagues other web hosting reviews sites, but Finley writes that Ohashi’s plan for proving its reliability has to do with transparency – Review Signal lets users view the tweets that are being evaluated by its systems. And the fact that Linode, which doesn’t have an affiliate program, is near the top of the site’s rankings, would seem to support its unbiased claims.

According to Finley’s Tech Crunch post, Ohashi isn’t interested in developing the site into a brand monitoring service, but does plan to expand into hosting-related services such as domains and email.

Talk back: Do you think there’s a need for an unbiased web hosting reviews site out there? Do you think the formula for Review Signal meets that need? Let us know in the comments.

Add Your Comments

  • (will not be published)

3 Comments

  1. Liam, Thank you for the thoughtful write up about my startup. Mr. Finley addressed the issue of bias pretty fairly I thought. Yes, there are some affiliate links on the site. Does that taint the overall rankings? I don't think so, and I am trying to convince users of that. Everyone is welcome to investigate each source and help improve our spam and sentiment analysis by reporting reviews. The rating algorithm I would love to see what sort of ideas your readers have about the issue of bias in reviews and specific comments about Review Signal's implementation. I am always looking for feedback on how to improve.

    Reply
    • Post author

      Hey Kevin, Thanks for commenting. I hope what I wrote doesn't come across as suggesting that the affiliate links taint the data you're presenting. I think affiliate links are actually probably the revenue-generating option that is least disruptive to your users' decision making (even in comparison to paid advertisements). I think overcoming the assumption (by users) of bias related to "web hosting reviews" in general is just something you have to overcome, which you obviously understand based on your comment. The thing I'd really like to see dissected a bit is what specifically the metric you produce from measuring the positivity of aggregated tweets describes? Is it "customer satisfaction," or something a little more abstract?

      Reply
      • I agree with you on affiliate links. I don't think they are intrusive if used properly. But you can't let those links influence the rankings in any way. You ask a fantastic question about what is it that Review Signal's ratings actually capture. I think they fit somewhere between customer satisfaction and net promoter score. Our algorithm is fair simple, #UniquePositive / #UniqueTotal (#UniqueNegative+#UniquePositive). If I had to put that in words, I think the closest I could get is 'average experience.' We're trying to measure something that's generally intangible (happiness? satisfaction?) by comparing how many people express positive opinions versus those that express negative ones. There is definitely some nuance and not all positive and negative expressions are created equal. Another issue that could be asked is whether each person's opinion is created equal? For the time being, we're treating every opinion and every person the same, but we're not against exploring weighting systems if it makes the results better (it could play into spam detection as well). Ultimately, we have one goal, giving honest and accurate reviews to help people make more informed decisions. Today, we took our first step down that (never ending?) road, and we're almost certainly going to have to adjust to improve. I tried to explain what we're capturing now, but I can't guarantee that is the best metric. I think it's a better metric than what is currently available, the market/internet/world will decide if that's true or not. If someone came up with a better algorithm that measured customer satisfaction/experience and it could be implemented I would be all for implementing it on Review Signal.

        Reply