Disclosure
Disclosure: I am compensated for the endorsements, reviews, rankings etc. presented here. Click here for details.
Support HostPeek
If you like HostPeek.com, please link to it, or recommend it to other people. Thank you!
While actively monitoring hosts, Hostpeek.com was hosted on a high quality VPS from Liquidweb.
Charts and figures
Polls
Miscellaneous
Hosting Resources
FAQs |
![]() |
Written by Daniel Lemnaru | |
1. What is the criteria used for choosing the hosts to monitor? There are several factors taken into account. One of them is the company's prominence. To estimate this, I'm usually using Google Trends and Google Adwords Keyword Tool to see how often people have searched for a certain company's name, relative to other hosting companies. There is also the list at Webhosting.info which skewed as it might be, can help to get a vague idea which would be the moment's biggest web hosts around. I use it as a starting point of sorts, but not for actual weighting. At this point you might be asking yourself why exactly I'm taking this approach. Well, I figured that if there were a popular vote where all people would get to choose which hosts they would want to read about, these would probably be some of the ones that would get most votes. The other factor may sound bad, but it's real and, from my point of view at least, fair. It takes effort to create and run a site. It also takes money, even more so a website of this kind, where I actually pay for all these web hosting services in order to get the service a regular customer does, unlike many (most?) other "independent" web hosting hosting reviewers that only claim to test companies and really only rank them in a way that maximizes profit. Often they don't test them at all, and their so called reviews are a mere copy and paste job from the host's website. In most cases, there's no proof that they ever tested the host. Anyway, back to Hostpeek.com. I don't charge visitors for content and I don't let the site be overrun with advertisements, but someone still has to pay the bill. I figure it should be the hosts themselves who should take care of this, since they get benefits in the form of potential customers. Because I can't force a host to pay me, they must be running a so called "affiliate program" where I get a commission for sales that originate from my site (that is people actually click on the link on my site to arrive to the host's website). That gives me a chance to recoup my costs (the biggest of which would arguably be my time) and hopefully make a profit too. The visitors get the chance to reach an informed decision, a host might get a customer, I might get some cash. Everybody can win. 2. Can't you make some exceptions if we're talking about some really great hosts? To put it bluntly, no. But not without reason. My experience proved to me that hosts give nothing back. I've promoted a number of good hosts for years, and in many cases I've never seen as much as a "Thank you!", although I have surely sent a good number of customers to them, as well as traffic worth thousands of dollars in advertising costs, if they were to pay for it. That isn't fair, not to me, and not to the few hosts that do have affiliate programs and have paid me back for my efforts. It isn't fair to let some get a free ride, and have others pay for it. As such, on HostPeek, they will all be indirectly charged. Of course, for all that to work as planned, you will have to click on my affiliate link to visit the host, before ordering. Please-please-pleeease! 3. What makes Hostpeek different from other hosting reviews sites? The main thing that makes it different is that it puts a lot more emphasis on measurements, and very little on customers or "independent reviewer" (read "my") comments. Don't get me wrong, customer reviews are wonderful and can be of great help in finding out good hosts, but they do have their imperfections. One is subjectivity. Reviews usually end up being expressed in phrases like "my site loads really fast", "I've noticed no downtime" and "support is fast and knowledgeable". Without actual values behind those claims, you can't really compare one host to another. Yes, "good enough for others" usually implies "good enough for me too", but not always. Many users don't monitor uptime, or monitor it using such rare checks that numerous short downtimes could escape detection. Their "uptime is great" reports are usually of limited value. Usually, "great uptime" is just their way of saying "I've noticed little or no downtime myself". That's better than the opposite (I've noticed so much downtime I can't recall how many occurrences there were!), but it's of little real use or accuracy if the person only visits his site once or twice a day. Support quality descriptions tend to be even more vague, which is why I will refrain from such. With the exception of response times which could probably be reported accurately, support quality grading will depend largely on the customer's level of knowledge. When you know little about a subject, anyone who knows more may sound like a genius. There is also some "people skills" involved with support, which can skew perception significantly. It happened more than once for a host to be perceived as cold or downright rude by some, while others praise it for its concise, to the point approach. Still, customer reviews are your best bet at attempting to judge support, and Hostpeek.com will not go into much detail, if any, in this area. Do feel free to send me an email though, if you want a personal opinion on a host's customer support, as I've experienced it. 4. You mentioned some inherent imperfections to HostPeek's approach. Can you expand on that? Yes, I can name and shame them, if that's what you want. Other imperfections are measurement related. The monitoring is done from one server, in one location (this does not apply to uptime monitoring which is done by a specialized service using multiple servers in different locations on the globe). If it fails to connect to the monitored account, it will affect the records as you can see them. If the testing script fails to run or errors out, it will do the same. But it doesn't necessarily mean that the monitored applications have failed as well. Networks can have local issues, the monitoring script may respond in unexpected ways to some local server environment changes, but the actual applications (read customers websites) may actually be unaffected and perfectly reachable from just about anywhere. Any downtime of the monitoring server itself will result in a generalized gap in the results charts, which should serve as a tell-tale sign that there were actually no issues on the monitored servers. Coupled with the separate, independent uptime monitoring I mentioned, you should be able to reach the right conclusion easily. Support related measurements are a lacking feature, at least for now. Many in the industry consider that hosting is fast becoming a commodity, where the real difference starts to be defined by the level and quality of customer support, as opposed to package allocations, uptime, server performance etc., but I see a lot of problems when it comes to measuring that accurately. That's probably the beauty of it, from a marketing point of view: you can never run out of arguments to sustain the idea that your company's customer support is better. Another shortcoming has to do with stress testing -- measuring how many visitors you could have to your site/application in a given period of time, without the system collapsing or the host taking action against your account (a suspension or even termination). I obviously can't do that, as it would result in my account getting suspended, which would affect all the other reports. A possible future solution to that could be for me to get an additional account just to run such a test, although there would be some significant moral issues attached. Is it right to deliberately risk affecting tens or hundreds of customers' sites uptime and performance? Does the aim justify the means? Food for thought I guess.
|
|
Last Updated on Sunday, 07 August 2011 21:11 |