SpireTech Labs

by | Jun 2, 2012 | Company News and Announcements, SpireTech Labs

At SpireTech, we spend a lot of time evaluating new technology. For every product or service we offer, there are probably ten that don’t make the cut. We thought it would be a good idea to start sharing what we’ve learned – the good and the bad. Even though this information might be helpful to our competitors, it’s also helpful to the tech and business community at large – and especially helpful to our clients. So, taking a cue from Google labs, here we go…

First, we thought it might be helpful to share how we go about reviewing a new product: We start by reviewing the website of the manufacturer. Of course, the software has to solve a problem for us, or our clients. If pricing is published, we’ll make sure the offer is priced competitively at a price our customers will buy. If pricing isn’t published and we have to “talk to someone”, then sometimes the review will stop there – unless the solution is particularly compelling. We’ll look at hardware, software, infrastructure, and other technical requirements. We review the manuals if available. Sometimes, we’ll call support – to see how responsive they are. At this point, we’ve usually eliminated 50% of the contenders. If all of this passes muster, we’ll move on to a technical and business evaluation.

Usually, we are evaluating software – so if a downloadable trial is offered, we’ll proceed with installing it into a test environment. Sometimes, we are evaluating software that requires a multi-server environment. SpireTech maintains a lab environment utilizing virtual servers, so usually our testing involves setting up an appropriate virtual server test environment, and installing the software being evaluated.

Often times, the sales promise doesn’t materialize into actual results. Sometimes, there are technical annoyances wherein an important feature is buggy or incomplete. Other times, the software isn’t intuitive, easy to configure, or has other limitations that only come to light after extensive testing. I’m sure all of you have encountered that – particularly with new software.

We have to be particularly careful about negative reviews – unfortunately in today’s litigious society anything bad we say has to be based purely on fact (and not opinion). We’ll only write about limitations we’ve experienced personally, or have engaged with technical support in an attempt to resolve. Also, we probably won’t write about stuff that didn’t make the cut – rather we’ll write about what *almost* made the cut. We’ll especially write about the exceptional software that did make all the hurdles to end up being a quality product we choose to offer.

So, without further ado – please proceed to our first evaluation: OnApp.