Methodology

How we research and score SaaS tools

Every SmartStack Guide review follows the same six-step process. No shortcuts, no vendor influence, no vague claims.

01

Tool selection

We prioritize tools with significant user bases, active development, and genuine relevance to small businesses under 50 employees. We don't review enterprise-only products or tools that require a sales call to get pricing.

02

Independent research

Every review starts with a free trial or public demo — not a vendor briefing. We test the actual signup flow, onboarding, core features, and support response time. No review is written from a press kit.

03

Pricing verification

We manually check public pricing pages and note exactly what each plan includes. If pricing is hidden behind a sales call, we say so. All prices reflect what a small business actually pays, not enterprise negotiated rates.

04

Scoring criteria

Tools are scored on: ease of setup, feature depth for small teams, value for money, customer support quality, and integration ecosystem. Scores are not influenced by affiliate relationships — a tool we earn from can still receive a low score.

05

Honest cons

Every review includes real drawbacks. If a tool has a steep learning curve, poor mobile app, or confusing billing — we include it. Vague cons like 'could be improved' are not acceptable. Every con must be specific and falsifiable.

06

90-day freshness audits

An automated system flags reviews for re-verification every 90 days. We manually check pricing, plan changes, discontinued features, and new alternatives. The 'last verified' date on every review reflects this cycle.

What we don't do

  • — Accept payment to change a review score or ranking position
  • — Use AI to generate review content without manual fact-checking
  • — Inflate scores for tools where we earn higher affiliate commissions
  • — Publish a review without testing the product ourselves