Skip to main content

Command Palette

Search for a command to run...

The Blog Tax: Why Search Engines Punish Useful Products

Updated
8 min read
The Blog Tax: Why Search Engines Punish Useful Products

We’ve quietly accepted a strange reality on the web: if you build a genuinely useful product – a SaaS app, a tool, a directory, a marketplace – search engines will often ignore you unless you also bolt on a content machine.

Not because your tool isn’t valuable.

But because it doesn’t talk enough.

Modern search is still, in practice, a text-first system. If you don’t publish long, optimised articles that feed this system, you’re playing with a handicap.

This isn’t just annoying for founders. It’s a structural flaw in how we discover things online.


How SEO turned blogs into a toll booth

Ask almost any SEO how to get organic traffic for a SaaS, marketplace, directory, or e‑commerce brand and the advice is painfully consistent:

“You need a blog.”

The logic behind that advice is simple:

  • Search engines crawl and index text.

  • They need a lot of text to infer what your site is about, what queries it should rank for, and how authoritative it is.

  • Product pages, dashboards, and tools are often light on copy and heavy on interactions, data, or UI.

So if your core value is:

  • an uptime monitoring tool

  • an SEO reporting dashboard

  • a directory of niche suppliers

  • a marketplace with smart matching

…you’re still told to create:

  • “Top 10 tools for X in 2026” posts

  • “How to do X” guides

  • “Complete beginner’s guide to Y” articles

Not because this is always the most useful thing you could do for your users.

But because this is what search engines understand best.

The misalignment

On paper, a search engine’s job is to:

Find the most relevant and useful resource for a given query.

In practice, what often ranks is:

  • the best‑optimised article about a tool, not the tool itself;

  • the website with more long‑form content, not necessarily the better product;

  • whoever has invested more into content and SEO, not whoever actually solves the problem best.

That gap between “best resource” and “best SEO asset” is the blog tax. If you build something valuable but light on text, you either pay that tax – or stay invisible.


Why tools and systems lose to text

Think about the kinds of products that get under‑rewarded in this system:

  • A web app that does one thing extremely well with a clean, minimal UI.

  • A directory whose value is in the data and filtering, not the prose around it.

  • A piece of infrastructure or automation that integrates into someone’s workflow and “just works.”

From a user’s perspective, these are ideal. They are:

  • focused

  • fast

  • low on noise

From a search engine’s perspective, they are often:

  • thin on crawlable content

  • hard to classify

  • weak on traditional on‑page signals

Crawlers don’t “experience” the product the way a human user does. They mostly see:

  • how much text is on the page

  • how it’s structured (headings, paragraphs, lists)

  • what other sites link to it

  • what metadata you provide

If your standout feature is an interactive dashboard, a smart recommendation engine, or a slick workflow, none of that is easily expressible in plain HTML text without you writing about it.

So you end up doing exactly that: writing about your product in longform, not because users need more words, but because search engines do.


This pattern isn’t new: meta keywords déjà vu

We’ve seen this play out before.

In the early days of SEO, meta keywords were a thing. You told search engines what your page was about using a simple tag. It didn’t take long for people to:

  • stuff every possible keyword in there

  • add irrelevant keywords to siphon traffic

  • use it as a cheat‑code rather than a description

The result was predictable:

  • the signal became noisy and unreliable

  • search engines started discounting it

  • eventually, major engines simply ignored meta keywords altogether

Any explicit, easy‑to‑manipulate ranking signal follows a similar arc:

  1. It’s introduced with good intentions.

  2. It’s discovered by SEOs.

  3. It’s exploited and overused.

  4. It’s discounted or abandoned.

Structured data and schema markup already show the early symptoms of this cycle. They are extremely useful in theory – a machine‑readable way to describe what’s on a page – but they’re also being:

  • used to inflate review stars

  • applied in contexts where they don’t really fit

  • turned into yet another surface for keyword games

The underlying problem isn’t any single feature. It’s the incentive structure: as long as clear, mechanical levers exist, people will pull them as hard as possible.


The perverse incentive: write more, not build better

This dynamic leads to a strange economy of effort:

  • You can ship a brilliant tool that saves users hours every week.

  • But if you don’t also produce thousands of words of “strategic” content, search may never send anyone to see it.

So founders and teams do what the system rewards:

  • spin up content calendars

  • produce “SEO articles” that rehash the same advice everyone else has

  • write posts that exist primarily so some keyword has a place to live

Meanwhile, users searching for answers often end up:

  • wading through generic content

  • being told about tools instead of directly finding and using them

  • losing time on pages that are optimised for bots, not humans

The search engine technically did its job – it found a page that “matches” the query. But from a human perspective, the result feels… off. You asked for a solution and got an article about solutions.


What a better search system would look like

If we started from the user’s perspective instead of the crawler’s constraints, search would behave differently.

A better system would:

  • Rank tools by usefulness, not just the blogs that mention them.

  • Understand that a minimal product page can still represent the best possible answer to a query.

  • Use more than just text length and keyword usage as proxies for quality.

Concretely, that could mean leaning more on:

  • User behaviour: Do people who land on this tool actually stay, use it, and return?

  • Direct signals of utility: For web apps, things like repeat usage, task completion, or satisfaction (where measurable).

  • Richer understanding of the page: Using modern AI models to interpret layout, UI, and intent – not just count words and headings.

At that point, a small, focused uptime monitoring tool with a tight, honest landing page could outrank yet another “Top 37 uptime monitoring tools you need in 2026” article.

We’re seeing early hints of this direction with:

  • AI answers layered on top of web results

  • more emphasis on “helpful content” and less tolerance for obvious fluff

  • experiments in surfacing apps, tools, and answers more directly

But for now, the old incentives still dominate.


So what should founders do today?

If you’re building a SaaS, a directory, a marketplace, or a specialised tool, you’re stuck in a hybrid reality:

  • The system is flawed.

  • But you still need to operate within it.

A few pragmatic guidelines:

  1. Accept that some text is necessary.
    You don’t need to turn your product into a content farm, but your site needs enough descriptive, structured text that a search engine can understand what you do and who you help.

  2. Make content genuinely useful.
    If you’re going to write, write things that:

    • genuinely help your ideal users

    • explain your approach, trade‑offs, and constraints

    • show real examples, case studies, or data

Use content to bridge the gap between your product and the problems it solves, not just to chase keywords.

  1. Lean on other discovery channels.
    Search isn’t the only way to be found. For many tools, it may not even be the best channel at the start. Consider:

    • niche communities (forums, subreddits, Discords)

    • integrations and partnerships

    • directories, comparison sites, and marketplaces

    • targeted outreach or small pilot programs

These can get real users in front of your product long before Google “decides” you exist.

  1. Design your site for humans first, crawlers second.
    It’s still worth doing the basics right – titles, headings, internal links, structured data where appropriate – but not at the expense of clarity and usability for real people.

  2. Keep an eye on how search evolves.
    As AI‑driven search and answer engines mature, the bias toward walls of text will probably weaken. If and when that happens, lean products that are already great for users will be in a stronger position than content‑bloated ones.


The blog tax is real, but it’s not the endgame

It’s frustrating – and a bit absurd – that in 2025 the easiest way to make a product discoverable by search is still to wrap it in longform text.

Many of the most valuable resources on the web are:

  • small, sharp tools

  • carefully curated directories

  • systems that automate something boring and painful

Those things should be first‑class citizens in search, not second‑class ones forced to drag a blog behind them just to be seen.

Until search engines catch up, most of us will keep paying the blog tax in one form or another. But it’s worth remembering that this is a limitation of the current system, not a law of nature.

The end goal shouldn’t be “who wrote the longest article about X.”
It should be: who actually solves the problem best.