arrow_back Back to Archive / Article: Deep Dive
Case Study

Information Entropy Is Eating Your Business

November 7, 2025

April 2026

Information Entropy Is Eating Your Business

A First-Principles Guide to Knowledge Decay

November 2025


Knowledge decay is the natural, inevitable process by which commercial information becomes less accurate over time - as products evolve, competitors adapt, customers churn, and markets shift. Like entropy in physics, knowledge decay doesn’t require any force to happen. It’s the default state. The only question is how fast it’s happening and whether anyone is measuring it.

Here’s something I borrowed from physics that I think applies directly to how companies manage information.

The second law of thermodynamics says that in a closed system, entropy - disorder - always increases. Things trend from order to chaos. Your desk gets messier. Your inbox gets more cluttered. Your codebase accumulates technical debt. Left alone, everything falls apart.

This isn’t a metaphor. It’s a precise description of what happens to commercial knowledge.

The day you publish a battlecard, it is maximally accurate. Every claim has been verified. Every competitive comparison reflects the current landscape. Every pricing reference is correct.

The next day, it’s slightly less accurate - because the world moved and the document didn’t. A competitor may have shipped something. A feature may have been renamed. A customer’s contract terms may have changed.

After a month, several claims are borderline. After a quarter, some are demonstrably wrong. After six months, the document is more archaeological artifact than operational tool.

This isn’t a failure of maintenance. It’s entropy. And entropy doesn’t stop because you assign someone to “keep things updated.” Entropy is a thermodynamic guarantee.

According to Crayon’s Competitive Intelligence Benchmark, a typical competitive battlecard contains 2-3 outdated claims within 90 days of creation and becomes “majority stale” - more than 50% of specific, testable claims inaccurate - within 6 months (Crayon, 2024).


The Decay Curve

Let me try to sketch the decay curve for commercial knowledge, based on what I’ve observed across dozens of B2B organizations.

Day 0: Publication. Accuracy is at peak - let’s call it 95% (not 100%, because even well-researched documents occasionally contain minor errors).

Day 30: Accuracy drops to ~85%. One or two claims have been overtaken by events: a competitor update, a product change, a new data point. Nobody has noticed because nobody is checking.

Day 60: Accuracy drops to ~75%. The competitive landscape has shifted enough that at least one major positioning claim needs revision. A customer referenced in the document may have renewed - or may have churned. The pricing, if it was included, may have been adjusted.

Day 90: Accuracy drops to ~65%. Three months of product evolution, competitive movement, and market dynamics have outrun the static document. A rep using this document is now operating with a one-in-three chance of citing something inaccurate on any given data point.

Day 180: Accuracy drops below 50%. The document is now more wrong than right, on a claim-by-claim basis. It is - objectively - worse than having no document at all, because a rep with no document hedges and says “let me check.” A rep with a stale document states wrong information confidently.

This is the knowledge decay curve. And its shape is not linear - it’s exponential, because multiple independent facts are each decaying along their own timelines. The probability that all claims in a document remain accurate decreases multiplicatively with time.

IDC research shows that the average enterprise creates and updates 500,000+ pieces of content per year (IDC Digital Content Creation Study, 2024). The mathematical certainty of entropy means that a significant fraction of this content is stale at any given moment - and the fraction grows every day that maintenance is deferred.


Why Maintenance Doesn’t Work

The obvious response to knowledge decay is: maintain the documents. Review them monthly. Update them quarterly. Assign owners. Create processes.

This works in theory. In practice, it fails for three reasons.

Reason 1: The surface area is too large. A $50M ARR company with a mature GTM function produces and maintains hundreds of sales assets, dozens of AI tool configurations, a website with hundreds of pages, and a knowledge base with thousands of entries. The surface area of claims - individual factual assertions - is in the tens of thousands. No human process can review tens of thousands of claims on a monthly cadence.

Reason 2: The rate of change is accelerating. Products ship faster. Competitors pivot faster. Markets shift faster. Pricing adjusts more frequently. Customer logos churn faster. The rate at which truth changes has increased in every dimension - meaning the maintenance frequency required to keep up has increased proportionally. Monthly reviews that were adequate in 2020 are inadequate in 2026.

Reason 3: The dependencies are invisible. When you update one document, you may not realize that the same claim appears in fourteen other documents, three AI tool knowledge bases, two email templates, and the CEO’s keynote deck. Maintenance without dependency tracking is like fixing a leak in one pipe without realizing the same joint is used throughout the building.

Research from Gartner estimates that the average B2B company would need 2-3 FTEs dedicated solely to content accuracy maintenance to keep pace with knowledge decay - resources that virtually no company has allocated because the problem isn’t measured (Gartner Content Operations Survey, 2024).


The Entropy Tax

Let me quantify what unchecked knowledge decay costs.

The directly measurable costs:

  • Search waste: Reps spend 1.8 hours per day searching for and verifying information (IDC, 2024), a significant portion of which is attributable to the existence of multiple document versions at different stages of decay.
  • Content waste: 65% of marketing content goes unused (Highspot, 2024), primarily because reps can’t distinguish current content from decayed content.
  • AI inaccuracy: AI tools trained on decayed content propagate stale claims to 10,000-20,000 prospect interactions per month (Forrester, 2025).

The indirectly measurable costs:

  • Deal loss from inconsistency: Decayed claims in different documents result in inconsistent prospect experiences, contributing to the 40-60% of pipeline that ends in “no decision” (Gartner, 2024).
  • Extended ramp times: New hires can’t distinguish current knowledge from decayed knowledge, extending the average ramp time to 5.7 months (Gartner, 2025).
  • Compliance exposure: Under the EU AI Act, AI-generated claims based on decayed sources create regulatory liability.

The total entropy tax - the aggregate cost of knowledge decay across all these dimensions - runs $3-5M per year for a $50M ARR company. Not because anyone made a mistake. Because entropy is the default, and nobody is actively counteracting it.


Fighting Entropy

Physics tells us that entropy can be locally reversed - but only by investing energy. You can clean your desk. You can refactor your codebase. You can organize your closet. But it requires continuous, active effort. The moment you stop investing energy, entropy resumes.

The same is true for commercial knowledge. You can counteract knowledge decay - but only with continuous, active governance. Not quarterly reviews. Not annual audits. Continuous.

What does continuous governance look like? It requires:

Atomic claim management. Instead of managing documents, manage the discrete claims within documents. Each claim has its own decay clock - a timer that starts the moment the claim is verified and counts down toward the point at which re-verification is required. Some claims decay fast (competitive positioning: 30-60 days). Some decay slowly (company mission statement: 12+ months). The granularity matters because one-size-fits-all review cycles over-maintain stable claims and under-maintain volatile ones.

Change-triggered updates. Instead of reviewing on a calendar schedule, trigger reviews when the underlying reality changes. When the product ships a new feature, every capability claim is automatically flagged for re-verification. When a competitor announces a product update, every competitive claim is flagged. When a customer churns, every reference to that customer is flagged. Change-triggered updates counteract entropy at the rate of change, not at the rate of a review calendar.

Confidence scoring. Each claim carries a confidence score that declines over time - starting at the verified confidence level and decreasing toward zero on a schedule determined by the claim’s volatility category. At any moment, a CRO can see: “78% of our active commercial claims are above the confidence threshold. 22% need re-verification.” This makes entropy measurable, which makes it manageable.

Propagation guarantees. When a claim is updated, every downstream artifact and AI system that carries the claim inherits the update automatically. This ensures that fighting entropy in the knowledge graph - the canonical source - automatically fights entropy in every document, template, and AI tool that draws from it.

This is anti-entropy infrastructure. Not a process improvement. Not a better review cadence. An architectural system that continuously counteracts the natural decay of commercial knowledge.


The First Principle

Here’s the first principle, stated simply:

Commercial knowledge decays by default. It does not maintain itself. It cannot be maintained by periodic human intervention at scale. It must be governed by a system that detects, measures, and counteracts decay continuously - or it will inevitably trend toward a state where more claims are wrong than right.

This is not a failure of people. It’s a law of information physics. And the companies that understand this - that build anti-entropy systems rather than hoping that human processes will outrun thermodynamics - will have a fundamentally more accurate commercial presence than those that don’t.

The question isn’t whether your knowledge is decaying. It’s how fast. And whether you’re doing anything about it.


Frequently Asked Questions

What is knowledge decay in B2B sales?

Knowledge decay is the natural, inevitable process by which commercial information - product capabilities, pricing, competitive positioning, customer evidence, compliance claims - becomes less accurate over time as reality changes and documents don’t. Like entropy in physics, knowledge decay is the default state and requires no force to occur. A typical competitive battlecard becomes “majority stale” within 6 months (Crayon, 2024).

How fast does commercial knowledge become outdated?

The decay rate varies by claim type: competitive positioning decays within 30-60 days, product capability claims last 60-90 days, and pricing claims remain accurate for an average of 45-90 days. Across all claim types, a typical document drops from ~95% accuracy at publication to below 50% accuracy at the six-month mark.

Can regular content reviews prevent knowledge decay?

At scale, no. The average B2B company has tens of thousands of individual claims across hundreds of documents, AI tool knowledge bases, and website pages. Gartner estimates that maintaining accuracy through periodic human review would require 2-3 dedicated FTEs - resources that virtually no company has allocated (Gartner, 2024). Calendar-based review cycles also under-maintain volatile claims while over-maintaining stable ones.

What is anti-entropy infrastructure for commercial knowledge?

Anti-entropy infrastructure is an architectural system that continuously counteracts knowledge decay through four mechanisms: (1) managing claims atomically with individual decay clocks, (2) triggering reviews when underlying reality changes rather than on calendar schedules, (3) providing confidence scores that decline over time to make decay measurable, and (4) propagating updates automatically to every downstream system when a claim is re-verified.