AI Technology
March 23, 2026

We Called It Context. Turns Out, It's Ontology.

Back to Blog

And the difference matters more than ever.

Peter Carr published a piece recently that gave the JOURN3Y team pause for thought.... Not because the ideas were new to us, but because he found the precise language for something we've been circling around for the better part of two years.

The article is called Whatever-as-a-Service and the Bad Data Myth, and the core argument is this: the biggest barrier to AI inside most organisations has never been bad data. It's that they've never clearly described how relationships actually operate between their people, systems, approvals, and the work that flows between them.

That description has a name. An ontology.

We've been saying this — just differently

At JOURN3Y, we've tended to talk about "context" and "knowledge" rather than ontology. We tell clients that AI isn't failing because their data is dirty — it's failing because the AI doesn't understand the work. It doesn't know who owns what, who approves what, or how a request moves from person A to outcome B.

Peter's framing is sharper, and we're adopting it. Most organisations haven't modelled the things that exist in their world and how they relate. People. Teams. Approvals. Services. Documents. Work itself. That meaning is scattered across systems and workflows, but rarely surfaced as a coherent or legible structure.

And here's the thing: AI doesn't create that problem. It just makes it impossible to ignore.

The "Whatever-as-a-Service" hangover

Peter's phrase Whatever-as-a-Service is one of the best descriptions we've heard of the last decade in enterprise technology. A decades-long application frat party. Sales teams adopted their platform. HR adopted theirs. Finance moved to cloud ERP. Collaboration tools multiplied. Each system brought its own definition of the world. A "customer" became an "account" became a "ratepayer" depending on which system you asked.

Humans quietly resolved these inconsistencies through experience and context — through the knowledge in their heads, not their systems. AI cannot do that. Not safely. Not reliably.

The result? AI that hallucinates, guesses, or confidently gives the wrong answer — not because it's a bad model, but because no one has ever told it how your organisation actually works.

Where Glean fits in

Peter makes a point in his article that we think deserves more attention. He writes that AI-native platforms like Glean — which JOURN3Y implements across Australia and New Zealand — construct knowledge graphs that map relationships between people, documents and projects across collaboration tools.

Critically, rather than imposing a new security model or demanding a data clean-up project first, Glean inherits the permissions already defined in your identity and collaboration platforms. If a document is accessible in its original system, it remains accessible in the AI environment.

This is practical ontology. Not a multi-year transformation. Not a big-bang platform replacement. It's surfacing the structure that already exists in your tools — the permissions, the relationships, the responsibilities — and making it legible to AI.

That's exactly where we're seeing Glean create real value in production. Not just as a search layer, but as a way to expose the hidden structure of how work actually gets done. When relationships and responsibilities are visible, work can move predictably and AI can act with confidence rather than guessing.

Structure matters more than cleanliness

This is the shift we ask every client to make. Stop waiting for the data clean-up project to finish before you start on AI. That project will never finish. It's also the wrong target.

The question isn't: is our data clean enough?

The question is: have we described how our organisation actually works?

If the answer is no — and for most organisations, it is — the path forward isn't more data hygiene. It's building the ontological structure that lets AI do real work: routing the right request to the right team, surfacing the right document to the right person, automating the right workflow with the right authority.

Organisations that address this will get far more from AI than those still chasing cleaner data.

A word on language

"Ontology" is a word that sounds abstract. Even companies with great ontological solutions rarely use it in executive company, as Peter notes. We're not here to push the jargon into every boardroom. But understanding the concept — the structured representation of the things that exist in your organisation and how they relate — is essential to making sense of why some AI implementations deliver and others stall.

If you've been frustrated by AI that can't seem to grasp your business, it's almost certainly an ontology problem. And unlike bad data, it's one you can actually solve.


Curious where the gaps are in your organisation's structure? Talk to the JOURN3Y team →

Read Peter Carr's full piece: Whatever-as-a-Service and the Bad Data Myth

Category:AI Technology
Tags:
##Ontology #EnterpriseAI #Glean #AIStrategy #KnowledgeGraph #WorkAI