By Dorian · Ztrader AI
Taiwan, Nov 25, 2025
Most of my current market outlooks have moved to: blog.ztrader.ai
The Planet module is undergoing a database upgrade (yes, AI-driven schema updates nuked Supabase again). It should be back online soon.
Every module in my ecosystem shares only one authentication primitive: your email.
You don’t need to register.
Your inbox is your identity layer.
Everything else is your own data asset.
I have zero interest in anything beyond that.
Language and Pattern
Here’s the thesis I’ve been building toward:
Everything is language.
Structure is syntax.
To me, writing poetry and writing code are not two arts—they are the same operation on different coordinates.
When you compress the essence of any phenomenon far enough, the residue you’re left with is always the same thing:
an AST — an abstract syntax tree.
In the AI era, models will inevitably become the operating system behind every tool, app, and interface. Not metaphorically, but structurally.
-
Apps will become sandboxes or callable agents.
-
Interaction patterns will determine the upper bound of AI’s effectiveness.
-
Text embeddings will determine the vector field of the model’s output (distribution, topology, orientation).
This is why mathematicians at the very top—Terry Tao and others—have confirmed the new paradigm of human-AI collaboration:
AI constructs the structures, models, and candidate proofs.
Humans constrain, verify, prune, and rebuild the logic into a formal theory.
This is not a minor workflow tweak. It is a civilizational shift:
-
AI is already demonstrating extreme value in theoretical physics, pure math, and ultra-abstract CS.
-
The resonance effect between AI and high-intelligence individuals is real.
The point is not to worship AI or outsource thinking—
the point is to amplify cognition through co-construction. -
The loop becomes: search → verify → generate → push.
If it works, integrate.
If it fails, debug. -
Execution drives intelligence.
You learn while building, you build while learning, and writing becomes the closure of that feedback loop. -
Meta-knowledge is the final form of abstraction.
Syntax Ontology Engineering.
Meta-Theorem Design.
You learn “how you learn.”
You observe “how you observe yourself learning.”
You learn “how you used to observe the way you learned.”
Then you use that to observe even more deeply.
Once you iterate that stack far enough, you always end up with the same object:
an optimized AST.
Blockchain, ZK, Rust—these are merely expressions of that layer.
They look like technologies, but they are actually syntax products—languages infused into formal structure.
Which brings me to the most important point:
The Worst Thing Is Prompt Engineering
Not because prompts are useless,
but because they are static artifacts pretending to be dynamic cognition.
Prompt engineering is not:
-
a cognition pipeline
-
a learning workflow
-
a meta layer
At best, it temporarily reshapes how the model’s tokenizer distributes and weights tokens.
But over long time horizons, your behavior smooths out those local advantages.
If you lack density and structure, no prompt will save you.
You may learn how to produce one “good prompt”—
but you won’t learn how to produce an entire family of them.
You won’t learn how to design prompts that sit one abstraction layer higher.
Prompts cannot compensate for a shallow knowledge graph.
They cannot compress what you refuse to learn.
They cannot generate structure where you have none.
Language and Pattern
With time, LLMs will align toward the average cognitive curve of their users:
-
The majority usage defines the model’s center of gravity.
-
The Matthew effect becomes extreme.
High-density users ignite structural alignment and output explosions. -
Most people will still use natural language queries:
search, verify, calculate, reason, simulate. -
Highly abstract, theoretical, text-driven workflows will return to the foreground—
because that is the native habitat of LLMs.
“Using a person as a mirror helps you correct your posture.
Using language as a mirror lets you observe the world.”
This Is the Future Operating System of Intelligence
Not an app.
Not a tool.
Not a prompt.
But a syntax ecology where cognition, representation, learning, and execution compress into one recursive structure:
Language → Pattern → Syntax → AST → Intelligence Loop
This is the new OS.
And the only people who will fully benefit
are the ones who understand the structure beneath the surface