Vela
The first concrete wedge in the Borrowed Light thesis: a scientific state layer that turns papers into structured, linked, correctable findings.
The claim
Scientific papers get published. The knowledge inside them does not compound. Findings remain trapped in prose: hard to query, hard to correct, hard to inherit. Every new reader rebuilds the same map from scratch. Every new agent re-parses the same papers. Corrections spread slowly. Failed paths disappear.
Vela turns papers into structured, linked, correctable scientific state. In one line: Vela is Git for scientific findings.
- papers are renderings, not the record layer
- findings become first-class objects with durable identity
- support, contradiction, dependence, and revision become explicit
- corrections propagate through structure instead of rumor
- humans and AI agents can reason over the same frontier
The wedge
Vela is not trying to be the whole science operating system at once. The right first move is narrower and deeper: compiled frontier memory for one scientific domain. If a field becomes materially more intelligible when its literature is compiled into structured findings, then the state layer is real. If that state changes what a researcher or agent would do next, then the wedge is real too.
The proof burden in the near term is not universal adoption. It is simpler: show that a compiled frontier is a better working memory for a field than papers alone.
What Vela is
Vela is primarily a state layer for science. It compiles literature into finding bundles: individual, content-addressed records with evidence, conditions, entities, confidence, provenance, and links to other findings.
It also includes a lightweight runtime for compiling, searching, reviewing, and updating that state, plus early network primitives for sharing, versioning, and federating frontiers across people and institutions.
Another way to say it is simple: science needs better state, better runtime, and better network. Vela starts at the deepest shared layer first, because without shared state the rest of the stack never compounds. With it, richer tools, stronger institutions, and a much deeper scientific ecosystem become much more real.
What Vela is not
- not another literature search layer
- not another scientist copilot that rebuilds context from scratch every run
- not a decorative knowledge graph detached from review and correction
- not the whole scientific operating system yet
It is the layer that makes a scientific operating system possible.
Three layers
The deeper thesis is not just about one tool. It is about the missing stack. The cleanest framing is that science is missing three things that software already has: a state layer, a runtime layer, and a network layer.
- State: what is known, claimed, challenged, and revised
- Runtime: the actions that transform that state and produce new artifacts, analyses, and experiments
- Network: the collaborative layer that lets those states and artifacts be shared, versioned, and inherited
Vela starts at the state layer. If that layer is wrong, everything above it stays brittle. The runtime cannot learn cleanly. The network cannot compound cleanly.
What it has to prove
The question is whether compiled scientific state changes real decisions. By the end of 2026, Vela should be able to show, in at least one real scientific domain, that a literature corpus can be compiled into structured findings; that contradictions, dependencies, gaps, and revision become visibly legible; and that a researcher or agent can answer real strategic questions better using the compiled frontier than using papers alone.
The proof artifact is not just a repo. It is a frontier that changes what someone would do next.
The current first frontier is blood-brain barrier delivery strategies for Alzheimer's-relevant therapeutics: narrow enough to compile deeply, rich in contradictions and hidden conditions, and important enough to make the paper problem obvious.
Why now
The world is starting to ask for maps of scientific space, not just more paper search. The recent OpenAI Foundation Alzheimer's announcement is interesting for exactly this reason. It talks about building a causal map of the space, generating and opening datasets, testing interventions, and learning across institutions.
That is a strong external signal. Serious actors are now circling the same underlying need: if AI is going to help science meaningfully, it needs a better medium than disconnected papers and private notes. Better models make the substrate problem more urgent, not less.
Trajectory
If Vela works, it proves one thing first: compiled scientific state beats papers as the working memory of a field. That is enough. If it is true, then richer execution runtimes, experiment memory, networked federation, and broader scientific infrastructure become more believable. Vela does not need to prove the whole operating system in one step. It needs to prove the state layer.
Borrowed Light is the broader thesis. Vela is the first concrete wedge.