Skip to content

@sightmap/core

The library API behind the sightmap CLI. For IDE plugins, MCP servers, custom build tooling, and anyone integrating sightmap into a workflow that isn’t shell scripts.

The library and CLI are shipped as a single npm package — @sightmap/sightmap. There is no separate @sightmap/core on npm. We split them in the docs because they serve different audiences: the CLI page is for curators and CI; this page is for tool builders.

Terminal window
pnpm add @sightmap/sightmap
import {
parse,
validate,
merge,
loadDirectory,
match,
explain,
lint,
} from "@sightmap/sightmap";
FunctionPurpose
parse(yaml)Parse one YAML document into a typed sightmap. Throws on invalid input.
validate(input)Non-throwing validator. Returns a Result with diagnostics.
merge(sightmaps)Smart-merge multiple sightmap documents into one. Preserves agent-authored fields; surfaces collisions as warnings.
loadDirectory(path)Read every YAML file under path, validate each, merge them. Returns a Result with the merged sightmap and any diagnostics.
match(sightmap, { url })Resolve a URL: returns the matched view, applicable components, requests, and aggregated memory.
explain(sightmap, query)Find every entry tied to a name or source path.
lint(sightmap, { root })Quality checks beyond schema. Returns diagnostics.
import { loadDirectory, match, explain, lint } from "@sightmap/sightmap";
const result = await loadDirectory(".sightmap");
if (!result.ok) {
for (const d of result.diagnostics) {
console.error(d.code, d.message, d.file);
}
process.exit(1);
}
const sightmap = result.value;
// Drive a URL like an agent would
const r = match(sightmap, { url: "/search" });
console.log(r.view?.name, r.components.length);
// Find entries by source path
const hits = explain(sightmap, "src/components/Foo.tsx");
// Quality checks
const issues = await lint(sightmap, { root: process.cwd() });

parse() throws on invalid input and is sugar for end-user code that knows the input is valid. Downstream tooling should consume validate(), which returns a non-throwing Result. Treating diagnostics as data — not exceptions — is what lets the CLI, the MCP server, and the plugin all share the same diagnostic shapes and codes.

Every diagnostic is the same shape:

type Diagnostic = {
severity: "error" | "warning" | "info";
code: string;
message: string;
file?: string;
path?: string; // JSON pointer within the file
loc?: { line: number; column: number };
source?: string;
};

Codes are stable, kebab-case, and exported as constants. A representative slice:

CodeSeveritySource
parse-errorerrorparse / loadDirectory
schema-validation-failederrorvalidate
merge-collision-viewwarningmerge / loadDirectory
duplicate-routewarninglint
route-shadowingwarninglint
unknown-sourcewarninglint
selector-syntaxwarninglint

Renames require an SEP. Future SDK ports (Python, Go) MUST emit identical codes for the same conditions — this is what makes diagnostic-aware tooling portable across implementations. The full table is on GitHub.

  • Building a CI script? Shell out to the CLI — simpler, fewer moving parts.
  • Building an IDE plugin or a custom MCP server? Use the library — diagnostics arrive as data, not parsed stdout.
  • Building a test harness? Either works. The library is faster (no process boundary); the CLI is easier to mock.

For the canonical schema semantics — fields, route glob syntax, scope rules — see the spec and the JSON Schema reference.