VectorLint is a command-line tool that evaluates and scores content using LLMs. It uses LLM-as-a-Judge to catch terminology, technical accuracy, and style issues that require contextual understanding.
Install globally from npm:
npm install -g vectorlintVerify installation:
vectorlint --helpRun VectorLint without installing:
npx vectorlint path/to/article.mdDefine rules as Markdown files with YAML frontmatter to enforce your specific content standards:
- Check SEO Optimization - Verify content follows SEO best practices
- Detect AI-Generated Content - Identify artificial writing patterns
- Verify Technical Accuracy - Catch outdated or incorrect technical information
- Ensure Tone & Voice Consistency - Match content to appropriate tone for your audience
If you can write a prompt for it, you can lint it with VectorLint.
👉 Learn how to create custom rules →
VectorLint scores your content using error density and a rubric-based system, enabling you to measure quality across documentation. This gives your team a shared understanding of which content needs attention and helps track improvements over time.
- Density-Based Scoring: For errors that can be counted, scores are calculated based on error density (errors per 100 words), making quality assessment fair across documents of any length.
- Rubric-Based Scoring: For more nuanced quality standards, like flow and completeness, scores are graded on a 1-4 rubric system and then normalized to a 1-10 scale.
If you just want to check your content against a style guide:
vectorlint init --quickThis creates a VECTORLINT.md file where you can paste your style guide.
Note: You must set up your credentials in
~/.vectorlint/config.toml(see Step 3) before running checks.
Then run:
vectorlint doc.mdFor a comprehensive setup (custom rule packs, specific targets), run:
vectorlint initThis creates:
- VectorLint Config (
.vectorlint.ini): Project-specific settings. - App Config (
~/.vectorlint/config.toml): LLM provider API keys.
👉 Full configuration reference →
Open your global App Config (~/.vectorlint/config.toml) and uncomment the section for your preferred LLM provider (OpenAI, Anthropic, Gemini, or Azure).
[env]
LLM_PROVIDER = "openai"
OPENAI_API_KEY = "sk-..."Note: You can also use a local
.envfile in your project, which takes precedence over the global config.
Run a check:
vectorlint doc.mdVectorLint is bundled with a VectorLint preset containing rules for AI pattern detection, directness, and more. The init command configures this automatically.
👉 Learn how to create custom rules →
We welcome your contributions! Whether it's adding new rules, fixing bugs, or improving documentation, please check out our Contributing Guidelines to get started.
- Creating Custom Rules - Write your own quality checks in Markdown
- Configuration Guide - Complete reference for
.vectorlint.ini
