Help build local AI with no API token caps.
Founders + one-time chip-ins fund the next model and agent sprint.
Fund it →
v1.8.58 · notarized Mac DMG · community-funded

Unlimited local usage.
Cloud-tool energy. On your Mac.

Outlier is a Mac-native beta app for people who are tired of paying $200/month and still hitting usage limits. The goal is simple: cloud-style coding workflows, running locally by default, with no API token caps from Outlier. The app is early. The mission is not.

Apple Silicon · macOS 12+ · Free Nano + Lite · Pro unlocks every shipped tier · local by default · performance varies by hardware
Outlier is actively under development. “Unlimited” means no API token meter from Outlier for local models; speed, context, model fit, and battery life depend on your Mac.
Local usage
$20
Pro / month
$200
Founders lifetime
100%
Local by default
7
Model tiers
0
Cloud token bill

The pitch in one sentence

Cloud coding tools are incredible, but usage caps kill momentum. Outlier is the bet that the same style of coding platform can live on your Mac: agent plans, file edits, tests, memory, vision, research, and long runs without a meter.

Free: Nano + Lite for everyday local AI.
Pro: $20/month unlocks the full Outlier stack.
Founders: $200 once, lifetime Pro, funds the build.

What the money funds

Every dollar goes toward the gap between where Outlier is today and where it needs to be: stronger coding models, better agent loops, faster local inference, cleaner UX, and public build logs showing exactly how we compress, speed up, test, and ship.

Community-funded. Build-in-public. No cloud token tax.

Will it run on your Mac?

16 GB
MacBook Air
Vision 35B + Nano + Lite
32 GB
MacBook Pro M-series
+ Quick 26B, Compact 27B, Code 27B
64 GB
Mac Studio / MBP
All tiers except Plus
96+ GB
Mac Studio / MacPro
+ Plus 397B-A17B

Apple Silicon only (M1/M2/M3/M4). Intel Macs not supported.

The problem

You can pay $200/month and still run out.

That is the pain Outlier is built around. The best AI coding tools are powerful, but they are rented, rate-limited, and cloud-hosted. Outlier is building the local version: your Mac does the work, your code stays private, and usage does not stop because a server-side meter says so.

Cloud today

Usage caps break flow

Long coding sessions hit limits. The tool gets good right when you need it most, then cuts off. Local inference changes the economics: once the model is downloaded, every token runs on your hardware.

Cloud today

Your code leaves your machine

Private repos, customer data, personal documents, and business plans all move through someone else’s infrastructure. Outlier is local-first: the default path is your Mac, your files, your disk.

Cloud today

You rent the workflow forever

Stop paying, lose access. Outlier’s free tier remains useful forever, Pro is priced at $20/month, and Founders get lifetime Pro while funding the product into existence.

Simple pricing

Free to start. $20 Pro. Founders fund the mission.

The pricing has one job: keep local AI accessible while raising enough money to build a stronger offline coding-agent platform. Free gets Nano + Lite. Pro unlocks the full product. Founders pay once and help make it real.

Free

$0 forever

For everyone who wants local private AI without a subscription.

  • Outlier Nano
  • Outlier Lite
  • Local chat + sessions
  • No account required
  • No token bill
Download beta app

Founders

$200 once

Lifetime Pro for the people who help fund the push toward high-quality offline agentic AI.

  • Lifetime Pro unlock
  • Founder recognition + early access
  • Early builds and build logs
  • Directly funds model + agent work
  • Limited founder cohort
Become a Founder
Why we need the community

The app works. The ambition is bigger.

Outlier is not pretending it already matches the best cloud coding agents. The honest story is better: v1.8 is the base, the community funds the climb, and every major step gets shown in public — the compression work, the benchmarks, the failed runs, the speedups, the fixes.

Now · v1.8.58

Local Mac AI foundation

Signed, notarized Mac app. Streaming chat, local sessions, seven model tiers, model downloads, memory, project files, web research, vision support, and a Pro gate that unlocks the full stack.

Next · agent quality

Cloud-style coding workflow

Plan → diff → approve → patch → test → repair. The goal is not just a chat model that writes code; it is a local coding platform that can work across a repo without burning cloud usage.

Next · model quality

Better coding models

Every founder seat helps pay for the runs, evals, and infrastructure to close the quality gap. Stronger local coding, better reasoning, better long-context behavior, and honest benchmark reporting.

Ongoing · speed

Compression + inference work

The build logs will show how we shrink, route, cache, page, and speed up large open models so they become practical on consumer Macs instead of locked behind H100s.

This is a community bet: enough people want unlimited local AI badly enough to fund the missing pieces together. If that is you, Founders is the cleanest way to help.

v1.8 model lineup

Free starts small. Pro unlocks the serious tiers.

The product is organized around simple expectations: Free is useful immediately. Pro unlocks the heavier models and advanced workflows. The numbers below are the grounded v1.8 shipping story.

TierPlanBest forDisk / RAMSpeed / note
Outlier Nano
Qwen3.5-4B · MLX 4-bit
Free Fast iteration, lightweight chat, small Macs 2.37 GB · 6 GB min RAM 71.7 tok/s
Outlier Lite
Qwen3.5-9B · MLX 4-bit
Free Daily local AI, writing, search, Q&A 5.04 GB · 12 GB min RAM 53.4 tok/s
Outlier Quick
Gemma-4-26B MoE
Pro Thinking-mode reasoning, not a code substitute 15.61 GB · 16 GB min RAM 14.6 tok/s
Outlier Core
Qwen3.6-27B · text-only
Pro Best default quality, reasoning, coding 15.13 GB · 24 GB min RAM HumanEval 0.866 · 20.7 tok/s
Outlier Code
Core weights + code config
Pro Coding workflow, lower-temp code-tuned setup 15.13 GB · 24 GB min RAM Same verified base as Core
Outlier Plus
Qwen3.5-397B-A17B · V9 paged / V11 streaming
Pro Frontier-adjacent local experiments on high-end Macs 209 GB disk · 96 GB min, 128 GB recommended V9 K=20: 1.59 tok/s @ 13.75 GB · V11 K=4 N=4 LRU=8: 3.28 tok/s @ 7.34 GB
Outlier Vision
Qwen3.6-35B-A3B · vision retained
Pro Images, screenshots, OCR, multimodal reasoning 19.0 GB · 24 GB min RAM (16 GB Air on V11 streaming) V9 K=256: 16.31 tok/s @ 34.04 GB · V11 K=4 N=2 LRU=30 XPF=1: 15.96 tok/s, ~3.16 GB peak (multi-turn)

Core / Code / Plus / Vision are Pro-gated in the current v1.8 framing. Code uses the same weights as Core with code-specialized configuration. Quick is useful for reasoning, not positioned as a coding tier.

What ships now

A local AI workstation, not just a chat box.

v1.8 is the foundation: local chat, session history, model picker, model downloads, project context, memory, web research, agent tools with approval, test panels, vision, and a Mac-native app shell. Some edges are rough. That is why the build is funded in public.

Local

Chat + sessions

Streaming token output, persistent local history, rename/delete/pin, Markdown export, demo session on first run, cost display at $0.00, and local storage across restarts.

Agent

Tools with approval

File read/write, shell execution with approval, permission modes, plan review card, repair loop, audit log, path scoping, project map, symbols, dependencies, snapshots, and tests.

Research

Search + citations

Deep research mode, DuckDuckGo with Wikipedia fallback, source filters, export, follow-up, summary cards, trust badges, and inline citations with source excerpts.

Memory

Local memory

Persistent memory in SQLite, short/medium/long-term tiers, provenance tracking, review cards, conflict detection, decay, frequency, and MEMORY.md export.

Vision

Images + screenshots

Image upload and direct image queries through Outlier Vision. Best for OCR, screenshots, diagrams, and multimodal Q&A — not positioned as the coding model.

Mac

Signed + notarized

macOS arm64 DMG, Apple notarization accepted, Gatekeeper accepted, GitHub Releases distribution, and a Tauri updater pointed at the latest manifest.

The comparison that matters

Outlier vs. the monthly bill.

The cloud tools proved the workflow: coding agents, long-context research, file-aware assistants, and always-on help. The problem is the meter. Outlier is building the local version: Mac-native, private by default, and not capped by an Outlier API token allowance.

Cloud AI tools Cloud coding agents Outlier Free Outlier Pro
Monthly cost $20+ Often much higher $0 $20
Usage model Server-side limits Usage windows / caps No Outlier token meter No Outlier token meter
Where inference runs Provider cloud Provider cloud Your Mac Your Mac
Privacy default Remote request Remote repo/context Local by default Local by default
Offline use No No Yes, once models download Yes, once models download
Current maturity Very mature Very mature Useful beta Ambitious beta
Goal Hosted assistant Hosted coding workflow Local baseline Cloud-style coding workflows, local

Important framing: Outlier is not claiming parity with the best cloud coding agents today. The beta is the foundation; Founders and Pro revenue fund the climb toward that experience locally.

Why local matters

Local AI can be lighter on the planet.

The environmental case is simple: cloud inference needs datacenters, networking, cooling, and always-growing GPU clusters. Local inference uses the Apple Silicon chip you already own. Outlier is a bet that more AI work should happen at the edge.

Edge

When a model runs locally, the query does not need a round trip through a remote GPU cluster.

0 cloud tokens

Outlier local models do not create an Outlier cloud inference bill or cloud token meter.

Mac

Apple Silicon unified memory is efficient for local inference compared with shipping every prompt to a server.

This is not a claim that every local query is automatically cleaner in every situation. Hardware, model size, electricity source, and usage pattern all matter. The point is directionally important: if a huge share of everyday AI inference can move from datacenters to efficient devices people already own, the load on cloud infrastructure can drop.

That is why compression, routing, quantization, paging, and model fit are not just engineering details. They are part of the product philosophy. A useful local model is not just cheaper for the user — it can also reduce unnecessary cloud dependence for everyday tasks.

The best environmental feature is not a green badge. It is a model that is good enough, small enough, and fast enough that people actually choose to run it locally.

Practical framing: local-first when possible, cloud only when needed. Outlier’s default path is the Mac.

Proof, not vibes

Numbers with a paper trail.

The old website had a strong provenance section. It belongs here. Outlier should be ambitious, but the numbers still need to be honest: source file, command, sample size, standard error, date measured.

📁 Source files

Benchmarks should trace back to eval artifacts or sprint logs. If a number cannot point to a file, it should not become marketing copy.

⌨️ Exact commands

The harness, version, dtype, device, shot count, and sample size matter. Smoke tests stay smoke tests.

📊 Standard error

Small benchmark lifts are not magic. Outlier’s public story should separate strong absolute scores from statistically significant improvements.

🚫 No fake parity

The goal is cloud-style coding workflows running locally. The current beta is not yet equal to the best cloud coding agents, and the site should say that clearly.

🧪 Public build process

The community funds model runs, evals, compression, speed work, agent loops, and UX hardening. The process is part of the product.

⚡ One-month pace

Outlier went from idea to shipped beta in about a month. More runway means more focused cycles on model quality, agents, and polish.

Current site-safe headline numbers: v1.8.58 shipped as a notarized Mac DMG; Free includes Nano + Lite; Pro unlocks the shipped heavier tiers; local generation has no Outlier API token meter.

Research + build track

Where the next money goes.

Founders and chip-ins are not abstract support. They fund specific work: better coding models, better local speed, more reliable agents, better tests, and a public process that shows the wins and failures.

Shipped · v1.8.58

Mac-native local foundation

Notarized DMG, seven model tiers, local chat, sessions, downloads, memory, project context, research, vision, and Pro unlock.

● Live beta
Next · agent workflow

Plan → diff → test → repair

The product goal is a local coding-agent loop that can work across real projects without burning paid cloud usage.

● Funded by Pro + Founders
Next · model quality

Stronger local coding tiers

Better coding performance means disciplined evals, better prompts/configs, distillation experiments, and no inflated benchmark claims.

● Community-funded
Ongoing · speed

Compression + inference

Make bigger open models practical on consumer Macs through paging, caching, quantization, routing, and product-level hardware honesty.

Always improving

The pitch is not “trust us.” The pitch is: help fund the work, use the beta, report what breaks, and watch the process of making local AI better in public.

Fund the build

Help us build the platform we all want to use.

This is the call to action: buy Founders, subscribe to Pro, or chip in whatever you can. The money goes into making Outlier better — coding-agent quality, local model quality, speed, compression, and the public process behind all of it.

Outlier went from idea to a shipped, notarized Mac app in about a month. That included seven model tiers, local inference, agents, memory, vision, payments, packaging, and the first public build. Imagine what this can become with more runway and two more months of focused building. We will take anything you can give us. Thank you.

Best way to help

Founders

$200 · one time

Lifetime Pro. Early builds. Founder recognition. A direct vote for local AI with no API token caps. Founders revenue funds model runs, evals, agent work, and polish.

Become a Founder →
Pro
$20/mo
Unlock everything in Outlier now and fund the build monthly.
Subscribe →
Chip in
Any amount
Chip in anything you can give. One time. No subscription. Thank you — genuinely.
Chip in →
Custom support
Any amount
Want to sponsor a benchmark run, feature, or larger amount? Email Matt directly.
Email Matt →
Community

We need everybody’s help.

The community is not a side quest. It is how this gets built: users testing real workflows, reporting what breaks, funding the next sprint, and holding the benchmarks honest. The more people using it, the faster it gets better.

FAQ

Clear answers.

Is Outlier already as good as the best cloud coding agents?
No. That is the goal, not the current claim. v1.8 is a real local Mac app with shipped models and agent features, but the app and models still need work to reach the quality of the best cloud coding agents. Founders and Pro revenue fund that gap.
What does “unlimited tokens” mean?
Outlier does not meter local model usage or charge per token. Once a model is downloaded, generation runs on your Mac. Your practical limit is hardware, power, storage, model size, and time — not a server-side usage cap from Outlier.
What is free?
Nano and Lite. They are the lightweight local tiers for everyday use. Free is meant to be useful, not a fake trial.
What does Pro unlock?
Pro is $20/month and unlocks the full Outlier stack: Quick, Core, Code, Plus, Vision, and the advanced app capabilities gated behind Pro.
Why buy Founders?
Founders is $200 once for lifetime Pro and direct support of the mission. It funds the work required to make the coding-agent experience stronger, faster, and closer to the cloud tools people already love.
Does my data stay private?
Local model inference runs on your Mac. Chat history and memory are stored locally. Web search and external APIs are optional paths, not the default. If you turn on web search or bring an external API key, that specific request leaves your device by design.
What Mac do I need?
Apple Silicon Mac. Nano and Lite fit smaller machines; Core, Code, and Vision are best on 24 GB+ RAM; Plus needs significant disk space and is best for higher-end Macs.
Can I just donate?
Yes. Use the chip-in link for a one-time contribution of whatever you can give, or email Matt for a custom sponsorship, benchmark run, or feature sponsorship.
Build the offline future

Unlimited local usage. Private by default.

The cloud tools proved what the workflow should feel like. Now we build the version that runs locally, belongs to the user, and has no API token cap from Outlier mid-sprint.