
The owl that never blinks.
Compress. Dependencies distilled into pellets — 200ms reinstalls.
Watch. Golden signals across every service, from his perch.
Guide. A local-first AI agent in your terminal. No cloud. No keys.
Olly sees what you can't. Across every repo, every branch, every dependency.
What Olly brings to your stack
Every tool in the Nestr ecosystem is named after something an owl does. Because that's literally what Olly is — a wise, watchful guardian for your repositories.
Pellets — Olly digests your dependencies
Like a real owl, Olly compresses what he consumes into pellets — zstd-compressed dependency bundles cached across your monorepo. Architecture-aware, lockfile-driven, absurdly fast.
- 200ms reinstalls
- Up to 22x compression
- Any language, any package manager
The Nest — see everything Olly sees
A real-time dashboard where Olly reports back. Cache status, compression metrics, SLO tracking — the view from above, always up to date via WebSocket.
- Real-time updates
- Cache hit/miss analytics
- One-click invalidation
Perch — Olly's observation post
Every owl needs a perch. This one wires up Prometheus, Grafana, Jaeger, and ELK in a single Helm install. Golden signals, SLO tracking, and cost monitoring — Olly watches so you don't have to.
- One helm install
- 8 pre-built dashboards
- Trace-log correlation
Olly — the owl herself, in your terminal
A local-first AI coding agent that runs entirely on your machine. Quantised open-weight models via llama.cpp. No cloud, no API keys, no telemetry. Olly's wisdom, your hardware.
- Fully offline
- Sandboxed execution
- Full audit trail
The whole owl. One platform.
Pellets compress. The Nest visualises. The Perch monitors. Olly codes alongside you. Each works standalone. Together, they're the monorepo guardian that doesn't exist yet.
- Open architecture
- Works with Nx, Turbo, Bazel
- Self-hostable
Simple, transparent pricing
Start free. Scale when you're ready. No surprises.
Free
For individual devs exploring Nestr on personal projects.
- 1 GB cache storage
- 50 pellets
- 30-day TTL
- 100 MB max pellet size
- Frontend architecture only
- Basic dashboard
- Community support
- CDN distribution
- Perch observability
- Priority Olly builds
Pro
For teams shipping production monorepos at scale.
- 10 GB cache storage
- 500 pellets
- 90-day TTL
- 500 MB max pellet size
- All architectures
- Full dashboard + metrics
- Email support (48h)
- Community Perch
- Priority Olly builds
- CDN distribution
Enterprise
For organisations that need dedicated infrastructure and SLAs.
- Unlimited cache storage
- Unlimited pellets
- Custom TTL
- Unlimited pellet size
- All architectures + custom
- Full dashboard + SSO + audit log
- Dedicated Slack + SLA
- Managed Perch
- Dedicated Olly support
- CDN distribution
$199 — Lifetime Pro access
Limited to first 200One-time payment. All current and future Pro features, forever. No recurring fees.
Engineers who trust the owl
Early adopters on what it's like having Olly in their stack.
“We cut our CI install step from 4 minutes to under 10 seconds. Pellets just works — no config drama, no surprises.”
Sarah Chen
Staff Engineer @ ScaleOps
“Perch gave us golden signals across 30+ services in a single afternoon. We were manually wiring Prometheus for months before this.”
Marcus Rivera
Platform Lead @ Arcline
“Olly running fully local was the dealbreaker for us. No API keys, no data leaving the machine. Finally an AI tool our security team approves of.”
Jess Okafor
Founding Engineer @ Trelliswork
Where we're headed
A transparent look at what's coming next. Timelines are honest estimates, not marketing promises.
Next Month
Engine v0.9
Public beta with full CLI + API
Olly alpha
First downloadable builds
Landing page
You're looking at it
3 Months
Web dashboard beta
Public access to cache + metrics UI
Perch Helm 2.0
Simplified install, Loki integration
Pro tier launch
Paid plans with expanded limits
GitHub Actions plugin
CI/CD integration out of the box
6 Months
Engine v1.0
Stable release with S3 + HTTP registries
Enterprise tier
CDN distribution, SSO, audit logs
Olly multi-model
Swap models per task — code, docs, review
VS Code extension
Pellets + Olly right in the editor
Alex
Founder, Nestr
I went from designing spacecraft systems to shipping digital products. That sounds like a big leap, but the core problem is the same: managing complexity across distributed systems where failure isn't an option.
Nestr started because I got tired of watching CI pipelines burn minutes reinstalling the same packages, wiring up the same Prometheus exporters by hand, and pretending that “just use Docker” was a strategy. I wanted a wise, tireless guardian that could watch over every repo, compress what needed compressing, and alert me when something went sideways.
So I built Olly. Named the compressed bundles pellets because that's what owls produce. The monitoring stack became his perch. The whole platform is a nest. Turns out, owls make excellent metaphors for repository orchestration.
This isn't a VC-backed rocketship. It's one person and one owl, building something useful, in public, one commit at a time. If that resonates — I'd love to have you along for the ride.
High energy, always learning, constantly building.
Founder journal
Curious how this gets built? Real decisions, real trade-offs, no polish.
2026-03-31
Olly's volume problem — and the 3B model fix
Railway volume hit 97%. Swapped from 7B to 3B quantised model, added trace rotation, Docker layer hardening. Down to 18%.
ollama pull qwen2.5-coder:3b-instruct-q4_K_M2026-03-22
Why we chose zstd over gzip for Pellets
Benchmarked zstd, gzip, lz4, and brotli across 50 real monorepo dependency trees. Zstd won on both ratio and speed at level 3.
nestr pellet compress --level 3 --verify2026-03-14
Wiring Perch: one Helm install, 8 dashboards
Got Prometheus, Grafana, Jaeger, and the ELK stack deploying from a single values.yaml. Golden signals dashboard came free.
helm install perch ./helm -n monitoring --create-namespaceLet Olly watch your repos
Join the waitlist to be first in line when Olly opens his nest. Early members get access to the lifetime Pro deal.
No spam. Unsubscribe anytime. We respect your inbox.
Frequently asked questions
Straight answers to the things you're probably wondering.
Building in public
We believe in transparency. Here's where the project stands right now.
0+
Commits
0
PRs merged
0k
Lines of code
0.9%
Uptime
We're building in public. Follow along →
Know someone who needs an owl?
Share Olly with your team or a friend drowning in monorepo chaos.