If you’re running workflows with Zapier or building internal tools on Retool, you’re paying per-execution and per-seat fees that compound fast as your team scales. Windmill is an open-source, self-hosted developer platform that replaces both — offering workflow automation and internal UI building from a single code-first platform.
In this guide, you’ll learn what Windmill is, how it compares to Zapier and Retool, and exactly how to deploy and use it on your own infrastructure.
Quick Comparison: Windmill vs Zapier vs Retool
| Feature | Windmill | Zapier | Retool |
|---|---|---|---|
| License | AGPLv3 (open-source) | Proprietary SaaS | Proprietary (self-hosted paid) |
| Pricing (self-hosted) | Free (Community Edition) | N/A | $10–$50/seat/month |
| Workflow Engine | ✅ Native (flows, schedules) | ✅ Native | ❌ (needs external tool) |
| Internal UI Builder | ✅ Native (frontend components) | ❌ | ✅ Native |
| Language Support | Python, TypeScript, Go, SQL, Bash, Shell | No-code only | JavaScript |
| Code-First | ✅ Scripts are version-controlled code | ❌ | ⚠️ JS snippets |
| Git Integration | ✅ Full (push/pull, branches) | ❌ | ⚠️ Limited |
| Self-Hosted | ✅ docker / Kubernetes | ❌ | ✅ (Enterprise only) |
| AI Code Generation | ✅ Built-in AI script generation | ✅ (AI Zap creation) | ✅ (AI component gen) |
| Min RAM | 2 GB | N/A | 4 GB |
| Best For | Dev teams, SREs, data engineers | Non-technical users, marketing ops | Internal tools, admin panels |
Why Self-Host Your Automation Platform?
Running Zapier or Retool as a SaaS is convenient, but it comes with real trade-offs:
Cost at Scale
Zapier charges per task. A moderate workflow that processes 1,000 records and triggers 5 actions each time burns through 5,000 tasks per run. At $50/month for 50,000 tasks, you can exhaust a plan in a single week of batch processing. Retool charges per seat — every developer, analyst, and support agent who needs access adds $5–$50/month to your bill.
Windmill’s Community Edition is free and self-hosted. You only pay for the infrastructure it runs on.
Data Privacy and Compliance
When your workflows process customer data, financial records, or health information, sending that data through third-party SaaS pipelines creates compliance overhead. Self-hosting keeps everything within your network boundary — no data leaves your servers, no vendor audit trail to manage, no SOC 2 questionnaires to fill out.
Code Ownership and Version Control
Zapier workflows live inside Zapier. Retool apps live inside Retool. Windmill scripts and flows live in Git. Every change is tracked, every version is restorable, and your automation logic is part of your codebase — not trapped in a SaaS platform’s proprietary format.
No Vendor Lock-In
Windmill scripts are plain Python, TypeScript, Go, or SQL. If you ever decide to stop using the platform, your scripts run anywhere. There’s no proprietary node format or visual editor dependency to decode.
What Is Windmill?
Windmill is an open-source developer platform that combines three capabilities into a single system:
Script Execution — Write scripts in Python, TypeScript, Go, SQL, Bash, or Shell and run them on a schedule, on a webhook trigger, or on demand.
Workflow Orchestration — Chain scripts together into flows with branching logic, loops, error handling, and parallel execution. Think of it as a code-first Airflow or Temporal.
Internal App Building — Generate user interfaces from scripts automatically, or build custom UIs with Windmill’s frontend components (tables, forms, charts, buttons). Think of it as a code-first Retool.
The platform was created by Ruben Fiszel (formerly at Airflow) and is backed by Windmill Labs. It’s used by companies like Coinbase, Ramp, and Brex for internal tooling and automation.
Key Features
- Multi-language support — Python, TypeScript, Go, SQL, Bash, Shell, and PHP scripts
- Native Git sync — scripts stored in Git repositories with push/pull and branch support
- Built-in scheduler — cron-like scheduling with timezone support and concurrency controls
- Webhook triggers — expose any script as an HTTP endpoint
- Resource management — centrally manage database connections, API keys, and OAuth tokens
- Flow engine — DAG-based workflow execution with retries, timeouts, and error branches
- App builder — auto-generate UIs from scripts or build custom interfaces
- Execution queues — isolate workloads across different worker pools
- Audit logging — full execution history with inputs, outputs, and duration
- Role-based access control — fine-grained permissions for users and groups
Installing Windmill with Docker Compose
The fastest way to get Windmill running is with the official Docker Compose setup. This deploys Windmill with PostgreSQL and a single worker process.
Prerequisites
- Docker and Docker Compose installed
- At least 2 GB RAM available
- A domain name (optional, for HTTPS)
Step 1: Create the Compose File
Create a directory for Windmill and write the compose configuration:
| |
| |
Save this as docker-compose.yml.
Step 2: Start the Stack
| |
Wait about 30 seconds for the database to initialize, then open http://localhost:8000 in your browser. The default credentials are:
- Username:
admin - Password:
changeme
Change these immediately in the admin panel.
Step 3: Verify the Installation
| |
You should see four containers running: postgres, windmill-server, windmill-worker, and windmill-worker-native.
Production Deployment with Reverse Proxy
For a production setup, add a reverse proxy and HTTPS termination. Here’s a Caddy configuration that handles TLS automatically:
Docker Compose (Production)
| |
Caddyfile
| |
This gives you automatic HTTPS with Let’s Encrypt. No certificate management needed.
Writing Your First Script
Windmill scripts are plain code files. Let’s create a Python script that queries a database and sends a report.
Step 1: Create a Script
From the Windmill UI, click Scripts → + Script → Python. Or use the CLI if you have Git sync configured.
| |
Step 2: Set Up Database Resources
Instead of hardcoding credentials, use Windmill’s resource system:
- Go to Resources → + Resource → PostgreSQL
- Fill in your connection details (host, port, database, user, password)
- Name it
production_db - Windmill stores the credentials encrypted and injects them at runtime
Now modify the script to use the resource:
| |
Windmill automatically passes the resource as a typed argument — no environment variables or secret management needed.
Building a Workflow (Flow)
Scripts become powerful when chained together. Windmill’s flow engine lets you connect scripts into DAGs with branching, loops, and error handling.
Here’s a real-world example: a data pipeline that extracts data, transforms it, loads it into a warehouse, and sends a notification.
Flow Definition (JSON/YAML via UI)
| |
What This Flow Does
- Extract — Runs a Python script to pull data from the production database
- Transform — Cleans, validates, and reshapes the data
- Conditional Load — Only loads to the warehouse if there’s data to process
- Notify — Sends a Slack message with the results
- Failure Handler — If anything fails, posts an error alert instead
You can add retry policies, timeouts, and parallel branches. The flow editor shows real-time execution status with color-coded node states.
Building an Internal App
Windmill can auto-generate a UI from any script. For more control, build custom interfaces with frontend components.
Auto-Generated UI
Every script automatically gets a web form based on its type hints. The report_daily_metrics script above generates a form with:
- A date picker (string type)
- A number input for
db_port(int type with default) - Text fields for
db_host,db_name, andslack_webhook - A Run button that executes the script and displays the JSON result
No UI code needed.
Custom App Builder
For a dashboard, use Windmill’s app builder to combine multiple scripts and data sources:
| |
Windmill handles the rendering, data fetching, and component styling. You write the logic, the platform handles the UI plumbing.
Scaling Workers for Production
The default setup uses a single worker container. For production workloads, scale horizontally:
Multiple Worker Groups
| |
Assign scripts to specific worker groups by setting the worker_group field in the script configuration. Heavy scripts run on isolated workers so they don’t block fast API calls.
Kubernetes Deployment
Windmill also supports Helm charts for Kubernetes:
| |
This deploys with auto-scaling, health checks, and ingress routing out of the box.
Git Sync and Version Control
Windmill’s Git integration turns the platform into a proper development environment:
Enable Git Sync
- Go to Settings → Git Sync
- Connect a Git repository (GitHub, GitLab, Gitea, etc.)
- Set the sync direction (pull, push, or bidirectional)
- Choose a branch (e.g.,
mainfor production,devfor staging)
Workflow
| |
Every script is a regular file in your repository. Windmill reads the file path as the script path, parses the function signature for arguments, and uses the file content as the script body. This means:
- Code review works through normal pull requests
- CI/CD can lint and test scripts before they reach Windmill
- Rollback is a
git revertaway - Local development works in VS Code, Neovim, or any editor
Monitoring and Observability
Windmill provides built-in monitoring for all executions:
Execution Dashboard
The UI shows:
- Running scripts and flows with real-time status
- Execution history with duration, inputs, and outputs
- Failed executions with full error traces
- Schedule status and next run times
API Monitoring
Windmill exposes metrics prometheusdmin/metrics` in Prometheus format:
| |
Key metrics include:
windmill_script_executions_total— total script runswindmill_script_execution_duration_seconds— execution time histogramwindmill_queue_depth— pending jobs per worker groupwindmill_worker_count— active workers
Log Aggregation
Send Windmill logs to your existing log stack:
| |
Then configure Promtail or Fluent Bit to scrape the container logs and ship them to Loki, Elasticsearch, or your preferred log backend.
When Windmill Is the Right Choice
✅ Use Windmill when:
- You have a technical team comfortable with Python, TypeScript, or Go
- You need both workflow automation and internal tools in one platform
- You want Git-based version control for all your automation logic
- You’re running scripts on a schedule or responding to webhooks
- You want to replace Zapier, Retool, or Airplane with a single self-hosted system
- You need fine-grained access control and audit logging
⚠️ Consider alternatives when:
- Your team is entirely non-technical — Zapier or n8n’s visual editor may be easier
- You need heavy IoT / MQTT integration — Node-RED is purpose-built for that
- You want a managed service with zero ops — Windmill is self-hosted only
- You need complex data pipeline orchestration with backfill — Apache Airflow is more mature for that specific use case
Summary
Windmill consolidates three platforms — workflow automation, internal tools, and script execution — into a single open-source system you can run on your own infrastructure. The code-first approach means your automation logic lives in Git, your scripts are portable, and there’s no vendor lock-in.
For teams that are comfortable writing code and want to move away from per-execution SaaS pricing, Windmill is one of the most capable self-hosted automation platforms available in 2026. The combination of multi-language scripting, a DAG-based flow engine, auto-generated UIs, and full Git integration makes it a compelling foundation for any self-hosted automation stack.
Get started with the Docker Compose setup above, connect your database as a resource, and have your first script running within 10 minutes.
Frequently Asked Questions (FAQ)
Which one should I choose in 2026?
The best choice depends on your specific requirements:
- For beginners: Start with the simplest option that covers your core use case
- For production: Choose the solution with the most active community and documentation
- For teams: Look for collaboration features and user management
- For privacy: Prefer fully open-source, self-hosted options with no telemetry
Refer to the comparison table above for detailed feature breakdowns.
Can I migrate between these tools?
Most tools support data import/export. Always:
- Backup your current data
- Test the migration on a staging environment
- Check official migration guides in the documentation
Are there free versions available?
All tools in this guide offer free, open-source editions. Some also provide paid plans with additional features, priority support, or managed hosting.
How do I get started?
- Review the comparison table to identify your requirements
- Visit the official documentation (links provided above)
- Start with a Docker Compose setup for easy testing
- Join the community forums for troubleshooting