Open Source Reactive SQL Engine

Extract. Transform. Load.
One binary. Zero infrastructure.

Connect any API or database. Transform with composable SQL pipelines. Deliver results anywhere. Replace Flink, Fivetran, and ksqlDB with a single binary.

brew install litejoin/tap/litejoin
Extract
Stripe CDC 42/s
postgres_cdc
Weather API 12/s
http_api
Users DB 38/s
postgres_cdc
Transform
order_enrichment 38/s
Join
chain
revenue_5m 12/eval
Window 5m
Load
SSE Out 38/s
sse
Postgres 12/s
postgres
Webhook 38/s
http

Three verbs. That's the whole product.

Connect sources, transform with SQL, deliver to sinks. The mental model is simple because the engine handles the hard parts.

01 — Extract

Connect your data sources

Paste a URL or connection string. LiteJoin polls APIs with intelligent diffing, streams CDC from Postgres, and consumes from Kafka.

REST API poll + diff
Postgres CDC logical replication
Kafka consumer groups
HTTP Webhook POST ingest
02 — Transform

SQL on live streams

Join streams, aggregate with time windows, or write arbitrary SQL. Transformations are composable — chain them together, fan them out.

Joins multi-stream
Windows tumbling / session
* Custom SQL full SELECT
Chaining composable
03 — Load

Deliver results anywhere

Push transformed data to Postgres, fire webhooks, stream via SSE to your frontend. Delivery health and retry built in.

HTTP Webhook POST
SSE real-time push
Postgres upsert
SQLite embedded
Kafka async producer

One source. Many pipelines.

LiteJoin transformations fan out and chain together. One Stripe CDC source can feed a join, a window, and a SQL filter simultaneously. No linear pipeline limitations.

Fan-out

One source feeds multiple transformations independently

Chaining

Output of one transformation becomes input to the next

Multi-input

Multiple sources feed into a single join transformation

Transformation — order_enrichment (Join)
SELECT c.key AS charge_id, json_extract(c.payload, '$.amount') / 100.0 AS amount, json_extract(c.payload, '$.currency') AS currency, json_extract(u.payload, '$.name') AS customer_name, json_extract(u.payload, '$.plan') AS plan_tier FROM stripe_charges c LEFT JOIN users u ON json_extract(c.payload, '$.customer') = u.key -- Chains into: revenue_5m (Window 5m tumbling) -- Fan-out to: high_value_filter (SQL, amount > 1000)

Build pipelines from your editor

LiteJoin ships an MCP server that exposes every engine capability as a tool. Create sources, chain transformations, wire sinks — all through natural language in your IDE.

Describe an entire ETL pipeline in one sentence. The MCP creates sources, writes the SQL, chains transformations, and starts the pipeline. Open Studio to watch it run.

Works with any MCP client:

Claude Code Cursor Windsurf Claude Desktop
Claude Code
> Join Stripe charges with users, compute 5-min revenue by plan, alert via webhook when any plan drops 20%. Creating source stripe_charges (postgres_cdc) Creating source users (postgres_cdc) Creating join order_enrichment ON $.customer = users.key Chaining window revenue_5m (tumbling 5m) GROUP BY plan_tier Chaining SQL drop_detector compare current vs previous window Creating sink webhook_alert (http) Pipeline running 2 sources, 3 transforms, 1 sink Open Studio to visualize the pipeline

See your data flow

The visual companion to your pipelines. Watch data stream through every transformation. Ships with the binary — no account, no signup.

Dataflow Canvas

Your entire pipeline as a left-to-right DAG. Sources, transformations, sinks. Edges animate with live throughput.

Live Data Inspection

Click any node to see live records streaming. Double-click a row to inspect the full JSON payload.

SQL at Every Step

Every transformation exposes its SQL. Edit inline, run it, see results immediately. Write it or let MCP write it for you.

Ambient Observability

Throughput, latency, error counts on every node. Edge thickness shows volume. No separate dashboard needed.

Fan-out & Chaining

The canvas makes composition visual. See fan-out relationships, inspect data at every stage of a multi-step pipeline.

MCP Companion

Build pipelines from Claude Code. Open Studio and the canvas updates live — the dashcam to your MCP workflow.

Ships with the binary. No account needed. Just run litejoin studio

You shouldn't need a platform team
for real-time data

Purpose-built for developers who want real-time ETL without enterprise operational weight.

LiteJoin Kafka Streams Apache Flink Fivetran
Time to first pipeline 5 minutes Hours Days Minutes
Infrastructure Single binary Kafka cluster Flink + Kafka SaaS
API to stream Paste a URL Kafka Connect Custom source Connectors
Transformations SQL (composable) Java DSL Java / SQL UI / dbt
Composability Fan-out + chaining Linear topologies DAG (complex) Linear
AI-native (MCP) Built in No No No
Cost $0 open source $$$$ $$$$ $$$$
Best for App developers Java backend teams Data engineering Non-technical

Connect to anything. Stream everywhere.

Sources and sinks — with more shipping continuously.

Sources

REST API
Postgres CDC
Kafka
HTTP
MySQL
MongoDB

Sinks

Webhook
SSE
Postgres
SQLite
Kafka

Heavy on connections.
Lite on everything else.

Start building pipelines in 5 minutes

One binary. One config file. Real-time ETL running on your machine before your coffee gets cold.