Function-level deep dive: from flutter build to byte-level diffing to patch application on a real phone. Every function, every file, every byte transformation.
📦 Phase 1: Build & Release
What happens when you run shorebird release --platforms android
Reads
shorebird.yaml from project root. Extracts app_id. Resolves the Shorebird Flutter SDK revision to use. Calls shorebirdFlutter.installRevision() which downloads the prebuilt Shorebird Flutter SDK (this SDK contains the forked engine with libupdater.a already linked).Runs
flutter build appbundle --release using the Shorebird Flutter fork. The Dart compiler (gen_snapshot) compiles your Dart code into AOT machine code:Your Dart code →
gen_snapshot → libapp.so (per architecture)This produces 3 files for Android:
•
libapp.so (arm) — ~3.5 MB•
libapp.so (aarch64) — ~3.1 MB•
libapp.so (x86_64) — ~3.2 MBThese files contain ALL your Dart code compiled to native machine instructions. They're the ONLY files that change between patches.
•
libflutter.so — The Flutter engine (Shorebird's fork, with libupdater.a statically linked inside)•
libapp.so — Your Dart code (the AOT snapshot from Step 2)•
flutter_assets/shorebird.yaml — Config (app_id, base_url, channel, auto_update)Key:
libflutter.so never changes between patches. Only libapp.so changes. The engine + updater Rust library is baked in once.
1.
POST /api/v1/apps/{appId}/releases → creates release record (version, flutter_revision)2. For each arch:
POST /api/v1/apps/{appId}/releases/{id}/artifacts → uploads libapp.so + SHA256 hash3.
PATCH /api/v1/apps/{appId}/releases/{id} → sets status to "active"The original
libapp.so files are stored on the server. They're needed later to compute the binary diff when you create a patch.
libapp.so bundled inside their APK. The engine (libflutter.so) contains the Rust updater which will check for patches on every app launch.
🔬 Phase 2: Diff & Patch Creation
What happens when you change code and run shorebird patch --platforms android
Downloads the original
libapp.so (per-arch) from your backend. This is the exact binary that's bundled inside the APK your users have. It's needed as the "base" for binary diffing.Rebuilds your Dart code using the exact same Flutter revision as the original release. This is critical — the Dart compiler must match exactly. Produces a new
libapp.so with your code changes.patch binaryupdater/patch/src/lib.rs → make_patch()
Step 1: Binary Diff (bidiff)
bidiff::simple_diff_with_params(&old_bytes, &new_bytes, &mut writer, ¶ms)This computes a byte-level binary diff between the old and new
libapp.so. The bidiff algorithm finds matching byte sequences and encodes the differences as a series of copy + insert operations. Output: an uncompressed diff (maybe 200KB–2MB).Step 2: Zstd Compression
ZstdCompressor::new().compress(&mut output, &mut diff_bytes)The diff is piped through zstd compression (via a separate thread + pipe for parallelism). Output: a compressed
.patch file.SHA256(new_libapp.so) — the hash of the final uncompressed patched file, NOT the compressed diff. This hash is uploaded to the server and later used by devices to verify the patch was applied correctly.1.
POST /api/v1/apps/{id}/patches → creates patch record (gets auto-incremented patch number)2.
POST /api/v1/apps/{id}/patches/{id}/artifacts → uploads compressed diff per-arch + hash3.
POST /api/v1/apps/{id}/patches/promote → links patch to "stable" channelNow any device checking
/patches/check with this app_id + version + channel will receive this patch.
📱 Phase 3: Device Boot & Patch Loading
What happens inside the user's phone when they open the app
FlutterActivity starts → FlutterJNI.nativeInit() called → reads shorebird.yaml from flutter_assets/ → passes it to native C++ code.Creates the
shorebird_updater/ directory in the app's cache + storage paths. Builds AppParameters struct with:•
release_version: "1.0.0+1"•
code_cache_dir: OS cache directory•
app_storage_dir: persistent storage•
original_libapp_paths: pointer to the bundled libapp.so pathThen calls → shorebird_init(&app_parameters, file_callbacks, yaml)
1. Parses
shorebird.yaml → extracts app_id, base_url, channel, auto_update2. Sets global
UpdateConfig (once per process, guarded by OnceLock)3. Loads
state.json from storage dir (or creates fresh if corrupt/missing)4. Crash recovery: calls handle_prior_boot_failure_if_necessary() — if
currently_booting_patch is set (meaning last boot crashed), marks that patch as Bad with reason crash_recovery
libapp.so should I load?"Updater checks the
PatchLifecycle state on disk:• If a patch is installed and not bad → returns path:
/data/data/com.app/shorebird_updater/patches/3/dlc.vmcode• If no patch installed → returns NULL (use original bundled
libapp.so)
// Android (non-interpreter mode):
settings.application_library_path.clear();
settings.application_library_path.emplace_back(active_path);
// iOS (interpreter mode):
settings.application_library_path.insert(
settings.application_library_path.begin(), active_path);
The engine replaces (or prepends) the default libapp.so path with the patched file's path. The Dart VM then loads the patched file instead of the original. This is how the updated Dart code runs without reinstalling the app.
std::once_flag). Sets currently_booting_patch = next_boot_patch in the state file AND records a boot timestamp. If the process crashes before report_launch_success() is called, the next launch will detect currently_booting_patch is still set → crash recovery triggers → patch marked bad.
auto_update: true in shorebird.yaml:shorebird_start_update_thread() → spawns a Rust background thread that runs the full update cycle (Panel 4). The app is already running with the current patch — the update downloads the NEXT patch for the NEXT boot.
main() runs, it's already running the patched code.
🔄 Phase 4: Background Update Cycle
The Rust updater thread checking for and downloading a new patch
for event in state.copy_events(3) { send_patch_event(event, &config); }Events like
__patch_download__, __patch_install__, __patch_install_failure__.
Sends:
{ app_id, release_version, platform, arch, channel, client_id, patch_number }Server responds with:
{ "patch_available": true,
"patch": { "number": 3, "download_url": "https://...", "hash": "sha256...", "hash_signature": null },
"rolled_back_patch_numbers": [2] }
For each patch number in
rolled_back_patch_numbers:state.uninstall_patch(patch_number) — deletes the installed
dlc.vmcode file and removes the patch from the lifecycle state. If the currently booted patch is in this list, the next boot will fall back to the previous good patch (or base release).
• Skip(KnownBad): This patch was previously tried and failed → don't re-download
• Skip(AlreadyInstalled): Same patch number + same hash already installed → no-op
• Resume {offset}: Previous download was interrupted → resume from byte offset
• Download: Fresh download
• Complete: Already downloaded but not yet installed → skip to install
Downloads from
patch.download_url using ureq HTTP client. Supports HTTP Range headers for resume. Downloads to: {cache}/patches/{N}/download.binVerifies: if server sent
Content-Length header, checks actual downloaded bytes match. On mismatch → uninstall + error.
updater/library/src/updater.rs
This is the reverse of the diff process, running on the user's phone:
1. Validate: Check first 4 bytes are zstd magic (
0x28, 0xB5, 0x2F, 0xFD)2. Create pipe: In-memory pipe connecting decompression thread to patching thread
3. Thread 1 (decompression):
ZstdDecompressor.copy(compressed_file → pipe_writer)4. Thread 2 (patching):
bipatch::Reader::new(pipe_reader, original_libapp_reader) — reads decompressed diff + original bundled libapp.so, outputs the new patched libapp.so5. Write: Patched bytes →
{storage}/patches/{N}/dlc.vmcodeSHA256(dlc.vmcode) and compares against patch.hash from the server.• Match: Patch is valid. Transition to
Installed state.• Mismatch: Mark patch as
Bad(InstallHashMismatch). This is usually caused by the developer building the patch with a different Dart compiler version than the release.
next_boot_patch = patch_number in the lifecycle state. Queues a __patch_download__ event. The patch is now ready. On the next app launch, Step 3 will return this patch's path instead of the original.
dlc.vmcode file IS a complete libapp.so — not a diff. The device reconstructs the full binary from the diff + original. The engine loads it exactly like it would load the bundled version.
🔄 Phase 5: Rollback & Recovery
A. Automatic Crash Recovery (Client-Side)
🔴 Scenario: Patch 3 crashes the app
Boot 1: App starts → shorebird_init() → loads patch 3 → report_launch_start() writes currently_booting_patch=3, boot_started_at=timestamp → 💥 CRASH (process dies before report_launch_success())
Boot 2: App restarts → shorebird_init() → handle_prior_boot_failure_if_necessary() sees currently_booting_patch=3 is still set → calls record_boot_failure_for_patch(3) → marks patch 3 as Bad(CrashRecovery) → queues __patch_install_failure__ event with message "crash_recovery: patch 3 failed to boot" → next_boot_patch_path() returns patch 2 (or NULL for base) → App boots safely
B. Server-Side Rollback
🟡 Developer rolls back patch 3 via CMS/CLI
Developer calls: POST /api/v1/apps/{id}/patches/3/rollback
Next device check: POST /patches/check response now includes "rolled_back_patch_numbers": [3]
On device: roll_back_patches_if_needed([3]) → state.uninstall_patch(3) → deletes patches/3/dlc.vmcode + removes from lifecycle state → next_boot_patch() now returns patch 2 (or base)
Next app launch: Engine loads patch 2 (or base release). Patch 3 is gone.
C. Un-Rollback (Re-enable)
🟢 Developer un-rolls-back patch 3
Developer calls: DELETE /api/v1/apps/{id}/patches/3/rollback
Next device check: Server now offers patch 3 again in /patches/check response. rolled_back_patch_numbers no longer includes 3.
On device: Normal update cycle — downloads patch 3 again, inflates, verifies, installs for next boot.
Caveat: If the device had marked patch 3 as Bad(CrashRecovery) locally, the decide_start() function will return Skip(KnownBad) and refuse to re-download. The device's local "known bad" state takes precedence over the server.
D. State Files on Device
/data/data/com.example.app/files/shorebird_updater/
├── state.json ← { client_id, release_version }
├── pointers.json ← { next_boot: 3, last_booted: 2 }
└── patches/
├── 2/
│ ├── state.json ← { status: "installed", hash: "...", size: 3100000 }
│ └── dlc.vmcode ← THE ACTUAL PATCHED libapp.so (3.1 MB)
└── 3/
├── state.json ← { status: "bad", reason: "crash_recovery" }
└── (dlc.vmcode deleted)
/data/data/com.example.app/cache/shorebird_updater/
└── patches/
└── 3/
└── download.bin ← Compressed diff (temp, deleted after install)
🔬 Phase 6: Byte-Level Internals
What is libapp.so?
An ELF shared library containing the Dart AOT snapshot. It has two key sections:
• _kDartVmSnapshotInstructions: Compiled machine code (ARM64/x86 instructions)
• _kDartIsolateSnapshotData: Dart heap data (objects, constants, strings)
When you change Dart code, both sections change. The instruction offsets shift, constant pools update, and new functions appear or existing ones change. But much of the file stays identical (standard library code, unchanged widgets, etc.).
The bidiff Algorithm
How it finds similarities
bidiff uses a suffix-sort-based algorithm (similar to bsdiff) to find matching byte sequences between old and new files. It produces a stream of operations:
• COPY(offset, length): "Copy length bytes from the old file starting at offset"
• INSERT(bytes): "Insert these new bytes that don't exist in the old file"
• ADD(delta): "Add these delta bytes to the copied bytes" (for near-matches)
For typical Dart code changes, 90-98% of libapp.so is identical. bidiff captures this efficiently.
Size Comparison (Real Numbers)
┌──────────────────────┬────────────┬──────────────────────┐ │ Artifact │ Size │ Notes │ ├──────────────────────┼────────────┼──────────────────────┤ │ original libapp.so │ 3,146 KB │ Full AOT snapshot │ │ new libapp.so │ 3,148 KB │ After Dart changes │ │ raw bidiff output │ ~500 KB │ Uncompressed diff │ │ zstd compressed │ 50-200 KB │ What device downloads│ │ dlc.vmcode (output) │ 3,148 KB │ Full reconstructed │ └──────────────────────┴────────────┴──────────────────────┘ Compression ratio: 3,148 KB → 150 KB = ~95% reduction
The Pipe-Based Inflate Architecture
┌─────────────────────────────────────────────────────────────────┐ │ DEVICE (Rust, two threads) │ │ │ │ Thread 1 (decompress): │ │ ┌──────────────┐ ┌──────────┐ ┌─────────────┐ │ │ │ download.bin │───▶│ zstd │───▶│ pipe_writer │──────┐ │ │ │ (compressed) │ │ decomp │ └─────────────┘ │ │ │ └──────────────┘ └──────────┘ ┌─────┘ │ │ │ in-memory│ │ Thread 2 (patch + write): │ pipe │ │ ┌──────────────┐ ┌─────────────┐ ┌──────────┐ │ │ │ │ bundled │───▶│ bipatch │◀─│pipe_reader│◀─┘ │ │ │ libapp.so │ │ ::Reader │ └──────────┘ │ │ │ (original) │ │ │ │ │ └──────────────┘ └──────┬──────┘ │ │ │ │ │ ┌──────▼──────┐ │ │ │ dlc.vmcode │ ← Full patched libapp.so │ │ │ (output) │ ← SHA256 verified │ │ └─────────────┘ │ └─────────────────────────────────────────────────────────────────┘
🚀 Future Improvements for Self-Hosted
📊 Staged Rollouts MEDIUM
Roll out patches to a percentage of users (1% → 10% → 50% → 100%). The /patches/check endpoint uses client_id to deterministically assign users to a rollout bucket: hash(client_id) % 100 < rollout_percentage. Add a rollout_percentage column to channel_patches table.
🔐 Code Signing EASY
The updater already supports patch_public_key in shorebird.yaml and hash_signature in the patch response. Generate an RSA keypair, sign SHA256(dlc.vmcode) with your private key during patch creation, store signature in hash_signature. Device verifies with embedded public key. Prevents server compromise from pushing malicious patches.
🌍 CDN for Patch Distribution EASY
Put a CDN (CloudFront, Cloudflare) in front of your /storage/ path. Patches are immutable (same URL = same content), so cache-forever headers work perfectly. Reduces latency from 200ms to 20ms globally. The download_url in /patches/check already points to a full URL — just make it a CDN URL.
📱 Forced Update Support MEDIUM
Add a force_update: true flag to /patches/check response. The Dart-side shorebird_code_push package checks this and shows a full-screen "Updating..." overlay, blocking the app until the patch downloads and installs. Requires a small Dart wrapper + backend flag per patch.
🔄 Delta Patches (Patch-to-Patch) HARD
Currently: diff is always against the original release libapp.so. Improvement: diff against the last patch's output instead. If a user has patch 2 installed and patch 3 is available, the diff between patch 2's output and patch 3's output is much smaller than original→3. Requires server-side diffing at promote time, per-patch-chain artifacts, and updater changes to select the right base.
📈 Real-time Analytics Dashboard MEDIUM
WebSocket-based live counter of downloads/installs per patch. Show a real-time map of where patches are being installed (from IP geolocation of /patches/events). Add aggregation tables (hourly/daily rollups) for fast historical queries. Add percentile tracking: "What % of users are on the latest patch?"
🧪 A/B Testing via Channels MEDIUM
Create channels like experiment_a and experiment_b. Assign users to channels deterministically based on client_id hash. Each channel gets a different patch (e.g., different UI layout). Measure install counts per channel. The channel infrastructure already exists — you just need the assignment logic and analytics comparison view.
⏱ Automatic Rollback on Error Rate HARD
Monitor the ratio of __patch_install_failure__ to __patch_install__ events per patch. If failure rate exceeds a threshold (e.g., >5% in 30 minutes), automatically insert into rolled_back_patches and alert the developer. This turns your self-hosted backend into a canary deployment system.
📦 Multi-App Orchestration EASY
If you have multiple apps (Univest Production, UAT, DEV), add a promotion pipeline: patch tested on DEV → promoted to UAT → promoted to Production. Each stage is a different app_id with its own channels. Add a CMS feature to "promote" a patch from one app's channel to another app.
🔒 Patch Expiry / Kill Switch EASY
Add an expires_at field to patches. After expiry, /patches/check stops serving it. For kill switch: a special "patch 0" that forces the device back to the base release. Implement as a patch with number=0 and no diff — the updater sees it's "installed" and returns NULL from next_boot_patch_path().
🗜 Brotli Compression Alternative MEDIUM
zstd is optimized for speed. Brotli achieves 10-20% better compression at the cost of slower compression (fine for server-side). This means smaller downloads. Requires forking the updater to add brotli support alongside zstd, using a magic-byte header to detect the format. Decompression speed is similar.
📋 Patch Notes in App EASY
Add a notes field to the /patches/check response. The Dart shorebird_code_push package exposes it. Developers show a "What's new" dialog after update. Already partially supported — just needs the backend to include patch.notes in the check response and a small Dart wrapper.