An AI edge computer that lives on a property closet shelf has no keyboard, no monitor, no SSH. Here is what replaces them.
Every top-ranked AI edge computer page sells TOPS, watts, and interface counts. None of them describe the install-day UX on the device itself, because the buyer those pages imagine has a dev laptop and SSH keys. On a Cyrano unit the installer is a maintenance tech with a ladder and a phone. This page is about the three things on the unit that make that install possible: a 6-phase state file in tmpfs, a 4-code chassis LED, and a QR sticker that opens a captive portal.
See the install on a production unitThe gap on the spec-sheet SERP
Read the current top results for AI edge computer and the shape is the same on every page. TOPS numbers, a watts column, a memory column, a storage column, a photo of the box with labeled I/O ports. NVIDIA Jetson Orin, Advantech MIC and UNO, Dell NativeEdge, Lenovo ThinkEdge, Supermicro edge nodes. The content is accurate and the products are good. The content also assumes the buyer is an OEM, a systems integrator, or a robotics team who will mount the board, attach a debug cable, and provision with a dev laptop.
The install surface a property manager encounters is not that. It is a dusty shelf above an existing DVR, one HDMI cable, one ethernet cable, one power cable, and a chassis that never gets a keyboard plugged into it. There is no serial console. The dev laptop is not coming. The only human is someone whose job is painting hallways and unclogging toilets, and who is being asked to bring up an edge AI unit because someone at the regional office said it would take two minutes.
Every design choice in the Cyrano install UX falls out of that persona. That persona is the gap. That persona is this page.
Three surfaces on the box that replace the dev laptop
On a Jetson-class development board the install path is keyboard plus monitor plus SSH. On a Cyrano unit those three are stripped out and replaced with a tmpfs file, a chassis LED, and a QR sticker. Each one is narrow on purpose, because the tech is not a developer, and anything more ambitious would fail in the field.
/run/cyrano/boot_phase
A single ASCII word in tmpfs, rewritten by init as the unit advances through halt, power, link, model_load, detector_warm, active. Ephemeral on purpose: a power cut clears it and the next boot starts fresh.
Chassis LED (L0 through L5)
One multicolor LED on the front of the chassis, driven by /etc/cyrano/led.rules. The tech reads the named code over the phone. Support maps the code to a phase without logging in.
QR sticker on top of the unit
Printed at the factory. Carries the device_id, a one-time binding token, and the mDNS URL http://cy-<serial>.local/setup. A phone camera is the entire bind tool.
Captive portal
Lives on the unit itself at port 80. Takes email, property label, WiFi creds, and writes /var/lib/cyrano/provisioning/status.json. No installer app, no vendor login page.
Fallback setup AP
If the property network refuses DHCP or blocks mDNS, the unit hosts cy-setup-<serial>. The tech joins, the same portal opens, the same file gets written, the unit re-joins the real network.
The boot script that writes the phase file
This is the init script that drives /run/cyrano/boot_phase forward. The file is one ASCII word. The LED daemon polls it every 200 ms. The captive portal reads it to show progress on the phone. Everything downstream in the install UX hangs off these six echo lines.
The LED rules the tech reads aloud
The LED daemon is declarative on purpose. A declarative config means the names the tech reads aloud on the phone are exactly the names in the file; the support rep does not need to translate colors into phases. L0 through L4 map one-to-one with boot_phase; L5 is an overlay the drain worker turns on during a post-outage flush. Nothing else drives the LED.
What the install UX replaces on the device
What the install actually looks like on the wire
Below is a slice of the boot log from a production 16-camera property, from the moment HDMI, ethernet, and power are all seated through the first active frame. Every step is visible. Nothing is a vendor black box.
“75 seconds from a cold plug-in on a property closet shelf to boot_phase=active, with no keyboard, no monitor, and no SSH session in the install path. The tech finished the QR bind on their phone during model_load, so the end-to-end install including the portal form landed in under 3 minutes.”
Cyrano field notes, 16-camera property install timing
One line of the provisioning file the portal writes
When the tech submits the bind form on their phone, the portal writes one compact JSON object to /var/lib/cyrano/provisioning/status.json on the unit itself. That file is on real disk, not tmpfs; a reboot mid-install resumes from the last written state. This is the record the on-device boot process reads to know the unit has been bound to an account.
Across more than 0+ production property installs, the install UX on the unit itself has been the same three surfaces: boot_phase, the LED, and the QR sticker.
The six boot phases, in order, with the LED code per phase
/run/cyrano/boot_phase transitions on a cold plug-in
halt (LED L0, solid red)
Unit is off or has just lost power. boot_phase is rewritten to halt by the init script before anything else runs. The red LED tells the tech the unit is seated but not yet up.
power (LED L1, slow amber pulse)
SoC came up, kernel is running, rootfs is mounted read-only from /dev/mmcblk0p2. Typically reached around t+12s. The LED turning amber is the first visible sign of life on the chassis.
link (LED L1)
DHCP obtained a lease, captive portal is listening on port 80, mDNS is advertising cy-<serial>._http._tcp.local. The tech can now scan the QR sticker and open the bind form on their phone.
model_load (LED L2, fast amber pulse)
cyrano-load-models mmaps the detector weights from /var/lib/cyrano/models/detector.onnx and the layout classifier from layout-classifier-v3.onnx. The tech is usually filling in the portal form while this runs.
detector_warm (LED L3, blue pulse)
cyrano-warmup runs a first forward pass on the current HDMI frame. Any model or hardware error surfaces here as a non-zero exit and the LED stays on L3; nothing progresses to active if the warmup fails.
active (LED L4, blue solid)
cyranod is in the steady-state HDMI capture loop, producing frames at the configured cadence. boot_phase=active is the green light for the portal to show installation complete and for the cloud to mark the unit live.
What is intentionally not in the install path
These omissions are the reason the install is 2 minutes instead of half a day. Each one is a common assumption on the spec-sheet SERP that this product deliberately does not make.
not required at install time:
- A dev laptop with SSH keys
- A monitor plugged into the chassis
- A USB keyboard plugged into the chassis
- A serial console cable
- A vendor installer app on the tech's phone
- Account credentials typed into the unit itself
- A firmware image downloaded and flashed on site
on the unit by default:
- /run/cyrano/boot_phase as a live progress signal
- Chassis LED driven by /etc/cyrano/led.rules
- QR sticker carrying device_id and a bind token
- mDNS responder advertising cy-<serial>.local
- Captive portal at http://cy-<serial>.local/setup
- Fallback setup AP named cy-setup-<serial>
- Persisted /var/lib/cyrano/provisioning/status.json
Spec-sheet framing vs. install-UX framing
The table below is the question an operator actually should ask a vendor. TOPS and watts still matter for sizing the silicon. But on a property where the installer is not a developer, these rows below are the rows that decide whether the deployment lands in 2 minutes or gets abandoned halfway through.
| Feature | AI edge computer (SERP spec-sheet framing) | AI edge computer (Cyrano) |
|---|---|---|
| Buyer persona assumed | OEM, SI, robotics team with dev laptop | Property maintenance tech with a phone |
| Install tool the persona brings | SSH keys, a monitor, a USB keyboard | A phone camera and a ladder |
| On-device progress surface | Unspecified; usually requires a monitor | /run/cyrano/boot_phase plus L0–L5 chassis LED |
| Account binding path | CLI tool on a dev laptop | QR sticker on top of the chassis |
| WiFi onboarding on a new property | Static config or vendor installer app | Fallback AP cy-setup-<serial> plus captive portal |
| Support-call handoff surface | Screen share or SSH into the unit | Read the LED code (L2, L3, L4) aloud on the phone |
| Persistent provisioning record | Usually only in the cloud | /var/lib/cyrano/provisioning/status.json on disk |
The thing that is uncopyable
You can read the LED code aloud.
The LED on the front of the chassis shows L0 through L5. Those names are literal values in /etc/cyrano/led.rules that ship identically on every unit. A property manager can point their phone at the chassis, describe the pulse (blue solid, or fast amber pulse), and a support rep knows the exact phase the box is in without seeing it, without logging into it, and without asking the manager to plug anything into it. That is the whole diagnosis protocol. One LED. Five short codes. Zero vendor dashboards required to tell what state the unit is in.
Install-day conditions the unit is designed to handle on its own
Every one of these collapses into a known branch of the boot script, a fallback in the captive portal, or an idempotent write to /var/lib/cyrano/provisioning/status.json. The install either advances or stalls in a named phase; it does not leave a tech guessing at a blank chassis.
When the spec-sheet framing is fine, and when this one matters
Worth being direct: if the deployment is a robotics lab, a factory cell, or an OEM bring-up bench, the TOPS-and-watts framing covers the buyer correctly. There is a dev laptop in the room. There is an SSH key. The install is run by someone who does this for a living.
For commercial property work, it does not. The edge AI unit is landing on a closet shelf at a Class B or C multifamily, a construction trailer, or a strip of storefronts. The tech on site is the tech who was available that afternoon. The unit has to explain itself in LED codes, bind itself with a QR sticker, and finish without a screen. That is the window this page is for, and that is the window boot_phase, the LED rules, and the captive portal were shaped to fit inside.
Watch a Cyrano AI edge computer come up on a real shelf
A 15-minute call. We cold-boot a unit, show /run/cyrano/boot_phase advance through the six states, and bind it with the QR sticker on a phone while the LED narrates.
Book a call →AI edge computer: frequently asked questions
What does the current SERP for 'ai edge computer' actually describe?
Page one is a hardware catalog. NVIDIA Jetson Orin, Advantech MIC and UNO, Dell NativeEdge, Seeed reTerminal, Lenovo ThinkEdge, BrainChip Akida, and Supermicro edge nodes. Every result lists TOPS, thermal envelope in watts, memory, storage, PoE support, and a chart of I/O ports. That is correct but incomplete for a commercial property install. The catalog assumes the buyer is an OEM, a systems integrator, or a robotics team with a dev laptop and SSH keys. It does not describe the install UX on the device itself. A property maintenance tech with a ladder, a phone, and 2 minutes is not the persona those pages were written for.
What is different about an AI edge computer installed in a property closet?
Three things that the Jetson module datasheet cannot answer. The box sits on a dusty shelf above an already-running DVR. There is no keyboard or monitor plugged in and there never will be. And the human who installs it is a property manager, a leasing agent, or a maintenance tech, not a developer. Everything the installer needs has to surface on the device itself, through its own power LED, its own chassis, and a QR code the tech scans with their phone. SSH is not in the install path. A serial cable is not in the install path. A screen is not in the install path.
What specific on-device state tells the technician how far the boot has gotten?
A single ASCII word in the file /run/cyrano/boot_phase. The file is rewritten by init scripts as the unit passes through six states: halt, power, link, model_load, detector_warm, active. halt means the unit is off. power means the SoC came up, kernel booted, rootfs mounted. link means DHCP got a lease and the captive portal is listening on mDNS. model_load means the on-device weights under /var/lib/cyrano/models/ finished loading into memory. detector_warm means a first forward pass completed without error. active means the HDMI capture loop is producing frames at the configured cadence. The file is one line, no quotes, no JSON, so even a tech on a phone shell can read it.
How does the chassis LED strip map to those boot phases?
Four visible codes on a single multicolor LED, driven by /etc/cyrano/led.rules. L0 is solid red, shown while boot_phase is halt or the unit just lost power. L1 is slow amber pulse, shown during power and link, while DHCP and mDNS are coming up. L2 is fast amber pulse, shown during model_load, when the weights file at /var/lib/cyrano/models/detector.onnx is mmap-ing into memory and the classifier is deserializing. L3 is blue pulse, shown during detector_warm, after the first forward pass lands. L4 is blue solid, shown during active, the steady-state running condition. A fifth code, L5 blue double-pulse, appears only while a drain worker is flushing the outbox after a WAN outage; a tech on the phone can describe it without knowing what the outbox is.
What is on the QR sticker on the top of the chassis, and what does scanning it actually do?
The sticker is printed at the factory and shipped with the unit. It contains three things. The device_id (for example cy-8F3A21), a one-time binding token (a short string scoped to the unit's serial), and a mDNS URL of the form http://cy-8F3A21.local/setup. Scanning the sticker with any phone camera opens the captive portal hosted by the unit itself on the local network. The portal takes the account owner's email, a WiFi SSID and password if the unit is on wireless, and the physical address of the property. Submit writes the binding to /var/lib/cyrano/provisioning/status.json on the unit and POSTs a bind event to the cloud once the uplink is available. No app, no installer account, no SSH.
What happens if the property has no DHCP or a weird VLAN that blocks mDNS?
The unit falls back to a self-hosted WiFi AP named cy-setup-<serial> with a default password printed on the same sticker. The tech joins that AP with their phone, the phone's auto-captive-portal detection opens http://cy-<serial>.local/setup against the unit's local address, and the flow is identical. Once the tech enters real WiFi credentials, the unit writes them to /var/lib/cyrano/provisioning/wifi.conf, drops the setup AP, joins the real network, and continues the boot_phase sequence from link. The whole fallback adds roughly 90 seconds to the install.
What is the actual time distribution from plug-in to active on a Cyrano unit?
Measured on a production 16-camera property with a cold boot: boot_phase=power reached in roughly 12 seconds (SoC + kernel + rootfs). link in roughly 20 seconds (DHCP lease, captive portal listener up, mDNS advertised). model_load in roughly 45 seconds (detector and layout-classifier weights mmap). detector_warm in roughly 60 seconds (first forward pass on the current HDMI frame). active around 75 seconds. Under 2 minutes from a cold plug. The QR binding runs in parallel with model_load on the tech's phone, so the end-to-end install including the portal form is commonly under 3 minutes.
Why is the boot phase in /run and not /var?
Because /run is tmpfs. The boot_phase file is deliberately ephemeral; it reflects the current runtime, not the last known state. On a power cut it vanishes with the rest of /run and is rewritten by init on the next boot, beginning from halt. The persistent provisioning state, the account binding, the WiFi credentials, and the outbox all live under /var/lib/cyrano/ on real disk. Keeping the boot indicator in tmpfs eliminates an entire class of stale-state bug, for example a tech pulling power during model_load and the unit remembering a phase it is no longer in.
Can a property manager read the LED code aloud on the phone to a support rep?
Yes. The codes are intentionally short and named L0 through L5 so the manager does not have to describe a color pulse. Support asks 'what is the LED doing' and the manager answers 'L2', 'L3', or 'blue solid'. The support rep looks at /etc/cyrano/led.rules (it ships the same on every unit) and knows which phase the unit is in without plugging anything into it. On calls where the manager has never touched the box, the entire diagnosis conversation is often four sentences long because the LED plus the QR sticker plus the captive portal leave no ambiguity about what state the box is in.
What does this choice of install UX cost on the BOM?
A single multicolor LED on the chassis, a small QR sticker printed at the factory, and a ~40 KB mDNS responder plus a captive-portal page bundled into the firmware image. The LED plus sticker add cents per unit. The software bundle adds roughly 40 KB of code against the main image. What they save is the entire serial-cable, keyboard, monitor, SSH-key path that most industrial edge computers assume on first boot, which is the actual gate between a 2-minute install and a half-day truck roll.
Comments (••)
Leave a comment to see what others are saying.Public and anonymous. No signup.