The untethered workstation.

The untethered workstation.
Even Insider

Welcome back to the Even Insider.

Something quiet has shifted in how work gets done, and most of us haven't fully reckoned with it yet. Work used to be a coordinate. First, it was the desk. Then it was the office. Later, the laptop became the place - but it was still a place you had to be. Even when "remote work" arrived, the geography only got softer; the posture stayed exactly the same. You sat down to work. You stood up to do everything else.

Then the agent arrived, and the math inverted.

AI agents - for code, for writing, for research, for creative work - can now run autonomously for minutes, sometimes hours. They no longer need you to type every single line. They need you to decide, to redirect, and to capture the things they do not yet know. The decisions take seconds. The capture takes a phrase. The heavy lifting in between is theirs to do.

This means, for the first time in modern history, the primary constraint on getting work done is no longer compute. It is location. Your agent could easily keep working while you take a walk; you just couldn't watch it.

Until today.

We are introducing Terminal Mode - a new mode within Even OS designed to do exactly one thing: unchain the workshop from the desk.

Work in Motion: A new kind of workplace.

Building has always required two distinct postures.

There is the deep posture: sitting still, executing. Code gets typed, drafts get written, tickets get closed. But then there is the loose posture: moving, thinking, finding. It’s when the architecture finally clicks on a walk. It’s when the answer surfaces on the train. It’s the breakthrough that arrives while you're standing at the kitchen counter.

Historically, software has only ever respected the first posture. The IDE, the word processor, the design canvas - all of them assume you are sitting. Because of this, the "loose posture" has always been a waiting room. You held the idea in your head, hurried back to the desk, and tried to type it out before you forgot it.

Agents changed the execution, but they didn't change the posture. The agent stepped in as the executor, allowing the human to become the conductor. Yet, we still found ourselves tethered to our desks just to watch the agent work.

We realized that to truly leverage AI, the workshop had to move.

With Terminal Mode on the Even G2S, the agent’s "tail" - what it is doing right now, the question it is about to ask, the result it just delivered - lives quietly in your peripheral vision. You take a walk; it runs the build. You ride the elevator; it summarizes the documentation. You sit at the kitchen counter; it drafts the next paragraph.

Same work. Same output. Entirely different posture.

Designing for the Periphery: Three decisions we resisted.

We didn't want to build a desktop interface crammed into a pair of glasses. The interaction model for Terminal Mode wasn't so much designed as it was distilled - pared down to the absolute minimum required to steer an agent without breaking your stride.

Instead of clicks and cursors, we built around the natural rhythm of untethered work:

Looking is listening. When you want to check on the agent’s progress, you simply tilt your eyes up for half a second. You read the bottom line of the tail, and you are current. That is the entire interaction.

Decisions take a second. When the agent reaches a crossroads - asking for permission to run a migration, confirm a draft, or approve a research query - the prompt surfaces in bright green. You don't need to pull out a phone. A single tap on the Even R1 ring confirms. A double tap rejects. Your hand stays at your side, your eyes stay on the world in front of you.

Speech is the new prompt. When a breakthrough arrives - a new direction for a refactor, a different angle for an article - you just say it. The clean, directional pickup of the four-microphone array we shipped in the Even G2 captures your voice, transcribes it in dim green, and commits it the moment you stop speaking. The agent runs with your new idea before you've taken three more steps.

Many Crafts, One Canvas: Why this is not a developer mode.

Because coding agents matured the fastest, development is the easiest example to point to. But Terminal Mode is not just a developer tool dressed up with a broader name. The untethered workshop applies anywhere an agent does the work and a human steers it.

For the writer, it means expanding an outline and approving paragraphs on the subway ride home. For the researcher, it’s kicking off a literature sweep before lunch, tapping the Even R1 to approve queries, and speaking a redirect - "narrow this to the last two years" - without ever opening a laptop. For the creator, it means reviewing generative mood-boards while waiting for coffee, glancing to skim, tapping to keep, and speaking to refine.

The mode is the same. The craft is yours.

What's Next?

Terminal Mode is the first mode in Even OS that was not built for everyday use. It was built for a specific moment in computing - the moment where the human is no longer the one doing the typing, and the question quietly becomes how lightly we can supervise the work.

It will not be the last mode we build for that question. Each future one will start the same way: ask what the person's actual job is once the agent is doing the work, and design for that, instead of inheriting whatever interface the desk decided was normal.

The next entry in this series goes one layer deeper. An agent that travels with you is only as useful as the context it carries - and today, almost every conversation still begins from zero. We will introduce the Even Knowledge Base: the layer that lets your agent stop starting over, and start building on what it already knows about you, your work, and the way you think.