Orchestrate.legal

Last mile liability: why the risk starts when AI output gets used

Governance should focus on the point where output becomes action, not only on generation.

governanceWorking2026-05-02

This reflects current thinking and may change as the model develops.

The risk in legal AI often appears after the model has finished. A draft summary sitting in a workspace is one thing. A clause amendment sent to a client is another. The system needs to understand that difference.

Last mile liability is the risk that arises when output is relied on, sent, inserted into a live document or allowed to trigger action. The problem is not only a wrong generation. It is wrong use without evidence, review or authority.

Execution gates are the practical response: controls at the point of use, tied to destination and reliance, not only to the prompt that produced the text.

Framing governance as “after the fact compliance” misses where professional accountability actually attaches.

Back to writing