How to Write an AI Automation Brief That Actually Works

How to Write an AI Automation Brief That Actually Works

How to Write an AI Automation Brief That Actually Works

Most automation projects don't fail because of bad code. They fail because nobody wrote down what the system was actually supposed to do.

I've reviewed a lot of automation briefs at this point. Some are detailed enough to build from on day one. Others are three sentences long and leave the builder guessing at every decision. The difference in outcome is enormous, not just in cost, but in whether the thing you get back resembles what you needed.

This guide walks you through how to write an AI automation brief that gives a builder everything they need. If you're planning to commission automation work, whether with us or anyone else, this is worth reading before your first conversation.

What makes a good AI automation brief? A solid brief covers six things: the current process (mapped, not assumed), the trigger that starts the workflow, the inputs the system receives, the outputs it must produce, the exceptions for when things go wrong, and the success metric that tells you it's working. Cover all six and a builder can quote accurately, build confidently, and deliver something that actually fits your operations.



Why most automation projects fail before a single line of code is written

When a brief is vague, the builder fills in the gaps. That's not ideal, because they're filling those gaps with assumptions about your business, your data, your edge cases. Sometimes those assumptions are reasonable. Often they're not.

What I see repeatedly across AMPL client audits is a clear pattern. The businesses that have the most frustrating experiences with previous automation vendors are usually the ones who handed over the least information upfront. Not because they were being difficult. They just didn't know what to include.

The builder then produces something that works in the demo. It handles the clean, happy-path scenario perfectly. Then it hits real data, a form submission missing a field, a customer with two accounts, an enquiry that arrives on a bank holiday, and it breaks. Or worse, it silently does the wrong thing.

Rebuilding costs more than building it right the first time. Basically every time. A vague brief is an expensive brief, even if it looks cheaper going in.



What a good automation brief must contain

There's no single right format. But every brief that works covers these six things.



The current process, documented, not assumed

Before you can brief someone on what to automate, you need to write down what actually happens today. Not what's supposed to happen. What actually happens, including the workarounds and the manual steps people have quietly added over the years.

Walk through the process yourself. Better yet, ask the person who does it every day. You're looking for: who does what, in what order, using which tools, and how long each step takes.

This matters because automation doesn't replace a vague idea of a process. It replaces a specific sequence of actions. If you can't describe the sequence, the builder can't replicate it.



The trigger, what starts the workflow

Every automation has a starting point. Something happens, and then the system kicks off. That trigger needs to be explicit.

Is it a form submission? An email arriving in a specific inbox? A row added to a spreadsheet? A time of day? A webhook from another system? A sales stage changing in your CRM?

The more specific you are here, the better. "When a new enquiry comes in" is not a trigger. "When a new form submission arrives via the contact form on the /contact page, sending to enquiries@yourbusiness.com" is a trigger.



The inputs, what data enters the system

What information is available at the point the trigger fires? List every field, every data source, every piece of context the system will have access to.

If it's a form, paste the actual fields. If it's an email, describe the format. Is it structured or freeform? If it's pulling from a CRM, which fields? Are any of them sometimes empty?

Builders need to know what they're working with. Surprising them with a messy, inconsistent data source halfway through a build is one of the most common reasons timelines slip.



The outputs, what the system must produce

What does done look like? Be specific about what the system should create, send, update, or log when it runs successfully.

If it's sending an email, what should that email contain and where should it come from? If it's updating a CRM record, which fields change to what? If it's generating a document, what template does it follow?

Don't describe the outcome in general terms. Describe the actual thing. Better yet, show an example of one. "A summary of the enquiry with the customer's details, the product they asked about, and a suggested response" is more useful than "an email reply".



The exceptions, what happens when things go wrong

This is the section most briefs skip entirely. It's also the section that determines whether the system survives contact with real data.

Think through the ways the trigger might fire with incomplete or unexpected information. What if a required field is blank? What if the same person submits twice? What if the data format is different from usual? What if an upstream system is down?

You don't need to predict every edge case. That's partly what the build process is for. But listing the ones you know about means the builder can handle them intentionally rather than leaving them as silent failure points.

For each exception, say what you'd want the system to do. Flag it for human review? Send an alert? Skip it and log it? Attempt a fallback?



The success metric, how you'll know it's working

How will you measure whether this automation is doing its job? Not in vague terms. Specifically.

It might be: "This process currently takes our admin team four hours a week. After automation, it should take under 30 minutes of human review time." Or: "100% of enquiries receive an initial response within five minutes, up from our current average of four hours." Or: "Zero data entry errors in CRM records created through this flow."

A success metric does two things. It gives the builder a clear target. And it gives you a way to evaluate whether what you got is what you needed, before you've paid the final invoice.



A template you can use right now

Copy this and fill it in before your first conversation with any automation developer.

Process name: What do you call this process internally?

Current process: Step by step, what happens today? Who does each step? Which tools do they use? How long does it take?

Trigger: What event starts this workflow? Be as specific as possible, including the exact source, format, and any conditions.

Inputs: What data is available when the trigger fires? List every field and data source. Flag anything that's sometimes missing or inconsistent.

Outputs: What should the system produce? Describe each output specifically, and include an example or template where you have one.

Exceptions: What could go wrong or arrive in unexpected formats? For each exception, what should the system do?

Success metric: How will you measure whether this is working? Include current baseline and target.

Tools already in use: Which systems does this need to connect to? Include names and whether you have API access.

Volume: How often does this trigger fire, per day, per week? Is it seasonal or consistent?

Out of scope: Anything you explicitly don't want this system to touch?



Bad brief vs good brief: a side-by-side comparison

The gap between a bad brief and a good one often isn't the length. It's the specificity. Here's what that looks like in practice.



Bad: 'We want to automate our emails'

This is real. I've received versions of this brief. Sometimes with a bit more context, "we want to automate our email responses to enquiries", but not much more.

What's a builder supposed to do with this? Which emails? Arriving where? Containing what? Responded to how? With what content? Sent from which address? Logged where? Reviewed by whom?

Every one of those questions is a decision point in the build. If the brief doesn't answer them, the builder either asks (adding time and back-and-forth) or assumes (adding risk). Usually both.

The build that comes back from a brief like this will be generic. It'll handle the obvious scenario. It won't handle your business.



Good: 'When a new enquiry arrives via the contact form...'

Here's what a better version of the same request looks like:

"When a new form submission arrives via the /contact page on our website (Webflow, sends to enquiries@ourbusiness.com), extract the customer's name, email, phone, and the message field. Check our CRM (HubSpot) for an existing contact with that email. If a record exists, add a note and log the enquiry. If not, create a new contact. Send the customer an automated acknowledgement from enquiries@ourbusiness.com using the template attached. Post the enquiry to our New Enquiries Slack channel with the customer name and a link to the HubSpot record. If the message field is empty or the email looks invalid, flag it for manual review and don't create a record. Success: response sent within two minutes, 100% of valid enquiries logged in HubSpot with no manual data entry."

That brief can be built from. The builder knows the trigger, the data sources, the integrations, the outputs, the exceptions, and the success criteria. They can quote accurately and deliver something that fits.



How to identify and document edge cases before the build starts

Edge cases are the things that break automations. And most of them are predictable. You just have to think about them before the build rather than after.

A simple approach: take your process and ask "what's the weirdest thing that could turn up here?" Then ask again. And again.

If your trigger is an inbound email: what if it's a reply to a previous thread rather than a fresh enquiry? What if it's spam? What if it contains an attachment? What if it's in a different language?

If your trigger is a form submission: what if someone submits the same form twice? What if they use a different email address each time? What if the phone number field contains letters?

You don't need to solve every edge case yourself. But you need to surface the ones you know about. The brief is your opportunity to say "we know this happens, here's how we'd like it handled" rather than discovering it six months into production when a real customer hits the scenario.

At AMPL, we go through this exercise with every client during the audit. It reliably surfaces three or four things the business hadn't thought to mention, and those things almost always affect the architecture of the build. Better to find them in a conversation than in a post-launch bug report.



What happens when you hand a good brief to a builder

The process changes completely.

A good brief means the first conversation is about refinement, not discovery. The builder isn't trying to understand your business from scratch. They're asking targeted questions about specific decisions. "You said flag invalid emails for manual review, does that mean a notification to a specific person, or a shared inbox?" That's a productive conversation. It moves fast.

Scoping becomes accurate. When a builder knows exactly what they're building, including the edge cases, they can give you a realistic cost and timeline. Vague briefs lead to vague quotes, which lead to scope creep and budget surprises later.

The build itself is faster. Fewer clarification rounds mid-build. Fewer "I assumed you meant X" moments. Fewer rebuilds.

And the QA process is cleaner. When the success metric is defined upfront, both sides know what passing looks like. Either the system handles 100% of valid enquiries in under two minutes or it doesn't. That clarity is valuable, for you and for the builder.

To be honest, the briefs we receive at AMPL have a direct correlation with how smooth the project runs. Not because the team can't handle ambiguity, we can, but because no amount of skill on the builder's side fully compensates for missing information about the business on the client's side.

If you've got a process in mind and you're not sure how to scope it, that's exactly what our audit is designed for. We map the process, identify what's automatable, and produce a brief you can build from. Book a free audit at amplconsulting.ai.



FAQ: writing automation briefs



How long should an AI automation brief be?

Long enough to cover the six components: current process, trigger, inputs, outputs, exceptions, and success metric. For a simple single-workflow automation, that might be one page. For a multi-step system with several integrations, it might be four or five. Don't pad it out, but don't cut it short to save time. The time you invest in the brief comes back in a faster, cheaper build.



Do I need technical knowledge to write the brief?

No. You need to know your business and your process. The technical decisions are the builder's job. What tools you use, what data is involved, what the trigger looks like, those are business questions, not technical ones. If you're not sure which fields your CRM captures, find out before the briefing conversation. But you don't need to know how the API works.



What if I don't fully understand the process myself?

Document it before you brief it. Talk to the person who runs the process day to day. Shadow them for an hour. Record a screen share of them doing it. You can't brief an automation of a process you can't describe, and that's true regardless of who's building it. This is also something AMPL covers in the audit: we map the process with you before scoping the build.



Should I include examples of the data in my brief?

Yes, wherever you can. A sample form submission, an anonymised example email, a screenshot of a CRM record, these are worth more than a paragraph of description. They show the builder exactly what they're working with, including any formatting quirks or inconsistencies that a written description would miss.



What if I don't know what the success metric should be?

Start with time. How long does the process take today in total staff hours per week? That's your baseline. Then ask: what would good look like? Even a rough answer gives the builder something to target. If you genuinely have no idea, that's a signal the process needs more mapping before it's ready to be automated.



Can I write the brief after the first conversation with a builder?

The brief should come before, not after. The first conversation goes much better when the builder has already read something. It lets them ask the right questions rather than starting from zero. Writing even a rough draft and sharing it ahead of the call is better than arriving with nothing. Treat it as a working document that gets refined through the conversation.