
AI Training Week 1: Foundations and Prompt Basics (Script)
TL;DR
- •Week 1 is about removing fear and shipping one tiny output per person — not theory.
- •Skip "What is an LLM" lectures; spend that time on hands-on prompt drills with the employee's own work.
- •End the week with a public artifact (Slack channel of "what I shipped today") so the program has gravity for week 2.
When a CEO of a 220-person logistics company asked me what week 1 of an AI rollout should actually cover, I told her: nothing fancy. Just get every employee to send one prompt that produces something they would have typed themselves.
Why week 1 decides whether the program survives
Most corporate AI programs die in the first ten days. Not because the tools are bad. Because the kickoff is a 90-minute slide deck about transformer architectures, followed by a vague "go try it." Adoption craters before the second cohort meeting.
Microsoft's own internal data on its 300,000-employee Copilot rollout showed usage dropped over 80% within three weeks when the training was front-loaded with theory and back-loaded with practice. BCG's 2025 AI Radar found that only about 25% of organizations deploying AI see meaningful value — and the gap is almost always on the people side. BCG's own 10-20-70 rule is the cleanest framing I've found: 10% of AI value comes from algorithms, 20% from infrastructure and data, and 70% from people and process. Week 1 is where you either build that 70% or quietly forfeit it.
Definition: Prompt — a written instruction sent to an AI model. The quality of the prompt is the single biggest predictor of output quality, more than the model choice.
What week 1 should actually contain
A 6-week program that works has a specific shape, and week 1 sets the rhythm for the remaining five. The structure I keep coming back to has three live touchpoints and one async assignment:
- Day 1 — 60-minute kickoff (live). Founder or sponsor opens. Not the L&D team. This signals the program is a business priority, not a training initiative.
- Day 2 — 90-minute hands-on lab (live). Each employee opens ChatGPT, Claude, or Copilot with their actual inbox or document open next to it. They prompt their own work, not a fictitious case study.
- Day 3-4 — async drills (30 min/day). Three short prompt drills tied to the employee's role.
- Day 5 — 45-minute show-and-tell (live). Each person posts one artifact. One. No exceptions.
The mistake I see most often: skipping day 5. Without a forcing function for a public artifact, week 1 evaporates by Monday.
Definition: AI Champion — a peer-level employee (1 per 15-20 staff) trained one cohort ahead, who runs labs and answers questions in their own department.
The kickoff script (60 minutes, copy/paste)
This is the script I give founders running week 1 themselves. Edit, but don't over-edit.
[0:00-5:00] Founder opens. "Why we're doing this. What I expect by week 6."
No slides. Just talking. Two minutes max on "the world is changing."
[5:00-15:00] Live demo: founder runs one prompt against a real company artifact
(a real email, a real spreadsheet, a real meeting transcript).
Show one good result. Show one bad result. Explain why.
[15:00-25:00] Augment, don't replace — the operating principle.
"AI doesn't take your job. It takes the boring 30% so you do
the interesting 70% better. If you feel replaced, you're
using it wrong — come find me."
[25:00-40:00] AI Champions introduced by name. Each champion stands up and
says which department they support and how to reach them.
[40:00-55:00] Hands-on: everyone opens their AI tool of choice and writes
one prompt that produces something useful for their next
real meeting today.
[55:00-60:00] Logistics: where to ask questions, where to post artifacts,
when day 2 lab starts.
Tool tip (Course for Business): The single biggest week-1 lever is the Augment, don't replace framing in the kickoff. If employees suspect AI is here to thin headcount, they will quietly sandbag the program — and your week-6 metrics will look like Microsoft's. Pair that framing with the AI Champions (1:15-20) ratio so every employee has a peer they can ask without feeling stupid in front of their manager. The 6-week program at https://course.aiadvisoryboard.me/business is built around exactly this kickoff. (Course for Business)
The first prompt every employee should write
The drill that works on day 2:
Take an email you wrote last week that took you more than 10 minutes.
Paste the situation (not the email) into ChatGPT/Claude/Copilot.
Ask: "Draft a 4-sentence reply to [recipient role] that achieves
[specific outcome] and avoids [specific risk]."
Compare the AI draft to what you actually sent. Keep what's better.
That's it. No prompt-engineering theory. No "act as a senior strategist" cosplay. The compare-to-what-you-actually-sent step is what builds judgment, and judgment is the whole point of week 1.
Good vs bad week-1 prompts
Bad: "Write me an email to the supplier." Good: "Draft a 4-sentence email to a supplier explaining we're delaying the PO by 2 weeks because of a customs hold, in a tone that protects the relationship and doesn't admit fault for the customs issue."
Bad: "Summarize this meeting." Good: "Summarize this 45-min meeting transcript into 3 decisions, 3 open questions, and 2 owners with deadlines. Flag anything that contradicts what was said in the prior meeting [paste]."
The pattern: specify the audience, the structure, and the constraint.
Team scan (what AI champions report after week 1)
- ~70-80% of employees produce at least one prompt by day 5 if there's a forcing function; under 30% if there isn't.
- The first roadblock is almost never the tool — it's "I don't know what to ask it about my work."
- Sales and customer-success teams adopt fastest; finance and legal slowest (and need a different week-2 track).
- The single most common week-1 use case is email drafting, followed by meeting summaries.
- Champions report that the biggest blocker is shame, not skill: people don't want to look stupid pasting a clumsy first prompt.
- About 1 in 5 employees will privately have already used ChatGPT for months; surface them early as informal champions.
- Departments where the manager visibly uses AI in week 1 hit double the adoption of departments where the manager doesn't.
- Asking "what did you ship today?" in the cohort Slack outperforms any LMS dashboard for keeping the cohort alive.
- Privacy fear is the second-largest blocker; address it on day 1, not day 5.
- The single best signal that week 1 worked is unprompted prompts being shared peer-to-peer by Friday.
Micro-case (what changes after 7-14 days)
A 180-employee professional-services firm I advised ran their week 1 in early 2026. Day 1 kickoff by the managing partner — 55 minutes, no slides. Day 2 lab with eight AI champions seeded across the practice areas. By day 5, 162 of 180 employees had posted one artifact. The most common output: rewriting standard client-update emails, which the senior associates had been spending around 90 minutes a week on. By day 14, the operations lead reported the cohort Slack had over 400 prompts shared — peer-to-peer, no one asked them to. Week 2 had momentum it didn't have to manufacture. Compare that to the previous year, when the same firm had bought a $40k LMS-based AI course; completion was under 20% and zero artifacts shipped.
Note on this case: This example is illustrative — based on typical patterns we observe with companies of 30-500 employees, not a single named client. Specific numbers are rounded approximations of common ranges, not guarantees.
Tool tip (Course for Business): Don't try to teach prompt engineering theory in week 1 — teach the Shoulder-to-Shoulder hot-seat drill instead. One employee shares their screen, types a real prompt against a real document, and the champion coaches in the moment. It's the fastest way to compress weeks of fumbling into one session, and it's the format the 6-week program at https://course.aiadvisoryboard.me/business runs every week. The transfer rate from observation to independent use is markedly higher than any video-based curriculum I've benchmarked. (Course for Business)
FAQ
Should we let everyone use any AI tool, or pick one? Pick one for week 1. Optionality is paralysis at this stage. ChatGPT or Copilot for most SMBs; Claude if your team writes a lot of long-form. You can introduce alternatives in week 3 (see week 3 — tool deep-dive).
What if some employees have already been using ChatGPT for 6 months? They become unofficial champions on day 2. Ask them to share one prompt they actually use. Resist the temptation to make them sit through basics — assign them a peer-mentor role instead.
Do we need a privacy/legal briefing before week 1? A 5-minute one, yes. Tell employees what data NOT to paste (customer PII, financials, contracts under NDA) and which sanctioned tool to use. Save the deeper Responsible-AI conversation for week 5.
Is 6 weeks really necessary? Can we do a 3-day intensive? A 3-day intensive looks impressive on a calendar and produces almost no behavior change. The 5-hour-training threshold from BCG is a floor, not a target. Spaced practice across 6 weeks is what makes it stick. (We separately run a B2B operating-system product on the advisory side, but that's a different conversation.)
What's the one metric that proves week 1 worked? Number of employees who shipped one artifact by Friday, divided by total employees in the cohort. Anything over 70% means week 2 will fly. Under 50%, do a week-1.5 patch — don't move on.
Conclusion
Week 1 isn't about teaching AI. It's about removing the fear that's been quietly compounding in your organization since ChatGPT launched. Get every employee to ship one tiny artifact, name your champions in public, and use the Augment, don't replace framing as the operating principle. The next five weeks compound off this base.
Next step: book the founder-led kickoff on the calendar before you book anything else.
If you want every employee to ship their first AI automation in five days — book a 30-min call and we'll map your team's first week: https://course.aiadvisoryboard.me/business
Frequently Asked Questions
Ready to transform your team's daily workflow?
AI Advisory Board helps teams automate daily standups, prevent burnout, and make data-driven decisions. Join hundreds of teams already saving 2+ hours per week.
Get weekly insights on team management
Join 2,000+ leaders receiving our best tips on productivity, burnout prevention, and team efficiency.
No spam. Unsubscribe anytime.
Related Articles

JCB Hit 83% Monthly Copilot Use — What They Did Differently
JCB reached 83% monthly active Copilot usage — far above industry-typical drop-off. The program design that produced this and what an SMB owner can copy.
Read more
Huber+Suhner Reached 99% AI Pilot Adoption — The Playbook
Huber+Suhner's AI pilot reportedly hit 99% adoption — an outlier figure. The program design behind it and what an SMB owner can realistically copy.
Read more
AI Training Week 6: Champions and Final Projects
Week 6 closes a 6-week corporate AI program with champion graduation and shipped final projects per role-track. The handover format that keeps adoption alive past the cohort.
Read more