A course is the longest-running creative project most independent educators ever ship. The first version takes months. The second version, six months later, is supposed to be better — informed by what students struggled with, what landed, and what fell flat. In practice, the revision usually feels like starting over because the student feedback and the design notes are scattered across email threads, survey exports, and a folder named "course feedback" that you never opened in time.
This guide is about putting the design work, the running drafts, and the student feedback into one vault so the next iteration is informed by the last one — without an evening of digging.
What course-creator notes are actually for
A course's notes do four jobs:
- Hold the design — outlines, lesson scripts, exercise designs, the rationale for each decision.
- Capture the build — slides, recordings, written materials, the artifacts students see.
- Track the cohort — who's in, where they are, what they're stuck on.
- Absorb the feedback — survey responses, support questions, comments, exit interviews.
The first three are familiar — every course creator has some version of them. The fourth is where most courses lose value over time. Feedback comes in. It gets read once. Then it gets buried. The next iteration doesn't get the lessons. A vault that holds the feedback alongside the design lets the lessons accumulate. For a deeper take on the design layer specifically — sequencing, dependencies, sources — see our curriculum design guide.
The curriculum page — design as a living document
A course needs a top-level curriculum page. Module structure, learning objectives per module, the throughline that connects them, the kind of student you're designing for, the prerequisite knowledge you assume.
That page is a living document. It's not a one-time deliverable. As you teach, you learn things about the curriculum's shape — module three is too long, module five depends on something module four doesn't actually establish, the order should swap modules six and seven. Those discoveries belong on the curriculum page, dated, so the next revision starts informed.
Sub-pages under the curriculum hold each module's detailed design — lesson outlines, exercise specs, assessment criteria. Unlimited page nesting means the design can be as deep as it needs to be without becoming a tangle.
The agent reads across the whole structure. Ask: "Read the entire curriculum and flag any place where a lesson references a concept that hasn't been introduced yet." You get a structural review that's hard to do manually because it requires holding the whole arc in your head at once. The agent doesn't have that limit.
Lesson pages — script, slides, and the post-delivery notes
Each lesson lives on its own page. The page holds the lesson script or outline, links to slides or the recorded video, the exercises or assignments, and a notes section that fills in as you teach.
When you record video lessons, drop the audio or video file in. It transcribes with speaker diarization. The transcript lives next to the recording on the same page — searchable, quotable, easy to revise from.
After delivering a cohort, three lines per lesson: what landed, what confused students, what you'd change. Six months later when you're revising, those notes are the most useful thing in the file. The agent can summarize across them: "Read every post-delivery note across the curriculum. What patterns recur? Which lessons have repeated complaints? Which ones consistently land?" You get a revision agenda informed by every cohort you've taught, not just the last one you remember.
Student feedback — the substrate that should drive revision
Most courses collect more feedback than they use. Mid-course surveys, exit surveys, support tickets, comments in lesson discussions, async questions during cohort calls. The volume is fine; the integration is what's missing.
A "Feedback" section in the vault, with sub-pages per cohort, holds it. After each cohort, paste the survey responses, the support themes, the exit interview transcripts. PDFs of survey exports auto-convert to markdown via docstrange so the agent can search them. Audio of exit interviews transcribes with speaker labels.
The agent reads across the whole feedback corpus. "Read every cohort's exit feedback over the last year. What's the most common reason students reported they didn't finish? What's the most common reason students said they did?" You get a real signal grounded in actual student voices. That signal informs the next revision in a way a casual read never could.
For specific design questions: "Find every time students mentioned struggling with the module on data structures. Group by cohort. Pull the strongest specific complaints." Now you can see exactly where the lesson is breaking down and have actual quotes to design around.
Tracking the cohort while it's running
While a cohort is live, the operational layer matters. Who's in. Where they are. Who's behind. Who needs a check-in.
An inline database via the :::database::: directive on a cohort page handles this. Columns: name, email, current module, last activity, status (on track / behind / at risk / dropped). Update as the cohort runs. Filter by status to see who needs attention this week. Sort by last activity to spot the quietly disengaged.
The agent can give you the morning view. "For the spring cohort, list every student who's behind on the current module and hasn't engaged in the last seven days." Five names. Three of them get a personal email. The other two were planned breaks. The disengagement that quietly kills cohort completion gets caught while it's still recoverable.
For office hours and group calls, record them when participants are comfortable with it. Drop the audio in. The transcript lets you go back later and pull the moment a student asked a particularly clarifying question — a question that probably belongs in the next iteration of the lesson. The general transcription mechanic is documented at AI meeting note taker with speaker labels.
Iteration: the version-to-version improvement
The reason a vault makes course design different over time is the version-to-version improvement loop actually closes. You teach a cohort. Feedback accumulates on the cohort's feedback page. Post-delivery notes accumulate on the lesson pages. The curriculum page picks up the structural notes.
When you sit down to revise for the next cohort — three months out, say — open the curriculum page and ask the agent: "Based on the last cohort's feedback, the post-delivery notes, and any support-question themes, what are the top five revisions I should make to this curriculum? Group by module." You get a revision plan grounded in the actual evidence. You decide which revisions to take. You make them on the lesson pages directly. The next cohort gets the improved version.
For longer-running courses with three or four cohorts behind them: "Compare feedback patterns across the last three cohorts. What's gotten better? What's still showing up?" You can see whether your revisions worked, or whether the same complaint keeps coming back from a deeper cause you haven't addressed yet. That feedback loop is hard to maintain without the tools, and the course slowly drifts as a result. With them, the course gets sharper every cohort.
Designing exercises and assessments
Exercises are where most courses thin out. The lessons are tight; the exercises are afterthoughts. Often because designing a good exercise requires understanding what students actually struggle with, and that understanding is downstream of teaching the course at least once.
After a cohort, ask the agent: "For each exercise in the curriculum, pull every student comment about that specific exercise. Was it too easy? Too hard? Unclear what was being asked?" You get an exercise-by-exercise diagnostic. The two exercises everyone struggled with get rewritten. The exercise that nobody mentioned might not be doing real work.
For assessment design, the same pattern. Pull every student response to a specific assessment, ask the agent to summarize what most students got right and what most missed. The pattern tells you whether the assessment is testing what you intended.
Marketing and student acquisition material
A course needs material to sell it — a sales page, a free preview, a sample lesson, a case study from a past student. Most of this material can be drafted from the curriculum and the feedback corpus. The drafting workflow that turns those into copy is in drafting emails, proposals, and newsletters inside your notes app.
"From the curriculum and the last cohort's strongest exit feedback, draft a sales page section that talks about what students will be able to do after completing the course. Use specific quotes from feedback where they support the claim." The draft is grounded in actual student outcomes, not aspirational marketing language. You edit. You publish. The sales page reflects the course as it actually is.
For testimonials, ask: "Pull every exit survey response that's positive enough to be a testimonial. List the student's name (if they consented to be quoted) and the strongest two sentences from their response." You get a shortlist. You confirm permissions. The testimonial section gets populated from real student voice.
A practice that holds across cohorts
The system only pays off if you use it across multiple cohorts. The habit is small: take notes during teaching, capture feedback after each cohort, do a thirty-minute revision pass before the next cohort starts. Over two or three cohorts, the curriculum gets noticeably sharper. Over five or six, you have a course that's genuinely been improved by the people who took it — instead of one that's still mostly the version you wrote on a Sunday.
The agent doesn't design the curriculum or write the lessons. It does the chores around the design — finding the patterns, surfacing the recurring complaints, drafting the sales material, summarizing the feedback. The substance and the teaching judgment stay with you. The administrative weight that usually keeps courses from being revised gets light enough that revision actually happens.
Try Docapybara free — sign up, drop in your last cohort's exit feedback and one of your lesson outlines, and ask the agent what it sees.