AI-Powered Session Review Workflow for Coaches
CLIENT
INDUSTRY
Careerfully
EdTech.AI. B2B Saas
My Role
Product Designer
Team
3
Responsibilties
0-1 Feature Design
Timeline
3 weeks
Tools
Figma, Figma Make, Lovable,
ChatGPT
.png)
What is Careerfully?
Careerfully is a coaching platform for business school students. It serves both coaches and students. Coaches manage their sessions and send follow-ups, and students track their progress.
Users
The primary users are coaching professionals who run recurring sessions with business school students and need to send structured follow-ups after every session. They are time-constrained, detail-oriented, and care deeply about the quality of what gets sent to their students.
Problem
Coaches had no structured way to manage their sessions; what was coming up, what was done, and what still needed a follow-up.
After every session, coaches were manually writing summaries and action items and sending them to students. It was taking up time that could be saved with AI. But automatically sending AI-generated content directly to students wasn't the right call either; coaches needed to review and approve what goes out.
DESIGN CHALLENGE
How might we give coaches a simple way to manage their sessions and pending follow-ups, while using AI to automate summaries and action items without removing their judgment from what gets sent to students?
Solution
I designed the Coach Session Dashboard and Session Detail pages, giving coaches a clear view of upcoming and past sessions, and a way to review, edit, and approve AI-generated summaries and action items before sending to students.

Prototype demo- coach session dashboard and session detail page
Qualitative Impact & Result
The feature launched successfully. Coaches said the new dashboard and AI enhancement would organize their sessions well, enhance their trust and confidence, and save their time on post-session admin work.
Coaches liked the dashboard view; having a single place to see all their upcoming and past sessions was a clear improvement over the previous workflow.
Drag-to-reorder action items landed well, they wanted control over the order and priority of what they sent students.
The review, edit and approve features of AI-generated summaries and actions were well appreciated.
All the features rolled out with minimal technical friction.
What coaches actually needed
Through stakeholder conversations, it became clear that the problem wasn't just about saving time. Coaches needed:
A clear view of sessions and pending
follow-ups for students
The ability to review the AI-generated
summaries and edit them before sending
Confidence that they were the ones approving what the students received
Action items that were accurate and editable, not just auto-populted
The Key Insight: Coaches didn't distrust the AI outright, but they needed to feel like the final decision was theirs.
Competitive Analysis
I looked at Fathom,Otter.ai, Attiko and Zoom AI companion, focusing on how much control on AI generated summaries. action items that each platform gives to the owner 0f the session.​
-
Fathom, Otter.ai, and Attiko all show the summary and the transcript side by side. One can click on any word, and it takes you directly to the transcript and loops to the video conversation. The coach can verify instantly without wondering if the AI made something up.
-
Otter.ai also lets users edit the Ai- generated summary and action items before sharing
WHAT WE CARRIED FORWARD
The side-by-side transcript and summary view from Fathom and Attiko, and Otter.ai's feature to edit summaries and action items before sending.
Design Decisisons
Design decision split into two areas: session management through the dashboard, and coach control over AI-generated control through the review workflow.
Dashboard Design
As soon as the coaches land on the dashboard, they see their upcoming session front and center. In the Past Sessions tab, sessions that still need coaches' follow-up surface at the top.

Dashboard- Upcoming sessions and Pending sessions
Session Detail page
When a coach opens a past session, they see the session name, scheduled time, runtime, and logged hours at the top. Below that, two tabs, Summary and Transcript, give them access to the AI-generated content. Three specific design decisions came out of coach feedback:
01
Edit Logged hours
Coaches wanted the ability to edit logged hours because the actual time spent on a session — including prep or follow-up — is their call, not something the system should lock in automatically.
02
Drag action items
Rather than deleting AI suggestions that were off, coaches can drag the correct ones into the Approved Actions list and leave the rest behind.
03
Approve and send
Once the coach is satisfied with the summary and action items, a clear Approve and Send button confirms that they have reviewed everything and are ready to send it to the student. Nothing goes out without that explicit action.


Summary tab-editable inline
Drag to approve-AI Suggestions stay visible
Design Handover & Stakeholder Alignment
Getting the design to a place where it was technically feasible and stakeholder-approved took several rounds of alignment that shaped our final design.
WITH DEVELOPER
During handover, the main conversation was around the editing behavior of summaries/action items on the session detail page. We decided to go with explicit Save and Cancel buttons rather than autosave, simpler to build, and clearer for the coach. The other design moved forward as designed.
STAKEHOLDER FEEDBACK
The feedback that came up was about the red color used for AI-suggested action items. Coaches found it alarming, whether to avoid or to consider. We changed it to black, which removed the confusion.
Learnings
This project gave me a lot to learn as a designer.​​
1
Red can mean error. It can also mean Important
When to use red color is trickier than it seems. Both 'error' and 'default/important' use cases reach for red — and in this context, coaches read "AI-suggested Actions" as an error state, not an action prompt.
2
User needs shape the design direction
The drag to reorder action items and the ability to edit logged hours weren't in the original brief. They came directly from coaches. Listening to what users actually needed, and building it in, made the product meaningfully better.
Next Steps
The next phase is to design an interface where AI gives coaches feedback after their sessions end, surfacing what went well and what could be improved. We will be exploring the opportunities how AI could be used to support coaches' growth.