Bianca Starling
Unyleya · 2018 - 2020

Teacher Operations & Learner Experience

Operations UX Design EdTech Teacher Tools

Teacher experience and learner experience are more connected than most learning platforms admit. Fix one badly and you break the other.

This is the part of edtech product work that rarely makes it into a case study, because it’s not one decision with a clean outcome — it’s dozens of small decisions about process and interface that compound into something that either works or doesn’t. Nobody can point to the moment when teacher operations became smooth. They just notice, eventually, that things that used to take hours take minutes, and problems that used to surface in student complaints stopped surfacing.

At Unyleya, the teacher experience and learner experience were both broken in ways that fed each other. Teachers navigating a confusing administrative portal made errors that showed up as learner problems. Learners hitting friction in their interface generated support tickets that consumed teacher time. The two systems weren’t just related — they were a loop, and the loop was running in the wrong direction.

What Teacher Operations Actually Means

When people say “teacher operations” in an online learning context, they often mean the visible part: uploading content, recording videos, participating in discussion forums. Those are real, but they’re the front of the iceberg.

The operational work underneath includes: onboarding to the platform (which, if it takes more than a day, immediately creates a negative relationship between teacher and tool), submitting content for review and understanding where it is in the review process, receiving feedback on rejected or revised content, responding to student questions and escalations, and tracking basic metrics about how their courses are performing.

At Unyleya, this work was being done through a portal that had been built for administrators first and teachers second — or possibly third. The interface was dense, form-heavy, and organized around the institution’s internal logic rather than the workflow a teacher actually moves through. Finding anything required knowing where to look, which meant new teachers spent their first weeks asking more experienced colleagues for help rather than doing actual teaching work.

🚧 Need more context: What were the specific pain points identified in teacher feedback or observation? Were there particular tasks that had the longest completion times or highest error rates?

The Redesign Approach

The redesign moved from a form-heavy model to a task-based one. The distinction matters. A form-heavy interface presents you with fields to fill. A task-based interface presents you with the next thing you need to do. For a teacher coming in to submit content for review, the difference is between “navigate to the content submission section, fill out the form, attach the file, submit” and “here’s your pending task: content submission — click to continue.”

This sounds incremental. In practice, it changed how teachers related to the portal. Instead of needing to know the structure of the system, they could just follow the task list. Cognitive load dropped. Error rates dropped. The questions that had been going to support started getting answered by the interface.

🚧 Need more context: What was the specific before/after on teacher onboarding time? Were error rates tracked pre- and post-redesign? What platform was the teacher portal built on?

The Data Integration Question

Part of the operational visibility problem was that teacher activity data and learner outcome data lived in separate systems and didn’t talk to each other. A teacher could see how many students had enrolled in their course. They couldn’t easily see whether those students were completing modules, where they were dropping off, or how their assessment performance compared to other sections of the same course.

Connecting those data sources gave teachers information they could act on. A teacher who sees that 40% of students stop at module three can investigate module three. A teacher who has no visibility into completion patterns can only wait for support tickets.

🚧 Need more context: What specifically did the data integration involve technically? What data sources were connected and what did the resulting visibility look like for teachers? Were there specific operational decisions that changed as a result of better data?

The Learner Side

The learner interface redesign ran in parallel with the teacher operations work, and the two were designed to reflect each other. The Netflix-inspired student platform described in the course publishing case study was the front end. Behind it, the teacher operations improvements were what kept content quality high and support response times reasonable.

Learners experienced the improvement indirectly. Fewer errors in course content because the teacher submission process had better QA checkpoints. Faster responses to questions because teachers weren’t spending their time fighting the administrative interface. Cleaner course navigation because content was being submitted in consistent formats from the teacher side.

🚧 Need more context: Were there specific learner engagement metrics that improved — completion rates, time in platform, satisfaction scores? Were any of these attributable specifically to the teacher operations improvements vs. the learner UI redesign?

What This Kind of Work Actually Is

Projects like this don’t have the clean narrative arc of “we built a new feature and X happened.” The outcomes are real but distributed — slightly better operational efficiency multiplied across many teachers multiplied across many students over time.

The reason this work matters for a portfolio isn’t the headline metric. It’s the approach: looking at a system with two broken ends, recognizing that fixing one in isolation would leave the loop running wrong, and building the two redesigns to reflect each other. That’s the judgment call. The implementation is the evidence that the call was right.

🚧 Need more context: What was the team structure for this work? Was there a dedicated design resource, or did the PM drive design decisions? What was the timeline and scope — were teacher operations and learner experience redesigned simultaneously or in sequence?

All work