JUN WANG

Case study

HiteVision Pie Product Line

Interactive teaching whiteboard 8.0 and multi-screen interaction 2.0

I joined this product line as a UX designer when HiteVision's education software was under pressure from more experience-led competitors. The problem was not simply that the software lacked features. The deeper issue was that many useful teaching capabilities were not being fully delivered in real classrooms because the interaction logic did not fit actual teacher behaviour. My role was to go into real teaching scenarios, identify why usage diverged across teacher groups, and turn those findings into clearer interaction, better feature delivery, and more usable classroom workflows.

Product line
Pie.EXE - Powerful, Interactive, Education
Main projects
Interactive Teaching Whiteboard 8.0 and Multi-screen Interaction 2.0
My role
UX designer with ownership across field research, interaction definition, prototyping, iteration, testing, and launch support
Methods
Contextual inquiry, stakeholder mapping, behavioural segmentation, interaction modelling, and data instrumentation
High-resolution overview for HiteVision Pie Product Line

Industry context and business pressure

Around 2010, competitors began catching up rapidly. While HiteVision still had scale and market presence, competing products were moving faster with more experience-first product thinking.

Under that pressure, the Pie product line was revised and restructured. My task was not just to polish screens, but to help diagnose why the product line was underperforming in actual use and where interaction changes could improve classroom adoption.

Business and industry background for the HiteVision Pie product line
The redesign work happened under direct competitive and business pressure, not only as a design refresh.

Interactive Teaching Whiteboard 8.0

Product description

  • Target users: teachers of all subjects in K12
  • Scenario: lesson preparation and teaching
  • Media: personal computer and HiteVision touch screen
  • Main content: 21 functional modules, including 4 new classroom activity types, 5 gadgets, and usage specifications for classroom tools and basic elements

Revision strategy

  • Greatly simplify the operating interface
  • Delete redundant operation paths
  • Merge one-way operation paths
  • Improve operating efficiency significantly

Team and responsibilities

Team structure for Whiteboard 8.0: 30 R&D, 1 visual designer, 6 PMs, 4 QA, and me as UX.

I participated in demand discussions, brainstorming, and function screening; helped prioritize functions; created interaction prototypes and functional logic rules; conveyed functional design intent to UI designers; worked closely with R&D to refine implementation details; synchronized design revisions in real time to balance schedule and quality; supported testing with test cases in the agile process; and collaborated with product teams to collect and analyze feedback and guide rapid iteration.

Responsibilities and iteration process for Whiteboard 8.0
Whiteboard 8.0 involved dense iteration, release preparation, internal testing, and follow-up design revisions.

Research and analysis

I worked from stakeholder maps, user behavior analysis, questionnaires, and scenario analysis to understand both the teaching environment and actual classroom operations. The most important part of this work was going back to real teaching contexts instead of treating all teachers as one user group.

What I found was that the issue was not only feature richness. Younger teachers and teachers in more developed regions were much more willing to explore interactive tools, while many teachers in less developed areas treated the large screen more like a projection surface. The same product therefore had very different levels of feature delivery depending on the teaching context.

  • The questionnaire collected 312 valid suggestions.
  • Issues included difficult operation, mode confusion, weak gestures, insertion problems, insufficient tools, inconvenient annotation and erasing, and weak sharing support.
  • The research problem was not only what features were missing, but why existing features were not being fully delivered across real classrooms.
  • Research covered both PC lesson-preparation scenarios and large touch-screen teaching scenarios.
Teacher segmentation and feature-delivery gap diagram for HiteVision Pie Product Line
The segmentation view makes the key research finding explicit: feature delivery varied sharply across teacher groups, so the design task was to lower adoption cost, not just add functions.

Teacher segmentation and field insight

The most important research move in this project was refusing to flatten all teachers into one user type. In practice, classroom behaviour differed significantly across teaching style, digital confidence, school conditions, and regional context. That meant the same feature set could look powerful in a demo and still underperform in daily teaching.

In more developed teaching environments, some teachers were willing to explore interactive activities, annotation tools, and richer classroom controls. In less developed environments, many teachers used the large screen more like a projection surface and stayed close to low-risk, familiar actions. This difference changed how I thought about interaction cost, gesture discoverability, and what “feature delivery” really meant.

Rather than asking only what new functions to add, I asked which existing capabilities were failing to cross the usability threshold in real classrooms. That framing led to more grounded decisions about simplification, gesture logic, classroom entry points, and multi-screen interaction support.

Observed classroom contrast

In one lower-resource classroom, I observed a highly committed senior teacher who cared deeply about student outcomes but used the large screen mostly like a projector. He relied on spoken explanation and personal teaching rhythm, while younger students had limited support from interactive visuals or manipulable tools. The product had useful features, but they were not crossing the usability threshold in that environment.

In more developed urban classrooms, younger teachers were much more willing to use geometry tools, annotation, interactive activities, and richer on-screen controls. Students responded faster, participation was more active, and the same product delivered more of its intended value. That contrast made the design challenge very clear: I was not just improving features, I was trying to improve feature delivery across very different teaching realities.

Design implication

This is why I prioritised lower interaction cost, clearer gesture rules, simpler classroom entry points, and better cross-screen coordination. The goal was to help more teachers use the product confidently in live teaching, not only to make advanced users even more powerful.

Segmentation lens

  • Teaching style and willingness to explore interaction-heavy features
  • Digital confidence and tolerance for learning new classroom tools
  • Regional and school-context differences affecting equipment use

Product implication

  • Reduce interaction cost for common classroom actions
  • Improve discoverability for valuable but underused features
  • Treat feature adoption as a design problem, not only a training problem

Information architecture and low-level interaction definition

Based on research and analysis, I planned the paths of information interaction for the basic behaviors of lesson preparation and teaching.

In lesson preparation, the redesign covered the file menu, toolbar, interface behavior, QR-code sharing of courseware, convenient access to cloud files, editing and insertion of classroom interactive questions, subject tools, property panel linkage, auxiliary-line adsorption, insertion of teaching resources, and usage specifications for elements.

At the low-level module layer, I also defined gesture operations in detail, defined touch-feedback behavior, proposed solutions for resolution adaptation and object adaptation, and sorted out data instrumentation points.

These research findings directly changed the interaction rules I defined. For example, gesture behaviour had to become more predictable and easier to discover, because teachers under classroom pressure would not tolerate ambiguous touch feedback or multi-step tool logic. I also treated cross-screen and mobile-linked actions as a way to reduce podium-bound teaching behaviour and make interactive control feel more natural in the flow of a real lesson.

Data instrumentation

  • Track and record user behavior and actively detect problems
  • Quantify hot and cold functions to optimize high-frequency usage
  • Analyze user habits to guide follow-up interaction design
  • Connect with later big-data analysis plans and support deeper exploration of the product ecosystem
Data instrumentation design for Whiteboard 8.0
Interaction work was connected to behavior tracking and later product analysis, not only screen-level polish.

Multi-screen Interaction 2.0

The multi-screen interactive mode of "entering the students and interacting in the classroom" was designed to let teachers leave the podium and move into the classroom while still controlling the large screen.

I officially took over this product in December 2017, and it was launched in March. Internal evaluation users included Gaosi Education and internal lecturers. In a questionnaire survey, 100% of the 75 teachers surveyed expressed a strong willingness to use the revised software again.

Product description

  • Target users: teachers of all subjects in K12
  • Scenario: classroom teaching
  • Medium: personal mobile phone and HiteVision touch screen
  • Main content: classroom large-screen linkage through the teacher's mobile phone, image uploading, screen projection, computer courseware control, and classroom live broadcast

Revision strategy

  • Use QR-code binding to simplify the linkage-establishment process
  • Optimize common operations and improve efficiency
  • Deeply optimize the teacher's application experience

Iteration and launch support

Team structure for Multi-screen Interaction 2.0: 10 R&D, 1 visual designer, 2 PMs, 2 QA, and me as UX.

There were 28 internal versions and 5 major design iterations. I followed up on testing with each release, raised user-experience questions continuously, participated in demand discussions and prioritization, created prototypes and logic rules, aligned with UI and R&D, supported testing, and collaborated with product teams on experience feedback and rapid iteration.

Responsibilities and iterations for Multi-screen Interaction 2.0
Iteration work included internal reviews, leadership reviews, repeated design updates, and launch preparation.

Key design cases

Case 1

Optimizing the opening method

Before the redesign, teachers had to open the software, click large-screen and small-screen interaction, click connection, choose intelligent search or QR-code scanning, select a device, and then connect successfully. After the redesign, the product used automatic intelligent search followed by device selection, which improved efficiency and reduced logic errors.

Before and after opening flow for Multi-screen Interaction 2.0
Connection setup was shortened and simplified.

Case 2

Functional module division

Before the redesign, functions lacked focus and priority, bidding features and real user functions were mixed together, classroom scenarios were not considered comprehensively, and the main upload workflow was flawed. After the redesign, four core functional modules used by teachers were extracted, while smaller functions were hidden in the toolbox with business goals still in mind.

Functional module division redesign for Multi-screen Interaction 2.0
Module grouping was redesigned around actual teacher usage.

Case 3

Optimizing uploaded images

Before the redesign, the page was concise but it was difficult to support annotation and impossible to switch smoothly between comparison mode and single-page mode. After the redesign, uploaded images were divided into single-page mode and comparison mode, which better supported multiple teaching scenarios and improved classroom efficiency.

Uploaded image optimization for Multi-screen Interaction 2.0
Image handling was redesigned for comparison, annotation, and classroom use.

Case 4

Mobile phone screen projection

I added direct phone screen projection so that clicking the app on the mobile phone would project the screen directly to the computer. Horizontal and vertical orientation could rotate freely, making video playback easier. Pen and eraser functions from the PC side were also brought in so projected documents could be annotated more conveniently.

Mobile phone screen projection design for Multi-screen Interaction 2.0
Projection and annotation were redesigned together to support actual teaching behavior.

These design cases mattered because they reduced the interaction threshold in live teaching. In the revised multi-screen interaction product, 100% of the 75 teachers surveyed said they were willing to keep using the software, which was a strong signal that the new flow felt usable enough to stay in the classroom rather than being tried once and abandoned.

Methods and product thinking

I treated this product line as a real behavioural adoption problem rather than as a feature-expansion exercise.

I used stakeholder maps, questionnaires, user-behaviour analysis, and classroom scenario research to understand how teaching contexts changed what users could actually adopt.

I used behavioural segmentation to avoid designing only for the most advanced teachers, because product value depended on whether more ordinary classroom users could cross the interaction threshold.

I connected research findings to interaction modelling, gesture rules, and feature-entry simplification so important capabilities could be used more naturally in live teaching situations.

I also linked interface work to instrumentation thinking, because understanding hot and cold features was necessary for improving later adoption and product evolution.

What this case proves

  • I can do field-grounded UX work in complex real-world environments instead of relying only on generic interface assumptions.
  • I can translate behavioural research and teacher segmentation into concrete interaction, gesture, and workflow decisions.
  • I can improve feature delivery in products where the core challenge is not missing functionality but weak usability in live use.
  • I can connect classroom research, interaction design, iteration, and data instrumentation into one longer-term product-improvement loop.

Failures and learning

This project taught me that feature richness can hide delivery failure. A product can look powerful on paper and still fail in the classroom if the interaction cost is too high for real teachers under real teaching pressure.

I learned to ask a better question: not “what else should we add?” but “why are valuable capabilities failing to cross the usability threshold for so many users?” That shift made me pay more attention to adoption, segmentation, discoverability, and behaviour in context.

It also made me more careful about designing for ordinary users, not only advanced users. Strong UX work is not just about making experts faster. It is about helping more people actually use the product well.

Public product access

These public links show the company context and the download pages for the product lines I worked on.