PALMs Authoring Tools

A web-based platform for educators to create and share perceptual and adaptive learning modules

Figure 1: A selection of experimental apps made by the internal version of the platform.


Advanced Distributed Learning,
US government
Target Audience
DoD Educators
Enabling educators to create adaptive training content

A responsive web-based authoring platform for building learning modules
Live on ADL’s warehouse
(Demo version available)

Domain Experts x 2
Project Manager x 1
Engineer x 4
Designer x 1 👋
My Role
End-to-end product design
Design interaction models
Code responsive templates
Design QA and usability testing
The timely and quality delivery of the product unlocked a significant source of revenue for the company through multiple awarded contracts.

Figure 2: Introductory shots as a quick visual overview of the project.

Quick Overview

A Perceptual Adaptive Learning Module or PALM is a piece of software intended to address crucial aspects of learning that traditional methods of instruction have largely overlooked. PALMs were originally developed to focus on pattern recognition, but they are also well-suited and highly effective for any factual learning or procedural learning. To ensure this online instruction is readily available for DoD use, ADL commissioned Insight to build a platform that enables DoD educators to turn their course content into PALMs.

Problem Statement

Education programs in critical domains aim at mastery, for which didactic approaches are ineffective. Given the effectiveness of PALMs, how can we eliminate the technical barrier for educators to turn their course material into PALMs? How can we build a system around a proprietary technology, under the supervision of a high-profile client, to unlock a significant source of revenue?

Strategy, Planning, and Ideation

As PALM-makers, we knew how to build PALMs but didn't know to make software that essentially replaces us. We spent a few months researching, studying, and strategizing to develop a roadmap. In this phase, I led group discussions and whiteboard exercises on user-facing aspects of the platform. I spearheaded a reverse-engineering activity that led to various interaction models, user interfaces, and specialized procedures.

Figure 3: The preliminary components backlog (top left), the initial proposal (top right), the development schedule spanning years one and two (bottom left), and the features and activities saved for the 3rd year (bottom right).

Based on ideated concepts involving assumptions and hypotheses, I built a click-through wireframe system ​(over 50 screens). The goal was to have a visual guide for designing a cohesive and well-designed experience for educators.

Figure 4: Various maps drawn to aid the decision-making and development processes. The map on the right reflects the latest iteration of the platform.

Based on ideated concepts—which involved assumptions and hypotheses—I built a click-through wireframe system ​(over 50 screens). The goal was to have a visual guide for designing a cohesive and well-designed experience for educators.

Figure 5: The systems modeling of The Main Editor, early iterations.

Then, through internal critique sessions, pilot educator interviews, and direct observation of test users, I noticed a few flaws in the system that led to revisions.

Example Hypotheses

✅ The building process should be top-down.
✅ Structural elements should come before attributes or relationships.

❌ We should have two creation paths: assisted and expert

Prototyping & Building

Once I could bring our team and stakeholders on board by presenting my ideas through these wireframes, we entered the building phase. Building one component at a time, I spent most of my time in this phase creating interfaces using code. Once I had readily worked on responsive templates, I worked closely with engineers to turn them into react components and hook them to the back-end. At the same time, to ensure usability, feasibility, and effectiveness of interfaces, I rigorously tested them with various screen sizes, devices, and browsers.

I repeatedly used our team (5), pilot educators (2), and three completely novice users (3), to use the components and provide feedback. About half of the time, the sessions were unmoderated, trying to test for unguided interactions with the interfaces. The other half, the users, were primed on specific tasks (e.g., creating problems with different layouts, uploading assets, making distractor choices inserted dynamically, etc.).

Once I felt confident with the result of my tests for each deliverable, I requested a client presentation to get their buy-in. (Ask them to work with the deliverable and provide feedback.)

One of the most valuable meta findings of these efforts for me as a designer was that building specialized interfaces was much easier than making them superbly efficient.

Example Behaviroal Insight
🔎 The CRUD pattern hints at underlying issues.
Users quickly start thinking of optimizing their methods to take fewer steps. We observed a recurring way of creating—reading—updating—deleting, aka CRUD. There are several components to which this pattern is applied: PALM structural elements (modules, categories, and problems) and assets (templates, files, tags, and lists). My conclusion was that the interfaces are not efficient/flexible enough to preserve the user's preliminary efforts.

Example Usability Insight
🔎 Clicks compound.Because of the cyclic nature of the creation process, seemingly negligible extra clicks compound when repeated, harming the UX significantly.

Example Usability Insight
🔎 Scrollbars slow down.
We need to minimize the real estate use to get scrollbars as much as possible. Also, we need to preserve the scrollbar state when changing routes to avoid redundant browsing (scrolling).

Example Nullified Hypothesis
🔎 Unfamiliar users need wizards.Early on, we noticed people didn't know where to begin. So, we entertained having specialized wizards to ease the learning curve. But, after building a set of wizards and testing them with pilot users, we observed even though our wizards were easy to work with, they were discouragingly slow when used for high volumes of learning materials. Inevitably, we abandoned that decision and shifted our focus on improving the main editor so much that the need for wizards would eliminate.

Example Design Challenge
A Search Compatible Tree

Figure 6: The tree model for browsing PALM structure in its initial state features indentation, entity-type cueing symbols, and collapsibility. In search mode, the view changes to a flat list where search results feature their bread-crumb-like addresses, cueing where each entity belongs and how to navigate to that location. This view is helpful in cases where various problems from different categories share the same name.

Example Design Challenge
An Efficient Interaction Model for Managing Images and Tags

Figure 7: The images tab is the entry point to the image management system. The tags tab is the entry point to the tags management system. Through a familiar presentation, the learning curve for utilizing them becomes shallow, and the effort for reaching them is equal.

Example Design Challenge
Templatization of The Creation Process

Figure 8: Defining problem attributes (e.g., prompt question, distractors, feedback text, etc.) in a template and assigning that template to a problem saves users from taking redundant steps in similar problems.

Example Design Challenge
Automating Distractor Insertion Using Lists

Figure 9: Each problem comes with correct answer choices and multiple distractors. Defining multiple distractors for numerous problems can be a cumbersome task. Instead, defining a list of distractors and using the list name in the distractor fields can automate the process. A behind-the-scenes mechanism excludes the correct answer at each fetch.

Example Design Challenge
Embedding Progress Reports Into Essential Screens to Avoid Additional Screens

Figure 10: Module launcher page features the progress and mastery metrics. Progress reports, triggered after each definable block of problems, allow users to take a break and reflect on their performance.


I matured the product's visual language in phase. In this phase, we focused on increasing the components' quality by making them efficient and performant. Our further usability tests surfaced several edge cases that we had to address. In this phase, we also received several delayed—or post-delivery—feedback that we needed to address. While working on those, I also started polishing the UI and paying extra attention to the visual language of the product.

Figure 11: Various modals requesting user input for basic and advanced functionalities.


The timely and quality delivery of the product positioned Insight Learning Technology for significant revenue growth.

The PALMs Authoring Tools system is now available for government use on desktop and mobile devices. This powerful authoring system enables easy learning module creation and leverages perceptual and adaptive pedagogical methods. Content authors can upload images and text, create and prioritize questions, set question timers, etc.*

Throughout 2017 and 2018, the U.S. Marine Corps (USMC) conducted usability testing of the PALMs platform and its associated authoring tool. This work led to a more robust instrument evaluation to support USMC education and training requirements**.

* Perceptual and adaptive learning modules (palms). ADL Initiative. (n.d.). Retrieved January 12, 2022, from
** Freed, M., Folsom-Kovarik, J. T., & Schatz, S. (2017). More than the sum of their parts: case study and general approach for integrating learning applications. In Proceedings of the 2017 Modeling and Simulation Conference.

Email   LinkedIn   X