Ninety.io Knowledge Portal
A redesign of the Knowledge Sharing Tool in the Ninety.io app with a friendlier and more intuitive interface.
Project Date: 2023-2024
Role: Lead Product Designer
Team: Jenifer Warburton - Group Product Manager, Melissa - Product Designer, Christopher Bartlett - Engineer, Gary Burns - Engineer
Ninety.io lacked a centralized solution for teams to share internal knowledge, onboarding content, and coaching resources. I led the design and execution of the Knowledge Sharing Tool (KST)—an extensible learning platform built to help organizations document, assign, and track learning materials.
We launched this tool from scratch through alpha, beta, and GA, continuously evolving it based on user feedback.
Before KST, teams used fragmented tools like Slack, Google Docs, and email to share critical knowledge. This made it nearly impossible to:
Ensure people had access to the right learning content
Track engagement or completion
Curate reusable knowledge artifacts across teams
We needed a centralized tool that could:
Organize content clearly across roles and teams
Track completion and assignment
Scale with growing organizations
Problem Overview
Research & Discovery
I ran interviews with internal teams, coaches, and clients to understand knowledge-sharing pain points. Key insights included:
Managers struggled to onboard team members consistently
Coaches wanted to assign learning material to clients post-session
Users needed the ability to filter and organize content relevant to their role
This informed our MVP acceptance criteria and design strategy.
I also created user journey maps for both Admin and Learner archetypes, helping visualize their motivations, friction points, and success paths. These flows clarified product requirements and gave us a shared lens for evaluating UX tradeoffs.
Flow Mapping
I mapped out the full content creation and assignment journey—from Knowledge Share to Rock and To-Do creation. This helped clarify dependencies across tools and roles and uncovered opportunities to streamline the user path and reduce redundant steps.
User Testing
We conducted multiple rounds of moderated tests focused on:
CTA clarity (single dropdown vs. contextual)
Drag-and-drop functionality
Language and hierarchy (Folder > Subject > Topic)
I iterated on the UI to reduce friction in content creation and reordering, and validated whether learners understood progress indicators.
Key Design Decisions
Dual Modes (Admin vs. Learner): Gave creators robust editing tools while keeping the learner view clean and focused
Progress Indicators: Visual cues at the topic and subtopic level to show % completed
Collaboration Model: Only users with creator roles could edit shared Collections, reducing accidental changes
Outcome
Results
Within the first 90 days, we exceeded all target KPIs, validating both the need and the usability of the tool:
📈 40,000+ Monthly Page Views: Demonstrated strong and consistent user engagement from launch.
👥 10% Daily Active User Penetration: A clear sign of tool discovery and habitual use across our user base.
🏢 40% Account-Level Adoption: Nearly half of all Ninety accounts had at least one active Knowledge Portal user each month.
📚 1,000+ Collections Created: Averaging 300+ new collections per month, showing qualitative adoption and content creation at scale.
What I learned
Balance flexibility and structure: Giving admins too many open-ended tools led to messy hierarchies—preset templates helped.
Progress tracking drives behavior: Simply adding % completion shifted usage patterns and helped managers ensure accountability.
Scoping MVP tightly matters: We postponed "content analytics" and "draft modes" until we validated real-world use.
Iterating on Results: Adding an AI Companion to Improve Adoption
After launch, our initial results showed that while the Knowledge Portal provided clear structure and value, many users still felt blocked at the very first step: actually creating a new Collection. Our adoption data and follow-up user interviews made it clear that people understood why the Portal was useful—but they didn’t know how to get started.
To address this friction, I led the design of Maz, an AI-powered companion that guides users step by step through creating their first Collection. Maz uses a conversational flow to break the process into approachable questions, helping users go from a blank page to a fully built Collection in under 10 minutes.
By directly targeting this barrier, Maz improved first-session success rates and increased overall engagement with the Knowledge Portal. This follow-up intervention demonstrated our commitment to continuous user-centered design and our willingness to evolve the experience based on real feedback and behavioral data.