Heptabase is a visual thinking tool designed for learning and researching complex topics. By 2025, the product had achieved strong retention and entered its next growth stage: integrating AI Agent and AI Browser to help users more efficiently digest information, build structure, and deepen understanding.The business goal was to boost MRR growth and long-term retention, especially Week-24 usage. With competitors like Notion and Obsidian focusing on general note-taking, Heptabase aimed to differentiate through AI-enhanced research workflows.This project explores how to design a smooth, context-aware AI experience that truly supports real learning and research tasks for learners, students, and knowledge workers.
To design Heptabase’s next-generation AI thinking system—Auto Structure and AI Reviewer—that helps users reveal structure, gain insights, and improve their thinking through intuitive, collaborative AI interactions that fit naturally into their existing workflow.
Role
Design Lead
UX/UI & Research
Facilitator
DesignUX flows & IA
UI design
Junior designer * 3
UI designer * 2
Winner of the Best Insight Award
Selected as a Top 8 finalist among 16 teams
After receiving the topic—How can we design an AI agent with a smooth, context-aware experience that helps users learn and handle complex research tasks?—we began by conducting a competitive analysis and accumulated background and context of the project and product. Next, we wanted to understand the real problems users face, so we ran user interviews to identify the key opportunities and entry points for our design.
Competition analysis:
From our competitive analysis, we found that although NotebookLM excels in citation and source retrieval, it is limited in guiding users through structured, project-oriented workflows. Notion and Obsidian, on the other hand, offer strong control over databases and content structure, but their linear editing environments—or the high cost of maintaining non-linear structures—often restrict intuitive thinking.
In contrast, Heptabase provides a unique visual knowledge map that supports spatial thinking and intuitive organization. This revealed a clear opportunity in the market:
Most tools focus primarily on knowledge storage, while Heptabase’s true strength lies in helping users build thinking structures. We believe that combining NotebookLM’s strengths in citation and source retrieval with Heptabase’s visualized knowledge maps would enable AI to act as a thinking partner—one that can surface connections, remove mental bottlenecks, and proactively help users construct meaningful structures. This represents a competitive advantage that is difficult for others to replicate.
Team Collaboration Trade-off
I let go of “everyone must do the full UX process” → shifted to “assign tasks based on strengths.”
Trade-off
Ideally, all designers join research and UI work, but this wasn’t realistic for junior and UI-focused teammates.
Key Decisions
✔ I led UX strategy, research framework, interviews, and decision-making.Junior designers handled briefs and summaries.UI designers focused on high-quality visuals.They joined interviews as notetakers to learn without slowing progress.
Why
1. Maintain stable quality within a tight two-month schedule.
2. Reduce IA/flow decision risks.
3. Enable juniors to contribute where they are strongest and learn effectively.
Research Objective: Understand how users interact with information when handling complex tasks.
User Interview
I was responsible for creating the recruitment survey and developing the user interview questions. I also served as the lead interviewer for all six user interviews.
After sending out the recruitment survey, we received 45 sign-ups within the first week and a total of 67 by the second week. From the responses, I identified potential participants who were likely to engage in complex tasks—selecting three Heptabase users and three non-users. I then conducted screening calls with each person to confirm their suitability for the study. Before the official interviews, I also ran a pilot session to test and refine the interview questions. Based on the pilot and the actual interviews, I continually adjusted the interview guide to ensure clarity and depth.
My phone-screening criteria were based on the research objective:I excluded industry professionals (designers, researchers, or people frequently interviewed for work) and prioritized participants who regularly handle complex tasks and are willing and able to share their thinking.



After completing the user interviews, I facilitated a workshop where we organized the findings using an Affinity Diagram. We then synthesized the insights and determined the product direction through a voting process.
Afterward, we formulated several HMW questions and voted to select the final one.
HMW help users uncover structure and blind spots in their thinking while organizing complex information—without disrupting their natural workflow?
To address these problems, we figured out several HMW questions voted for a HMW as our directions. Then, I hosted a workshop, and applied the 6–3–5 Brainwriting method to generate potential solutions.
After that, we narrowed down the ideas through voting. To ensure our resources were invested in the right direction—and to validate whether users actually liked these ideas—we recruited five Heptabase users to test the proof of concept.
I recruited five Heptabase users to participate in concept validation sessions, ensuring our ideas aligned with real user needs. I worked with two UI designers to propose 5–10 ideas to users and create wireframes that visually communicated our concepts based on our brainstorming sessions.
Qualitative Feedback from Concept Validation
😊 Positive Feedback
Users saw clear value in AI-assisted organization:
Summary Insight :
AI support at the right moment significantly improves whiteboard organization, structural thinking, and research/planning efficiency.
🤔 Concerns & Frictions
Users also raised doubts about trust and control:
Summary Insight :
If AI misinterprets user intent, it may generate irrelevant structures, create extra work, and disrupt the user’s cognitive flow.
Given our time and resource constraints, we prioritized the design direction and chose the two strongest features to continue refining according to user feedback. These two concepts were initially proposed by me.
Besides, one challenge I encountered was resolving differing opinions within the design team, so I relied on user feedback to understand which solution approach worked best and used it to guide our decision-making.
Concept 1 : AI Reviewer



I was responsible for design the UIUX for the AI Reviewer.
The “AI Reviewer” stays quietly by the user’s side when needed.Rather than giving direct answers, it offers different perspectives from multiple AI reviewers to help users break through mental blocks and think more clearly.
It behaves like a coach who can understand the topic you’re organizing on the whiteboard, provide precise suggestions with verifiable references, point out gaps or contradictions, and guide you forward in your thinking process.
Concept 2 : Auto Structure



I was responsible for design the UX for the Auto Structure.
When the whiteboard becomes crowded with cards, connections, and screenshots, the AI Context Architect reads the existing content and proposes possible structures and relationships. It quickly transforms scattered information into a clear, organized hierarchy.
Each suggestion comes with sources and reasoning, and users can choose to accept or ignore them — maintaining full control at all times.
During the design process, we were unsure whether the current UI/UX would be intuitive for users. To validate this, I proposed that we need to plan usability testing with five Heptabase users. However, due to scheduling conflicts that week, many users were unavailable. As a result, we opened the sessions to two additional participants who were product designers. Their feedback was treated only as supplementary reference. Before the sessions, we clarified that we were not asking for their professional design opinions, but rather wanted them to use the product purely as Heptabase users and share any issues or difficulties they encountered.
I was responsible for recruiting participants, designing the usability test tasks, leading the interviews, and synthesizing the feedback from all five users.
Iteration
AI Reviewer
I partnered with two UI designers on the iteration and took ownership of the AI Reviewer feature.
The large amount of information, mixed categories, and repetitive presentation led to:
When too much content is displayed at once, users feel overwhelmed, which negatively affects readability and comprehension.
XXXXX
Prototype
Auto Structure

Lor

Lore
This app uses the Design system of Noise Eraser including button, input, navigation bar, and checkbox.

Building Alignment Through an Icebreaker Activity
Before the project officially kicked off, I proactively organized a structured icebreaker activity to help the team build psychological safety, rapport, and a shared language.My goal was to ensure everyone felt comfortable collaborating, expressing ideas openly, and understanding each other’s working styles—so that the project could progress smoothly from day one.
Through guided mini-tasks, teammates quickly connected with one another, which later made discussions more efficient and lowered communication friction.
Team Feedback — Positive Outcomes From the Icebreaker Activity
These comments reflect more than appreciation for the activity itself—they show the type of team environment I aim to create:open, collaborative, trusting, and aligned.
Team Feedback — Positive Outcomes From the project
I’m grateful to have worked with such supportive group. Each teammate brought energy and ownership to the project, and I deeply appreciate the trust they placed in me as a design lead.
1. Real user behavior matters more than verbal descriptions — observe, don’t assumeWhat users say and what they actually do often differ.Throughout the research, I prioritized observing real interactions over relying on verbal explanations.This helped the team quickly focus on the true pain points.
2. As a Design Lead: leverage each team member’s strengths and allocate resources effectivelyIn cross-functional collaboration, I learned the importance of clarifying each teammate’s skill boundaries early on.By assigning suitable workloads and providing targeted support, I ensured the team could consistently deliver high-quality outcomes.
3. Usability testing isn’t about perfection — it’s about improving every roundEach small testing cycle should reveal new insights and reduce uncertainty.These learnings fuel iteration and help the team identify where to focus next — an essential process for maturing a design quickly.
4. Onboarding design: reduce cognitive load and let users feel the value immediatelyA new feature’s first-time experience should be lightweight.Avoid forcing users to process heavy default content or fill in extra information.With limited time, I applied the “show, don’t tell” principle and used real user input for onboarding validation.Inspired by Shape of AI, I adopted the flow:user’s own content → instant analysis → immediate feedback,so that the value naturally emerges without explanation.
5. Reduce complex task demonstrations — use “task breakpoints” to guide the interviewIf a test requires users to switch between multiple scenarios, they quickly become overwhelmed.I learned to observe natural reactions during the task, and only add essential context when needed.This approach captures more authentic behavior and clearer feedback.
6. Key realization: great products come from deeply understanding human natureCompetition isn’t about chasing features — it’s about studying users’ real context and motivations.Only by truly understanding their habits, concerns, expectations, and workflows can a design achieve irreplaceable value.
RecruitmentApp
UXDesign
JobPlatform
Talent Matching
Mobile UX
B2B Design
HR Tech
FormOptimization
AI Tech
AIAudio
NoiseCancellation
Web App Design
Sound Design
Voice Enhancement
B2C Design
UserManagement
PlatformUX
Account Management
Enterprise UX
Admin Dashboard
Web UX