Project Overview

Heptabase Website - 2025.10 — 2025.12

Product background

Heptabase is a visual thinking tool designed for learning and researching complex topics. By 2025, the product had achieved strong retention and entered its next growth stage: integrating AI Agent and AI Browser to help users more efficiently digest information, build structure, and deepen understanding.The business goal was to boost MRR growth and long-term retention, especially Week-24 usage. With competitors like Notion and Obsidian focusing on general note-taking, Heptabase aimed to differentiate through AI-enhanced research workflows.This project explores how to design a smooth, context-aware AI experience that truly supports real learning and research tasks for learners, students, and knowledge workers.

01 | Team Goals

To design Heptabase’s next-generation AI thinking system—Auto Structure and AI Reviewer—that helps users reveal structure, gain insights, and improve their thinking through intuitive, collaborative AI interactions that fit naturally into their existing workflow.

02 | My Role & Deliverables

Role
Design Lead
UX/UI & Research

Facilitator
DesignUX flows & IA
UI design

03 | Project Challenges

  • Only 2 months to deliver research, testing, and full UX/UI3 junior designers, 2 with mainly UI background → heavy guidance needed
  • Required strong scoping, decision-making, and leadership to deliver
  • Limited engineering clarity → design under ambiguity

04 | Team Member

Junior designer * 3

UI designer * 2

05 | Outcome

Winner of the Best Insight Award

Selected as a Top 8 finalist among 16 teams

Research & Testing

I conducted...

6 User Interviews

5 Poc Tests

5 Usability Test

Facilitation

I host...

1 Icebreaker Activity

8 Workshops

User Feedback

“The Auto-Structure feature is really helpful when I’m working on academic papers.” -Interviewee D

“I really like the AI Reviewer because it stimulates my thinking.It feels like someone beside you throwing out ideas.And if even one of those suggestions is useful, then this feature is absolutely valuable.”
-Interviewee Miss Chang

“Auto-Structure is like helping me reorganize the big picture.We actually need something like this, because our brains get tired and can’t think of everything.That’s exactly the advantage of AI.” -Interviewee Mr.Wu

User Research

After receiving the topic—How can we design an AI agent with a smooth, context-aware experience that helps users learn and handle complex research tasks?—we began by conducting a competitive analysis and accumulated background and context of the project and product. Next, we wanted to understand the real problems users face, so we ran user interviews to identify the key opportunities and entry points for our design.

Competition analysis:

From our competitive analysis, we found that although NotebookLM excels in citation and source retrieval, it is limited in guiding users through structured, project-oriented workflows. Notion and Obsidian, on the other hand, offer strong control over databases and content structure, but their linear editing environments—or the high cost of maintaining non-linear structures—often restrict intuitive thinking.

In contrast, Heptabase provides a unique visual knowledge map that supports spatial thinking and intuitive organization. This revealed a clear opportunity in the market:
Most tools focus primarily on knowledge storage, while Heptabase’s true strength lies in helping users build thinking structures. We believe that combining NotebookLM’s strengths in citation and source retrieval with Heptabase’s visualized knowledge maps would enable AI to act as a thinking partner—one that can surface connections, remove mental bottlenecks, and proactively help users construct meaningful structures. This represents a competitive advantage that is difficult for others to replicate.

Team Collaboration Trade-off
I let go of “everyone must do the full UX process” → shifted to “assign tasks based on strengths.”
Trade-off
Ideally, all designers join research and UI work, but this wasn’t realistic for junior and UI-focused teammates.
Key Decisions
✔ I led UX strategy, research framework, interviews, and decision-making.Junior designers handled briefs and summaries.UI designers focused on high-quality visuals.They joined interviews as notetakers to learn without slowing progress.
Why
1. Maintain stable quality within a tight two-month schedule.
2. Reduce IA/flow decision risks.
3. Enable juniors to contribute where they are strongest and learn effectively.

Research Objective: Understand how users interact with information when handling complex tasks.

User Interview

I was responsible for creating the recruitment survey and developing the user interview questions. I also served as the lead interviewer for all six user interviews.

After sending out the recruitment survey, we received 45 sign-ups within the first week and a total of 67 by the second week. From the responses, I identified potential participants who were likely to engage in complex tasks—selecting three Heptabase users and three non-users. I then conducted screening calls with each person to confirm their suitability for the study. Before the official interviews, I also ran a pilot session to test and refine the interview questions. Based on the pilot and the actual interviews, I continually adjusted the interview guide to ensure clarity and depth.

My phone-screening criteria were based on the research objective:I excluded industry professionals (designers, researchers, or people frequently interviewed for work) and prioritized participants who regularly handle complex tasks and are willing and able to share their thinking.

Solutions

To address these problems, we figured out several HMW questions voted for a HMW as our directions. Then, I hosted a workshop, and applied the 6–3–5 Brainwriting method to generate potential solutions.

  • HMW help users uncover structure and blind spots in their thinking while organizing complex information—without disrupting their natural workflow?

After that, we narrowed down the ideas through voting. To ensure our resources were invested in the right direction—and to validate whether users actually liked these ideas—we recruited five Heptabase users to test the proof of concept.

I recruited five Heptabase users to participate in concept validation sessions, ensuring our ideas aligned with real user needs. I worked with two UI designers to propose 5–10 ideas to users and create wireframes that visually communicated our concepts based on our brainstorming sessions.

Qualitative Feedback from Concept Validation

😊 Positive Feedback

Users saw clear value in AI-assisted organization:

  • AI helps efficiently organize and structure large amounts of whiteboard content.
  • Automatically provides structure, categories, and connections users can directly use.
  • Especially helpful for research, planning, and case analysis.
  • Multi-version suggestions and keyword extraction were considered useful.
  • AI Tag speeds up categorization and saves time.

Summary Insight :

AI support at the right moment significantly improves whiteboard organization, structural thinking, and research/planning efficiency.

🤔 Concerns & Frictions

Users also raised doubts about trust and control:

  • AI’s categories/structure may not align with user thinking.
  • Misidentified key points may increase revision costs.
  • Uncertainty about whether AI truly understands relationships between cards.
  • Some disliked the Section display format.
  • Concerns about data sources and Tag accuracy.
  • AI suggestions may sometimes interrupt their flow of thought.

Summary Insight :

If AI misinterprets user intent, it may generate irrelevant structures, create extra work, and disrupt the user’s cognitive flow.

Given our time and resource constraints, we prioritized the design direction and chose the two strongest features to continue refining according to user feedback. These two concepts were initially proposed by me.

Besides, one challenge I encountered was resolving differing opinions within the design team, so I relied on user feedback to understand which solution approach worked best and used it to guide our decision-making.

Concept 1 : AI Reviewer

I was responsible for design the UIUX for the AI Reviewer.

The “AI Reviewer” stays quietly by the user’s side when needed.Rather than giving direct answers, it offers different perspectives from multiple AI reviewers to help users break through mental blocks and think more clearly.
It behaves like a coach who can understand the topic you’re organizing on the whiteboard, provide precise suggestions with verifiable references, point out gaps or contradictions, and guide you forward in your thinking process.

Concept 2 : Auto Structure


I was responsible for design the UX for the Auto Structure.

When the whiteboard becomes crowded with cards, connections, and screenshots, the AI Context Architect reads the existing content and proposes possible structures and relationships. It quickly transforms scattered information into a clear, organized hierarchy.
Each suggestion comes with sources and reasoning, and users can choose to accept or ignore them — maintaining full control at all times.

Usability Test

During the design process, we were unsure whether the current UI/UX would be intuitive for users. To validate this, I proposed that we need to plan usability testing with five Heptabase users. However, due to scheduling conflicts that week, many users were unavailable. As a result, we opened the sessions to two additional participants who were product designers. Their feedback was treated only as supplementary reference. Before the sessions, we clarified that we were not asking for their professional design opinions, but rather wanted them to use the product purely as Heptabase users and share any issues or difficulties they encountered.

I was responsible for recruiting participants, designing the usability test tasks, leading the interviews, and synthesizing the feedback from all five users.

Iteration

AI Reviewer

I partnered with two UI designers on the iteration and took ownership of the AI Reviewer feature.

The large amount of information, mixed categories, and repetitive presentation led to:

  • Users being unable to quickly grasp the key points
  • The tooltip overpowering the visual hierarchy

When too much content is displayed at once, users feel overwhelmed, which negatively affects readability and comprehension.

XXXXX

Prototype

Auto Structure

Lor

Lore

Team Impact

Building Alignment Through an Icebreaker Activity

Before the project officially kicked off, I proactively organized a structured icebreaker activity to help the team build psychological safety, rapport, and a shared language.My goal was to ensure everyone felt comfortable collaborating, expressing ideas openly, and understanding each other’s working styles—so that the project could progress smoothly from day one.
Through guided mini-tasks, teammates quickly connected with one another, which later made discussions more efficient and lowered communication friction.

Team Feedback — Positive Outcomes From the Icebreaker Activity

“This is the first time I’ve joined such a well-structured and fun icebreaker. We definitely chose the right team lead!” — Lynn

“Josie really has leadership presence. We truly picked the right leader. I didn’t know FigJam could be used like this!” — 72

“A

These comments reflect more than appreciation for the activity itself—they show the type of team environment I aim to create:open, collaborative, trusting, and aligned.

Team Feedback — Positive Outcomes From the project

“Thank you, Josie, for your dedicated involvement in the project and for providing thoughtful feedback on the brief and proposal." — Lynn

“Thanks for your strategic thinking during the interview planning stage and your strong execution in validating assumptions." — Cindy

“Thanks to Josie for carrying the project." — Michael

I’m grateful to have worked with such supportive group. Each teammate brought energy and ownership to the project, and I deeply appreciate the trust they placed in me as a design lead.

Takeaway

1. Real user behavior matters more than verbal descriptions — observe, don’t assumeWhat users say and what they actually do often differ.Throughout the research, I prioritized observing real interactions over relying on verbal explanations.This helped the team quickly focus on the true pain points.

2. As a Design Lead: leverage each team member’s strengths and allocate resources effectivelyIn cross-functional collaboration, I learned the importance of clarifying each teammate’s skill boundaries early on.By assigning suitable workloads and providing targeted support, I ensured the team could consistently deliver high-quality outcomes.

3. Usability testing isn’t about perfection — it’s about improving every roundEach small testing cycle should reveal new insights and reduce uncertainty.These learnings fuel iteration and help the team identify where to focus next — an essential process for maturing a design quickly.

4. Onboarding design: reduce cognitive load and let users feel the value immediatelyA new feature’s first-time experience should be lightweight.Avoid forcing users to process heavy default content or fill in extra information.With limited time, I applied the “show, don’t tell” principle and used real user input for onboarding validation.Inspired by Shape of AI, I adopted the flow:user’s own content → instant analysis → immediate feedback,so that the value naturally emerges without explanation.

5. Reduce complex task demonstrations — use “task breakpoints” to guide the interviewIf a test requires users to switch between multiple scenarios, they quickly become overwhelmed.I learned to observe natural reactions during the task, and only add essential context when needed.This approach captures more authentic behavior and clearer feedback.

6. Key realization: great products come from deeply understanding human natureCompetition isn’t about chasing features — it’s about studying users’ real context and motivations.Only by truly understanding their habits, concerns, expectations, and workflows can a design achieve irreplaceable value.

Related Products

RecruitmentApp

UXDesign

JobPlatform

Talent Matching

Mobile UX

B2B Design

HR Tech

FormOptimization

AI Tech

AIAudio

NoiseCancellation

Web App Design

Sound Design

Voice Enhancement

B2C Design

UserManagement

PlatformUX

Account Management

Enterprise UX

Admin Dashboard

Web UX

Welcome to contact me !