PI Review Improvements

Product design, Onboarding

Streamlining the Personal Information Review flow in H&R Block's DIY product to improve user satisfaction and reduce drop-off rates in Onboarding

How might we increase the speed and ease of the personal information (PI) review flow to improve customer satisfaction with DIY filing?

Two major goals for the H&R Block DIY Online product in Tax Season 2025 were to reduce drop-off rates during onboarding and to increase customer satisfaction in the overall product. As an Onboarding squad, we set out to discover how we might contribute to those outcomes.

Our hypothesis was simple: if we created a more efficient, personalized, and value-rich review flowβ€”which is specifically for prior-year H&R Block users or switchers who upload a 1040β€”users would feel more invested in our DIY experience. We also felt deeper client investment would, in turn, foster greater confidence in our product, drive conversions, and ultimately boost revenue.

The team

Our cross-functional team included: me as the Product Designer, a Content Writer, a Product Manager, a Lead Developer and Developer, and a UX Researcher.

As the Product Designer, I took the creative lead on the project. I collaborating closely with the content writer on discovery, then took the primary lead role in detailed design and preparing designs for testing. I also facilitated communication across teams, ensuring that user feedback translated seamlessly into actionable design solutions.

research & Discovery

To begin, we focused on discovery work and low-fidelity prototyping to validate our hypothesis. Using FigJam, the content writer and I mapped out potential user flows, exploring two key directions:
1. A single-page review.
2. Bite-sized groupings of information.

Given the technical constraints and complexity of user data, we tested three low-fidelity concepts alongside a control flow representing our current experience. We conducted moderated user tests using one-sheeters, gathering insights on user preferences and mental models when reviewing information.

These are the four one-sheeters we reviewed with testing participants. Testers were presented with each of the four screens, with each one-sheeter followed by a full-size view of the respective wireframe.

From these tests, we learned that users preferred the bite-sized grouping approach for clarity and ease. The testing gave us a clearer direction, and led us to categorize specific insights into β€œinclude,” β€œexplore,” and β€œavoid” categories.
From there, we refined our approach, creating a clear β€œhappy path” flow using stickies in FigJam and reviewing it with our Product Manager and Lead Developer before moving into low-fidelity designs.

Validation & Prototyping

Once the low-fi designs based on the β€œbite size chunks” idea were complete, we collaborated with our UX researcher to test three variations tailored to different filing scenarios:

  1. A simple flow for a returning user.

  2. A flow for a returning user with multiple updates.

  3. A flow for a new user importing a 1040.

We determined our objectives:

  • Assess ease, accuracy, and user confidence in the PI review flow.

  • Gauge the flow’s length and clarity of content.

  • Ensure users understood how to edit or update information.

And our researcher determined the methodology:

  • 5 moderated interviews with DIY filers currently using a competitor’s product.

  • Participants tested all three flows and provided feedback.

When testing the low-fi wireframes, we made sure to show prototypes of the entire experience to understand how users felt about how the new screens fit into the whole experience.

From our tests, we learned that users found the flows quick, simple, and easy to understand. The bite-sized groupings made sense, though adjustments were needed, like reordering β€œstudent status” and β€œdependents” questions. TurboTax users appreciated our shorter flows and clear, actionable edits. Users appreciated the intuitive grouping of information and the clear UI for editing or adding details. They felt β€œneeds review” text and icon for missing or incorrect data stood out, making corrections straightforward.

Detailed design

Because this style of summary review would be new to the product, I decided to audit the entire DIY product to design and implement a new card component into the design system. Because less complex summaries existed elsewhere in the product, but in outdated UI and UX, I wanted to create something that would be useful throughout the product. This would ensure that our design system remained holistic and our product was visually consistent.

I was drawn to a card because of the skeuomorphic connotation of saving someone’s information in your rolodex. We had discussed early on that we wanted users to feel like we made good use of the data they gave us; that we listened and really knew them now that they’d either filed with us or introduced themselves to us via their 1040.

A card also allowed for clear, visually β€œchunked” content for pages where multiple groups of data could be presented at once (like the dependents screen).

Once I had audited the system and designed a powerhouse card component, the rest of the designs came together quickly based on the wireframes and existing screen templates in the design system.

I designed the Summary Card component to function as a review card, a summary table, a list, a house for long-form text, and a multi-instance gateway (MIG) for categories like income, credits, and deductions. It was designed for desktop and mobile and contains nested sub-components in Figma to make it versatile and powerful.

The qa process

To ensure a smooth launch, we implemented twice-weekly check-ins with developers, product managers, designers, content writers, and QA. We also had a Teams chat for communication between meetings. This process allowed us to address potential issues early and make rapid fixes. By proactively accounting for deviations from the happy path during the design phase, we encountered no major surprises during QA.

We focused on the details, verifying that margins, text sizes, padding, and functionality aligned with the design specifications.

Final Design

The final product is a clear, concise, and visually appealing PI review flow that is now live in the H&R Block Online product.

50%

reduction in number of screens

(for a returning H&R Block client filing Single)

150+

data points auto-imported

(for returning H&R Block clients and switchers who upload a 1040)

Key Learnings

We’re eagerly awaiting final conversion and fallout data at the close of the 2025 tax season. Early user feedback highlights the value of prioritizing clarity, simplicity, and personalization.

For now, I personally took away several soft learnings from this project:

  • Proactively seeking areas for improvement can be as valuable as waiting for customer feedback.

  • User feedback during low-fi testing is invaluable in validating designs before spending valuable time and development resources

  • Proactive communication and collaboration between teams streamline the QA process and reduce the risk of surprises during implementation.

This project showcased the power of thoughtful UX design in transforming a critical user journey, and I’m proud to have contributed to a solution that empowers users and enhances their confidence in our DIY product.

Team

Haley Mitchell
Designer

Lynn Reddick
Product Manager

Brian Gilton
Content Designer

Bryan Stephens
Lead Developer

Spencer Palmeter
Developer

Erin Nyquist
UX Researcher

my contributions

  • Spearheaded UX discovery in FigJam

  • Built one-sheeters and prototypes for user testing

  • Created handoff files with testing goals for research partner

  • Designed all high-fidelity screens in compliance with the existing design system

  • Introduced a new card component to the design system for use in this and other flows