Project snapshot

• My role:

UX Research

• Team:

Research, Product, Privacy & Trust, Data Ethics, Brand Strategy

• Timeline:

Abr - Jun 2024

• Platforms:

Google Search 

• Tools:

Dovetail, Google Workspace, Miro, Behavioral Mapping

Context

Google Search was exploring ways to give users more control over their Personal Identifiable Information (PII) across the open web, as privacy concerns continued to rise globally. The growing complexity of data regulations, cultural sensitivities, and varying levels of trust in tech companies made it critical to understand user expectations before moving into product definition.

I collaborated with various teams to conduct a study in four diverse markets (US, UK, India, Brazil). The goal was to identify emotional triggers, adoption barriers, and cultural nuances that could guide product feasibility, UX priorities, and communication strategies.

The challenge

Managing personal information online is emotionally charged and deeply contextual. While users express strong desire for control over their PII, trust in companies offering that control is often fragile, especially for large platforms like Google. The challenge was to decode how users balance convenience, safety, and skepticism when considering who should have access to their most sensitive data.

Through qualitative interviews across markets, we uncovered core adoption barriers: brand trust concerns, lack of clarity around data sources, fear of unintended consequences, and uncertainty about what "removal" really means. At the same time, clear opportunities emerged to position PII management as empowerment rather than deletion, offering visibility, guidance, and reassurance.

We framed our design inquiry around:

How might we...

Help users feel informed and in control of their PII without overwhelming them with complexity?

Build trust in Google’s ability to manage sensitive data while being transparent about limitations?

Help users feel informed and in control of their PII without overwhelming them with complexity?

Build trust in Google’s ability to manage sensitive data while being transparent about limitations?

Help users feel informed and in control of their PII without overwhelming them with complexity?

Build trust in Google’s ability to manage sensitive data while being transparent about limitations?

Research & Discovery

The research phase focused on understanding how users frame control and vulnerability when it comes to their personal information online. We conducted 24 in-depth interviews across markets to capture not just attitudes toward privacy, but also underlying expectations on how such a tool should behave. Each market offered distinct cultural, legal, and emotional dynamics that challenged the team to avoid a one-size-fits-all solution.

Beyond interviews, we applied behavioral friction mapping to identify points of hesitation, decision fatigue, and confidence gaps that might emerge throughout the product journey. Collaboration with legal, privacy, and brand stakeholders ensured we could translate these signals into design constraints early on.

Strategy & Ideation

Translating the research into strategy required reframing the product from a "data deletion tool" into a guided visibility and control experience. The mental model we prioritized positioned PII management as an ongoing process rather than a one-time fix, emphasizing transparency, flexibility, and emotional safety.

We structured the information architecture around progressive disclosure:

Starting with high-level PII categories (e.g. name, phone, address, financial info)

Surfacing where data might live across external websites.

Allowing users to review, confirm, or request action per data type.

Starting with high-level PII categories (e.g. name, phone, address, financial info)

Surfacing where data might live across external websites.

Allowing users to review, confirm, or request action per data type.

Starting with high-level PII categories (e.g. name, phone, address, financial info)

Surfacing where data might live across external websites.

Allowing users to review, confirm, or request action per data type.

Early wireflows explored different entry points within Google Search and Account Settings, balancing accessibility with sensitivity. We tested multiple frameworks for framing action language (replacing terms like "delete" or "erase" with "review," "manage," and "visibility options") to better match user comfort levels.

Throughout ideation, we collaborated closely with privacy, legal, trust & safety, and UX teams to ensure the strategy aligned with both regulatory requirements and user expectations around agency and reassurance.

Design & Execution

Rather than moving into wireframes or visual design, this project focused entirely on delivering research-driven input to inform future design and product decisions. Our work resulted in a comprehensive insights deck that synthesized behavioral findings into actionable design principles, adoption triggers, risk considerations, and UX opportunities.

The deliverable included:

Behavioral frameworks for user trust and control models.
• Decision-making drivers mapped across emotional and functional dimensions.
• Communication principles to inform messaging strategy.
• Early experience concepts outlining data categorization, progressive disclosure, and user guidance structures.

• Cross-market considerations for regulatory and cultural sensitivities.

Testing & Iteration

As this study was positioned at the upstream research phase, iteration focused on refining adoption scenarios, messaging framings, and user pathways based on ongoing internal feedback loops. Instead of evaluating screens, we iterated on the frameworks, value propositions, and user narratives as new market-specific insights emerged.

Key adjustments included:
• Refining language recommendations to balance legal accuracy with user reassurance.
• Reframing certain triggers around visibility and monitoring instead of data removal, based on user feedback indicating discomfort with absolute deletion claims.
• Expanding archetype definitions to capture subtle variations in privacy attitudes within markets.

Outcomes & Impact

The research provided Google teams with foundational behavioral frameworks to guide the future development of PII management features within Search. The study directly informed:

Product feasibility discussions by mapping emotional adoption barriers and trust triggers.

Communication strategy recommendations to align messaging with user mental models.

Legal and privacy team alignment on how to frame user control without creating false expectations.

Prioritization of design explorations for data categorization, transparency layers, and progressive disclosure models.

Product feasibility discussions by mapping emotional adoption barriers and trust triggers.

Communication strategy recommendations to align messaging with user mental models.

Legal and privacy team alignment on how to frame user control without creating false expectations.

Prioritization of design explorations for data categorization, transparency layers, and progressive disclosure models.

Product feasibility discussions by mapping emotional adoption barriers and trust triggers.

Communication strategy recommendations to align messaging with user mental models.

Legal and privacy team alignment on how to frame user control without creating false expectations.

Prioritization of design explorations for data categorization, transparency layers, and progressive disclosure models.

Learnings

Working on sensitive topics like privacy and PII management challenged me to translate highly emotional user concerns into structured behavioral frameworks that could inform both product and legal decisions. This experience reinforced the importance of mapping not only adoption intent but also the underlying mental models users hold about control, safety, and transparency, especially when dealing with trust-critical experiences.

What I would do differently:

In future studies, I would incorporate earlier participatory design sessions with users to prototype communication language and control metaphors, enabling faster iteration around how privacy expectations are framed before product definitions are locked.

What’s next for the product:

The research continues to inform privacy-centered design explorations across Google Search and related Trust & Safety initiatives, helping shape how future features present sensitive information controls while balancing user agency, transparency, and legal frameworks.

What I would do differently:

In future studies, I would incorporate earlier participatory design sessions with users to prototype communication language and control metaphors, enabling faster iteration around how privacy expectations are framed before product definitions are locked.

What’s next for the product:

The research continues to inform privacy-centered design explorations across Google Search and related Trust & Safety initiatives, helping shape how future features present sensitive information controls while balancing user agency, transparency, and legal frameworks.

What I would do differently:

In future studies, I would incorporate earlier participatory design sessions with users to prototype communication language and control metaphors, enabling faster iteration around how privacy expectations are framed before product definitions are locked.

What’s next for the product:

The research continues to inform privacy-centered design explorations across Google Search and related Trust & Safety initiatives, helping shape how future features present sensitive information controls while balancing user agency, transparency, and legal frameworks.