Protected Case Study

Password required

This case study is password protected.

Incorrect password — try again.

← Back to all work
Google · Gemini Product Sentiment Analysis Enterprise UX 2025 Shipped ✅

Designing a Product Sentiment
Analysis Experience on
Gemini's Multimodal Capabilities

Surfacing consumer signals across YouTube, web, and internal data — built to make Gemini's multimodal reasoning legible to enterprise decision-makers

Duration
2 Weeks
My Role
Lead UX/UI Designer · Front-End Dev · PM
Team
Pitchhub (Cognizant + Google) · Google Cloud Gen AI Team
Status
✅ Shipped
Product Sentiment Analysis — Project Sentio
Overview

The Brief

Project Sentio is a product sentiment analysis tool built on Gemini. Given a product, it ingests YouTube video reviews, web mentions, and internal documents in a single query — and returns a synthesized sentiment score, top positive and negative signals, and the source evidence behind each finding. The experience makes visible what Gemini is doing: processing genuinely different data modalities and reasoning across them to produce a unified business-ready output.

This was built as a working conference demo for an exclusive Google event attended by 200+ C-suite executives from global companies. The context mattered: these weren't developers. They were decision-makers evaluating whether to commit their organizations to Gemini. The design problem wasn't usability — it was making multimodal AI comprehensible and commercially credible to people who had the authority to say yes.

I led the project end-to-end — information architecture, high-fidelity Figma prototypes, and front-end implementation in FlutterFlow — collaborating with a four-person Google Cloud Gen AI engineering team. We shipped in two weeks.

Duration
2 weeks
Role
Lead UX/UI Designer
Front-End Developer (FlutterFlow)
Project Manager
Team
Front-end: Pitchhub (Cognizant & Google)
Back-end: Google Cloud Gen AI (4 engineers)
Status
✅ Shipped
Design Challenge

How might we design an experience that makes Gemini's multimodal advantage immediately legible — and commercially compelling — to enterprise decision-makers who have the power to adopt it?

01
Make Multimodality Legible
Make Gemini's ability to reason across YouTube, web, and internal data immediately understandable — without requiring any technical explanation.
02
Build Commercial Credibility
Help executives picture this capability inside their own organizations — not as a research demo, but as something they could plausibly adopt and deploy.
03
Turn Insight into Business Value
Translate AI output into tangible business artifacts executives could immediately see value in — closing the gap between "impressive capability" and "I want this."
04
Ship Without Compromising the Story
Design within the hard constraints of what the engineering team could ship in a two-week sprint — without letting technical limits undermine the multimodal narrative we were trying to tell.
Solution

From Search to Insight
to Value Generation

Feature 01 — Homepage

Grounding the Use Case in Real Products

The homepage surfaces a moving carousel of real product images before any analysis runs. This immediately anchors the experience: Sentio is not analyzing abstract data — it is analyzing what real people are saying about specific products. Grounding the use case visually helped executives understand the problem being solved before they encountered the AI output.

Feature 02 — Tabbed Source Navigation

Making Each Data Source Visible

Results are organized into dedicated tabs for YouTube, Web, and Internal sources — making Gemini's multimodal processing explicit rather than implied. Executives aren't told the model uses multiple sources; they can see each source's contribution separately. This turns what could have been a black-box output into a legible three-part story: what YouTube reviewers are saying, what the web is saying, and what internal data shows.

Feature 03 — At-a-Glance Insights Dashboard

Readable Output, Traceable Signal

A distilled dashboard presents the overall sentiment score, top positive features (green), and top negative features (red) as scannable pills and tags. The design prioritizes two things: immediate readability without data expertise, and traceability — every signal can be traced back to the specific sources it came from, giving executives confidence that the analysis is grounded in real evidence rather than opaque inference.

Feature 04 — Granular Source Drill-Down

From Overview to Specifics

A modal interface lets presenters dive into individual sources — surfacing specific summaries, sentiment tags, and verbatim quotes. Executives can see exactly where the AI's conclusions come from, building trust in the analysis.

Feature 05 — Insight-to-Action Generation

From Sentiment to Business Action

The final step closes the loop: Sentio uses the sentiment findings as structured input and generates polished campaign images in real time. The intent was to show that product sentiment analysis doesn't end with a report — it can feed directly into business decisions. Curated prompts provided a low-friction starting point; the ability to input custom queries let executives test it against their own products and mental models.

The Process

Designing the Story,
Not Just the Screen

Design Iteration
01

Making Multimodality Visible Through Structure

Problem

The initial design consolidated all sentiment results into a single feed. It obscured the most important thing we needed to show: that Gemini was reasoning across genuinely distinct data modalities, not just one big dataset.

Solution

A tabbed approach organized results by source (YouTube, Web, Internal), allowing the presenter to deliberately walk the audience through each data type as a structured, three-act narrative.

Impact

Transformed a cluttered feed into a structured three-act story. Each tab became a distinct beat, making Gemini's multimodal reasoning observable and legible rather than invisible.

Design Decision
02

Simplifying Data Visualization

Challenge

Initial high-fidelity visualizations — complex charts and graphs — exceeded what the engineering team could build in two weeks. The model also only output a limited set of data points: a sentiment score, top positive features, and top negative features.

Redesign

Focused entirely on actual model outputs. Designed a large, prominent sentiment score alongside two columns of color-coded pills (green for positive, red for negative) with icons — no chart expertise needed to read.

Outcome

A dashboard that was both feasible to build and more direct for the audience. The engineering constraint actually improved the design — a lesson in letting constraints clarify intent.

Design Handoff
03

Bridging Design & Development with FlutterFlow

Instead of a traditional design-then-handoff cycle, I built the front-end directly in FlutterFlow — a low-code platform that enabled immediate, tight feedback loops between design intent and engineering implementation.

When the back-end team surfaced data limitations mid-sprint, I could adjust the UI in real time — compressing days of back-and-forth into hours. A parallel Figma prototype also served as a reliable offline backup for the conference presentation, ensuring we were never at the mercy of conference Wi-Fi.

Impact

How It Landed

The tabbed source structure gave executives a clear mental model of what multimodal AI analysis actually means in practice — without a single line of technical explanation.
👥
Active engagement: Multiple C-suite attendees engaged directly with the product during the live presentation — asking specific questions about which sources contributed to which findings, and how the sentiment signals were derived. For a non-technical audience, that level of interrogation is a meaningful signal of comprehension.
💼
Enterprise adoption conversations: Several executives approached the Google team after the presentation specifically to discuss how Gemini's multimodal capability could apply within their own organizations. The demo succeeded not just as a showcase, but as a credible proof of commercial relevance — exactly the outcome it was designed to enable.
Internal recognition: The demo was highly praised by the CEO's office and management for its clarity and storytelling effectiveness — a meaningful signal that the strategic framing worked and the constraints we navigated didn't show.
Reflection

Learnings & Takeaways

What I learned
  • Designing for comprehension is different from designing for users. This project taught me that enterprise demo design is fundamentally about making a capability legible to someone with buying power, not just making a product usable. The design questions aren't "is this easy to use?" — they're "does this make the right thing believable?"
  • Constraints made the story sharper. The engineering limitations that forced me to simplify data visualization actually improved the design. Stripping out complex charts and foregrounding raw model outputs made Gemini's multimodal capability more visible, not less impressive.
  • Adaptable tooling unlocks speed. Being willing to build in FlutterFlow rather than hand off to developers compressed what could have been a week-long cycle into same-day iteration. The right tool for the context matters enormously.
If I had more time
  • Think through productization. Sentio worked well as a conference demo, but the more interesting design question is: what would it take to turn this into something product teams could actually use day-to-day? That means thinking about saved analyses, comparison views, integration with existing product feedback workflows, and how the output maps to real product decisions — not just a live presentation.
  • Broader stakeholder research. The experience was shaped entirely around a C-suite presentation context. The full decision chain for enterprise AI adoption also includes product managers, data analysts, and IT leads — each with different questions about trustworthiness, accuracy, and integration. Research across those audiences would surface a richer set of design requirements.
Next Case Study
Mobile Onboarding Redesign → ← Back to all work