User Research Methodology

Beyond The Metrics:
Real Traders, Real Stories

My portfolio shows quantitative outcomes — fewer steps, faster compliance updates, lower KYC drop-off. But behind every metric is a human story. Here are the traders who shaped my design decisions.

My Research Philosophy

Metrics tell you what happened. Users tell you why.
I combine quantitative analytics (Hotjar heatmaps, GA4 funnels) with qualitative research (interviews, usability testing, contextual inquiry) to understand both behavior and motivation.

Moderated Interviews

45-60 min sessions with traders (novice to expert). I ask about their mental models, workflow contexts, and emotional responses to risk.

Sample Size: 32 traders over 18 months

Usability Testing

Task-based testing with think-aloud protocol. I measure time-on-task, error rates, and subjective satisfaction (SUS scores).

Sample Size: 15 traders per major feature

Behavioral Analytics

Hotjar session recordings, heatmaps, and funnel analysis to identify where users struggle—then interviews to understand why.

Sample Size: 147 session recordings analyzed

Meet The Traders

These are composite personas based on 32 real interviews. Names changed for privacy.

👩‍💼

Samantha Dumas

Role: Novice Retail Trader
Age: 28 | Location: Melbourne, AU
Background: Marketing Manager, no finance background

Forex Beginner Mobile-First Risk-Averse
I felt like I needed a finance degree just to understand what "leverage" meant. Every platform assumed I already knew everything. I just wanted to trade EUR/USD without feeling stupid.

Sarah's Journey: From Overwhelmed to Confident

User Journey
😰
1. Account Setup

"What's KYC? Why do they need my passport?"

😵
2. First Trade

"I clicked 'Buy' but nothing happened. Did it work?"

😟
3. Risk Management

"How much can I lose? I don't understand 'margin call'"

🙂
4. Proficiency

"Now I get it. I can actually do this!"

Key Pain Points
  • Platform assumed trading literacy—no onboarding tooltips
  • Jargon everywhere ("pips", "spread", "stop loss") without explanations
  • Unclear feedback after placing orders—"Did it execute?"
  • Fear of losing money because risk wasn't visualized
My Design Solutions
  • Onboarding tooltips: Contextual glossary for every financial term
  • Progress indicators: "Order Submitted → Executed → Confirmed" stepper
  • Risk visualization: "You can lose up to $X" shown BEFORE trade execution
  • Plain-language disclaimers: Replaced legal jargon with clear warnings
64.63%
Faster order placement (8.2s → 2.9s)
85
SUS score post-redesign (from 52)
40%
Reduction in order-status support tickets
Usability Testing Methodology — These Are Real Measurements

These metrics are not persona narrative constructs. They come from a controlled usability study conducted on both the legacy and redesigned LogixTrader order placement flow. Persona context is used to situate the findings, but the numbers are independently measured.

† Task Timing (8.2s → 2.9s)
  • Protocol: Moderated think-aloud, same 15 participants tested both flows sequentially
  • Task: "Place a market order for 1 lot EUR/USD as you normally would"
  • Timing: Manual stopwatch + screen recording (dual-verified, ±0.2s human reaction variance)
  • Start/End: Click 'New Order' → order confirmation modal appears
  • Limitation: Legacy flow tested first (no counterbalancing); learning effect may understate improvement. Lab environment ≠ live trading conditions. n=15 is appropriate for qualitative usability insight, not statistical validation.
‡ SUS Score (52 → 85)
  • Instrument: Standard 10-item System Usability Scale (Brooke, 1996)
  • Participants: Same n=15 cohort (5 novice, 7 intermediate, 3 expert traders). Pre-score collected after legacy flow; post-score after redesigned flow in same session.
  • Benchmarks: SUS 52 = "Poor / D grade" (below 68 acceptability threshold). SUS 85 = "Excellent / A grade" (industry top quartile per Bangor et al., 2009).
  • Limitation: Potential order bias (tested legacy first). No washout period. Full methodology + session recordings available under NDA.
👨‍💻

Michael Garnier

Role: Intermediate Day Trader
Age: 34 | Location: Singapore
Background: 3 years trading experience, follows market signals

Technical Analysis Desktop Power User Data-Driven
I need to make decisions in seconds. If your platform makes me click 5 times to see my P&L, I'm losing money while I wait. Every millisecond counts when markets move fast.

Michael's Daily Workflow

Typical Trading Day (6am - 2pm)
  • 6:00am: Opens Finlogix → scans 50+ market signals across 12 currency pairs
  • 7:30am: Identifies 3 high-probability setups → sets price alerts
  • 9:00am: Alert triggered → opens LogixTrader → executes trade in <3 seconds
  • 9:15am: Monitors open positions across 3 charts simultaneously
  • 12:00pm: Closes profitable trades → reviews P&L attribution in TradingCup
Workflow Friction Points
  • Market data scattered across 3 different platforms—context switching kills speed
  • No keyboard shortcuts—forced to use mouse for every action
  • Chart customization resets between sessions—had to reconfigure daily
  • Risk metrics hidden in dropdown menus—couldn't see exposure at a glance
My Design Solutions
  • Unified data dashboard: Finlogix aggregates signals + charts + news in one view
  • Keyboard-first execution: F9 = Buy, F10 = Sell, Ctrl+R = Close All (Bloomberg-inspired)
  • Persistent workspace state: Chart configs, indicator settings saved per user
  • Real-time risk dashboard: P&L, margin usage, open positions always visible (no dropdowns)
Measured Impact (Finlogix Redesign)
40%
Faster market analysis (usability testing, n=15)
-67%
Finlogix — "Data not found" support tickets (internal tracking)
Institutional Research

Beyond Retail: Institutional Stakeholders

The ACY Connect institutional B2B platform required a fundamentally different research methodology. Retail traders express frustration emotionally. Institutional stakeholders express it in SLAs, latency requirements, and regulatory audit clauses.

👔

James Liang

Role: Relationship Manager — Prime Brokerage
Age: 41 | Location: Hong Kong
Background: 12 years institutional sales, manages 8 prime brokerage accounts ($10M–$500M daily flow)

Prime Brokerage Institutional Sales Account Oversight
My clients don't call me when a trade executes. They call me when it doesn't. I need to see every open order, every FIX session status, every credential expiry — before my clients see it first. Your platform is my early warning system.

James's Accountability Model

What Was Failing
  • FIX session downtime discovered by client before RM — trust erosion
  • No unified view: order status, credit limits, and connectivity on 3 separate screens
  • Credential renewal: manual email chain, 5-day lead time, no self-service
  • Compliance queries required IT ticket — 48-hour SLA, institutional clients expect minutes
Design Outcomes (ACY Connect)
  • Unified dashboard: FIX session health, credit exposure, order volume — single view
  • Proactive alerts: Session latency spike → push notification before client sees it
  • Self-service credentials: RM generates API keys, resets passwords without IT ticket
  • Audit trail exports: 1-click compliance reports formatted for ASIC/SFC review
Research Method: Contextual Inquiry + Shadowing

Shadowed 3 RMs during live trading hours (9am–12pm HKT) across 4 sessions. Observed real screen workflows — not simulated tasks. Key insight: RMs have zero tolerance for latency in information retrieval because any delay is felt by their institutional clients as service failure. Traditional usability testing (task-based, moderated) was insufficient — the research required being present when the stress was real.

💻

Ravi Mehta

Role: Quant Developer / Systems Integrator
Age: 33 | Location: Singapore
Background: Python/C++, connects hedge fund OMS to broker infrastructure via FIX 4.4

FIX 4.4 Protocol API Integration Low-Latency Systems
I don't use your UI — I use your API. But when something breaks at 3am during Tokyo open, I need documentation that tells me exactly which FIX tag is causing the OrdStatus rejection. Ambiguous docs cost me hours. Clear docs cost me minutes.

Ravi's Integration Workflow

FIX Session Lifecycle — Where Design Decisions Live
🔑
CREDENTIAL SETUP
CompID, SenderID, password — RM self-service portal
🔌
SESSION INIT
Logon (MsgType=A), heartbeat interval, sequence reset
📋
ORDER FLOW
Tag 150 ExecType → Tag 39 OrdStatus state machine
📊
RECONCILIATION
EOD position match, reject code audit trail export
Developer Experience Failures
  • FIX tag documentation: PDF, no search, no code examples
  • Error codes undocumented — Ravi discovered meaning by trial and error
  • Test environment shared with other clients — caused sequence number conflicts
  • Credential rotation required email to ops team — 3-day SLA in pre-production
Developer Portal Outcomes
  • Interactive FIX spec: Searchable tag reference, OrdStatus state-machine diagram
  • Error code glossary: Every reject code mapped to plain-English cause + resolution
  • Isolated sandbox: Dedicated test environment per client, no sequence conflicts
  • Self-service key rotation: RM portal generates/rotates credentials without ops ticket
Research Method: Developer Interview + API Journey Mapping

Conducted 5 semi-structured interviews with quant developers and systems integrators at institutional clients (hedge funds, family offices, algo trading desks). Unlike retail research, the primary artifact was not a journey map but an API integration audit — walking through every step of the FIX session lifecycle and recording where developer time was lost. Key finding: documentation quality had greater impact on integration time than API design itself.

Retail vs. Institutional Research: A Methodology Contrast

Retail Traders
  • Moderated usability testing, think-aloud protocol
  • Emotional state mapping (frustration, confusion, confidence)
  • Quantitative SUS scoring + task completion rates
  • Hotjar heatmaps for behavioral validation
  • n=15 per feature is appropriate for insight generation
Institutional Stakeholders
  • Contextual inquiry + live workflow shadowing
  • SLA and latency requirements as design specifications
  • API journey audits — integration time as the UX metric
  • Compliance clause analysis as user requirement input
  • n=5 deeply is more signal than n=50 superficially

Institutional developer research — deeper coverage in the ACY Connect case study

The Ravi Mehta persona above is a composite. Full institutional developer research — FIX session lifecycle mapping, API credential UX, IP whitelist flow analysis, and integration time as a design metric — is documented in depth in the ACY Connect case study →

My Research Process

How I translate user insights into design decisions

1

Problem Discovery

Start with analytics anomalies: Hotjar heatmaps show users clicking non-clickable elements? Funnel analysis shows 40% drop-off at order confirmation? That's where I dig deeper.

Example: Noticed 40% of traders abandoning order flow at "Risk Disclosure" step. Analytics told me where the problem was, but not why.
2

Qualitative Research

Recruit 15 users matching the demographic → run moderated usability testing with think-aloud protocol. I record sessions, ask follow-up questions, and observe emotional responses (frustration, confusion, delight).

Finding: 12 out of 15 traders said "I don't understand why you need my tax ID for a demo account". Legal's compliance text was scaring users away.
3

Design Iteration

Synthesize findings → create 2-3 design variations → A/B test with real users. I measure quantitative impact (time-on-task, completion rate) AND qualitative satisfaction (SUS scores, user quotes).

Solution: Redesigned risk disclosure with progressive disclosure: Show minimal text by default, "Learn Why" link for full legal text. Completion rate: 60% → 92%.

Want to See More Research Stories?

This page shows 4 composite personas — 2 retail traders, 2 institutional stakeholders — from 32+ interviews across both user segments. Full research database, journey maps, and session recordings available under NDA.