Conversion Hotel 2018, Texel, NL

A 90 minute hands on workshop to explain the principles of good usability, functionality and accessiblity testing.

Key Themes:

Device Experience Optimization

  • Device Experience Model:
    Split mobile and desktop audiences carefully by device type, OS, browser, and even model — especially for Android, where screen resolution is unreliable.
  • Segmenting Funnels by Device:
    Funnel analysis for devices helps isolate problems — segment by mobile type, browser, country, channel, visitor type, and key journey steps.
  • Examples:
    • 13 bugs found impacting all Android devices.
    • Fixing 4 bugs in Chrome doubled yield over Internet Explorer.

Lean Analytics and Device Testing

  • Data-Driven Testing Approach:
    • Start with a custom device report and spreadsheet templates.
    • Prioritize mobile, desktop, tablet devices based on real traffic data.
  • Performance Matters:
    • Mobile page speed must be tested under real-world conditions, not lab settings.
    • Slow mobile experiences cost conversions heavily.

QA Best Practices

  • Manual testing is critical:
    Manual testing uncovers emotional, environmental, and psychological issues that automated tests miss.
  • Real devices > Simulators:
    Always use real devices for proper testing — second-hand devices are fine.
  • Things to Look For:
    • Not just bugs, but accessibility, usability, standards compliance, and WTF moments.
    • Test edge cases deliberately (e.g., lots of items in cart, errors in forms).

Accessibility & Usability

  • Accessibility benefits everyone — and ignoring it cuts out a massive audience.
  • Key accessibility basics include page titles, alt text, form labels, keyboard navigation, and good contrast.
  • Usability is about how easy and pleasant the interaction feels, not just how it looks.

Streaming & Prioritisation

  • Streaming (or Bucketing):
    Gather all found issues into a master backlog, then cluster by theme (privacy, payment, returns, confusion, etc.).
  • Prioritisation Methods:
    • Traditional points-based systems like PIE and ICE often fail due to bias.
    • Binary or yes/no models (like PXL) work better — quicker, simpler, harder to game.
  • Validation:
    Always validate prioritization by measuring real outcomes (impact vs. prediction) and adjust the system if needed.

Key Takeaways

  • Device testing and QA must be data-driven, continuous, and user-centered.
  • Accessibility and performance should be part of your CRO toolkit, not an afterthought.
  • Stream your findings into action, and prioritize based on clear, measurable value.
  • Always validate your prioritization process — what matters is whether real results match your predictions.