Google [MAY '16 – CURRENT]

Senior UX Engineer

Previously: UX Engineer, Design

Overview

What is a UX Engineer? See UX Engineering at Google. Similar titles are Design Technologist, Prototype Engineer, Prototyper and Creative Engineer.

Many of my public Android Open Source Project (AOSP) commits are viewable on googlesource.com.

Public projects
Other responsibilities
  • Front-end eng: custom views, touch processing and research validation, complex layouts, smooth motion
  • ML: human interfaces for machine learning, probabilistic algorithms and heuristics
  • Enabling prototyping: tool dev, researching hacks+exploits, reverse engineering, OS modification
  • Tech stacks: my work has used embedded software, hardware, BLE advertising + GATT servers, wifi direct, mDNS, Bluetooth A2DP & HFP profiles, orientation sensor and heuristics, sensor hub

Sharesheet in Android Q

right

Coordinated efforts across 13 people, polishing final design and building UI

  • Lead the final design polish through implementation. To build a user tested, red-line checked, prod-ready UI, I have coordinated efforts with 3 designers, 5 engineers, 4 researchers and and a PM.
  • Led project through internal design reviews, aligning efforts with the design team. Created an IxD spec with necessary design nuance for final implementation.
  • Worked with research to design and provide prototype support for 4 studies on usability and developer perspective
  • Leveraged prototype leanings to advocate for technical direction on specific components.

Brought the new Sharesheet to launch in Android Q

  • Implemented final polish details in Android source, found visual bugs and guided changes to reach for pixel perfection. My work includes:
    • Icon loading logic with permissions cutout requiring API review [code], [code]
    • Perceptual improvements to loading time via tweaking of entrance animation [code]
    • Direct share loading animation changes I designed, layout updates [code]
    • Identifying a low level SKIA rendering problem
    • Custom manual testing app to validate machine-learning shortcut suggestions

Took point on developer outreach and documentation for Sharesheet

  • Spoke at Google I/O about Android Q Sharesheet changes, designed slide content
  • Completely rewrote Android sharing documentation to missing add clarity and detail

Highlights

Wifi QR code sharing in Android Q

right

  • Explored mechanisms to improve person-to-person wifi sharing. Work includes details on ecosystem support for code scanning and Settings app entry points, QR code scanner design.
  • Seeded deck and prototypes on how to improve wifi sharing through the use of QR codes along with a mechanism to reveal wifi passwords to Android Settings team.
  • Advocacy and persistence convinced the Settings team to implement and ship this for Q
  • Consulted with the Setting team designers and engineers to bring idea to life; much of my original proposal was implemented as-is
  • Launched with positive press by CNET and Android Police

Research+design of gestural navigation params in Android Q

right

My signal processing and gesture design background led me to be deeply involved with the design of the touch gestures used by the new system gesture navigation introduced in Android Q.

  • Built app to collect fine-grained touch event sequences from the screen edge, leveraging hack to move nav bar off-screen, logged data included MotionEvent details and display implied stats
  • Worked with research to design and build gesture testing sequence into app, including between gesture-distractor tests. Tested on a large pool of participants.
  • Aggregated and analyzed touch sequence data to support design ideas and derive heuristic parameters that create natural, data-based gestures. Gestures accommodated for:
    • Left/right handedness
    • Strain from continuous usage, reach
    • Various phone form-factors
    • User hand sizes and finger span, grip and positioning
  • Gesture navigation launched in Android Q leveraging the collected stats, see the Android developer blog for more

Rotation Locked Mode in Android P

I identified the pain point, researched prior art, prototyped solutions, formed a team, owned the interaction+visual+motion design and final production code, brought the feature to launch in Android P.

maxwidth

Identified pain points, developed solutions, iterated on UX, sold the idea

  • On my own initiative, researched prior work and pain points with auto-rotate on mobile platforms, proposed initial solution, shared summary deck with Android leadership
  • Iterated on formfactor and interaction to make a feasible draft solution
  • Built a functional, livable prototype with logging, shared internally to gather interest, users and stats
  • Commissioned a study with the prototype to understand usage, iterated on results

I formed and advisory board + team, brought feature to greenlight

  • Gathered interested parties to form an informal UX advisory and sounding board including many from Android UX leadership
  • Got PM support and architecture + review support from System UI
  • Pivoted on the design and supporting prototypes to find a better fit in Android ecosystem
  • Proposed a complete solution to Android leadership which was a greenlighted design for P

Created design assets, built production code, launched in Android Pie

  • Worked with designers to explore iconography and motion, I designed the final icon + motion assets
  • Built, tested, and integrated final production code including complex changes to Window Manager logic and System UI (some associated AOSP commits: 1, 2, 3, 4)
  • With PM, collaborated with several teams to launch, a few examples:
    • Sensors team to resolve power issues
    • Security and logs to turn on logging support
    • Developer advocacy to lock down developer and partner communication, example
    • Several Google apps teams to advise on rotation UX and aid with rotation bugs
    • Launcher to fill the recents button in the navbar with contextual elements
Highlights

Ergonomic, media default volume in Android P

right

I identified pain points, developed early solutions, sold the need for change, iterated on the UX with the P volume working group, created robust take home prototypes, bridged complex design decisions across Audio System, Bluetooth, System UI and Chromecast, launched the changes in Android P.

Identified pain points, developed solutions, sold the need for change

  • On my own initiative researched + audited platforms and code to come up with volume and Bluetooth audio pain points, shared summary deck to gather interest
  • Reverse engineered a mechanism to proportionally mirror media volume level across all streams. Used this to build a liveable prototype, demonstrating the idea.
  • Planted the seed for the need of volume improvements by sharing the above prototype and selling the idea person-to-person. This helped elevate volume UI and audio core interactions as a focus for Android P.

Iterated on the UX, created multiple robust prototypes

  • Iterated with Android UX to design and explore visual and interaction changes.
  • Created prototypes used in a study to evaluating volume proposed models quantitatively. Architected specialized logging of volume use with complex interactions like media playback and audio device dis/connection. As a testament to improved experience and prototype quality, many participants chose to continue using the prototype at the study’s end.
  • Proposed, designed and prototyped many interactions, including the final ringer mode toggle, intro ripple and vol up + power gesture
  • Proposed, fleshed out initial eng/design details, provided guidance on multi-device interaction and Chromecast integration. Helped drive decisions on complex issues.

Android P Sandbox Google I/O ‘18 Lead

I created the final display content, designed demos + built an associated notification demo app (picture below), setup the exhibit, trained others on materials and lead the press walkthrough for the Android P System UI Sandbox at Google I/O 2018.

maxwidth

Highlights

Fast Pair

Easy headphone pairing via BLE and Nearby

Apple removed the 3.5mm headphone jack in the iPhone 7, setting the precedent for Android OEMs to follow. Mobile use of headphones and speakers of all forms one day may be wireless only. Today’s pairing process, however, is painful. I explored how to make pairing better in Android and bootstrapped a project across Nearby and Android Bluetooth to bring these changes to life.

maxwidth

Identified pain point, validated need, developed solutions

  • On my own initiative researched + audited Bluetooth pairing across platforms
  • Ran desktop study to observe pain points in the pairing process
  • Built a functional prototype to demonstrate a first pass at better pairing UX. Demo used an Estimote with a custom Eddystone broadcast. On Android, BLE ad detection is then used to bootstrap an A2DP Bluetooth classic connection where the pairing acceptance dialog is suppressed (by aborting an ordered system intent). Distance between devices is ensured by using low ad TX power.
  • Built a robust prototype by electrically modifying headphones to power an nRF51 BLE advertising dongle. Wrote firmware for the nRF51 that implemented the proposed technical solutions. Created and shared a demo video reel.

Iterated on the UX, created multiple robust prototypes, fleshed out spec, launched

  • Fleshed out and iterated on interactions, targeting the major problems along with the corresponding technical requirements.
  • Worked with Bluetooth and Nearby to flesh out a future-proof manufacturer specification based on future features and OEM feedback.
  • Demoed prototypes at two major internal events.
  • Owned the vision, research, interaction, visual and initial technical design proposals. Got interaction greenlight from Android UX.
  • Designed and validated BLE signal normalization techniques across devices.
  • Teamed up with Nearby team to put this in the upcoming release of Google Play Services to coincide with the Android Oreo release.
Highlights

Dismiss on capacitive, resistive, rotary headunits

Built 50+ prototypes to bring home card dismiss to Android Auto Projected, across its various input methods: capacitive touch, resistive touch and rotary. Prototypes explored interaction, visual and motion options. Ran a study and iterated with feedback to get buy-in from stakeholders in UX, PM and Eng on interaction basics. Once roadmapped, gave feedback to UX team during IxD and VisD iterations. Aided with bug fixes in final implementation.

maxwidth

A few insights I developed through prototying that guided the final implementation:

  • Long press is the best available trigger to start dismiss with rotary and resistive touch
  • Don’t provide an undo mechanism, no secondary signals; too complex while driving
  • To improve dismiss discoverability on rotary and resistive, use vivid motion to enter dismiss mode and distinct aesthetics to differentiate it
  • Motion to expose dismiss mode should not translate the card, resistive users will find it moving under their finger
  • Avoid zig-zag rotary highlight pattern when possible to a better first experience, however users will figure out the pattern after a few uses
Highlights
  • Android Police report on dismissing cards in Android Auto

    There’s a new version of Android Auto rolling out and this one is definitely worth the download. Version 2.2 brings one of the most requested features to the Auto interface since it was launched: Notifications can now be swiped away from the overview screen.

Patents and Applications

tags: industry software design data-science statistics matlab UX Android web coffee-script