Improving reachability for the search experience on large screen phones

  • #framer
  • #android
  • #search

Search UX is hard to reach on large phones

As smartphones have increasingly become larger, the area one can reach with one handed use has shrunk. What's more is that as phones start becoming bezel-less, top-aligned content gets pushed up even further making reachability even worse.

One area that is especially harmed from the inability to comfortably interact with the top of the screen are search bars and autocomplete results.

Android Oreo screenshots with reach areas super imposed.
Android Oreo screenshots with reach areas super imposed.

Not only is the Google search button in the danger zone for one handed use but the app drawer’s search bar is too. For a company like Google, this could not only be a bad experience for their end users but could also mean a reduction in revenue from search.

Exploring the current solution space

This problem has not gone unnoticed. Apple, Samsung, Motorola, LG, and Google have all devised ways to address the inability to comfortably interact with the top of the screen.

Apple’s Reachability

On large screen iOS devices, lightly double taping the home button slides the screen within range. The double tap is hard to discover and frankly easy to forget exists. It requires the user to wait for the animation to complete in order to interact with it and isn't fluid to do while the phone is in use.

Samsung’s (and others) screen shrink

Honey, I shrunk the screen! Samsung, Motorola, and others allow for a temporary screen shrink that puts the scren within range. The biggest issue here being how difficult the touch targets become to reliably hit once the UI shrinks. Buttons that were meant with specific target sizes in mind become significantly smaller.

Pixel’s fingerprint swipe

The Pixel allows for a simple downwards gesture on the fingerprint sensor to bring down the notifications pane - a gesture commonly activated from the top of the screen. Although it works well for its intended use, no more gestures can be added due to sensor limitations and potential ergonomics issues.

Pixel 2’s bottom aligned search bar

The Pixel 2 is really the first to address reachability specifically for search UX by placing the Google search bar at the bottom of their launcher. However, once activated, autocomplete results are still at the top and are still uncomfortable to reach.

Although each solution is interesting and provides a somewhat better experience in certain cases, none of them address the entire end-to-end search experience. I believe there exists something better and I'm going to try and prototype ideas to explore the space.

Defining success

Unlike the solutions by others, I want my prototypes to feel more fluid and not feel like you're entering into a different mode of use. It should be memorable and easy to activate while within the search experience. Therefore, to provide some guiding principles I have defined the following as key success criteria that each solution should fulfill.

  • The design should feel fluid and feel like an extension of what the user is already doing

  • The design should be discoverable and memorable enough to become learnable

  • The design should accommodate the entire search experience including selection of autocomplete results

Prototypes

Now that I've understood the problem, did some research on existing solutions, and defined success criteria, I can begin prototyping. I usually start off with sketches and then go into high fidelity.

After some sketching, designing, and a lot of hacking in Framer, I've been able to put together 3 solutions that explore the space.

1
Flip results upside down

Always start with the most simple solution. The idea with this prototype is to flip the results to be bottom aligned, similar to the Pixel’s search bar, and reverse the sort order so most relevant autocomplete results are now sorted bottom to top.

hand_img

Pros

  • Easy to reach most relevant autocomplete results

  • Easily discoverable

  • No complicated gestures to learn.

Cons

  • Establishing the design system, components, and design principles

  • Establishing the design system, components.

  • Establishing the design system, components.

TODO

  • A visual affordance or indicator to show which results are more popular than others could help solve confusion around ordering.

2
Pull down to select

A possible hypothesis as to why the most relevant autocomplete result are shown right under the search bar is how keyboard arrow keys work with search. Traditionally on desktops, it was always one or a few “down” key presses to get to the most relevant results.

To replicate something similar, the idea for this prototype is to use a gesture performed on the keyboard, which would be present during typing the search term, that would scroll through each list item.

How to use

A single swipe down on the blue indicator could select items based on how low a user dragged down. To avoid conflicts with the swipe gesture keyboard, the swipe down gesture could be restricted to the blue bar for single drag or the entire keyboard for two finger drag.

Pros

  • Gesture feels fluid since user is already interacting with the keyboard

  • A full swipe all the way down selects the top and most relevant result without needing to be precise

Cons

  • Undiscoverable

  • Swiping the keyboard down doesn’t feel intuitive nor does it pair with any other mental model or similar gesture.

  • Even if the gesture is restricted to the blue bar or to two finger drag down, it could still incorrectly activate the swipe keyboard.

TODO

  • Possibly add indicators and animations for accidental activations that reduce the potential jarring experience of it being activated during swipe typing.

3
Swipe up for virtual cursor

Rather than restrict to simply list selection, perhaps the better solution is to find a solution that could be used for reaching any target at the top of the screen. It could be designed in a way to certainly help search but could be versatile for other uses as well.

The idea for this prototype is offer a virtual cursor that is offset from a user’s touch that gives them extra reach. Think of it like a reaching stick for phones.

How to use

Swipe up around the home button from the bottom of the screen and hold. You’ll see a blacktouch area and a virtual cursor. Where you let go is where it performs a virtual tap.

Pros

  • Gesture feels fluid and could be activated quickly

  • Can be used for more than just search lists and autocomplete

  • Unlike the screen shrink solutions by others, this keeps the hit target area the same

Cons

  • Undiscoverable

  • Is only limited to tap and doesn’t support swipe or drag gestures.

  • Requires some level of precision to aim the cursor where you’d like it

TODO

  • Try locking the x position of the virtual cursor to the x position of the swipe up from bottom to help aim and precision.

  • Add further gestures to allow for swiping/dragging.

Conclusions

None of my proposed solutions met the bar and none of them met all 3 success criteria. In particular, discoverability was a major issue and none of the solutions were memorable to become habit.

The third solution, the virtual cursor, was the most interesting as it was the most versatile and could be used for multiple purposes. I’m interested to play around more here.

I did, however, learn a lot about both the problem and solution space and I enjoyed prototyping these to build my knowledge base.