Check-in instructions

Rethinking what and how we collect from partners, informed by data and driven by empathy.

User Research, User Interface Design, prototype development, A/B Experimentation

At Booking.com I mainly worked in the alternative accommodations unit, that provides homestays, apartments and other similar holiday homes.

During my time as a UX Designer I was working on the pre-stay experience of guests, making it easy for guests to prepare for their stay.

Why

One of the team's objectives is to minimise the risk of guests not being able to enter the property. This translates into helping partners reduce operational workload and increase the offline guest experience.

While looking into what kind of information partners were providing us related to the check-in process, we landed ourselves with a fundamental question: Is the information we're collecting from partners reflective of reality and what guests really need?

What

In order to answer this question, we started off by analysing the free text provided by partners, identifying the following patterns:

  • Text that was not immediately usable (some examples found: "Dance" or "just open the door")
  • A relative large amount of partners wanted guests to contact them directly for instructions
  • Some partners entered an alternative way of check-in (but the reasons were scattered)
  • Input in different and/or multiple languages at once

To dig a little bit deeper, we also interviewed 6 partners to identify how they used the current check-in instructions form and how they usually shared check-in information with guests.

We then combined those insights with guest feedback from a survey, and developed our first iteration. Then tested the prototype on 10 property owners/renters on usertesting.com.

An early design of the key collection form. The key collection hours and shared entrance are mentioned separately.

An iteration of the design of the key collection form. A more flexible and scalable alternative check-in method has been added.

Insights

The first round of learnings helped us identify two main issues:

  • Copy: it was not always clear nor inclusive of all property types. For example: the word 'shared entrance' was not clear to everyone nor relevant for campings.
  • Flexibility: the form was not flexible enough to adapt to all use cases partners had. For example, some partners told us that late check-ins are usually fine, but early check-ins depend on availability (like an earlier guest leaving or when the room is being cleaned).

Those issues led to a second round of design changes and user validation, aiming to tackle the issues previously identified.

The second round of usability tests proved that copy could still be improved and that the form was still too rigid. For example, "private entrance" (which was an iteration of "shared entrance") caused some confusion among test subjects, as they were trying to figure out what it referred to. We also learned that the test subjects wanted to be able to enter more detailed instructions.

Our third, and final iteration, tried to consolidate all these learnings by allowing partners to enter more details via free text, so that they can give more context (like what the guests needs to do). We also added an alternative check-in method, allowing for more flexibility.

Final key collection form

The final key collection form as we A/B tested with partners.

Learnings

Copy plays a main role in the success of the solution

From Usertesting.com we saw that by simply changing the words used, we could help partners understand better what was expected of them. Always make sure to involve your fellow wordsmith from the conception and research phase, it does make a difference!

Focus on the right questions

In our case, we saw from research data that there are different reasons why partners needed to provide alternative instructions i.e. it's outside check-in hours, it's lunch break time, etc.

When moving onto the design phase, we kept asking ourselves: how can we accommodate all the reasons why a partner needs to provide an alternative method of check-in in this dropdown?

And then, we realised that we were focusing on the wrong question. What mattered the most to the final user is not why the partner is not there, but what can guests do if the default instructions given do not work. In other words "What's plan B?".

Always design for the novice

We used data and empathy to drive our designs, but what user tests showed us was that what might be easy to comprehend for us, might not be for others. Default to create experiences that work for our most novice and casual partners.

"The goal of this process isn't perfection - it's improvement."

Everytime we looked at the prototype or got feedback, we would end up adding more and more things. At one point, we just had to remind ourselves that we were not aiming to create the perfect final product (that will be set in stone). We just aimed to improve with every step that we take and only when we're confident we can move the needle enough, we move forward to build the product.

Thanks to

Shang Hsin Tsai (Product Manager) planned and prioritised strategically and helped me write this text;

Yan Jin (Project Marketing Manager) recruited partners and looked at the data and worked on adoption;

Marie-Anne Leuty (Copywriter) gave us a hand with the copy;

Soma Ray (User Researcher) and Kate Dieter (Partner Researcher) helped us out with the Usertesting.com scripts;

Shreyam Sinha (Back-end Developer) designed the database and back-end;

Chienwen Chen (Full-Stack Developer) worked on the front end;

Jonathan de Jong (UX Designer) designed prototypes, did research and facilitated in user tests.