Sentiment survey

Privacy review is a complex process with multiple moving parts: a product manager or product owner sets up a project; the privacy program manager decides what risks this project faces and the solutions that should be implemented to mitigate these risks; and an engineer or non-technical employee then submits evidence that the risks have been addressed. This evidence is then evaluated by technical auditors, which might result in the product team making some changes, and the project is launched.

It was crucial for us to collect feedback from our users end to end. Privacy review was a fairly new process at Meta, so we had to keep refining the process and testing it out. 

One of the biggest complaints that we kept hearing from both technical and non-technical folks alike was that the evidence submission part of the privacy review process especially was taking too much time and holding up their projects. So I joined forces with the product designer and the UX researcher to design a simple survey (that wouldn't add one more step to an already complicated and time-consuming process) and figure out the best way to show it to the user without making them feel like they have to jump through multiple hoops just to submit their project for a privacy review. 

Initial brainstorm with the PM, product designer and engineers.

Based on the UX researcher's learnings from extensive cognitive testing, I was able to develop and refine a simple survey and get it approved by all the stakeholders, including the PM, Legal, Engineering, and other leaders. I also made sure that the survey was consistent with other surveys throughout the whole privacy process (not limited to privacy review). This turned out to be the easiest part.

The UX researcher and I created and refined this survey based on extensive cognitive testing.

The biggest constraint was that we needed this survey to get as many responses as possible, and we wanted to collect as many unbiased responses as possible. Earlier (much before my time at Meta), when they implemented a similar survey and made it optional, the team had noticed that only those who were completely dissatisfied with the process left any feedback.

As we started figuring out the logistics of popping up this survey to the user, the product designer, engineers, and I quickly ran into some snags: To make this survey mandatory, we might need the user to take it right before they submit the project for review. They also need to be told that their project is not submitted for review yet. At least, not until they complete the survey. We needed to make sure that they completed the survey and submitted their project so that their work was not lost.

Here are some options that we explored in our brainstorm:

Why this wouldn't work: A variety of reasons! 

We needed the user to be able to go through this part first and give feedback based on their experience here. Without running these automated verifiers, the process wouldn't be complete. That means, the users' responses to the survey are only valid if they finish it after the evidence submission process is well and truly complete. That is, after the verifier runs. 

Why this wouldn't work: We needed feedback specifically on the evidence submission process. The survey being placed elsewhere would muddy our results.

Why this wouldn't work: Misleading button = horrible UX. There's also the risk that the user thinks they're done with submission when they click the button.

After a lot of playing around with the different pieces, we all realized that the solution was staring at us right in the face all along. 

What if we ran the verifier first and gave the user the results of automatic verification? And then we move them to the modal with the survey, in which we incorporate the acknowledgment step. 

The user would run the automated verifier first.

The user would get the results of the automated verification. They'd review their submission or proceed based on the results.

The user would then fill out the survey and then finish the acknowledgment in the modal before completing the evidence submission process.

We also incorporated a few more thoughtful details into this workflow.