Sentiment survey
Privacy review is a complex process with multiple moving parts: a product manager or product owner sets up a project; the privacy program manager decides what risks this project faces and the solutions that should be implemented to mitigate these risks; and an engineer or non-technical employee then submits evidence that the risks have been addressed. This evidence is then evaluated by technical auditors, which might result in the product team making some changes, and the project is launched.
It was crucial for us to collect feedback from our users end to end. Privacy review was a fairly new process at Meta, so we had to keep refining the process and testing it out.
One of the biggest complaints that we kept hearing from both technical and non-technical folks alike was that the evidence submission part of the privacy review process especially was taking too much time and holding up their projects. So I joined forces with the product designer and the UX researcher to design a simple survey (that wouldn't add one more step to an already complicated and time-consuming process) and figure out the best way to show it to the user without making them feel like they have to jump through multiple hoops just to submit their project for a privacy review.
Initial brainstorm with the PM, product designer and engineers.
Based on the UX researcher's learnings from extensive cognitive testing, I was able to develop and refine a simple survey and get it approved by all the stakeholders, including the PM, Legal, Engineering, and other leaders. I also made sure that the survey was consistent with other surveys throughout the whole privacy process (not limited to privacy review). This turned out to be the easiest part.
The UX researcher and I created and refined this survey based on extensive cognitive testing.
The biggest constraint was that we needed this survey to get as many responses as possible, and we wanted to collect as many unbiased responses as possible. Earlier (much before my time at Meta), when they implemented a similar survey and made it optional, the team had noticed that only those who were completely dissatisfied with the process left any feedback.
As we started figuring out the logistics of popping up this survey to the user, the product designer, engineers, and I quickly ran into some snags: To make this survey mandatory, we might need the user to take it right before they submit the project for review. They also need to be told that their project is not submitted for review yet. At least, not until they complete the survey. We needed to make sure that they completed the survey and submitted their project so that their work was not lost.
Here are some options that we explored in our brainstorm:
Incorporate the survey into the evidence submission process. The last page of the process would show the survey, after which they could complete the evidence submission workflow.
Why this wouldn't work: A variety of reasons!
The evidence submission process had an automated verification step as a first line of defense. This was a newly developed step that we were still refining. These automated verifiers would first evaluate the evidence quickly to weed out low-quality evidence (think things that can be automatically checked: format of dates, duplicate rows, mismatched data, etc.).
We needed the user to be able to go through this part first and give feedback based on their experience here. Without running these automated verifiers, the process wouldn't be complete. That means, the users' responses to the survey are only valid if they finish it after the evidence submission process is well and truly complete. That is, after the verifier runs.
There's also an acknowledgment step that the user has to complete, where they enter their name. This step serves to have them "sign off" on the evidence submission. The survey would show up before this acknowledgement, which would be a confusing experience. (We also very briefly considered including a note in the acknowledgment step that the submission process is almost complete. Why this wouldn't work: Many of the reasons mentioned above. Also, it would turn this whole thing into a trivial detail that the user could easily miss.)
Bring up the survey on a different interface. The privacy review was a comprehensive process involving multiple product surfaces, so throw the survey into another step, on another product.
Why this wouldn't work: We needed feedback specifically on the evidence submission process. The survey being placed elsewhere would muddy our results.
At the end of the evidence submission process, make the "Submit" button perform a different function. Lead the user to the survey before they submit and then force them to go through the survey.
Why this wouldn't work: Misleading button = horrible UX. There's also the risk that the user thinks they're done with submission when they click the button.
After a lot of playing around with the different pieces, we all realized that the solution was staring at us right in the face all along.
What if we ran the verifier first and gave the user the results of automatic verification? And then we move them to the modal with the survey, in which we incorporate the acknowledgment step.
The user would run the automated verifier first.
The user would get the results of the automated verification. They'd review their submission or proceed based on the results.
The user would then fill out the survey and then finish the acknowledgment in the modal before completing the evidence submission process.
We also incorporated a few more thoughtful details into this workflow.
If the user had already submitted a survey in the last X days, they could skip this flow and go straight to acknowledgment and submission.
If the user was completing multiple evidence submission processes for multiple components in their project, they only had to go through the survey once.