evidence submission

Highlights

Simplified complex concepts and processes for users

Significantly improved efficiency and time to completion

Refined and evolved the process and content based on feedback

When the internal privacy processes were first built, all the users went through the same evidence submission workflow. Whether they were launching a new feature for Instagram or a marketing campaign on Facebook, they had to answer the same complicated questions. This was a problem: The questions were highly technical, geared entirely toward an engineering audience. Users like marketers, project managers, etc., were unable to submit evidence and couldn't proceed on their non-technical projects. The qualitative UX researcher and the data scientist on the team presented substantial evidence to show that this one-size-fits-all process was stopping a significant portion of Meta employees from moving fast. 

Non-technical verification was a new process we created so that non-engineer users like marketers, project managers, etc., could still submit evidence for things that were handling user data but were not new product/feature launches (e.g., marketing campaigns).

I worked closely with the UX researcher, data scientist, product designer, product manager, and engineers to create a new set of questions that let non-engineers provide sufficient evidence. These questions then went through rigorous testing and workshopping with our cross-functional partners, including privacy program managers who helped engineers and non-engineering users embed privacy into their products and tech auditors who ensured that the evidence was up to par.

The new workflow should ask for evidence that the user can provide (vs. something that's impossible to prove, something they had no knowledge about or didn't know how to get, etc.), the user must be able to understand and supply what's being asked, and the tech auditor must receive everything necessary to approve the user's submission without back-and-forth and a delayed launch. Privacy review is a critical part of launching anything at Meta, so we needed this workflow to be simple and easy so that the users didn't feel intimidated or put off.

The first step was to clearly define what "non-technical" meant. Only when the user checked all the right boxes and had a complete understanding of the nature of their project, they were allowed to go through that workflow.

With the researcher and data scientist, I went through previous non-technical submissions. Many of these submissions were rejected multiple times for insufficient evidence. We carefully studied what kind of data the users were asked to provide and what they were able to provide to finally unblock the privacy review process.

I also spoke to multiple non-technical users who had to go through the privacy review process. I gathered their input on where in the process they felt blocked or confused. I also watched them go through the process and took notes on how different users approached the process differently.

We also took some time to understand what privacy assessors from government bodies usually looked for when they audited Meta's privacy practices. This helped us make sure that there were no gaps when we built the new set of questions.

With all the context, I was able to adapt the technical workflow for the non-technical audience with significantly simpler questions and little to no privacy-speak. Privacy is an inherently complex area, and one of the possible pitfalls is resorting to jargon. Not all users were well-versed with the nuances of privacy, and it was crucial to write everything in simple language. 

Document with multiple-choice questions for evidence submission.

First draft of the new questions. Simple, but inadequate.

The first draft reduced the time non-technical users took to submit evidence and get a response from technical audit by 85%. However

While users were able to answer all the questions without feeling blocked, their submissions continued to be flagged for insufficient evidence. The biggest lesson learned: There's such a thing as "too simple."

I took feedback from some test users to refine the questions. 

Second draft of the questions with changes highlighted.

Second draft of the questions. More education, more clarity, and more context.

As I kept testing, learning, and evolving, I continued to prioritize:

This new draft significantly reduced the number of users being flagged for insufficient evidence.