Todd is the CEO and Co-Founder of Okta, where he creates, communicates and implements the overall vision and strategy for the company.
Before Covid-19, profit was the driving force behind digital privacy infringements. Advertisers held almost all of the power and reaped the benefits, while consumers were left questioning how to gain control over their digital identities. Now, we’ve hit a crucial crossroads. Our data has the potential to track disease and potentially save lives, which I believe far outweigh our need for total privacy. But where do we draw the line?
Take Apple and Google’s partnership to enable contact tracing via smartphones as an example. The cause is noble, and both have proactively taken steps to protect the privacy of participants. Still, it’s unclear how these companies will be held accountable. U.S. senators are seeking to answer that question with a new contact-tracing privacy bill, which is a reasonable short-term fix. However, we also need a broader law now to ensure consumer privacy both during Covid-19 and after it subsides.
This rare moment of bipartisan agreement should be used to pass legislation that both protects consumers and their right to privacy and empowers businesses to use data to solve the world’s biggest problems. It’s not a zero-sum game, but lawmakers must recognize some important distinctions to get it right.
Don’t conflate security and privacy.
We tend to conflate privacy with security, but often, the two can be at odds with each other. Typically, privacy protections restrict data intake, while security tools require data to provide proper protection. If we’re not thoughtful in our approach to balancing security and privacy in legislation, we could inhibit companies from using data to ensure strong security.
Effective legislation needs to identify and carve out essential security data use cases and enable them. Consider banks: They collect personal information about purchasing habits, which then allows them to alert customers of fraudulent charges. While this could be seen as an invasion of privacy, it’s critical to protecting their customers’ finances.
Not all data is created equal.
When it comes to risk, the type of data gathered and how an organization uses it matters. Instead of taking a “one-size-fits-all” approach, the U.S. should embrace a risk-based model to determine how different types of data can be used and protected.
For example, location data is high-risk, especially when it connects to an individual — we can all imagine scenarios for what could go wrong. However, if processed with privacy in mind, location data can be used in incredibly beneficial ways, like tracking the spread of Covid-19 or informing relief efforts following a natural disaster.
That’s why we need to be thoughtful about privacy, which means limiting nonessential data collection, requiring highly secure storage and keeping data anonymous unless absolutely necessary. For data to be considered low risk and warrant a lower bar for collection, it needs to be anonymized or otherwise be nonpersonal data that can’t be traced back to an individual.
Be clear about where the data responsibility buck stops.
Companies with a direct line of communication with consumers should have different privacy requirements than those that simply manage data on another company’s behalf. “Data controllers” are typically consumer-facing companies like Airbnb or PayPal. They are responsible for interfacing with users to make sure they understand what data is collected, how it’s used and how they can adjust their data-sharing settings. “Data processors” are typically business-to-business companies that help data controllers run some aspect of their business. They are responsible for processing data as permitted under their contract with the data controller.
Why is this distinction so important? Through the California Consumer Privacy Act (CCPA), an airline (the controller) is required to report on data it has collected about a specific user. That makes sense, but think about its 100-plus vendors (processors) that manage its user experience, rewards program, email marketing and so on that also access that anonymized data. Having them communicate directly with end users would create an unnecessary, burdensome web of regulation for businesses and ultimately confuse consumers. Enacting a shared responsibility model is critical to creating a streamlined approach to who owns what.
Don’t create laws that contradict enacted legislation.
Consumer privacy laws like the CCPA and General Data Protection Regulation (GDPR) have already required companies to upend the actions they take to protect consumer data. To avoid confusion and difficulty in complying with yet another law, I believe any new legislation must stay interoperable with those in existence already or effectively replace them.
Making it straightforward to comply is especially important for small businesses. Small businesses increasingly seek customers through digital experiences that extend beyond state lines. A patchwork system of state laws can lead to expensive compliance costs for small teams, whereas federal privacy legislation could greatly simplify things. With small businesses across the country already struggling enough amid the impact of Covid-19, creating a law that will empower them, not burden them, becomes even more critical.
This moment illustrates just how robust the data that makes up our digital identities is. Collectively, that data can save lives and prevent the spread of disease, yet the word “data” still elicits fear in many of us. It doesn’t have to be that way, but getting privacy right at the federal level will require recognition of how broad, deep and applicable our digital identities are in this continually connected world.