CCPA, CDO, CPO, GDPR, IAPP, Privacy

Organizational Placement of Privacy

Question for the community: where should a Chief Privacy Officer (or more broadly, the privacy function)?  Some alternatives include:

  1. Counsel’s office: Since privacy is a legal matter, it stands to reason that compliance would benefit from being embedded with the general counsel.  On the other hand, counsel is often positioned as a separate function to demonstrate objectivity and independence from operations.  Moreover, since lawyers are trained to look at situations through a legal-risk lens, they are sometime less able to “get to YES” and truly embed privacy in operations.  Operations folks may look at their Legal colleagues in general as someone providing “sign-off” and that perception might extend to privacy compliance.
  2. Risk Management & Compliance: again, the alignment has some logic, since privacy provides a set of requirements that overlaid on operational processes, and one should manage the risk of non-compliance.  However, similar to assigning privacy to the Counsel’s office, Risk and Compliance are often organizationally separate to maintain objectivity and independence.  As a result, there will likely be challenges in embedding privacy into operational processes to achieve Privacy/Data Protection by Design.
  3. Office of the Chief Data Officer: The CDO is tasked with understanding the full breadth of data for purposes of deriving value and helping the organization leverage data in existing and new initiatives.  As a result of developing and maintaining the inventory of an organization’s data, the CDO is in a natural position to assess the applicability of privacy requirements and embed privacy requirements in business processes.  The challenges include that the CDO may be perceived has having a conflict of interests by owning privacy compliance as well as data leverage goals (in much the same way as a CIO has a conflict of interests by owning the CISO function).  Another challenge is that CDOs don’t always own all data in the organization, instead focusing on the data to be leveraged or monetization.  This leaves key gaps – such as employee data.
  4. Office of the CIO or CISO: The CISO is tasked with protecting data and is often looked to when there are data incidents.  As a result, the CISO has operational processes as it relates to embedding security requirements as well as monitoring/responding to issues, so adding privacy requirements would seem like a logical extension.  Moreover, the CIO and CISO are very well versed at implementing tools and extensions, which will be required for an effective program.  Privacy professionals will be quick to point out that privacy requirements extend well beyond security, and compliance requires a different level of understanding of the nature of data and how it’s used; a privacy breach may exist where no “traditional” security breach has occurred.  Moreover, privacy requirements apply to information and processes across an organization – not just those within scope of the CIO.  You could have an entire privacy awareness curriculum that never mentions technology, instead focusing on how people handle information. 
  5. Operations (COO): Having privacy report of the COO can make sense, depending on the organization.   Whereas privacy has been around for many years, the passage of landmark privacy legislation – with significant consequences for non-compliance – has very quickly elevated its importance in organizations, making it a Board-level or C-suite priority in some cases.  Having it report to the COO gives it prominence and positions it as aligning with the entire company.  This helps enable the implementation of privacy processes as embedded components in business process.  If done right, the result is a less disruptive but more effective program.   The downside is that unless the organization is a very data-focused company, privacy may get lost among the COO’s other priorities, and may be the target of political struggles.

To be sure, any of these models can work, if provided with the appropriate leadership, support and oversight.  Moreover, the culture of the company and the nature of their business can also influence an appropriate structure.

Privacy is at a crossroads.  One the one hand, the emerging interest and concern from consumers (and therefore legislators) puts pressure on companies to acknowledge their responsibilities handling personal information properly.  On the other hand, since privacy has been around for a while and is conceptually familiar to executives, is there a level of privacy fatigue being felt?  As a result, are companies less motivated to address the risks, instead adopting a wait-and-see attitude?

CCPA, CPO, GDPR, IAPP, Information Management and Governance, Privacy

How effective are privacy programs?

Background:

In September 2019, A group of 100 data leaders from respectable NY financial institutions were asked whether they’d heard of the General Data Protection Regulation (GDPR – the far-reaching European law governing how EU citizen’s personal information is handled around the world); 5 hands went up.  When asked a follow-up question: how many had heard of the California Consumer Privacy Act (CCPA), 2 hands went down.

On December 26, 2019, CNN published a story explaining why consumers are all of a sudden receiving so many privacy notices, which goes on to summarize CCPA, including the activity that triggered it.  The article explained – at a high level – the events that led legislators to pass the law. 

Over the summer, a small group of CFOs were interviewed and felt that GDPR is a mess, readiness was a waste of money, and that compliance is being addressed by “someone else”. 

Problem statement: 

Companies want to increase the degree to which they store and process personal information, but in an effort to protect the rights of individuals, law-makers are seeking to reduce the number and severity of incidents by imposing regulations.

Companies are making big investments in initiatives to take advantage of the transformative potential of data.  This covers an incredible array of opportunities, from simply using data and analytics to enrich their products and services, all the way to inventing algorithms to mimic human thinking to improve the lives of millions.  

The initiatives all have one thing in common: they depend of high quality data.  Vast amounts of it.  Increasingly pertaining to people.  Companies are building systems that pull together and combine data from a myriad of sources – internal and external. 

Breaches are happening – bigger and more impactful.  In 2019, records containing personal data were being stolen at a rate of over 15,000,000 per day.  The consequences to organizations are significant – financial and reputational.  Regulators are stepping up their actions, conducting investigations, and imposing fines.  Companies are having to pivot to correct issues and address new requirements reactively because many have failed to implement a data management framework efficiently adapt to regulatory changes.

Many companies don’t have a prominent leader assigned responsible for privacy – a Chief Privacy Officer (CPO) or equivalent.  Privacy is managed by legal or compliance groups as an adjunct to operations.  As a result, the people doing the day to day business of the company are not aware of their privacy responsibilities.  So is there any wonder why companies are mishandling personal data?

It’s time to act

More to the point, it has been “time to act”, but the regulatory requirements around data privacy are not going to get simpler, and companies should consider implementing an operational framework, with appropriate tools, enabling them to adopt new requirements in a time and cost effective manner.

An effective program to enable business to use data while also managing risk and ensuring compliance must reflect 3 interlocking components: Privacy, Data Governance and Risk Management.  Together, they can protect an organization while serving as a catalyst to accelerate forward.

Privacy

Most companies have a Privacy compliance program.  However, the informal poll referenced above revealed that privacy compliance is not embedded in the data programs.  This gap is very significant, since provisions of the laws speak very specifically to plans data scientists are pursuing,  The result is certain initiatives will have to slow down or get re-tooled.

And it’s not just data science teams who are dangerously disconnected.  Data science is probably a key area where data is being handled outside the boundaries set by the regulations (kept and processed for purposes beyond why it was collected, for example), but the breaches are mostly tied to weak controls on the operational side of companies – ranging from how and where it is tracked and stored, to how it is processed or disclosed for business purposes.

“Privacy by design” has eluded organizations since it was first envisioned in 1995, in part because it is frequently promoted by an under-resourced parallel organization, trying to apply one-size-fits-all techniques.  It doesn’t have to be like this.  Privacy programs can be structured to bridge to data users in an foundational sense, where privacy obligations are taken into account through-out project or operations lifecycles.  Risk goes down.  

Addressing the challenge begins by assessing the current state of the privacy program against a privacy template or framework, such as the latest draft NIST Privacy Framework, and creating a gap analysis.  The framework is useful because it breaks down the objectives of a privacy program in a way that aligns in with both regulations and the way organizations use data.  To be fair, the full Framework can be overwhelming for many companies – especially those not familiar with the NIST Security Framework, on which the Privacy Framework is based.  But this can be addressed by first distilling the NIST framework down to a more manageable version that still preserves the key elements. 

The gap analysis forms the basis for discussing how to enhance existing privacy efforts to achieve compliance, in a deliberate, sustainable, pragmatic way.  If done right, it can be scaled – whether down to a small privacy team of, say 2-3, or up to a full enterprise-level team.  This also allows a more focused approach to address specific pain points, including:

  • Compliance with GDPR or CCPA, which might range from early stage assistance, to specific process solutions (e.g., data subject access requests, data inventory upkeep, privacy-by-design, training and awareness, etc.)
  • Consideration for placement of the program, to integrate into company culture; companies are struggling with where to assign privacy, if not in Legal, and it’s landing with the CISO, who often needs help getting ramped up
  • Operationalizing Privacy, making the program resilient and sustainable, incorporating activities such as: 
    • Strategic oversight and stewardship, including obtaining executive and Board support
    • Monitoring for legislative changes, 
    • Updating and implementing policy,
    • Risk assessment, 
    • Process and control documentation and testing, 
    • Integration with business and IT change management, 
    • Incident management, escalation and resolution, 
    • Vendor management, and 
    • Contract review.

Data Management

Data programs are high priority for CEOs – over 95% believe that leveraging data is key to continued success and to defend against external disruption.  Yet Gartner concludes that 85% of data projects fail.  How is this possible?  Oftentimes, data initiatives are launched without implementing basic management and governance techniques.  Objectives are not defined at the outset, C-levels and the Board aren’t clear in what they are asking for, and may not understand the path to get there – or the cost.  

Introducing data management and governance discipline to create the data equivalent of “scientific method” can dramatically reduce risk and increase the chance of success.  Many companies – especially those in regulated industries – have records management programs that can be adapted to provide a management framework for data to be leveraged for monetization or through analytics or AI initiatives.  

The value proposition is to implement sufficient management and governance activities to

  • Provide transparency and accountability in to the program, including ethics and legality,
  • Ensure that data is handled in a way that doesn’t violate compliance obligations, whether contractual or regulatory
  • Provide shared-service capabilities, including inventory, procurement, tracking and disposition.
  • Create logical interface and touch-points into privacy, security, internal audit, compliance and legal programs
  • Triggers and objectives are to close the gap between CEO expectations and the practical success rate of data projects.
  • Expose the relative value and sensitivity of data to enable proper risk and threat management, in collaboration with others, such as a Chief Information Security Officer.

Information Risk Management

In a metaphorical sense, data programs are taking the jewels out of the safe and passing them around.  Handling high value assets definitionally increases the risk of theft or breach, when compared to keeping them locked up.  But they must be handled in order to derive value.  Many companies have built information risk or IT risk management capabilities over the last several years; the question is how well are they tied into data initiatives or aligned with the way data is used?  Given that 15,000,000 records are breached every day, one might suggest “not very”.  

In the context of the increased use of data for market-facing benefit, Information-related risk needs to be assessed in a more focused way.  As a discipline, IT RM has created a good foundation, however it frequently aligns with core IT process like strategy, architecture, change management, and security, and not to data.  

Information risk management can provide a critical interface between a data leverage program and a privacy/compliance program.  The techniques used to assess information risk result in key insights into the nature, relative value, uses and threats to information.  This helps direct risk-mitigation resources to align with the risk.  Specifically, it helps to recognize whether risk can be mitigated through, say, security controls, or whether the employee community needs tools that better align with their jobs (obviating the need for them to find their own solutions to business problems), or whether increasing awareness can help people make better judgements.  

Companies should consider identifying, categorizing and managing risk by looking at initiatives through an information lens – as opposed to a technology lens.  This changes the dialog with business stakeholders, which increases their understanding and appreciation of what could go wrong, what is acceptable residual risk, and the steps needed to bridge the gap.  

As indicated, IT RM in the marketplace has achieved a level of maturity, and there exists opportunities to adjust the scope and approach to more effectively identify and manage information-related IT risks, which arguable, can help manage overall financial, regulatory and brand exposure for companies.

Summary

Companies are increasing their use of data at a tremendous rate – and they should.  The opportunities to gain competitive benefit are exploding.  But the risk and consequences of missteps are growing as well.  By implementing data governance and integrating risk management and compliance in a pragmatic way, organizations can continue to explore the ways data leverage can provide benefits, while taking proportional measures against events that can impede progress.  

CCPA, CPO, GDPR, IAPP, Information Management and Governance, Information protection, Privacy

Does Privacy Need Disrupting?

Executive Summary

When it comes to the use of data in a business context, there are a few absolute truths: (1) business will continue to gather and process more and more information about people to meet their goals. (2) We will continue to see larger and more far-reaching data events involving personal information.  And (3) regulators will continue to respond with increasingly complex requirements around the handling of personal information.

This paper reflects on the trajectory data-use is taking within the business environment, and explores some challenges the privacy profession is facing trying to keep pace.  The combination points to the inevitability of catastrophic data incidents.

But like so many other industries, modern technology may hold the answer to managing the risk.  This paper goes on to discuss that through the measured deployment of disruptive technologies, the privacy profession may find a way to support the acceleration of data use in the business, while managing risk and pursuing compliance.

Background

The thing about black swans is that they are both predictable and unpredictable – you know they are going to happen, you just can’t anticipate when and the form they will take.  In the period of one week in December, over 600 million records containing PII (Personally Identifiable Information) were breached. For perspective, that’s more than every man, woman and child living in the US, UK, Canada, Australia and Russia combined.  

With the increasing volume of PII being collected and processed by organizations around the world, it was inevitable that something like this was going to happen.  Moreover, it will happen again – and bigger – from triggers and vulnerabilities on which the risk community is not focusing. And no global organization wants to be named in a headline that talks about hundreds of millions of records being compromised.  

About data

We live in an age where information is emerging is a truly leverage-able resource for companies around the world, enabled by the incredible pace of change in technology and analytics capabilities.  The opportunities to improve customer experience are growing exponentially. To be sure, customers now measure their own satisfaction – and loyalty – based on capabilities offered by service-providers that were not even possible a few short years ago.  And companies are doubling-down investment to outpace their competition, or in many cases – in the face of disruptive startups – ensure their very survival.

Much of the data at the heart of the most promising innovations is in some way tied to individuals — whether traditional PII or PHI or new data around people’s movements, tastes and behaviors, spun off from IoT sensors, new analytics technologies or apps used by individual consumers where they are knowingly or inadvertently contributing data.

We also see that as some companies push the boundaries, or in the aftermath of high profile data incidents, lawmakers are reacting by implementing far-reaching legislation to protect the rights of individuals.  Complying with those is a challenge and imperative for all organizations but especially forward-looking global organizations, as they navigate uncharted waters and as regulations emanating from different jurisdictions overlap and conflict.  

Given the pace and trajectory of developments in technology and data, and the scale and frequency of data events, it’s reasonable to predict that there will be more breaches in the future – both larger in scale and more impactful.   Moreover, the increasing number and complexity of regulatory requirements – many triggered in the aftermath of data breaches – will place increasing burden on businesses, increasing internal tension between those developing new and innovative products and services, and those tasked with managing risk and ensuring compliance.  Finally, the potential ramifications of a breach, including the very significant fines, lost business or damage to the brand, can have lasting negative consequences to any organization.

Risk and privacy activity today

Today, risk management and privacy are heavily manual.  Risk management and privacy groups are relatively compartmentalized, often viewed as necessary but imposing layers of bureaucracy, addressed late in the process and after the business requirements are met; risk and privacy requirements are often viewed as disruptive and costly.   

Whereas “Privacy by Design” seems like an obvious enabler, and has been a holy grail of sorts, passionately embraced by privacy practitioners, it’s often down-played (or ignored) by business development groups.  

The basic process around risk and privacy include the following:

  1. Privacy Policies that reflect requirements — whether legal, contractual, ethical, professional or industry parameters.  This establishes the inward- and outward-facing posture and serves as the foundation and basis that drives every meaningful aspect of the program.
  2. Process documentation: business processes that handle PII are documented and analyzed to identify risk and to ensure that controls mitigate the risk and align with policy requirements.  
  3. Data and application inventories: as a supplement to process documentation, knowing what data is on hand and what applications process it is important to help ensure that appropriate controls are in place
  4. Trigger points within processes – IT or business processes – around changes or data events requiring action; certain activities such as developing or changing an application that stores or processes PII should trigger a Privacy Impact Assessment to determine what risks exist and what controls are needed.
  5. Consultations and approvals where SME’s respond to inquiries and use research and professional judgment to provide recommendations.
  6. Risk assessments take place periodically to determine what’s changed and whether controls are aligned with risks to PII.
  7. Controls are tested periodically to ensure they are functioning as intended
  8. Control weaknesses or failures are documented in findings reports requiring action by control owners

The process is largely manual

The key point in providing this list is to highlight the fact that all of these are manually intensive and are at best supplemented or enabled by tools such as GRC applications.  And while the enabling tools and applications help, these processes are only linearly scalable – meaning, increases in the number of in-scope processes and applications require a proportional increase in resources — people — to accomplish the risk and compliance activity.   Moreover, while the most effective privacy programs distribute the activity across the business constituents, and can gain some leverage and economies of scale, the costs fundamentally increase fairly linearly.

Most organizations face challenges in trying to increase their bench of Privacy SMEs, since they require in-depth understanding of their organizations, as well as privacy expertise, and need to exercise consistent and similar judgment.  So maintaining consistent quality around advice provided by SMEs is a risk and challenge in itself.

So in summary, the technology, data, business and regulatory environment is evolving rapidly, getting more complex, and more critical for the continuing success of the organization.  Traditional privacy risk and compliance practices are heavily manual, reactive, burdensome and difficult to scale. In combination, it’s clear that costly and damaging issues will continue to arise, and the tension between the execution of business strategy, managing risk and maintaining compliance will become even more pronounced.

What is changing…

In order to become better embedded and get ahead of business developments that leverage data, the privacy function needs to understand how the business plans to gather, manipulate and store PII, and overlay the risk and compliance requirements for its treatment and handling – which should result in certain adjustments to the business strategy.  

The privacy team has to understand all aspects of information risk management (leveraging an auditor’s playbook) to judge sufficiency of control, and be able to interface with the business, IT, IT security, legal, audit and compliance stakeholders, as well as with regulators.  

An important dimension of this is to have a framework for accepting residual risk.  This framework has to resist the “group-think” temptation to be either blinded by competitive pressure or the promise of fantastic profits, or lured into the “risk elimination” mode.  Instead, it should allow for the analysis of risk, mitigating effect of controls, and a transparent mechanism to accept residual risk that escalates upwards through leadership, depending on the overall risk/benefit balance.

But as discussed above, data “events” are bound to happen — whether breaches, losses or abuses — and privacy professionals too often are reactive.

Fundamental and disruptive change – leveraging Artificial Intelligence

Business, technology and data science will continue to accelerate, events will happen and regulations with come into effect.  The result is an increasing tension between opposing forces, where the resistant compliance side of the equation will almost always lose.

It’s time to take a fresh look at the model.  Increasingly, companies are recognizing the disruptive effect that data and analytics (including AI) will have on their business – the very action that increases the risk of privacy events discussed in this paper.

Privacy compliance can benefit from disruption.  

Ultimately, many aspects of privacy compliance will benefit from the disruptive use of AI and cognitive algorithms.   Given that privacy compliance combines documentation, analysis and judgment, there are opportunities to design and train algorithms to assist analysis, which will increase the timeliness and reach of the program.

Approach

First and foremost is the recognition that intelligent automation and leveraging AI is a journey – not a destination – and benefit is gained incrementally.  Focus begins on the more basic and mechanical aspects of the program, allowing more analyst time to focus on more sophisticated and complicated issues.

The privacy activities are then broken into categories which helps to drive priorities:

  • Routine daily tasks that need to be monitored for compliance, and where certain events trigger action, and
  • Change involving new applications, data, business ventures or data use cases
  • New requirements, such as new regulations, risk factors or data use restrictions

As the process matures, more aspects of the program can be automated, leading to a state where increasingly sophisticated tasks are processed automatically and the SME is engaged at certain thresholds where, say, more judgement or specific approval is needed.  If properly implemented, the algorithms are trained methodically (“crawl, walk, run”) and logged to ensure consistency.

Example activities that are candidates for automation:

  1. Process review comparing to policy – using an algorithm to determine whether a proposed process might violate a privacy policy
  2. Access monitoring – data stores containing information pertaining to people can be monitored for access and AI can analyze access for anomalies, and trigger responses
  3. Data access requests – routine operational transactions, such as requesting access to certain data, can be vetted and handled through Intelligent Automation
  4. Transaction monitoring – AI sensors can be tuned to monitor a wide range of structured and unstructured transactions guarding against inadvertent use of private information
  5. Privacy event analysis/DLP – Data Loss Prevention (DLP) sensors can capture thousands of potential events on a daily basis.  AI can be used to risk-assess the events based on a variety of rules, and flag those exceeding a predetermined risk threshold for further investigation.  
  6. Control analysis and testing – privacy programs often include a periodic testing cycle.  AI can be used to evaluate the results of testing to assess severity
  7. Data discovery and inventory – All organizations have large volumes a unstructured data stored (and often forgotten) on network file servers.  AI can be used to traverse the file stores and build meta-data tables around the data, and can be tuned to identify sensitive data, helping to ensure compliance
  8. Data psudonymization – AI can be used to implement psudonymization techniques on a large scale, and can test whether the data can be re-identified.
  9. Contract review – often times additional specific data handling terms are embedded in contracts with large clients.  AI can be used to extract those terms and correlate them to specific data in the environment to help comply with the client’s requirements.
  10. Regulation review – AI can be used to highlight applicable sections of regulation based on ingested company policy documentation, which accelerates implementing compliance activity
  11. Risk analysis – Algorithms can be trained to detect data use-cases that are in conflict with policy.
  12. Residual risk assessment – Quantifying residual risk is very important for determining whether risks are sufficiently mitigated to meet corporate risk appetite, and whether a value proposition is still valid.  AI can help with the determination.
  13. Customer inquiries – Intelligent automation can be used to handle customer inquiries around where data is, requests for erasure or transfer.  This can be extremely burdensome for companies with large numbers of individual customers.

Benefits

All these use cases are within the capabilities of existing technology, and the decision to pursue any combination is based on specific circumstances.  However, the overriding point is that they pave the way toward much more flexibility and scalability of a privacy program that is coming under increasing pressure to perform.  So the benefits are:

  • Greater flexibility
  • More scalability and leverage of resources
  • Lower risk of non-compliance
  • Less impact and burden to the business
  • Managed cost

Risks

At a high level, the risks are that the tools fail to detect or prevent an unauthorized use or disclosure of information pertaining to individuals.  This can be because the algorithms don’t work as intended or are not properly implemented. These are project and operational risks and should be managed through normal risk management processes.

But by keeping in mind the current state and the trajectory business is on, the reality is that leveraging Intelligent Automation and Artificial Intelligence makes sense.  It’s going to happen.

Conclusion

When it comes to the use of data in a business context, there are a few absolute truths: (1) business will continue to gather and process more and more information about people to meet their goals. (2) We will continue to see larger and more far-reaching data events involving personal information.  And (3) regulators will continue to respond with increasingly complex requirements around the handling of personal information.

Many industries are being disrupted by the creative and innovative use of data.  The privacy profession — increasingly in the spotlight, yet dependent on manual processes — is quickly becoming a good candidate for reinvention.  People will benefit, as it will open avenues for business to provide new products and services designed to make their lives better, while at the same time lowering the risk to them for participating.