The Risk in Adopting Templated Data Protection Policies

The New Y orkDepartment of Financial Services fined a crypto company $1.2 million for not conducting proper information security risk assessments and for using templated policies, without customizing them to fit the company.

In a matter of seconds, generative artificial intelligence (AI) can draft a reasonably passable information security policy for an organization. The policy will look much like the thousands of publicly available policies found on the Internet. It will likely contain the generally accepted elements of a mature information security program: risk assessment, a categorical listing of primary security controls, a section on incident response, and further sections on employee obligations and training.

The policy will likely also contain a placeholder, such as “[Company Name],” where the organization should, at minimum, customize the policy to reflect that it has been evaluated, approved, and adopted by organization. There is risk in such ease of access to prefabricated policies, however.

Even before the Google transformer paper was published in 2017, giving rise to the recent explosion in large-language-model AI solutions, data protection policies were widely available on the Internet and just as widely cribbed by organizations looking for a good example from which to start, or in certain cases, end.

A quick web search results in numerous examples of companies’ adopted website privacy policies using unresolved placeholders such as “[Company Name]” or “[link].” Internally, certain organizations may have adopted data protection policies that reference, for example, Health Insurance Portability and Accountability Act (HIPAA) compliance, when the organization has nothing to do with healthcare.

On May 1, 2023, the New York Superintendent of Financial Services, Adrienne Harris, approved a consent order between her Department of Financial Services (DFS) and bitFlyer USA, Inc. (bitFlyer), a cryptocurrency exchange licensed with DFS under 23 N.Y.C.R.R. Part 200, the New York “Virtual Currency Regulation.”

The bitFlyer consent order shows, inter alia, the risk in adopting templated policies, as well as the risk that a current or former subsidiary runs in getting its data protection services from a parent, including a parent located in another jurisdiction.

Under the consent order, bitFlyer was required to pay a civil monetary penalty of $1.2 million and agreed not to seek any tax deduction or credit, nor to seek indemnification or reimbursement, including via insurance, in relation to the penalty.

The consent order and penalty arose in part from bitFlyer’s adoption of templated data protection policies without adequately tailoring those policies to bitFlyer’s specific data protection needs.

The consent order is grounded in DFS’s Cybersecurity Regulation, 23 N.Y.C.R.R. Part 500 (Part 500), a first-in-the-nation example of a state regulatory agency creating cybersecurity requirements for its regulated industry or industries without a direct legislative mandate. Part 500 was adopted effective March 1, 2017, and requires any organization that operates or is required to operate under a license or other similar permit under New York banking, insurance, or financial services laws to adopt and implement a comprehensive information security program to protect nonpublic information (NPI). NPI is defined, in turn, as including “all electronic information that is not publicly available information and is … business-related information of a covered entity, the tampering with which, or unauthorized disclosure, access, or use of which, would cause a material adverse impact to the business, operations, or security of the covered entity.” See 23 N.Y.C.R.R. § 500.1(g)(1).

In this regard, Part 500 is an outlier, as its defined focus concerns data not only relating to identifiable individuals, but also any information the compromise of which could create a material adverse impact on the organization.

By contrast, the New York State SHIELD Act, found at N.Y. Gen. Bus. Law § 899-bb and the federal Gramm-Leach-Bliley Act (GLBA), 15 U.S.C. §§ 6801-6809, 6821-27, and its supporting regulations focus instead on personal data relating to customers and employees of an organization. Neither regime sets a standard for the protection, for example, of mission-critical information not relating to an identifiable person.

Because of this, Part 500 is different among regulatory regimes, and a financial institution subject to Part 500 would be ill-served relying on form-based or templated policies relating, for example, to another financial institution regulated under GLBA.

As recounted in the consent order, however, bitFlyer’s apparent shortcoming was more mundane: failing to replace standard placeholders like “[Company Name]” with “bitFlyer” so as to at least indicate that the relevant policies had been customized to fit bitFlyer’s specific data protection needs.

As recounted in the consent order, “certain documents [comprising bitFlyer’s information security program] were clearly templates, one referring to bitFlyer USA as ‘ABC Company,’ and anther referring to bitFlyer USA as ‘the Covered Entity.’” See In the Matter of bitFlyer USA, Inc., New York State Department of Financial Services, ¶ 16 (May 1, 2023).

The consent order also noted that bitFlyer, which at one time had a Japanese parent and was still receiving support services from that entity, had adopted policies that “were English translations of Japanese originals” with some portions being “poorly translated, while others (such as graphs) were not translated at all.”

This issue of translation, although seemingly limited to an organization with international ties such as bitFlyer, has ramifications for many organizations, especially given the patchwork of data protection laws globally and in the United States specifically.

An organization subject to Part 500 with employees in New York, for example, must comply not only with the specific requirements of Part 500, but also with the differing requirements of the SHIELD Act and potentially a host of other requirements, such as those found in neighboring states like Massachusetts, which has its own set of cybersecurity rules under 201 C.M.R. 17.00.

Notably, although the SHIELD Act attempts to limit this complexity by including a “compliant regulated entity” caveat for organizations subject to and compliant with Part 500, for example, that caveat applies only to the extent the organization remains compliant. See N.Y. Gen. Bus. Law § 899-bb(1)(a)(iii); (2)(b)(i) (Part 500 caveat for a “compliant regulated entity”).

After a security incident, however, regulatory hindsight is often negative, finding gaps in compliance where the organization may not have identified gaps previously. Hence, the “compliant regulated entity” caveat under the SHIELD Act is good only until it isn’t, and organizations are often best served by complying with all potentially applicable standards, regardless of whether such a caveat may apply.

It is this complexity, however, that leads organizations to look for and sometimes adopt templated policies, as seen with bitFlyer.

In the first instance, those in an organization tasked with developing data protection policies, such as the IT or information security team, are often not trained in legal analysis. IT and information security teams are also often not trained in how to translate one set of legal requirements into another legal paradigm, resulting instead in conglomeration policies utilizing mismatching terminology from various regulatory regimes. And adding to these pressures, many organizations lack the expertise to critique the policies proposed by the IT or information security team, resulting in a pronounced lack of policy quality control, as cited by DFS in the bitFlyer consent order.

Against this backdrop, proper regulatory translation is key: speaking the data protection language of the organization’s regulators, while acknowledging that a different regulatory lexicon may be required, depending on the circumstances.

An organization that documents its technical and organizational measures taken to protect company data clearly either has a Europe-centric focus, because the terminology originates from the European Union’s General Data Protection Regulation (GDPR), or it has “borrowed” its relevant policy from an organization that does.

Understanding how an organization’s regulators express and interpret their specific data protection requirements is both difficult, in light of the increasing complexity among data protection regimes, and more important than ever, especially in light of the bitFlyer consent order.

Adding to this complexity, DFS-regulated entities are waiting for proposed amendments to Part 500 to formally take effect later this year, adding to the breadth of an already detailed regulatory regime.

One of the proposed Part 500 amendments, for example, will require organizations to test their incident response plans on a yearly basis, a process that should immediately bring to light any gaps in a templated incident response plan not properly tailored to a particular organization’s needs. Because they are widely available on the Internet, incident response plans are a common locus for the kind of customization error noted in the bitFlyer consent order.

Given this, organizations of all shapes and sizes—not only those regulated by DFS—should inquire as to the origins of their data protection policies and assess whether those policies have been appropriately customized to fit the organization’s needs. And, as DFS noted in the consent order, such introspection is not static. Instead, it must be repeated as needed to ensure proper alignment between organizational risks, regulatory requirements, and the organization’s information security program.

Until AI can automate this process for us, such evaluation and adjustment of information security policies will remain a hallmark of any mature information security program and involve significant organizational—and human—engagement to remain effective.


F. Paul Greene is a partner and chair of the privacy and data security practice group at Harter Secrest & Emery. He can be reached at fgreene@hselaw.com.



From: New York Law Journal