Take care of your spreadsheets: tips to improve your data governance before an incident – Security

To print this article, all you need to do is be registered or log in to Mondaq.com.

Even the most secure organizations fall victim to cyberattacks. To prepare, organizations should identify and secure or remove certain forms of high-risk data on their systems.

Cybercriminals have been taking advantage of organizations’ loose information governance practices since the rise of so-called “double extortion” in 2020, in which threat actors encrypt key systems, steal data and demand a ransom in exchange for decryptors and removal. In this article, we explain how uncontrolled data collections can complicate an incident response and provide helpful tips for improving data management.


Cyber-threat actors now often steal data and hold it for ransom, offering to delete it only after paying a ransom commensurate with the sensitivity of the data.

Although database theft does occur, it is common for hackers to search and find sensitive files from network file shares. Too often, an organization’s file share serves as a digital “junk drawer,” storing random files that don’t fit neatly into a records retention scheme. It’s dirt for the threat actor. Sophisticated cybercriminals will use automated means to collect bulk files from various network locations – endpoints, file shares, and email accounts, for example. They will then present unstructured (or “flattened”) lists of tens or hundreds of thousands of files to organizations in an effort to prevent proper valuation and use time pressure to extort a higher payment.

Whether or not threat actors flush these large data caches on the dark web, the response can be very daunting as notification to those affected usually must follow. It is often necessary to send all or most of the data cache for eDiscovery processing to identify all personal information exposed and to whom it belongs.

E-discovery has become a major incident response cost, and victimized organizations have a major interest in limiting its reach. The number of people affected results in notification and a number of related costs, and is also a key factor in exposure to legal action. Frankly, an incident with a small group of affected individuals is a much less attractive class action target.

Formal recordings, scheduled and classified according to their sensitivity, are not the problem. Since they are under governance, official records tend to be appropriately secured, including encryption. It is the treatment of file copies, particularly “ephemeral” or “loose” and ungoverned file copies that tend to inflate the size of affected populations.

Consider creating an export file for a project involving migrating data from one system to another. It contains information on 20,000 people, but remains unencrypted on a file share or on a single employee’s workstation. Files like this – and spreadsheets in particular – accumulate on a network and can double the population of individuals affected by a network compromise.

What organizations need to do

Organizations should recognize this particular threat and take steps to reduce their ransomware blast radius. In other words, they must assume that the data on their networks is at risk of being stolen and take steps to minimize the potential impact of the theft.

This can be done through a range of technical means, including implementing segmentation and privilege minimization. However, we are focusing here on document and information governance, and we make two suggestions:

  1. Organizations should implement a workable policy to govern user behavior. Many organizations have clean desk and convenience copy rules in place to govern physical copies. The rule we are considering is the electronic equivalent. We say “workable” because any rule that governs the use of sensitive convenience copies has the potential to hamper productivity. A good rule will treat risk reasonably, take advantage of available technology, and be acceptable to users.

  2. Organizations should consider performing periodic network scans. This means scanning the network to find problematic files before a malicious actor does. These scans serve the dual purpose of identifying problematic files and ensuring compliance with data policies. There are tools and services available, and like most security solutions, a solid implementation is essential and no one tool will likely do all the work. With proper investment and proper implementation, however, an organization is likely to eradicate “low-hanging fruit” and mitigate a significant degree of its risk.

Take away food

Ransomware regularly involves data theft. The threat actor will look for sensitive data wherever it is. Organizations that have policies and controls to protect their most sensitive data may still have uncontrolled free data on their systems. To avoid a costly eDiscovery exercise and surprisingly large notification obligations, organizations must adopt digital clean desktop and convenience copy policies and engage in regular network scanning.

About BLG

The content of this article is intended to provide a general guide on the subject. Specialist advice should be sought regarding your particular situation.

POPULAR ARTICLES ON: Technology Canada

Is it time to ban the term “legal innovation”?

McCarthy Tétrault LLP

Innovation is a term that has been widely used by lawyers and law firms. It’s ubiquitous in pitches and proposals, hailed as a differentiator when talking to customers, and used as a hook to attract and retain talent.

Helen D. Jessen