When I used to work as a systems analyst and programmer, it was common practice to use production data for test and development purposes. In fact, if production data was not used, the test was deemed non-comprehensive and therefore seen as inadequately tested for actual business purposes. However, in the past decade this practice has been re-evaluated -- and rightly so, considering the increasing number of privacy breaches involving personally identifiable information (PII).
Windows applications and test environments are inherently less secure than in production because data is typically exposed to a wider variety of potential insider threats, including in-house testing staff, outside contractors and consultants, partners and offshore development shops.
Many incidents by insiders are a result of malicious intent, but a large number of them are caused by developers who are unaware of basic security needs or by accidents and negligence. No matter the reason, insider incidents can be very costly to both the company and the individuals involved.
Consider this -- one of the many incidents where privacy breaches resulted from an "inside job." In June 2006, a programmer who was hired by Sentry Insurance Co. in Wisconsin to create an application was imprisoned and fined $519,859 for attempting to sell more than 111,000 individuals' social security numbers and other PII to an undercover U.S. Secret Service agent. The programmer had been given the data to use for development and testing and had sold the data to others before being caught.
But not all insider threats are meant to be malicious – they could result from unawareness and mistakes. I once attended a product demonstration for an e-discovery company. As it turned out, the data used in the demo was actual employee and customer PII. Following the presentation, the product presenters told me they didn't think about the privacy or legal issues involved with using this data.
Insiders pose great threat to sensitive information
Multiple studies confirm that insiders cause the majority of information security incidents and privacy breaches that occur.
A December 2008 study conducted by IBM's ISS X-Force research team reported a 30% increase in network and Web-based security incidents by insiders during the last half of 2008. Researchers also found that much of this increase was attributed to economic uncertainty. The current U.S. economic situation provides even more motivation for malicious activities.
Cyber-Ark Software confirmed this upturn in insider-caused incidents in its December 2008 study. Results showed that 56% of "office workers" who responded are worried about job loss and 58% of U.S. workers said they have "already downloaded competitive corporate data and plan to use the information as a negotiation tool to secure their next post." Seventy-one percent of respondents indicated that if they were laid off, they would take company data with them to their next employer, including "customer and contact databases, with plans and proposals, product information and access/password codes."
Data protection laws that protect personally identifiable information
Most data protection laws place significant and specific restrictions on how organizations can use PII. For example, any organization that collects PII from citizens in the 27 European Union countries must abide by Data Protection Directive 95/46/EC as well as any additional restrictions established in each member country.
The restrictions in most data protection laws outside the U.S. are based upon eight privacy principles. Of those eight, here are the principles that directly apply to restricting the use of personal information for the purposes of testing applications, systems, and any other type of IT test activities:
- Principle 1: Fair and lawful processing. While individuals must be notified as to why their PII is collected and how it will be used, the notices rarely state that the personal information will be used for testing purposes. Because using PII for testing purposes or development violates data protection laws, there is usually a legal obligation to mask or de-identify PII when possible for testing.
- Principle 3: Excessive data. This requires organizations to use the minimum amount of data necessary when processing the information collected. So, even when PII can be justified for testing, the quantity must be limited. Using a sub-set of PII is more likely to comply with data protection requirements than using an entire database.
- Principle 7: Security. Organizations must use appropriate security measures to protect PII. What's appropriate depends on factors, such as the organization's size and type of PII used. Generally, larger organizations would be expected to use more security. Similarly, the more sensitive the data and the more databases used means organizations are expected to implement greater security measures. These measures include required training and awareness activities for personnel with PII access. And when an outsourced company is used, most data protection laws require contracts to specify security requirements for the vendor to follow, including training.
- Principle 8: Cross border data transfer. Organizations cannot send PII across country borders unless it is for legitimate business purposes and it goes to a country that's deemed safe; or there is a specific cross-border agreement in place between the organization and the applicable countries. The EU does not consider the U.S. to be a safe country.
In a perfect world, Windows developers would never have to use PII for testing purposes. However, since using PII for testing is sometimes unavoidable, Windows managers should take the proper steps to ensure their shop complies with their applicable data protection laws. De-identifying or masking PII, limiting the amount used and ensuring appropriate contracts and safeguards are in place will help them achieve compliance and in turn help them avoid potential dire financial and legal consequences.
ABOUT THE AUTHOR
Rebecca Herold, , CISSP, CISA, CISM, CIPP, FLMI, has more than 17 years of experience in IT, information security, privacy and compliance and is the owner and principal of Rebecca Herold LLC. She is an adjunct professor for the Norwich University Master of Science in Information Assurance program and is writing her 11th book. Her articles can be found at www.privacyguidance.com and www.realtime-itcompliance.com.
This was first published in February 2009