“Our perspective has always been that if models were not transparent, auditable, and explainable then risk could not be effectively determined" - Steve Roemerman, CEO
Dallas, TX (PRWEB) February 16, 2017
A growing consensus is beginning to define acceptable practices and define legal risk for users of big data and analytics. In particular, two notable developments are posing significant legal risks for careless uses of big data.
First, a respected American technology policy group has issued a set of principles which may help U.S. plaintiffs who feel they have been harmed by Big Data and AI. The Association for Computing Machinery (ACM) US Public Policy Council recently issued a statement on “Principles for Algorithmic Transparency and Accountability.” The statement included seven principles for the use of algorithmic models in business and government and include:
1. Awareness: … of the possible biases… and the potential harm that biases can cause to individuals and society.
2. Access and redress: …adoption of mechanisms that enable questioning and redress…
3. Accountability: Institutions should be held responsible for decisions made by (their) algorithms…
4. Explanation: …produce explanations regarding both the procedures …and the specific decisions…
5. Data Provenance
6. Auditability: Models, algorithms, data, and decisions should be recorded so that they can be audited…
7. Validation and Testing: Institutions should use rigorous methods to validate…
“Lone Star views the Principles issued by ACM as fundamentally sound and important to the evolution and growth of the analytics market,” said Lone Star’s CEO Steve Roemerman. “Our perspective has always been that if models were not transparent, auditable, and explainable then risk could not be effectively determined. This is why we built all of these capabilities into our TruNavigator and AnalyticsOS analytics platforms.”
In addition to the Principles from ACM, the EU General Data Protection Regulation (GDPR) imposes significant penalties for breaches of EU citizens’ data privacy. Organizations running afoul of GDPR can be fined 4% of annual global turnover or €20 Million (whichever is greater). The regulations are some of the most important changes in data privacy regulation in 20 years. The new regulations will come into force next year.
The aim of the GDPR is to protect all EU citizens from “privacy and data breaches in an increasingly data-driven world.” Under the new regulations, personal data of all types “must be processed lawfully, fairly, and in a manner transparent to the data subject.” The regulations control how data is collected and greatly restricts data use. Holding data for analysis alone will be a potential source of legal risk under the new rules.
Big Data advocates who contend they should be able to discover new relationships and patterns in consumer data will have to be careful how they explore the unknown while regulators demand use of data for “specified, explicit purposes and only those purposes. “
“Lone Star has always advocated the use of redacted and abstracted data,” said John Volpi, CTO of Lone Star Analysis. “We have always advocated for targeted use of machine learning, and avoiding brute force Big Data. As society comes to grips with the risks of big data done badly, we should not be surprised to see more regulations, like GDPR and more guidelines like those issued by the ACM.”
About Lone Star Analysis
Lone Star Analysis provides powerful solutions that improve operations. We serve industrial markets, aerospace & defense, oil & gas, transportation & logistics, and the public sector.
Our analytics products and technology-enabled services are proven; we deliver the right answer for your operational needs. We are committed to generating improved operational and financial performance through accurate and actionable answers to our client’s most critical business challenges. Our reputation is built on creating lasting value for our clients.
Headquartered in Dallas, Texas, Lone Star is found on the web at http://www.Lone-Star.com.