Blooma Blog

Industry Adaptation and Privacy in the New Era of AI - Blooma

Written by blooma | Nov 27, 2024 12:05:07 AM
Philomena Lamoureaux, Head of AI

How data security is changing everything.

In a collaboration between BCG and MIT from 2018 (Ransbotham et al., 2018), researchers found that when it comes to artificial intelligence, companies and organizations can be classified into 4 groups: 18% Pioneers, 33% Innovators, 16% Experimenters and 34% Passives.

Pioneers are enterprises with an extensive understanding of AI tools and concepts, and embrace AI in significant ways. Innovators have a good understanding, but still display little actual application of AI in their business. Experimenters are using AI for their business, but without seeking an in-depth understanding of the AI methods. Finally, Passives lack both in-depth understanding and application of AI technology.

Interestingly, all four groups agree that AI will change their business model in the next few years. This means that sooner rather than later, AI applications will penetrate the entire corporate landscape. This will be important as AI becomes a pillar of competitiveness, allowing companies to get things done faster and more accurately and to reduce time spent on less desirable work.

“Private AI technology allows a company to use data to train highly performing models without exposing or sharing confidential data.”

In the past, laws around data privacy have hampered the evolution of AI applications in many industries which deal with private data, such as sensitive or personal information. Typical examples are the healthcare and financial sectors. In recent years however, emerging technologies have enabled the secure use of private data to build and use AI. These private AI technologies are accelerating the quantity and nature of novel AI applications, and transforming the AI landscape. Thus, Pioneers, Innovators, and Experimenters in the financial sector have started to use AI on their private data.

Private AI technology allows a company to use data to train highly performing models without exposing or sharing confidential data. Although the field of private AI is very cutting edge, there are a few notable methods already, including differential privacy, homomorphic encryption, federated learning, and data anonymization. For example, federated learning is an advanced, state-of-the-art method used by Google and the healthcare industry (full disclosure: Blooma uses it too). This method involves making use of client data, without the client having to share their data with anyone.

In a typical set up, a model is deployed to those client locations and trained on the data there. After training, the model is sent back to a central location where it is merged with the models of other clients. After this merge, you end up with a model which has learned from data across all clients without sharing the data in one central place. Federated learning, if done right, ensures that there is no exchange between each individual client’s data. Thus, federated learning allows a company to learn from confidential data while keeping the data secure and private.

Meanwhile, the introduction of AI into the corporate landscape has led to changes in legislation which continue to develop. The General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) are two statutes that have recently come into play in the attempt to promote and regulate data privacy and security in the European Union and California, respectively. Additionally, Brazil passed the General Data Protection Law that came into effect in February 2020. According to Richard Koch, Managing Editor of the GDPR, EU, these legislations that have begun to emerge have some common principals, including the importance of defining personal data and certain fundamental rights relating to data subjects (2018). Private AI allows us to obey those laws while capitalizing on private data. Whether you consider yourself or your business closer to the Pioneer or the Passive end of things, your data is already a part of this changing landscape, and security will continue to be the driving factor in the future evolution of private AI technology. Despite the many differences between the statutes mentioned above, we can only assume this is just the beginning of the movement toward an increase in legislation to govern data privacy.

Philomena Lamoureux

Philomena Lamoureux, Head of AI at Blooma, is a published PhD specialized in ML, NLP and privacy-preserving AI solutions.