India's new privacy law - the Digital Personal Data Protection Act, 2023 (DPDP/ the Act) is a landmark legislation created for regulating data collection. Even though it does not deal with how the companies use the data to out-compete their rivals, because it focuses on data collection, it impacts how the competition happens & how consumers are affected by it.
Here are the 7 Competition Problems that emerge from India’s soon-to-be-implemented privacy law.
The Consent Notice Problem:
If a Company wants to collect data, once the Act comes into force, the Data Fiduciary (the company that wants to collect data) can only collect such data after requesting the person's consent and gaining such consent.
The Act dictates that a ‘request for consent’ should be made in the form of a notice, and it must contain the following:
The personal data sought and the purpose for which it will be used. How a person can exercise their rights & how a person can complain to the Data Protection Board (hereinafter, the Board).
If a Data Fiduciary is already collecting data, on or before the day the Act was implemented, then a new request for consent should be made by the Data Fiduciary “as soon as possible”.
Because there is no time set within which the existing companies should provide the Request for consent notice, there is a possibility that companies / Data Fiduciaries can continue collecting data without providing a new request for consent under the pretence of it not being possible just yet.
This becomes even more problematic if they are companies with significant market power and lax privacy policies - as they can continue to collect excessive amounts of data, not inform the consumers as to the breadth of their data that is collected and the specific purpose that they are using such data for, build a protective moat around their market position with such data and increase barriers to entry for any new competitor.
Consequently, this creates an imbalance in competition because it affects the ability of new entrants to penetrate the market while an established player collects excessive data without accountability.
The Consent Problem:
The Act requires the Consent provided by the Data Principal to be free, specific, informed, unambiguous, unconditional and with a clear informative action (not a passive action). It also states that a request for consent has to be made by the Data Fiduciary, once after the Act is implemented.
It nowhere mentions or defines ‘free’ & ‘informed’ - In an era of network effects, where users are joining a specific platform because their friends are present there, and in the era of skipping to ‘check’ the box next to “I agree to the terms & conditions”, the concepts of ‘free’ and ‘informed’ are in themselves questionable.
It also does not discuss what needs to happen if the Data Fiduciary updates the privacy policy after giving the initial Request for consent notice.
It requires clear affirmative action only for the first time, not when the policy is updated later. Because passive acceptance (where users accept the terms and conditions by their continuous use) is allowed for when the policy is updated, consumers, more than businesses, will lose - it will enable businesses to put up a front for the first time by appearing benevolent and then updating the privacy policy to continue collecting excess and unreasonable amounts of information about the user.
Passive acceptance is not a new concept involving digital markets, and neither is a business collecting data on the users and using it to “increase” users' resistance to switching to other products.
The Purpose of Collection Problem:
A Data Fiduciary is a person who determines the purpose & means of processing personal data, either by themselves or “in conjunction with others”.
The Act may have intended the words “in conjunction” to mean marketing agencies or business developers that assist the business for its development. But as nowhere mentioned what it means by “in conjunction with others”, an extremely worrying eventuality is plausible.
What if a business decides its privacy policy in conjunction with its competitors?
In a world where competition law is still yet to fully recognise data abuse as an anti-competitive practice, DPDP presents a whole new possibility of a Cartel in-terms of data collection.
Businesses situated at the same level (basically, competitors) could agree to collect the same type of data and the same amounts of excessive and invasive data that can influence the user in a way they wish. The consumers, at the receiving end, will have no other option but to agree to the terms that were decided by the Cartel. All of a sudden everyone in the industry is facing a situation where their data is collected for purposes not clearly specified - and it’s the same with other competitors as well.
This is likely in industries with few players, like advertising services, social media/communication services, web browsers, operating systems, e-commerce services or online payment services, etc.
The Children’s Data Problem:
A Data Fiduciary can collect data from a child (classified as less than 18 years old by the Act) after obtaining verifiable permission from the legal guardian. The Act imposes two conditions on the Data fiduciary that collects such data - they should not process/use the data in a way that is ‘likely’ to cause a ‘detrimental’ effect on the ‘well-being’ of the child & they should not undertake tracking or behavioural monitoring of children or provide targeted advertisements to children.
While they are certainly good and welcoming conditions, they create a BIG jurisdiction issue between the Competition Commission of India (CCI) and the Board.
This is because the question - whether a company’s activity while competing with its competitors, with the data and other resources it possesses, is unfair to the market or not - falls under the direct ambit of Competition Authorities worldwide.
DPDP, however, takes it upon itself to determine whether a particular activity committed by the company, using the data it collected, is detrimental to the welfare of a child. The problem with this idea is that an authority - not a competition authority - recognises that an activity - committed by the company while competing to win the competition - is detrimental to the consumers. All of this is compounded by the fact that the DPDP offers no set criteria to determine what kinds of activities are detrimental and what are not.
DPDP opened Pandora’s box by doing this, and 4 BIG economic and legal issues will likely arise.
First, the way the Data Protection Board would analyse whether a specific activity is detrimental may not be the same way the CCI assess abuse to the children/consumers caused by that particular activity of the company. And because there is a difference in the way a specific activity by the company is assessed, every company is thrown in limbo with regards to whether they are committing abuse or if they are detrimentally affecting the children or both! What does a business do in this situation?
Second, any activity that is considered abuse by the CCI is sure to be detrimental to the children. Doesn’t that result in separate cases in front of the CCI & the Board for the same activity? Doesn’t it mean double jeopardy?
Third, any activity considered detrimental by the Board may or may not be regarded as an abuse by the CCI when it applies the principles of Competition law to such activity, and it is likely that such activity may have a very valid business justification. How do we reconcile with this particular dilemma?
Fourth: Suppose the Board determines a particular activity to be detrimental to children, i.e. people who are less than 18 years old - what if the same activity is being undertaken by the company against 19 years old?
Does the detrimental effect of the activity cease to be detrimental the moment someone crosses 18 years?
Because Competition authorities, while assessing abuse, can see beyond the factor of age to determine the abuse caused by the companies, shouldn’t they be the persons to consider this issue? Is it a situation where a user is more than 18 years of age, they are assessed through competition law, and if they are less than 18 years old, they are evaluated by DPDP? Doesn’t that defeat the purpose of the Competition Law, to a certain extent?
What DPDP proposes vis-à-vis this issue has the potential to change the understanding of abuse and the role of the Competition Authority - CCI.
Even though the intent behind the law is to make sure no Data Fiduciary uses data that they collect in a way detrimental to the children, because Competition Law is also concerned with the same issue, a challenge to the jurisdiction of CCI inevitably arises.
The middle ground to read these two laws is the market power that a company/data fiduciary has. If the case is against a dominant company with a significant market power, then the CCI will have the final say, if not, the Board. However, until CCI fully formalises the data abuses that can be caused (by first implementing the Digital Competition Act) and until there is clarity on the issues mentioned, this problem will not go anywhere.
The Consent Withdrawal Problem:
The Act states that data fiduciaries should allow the withdrawal of consent as easily as data principals could provide consent for data collection.
The name of the business is retention of the consumer. It is in the business’s interest to make things as appealing as possible to the consumers so that they do NOT quit using the business services.
That’s why chocolates/candy are kept next to the check-out counter. Imagine entering and exiting the store and not seeing anything appealing even while getting out of the store. The consumer experience is rightfully affected.
In other words, it is in the interest of the business to make it a bit difficult for the consumer to get out of the ecosystem. That is, inherently, not a problem - That’s just how businesses work. No one wants to lose a consumer, which is the business justification for why things are how they are.
Certainly, if the users are facing a system where withdrawing their consent / getting out of the system is impossible, then yes, that kind of system is inherently problematic. Whether a particular system is difficult to get out of or not - because of which consumers cannot choose the goods and services the others provide - is a competition issue.
To put it in competition law words, is a company involved in increasing switching costs to consumers, and is that barrier to switching artificial (kept by the Company) or not? These are two questions that fall under the scope of competition law.
But because the Board will be empowered to assess this particular type of case as well, a question needs to be asked - Looking at how easy it is for a consumer to provide consent when signing up for a particular service, literally checking a box, can a case be brought against any Data Fiduciary for not having the same exact ease in withdrawing the consent?
If the provision is considered objectively, there seems to be a possibility for a consumer to bring a case against a Data Fiduciary for NOT having the same exact ease while withdrawing the consent.
Because this is the Board that assesses the case and not the competition authority, this leads to the bigger question: Will the Board consider the business justification (to retain consumers) while deciding the case? If the Board doesn’t, then the result is straightforward - a penalty of 50 Crores for violating the provisions of the Act.
The Data Deletion Problem:
DPDP requires the consent provided by the person (Data Principal in the Act), to be free, unconditional, informed, specific, and unambiguous. It also says that the Data Principal can, at any time, withdraw such consent to provide the data. Once the Data Principal withdraws the consent, the Data Fiduciary must delete the data within a “reasonable time”.
Barring situations where any other law determines how long specific data can be held by a business, it falls in the hands of the Board to decide what amounts to “reasonable time”. And this impacts competition significantly.
Take, for example, dating apps and other apps that often embed Unique device identifier (UID) in the user’s device when they first use the service. And they often fail to remove them from their database even after the user deletes their profiles / withdraws their consent.
Suppose the user re-installs the app or re-creates the profile on the platform again: Because the device is already registered in the database (& the UID info is not deleted), it allows dating apps to see the extent of time that has passed since the deletion of the profile and the re-creation of it, to assess how desperate/needy a particular person is for the sake of meeting someone.
This, in turn, allows them to intentionally suppress the visibility of that user’s profile and “nudge” them to take a premium subscription to get more visibility.
This is a business practice that is already in play and this abusive conduct, where users are not freely choosing premium services but are being made to choose, can be curbed if there exists a condition upon the time limit within which the user’s entire data should be removed from the company's servers.
But alas, that is not to be.
The Significant Data Fiduciary Problem:
DPDP recognises the market situation where certain companies collect large amounts of user data. It brought in the idea of Significant Data Fiduciaries (SDF) to require these companies to be more responsible with their data collection and processing.
The Central Government will determine and recognise SDFs based on factors like Volume and sensitivity of data processed, risk to electoral democracy, risk to data principal rights, Security of the State and Public Order. Once recognised, the SDFs must appoint a data auditor and a Data Protection Officer and undertake a Data Protection Impact assessment.
Keeping aside the inherent problems that emanate (like how Data Auditor is not defined at all in the entire Act), this presents an issue of concern for the consumers, from a competition perspective.
A few months ago, the Committee on Digital Competition Law submitted a draft of the Digital Competition Act (discussed here) detailing how companies that provide digital services should compete. In that, they put forth certain criteria to recognise significantly powerful companies with huge consumer & business bases as Systematically Significant Digital Enterprises (SSDE). This is done to make sure these companies act fairly, transparently, and non-discriminately.
The factors which are relied upon for determining a company to be a SSDE are: the volume of transactions the company does, the advantage it derives because of the volume of data it collects, the total revenue, and the total no. of people (more than 1 Crore consumers and 10,000 businesses) using the Company’s services, etc.
If the proposed Digital Competition Act becomes a law and is implemented after DPDP is implemented, there is a predicament that cannot be ignored.
A company which has significant business operations, which has a significant consumer base and collects massive amounts of data could be considered as an SSDE, per the Digital Competition Act. Still, it may not be considered a Significant Digital Enterprise in the eyes of DPDP.
Because of that, even though it deals with a large volume of data, it does not need to appoint a Data Auditor or a Data Protection Officer, nor is it required to undertake a Data Protection Impact Assessment.
This is even more complicated because an SSDE is recognised by the CCI, whereas an SDF is only identified by the Central Government.
It’s a rare sight for a digital company that collects data to be significant in terms of its operations and NOT to be significant in terms of the data it collects…
Conclusion:
There’s no denying India’s new privacy law is much-needed legislation, considering the type of data abuses and breaches that have happened and the sheer number of victims present. But it is not ideal, not at least when looking from the perspective of competition and competition law. One thing is for sure. When the New Privacy law is implemented in India, the Competition, & consumers inside the markets will be ‘detrimentally’ affected (pun intended).