Private companies cannot harm you, in the legal sense of the term, when they merely collect information about you as part of the customer relationship. This inability to cause legal harm includes collecting information that exceeds terms of use for the platform. This is not to say private companies are in the clear if they wantonly disregard their own terms of service or engage in mass data collection programs, but that the simple collection and retention of data is not a recognizable legal harm.
Under the Constitution of the United States, federal court jurisdiction is limited to “cases” and “controversies.” The Supreme Court, on multiple occasions, has interpreted Article III, Section 2 of the Constitution as requiring the plaintiffs to have suffered a “concrete” injury or an “injury-in-fact.”
A concrete injury is one that is neither “conjectural” nor “hypothetical.” A court must be able to redress the injury through a favorable decision.
Some injuries, or violations of standards, have been long recognized throughout American and English jurisprudence. These injuries include injuries to people such as assault and battery, injuries to property, such as trespass, and encroachments upon rights such as nuisance claims.
Other injuries are created by statute. When a plaintiff brings an action for a Constitutional civil rights violation, for example, he or she likely does so under 42 U.S.C. § 1983.
Legislatures may not simply legislate an injury into existence, though. Federal courts will still require that right to present a case or controversy.
[pullquote]The mere collection and retention of data cannot result in a legally recognizable harm. [/pullquote]The type of data platforms collect is either data generated by users or about users. On the one hand, people upload photos to Facebook, Instagram, Google, and other platforms. They post status updates and share stories. By the nature of the transaction, individuals generally want to share the information.
Companies can also glean a lot of information about individuals. They know the general area in which individuals live based on the IP address. Companies may know what the individual looks like based on photos uploaded to the platform.
Individuals may not like that platforms have figured out how to monetize data, but that lack of “liking” is far from a legal injury.
A recent court case from Chicago supports the proposition that data collection and retention is not a legal injury. In Rivera v. Google, Inc., a federal district court granted Google’s motion for summary judgment. After analyzing the claims, the court concluded that the private plaintiffs’ allegations fell short of the Constitution’s concrete injury requirement.
Notably, the court did not dismiss the plaintiffs’ claims outright. During the analysis, it noted that if the plaintiffs had coupled the claims with evidence that something else occurred, Google may very well be on the hook. In so doing, the court admirably changed the discussion from one of privacy to one of consumer protection, even citing that several state legislatures have pointed to the risk of identity theft to support their biometric privacy laws.
The calls for “privacy” regulations stem from fear and a belief that companies should not be able to profit from the information people generate. Those advocating for privacy regulations have tapped into people’s fear of the unknown, fear of change, and fear of being watched. There seems to be a distrust of companies so popular and in tune with what consumers want. This distrust is in part based on perceived political power, in part on the philosophy that companies that large must have abused their consumers, and in part on a belief that targeted advertising is “villainous.”
Privacy advocates fail to thoroughly analyze the legal harm question. They rely on people feeling violated when technology platforms figure out a way to monetize data. Feeling offended is not an injury. And it is certainly not an injury any courts will recommend.
Policymakers should avoid any temptation to interfere with the relationship between companies and private individuals. The ability to exchange some amount of data for a product or service enables even the poorest among us to access valuable products such as email, the ability to connect with friends, calendars, directions, or so on.
This is not to say the government has no role whatsoever, though. While the collection and retention of data alone may not be enough to satisfy injury requirements, there are other considerations. If a platform fails to safeguard personal information within standards of reasonableness and that information is accessed by bad actors, the platform should be liable. It should be liable to its customers and the appropriate state entity.
Companies make promises in exchange for an individual’s business. What happens when the company violates that promise? Most states have consumer protection laws barring fraudulent and deceptive trade practices. If a company violates a promise, both the customer and the state likely have causes of action under these provisions.
Relatedly, if a company makes a promise that it later changes or violates, a customer may have legal actions either for misrepresentation or, worse for the company, breach of implied covenant of good faith and fair dealing.
The mere collection and retention of data is not wrong. Companies are not evil for figuring out a way to make a profit while providing products and services to consumers. Policy makers should look to prevent actual harms to consumers by relying and enhancing existing consumer protection statutes.
Catalyst articles by Jonathon Hauenschild