Your browser is out of date. It has known security flaws and may not display all features of this and other websites.
Learn how to update your browser.
For COVID updates please visit the following page.

Taming the Algorithmic Onslaught Facing Consumers

Tabrez Ebrahim

Algorithms have transformed merchant-consumer interactions.

Algorithms in the realms of artificial intelligence (AI), machine learning, and big data are pervading society in new ways with both beneficial and detrimental outcomes, writes California Western Professor Tabrez Ebrahim in a scholarly paper entitled Algorithms in Business, Merchant-Consumer Interactions, & Regulation.

The paper, presented by Professor Ebrahim at West Virginia Law Review’s Artificial Intelligence and the Law Symposium in February 2021 and later published in the West Virginia Law Review, discusses the phenomenon of algorithmic decision-making in business and the need for a regulatory response to the use of algorithms in the consumer-merchant context.

“Consumers should be worried about this,” says Professor Ebrahim. “This falls into the realm of consumer protection law, and left unchecked, algorithms hand over too much power to merchants.”

Noteworthy news headlines claim numerous concerns about algorithms in criminal sentencing, dermatology, and government decision-making. However, in exploring business issues with algorithms, a massive problem in and of itself relates to the effects and interactions of merchants with consumers, writes Professor Ebrahim.

Consumers are finding themselves in an unbalanced interaction with merchants in this algorithmic era and can become the victims of discrimination.

Discrimination, either intentional or unintentional, is encountered when algorithms apply profiling methods that categorize consumers in a way that treats certain groups (such as those based on race, religion, gender, and national origin) differently or denies access to products and services to certain groups. For example, if some consumers receive certain prices or special offers due to their association or inclusion in a certain group while other consumers are not provided such a benefit, this could be discrimination.

In the U.S., the Federal Trade Commission (FTC) protects consumers from digital exploitation in the commercial context. However, writes Professor Ebrahim, the FTC’s ability to police and prevent abusive practices is limited by its narrow statutory authority, minimal available resources, and lack of rulemaking authority. In addition, the FTC’s limited power to enforce data privacy policies stems from the lack of omnibus privacy and data security legislation in the United States; as a result, the FTC is limited in taking action against merchants that engage in unfair treatment and deceptive trade practices.

The limits of the FTC’s ability to protect consumers from exploitation necessitates new legislation to prevent abuse and manipulation. Consumers would benefit if the FTC were to expand the scope of its enforcement against informational harms.

Achieving the societal goal of finding and implementing a regulatory approach to maintain consumer sovereignty in this algorithmic era is no easy task, and Professor Ebrahim suggests some possible regulatory responses.

Regulatory efforts should start by determining the proper scope of protection and appropriate legal obligations. That may entail sector-specific classifications, certain types of applicable algorithms, or greater clarity on the necessary legal effect of algorithmic decision-making utilized by merchants. However, the breadth of these responsive actions is expansive. For these reasons, a proposal that follows the basic idea of letting the market decide, but with a unique angle, presents a more suitable legal and policy response.

A self-regulation proposal in the form of a responsibility algorithm code could serve as a possible new paradigm for a regulatory model. This proposal would bring together a panel of business experts to develop a responsible algorithm code with a set of minimum industry standards or industry guidelines that would include and balance the opinions from various commercial sectors, merchants, and consumer associations.

Longstanding laws and principles that protected consumers face challenges in terms of algorithmic methodologies utilized by merchants, concludes Professor Ebrahim. As merchants use algorithms to strengthen economic and social power over consumers, new theoretically-driven concerns about information asymmetry and bias and discrimination have arisen.

The core tenets of consumer protection law are under tension when algorithms exploit or deliberately manipulate consumers. Moreover, algorithms have undesired ethical, social, and economic effects on the merchant-consumer interaction if left unaddressed by regulation.

The United States and European Union have taken some steps to respond to legislative activity and initial governance. However, these actions have only just scratched the surface and will need to be bolstered by industry measures like those proposed in Professor Ebrahim's paper to have any meaningful regulatory effect.

Read Professor Ebrahim’s complete paper, Algorithms in Business, Merchant-Consumer Interactions, & Regulation, here.