Europe Might Wrap the Tech Industry in Even More Red Tape

Following a seemingly endless parade of tech regulations—from the General Data Protection Regulation (GDPR) in 2016 to the Digital Services Act (DSA) and Digital Markets Act (DMA) in 2022 to the Artificial Intelligence (AI) Act in 2024—the European Commission published a “fitness check” of EU consumer law on October 3, 2024 to assess existing regulations’ ability to address concerns surrounding “digital fairness.” Whether this report will lead to a Digital Fairness Act remains to be seen. Regardless, Europe needs to pump the brakes on new regulations before it can properly assess the long-term effects of the GDPR, DSA, DMA, and AI Act.

While the United States has led the world in technological innovation—though it faces stiff competition from China—Europe has taken a different road by leading the world in technological regulation, a much easier task. This is a symptom of Europe’s precautionary approach to innovation driven by fears—mostly unfounded—of technology’s potential negative impact on society. Proponents of the European approach call it “values-based,” and compare it favorably to the U.S. approach that has historically prioritized innovation, progress, and growth, values that have suffered under Europe’s precautionary regime, according to the Commission’s own European Competitiveness Report released on September 9, 2024.

A Digital Fairness Act would be the next step in the EU’s regulatory slog. Judging by the Commission’s report, such an Act would target “problematic practices” surrounding personalized pricing, personalized advertising, influencer marketing, product scalping, dropshipping, AI chatbots, subscriptions, contracts, loot boxes, virtual items, and more. The report frequently references “dark patterns,” or design features that are allegedly used to influence user behavior in a way that is profitable for an online service but purportedly harmful to hapless users or contrary to users’ intent.

The focus on dark patterns is particularly problematic, as the term’s vague definition could apply to any number of common practices, many of which cause no actual harm to consumers. It is not inherently problematic that online services use design to influence consumers; this is a common practice much older than the Internet. Toy companies design bright, colorful products to appeal to children. Stores place small, low-cost items near the register to encourage customers waiting in line to add more to their carts. Brands stress the limited nature of a sale to encourage consumers to buy more while the price is low. The entire marketing industry exists to influence consumers. Lumping practices that are actually harmful, such as changing the final price of a product during the purchasing process to a price higher than what was advertised, along with simple marketing tactics are disingenuous and unlikely to lead to effective regulation.

The European Union has already wrapped the tech industry in more than enough red tape. Rushing to introduce further regulations, while the DSA, DMA, and AI Act are in their infancy—and even the GDPR remains relatively new—would be premature. In order to avoid unnecessary overlap, Europe should wait until it becomes clear how these recent regulations impact consumer welfare.

* Ash Johnson is a senior policy manager at ITIF, specializing in Internet policy issues including privacy, security, platform regulation, e-government, and accessibility for people with disabilities. Her insights appear in numerous prominent media outlets such as TIME, The Washington Post, NPR, BBC, and Bloomberg.

Previously, Johnson worked at Software.org: the BSA Foundation, focusing on diversity in the tech industry and STEM education. She holds a master’s degree in security policy from The George Washington University and a bachelor’s degree in sociology from Brigham Young University.

Source: ITIF