{"id":6840,"date":"2025-07-15T22:49:06","date_gmt":"2025-07-15T22:49:06","guid":{"rendered":"https:\/\/uang69.id\/?p=6840"},"modified":"2025-07-15T22:49:07","modified_gmt":"2025-07-15T22:49:07","slug":"lina-khans-ftc-gears-up-to-go-after-algorithmic-black-boxes-governing-workers-lives-but-it-might-be-running-out-of-time","status":"publish","type":"post","link":"https:\/\/uang69.id\/?p=6840","title":{"rendered":"Lina Khan\u2019s FTC Gears Up to Go After Algorithmic Black Boxes Governing Workers\u2019 Lives, but It Might Be Running Out of Time"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div>\n<p>The Department of Justice just scored a landmark victory against Google over its monopoly behavior. It\u2019s a feather in the cap of the DOJ antitrust division and Lina Khan\u2019s Federal Trade Commission (FTC), which have had an pretty incredible four-year run. We\u2019ll see if it continues under the next administration. JD Vance continues to back Khan for whatever that is worth, while there have been some not great signs on that front from Team Kamala where Harris\u2019s closest advisor is Uber general counsel and former Obama official Tony West. Tim Walz, too, has a record of friendly ties with Uber to the detriment of workers in the state of Minnesota.<\/p>\n<p>Just to quickly review some of the efforts of the DOJ and FTC that are really pretty earth shattering considered they\u2019re coming after 40-plus years of moving in the opposite direction:<\/p>\n<p>They are now going after surveillance pricing. The DOJ\u00a0 is reportedly readying a civil case against RealPage, the private equity-owned corporation that creates software programs for property management. The company is accused of \u201cselling software that enables landlords to illegally share confidential pricing information in order to collude on setting rents,\u201d and the DOJ is also working on a complaint focused on landlords\u2019 exchange of vacancy rate information, which helped to restrict supply.<\/p>\n<p>Maybe most importantly, DOJ reversed enforcement policies put in place by the Clinton administration on information sharing. Between 1993 and 2011 the DOJ Antitrust Division issued a trio of policy statements (two during the Clinton administration and one under Obama) regarding the sharing of information in the healthcare industry. These rules provided wiggle room around the Sherman Antitrust Act, which \u201csets forth the basic antitrust prohibition against contracts, combinations, and conspiracies in restraint of trade or commerce.\u201d<\/p>\n<p>And it wasn\u2019t just in healthcare. The rules were interpreted to apply to all industries. To say it has been a disaster would be an understatement. Companies increasingly turned to data firms offering software that \u201cexchanges information\u201d at lightning speed with competitors in order to keep wages low and prices high \u2013 effectively creating national cartels.<\/p>\n<p>In a 2023 speech announcing the withdrawal, Principal Deputy Attorney General Doha Mekki explained that the development of technological tools such as data aggregation, machine learning, and pricing algorithms have increased the competitive value of historic information. In other words, it\u2019s now (and has been for a number of years) way too easy for companies to use these Clinton-era \u201csafety zones\u201d to fix wages and prices:<\/p>\n<p>An overly formalistic approach to information exchange risks permitting \u2013 or even endorsing \u2013 frameworks that may lead to higher prices, suppressed wages, or stifled innovation. A softening of competition through tacit coordination, facilitated by information sharing, distorts free market competition in the process.<\/p>\n<p>Notwithstanding the serious risks that are associated with unlawful information exchanges, some of the Division\u2019s older guidance documents set out so-called \u201csafety zones\u201d for information exchanges \u2013 i.e. circumstances under which the Division would exercise its prosecutorial discretion not to challenge companies that exchanged competitively-sensitive information. The safety zones were written at a time when information was shared in manila envelopes and through fax machines. Today, data is shared, analyzed, and used in ways that would be unrecognizable decades ago. We must account for these changes as we consider how best to enforce the antitrust laws.<\/p>\n<p style=\"text-align: center;\">***<\/p>\n<p>We\u2019ve seen the efforts and some major wins on antitrust and consumer protections. So what about the wages Mekki mentions? The DOJ has gone after no-poach deals and wage-fixing in recent years with limited success.<\/p>\n<p>In November, the DOJ moved to dismiss one of its last no-poach criminal cases after failing to secure a conviction in three other no-poach or wage-fixing cases brought to trial since over the last two years.<\/p>\n<p>In 2022, the DOJ fined a group of major poultry producers $84.8 million over a long-running conspiracy to exchange information about wages and benefits for poultry processing plant workers and collaborate with their competitors on compensation decisions. More significant than the measly $84.8 million, it ordered an end to the exchange of compensation information, banned the data firm (and its president) from information-sharing in any industry, and prohibited deceptive conduct towards chicken growers that lowers their compensation. Neither the poultry groups nor the data consulting firm admitted liability.<\/p>\n<p>Comments by FTC and DOJ officials in recent months also hint that they are still looking at going after wage-fixing cartels, as well as single companies using algorithms to exploit workers.<\/p>\n<p>FTC officials are talking about opening up the algorithmic \u201cblack boxes\u201d that increasingly control workers\u2019 wages and all other aspects of their labor. While they became infamous from \u201cgig\u201d companies like Uber, they are now used by companies across all sectors of the economy.<\/p>\n<p>Today I\u2019d like to look at one such legal theory making the case for not just opening up the Uber et al. black box but smashing it altogether.<\/p>\n<p>\u201cAlgorithmic wage discrimination\u201d is the term Veena Dubal, a professor of law at University of California, Irvinet, uses to describe the way outfits like Uber and increasingly companies across the economy set wages and control workers. The term also hints at her argument to ban the practice. More from Dubal\u2019s \u201cOn Algorithmic Wage Discrimination,\u201d published in November at the Columbia Law Review:<\/p>\n<p>\u201cAlgorithmic wage discrimination\u201d refers to a practice in which individual workers are paid different hourly wages\u2014calculated with ever-changing formulas using granular data on location, individual behavior, demand, supply, or other factors\u2014for broadly similar work. As a wage-pricing technique, algorithmic wage discrimination encompasses not only digitalized payment for completed work but, critically, digitalized decisions to allocate work, which are significant determinants of hourly wages and levers of firm control. These methods of wage discrimination have been made possible through dramatic changes in cloud computing and machine learning technologies in the last decade.<\/p>\n<p>These automated systems record and quantify workers\u2019 movement or activities, their personal habits and attributes, and even sensitive biometric information about their stress and health levels.<\/p>\n<p>Employers then feed amassed datasets on workers\u2019 lives into machine learning systems to make hiring determinations, to influence behavior, to increase worker productivity, to intuit potential workplace problems (including worker organizing)\u2026<\/p>\n<p>Maybe not on the same level that Khan\u2019s 2017 article, \u201cAmazon\u2019s Antitrust Paradox\u201d reframed antitrust, but Dubal\u2019s piece is an attempt to zero in on the discrimination aspect of these employer algorithms is an attempt to bring the issue under the legal umbrella of existing laws. Specifically, she argues that since \u201cthe on-demand workforces that are remunerated through algorithmic wage discrimination are primarily made up of immigrants and racial minority workers, these harmful economic impacts are also necessarily racialized.\u201d<\/p>\n<p>That misses the point, argues Dubal. It is not, primarily, the secrecy or lack of consent that results in low and unpredictable wages; it is the \u201cextractive logics of well-financed firms in these digitalized practices and workers\u2019 comparatively small institutional power that cause both individual and workforce harms.\u201d<\/p>\n<p>While some workers have sought to use existing law to learn what data are extracted from their labor and how the algorithms govern their pay, Dubal argues that any data-transparency reform approach \u201ccannot by themselves address the social and economic harms.\u201d<\/p>\n<p>The secrecy must be overcome, but the information gleaned must be used in pursuit of a blanket ban, argues Dubal, because algorithmic wage discrimination\u00a0 runs afoul of both longstanding precedent on fairness in wage setting and the spirit of equal pay for equal work laws.<\/p>\n<p>If I\u2019m reading this right, a successful argument of wage discrimination would lead to the outright ban favored by Dubal on the use of algorithms that control workers wages, activities, etc because they are, in their very nature, discriminatory.<\/p>\n<p>That would be a step beyond many other efforts nowadays to \u201creform\u201d the algorithm, make the black box more transparent, compensate workers for their data and so forth.\u00a0It would also be a death knell for so many of the most exploitative \u201cinnovative\u201d companies in the US.<\/p>\n<p>This argument also brings Dubal in for heavy criticism as it is a major threat to US oligarchs and their courtesans. Here\u2019s Forbes attacking her last year using many of the same arguments that are used against Khan.<\/p>\n<p>Worker attempts to go after companies like Uber for violations have been difficult due to a lack of knowledge of what exactly their algorithms are doing. It is not dissimilar to lawsuits challenging illegal government surveillance, which are made impossible due to the requirement that plaintiffs are required to prove that the government surveilled them. Because such surveillance is conducted entirely in secret, there\u2019s virtually no way to obtain proof.\u00a0 \u00a0Companies like Uber have frequently successfully argued that \u201cthe safety and security of their platform may be compromised if the logic of such data processing is disclosed to their workers.\u201d<\/p>\n<p>Even in cases where the companies have released the data, they have released little information about the algorithms informing their wage systems. The FTC, however, would have the authority to pry open the black boxes \u2014 as it is doing now in its investigation into surveillance pricing with orders to eight companies to hand over information.<\/p>\n<p>\u201cOn Algorithmic Wage Discrimination\u201d is well worth a read, but here is a quick breakdown.<\/p>\n<p>How does it differ from traditional forms of variable pay?<\/p>\n<p>\u2026algorithmic wage discrimination\u2014whether practiced through Amazon\u2019s \u201cbonuses\u201d and scorecards or Uber\u2019s work allocation systems, dynamic pricing, and wage incentives\u2014arises from (and may function akin to) the practice of \u201cprice discrimination,\u201d in which individual consumers are charged as much as a firm determines they may be willing to pay.<\/p>\n<p>As a labor management practice, algorithmic wage discrimination allows firms to personalize and differentiate wages for workers in ways unknown to them, paying them to behave in ways that the firm desires, perhaps for as little as the system determines that the workers may be willing to accept.<\/p>\n<p>Given the information asymmetry between workers and firms, companies can calculate the exact wage rates necessary to incentivize desired behaviors, while workers can only guess how firms determine their wages.<\/p>\n<p>Isn\u2019t that illegal?<\/p>\n<p>Although the United States\u2013based system of work is largely regulated through contracts and strongly defers to the managerial prerogative, two restrictions on wages have emerged from social and labor movements: minimum-wage laws and antidiscrimination laws. Respectively, these laws set a price floor for the purchase of labor relative to time and prohibit identity-based discrimination in the terms, conditions, and privileges of employment, requiring firms to provide equal pay for equal work. Both sets of wage laws can be understood as forming a core moral foundation for most work regulation in the United States. In turn, certain ideals of fairness have become embedded in cultural and legal expectations about work.<\/p>\n<p>[Laws] which specifically legalize algorithmic wage discrimination for certain firms, compare with and destabilize more than a century of legal and social norms around fair pay.<\/p>\n<p>What does it mean for workers? It\u2019s not just that such compensation systems\u00a0 make it difficult for them to predict and ascertain their hourly wages. It also affects \u201cworkers\u2019 on-the-job meaning making and their moral interpretations of their wage experiences.\u201d More:<\/p>\n<p>Though many drivers are attracted to on-demand work because they long to be free from the rigid scheduling structures of the Fordist work model,27 they still largely conceptualize their labor through the lens of that model\u2019s payment structure: the hourly wage.28 Workers find that, in contrast to more standard wage dynamics, being directed by and paid through an app involves opacity, deception, and manipulation.29 Those who are most economically dependent on income from on-demand work frequently describe their experience of algorithmic wage discrimination through the lens of gambling.30 As a normative matter, this Article contends that workers laboring for firms (especially large, well-financed ones like Uber, Lyft, and Amazon) should not be subject to the kind of risk and uncertainty associated with gambling as a condition of their work. In addition to the salient constraints on autonomy and threats to privacy that accompany the rise of on-the-job data collection, algorithmic wage discrimination poses significant problems for worker mobility, worker security, and worker collectivity, both on the job and outside of it.<\/p>\n<p>Is such a model coming for other industries?<\/p>\n<p>So long as this practice does not run afoul of minimum-wage or antidiscrimination laws, nothing in the laws of work makes this form of digitalized variable pay illegal.37 As Professor Zephyr Teachout argues, \u201cUber drivers\u2019 experiences should be understood not as a unique feature of contract work, but as a preview of a new form of wage setting for large employers . . . .\u201d38 The core motivations of labor platform firms to adopt algorithmic wage discrimination\u2014labor control and wage uncertainty\u2014apply to many other forms of work. Indeed, extant evidence suggests that algorithmic wage discrimination has already seeped into the healthcare and engineering sectors, impacting how porters, nurses, and nurse practitioners are paid.39 If left unaddressed, the practice will continue to be normalized in other employment sectors, including retail, restaurant, and computer science, producing new cultural norms around compensation for low-wage work.<\/p>\n<p>Here\u2019s The Register detailing how it\u2019s being used by Target, FedEx, UPS, and increasingly white collar jobs:<\/p>\n<p>One example is Shipt, a delivery service acquired in 2017 by retailer Target. As recounted by Dana Calacci, assistant professor of human-centered AI at Penn State\u2019s College of Information Sciences and Technology, the shipping service in 2020 introduced an algorithmic payment system that left workers uncertain about their wages.<\/p>\n<p>\u201cThe company claimed this new approach was fairer to workers and that it better matched the pay to the labor required for an order,\u201d explained Calacci. \u201cMany workers, however, just saw their paychecks dwindling. And since Shipt didn\u2019t release detailed information about the algorithm, it was essentially a black box that the workers couldn\u2019t see inside.\u201d<\/p>\n<p>\u2026\u201dFedEx and UPS drivers continue to deal with the integration of AI-driven algorithms into their operations, which affects their pay among other things,\u201d said [Wilneida Negr\u00f3n, director of policy and research at labor advocacy group Coworker.org]. \u201cThe new UPS contract does not only increase wages, but give workers a bigger say in new technologies introduced and their impact.<\/p>\n<p>The situation is slightly different, said Negr\u00f3n, in the banking and finance industries, where workers have objected to the use of algorithmic performance and productivity measurements that indirectly affect compensation and promotion.<\/p>\n<p>\u201cWe\u2019ve heard this from Wells Fargo and HSBC workers,\u201d said Negr\u00f3n. \u201cSo, two dynamics here: the direct and indirect ways that algorithmic systems can impact wages is a growing problem that is slowly affecting white collar industries as well.\u201d<\/p>\n<p>One doesn\u2019t have to think too hard about the dangers these types of practices introduce to the workplace \u2014 for laborers and consumers. As Dubal points out:<\/p>\n<p>Hospitals\u2026have begun using apps to allocate tasks based on increasingly sophisticated calculations of how workers move through space and time. Whether the task is done efficiently in a certain time frame can impact a worker\u2019s bonus. It\u2019s not surge pricing per se, but a more complex form of control that often incentivizes the wrong things. \u201cIt might not be the nurse that\u2019s really good at inserting an IV into a small vein that is the one that\u2019s assigned that task,\u201d Dubal said. \u201cInstead, it\u2019s the nurse that\u2019s closest to it, or the nurse that\u2019s been doing them the fastest, even if she\u2019s sloppy and doesn\u2019t do all the necessary sanitation procedures.\u201d<\/p>\n<p>What to do? Dubal proposes a simple solution:<\/p>\n<p>\u2026 a statutory or regulatory nonwaivable ban on algorithmic wage discrimination, including, but not limited to, a ban on compensation through digitalized piece pay. This would effectively not only put an end to the gamblification of work and the uncertainty of hourly wages but also disincentivize certain forms of data extraction and retention that may harm low-wage workers down the road, addressing the urgent privacy concerns that others have raised.<\/p>\n<p>\u2026At the federal level, the Robinson\u2013Patman Act bans sellers from charging competing buyers different prices for the same \u201ccommodity\u201d or discriminating in the provision of \u201callowances\u201d\u2014like compensation for advertising and other services. The FTC currently maintains that this kind of price discrimination \u201cmay give favored customers an edge in the market that has nothing to do with their superior efficiency.\u201d<\/p>\n<p>Though price discrimination is generally lawful, and the Supreme Court\u2019s interpretation of the Robinson\u2013Patman Act suggests it may not apply to services like those provided by many on-demand companies, the idea that there is a \u201ccompetitive injury\u201d endemic to the practice of charging different buyers a different amount for the same product clearly parallels the legally enshrined moral expectations about work and wages\u2026<\/p>\n<p>If, as on-demand companies assume, workers are consumers of their technology and not employees, we may understand digitalized variable pay in the on-demand economy as violating the spirit of the Robinson\u2013Patman Act.<\/p>\n<p>While Khan\u2019s FTC, which is charged with protecting American consumers, is increasingly coming after these non-employer employers, it has not yet gone the route recommended by Dubal. Here is a little bit of what the FTC has been doing, however: A 2022 policy statement on gig work from the FTC reads:<\/p>\n<p>As the only federal agency dedicated to enforcing consumer protection and competition laws in broad sectors of the economy, the FTC examines unlawful business practices and harms to market participants holistically, complementing the efforts of other enforcement agencies with jurisdiction in this space. This integrated approach to investigating unfair, deceptive, and anticompetitive conduct is especially appropriate for the gig economy, where law violations often have cross-cutting causes and effects\u2026And the manifold protections enforced by the Commission do not turn on how gig companies choose to classify working consumers.<\/p>\n<p>The FTC has gone after companies with smaller, targeted actions, but Benjamin Wiseman, Associate Director of the FTC\u2019s Division of Privacy and Identity Protection speaking at the Harvard Journal of Law &amp; Technology earlier this year, said that larger actions are coming on the labor-black box front:<\/p>\n<p>\u2026the Commission is also taking steps to ensure that the FTC has the resources and expertise to address harms workers face from surveillance tools. We are doing this in two ways. First, the Commission is forging relationships with partner agencies in federal government with expertise in the labor market. In the past two years, the Commission has entered into memoranda of understandings with both the National Labor Relations Board and the Department of Labor, recognizing our shared interest in protecting workers and, among other things, addressing the impact of algorithmic decision-making in the workplace. Second, the Commission is increasing its in-house capacity to investigate and analyze new technologies.<\/p>\n<p>We\u2019ll see if Khan and company get the chance to carry its work over to a Trump or Kamala administration. While the former might be a bit of a wild card, it looks like the writing\u2019s on the wall for Khan and the Jonathan Kanter- led antitrust division under the latter.<\/p>\n<p>While Uber is a ringleader in the use of the exploitative black box and its practices are diametrically opposed to Khan\u2019s mission at the FTC, it enjoys a significant presence on Team Kamala. The company \u2014 probably better described as a venture capital-funded project to cement serfdom in the 21st century \u2014 also has a longstanding close relationship with Obama World, which helped orchestrate the crowning of Kamala. And now the Plutocrats are showering Kamala with cash and insisting that Khan must go.<\/p>\n<p>It sure would be fitting if after a career of supporting the police state, debt peonage, mass infection during a pandemic, and war, \u00a0Joe Biden is pushed aside and one of the few decent things the man ever did goes with him: his appointing Khan and letting her do her job.<\/p>\n<div class=\"printfriendly pf-alignleft\"><img decoding=\"async\" style=\"border:none;-webkit-box-shadow:none; -moz-box-shadow: none; box-shadow:none; padding:0; margin:0\" src=\"https:\/\/cdn.printfriendly.com\/buttons\/print-button-gray.png\" alt=\"Print Friendly, PDF &amp; Email\"\/><\/div>\n<\/div>\n<p><br \/>\n<br \/><a href=\"https:\/\/www.nakedcapitalism.com\/2024\/08\/lina-khans-ftc-gears-up-to-go-after-algorithmic-black-boxes-governing-workers-lives-but-it-might-be-running-out-of-time.html\" target=\"_blank\" rel=\"noopener\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The Department of Justice just scored a landmark victory against Google over its monopoly behavior. It\u2019s a feather in the cap of the DOJ antitrust division and Lina Khan\u2019s Federal Trade Commission (FTC), which have had an pretty incredible four-year run. We\u2019ll see if it continues under the next administration. JD Vance continues to back [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":491,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"tdm_status":"","tdm_grid_status":"","footnotes":""},"categories":[35],"tags":[],"class_list":["post-6840","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-berita-internasional"],"_links":{"self":[{"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/posts\/6840","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6840"}],"version-history":[{"count":1,"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/posts\/6840\/revisions"}],"predecessor-version":[{"id":10554,"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/posts\/6840\/revisions\/10554"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=\/wp\/v2\/media\/491"}],"wp:attachment":[{"href":"https:\/\/uang69.id\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6840"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6840"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/uang69.id\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6840"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}