Lawsuit filed to prevent AI-generated scores from blocking apartment rentals

TThree hundred and twenty-four. That was the score Mary Louie was given by an AI-powered tenant screening tool. In its 11-page report, the software SafeRent does not explain how the score was calculated or how various factors were taken into account. There is no mention of what the score actually means. They just saw Louis’ numbers and decided it was too low. In the box next to the result, the report said “Score Recommended: DECLINE.”

Louis, who works as a security guard, had applied for an apartment in a suburban area in eastern Massachusetts. When she toured the room, the management company said there was no problem for her application to be accepted. Although she had bad credit and credit card debt, she had an excellent recommendation from her landlord of 17 years, who paid her rent on time. She also plans to use vouchers for low-income renters and ensure that management companies receive at least a portion of their monthly rent payments from the government. Her son, whose name was also on the voucher, also had a high credit score, indicating it could act as a backstop in case of missed payments.

But in May 2021, more than two months after she applied for the apartment, the management company sent Louis an email informing her that the computer program had rejected her application. Applications needed a score of at least 443 to be accepted. There was no further explanation and no way to appeal the decision.


“Mary, we regret to inform you that your housing offer has been denied due to a third-party service we use to screen all prospective housing applicants,” the email said. I did. “Unfortunately, the SafeRent tenant score for this service was lower than what our tenant standards would allow.”

tenant files suit

Louis ended up renting a more expensive apartment. Management there did not grade her based on an algorithm. But she learned that her experience at Saferent was not unique. She is one of more than 400 Black and Hispanic tenants on Housing Vouchers in Massachusetts who said their rental applications were rejected because of their safe rent scores.

In 2022, they banded together to sue SafeRent under the Fair Housing Act, alleging that it discriminated against them. Lewis and another named plaintiff, Monica Douglas, said the company’s algorithm unfairly scores Black and Hispanic renters using Housing Vouchers over white applicants. he claimed. They found that the software inaccurately assessed irrelevant account information (credit score, non-housing-related debt) on whether a tenant was a good tenant, but did not take into account whether they would use a housing voucher. he claimed. Research shows that black and Hispanic rent-seekers have lower credit scores and are more likely to use housing vouchers than white applicants.

“It was a waste of time waiting to be turned down,” Lewis said. “I knew my credit was bad, but AI doesn’t know what I do. It knew I was late on my credit card payments, I didn’t know I was paying.”

Two years have passed since the group first sued for safe rent. Lewis, who was one of two named plaintiffs, said she has moved on with her life and has largely forgotten about the lawsuit. But her action could protect other renters in a similar housing program, known as Section 8 vouchers, from losing their homes because of scores determined by algorithms.

Saferent settled with Mr. Lewis and Mr. Douglas. In addition to paying $2.3 million, the company agreed to stop using the scoring system or make some sort of recommendation to prospective tenants who used housing vouchers for five years. Although Saferent legally does not admit wrongdoing, it is unusual for a tech company to accept changes to its core product as part of a settlement. A more common outcome of such agreements is financial agreements.

“While SafeRent continues to believe that SRS scores comply with all applicable laws, litigation is time-consuming and costly,” company spokeswoman Yazmin Lopez said in a statement. “Defending SRS scores in this case would be a waste of time and resources that could be better used by SafeRent to fulfill its core mission of providing housing providers with the tools they need to screen applicants. It has become increasingly clear.”

New AI landlord

Tenant screening systems like SafeRent are often used as a way to “avoid” direct interaction with prospective tenants and shift responsibility for refusals to computer systems, said Louie and the plaintiffs in the lawsuit. said Todd Kaplan, one of the attorneys representing the company. company.

The property management company told Louis that it decided to deny her based solely on the software, but the SafeRent report says it did not set the criteria for what the score needed to be for an application to be accepted. was a management company.

Still, even for those involved in the application process, how the algorithm works is opaque. The property manager who showed Louis the apartment said he didn’t know why Louis was having trouble renting the apartment.

“They’re inputting a lot of information, and SafeRent is coming up with its own scoring system,” Kaplan said. “It becomes difficult to predict how people will see themselves on Safe Rent. Not only applying tenants, but even landlords, don’t know the details of Safe Rent scores.”

As part of Louie’s settlement with SafeRent, approved Nov. 20, the company will not use a scoring system or recommend accepting or rejecting tenants if they are using housing vouchers. I can no longer do that. If the company devises a new scoring system, it is required to have it independently verified by a third-party fair housing organization.

“By removing the thumbs up and down, tenants can really say, ‘I’m a great tenant,'” Kaplan said. “It allows for more personal decisions.”

Skip past newsletter promotions

AI extends to fundamental parts of life

One study found that nearly all of the 92 million people in the U.S. who are considered low-income are exposed to AI decisions in basic areas of their lives such as employment, housing, health care, education, and government assistance. It is said that there is
New report on the harms of AI
By Kevin de Liban, a lawyer who represented low-income people as a member of the Legal Aid Society. Founder of a new AI justice organization called
tectonic justice
Derivan began researching these systems in 2016, when he discovered that automated decision-making that reduced human input suddenly left state-funded home care out of reach for extended periods of time. This happened when I received a consultation from some patients. In one case, the state’s Medicaid payments depended on the program determining that the patient’s leg was intact because of the amputation.

“When we saw this, we realized we shouldn’t postpone.” [AI systems] As a kind of very rational decision-making method,” Derivan said. He said these systems make assumptions based on “junk statistical science” that create what he called “absurdity.”

In 2018, after Derivan sued the Arkansas Department of Human Services on behalf of these patients over its decision-making process, the state Legislature ruled that the department could not automate home care assignment decisions for patients. was lowered. While Derivan’s system was an early victory in the fight against harm caused by algorithmic decision-making, its use continues across the country in other areas such as employment.

Despite flaws, there are few regulations to curb AI adoption.

There are few laws restricting the use of AI, especially in making critical decisions that can impact a person’s quality of life, and liability for those harmed by automated decisions. I have very few means.

Research conducted by
consumer report
A study released in July found that a majority of Americans are “uncomfortable with the use of AI and algorithmic decision-making technologies in key life moments related to housing, employment, and health care.” “I’m there.” Respondents said they are concerned about not knowing what information AI systems use to make assessments.


Unlike in Louis’ case, people are often not informed when algorithms make decisions about their lives, making it difficult to challenge or challenge those decisions.

“The existing laws we have in place may be helpful, but they can only provide so much,” Derivan said. “Market forces don’t work when it comes to poor people. All the incentives are basically to create worse technology, and there’s no incentive for companies to create better options for low-income people.”

Federal regulators under President Joe Biden’s administration have made several attempts to keep up with the rapidly evolving AI industry. The President issued an executive order containing a framework aimed at partially addressing discrimination-related risks in national security and AI systems. But Donald Trump has vowed to roll back those efforts and cut regulations, including Biden’s executive order on AI.

So lawsuits like Louis’ may become an even more important tool in holding AI accountable. Already in litigation
attracted interest
It is an agency of the U.S. Department of Justice and the Department of Housing and Urban Development, both of which deal with discriminatory housing policies that affect protected classes.

“To the extent that this is a landmark case, it has the potential to provide a roadmap for how to consider these cases and encourage other agendas,” Kaplan said.

Still, without regulation, Derivan said it would be difficult to hold these companies accountable. Because litigation is time-consuming and expensive, companies may find workarounds or ways to build similar products for people who are not subject to class action lawsuits. “You can’t bring in these types of cases every day,” he said.

Source: www.theguardian.com

Second antitrust lawsuit filed against Google in the U.S. for online advertising | Technology

The second antitrust trial between Google and the U.S. Department of Justice commenced on September 9, with a federal judge in Virginia listening to opening arguments regarding whether the tech giant unlawfully monopolized the digital advertising sector. This trial carries significant implications for the tech industry, online publishers, and Google’s primary revenue stream.

This much-anticipated trial represents the second major U.S. antitrust case against Google, following a recent landmark ruling that found the company guilty of monopolizing the online search market illegally. Contrary to the previous case, the Justice Department is now seeking specific measures to compel Google to divest parts of its business and sell some of its advertising technology.


The Department of Justice’s second lawsuit, submitted in January 2023, targets Google’s Ads initiative, focusing on the company’s acquisition and utilization of digital advertising technology. The case revolves around Google’s role as an intermediary for website operators seeking to monetize through advertising, enabling them to sell ad space on their sites and connecting advertisers with potential customers, with Google retaining a significant portion of the ad revenue.

The Department of Justice argues that Google’s control over various aspects of digital advertising results from strategic acquisitions, culminating in a monopoly over the industry. The case delves into Google’s acquisitions of DoubleClick, Invite Media, and AdMeld, which allegedly granted the company dominance over both supply and demand in online advertising and intermediary exchange points.

During the trial, the Justice Department alleges that Google’s actions constitute anti-competitive behavior through exclusionary practices and acquisitions, leading to an illegal monopoly. Google’s defense maintains that its business model aligns with industry practices and that the Justice Department’s allegations stem from outdated perceptions of the digital advertising landscape.

Source: www.theguardian.com

Lawsuit filed against Grindr in London for exposing users’ HIV status to advertising firms

Grindr is potentially facing lawsuits from numerous users who allege that the dating app shared extremely confidential personal data with advertising firms, including disclosing their HIV status in some instances.

Law firm Austin Hayes is preparing to sue the app’s American owners in London’s High Court, claiming a breach of UK data protection laws.

The firm asserts that thousands of Grindr users in the UK had their information misused. They state that 670 individuals have already signed the claim, with “thousands more” showing interest in joining.

Grinder has stated it will vigorously respond to these allegations, pointing out that they are based on an inaccurate evaluation of past policies.

Established in 2009 to facilitate interactions among gay men, Grindr is currently the largest dating app worldwide for gay, bisexual, transgender, and queer individuals, boasting millions of users.

The lawsuit against Grindr in the High Court centers on claims of personal data sharing with two advertising companies. It also suggests that these companies may have further sold the data to other entities.

New users may not be eligible to take part, as the claims against Grindr primarily cover the period before April 3, 2018, and between May 25, 2018, and April 7, 2020. Grindr updated its consent process in April 2020.

Los Angeles-headquartered Grindr ceased passing on users’ HIV status to third parties in April 2018 following a report by Norwegian researchers uncovering data sharing with two firms. In 2021, Norway’s data protection authority imposed a NOK 65 million fine on Grindr for violating data protection laws.

Grinder appealed the decision from Norway.

The Norwegian ruling does not specifically address the alleged sharing of a user’s HIV status, recognizing that a user registered on Grindr is likely associated with the gay or bisexual community, making such data sensitive.

Chaya Hanumanjee, managing director at Austin Hayes leading the case, remarked, “Our clients suffer greatly when their highly sensitive data is shared without consent, leading to fear, embarrassment, and anxiety.”

Skip past newsletter promotions

“Grindr is dedicated to compensating those impacted by the data breach and ensuring all users can safely utilize the app without fear of their data being shared with third parties,” Hanumanjee added.

The law firm believes that affected users might be entitled to significant damages but did not disclose details.

A spokesperson from Grindr stated, “We prioritize safeguarding your data and adhering to all relevant privacy regulations, including in the UK. Our global privacy program demonstrates our commitment to privacy, and we will vigorously address this claim.”

Source: www.theguardian.com

Google filed a lawsuit against European media group for $2.3 billion over digital advertising losses

Google, a subsidiary of Alphabet Inc., is facing a 2.1 billion euros ($2.3 billion) lawsuit from 32 media groups, such as Axel Springer and Schibsted. The media groups are alleging losses due to Google’s practices in digital advertising.


The lawsuit comes as antitrust regulators are tightening the grip on Google’s advertising practices. It was initiated by publishers from various European countries like Austria, Belgium, Bulgaria, and more, accusing Google of creating a less competitive market due to its illegal conduct.

The media companies’ lawyers, Geradin Partners and Steck, stated that the losses incurred by the publishers could have been avoided if Google hadn’t abused its dominant position. This could have led to higher advertising revenues for the media companies and lower fees for ad tech services, ultimately benefiting Europe’s media landscape.

The lawsuit is supported by previous actions taken against Google, such as the French competition authority’s fine in 2021 and the European Commission’s complaint last year. Analysts predict that Google may need to adjust its practices and pricing due to increased regulatory scrutiny.

A spokesperson for Google dismissed the lawsuit as “speculative and opportunistic,” emphasizing the company’s collaboration with European publishers to enhance their advertising tools.

Despite Google’s disagreements with antitrust violations, publishers worldwide have expressed concerns about Big Tech’s dominance in advertising and the subsequent decline in their revenue share. Google remains the leading digital advertising platform globally.

The group of media companies chose to file the lawsuit in Dutch courts, citing the country’s reputation for handling antitrust claims effectively in Europe. Companies like Krone, DPG Media, TV2 Danmark A/S, and others are part of the collective seeking legal action against Google.

Source: www.theguardian.com

The most recent lawsuit filed by Mr. Tem against Shane is absolutely crazy.

Early this week, the exchange argued that the conflict between PDD and Shein was noteworthy. PDD is a Chinese company that owns his famous Pinduoduo e-commerce business and Temu, a discount online retailer that has boomed in the US market in recent years.


Exchange explores startups, markets, and money.

Read it every morning on TechCrunch+ or get The Exchange newsletter every Saturday.


Our post was timely.Two days after exploring the rivalry a bit, Tim filed a lawsuit against Shane. This isn’t the first time the company has done something like this. Earlier this year, the companies filed suit against each other solely to: lawsuit dismissed in October. Now, Tem has filed a new lawsuit alleging a series of misconduct by Shane.

How did a new lawsuit come about after the companies dropped their dueling lawsuits? According to the complaint, since the first lawsuit was dismissed, “Tem has been concerned that Shein’s anti-competitive conduct has persisted. “We found that not only that, but also that it was intensifying,” the company claims. At this point, it may be helpful to recall Temu’s parent company, his PDD. I recently passed Alibaba has a market capitalization, and Shane wants to be listed in the United States.

A Shein spokesperson told TechCrunch+ that the company “believes in the following.”[s] This lawsuit is without merit and we intend to vigorously defend it. ”

What does Tem say Shane is doing?

The claims in the lawsuit are wide-ranging. Some have taken up Tem’s view that Shein has filed countless “questionable copyright infringement lawsuits” against the company, and that Shein has filed “a plethora of malicious DMCA takedown notices” against rivals. It claims to have issued.

But that’s just the beginning. Temu also claims that Shein used its “monopoly power in the U.S. ultrafast fashion market” to abuse its suppliers, entering into “exclusive trade agreements” with ultrafast fashion suppliers through which Shein They allege that their property was wrongfully confiscated. This prevents suppliers from listing and selling similar products on her Temu or other retail platforms.

Source: techcrunch.com