close
close

Google’s Reputation Abuse Policy Is a Band-Aid for a Bullet Wound

Google’s Reputation Abuse Policy Is a Band-Aid for a Bullet Wound

Google updated site reputation abuse policy attempts to solve a growing problem in search: large authority sites use the power of their domain to rank content they do not own or create.

While this policy is a step in the right direction, it does not address the underlying systemic problems with Google’s algorithm that allow this abuse to flourish.

Understanding Google’s Policy on Site Reputation Abuse

Google’s policy regarding abuse of site reputation has been introduced March 2024but his announcement was overshadowed major kernel update in the same month.

As a result, what should have been a defining moment in the fight against search manipulation was relegated to a footnote.

At its core, the policy targets large, authoritative websites that use the power of their domain to rank content they did not create.

It is designed to prevent these entities from acting as “hosts” for third-party content simply to exploit search engine rankings.

Google's Site Reputation Policy

A prime example would be a reputable business website that has a “coupons” section that is completely filled with third-party data.

Google recently expanded the scope of the policy to cover even more scenarios.

IN updated recommendationsGoogle emphasizes its review of cases involving “varying degrees of first-party involvement,” citing examples such as:

  • Partnership through White Label services.
  • License agreements.
  • Partial ownership.
  • Other complex business models.

This makes clear that Google isn’t just fighting programmatic abuse of third-party content.

The policy now aims to limit broad partnerships between reputable sites and third-party content creators.

Some of these often involve deeply integrated collaborations, where external organizations create content specifically to leverage the power of the hosting site’s domain for higher rankings.

Dig Deeper: Hosting third-party content: what Google says and reality

Parasite SEO is a bigger problem than ever

These partnerships have become a major problem for Google.

One of the most influential SEO investigations this year was an article by Lars Lofgren: “Forbes Marketplace: A parasitic SEO company is trying to absorb its owner

The article focuses on Forbes Advisor’s parasitic SEO program, developed in partnership with Marketplace.co, and details the significant traffic and revenue generated from the partnership.

Lofgren estimates that Forbes Advisor alone makes about $236 million a year from this strategy.

As Lofgren says:

Lars Lofgren on Forbes MarketplaceLars Lofgren on Forbes Marketplace

This highlights a systemic problem with Google search.

Forbes Advisor is just one example of parasitic SEO programs that Lofgren examines. If you want to go deeper read his articles on other sites that run similar programs.

LinkedIn is another prime example. Over the past few years, users have increasingly used LinkedIn’s user-generated content platform to capitalize on its powerful domain authority, pushing their content to the top of search results.

For example, as of this writing, the top result for “Healthcare SEO” comes not from a dedicated expert site, but from a LinkedIn Pulse article.

LinkedIn Pulse Article on Healthcare SEOLinkedIn Pulse Article on Healthcare SEO

If you dig into their query data, you’ll see a range of queries from businesses, adult themes, personal loans, and more.

Obviously LinkedIn isn’t the best source for all of these things, right?

The rise of programs designed to manipulate search results has likely prompted Google to introduce a site reputation abuse policy.

Get the newsletter search marketers rely on.


Big problem

This brings me to the question of why policy alone is not enough. The main problem is that these sites should never rank first.

Google’s algorithm is simply not strong enough to consistently prevent this kind of abuse.

Instead, the policy acts as a fallback—something Google can use to address egregious cases after they’ve already caused damage.

This reactive approach becomes an endless game of whack-a-mole that is nearly impossible to win.

What’s worse is that Google can’t catch every instance of this event, especially on a smaller scale.

Time and time again, I have seen large sites rank for topics outside of their core business – simply because they are large sites.

Here is an example to illustrate my point. Progressive has a blog, Lifelines, which primarily covers topics related to its core business: insurance, driving tips, traffic rules, etc.

However, one of their blog posts ranks #4 for the search term “side effects in puppies after vaccination,” ahead of real experts like the American Veterinary Medical Association.

Google search results for Google search results for

Result in position 1? This is Rover.com, a technology company that helps pet owners find pet sitters. She is still not a medical expert, but she uses her strong domain.

Top Google search results for Top Google search results for

I’m not suggesting that Progressive is doing anything nefarious. Most likely, this is a one-time post, off topic.

The bigger problem, however, is that Progressive could easily turn its Lifelines blog into a parasitic SEO program if it wanted to.

With minimal effort, it ranks by medical query – an area where EEAT designed to tighten competition.

The only way to stop this right now is to detect it and enforce the site’s reputation abuse policies, but that could take years.

At best, this policy serves as a short-term solution to the problem and a warning to other sites attempting to abuse.

However, it cannot solve the broader problem of large authority sites consistently outperforming real experts.

What’s going on with Google’s algorithms?

The Site Reputation Abuse Policy is a temporary solution to a much larger systemic problem plaguing Google.

Algorithmically, Google needs to be better equipped to rank real experts in a given field and filter out sites that are not reputable.

One of the most common theories is that Google is increasing the weight it places on brand authority.

winners of useful content update were more likely to have stronger “brand authority” than “domain authority,” according to the study. recent Moz study.

Average domain authority and brand authority of the analyzed sitesAverage domain authority and brand authority of the analyzed sites

Essentially, the more searches a site receives for a brand, the more likely it is to be a winner in the latest updates.

This makes sense since Google tends to rank major brands (e.g. “Nike” instead of “sneakers”) for relevant queries.

However, large brands such as Forbes, CNN, Wall Street Journal and Progressive also receive a lot of search queries.

If Google places too much weight on this signal, it creates opportunities for large sites to either intentionally exploit or unintentionally capitalize on the power of their domain or branded search.

This system does not reward true experts in a particular field.

Currently, the Site Reputation Abuse Policy is Google’s only tool to address these issues when their algorithm fails.

While there is no easy solution, it seems logical moving forward to focus more on the actual authority aspect of their algorithm.

When we look at Google Search API Leakswe see that Google can use different variables to determine the topical focus of a site.

For example, the siteEmbedding variable implies that they can categorize your entire site.

I’m particularly interested in the siteFocusScore variable.

According to leaks, this is “a number that represents how focused a site is on a single topic.”

siteFocusScore variablesiteFocusScore variable

If sites start to spread their attention too much, could this be a trigger indicating that there is something more at play?

Moving Forward

I don’t think the site’s reputation abuse policy is a bad thing.

At the very least, this serves as a much-needed warning to the Internet, as the threat of serious consequences potentially deters the most egregious abuses.

In the short term, however, it appears that Google is admitting that there is no software solution to the problem.

Since the problem cannot be detected algorithmically, a way is needed to threaten action when necessary.

However, I’m optimistic that Google will understand this in the long term and that search quality will improve in the coming years.

Contributing writers are invited to create content for Search Engine Land and are selected for their expertise and contributions to the search community. Our authors work under editorial supervision, and materials are checked for quality and relevance to our readers. The opinions they express are their own.