Cloaking Can Never Be A White Hat Practice
Cloaking

Easy

Cloaking is a black hat SEO technique in which different content is shown to search engines and human visitors. It’s an attempt to manipulate rankings and is against search engine guidelines.

A website can use cloaking to give search engines the illusion that it contains different content than it actually does.

Visitors experience an interactive, visually pleasing website that may, for example, be appealing to the eye and have little text.

Graphical and multimedia elements are not recognized by search engines, which results in a different website (with the same URL).

One of the most notable aspects of this site is its search engine optimized content. Search engines do not allow keyword cloaking.

Cloaking is penalized by search engines, which will permanently remove a website from their index if they detect it. Search results will no longer display the website.

What does cloaking mean in SEO?

In what ways can cloaking be done, and how is it done?

  • Cloaking of user agents
  • Cloaking based on IP addresses
  • Script cloaking in Java
  • HTTP_REFERER cloaking
  • HTTP header cloaking using the accept-language
  1. Cloaking of user-agents

User-agents are programs (software agents) that act in the user’s place.

Using a web browser as an example, an operating system fetches website information using a user-agent. In response to a query, a code is sent from the browser to the server identifying the user agent. If the user-agent is recognized as a crawler, cloaked content will be displayed.

  1. Cloaking based on IP addresses

The IP address of each user accessing a website depends on their location and internet service provider. By using this methodology, users are redirected to the desired page through a page with a high traffic volume and good search engine ranking. Your hosting company’s control panel can provide you with reverse DNS records (which can be used to identify the IP address) so that you can create a .htaccess to redirect these requests. Most people prefer this method of cloaking.

  1. Script cloaking in Java

The result is that users with JavaScript-enabled browsers are shown one version of a site, while those without JavaScript (like search engines) see a different version.

  1. HTTP_REFERER cloaking:

According to this method, the HTTP_REFERER header will be examined, and based on that, an uncloaked or cloaked version of the website will be served.

  1. HTTP Accept-language header cloaking:

A specific version of the website is presented to the user based on the HTTP Accept-Language header. Cloaked versions of websites appear if the HTTP Accept-Language headers match search engine headers.

How can cloaking be implemented in SEO?

In order to implement cloaking, let us examine a few easy steps:

  • Text that is invisible or hidden

If you add text of the same color as the background, it won’t be visible to humans.

  • Websites that use Flash

As far as SEO guidelines are concerned, Flash is not recommended. However, it cannot be avoided in some cases. As opposed to transforming the entire site into plain HTML, they create content-rich web pages and make them available to search engines while delivering flash versions to visitors.

  • A rich HTML website

As high a TEXT to HTML ratio as possible is necessary for good SEO. Alternatively, your website should have more text (content) than HTML tags. You will have a very low text to HTML ratio if you write short articles. As a result, people cloak their websites to meet SEO guidelines to avoid re-designing them.

  • Javascript replacement

It is possible to use JavaScript to match textual information that is contained within a Flash or other multimedia element with content to show users who do not have JavaScript capabilities.

Is there such a thing as White Hat Cloaking?

Is there something called White Hat Cloaking? This is a commonly asked question.

According to Matt Cutts:

“White hat cloaking is incompatible with Google’s business model. White-hat cloaking has never been an exception. You should never believe someone who says otherwise.

Additionally, he said that cloaking is considered by Google if a site identifies the Googlebot based on the user agent or IP address and may be punished.

Google’s webmaster guidelines prohibit ‘white hat cloaking’ so this answers our question. Don’t be fooled by anyone who suggests you try white-hat cloaking.

How does Google penalize cloaking?

Search engines continually update their algorithms, so if you use cloaking, you may end up being permanently banned from the search engine index if they discover it. BMW was “blacklisted” by Google in February 2006 for breaking its guidelines.

Cloaking ads

The practice of ad cloaking is a sophisticated means of camouflaging malicious advertisements in the context of programmatic ads.

A security tool can’t detect malicious activity if an ad tag is scanned, so scammers hide their malicious activity when scammers realize screening efforts are being made. Claimed attacks are engineered to trick ad tag scanners by passing through their scan at the ad tag level, before the impression is generated, giving them a false reading.

There has been a consistent pattern over time: cloaked attacks always target environments with end users, as opposed to environments without one. “Non-user” environments encompass search engines and advertising tracking software. A cloaker uses a variety of detection tools to identify artificial, non-user environments such as IP addresses, browsers, devices, etc.

By hiding their own real URLs inside lines of code, scammers often evade layers of manual and automated quality assurance. Sometimes their code looks identical to a legitimate publisher’s or company’s URL. As long as basic scanner tools see the fraudulent code as legitimate, it will arrive at the appropriate location, where it can be read directly by the user.

A Malvertiser’s Guide to Bypassing Ad Scanning

Publishers and advertisers are affected by cloaking, depending on their strategies and end goals. A cloaker who wants to steal ad spend from legitimate buyers, may build a fake website imitating a premium advertiser, and conceal their actual page URL on the website.

Without a guard, any ad platform will think this site is legitimate and will send it quality ads – which no one will see. In essence, because the platform has merged the genuine publisher’s website with the counterfeit one, viewability for the genuine publisher drops — and so does its CPMs, since platforms perceive that the genuine publisher has more inventory than it really has.

The bad actors’ methods are basically analogous when attacking a publisher with cloaking techniques. It appears as if the fraudsters built a legitimate creative ad with an accompanying landing page (for example, a rental car ad). Ad scanners can “see” this content when they examine an ad tag. This code has been cloaked with the actual URLs for the creative and landing page.

As soon as the publisher loads the ad, the fake creative is replaced with low-quality, often sensational creative (for example, an advertisement featuring celebrities in crisis). By swapping the landing page, they end up on a counterfeit site where they are manipulated by malware, phishing, or any other scam after clicking on the ad – which we will explain later is a distinct risk.

Cloaking Signals

Here are some signs that advertisers should consider red flags for serious ad quality issues:

  • CTR increases on display ads

In 2020, display ad CTRs will typically be low industry-wide. Depending on the source, average CTRs vary, but under 0.1% is generally expected for display ads today. A sudden increase in your CTR may indicate a clickbait ad campaign has hit your site.

  • Diminished metrics like time on site, duration of sessions, and revenues; or increase in visitors’ bounce-back rate.

The loss of monetization can be caused by any negative change in any of these metrics. Publishers should use analytics to identify the sources of poor performance if they first notice a loss of monetization.

  • Viewability and CPM rates have fallen

The publisher’s buy-side partners might have been hit by a cloaked attack if suddenly their viewability or CPM dropped. Ad platforms that purchase counterfeit inventory divert advertisers’ expenditures away from real publishers’ sites to the counterfeit site. If publishers see such dips in performance early, they should communicate clearly with their demand partners.

  • In-banner video that appears on a website

Premium advertisers in particular are negatively impacted by IBV, an industry issue that has long-standing consequences for UX.

Demand partners should be made aware of IBV and what the security and quality assurance measures are in place.

 Cloaking devices detection

The supply chain needs to be checked at multiple points, as cloaked ads reveal their true nature after the final scan. There are a variety of scanning solutions available and not all scanning solutions are able to scan every creative. Scanners that only look at a sample may miss identifying real risks.

Due to the real-time nature of cloaking, scanning technology does not detect this change, since ad creative is switched out at the last micro-moment, at the moment the page and ad content render. Ads can be caught in real time by real-time blocking, when the cloaked ad shows itself and before the page content begins to load.

Summing it up

It’s important to act quickly and get the site cleaned if your site has been hacked by SEO spammers. Since blacklisting is possible only after your site has been hacked for a considerable period of time, there’s a time urgency.

Search engines will not display a site that has been blacklisted. A blacklisted site will experience a loss of revenue, traffic, and reputation.

Getting a professional to do the cleanup will ensure that the site is protected and cleaned properly.

Dwell time

Get Your Free SEO Audit Now!

Enter your website URL below, and we'll send you a comprehensive SEO report detailing how you can improve your site's visibility and ranking.