Our spam policies help protect users and improve the quality of search results. To be eligible to appear in Google web search results (web pages, images, videos, news content or other material that Google finds from across the web), content shouldn't violate Google Search's overall policies or the spam policies listed on this page. These policies apply to all web search results, including those from Google's own properties.
We detect policy-violating content and behaviors both through automated systems and, as needed, human review that can result in a manual action. Sites that violate our policies may rank lower in results or not appear in results at all.
If you believe that a site is violating Google's spam policies, let us know by filing a search quality user report. We're focused on developing scalable and automated solutions to problems, and we'll use these reports to further improve our spam detection systems.
Our policies cover common forms of spam, but Google may act against any type of spam we detect.
Cloaking refers to the practice of presenting different content to users and search engines with the intent to manipulate search rankings and mislead users. Examples of cloaking include:
- Showing a page about travel destinations to search engines while showing a page about discount drugs to users
- Inserting text or keywords into a page only when the user agent that is requesting the page is a search engine, not a human visitor
If a site is hacked, it's not uncommon for the hacker to use cloaking to make the hack harder for the site owner to detect. Read more about fixing hacked sites and avoiding being hacked.
If you operate a paywall or a content-gating mechanism, we don't consider this to be cloaking if Google can see the full content of what's behind the paywall just like any person who has access to the gated material and if you follow our Flexible Sampling general guidance.
Doorways are sites or pages created to rank for specific, similar search queries. They lead users to intermediate pages that are not as useful as the final destination. Examples of doorways include:
- Having multiple websites with slight variations to the URL and home page to maximize their reach for any specific query
- Having multiple domain names or pages targeted at specific regions or cities that funnel users to one page
- Pages generated to funnel visitors into the actual usable or relevant portion of your site(s)
- Substantially similar pages that are closer to search results than a clearly defined, browseable hierarchy
Hacked content is any content placed on a site without permission, due to vulnerabilities in a site's security. Hacked content gives poor search results to our users and can potentially install malicious content on their machines. Examples of hacking include:
- Page injection: Sometimes, due to security flaws, hackers are able to add new pages to your site that contain spammy or malicious content. These pages are often meant to manipulate search engines or to attempt phishing. Your existing pages might not show signs of hacking, but these newly-created pages could harm your site's visitors or your site's performance in search results.
- Content injection: Hackers might also try to subtly manipulate existing pages on your site. Their goal is to add content to your site that search engines can see but which may be harder for you and your users to spot. This can involve adding hidden links or hidden text to a page by using CSS or HTML, or it can involve more complex changes like cloaking.
- Redirects: Hackers might inject malicious code to your website that redirects some users to harmful or spammy pages. The kind of redirect sometimes depends on the referrer, user agent, or device. For example, clicking a URL in Google Search results could redirect you to a suspicious page, but there is no redirect when you visit the same URL directly from a browser.
Here are our tips on fixing hacked sites and avoiding being hacked.
Hidden text and links
Hidden text or links is the act of placing content on a page in a way solely to manipulate search engines and not to be easily viewable by human visitors. Examples of hidden text or links that violate our policies:
- Using white text on a white background
- Hiding text behind an image
- Using CSS to position text off-screen
- Setting the font size or opacity to 0
- Hiding a link by only linking one small character (for example, a hyphen in the middle of a paragraph)
There are many web design elements today that utilize showing and hiding content in a dynamic way to improve user experience; these elements don't violate our policies:
- Accordion or tabbed content that toggle between hiding and showing additional content
- Slideshow or slider that cycles between several images or text paragraphs
- Tooltip or similar text that displays additional content when users interact with over an element
- Text that's only accessible to screen readers and is intended to improve the experience for those using screen readers
Keyword stuffing refers to the practice of filling a web page with keywords or numbers in an attempt to manipulate rankings in Google Search results. Often these keywords appear in a list or group, unnaturally, or out of context. Examples of keyword stuffing include:
- Lists of phone numbers without substantial added value
- Blocks of text that list cities and regions that a web page is trying to rank for
- Repeating the same words or phrases so often that it sounds unnatural. For example:
Unlimited app store credit. There are so many sites that claim to offer app store credit for $0 but they're all fake and always mess up with users looking for unlimited app store credits. You can get limitless credits for app store right here on this website. Visit our unlimited app store credit page and get it today!
Google uses links as an important factor in determining the relevancy of web pages. Any links that are intended to manipulate rankings in Google Search results may be considered link spam. This includes any behavior that manipulates links to your site or outgoing links from your site. The following are examples of link spam:
- Buying or selling links for ranking purposes. This includes:
- Exchanging money for links, or posts that contain links
- Exchanging goods or services for links
- Sending someone a product in exchange for them writing about it and including a link
- Excessive link exchanges ("Link to me and I'll link to you") or partner pages exclusively for the sake of cross-linking
- Using automated programs or services to create links to your site
- Requiring a link as part of a Terms of Service, contract, or similar arrangement without allowing a third-party content owner the choice of qualifying the outbound link
- Text advertisements or text links that don't block ranking credit
- Advertorials or native advertising where payment is received for articles that include links that pass ranking credit, or links with optimized anchor text in articles, guest posts, or press releases distributed on other sites. For example:
There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.
- Low-quality directory or bookmark site links
- Keyword-rich, hidden, or low-quality links embedded in widgets that are distributed across various sites
- Widely distributed links in the footers or templates of various sites
- Forum comments with optimized links in the post or signature, for example:
Thanks, that's great info!
paul's pizza san diego pizza best pizza san diego
Google does understand that buying and selling links is a normal part of the economy of the web for advertising and sponsorship purposes. It's not a violation of our policies to have such links as long as they are qualified with a rel="nofollow" or rel="sponsored" attribute value to the
Machine-generated traffic consumes resources and interferes with our ability to best serve users. Examples of automated traffic include:
- Sending automated queries to Google
- Scraping results for rank-checking purposes or other types of automated access to Google Search conducted without express permission
Such activities violate our spam policies and the Google Terms of Service.
Malware and malicious behaviors
Google checks websites to see whether they host malware or unwanted software that negatively affects the user experience.
Malware is any software or mobile application specifically designed to harm a computer, a mobile device, the software it's running, or its users. Malware exhibits malicious behavior that can include installing software without user consent and installing harmful software such as viruses. Site owners sometimes don't realize that their downloadable files are considered malware, so these binaries might be hosted inadvertently.
Unwanted software is an executable file or mobile application that engages in behavior that is deceptive, unexpected, or that negatively affects the user's browsing or computing experience. Examples include software that switches your homepage or other browser settings to ones you don't want, or apps that leak private and personal information without proper disclosure.
Site owners should make sure they don't violate the Unwanted Software Policy and follow our guidelines.
Site owners should create websites with high quality content and useful functionality that benefits users. However, some site owners intend to manipulate search ranking by intentionally creating sites with misleading functionality and services that trick users into thinking they would be able to access some content or services but in reality can not. Examples of misleading functionality include:
- A site with a fake generator that claims to provide app store credit but doesn't actually provide the credit
- A site that claims to provide certain functionality (for example, PDF merge, countdown timer, online dictionary service), but intentionally leads users to deceptive ads rather than providing the claimed services
Some site owners base their sites around content taken ("scraped") from other, often more reputable sites. Scraped content, even from high quality sources, without additional useful services or content provided by your site may not provide added value to users. It may also constitute copyright infringement. A site may also be demoted if a significant number of valid legal removal requests have been received. Examples of abusive scraping include:
- Sites that copy and republish content from other sites without adding any original content or value, or even citing the original source
- Sites that copy content from other sites, modify it only slightly (for example, by substituting synonyms or using automated techniques), and republish it
- Sites that reproduce content feeds from other sites without providing some type of unique benefit to the user
- Sites dedicated to embedding or compiling content, such as videos, images, or other media from other sites, without substantial added value to the user
Redirecting is the act of sending a visitor to a different URL than the one they initially requested. Sneaky redirecting is doing this maliciously in order to either show users and search engines different content or show users unexpected content that does not fulfill their original needs. Examples of sneaky redirects include:
- Showing search engines one type of content while redirecting users to something significantly different
- Showing desktop users a normal page while redirecting mobile users to a completely different spam domain
While sneaky redirection is a type of spam, there are many legitimate, non-spam reasons to redirect one URL to another. Examples of legitimate redirects include:
- Moving your site to a new address
- Consolidating several pages into one
- Redirecting users to an internal page once they are logged in
When examining if a redirect is sneaky, consider whether or not the redirect is intended to deceive either the users or search engines. Learn more about how to appropriately employ redirects on your site.
Spammy automatically-generated content
Automatically generated (or "auto-generated") content is content that's been generated programmatically without producing anything original or adding sufficient value; instead, it's been generated for the primary purpose of manipulating search rankings and not helping users. Examples of spammy auto-generated content include:
- Text that makes no sense to the reader but contains search keywords
- Text translated by an automated tool without human review or curation before publishing
- Text generated through automated processes without regard for quality or user experience
- Text generated using automated synonymizing, paraphrasing, or obfuscation techniques
- Text generated from scraping feeds or search results
- Stitching or combining content from different web pages without adding sufficient value
If you're hosting such content on your site, you can use these methods to exclude them from Search.
Thin affiliate pages
Thin affiliate pages are pages with product affiliate links on which the product descriptions and reviews are copied directly from the original merchant without any original content or added value.
Affiliate pages can be considered thin if they are a part of a program that distributes its content across a network of affiliates without providing additional value. These sites often appear to be cookie-cutter sites or templates with the same or similar content replicated within the same site or across multiple domains or languages. If a Search results page returned several of these sites, all with the same content, thin affiliate pages would create a frustrating user experience.
Not every site that participates in an affiliate program is a thin affiliate. Good affiliate sites add value by offering meaningful content or features. Examples of good affiliate pages include offering additional information about price, original product reviews, rigorous testing and ratings, navigation of products or categories, and product comparisons.
User-generated spam is spammy content added to a site by users through a channel intended for user content. Often site owners are unaware of the spammy content. Examples of spammy user-generated content include:
- Spammy accounts on hosting services that anyone can register for
- Spammy posts on forum threads
- Comment spam on blogs
- Spammy files uploaded to file hosting platforms
Here are several tips on how to prevent abuse of your site's public areas. Here are our tips on fixing hacked sites and avoiding being hacked.
Other behaviors that can lead to demotion or removal
When we receive a high volume of valid copyright removal requests involving a given site, we are able to use that as a quality signal and demote other content from the site in our results. This way, if there is other infringing content, users are less likely to encounter it versus the original content. We apply similar demotion signals to other classes of complaints, including complaints about counterfeit goods and court-ordered removals.
Online harassment removals
Google has policies that allow the removal of certain types of content if it violates our policies involving personal information, such as non-consensual explicit images, doxxing content, or content hosted by sites with exploitative removal practices.
If we process a high volume of these removals involving a particular site, we use that as a quality signal and demote other content from the site in our results. We also look to see if the same pattern of behavior is happening with other sites in relation to people's names and, if so, apply demotions to content on those sites.
Once someone has requested a removal from one site with predatory practices, we will automatically apply ranking protections to help prevent content from other similar low quality sites from appearing in Google Search results for people's names.
Scam and fraud
Scam and fraud come in many forms, including but not limited to impersonating an official business or service through imposter sites, intentionally displaying false information about a business or service, or otherwise attracting users to a site on false pretenses. Using automated systems, Google seeks to identify pages with scammy or fraudulent content and prevent them from showing up in Google Search results. Examples of online scams and fraud include:
- Impersonating a well-known business or service provider to trick users into paying money to the wrong party
- Creating deceptive sites pretending to provide official customer support on behalf of a legitimate business or provide fake contact information of such business
Does Google penalize for keyword stuffing? ›
To help higher quality content rank better, Google search penalizes sites that it detects are keyword stuffing, and may remove your page from its results altogether.What should you avoid when developing a search optimized website? ›
Search engines see Frames as completely different pages and as such Frames have a negative impact on Seo. We should avoid the usage of Frames and use basic HTML instead.What is the cloaking in SEO? ›
Cloaking is a method which gives search engines the impression that a website carries content that is different to what users actually see. Visitors see a user friendly, visually appealing website which may, for example, contain little text and plenty of graphic or multimedia elements.How SEO works step by step? ›
- Audit. The Process. ...
- Technical SEO. The Process. ...
- Keyword Research. The Process. ...
- Location Demographics. The Process. ...
- Content Strategy. The Process. ...
- Content Writing & Editing. The Process. ...
- Ranking. The Process.
Unethical black hat SEO techniques like cloaking may get a site ahead initially, but they also go against search engine guidelines. If caught, a website can be heavily penalized by Google, if not banned entirely.How many keywords is too many? ›
How many keywords are too many? The ideal keyword density preferred by both readers and search engines is around two to five percent. Even in longer pieces, the best practice is not to exceed 20 uses per webpage.What is a common best practices for handling search? ›
- Make the Search Box User-Friendly. ...
- Analyze Search Data. ...
- Optimize for Mobile Searching (or anywhere your users might be!) ...
- Use Autocomplete, Autocorrect, Filters, and Facets to Assist Search. ...
- Make the Results Page Intuitive, Helpful, and Inspiring.
Common examples of black hat SEO strategies include the use of invisible text, doorway pages, keyword stuffing, page swapping, or the addition of unrelated keywords to a page. Each of these techniques are defined below, with insight as to how they can be detrimental to any business website.What is GREY hat SEO? ›
Generally speaking, grey hat SEO is the practice of using techniques that are not strictly against Google's guidelines but perhaps a little outside what would be considered best practice. For example, getting a backlink from an authority site is something that Google encourages.Does Google read hidden text? ›
Google renders the web page to approximate what a user might see. If content is hidden behind a “read more” link to make the content visible on the page, then that's okay. If a user can see it then Google can see it too.
What is SEO checklist? ›
On-page SEO checklist. On-page SEO is the process of optimizing the actual content on your page. It includes optimizations made to visible content and content in the source code. Let's look at how to do it.What is the first thing to do before SEO? ›
Before you start any active SEO efforts, it's important to make sure your website has content that engages and helps its users. Data shows that pages with 2,000+ words of content typically rank higher in Google search than pages with short, light content.What is highly frowned upon by Google? ›
Google Hates A Site Full of Ads
However, if it's difficult to separate the ads from the content and if the ads are intrusive enough to provide what Google considers a “bad experience” for the user, your search rankings will falter.
Cloaking in SEO is a black hat technique where a website presents a different URL or content to users and another one to search crawlers. This is an attempt to deceive search engines to rank the website higher in SERPs.How many SEO ranking signals are there? ›
You might already know that Google uses over 200 ranking factors in their algorithm…Should I use the same keywords on every page? ›
Having the same keyword targeted on multiple pages of a website doesn't make a search engine thinks your site is more relevant for that term. When multiple web pages seem to be too similar, it can actually send out negative signals.How do I separate SEO keywords? ›
Commas are used to separate keywords and are the only punctuation allowed in the field. Keywords can be used to disambiguate between like terms.What is keyword cannibalization? ›
Keyword cannibalization means that you have various blog posts or articles on your site that can rank for the same search query in Google. Either because the topic they cover is too similar or because you optimized them for the same keyphrase.Which is better for SEO? ›
Google Ads is better for quick, top of the page results and SEO is better for long-term Google ranking. If budget is more of a concern and time is not then SEO is better. If you want to be at the top of the first-page search result like next week then Google Ads is better for you.How do you implement a search on a website? ›
- Tokenize the search string.
- Create a regular expression of the tokens.
- Stringify the book objects.
- Look for the search tokens in the stringified book objects and create a list of book objects for which a match was found.
- Render the search result.
What makes a good search result? ›
Search Results Should be Timely and Relevant
Relevancy in search is simply the idea that a result is related to what was typed in. In a good search, the most related results for a query will be retrieved. Timeliness is crucial in many cases as well—especially on news sites.
The term 'cloaking' is used to describe a Website that returns altered web pages to search engines that are crawling them. For example, if a regular Joe the Plumber were to visit a cloaked website, he will see and experience something different from what the search crawlers will encounter when they visit the same site.What is aggressive SEO? ›
Essentially, an aggressive strategy searches for ways to bend the system to gain an advantage. Aggressive strategies aren't necessary for SEO success, but they may get results faster.What is difference between white hat and black hat SEO? ›
While white hat SEO involves looking for ways to improve user experience, black hat SEO relies on manipulating Google's algorithm to improve rankings. To put it simply, if a tactic is designed to make Google think that a site provides more value to users than it really does, it's deceptive — and it's black hat SEO.What is SEO compliance? ›
What is SEO compliance? Being SEO compliant allows your pages to be indexed by search engines more effectively, resulting in increased traffic and leads.How do I know if my website is SEO friendly? ›
- Make clear which keywords the company's site is targeting.
- Check your current position in Google and the pages of the site that are already indexed. To know this number exactly use the command “site:” and the domain of the site. ...
- Look at the competitors.
Yes! We're happy to say that Wix has an excellent range of SEO tools to help your website rank well in search engine results. Wix is good for SEO – it's been proven by regular Wix users, by public SEO battles, and by our thorough testing of its SEO offerings.What is not provided in Google Analytics? ›
In Google Analytics, when you see “(not provided)” instead of the queries that led searchers to your website, this means Google is covering organic keywords data in the interest of protecting the privacy of searchers.What is the most common way of search engine discovers a web page? ›
Crawling is the discovery process in which search engines send out a team of robots (known as crawlers or spiders) to find new and updated content. Content can vary — it could be a webpage, an image, a video, a PDF, etc. — but regardless of the format, content is discovered by links.Which tool would be best to use to research how many searches per month a term gets? ›
With the Keyword Explorer tool, you can search any keyword you choose and see its monthly volume, difficulty, and organic clickthrough rate (CTR). Scroll down from there to see analysis of current results ranking for it and suggestions for similar keywords.
What is the advantage of putting all of your important keywords in the meta keywords tag Mcq? ›
This is basically where you want the words which will take you to the top of the SERPs page to be. Your keywords are important – even if you take away all of the other words, the user should be able to know what your site is all about when they read your keywords.