Misinterpretations and myths about search engines

Misinterpretations and myths about search engines

Myths about search engines and how they operate have emerged and evolved over the years, which can cause confusion for SEO beginners. In this post, the misconceptions of search engine optimisation will be put right and explained thoroughly.

Search engine submission 

When SEO began (1990’s), submission forms were a part of the optimisation process, this basically meant that the web-masters would tag their sites and pages with keyword information to submit to the search engines. After the submission form was completed, a robot would crawl and combine the assets in their index. Due to submissions often being spam, over time the process didn’t really scale well, which means today, it is practically pointless; since 2001 the search engine submission has not been required. Today popular search engines all publicly note that they do not use submission forms, instead, the best practise is to earn links from other sites.

Even though it is useful for the practise of modern SEO, there still are submission forms around, but they are merely fragments of the previous practise. Make sure that on your website you aren’t approached by an SEO offering search engine submission forms to crawl you site, because these services will not give you enough link power to rank competitively with other websites. 



Meta tags

Meta tags used to be an important part of the SEO process: including the keywords you wanted for your site to rank for, and when users typed in those terms, your page could come up in a query. Unfortunately this was then spammed a lot and finally dropped by all popular search engines as an important ranking signal. Today, title tags and meta description tags are used as a crucial part of modern high quality SEO. Also, the meta robot tag is an important tool for controlling crawler access.

Keyword stuffing

One common myth in SEO is based around the concept of keyword density. The myth is that the number of words on a page divided by the number of instances of a given word is used by the search engines for relevancy and ranking calculations. Even though this myth has always been rejected, it still states that a key SEO metric is keyword density, when it isn’t. Use keywords that are related to the topic and use them intelligently. 

Paid search helps bolster organic results 

One of the most common SEO conspiracy theories is that spending time on search engine advertising improves your organic ratings. Popular search engines have all disproved this theory as the results are purely organic based on the user’s search. You cannot pay to appear high in rankings, advertisers can spend millions of pounds advertising but it won’t necessarily increase rankings. 


Search engine spam

Search will always attract spam, many websites are designed to specifically abuse algorithms to increase rankings on search engine results; this has been expanding since the mid 1990’s. Search engines have advanced in their algorithms so that it is a lot more difficult to create spam, it is harder to do now and pointless for two main reasons:

  1. It is not worth the time or the effort now: One of the main aims for search engines is to battle spam, and users simply hate it. Google prides themselves over their updated algorithms over the last ten years because of its’ ability to control and remove spam efficiently. Spam is still occurring today, but it is a lot harder to achieve and requires a lot more effort to do, with the long term pay-off also being practically nothing. So, why not create a website people and search engines will find useful for a long period of time instead.
  2. Smart engines: They have vastly improved over the years to identify and fight spam, making it almost impossible to rank highly in search engine results. Google’s panda update refresh introduced incredibly sophisticated algorithms that are still being updated today to fight spam and other low-value pages, so the search engine’s main goal will always be to deliver top, relevant results that satisfy the user.  

Search engines today have improved so dramatically that they can weed out websites that fail specific algorithms, for example, a very popular flower company called Interflora were completely wiped of Google’s ranking, which is called a manual penalty; this is usually due to spammy content to try and manipulate search results. So, a key tip to remember in this case is that trying to manipulate the system will not work in your favour, and will conclude in some sort of search engine manual penalty with your site wiped off the rankings, making it extremely difficult to ever get back to a high rank.


Page-level spam analysis  

Spam analysis is carried out by search engines across whole websites and singular pages. Here are some ways they analyse manipulative practises on the URL levels: Keyword stuffing: This is one of the most common spamming techniques. It involves loading a webpage with keywords to try and manipulate a site’s ranking in search results. The algorithms on the search engines now scan pages for keyword stuffing to eliminate the spammy websites from climbing their way up to the top of search rankings.Manipulative linking: Another very popular spamming technique, they try to manipulate the link acquisition and then tries to exploit the search engines’ use of link popularity in their ranking algorithms to artificially increase visibility of their site. Unlike keyword stuffing, manipulative linking is harder to overcome, due to it existing in so many forms; here are a few ways that manipulative link can appear: reciprocal link exchange programs, sites that create link pages going back and forth to potentially increase link popularity. Luckily, the search engines can spot these forms out very quickly and de-value them.

Then, there are link schemes where low-value websites are built purely as link sources to artificially increase popularity. The search engines try to combat these link farms by detecting connections between site registrations, link overlap and other methods targeted at typical link scheme techniques. Also, there are paid links- those wanting to earn higher rankings without working for it, so in exchange money, the gain links. Search engines are still trying to stop them, but they persist in providing value to buyers and sellers. Lastly, there’s low quality directory links- constant source of manipulation, they claim to have a large number of pay-for-placement web directories and pass themselves off as legitimate. Search engine algorithms have been developed to battle the several more techniques of manipulative linking.


One of the simple guidelines in search engines is to show the same text to the engine’s crawlers that you would show to a human viewer. So, this basically means you shouldn’t hide text in the HTML code of your website that a normal visitor can’t see. If this guideline is broken, then the engines call it ‘cloaking’ and fight it to prevent the pages and sites from ranking highly in the search results. Cloaking experiences are considered to be both positive and negative for a variety of different reasons, which is why on occasion the engines decide to let some pass due to the contribution of a positive user experience.


Low-value pages

Search engines can analyse low-value pages, even though they aren’t always spam, the engines can determine if it is or not and whether the website’s content is original and valuable to users. There are a lot of common filtered types of pages such as: duplicate content, thin associated content and dynamically-generated content pages that don’t provide much worth. Algorithms are installed today to weed out these low-value pages before they begin to spam search results, ultimately dissatisfying the user. 

Domain level spam analysis 

 As well as search engines scanning individual pages for spam, they can also apply this to whole domains and subdomains that could alert them as being spam.

Linking Practices- just like when search engines monitor the different links and quality of of links sent to a website, the same can be done for domains. If a website is using the manipulative techniques this triggers the search engine’s algorithms which then can devalue the website and even give manual penalties, ultimately wiping the entire site off rankings. 

Trustworthiness- this is a factor that needs to be gained over time and experience. Websites that earn trust are normally seen differently to websites that really aren’t, they are seen in a more serious aspect. Trust is something that is formed from the links the website has earned, and also can be earned from inbound links- if your website is linked with other high quality, well ranking websites; it is more likely to be taken seriously by the search engine. When you publish something with original content that is useful for users and is valued to be seen as increasingly popular, then the search engines will flag this as a positive thing and rank your website highly in moderation of the user’s query. 

Content value- whole websites can be assessed part based on its originality and user’s experience. So, instead of just thousands of pages worth of duplicates, they can stop whole sites from spamming search results. If your website has duplicated information from websites like Wikipedia, there’s less of a chance that it will ranked highly on search engines, because they simply don’t want the same content in their indexes.  The search engines can evaluate their results from seeing if the user has clicked on a site and then quickly hit back, which implies that the user was dissatisfied with the content; this allows the search engines to use different results for the next user’s query. 

penalty notice

Signs that your site is bad

Sometimes, it can be difficult to find out if your site actually has a penalty; sometimes search engines’ algorithms change. So, before panicking that your website has been penalised, think about if you’ve changed something on your website that has dropped  rankings, and then check the following advice:

Ruling out

  • Errors: Check if there are any issues on your site that can stop crawling.
  • Changes: If you have changed how the search engine view your content (like internal link structural changes, content moves or on-page change).
  • Copied content: Many modern websites now struggle with ranking highly due to their change to larger size of duplicate content.
  • Similarity: Analyse the other websites that share elements like backlink profiles and see if they have lost rankings as well; then it is more likely to be the search engines’ algorithms updating, since the link valuation and importance can change, ultimately causing ranking to fall. 

Follow this process:

  • Check if your site is still indexed: If it isn’t, the site it most likely to be banned. Filling out a re-inclusion request is required as soon as you’ve removed your spam to get back on the search engines’ ranks.
  • Check if your site still ranks for its domain name: If this is failing to do so, it is probable that you have a penalty for manipulative linking activity such as: keyword stuffing or cloaking. To get back into search engine rankings, a re-inclusion request form needs to be filled out once you have removed all potential problems with your site; for example: remove all risky outbound links and on-page problems. 
  • Search for five/six terms or phrases used in your title tag and see if your website will appear in the next 10/20 results: If not, you’ve most possibly had your links wiped of their value. To solve this issue a re-inclusion request form needs to filled in to Webmaster Central, making sure you’re apologetic and solved your bad links- it will take a lot of time if your request is accepted to rank highly again with natural links.  
  • If none of these apply to your website- then great, you don’t have a penalty, you’ve just lost some rankings. In order to improve this it’s essential to have high quality content and natural links on your page. 

This process might not work for every situation but it definitely applies to a wide scale of issues with a website that has occurred issues with basic decreased rankings. 


How to get penalties lifted

Re-inclusion request forms for Webmaster Central usually are a long process and are often unsuccessful; and most of the time they do not even inform you on what went wrong with your site or why it has been issued a penalty. Nevertheless, here is the process you should carried out when your site has been issued with a penalty:

  1. Register with Webmaster tools service: This is a trust blanket between your site and the search engine teams. 
  2. Review data in Webmaster tools accounts: Check for things such as: spam alert notifications, broken pages, crawling errors; all of these are common errors that are initially perceived as a mistaken spam penalty but it actually in relation to accessibility troubles.
  3. Re-inclusion request form: Send this through the Webmaster services rather than a public form as this highlights a bigger trust blanket and creates a higher chance of getting a reply.
  4. Full disclosure:  A crucial element of getting consideration is to be honest in your re-inclusion form. Own up to everything you have done whether it is: spamming, links you have collected and how you got them. The search engines’ want this information so they are informed on how to prevent this from occurring in future. If you fail to mention information such as this, the search engine will think of you as dishonest, concluding in a rejection of re-inclusion. 
  5. Fix everything that you can: Before submitting the re-inclusion request, remove everything that can be perceived as negative on your site, such as bad links; and if any manipulation was formed on your site remove it imminently.
  6. Wait:  Responses often take weeks; re-inclusion is a very long process. 
  7. Go direct: If you run a large brand on the web, re-inclusion can be faster if you go to an individual at a conference. The value of being re-included quickly can be worth the price of admission.

The re-inclusion request form is a benefit and privilege for the web users’; lifting a penalty is not the Webmasters main obligation to do so; legally, they can reject any site or page. So, in future do not apply for SEO aspects that you’re cautious about, otherwise you could be in a difficult position with Webmaster.  

 featured image: www.enginebuildermag.com