On-Page Factors

1. Sitemap

A site map (or sitemap) is a listing of pages of a web site. Sitemaps used all through the making plans of a Web website via its designers. Human-visible listings, typically hierarchical, of the pages on a web page. Structured listings meant for net crawlers consisting of search engines.

XML Sitemap :

An XML(extensible markup language) sitemap protocol is specifically intended for search engine spiders. At its root, XML is a document that includes all the in the back of the scenes pastime on a web site. Not just the site’s major URL, however all the URLs in the site alongside the related metadata. This can consist of while the URL was closing updated, how vital it is, the common frequency changes occur,the URLs relation to the rest of the site, etc.

HTML Sitemap :

HTML(hypertext markup language) is only a general diagram of the site, simply the pages and information a client should be worried about. In case you’re on a site also, you’re searching for the shopping basket or the ‘Get in touch with Us’ page and can’t discover it, you’d go to the sitemap and effectively discover it there. While this is outfitted towards the client, it can likewise help your web index positioning in light of the fact that your webpage is easy to understand also, taking into account the site guest.

Tool :

Sitemap Generator.

How to use Sitemap Generator Tool Just Click on “Sitemap” Button.

2. Canonical Tag

The canonical tag is a web page level meta tag that is located inside the HTML header of a webpage. It tells the search engines like google which URL is the canonical version of the page being displayed. Its reason is to preserve duplicate content material out of the search engine index whilst consolidating your web page’s strength into one ‘canonical’ page.

Example :

Multiple Url pointing to identical content brought on Duplicate issues. For Example.

Following 3 URL having identical content material to resolve this difficulty we use the canonical tag.

  • wseofocus.com/index
  • https://www.wseofocus.com
  • wseofocus.com

So, in this case, we write the following code in our head section to the crawler understanding that this different URL indicating the same page.


Finalize the URL to understand Google crawler that my actual website name is “https://www.wseofocus.com”.

3. Redirection :

A redirect is a way to send each customer and search engines to a different URL from the only they at the beginning requested. Below are descriptions of some of the generally used kinds of redirects.

301 Redirect :

301 (Permanent Redirect) should be used to genuinely suggest a permanently moved the page and to keep away from complicated search engines.

302 Redirect :

A 302 (Temporary Redirect) Temporary Redirect one URL to another URL

Tools :

  • 301 Redirect Checker Tool

4. SSL Certificate (Secure Socket Layer)

SSL layer is utilized to support the positioning of your site. SSL Stands for “Secure Socket Layer”. It is the business standard for web security innovation and is acknowledged all around. As the name recommends, SSL builds up an encoded connection between the web server and the program.

Why do you need it ?

The solution is absolutely simple. If your site is taking statistics from the person or sending him important information, then your website ought to put into effect SSL. With this benchmark in mind, maximum of the web sites at the internet must put into effect SSL. This is because most websites nowadays have login abilities or at least receive consumer email for e-newsletter subscriptions.

5. Domain Name in SEO

What is Domain ?

Domain names are the unique, human-readable Internet addresses of websites. They are made up of three parts: a top-level area (sometimes called an extension or domain suffix), a domain call (or IP address), and an optional subdomain. The aggregate of most effective the area name and top-level domain is called a “root area.” The “Http://” is part of a page’s URL but not its domain name and is known as the “protocol.“

6. SEO Analysis & Website Speed Analysis Tools :

  • Pingdom Tool
  • GT Matrix
  • Google Page Insight
  • Woorank
  • SEO Optimizer
  • SEO Site Checkup
  • Onpage.org

7. Robots.txt File :

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt is a standard utilized by sites to communicate with net crawlers and other net robots. The popular specifies how to tell the web robotic approximately which areas of the internet site should no longer be processed or scanned.

For Example :

User-agent: *

Disallow: /wp-admin/

Allow: /wp-admin/admin-ajax.php

Sitemap : https://wseofocus.com/sitemap.xml

Show the Following Image in Robots.txt File :

Robots.txt file

Structured Schema Markup

Schema markup, found at Schema.org, is a type of microdata. Once added to a site page, blueprint markup makes an improved depiction (usually known as a rich snippet), which shows up in list items.

Top web crawlers – including Google, Yahoo, Bing, and Yandex – first began teaming up to make Schema.org, in 2011.

Pattern markup is particularly significant in the time of Hummingbird and RankBrain. How an internet searcher deciphers the setting of an inquiry will decide the nature of a query item.