October 26, 2022

Alina Hasnaș

Onpage SEO Guide – how to do good SEO optimizations?

On-page optimization is a set of works that requires fulfilling all search engine requirements to achieve top positions in search engine rankings. Check out this Onpage SEO Guide for the best website optimization tips to see real results fast!

On page SEO Guide, list of key elements to optimize on page:

    1. Meta tags
    2. Image Alt Tags
    3. Structured Data - Schema.org
    4. Breadcrumb Navigation
    5. Sitemap.xml
    6. Robots.txt.

Meta tags (meta description and title) are keyword-optimized snippets that contain text. They represent a kind of summary of a page. Generally, Meta Tags contain information that helps search engines crawl and index your website/web page better.

A useful tool for checking or generating Meta Tags: https://metatags.io/.

The Alt attribute of images is, in turn, a way to tell search engines what an image is about. This is to be correctly systematized in the Google database.

Example code for the Alt Attribute:

<img src="pancakes.png" alt="Stack of blueberry pancakes with powdered sugar">

Schema.org: Using Schema.org markup makes the product eligible for rich results. They are called content fragments in Romanian and display name, description, price, image, etc.). Schema.org allows you to attract more potential buyers while they are searching for keywords on Google.

Structured Data is information organized in the form of code snippets that help search engines better understand the content of your website. They can also appear as rich results on the search engine results page (ie in the SERPs, as in the example above). Likewise, it helps to increase the CTR (click-through rate). Structured Data can be implemented as distinct types of markup on the web page using one of the 3 major formats that Google understands.

Here are the formats structured data can take:

  • JSON-LD - can be implemented in one block of code without changing the rest of the file;
  • HTML Microdata - you simply copy and paste the script into the web page section. This format is based on a set of tags that highlight elements and values on a page, for structured data, individually. The main disadvantage of Microdata is that each content entity or attribute must be individually marked up in the HTML body of the pages.
  • RDFa (Resource Description Framework in Attributes) - is an extension from HTML5. It can be used to mark up articles in structured data.

Breadcrumb Navigation is a tool used in web searches that allows users to return to the previous page or return to the homepage with a single click.

Where is Breadcrumb used? We use breadcrumb navigation for large websites and websites that have hierarchically arranged pages. Breadcrumbs are great for eCommerce sites where a wide variety of products are grouped into logical categories and subcategories. Breadcrumbs appear on mobile devices, instead of a URL, above the title of the results page. Breadcrumbs can be used in 2 ways:

- Through 4 steps to reach the Main Page (main page of the site)
- Through 2 steps to reach the Main Page.

Sitemap.xml is a document that helps Google and other search engines better understand the content structure of your website. A sitemap is a protocol that allows a webmaster to tell search engines exactly which site URLs are available for crawling (indexing). This .xml file is useful to facilitate site navigation both for users and for the Google robot (Googlebot). It has the task of accessing the site to index its content. So the sitemap.xml file is the file that collects all the important URLs of your site. After creating sitemap.xml it is necessary to enter it on GSC (Google Search Console).

What information does a sitemap provide to the search engine?

This is the information a sitemap.xml gives them about each URL on your site: when they were last updated, how often they change, and the relationships between certain pages on your site.

The robots.txt file is the one that instead asks for the exclusion of some information. More specifically, it tells search engine bots which files are visible to Google and which are not.

Robots.txt is a normal text file that contains some special lines, which are used to communicate with search engines like Google, Bing, Yahoo, etc.

The syntax of a robots.txt file is very simple:

User-agent: *

Disallow: /admin

Disallow: /*?route=checkout/

Disallow: /*?route=account/

Disallow: /*?route=product/search

Allow: /Sitemap: https://site.md/sitemap.xml

Example no. 2:
User-agent: *

Disallow: /admin/
Allow: /

Sitemap: https://site.md/sitemap.xml

Contact us whenever you want

Phones

RO: + 40 721 536 067

MD: +373 69 809 235

Program

M - F: 9 AM - 6 PM

S-S: Free

Email

Email: info{@}seolitte.com

Scroll to Top