• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

SEO elements that need to be verified after launching a new site

Sencuart

Well-Known Member
Registered
Active Member
Oracol (XOR)
0.00000000
#1
An essential process in launching a site, and almost always ignored, is SEO verification. Below is a list of elements that, once checked, make the transition to a new site much easier.

Choosing a domain: it's not a domain name but a choice with or without www. Once you've made your choice, you need to check if redirects work correctly and use 301 for that.

Robots.txt: is an absolutely essential element of any site, being the first file required by indexing programs prior to any indexing session. Its lack can in many cases lead to blocking indexing. Because of this, a robots.txt empty file is more useful than the missing one. If the file is present, it must:
  • Contains a link to the site's XML sitemap;
  • Block only directories or URLs that are not indexed;
  • Do not be the same robots.txt from the development environment that prevents site indexing.
Meta Tag Robots: Some platforms (like WordPress) have an option in settings that prevents site indexing when it's in the development environment. This is shown in the source code of the following line: <META NAME = "ROBOTS" CONTENT = "NOINDEX, NOFOLLOW"> If present it must be removed

Test Indexing: To easily identify errors that can occur in large sites, you can crawl both Xenu Link Sleuth (to identify the wrong links) and Screaming Frog. Screaming frog, although it's a cost-plus tool, can be used for 500 URLs free of charge. These are enough to have a general idea of errors. What errors can occur:
  • Wrong links (to the development domain);
  • Non-existent pages;
  • <Title> missing or duplicated meta description.
XML Sitemap: This is essential in transmitting information about the tree and the size of the site to the search engines. It is necessary for the pages to have a distinct priority depending on the site position, as well as the indexing frequency. An XML sitemap failing does not result in penalties, but will be ignored.

Google Analytics: From the first hours of launch, the site must have Google Analytics installed and functional. Checking is done both by looking for the Google Analytics code in the source code, and using the live view section. In this regard, several multi-page test visits must be made and verified if they are reported correctly. If it's an ecommerce site, then it's also necessary to place a few test commands to check if the products, the cart and the value of it are recorded correctly.

Webmaster Tools: Immediately after launching the site, it must be registered with Google Search Console. All variations (with and without www) must be recorded and chosen the preferred version. The following steps are:
  • Adding the XML sitemap and sending it to your processing
  • Boosting indexing via the "Pick up as Google" feature for homepage and home pages and sending them to Google;
  • Daily monitoring for at least a week (recommended one month) of reported errors.
Page Speed: At least 3 pages must be checked: homepage, a category page, and a product / article page. This is necessary for small scores to be quickly resolved by activating site or server functionalities.