- Oracol (XOR)
- 295.34
- Fan Adevărat!
- 0.00 Puncte
An essential process in launching a site, and almost always ignored, is SEO verification. Below is a list of elements that, once checked, make the transition to a new site much easier.
Choosing a domain: it's not a domain name but a choice with or without www. Once you've made your choice, you need to check if redirects work correctly and use 301 for that.
Robots.txt: is an absolutely essential element of any site, being the first file required by indexing programs prior to any indexing session. Its lack can in many cases lead to blocking indexing. Because of this, a robots.txt empty file is more useful than the missing one. If the file is present, it must:
Test Indexing: To easily identify errors that can occur in large sites, you can crawl both Xenu Link Sleuth (to identify the wrong links) and Screaming Frog. Screaming frog, although it's a cost-plus tool, can be used for 500 URLs free of charge. These are enough to have a general idea of errors. What errors can occur:
Google Analytics: From the first hours of launch, the site must have Google Analytics installed and functional. Checking is done both by looking for the Google Analytics code in the source code, and using the live view section. In this regard, several multi-page test visits must be made and verified if they are reported correctly. If it's an ecommerce site, then it's also necessary to place a few test commands to check if the products, the cart and the value of it are recorded correctly.
Webmaster Tools: Immediately after launching the site, it must be registered with Google Search Console. All variations (with and without www) must be recorded and chosen the preferred version. The following steps are:
Choosing a domain: it's not a domain name but a choice with or without www. Once you've made your choice, you need to check if redirects work correctly and use 301 for that.
Robots.txt: is an absolutely essential element of any site, being the first file required by indexing programs prior to any indexing session. Its lack can in many cases lead to blocking indexing. Because of this, a robots.txt empty file is more useful than the missing one. If the file is present, it must:
- Contains a link to the site's XML sitemap;
- Block only directories or URLs that are not indexed;
- Do not be the same robots.txt from the development environment that prevents site indexing.
Test Indexing: To easily identify errors that can occur in large sites, you can crawl both Xenu Link Sleuth (to identify the wrong links) and Screaming Frog. Screaming frog, although it's a cost-plus tool, can be used for 500 URLs free of charge. These are enough to have a general idea of errors. What errors can occur:
- Wrong links (to the development domain);
- Non-existent pages;
- <Title> missing or duplicated meta description.
Google Analytics: From the first hours of launch, the site must have Google Analytics installed and functional. Checking is done both by looking for the Google Analytics code in the source code, and using the live view section. In this regard, several multi-page test visits must be made and verified if they are reported correctly. If it's an ecommerce site, then it's also necessary to place a few test commands to check if the products, the cart and the value of it are recorded correctly.
Webmaster Tools: Immediately after launching the site, it must be registered with Google Search Console. All variations (with and without www) must be recorded and chosen the preferred version. The following steps are:
- Adding the XML sitemap and sending it to your processing
- Boosting indexing via the "Pick up as Google" feature for homepage and home pages and sending them to Google;
- Daily monitoring for at least a week (recommended one month) of reported errors.