SEO Best Practices - At The Domain Level

Look beyond just keywords for search engine optimization. Let's find out what we can do at the domain level.

Posted by John Daters on October 18, 2015

SEO Best Practices - At The Domain Level

This document seeks to outline SEO-critical elements to consider when building a new webpage or website. The document is broken up into two sections, “Domain-Level Elements” and “On-Page Elements.” Within “On-Page Elements” SEO tactics are provided in groups as they related to a page, with items found in the and items found in the of a page.

Weighting has not been applied to these elements, as there is no true measure of importance for each item as a standalone entity, instead, SEO should be seen as a holistic strategy that incorporates many individual things to add up to a complete and effective SEO strategy.

Domain-Level Elements

Robots

Robot files are txt documents that reside on the root of a domain. These text files give bots direction on what to crawl and what not to crawl. The file can target specific or all bots to target in its request. “Do not crawl” directives can be specific to the URL or folder level, including the entire web folder.

Additionally, robots documents can be used to indicate the location of the sitemap document.

A robots file is not a required element, but is useful in having a single resource for disallowing crawled pages/folders. An equivalent of the robots file can be seen in the on-page robots tag, which can also call out specific bots or all bots and provide them a noindex directive. The difference is, the bot will have to visit the page before seeing the noindex rule. With a robots file, the bots will not attempt to visit the URL provided.

It should be noted, that an improperly set up robots file can lead to a no-indexing of the entire site if not used properly. Improperly implemented, robot files can accidentally disallow entire files (including from the root domain), thus rendering the entire site non-indexable by bots.

Furthermore, bots do not have to follow a robots.txt file. Unscrupulous bot programs can ignore a robots file and crawl and index the site at any time they like.

Note: The robots file must be included in the root (www.example.com/robots.txt) for it to be read by bots.

Find more information about robots from Google.

Sitemap

Sitemaps are XML files that outline website URLs for search engines to easily access. Elements of the sitemap include location of file, priority of page, frequency of updates and images included on the page (among other elements).

Sitemaps are ideally located at the root of a website (www.example.com/sitemap.xml), however, unlike robots files, they can be located anywhere on the domain.

Sitemaps are not a required element, but are highly recommended for large sites. Google and other search engines are under no obligation to crawl and index all the URLs indicated on the sitemap, but they can use them as a map to identify potential sites for crawl. Additionally, while it’s general practice to include priority and update frequency, Google may not factor in these elements anymore. There is no harm in including them, and these items should remain until further industry confirmation of the omission assertion.

Once built and hosted, sitemap files should be submitted directly to search engines. Bing/Yahoo and Google allow Webmasters to submit sitemaps directly via Webmaster Tools (Bing/Yahoo) and Search Console (Google)

Get more information on sitemaps from Google.

URL Structure/Hierarchy

In DNS and Google Search Console, ensure account is set to manage URL variations, including http:, https:, www, and non-www sites. There should be a uniform method of delivering URLs so as not to cause duplicate content between what search engines see as distinct websites. Additionally, it’s important to ensure that sub folders are properly displayed; for example, www.example.com and www.example.com/index.jsp should both resolve to the same page and not two different URL/URI versions. Choose one format and keep it consistent.

SEO "juice" is not shared between these two sites, instead it is split. It’s important to keep the SEO “juice” within a single domain and not split between domain versions.

Additionally, URLs should contain keywords in folder structures. For example, a page dedicated to home loans within a parent web folder of loans should follow a format similar to www.example.com/loans/home-loans.jsp. In this way, the URL is an indicator of page content. If the domain instead follows the example www.example/L/HL.jsp, there are no indicators within the URL to give search engines insight into the value/use of the page.

URLs should incorporate hyphens, not underscores or spaces when using multiple words in the domain. This is true for HTML and any other file level pages, including PDF, JPG and other file types.

More info: http://googlewebmastercentral.blogspot.com/2010/04/to-slash-or-not-to-slash.html

Mobile Friendly

Since the April 21, 2015 Google mobile search algorithm update, mobile-friendly pages are now more likely to outrank non-mobile-friendly pages in mobile search. “Mobile friendly” refers to a site that is developed and designed specifically for smaller screen sizes. This is usually accomplished through one of two means – responsive design or separate mobile-only website (usually loaded based on a query of the user’s screen size). Both are effective, though Google has stated they prefer responsive.

It’s important to note, Google ranks pages as mobile friendly, not sites. Therefore, a page can be mobile optimized, while the site as a whole is still traditional desktop in delivery and the page in question will get the “mobile friendly” annotation within SERP and have the algorithmic elements applied to it.

HTTPS/SSL

Google has stated in 2014 that it prefers sites that have active and effective SSL certificates. There seems to be a small correlation between HTTPS and rankings, but not as significant as previously thought.

Like the mobile optimized SEO format, Google assesses security of a site at a page level, not a site level. Any HTTPS pages will get the possible algorithm boost, but they will not necessarily impact overall site SERP placement, nor will any non-secure pages negatively affect placement.

More info: http://searchengineland.com/explainer-googles-new-ssl-https-ranking-factor-works-200492

Age of Domain/URL

Age of domain isn’t necessarily a direct correlator to SERP rankings, however, older domains/URLs do tend to have a number of good SEO factors associated with them, including indexed content, visitation numbers and back links, among others. All these elements do relate to organic performance.

Additionally, while a domain or URL can change and be redirect via a 301 to the new version, not 100% of SEO link “juice” will be transferred over to the new page. Therefore, it is usually recommended that a domain/URL retain its original structure instead of being recreated, whenever possible, and so long as the cost/benefit of such a change is determined from an SEO standpoint.