SEO Guide – Part 2 : Aspects of Structural SEO and Website SEO Audits
Structural ‘defects’ will affect your ranking with Google and the other search engines.
In the course of a Structural SEO Audit the website architecture is scrutinized for any deficiencies, along with analysis of the web server configuration and the numerous protocols necessary for your website to interact with the rest of the internet. An SEO structural website audit will also identify if you are using the prime locations to let the search engines know what is your area of expertise but without drilling down on your selection of keywords – that is looked at in Part 3 : Content and Keyword Focus.
DNS : errors in the DNS configuration can cause delays in the initial ‘handshake’ and the domain could be ranked lower by search engines. The DNS should be free from errors and registered to a contactable email address.
IP address : companies that are serious about maximizing their position in the search engine results pages should invest the $2/mo needed for a private IP address. The authority of a website can be affected by its IP neighbors, which might be 2,000 neighbors in a shared hosting plan. Your domain authority is affected by the status of your IP neighbors. Virtual Private Servers (VPS) or dedicated hosting servers will have their own IP address. Shared hosting will share the IP address with neighbors unless you pay the extra for a private address
PCI Compliance : if https with an SSL certificate is used for e-commerce there are many configurations that need to comply with the Payment Card Industry standards. If https, there is now HTTP/2 which reduces the number of requests to the server and so can speed up page delivery.
Domain Redirects : most sites have issues with content duplication and with the diluted benefit of back-links due to the parallel existence of different versions of the website address: www. and non-www. ; http and https. Although this affects every website, Google and the others do not automatically combine these different versions of the domain and someone must manually place some lines of code to achieve this. This is normally achieved by writing in the .htaccess file at the site file root on the server but can also be achieved by writing PHP scripts (if your site uses PHP). The .htaccess can be used to write many other directives.
Sitemaps : we need to have our pages indexed by the search engines and so we should have .xml sitemap(s) in order to make it as easy as possible for the search engines to do so. There can be numerous sitemaps to help the search engines navigate : pages, posts, portfolio items, images, pdf attachments, products, categories, tags, news articles … and others.
Producing sitemaps is not enough – they should be submitted to Google and Bing Webmaster Tools, and their location should be included in the robots.txt file at the site file root on the server.
Many sites now have separate page, post, category and tag sitemaps generated by plugins. Far fewer also have an image sitemap, and a product sitemap for e-commerce. Image search can produce a significant amount of visits for some sites, as long as they are indexed by the search engine
News agencies can apply for recognition at the Google News Center and then produce and submit news sitemaps. There are numerous guidelines with which to comply. Google has revived the meta keyword tag for news agencies. This is not open as before, but up to ten keywords can be chosen from a limited list for inclusion in the tag.
Robots : we need a robots text file that gives instructions to search engine spiders and bots on what directories, pages or file types should be omitted from the indexing of the site, along with the location of the sitemaps. The current trend is to leave this extremely open and to only no-index test and draft pages and directories. Previously it was normal to place all sorts of restrictions and to request a 10 second delay in order to reduce the strain on sever resources produced by the searching and indexing by the spiders and bots. Many websites now have too many restrictions and are not indexed correctly. Whereas it used to be normal to exclude parts of the admin and theme or template files, Google now wants to render the visual page as part of its indexing for which it needs access to .js and .css files. Google and Bing automatically give a delay time now, and some of the other search engines don´t obey the directives anyway. Many sites have their uploads and images files excluded, which means that they stand a greatly reduced chance of appearing in the image search results.
Online Reputation : some websites are on lists of high risk sites due to previously carrying viruses, misrepresenting their activities, from their spam e-mail campaigns, or because they are seen to be part of a link farm (low quality sites linking to each other to boost authority for a website that will make them money) . Without checking Webmaster Tools you never know that your website domain has been penalized, but if it happens you just don´t appear in the search engine results as high as you otherwise would.
Page Speed : Google dislike slow web pages as they are always trying to improve the internet’s User Experience. Factors that can make a website slow are: errors in DNS, redirection loops, linked pages missing, broken internal links, under-powered hosting server, large photo and image file-sizes, excessive plugins and code overhead. Pages will also be slower if not utilizing pagespeed technologies (browser caching, server-side caching, file compression, minifying scripts). Another focus is to delay scripts and images not used “above the fold”, so that the part of the page visible at first is completed earlier and the rest, visible on scrolling, is given a lower priority. This speeds up the initial page render in the users experience but not the total page load time. My recommendations often include upgrading the hosting account, saving $10/mo on cheap hosting is a false economy that affects not only the quality of the visitor´s experience, but also your position in the search engine results.
Many websites employ the use of CDN servers (Content Domain Network). With these the code scripts, html and image files are delivered from more than one hosting server and so increases the number of parallel deliveries.
There are new technologies for quicker pageload
- AMP (Accelerated Mobile Pages) is a new markup from Google that delivers pages to mobile devices about four times as quick. It has limited use of Javascript and delivers a simple page, but Google know that pages with this mark-up will be quick and so they are prioritized in the search results.
- HTTP2 is a new protocol that can request and deliver a larger number of packages in parallel with each visit to the server. This reduces the number of requests needed to have all the page elements delivered. The website has to be using secure https with an SSL certificate first.
Page errors : 404 errors (page not found) need to be fixed. Google regards these as a poor light as they make for a terrible user experience. This includes our own pages and also checking that external pages that we link to have not disappeared.
Page 301 redirects : If we ever change the the url of a page, or move the page to a different folder, which also changes the url then we need to check that every reference to that page is changed also. It creates another problem, that page might be linked to by posts in your own social media, an blog and forum comments, in YouTube video descriptions, or by external websites written by others. For this reason we need to write a “301 redirect” any time a url is changed. These changes can happen when restructuring the site for a more logical hierarchy, when re-focusing a page or post onto target keywords we have identified.
When a website is redesigned the url´s of most pages will change, and many pages may be thrown out and replaced with fresh content. We have now just lost all of our back-links, which are a very important factor in assessing the rank of your domain and pages, which affects whether or not you appear in organic search engine results or not. This is a very common scenario after a website redesign. Google soon notice that back-links on other sites no longer reach their destination and they penalize your domain. I have seen websites drop from page 1 for their prime keywords down to position 120 at the bottom of page 12. Nearly all of the clicks are obtained by the first few positions in a search engine result, and nobody travels 12 pages deep, so the website traffic is drastically reduced overnight. Recovery from this kind of fall can take years. This is a very common occurrence as there are a lot of designers that are not rounded web professionals and are unaware of basic requirements of good performance in search engines.
All of this is easily avoided by writing 301 redirects for any pages that change their address, redirecting to the new location, or for pages that are eliminated, redirecting to a page on a similar topic or anywhere within the site if there is not a suitable target. These are normally written into the .htaccess file. If the list of redirects becomes very large it is more efficient to write into the httpd.conf file of the web server, which is loaded once when the server is booted, rather than reloading for each page delivery.
I have been asked to get creative with redirect rules on occasion. EG. On a website to be discontinued and replaced with another business name and domain name: Product category pages redirecting to product category pages on the new site, except certain products that needed redirecting directly to Amazon product sales pages. All other pages redirecting to the new home page.
Broken Links : like 404 page errors there is no excuse other than laziness for broken links because they are within your website and totally in your control. These are things such as images that don’t appear on your pages because the address is wrong or because the resource no doesn’t exist. Again, Google frowns upon these for giving a poor user experience.
Although this article is written as an expanded list form, I hope that it has given you better oversight of the elements of Structural SEO and of SEO Audits.
If you found the information valuable please share it for others to read via the social share buttons below. I use my social media channels as a big ‘scrap book’ of articles that I might want to refer to later. Follow my Web 4 Panama social media accounts (links in page header and footer) and you will gain access to the best news, tech tips and tutorials for all aspects of digital design, digital marketing, website development, SEO and analytics.
The SEO Guide : the full series