Quality Points – How Do You Know The Next Google Update Won’t Crash Your SEO?

google maps

So how do you know the next Google update will not crash your SEO rankings?
Answer: You don’t. But I can give you a hint: Users and Google are going after user experience – quality and Logic.
One of my Favorite Quotes: Wayne Gretzky Said something like: “I Don’t Skate After The Hockey Puck – I Skate To Where The Puck WILL BE”.
I quote another article: “Every year there are optimization techniques that become generally accepted best practices. However – before an idea becomes accepted – it sits in this intermediate state where enough evidence of its value in increasing rankings is lacking. Many of these items are treated as signals of quality.”
“So what is a signal of quality? It’s some aspect of a web site whose existence indicates that the site is less likely to be engaged in search engine black hat techniques. Enough signals should add up to provide a boost to a site when it is compared to other sites that don’t provide the same signals. These signals aren’t proof – of course – just another piece of the puzzle.”
“Certain Signals of quality exist everywhere and in many cases we subconsciously respond to them. For instance – if you’re looking for a good restaurant and you’re particularly nervous about cleanliness you’ll look for signals that the kitchen is clean. There’s no way for you to know the true state of the kitchen – but you can infer it by answering questions like is the table clean – are the glasses spotless – and is the waiter’s uniform professional looking. Restaurant staff that take care of these things are signaling to you that they are also taking care of things you can’t see.”
Here are a list of many of the quality things I would look for:
Readable and Clear Navigation Hierarchy and Links:
Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
Mobile Friendly:
Can your site be seen easily on a mobile phone? Is your site scalable? There are ways you can strip off items for cellular phones. I believe mobile-friendly sites will be the wave of the future – so start working on it now. But make sure you do not duplicate your content by creating a duplicate mobile site. The best way is to make your existing site mobile friendly or replace it with one that is.
Site Speed
If you think about it – the faster your site is – the more users are capable of doing on your site – the faster they can navigate through it and the more productive your site will be. Slow sites can drive customers crazy (especially customers or clients with slow connections). In any way possible – Speed up your site so users can move around the site easily.
Short URL’s:
Search Engines are moving more and more towards shorter cleaner URL’s (no appended parameters and special characters in them).
Uniformity of URL’s:
There is no way to separate uppercase URL’s and lowercase URL’s.
Sitemap to Users:
Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links – you may want to break the site map into multiple pages.
The Lower the amount of links on each page:
Keep the links on a given page to a reasonable number. Page (Under 200 is good – 100 links is ideal) – Consider Java-scripting additional links or linking higher up on the strategy:
The Relevancy – Don’t talk about irrelevant topics on your site:
More information is better – but irrelevant information is worse. Create a useful – information-rich site – and write pages that clearly and accurately describe your content. Separate Different Subjects in different folders – If you are a Rehab Center and you are talking about celebrities – keep it in a “Celebrity-Rehab” Category folder off the main content of your site. Don’t mix “Cats” and “Dogs” links on the same page. Separate your site by subject.
The Keywords and Top Searches in The Title – H1  and Domain name if possible:
Think about the words users would type to find your pages – and make sure that your site actually includes those words within it.
Try To Use Text Links And Always Include Info On Each Picture:
Try to use text instead of images to display important names – content – or links. The Google crawler doesn’t recognize text contained in images. If you must use images for textual content – consider using the “ALT” attribute to include a few words of descriptive text.
Having Accurate Notations:
Make sure that your <title> elements and ALT attributes are descriptive and accurate.
The Broken Links and Broken HTML:
Check for broken links and correct HTML.
Try and keep every page you want indexed as a static page:
If you decide to use dynamic pages (i.e. – the URL contains a “?” character) – be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
The Domain Age:
A really old domain is one that existed before anyone cared about SEO. If it existed before anyone thought about gaming the search engines then it’s less likely that it is currently trying to game them. Note that domain age is said to be reset whenever there is a change of ownership so a 10 year old domain that just changed hands last month isn’t going to provide as strong a signal as it did before it changed owners.
The Shared IP Addresses:
If an IP has multiple web sites associated with it – then it can be inferred that the web site owner isn’t paying much for the hosting service. Spammers often choose this route to keep their costs low and hence a dedicated IP signals that the owner is truly interested in a long-term – successful web presence.
The Code to Text Ratio:
Some Sites that contain 100 KB of HTML code with only 2 KB of content are possibly signaling a lack of sophistication and perhaps a lack of interest in doing what’s right for the user – (i.e. creating pages that load quickly and feel responsive). Since search engines want to keep their users coming back – they want to send them to sites that are going to be well-received and therefore considered a good search experience.
Note that Rand Fishkin of SEOMoz quotes Vanessa Fox of Google and suggests that code is ignored by Google implying that this ratio doesn’t play any role at all.
All CSS vs. Tables:
There is a lot of debate about the advantages of CSS when it comes to SEO. For me – there are two signals here. The first is that a redesign from tables to CSS is picked up as a site-wide investment in the site. A site that is maintained and updated sends a signal that someone cares about it and therefore is worth a look by the search engines. The second signal is that CSS can improve the code to text ratio (see previous item).
The Valid HTML / XHTML:
The W3C makes it easy to validate a web page and ensure that it conforms to standards. Since valid web pages almost never occur without a conscious effort to make them error-free – having such pages is a signal that there is someone behind the site that is being careful with their efforts.
The Existence of Robots.txt File:
This file – which sits in the root folder of a web site – provides instructions to web sites about what they should and shouldn’t index. Without it – search engines are left to assume that all content is fair game. Thus – one could argue that if the file exists and explicitly permits search engines to crawl the site then if all other things were equal – the site that gave permission should beat out a site that didn’t.
The Numerous Site Wide and Irrelevant Links:
Corporate sites are often the worst when it comes to site wide links. Politics and the “too many cooks in the kitchen” syndrome often result in header and footer links that point to every division and subsidiary regardless of whether these other sites are related from a content-perspective. The existence of these links implies that the user to the web site isn’t important enough to trump corporate egos. And conversely – the absence of such links can signal that the user is king which is a philosophy that Google encourages.
The Bounce Rate:
It is well known that Google Webmaster Tools shows how long people stay on your site and how many people bounce. Although Google has not officially stated that this is any deciding factor at all in rankings (can you imagine what would happen if they had said this  – people gaming this point – machines staying on pages hundreds of hours just to make this go up). I believe it plays a large part of your site’s SEO performance and shows a clear picture of how relevant people who go to your site stay on it. Anything you can do to improve this and increase time on your site is a good idea.
Can you think of any other important Quality Points?
Gooogle Logo

Feedback: What are your thoughts?

Pin It on Pinterest

Share This