Google Penguin Update – 10 Website Optimization Technical Tips. Here are ten great tips on how to beat Google Penguin updates as well as an awesome info-graphic… (Keep reading)
Do me a favor and check out my other article on Google Quality Points – I warned you in my article of last year - that you are always at risk unless you use logic – But I personally hate people who say ” I told you so”. So here are my ten ways to recover from Penguin and avoid future penalties:
Google Penguin Update – 1-2-3-4-5-6-7-8-9 and 10!
I expect there to be numerous updates – similar to Google’s Panda Updates on this one. Using what we know about Panda – this update is very similar and should be an incremental algorithm change (similar to “Panda”).
Here are my great tips on how to beat the “Penguin”:
- Remove Paid Text Links Using Exact Match Anchor Text: You must avoid sponsored or paid links that use exact match anchor tag keywords.
- Remove Comment Spam: Another easy footprint for Google to spot! Avoid using automated link building tools.
- Remove Guest posts on questionable sites: Most “private” blog networks that allow you write a guest post/article, which you would upload it, many of their networks have all been identified by Google.
- Remove Article Marketing Sites: Links from article marketing sites have been classified as being unnatural links by Google
- Avoid Links from Dangerous Sites: Sites that have been flagged for malware, numerous pop-ups, link farms or other spammy issues are another indicator causing sites to lose rankings in the Google SERPs.
- Focus on Social Media Links: Social Media Links have increased over 25% in estimated value to Search Engines this year alone. This is because SE’s know they are from real people – for the most part.
- Focus on relevant Directory Links: Directories are a great way to get links – especially if they are relevant to your website. How do you know which directories to get? Simple: Just type “[your field] directories” into Google and you will get a long list of good directories to get listed in.
- Focus on Press Releases – Adding a photo or an image increases views of the release by 14%.
- Focus on sharable content on your Press Release: Adding a video gives you a 28% increase in views.
- Press Releases Sharable: Add a Video, Info-graphic, Image and a Download to each: Go all the way and put in a photo, a video, a graphic and a download and you’ll see a 78% jump in the number of views!
Google Penguin Update: – How To Avoid Future Penalties
So how do you know the next Google update will not crash your SEO rankings?
Answer: You don’t. But I can give you a hint: Users and Google are going after user experience – quality and Logic.
One of my Favorite Quotes: Wayne Gretzky Said something like: “I Don’t Skate After The Hockey Puck – I Skate To Where The Puck WILL BE”.
I quote another article: “Every year there are optimization techniques that become generally accepted best practices. However – before an idea becomes accepted – it sits in this intermediate state where enough evidence of its value in increasing rankings is lacking. Many of these items are treated as signals of quality.”
“So what is a signal of quality? It’s some aspect of a web site whose existence indicates that the site is less likely to be engaged in search engine black hat techniques. Enough signals should add up to provide a boost to a site when it is compared to other sites that don’t provide the same signals. These signals aren’t proof – of course – just another piece of the puzzle.”
“Certain Signals of quality exist everywhere & in many cases we subconsciously respond to them. For instance – if you’re looking for a good restaurant and you’re particularly nervous about cleanliness you’ll look for signals that the kitchen is clean. There’s no way for you to know the true state of the kitchen – but you can infer it by answering questions like is the table clean – are the glasses spotless – & is the waiter’s uniform professional looking. Restaurant staff that take care of these things are signaling to you that they are also taking care of things you can’t see.”
Here are a list of many of the quality things I would look for:
(Now a target for Penguin) The Numerous Site Wide and Irrelevant Links:
Corporate sites are often the worst when it comes to site wide links. Politics and the “too many cooks in the kitchen” syndrome often result in header and footer links that point to every division and subsidiary regardless of whether these other sites are related from a content-perspective. The existence of these links implies that the user to the web site isn’t important enough to trump corporate egos. And conversely – the absence of such links can signal that the user is king which is a philosophy that Google encourages.
Readable and Clear Navigation Hierarchy and Links:
Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
Can your site be seen easily on a mobile phone? Is your site scalable? There are ways you can strip off items for cellular phones. I believe mobile-friendly sites will be the wave of the future – so start working on it now. But make sure you do not duplicate your content by creating a duplicate mobile site. The best way is to make your existing site mobile friendly or replace it with one that is.
If you think about it – the faster your site is – the more users are capable of doing on your site – the faster they can navigate through it and the more productive your site will be. Slow sites can drive customers crazy (especially customers or clients with slow connections). In any way possible – Speed up your site so users can move around the site easily.
Search Engines are moving more and more towards shorter cleaner URL’s (no appended parameters and special characters in them).
Uniformity of URL’s:
There is no way to separate uppercase URL’s and lowercase URL’s.
Sitemap to Users:
Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links – you may want to break the site map into multiple pages.
Lower the amount of links on each page:
Keep the links on a given page to a reasonable number. Page (Under 200 is good – 100 links is ideal) – Consider Java-scripting additional links or linking higher up on the strategy:
Relevancy – Don’t talk about irrelevant topics on your site:
More information is better – but irrelevant information is worse. Create a useful – information-rich site – and write pages that clearly & accurately describe your content. Separate Different Subjects in different folders – If you are a Rehab Center and you are talking about celebrities – keep it in a “Celebrity-Rehab” Category folder off the main content of your site. Don’t mix “Cats” and “Dogs” links on the same page. Separate your site by subject.
The Keywords and Top Searches in The Title – H1 and Domain name if possible:
Think about the words users would type to find your pages – and make sure that your site actually includes those words within it.
Try To Use Text Links & always Include Info On Each Picture:
Try to use text instead of images to display important names – content – or links. The Google crawler doesn’t recognize text contained in images. If you must use images for textual content – consider using the “ALT” attribute to include a few words of descriptive text.
Having Accurate Notations:
Make sure that your elements & aLT attributes are descriptive & accurate.
The Broken Links and Broken HTML:
Check for broken links and correct HTML.
Try and keep every page you want indexed as a static page:
If you decide to use dynamic pages (i.e. – the URL contains a “?” character) – be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
The Domain Age:
A really old domain is one that existed before anyone cared about SEO. If it existed before anyone thought about gaming the search engines then it’s less likely that it is currently trying to game them. Note that domain age is said to be reset whenever there is a change of ownership so a 10 year old domain that just changed hands last month isn’t going to provide as strong a signal as it did before it changed owners.
The Shared IP Addresses:
If an IP has multiple web sites associated with it – then it can be inferred that the web site owner isn’t paying much for the hosting service. Spammers often choose this route to keep their costs low and hence a dedicated IP signals that the owner is truly interested in a long-term – successful web presence.
The Code to Text Ratio:
Some Sites that contain 100 KB of HTML code with only 2 KB of content are possibly signaling a lack of sophistication and perhaps a lack of interest in doing what’s right for the user – (i.e. creating pages that load quickly and feel responsive). Since search engines want to keep their users coming back – they want to send them to sites that are going to be well-received and therefore considered a good search experience.
Note that Rand Fishkin of SEOMoz quotes Vanessa Fox of Google and suggests that code is ignored by Google implying that this ratio doesn’t play any role at all.
All CSS vs. Tables:
There is a lot of debate about the advantages of CSS when it comes to SEO. For me – there are two signals here. The first is that a redesign from tables to CSS is picked up as a site-wide investment in the site. A site that is maintained & updated sends a signal that someone cares about it and therefore is worth a look by the search engines. The second signal is that CSS can improve the code to text ratio (see previous item).
The Valid HTML / XHTML:
The W3C makes it easy to validate a web page & ensure that it conforms to standards. Since valid web pages almost never occur without a conscious effort to make them error-free – having such pages is a signal that there is someone behind the site that is being careful with their efforts.
The Existence of Robots.txt File:
This file – which sits in the root folder of a web site – provides instructions to web sites about what they should and shouldn’t index. Without it – search engines are left to assume that all content is fair game. Thus – one could argue that if the file exists & explicitly permits search engines to crawl the site then if all other things were equal – the site that gave permission should beat out a site that didn’t.
The Bounce Rate:
It is well known that Google Webmaster Tools shows how long people stay on your site and how many people bounce. Although Google has not officially stated that this is any deciding factor at all in rankings (can you imagine what would happen if they had said this – people gaming this point – machines staying on pages hundreds of hours just to make this go up). I believe it plays a large part of your site’s SEO performance and shows a clear picture of how relevant people who go to your site stay on it. Anything you can do to improve this & increase time on your site is a good idea.
Can you think of any other important Quality Points?