Some SEO Tips I learnt from Matt Cutts
Are you looking for SEO tips from Matt Cutts, the SEO guru for most of the webmasters and bloggers? Matt Cutts is an engineer at Google and works with the Webspam team in Google Search. Find out some great SEO tips, which are my interpretation of original advises from Matt Cutts.
Many webmasters and bloggers are now afraid of Panda, the web spam prevention system of Google Search to demote bad sites and promote good sites. The algorithm changes in Google Search that give low priority for bad sites and penalizing bad SEO practices is called "Google Panda Update". Google Panda introduced several changes to the Google Search in last few months that eliminated many bad sites from Google Search results with the aim of providing better quality search results to the users.
I was going through all tips and videos from Matt Cutts to make sure we do not violate any of the webmaster guidelines even accidentally. I have gathered the most important points from his videos and blogs and have presented it here along with my views on them.SEO tips, suggestions and answers from Matt Cutts
These suggestions and tips are my interpretation of various blogs and videos from Matt Cutts and do not represent exact suggestions from Matt Cutts.
Can we have more than one H1 tag in an article?
According to Matt Cutts, we can have more than one H1 in a page if there is a need and if it makes sense. In short, if you have a large article with several sections, you may use more than one H1 tag. However, it is not a good strategy to use several H1 tags in a page since that tag is meant for the primary title of an article. I suggest you use a single H1 tag and multiple H2 or H3 tags.
Should we use hyphen or underscore in URLs?
Matt Cutts clearly recommends hyphens (dashes) instead of underscore in URLs since Google prefers to use underscore as a separator.
Does the position of keywords in URL matter?
Matt says that position of keywords in URL matters a little bit but he recommends not to depend on it too much.
In my opinion, eventually people will start abusing this and then Google would change that to avoid abuse. Avoid spammy keywords in URL anyway. Don't make URLs filled with unrelated keywords.
How to get pages indexed quicker?
The best solution Google gives to get pages indexed is, get more links to the page.
However, this is an old advice. In the past few months, Google has changed the algorithm a lot and bad links could actually harm your site. So, avoid getting lot of links for the purpose of getting your page rank better or index faster.
The best way to get pages indexed by Google quicker is to make Google believe that you have valuable content and make it worth checking your site often. If your site is updated often with new content, eventually Google will learn that it needs to index more often to get more fresh content. If your site has no new posts regularly, Google would reduce the indexing rate.
Can we publish hundreds of new pages all together or should we post one by one over a period of time?
Matt Cutts see no issues in releasing few hundred high quality new articles at the same time. However, when it comes to hundreds of thousands of new pages, Google could suspect that it is auto generated content, especially if there are signals that make Google think that it is auto generated content or spammy content. If you have original content and have met all webmaster guidelines, go ahead and publish all of your pages at the same time.
Does Google inform sites when they are penalized?
Google does it sometimes to inform good sites about their mistakes so they could correct themselves. However, it is not done always to avoid bad sites trying to figure out how far they can go with bad ideas before they are caught.
Is it better to have keywords in the path or URL?
From search engine perspective, it makes no big difference. However, from usability point of view, you may want to organize it into sub folders and hence integrate keywords in to the path itself rather than making a long URL filled with all keywords.
For example, consider the following URLs:
/articles/electronics/televisions/plasma-tv/reviews.aspx
/articles/electronics-televisions-plasma-tv-reviews.aspx
The first URL looks better since that shows the posts are organized in to a good hierarchy.
Does Google favor BOLD tag or STRONG tag?
I was actually surprised to hear this from Matt Cutts, but he said Google actually favor BOLD a little bit over STRONG, but it is so slight. (In October 2013, Matt Cutts confirmed Google no longer gives any weightage for <v> tag over the <strong> tag.)
Is it worth spending time to write meta tags for keywords and description?
Matt said it clearly that don't waste your time to write meta tags for keywords. However, the meta description tag is very important and it is used in a wide range of places.
In my view, even though Google does not use the keyword meta tag, there may be other services and sites that use this tag to identify the primary focus subject of your site. So, it does not harm using keyword meta tag.
Is it good to include site name in page titles?
It is okay to include a branded title for the home page, which include the site name in the title. But do not include the site name in the title of all pages. Limit it to the home page only.
Example:
Techulator: Find technology articles on gadgets, SEO, Windows 8, Android and more
Is it a bad idea to have hundreds of sites, on the same IP address?
Probably it is! Matt Cutts said in a video that if someone has so many sites, there is something to suspect. Probably no one will have so much valuable content to serve so many sites and that could raise a red flag by Google. If you are an average webmaster with few sites on the same IP address, there is nothing to worry about.
Reference: http://www.youtube.com/watch?v=XpJacspWz4Y
I can read between lines that as long as the sites are dealing with difference niche and content, it is okay. But if you try to cheat Google by having several sites which deal with same or similar topics, you will be probably getting into trouble.
Is there a way to exclude certain words or part of a page from Google index?
The question was in the context of having certain words like "Print", "Leave a Comment" etc, which are repeated several times in a page and are not relevant to the content of the page. Matt Cutts mentioned there is no need to worry about keyword spamming or keywords density on such words and Google could detect the usage of such repeated words as long as they are not meant for spam. It has been Google's policy that Google would automatically detect which words are redundant and which words or part of the blog to ignore.
Hello:
I have to say that I think you have one of the best Panda articles available. It's to the point and it makes sense.
Thank you!
I do have one question though. I always place to links at the bottom of my articles. One is directed to my home page and it includes my site name in the link.
I also include a link to the top of the page and I include the main keyword for that page in that link. Do you think these are safe practices?
Thanks again for the great post!