Brought to you by the web design experts at Fast Web Site Solutions.
For a limited time, Google Watchdog readers can get a free month of Business VoIP.
Just mention the Watchdog to get your free discount.
You could save you $1,000's of dollars for your business just by being a Google Watchdog reader!
Directories are a quick and dirty way to get a few backlinks to your website. Their importance is going down, and I suspect the next Google update will limit their effectiveness even more. I'll probably still submit to directories, especially the ones with higher pagerank (giving them more "authority" in the Google algorithm). Unless the directory is doing something extremely egregious to violate Google guidelines, the extra link can't hurt. I even get a small amount of traffic from directories.
But bidding directories? I've never used them, and with Google's promise to crack down on the buying and selling of links, bidding directories have become completely useless. There isn't a more open or blatant violation of the new prohibition against buying and selling links. It's fairly easy to identify them, and Google may go as far as completely banning them from the index.
Webmasters, however, are still out there bidding for high placement in these directories, even with all the warnings signs. Why risk having Google label you as being part of a "bad neighborhood" for a single link that doesn't provide any traffic and soon won't even be sending any link juice? If you're a webmaster and you're still purchasing links in bidding directories, you are a SUCKER. And an idiot. And your clients should fire you.
Occasionally, I will be posting webmaster tips. These are bits and pieces of knowledge that I've picked up myself over the years, and are by no means the "correct" way to do things. If you disagree with any of my top tens, let me know. :)
When I build a new site, I place a link to it's home page on one of my established sites (for example, my web design site). This might technically be in the grey area of Google guidelines, but it can often get new sites indexed in less than 24 hours (at least the home page). This also works for Yahoo, but seems to be less effective for MSN.
Put the site into your list of sites on the Google webmaster tools page. Then, add a sitemap and verify the site. I don't know if this will speed up the process, but it at least gives you a centralized area to manage the site.
Validate your code using a website validator at w3c.org. If there are problems with your HTML, search engine bots might have a hard time reading and indexing your page
Build a linking structure in such a way that all pages are within 2-3 clicks of each other. This makes it easier for the user to navigate and the search engines will also be able to find their way around.
Follow the Google Webmaster Guidelines when building the site. Nothing sucks worse than a site that's banned before it can even get off the ground.
Choose your keywords carefully and don't "keyword stuff". If you have more than 15 or 20 keywords, you're probably stuffing. Some search engines don't care. Google does, so don't get penalized for that.
Make sure each page has a separate and distinct DESCRIPTION in your meta tags. The description should describe the page. Make sure to add keywords to your description in a natural way.
Make sure that the keywords in your KEYWORDS meta tag are different on each page, and that they adequately describe the page's main topics.
The most important is #1. Google puts a premium on links, and will index sites with good inbound links far faster than those without. If you notice, I didn't mention actually submitting the URL to Google. It's a waste of time - don't do it!
According to SEOCompany.com, it has been 117 days since the last pagerank update. The longest amount of time between updates has been 122 days. A lot of webmasters are crossing their fingers that the update is coming soon.
My advice? Focus on your search results - Focus on your search results - Focus on your search results! After all, a site with high pagerank is worthless if you don't have the results you need.
Among the webmaster community, the Google PR update is waited for breathlessly. I don't stress it too much - my SERPs are more important. But, it's still a benchmark I like to have, especially if I have SEO clients who expect an update by a specific date.
Google has constraints that I can't even fathom, but a set update schedule would do wonders for some of the animosity they face from the webmaster crowd (we love the traffic, we hate the Goops we have to jump through).
I've long wondered exactly how effective a sitemap is in getting a site fully indexed. Like the other Goops (Google hoops) I jump through, I do it without thinking. 90% of the sites that I build have are tracked through the Google webmaster tools and have been verified and sitemapped. I've never known exactly why I create a sitemap other than Google claims it makes a site more "Google friendly".
"Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages."
I haven't been able to quantify Google's claim, so I put it to a test. Here is how I set it up:
I created three sites with unique content
Each of the three sites had 6 pages
The navigation structure was identical on each page - I used a "link bar" at the top and bottom of each page that linked to all pages in the website.
The three sites had different topics. I originally wanted to create three identical sites with different URLs, but the chance that two of the sites would be seen as duplicate (and penalized) held me back. Instead, I went for three separate but equally innocuous topics about mundane tasks.
Each site had 1 image per page and between 200 and 300 words
I set up the three sites through my Google webmaster tools in three different ways:
The first site (Site A) was entered into the webmaster tools, verified, and had a complete sitemap submitted describing it's structure
The second site (Site B) was entered into the webmaster tools and verified, but I refrained from submitting a sitemap
The third site (Site C) wasn't even entered into the webmaster tools. Google had to find this third website naturally
To give each a chance to be found, I created three footer links on one of my higher ranking sites. The links were all on the same page and used "click here" as the anchor text (so that Google didn't try to weigh a site's value higher or lower because of fouled up linking text). I checked for the indexing of the site by using Google's site: operator in the search engine. I checked at least once per hour, except when I slept. I wanted to find out which site became fully indexed the fastest.
The results of my research were surprising to say the least.
The first site to be fully indexed by Google was Site B - the site listed in the webmaster tools and verified but not sitemapped. It took about 24 hours to be fully indexed.
The second site to be fully indexed was the site without any listing in the webmaster tools (Site C). Google found it through a natural link and indexed it completely. Oddly, this was the last site to have it's first page indexed, but all were indexed at one time. Site C took almost 6 days to become fully indexed.
The last site to become fully indexed was Site A - the website loaded into the webmaster tools, verified, and sitemapped. It was also the first site added to the tools and verified. It only took several hours (less than 8, but I can't be sure because I was sleeping) for the home page to get indexed, but more than 1 week for the entire site to be indexed.
I know that my little experiment was hardly scientific, but it's still surprising how it turned out. Does Google give more credence to sites it finds naturally? (I can't prove it, but I suspect it does) Does the sitemap help in hurrying up the indexing process? (probably not) I will continue to create sitemaps for my websites, but it would be nice to have more information about exactly why Google puts so much emphasis on creating sitemaps (often a time consuming process with large sites).
This is one of my largest complaints when it comes to the Google search engine. Every couple months, the rules of the game change, and search engine optimizers are left guessing what we should or shouldn't do to stay within Google guidelines.
I read Matt Cutts' blog, but he sometimes seems as clueless as the rest of us. I find great information at his site, but it's hard to decipher what the "real" rules are. Sure, Google has a page of guidelines, but it doesn't answer many of the questions I have. For example, if I purchase a banner ad and the webmaster doesn't put the NOFOLLOW designation on it, will I be penalized for an issue caused by someone else? If I have organic reciprocal links on several of my websites, will I be penalized?
What makes it worse is that there isn't a human being who can answer these questions. I often feel like a blindfolded NFL kicker, and Google keeps moving the goalposts. They may or may not give me a clue about the new location, but me, myself, and I is the one penalized if I miss the kick.
Okay, enough of the football analogies. My suggestion: a more comprehensive set of guidelines for webmasters, and a place to post anonymous questions that can be answered by an expert at Google who has the knowledge to let us know yes or no (not necessarily the why). I'll be blogging about this a lot in future posts.
My name is Joe, and while some may find it ironic that I'm using Blogger, a Google owned company, to voice some of my complaints and concerns about Google, I'm hoping that a constructive blog discussing issues from a webmaster's point of view can be used by Google as a place to get real feedback.