I took a long weekend, and when I checked my rankings on Monday night (the 10th), I had dropped to #82. Tuesday the 11th I was #88 and today (Wednesday the 12th) I'm at #85. I seem to have been hit with some sort of nasty -50 penalty, even though I've played it pretty straight and jumped through all the Goops (Google hoops).
I'm patiently waiting it out - I have 5 or 6 other great keywords that I rank highly for that haven't dropped in the SERPs so the penalty only seems to be applied to the one keyword: web design. Maybe too many backlinks too quickly?
Or, was there an algo update this weekend that changed the rankings of certain keywords? The keywords I rank highly for have all been strong SERPs for a long period of time. Perhaps an algo change that has something to do with length of backlinks?
Wednesday, September 12, 2007
Friday, August 24, 2007
Among the webmaster community, the Google PR update is waited for breathlessly. I don't stress it too much - my SERPs are more important. But, it's still a benchmark I like to have, especially if I have SEO clients who expect an update by a specific date.
Google has constraints that I can't even fathom, but a set update schedule would do wonders for some of the animosity they face from the webmaster crowd (we love the traffic, we hate the Goops we have to jump through).
So hey Google, update us already. :)
Thursday, August 23, 2007
I've long wondered exactly how effective a sitemap is in getting a site fully indexed. Like the other Goops (Google hoops) I jump through, I do it without thinking. 90% of the sites that I build have are tracked through the Google webmaster tools and have been verified and sitemapped. I've never known exactly why I create a sitemap other than Google claims it makes a site more "Google friendly".
According to the
"Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your webpages."
I haven't been able to quantify Google's claim, so I put it to a test. Here is how I set it up:
- I created three sites with unique content
- Each of the three sites had 6 pages
- The navigation structure was identical on each page - I used a "link bar" at the top and bottom of each page that linked to all pages in the website.
- The three sites had different topics. I originally wanted to create three identical sites with different URLs, but the chance that two of the sites would be seen as duplicate (and penalized) held me back. Instead, I went for three separate but equally innocuous topics about mundane tasks.
- Each site had 1 image per page and between 200 and 300 words
- The first site (Site A) was entered into the webmaster tools, verified, and had a complete sitemap submitted describing it's structure
- The second site (Site B) was entered into the webmaster tools and verified, but I refrained from submitting a sitemap
- The third site (Site C) wasn't even entered into the webmaster tools. Google had to find this third website naturally
The results of my research were surprising to say the least.
- The first site to be fully indexed by Google was Site B - the site listed in the webmaster tools and verified but not sitemapped. It took about 24 hours to be fully indexed.
- The second site to be fully indexed was the site without any listing in the webmaster tools (Site C). Google found it through a natural link and indexed it completely. Oddly, this was the last site to have it's first page indexed, but all were indexed at one time. Site C took almost 6 days to become fully indexed.
- The last site to become fully indexed was Site A - the website loaded into the webmaster tools, verified, and sitemapped. It was also the first site added to the tools and verified. It only took several hours (less than 8, but I can't be sure because I was sleeping) for the home page to get indexed, but more than 1 week for the entire site to be indexed.