Traffic

CPX

PTP

DOWNLOAD OUR MOVIE COLLECTION

Thursday, February 28, 2008

iGoogle Gadgets for Webmaster Tools



Update: The described feature is no longer available.

When you plan to do something, are you a minimalist, or are you prepared for every potential scenario? For example, would you hike out into the Alaskan wilderness during inclement weather with only a wool overcoat and a sandwich in your pocket - like the naturalist John Muir (and you thought Steve McQueen was tough)?

Or are you more the type of person where even on a day hike, you bring a few changes of clothes, 3 dehydrated meals, a couple of kitchen appliances, a power inverter, and a foot- powered generator, because, well, you never know when the urge will arise to make toast?

The Webmaster Tools team strives to serve all types of webmasters, from the minimalist to those who use every tool they can find. If you're reading this blog, you've probably had the opportunity to use the current version of Webmaster Tools, which offers as many features as possible just shy of the kitchen sink. Now there's something for those of you who would prefer to access only the features of Webmaster Tools that you need: we've just released Webmaster Tools Gadgets for iGoogle.

Here's the simple process to start using these Gadgets right away. (Note: this assumes you've already got a Webmaster Tools account and have verified at least one site.)

1. Visit Webmaster Tools and select any site that you've validated from the dashboard.
2. Click on the Tools section.
3. Click on Gadgets sub-section.
4. Click on the big "Add an iGoogle Webmaster Tools homepage" button.
5. Click the "Add to Google" button on the following confirm page to add the new tab to iGoogle.
6. Now you're in iGoogle, where you should see your new Google Webmaster Tools tab with a number of Gadgets. Enjoy!

You'll notice that each Gadget has a drop down menu at the top which lets you select from all the sites you have validated to see that Gadget's information for the particular site you select. A few of the Gadgets that we're currently offering are:

Crawl errors - Does Googlebot encounter issues when crawling your site?



Top search queries - What are people searching for to find your site?



External links - What websites are linking to yours?




We plan to add more Gadgets in the future and improve their quality, so if there's a feature that you'd really like to see which is not included in one of the Gadgets currently available, let us know. As you can see, it's a cinch to get started.

It looks like rain clouds are forming over here in Seattle, so I'm off for a hike.

Wednesday, February 27, 2008

Cross-submissions via robots.txt on Sitemaps.org


Last spring, the Sitemaps protocol was expanded to include the autodiscovery of Sitemaps using robots.txt to let us and other search engines supporting the protocol know about your Sitemaps. We subsequently also announced support for Sitemap cross-submissions using Google Webmaster Tools, making it possible to submit Sitemaps for multiple hosts on a single dedicated host. So it was only time before we took the next logical step of marrying the two and allowing Sitemap cross-submissions using robots.txt. And today we're doing just that.

We're making it easier for webmasters to place Sitemaps for multiple hosts on a single host and then letting us know by including the location of these Sitemaps in the appropriate robots.txt.

How would this work? Say for example you want to submit a Sitemap for each of the two hosts you own, www.example.com and host2.google.com. For simplicity's sake, you may want to host the Sitemaps on one of the hosts, www.example.com. For example, if you have a Content Management System (CMS), it might be easier for you to change your robots.txt files than to change content in a directory.

You can now exercise the cross-submission support via robots.txt (by letting us know the location of the Sitemaps):

a) The robots.txt for www.example.com would include:
Sitemap: http://www.example.com/sitemap-www-example.xml

b) And similarly, the robots.txt for host2.google.com would include:
Sitemap: http://www.example.com/sitemap-host2-google.xml

By indicating in each individual host's robots.txt file where that host's Sitemap lives you are in essence proving that you own the host for which you are specifying the Sitemap. And by choosing to host all of the Sitemaps on a single host, it becomes simpler to manage your Sitemaps.

We are making this announcement today on Sitemaps.org as a joint effort. To see what our colleagues have to say, you can also check out the blog posts published by Yahoo! and Microsoft.

Tuesday, February 26, 2008

Leap day hackathon for Google Gadgets, Maps, and more



If you've got JavaScript skills and you'd like to implement such things as Google Gadgets or Maps on your site, bring your laptops and come hang out with us in Mountain View.

This Friday, my team (Google Developer Programs) is hosting a hackathon to get you started with our JavaScript APIs. There will be plenty of our engineers around to answer questions. We'll start with short introductions of the APIs and then break into groups for coding and camaraderie. There'll be food, and prizes too.

The featured JavaScript APIs:When: Friday, February 29 - two sessions (you're welcome to attend both)
  • 2-5:30 PM
  • 6-10 PM
Where: The Googleplex
Building 40
1600 Amphitheatre Pkwy
Mountain View, CA 94043
Room: Seville Tech Talk, 2nd floor

See our map for parking locations and where to check in. (Soon, you too, will be making maps like this! :)

Just say yes and RSVP!

And no worries if you're busy this Friday; future hackathons will feature other APIs and more languages. Check out the Developer Events Calendar for future listings. Hope to see you soon.

Tuesday, February 12, 2008

7 must-read Webmaster Central blog posts

Our search quality and Webmaster Central teams love helping webmasters solve problems. But since we can't be in all places at all times answering all questions, we also try hard to show you how to help yourself. We put a lot of work into providing documentation and blog posts to answer your questions and guide you through the data and tools we provide, and we're constantly looking for ways to improve the visibility of that information.

While I always encourage people to search our Help Center and blog for answers, there are a few articles in particular to which I'm constantly referring people. Some are recent and some are buried in years' worth of archives, but each is worth a read:

  1. Googlebot can't access my website
    Web hosters seem to be getting more aggressive about blocking spam bots and aggressive crawlers from their servers, which is generally a good thing; however, sometimes they also block Googlebot without knowing it. If you or your hoster are "allowing" Googlebot through by whitelisting Googlebot IP addresses, you may still be blocking some of our IPs without knowing it (since our full IP list isn't public, for reasons explained in the post). In order to be sure you're allowing Googlebot access to your site, use the method in this blog post to verify whether a crawler is Googlebot.
  2. URL blocked by robots.txt
    Sometimes the web crawl section of Webmaster Tools reports a URL as "blocked by robots.txt", but your robots.txt file doesn't seem to block crawling of that URL. Check out this list of troubleshooting tips, especially the part about redirects. This thread from our Help Group also explains why you may see discrepancies between our web crawl error reports and our robots.txt analysis tool.
  3. Why was my URL removal request denied?
    (Okay, I'm cheating a little: this one is a Help Center article and not a blog post.) In order to remove a URL from Google search results you need to first put something in place that will prevent Googlebot from simply picking that URL up again the next time it crawls your site. This may be a 404 (or 410) status code, a noindex meta tag, or a robots.txt file, depending on what type of removal request you're submitting. Follow the directions in this article and you should be good to go.
  4. Flash best practices
    Flash continues to be a hot topic for webmasters interested in making visually complex content accessible to search engines. In this post Bergy, our resident Flash expert, outlines best practices for working with Flash.
  5. The supplemental index
    The "supplemental index" was a big topic of conversation in 2007, and it seems some webmasters are still worried about it. Instead of worrying, point your browser to this post on how we now search our entire index for every query.
  6. Duplicate content
    Duplicate content—another perennial concern of webmasters. This post talks in detail about duplicate content caused by URL parameters, and also references Adam's previous post on deftly dealing with duplicate content, which gives lots of good suggestions on how to avoid or mitigate problems caused by duplicate content.
  7. Sitemaps FAQs
    This post answers the most frequent questions we get about Sitemaps. And I'm not just saying it's great because I posted it. :-)

Sometimes, knowing how to find existing information is the biggest barrier to getting a question answered. So try searching our blog, Help Center and Help Group next time you have a question, and please let us know if you can't find a piece of information that you think should be there!