Traffic

CPX

PTP

DOWNLOAD OUR MOVIE COLLECTION

Showing posts with label events. Show all posts
Showing posts with label events. Show all posts

Friday, September 14, 2012

Answering the top questions from government webmasters

Webmaster level: Beginner - Intermediate

Government sites, from city to state to federal agencies, are extremely important to Google Search. For one thing, governments have a lot of content — and government websites are often the canonical source of information that’s important to citizens. Around 20 percent of Google searches are for local information, and local governments are experts in their communities.

That’s why I’ve spoken at the National Association of Government Webmasters (NAGW) national conference for the past few years. It’s always interesting speaking to webmasters about search, but the people running government websites have particular concerns and questions. Since some questions come up frequently I thought I’d share this FAQ for government websites.

Question 1: How do I fix an incorrect phone number or address in search results or Google Maps?

Although managing their agency’s site is plenty of work, government webmasters are often called upon to fix problems found elsewhere on the web too. By far the most common question I’ve taken is about fixing addresses and phone numbers in search results. In this case, government site owners really can do it themselves, by claiming their Google+ Local listing. Incorrect or missing phone numbers, addresses, and other information can be fixed by claiming the listing.

Most locations in Google Maps have a Google+ Local listing — businesses, offices, parks, landmarks, etc. I like to use the San Francisco Main Library as an example: it has contact info, detailed information like the hours they’re open, user reviews and fun extras like photos. When we think users are searching for libraries in San Francisco, we may display a map and a listing so they can find the library as quickly as possible.

If you work for a government agency and want to claim a listing, we recommend using a shared Google Account with an email address at your .gov domain if possible. Usually, ownership of the page is confirmed via a phone call or post card.

Question 2: I’ve claimed the listing for our office, but I have 43 different city parks to claim in Google Maps, and none of them have phones or mailboxes. How do I claim them?

Use the bulk uploader! If you have 10 or more listings / addresses to claim at the same time, you can upload a specially-formatted spreadsheet. Go to www.google.com/places/, click the "Get started now" button, and then look for the "bulk upload" link.

If you run into any issues, use the Verification Troubleshooter.

Question 3: We're moving from a .gov domain to a new .com domain. How should we move the site?

We have a Help Center article with more details, but the basic process involves the following steps:
  • Make sure you have both the old and new domain verified in the same Webmaster Tools account.
  • Use a 301 redirect on all pages to tell search engines your site has moved permanently.
    • Don't do a single redirect from all pages to your new home page — this gives a bad user experience.
    • If there's no 1:1 match between pages on your old site and your new site (recommended), try to redirect to a new page with similar content.
    • If you can't do redirects, consider cross-domain canonical links.
  • Make sure to check if the new location is crawlable by Googlebot using the Fetch as Google feature in Webmaster Tools.
  • Use the Change of Address tool in Webmaster Tools to notify Google of your site's move.
  • Have a look at the Links to Your Site in Webmaster Tools and inform the important sites that link to your content about your new location.
  • We recommend not implementing other major changes at the same time, like large-scale content, URL structure, or navigational updates.
  • To help Google pick up new URLs faster, use the Fetch as Google tool to ask Google to crawl your new site, and submit a Sitemap listing the URLs on your new site.
  • To prevent confusion, it's best to retain control of your old site’s domain and keep redirects in place for as long as possible — at least 180 days.
What if you’re moving just part of the site? This question came up too — for example, a city might move its "Tourism and Visitor Info" section to its own domain.

In that case, many of the same steps apply: verify both sites in Webmaster Tools, use 301 redirects, clean up old links, etc. In this case you don't need to use the Change of Address form in Webmaster Tools since only part of your site is moving. If for some reason you’ll have some of the same content on both sites, you may want to include a cross-domain canonical link pointing to the preferred domain.

Question 4: We've done a ton of work to create unique titles and descriptions for pages. How do we get Google to pick them up?

First off, that's great! Better titles and descriptions help users decide to click through to get the information they need on your page. The government webmasters I’ve spoken with care a lot about the content and organization of their sites, and work hard to provide informative text for users.

Google's generation of page titles and descriptions (or "snippets") is completely automated and takes into account both the content of a page as well as references to it that appear on the web. Changes are picked up as we recrawl your site. But you can do two things to let us know about URLs that have changed:
  • Submit an updated XML Sitemap so we know about all of the pages on your site.
  • In Webmaster Tools, use the Fetch as Google feature on a URL you’ve updated. Then you can choose to submit it to the index.
    • You can choose to submit all of the linked pages as well — if you’ve updated an entire section of your site, you might want to submit the main page or an index page for that section to let us know about a broad collection of URLs.

Question 5: How do I get into the YouTube government partner program?

For this question, I have bad news, good news, and then even better news. On the one hand, the government partner program has been discontinued. But don’t worry, because most of the features of the program are now available to your regular YouTube account. For example, you can now upload videos longer than 10 minutes.

Did I say I had even better news? YouTube has added a lot of functionality useful for governments in the past year: I hope this FAQ has been helpful, but I’m sure I haven’t covered everything government webmasters want to know. I highly recommend our Webmaster Academy, where you can learn all about making your site search-engine friendly. If you have a specific question, please feel free to add a question in the comments or visit our really helpful Webmaster Central Forum.

Wednesday, October 5, 2011

Webmaster forums' Top Contributors rock

Webmaster level: All

The TC Summit was a blast! As we wrote in our announcement post, we recently invited more than 250 Top Contributors from all over the world to California to thank them for being so awesome and to give them the opportunity to meet some of our forum guides, engineers and product managers in person.

Our colleagues Adrianne and Brenna already published a recap post on the Official Google Blog. As for us, the search folks at Google, there's not much left to say except that we enjoyed the event and meeting Top Contributors in real life, many of them for the first time. We got the feeling you guys had a great time, too. Let’s quote a few of the folks who make a huge difference on a daily basis:

Sasch Mayer on Google+ (Webmaster TC in English):

"For a number of reasons this event does hold a special place for me, and always will. It's not because I was one of comparatively few people to be invited for a Jolly at the ‘Plex, but because this trip offered the world's TCs a unique opportunity to finally meet each other in person."

Herbert Sulzer, a.k.a. Luzie on Google+ (Webmaster TC in English, German and Spanish):

“Hehehe! Fun, fun fun, this was all fun :D Huhhh”

Aygul Zagidullina on Google+ (Web Search TC in English):

“It was a truly fantastic, amazing, and unforgettable experience meeting so many other TCs across product forums and having the chance to talk to and hear from so many Googlers across so many products!”

Of course we did receive lots of constructive feedback, too. Transparency and communication were on top of the list, and we're looking into increasing our outreach efforts via Webmaster Tools, so stay tuned! By the way, if you haven’t done so yet, please remember to use the forwarding option in the Webmaster Tools Message Center to get the messages straight to your email inbox. In the meantime please keep an eye on our Webmaster Central Blog, and of course keep on contributing to discussions in the Google Webmaster Forum.

On behalf of all Google guides who participated in the 2011 Summit we want to thank you. You guys rock! :)


That’s right, TCs & Google Guides came from all over the world to convene in California.


TCs & Google Guides from Webmaster Central and Search forums after one of the sessions.


After a day packed with presentations and breakout sessions...


...we did what we actually came for...


...enjoyed a party, celebrated and had a great time together.

Wednesday, September 7, 2011

Recognizing Top Contributors in Google's Help Forums

The communities around Google products and services have been growing tremendously over the last couple of years. It is inspiring and motivating for us to see how many users like you contribute to Google Forums. For some time, we´ve been thinking of ways to thank our Top Contributors -- our most the passionate, helpful, friendly, and active users. These TCs have demonstrated incredible commitment to our communities and continue to share their profound knowledge by answering user questions within the forums.

TCs from all over the world will attend our first global summit in California.

We decided to give the online world a break for a moment and meet in real life to celebrate our past success and work on future endeavours. Google Forum Guides, Googlers that participate in the forums, and Top Contributors will convene for the first global Top Contributor Summit on September 13th and 14th in Santa Clara and Mountain View, California. During the Google-organized two-day event, Top Contributors will meet guides, engineers and product managers in order to get to know each other, provide feedback and share new ideas. We’ll be sharing some of the insights and takeaways after the event too, so stay tuned. And if you would like to follow the events online, look out for the #TCsummit tag on Twitter and our updates on Google+.

Tuesday, June 14, 2011

Webinar: Implementing the +1 Button

Webmaster Level: All

A few weeks ago, we launched the +1 button for your site, allowing visitors to recommend your content on Google search directly from your site. As people see recommendations from their friends and contacts beneath your search results, you could see more, better qualified traffic from Google.

But how do you make sure this experience is user friendly? Where should you position the +1 button? How do you make sure the correct URL is getting +1’d?

On Tuesday, June 21 at 3pm ET, please join Timothy Jordan, Google Developer Advocate, to learn about how to best implement the +1 button on your site. He’ll be talking through the technical implementation details as well as best practices to ensure the button has maximum impact. During the webinar, we’ll review the topics below:
  • Getting started
  • Best practices
  • Advanced options
  • Measurement
  • And, we’ll save time for Q&A
If you would like to attend, please register here. To download the code for your site, visit our +1 button tool on Google Webmaster Central.

Monday, March 15, 2010

Sharing advice from our site clinic

Webmaster Level: All

Members of the Google Search Quality Team have participated in site clinic panels on a number of occasions. We receive a lot of positive feedback from these events and we've been thinking of ways to expand our efforts to reach even more webmasters. We decided to organize a small, free of charge pilot site clinic at Google in Dublin, and opened the invitation to webmasters from the neighborhood. The response we received was overwhelming and exceeded our expectations.


Meet the Googlers who hosted the site clinic: Anu Ilomäki, Alfredo Pulvirenti, Adel Saoud, Fili Wiese, Kaspar Szymanski and Uli Lutz.

It was fantastic to see the large turnout and we would like to share the slides presented as well as the takeaways.

These are some questions we came across, along with the advice shared:
  1. I have 3 blogs with the same content, is that a problem?

    If the content is identical, it's likely only one of the blogs will rank for it. Also, with this scattered of an effortwith this scattered of an effort chances are your incoming links will be distributed across the different blogs, instead of pointing to one source. Therefore you're running the risk of both users and search engines not knowing which of your blogs is the definitive source. You can mitigate that by redirecting to the preferred version or using the cross domain canonical to point to one source.

  2. Should I believe SEO agencies that promise to make my site rank first in Google in a few months and with a precise number of links?

    No one can make that promise; therefore the short answer is no, you should not. However, we have some great tips on how to find a trustworthy SEO in our Help Center.

  3. There are keywords that are relevant for my website, but they're inappropriate to be shown in the content e.g. because they could be misunderstood, slang or offensive. How can I show the relevance to Google?

    Depending on the topic of your site and expectations of the target group, you might consider actually using these keywords in a positive way, e.g. explaining their meaning and showing your users you're an authority on the subject. However if the words are plain abusive and completely inappropriate for your website, it's rather questionable whether the traffic resulting from these search queries is interesting for your website anyway.

  4. Would you advise to use the rewrite URL function?

    Some users may like seeing descriptive URLs in the search results. However, it's quite hard to correctly create and maintain rewrites that change dynamic URLs to static-looking URLs. That's why, generally speaking, we don't recommend rewriting them. If you still want to give it a try, please be sure to remove unnecessary parameters while maintaining a dynamic-looking URL and have a close look at our blog post on this topic. And if you don't, keep in mind that we might still make your URLs look readable in our search results no matter how weird they actually are.

  5. If I used the geo-targeting tool for Ireland, is Northern Ireland included?

    Google Webmaster Tools geo-targeting works on a country basis, which means that Northern Ireland would not be targeted if the setting was Republic of Ireland. One possible solution is to create a separate site or part of a website for Northern Ireland and to geo-target this site to the United Kingdom in Webmaster Tools.

  6. Is there any preference between TLDs like .com and .info in ranking?

    No, there is none. Our focus is on the content of the site.

  7. I have a website on a dot SO (.so) domain name with content meant for the Republic of Ireland. Will this hurt my rankings in the Irish search results?

    .so is the Internet country code top-level domain for Somalia. This is one factor we look into not pointing to the desired destination. But we do look at a larger number of factors when ranking your website. The extension of the domain name is just one of these. Your website can still rank in the Irish search results if you have topic-specific content. However, keep in mind that it may take our algorithms a little bit longer to fully understand where to best serve your website in our search results.
We would like to thank all participants for their time and effort. It was a pleasure to help you and we hope that it was beneficial for you, too. For any remaining questions, please don't hesitate to join the community on our GWHF.

Tuesday, February 26, 2008

Leap day hackathon for Google Gadgets, Maps, and more



If you've got JavaScript skills and you'd like to implement such things as Google Gadgets or Maps on your site, bring your laptops and come hang out with us in Mountain View.

This Friday, my team (Google Developer Programs) is hosting a hackathon to get you started with our JavaScript APIs. There will be plenty of our engineers around to answer questions. We'll start with short introductions of the APIs and then break into groups for coding and camaraderie. There'll be food, and prizes too.

The featured JavaScript APIs:When: Friday, February 29 - two sessions (you're welcome to attend both)
  • 2-5:30 PM
  • 6-10 PM
Where: The Googleplex
Building 40
1600 Amphitheatre Pkwy
Mountain View, CA 94043
Room: Seville Tech Talk, 2nd floor

See our map for parking locations and where to check in. (Soon, you too, will be making maps like this! :)

Just say yes and RSVP!

And no worries if you're busy this Friday; future hackathons will feature other APIs and more languages. Check out the Developer Events Calendar for future listings. Hope to see you soon.

Tuesday, January 22, 2008

Feeling lucky at PubCon

Last month, several of us with Webmaster Central hit the "good times" jackpot at PubCon Vegas 2007. We realize not all of you could join us, so instead of returning home with fuzzy dice for everyone, we've got souvenir conference notes.

Listening to the Q&A, I was pleased to hear the major search engines agreeing on best practices for many webmaster issues. In fact, the presentations in the duplicate content session were mostly, well, duplicate. When I wasn't sitting in on one of the many valuable sessions, I was chatting with webmasters either at the Google booth, or at Google's "Meet the Engineers" event. It was exciting to hear from so many different webmasters, and to help them with Google-related issues. Here are a few things that were on the minds of webmasters, along with our responses:

Site Verification Files and Meta Tags
Several webmasters asked, "Is it necessary to keep the verification meta tag or HTML file in place to remain a verified owner in Webmaster Tools?" The answer is yes, you should keep your verification file or meta tag live to maintain your status as a verified owner. These verification codes are used to control who has access to the owner-specific tools for your site in Webmaster Tools. To ensure that only current owners of a site are verified, we periodically re-check to see if the verification code is in place, and if it is not, you will get unverified for that site. While we're on the topic:

Site Verification Best Practices
  • If you have multiple people working on your site with Webmaster Tools, it's a good idea to have each person verify the site with his or her own account, rather than using a shared login. That way, as people come and go, you can control the access appropriately by adding or removing verification files or meta tags for each account.
  • You may want to keep a list of these verification codes and which owner they are connected to, so you can easily control access later. If you lose track, you can always use the "Manage site verification" option in Webmaster Tools, which allows you to force all site owners to reverify their accounts.
Subdomains vs. Subdirectories
What's the difference between using subdomains and subdirectories? When it comes to Google, there aren't major differences between the two, so when you're making that decision, do what works for you and your visitors. Following PubCon, our very own Matt Cutts outlined many of the key issues in a post on his personal blog. In addition to those considerations, if you use Webmaster Tools (which we hope you do!), keep in mind that you'll automatically be verified for deeper subdirectories of any sites you've verified, but subdomains need to be verified separately.

Underscores vs. Dashes
Webmasters asked about the difference between how Google interprets underscores and dashes in URLs. In general, we break words on punctuation, so if you use punctuation as separators, you're providing Google a useful signal for parsing your URLs. Currently, dashes in URLs are consistently treated as separators while underscores are not. Keep in mind our technology is constantly improving, so this distinction between underscores and dashes may decrease over time. Even without punctuation, there's a good chance we'll be able to figure out that bigleopard.html is about a "big leopard" and not a "bigle opard." While using separators is a good practice, it's likely unnecessary to place a high priority on changing your existing URLs just to convert underscores to dashes.

Keywords in URLs
We were also asked if it is useful to have relevant keywords in URLs. It's always a good idea to be descriptive across your site, with titles, ALT attributes, and yes, even URLs, as they can be useful signals for users and search engines. This can be especially true with image files, which otherwise may not have any text for a search engine to consider. Imagine you've taken a picture of your cat asleep on the sofa. Your digital camera will likely name it something like IMG_2937.jpg. Not exactly the most descriptive name. So unless your cat really looks like an IMG_2937, consider changing the filename to something more relevant, like adorable-kitten.jpg. And, if you have a post about your favorite cat names, it's much easier to guess that a URL ending in my-favorite-cat-names would be the relevant page, rather than a URL ending in postid=8652. For more information regarding issues with how Google understands your content, check out our new content analysis feature in Webmaster Tools, as well as our post on the URL suggestions feature of the new Google Toolbar.

Moving to a new IP address
We got a question about changing a site's IP address, and provided a few steps you can take as a webmaster to make sure things go smoothly. Here's what you can do:
  1. Change the TTL (Time To Live) value of your DNS configuration to something short, like five minutes (300 seconds). This will tell web browsers to re-check the IP address for your site every five minutes.
  2. Copy your content to the new hosting environment, and make sure it is live on the new IP address.
  3. Change your DNS settings so your hostname points to the new IP address.
  4. Check your logs to see when Googlebot starts crawling your site on the new IP address. To make sure it's really Googlebot who's visiting, you can verify Googlebot by following these instructions. You can then log into Webmaster Tools and monitor any crawl errors. Once Googlebot is happily crawling on the new IP address, you should be all set as far as Google is concerned.
  5. To make sure everyone got the message of your move, you may want to keep an eye out for visits to your old IP address before shutting it down.
Proxies
A few webmasters were concerned that proxy services are being indexed with copies of their content. While it's often possible to find duplicate copies of your content in our results if you look hard enough, the original source is most likely going to be ranked higher than a proxy copy. However, if you find this not to be the case, please drop us some URLs in the Webmaster Help Group. There are many Googlers including myself who monitor this group and escalate issues appropriately.

It was great talking with webmasters at the conference -- we hope those of you unable to join us found this post useful. If you want to continue to talk shop with me, other Googlers, and your fellow webmasters, join the follow-up conversation in the Webmaster Help Group.

Update: Additional PubCon notes from Jonathan Simon are available in our discussion group.

Monday, November 19, 2007

Bringing the conference to you



We're fortunate to meet many of you at conferences, where we can chat about web search and Webmaster Tools. We receive a lot of good feedback at these events: insight into the questions you're asking and issues you're facing. However, as several of our Webmaster Help Group friends have pointed out, not everyone can afford the time or expense of a conference; and many of you live in regions where webmaster-related conferences are rare.

So, we're bringing the conference to you.

We've posted notes in our Help Group from conferences we recently attended:
Next month, Jonathan and Wysz will post their notes from PubCon, while Bergy and I will cover SES Chicago.

If you can make it to one of these, we'd love to meet you face to face, but if you can't, we hope you find our jottings useful.

Wednesday, October 31, 2007

Happy Halloween to our spooktacular webmasters!



With apologizes to Vic Mizzy, we've written short verse to the tune of the "Addams Family" theme (please use your imagination):

We may be hobbyists or just geeky,
Building websites and acting cheeky,
Javascript redirects we won't make sneaky,
Our webmaster fam-i-ly!

Happy Halloween everyone! Feel free to join the discussion and share your Halloween stories and costumes.


Magnum P.I., Punk Rocker, Rubik's Cube, Mr. T., and Rainbow Brite
a.k.a. Several members of our Webmaster Tools team: Dennis Geels, Jonathan Simon, Sean Harding, Nish Thakkar, and Amanda Camp


Panda and Lolcat
Or just Evan Tang and Matt Cutts?


7 Indexing Engineers and 1 Burrito


Cheese Wysz, Internet Repairman, Community Chest, Internet Pirate (don't tell the RIAA)
Helpful members of the Webmaster Help Group: Wysz, MattD, Nathan Johns (nathanj) , and Bergy


Count++
Webspam Engineer Shashi Thakur (in the same outfit he wore to Searchnomics)


Hawaiian Surfer Dude and Firefox
Members of Webmaster Central's communications team: Reid Yokoyama and Mariya Moeva


Napolean Dynamite and Raiderfan
Shyam Jayaraman (speaking at SES Chicago, hopefully doing the dance) and me

Wednesday, June 13, 2007

Duplicate content summit at SMX Advanced

Last week, I participated in the duplicate content summit at SMX Advanced. I couldn't resist the opportunity to show how Buffy is applicable to the everday Search marketing world, but mostly I was there to get input from you on the duplicate content issues you face and to brainstorm how search engines can help.

A few months ago, Adam wrote a great post on dealing with duplicate content. The most important things to know about duplicate content are:
  • Google wants to serve up unique results and does a great job of picking a version of your content to show if your sites includes duplication. If you don't want to worry about sorting through duplication on your site, you can let us worry about it instead.
  • Duplicate content doesn't cause your site to be penalized. If duplicate pages are detected, one version will be returned in the search results to ensure variety for searchers.
  • Duplicate content doesn't cause your site to be placed in the supplemental index. Duplication may indirectly influence this however, if links to your pages are split among the various versions, causing lower per-page PageRank.
At the summit at SMX Advanced, we asked what duplicate content issues were most worrisome. Those in the audience were concerned about scraper sites, syndication, and internal duplication. We discussed lots of potential solutions to these issues and we'll definitely consider these options along with others as we continue to evolve our toolset. Here's the list of some of the potential solutions we discussed so that those of you who couldn't attend can get in on the conversation.

Specifying the preferred version of a URL in the site's Sitemap file
One thing we discussed was the possibility of specifying the preferred version of a URL in a Sitemap file, with the suggestion that if we encountered multiple URLs that point to the same content, we could consolidate links to that page and could index the preferred version.

Providing a method for indicating parameters that should be stripped from a URL during indexing
We discussed providing this in either an interface such as webmaster tools on in the site's robots.txt file. For instance, if a URL contains sessions IDs, the webmaster could indicate the variable for the session ID, which would help search engines index the clean version of the URL and consolidate links to it. The audience leaned towards an addition in robots.txt for this.

Providing a way to authenticate ownership of content
This would provide search engines with extra information to help ensure we index the original version of an article, rather than a scraped or syndicated version. Note that we do a pretty good job of this now and not many people in the audience mentioned this to be a primary issue. However, the audience was interested in a way of authenticating content as an extra protection. Some suggested using the page with the earliest date, but creation dates aren't always reliable. Someone also suggested allowing site owners to register content, although that could raise issues as well, as non-savvy site owners wouldn't know to register content and someone else could take the content and register it instead. We currently rely on a number of factors such as the site's authority and the number of links to the page. If you syndicate content, we suggest that you ask the sites who are using your content to block their version with a robots.txt file as part of the syndication arrangement to help ensure your version is served in results.

Making a duplicate content report available for site owners
There was great support for the idea of a duplicate content report that would list pages within a site that search engines see as duplicate, as well as pages that are seen as duplicates of pages on other sites. In addition, we discussed the possibility of adding an alert system to this report so site owners could be notified via email or RSS of new duplication issues (particularly external duplication).

Working with blogging software and content management systems to address duplicate content issues
Some duplicate content issues within a site are due to how the software powering the site structures URLs. For instance, a blog may have the same content on the home page, a permalink page, a category page, and an archive page. We are definitely open to talking with software makers about the best way to provide easy solutions for content creators.

In addition to discussing potential solutions to duplicate content issues, the audience had a few questions.

Q: If I nofollow a substantial number of my internal links to reduce duplicate content issues, will this raise a red flag with the search engines?
The number of nofollow links on a site won't raise any red flags, but that is probably not the best method of blocking the search engines from crawling duplicate pages, as other sites may link to those pages. A better method may be to block pages you don't want crawled with a robots.txt file.

Q: Are the search engines continuing the Sitemaps alliance?
We launched sitemaps.org in November of last year and have continued to meet regularly since then. In April, we added the ability for you to let us know about your Sitemap in your robots.txt file. We plan to continue to work together on initiatives such as this to make the lives of webmasters easier.

Q: Many pages on my site primarily consist of graphs. Although the graphs are different on each page, how can I ensure that search engines don't see these pages as duplicate since they don't read images?
To ensure that search engines see these pages as unique, include unique text on each page (for instance, a different title, caption, and description for each graph) and include unique alt text for each image. (For instance, rather than use alt="graph", use something like alt="graph that shows Willow's evil trending over time".

Q: I've syndicated my content to many affiliates and now some of those sites are ranking for this content rather than my site. What can I do?
If you've freely distributed your content, you may need to enhance and expand the content on your site to make it unique.

Q: As a searcher, I want to see duplicates in search results. Can you add this as an option?
We've found that most searchers prefer not to have duplicate results. The audience member in particular commented that she may not want to get information from one site and would like other choices, but for that case, other sites will likely not have identical information and therefore will show up in the results. Bear in mind that you can add the "&filter=0" parameter to the end of a Google web search URL to see additional results which might be similar.

I've brought back all the issues and potential solutions that we discussed at the summit back to my team and others within Google and we'll continue to work on providing the best search results and expanding our partnership with you, the webmaster. If you have additional thoughts, we'd love to hear about them!

Thursday, May 31, 2007

Plumbing the web



Today is Google Developer Day! We're hosting events for developers in ten cities around the world, as you can read about from Matt Cutts and on our Google Blog. Jonathan Simon and Maile Ohye, whom you have seen on this blog, at conferences, and in our discussion forum, are currently hanging out at the event in San Jose.

I've been at the Beijing event, where I gave a keynote about "Plumbing the Web -- APIs and Infrastructures" for 600 Chinese web developers. I talked about a couple of my favorite topics, Sitemaps and Webmaster Tools, and some of the motivations behind them. Then I talked a bit about consumer APIs and some of our backend infrastructures to support our platform.

Check out the video of my keynote on YouTube or see some of the other videos from the events around the globe.

Wednesday, May 23, 2007

Why we attend conferences


I've always loved traveling. Okay, not the flights so much, especially given that I typically travel coach (yes, even for work trips). But getting to learn interesting cultural tidbits, enjoy regional cuisines, and meet new people... it all definitely makes my life richer. Even the little things -- linguistic differences ("How are you going?" in Sydney) and just walking around (pass on the left in the UK!) -- can be fascinating.

So I shouldn't be surprised when my friends tease me about my traveling as a representative of Google's Search Quality team: "Must be really rough!" However, being an active part of conferences actually isn't all glamour and relaxation.

Here's a glimpse of the reality:

  • Sometimes (though thankfully rarely) I get metaphorically used as a human punching bag.
  • There's no pause button on my corp and personal e-mail accounts. Days at conferences = LOTS of email to catch up on!
  • And on a related note, what's with the no-wifi nonsense?! I have Verizon broadband [sic] for my laptop now, but still... ack!
  • Attending conferences requires an enormous amount of extra time overall. I stubbornly seem to create presentations fresh for each conference, I collaborate with other Googler speakers on their presentations (and vice versa), and I end up with a ton of additional (valuable but time-intense) work from info I glean at the conferences. Based on this and the e-mail reason noted above, I've noticed that each day of conference = five days of combined prep + analysis + implementation.

But here's why I still really like going to conferences:

  • I learn a bunch from other speakers. When folks from other search engines or various experts speak, I often think -- hey, that's useful information, or that's a particularly thoughtful way of explaining stuff. I'm still pretty new to the conference-speaking circuit, so every bit I soak up helps!
  • SEO and webmaster folks are typically rather fun people. :-)
  • Though I don't always make time for this, it's certainly neat getting to spend some time exploring various cities. Okay, so San Jose doesn't count (it's right next to Google), but I can't wait to check out Toronto (and, likely via a few personal days beforehand, Montreal).
  • I learn a great deal from webmasters I chat with. I'm able to go back to my colleagues here and say - hey, this is how our algorithm changes or our guidelines are being perceived, these are challenges we didn't anticipate from our tools, and so on. And it's not just about search; I've gotten thoughtful earfuls about Gmail, Calendar, and practically everything else about Google, and I do my best to relay this feedback to my colleagues in other departments.
  • Lastly, seeing someone in person provides a very helpful new perspective on what they're meaning to communicate online. It's easy to misread text on a page, especially when there's no immediate opportunity to follow up with questions. But in person, issues get cleared up on both sides, and that's good for everyone.

Thankfully, it's not just me who's presenting to and chatting with webmasters from Google -- I'd be exhausted, and you'd get quite bored of me. As you can see from the list below, our conference-going is genuinely a team effort: Through this month and June, you'll find Google Search Quality and Webmaster Central folks present at these conferences:

Search Engine Strategies - Xiamen, China - May 25, 26-30
  • Jianfei Zhu (Senior Software Engineer): Get a Lesson from Spamming

Search Engine Strategies - Milan, Italy - May 29-30
  • Brian White (Technical Program Manager)
  • Luisella Mazza (Search Quality Analyst)
  • Stefano Bezze (Search Quality Associate)

American Marketing Association Hot Topics Series - New York, NY - May 25
  • Maile Ohye (Senior Developer Support Engineer): Search Engine Marketing

Google Developer Day - San Jose, CA (was originally set for Mountain View) - May 31
  • Jonathan Simon (Webmaster Trends Analyst)
  • Maile Ohye (Senior Developer Support Engineer)

Search Marketing Expo Advanced - Seattle, WA - June 4-5
  • Matt Cutts (Software Engineer): You&A, Personalized Search and Penalty Box
  • Vanessa Fox (Product Manager, Webmaster Central): Duplicate Content

Search Engine Strategies - Toronto, Canada - June 12-13
  • Adam Lasnik: Search Engine Friendly Design and The Worst SEO Myths, Don'ts, and Scams

Searchnomics - Santa Clara, CA - June 27
  • Shashi Thakur (Software Engineer): Search Engine Friendly Design
  • Greg Grothaus (Software Engineer): Search & Dynamic Web Sites and SEO for Web 2.0

* * *

We look forward to seeing many of you in person! But even if you can't or don't want to go to one of the conferences we attend, we welcome your questions, comments, or even just a friendly introduction in our Webmaster Help Group.

Take care, and enjoy your summer, wherever your online or offline travels may take you!

Thursday, May 17, 2007

Taking advantage of universal search


Yesterday, at Searchology, we unveiled exciting changes in our search results. With universal search, we've begun blending results from more than just the web in order to provide the most relevant and useful results possible. In addition to web pages, for instance, the search results may include video, news, images, maps, and books. Over time, we'll continue to enhance this blending so that searchers can get the exact information they need right from the search results.

This is great news for the searcher, but what does it mean for you, the webmaster? It's great news for you as well. Many people do their searches from web search and aren't aware of our many other tools to search for images, news, videos, maps, and books. Since more of those results may now be returned in web search, if you have content that is returned in these others searches, more potential visitors may see your results.

Want to make sure you're taking full advantage of universal search? Here are some tips:

Google News results
If your site includes news content, you can, submit your site for inclusion in Google News. Once your site is included, you can let us know about your latest articles by submitting a News Sitemap. (Note News Sitemaps are currently available for English sites only.)

News Archive results
If you have historical news content (available for free or by subscription), you can submit it for inclusion in News Archive Search.

Image results
If your site includes images, you can opt-in to enhanced Image search in webmaster tools, which will enable us to gather additional metadata about your images using our Image Labeler. This helps us return your images for the most relevant queries. Also ensure that you are fully taking advantage of the images on your site.

Local results
If your site is for a business in a particular geographic location, you can provide information to us using our Local Business Center. By providing this information, you can help us provide the best, locally relevant results to searchers both in web search and on Google Maps.

Video results
If you have video content, you can host it on Google Video, YouTube, or a number of other video hosting providers. If the video is a relevant result for the query, searchers can play the video directly from the search results page (for Google Video and YouTube) or can view a thumbnail of the video then click over to the player for other hosting providers. You can easily upload videos to Google Video or to YouTube.

Our goal with universal search is to provide most relevant and useful results, so for those of you who want to connect to visitors via search, our best advice remains the same: create valuable, unique content that is exactly what searchers are looking for.

Friday, May 11, 2007

Musings on Down Under



Earlier this year, a bunch of Googlers (Maile, Peeyush, Dan, Adam and I) bunged ourselves across the equator and headed to Sydney, so we could show our users and webmasters that just because you're "down under" doesn't mean you're under our radar. We had a great time getting to know folks at our Sydney office, and an even greater time meeting and chatting with all the people attending Search Summit and Search Engine Room. What makes those 12-hour flights worthwhile is getting the chance to inform and be informed about the issues important to the webmaster community.

One of the questions we heard quite frequently: Should we as webmasters/SEOs/SEMs/users be worried about personalized search?

Our answer: a resounding NO! Personalized search takes each user's search behavior, and subtly tunes the search results to better match their interests over time. For a user, this means that even if you're a lone entomologist in a sea of sports fans, you'll always get the results most relevant to you for the query "cricket". For the webmaster, it allows niche markets that collide on the same search terms to disambiguate themselves based on individual user preferences, and this really presents a tremendous opportunity for visibility. Also, to put things in perspective, search engines have been moving towards some degree of personalization for years; for example, providing country/language specific results is already a form of personalization, just at a coarser granularity. Making it more fine-grained is the logical next step, and helps level the playing field for smaller niche websites which now have a chance to rank well for users that want their content the most.

Another question that popped up a lot: I'm moving my site from domain X to Y. How do I make sure all my hard-earned reputation carries over?

Here are the important bits to think about:
  • For each page on domain X, have it 301-redirect to the corresponding page on Y. (How? Typically through .htaccess, but check with your hosting provider).
  • You might want to stagger the move, and redirect sub-sections of your site over time. This gives you the chance to keep an eye on the effects, and also gives search engines' crawl/indexing pipelines time to cover the space of redirected URLs.
  • http://www.google.com/webmasters is your friend. Keep an eye on it during the transition to make sure that the redirects are having the effect you want.
  • Give it time. How quickly the transition is reflected in the results depends on how quickly we recrawl your site and see those redirects, which depends on a lot of factors including the current reputation of your site's pages.
  • Don't forget to update your Sitemap. (You are using Sitemaps, aren't you?)
  • If possible, don't substantially change the content of your pages at the same time you make the move. Otherwise, it will be difficult to tell if ranking changes are due to the change of content or incorrectly implemented redirects.
Before we sign off, we wanted to shout-out to a couple of the folks at the Sydney office: Lars (one of the original Google Maps guys) gets accolades from all of us jetlagged migrants for donating his awesome Italian espresso machine to the office. And Deepak, thanks for all your tips on what to see and do around Sydney.

Tuesday, April 24, 2007

Come out to SMX Advanced in Seattle and party with Webmaster Central

Our team at Webmaster Central is always looking for ways to communicate with you, the webmaster community. We do through providing tools that tell you more about your site and let you give us input about your site, talking to you in our discussion forums, reading what you have to say across the blogs and forums on the web, blogging here, and by talking to you in person at conferences. We can't talk to as many of you in person as we can reach through other means, such as this blog, but we find meeting face-to-face to be invaluable.

So, we're very excited about an upcoming conference in our hometown, Seattle -- SMX Advanced, June 4-5. Since it's nearby, many from our team can attend and we're hoping to hear more about what you like and what you'd like to see us do in the coming year. We're participating in two summits at this conference. Summits are a great way to find out exactly what issues you're facing and explore ways we can solve them together. We can weigh the alternatives and make sure we understand the obstacles from your perspective. The recent robots.txt summit was a great opportunity for all the search engines to get together and brainstorm with you, the webmaster. We came away from that with lots of great ideas and a better understanding of what you're looking for most with the evolution of robots.txt. We hope to do the same with the two summits at SMX Advanced.

At the Duplicate Content Summit, I'd love to talk to you about the types of situations you're facing with your site. Are you most concerned about syndicating your content? Using dynamic URLs with changing parameters? Providing content for sites in multiple countries? For each issue, we'll talk about ways we can tackle them. What solutions can we offer that will work best for you? I'm very excited about what we can accomplish at this summit, although I'm not quite as excited about the 9am start time. Fortunately, our party isn't the night before.

At the Penalty Box Summit, Matt Cutts will be on hand to talk to you about all the latest with our guidelines and reinclusion procedures. And he'll want to hear from you. What concerns do you have about the guidelines? How can we better communicate violations to you? Unfortunately, our party is the night before this session, but I'm sure there will be lots of coffee on hand.

And speaking of the party... since conference attendees are coming all the way to Seattle, we thought we should throw one. The Google Seattle/Kirkland office and the Webmaster Central team are hosting SMX After Dark: Google Dance NW on Monday night. We want to say thanks to you for this great partnership, as well as give you the chance to learn more about what we've been up to. We'll have food, drinks, games (Pacman and Dance Dance Revolution anyone?), and music. Talk to the Webmaster Central engineers, as well as engineers from our other Kirkland/Seattle product teams, such as Talk, Video, and Maps. We may even have a dunk tank! Who would you most like to try your hand at dunking?

Thursday, April 19, 2007

Estuvimos presentes en Madrid

El pasado 8 y 9 de Marzo asistimos al congreso OJOBuscador en Madrid. Este evento fue muy interesante para nosotros dado que nos dio la oportunidad de escuchar a los ponentes de distintos motores de búsqueda y dialogar con los webmasters acerca de sus principales inquietudes relacionadas con el posicionamiento en buscadores en español. Uno de los puntos que se mencionaron con frecuencia, tanto en las sesiones de posicionamiento como en las charlas informales, fue la desventaja de competir en el mercado SEO donde varias empresas utilizan métodos solapados que van en contra de las directrices oficiales de Google.

Entre las técnicas que hemos observado están las de generar dominios satélite o crear infinidades de páginas irrelevantes con el único objetivo de ganar tráfico en búsquedas que no están necesariamente relacionadas con el contenido del sitio. Otro fenómeno que hemos observado es la continua aparición de dominios que sólo tienen contenido procedente de afiliados sin aportar valor único o relevante.

Vamos a ser más severos con las técnicas previamente mencionadas, porque en Google consideramos que es muy importante no defraudar a los usuarios. Por otra parte, consideramos que la responsabilidad última de los contenidos de un sitio pertenece al webmaster, quien debe velar por su calidad y verificar que sus paginas tengan como finalidad primera satisfacer a los usuarios.

Con el propósito de mejorar la comunicación con la comunidad de webmasters, queremos anunciar que Google a partir de ahora estará presente y participará activamente en el Foro de Google para webmasters.

We were in Madrid

Last 8th and 9th of March we went to the OJOBuscador conference in Madrid. The event was very interesting for us since we had the chance to listen to presentations from the main search engines and to discuss with the webmasters their main concerns regarding Spanish search engine positioning. One key point mentioned frequently both in the SEO sessions and in the informal chats was the disadvantage of working in an SEO market where several companies use sneaky methods that go against the official Webmaster Guidelines.

There are some techniques such as generating satellite domains or creating thousands of irrelevant pages with the sole purpose of gaining traffic in search queries that are not always related to sites' content. Another phenomenon we have observed is the steady influx of domains which only have content from affiliate sites without adding any unique value or relevance.

We are going to be stricter against the techniques previously discussed, as we consider it is very important to avoid deceiving the users. Nevertheless, we think the ultimate responsibility for the contents of a website belongs to the webmaster, who should watch over the site quality and verify that the pages are made for the user.

Aiming to enhance communication with the webmaster community, we would like to announce that going forward Google will participate and monitor the Spanish Webmaster Discussion Forum.

Friday, April 6, 2007

Drop by and see us at SES NY

If you're planning to attend the Search Engine Strategies conference next week in New York, be sure to come by and say hi! A whole bunch of us from the Webmaster Central team will be there, looking to talk to you, get your feedback, and answer your questions. Be sure to join us for lunch on Tuesday, April 10th, where we'll spend an hour answering any question you may have. And then come by our other sessions, or find us in the expo hall or the bar.

Tuesday, April 10

11:00am - 12:30pm

Ads in a Quality Score World
Nick Fox, Group Business Product Manager, Ads Quality

12:45 - 1:45

Lunch Q&A with Google Webmaster Central


Vanessa Fox, Product Manager, Webmaster Central


Trevor Foucher, Software Engineer
Jonathan Simon, Webmaster Trends Analyst
Maile Ohye, Sitemaps Developer Support Engineer
Nikhil Gore, Test Engineer
Amy Lanfear, Technical Writer

Susan Mowska, International Test Engineer
Evan Roseman, Software Engineer

Wednesday, April 11

10:30pm - 12:00pm

Web Analytics & Measuring Success
Brett Crosby, Product Marketing Manager, Google Analytics

Sitemaps & URL Submission
Maile Oyhe, Sitemaps Developer Support Engineer

1:30pm - 2:45pm

Duplicate Content & Multiple Site Issues
Vanessa Fox, Product Manager, Webmaster Central

Meet the Search Ad Networks
Brian Schmidt, Online Sales and Operations Manager

3:15pm - 4:30pm

Earning Money from Contextual Ads
Gavin Bishop, GBS Sales Manager, AdSense

4:45pm - 6:00pm

Landing Page Testing & Tuning
Tom Leung, Product Manager, Google Website Optimizer

robots.txt Summit
Dan Crow, Product Manager

Thursday, April 12

9:00am - 10:15am

Meet the Crawlers
Evan Roseman, Software Engineer

Search Arbitrage Issues
Nick Fox, Group Business Product Manager, Ads Quality

11:00am - 12:15pm

Images & Search Engines
Vanessa Fox, Product Manager, Webmaster Central

4:00pm - 5:15pm

Auditing Paid Listings & Click Fraud Issues
Shuman Ghosemajumder, Business Product Manager, Trust and Safety

Friday, April 13

12:30pm - 1:45pm

Search Engine Q&A on Links
Evan Roseman, Software Engineer

CSS, Ajax, Web 2.0 and Search Engines
Dan Crow, Product Manager

Friday, March 30, 2007

BlogHer 2007: Building your audience

Last week, I spoke at BlogHer Business about search engine optimization issues. I presented with Elise Bauer, who talked about the power of community in blogging. She made great points about the linking patterns of blogs. Link out to sites that would be relevant and useful for your readers. Comment on blogs that you like to continue the conversation and provide a link back to your blog. Write useful content that other bloggers will want to link to. Blogging connects readers and writers and creates real communities where valuable content can be exchanged. I talked more generally about search and a few things you might consider when developing your site and blog.

Why is search important for a business?
With search, your potential customers are telling you exactly what they are looking for. Search can be a powerful tool to help you deliver content that is relevant and useful and meets your customers' needs. For instance, do keyword research to find out the most common types of searches that are relevant to your brand. Does your audience most often search for "houses for sale" or "real estate"? Check your referrer logs to see what searches are bringing visitors to your site (you can find a list of the most common searches that return your site in the results from the Query stats page of webmaster tools). Does your site include valuable content for those searches? A blog is a great way to add this content. You can write unique, targeted articles that provide exactly what the searcher wanted.

How do search engines index sites?
The first step in the indexing process is discovery. A search engine has to know the pages exist. Search engines generally learn about pages from following links, and this process works great. If you have new pages, ensure relevant sites link to them, and provide links to them from within your site. For instance, if you have a blog for your business, you could provide a link from your main site to the latest blog post. You can also let search engines know about the pages of your site by submitting a Sitemap file. Google, Yahoo!, and Microsoft all support the Sitemaps protocol and if you have a blog, it couldn't be easier! Simply submit your blog's RSS feed. Each time you update your blog and your RSS feed is updated, the search engines can extract the URL of the latest post. This ensures search engines know about the updates right away.

Once a search engine knows about the pages, it has to be able to access those pages. You can use the crawl errors reports in webmaster tools to see if we're having any trouble crawling your site. These reports show you exactly what pages we couldn't crawl, when we tried to crawl them, and what the error was.

Once we access the pages, we extract the content. You want to make sure that what your page is about is represented by text. What does the page look like with Javascript, Flash, and images turned off in the browser? Use ALT text and descriptive filenames for images. For instance, if your company name is in a graphic, the ALT text should be the company name rather than "logo". Put text in HTML rather than in Flash or images. This not only helps search engines index your content, but also makes your site more accessible to visitors with mobile browsers, screen readers, or older browsers.

What is your site about?
Does each page have unique title and meta description tags that describe the content? Are the words that visitors search for represented in your content? Do a search of your pages for the queries you expect searchers to do most often and make sure that those words do indeed appear in your site. Which of the following tells visitors and search engines what your site is about?

Option 1
If you're plagued by the cliffs of insanity or the pits of despair, sign up for one of our online classes! Learn the meaning of the word inconceivable. Find out the secret to true love overcoming death. Become skilled in hiding your identity with only a mask. And once you graduate, you'll get a peanut. We mean it.

Option 2
See our class schedule here. We provide extensive instruction and valuable gifts upon graduation.

When you link to other pages in your site, ensure that the anchor text (the text used for the link) is descriptive of those pages. For instance, you might link to your products page with the text "Inigo Montoya's sword collection" or "Buttercup's dresses" rather than "products page" or the ever-popular "click here".

Why are links important?
Links are important for a number of reasons. They are a key way to drive traffic to your site. Visitors of other sites can learn about your site through links to it. You can use links to other sites to provide valuable information to your visitors. And just as links let visitors know about your site, they also let search engines know about it. Links also tell search engines and potential visitors about your site. The anchor text describes what your site is about and the number of relevant links to your pages are an indicator of how popular and useful those pages are. (You can find a list of the links to your site and the most common anchor text used in those links in webmaster tools.)

A blog is a great way to build links, because it enables you to create new content on a regular basis. The more useful content you have, the greater the chances someone else will find that content valuable to their readers and link to it. Several people at the BlogHer session asked about linking out to other sites. Won't this cause your readers to abandon your site? Won't this cause you to "leak out" your PageRank? No, and no. Readers will appreciate that you are letting them know about resources they might be interested in and will remember you as a valuable source of information (and keep coming back for more!). And PageRank isn't a set of scales, where incoming links are weighted against outgoing ones and cancel each other out. Links are content, just as your words are. You want your site to be as useful to your readers as possible, and providing relevant links is a way, just as writing content is, to do that.

The key is compelling content
Google's main goal is to provide the most useful and relevant search results possible. That's the key thing to keep in mind as you look at optimizing your site. How can you make your site the most useful and relevant result for the queries you care about? This won't just help you in the search results, which after all, are just the means to the end. What you are really interested in is keeping your visitors happy and coming back. And creating compelling and useful content is the best way to do that.