Traffic

CPX

PTP

DOWNLOAD OUR MOVIE COLLECTION

Wednesday, August 31, 2011

Reorganizing internal vs. external backlinks

Webmaster level: All

Today we’re making a change to the way we categorize link data in Webmaster Tools. As you know, Webmaster Tools lists links pointing to your site in two separate categories: links coming from other sites, and links from within your site. Today’s update won’t change your total number of links, but will hopefully present your backlinks in a way that more closely aligns with your idea of which links are actually from your site vs. from other sites.

You can manage many different types of sites in Webmaster Tools: a plain domain name (example.com), a subdomain (www.example.com or cats.example.com), or a domain with a subfolder path (www.example.com/cats/ or www.example.com/users/catlover/). Previously, only links that started with your site’s exact URL would be categorized as internal links: so if you entered www.example.com/users/catlover/ as your site, links from www.example.com/users/catlover/profile.html would be categorized as internal, but links from www.example.com/users/ or www.example.com would be categorized as external links. This also meant that if you entered www.example.com as your site, links from example.com would be considered external because they don’t start with the same URL as your site (they don’t contain www).

Most people think of example.com and www.example.com as the same site these days, so we’re changing it such that now, if you add either example.com or www.example.com as a site, links from both the www and non-www versions of the domain will be categorized as internal links. We’ve also extended this idea to include other subdomains, since many people who own a domain also own its subdomains—so links from cats.example.com or pets.example.com will also be categorized as internal links for www.example.com.

Links for www.google.comExternal linksInternal links
Previously categorized as...www.example.com/
www.example.org/stuff.html
scholar.google.com/
sketchup.google.com/
google.com/
www.google.com/
www.google.com/stuff.html
www.google.com/support/webmasters/
Now categorized as...www.example.com/
www.example.org/stuff.html
scholar.google.com/
sketchup.google.com/
google.com/
www.google.com/
www.google.com/stuff.html
www.google.com/support/webmasters/

If you own a site that’s on a subdomain (such as googlewebmastercentral.blogspot.com) or in a subfolder (www.google.com/support/webmasters/) and don’t own the root domain, you’ll still only see links from URLs starting with that subdomain or subfolder in your internal links, and all others will be categorized as external links. We’ve made a few backend changes so that these numbers should be even more accurate for you.

Note that, if you own a root domain like example.com or www.example.com, your number of external links may appear to go down with this change; this is because, as described above, some of the URLs we were previously classifying as external links will have moved into the internal links report. Your total number of links (internal + external) should not be affected by this change.

As always, drop us a comment or join our Webmaster Help Forum if you have questions!

Thursday, August 25, 2011

Google News now crawling with Googlebot

Webmaster Level: intermediate

(Cross-posted on the Google News Blog)

Google News recently updated our infrastructure to crawl with Google’s primary user-agent, Googlebot. What does this mean? Very little to most publishers. Any news organizations that wish to opt out of Google News can continue to do so: Google News will still respect the robots.txt entry for Googlebot-News, our former user-agent, if it is more restrictive than the robots.txt entry for Googlebot.

Our Help Center provides detailed guidance on using the robots exclusion protocol for Google News, and publishers can contact the Google News Support Team if they have any questions, but we wanted to first clarify the following:
  • Although you’ll now only see the Googlebot user-agent in your site’s logs, no need to worry: the appearance of Googlebot instead of Googlebot-News is independent of our inclusion policies. (You can always check whether your site is included in Google News by searching with the “site:” operator. For instance, enter “site:yournewssite.com” in the search field for Google News, and if you see results then we are currently indexing your news site.)

  • Your analytics tool will still be able to differentiate user traffic coming to your website from Google Search and traffic coming from Google News, so you should see no changes there. The main difference is that you will no longer see occasional automated visits to your site from the Googlebot-news crawler.

  • If you’re currently respecting our guidelines for Googlebot, you will not need to make any code changes to your site. Sites that have implemented subscriptions using a metered model or who have implemented First Click Free will not experience any changes. For sites which require registration, payment or login prior to reading any full article, Google News will only be able to crawl and index the title and snippet that you show all users who visit your page. Our Webmaster Guidelines provide additional information about “cloaking” (i.e., showing a bot a different version than what users experience). Learn more about Google News and subscription publishers in this Help Center article.

  • Rest assured, your Sitemap will still be crawled. This change does not affect how we crawl News Sitemaps. If you are a News publisher who hasn’t yet set up a News Sitemap and are interested in getting started, please follow this link.

  • For any publishers that wish to opt out of Google News and stay in Google Search, you can simply disallow Googlebot-news and allow Googlebot. For more information on how to do this, consult our Help Center.


As with any website, from time to time we need to make updates to our infrastructure. At the same time, we want to continue to provide as much control as possible to news web sites. We hope we have answered any questions you might have about this update. If you have additional questions, please check out our Help Center.

Wednesday, August 24, 2011

Making the most of improvements to the +1 button

Webmaster level: All

For the past few months, you might have used +1 buttons to help visitors recommend your content on Google Search and on their Google Profiles. We’ve just announced a few changes that make +1 even more useful.

First, the +1 button now lets visitors share links to your pages on Google+. If someone wants to start a conversation about your content, it’s easy for them to do so. Second, you can use +Snippets to customize the name, image and description that appear when your content is shared. Finally, new inline annotations help increase engagement after users see a friend’s recommendation right on your page.

Here are a couple of tips to help you take full advantage of these improvements:

+Snippets
The +1 button opens up your site to a valuable new source of traffic with sharing on Google+. +Snippets let you put your best face forward by customizing exactly what appears when your content is shared.

For example, if you’re running a movie review site, you might want visitors to share posts containing the title, movie poster, and a brief synopsis:



You may already be using this markup to build rich annotations for your pages on Google Search. If not, marking up your pages is simple. Just add the correct schema.org attributes to the data already present on your pages. You’ll set a name, image, and description in your code:

<body itemscope itemtype="http://schema.org/Article">
<h1 itemprop="name">This is the article name</h1>
<img itemprop="image" src="thumbnail.jpg" />
<p itemprop="description">This is the description of the article.</p>
</body>
Example code containing each of the +Snippet attributes


For more details on alternate markup types, please see our technical documentation.

Inline annotations
Now, when a person visits a page that someone they know has +1’d, they can see a name and face reminding them to pay special attention to your content. Here’s how it looks:


Inline annotations let people see which of their friends +1’d your content


To add inline annotations, you need to update your +1 button code. Visit the configuration tool, select ‘inline’ from the ‘Annotation’ menu, and grab a new snippet of code.

Both sharing from +1 and inline annotations are rolling out fully over the next few days. To test these improvements right now, join our Platform Preview group.

Update later the same day, August 24, 2011: If you have any thoughts or feedback you'd like to share, continue the conversation on Google+.

Tuesday, August 23, 2011

Help us improve Google Search

Webmaster level: Advanced

Yes, we're looking for help improving Google search—but this time we're not asking you to submit more spam reports. Although we still appreciate receiving quality spam reports, today we've got a different opportunity for you to improve Google Search: how about YOU join our team and do the webspam fighting yourself?

Interested? Here's what we're looking for: open-minded academic graduates willing to work in a multinational environment in our Dublin office. Looking at a site's source code should not scare you. You should be excited about search engines and the Internet. It’s also essential that you share our aversion to webspam and the drive to make high-quality content accessible. PlayStation or foosball skills are a plus.


This is an actual work environment photo taken at the Dublin Google office.


If you'd like to know more about the positions available, here's the full list of requirements and responsibilities. Great candidates should be able to email the recruiter directly.

Thursday, August 18, 2011

A new rich snippets format for music

Webmaster level: All

Since we introduced Rich Snippets back in 2009, we’ve created rich snippet formats for a variety of different content types, such as Events, People and Reviews, to show users relevant information about the content they can find on a site. Today, we announced the launch of rich snippets for music. With this new feature, site owners can mark up their pages using the newly created music markup spec on schema.org, and search results for that site may start displaying song information in the snippet so that users know that there are songs or samples there for them to listen to:



Several initial partners have implemented the music markup on their sites, including MySpace, Rhapsody and ReverbNation. As with other rich snippet formats, implementing the markup does not guarantee that your site will be displayed with the UI shown above on a given search; a variety of factors affect whether a particular rich snippet type will appear in our search results. However, having correct markup is a prerequisite for music rich snippets to ever be displayed. You can use the rich snippets testing tool to test the markup of your page and see an illustration of what the search result would look like with music rich snippets.

For now, music rich snippets will display song information when users search for artists, album names or song names. We’ll continue working both on expanding our existing rich snippets formats and on creating new ones, so keep watching for updates about new types of content that you can surface for users right in your site’s snippets.

If you have any questions about this new rich snippets format, you can head over to our Webmaster Forum and ask. You can also check out our page on rich snippets in our Webmaster Tools Help Center, which includes an article specifically about music rich snippets.

Tuesday, August 16, 2011

Introducing new and improved sitelinks

Webmaster level: All



This week we launched an update to sitelinks to improve the organization and quality of our search results. Sitelinks are the two columns of links that appear under some search results and ads that help users easily navigate deeper into the site. Sitelinks haven’t changed fundamentally: they’re still generated and ranked algorithmically based on the link structure of your site, and they’ll only appear if useful for a particular query.





Sitelinks before today’s changes


Here’s how we’ve improved sitelinks with today’s launch:

  • Visibility. The links have been boosted to full-sized text, and augmented with a green URL and one line of text snippet, much like regular search results. This increases the prominence of both the individual sitelinks and the top site overall, making them easier to find.


  • Flexibility. Until now, each site had a fixed list of sitelinks that would either all appear or not appear; there was no query-specific ranking of the links. With today’s launch, sitelink selection and ranking can change from query to query, allowing more optimized results. In addition, the maximum number of sitelinks that can appear for a site has been raised from eight to 12, and the number shown also varies by query.


  • Clarity. Previously, pages from your site could either appear in the sitelinks, in the regular results, or both. Now we’re making the separation between the top domain and other domains a bit clearer. If sitelinks appear for the top result, then the rest of the results below them will be from other domains. One exception to this is if the top result for a query is a subpart of a domain. For instance, the query [the met exhibitions] has www.metmuseum.org/special/ as the top result, and its sitelinks are all from within the www.metmuseum.org/special section of the site. However, the rest of the results may be from other parts of the metmuseum.org domain, like store.metmuseum.org or blog.metmuseum.org/alexandermcqueen/about.


  • Quality. These user-visible changes are accompanied by quality improvements behind the scenes. The core improvement is that we’ve combined the signals we use for sitelinks generation and ranking -- like the link structure of your site -- with our more traditional ranking system, creating a better, unified algorithm. From a ranking perspective, there’s really no separation between “regular” results and sitelinks anymore.


Sitelinks after today’s changes


These changes are also reflected in Webmaster Tools, where you can manage the sitelinks that appear for your site. You can now suggest a demotion to a sitelink if it’s inappropriate or incorrect, and the algorithms will take these demotions into account when showing and ranking the links (although removal is not guaranteed). Since sitelinks can vary over time and by query, it no longer makes sense to select from a set list of links -- now, you can suggest a demotion of any URL for any parent page. Up to 100 demotions will be allowed per site. Finally, all current sitelink blocks in Webmaster Tools will automatically be converted to the demotions system. More information can be found in our Webmaster Tools Help Center.



It’s also worth mentioning a few things that haven’t changed. One-line sitelinks, where sitelinks can appear as a row of links on multiple results, and sitelinks on ads aren’t affected. Existing best practices for the link structure of your site are still relevant today, both for generating good quality sitelinks and to make it easier for your visitors. And, as always, you can raise any questions or comments in our Webmaster Help Forum.



Friday, August 12, 2011

High-quality sites algorithm launched in additional languages

(Cross-posted on the Inside Search blog)

Webmaster level: All

For many months, we’ve been focused on trying to return high-quality sites to users. Earlier this year, we rolled out our “Panda” change for searches in English around the world. Today we’re continuing that effort by rolling out our algorithmic search improvements in different languages. Our scientific evaluation data show that this change improves our search quality across the board and the response to Panda from users has been very positive.

For most languages, this change impacts typically 6-9% of queries to a degree that a user might notice. This is distinctly lower than the initial launch of Panda, which affected almost 12% of English queries to a noticeable amount. We are launching this change for all languages except Chinese, Japanese, and Korean, where we continue to test improvements.

For sites that are affected by this algorithmic change, we have a post providing guidance on how Google searches for high-quality sites. We also have webmaster forums in many languages for publishers who wish to give additional feedback and get advice. We’ll continue working to do the right thing for our users and serve them the best results we can.

Tuesday, August 9, 2011

New webmaster tutorial videos

Webmaster level: All

Over the past couple of years, we’ve released over 375 videos on our YouTube channel, with the majority of them answering direct questions from webmasters. Today, we’re starting to release a freshly baked batch of videos, and you might notice that some of these are a little different. Don’t worry, they still have Matt Cutts in a variety of colored shirts. Instead of only focusing on quick answers to specific questions, we’ve created some longer videos which cover important webmaster-related topics. For example, if you were wondering what the limits are for 301 redirects at Google, we now have a single video for that:



Thanks to everyone who submitted questions for this round. You can be the first to hear about the new videos as they’re released by subscribing to our channel or following us on Twitter.

Monday, August 8, 2011

A new, improved form for reporting webspam

Webmaster level: All


Everyone on the web knows how frustrating it is to perform a search and find websites gaming the search results. These websites can be considered webspam - sites that violate Google’s Webmaster Guidelines and try to trick Google into ranking them highly. Here at Google, we work hard to keep these sites out of your search results, but if you still see them, you can notify us by using our webspam report form. We’ve just rolled out a new, improved webspam report form, so it’s now easier than ever to help us maintain the quality of our search results. Let’s take a look at some of our new form’s features:


Option to report various search issues
There are many search results, such as sites with malware and phishing, that are not necessarily webspam but still degrade the search experience. We’ve noticed that our users sometimes report these other issues using our webspam report form, causing a delay between when a user reports the issue and when the appropriate team at Google handles it. The new form’s interstitial page allows you to report these other search issues directly to the correct teams so that they can address your concerns in a timely manner.


Simplified form with informative links
To improve the readability of the form, we’ve made the text more concise, and we’ve integrated helpful links into the form’s instructions. Now, the ability to look up our Webmaster Guidelines, get advice on writing actionable form comments, and block sites from your personalized search results is just one click away.


Thank you page with personalization options
Some of our most valuable information comes from our users, and we appreciate the webspam reports you submit to us. The thank you page explains what happens once we’ve received your webspam report. If you want to report more webspam, there’s a link back to the form page and instructions on how to report webspam more efficiently with the Chrome Webspam Report Extension. We also provide information on how you can immediately block the site you’ve reported from your personalized search results, for example, by managing blocked sites in your Google Account.


At Google, we strive to provide the highest quality, most relevant search results, so we take your webspam reports very seriously. We hope our new form makes the experience of reporting webspam as painless as possible (and if it doesn’t, feel free to let us know in the comments).


Wednesday, August 3, 2011

Submit URLs to Google with Fetch as Googlebot

Webmaster Level: All

The Fetch as Googlebot feature in Webmaster Tools now provides a way to submit new and updated URLs to Google for indexing. After you fetch a URL as Googlebot, if the fetch is successful, you’ll now see the option to submit that URL to our index. When you submit a URL in this way Googlebot will crawl the URL, usually within a day. We’ll then consider it for inclusion in our index. Note that we don’t guarantee that every URL submitted in this way will be indexed; we’ll still use our regular processes—the same ones we use on URLs discovered in any other way—to evaluate whether a URL belongs in our index.

This new functionality may help you in several situations: if you’ve just launched a new site, or added some key new pages, you can ask Googlebot to find and crawl them immediately rather than waiting for us to discover them naturally. You can also submit URLs that are already indexed in order to refresh them, say if you’ve updated some key content for the event you’re hosting this weekend and want to make sure we see it in time. It could also help if you’ve accidentally published information that you didn’t mean to, and want to update our cached version after you’ve removed the information from your site.

How to submit a URL
First, use Diagnostics > Fetch As Googlebot to fetch the URL you want to submit to Google. If the URL is successfully fetched you’ll see a new “Submit to index” link appear next to the fetched URL.
Once you click “Submit to index” you’ll see a dialog box that allows you to choose whether you want to submit only the one URL, or that URL and all its linked pages.
When submitting individual URLs, we have a maximum limit of 50 submissions per week; when submitting URLs with all linked pages, the limit is 10 submissions per month. You can see how many submissions you have left on the Fetch as Googlebot page. Any URL submitted should point to content that would be suitable for Google Web Search, so if you're trying to submit images or videos you should use Sitemaps instead.

Submit URLs to Google without verifying
In conjunction with this update to Fetch as Googlebot, we've also updated the public "Add your URL to Google" form. It's now the Crawl URL form. It has the same quota limits for submitting pages to the index as the Fetch as Googlebot feature but doesn't require verifying ownership of the site in question, so you can submit any URLs that you want crawled and indexed.

Note that Googlebot is already pretty good about finding and crawling new content in a timely fashion, so don’t feel obligated to use this tool for every change or update on your site. But if you’ve got a URL whose crawling or indexing you want to speed up, consider submitting it using the Crawl URL form or the updated Fetch as Googlebot feature in Webmaster Tools. Feel free to comment here or visit our Webmaster Help Forum if you have more detailed questions.