Traffic

CPX

PTP

DOWNLOAD OUR MOVIE COLLECTION

Tuesday, October 16, 2012

A new tool to disavow links

Webmaster level: Advanced

Today we’re introducing a tool that enables you to disavow links to your site. If you’ve been notified of a manual spam action based on “unnatural links” pointing to your site, this tool can help you address the issue. If you haven’t gotten this notification, this tool generally isn’t something you need to worry about.

First, a quick refresher. Links are one of the most well-known signals we use to order search results. By looking at the links between pages, we can get a sense of which pages are reputable and important, and thus more likely to be relevant to our users. This is the basis of PageRank, which is one of more than 200 signals we rely on to determine rankings. Since PageRank is so well-known, it’s also a target for spammers, and we fight linkspam constantly with algorithms and by taking manual action.

If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, or other link schemes that violate our quality guidelines. If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business.

If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page. When you arrive, you’ll first select your site.


You’ll then be prompted to upload a file containing the links you want to disavow.


The format is straightforward. All you need is a plain text file with one URL per line. An excerpt of a valid file might look like the following:

# Contacted owner of spamdomain1.com on 7/1/2012 to

# ask for link removal but got no response
domain:spamdomain1.com
# Owner of spamdomain2.com removed most links, but missed these
http://www.spamdomain2.com/contentA.html
http://www.spamdomain2.com/contentB.html
http://www.spamdomain2.com/contentC.html

In this example, lines that begin with a pound sign (#) are considered comments and Google ignores them. The “domain:” keyword indicates that you’d like to disavow links from all pages on a particular site (in this case, “spamdomain1.com”). You can also request to disavow links on specific pages (in this case, three individual pages on spamdomain2.com). We currently support one disavowal file per site and the file is shared among site owners in Webmaster Tools. If you want to update the file, you’ll need to download the existing file, modify it, and upload the new one. The file size limit is 2MB.

One great place to start looking for bad links is the “Links to Your Site” feature in Webmaster Tools. From the homepage, select the site you want, navigate to Traffic > Links to Your Site > Who links the most > More, then click one of the download buttons. This file lists pages that link to your site. If you click “Download latest links,” you’ll see dates as well. This can be a great place to start your investigation, but be sure you don’t upload the entire list of links to your site -- you don’t want to disavow all your links!

To learn more about the feature, check out our Help Center, and we’d welcome your comments and questions in our forum. You’ll also find a video about the tool and a quick Q&A below.





Wednesday, October 10, 2012

Make the web faster with mod_pagespeed, now out of Beta



If your page is on the web, speed matters. For developers and webmasters, making your page faster shouldn’t be a hassle, which is why we introduced mod_pagespeed in 2010. Since then the development team has been working to improve the functionality, quality and performance of this open-source Apache module that automatically optimizes web pages and their resources. Now, after almost two years and eighteen releases, we are announcing that we are taking off the Beta label.

We’re committed to working with the open-source community to continue evolving mod_pagespeed, including more, better and smarter optimizations and support for other web servers. Over 120,000 sites are already using mod_pagespeed to improve the performance of their web pages using the latest techniques and trends in optimization. The product is used worldwide by individual sites, and is also offered by hosting providers, such as DreamHost, Go Daddy and content delivery networks like EdgeCast. With the move out of beta we hope that even more sites will soon benefit from the web performance improvements offered through mod_pagespeed.

mod_pagespeed is a key part of our goal to help make the web faster for everyone. Users prefer faster sites and we have seen that faster pages lead to higher user engagement, conversions, and retention. In fact, page speed is one of the signals in search ranking and ad quality scores. Besides evangelizing for speed, we offer tools and technologies to help measure, quantify, and improve performance, such as Site Speed Reports in Google Analytics, PageSpeed Insights, and PageSpeed Optimization products. In fact, both mod_pagespeed and PageSpeed Service are based on our open-source PageSpeed Optimization Libraries project, and are important ways in which we help websites take advantage of the latest performance best practices.



To learn more about mod_pagespeed and how to incorporate it in your site, watch our recent Google Developers Live session or visit the mod_pagespeed product page.

Tuesday, October 9, 2012

What Content that Friendly for Search Engine

In this section, we will look at just what search engine friendly material is.  Although you may think it is about filling your site with focus on keywords and phrases, it is not.  Actually, you might discover that you could drop bad of the search if you do this.  What you need to keep in thoughts is that you need to create duplicate that not only robots will look at, but also humans will as well.  What is the point of having a website that is extremely rated, yet none of your guests can understand?  So it is important that your site is simple to use to both your guests, as well as robots.

First of all you need to response the following questions:-

How to Figure out the Keyword and key phrase Choice for Your Site

Making a web page eye-catching to search engines is a key aspect for its achievements.  One of the key ways in which to position well with search engines is by improving the noticeable keywords and words on the webpages.   But to be able to be effective in your keyword technique strategy, you should use the following two actions.

Keyword Selection
You will need to discover out which your webpages are providing and also decide which terms that your prospective guests might use to be able to look for for the webpages of your web page.  You then need to make keywords and words which are depending on those terms.

Monday, October 8, 2012

Powerful of Keywords Utilization for SEO

What you should keep in mind is that Look for Website Promotion does not need to be complex.  What you should do is create sure that each and every web page of your website is a exclusive enterprise, and needs to be handled in the appropriate way where SEO is worried.  Below we offer you with some recommendations which should help you accomplish your preferred SEO outcomes.

Get the Look for phrases Right
It is essential that you create sure the chosen terms you use fulfill your marketing requirements.  So do not select terms which are too common.  Using terms that are more particular will outcome in an improved position for your website.  Also, it is essential that you select terms which are appropriate to your website. For example, you will discover that “optimizing search engines” and “search motor optimization” have absolutely different positions.

What is the Primary Weblink Framework within a Site ?

This is probably the most apparent, but certainly one of the most neglected factors of seo.  This allows to create sure that robots can actually discover (crawl) all of your website's webpages.  But if they can not discover them, then they will not get spidered, and this implies that they will not get listed.  Actually, no quantity of seo you try will help.

Below are some factors that should be taken observe of in regard of link structure for your website.

Search Engine Optimization, Why ?


SEO (Search Engine Optimization), as already earlier described, is a sub area of search motor promotion.

Unfortunately, with SEO, there are no strategies, and if you are looking for a way to get easy and fast outcomes, then this is not it.  But instead, you will need to bring out some attempt, especially in regards to the real material of your website.

Also, you will need to have a lot of perseverance, as outcomes do not occur instantaneously.  So if you are looking to enhance search engines position of your website, you should consider this very beginning on when looking at using SEO.

However, what are essential are the following points:-

Buy Returning Policy


A buy back plan is a stipulation in the business's circumstances where it defends its suppliers in case they quit their company and want their reimbursement. Normally this is applicable more to off-line organizations.

A very powerful organization usually has reimbursement assures for their items even after the item has been used midway or if the item is completely used up. This kind of fulfillment assurance normally obliges the organization to process the danger because there will always be lots of quitters in multilevel promotion or individuals who want to rip the organization off trying the item for free.

In most nations, they offer either a 10 day chilling off interval, or a 6 months to one year interval where they will reimbursement 90% to 100% of the items (whether used or rarely used with regards to the company) due to the regulations of the nation or by the Government Business Percentage.

Payment Transparency




A business's promotion strategy must be as clear as possible and published in a easy way that can be recognized by expert and beginners as well. Payment visibility is the term for the part of cash that is returned to the suppliers detailed in details.

Here is an example:

Point Value to Money Calculation

Point value, or PV in most multilevel promotion usually is applicable to a financial value that results in how much money you will get after purchasing a item from the organization. An representation would look like this:

SCENARIO 1

Let’s say, a item expenses $100 and each investment property on the item gives you 1 PV:
I get compensated on 100 PV for purchasing $100 value of products and if I am qualified for 10% bypass, I produce $10
Hence the money to PV rate is $1:1 PV – a money to money equivalent comparison

SCENARIO 2
Sometimes, the same cost of a item will only provide you with 0.5 PV for each $1 invested.
I get compensated on 50PV for purchasing $100 in this situation. If I am qualified for 10% bypass, I produce only $5.
Hence the money to PV rate now is $2:1 PV – you do not get as much as 1:1 because you have to invest more to produce more points

Acknowledging Primary Strategy Mechanics

Now that we are very obvious how essential it is to understand about promotion programs, let us try for more information on its basic techniques. This will be protected over a course of a few guides.

Over the next few webpages, I will summarize the standard plan techniques and summarize its advantages depending on the following topics:

1.    What system developing techniques are the best?
2.    How well does it synergizes with the product?
3.    If I start a organization with this function, what are the strong factors and weaknesses?

Cleaning Out Typical Misconceptions

Let’s obvious up some common misguided beliefs regarding settlement programs both for the supplier and the organization.

The comp strategy is not important; all I need to do is sponsor people
An excellent strategy will do or die a organization. Survivability among suppliers is one of the main causes of attrition. Would you be as thrilled as when you first began if you are not creating money?

Factors like payment, servicing and even the becoming a member of fee are essential. If a organization has quite a higher servicing and their suppliers are not earning cash, they will drop out even quicker.

I will only be a serious designer once I find the best settlement plan
There are individuals who do not give your very best enough in their company, that they fault the strategy for their failing.

Those Who Fall short to Strategy, Strategy to Fail

If you are familiar with the multilevel promotion market, you will have no surprise why the majority of new suppliers or small business owners invest lots of your energy and energy in multilevel promotion exercising because they basically do not comprehend how the settlement or promotion plan in their company works!

Every year, multilevel promotion companies and groups invest countless numbers on promotion plan exercising because many individuals basically find promotion programs today too complicated or individuals just do not take multilevel promotion serious enough to be well prepared.

Isn’t this alarming? Consider this:

Wednesday, October 3, 2012

Rich snippets guidelines

Webmaster level: All


Traditional, text-only, search result snippets aim to summarize the content of a page in our search results. Rich snippets (shown above) allow webmasters to help us provide even better summaries using structured data markup that they can add to their pages. Today we're introducing a set of guidelines to help you implement high quality structured data markup for rich snippets.

Once you've correctly added structured data markup to you site, rich snippets are generated algorithmically based on that markup. If the markup on a page offers an accurate description of the page's content, is up-to-date, and is visible and easily discoverable on your page and by users, our algorithms are more likely to decide to show a rich snippet in Google’s search results.

Alternatively, if the rich snippets markup on a page is spammy, misleading, or otherwise abusive, our algorithms are much more likely to ignore the markup and render a text-only snippet. Keep in mind that, while rich snippets are generated algorithmically, we do reserve the right to take manual action (e.g., disable rich snippets for a specific site) in cases where we see actions that hurt the experience for our users.

To illustrate these guidelines with some examples:
  • If your page is about a band, make sure you mark up concerts being performed by that band, not by related bands or bands in the same town.
  • If you sell products through your site, make sure reviews on each page are about that page's product and not the store itself.
  • If your site provides song lyrics, make sure reviews are about the quality of the lyrics, not the quality of the song itself.
In addition to the general rich snippets quality guidelines we're publishing today, you'll find usage guidelines for specific types of rich snippets in our Help Center. As always, if you have any questions or feedback, please tell us in the Webmaster Help Forum.

Tuesday, October 2, 2012

Google Webmaster Guidelines updated

Webmaster level: All

Today we’re happy to announce an updated version of our Webmaster Quality Guidelines. Both our basic quality guidelines and many of our more specific articles (like those on links schemes or hidden text) have been reorganized and expanded to provide you with more information about how to create quality websites for both users and Google.

The main message of our quality guidelines hasn’t changed: Focus on the user. However, we’ve added more guidance and examples of behavior that you should avoid in order to keep your site in good standing with Google’s search results. We’ve also added a set of quality and technical guidelines for rich snippets, as structured markup is becoming increasingly popular.

We hope these updated guidelines will give you a better understanding of how to create and maintain Google-friendly websites.

Monday, October 1, 2012

Keeping you informed of critical website issues

Webmaster level: All

Having a healthy and well-performing website is important, both to you as the webmaster and to your users. When we discover critical issues with a website, Webmaster Tools will now let you know by automatically sending an email with more information.

We’ll only notify you about issues that we think have significant impact on your site’s health or search performance and which have clear actions that you can take to address the issue. For example, we’ll email you if we detect malware on your site or see a significant increase in errors while crawling your site.

For most sites these kinds of issues will occur rarely. If your site does happen to have an issue, we cap the number of emails we send over a certain period of time to avoid flooding your inbox.  If you don’t want to receive any email from Webmaster Tools you can change your email delivery preferences.

We hope that you find this change a useful way to stay up-to-date on critical and important issues regarding your site’s health. If you have any questions, please let us know via our Webmaster Help Forum.

Thursday, September 20, 2012

Structured Data Testing Tool

Webmaster level: All

Today we’re excited to share the launch of a shiny new version of the rich snippet testing tool, now called the structured data testing tool. The major improvements are:
  • We’ve improved how we display rich snippets in the testing tool to better match how they appear in search results.
  • The brand new visual design makes it clearer what structured data we can extract from the page, and how that may be shown in our search results.
  • The tool is now available in languages other than English to help webmasters from around the world build structured-data-enabled websites.
Here’s what it looks like:
The new structured data testing tool works with all supported rich snippets and authorship markup, including applications, products, recipes, reviews, and others.

Try it yourself and, as always, if you have any questions or feedback, please tell us in the Webmaster Help Forum.

Written by Yong Zhu on behalf of the rich snippets testing tool team



Friday, September 14, 2012

Answering the top questions from government webmasters

Webmaster level: Beginner - Intermediate

Government sites, from city to state to federal agencies, are extremely important to Google Search. For one thing, governments have a lot of content — and government websites are often the canonical source of information that’s important to citizens. Around 20 percent of Google searches are for local information, and local governments are experts in their communities.

That’s why I’ve spoken at the National Association of Government Webmasters (NAGW) national conference for the past few years. It’s always interesting speaking to webmasters about search, but the people running government websites have particular concerns and questions. Since some questions come up frequently I thought I’d share this FAQ for government websites.

Question 1: How do I fix an incorrect phone number or address in search results or Google Maps?

Although managing their agency’s site is plenty of work, government webmasters are often called upon to fix problems found elsewhere on the web too. By far the most common question I’ve taken is about fixing addresses and phone numbers in search results. In this case, government site owners really can do it themselves, by claiming their Google+ Local listing. Incorrect or missing phone numbers, addresses, and other information can be fixed by claiming the listing.

Most locations in Google Maps have a Google+ Local listing — businesses, offices, parks, landmarks, etc. I like to use the San Francisco Main Library as an example: it has contact info, detailed information like the hours they’re open, user reviews and fun extras like photos. When we think users are searching for libraries in San Francisco, we may display a map and a listing so they can find the library as quickly as possible.

If you work for a government agency and want to claim a listing, we recommend using a shared Google Account with an email address at your .gov domain if possible. Usually, ownership of the page is confirmed via a phone call or post card.

Question 2: I’ve claimed the listing for our office, but I have 43 different city parks to claim in Google Maps, and none of them have phones or mailboxes. How do I claim them?

Use the bulk uploader! If you have 10 or more listings / addresses to claim at the same time, you can upload a specially-formatted spreadsheet. Go to www.google.com/places/, click the "Get started now" button, and then look for the "bulk upload" link.

If you run into any issues, use the Verification Troubleshooter.

Question 3: We're moving from a .gov domain to a new .com domain. How should we move the site?

We have a Help Center article with more details, but the basic process involves the following steps:
  • Make sure you have both the old and new domain verified in the same Webmaster Tools account.
  • Use a 301 redirect on all pages to tell search engines your site has moved permanently.
    • Don't do a single redirect from all pages to your new home page — this gives a bad user experience.
    • If there's no 1:1 match between pages on your old site and your new site (recommended), try to redirect to a new page with similar content.
    • If you can't do redirects, consider cross-domain canonical links.
  • Make sure to check if the new location is crawlable by Googlebot using the Fetch as Google feature in Webmaster Tools.
  • Use the Change of Address tool in Webmaster Tools to notify Google of your site's move.
  • Have a look at the Links to Your Site in Webmaster Tools and inform the important sites that link to your content about your new location.
  • We recommend not implementing other major changes at the same time, like large-scale content, URL structure, or navigational updates.
  • To help Google pick up new URLs faster, use the Fetch as Google tool to ask Google to crawl your new site, and submit a Sitemap listing the URLs on your new site.
  • To prevent confusion, it's best to retain control of your old site’s domain and keep redirects in place for as long as possible — at least 180 days.
What if you’re moving just part of the site? This question came up too — for example, a city might move its "Tourism and Visitor Info" section to its own domain.

In that case, many of the same steps apply: verify both sites in Webmaster Tools, use 301 redirects, clean up old links, etc. In this case you don't need to use the Change of Address form in Webmaster Tools since only part of your site is moving. If for some reason you’ll have some of the same content on both sites, you may want to include a cross-domain canonical link pointing to the preferred domain.

Question 4: We've done a ton of work to create unique titles and descriptions for pages. How do we get Google to pick them up?

First off, that's great! Better titles and descriptions help users decide to click through to get the information they need on your page. The government webmasters I’ve spoken with care a lot about the content and organization of their sites, and work hard to provide informative text for users.

Google's generation of page titles and descriptions (or "snippets") is completely automated and takes into account both the content of a page as well as references to it that appear on the web. Changes are picked up as we recrawl your site. But you can do two things to let us know about URLs that have changed:
  • Submit an updated XML Sitemap so we know about all of the pages on your site.
  • In Webmaster Tools, use the Fetch as Google feature on a URL you’ve updated. Then you can choose to submit it to the index.
    • You can choose to submit all of the linked pages as well — if you’ve updated an entire section of your site, you might want to submit the main page or an index page for that section to let us know about a broad collection of URLs.

Question 5: How do I get into the YouTube government partner program?

For this question, I have bad news, good news, and then even better news. On the one hand, the government partner program has been discontinued. But don’t worry, because most of the features of the program are now available to your regular YouTube account. For example, you can now upload videos longer than 10 minutes.

Did I say I had even better news? YouTube has added a lot of functionality useful for governments in the past year: I hope this FAQ has been helpful, but I’m sure I haven’t covered everything government webmasters want to know. I highly recommend our Webmaster Academy, where you can learn all about making your site search-engine friendly. If you have a specific question, please feel free to add a question in the comments or visit our really helpful Webmaster Central Forum.

Wednesday, August 29, 2012

Site Errors Breakdown

Webmaster level: All

Today we’re announcing more detailed Site Error information in Webmaster Tools. This information is useful when looking for the source of your Site Errors. For example, if your site suffers from server connectivity problems, your server may simply be misconfigured; then again, it could also be completely unavailable!  Since each Site Error (DNS, Server Connectivity, and Robots.txt Fetch) is comprised of several unique issues, we’ve broken down each category into more specific errors to provide you with a better analysis of your site’s health.

Site Errors will display statistics for each of your site-wide crawl errors from the past 90 days.  In addition, it will show the failure rates for any category-specific errors that have been affecting your site.




If you’re not sure what a particular error means, you can read a short description of it by hovering over its entry in the legend.  You can find more detailed information by following the “More info” link in the tooltip.


We hope that these changes will make Site Errors even more informative and helpful in keeping your site in tip-top shape.  If you have any questions or suggestions, please let us know through the Webmaster Tools Help Forum.

Written by Cesar Cuenca and Tiffany Wang, Webmaster Tools Interns

Monday, August 20, 2012

Search Queries Alerts in Webmaster Tools

Webmaster level: All

We know many of you check Webmaster Tools daily (thank you!), but not everybody has the time to monitor the health of their site 24/7. It can be time consuming to analyze all the data and identify the most important issues. To make it a little bit easier we’ve been incorporating alerts into Webmaster Tools. We process the data for your site and try to detect the events that could be most interesting for you. Recently we rolled out alerts for Crawl Errors and today we’re introducing  alerts for Search Queries data.

The Search Queries feature in Webmaster Tools shows, among other things, impressions and clicks for your top pages over time. For most sites, these numbers follow regular patterns, so when sudden spikes or drops occur, it can make sense to look into what caused them. Some changes are due to differing demand for your content, other times they may be due to technical issues that need to be resolved, such as broken redirects. For example, a steady stream of clicks which suddenly drops to zero is probably worth investigating.

The alerts look like this:




We’re still working on the sensitivity threshold of the messages and welcome your feedback in our help forums. We hope the new alerts will be useful. Don’t forget to sign up for email forwarding to receive them in your inbox.

Posted by , Tech Lead, Webmaster Tools

Tuesday, August 14, 2012

Configuring URL Parameters in Webmaster Tools

Webmaster Level: Intermediate to Advanced

We recently filmed a video (with slides available) to provide more information about the URL Parameters feature in Webmaster Tools. The URL Parameters feature is designed for webmasters who want to help Google crawl their site more efficiently, and who manage a site with -- you guessed it -- URL parameters! To be eligible for this feature, the URL parameters must be configured in key/value pairs like item=swedish-fish or category=gummy-candy in the URL http://www.example.com/product.php?item=swedish-fish&category=gummy-candy.


Guidance for common cases when configuring URL Parameters. Music in the background masks the ongoing pounding of my neighbor’s construction!

URL Parameter settings are powerful. By telling us how your parameters behave and the recommended action for Googlebot, you can improve your site’s crawl efficiency. On the other hand, if configured incorrectly, you may accidentally recommend that Google ignore important pages, resulting in those pages no longer being available in search results. (There's an example provided in our improved Help Center article.) So please take care when adjusting URL Parameters settings, and be sure that the actions you recommend for Googlebot make sense across your entire site.

Thursday, August 9, 2012

Website testing & Google search

Webmaster level: Advanced

We’ve gotten several questions recently about whether website testing—such as A/B or multivariate testing—affects a site’s performance in search results. We’re glad you’re asking, because we’re glad you’re testing! A/B and multivariate testing are great ways of making sure that what you’re offering really appeals to your users.

Before we dig into the implications for search, a brief primer:
Website testing is when you try out different versions of your website (or a part of your website), and collect data about how users react to each version. You use software to track which version causes users to do-what-you-want-them-to-do most often: which one results in the most purchases, or the most email signups, or whatever you’re testing for. After the test is finished you can update your website to use the “winner” of the test—the most effective content.

A/B testing is when you run a test by creating multiple versions of a page, each with its own URL. When users try to access the original URL, you redirect some of them to each of the variation URLs and then compare users’ behaviour to see which page is most effective.

Multivariate testing is when you use software to change differents parts of your website on the fly. You can test changes to multiple parts of a page—say, the heading, a photo, and the ‘Add to Cart’ button—and the software will show variations of each of these sections to users in different combinations and then statistically analyze which variations are the most effective. Only one URL is involved; the variations are inserted dynamically on the page.

So how does this affect what Googlebot sees on your site? Will serving different content variants change how your site ranks? Below are some guidelines for running an effective test with minimal impact on your site’s search performance.
  • No cloaking.
    Cloaking—showing one set of content to humans, and a different set to Googlebot—is against our Webmaster Guidelines, whether you’re running a test or not. Make sure that you’re not deciding whether to serve the test, or which content variant to serve, based on user-agent. An example of this would be always serving the original content when you see the user-agent “Googlebot.” Remember that infringing our Guidelines can get your site demoted or removed from Google search results—probably not the desired outcome of your test.
  • Use rel=“canonical”.
    If you’re running an A/B test with multiple URLs, you can use the rel=“canonical” link attribute on all of your alternate URLs to indicate that the original URL is the preferred version. We recommend using rel=“canonical” rather than a noindex meta tag because it more closely matches your intent in this situation. Let’s say you were testing variations of your homepage; you don’t want search engines to not index your homepage, you just want them to understand that all the test URLs are close duplicates or variations on the original URL and should be grouped as such, with the original URL as the canonical. Using noindex rather than rel=“canonical” in such a situation can sometimes have unexpected effects (e.g., if for some reason we choose one of the variant URLs as the canonical, the “original” URL might also get dropped from the index since it would get treated as a duplicate).
  • Use 302s, not 301s.
    If you’re running an A/B test that redirects users from the original URL to a variation URL, use a 302 (temporary) redirect, not a 301 (permanent) redirect. This tells search engines that this redirect is temporary—it will only be in place as long as you’re running the experiment—and that they should keep the original URL in their index rather than replacing it with the target of the redirect (the test page). JavaScript-based redirects are also fine.
  • Only run the experiment as long as necessary.
    The amount of time required for a reliable test will vary depending on factors like your conversion rates, and how much traffic your website gets; a good testing tool should tell you when you’ve gathered enough data to draw a reliable conclusion. Once you’ve concluded the test, you should update your site with the desired content variation(s) and remove all elements of the test as soon as possible, such as alternate URLs or testing scripts and markup. If we discover a site running an experiment for an unnecessarily long time, we may interpret this as an attempt to deceive search engines and take action accordingly. This is especially true if you’re serving one content variant to a large percentage of your users.
The recommendations above should result in your tests having little or no impact on your site in search results. However, depending on what types of content you’re testing, it may not even matter much if Googlebot crawls or indexes some of your content variations while you’re testing. Small changes, such as the size, color, or placement of a button or image, or the text of your “call to action” (“Add to cart” vs. “Buy now!”), can have a surprising impact on users’ interactions with your webpage, but will often have little or no impact on that page’s search result snippet or ranking. In addition, if we crawl your site often enough to detect and index your experiment, we’ll probably index the eventual updates you make to your site fairly quickly after you’ve concluded the experiment.

To learn more about website testing, check out these articles on Content Experiments, our free testing tool in Google Analytics. You can also ask questions about website testing in the Analytics Help Forum, or about search impact in the Webmaster Help Forum.

Wednesday, August 1, 2012

Domain verification using CNAME records

Webmaster Level: all

In order to use Google services like Webmaster Tools and Google Apps you must verify that you own the site or domain. One way you can do this is by creating a DNS TXT record to prove your ownership of the domain. Now you can also use DNS CNAME records to verify ownership of your domains. This is a new domain verification option for users that are not able to create DNS TXT records for their domains.

For example, if you own the domain example.com, you can verify your ownership of the domain by creating a DNS CNAME record as follows.

  1. Add the domain example.com to your account either in Webmaster Tools or directly on the Verification Home page.

  2. Select the Domain Name Provider method of verification, then select your domain name provider that manages your DNS records or "Other" if your provider is not on this list.

  3. Based on your selection you may either see the instructions to set a CNAME record or see a link to the option Add a CNAME record. Follow the instructions to add the specified CNAME record to your domain’s DNS configuration.

  4. Click the Verify button.

When you click Verify, Google will check for the CNAME record and if everything works you will be added as a verified owner of the domain. Using this method automatically verifies you as the owner of all websites on this domain. For example, when you verify your ownership of example.com, you are automatically verified as an owner of www.example.com as well as subdomains such as blog.example.com.

Sometimes DNS records take a while to make their way across the Internet. If we don't find the record immediately, we'll check for it periodically and when we find the record we'll make you a verified owner. To maintain your verification status don’t remove the record, even after verification succeeds.

If you don’t have access to your DNS configuration at your domain name provider you can continue to use any of the other verification methods, such as the HTML file, the meta tag or Google Analytics tag in order to verify that you own a site.

If you have any questions please let us know via our Webmaster Help forum.

Tuesday, July 31, 2012

Introducing the Structured Data Dashboard

Webmaster level: All

Structured data is becoming an increasingly important part of the web ecosystem. Google makes use of structured data in a number of ways including rich snippets which allow websites to highlight specific types of content in search results. Websites participate by marking up their content using industry-standard formats and schemas.

To provide webmasters with greater visibility into the structured data that Google knows about for their website, we’re introducing today a new feature in Webmaster Tools - the Structured Data Dashboard. The Structured Data Dashboard has three views: site, item type and page-level.

Site-level view
At the top level, the Structured Data Dashboard, which is under Optimization, aggregates this data (by root item type and vocabulary schema).  Root item type means an item that is not an attribute of another on the same page.  For example, the site below has about 2 million Schema.Org annotations for Books (“http://schema.org/Book”)


Itemtype-level view
It also provides per-page details for each item type, as seen below:


Google parses and stores a fixed number of pages for each site and item type. They are stored in decreasing order by the time in which they were crawled. We also keep all their structured data markup. For certain item types we also provide specialized preview columns as seen in this example below (e.g. “Name” is specific to schema.org Product).


The default sort order is such that it would facilitate inspection of the most recently added Structured Data.

Page-level view
Last but not least, we have a details page showing all attributes of every item type on the given page (as well as a link to the Rich Snippet testing tool for the page in question).


Webmasters can use the Structured Data Dashboard to verify that Google is picking up new markup, as well as to detect problems with existing markup, for example monitor potential changes in instance counts during site redesigns.

Friday, July 27, 2012

New notifications about inbound links

Webmaster level: Advanced

Lots of site owners use our webmaster console to see how their site is doing in Google. Last week we began sending new messages to sites with a pattern of unnatural links pointing to them, and I wanted to give more context about these new messages.

Original Link Messages 

First, let's talk about the original link messages that we've been sending out for months. When we see unnatural links pointing to a site, there are different ways we can respond. In many severe cases, we reduce our trust in the entire site. For example, that can happen when we believe a site has been engaging in a pretty widespread pattern of link spam over a long period of time. If your site is notified for these unnatural links, we recommend removing as many of the spammy or low-quality links as you possibly can and then submitting a reconsideration request for your site.

In a few situations, we have heard about directories or blog networks that won't take links down. If a website tries to charge you to put links up and to take links down, feel free to let us know about that, either in your reconsideration request or by mentioning it on our webmaster forum or in a separate spam report. We have taken action on several such sites, because they often turn out to be doing link spamming themselves.

New Link Messages 

In less severe cases, we sometimes target specific spammy or artificial links created as part of a link scheme and distrust only those links, rather than taking action on a site’s overall ranking. The new messages make it clear that we are taking "targeted action on the unnatural links instead of your site as a whole." The new messages also lack the yellow exclamation mark that other messages have, which tries to convey that we're addressing a situation that is not as severe as the previous "we are losing trust in your entire site" messages.

How serious are these new link messages? 

These new messages are worth your attention. Fundamentally, it means we're distrusting some links to your site. We often take this action when we see a site that is mostly good but might have some spammy or artificial links pointing to it (widgetbait, paid links, blog spam, guestbook spam, excessive article directory submissions, excessive link exchanges, other types of linkspam, etc.). So while the site's overall rankings might not drop directly, likewise the site might not be able to rank for some phrases. I wouldn't classify these messages as purely advisory or something to be ignored, or only for innocent sites.

On the other hand, I don't want site owners to panic. We do use this message some of the time for innocent sites where people are pointing hacked anchor text to their site to try to make them rank for queries like [buy viagra].

Example scenario: widget links 

A fair number of site owners emailed me after receiving one of the new messages, and I think it might be helpful if I paraphrased some of their situations to give you an idea of what it might mean if you get one of these messages.

The first example is widget links. An otherwise white-hat site emailed me about the message. Here's what I wrote back, with the identifying details removed:

"Looking into the very specific action that we took, I think we did the right thing. Take URL1 and URL2 for example. These pages are using your EXAMPLE1 widgets, but the pages include keyword-rich anchortext pointing to your site's url. One widget has the link ANCHORTEXT1 and the other has ANCHORTEXT2. 


If you do a search for [widgetbait matt cutts] you'll find tons of stories where I discourage people from putting keyword-rich anchortext into their widgets; see http://www.stonetemple.com/articles/interview-matt-cutts-061608.shtml for example. So this message is a way to tell you that not only are those links in your widget not working, they're probably keeping that page from ranking for the phrases that you're using." 

Example scenario: paid links 

The next example is paid links. I wrote this email to someone:

"I wouldn't recommend that Company X ignore this message. For example, check out SPAMMY_BLOG_POST_URL. That's a link from a very spammy website, and it calls into question the linkbuilding techniques that Company X has been using (we also saw a bunch of links due to widgets). These sorts of links are not helping Company X, and it would be worth their time to review how and why they started gathering links like this." 

I also wrote to another link building SEO who got this message pointing out that the SEO was getting links from a directory that appeared to offer only paid links that pass PageRank, and so we weren't trusting links like that.

Here's a final example of paid links. I emailed about one company's situation as follows:

"Company Y is getting this message because we see a long record of buying paid links that pass PageRank. In particular, we see a lot of low-quality 'sponsored posts' with keyword-rich anchortext where the links pass PageRank. The net effect is that we distrust a lot of links to this site. Here are a couple examples: URL1 and URL2. Bear in mind that we have more examples of these paid posts, but these two examples give a flavor of the sort of thing that should really be resolved. My recommendation would be to get these sort of paid posts taken down, and then Company Y could submit a reconsideration request. Otherwise, we'll continue to distrust quite a few links to the site." 

Example scenario: reputation management 

In some cases we're ignoring links to a site where the site itself didn't violate our guidelines. A good example of that is reputation management. We had two groups write in; one was a large news website, while the other was a not-for-profit publisher. Both had gotten the new link message. In one case, it appeared that a "reputation management" firm was using spammy links to try to push up positive articles on the news site, and we were ignoring those links to the news site. In the other case, someone was trying to manipulate the search results for a person's name by buying links on a well-known paid text link ad network. Likewise, we were just ignoring those specific links, and the not-for-profit publisher didn't need to take any action.

What should I do if I get the new link message? 

We recently launched the ability to download backlinks to your site sorted by date. If you get this new link message, you may want to check your most recent links to spot anything unusual going on. If you discover that someone in your company has been doing widgetbait, paid links, or serious linkspam, it's worth cleaning that up and submitting a reconsideration request. We're also looking at some ways to provide more concrete examples to make these messages more actionable and to help narrow down where to look when you get one.

Just to give you some context, less than 20,000 domains received these new messages—that's less than one-tenth the number of messages we send in a typical month—and that's only because we sent out messages retroactively to any site where we had distrusted some of the sites' backlinks. Going forward, based on our current level of action, on average only about 10 sites a day will receive this message. 

Summing up 

I hope this post and some of the examples above will help to convey the nuances of this new message. If you get one of these new messages, it's not a cause for panic, but neither should you completely ignore it. The message says that the current incident isn't affecting our opinion of the entire website, but it is affecting our opinion of some links to the website, and the site might not rank as well for some phrases as a result.

This message reflects an issue of moderate severity, and we're trying to find the right way to alert people that their site may have a potential issue (and it's worth some investigation) without overly stressing out site owners either. But we wanted to take this extra step toward more transparency now so that we can let site owners know when they might want to take a closer look at their current links.

Tuesday, July 24, 2012

Behold Google index secrets, revealed!

Webmaster level: All

Since Googlebot was born, webmasters around the world have been asking one question: Google, oh, Google, are my pages in the index? Now is the time to answer that question using the new Index Status feature in Webmaster Tools. Whether one or one million, Index Status will show you how many pages from your site have been included in Google’s index.

Index Status is under the Health menu. After clicking on it you’ll see a graph like the following:





It shows how many pages are currently indexed. The legend shows the latest count and the graph shows up to one year of data.

If you see a steadily increasing number of indexed pages, congratulations! This should be enough to confirm that new content on your site is being discovered, crawled and indexed by Google.

However, some of you may find issues that require looking a little bit deeper. That’s why we added an Advanced tab to the feature. You can access it by clicking on the button at the top, and it will look like this:





The advanced section will show not only totals of indexed pages, but also the cumulative number of pages crawled, the number of pages that we know about which are not crawled because they are blocked by robots.txt, and also the number of pages that were not selected for inclusion in our results.

Notice that the counts are always totals. So, for example, if on June 17th the count for indexed pages is 92, that means that there are a total of 92 pages indexed at this point in time, not that 92 pages were added to the index on that day only. In particular for sites with a long history, the count of pages crawled may be very big in comparison with the number of pages indexed.

All this data can be used to identify and debug a variety of indexing-related problems. For example, if some of your content doesn’t appear any more on Google and you notice that the graph of pages indexed has a sudden drop, that may be an indication that you introduced a site-wide error when using meta=”noindex” and now Google isn’t including your content in search results.

Another example: if you change the URL structure of your site and don’t follow our recommendations for moving your site, you may see a jump in the count of “Not selected”. Fixing the redirects or rel=”canonical” tags should help get better indexing coverage.

We hope that Index Status will bring more transparency into Google’s index selection process and help you identify and fix indexing problems with your sites. And if you have questions, don’t hesitate to ask in our Help Forum.

Posted by , and, Webmaster Tools Team