Best practices to help Google find, crawl, and index your site
Following these guidelines will help Google find, index, and rank your site. Even if you choose not to implement any of these suggestions, we strongly encourage you to pay very close attention to the "Quality Guidelines," which outline some of the illicit practices that may lead to a site being removed entirely from the Google index or otherwise impacted by an algorithmic or manual spam action. If a site has been affected by a spam action, it may no longer show up in results on Google.com or on any of Google's partner sites.
• Design and content guidelines
• Technical guidelines
• Quality guidelines
When your site is ready:
• Submit it to Google at http://www.google.com/submityourcontent/.
• Submit a Sitemap using Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your WebPages.
• Submit a Sitemap using Google Webmaster Tools. Google uses your Sitemap to learn about the structure of your site and to increase our coverage of your WebPages.
• Make sure all the sites that should know about your pages are aware your site is online.
Design and content guidelines
• Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
• Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.
• Keep the links on a given page to a reasonable number.
• Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
• Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.
• Make sure that your <title> elements and ALT attributes are descriptive and accurate.
• Check for broken links and correct HTML.
• If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.
• Review our recommended best practices for images, video and rich snippets.
Technical guidelines
• Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing your entire site in a text browser, then search engine spiders may have trouble crawling your site.
• Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.
• Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.
• Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visithttp://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.
• Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.
• If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.
• Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.
• Test your site to make sure that it appears correctly in different browsers.
• Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.
Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow,WebPagetest, or other tools. For more information, tools, and resources, see Let's Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.
0 comments:
Post a Comment
Leave a Reply