เราอยู่ใน LinkedIn แล้วตอนนี้: หากต้องการข่าวสารและแหล่งข้อมูลจาก Google Search เกี่ยวกับการทำให้เว็บไซต์ของคุณค้นพบได้ โปรด
ติดตามเราใน LinkedIn
网站站长技术指南更新通知
使用集合让一切井井有条
根据您的偏好保存内容并对其进行分类。
2014年11月21日星期五
原文:
Updating our technical Webmaster Guidelines
作者:
皮埃尔·法 (Pierre Far)
,网站站长趋势分析师
最近我们对外宣布,在启用 CSS 和 JavaScript 的情况下,我们的索引系统能够以更接近普通新型浏览器的方式
呈现网页
。考虑到这项变化,我们对
网站站长技术指南
中的一项要求进行了更新。
为了尽可能优化网站的呈现效果和索引编制效果,我们的新指南规定:您的网站应允许 Googlebot 访问您网页使用的 JavaScript、CSS 和图片文件。这样能充分优化您网站的呈现效果和索引编制效果。
如果在网站的 robots.txt 中指定不允许 Google 抓取 Javascript
或
CSS
文件
,将会直接影响我们的算法呈现您网页内容的效果以及将其编入索引的效果,并且会造成网站排名欠佳。
经过更新的索引编制效果优化建议
以前,Google 索引系统的运作方式就像那些旧版纯文本浏览器(如 Lynx)一样(正如我们在网站站长指南中曾提到的那样)。如今,索引编制过程依据的是网页呈现,再将我们的索引系统看做纯文本浏览器已不准确,现在它更像一个新型网页浏览器。鉴于这种变化,请谨记以下几点:
-
和所有新型浏览器一样,我们的呈现引擎可能无法支持网页所使用的所有技术。请确保您的网页设计遵循
渐进增强
的原则,因为这样有助于我们的系统(以及大量浏览器)在尚不支持某些网页设计功能时,能够查看可用的内容和使用基本功能。
-
网页快速呈现不但有助于用户更加轻松地浏览您网站的内容,而且使 Google 能够更加高效地将这些网页编入索引。我们建议您遵循有关
网页效果优化
的最佳做法,尤其是下列几条:
-
确保您的服务器能够处理向 Googlebot 提供 JavaScript 和 CSS 文件的附加载荷。
测试和问题排查
除了推出了基于网页呈现的索引系统,我们还更新了网站站长工具中的
Google 抓取和呈现方式
功能,以便网站站长能够了解我们的系统是如何呈现网页的。更新该功能后,您将能够发现一系列索引编制问题:不合适的 robots.txt 限制、Googlebot 无法追踪的重定向等。
如未另行说明,那么本页面中的内容已根据知识共享署名 4.0 许可获得了许可,并且代码示例已根据 Apache 2.0 许可获得了许可。有关详情,请参阅 Google 开发者网站政策。Java 是 Oracle 和/或其关联公司的注册商标。
最后更新时间 (UTC):2014-11-01。
[null,null,["最后更新时间 (UTC):2014-11-01。"],[[["\u003cp\u003eGoogle's indexing system now renders web pages like modern browsers, requiring access to JavaScript, CSS, and image files for optimal ranking.\u003c/p\u003e\n"],["\u003cp\u003eBlocking Googlebot from accessing these files via robots.txt negatively impacts indexing and rankings.\u003c/p\u003e\n"],["\u003cp\u003eWebsites should be designed with progressive enhancement principles and optimized for performance to ensure content is accessible and efficiently indexed.\u003c/p\u003e\n"],["\u003cp\u003eWebmasters can use the "Fetch and Render as Google" tool to troubleshoot indexing issues and see how Google renders their pages.\u003c/p\u003e\n"]]],["Google's indexing system now renders web pages like modern browsers, necessitating an update to webmaster guidelines. Webmasters should allow Googlebot access to JavaScript, CSS, and image files for optimal indexing and ranking. Disallowing these files harms content rendering. Pages should be designed with progressive enhancement and optimized for performance, including minimizing downloads and optimizing CSS and JavaScript files. Webmasters can use the updated \"Fetch and Render as Google\" tool to identify indexing issues.\n"],null,["# Updating our technical Webmaster Guidelines\n\nMonday, October 27, 2014\n\n\nWe recently announced that our indexing system has been\n[rendering web pages](/search/blog/2014/05/understanding-web-pages-better)\nmore like a typical modern browser, with CSS and JavaScript turned on. Today, we're updating one\nof our\n[technical Webmaster Guidelines](/search/docs/essentials)\nin light of this announcement.\n\n\nFor optimal rendering and indexing, our new guideline specifies that you should allow Googlebot\naccess to the JavaScript, CSS, and image files that your pages use. This provides you optimal\nrendering and indexing for your site. **Disallowing crawling of Javascript or CSS files in\nyour site's robots.txt directly harms how well our algorithms render and index your content and\ncan result in suboptimal rankings.**\n\nUpdated advice for optimal indexing\n-----------------------------------\n\n\nHistorically, Google indexing systems resembled old text-only browsers, such as Lynx, and that's\nwhat our Webmaster Guidelines said. Now, with indexing based on page rendering, it's no longer\naccurate to see our indexing systems as a text-only browser. Instead, a more accurate\napproximation is a modern web browser. With that new perspective, keep the following in mind:\n\n- Just like modern browsers, our rendering engine might not support all of the technologies a page uses. Make sure your web design adheres to the principles of [progressive enhancement](https://en.wikipedia.org/wiki/Progressive_enhancement) as this helps our systems (and a wider range of browsers) see usable content and basic functionality when certain web design features are not yet supported.\n-\n Pages that render quickly not only help users get to your content easier, but make indexing of\n those pages more efficient too. We advise you follow the best practices for\n [page performance optimization](/web/fundamentals/performance),\n specifically:\n\n - [Eliminate unnecessary downloads](/web/fundamentals/performance/optimizing-content-efficiency/eliminate-downloads)\n - [Optimize the serving of your CSS and JavaScript files](/web/fundamentals/performance/optimizing-content-efficiency/optimize-encoding-and-transfer) by concatenating (merging) your separate CSS and JavaScript files, minifying the concatenated files, and configuring your web server to serve them compressed (usually gzip compression)\n- Make sure your server can handle the additional load for serving of JavaScript and CSS files to Googlebot.\n\nTesting and troubleshooting\n---------------------------\n\n\nIn conjunction with the launch of our rendering-based indexing, we also updated the\n[Fetch and Render as Google](https://support.google.com/webmasters/answer/158587)\nfeature in Webmaster Tools so webmasters could see how our systems render the page. With it,\nyou'll be able to identify a number of indexing issues: improper robots.txt restrictions,\nredirects that Googlebot cannot follow, and more.\n\n\nAnd, as always, if you have any comments or questions, please ask in our\n[Webmaster Help forum](https://support.google.com/webmasters/community/).\n\n\nPosted by\n[Pierre Far](https://google.com/+PierreFar?rel=superauthor),\nWebmaster Trends Analyst"]]