網站管理員指南技術內容更新
透過集合功能整理內容
你可以依據偏好儲存及分類內容。
2014年11月21日星期五
原文:
Updating our technical Webmaster Guidelines
作者:
皮埃尔·法 (Pierre Far)
,webmaster trends analyst
不久前,我們宣佈索引系統已開啟 CSS 和 JavaScript 支援功能,將可透過與一般新型瀏覽器更接近的方式
轉譯網頁
。今天我們將根據這項公告,更新
網站管理員指南技術內容
中的部分內容。
為了讓轉譯及索引功能達到最佳效果,新版指南指出管理員應允許 Googlebot 存取網頁中使用的 JavaScript、CSS 和圖片檔案。如此一來,您的網站就能獲得最佳轉譯及索引效果。
如果您未允許 Google 在您網站的 robots.txt 檔案中檢索 Javascript
或
CSS
檔案
,將直接影響 Google 演算法轉譯內容及建立索引的效果,導致網站無法達到應有的排名。
將索引功能最佳化的最新建議
以往,Google 索引系統的運作方式是模擬舊型的純文字瀏覽器 (例如 Lynx),我們的網站管理員指南也根據這樣的運作方式傳達相關指示。如今,由於系統已改為根據轉譯網頁建立索引,如果再延續舊有方式運作,將無法準確呈現網頁內容。現在比較能夠正確建立索引的方法,是模擬新型的網頁瀏覽器。瞭解這個新的概念之後,提醒您注意下列事項:
-
我們的轉譯引擎和新型瀏覽器一樣,可能無法支援網頁中使用的部分技術。請務必遵循
漸進增強
的原則設計網頁,這樣可讓我們的系統 (以及多款瀏覽器) 在尚未支援特定網頁設計功能的情況下,仍能存取可用的內容和基本功能。
-
可快速轉譯的網頁不僅能讓使用者更快取得您的內容,也能讓網頁建立索引時更有效率。建議您遵循
網頁效能最佳化
的最佳做法,尤其是下列事項:
-
確認您的伺服器可處理為 Googlebot 提供 JavaScript 和 CSS 檔案時的額外負載。
測試及疑難排解
在推出轉譯型索引功能的同時,我們也更新了「網站管理員工具」中的
Google 模擬器
功能,讓網站管理員可以查看我們系統轉譯網頁的情形。Google 模擬器可讓您找出多種索引問題,例如不恰當的 robots.txt 限制、Googlebot 無法追蹤的重新導向設定等等。
除非另有註明,否則本頁面中的內容是採用創用 CC 姓名標示 4.0 授權,程式碼範例則為阿帕契 2.0 授權。詳情請參閱《Google Developers 網站政策》。Java 是 Oracle 和/或其關聯企業的註冊商標。
上次更新時間:2014-11-01 (世界標準時間)。
[null,null,["上次更新時間:2014-11-01 (世界標準時間)。"],[[["\u003cp\u003eGoogle's indexing system now renders web pages like modern browsers, requiring access to JavaScript, CSS, and image files for optimal ranking.\u003c/p\u003e\n"],["\u003cp\u003eBlocking Googlebot from accessing these files via robots.txt negatively impacts indexing and rankings.\u003c/p\u003e\n"],["\u003cp\u003eWebsites should be designed with progressive enhancement principles and optimized for performance to ensure content is accessible and efficiently indexed.\u003c/p\u003e\n"],["\u003cp\u003eWebmasters can use the "Fetch and Render as Google" tool to troubleshoot indexing issues and see how Google renders their pages.\u003c/p\u003e\n"]]],["Google's indexing system now renders web pages like modern browsers, necessitating an update to webmaster guidelines. Webmasters should allow Googlebot access to JavaScript, CSS, and image files for optimal indexing and ranking. Disallowing these files harms content rendering. Pages should be designed with progressive enhancement and optimized for performance, including minimizing downloads and optimizing CSS and JavaScript files. Webmasters can use the updated \"Fetch and Render as Google\" tool to identify indexing issues.\n"],null,["# Updating our technical Webmaster Guidelines\n\nMonday, October 27, 2014\n\n\nWe recently announced that our indexing system has been\n[rendering web pages](/search/blog/2014/05/understanding-web-pages-better)\nmore like a typical modern browser, with CSS and JavaScript turned on. Today, we're updating one\nof our\n[technical Webmaster Guidelines](/search/docs/essentials)\nin light of this announcement.\n\n\nFor optimal rendering and indexing, our new guideline specifies that you should allow Googlebot\naccess to the JavaScript, CSS, and image files that your pages use. This provides you optimal\nrendering and indexing for your site. **Disallowing crawling of Javascript or CSS files in\nyour site's robots.txt directly harms how well our algorithms render and index your content and\ncan result in suboptimal rankings.**\n\nUpdated advice for optimal indexing\n-----------------------------------\n\n\nHistorically, Google indexing systems resembled old text-only browsers, such as Lynx, and that's\nwhat our Webmaster Guidelines said. Now, with indexing based on page rendering, it's no longer\naccurate to see our indexing systems as a text-only browser. Instead, a more accurate\napproximation is a modern web browser. With that new perspective, keep the following in mind:\n\n- Just like modern browsers, our rendering engine might not support all of the technologies a page uses. Make sure your web design adheres to the principles of [progressive enhancement](https://en.wikipedia.org/wiki/Progressive_enhancement) as this helps our systems (and a wider range of browsers) see usable content and basic functionality when certain web design features are not yet supported.\n-\n Pages that render quickly not only help users get to your content easier, but make indexing of\n those pages more efficient too. We advise you follow the best practices for\n [page performance optimization](/web/fundamentals/performance),\n specifically:\n\n - [Eliminate unnecessary downloads](/web/fundamentals/performance/optimizing-content-efficiency/eliminate-downloads)\n - [Optimize the serving of your CSS and JavaScript files](/web/fundamentals/performance/optimizing-content-efficiency/optimize-encoding-and-transfer) by concatenating (merging) your separate CSS and JavaScript files, minifying the concatenated files, and configuring your web server to serve them compressed (usually gzip compression)\n- Make sure your server can handle the additional load for serving of JavaScript and CSS files to Googlebot.\n\nTesting and troubleshooting\n---------------------------\n\n\nIn conjunction with the launch of our rendering-based indexing, we also updated the\n[Fetch and Render as Google](https://support.google.com/webmasters/answer/158587)\nfeature in Webmaster Tools so webmasters could see how our systems render the page. With it,\nyou'll be able to identify a number of indexing issues: improper robots.txt restrictions,\nredirects that Googlebot cannot follow, and more.\n\n\nAnd, as always, if you have any comments or questions, please ask in our\n[Webmaster Help forum](https://support.google.com/webmasters/community/).\n\n\nPosted by\n[Pierre Far](https://google.com/+PierreFar?rel=superauthor),\nWebmaster Trends Analyst"]]