Tetap teratur dengan koleksi
Simpan dan kategorikan konten berdasarkan preferensi Anda.
Mengurangi frekuensi crawling Google
Infrastruktur crawler Google memiliki algoritma canggih untuk menentukan frekuensi crawling yang optimal
bagi suatu situs. Tujuan kami adalah meng-crawl sebanyak mungkin halaman dari situs Anda di setiap kunjungan tanpa
membuat server Anda mengalami kelebihan beban. Dalam beberapa kasus, crawling Google di situs Anda dapat menimbulkan
pemuatan yang berat pada infrastruktur Anda, atau menyebabkan biaya yang tidak diinginkan selama gangguan. Untuk mengatasi masalah ini,
Anda dapat memutuskan untuk mengurangi jumlah permintaan yang dibuat oleh crawler Google.
Memahami penyebab peningkatan crawling yang tajam
Peningkatan crawling yang tajam mungkin disebabkan struktur situs yang tidak efisien atau masalah pada
situs Anda. Berdasarkan laporan yang kami terima sebelumnya, penyebab paling umum adalah:
Konfigurasi URL yang tidak efisien di situs, yang biasanya disebabkan oleh
fungsi situs tertentu:
Navigasi berfaset atau fungsi pengurutan dan pemfilteran lainnya pada situs
Sebaiknya hubungi perusahaan hosting Anda dan periksa log akses terbaru
server Anda untuk memahami sumber traffic, dan lihat apakah traffic tersebut sesuai dengan penyebab umum
peningkatan crawling yang tajam yang disebutkan di atas. Kemudian, lihat panduan kami tentang
mengelola crawling URL navigasi berfaset
dan
mengoptimalkan efisiensi crawling.
Mengurangi traffic crawler dengan segera (untuk keadaan darurat)
Jika Anda perlu segera mengurangi frekuensi crawling untuk jangka waktu yang singkat (misalnya, beberapa
jam, atau 1-2 hari), tampilkan kode status respons HTTP 500, 503, atau 429, bukan 200, pada permintaan crawl. Infrastruktur crawling Google
akan mengurangi frekuensi crawling situs Anda jika menemukan jumlah URL yang signifikan
dengan kode status respons HTTP 500, 503, atau 429 (misalnya,
jika Anda
menonaktifkan situs Anda).
Pengurangan frekuensi crawling memengaruhi seluruh nama host situs Anda (misalnya,
subdomain.example.com), baik crawling URL yang menampilkan error, maupun
URL yang menampilkan konten. Setelah jumlah error ini dikurangi, frekuensi crawling akan
mulai meningkat lagi secara otomatis.
Permintaan khusus untuk mengurangi frekuensi crawling
Jika infrastruktur Anda tidak dapat menayangkan error ke crawler Google,
ajukan permintaan khusus
untuk melaporkan masalah terkait tingginya frekuensi crawling yang tidak wajar, dengan menyebutkan frekuensi yang optimal bagi situs dalam
permintaan Anda. Anda tidak dapat meminta peningkatan frekuensi crawling, dan mungkin memerlukan waktu beberapa hari agar
permintaan tersebut dapat dievaluasi dan dipenuhi.
[null,null,["Terakhir diperbarui pada 2025-08-28 UTC."],[[["\u003cp\u003eGoogle automatically adjusts crawl rate to avoid overloading your server, but you can reduce it further if needed.\u003c/p\u003e\n"],["\u003cp\u003eTemporarily reducing crawl rate can be achieved by returning 500, 503, or 429 HTTP response codes, but this impacts content freshness and discovery.\u003c/p\u003e\n"],["\u003cp\u003eFor longer-term crawl rate reduction, file a special request with Google; however, increasing the rate isn't possible.\u003c/p\u003e\n"],["\u003cp\u003eBefore reducing crawl rate, consider optimizing your website structure for better crawling efficiency as this might resolve the issue.\u003c/p\u003e\n"],["\u003cp\u003eExtended use of error codes to control crawling may lead to URLs being dropped from Google's index, so it's crucial to use this method cautiously.\u003c/p\u003e\n"]]],["Google's crawlers may need to be slowed if they overload a site. Common causes for increased crawling include inefficient site structure, like faceted navigation. For urgent reductions, return `500`, `503`, or `429` HTTP status codes to crawler requests; this will lower the crawl rate, but can negatively affect site indexing if done for too long. Alternatively, if returning errors isn't viable, submit a special request specifying an optimal crawl rate. Note: reducing the crawl rate will result in slower updates of existing pages.\n"],null,["# Reduce Google Crawl Rate | Google Search Central\n\nReduce the Google crawl rate\n============================\n\n\nGoogle's crawler infrastructure has sophisticated algorithms to determine the optimal crawl rate\nfor a site. Our goal is to crawl as many pages from your site as we can on each visit without\noverwhelming your server. In some cases, Google's crawling of your site might be causing a\ncritical load on your infrastructure, or cause unwanted costs during an outage. To alleviate this,\nyou may decide to reduce the number of requests made by Google's crawlers.\n\nUnderstand the cause of the sharp increase in crawling\n------------------------------------------------------\n\n\nSharp increase in crawling may be caused by inefficiencies in your site's structure or issues with\nyour site otherwise. Based on the reports we've received in the past, the most common causes are:\n\n- Inefficient configuration of URLs on the site, which is typically casued by a specific functionality of the site:\n - Faceted navigation or other sorting and filtering functionality of the site\n - A calendar with a lot of URLs for specific dates\n- [A Dynamic Search Ad target](/search/docs/crawling-indexing/large-site-managing-crawl-budget#adsbot)\n\n\nWe strongly recommend that you check with your hosting company and look at recent access logs of\nyour server to understand the source of the traffic, and see if it fits in the aformentioned\ncommon causes of the sharp increase in crawling. Then, check our guides about\n[managing crawling of faceted navigation URLs](/search/docs/crawling-indexing/crawling-managing-faceted-navigation)\nand\n[optimizing crawling efficiency](/search/docs/crawling-indexing/large-site-managing-crawl-budget#improve_crawl_efficiency).\n\nUrgently reduce crawler traffic (for emergencies)\n-------------------------------------------------\n\n| **Warning**: When considering reducing the Google's crawl rate, keep in mind that this will have broad effects. For Search, Googlebot will discover fewer new pages, and existing pages will be refreshed less frequently (for example, prices and product availability may take longer to be reflected in Search), and removed pages may stay in the index longer. For Google Ads, your campaigns may be cancelled or paused, and your ads may not serve.\n\n\nIf you need to urgently reduce the crawl rate for short period of time (for example, a couple\nof hours, or 1-2 days), then return `500`, `503`, or `429` HTTP\nresponse status code instead of `200` to the crawl requests. Google's crawling\ninfrastructure reduces your site's crawling rate when it encounters a significant number of URLs\nwith `500`, `503`, or `429` HTTP response status codes (for\nexample, if you\n[disabled your website](/search/docs/crawling-indexing/pause-online-business)).\nThe reduced crawl rate affects the whole hostname of your site (for example,\n`subdomain.example.com`), both the crawling of the URLs that return errors, as well as\nthe URLs that return content. Once the number of these errors is reduced, the crawl rate will\nautomatically start increasing again.\n| **Warning**: We don't recommend that you do this for a long period of time (meaning, longer than 1-2 days) as it may have a negative effect on how your site appears in Google products. For example, in case of Search, if Googlebot observes these status codes on the same URL for multiple days, the URL may be dropped from Google's index.\n\nExceptional requests to reduce crawl rate\n-----------------------------------------\n\n\nIf serving errors to Google's crawlers is not feasible on your infrastructure,\n[file a special request](https://search.google.com/search-console/googlebot-report)\nto report a problem with unusually high crawl rate, mentioning the optimal rate for your site in\nyour request. You cannot request an increase in crawl rate, and it may take several days for the\nrequest to be evaluated and fulfilled."]]