Google의 크롤러 인프라는 사이트에 가장 적합한 크롤링 속도를 결정하는 정교한 알고리즘을 사용합니다. Google의 목표는 방문한 사이트에서 서버에 무리를 주지 않으면서 가능한 한 많은 페이지를 크롤링하는 것입니다. 경우에 따라 Google에서 사이트를 크롤링하면 인프라에 심각한 부하가 발생하거나 중단된 기간에 원치 않는 비용이 발생할 수 있습니다. 이 문제를 완화하려면 Google 크롤러가 생성하는 요청 횟수를 줄여달라고 선택하세요.
크롤링이 급격히 증가한 원인 파악하기
비효율적인 사이트 구조나 사이트 관련 문제로 인해 크롤링이 급격히 증가할 수 있습니다. 이전에 접수된 신고를 바탕으로 보았을 때 이러한 상황이 발생하는 가장 일반적인 원인은 다음과 같습니다.
사이트의 URL이 비효율적으로 구성되어 있으며 이는 일반적으로 사이트의 특정 기능으로 인해 발생함
호스팅 회사에 문의하여 서버의 최근 액세스 로그에서 트래픽 소스를 파악한 다음, 앞서 언급한 크롤링 급증의 일반적인 원인에 해당하는지 확인하는 것이 좋습니다. 그런 다음 속성 탐색 URL 크롤링 관리 및 크롤링 효율성 최적화에 관한 가이드를 확인하세요.
크롤러 트래픽을 긴급하게 줄이기(긴급 상황의 경우)
짧은 기간 동안(예: 2~3시간 또는 1~2일) 긴급하게 크롤링 속도를 낮춰야 한다면 200 대신 500, 503 또는 429 HTTP 응답 상태 코드를 크롤링 요청으로 반환하세요. 500, 503 또는 429 HTTP 응답 상태 코드가 포함된 URL을 다수 발견하게 되면(예: 웹사이트를 사용 중지한 경우) Google의 크롤링 인프라에서 사이트의 크롤링 속도를 낮춥니다.
크롤링 속도가 감소하면 사이트의 전체 호스트 이름(예: subdomain.example.com), 오류를 반환하는 URL 크롤링 및 콘텐츠를 반환하는 URL 모두에 영향을 줍니다. 이러한 오류의 수가 줄어들면 크롤링 속도가 자동으로 다시 높아집니다.
크롤링 속도 감소를 위한 예외적인 요청
인프라에서 Google 크롤러에 오류를 제공할 수 없다면 특별 요청을 제출하여 크롤링 속도가 비정상적으로 높은 문제를 신고하고, 요청할 때 사이트에 적합한 최적의 속도를 언급하세요. 크롤링 속도를 높여달라고 요청할 수 없으며 요청이 평가되고 처리되는 데 며칠이 걸릴 수 있습니다.
[null,null,["최종 업데이트: 2025-09-01(UTC)"],[[["\u003cp\u003eGoogle automatically adjusts crawl rate to avoid overloading your server, but you can reduce it further if needed.\u003c/p\u003e\n"],["\u003cp\u003eTemporarily reducing crawl rate can be achieved by returning 500, 503, or 429 HTTP response codes, but this impacts content freshness and discovery.\u003c/p\u003e\n"],["\u003cp\u003eFor longer-term crawl rate reduction, file a special request with Google; however, increasing the rate isn't possible.\u003c/p\u003e\n"],["\u003cp\u003eBefore reducing crawl rate, consider optimizing your website structure for better crawling efficiency as this might resolve the issue.\u003c/p\u003e\n"],["\u003cp\u003eExtended use of error codes to control crawling may lead to URLs being dropped from Google's index, so it's crucial to use this method cautiously.\u003c/p\u003e\n"]]],["Google's crawlers may need to be slowed if they overload a site. Common causes for increased crawling include inefficient site structure, like faceted navigation. For urgent reductions, return `500`, `503`, or `429` HTTP status codes to crawler requests; this will lower the crawl rate, but can negatively affect site indexing if done for too long. Alternatively, if returning errors isn't viable, submit a special request specifying an optimal crawl rate. Note: reducing the crawl rate will result in slower updates of existing pages.\n"],null,["# Reduce Google Crawl Rate | Google Search Central\n\nReduce the Google crawl rate\n============================\n\n\nGoogle's crawler infrastructure has sophisticated algorithms to determine the optimal crawl rate\nfor a site. Our goal is to crawl as many pages from your site as we can on each visit without\noverwhelming your server. In some cases, Google's crawling of your site might be causing a\ncritical load on your infrastructure, or cause unwanted costs during an outage. To alleviate this,\nyou may decide to reduce the number of requests made by Google's crawlers.\n\nUnderstand the cause of the sharp increase in crawling\n------------------------------------------------------\n\n\nSharp increase in crawling may be caused by inefficiencies in your site's structure or issues with\nyour site otherwise. Based on the reports we've received in the past, the most common causes are:\n\n- Inefficient configuration of URLs on the site, which is typically casued by a specific functionality of the site:\n - Faceted navigation or other sorting and filtering functionality of the site\n - A calendar with a lot of URLs for specific dates\n- [A Dynamic Search Ad target](/search/docs/crawling-indexing/large-site-managing-crawl-budget#adsbot)\n\n\nWe strongly recommend that you check with your hosting company and look at recent access logs of\nyour server to understand the source of the traffic, and see if it fits in the aformentioned\ncommon causes of the sharp increase in crawling. Then, check our guides about\n[managing crawling of faceted navigation URLs](/search/docs/crawling-indexing/crawling-managing-faceted-navigation)\nand\n[optimizing crawling efficiency](/search/docs/crawling-indexing/large-site-managing-crawl-budget#improve_crawl_efficiency).\n\nUrgently reduce crawler traffic (for emergencies)\n-------------------------------------------------\n\n| **Warning**: When considering reducing the Google's crawl rate, keep in mind that this will have broad effects. For Search, Googlebot will discover fewer new pages, and existing pages will be refreshed less frequently (for example, prices and product availability may take longer to be reflected in Search), and removed pages may stay in the index longer. For Google Ads, your campaigns may be cancelled or paused, and your ads may not serve.\n\n\nIf you need to urgently reduce the crawl rate for short period of time (for example, a couple\nof hours, or 1-2 days), then return `500`, `503`, or `429` HTTP\nresponse status code instead of `200` to the crawl requests. Google's crawling\ninfrastructure reduces your site's crawling rate when it encounters a significant number of URLs\nwith `500`, `503`, or `429` HTTP response status codes (for\nexample, if you\n[disabled your website](/search/docs/crawling-indexing/pause-online-business)).\nThe reduced crawl rate affects the whole hostname of your site (for example,\n`subdomain.example.com`), both the crawling of the URLs that return errors, as well as\nthe URLs that return content. Once the number of these errors is reduced, the crawl rate will\nautomatically start increasing again.\n| **Warning**: We don't recommend that you do this for a long period of time (meaning, longer than 1-2 days) as it may have a negative effect on how your site appears in Google products. For example, in case of Search, if Googlebot observes these status codes on the same URL for multiple days, the URL may be dropped from Google's index.\n\nExceptional requests to reduce crawl rate\n-----------------------------------------\n\n\nIf serving errors to Google's crawlers is not feasible on your infrastructure,\n[file a special request](https://search.google.com/search-console/googlebot-report)\nto report a problem with unusually high crawl rate, mentioning the optimal rate for your site in\nyour request. You cannot request an increase in crawl rate, and it may take several days for the\nrequest to be evaluated and fulfilled."]]