Step 1: Locate the uncrawlable URL
Uncrawlable pages can affect your Dynamic Search Ads in 3 ways:
- If you select
URL-Equalsas your targeting type and your URL is uncrawlable, you'll find the status “Disapproved: Page cannot be crawled” on your dynamic ad target. - If you use page feeds and a URL in your feed is uncrawlable, you'll find the status “Disapproved: Page cannot be crawled” on the row for that URL in the shared library.
- If you use any other targeting mechanism and a URL isn't or cannot be crawled, Google Ads won’t know about it, so no error message will be displayed. If a URL you’re attempting to target isn't getting traffic, you can troubleshoot by first targeting it through
URL-Equalsor page feeds. The URL will either get traffic or you'll get an error explaining why it isn’t getting traffic.
Step 2: Make sure that Google AdsBot can crawl the landing page
- Copy the URL and paste it in a web browser's address bar.
- Add
/robots.txtto the end of the domain name, then press Enter. For example, if your landing page ishttp://example.com/folder1/folder2/, check what robots are allowed to crawl the domain by visiting the URL:http://example.com/robots.txt. - Check for the line
User-agent: AdsBot-Google.- If it's not there, work with the web developer to add it.
- If the page is blank, work with the web developer to add a
robots.txtfile to the domain. - If
User-agent: AdsBot-Googleis there, ensure it isn't followed by a line that saysDisallow: /. Ensure AdsBot-Google isn't restricted by another line in the file. Learn more Aboutrobots.txtfiles and Google crawlers.
- If the landing page is a subdomain, check if the subdomain has a separate
robots.txtfile. For example,https://subdomain.domain.com/robots.txt. Ensure thisrobots.txtalso allowsUser-agent: AdsBot-Google. - Open the source of the URL. In Chrome, you can do this by opening
view-source:THE_URL. For example, if your landing page ishttp://example.com/folder1/folder2/, you can check what robots are allowed to crawl the domain by navigating toview-source:http://example.com/folder1/folder2/. - Check for the string “
AdsBot-Google”.- If it’s not there, this isn't the source of the issue.
- If the string is within a tag like
<meta name="AdsBot-Google" content="noindex" />, work with your web developer to remove it. Learn more about Meta tags that Google understands.
- Test your landing page with a crawler simulation tool. Use Google Search Console's URL Inspection tool to view how Google crawls your page. This can help identify if the page is accessible and if it returns a 200 OK status code. This proactively checks for the crawling issues that lead to disapprovals.
Other reasons that a landing page can't be crawled
If the Google AdsBot is allowed to crawl the landing page (User-agent: AdsBot-Google is already included in the site's robots.txt file), there may be another reason the page can't be crawled:
- The landing page isn't loading or returns an error. If you open the page with a web browser and find an HTTP error (for example, 404 Not Found, 403 Forbidden, 500 Internal Server Error or a DNS error, it means your website's server is preventing access. Even if the page loads for you, it may be blocked for Google's crawlers. Contact the site's web developer to ensure the landing page is accessible to Google AdsBot and returns a 200 OK status code. They should check server logs and configurations to ensure Google's crawlers are not being blocked by firewalls or other security settings (like Cloudflare).
- The landing page requires sign-in. All ad landing pages must be publicly accessible. Use another landing page or work with the web developer to remove the sign-in requirement from the page.
- The landing page has too many forwards or redirects. Landing pages must have fewer than 10 redirects. Work with your web developer to reduce the number of redirects.