In Today’s Article, I will provide u with all information about the crawling issue in the blogger post all possible methods will be shared in this article to fix the crawling issue and provide all required code and files that are required to fix this error. Here u will fix the website hosted on blogger this method doesn’t work for WordPress hosted websites.
What is Crawling?
Crawling is the process of search engine bots like Google bots visiting a website to ascertain what is on the page. It helps all search engines to understand the content and index it in their database.
What is search engine indexing?
Indexing is the procedure of search engines organize the information for faster response to given search queries. They store the crawling data in their servers and organize them for different search queries and show only the relevant search results.
Search engine ranking
After analyzing the indexing data, then search engines show the relevant or similar websites in priority order and show in the SERPs. (search engine result pages).
Ranking depends on a lot of different factors like relevancy, Page rank, website authority, backlink, and a lot more. Google uses more than 200 ranking factors to show accurate results.
How to fix crawling & indexing issues in Blogger?
Fixing the crawling & indexing issue is a bit technical and depends on a lot of different factors. Here, I will explain some of the settings to avoid these problems and get this error fixed easily.
1: Privacy Setting
Navigate to Blogger dashboard and click on the setting tab. Now search for the Privacy option and turn on the setting “Visible to search engines”. If this is turned Off then indexing of the web pages will stop.
2: Crawlers and indexing settings
Now After Enabling the Privacy setting Now scroll down to crawlers and indexing settings and turn on the custom robots.txt option. Now add the robots.txt file in the below format and replace it with your website URL.
User-agent: * Disallow: /search Disallow: /category/ Disallow: /tag/ Allow: / Sitemap: https://www.example.com/atom.xml?redirect=false&start-index=1&max-results=500If your website has more than 500 pages then you can add multiple sitemaps to your website. Just add an extra line of code after the above code.
Sitemap: https://example.com/atom.xml?redirect=false&start-index=501&max-results=500In this way, you can add 1000 pages to your blogger XML sitemap and solve the indexing issue.If u have more than 1000 posts just edit the code in html start-index=1001
3:Enable Custom Header
Now Navigate to “custom robots header tags” option and then you have to set the 3 header tags Mentioned below.
- Open Homepage tags and select the “all” & “noodp” option and save it.
- Now select Archive and search page tags and select “noindex” & “noodp” option and save it.
- In Post and page tags select the “all” and “noodp” options and save them.
4: Submit sitemap in Search Console
Now you have to submit the sitemap in the Google search console. If you don’t have an account then create one and then verify your domain in it.
Now click on the submit sitemap option in the search console and enter the sitemap URL. Type the URL in this format and click on submit.
https://www.example.com/sitemap.xmlNow your sitemap is submitted to Google Search Console. You can also submit your sitemap in the Bing webmaster tool or link your search console with it.
After submitting the sitemap, Your website is automatically crawled by the search engines bots and indexed in google search results.
Here, some websites face crawling issues in their website. so, this happens sometimes due to crawling budget limitations or any redirecting issues.
You can fix the crawler budget issue by regularly updating your website and posting articles frequently.
You can also manually submit your Blog URL in the URL inspection tool of the search console. Just paste the newly publish article link and click on the button request indexing.
After that Google will set your website in priority and crawl your website in some times.

MI FLASH 2020 No Need Authorized Free Download 2022 Working 100%
How to avoid indexing and crawling issues?
There are various methods on the internet that you can implement to avoid indexing and crawling issues on your website. These methods will help search engines to index your pages faster.
- Post Articles frequently and update your old articles after a certain time interval.
- Focus on the interlinking of articles as it helps search engines discover new pages easily.
- Share the article on social media and get some initial traffic to that page using social media.
- Fix broken Internal links.
- Fix redirect loop (it happens when two pages redirect to each other)
- Improve page loading speed.
- Fix duplicate pages issue.
- Use HTML sitemap in Blogger.
Conclusion
Now if u have read that article carefully then u have learned how to fix crawling and indexing issues in your blogger website. Just check the robots.txt file and meta tags properly and follow the best practices as shown above. If you still facing any types of crawling and indexing issues let me know in the comment section.