Quite simply, your site's robots.txt file tells search engine crawlers which pages and files web crawlers can or cannot request from your site.
Most commonly, it is used to prevent certain free korean number for whatsapp sections of your site from being crawled but should not be used as a way to de-index a web page and prevent it from appearing on Google.
You can find your site's robots.txt file at
Check if you already have one running.
If not, you will need to create one, even if you do not currently need to prevent your web pages from being crawled.
Several WordPress SEO plugins allow users to create and edit their own robots.txt file, but if you use a different CMS you may need to manually create the file using a text editor and upload it to your domain’s root directory.
6. Check Search Console for manual actions
In rare cases, you may find that your site has been negatively impacted by a manual action.
Manual actions are typically caused by a clear attempt to violate or manipulate Google Webmaster Guidelines . This includes: user-generated spam, structured data issues, unnatural links (both to and from your site), thin content, hidden text, and even what is called pure spam.
Most sites are not affected by manual action and never will be.
That said, you can check this in the manual actions tab in Google Search Console.
You will be notified if your site has been affected by a manual action, but if you are working on a new project or taking over a site, this should always be one of the first things you check.
Create a Robots.txt file
-
- Posts: 24
- Joined: Sun Dec 15, 2024 5:25 am