Robots.txt

A text file stored on a website’s server that includes basic rules for indexing robots which “crawl” the site. This file allows you to specifically allow (or disallow) certain files and folders from being viewed by crawler bots, which can keep your indexed pages limited to only the pages you wish.


Articole in care e folosit acest termen