Robots.txt file is required to make some definitions in search engines. Forbidden pages and sitemap addresses can be written in this file. To create robots.txt, simply notepad can be used in Windows.
If written in the following figure, search engines such as Google or Bing do not enter this page, admin is the management page on many sites:
User-agent: *
Disallow: /admin /
The example file below has shown the identification of a sample sitemap file address:
Sitemap: https://www.acilsoru.com/sitemap.xml
This method can be targeted on different sites.
Önceki cevap: Spaghetti construction