Create robots.txt - Intergraph Smart Materials - 10.4 - Installation & Upgrade - Hexagon

Intergraph Smart Materials Web Installation (10.4)

Language
English
Product
Intergraph Smart Materials
Subproduct
Web
Search by Category
Installation & Upgrade
Smart Materials/Smart Reference Data Version
10.4

The robots.txt can be used to restrict the allowed areas for search engine crawling.

During the Tomcat installation, you can create a robots.txt file that is used to instruct web crawlers which parts of a URL they are not allowed to access. However, this is only an instruction to the web crawler, which it can ignore.

For example, if web crawlers should be prohibited from accessing all sub pages of the URL, the robots.txt should have this content:

User agent: *

Disallow: /

After creating the robots.txt, it must be placed in the ROOT directory of Tomcat, this is:

<installation directory Tomcat>\webapps\ROOT

To check, you can call the link as shown in the picture below.

APEXAppacheTomcat01