Overview


1. Create robots.txt

Open a plain text editor like Notepad. In the /src folder, save a new file as "robots.txt". In this file add these lines:

User-Agent: *
Disallow: 
              

This is the most basic robots.txt and will allow all web crawlers access to all pages on your website. Options


2. Modify angular.json

Open angular.json and in architect.build.assets add this line:

"src/robots.txt"


3. Test

Build and deploy your project online. Go to a robots checker tool, such as technicalseo.com/tools/robots-txt/, and test your url.


Options

Exclude Pages

Prevent bots crawling certain pages.

Disallow: /user
Disallow: /contact 
            

Exclude Bots

Prevent certain bots crawling your site.

User-agent: SemrushBot
Disallow: /

User-agent: AhrefsBot 
Disallow: /

User-agent: Amazonbot
Disallow: /
          





Next

Add a sitemap


Reference

Google