Vinutha, While doing web scraping its necessary to check if the website permits users to perform web scraping.
This can be checked by using paths_allowed( ) in robotstxt package.
paths_allowed( ) function returns TRUE or FALSE depending on whether the website permits the user to scrape or not.
For example - Edureka website -
Technically websites that return FALSE are not supposed to be scaped, but users can still scrape which are not permitted.