This is a shell script function that creates a robots.txt file with a crawl delay of 10 for all user-agents. The function first checks if a robots.txt file already exists in the document root directory of the provided domain. If the file already exists, it prints a message indicating so and exits. If the file does not exist, the function creates the file with the desired content, sets the ownership of the file to the user of the domain, and prints a message indicating that the file was created.


[root@cloudvpsserver public_html]# nk_create_robots_txt nkern.net
/home/nkern/public_html/robots.txt created.

[root@cloudvpsserver public_html]# nk_create_robots_txt nkern.net
/home/nkern/public_html/robots.txt already exists.


nk_create_robots_txt() {
# Exit and print message if no domain name is provided.
if [ "$1" = "" ]; then
    echo "You must provide a domain name"
    return 0
# Populate variables.
# domain was provided by the nk_user
# docroot is the result of running nk_docroot on $domain
# user is the result of runnin nk_user on $domain
# robots_file is the result of adding "robots.txt" to the end of the docroot.
docroot="$(nk_docroot "$domain")"
user="$(nk_user "$domain")"
# If there's a file called robots.txt in docroot. Print out that it aleady exists and exit.
if [ -f "$robots_file" ]; then
    echo "$robots_file" already exists.
    return 0
# Otherwise if a robots.txt files does not exist. Create one with  crawl delay of 10 for all.
printf "User-agent: *\nCrawl-delay: 10\n" > "$robots_file"
# Update the ownership of the robots.txt file to be the user.
chown "$user":"$user" "$robots_file"
# Print out that the robots.txt file was created.
echo "$robots_file" created.

Author: Nichole Kernreicht

Created: 2023-04-09 Sun 21:35