Shell script to check url

Tags:

Answer: 1

36 hours ago

I am trying to write a shell script to check for the existence of a particular directory in a particular domain name. For example, I have a web site named www.example.com. I want to check whether the page www.example.com/testpage exists or not. I have a text file containing a list of directory names,one per line, which must be replaced with the name testpage. I want the script to select each of these directory names,append it to the url www.example.com and validate it. How can I begin writing my script?

Added by: Fiona Flatley

Answer: 2

9 hours ago

Let's examine an example.

File relative_urls.list:

> cat relative_urls.list 
/users/449/oli
/users/449
/help/badges
/help/badges/185/curious
/unanswered
/questions/tagged/12.04
/questions/tagged/boot
/questions/tagged/oracle
/questions/tagged/internet_explorer
/questions/tagged/outlook

We'd like to check if these documents available on the site http://askubuntu.com.

Possible solution:

> cat relative_urls.list |while read i ; do curl --head -s "http://askubuntu.com"$i |grep -q '^HTTP.*200' && echo "OK '$i'" ||echo "fail '$i'" ; done

And the result is:

OK '/users/449/oli'
fail '/users/449'
OK '/help/badges'
OK '/help/badges/185/curious'
OK '/unanswered'
OK '/questions/tagged/12.04'
OK '/questions/tagged/boot'
OK '/questions/tagged/oracle'
fail '/questions/tagged/internet_explorer'
fail '/questions/tagged/outlook'

If you prefer another definition of availability of url, you can change grep command to something more relevant to your conditions.

Important thing is that curl shows whether an url is served by a web server. There is no reliable method to determine existence of directory on a web server's file system via http.

Added by: Aglae Fadel

Popular Search

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z 1 2 3 4 5 6 7 8 9