Let’s say you have a list of URLs that you need to quickly go through to determine their HTTP status. This recently happened to me when I had to determine if a series of pages were redirecting properly. You don’t need anything fancy. You certainly don’t want to enter a bunch of URLs manually.
Using cURL , a command line tool for fetching URLs, is the quickest solution. It can retrieve information and content on any URL. By modifying the command with flags, we can get the exact information we need.
To start, you’ll need to have a file of URLs available, one per line. Open your favorite terminal program. At the prompt, copy and paste the following command, substituting filename.csv for the name of the file containing the URLs:
curl -L -o /dev/null -s -w "%{url_effective}\n%{http_code}\n" $(cat filename.csv)
Here’s how to understand the different flags in the above command:
curl
is the command name. -L
tells the command to follow any redirects. -o /dev/null
sends the output to nowhere, since we’re only interested in the response code. -s
suppresses the progress bar. -w "%{url_effective}\n%{http_code}\n"
outputs the URL on one line, then the response code on the next line. $(cat filename.csv)
uses each line from filename.csv as an argument for the command.
If you think this command is useful and used frequently, consider setting up an alias in your profile, or create a shell script. Then you’ll have the command ready to go whenever you need it.
I hope this quick tip eliminates the menial, repetitive task of checking URLs manually – exactly the kind of thing computers are good at.
The post 60 Second Solution: URL List HTTP Response Codes appeared first on Marc Gottlieb Consulting.