URLLib2.URL Error: Reading Server Response Codes (Python) -
i have list of urls. i'd see server response code of each , find out if broken. can read server errors (500) , broken links (404) okay, code breaks once non-website read (e.g. "notawebsite_broken.com"). i've searched around , not found answer... hope can help. here's code: import urllib2 #list of urls. third url not website urls = ["http://www.google.com","http://www.ebay.com/broken-link", "http://notawebsite_broken"] #empty list store output response_codes = [] # run "for" loop: server response code , save results response_codes url in urls: try: connection = urllib2.urlopen(url) response_codes.append(connection.getcode()) connection.close() print url, ' - ', connection.getcode() except urllib2.httperror, e: response_codes.append(e.getcode()) print url, ' - ', e.getcode() print response_codes this gives output of... http://www.google.com