[Python] Socket Timeouts in urllib2

Posted on in Programming

cover image for article

Things have changed quite a lot since this post was originally written in 2016. Currently, I would recommend switching over to making asynchronous calls via aiohttp.

One of my scripts that makes an API call has been failing silently lately. It appears that the connection is timing out, but I am not catching that particular error. So, I fixed that.

@@ -8,6 +8,7 @@
 import os
 import random
 import shelve
+import socket
 import sys
 import time
 import urllib2
@@ -225,11 +226,15 @@

     queries['eve-kill'] += 1
     try:
-        data = urllib2.urlopen(request)
+        data = urllib2.urlopen(request, timeout=60)
     except urllib2.HTTPError, e:
         print('url: {}'.format(url))
         print('error: {}'.format(e))
         sys.exit(1)
+    except socket.timeout, e:
+        print('url: {}'.format(url))
+        print('error: {}'.format(e))
+        sys.exit(1)

     j = json.load(data)

I set the timeout particular long at 60 seconds. Since I made the change, I do not think it has ever actually taken that long.

Also, I need to refactor this and add my own exception class since it seems I am doing the same thing on most of my exceptions.

My Bookshelf

Reading Now

Other Stuff