Back to index

python3.2  3.2.2
Classes | Functions | Variables
urllib.request Namespace Reference

Classes

class  Request
class  OpenerDirector
class  BaseHandler
class  HTTPErrorProcessor
class  HTTPDefaultErrorHandler
class  HTTPRedirectHandler
class  ProxyHandler
class  HTTPPasswordMgr
class  HTTPPasswordMgrWithDefaultRealm
class  AbstractBasicAuthHandler
class  HTTPBasicAuthHandler
class  ProxyBasicAuthHandler
class  AbstractDigestAuthHandler
class  HTTPDigestAuthHandler
class  ProxyDigestAuthHandler
class  AbstractHTTPHandler
class  HTTPHandler
class  HTTPSHandler
class  HTTPCookieProcessor
class  UnknownHandler
class  FileHandler
class  FTPHandler
class  CacheFTPHandler
class  URLopener
class  FancyURLopener
class  ftpwrapper

Functions

def urlopen
def install_opener
def urlretrieve
def urlcleanup
def request_host
def build_opener
def _parse_proxy
def randombytes
def parse_keqv_list
def parse_http_list
def _safe_gethostbyname
def url2pathname
def pathname2url
def localhost
def thishost
def ftperrors
def noheaders
def getproxies_environment
def proxy_bypass_environment
def _proxy_bypass_macosx_sysconf
def proxy_bypass_macosx_sysconf
def getproxies_macosx_sysconf
def proxy_bypass
def getproxies
def getproxies_registry
def proxy_bypass_registry

Variables

 _have_ssl = False
list __version__ = sys.version[:3]
 _opener = None
 _urlopener = None
tuple _cut_port_re = re.compile(r":\d+$", re.ASCII)
int MAXFTPCACHE = 10
dictionary ftpcache = {}
 _localhost = None
 _thishost = None
 _ftperrors = None
 _noheaders = None
 getproxies = getproxies_environment
 proxy_bypass = proxy_bypass_environment

Detailed Description

An extensible library for opening URLs using a variety of protocols

The simplest way to use this module is to call the urlopen function,
which accepts a string containing a URL or a Request object (described
below).  It opens the URL and returns the results as file-like
object; the returned object has some extra methods described below.

The OpenerDirector manages a collection of Handler objects that do
all the actual work.  Each Handler implements a particular protocol or
option.  The OpenerDirector is a composite object that invokes the
Handlers needed to open the requested URL.  For example, the
HTTPHandler performs HTTP GET and POST requests and deals with
non-error returns.  The HTTPRedirectHandler automatically deals with
HTTP 301, 302, 303 and 307 redirect errors, and the HTTPDigestAuthHandler
deals with digest authentication.

urlopen(url, data=None) -- Basic usage is the same as original
urllib.  pass the url and optionally data to post to an HTTP URL, and
get a file-like object back.  One difference is that you can also pass
a Request instance instead of URL.  Raises a URLError (subclass of
IOError); for HTTP errors, raises an HTTPError, which can also be
treated as a valid response.

build_opener -- Function that creates a new OpenerDirector instance.
Will install the default handlers.  Accepts one or more Handlers as
arguments, either instances or Handler classes that it will
instantiate.  If one of the argument is a subclass of the default
handler, the argument will be installed instead of the default.

install_opener -- Installs a new opener as the default opener.

objects of interest:

OpenerDirector -- Sets up the User Agent as the Python-urllib client and manages
the Handler classes, while dealing with requests and responses.

Request -- An object that encapsulates the state of a request.  The
state can be as simple as the URL.  It can also include extra HTTP
headers, e.g. a User-Agent.

BaseHandler --

internals:
BaseHandler and parent
_call_chain conventions

Example usage:

import urllib.request

# set up authentication info
authinfo = urllib.request.HTTPBasicAuthHandler()
authinfo.add_password(realm='PDQ Application',
              uri='https://mahler:8092/site-updates.py',
              user='klem',
              passwd='geheim$parole')

proxy_support = urllib.request.ProxyHandler({"http" : "http://ahad-haam:3128"})

# build a new opener that adds authentication and caching FTP handlers
opener = urllib.request.build_opener(proxy_support, authinfo,
                             urllib.request.CacheFTPHandler)

# install it
urllib.request.install_opener(opener)

f = urllib.request.urlopen('http://www.python.org/')

Function Documentation

def urllib.request._parse_proxy (   proxy) [private]
Return (scheme, user, password, host/port) given a URL or an authority.

If a URL is supplied, it must have an authority (host:port) component.
According to RFC 3986, having an authority component means the URL must
have two slashes after the scheme:

>>> _parse_proxy('file:/ftp.example.com/')
Traceback (most recent call last):
ValueError: proxy URL with no authority: 'file:/ftp.example.com/'

The first three items of the returned tuple may be None.

Examples of authority parsing:

>>> _parse_proxy('proxy.example.com')
(None, None, None, 'proxy.example.com')
>>> _parse_proxy('proxy.example.com:3128')
(None, None, None, 'proxy.example.com:3128')

The authority component may optionally include userinfo (assumed to be
username:password):

>>> _parse_proxy('joe:password@proxy.example.com')
(None, 'joe', 'password', 'proxy.example.com')
>>> _parse_proxy('joe:password@proxy.example.com:3128')
(None, 'joe', 'password', 'proxy.example.com:3128')

Same examples, but with URLs instead:

>>> _parse_proxy('http://proxy.example.com/')
('http', None, None, 'proxy.example.com')
>>> _parse_proxy('http://proxy.example.com:3128/')
('http', None, None, 'proxy.example.com:3128')
>>> _parse_proxy('http://joe:password@proxy.example.com/')
('http', 'joe', 'password', 'proxy.example.com')
>>> _parse_proxy('http://joe:password@proxy.example.com:3128')
('http', 'joe', 'password', 'proxy.example.com:3128')

Everything after the authority is ignored:

>>> _parse_proxy('ftp://joe:password@proxy.example.com/rubbish:3128')
('ftp', 'joe', 'password', 'proxy.example.com')

Test for no trailing '/' case:

>>> _parse_proxy('http://joe:password@proxy.example.com')
('http', 'joe', 'password', 'proxy.example.com')

Definition at line 602 of file request.py.

00602 
00603 def _parse_proxy(proxy):
00604     """Return (scheme, user, password, host/port) given a URL or an authority.
00605 
00606     If a URL is supplied, it must have an authority (host:port) component.
00607     According to RFC 3986, having an authority component means the URL must
00608     have two slashes after the scheme:
00609 
00610     >>> _parse_proxy('file:/ftp.example.com/')
00611     Traceback (most recent call last):
00612     ValueError: proxy URL with no authority: 'file:/ftp.example.com/'
00613 
00614     The first three items of the returned tuple may be None.
00615 
00616     Examples of authority parsing:
00617 
00618     >>> _parse_proxy('proxy.example.com')
00619     (None, None, None, 'proxy.example.com')
00620     >>> _parse_proxy('proxy.example.com:3128')
00621     (None, None, None, 'proxy.example.com:3128')
00622 
00623     The authority component may optionally include userinfo (assumed to be
00624     username:password):
00625 
00626     >>> _parse_proxy('joe:password@proxy.example.com')
00627     (None, 'joe', 'password', 'proxy.example.com')
00628     >>> _parse_proxy('joe:password@proxy.example.com:3128')
00629     (None, 'joe', 'password', 'proxy.example.com:3128')
00630 
00631     Same examples, but with URLs instead:
00632 
00633     >>> _parse_proxy('http://proxy.example.com/')
00634     ('http', None, None, 'proxy.example.com')
00635     >>> _parse_proxy('http://proxy.example.com:3128/')
00636     ('http', None, None, 'proxy.example.com:3128')
00637     >>> _parse_proxy('http://joe:password@proxy.example.com/')
00638     ('http', 'joe', 'password', 'proxy.example.com')
00639     >>> _parse_proxy('http://joe:password@proxy.example.com:3128')
00640     ('http', 'joe', 'password', 'proxy.example.com:3128')
00641 
00642     Everything after the authority is ignored:
00643 
00644     >>> _parse_proxy('ftp://joe:password@proxy.example.com/rubbish:3128')
00645     ('ftp', 'joe', 'password', 'proxy.example.com')
00646 
00647     Test for no trailing '/' case:
00648 
00649     >>> _parse_proxy('http://joe:password@proxy.example.com')
00650     ('http', 'joe', 'password', 'proxy.example.com')
00651 
00652     """
00653     scheme, r_scheme = splittype(proxy)
00654     if not r_scheme.startswith("/"):
00655         # authority
00656         scheme = None
00657         authority = proxy
00658     else:
00659         # URL
00660         if not r_scheme.startswith("//"):
00661             raise ValueError("proxy URL with no authority: %r" % proxy)
00662         # We have an authority, so for RFC 3986-compliant URLs (by ss 3.
00663         # and 3.3.), path is empty or starts with '/'
00664         end = r_scheme.find("/", 2)
00665         if end == -1:
00666             end = None
00667         authority = r_scheme[2:end]
00668     userinfo, hostport = splituser(authority)
00669     if userinfo is not None:
00670         user, password = splitpasswd(userinfo)
00671     else:
00672         user = password = None
00673     return scheme, user, password, hostport

Here is the call graph for this function:

Here is the caller graph for this function:

def urllib.request._proxy_bypass_macosx_sysconf (   host,
  proxy_settings 
) [private]
Return True iff this host shouldn't be accessed using a proxy

This function uses the MacOSX framework SystemConfiguration
to fetch the proxy information.

proxy_settings come from _scproxy._get_proxy_settings or get mocked ie:
{ 'exclude_simple': bool,
  'exceptions': ['foo.bar', '*.bar.com', '127.0.0.1', '10.1', '10.0/16']
}

Definition at line 2278 of file request.py.

02278 
02279 def _proxy_bypass_macosx_sysconf(host, proxy_settings):
02280     """
02281     Return True iff this host shouldn't be accessed using a proxy
02282 
02283     This function uses the MacOSX framework SystemConfiguration
02284     to fetch the proxy information.
02285 
02286     proxy_settings come from _scproxy._get_proxy_settings or get mocked ie:
02287     { 'exclude_simple': bool,
02288       'exceptions': ['foo.bar', '*.bar.com', '127.0.0.1', '10.1', '10.0/16']
02289     }
02290     """
02291     import re
02292     import socket
02293     from fnmatch import fnmatch
02294 
02295     hostonly, port = splitport(host)
02296 
02297     def ip2num(ipAddr):
02298         parts = ipAddr.split('.')
02299         parts = list(map(int, parts))
02300         if len(parts) != 4:
02301             parts = (parts + [0, 0, 0, 0])[:4]
02302         return (parts[0] << 24) | (parts[1] << 16) | (parts[2] << 8) | parts[3]
02303 
02304     # Check for simple host names:
02305     if '.' not in host:
02306         if proxy_settings['exclude_simple']:
02307             return True
02308 
02309     hostIP = None
02310 
02311     for value in proxy_settings.get('exceptions', ()):
02312         # Items in the list are strings like these: *.local, 169.254/16
02313         if not value: continue
02314 
02315         m = re.match(r"(\d+(?:\.\d+)*)(/\d+)?", value)
02316         if m is not None:
02317             if hostIP is None:
02318                 try:
02319                     hostIP = socket.gethostbyname(hostonly)
02320                     hostIP = ip2num(hostIP)
02321                 except socket.error:
02322                     continue
02323 
02324             base = ip2num(m.group(1))
02325             mask = m.group(2)
02326             if mask is None:
02327                 mask = 8 * (m.group(1).count('.') + 1)
02328             else:
02329                 mask = int(mask[1:])
02330             mask = 32 - mask
02331 
02332             if (hostIP >> mask) == (base >> mask):
02333                 return True
02334 
02335         elif fnmatch(host, value):
02336             return True
02337 
02338     return False
02339 

Here is the call graph for this function:

Here is the caller graph for this function:

def urllib.request._safe_gethostbyname (   host) [private]

Definition at line 1304 of file request.py.

01304 
01305 def _safe_gethostbyname(host):
01306     try:
01307         return socket.gethostbyname(host)
01308     except socket.gaierror:
01309         return None

Here is the caller graph for this function:

def urllib.request.build_opener (   handlers)
Create an opener object from a list of handlers.

The opener will use several default handlers, including support
for HTTP, FTP and when applicable HTTPS.

If any of the handlers passed as arguments are subclasses of the
default handlers, the default handlers will not be used.

Definition at line 419 of file request.py.

00419 
00420 def build_opener(*handlers):
00421     """Create an opener object from a list of handlers.
00422 
00423     The opener will use several default handlers, including support
00424     for HTTP, FTP and when applicable HTTPS.
00425 
00426     If any of the handlers passed as arguments are subclasses of the
00427     default handlers, the default handlers will not be used.
00428     """
00429     def isclass(obj):
00430         return isinstance(obj, type) or hasattr(obj, "__bases__")
00431 
00432     opener = OpenerDirector()
00433     default_classes = [ProxyHandler, UnknownHandler, HTTPHandler,
00434                        HTTPDefaultErrorHandler, HTTPRedirectHandler,
00435                        FTPHandler, FileHandler, HTTPErrorProcessor]
00436     if hasattr(http.client, "HTTPSConnection"):
00437         default_classes.append(HTTPSHandler)
00438     skip = set()
00439     for klass in default_classes:
00440         for check in handlers:
00441             if isclass(check):
00442                 if issubclass(check, klass):
00443                     skip.add(klass)
00444             elif isinstance(check, klass):
00445                 skip.add(klass)
00446     for klass in skip:
00447         default_classes.remove(klass)
00448 
00449     for klass in default_classes:
00450         opener.add_handler(klass())
00451 
00452     for h in handlers:
00453         if isclass(h):
00454             h = h()
00455         opener.add_handler(h)
00456     return opener

Here is the call graph for this function:

Here is the caller graph for this function:

Return the set of errors raised by the FTP class.

Definition at line 2123 of file request.py.

02123 
02124 def ftperrors():
02125     """Return the set of errors raised by the FTP class."""
02126     global _ftperrors
02127     if _ftperrors is None:
02128         import ftplib
02129         _ftperrors = ftplib.all_errors
02130     return _ftperrors

Here is the caller graph for this function:

Return a dictionary of scheme -> proxy server URL mappings.

Returns settings gathered from the environment, if specified,
or the registry.

Definition at line 2363 of file request.py.

02363 
02364     def getproxies():
02365         return getproxies_environment() or getproxies_macosx_sysconf()
02366 

Here is the call graph for this function:

Return a dictionary of scheme -> proxy server URL mappings.

Scan the environment for variables named <scheme>_proxy;
this seems to be the standard convention.  If you need a
different way, you can pass a proxies dictionary to the
[Fancy]URLopener constructor.

Definition at line 2239 of file request.py.

02239 
02240 def getproxies_environment():
02241     """Return a dictionary of scheme -> proxy server URL mappings.
02242 
02243     Scan the environment for variables named <scheme>_proxy;
02244     this seems to be the standard convention.  If you need a
02245     different way, you can pass a proxies dictionary to the
02246     [Fancy]URLopener constructor.
02247 
02248     """
02249     proxies = {}
02250     for name, value in os.environ.items():
02251         name = name.lower()
02252         if value and name[-6:] == '_proxy':
02253             proxies[name[:-6]] = value
02254     return proxies

Here is the caller graph for this function:

Return a dictionary of scheme -> proxy server URL mappings.

This function uses the MacOSX framework SystemConfiguration
to fetch the proxy information.

Definition at line 2347 of file request.py.

02347 
02348     def getproxies_macosx_sysconf():
02349         """Return a dictionary of scheme -> proxy server URL mappings.
02350 
02351         This function uses the MacOSX framework SystemConfiguration
02352         to fetch the proxy information.
02353         """
02354         return _get_proxies()
02355 
02356 

Here is the caller graph for this function:

Return a dictionary of scheme -> proxy server URL mappings.

Win32 uses the registry to store proxies.

Definition at line 2368 of file request.py.

02368 
02369     def getproxies_registry():
02370         """Return a dictionary of scheme -> proxy server URL mappings.
02371 
02372         Win32 uses the registry to store proxies.
02373 
02374         """
02375         proxies = {}
02376         try:
02377             import winreg
02378         except ImportError:
02379             # Std module, so should be around - but you never know!
02380             return proxies
02381         try:
02382             internetSettings = winreg.OpenKey(winreg.HKEY_CURRENT_USER,
02383                 r'Software\Microsoft\Windows\CurrentVersion\Internet Settings')
02384             proxyEnable = winreg.QueryValueEx(internetSettings,
02385                                                'ProxyEnable')[0]
02386             if proxyEnable:
02387                 # Returned as Unicode but problems if not converted to ASCII
02388                 proxyServer = str(winreg.QueryValueEx(internetSettings,
02389                                                        'ProxyServer')[0])
02390                 if '=' in proxyServer:
02391                     # Per-protocol settings
02392                     for p in proxyServer.split(';'):
02393                         protocol, address = p.split('=', 1)
02394                         # See if address has a type:// prefix
02395                         import re
02396                         if not re.match('^([^/:]+)://', address):
02397                             address = '%s://%s' % (protocol, address)
02398                         proxies[protocol] = address
02399                 else:
02400                     # Use one setting for all protocols
02401                     if proxyServer[:5] == 'http:':
02402                         proxies['http'] = proxyServer
02403                     else:
02404                         proxies['http'] = 'http://%s' % proxyServer
02405                         proxies['https'] = 'https://%s' % proxyServer
02406                         proxies['ftp'] = 'ftp://%s' % proxyServer
02407             internetSettings.Close()
02408         except (WindowsError, ValueError, TypeError):
02409             # Either registry key not found etc, or the value in an
02410             # unexpected format.
02411             # proxies already set up to be empty so nothing to do
02412             pass
02413         return proxies

Here is the call graph for this function:

Definition at line 140 of file request.py.

00140 
00141 def install_opener(opener):
00142     global _opener
00143     _opener = opener
00144 
# TODO(jhylton): Make this work with the same global opener.
Return the IP address of the magic hostname 'localhost'.

Definition at line 2107 of file request.py.

02107 
02108 def localhost():
02109     """Return the IP address of the magic hostname 'localhost'."""
02110     global _localhost
02111     if _localhost is None:
02112         _localhost = socket.gethostbyname('localhost')
02113     return _localhost

Here is the caller graph for this function:

Return an empty email Message object.

Definition at line 2132 of file request.py.

02132 
02133 def noheaders():
02134     """Return an empty email Message object."""
02135     global _noheaders
02136     if _noheaders is None:
02137         _noheaders = email.message_from_string("")
02138     return _noheaders
02139 
02140 
02141 # Utility classes

Here is the call graph for this function:

Parse lists as described by RFC 2068 Section 2.

In particular, parse comma-separated lists where the elements of
the list may include quoted-strings.  A quoted-string could
contain a comma.  A non-quoted string could have quotes in the
middle.  Neither commas nor quotes count if they are escaped.
Only double-quotes count, not single-quotes.

Definition at line 1209 of file request.py.

01209 
01210 def parse_http_list(s):
01211     """Parse lists as described by RFC 2068 Section 2.
01212 
01213     In particular, parse comma-separated lists where the elements of
01214     the list may include quoted-strings.  A quoted-string could
01215     contain a comma.  A non-quoted string could have quotes in the
01216     middle.  Neither commas nor quotes count if they are escaped.
01217     Only double-quotes count, not single-quotes.
01218     """
01219     res = []
01220     part = ''
01221 
01222     escape = quote = False
01223     for cur in s:
01224         if escape:
01225             part += cur
01226             escape = False
01227             continue
01228         if quote:
01229             if cur == '\\':
01230                 escape = True
01231                 continue
01232             elif cur == '"':
01233                 quote = False
01234             part += cur
01235             continue
01236 
01237         if cur == ',':
01238             res.append(part)
01239             part = ''
01240             continue
01241 
01242         if cur == '"':
01243             quote = True
01244 
01245         part += cur
01246 
01247     # append last part
01248     if part:
01249         res.append(part)
01250 
01251     return [part.strip() for part in res]

Here is the caller graph for this function:

Parse list of key=value strings where keys are not duplicated.

Definition at line 1199 of file request.py.

01199 
01200 def parse_keqv_list(l):
01201     """Parse list of key=value strings where keys are not duplicated."""
01202     parsed = {}
01203     for elt in l:
01204         k, v = elt.split('=', 1)
01205         if v[0] == '"' and v[-1] == '"':
01206             v = v[1:-1]
01207         parsed[k] = v
01208     return parsed

Here is the caller graph for this function:

def urllib.request.pathname2url (   pathname)
OS-specific conversion from a file system path to a relative URL
of the 'file' scheme; not recommended for general use.

Definition at line 1435 of file request.py.

01435 
01436     def pathname2url(pathname):
01437         """OS-specific conversion from a file system path to a relative URL
01438         of the 'file' scheme; not recommended for general use."""
01439         return quote(pathname)
01440 
01441 # This really consists of two pieces:
01442 # (1) a class which handles opening of all sorts of URLs
01443 #     (plus assorted utilities etc.)
01444 # (2) a set of functions for parsing URLs
01445 # XXX Should these be separated out into different modules?
01446 

Here is the call graph for this function:

Here is the caller graph for this function:

Return a dictionary of scheme -> proxy server URL mappings.

Returns settings gathered from the environment, if specified,
or the registry.

Definition at line 2357 of file request.py.

02357 
02358     def proxy_bypass(host):
02359         if getproxies_environment():
02360             return proxy_bypass_environment(host)
02361         else:
02362             return proxy_bypass_macosx_sysconf(host)

Here is the call graph for this function:

Test if proxies should not be used for a particular host.

Checks the environment for a variable named no_proxy, which should
be a list of DNS suffixes separated by commas, or '*' for all hosts.

Definition at line 2255 of file request.py.

02255 
02256 def proxy_bypass_environment(host):
02257     """Test if proxies should not be used for a particular host.
02258 
02259     Checks the environment for a variable named no_proxy, which should
02260     be a list of DNS suffixes separated by commas, or '*' for all hosts.
02261     """
02262     no_proxy = os.environ.get('no_proxy', '') or os.environ.get('NO_PROXY', '')
02263     # '*' is special case for always bypass
02264     if no_proxy == '*':
02265         return 1
02266     # strip port off host
02267     hostonly, port = splitport(host)
02268     # check if the host ends with any of the DNS suffixes
02269     no_proxy_list = [proxy.strip() for proxy in no_proxy.split(',')]
02270     for name in no_proxy_list:
02271         if name and (hostonly.endswith(name) or host.endswith(name)):
02272             return 1
02273     # otherwise, don't bypass
02274     return 0
02275 
02276 
02277 # This code tests an OSX specific data structure but is testable on all
# platforms

Here is the call graph for this function:

Here is the caller graph for this function:

Definition at line 2343 of file request.py.

02343 
02344     def proxy_bypass_macosx_sysconf(host):
02345         proxy_settings = _get_proxy_settings()
02346         return _proxy_bypass_macosx_sysconf(host, proxy_settings)

Here is the call graph for this function:

Here is the caller graph for this function:

Definition at line 2423 of file request.py.

02423 
02424     def proxy_bypass_registry(host):
02425         try:
02426             import winreg
02427             import re
02428         except ImportError:
02429             # Std modules, so should be around - but you never know!
02430             return 0
02431         try:
02432             internetSettings = winreg.OpenKey(winreg.HKEY_CURRENT_USER,
02433                 r'Software\Microsoft\Windows\CurrentVersion\Internet Settings')
02434             proxyEnable = winreg.QueryValueEx(internetSettings,
02435                                                'ProxyEnable')[0]
02436             proxyOverride = str(winreg.QueryValueEx(internetSettings,
02437                                                      'ProxyOverride')[0])
02438             # ^^^^ Returned as Unicode but problems if not converted to ASCII
02439         except WindowsError:
02440             return 0
02441         if not proxyEnable or not proxyOverride:
02442             return 0
02443         # try to make a host list from name and IP address.
02444         rawHost, port = splitport(host)
02445         host = [rawHost]
02446         try:
02447             addr = socket.gethostbyname(rawHost)
02448             if addr != rawHost:
02449                 host.append(addr)
02450         except socket.error:
02451             pass
02452         try:
02453             fqdn = socket.getfqdn(rawHost)
02454             if fqdn != rawHost:
02455                 host.append(fqdn)
02456         except socket.error:
02457             pass
02458         # make a check value list from the registry entry: replace the
02459         # '<local>' string by the localhost entry and the corresponding
02460         # canonical entry.
02461         proxyOverride = proxyOverride.split(';')
02462         # now check if we match one of the registry values.
02463         for test in proxyOverride:
02464             if test == '<local>':
02465                 if '.' not in rawHost:
02466                     return 1
02467             test = test.replace(".", r"\.")     # mask dots
02468             test = test.replace("*", r".*")     # change glob sequence
02469             test = test.replace("?", r".")      # change glob char
02470             for val in host:
02471                 # print "%s <--> %s" %( test, val )
02472                 if re.match(test, val, re.I):
02473                     return 1
02474         return 0

Here is the call graph for this function:

Return n random bytes.

Definition at line 878 of file request.py.

00878 
00879 def randombytes(n):
00880     """Return n random bytes."""
00881     return os.urandom(n)

Here is the call graph for this function:

Here is the caller graph for this function:

def urllib.request.request_host (   request)
Return request-host, as defined by RFC 2965.

Variation from RFC: returned value is lowercased, for convenient
comparison.

Definition at line 161 of file request.py.

00161 
00162 def request_host(request):
00163     """Return request-host, as defined by RFC 2965.
00164 
00165     Variation from RFC: returned value is lowercased, for convenient
00166     comparison.
00167 
00168     """
00169     url = request.full_url
00170     host = urlparse(url)[1]
00171     if host == "":
00172         host = request.get_header("Host", "")
00173 
00174     # remove port, if present
00175     host = _cut_port_re.sub("", host, 1)
00176     return host.lower()

Here is the call graph for this function:

Return the IP addresses of the current host.

Definition at line 2115 of file request.py.

02115 
02116 def thishost():
02117     """Return the IP addresses of the current host."""
02118     global _thishost
02119     if _thishost is None:
02120         _thishost = tuple(socket.gethostbyname_ex(socket.gethostname()[2]))
02121     return _thishost

Here is the caller graph for this function:

def urllib.request.url2pathname (   pathname)
OS-specific conversion from a relative URL of the 'file' scheme
to a file system path; not recommended for general use.

Definition at line 1430 of file request.py.

01430 
01431     def url2pathname(pathname):
01432         """OS-specific conversion from a relative URL of the 'file' scheme
01433         to a file system path; not recommended for general use."""
01434         return unquote(pathname)

Here is the call graph for this function:

Here is the caller graph for this function:

Definition at line 152 of file request.py.

00152 
00153 def urlcleanup():
00154     if _urlopener:
00155         _urlopener.cleanup()
00156     global _opener
00157     if _opener:
00158         _opener = None
00159 
# copied from cookielib.py

Here is the caller graph for this function:

def urllib.request.urlopen (   url,
  data = None,
  timeout = socket._GLOBAL_DEFAULT_TIMEOUT,
  cafile = None,
  capath = None 
)

Definition at line 119 of file request.py.

00119 
00120             *, cafile=None, capath=None):
00121     global _opener
00122     if cafile or capath:
00123         if not _have_ssl:
00124             raise ValueError('SSL support not available')
00125         context = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
00126         context.options |= ssl.OP_NO_SSLv2
00127         if cafile or capath:
00128             context.verify_mode = ssl.CERT_REQUIRED
00129             context.load_verify_locations(cafile, capath)
00130             check_hostname = True
00131         else:
00132             check_hostname = False
00133         https_handler = HTTPSHandler(context=context, check_hostname=check_hostname)
00134         opener = build_opener(https_handler)
00135     elif _opener is None:
00136         _opener = opener = build_opener()
00137     else:
00138         opener = _opener
00139     return opener.open(url, data, timeout)

Here is the call graph for this function:

Here is the caller graph for this function:

def urllib.request.urlretrieve (   url,
  filename = None,
  reporthook = None,
  data = None 
)

Definition at line 146 of file request.py.

00146 
00147 def urlretrieve(url, filename=None, reporthook=None, data=None):
00148     global _urlopener
00149     if not _urlopener:
00150         _urlopener = FancyURLopener()
00151     return _urlopener.retrieve(url, filename, reporthook, data)

Here is the caller graph for this function:


Variable Documentation

Definition at line 115 of file request.py.

tuple urllib.request._cut_port_re = re.compile(r":\d+$", re.ASCII)

Definition at line 160 of file request.py.

Definition at line 2122 of file request.py.

Definition at line 110 of file request.py.

Definition at line 2106 of file request.py.

Definition at line 2131 of file request.py.

Definition at line 117 of file request.py.

Definition at line 2114 of file request.py.

Definition at line 145 of file request.py.

Definition at line 1447 of file request.py.

Return a dictionary of scheme -> proxy server URL mappings.

Returns settings gathered from the environment, if specified,
or the registry.

Definition at line 2489 of file request.py.

Definition at line 1424 of file request.py.

Return a dictionary of scheme -> proxy server URL mappings.

Returns settings gathered from the environment, if specified,
or the registry.

Definition at line 2490 of file request.py.