Skip to content

Commit

Permalink
Merge branch 'master' into async-auth
Browse files Browse the repository at this point in the history
  • Loading branch information
florimondmanca committed Sep 4, 2020
2 parents 4aa3c55 + 42c6686 commit a795e32
Show file tree
Hide file tree
Showing 17 changed files with 224 additions and 216 deletions.
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,6 @@ The HTTPX project relies on these excellent libraries:
* `rfc3986` - URL parsing & normalization.
* `idna` - Internationalized domain name support.
* `sniffio` - Async library autodetection.
* `urllib3` - Support for the `httpx.URLLib3Transport` class. *(Optional)*
* `brotlipy` - Decoding for "brotli" compressed responses. *(Optional)*

A huge amount of credit is due to `requests` for the API layout that
Expand Down
13 changes: 8 additions & 5 deletions docs/advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -809,6 +809,8 @@ HTTPX's `Client` also accepts a `transport` argument. This argument allows you
to provide a custom Transport object that will be used to perform the actual
sending of the requests.

### Usage

For some advanced configuration you might need to instantiate a transport
class directly, and pass it to the client instance. The `httpcore` package
provides a `local_address` configuration that is only available via this
Expand Down Expand Up @@ -850,18 +852,19 @@ do not include any default values for configuring aspects such as the
connection pooling details, so you'll need to provide more explicit
configuration when using this API.

HTTPX also currently ships with a transport that uses the excellent
[`urllib3` library](https://urllib3.readthedocs.io/en/latest/), which can be
used with the sync `Client`...
### urllib3 transport

This [public gist](https://gist.github.com/florimondmanca/d56764d78d748eb9f73165da388e546e) provides a transport that uses the excellent [`urllib3` library](https://urllib3.readthedocs.io/en/latest/), and can be used with the sync `Client`...

```pycon
>>> import httpx
>>> client = httpx.Client(transport=httpx.URLLib3Transport())
>>> from urllib3_transport import URLLib3Transport
>>> client = httpx.Client(transport=URLLib3Transport())
>>> client.get("https://example.org")
<Response [200 OK]>
```

Note that you'll need to install the `urllib3` package to use `URLLib3Transport`.
### Writing custom transports

A transport instance must implement the Transport API defined by
[`httpcore`](https://www.encode.io/httpcore/api/). You
Expand Down
6 changes: 6 additions & 0 deletions docs/compatibility.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,3 +83,9 @@ Besides, `httpx.Request()` does not support the `auth`, `timeout`, `allow_redire
## Mocking

If you need to mock HTTPX the same way that test utilities like `responses` and `requests-mock` does for `requests`, see [RESPX](https://github.com/lundberg/respx).

## Networking layer

`requests` defers most of its HTTP networking code to the excellent [`urllib3` library](https://urllib3.readthedocs.io/en/latest/).

On the other hand, HTTPX uses [HTTPCore](https://github.com/encode/httpcore) as its core HTTP networking layer, which is a different project than `urllib3`.
1 change: 0 additions & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,6 @@ The HTTPX project relies on these excellent libraries:
* `rfc3986` - URL parsing & normalization.
* `idna` - Internationalized domain name support.
* `sniffio` - Async library autodetection.
* `urllib3` - Support for the `httpx.URLLib3Transport` class. *(Optional)*
* `brotlipy` - Decoding for "brotli" compressed responses. *(Optional)*

A huge amount of credit is due to `requests` for the API layout that
Expand Down
10 changes: 10 additions & 0 deletions docs/third-party-packages.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,3 +23,13 @@ An asynchronous GitHub API library. Includes [HTTPX support](https://gidgethub.r
[GitHub](https://github.com/lundberg/respx) - [Documentation](https://lundberg.github.io/respx/)

A utility for mocking out the Python HTTPX library.

## Gists

<!-- NOTE: this list is in alphabetical order. -->

### urllib3-transport

[GitHub](https://gist.github.com/florimondmanca/d56764d78d748eb9f73165da388e546e)

This public gist provides an example implementation for a [custom transport](/advanced#custom-transports) implementation on top of the battle-tested [`urllib3`](https://urllib3.readthedocs.io) library.
3 changes: 0 additions & 3 deletions httpx/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,6 @@
from ._models import URL, Cookies, Headers, QueryParams, Request, Response
from ._status_codes import StatusCode, codes
from ._transports.asgi import ASGITransport
from ._transports.urllib3 import URLLib3ProxyTransport, URLLib3Transport
from ._transports.wsgi import WSGITransport

__all__ = [
Expand Down Expand Up @@ -101,8 +100,6 @@
"TransportError",
"UnsupportedProtocol",
"URL",
"URLLib3ProxyTransport",
"URLLib3Transport",
"WriteError",
"WriteTimeout",
"WSGITransport",
Expand Down
96 changes: 73 additions & 23 deletions httpx/_client.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
import functools
import typing
import warnings
from types import TracebackType

import httpcore

from .__version__ import __version__
from ._auth import Auth, BasicAuth, FunctionAuth
from ._config import (
DEFAULT_LIMITS,
Expand All @@ -17,6 +19,7 @@
create_ssl_context,
)
from ._content_streams import ContentStream
from ._decoders import SUPPORTED_DECODERS
from ._exceptions import (
HTTPCORE_EXC_MAP,
InvalidURL,
Expand Down Expand Up @@ -54,6 +57,10 @@
logger = get_logger(__name__)

KEEPALIVE_EXPIRY = 5.0
USER_AGENT = f"python-httpx/{__version__}"
ACCEPT_ENCODING = ", ".join(
[key for key in SUPPORTED_DECODERS.keys() if key != "identity"]
)


class BaseClient:
Expand All @@ -73,12 +80,20 @@ def __init__(

self._auth = self._build_auth(auth)
self._params = QueryParams(params)
self._headers = Headers(headers)
self.headers = Headers(headers)
self._cookies = Cookies(cookies)
self._timeout = Timeout(timeout)
self.max_redirects = max_redirects
self._trust_env = trust_env
self._netrc = NetRCInfo()
self._is_closed = True

@property
def is_closed(self) -> bool:
"""
Check if the client being closed
"""
return self._is_closed

@property
def trust_env(self) -> bool:
Expand Down Expand Up @@ -152,7 +167,16 @@ def headers(self) -> Headers:

@headers.setter
def headers(self, headers: HeaderTypes) -> None:
self._headers = Headers(headers)
client_headers = Headers(
{
b"Accept": b"*/*",
b"Accept-Encoding": ACCEPT_ENCODING.encode("ascii"),
b"Connection": b"keep-alive",
b"User-Agent": USER_AGENT.encode("ascii"),
}
)
client_headers.update(headers)
self._headers = client_headers

@property
def cookies(self) -> Cookies:
Expand Down Expand Up @@ -290,11 +314,9 @@ def _merge_headers(
Merge a headers argument together with any headers on the client,
to create the headers used for the outgoing request.
"""
if headers or self.headers:
merged_headers = Headers(self.headers)
merged_headers.update(headers)
return merged_headers
return headers
merged_headers = Headers(self.headers)
merged_headers.update(headers)
return merged_headers

def _merge_queryparams(
self, params: QueryParamTypes = None
Expand Down Expand Up @@ -696,6 +718,8 @@ def send(
[0]: /advanced/#request-instances
"""
self._is_closed = False

timeout = self.timeout if isinstance(timeout, UnsetType) else Timeout(timeout)

auth = self._build_request_auth(request, auth)
Expand Down Expand Up @@ -1026,16 +1050,20 @@ def close(self) -> None:
"""
Close transport and proxies.
"""
self._transport.close()
for proxy in self._proxies.values():
if proxy is not None:
proxy.close()
if not self.is_closed:
self._is_closed = True

self._transport.close()
for proxy in self._proxies.values():
if proxy is not None:
proxy.close()

def __enter__(self) -> "Client":
self._transport.__enter__()
for proxy in self._proxies.values():
if proxy is not None:
proxy.__enter__()
self._is_closed = False
return self

def __exit__(
Expand All @@ -1044,10 +1072,16 @@ def __exit__(
exc_value: BaseException = None,
traceback: TracebackType = None,
) -> None:
self._transport.__exit__(exc_type, exc_value, traceback)
for proxy in self._proxies.values():
if proxy is not None:
proxy.__exit__(exc_type, exc_value, traceback)
if not self.is_closed:
self._is_closed = True

self._transport.__exit__(exc_type, exc_value, traceback)
for proxy in self._proxies.values():
if proxy is not None:
proxy.__exit__(exc_type, exc_value, traceback)

def __del__(self) -> None:
self.close()


class AsyncClient(BaseClient):
Expand Down Expand Up @@ -1302,6 +1336,8 @@ async def send(
[0]: /advanced/#request-instances
"""
self._is_closed = False

timeout = self.timeout if isinstance(timeout, UnsetType) else Timeout(timeout)

auth = self._build_request_auth(request, auth)
Expand Down Expand Up @@ -1634,16 +1670,20 @@ async def aclose(self) -> None:
"""
Close transport and proxies.
"""
await self._transport.aclose()
for proxy in self._proxies.values():
if proxy is not None:
await proxy.aclose()
if not self.is_closed:
self._is_closed = True

await self._transport.aclose()
for proxy in self._proxies.values():
if proxy is not None:
await proxy.aclose()

async def __aenter__(self) -> "AsyncClient":
await self._transport.__aenter__()
for proxy in self._proxies.values():
if proxy is not None:
await proxy.__aenter__()
self._is_closed = False
return self

async def __aexit__(
Expand All @@ -1652,10 +1692,20 @@ async def __aexit__(
exc_value: BaseException = None,
traceback: TracebackType = None,
) -> None:
await self._transport.__aexit__(exc_type, exc_value, traceback)
for proxy in self._proxies.values():
if proxy is not None:
await proxy.__aexit__(exc_type, exc_value, traceback)
if not self.is_closed:
self._is_closed = True
await self._transport.__aexit__(exc_type, exc_value, traceback)
for proxy in self._proxies.values():
if proxy is not None:
await proxy.__aexit__(exc_type, exc_value, traceback)

def __del__(self) -> None:
if not self.is_closed:
warnings.warn(
f"Unclosed {self!r}. "
"See https://www.python-httpx.org/async/#opening-and-closing-clients "
"for details."
)


class StreamContextManager:
Expand Down
2 changes: 1 addition & 1 deletion httpx/_exceptions.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ class HTTPError(Exception):
response = httpx.get("https://www.example.com")
response.raise_for_status()
except httpx.HTTPError as exc:
print(f"HTTP Exception for {exc.request.url} - {exc.message}")
print(f"HTTP Exception for {exc.request.url} - {exc}")
```
"""

Expand Down
41 changes: 11 additions & 30 deletions httpx/_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
import rfc3986
import rfc3986.exceptions

from .__version__ import __version__
from ._content_streams import ByteStream, ContentStream, encode
from ._decoders import (
SUPPORTED_DECODERS,
Expand Down Expand Up @@ -72,8 +71,12 @@ def __init__(self, url: URLTypes = "", params: QueryParamTypes = None) -> None:
# We don't want to normalize relative URLs, since doing so
# removes any leading `../` portion.
self._uri_reference = self._uri_reference.normalize()
else:
elif isinstance(url, URL):
self._uri_reference = url._uri_reference
else:
raise TypeError(
f"Invalid type for url. Expected str or httpx.URL, got {type(url)}"
)

# Add any query parameters, merging with any in the URL if needed.
if params:
Expand Down Expand Up @@ -103,13 +106,11 @@ def userinfo(self) -> str:

@property
def username(self) -> str:
userinfo = self._uri_reference.userinfo or ""
return unquote(userinfo.partition(":")[0])
return unquote(self.userinfo.partition(":")[0])

@property
def password(self) -> str:
userinfo = self._uri_reference.userinfo or ""
return unquote(userinfo.partition(":")[2])
return unquote(self.userinfo.partition(":")[2])

@property
def host(self) -> str:
Expand Down Expand Up @@ -580,12 +581,6 @@ def getlist(self, key: str, split_commas: bool = False) -> typing.List[str]:
return self.get_list(key, split_commas=split_commas)


USER_AGENT = f"python-httpx/{__version__}"
ACCEPT_ENCODING = ", ".join(
[key for key in SUPPORTED_DECODERS.keys() if key != "identity"]
)


class Request:
def __init__(
self,
Expand Down Expand Up @@ -627,26 +622,12 @@ def prepare(self) -> None:
has_content_length = (
"content-length" in self.headers or "transfer-encoding" in self.headers
)
has_user_agent = "user-agent" in self.headers
has_accept = "accept" in self.headers
has_accept_encoding = "accept-encoding" in self.headers
has_connection = "connection" in self.headers

if not has_host:
url = self.url
if url.userinfo:
url = url.copy_with(username=None, password=None)
auto_headers.append((b"host", url.authority.encode("ascii")))

if not has_host and self.url.authority:
host = self.url.copy_with(username=None, password=None).authority
auto_headers.append((b"host", host.encode("ascii")))
if not has_content_length and self.method in ("POST", "PUT", "PATCH"):
auto_headers.append((b"content-length", b"0"))
if not has_user_agent:
auto_headers.append((b"user-agent", USER_AGENT.encode("ascii")))
if not has_accept:
auto_headers.append((b"accept", b"*/*"))
if not has_accept_encoding:
auto_headers.append((b"accept-encoding", ACCEPT_ENCODING.encode()))
if not has_connection:
auto_headers.append((b"connection", b"keep-alive"))

self.headers = Headers(auto_headers + self.headers.raw)

Expand Down
Loading

0 comments on commit a795e32

Please sign in to comment.