Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Version 0.15.0 #1301

Merged
merged 10 commits into from
Sep 22, 2020
42 changes: 42 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,48 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/).

## 0.15.0

### Added

* Added support for event hooks. (Pull #1246)
* Added support for authentication flows which require either sync or async I/O. (Pull #1217)
* Added support for monitoring download progress with `response.num_bytes_downloaded`. (Pull #1268)
* Added `Request(content=...)` for byte content, instead of overloading `Request(data=...)` (Pull #1266)
* Added support for all URL components as parameter names when using `url.copy_with(...)`. (Pull #1285)
* Neater split between automatically populated headers on `Request` instances, vs default `client.headers`. (Pull #1248)
* Unclosed `AsyncClient` instances will now raise warnings if garbage collected. (Pull #1197)
* Support `Response(content=..., text=..., html=..., json=...)` for creating usable response instances in code. (Pull #1265, #1297)
* Support instantiating requests from the low-level transport API. (Pull #1293)
* Raise errors on invalid URL types. (Pull #1259)

### Changed

* Cleaned up expected behaviour for URL escaping. `url.path` is now URL escaped. (Pull #1285)
* Cleaned up expected behaviour for bytes vs str in URL components. `url.userinfo` and `url.query` are not URL escaped, and so return bytes. (Pull #1285)
* Drop `url.authority` property in favour of `url.netloc`, since "authority" was semantically incorrect. (Pull #1285)
* Drop `url.full_path` property in favour of `url.raw_path`, for better consistency with other parts of the API. (Pull #1285)
* No longer use the `chardet` library for auto-detecting charsets, instead defaulting to a simpler approach when no charset is specified. (#1269)

### Fixed

* Swapped ordering of redirects and authentication flow. (Pull #1267)
* `.netrc` lookups should use host, not host+port. (Pull #1298)

### Removed

* The `URLLib3Transport` class no longer exists. We've published it instead as an example of [a custom transport class](https://gist.github.com/florimondmanca/d56764d78d748eb9f73165da388e546e). (Pull #1182)
* Drop `request.timer` attribute, which was being used internally to set `response.elapsed`. (Pull #1249)
* Drop `response.decoder` attribute, which was being used internally. (Pull #1276)
* `Request.prepare()` is now a private method. (Pull #1284)
* The `Headers.getlist()` method had previously been deprecated in favour of `Headers.get_list()`. It is now fully removed.
* The `QueryParams.getlist()` method had previously been deprecated in favour of `QueryParams.get_list()`. It is now fully removed.
* The `URL.is_ssl` property had previously been deprecated in favour of `URL.scheme == "https"`. It is now fully removed.
* The `httpx.PoolLimits` class had previously been deprecated in favour of `httpx.Limits`. It is now fully removed.
* The `max_keepalive` setting had previously been deprecated in favour of the more explicit `max_keepalive_connections`. It is now fully removed.
* The verbose `httpx.Timeout(5.0, connect_timeout=60.0)` style had previously been deprecated in favour of `httpx.Timeout(5.0, connect=60.0)`. It is now fully removed.
* Support for instantiating a timeout config missing some defaults, such as `httpx.Timeout(connect=60.0)`, had previously been deprecated in favour of enforcing a more explicit style, such as `httpx.Timeout(5.0, connect=60.0)`. This is now strictly enforced.

## 0.14.3 (September 2nd, 2020)

### Added
Expand Down
3 changes: 1 addition & 2 deletions httpx/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
from ._api import delete, get, head, options, patch, post, put, request, stream
from ._auth import Auth, BasicAuth, DigestAuth
from ._client import AsyncClient, Client
from ._config import Limits, PoolLimits, Proxy, Timeout, create_ssl_context
from ._config import Limits, Proxy, Timeout, create_ssl_context
from ._exceptions import (
CloseError,
ConnectError,
Expand Down Expand Up @@ -70,7 +70,6 @@
"NotRedirectResponse",
"options",
"patch",
"PoolLimits",
"PoolTimeout",
"post",
"ProtocolError",
Expand Down
2 changes: 1 addition & 1 deletion httpx/__version__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
__title__ = "httpx"
__description__ = "A next generation HTTP client, for Python 3."
__version__ = "0.14.3"
__version__ = "0.15.0"
24 changes: 6 additions & 18 deletions httpx/_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -850,18 +850,12 @@ def _send_single_request(self, request: Request, timeout: Timeout) -> Response:
timer.sync_start()

with map_exceptions(HTTPCORE_EXC_MAP, request=request):
(
http_version,
status_code,
reason_phrase,
headers,
stream,
) = transport.request(
(status_code, headers, stream, ext) = transport.request(
request.method.encode(),
request.url.raw,
headers=request.headers.raw,
stream=request.stream, # type: ignore
timeout=timeout.as_dict(),
ext={"timeout": timeout.as_dict()},
)

def on_close(response: Response) -> None:
Expand All @@ -871,9 +865,9 @@ def on_close(response: Response) -> None:

response = Response(
status_code,
http_version=http_version.decode("ascii"),
headers=headers,
stream=stream, # type: ignore
ext=ext,
request=request,
on_close=on_close,
)
Expand Down Expand Up @@ -1501,18 +1495,12 @@ async def _send_single_request(
await timer.async_start()

with map_exceptions(HTTPCORE_EXC_MAP, request=request):
(
http_version,
status_code,
reason_phrase,
headers,
stream,
) = await transport.request(
(status_code, headers, stream, ext,) = await transport.arequest(
request.method.encode(),
request.url.raw,
headers=request.headers.raw,
stream=request.stream, # type: ignore
timeout=timeout.as_dict(),
ext={"timeout": timeout.as_dict()},
)

async def on_close(response: Response) -> None:
Expand All @@ -1522,9 +1510,9 @@ async def on_close(response: Response) -> None:

response = Response(
status_code,
http_version=http_version.decode("ascii"),
headers=headers,
stream=stream, # type: ignore
ext=ext,
request=request,
on_close=on_close,
)
Expand Down
65 changes: 3 additions & 62 deletions httpx/_config.py
Original file line number Diff line number Diff line change
@@ -1,15 +1,14 @@
import os
import ssl
import typing
import warnings
from base64 import b64encode
from pathlib import Path

import certifi

from ._models import URL, Headers
from ._types import CertTypes, HeaderTypes, TimeoutTypes, URLTypes, VerifyTypes
from ._utils import get_ca_bundle_from_env, get_logger, warn_deprecated
from ._utils import get_ca_bundle_from_env, get_logger

DEFAULT_CIPHERS = ":".join(
[
Expand Down Expand Up @@ -212,44 +211,7 @@ def __init__(
read: typing.Union[None, float, UnsetType] = UNSET,
write: typing.Union[None, float, UnsetType] = UNSET,
pool: typing.Union[None, float, UnsetType] = UNSET,
# Deprecated aliases.
connect_timeout: typing.Union[None, float, UnsetType] = UNSET,
read_timeout: typing.Union[None, float, UnsetType] = UNSET,
write_timeout: typing.Union[None, float, UnsetType] = UNSET,
pool_timeout: typing.Union[None, float, UnsetType] = UNSET,
):
if not isinstance(connect_timeout, UnsetType):
warn_deprecated(
"httpx.Timeout(..., connect_timeout=...) is deprecated and will "
"raise errors in a future version. "
"Use httpx.Timeout(..., connect=...) instead."
)
connect = connect_timeout

if not isinstance(read_timeout, UnsetType):
warn_deprecated(
"httpx.Timeout(..., read_timeout=...) is deprecated and will "
"raise errors in a future version. "
"Use httpx.Timeout(..., write=...) instead."
)
read = read_timeout

if not isinstance(write_timeout, UnsetType):
warn_deprecated(
"httpx.Timeout(..., write_timeout=...) is deprecated and will "
"raise errors in a future version. "
"Use httpx.Timeout(..., write=...) instead."
)
write = write_timeout

if not isinstance(pool_timeout, UnsetType):
warn_deprecated(
"httpx.Timeout(..., pool_timeout=...) is deprecated and will "
"raise errors in a future version. "
"Use httpx.Timeout(..., pool=...) instead."
)
pool = pool_timeout

if isinstance(timeout, Timeout):
# Passed as a single explicit Timeout.
assert connect is UNSET
Expand Down Expand Up @@ -278,13 +240,10 @@ def __init__(
self.pool = pool
else:
if isinstance(timeout, UnsetType):
warnings.warn(
raise ValueError(
"httpx.Timeout must either include a default, or set all "
"four parameters explicitly. Omitting the default argument "
"is deprecated and will raise errors in a future version.",
DeprecationWarning,
"four parameters explicitly."
)
timeout = None
self.connect = timeout if isinstance(connect, UnsetType) else connect
self.read = timeout if isinstance(read, UnsetType) else read
self.write = timeout if isinstance(write, UnsetType) else write
Expand Down Expand Up @@ -335,16 +294,7 @@ def __init__(
*,
max_connections: int = None,
max_keepalive_connections: int = None,
# Deprecated parameter naming, in favour of more explicit version:
max_keepalive: int = None,
):
if max_keepalive is not None:
warnings.warn(
"'max_keepalive' is deprecated. Use 'max_keepalive_connections'.",
DeprecationWarning,
)
max_keepalive_connections = max_keepalive

self.max_connections = max_connections
self.max_keepalive_connections = max_keepalive_connections

Expand All @@ -363,15 +313,6 @@ def __repr__(self) -> str:
)


class PoolLimits(Limits):
def __init__(self, **kwargs: typing.Any) -> None:
warn_deprecated(
"httpx.PoolLimits(...) is deprecated and will raise errors in the future. "
"Use httpx.Limits(...) instead."
)
super().__init__(**kwargs)


class Proxy:
def __init__(
self, url: URLTypes, *, headers: HeaderTypes = None, mode: str = "DEFAULT"
Expand Down
29 changes: 7 additions & 22 deletions httpx/_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@
import json as jsonlib
import typing
import urllib.request
import warnings
from collections.abc import MutableMapping
from http.cookiejar import Cookie, CookieJar
from urllib.parse import parse_qsl, quote, unquote, urlencode
Expand Down Expand Up @@ -272,12 +271,6 @@ def raw(self) -> RawURL:
self.raw_path,
)

@property
def is_ssl(self) -> bool:
message = 'URL.is_ssl() is pending deprecation. Use url.scheme == "https"'
warnings.warn(message, DeprecationWarning)
return self.scheme == "https"

@property
def is_absolute_url(self) -> bool:
"""
Expand Down Expand Up @@ -525,13 +518,6 @@ def __repr__(self) -> str:
query_string = str(self)
return f"{class_name}({query_string!r})"

def getlist(self, key: typing.Any) -> typing.List[str]:
message = (
"QueryParams.getlist() is pending deprecation. Use QueryParams.get_list()"
)
warnings.warn(message, DeprecationWarning)
return self.get_list(key)


class Headers(typing.MutableMapping[str, str]):
"""
Expand Down Expand Up @@ -757,11 +743,6 @@ def __repr__(self) -> str:
return f"{class_name}({as_dict!r}{encoding_str})"
return f"{class_name}({as_list!r}{encoding_str})"

def getlist(self, key: str, split_commas: bool = False) -> typing.List[str]:
message = "Headers.getlist() is pending deprecation. Use Headers.get_list()"
warnings.warn(message, DeprecationWarning)
return self.get_list(key, split_commas=split_commas)


class Request:
def __init__(
Expand Down Expand Up @@ -883,19 +864,19 @@ def __init__(
html: str = None,
json: typing.Any = None,
stream: ByteStream = None,
http_version: str = None,
request: Request = None,
ext: dict = None,
history: typing.List["Response"] = None,
on_close: typing.Callable = None,
):
self.status_code = status_code
self.http_version = http_version
self.headers = Headers(headers)

self._request: typing.Optional[Request] = request

self.call_next: typing.Optional[typing.Callable] = None

self.ext = {} if ext is None else ext
self.history = [] if history is None else list(history)
self._on_close = on_close

Expand Down Expand Up @@ -964,9 +945,13 @@ def request(self) -> Request:
def request(self, value: Request) -> None:
self._request = value

@property
def http_version(self) -> str:
return self.ext.get("http_version", "HTTP/1.1")

@property
def reason_phrase(self) -> str:
return codes.get_reason_phrase(self.status_code)
return self.ext.get("reason", codes.get_reason_phrase(self.status_code))

@property
def url(self) -> typing.Optional[URL]:
Expand Down
11 changes: 6 additions & 5 deletions httpx/_transports/asgi.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from typing import TYPE_CHECKING, Callable, List, Mapping, Optional, Tuple, Union
from typing import TYPE_CHECKING, Callable, List, Optional, Tuple, Union

import httpcore
import sniffio
Expand Down Expand Up @@ -67,14 +67,14 @@ def __init__(
self.root_path = root_path
self.client = client

async def request(
async def arequest(
self,
method: bytes,
url: Tuple[bytes, bytes, Optional[int], bytes],
headers: List[Tuple[bytes, bytes]] = None,
stream: httpcore.AsyncByteStream = None,
timeout: Mapping[str, Optional[float]] = None,
) -> Tuple[bytes, int, bytes, List[Tuple[bytes, bytes]], httpcore.AsyncByteStream]:
ext: dict = None,
) -> Tuple[int, List[Tuple[bytes, bytes]], httpcore.AsyncByteStream, dict]:
headers = [] if headers is None else headers
stream = httpcore.PlainByteStream(content=b"") if stream is None else stream

Expand Down Expand Up @@ -154,5 +154,6 @@ async def send(message: dict) -> None:
assert response_headers is not None

stream = httpcore.PlainByteStream(content=b"".join(body_parts))
ext = {}

return (b"HTTP/1.1", status_code, b"", response_headers, stream)
return (status_code, response_headers, stream, ext)
11 changes: 4 additions & 7 deletions httpx/_transports/wsgi.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,13 +64,9 @@ def request(
url: typing.Tuple[bytes, bytes, typing.Optional[int], bytes],
headers: typing.List[typing.Tuple[bytes, bytes]] = None,
stream: httpcore.SyncByteStream = None,
timeout: typing.Mapping[str, typing.Optional[float]] = None,
ext: dict = None,
) -> typing.Tuple[
bytes,
int,
bytes,
typing.List[typing.Tuple[bytes, bytes]],
httpcore.SyncByteStream,
int, typing.List[typing.Tuple[bytes, bytes]], httpcore.SyncByteStream, dict
]:
headers = [] if headers is None else headers
stream = httpcore.PlainByteStream(content=b"") if stream is None else stream
Expand Down Expand Up @@ -127,5 +123,6 @@ def start_response(
for key, value in seen_response_headers
]
stream = httpcore.IteratorByteStream(iterator=result)
ext = {}

return (b"HTTP/1.1", status_code, b"", headers, stream)
return (status_code, headers, stream, ext)
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@ def get_packages(package):
"certifi",
"sniffio",
"rfc3986[idna2008]>=1.3,<2",
"httpcore==0.10.*",
"httpcore==0.11.*",
],
extras_require={
"http2": "h2==3.*",
Expand Down
Loading