Replies: 15 comments 61 replies
-
Hello, I've applied the changes listed, updated Pylance and it seems to be working! I am very happy about this. If I understand correctly, this behavior can be inherited by other wrappers in the same way, by applying ( In any case, thanks a lot! |
Beta Was this translation helpful? Give feedback.
-
Thanks, this looks promising. I wonder if for Pydantic it would be more appropriate for constructor arguments to be of type |
Beta Was this translation helpful? Give feedback.
-
I have tried this out on an reasonably well-size internal codebase that's currently using One pain-point for us would be the support for For a pseudo-contrived example: import attr
import enum
from typing import Union
class Foos(enum.Enum):
foo = "foo"
bar = "bar"
def parse_foos(value: Union[str, Foos]) -> Foos:
return Foos(value)
@attr.define
class Data:
a: Foos = attr.field(converter=parse_foos)
Could you comment on the technical feasibility of expanding this spec to include support for converters on fields, perhaps with the following semantics?
I can imagine that this could be reasonably complex, particularly if generics were supported in the I think this would explicitly not be an ask for the kinds of special-case support for the attrs-specific features in the existing mypy plugin, as upstream authors could adapt converter functions to expose |
Beta Was this translation helpful? Give feedback.
-
One minor gap is the current spec's ability to represent the I believe this can be easily implemented in the current specification by providing an additional |
Beta Was this translation helpful? Give feedback.
-
This is awesome! 🤩🎉 Thanks for doing this @erictraut! I think this would improve a lot of use cases for many libraries (including new ones) that could adopt it. I already made a PR to pydantic here: pydantic/pydantic#2721 As a side note, I know there are edge cases for each library, like converters, automatic data conversion, etc. But I think this would already be a huge improvement to the current state of things. For example, I would like to have support for In any case, with all the corner cases I could imagine would appear for each specific library, this is already amazing, and a great improvement in the developer experience. 🚀 |
Beta Was this translation helpful? Give feedback.
-
Hi @erictraut, I wanted to send this a while ago! First of all thanks for doing this, it is a great addition! I've started working adding support for it to Strawberry a GraphQL library that is inspired by dataclasses. I wanted to know if there's plans for decorating functions and make them modify the @dataclass
class PyRight:
is_awesome: bool = dataclasses.field(init=False, default=True) and in strawberry we have something similar: def get_is_awesome() -> bool:
return True
@strawberry.type
class PyRight:
is_awesome: bool = strawberry.field(resolver=get_is_awesome) every time there's a Do think this will be supported in future? The PR I'm working on is here: strawberry-graphql/strawberry#922 |
Beta Was this translation helpful? Give feedback.
-
Hi @erictraut, thanks for bringing up this suggestion! Looks like a great addition. I think the decorator would be a nice way of "enforcing" metadata (in whatever form) to be attached to classes. In an ideal world, I would imagine something like the following: class Model(Protocol[T]):
__metadata__: Dict[str, Any]
__message_type__: Type[T]
def to_dict(self) -> Dict[str, Any]:
...
def from_dict(cls: Type[T], data: Dict[str, Any]) -> T:
...
def to_message(self) -> T:
...
# The return type could be either explicitly defined in the __dataclass_transform__ decorator,
# or maybe directly inferred from the annotation of the decorated function
@__dataclass_transform__(return_type=Model[Message])
def model(message_type: Type[Message], **kwargs) -> Callable[[Type[Any]], Type[Model[Message]]]:
def wrapper(cls: Type[Any]) -> Type[Model[Message]]:
cls.__message_type__ = message_type
cls.to_dict = obj_to_dict
cls.from_dict = obj_from_dict
cls = dataclass(cls)
return cls
return wrapper
# Some other module
@model(EmployeeMessage, metadata_key="value")
class Employee:
id: int
name: str
Employee(1, "John Doe").to_dict()
Employee.from_dict({"id": 2, "name": "Jane Doe"}) With the current implementation, the methods fail the type check using pylance. For a passing type check, all decorated classes need to inherit from the Or would the better solution for be, to completely ditch the decorator approach, move the metadata to a class attribute (or a nested |
Beta Was this translation helpful? Give feedback.
-
Would this proposal also work with ORM-style classes, such as sqlalchemy ? One thing I could see as problematic is that in sqlalchemy each field has a different behavior whether it's called as an object instance or a class field, for example
In this case: Foo.a -> Column while Foo().a -> int Otherwise I think the Foo is more or less a pseudo-declarative class, in that it synthetizes an init(self, a: Optional[int]=None) method. However I'm not an expert in the subtleties of sqlalchemy to see if it would break down in some edge cases... There's a related discussion here: sqlalchemy/sqlalchemy2-stubs#170 |
Beta Was this translation helpful? Give feedback.
-
Looks like you're talking about "dataclasses with imperative table" here.
There is also a second way: "dataclasses with declarative table".
Are there significant issues in supporting them with pyright?
https://docs.sqlalchemy.org/en/14/orm/mapping_styles.html#example-two-dataclasses-with-declarative-table
…On Tue, Sep 7, 2021 at 6:11 AM layday ***@***.***> wrote:
SQLModel is great if you are a Pydantic user, but not everybody is, and
not everybody would want to add another layer of abstraction on top of
SQLAlchemy and its ORM. To type an ORM class, currently, so that it works
with Pyright, you have to define the table, annotate the attributes keeping
them in sync with table's cols, *and* write your own (typing-only)
constructor, where all arguments are optional. This all ends up looking
something like this, sans relationships:
@mapper_registry.mappedclass Foo:
__table__ = Table(
'foo',
mapper_registry.metadata,
Column('a', String, primary_key=True),
Column('b', TZDateTime, nullable=False, server_default=func.now()),
Column('c', String, primary_key=True),
Column('d', String, primary_key=True),
)
a: Mapped[str]
b: Mapped[datetime]
c: Mapped[str]
d: Mapped[str]
if TYPE_CHECKING:
def __init__(
self,
*,
a: str = ...,
b: datetime = ...,
c: str = ...,
d: str = ...,
) -> None:
...
Unsightly and error-prone, and it makes me wonder what the future of ORMs
is. That is, if change should be sought from the typing side of things, or
from the ORMs themselves.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#1782 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAFA2A6S3QMZQUYPUCTTXP3UAYFQFANCNFSM43LQXPBA>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
|
Beta Was this translation helpful? Give feedback.
-
I've created a decorator that wraps pydantic's https://gist.github.com/SandyChapman/ffd40a2e46754f1341a0135ab2cb7202 |
Beta Was this translation helpful? Give feedback.
-
I have a related issue with Pyright 1.1.195 and Python 3.8. I defined a decorator,
This all works well (including default values), but neither signature nor hover doc are showing up – even though
My best guess is that Pyright doesn't know that |
Beta Was this translation helpful? Give feedback.
-
Regarding
Would is be possible to support the use of either the defined field name or the alias? pydantic uses a config option of For my use case I would like to define, consume, and (optionally) populate the fields using snake case (standardized across the library) but when I convert the model object into a dict (using from pydantic import BaseModel, Field
class ExampleModel(BaseModel, allow_population_by_field_name=True):
field_one: str = Field(..., alias="FieldOne")
field_two: str = Field(..., alias="FieldTwo")
# pass type check
assert ExampleModel(field_one="foo", field_two="bar").dict(by_alias=True) == {
"FieldOne": "foo",
"FieldTwo": "bar",
}
assert ExampleModel(FieldOne="foo", FieldTwo="bar").dict(by_alias=True) == {
"FieldOne": "foo",
"FieldTwo": "bar",
}
# fail type check
assert ExampleModel(FieldOne="foo", field_two="bar").dict(by_alias=True) == {
"FieldOne": "foo",
"FieldTwo": "bar",
} |
Beta Was this translation helpful? Give feedback.
-
I see that the current implementation in pyright also automatically respects from dataclasses import InitVar, dataclass
from typing import Any, Callable, Tuple, Type, TypeVar, Union
_T = TypeVar("_T")
def __dataclass_transform__(
*,
eq_default: bool = True,
order_default: bool = False,
kw_only_default: bool = False,
field_descriptors: Tuple[Union[type, Callable[..., Any]], ...] = (()),
) -> Callable[[_T], _T]:
return lambda a: a
@__dataclass_transform__()
def dtc() -> Callable[[Type[_T]], Type[_T]]:
return dataclass()
@dtc()
class CustomerModel:
id: InitVar[int]
name: str
def __post_init__(self) -> None: # pyright (in this case rightly) complains
# self.name = self.name + str(id)
pass
if __name__ == "__main__":
c = CustomerModel(3, "hi") # fails at runtime because post_init is wrong
print(c) In this case, this check happens to be correct, but in other cases it probably isn't, right? For example, it seems @attr.define
class A:
b: InitVar[int]
c: str
a = A(b=3, c="foo")
print(a.b) gives |
Beta Was this translation helpful? Give feedback.
-
How does |
Beta Was this translation helpful? Give feedback.
-
I see that the dataclass_transforms spec has been removed. Has this feature been dropped? Linking #607 since I came here trying to determine how extensible Pyright is. |
Beta Was this translation helpful? Give feedback.
-
We've received feedback from pyright and pylance users that they would like to see better support for
attrs
,pydantic
,django
,edgedb
, and other libraries that provide semantics similar todataclass
. Custom mypy plugins exist for all of these libraries, but these plugins are very specific to mypy and cannot be used in pyright or other Python type checkers or linters.I've been working on a proposal that aims to provide support for this entire class of libraries in a standardized manner. If successful, this specification can be ratified as a PEP and adopted by the full suite of type checking and linting tools in the Python ecosystem.
I've posted a draft of this spec here, and I welcome feedback on it.
I've also implemented the code to support the current spec. It's included in pylance version 2021.4.2, which was released earlier today. It is also included in pyright 1.1.134. If you want to give it a try with either
attrs
orpydantic
, please follow the instructions toward the end of the spec.Note that this spec — and the corresponding implementation — are still in early forms and are subject to change.
Beta Was this translation helpful? Give feedback.
All reactions