Support for pyspark.sql.Column in dataclasses-json #423
bhupesh-simpledatalabs
started this conversation in
General
Replies: 1 comment
-
Hi @bhupesh-simpledatalabs, this question is not really related to library functionality, so I'm going to move this to discussions instead. As to "am I doing something wrong", I'd say yes, you can't have a type that is linked to a distributed collection in a file format that is designed for single-node storage. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
In my dataclasses i am using Column field of pyspark.sql as well. Is it possible to extend dataclasses-json to define json encoding and decoding for Column field types as well?
My encoding and decoding functions will look like below
I am extending Column class in python to add encoder and decoder method to enable to_json and from_json call on Column fields. I also tried specifying these encoder and decoder methods in my dataclass via below
I am getting below error when i try to call to_json() on the instance of my own ColumnExpression dataclass
May i know if i am doing anything wrong here?
Beta Was this translation helpful? Give feedback.
All reactions