You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When writing a dataframe to a SQL database it gives an error in ADF because of max number of characters exceeded (Operation on target Copy all_boards to SQL failed: Failure happened on 'Sink' side. ErrorCode=SqlBulkCopyInvalidColumnLength,.... This was the case with a column with a string version of a list of dicts.
The max characters in Python were 1870, so the schema definition in the database became varchar(1870), in the CSV the max length of values was 1870 as wel after checking as a txt file and after loading it again in Python.
However in ADF it is causing the error during the copy action. It does not give the error if we manually enter the text_length parameter in df_to_azure() as 1873
The text was updated successfully, but these errors were encountered:
When writing a dataframe to a SQL database it gives an error in ADF because of max number of characters exceeded (
Operation on target Copy all_boards to SQL failed: Failure happened on 'Sink' side. ErrorCode=SqlBulkCopyInvalidColumnLength,...
. This was the case with a column with a string version of a list of dicts.The max characters in Python were 1870, so the schema definition in the database became
varchar(1870)
, in the CSV the max length of values was 1870 as wel after checking as a txt file and after loading it again in Python.However in ADF it is causing the error during the copy action. It does not give the error if we manually enter the
text_length
parameter indf_to_azure()
as 1873The text was updated successfully, but these errors were encountered: