Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Large dataframe cannot be inserted to the clickhouse table #243

Closed
SergeRomankevich opened this issue Aug 24, 2021 · 1 comment
Closed

Comments

@SergeRomankevich
Copy link

SergeRomankevich commented Aug 24, 2021

Describe the bug
I am creating a clickhouse table.
CREATE TABLE Data_BI.Pt (DataVersionString,DeletionMarkUInt8,DescriptionString,post_KeyString,dog_KeyString,docString,doc_TypeString,store_KeyString,date_docDate,PredefinedUInt8,PredefinedDataName String ) ENGINE = MergeTree PARTITION BY toYYYYMM(date_doc) ORDER BY (post_Key, dog_Key, store_Key) SETTINGS index_granularity = 8192

Super fast inserting the data of the partion 2009.json (306Mb) file into the table.
Trying to insert partion 2013.json (874Mb) and fail.
Python gives no error. Clickhouse log has no errors. Clickhouse writes 0 lines inserted. What is the reason?

`import pandas as pd
import time
import datetime as dt
from clickhouse_driver.client import Client
s = time.time()

def readJSON(ye):
s0 = time.time()
f = pd.read_json('/tmp/Part/partion ' + str(ye) + '.json')
dd = pd.DataFrame.from_records(f['value'])
dd['ДатаДокумента'] = pd.to_datetime(dd['ДатаДокумента'], infer_datetime_format=True)
st=pd.to_datetime('1970-01-01')
dd['ДатаДокумента']=(dd['ДатаДокумента']-st).dt.days
s1 = time.time()
s11=s1-s0
print(f'Фрейм {str(ye)} года за {round(s11//60)} минут {round(s11%60)} сек.')
return dd

def insert_srv_etl(df0,ye):
#ch = Client('srv-etl', settings={'use_numpy': True})
ch = Client('srv-etl', settings={'use_numpy': True,'max_insert_block_size': 524288})
strok=ch.insert_dataframe('INSERT INTO Data_BI.Pt VALUES', df0)
ch.disconnect()
print(f'Вставлено {str(strok)} строк за {str(ye)} год')

df=readJSON('2013')
insert_srv_etl(df,'2013')`

Versions

  • pandas 1.3.1
  • clickhouse-driver 0.2.1
  • ClickHouse server version 21.7.2.7 (official build).
  • Python 3.7.3
xzkostyan added a commit that referenced this issue Sep 19, 2021
First element was empty and not data was inserted
xzkostyan added a commit that referenced this issue Sep 20, 2021
First element was empty and not data was inserted
@xzkostyan
Copy link
Member

Fix was merged into master branch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants