-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cannot INSERT INTO Stream. #2376
Comments
I run into the exact same problem, Where the schema is exactly the same but fail to insert. Incompatible schema between results and sink. Result schema is [CREDENTIAL_ID : VARCHAR, first_name : VARCHAR, last_name : VARCHAR, access_card_id_binary : VARCHAR, is_active : BOOLEAN, effective_date : VARCHAR, expiry_date : VARCHAR, access_levels : ARRAY], but the sink schema is [CREDENTIAL_ID : VARCHAR, first_name : VARCHAR, last_name : VARCHAR, access_card_id_binary : VARCHAR, is_active : BOOLEAN, effective_date : VARCHAR, expiry_date : VARCHAR, access_levels : ARRAY]. |
Humm... this is worrying. Are you able to provide steps to recreate please, i.e. the create statements and the |
I tried adding a test case to {
"name": "complex schema",
"statements": [
"CREATE STREAM SOURCE (IPROBE_ID BIGINT, ALPROTO_ID BIGINT, USER_PORT BIGINT, METHOD BIGINT, USER_AGENT STRING, REQ_CONTENT_TYPE STRING, HOST STRING, REFERER STRING, VER BIGINT, USER_IP BIGINT, PATH STRING, URL_QUERY STRING, STATUS BIGINT, RESP_CONTENT_LENGTH BIGINT, REQ_CONTENT_LENGTH BIGINT, COOKIE STRING, SERVER STRING, RESP_CTYPE STRING, REQ_TIME BIGINT, RESP_TIME BIGINT, DATA_STATUS BIGINT, POLICYID STRING, REQ_HOUR BIGINT, REGION STRING, APP_ID BIGINT, APP_IP BIGINT, DATA_RESOURCE_URL STRING, DATA_RESOURCE_ID BIGINT, COUNTRY STRING, LEVEL INTEGER, MSH STRING) WITH (kafka_topic='source', value_format='JSON');",
"CREATE STREAM SINK (IPROBE_ID BIGINT, ALPROTO_ID BIGINT, USER_PORT BIGINT, METHOD BIGINT, USER_AGENT STRING, REQ_CONTENT_TYPE STRING, HOST STRING, REFERER STRING, VER BIGINT, USER_IP BIGINT, PATH STRING, URL_QUERY STRING, STATUS BIGINT, RESP_CONTENT_LENGTH BIGINT, REQ_CONTENT_LENGTH BIGINT, COOKIE STRING, SERVER STRING, RESP_CTYPE STRING, REQ_TIME BIGINT, RESP_TIME BIGINT, DATA_STATUS BIGINT, POLICYID STRING, REQ_HOUR BIGINT, REGION STRING, APP_ID BIGINT, APP_IP BIGINT, DATA_RESOURCE_URL STRING, DATA_RESOURCE_ID BIGINT, COUNTRY STRING, LEVEL INTEGER, MSH STRING) WITH (kafka_topic='sink', value_format='JSON');",
"INSERT INTO SINK SELECT * FROM SOURCE;"
],
"inputs": [
{"topic": "source", "key": 0, "value": null, "timestamp": 0}
],
"outputs": [
{"topic": "sink", "key": 0, "value": null, "timestamp": 0}
]
} This test did not fail with the error you encountered, so I'm unable to recreate. This may be some quirk of how your source and sink streams are being created. Hence requiring steps to recreate to look into this. Thanks, Andy |
Got into a similar problem. selecting a subset of values and insert into another stream which has the same schema. insert into sub_stream select NAME, LASTNAME, CREATED, LOCATION from main_stream emit changes; Caused by: io.confluent.ksql.api.client.exception.KsqlClientException: Received 400 response from server: Incompatible schema between results and sink. Error code: 40001 |
Hi @aurabhi , the behavior you reported is expected: ksqlDB requires schemas for
I'm going to close this ticket for now as there is a workaround in your case and the original question has not seen activity in more than a year. Feel free to reopen if this has not addressed your question. Thanks. |
detail error log:
stream_merge_data_pipelinedb:
stream_tran_to_pipeline:
stream
stream_merge_data_pipelinedb
andstream_tran_to_pipeline
column&type is equal.Why cannot insert?
The text was updated successfully, but these errors were encountered: