Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error while processing multi-segment file - Following segment redefines not found, Please check fields exist #269

Closed
eapframework opened this issue Mar 27, 2020 · 5 comments
Labels
question Further information is requested

Comments

@eapframework
Copy link

eapframework commented Mar 27, 2020

I am trying to process a multi-segment file. Please find the copybook attached.

copybook-flap.txt

Code below:

val df = spark.read.format("cobol").option("copybook", "copybook-flap.txt").option("pedantic", "false").option("segment_field", "FLAP-MTHD-OVER-RIDE-NR")
.option("redefine_segment_id_map:0", "FLAP_RECORD.FLAP-ITEM.FLAP-MTHD-OVER-RIDE.FLAP-MTHDS.REDEFINE-STR1 => 1")
.option("redefine-segment-id-map:1", "FLAP_RECORD.FLAP-ITEM.FLAP-MTHD-OVER-RIDE.FLAP-MTHDS.REDEFINE-STR2 => 2")
.option("redefine-segment-id-map:2","FLAP_RECORD.FLAP-ITEM.FLAP-MTHD-OVER-RIDE.FLAP-MTHDS.REDEFINE-STR3 => 3")
.option("redefine-segment-id-map:3","FLAP_RECORD.FLAP-ITEM.FLAP-MTHD-OVER-RIDE.FLAP-MTHDS.REDEFINE-STR4 => 4")
.load("mcy_flap1_Dec19.dat")

I gave the full path for Redefines fields - FLAP_RECORD.FLAP-ITEM.FLAP-MTHD-OVER-RIDE.FLAP-MTHDS.REDEFINE-STR1

But still getting error -

Following segment redefines not found, Please check fields exist and are redefines/redefined by.

Please help!

@eapframework eapframework added the question Further information is requested label Mar 27, 2020
@yruslan
Copy link
Collaborator

yruslan commented Mar 27, 2020

Thanks for your request. Will take a look.

@yruslan
Copy link
Collaborator

yruslan commented Mar 30, 2020

Sorry for the delay. Will get to this soon. Fro the first glance it looks like it is related to the depth of nesting of segment redefines. If that is the case, it is a bug and we will fix it. Will let you know more soon.

@eapframework
Copy link
Author

Thanks for the update. Even I am working on resolving the issue. So far no luck. Will update if I am able to resolve the issue.

@eapframework
Copy link
Author

Hi yruslan, I am able to resolve the issue by changing the depth of nesting of segment redefines and clearing the cached files in my spark cluster. I have a question - what happens if pass segment_field(FLAP-MTHD-OVER-RIDE-NR) as 0. Can we skip without allocating bytes to any segment?
Thanks!

@yruslan
Copy link
Collaborator

yruslan commented Apr 1, 2020

I'm glad you've found a workaround. But I'm going to reopen the issue so that we understand why there is a limitation of the depth of segment redefines and if we can remove it.

If the value of a segment id is not in the list of segment redefine mappings all segment-specific fields should be empty in the dataset. But the record itself won't be skipped.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants