-
Notifications
You must be signed in to change notification settings - Fork 77
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for COMP-3 numbers without the sign nibble #701
Comments
Could you add
and send the HEX value of the field that is incorrectly decoded, and I'll take a look |
Sorry for so many questions, but we have been trying since long. |
Yes, Cobrix supports packed decimal data. 'debug' does not suppose to change anything, it just creates debug columns. I'm asking you to to send an example of HEX values that Cobrix didn't convert properly. |
Transaction code is coming as 601, while in mainframe I see the value as 791. The field type is PIC S9(3)V COMP-3. Another is date field, which is PIC S9(7)V COMP-3, it is coming as null in dataframe, while it should come as 240802 actually in mainframe. Cobrix version installed is spark_cobol_2_12_2_7_3_bundle.jar. We have spark version 2.12 in Databricks. |
When E.g. field1 = 601, field1_debug=? |
Got it thanks. |
Makes sense. Yes, I think we can add support for |
This are specs for |
Transaction code is defined as PIC S9(3)V COMP-3. If I see this value in mainframe is 791, while in dataframe coming as 601. This is also coming incorrect :-( . Thanks for the quick revert back. Date is defined as below. |
Thanks for the field definition. We can add support for COMP-3 numbers without a sign nibble. Just keep in mind that this definition:
implies 9 digits. while |
:( |
Checked - parsing of |
When i tried this with keeping the field in copybook unchanged i.e. PIC S9(3)V COMP-3, in debug it was coming as 601C. After changing the data from COMP-3 to COMP-3U, |
Maybe you can do something like df.select("failure_field1", "failure_field1_debug").show(false) and send here the table, for each field that is failing for you. |
I will double check the 601 with user, if he is sending me wrong snapshots. I can't upload the table due to data privacy, here are the values. I printed first 10, all are coming as below Trans_code Acct_open_dt Tran_date Trans_time Tran Date and time are together under a level field called trans-date-time, if it makes any difference.
I will double check the 601 with user, if he is sending me wrong snapshots. I can't upload the table due to data privacy, here are the values. I printed first 10, all are coming as below Trans_code Acct_open_dt Tran_date Trans_time Tran Date and time are together under a level field called trans-date-time, if it makes any difference. |
I will double check the 601 with user, if he is sending me wrong snapshots. I can't upload the table due to data privacy, here are the values. I printed first 10, all are coming as below Trans_code Acct_open_dt Tran_date Trans_time Tran Date and time are together under a level field called trans-date-time, if it makes any difference. |
Looks good. The only issue left then is |
|
Other than the wrong values above, Value: 00829 , Debug: 203030383239 |
I realized for this example that your data is ASCII, not EBCDIC. Ebcdic encoding for 0082918 is F0F0F8F2F9F1F8
It is straightforward. One character is 1 byte. |
We are using cobrix to convert the mainframe EBCDIC file. Below are the problematic data fields:
XXX-TRANSACTION-AMOUNT PIC S9(15) V99 COMP-3
We are not able to convert the fields correctly. I suspect due to sign field we are running into issues and coming as NULL.
Rest all fields are coming correctly.
cobolDataframe = spark.read.format("za.co.absa.cobrix.spark.cobol.source")
.option("copybook", "dbfs:/FileStore/Optis Test/copybook.txt")
.option("record_format", "D")
.option("is_rdw_big_endian", "true")
.option("rdw_adjustment", -4)
.load("dbfs:/FileStore/Optis Test/inputfile.txt")
thanks for the help
Background [Optional]
A clear explanation of the reason for raising the question.
This gives us a better understanding of your use cases and how we might accommodate them.
Question
A clear and concise inquiry
The text was updated successfully, but these errors were encountered: