Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

exception with plc_read using pylogix 1.0.5 #257

Closed
1 task
ChristopheLaurent opened this issue Nov 28, 2024 · 17 comments
Closed
1 task

exception with plc_read using pylogix 1.0.5 #257

ChristopheLaurent opened this issue Nov 28, 2024 · 17 comments
Labels

Comments

@ChristopheLaurent
Copy link

Type of issue

  • Bug

Description of issue

I get an exception reading 583 tags in batch
This error occurs with latest 1.0.5 version. Going back to my previous version 0.8.13, it works perfectly well.

Stacktrace

groot-svc | Exception reading 192.168.22.10: unpack_from requires a buffer of at least 49 bytes for unpacking 1 bytes at offset 48 (actual buffer size is 24)
groot-svc | File "/opt/everest/dev-ts-groot/groot_lib.py", line 194, in plc_read
groot-svc | results = comm.Read(batch)
groot-svc | File "/usr/local/lib/python3.10/site-packages/pylogix/eip.py", line 126, in Read
groot-svc | return self._batch_read(tag)
groot-svc | File "/usr/local/lib/python3.10/site-packages/pylogix/eip.py", line 376, in _batch_read
groot-svc | self._get_unknown_types(tags)
groot-svc | File "/usr/local/lib/python3.10/site-packages/pylogix/eip.py", line 1392, in _get_unknown_types
groot-svc | self._multi_read(unk_tags)
groot-svc | File "/usr/local/lib/python3.10/site-packages/pylogix/eip.py", line 428, in _multi_read
groot-svc | status, ret_data = self.conn.send(request)
groot-svc | File "/usr/local/lib/python3.10/site-packages/pylogix/lgx_comm.py", line 69, in send
groot-svc | return self._get_bytes(eip_header, connected)
groot-svc | File "/usr/local/lib/python3.10/site-packages/pylogix/lgx_comm.py", line 177, in _get_bytes
groot-svc | status = unpack_from('<B', ret_data, 48)[0]

Versions

  • pylogix: 1.0.5
  • plc model: ControlLogix
  • python: 3.10
  • OS: Linux alpine
@dmroeder
Copy link
Owner

Thanks, I'll check this out as soon as I can

@dmroeder
Copy link
Owner

dmroeder commented Dec 7, 2024

@ChristopheLaurent, I have a solution for this that is almost complete.

@dmroeder dmroeder added the bug label Dec 7, 2024
@dmroeder
Copy link
Owner

dmroeder commented Dec 9, 2024

@ChristopheLaurent, if you can try the branch bugfix/large-list-read, that would be helpful.

@ChristopheLaurent
Copy link
Author

ChristopheLaurent commented Dec 9, 2024 via email

@dmroeder
Copy link
Owner

dmroeder commented Dec 9, 2024

Could I convince you to capture it with wireshark and email the capture to me?

@ammonwk
Copy link

ammonwk commented Dec 9, 2024

I'm experiencing the same issue.

Grabbing 5 tags sequentially, my program outputs:

Attempting to connect to PLC at 10.1.18.132...
Successfully connected to PLC!
PLC Time: 2024-12-09 12:52:56.359930

Reading tag structure...

=== Available Tags ===
Found 2951 tags

Tag: Program:Preinspection          Type:                 Value: N/A Array: No
Tag: Program:Preinspection.PreInspect Type: VisionInspection Value: b'\x01\x00\x00\x00"\ Array: No
Tag: Program:Preinspection.Preinspect_Fail_Full_Light_Off Type: TIMER           Value: b'\xf8=\r\x00\xfa\x0 Array: No
Tag: Program:Preinspection.Preinspect_Fail_Full_Light_On Type: TIMER           Value: b'{s*\x00\xfa\x00\x0 Array: No
Tag: Program:Preinspection.PreinspectGoodPartsCounter Type: COUNTER         Value: 0 Array: No
done
Time elapsed: 0.03s

But trying to grab them all in a batch using comm.Read(tag_names_to_read) results in the error ERROR: unpack_from requires a buffer of at least 2 bytes for unpacking 2 bytes at offset 0 (actual buffer size is 0)

I subclassed PLC to output some debug info, here's my subclass:

class DebugPLC(PLC):
    def _parse_multi_read(self, data, tags):
        """
        Extract the values from the multi-service message reply
        """
        print("Entering _parse_multi_read")
        print(f"Raw data received: {data!r}")  # Print raw data for inspection

        data = data[46:]
        print(f"Data after removing header (46 bytes): {data!r}")

        try:
            service = unpack_from("<H", data, 0)[0]
            print(f"Service code: {service}")

            status, ext_status = unpack_from("<BB", data, 2)
            print(f"Status: {status}, Extended Status: {ext_status}")

            service_count = unpack_from("<H", data, 4)[0]
            print(f"Service count: {service_count}")

            offsets = [
                unpack_from("<H", data, i * 2 + 6)[0] for i in range(service_count)
            ]
            print(f"Offsets: {offsets}")

            # Adjust offsets to be relative to data start (after header)
            data_start = (
                4 + service_count * 2 + 2
            )  # 4 (header) + service_count * 2 (offsets) + 2 (service count)
            offsets = [offset - data_start for offset in offsets]
            print(f"Adjusted offsets (relative to data start): {offsets}")

            data = data[data_start:]
            print(f"Data after offset adjustment: {data!r}")

            reply = []
            current_offset = 0
            for idx, tag_info in enumerate(tags):
                print(f"\nProcessing tag: {tag_info[0]}")
                tag_name, base_tag, index = parse_tag_name(tag_info[0])
                element_count = tag_info[1]

                # Determine segment end based on next offset or end of data
                if idx + 1 < len(offsets):
                    segment_end = offsets[idx + 1]
                else:
                    segment_end = len(data)

                segment = data[current_offset:segment_end]
                print(f"Segment for tag: {segment!r}")
                current_offset = segment_end

                segment_service = unpack_from("<H", segment, 0)[0]
                print(f"Segment service: {segment_service}")
                segment_status = unpack_from("<B", segment, 2)[0]
                print(f"Segment status: {segment_status}")

                if segment_status == 6:
                    print("Segment status is 6 (partial), attempting to read more data")
                    segment_data_type = unpack_from("<B", segment, 4)[0]
                    ioi = self._build_ioi(tag_name, segment_data_type)
                    self.Offset = len(segment) - 8
                    request = self._add_partial_read_service(ioi, element_count)
                    segment_status, ret_data = self.conn.send(request)
                    print(f"Partial read status: {segment_status}, data: {ret_data!r}")
                    segment += ret_data[54:]

                if segment_status == 0:
                    print("Segment status is 0 (success)")
                    segment_data_type = unpack_from("<B", segment, 4)[0]
                    print(f"Segment data type: {segment_data_type}")
                    self.KnownTags[base_tag] = (segment_data_type, 0)

                    if segment_data_type == 0xA0:  # String
                        print("Parsing string value")
                        value = self._parse_string_value(segment)
                    elif segment_data_type == 0xC4:  # DINT (for counters)
                        print("Parsing DINT value")
                        value = self._parse_dint_values(segment)
                    elif segment_data_type == 0xD3 or bit_of_word(
                        tag_name
                    ):  # BOOL Array or Bit of Word
                        print("Parsing BOOL array or bit of word")
                        values = self._parse_bool_values(segment, segment_data_type)
                        value = self._words_to_bits(tag_name, values, element_count)
                    else:
                        print("Parsing other value type")
                        value = self._parse_other_values(segment, segment_data_type)

                else:
                    print(
                        f"Segment status is not 0, setting value to None. Status: {segment_status}"
                    )
                    value = None

                # value shouldn't be a list if there is only one
                if isinstance(value, list) and len(value) == 1:
                    value = value[0]

                response = [tag_name, value, segment_status]
                reply.append(response)

            print("\nExiting _parse_multi_read")
            return reply
        except Exception as e:
            print(f"ERROR in _parse_multi_read: {e}")
            raise

And here's its output:

Attempting to connect to PLC at 10.1.18.132...
Successfully connected to PLC!
PLC Time: 2024-12-09 12:54:03.176039

Reading tag structure...

=== Available Tags ===
Found 2951 tags

Attempting to read the tags ['Program:Preinspection', 'Program:Preinspection.PreInspect', 'Program:Preinspection.Preinspect_Fail_Full_Light_Off', 'Program:Preinspection.Preinspect_Fail_Full_Light_On', 'Program:Preinspection.PreinspectGoodPartsCounter'] in 
a batch
Entering _parse_multi_read
Raw data received: b'p\x00.\x02\x97\x00!\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\xa1\x00\x04\x00D\xad\x00\x00\xb1\x00\x1a\x02\x07\x01\x8a\x00\x1e\x00\x05\x00\x0c\x00\x10\x00\xd8\x01\xec\x01\x00\x02\xcc\x00\x0f\x00\xcc\x00\x00\x00\xa0\x02\x95\x85\x01\x00\x00\x00"\xe3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\xef\xdf\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\xc7\x01\x00\x00\x01\x00\x00\x00\x1e\x00\x00\x00\x01\x00\x00\x00%\xa3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00%\xa3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\x87\x01\x00\x00\x01\x00\x00\x00&\x00\x00\x00\x00\x00\x000\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa2FW\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xb9\x80+\xc0`\xea\x00\x00\xdd\x11\x00\x00\x00\x00\x00\x00\xc0\xf1\xff\x7f\x00\x00\x00\x00\xb1\xcb(\x00\xc0\xf1\xff\x7f\x961"\x00\xe7{+ \xc0\xf1\xff\x7f\xc0\xf1\xff\x7f\xe6{+\x00\x90\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00T\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1f\x00\x00\x00Vt\xff\xff\x1a\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00Dy+\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xedM+\x00k\x03\x00\x00\x00\x00\x00\x00Ey+\x00\xa4\x06\x00\x00\x00\x00\x00\x00\xc1T+\x00A\n\x00\x00\x00\x00\x00\x00\xb9\x80+\xa0,\x01\x00\x00,\x01\x00\x00\xb9\x80+\xc0\xf4\x01\x00\x00J\x00\x00\x00\x0f\\+\x00\xf4\x01\x00\x00\x00\x00\x00\x00yy+\x002\x00\x00\x00\x00\x00\x00\x00\xbbz+\x00\x88\x13\x00\x00\x00\x00\x00\x00\xb9\x80+\xa0\xc8\x00\x00\x00\xc8\x00\x00\x00\xb9\x80+\xc0\x98:\x00\x00s\x07\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x98:\x00\x00\xc1\x00\x0c\x13\xcc\x00\x00\x00\xa0\x02\x83\x0f\xf8=\r\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xcc\x00\x00\x00\xa0\x02\x83\x0f\x0f\\+\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xcc\x00\x00\x00\xa0\x02\x82\x0f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'  
Data after removing header (46 bytes): b'\x8a\x00\x1e\x00\x05\x00\x0c\x00\x10\x00\xd8\x01\xec\x01\x00\x02\xcc\x00\x0f\x00\xcc\x00\x00\x00\xa0\x02\x95\x85\x01\x00\x00\x00"\xe3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\xef\xdf\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\xc7\x01\x00\x00\x01\x00\x00\x00\x1e\x00\x00\x00\x01\x00\x00\x00%\xa3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00%\xa3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\x87\x01\x00\x00\x01\x00\x00\x00&\x00\x00\x00\x00\x00\x000\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa2FW\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xb9\x80+\xc0`\xea\x00\x00\xdd\x11\x00\x00\x00\x00\x00\x00\xc0\xf1\xff\x7f\x00\x00\x00\x00\xb1\xcb(\x00\xc0\xf1\xff\x7f\x961"\x00\xe7{+ \xc0\xf1\xff\x7f\xc0\xf1\xff\x7f\xe6{+\x00\x90\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00T\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1f\x00\x00\x00Vt\xff\xff\x1a\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00Dy+\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xedM+\x00k\x03\x00\x00\x00\x00\x00\x00Ey+\x00\xa4\x06\x00\x00\x00\x00\x00\x00\xc1T+\x00A\n\x00\x00\x00\x00\x00\x00\xb9\x80+\xa0,\x01\x00\x00,\x01\x00\x00\xb9\x80+\xc0\xf4\x01\x00\x00J\x00\x00\x00\x0f\\+\x00\xf4\x01\x00\x00\x00\x00\x00\x00yy+\x002\x00\x00\x00\x00\x00\x00\x00\xbbz+\x00\x88\x13\x00\x00\x00\x00\x00\x00\xb9\x80+\xa0\xc8\x00\x00\x00\xc8\x00\x00\x00\xb9\x80+\xc0\x98:\x00\x00s\x07\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x98:\x00\x00\xc1\x00\x0c\x13\xcc\x00\x00\x00\xa0\x02\x83\x0f\xf8=\r\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xcc\x00\x00\x00\xa0\x02\x83\x0f\x0f\\+\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xcc\x00\x00\x00\xa0\x02\x82\x0f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'
Service code: 138
Status: 30, Extended Status: 0
Service count: 5
Offsets: [12, 16, 472, 492, 512]
Adjusted offsets (relative to data start): [-4, 0, 456, 476, 496]
Data after offset adjustment: b'\xcc\x00\x0f\x00\xcc\x00\x00\x00\xa0\x02\x95\x85\x01\x00\x00\x00"\xe3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\xef\xdf\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\xc7\x01\x00\x00\x01\x00\x00\x00\x1e\x00\x00\x00\x01\x00\x00\x00%\xa3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00%\xa3\x01\x00\xe8\x03\x00\x00\x00\x00\x00\x00\x87\x01\x00\x00\x01\x00\x00\x00&\x00\x00\x00\x00\x00\x000\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa2FW\x00\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xb9\x80+\xc0`\xea\x00\x00\xdd\x11\x00\x00\x00\x00\x00\x00\xc0\xf1\xff\x7f\x00\x00\x00\x00\xb1\xcb(\x00\xc0\xf1\xff\x7f\x961"\x00\xe7{+ \xc0\xf1\xff\x7f\xc0\xf1\xff\x7f\xe6{+\x00\x90\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00T\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1f\x00\x00\x00Vt\xff\xff\x1a\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00Dy+\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xedM+\x00k\x03\x00\x00\x00\x00\x00\x00Ey+\x00\xa4\x06\x00\x00\x00\x00\x00\x00\xc1T+\x00A\n\x00\x00\x00\x00\x00\x00\xb9\x80+\xa0,\x01\x00\x00,\x01\x00\x00\xb9\x80+\xc0\xf4\x01\x00\x00J\x00\x00\x00\x0f\\+\x00\xf4\x01\x00\x00\x00\x00\x00\x00yy+\x002\x00\x00\x00\x00\x00\x00\x00\xbbz+\x00\x88\x13\x00\x00\x00\x00\x00\x00\xb9\x80+\xa0\xc8\x00\x00\x00\xc8\x00\x00\x00\xb9\x80+\xc0\x98:\x00\x00s\x07\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x01\x00\x00\x00\x98:\x00\x00\xc1\x00\x0c\x13\xcc\x00\x00\x00\xa0\x02\x83\x0f\xf8=\r\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xcc\x00\x00\x00\xa0\x02\x83\x0f\x0f\\+\x00\xfa\x00\x00\x00\x00\x00\x00\x00\xcc\x00\x00\x00\xa0\x02\x82\x0f\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00'

Processing tag: Program:Preinspection
Segment for tag: b''
ERROR in _parse_multi_read: unpack_from requires a buffer of at least 2 bytes for unpacking 2 bytes at offset 0 (actual buffer size is 0)
ERROR: unpack_from requires a buffer of at least 2 bytes for unpacking 2 bytes at offset 0 (actual buffer size is 0)
done
Time elapsed: 0.02s

Thanks for working on this! And let me know if I can help you with any more information on diagnosing the problem.

@dmroeder
Copy link
Owner

dmroeder commented Dec 9, 2024

Hmm, I can see this being an issue when including types that are not the basic data types (STRUCTs).

@ChristopheLaurent, do you also have tags that are structures/UDT's? I'll need to do some more testing...

@ammonwk
Copy link

ammonwk commented Dec 9, 2024

I could email you my WireShark capture, if that would be helpful. Where should I send it to?

@dmroeder
Copy link
Owner

dmroeder commented Dec 9, 2024

My user name at gmail

@ChristopheLaurent
Copy link
Author

ChristopheLaurent commented Dec 10, 2024

I think that the root cause of my issue is not _parse_multi_read but _multi_read. I modified _multi_read as follow:

    def _multi_read(self, tags):
        """
        Read tags using multi-service messaging
        """
        # generate a list of service requests
        print(f"Entering _multi_read for {len(tags)} tags")
        ret_services = self._generate_read_service_list(tags)

        # format tags to match services
        new_tags = []
        count = 0
        for s in (ret_services):
            temp = []
            for v in s:
                temp.append(tags[count])
                count += 1
            new_tags.append(temp)

        response = []
        for i, services in enumerate(ret_services):
            header = self._build_multi_service_header()
            tag_count = pack("<H", len(new_tags[i]))
            # # calculate the offsets
            current_offset = len(new_tags[i]) * 2 + 2
            offsets = pack("<H", current_offset)
            print (f'service #{i}: {len(new_tags[i])} tags, initial offset = {current_offset}')
            for j in range(len(services)-1):
                current_offset += len(services[j])
                offsets += pack("<H", current_offset)
            print (f'service #{i}: current offset = {current_offset}')

            segments = b''.join(s for s in services)
            request = header + tag_count + offsets + segments
            print (f'service #{i}: request = {request!r}')

            status, ret_data = self.conn.send(request)

            # return error if no data is returned
            if not ret_data:
                print ('No returned data !')
                return [[t, None, status] for t in new_tags[i]]

            print (f'received {len(ret_data)} data')
            response.extend(self._parse_multi_read(ret_data, new_tags[i]))

The resulting trace shows that the first call to multi_read works fine:

groot-svc        | Entering _multi_read for 703 tags
groot-svc        | service #0: 20 tags, initial offset = 42
groot-svc        | service #0: current offset = 480
groot-svc        | service #0: request = b'\n\x02 \x02$\x01\x14\x00*\x00B\x00Z\x00p\x00\x88\x00\xa0\x00\xb6\x00\xcc\x00\xe2\x00\xf8\x00\x0e\x01$\x01:\x01P\x01f\x01|\x01\x92\x01\xac\x01\xc4\x01\xe0\x01L\n\x91\x11AFPD ... _ALARM\x00\x01\x00'
groot-svc        | received 236 data
groot-svc        | service #1: 20 tags, initial offset = 42
groot-svc        | service #1: current offset = 592
groot-svc        | service #1: request = b'\n\x02 \x02$\x01\x14\x00*\x00L\x00d\x00\x84\x00\x9e\x00\xbc\x00\xd8\x00\xf6\x00\x14\x01,\x01H\x01f\x01\x88\x01\xa0\x01\xc0\x01\xe0\x01\xf8\x01\x18\x022\x02P\x02L\x0f\x91\x1cAWCM_ ... _ALARM\x01\x00'
groot-svc        | received 241 data
groot-svc        | service #2: 20 tags, initial offset = 42
groot-svc        | service #2: current offset = 596
...
groot-svc        | service #34: 20 tags, initial offset = 42
groot-svc        | service #34: current offset = 482
groot-svc        | service #34: request = b'\n\x02 \x02$\x01\x14\x00*\x00D\x00P\x00d\x00\x80\x00\x94\x00\xae\x00\xc6\x00\xde\x00\xf8\x00\x12\x01.\x01:\x01V\x01r\x01\x86\x01\xa6\x01\xb8\x01\xd0\x01\xe2\x01L\x0b\x91\x14PROCESS ..._next\x00\x01\x00'
groot-svc        | received 271 data
groot-svc        | service #35: 3 tags, initial offset = 8
groot-svc        | service #35: current offset = 44
groot-svc        | service #35: request = b'\n\x02 \x02$\x01\x03\x00\x08\x00\x16\x00,\x00L\x05\x91\x08TEST_INT\x01\x00L\t\x91\x0fWater_Level_Set\x00\x01\x00L\x06\x91\nlocal_DINT\x01\x00'
groot-svc        | received 86 data

Then the second call looks completely different with the 1st service that generate a huge request :

groot-svc        | Entering _multi_read for 703 tags
groot-svc        | service #0: 400 tags, initial offset = 802
groot-svc        | service #0: current offset = 11832
groot-svc        | service #0: request = b'\n\x02 \x02$\x01\x90\x01"\x03:\x03R\x03h\x03\x80\x03\x98\x03\xae\x03\xc4\x03\xda\x03\xf0\x03\x06\x04\x1c\x042\x04H\x04^\x04t\x04\x8a\x04\xa4\x04\xbc\x04\xd8\x04\xf6\x04\x18\x050\x05P\x05j\x05\x88\x05\xa4\x05\xc2\x05\xe0\x05\xf8\x05\x14\x062\x06T\x06l\x06\x8c\x06\xac\x06\xc4\x06\xe4\x06\xfe\x06\x1c\x078\x07V\x07t\x07\x90\x07\xa6\x07\xc2\x07\xe2\x07\xfa\x07\x16\x084\x08V\x08v\x08\x8e\x08\xae\x08\xca\x08\xe8\x08\x06\t"\t@\tb\t\x82\t\x9e\t\xbc\t\xda\t\xf6\t\x14\n6\nV\nr\n\x90\n\xae\n\xca\n\xe8\n\n\x0b*\x0bF\x0bd\x0b\x82\x0b\x9e\x0b\xba\x0b\xd6\x0b\xf2\x0b\n\x0c"\x0c:\x0cR\x0cj\x0c\x82\x0c\x9a\x0c\xb2\x0c\xca\x0c\xe2\x0c\x08\r2\rZ\r|\r\xa8\r\xd4\r\xec\r\x04\x0e\x1c\x0e4\x0eL\x0ed\x0e\x8a\x0e\xb4\x0e\xdc\x0e\xfe\x0e*\x0fV\x0fn\x0f\x86\x0f\x9e\x0f\xb6\x0f\xce\x0f\xe6\x0f\xfe\x0f\x16\x10.\x10F\x10^\x10v\x10\x8e\x10\xa6\x10\xcc\x10\xf6\x10\x1e\x11@\x11l\x11\x98\x11\xb0\x11\xc8\x11\xe0\x11\xf8\x11\x10\x12(\x12@\x12X\x12p\x12\x88\x12\xa0\x12\xb8\x12\xd0\x12\xe8\x12\x00\x13\x18\x130\x13H\x13n\x13\x98\x13\xc0\x13\xe2\x13\x0e\x14:\x14R\x14j\x14\x82\x14\x9a\x14\xb2\x14\xca\x14\xe2\x14\xfa\x14\x12\x15*\x15B\x15Z\x15r\x15\x8a\x15\xa2\x15\xba\x15\xd2\x15\xea\x15\x02\x16\x1a\x162\x16J\x16b\x16z\x16\x92\x16\xaa\x16\xd0\x16\xfa\x16"\x17D\x17p\x17\x9c\x17\xb4\x17\xcc\x17\xe4\x17\xfc\x17\x14\x18,\x18D\x18\\\x18x\x18\x96\x18\xb8\x18\xd8\x18\xfe\x18 \x19<\x19Z\x19|\x19\x9c\x19\xc2\x19\xe4 ... _TIME\x00\x01\x00'
groot-svc        | Exception reading 703 tags from 192.168.22.10:  unpack_from requires a buffer of at least 49 bytes for unpacking 1 bytes at offset 48 (actual buffer size is 24)
groot-svc        |   File "/opt/everest/dev-ts-groot/groot_lib.py", line 159, in plc_read
groot-svc        |     results = comm.Read(batch)
groot-svc        |   File "/opt/everest/dev-ts-groot/pylogix/eip.py", line 126, in Read
groot-svc        |     return self._batch_read(tag)
groot-svc        |   File "/opt/everest/dev-ts-groot/pylogix/eip.py", line 393, in _batch_read
groot-svc        |     responses += [Response(tag, value, status) for tag, value, status in self._multi_read(current_requests)]
groot-svc        |   File "/opt/everest/dev-ts-groot/pylogix/eip.py", line 432, in _multi_read
groot-svc        |     status, ret_data = self.conn.send(request)
groot-svc        |   File "/opt/everest/dev-ts-groot/pylogix/lgx_comm.py", line 69, in send
groot-svc        |     return self._get_bytes(eip_header, connected)
groot-svc        |   File "/opt/everest/dev-ts-groot/pylogix/lgx_comm.py", line 177, in _get_bytes
groot-svc        |     status = unpack_from('<B', ret_data, 48)[0]

Eventually I get the same exception as before. How the service #0 can have 400 tags in the second call to _multi_read, instead of 20 in the 1st call to _multi_read ? Possibly @ammonwk had a related but different issue ?

@dmroeder
Copy link
Owner

I'm working on a fix. The issue has to do with not taking into account the size of structs when trying to unpack the multi-message request. Apologies for the issue, I'll have it fixed as soon as I can.

@ChristopheLaurent
Copy link
Author

I'm working on a fix. The issue has to do with not taking into account the size of structs when trying to unpack the multi-message request. Apologies for the issue, I'll have it fixed as soon as I can.

No worries, this is not an emergency for us, we can stay on version 0.8 for the time being. Furthermore I won't be able to test your fix before the 18th of December. Thanks a lot for your support !

@dmroeder
Copy link
Owner

@ammonwk, are you able to run the lates changes on the bugfix branch?

@ammonwk
Copy link

ammonwk commented Dec 17, 2024 via email

@dmroeder
Copy link
Owner

Thanks for testing. How many bytes are the larger data types?

@ChristopheLaurent
Copy link
Author

I downloaded the branchlarge-list-readand I confirm that your fix works well in my case. I could read all my tags without any issues. Thanks a lot for your support. Please let me know which version will contain this fix.

@dmroeder
Copy link
Owner

I believe both issues are resolved in the branch. @ammonwk are you able to try it?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants