opcua-asyncio
opcua-asyncio copied to clipboard
[ua.uatypes.String] used baseclass
Most types in ua.uatypes
, for example like CharArray
and Float
, are derived from a matching Python datatype.
But this isn't the case for ua.uatypes.String
. In #609 the typedef is changed from class String(str):
to class String:
.
Not sure why this is done and if the original reason is still the case (@oroulet ?). If I change it back and run pytest no new errors are introduced. It even fix an existing issue in the test:
./tests/test_common.py::test_alias[client] Failed: [undefined]AttributeError: 'String' object has no attribute 'encode'
By having no str as baseclass makes use of the python typing system hard:
my_string : ua.String = "" # due typing invalid
my_string : ua.String = ua.String() # Quite unusable, return empty ua.String object
What about changing it back to class class String(str):
?
I think the problem is, that in opc ua a string is not a python string. Because a ua.String can be Null(None), empty('') or any str content. So if we are transparent then a ua.String is like Optional[str]. The problem is that a server/client interpret None and empty different, so we can't just convert a Null to empty without consequence. I don't know how we can describe that in python in a class.
@schroeder- are you aware of an existing test where this behaviour is demonstrated/tested?
The current definition class String:
also can't hold this values? And it places where a ua.String is currently str are used. For example the structs dataclasses. So the current implementation also doesn't seems entirely strict.
There is a test for it , but it doesn't use a struct so no no linter will find this. https://github.com/FreeOpcUa/opcua-asyncio/blob/602249d9a02dadfa6e1e82bd900a2064854d50c1/tests/test_common.py#L481-L491
Here is decoder/encoder: https://github.com/FreeOpcUa/opcua-asyncio/blob/602249d9a02dadfa6e1e82bd900a2064854d50c1/asyncua/ua/ua_binary.py#L64-L76
Created the a test for None/''/'value' behaviour below. @schroeder- can you review if the test below is correct?
A side from correct python typing, the results for class String:
and class String(str):
is the same. So it looks that using class String(str):
will not breaking anything.
The nodeset:
<UADataType NodeId="ns=1;i=6101" BrowseName="4:MyCrazyStruct">
<DisplayName>MyCrazyStruct</DisplayName>
<Description>My Description</Description>
<Definition Name="MyCrazyStruct">
<Field Name="MyUInt32" DataType="i=7" ValueRank="0" ArrayDimensions="1" />
<Field Name="MyString" DataType="String" />
<Field IsOptional="true" Name="MyOptionalString" DataType="String" />
</Definition>
<References>
<Reference ReferenceType="HasSubtype" IsForward="false">i=22</Reference>
<Reference ReferenceType="HasEncoding">ns=1;i=6102</Reference>
</References>
</UADataType>
The test itself:
async def test_custom_struct_with_string_import(opc):
await opc.opc.import_xml("tests/custom_struct_string.xml")
await opc.opc.load_data_type_definitions()
my_struct_src: ua.MyCrazyStruct = ua.MyCrazyStruct(MyUInt32=32, MyString="ABC", MyOptionalString="123")
data = extensionobject_to_binary(my_struct_src)
my_struct_dst= extensionobject_from_binary( ua.utils.Buffer(data))
assert my_struct_dst.MyString == "ABC", 'my_struct_dst.MyString == "ABC"'
assert my_struct_dst.MyOptionalString == "123", 'my_struct_dst.MyOptionalString == "123"'
my_struct_src = ua.MyCrazyStruct(MyUInt32=32, MyString="", MyOptionalString="")
data = extensionobject_to_binary(my_struct_src)
my_struct_dst= extensionobject_from_binary( ua.utils.Buffer(data))
assert my_struct_dst.MyString == "", 'my_struct_dst.MyString == ""'
assert my_struct_dst.MyOptionalString == "", 'my_struct_dst.MyOptionalString == ""'
my_struct_src = ua.MyCrazyStruct(MyUInt32=32, MyString=None, MyOptionalString=None)
data = extensionobject_to_binary(my_struct_src)
my_struct_dst= extensionobject_from_binary( ua.utils.Buffer(data))
assert my_struct_dst.MyString == None, 'my_struct_dst.MyString == None'
assert my_struct_dst.MyOptionalString == None, 'my_struct_dst.MyOptionalString == None'
No need to create an other class just use one of the auto generated ua class:
async def test_custom_struct_with_string_import(opc):
my_struct_src: ua.IdentityMappingRuleType= ua.IdentityMappingRuleType()
data = extensionobject_to_binary(my_struct_src)
my_struct_dst= extensionobject_from_binary( ua.utils.Buffer(data))
assert my_struct_src.Criteria is None
Also optional fields have nothing todo with strings.
If it is only typing, than neither
class String:
nor
class String(str):
gives us something, because either None or str is broken in typing. Also the same case with string also applys to ua.Boolean.
The ua.String
type is only one of the fixes used. Also fixes to get_default_value
are made.
The give IMHO this gives an more natural behaviour for strings and also for optional fields in structs in general:
# Able to use default constructor for a valid string:
assert ua.String() == '' # < Error: with default implementation this is TypeError
# Able to use default constructor for a valid string:
assert ua.String('') == '' # < Error: with default implementation this is TypeError
assert ua.String('123') == '123' # < Error: with default implementation this is TypeError
my_struct = ua_mystruct.MyCrazyStruct(MyUInt32=[ua.UInt32(32)])
# Is mandatory field; expect default valid value, not none:
assert my_struct.MyString == '' # < Error: with default implementation this an AssertionError, value is None
# Optional field; expect empty field in the form of None:
assert my_struct.MyOptionalString == None # < Error: with default implementation this an AssertionError, value is obj of ua.String
# Expect type warning about the None for a mandatory field (requires generated extension library, instead of on the fly generated with load_data_type_definitions):
ua_mystruct.MyCrazyStruct(MyUInt32=[ua.UInt32(32)], MyString=None)
Standard load_data_type_definitions
generate code for mandatory string fields like:
MyString: ua.String = None # < issue mandatory field
This is invalid for typing (I don't want a None for mandatory fields), because the get_default_value
is changed now the following is generated:
MyString: ua.String = ua.String()
And optional fields have still have Optional[...]
.
I get your view, which is python colored view. In reality you communicate with servers/clients, that use other sdks, which distinguish between a null string and an empty string. And can have different handling in case of null and empty. So I don't know if we should lose the possibility of null/empty. But I have no problem with the change currently, but this can lead to interoperability issues in the future.
My main language is C/C++ and I'm very aware of the binary mapping of the ua.string.
Only the standards Part 6 - 5.19 is very clear about this part:
5.1.9 Null, Empty and Zero-Length Arrays ToC Previous Next The terms null, empty and zero-length are used to describe array values (Strings are arrays of characters and ByteStrings are arrays of Bytes for purposes of this discussion). A null array has no value. A zero-length or empty array is an array with 0 elements. Some DataEncodings will allow the distinction to be preserved on the wire, however, not all DevelopmentPlatforms will be able to preserve the distinction. For this reason, null, empty and zero length arrays are semantically the same for all DataEncodings. Decoders shall be able to handle all variations supported by the DataEncoding, however, decoders are not required to preserve the distinction. When testing for equality, applications shall treat null and empty arrays as equal. When a DevelopmentPlatform supports the distinction, writing and reading back an array value may result in null array becoming an empty array or vise versa.
The way I interpret it I don't think we will run in interoperability issues.
I am ok with the change. But maybe we wait for @oroulet, because he changed to code, sometime ago.
I do not remember why I did these changes. Usually I try to write tests so hoepfully if you change something and it does not break any tests it should be fine....And even better if you add more tests.
(My comment from a few days ago does no longer show up for me here, so here I go again). In Python 3.10, Typealiases have been added. So it could be written as
String = str | None
Would this be a possibility? Is it feasible to change the required Python-version to 3.10? Other types (Float, ...) could then also e expressed as a typealias.
Its not possible because it would not change anything. Because the reason the issue was started is, that you cant just type ua.String()
or ua.String('ABC')
. Also we use the type informations for serialization and deserialization and this signature would not work without a massive rewrite.
I think we could go forward with @bitkeeper idea and interpret None as empty string and use this:
class us.String(str):
pass
Implemented in #1224, close issue.