You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Having built a deserializer using the WithAttemptingUnquotedStringTypeDeserialization option and using generic Dictionary<object, object> deserialization, hexadecimal values can be converted into a negative value inappropriately.
To Reproduce
The issue can be clearly seen in this .Net Fiddle: https://dotnetfiddle.net/Y5PPJq (embedded at the bottom here as well).
The simplest steps are to use a value between 0x8000 and 0xFFFF (this is the range between 2 ^15 + 1 and 2 ^ 16). These values will successfully Convert.ToInt16() but the code makes a decision to interpret the first bit as the sign. The same thing happens at Int32.MaxValue + 1.
Resolution
I would be glad to contribute to the solution, but I don't want to spend time to create a pull request if it's not in keeping with your intent. I feel the appropriate solution is to simply use UInt64 rather than trying to use the smallest possible type -- this handles the most common cases, is not ambiguous (because yaml hex numbers have no concept of how wide they are).
Workaround
In my use case I've worked around the issue by writing an INodeDeserializer explicitly for hexadecimal values and using the .WithNodeDeserializer method register it instead of (wrapping) the ScalarNodeDeserializer.
Extra Bonus Bug
The octal implementation is entirely incorrect as well, but for a different reason. The Convert.ToInt16 (and other methods) won't convert a string with the 0o prefix, it has to be numbers matching the regular expression [0-7]+. The regular expression for octal is copied straight from the hex code (which is a third issue, but entirely hidden by the complete failure to parse with the 0o).
Code example showing the bug
Fiddle code also pasted here for posterity:
Describe the bug
Having built a deserializer using the
WithAttemptingUnquotedStringTypeDeserialization
option and using genericDictionary<object, object>
deserialization, hexadecimal values can be converted into a negative value inappropriately.To Reproduce
The issue can be clearly seen in this .Net Fiddle: https://dotnetfiddle.net/Y5PPJq (embedded at the bottom here as well).
The simplest steps are to use a value between 0x8000 and 0xFFFF (this is the range between 2 ^15 + 1 and 2 ^ 16). These values will successfully
Convert.ToInt16()
but the code makes a decision to interpret the first bit as the sign. The same thing happens atInt32.MaxValue + 1
.Resolution
I would be glad to contribute to the solution, but I don't want to spend time to create a pull request if it's not in keeping with your intent. I feel the appropriate solution is to simply use UInt64 rather than trying to use the smallest possible type -- this handles the most common cases, is not ambiguous (because yaml hex numbers have no concept of how wide they are).
Workaround
In my use case I've worked around the issue by writing an
INodeDeserializer
explicitly for hexadecimal values and using the.WithNodeDeserializer
method register it instead of (wrapping) theScalarNodeDeserializer
.Extra Bonus Bug
The octal implementation is entirely incorrect as well, but for a different reason. The
Convert.ToInt16
(and other methods) won't convert a string with the0o
prefix, it has to be numbers matching the regular expression[0-7]+
. The regular expression for octal is copied straight from the hex code (which is a third issue, but entirely hidden by the complete failure to parse with the0o
).Code example showing the bug
Fiddle code also pasted here for posterity:
The text was updated successfully, but these errors were encountered: