-
Notifications
You must be signed in to change notification settings - Fork 5k
Option to disallow duplicate JSON properties #115856
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Option to disallow duplicate JSON properties #115856
Conversation
Tagging subscribers to this area: @dotnet/area-system-text-json, @gregsdennis |
Honestly, I don't believe this is possible with the current design of Utf8JsonReader checkpoint = reader;
var value = JsonSerializer.Deserialize<T>(ref reader);
reader = checkpoint; // resets the reader state For this reason, I think we need to make it strictly a deserialization-level concern.
I think we want to make this a fail-fast operation. Specifically for
I suspect you might need to implement this on case-by-case basis. For known mutable dictionary types you can rely on the result itself to track duplicates but immutable or frozen dictionaries (which IIRC have constructor methods with overwrite semantics) you probably need to enlist an auxiliary dictionary type to do so (it might help if you could make this a pooled object that can be used across serializations). |
} | ||
else | ||
{ | ||
if (jObject.ContainsKey(propertyName)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible we could avoid duplicate lookups here using a TryAdd-like operation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Still in PR: #111229
...tem/Text/Json/Serialization/Converters/Object/ObjectWithParameterizedConstructorConverter.cs
Outdated
Show resolved
Hide resolved
src/libraries/System.Text.Json/src/System/Text/Json/Serialization/ReadStackFrame.cs
Show resolved
Hide resolved
// False means that property is not set (not yet occurred in the payload). | ||
// Length of the BitArray is equal to number of non-extension properties. | ||
// Every JsonPropertyInfo has PropertyIndex property which maps to an index in this BitArray. | ||
public BitArray? AssignedProperties; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One optimization we left out when adding required property validation is avoiding allocations for a small number of properties. You could do that by implementing ValueBitArray
struct that either uses an int to track objects with less than 32 properties and fall back to allocating a regular BitArray
for anything bigger. Not required for this PR but nice to have nonetheless.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
perhaps some simple solution like this could be good enough to significantly reduce allocations:
struct OptimizedBitArray
{
private ulong _fastArray;
private BitArray? _slowArray;
public OptimizedBitArray(int numberOfProperties)
{
if (numberOfProperties > 64)
{
_slowArray = new BitArray();
}
}
public void SetBit(int index)
{
if (_slowArray != null)
// ...
}
// etc
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
but honestly I'd check if this will actually make any difference in E2E scenario
[InlineData("""{ "1": 0 , "1": 1 }""")] | ||
[InlineData("""{ "1": null, "1": null }""")] | ||
[InlineData("""{ "1": "a" , "1": null }""")] | ||
[InlineData("""{ "1": null, "1": "b" }""")] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
perhaps I missed but I think there might be couple of test cases which would be nice to see:
- recursive structure with duplicate properties in one of the inner, both positive and negative test cases
- two C# properties with same name but different casing
- are duplicates allowed inside of dictionary? regardless of the answer there should be a test case
- arrays - both duplicating entry with array and also item in the array with duplicated entries (might be also worth to add testcase that we don't false trigger)
- if there is per type setting on JsonTypeInfo I'd imagine this should be exercised (I think no but have only skimmed through PR)
- for arrays another interesting case is to create pre-populated list and add properties and see if it appends when it encounters for the first time - is that expected with this setting on? (I'd imagine if someone doesn't like the dups they might want to erase the content first - it's a bit user shooting themselves in the foot so not sure about doing anything special but might be worth defining expectations)
Adds option to disallow duplicate JSON properties. There are some open questions:
Are duplicate properties an error in the JSON payload itself, or only errors during deserialization? In other words, if
AllowsDuplicateProperties=false
, should we validate this in the Utf8JsonReader before giving it to converters or should the converters dedup themselves? This PR makes the change in the converters, but we might want to consider moving it down to the reader. Pros/cons for each:For JsonDocument deserialization, do we want to deserialize everything first and then dedup, or dedup while parsing? The former makes the duplicate tracking a little easier since the number of properties for an object would be known. The latter allows failing fast.
The dictionary converter is not yet implemented. It will also need to decide when to dedup as mentioned in the point above (note the dictionary keys do not have to be string, so we need an additional data structure to do the dedup).
/cc @eiriktsarpalis @jozkee
Contributes to #108521