Skip to content

Option to disallow duplicate JSON properties #115856

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

PranavSenthilnathan
Copy link
Member

Adds option to disallow duplicate JSON properties. There are some open questions:

  • Are duplicate properties an error in the JSON payload itself, or only errors during deserialization? In other words, if AllowsDuplicateProperties=false, should we validate this in the Utf8JsonReader before giving it to converters or should the converters dedup themselves? This PR makes the change in the converters, but we might want to consider moving it down to the reader. Pros/cons for each:

    • Utf8Reader dedup - all converters get deduplication this for free. All JSON that has duplicate properties will be rejected.
    • Converter dedup - types like JsonObject will have dictionaries internally so they can do the dedup more efficiently. There are probably workarounds to improve perf though (e.g. opt out of the reader dedup behavior when TokenType.StartArray is seen)
  • For JsonDocument deserialization, do we want to deserialize everything first and then dedup, or dedup while parsing? The former makes the duplicate tracking a little easier since the number of properties for an object would be known. The latter allows failing fast.

  • The dictionary converter is not yet implemented. It will also need to decide when to dedup as mentioned in the point above (note the dictionary keys do not have to be string, so we need an additional data structure to do the dedup).

/cc @eiriktsarpalis @jozkee

Contributes to #108521

Copy link
Contributor

Tagging subscribers to this area: @dotnet/area-system-text-json, @gregsdennis
See info in area-owners.md if you want to be subscribed.

@eiriktsarpalis
Copy link
Member

should we validate this in the Utf8JsonReader before giving it to converters or should the converters dedup themselves?

Honestly, I don't believe this is possible with the current design of Utf8JsonReader. The reader would need to track arbitrarily many properties in all objects it is currently nested within. This would almost certainly necessitate allocating from the heap for sufficiently complex data. In turn though, this would break the checkpointing semantics that Utf8JsonReader currently enjoys and many converters depend on:

Utf8JsonReader checkpoint = reader;
var value = JsonSerializer.Deserialize<T>(ref reader);
reader = checkpoint; // resets the reader state

For this reason, I think we need to make it strictly a deserialization-level concern.

For JsonDocument deserialization, do we want to deserialize everything first and then dedup, or dedup while parsing?

I think we want to make this a fail-fast operation. Specifically for JsonDocument not doing so makes it easy to bypass validation altogether since you can always forward the underlying JSON using JsonElement.GetRawText().

It will also need to decide when to dedup as mentioned in the point above (note the dictionary keys do not have to be string, so we need an additional data structure to do the dedup).

I suspect you might need to implement this on case-by-case basis. For known mutable dictionary types you can rely on the result itself to track duplicates but immutable or frozen dictionaries (which IIRC have constructor methods with overwrite semantics) you probably need to enlist an auxiliary dictionary type to do so (it might help if you could make this a pooled object that can be used across serializations).

}
else
{
if (jObject.ContainsKey(propertyName))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible we could avoid duplicate lookups here using a TryAdd-like operation?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still in PR: #111229

// False means that property is not set (not yet occurred in the payload).
// Length of the BitArray is equal to number of non-extension properties.
// Every JsonPropertyInfo has PropertyIndex property which maps to an index in this BitArray.
public BitArray? AssignedProperties;
Copy link
Member

@eiriktsarpalis eiriktsarpalis May 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One optimization we left out when adding required property validation is avoiding allocations for a small number of properties. You could do that by implementing ValueBitArray struct that either uses an int to track objects with less than 32 properties and fall back to allocating a regular BitArray for anything bigger. Not required for this PR but nice to have nonetheless.

Copy link
Member

@krwq krwq May 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps some simple solution like this could be good enough to significantly reduce allocations:

struct OptimizedBitArray
{
   private ulong _fastArray;
   private BitArray? _slowArray;

   public OptimizedBitArray(int numberOfProperties)
   {
     if (numberOfProperties > 64)
     {
         _slowArray = new BitArray();
     }
   }

   public void SetBit(int index)
   {
     if (_slowArray != null)
     // ...
   }

   // etc
}

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

but honestly I'd check if this will actually make any difference in E2E scenario

[InlineData("""{ "1": 0 , "1": 1 }""")]
[InlineData("""{ "1": null, "1": null }""")]
[InlineData("""{ "1": "a" , "1": null }""")]
[InlineData("""{ "1": null, "1": "b" }""")]
Copy link
Member

@krwq krwq May 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

perhaps I missed but I think there might be couple of test cases which would be nice to see:

  • recursive structure with duplicate properties in one of the inner, both positive and negative test cases
  • two C# properties with same name but different casing
  • are duplicates allowed inside of dictionary? regardless of the answer there should be a test case
  • arrays - both duplicating entry with array and also item in the array with duplicated entries (might be also worth to add testcase that we don't false trigger)
  • if there is per type setting on JsonTypeInfo I'd imagine this should be exercised (I think no but have only skimmed through PR)
  • for arrays another interesting case is to create pre-populated list and add properties and see if it appends when it encounters for the first time - is that expected with this setting on? (I'd imagine if someone doesn't like the dups they might want to erase the content first - it's a bit user shooting themselves in the foot so not sure about doing anything special but might be worth defining expectations)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants