You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I encountered the necessity to compare some generated data with test data, where those data sets include dynamically generated id values nested in different levels. So the goal was to test the object for equality by ignoring/ommiting those id_ keys.
E.G.
// test for equality but ignore the values of id{id: 1234,a: 'foo',b: {id: 45678c: 'bar'}}
which works in my case, but I would love to see the possibility to specify which keys to exclude in the deep-eql api which seems obviously much more reliable than my solution
The text was updated successfully, but these errors were encountered:
With the current API you can pass in a custom comparator function which gets called on every value so that you can filter it out as you see fit. You could write a custom comparator that deletes the keys like your code does, but it can rely on the rest of the of the internals of this library.
I think the comparator provides enough flexibility with the API to allow for the functionality you’re after.
You could write a custom comparator that deletes the keys like your code does, but it can rely on the rest of the of the internals of this library.
Could you elaborate more on this please ? So basically the custom comparator be a function which will accept the leftHandOperand, rightHandOperand as parameters, performing the
I encountered the necessity to compare some generated data with test data, where those data sets include dynamically generated id values nested in different levels. So the goal was to test the object for equality by ignoring/ommiting those id_ keys.
E.G.
I cerated my own - much simpler deepEql_ function
which works in my case, but I would love to see the possibility to specify which keys to exclude in the deep-eql api which seems obviously much more reliable than my solution
The text was updated successfully, but these errors were encountered: