Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
adt.Feature is a uint32 and is used to represent all labels, including index selectors. And since four bits are reserved for the label type, that leaves us with 28 bits for an index integer. This means that index selectors in CUE should work equally well on 32-bit platforms as they do on 64-bit platforms, given that the underlying type is uint32 and only 28-bit index integers are used. However, int64 precision is used before creating an adt.Feature for the sake of giving better errors to the user, such as the following when an index like `[3T]` goes over the 28-bit maximum adt.MaxIndex: int label out of range (3000000000000 not >=0 and <= 268435454) When creating an index selector from a CUE basic literal we were converting the literal's int64 value to int too early, meaning that we lost precision and mangled the value in the range check: int label out of range (2112827392 not >=0 and <= 268435454) Tweak cue.Index so that it accepts int64 for full precision on top of the existing int support for ease of use. This change to generics should not break the vast majority of its users doing direct calls, thanks to type inference, but any users who are broken can swap `cue.Index` for `cue.Index[int]`. We can soon start testing this by adding a CI step on amd64 like: GOARCH=386 go test -short ./... Updates #3540. Signed-off-by: Daniel Martí <[email protected]> Change-Id: I9ee67d66128d7a91c7c198865f107c26b496deaf Reviewed-on: https://review.gerrithub.io/c/cue-lang/cue/+/1203192 TryBot-Result: CUEcueckoo <[email protected]> Unity-Result: CUE porcuepine <[email protected]> Reviewed-by: Roger Peppe <[email protected]>
- Loading branch information