Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using .catch causes input type to become unknown #2852

Closed
bradleymcleish opened this issue Oct 10, 2023 · 10 comments
Closed

Using .catch causes input type to become unknown #2852

bradleymcleish opened this issue Oct 10, 2023 · 10 comments

Comments

@bradleymcleish
Copy link

When using a .catch block the typescript input type of the schema becomes unknown.

Without catch

const schema = z.number().min(1).max(255)
type SchemaType = z.input<typeof schema>

SchemaType is number

With catch

const schema = z
  .number()
  .min(1)
  .max(255)
  .catch((err) => {
    return 1
  })

type SchemaType = z.input<typeof schema>

SchemaType is unknown

This seems to happen for all types string, number boolean etc

@rotu
Copy link
Contributor

rotu commented Nov 12, 2023

I think the behavior you observe is intended - for example, if the input is "foo", the output will be 1, the value provided by the catch callback.

What's your expectation? That catch only applies to min and max? I think this is expressed in zod as:

const schema = z.
  number()
  .transform(x => (1 < x && x < 255) ? x : 1)

@vtgn
Copy link

vtgn commented Mar 15, 2024

Hello !
I've just noticed the same unexpected behavior in my project with this kind of code:

capacity: z
            .number()
            .min(0)
            .max(10)
            .catch(() => {
                throw new MyCustomException(
                    "My custom exception message"
                );
            })

The input type specifies capacity?: unknow which is not what I would have expected in such a case. I specified clearly I want a number, so the catch method shouldn't change it. I think the catch method should never change the expected input type.

As a workaround, I just cast the capacity field ZodType like this:

capacity: z
            .number()
            .min(0)
            .max(10)
            .catch(() => {
                throw new MyCustomException(
                    "My custom exception message"
                );
            }) as z.ZodType<number>

Regards.

@vtgn
Copy link

vtgn commented Mar 15, 2024

I think the behavior you observe is intended - for example, if the input is "foo", the output will be 1, the value provided by the catch callback.

What's your expectation? That catch only applies to min and max? I think this is expressed in zod as:

const schema = z.
  number()
  .transform(x => (1 < x && x < 255) ? x : 1)

Hello,

But it is obvious in his schema he wants a number value, so this is not logical that the catch method overwrites it, to accept any type including undefined.
This is not the most expected behavior for this schema in this kind of case.

Regards.

@rotu
Copy link
Contributor

rotu commented Mar 15, 2024

But it is obvious in his schema he wants a number value, so this is not logical that the catch method overwrites it, to accept any type including undefined. This is not the most expected behavior for this schema in this kind of case.

No, the schema will output a number, but will gladly accept a non-numeric input and replace it with 1.

That catch, intentionally or not, does broaden the types this schema can handle. Maybe you mean to use .refine?

@vtgn
Copy link

vtgn commented Mar 15, 2024

No, the schema will output a number, but will gladly accept a non-numeric input and replace it with 1.

You can't know what will be done inside the catch by the developers. And I'm talking about the input type, not the ouput.

Using catch doesn't mean we want to accept anything, and however this is precisely what it does currently. So where is the utility to specify constraints like number() before the catch() if the catch overwrites the types given by all the constraints before it?

The input type is supposed to indicate what is the type that the schema expects, not what the schema constraints can manage, especially when this is a method who just catches all exceptions and is not supposed to add constraints about the expected type.

In my example, I don't care the developer knows the schema can manage anything, I just want he knows by the input type that a number is expected, that's all I want. The use of the catch shouldn't modify these requirement.

So if you find this current behavior normal, in this case everything before the catch doesn't affect the input type, so it's sufficient to just use a catch with nothing before, and implement all the expected type logic inside it for the strict same result. :/

Imagine you write an API method which requires a parameter object and you want to generate an API documentation specifying the expected param format. The used type for the parameter is the input type of your zod schema, and the first thing your method does is to parse and validate the parameter. Do you find useful to read that in the expected parameter object every field is partial with unknown type? :/

@rotu
Copy link
Contributor

rotu commented Mar 16, 2024

Using catch doesn't mean we want to accept anything, and however this is precisely what it does currently. So where is the utility to specify constraints like number() before the catch() if the catch overwrites the types given by all the constraints before it?

Maybe that's not what you want your code to say, but it's what the .catch method means per #1537 (for better or worse). catch clobbers the input types before it but not the output types.

Perhaps the useTypedParser extension #1748 (comment) is what you're looking for. Or, perhaps less immediately useful, the proposal for z.satisfies #2473.

The input type is supposed to indicate what is the type that the schema expects, not what the schema constraints can manage

Citation? I think the design principle of "parse, don't validate" implies advertised input types should be exhaustive (even if a tad broad at times).

Do you find useful to read that in the expected parameter object every field is partial with unknown type? :/

No, I think the best design is to not try to handle malformed input (though that might be untenable)

@colinhacks
Copy link
Owner

As @rotu says, this is how .catch() works. The inferred input type represents the set of inputs that can reasonably be expected to pass validation. When you use .catch(), any input will pass.

Imagine you write an API method which requires a parameter object and you want to generate an API documentation specifying the expected param format. The used type for the parameter is the input type of your zod schema, and the first thing your method does is to parse and validate the parameter. Do you find useful to read that in the expected parameter object every field is partial with unknown type? :/

I almost see what @vtgn is saying and there are other cases where Zod's philosophy on input types has thrown people off (see the discussion here #1492). If you want the inferred input type to be string, then don't use .catch(). Use z.string().pipe(yourParamParser).

@graup
Copy link

graup commented Jun 12, 2024

This all makes sense, but how about the use case when you have an enum that instead of failing you want to just set to undefined? Something like z.enum(['a', 'b']).optional().catch(undefined). What I want is the input to be string and the output to be 'a' | 'b' | undefined. Is there a way to do this? Basically a way to say "optional + treat errors as undefined".

The best I've come up with is z.string().pipe(z.enum(['a', 'b').optional().catch(undefined)).optional(), does this make sense?

@rotu
Copy link
Contributor

rotu commented Jun 12, 2024

This all makes sense, but how about the use case when you have an enum that instead of failing you want to just set to undefined? Something like z.enum(['a', 'b']).optional().catch(undefined). What I want is the input to be string and the output to be 'a' | 'b' | undefined. Is there a way to do this? Basically a way to say "optional + treat errors as undefined".

The best I've come up with is z.string().pipe(z.enum(['a', 'b').optional().catch(undefined)).optional(), does this make sense?

That seems a convoluted use of catch. I think z.union or .or is probably more suited:

z.enum(['a', 'b', 'c']).or(z.unknown().transform(x => undefined))

Still, there are two issues that you might want to handle:

  1. This doesn't provide a type warning in TypeScript. (you can partly fix that by replacing z.unknown() with z.string() or z.string().optional() to constrain the input type)
  2. The fallback to undefined is silent. I don't unfortunately don't think the zod API has a good way to emit non-error warnings.

@HernanFdez
Copy link

HernanFdez commented Sep 4, 2024

TL;DR

export const catched = <T extends ZodSchema>(
  schema: T,
  fallback: z.infer<T>,
) => schema.catch(fallback) as unknown as z.ZodDefault<z.ZodOptional<T>>;

I came across this very same unexpected behaviour when using zod schemas for specifying @tanstak/router validateSearch parameters in route definitions.

I may understand the reasoning and philosophy behind why .catch() should behave the way it does.
But I still think the requested alternative behaviour is definitely quite often the desirable one.
I would really like it to be available built-in, not necessarily replacing the one that we have, but just as an alternative.

Anyway, good news is, it can be fairly easily implemented as the above one-liner.
Then, any given const schema = z.object({foo: "bar"}) can be applied this utility

const catchedSchema = catched(schema, {foo: "qux"})

resulting in a .catch()-like output, but without affecting the inferred input type.
And critically, the automagic power of TS generics makes sure what's passed as fallback makes sense.

That's it! Happy coding!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants