-
Notifications
You must be signed in to change notification settings - Fork 651
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: # of input channels is44 not divisible by groups 1 #2341
Comments
Why is this unnecessarily? Can you construct a minimal example to reproduce this issue? Also |
Notice the stack trace.
44 is perfectly divisible by 1, but Just tried |
Are you using flexible shape when you call |
Ah yes, I forgot to mention, this happens when switching the input to the model from
to
and it exports fine without the flexible shape.
Regardless of what it means, |
Have you verified that your traced PyTorch model actually works with all input shapes in that range? It looks like you're trying to make your converted model accept batched input. If so, have considered not using flexible shapes and just using the built in batch predictions? For details see Example in API docs. |
Yet, it seems like
The following worked just fine: for i in tqdm.tqdm(range(1, 9)):
y = traced_model(torch.rand(i, 3, 1024, 1024))
Correct, though the requirement comes from some internal product team. I forwarded your feedback to them. On this note, I also commented out the two checks in question and the model exported fine. Still waiting to hear back from the product team to see whether it actually works though. |
Are you also able to get correct predictions from that exported Core ML model? |
I was wondering about that as well and am waiting for the internal product team to investigate and report back. |
Description
coremltools.convert
will unnecessarily raise an exception ifC_in
is aSymbol
here.Stack Trace
To Reproduce
I am afraid that I can't share the model details here since it is proprietary, my apologies.
System
7.2
2.1.2
The text was updated successfully, but these errors were encountered: