You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for reporting @Andrew-Crosby. In an ideal world, how would you want this to be supported? The options that occur to me are:
POST the binary as a simple blob (i.e. with Content-Type: application/octet-stream instead of application/json)
For: simplest to implement in model and client(s) Against: API does not directly reflect model input schema Against: precludes support for models with multiple inputs
Assume binary data columns/fields will be sent as base64 encoded
For: retains JSON as the API and Content-Type - and so supports multiple inputs Against: API does not directly reflect model input schema
Explicitly state that binary is not supported, suggest that MLflow model be modified to decode a base64 encoded column
For: retains JSON as the API and Content-Type - supports multiple inputs Against: forces MLflow model to address a concern specific to real-time inference up-front
JSON does not natively support byte data, so I don't think it's possible to deploy a model with a binary input field.
The text was updated successfully, but these errors were encountered: