-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to support response description with the same output schema and different response content-types #204
Comments
Sorry, I did not reproduce your issue, and you need to provide a detailed description. |
So, when I use this:
And then I open swagger-ui, I see |
Things I've tried:
or even
I just want to be able to declare same schema for different content-types The only mitigation I found is this:
but this forces me to register all schemas manually and is like I am doing the same thing, in two different ways
It would be really more intuitive to support this that could also auto-register the schemas
|
It's strange I still can't surface your question, can you provide a minimal py file? My Environment: flask-openapi3 version: 4.0.3 |
@api.post(
"/inference",
responses={
HTTPStatus.OK: {
"description": "Inference output",
"content": {
# Content type
"application/json": {"schema": {"$ref": "#/components/schemas/SAM2InferenceOutput"}},
"application/bson": {"schema": {"$ref": "#/components/schemas/SAM2InferenceOutput"}},
},
},
},
) only this code is right, and other code is wrong. |
|
Re-opened, actually that code is not right either, because one has to deal with this manually
Now, because schemas have inner references, the from enum import Enum
from typing import Generic, TypeVar
import numpy as np
from pydantic import AnyUrl, BaseModel, Field
T = TypeVar("T")
class Response(BaseModel, Generic[T]):
data: T
message: str | None = Field(None, description="Exception Information")
errors: list[str] | None = Field(None, description="List of error messages")
class Config:
use_enum_values = True
arbitrary_types_allowed = True
class Project(BaseModel):
Version: str
Environment: str
class InputSourceEnum(str, Enum):
bytes = "bytes"
url = "url"
s3 = "s3"
local = "local"
class InputTypeEnum(str, Enum):
image = "image"
video = "video"
class SAM2PromptTypeEnum(str, Enum):
point = "point"
rectangle = "rectangle"
class SAM2ModelEnum(str, Enum):
sam2_1_hiera_base_plus = "sam2.1_hiera_base_plus"
sam2_1_hiera_large = "sam2.1_hiera_large"
sam2_1_hiera_small = "sam2.1_hiera_small"
sam2_1_hiera_tiny = "sam2.1_hiera_tiny"
class SAM2InferencePrompt(BaseModel):
type: SAM2PromptTypeEnum = Field(..., description="Type of the input (either 'point' or 'rectangle')")
label: int = Field(0, description="Label associated with the input (default: 0)", ge=0)
data: list[int] = Field(..., description="List of integer data points", min_items=1)
class Config:
use_enum_values = True
class SAM2InferenceInput(BaseModel):
source: InputSourceEnum = Field("url", description="Input source, either 'url' or 'bytes'")
type: InputTypeEnum = Field("image", description="Input type, either 'image' or 'video'")
model: SAM2ModelEnum = Field("sam2.1_hiera_base_plus", description="Model type")
prompt: list[SAM2InferencePrompt] = Field(..., description="List of inference prompts")
frame: int | None = Field(None, description="Frame index for video input")
data: AnyUrl | bytes | np.ndarray | None = Field(
None, description=("URI string, or for 'bytes' source with 'application/bson' input, a list of bytes " "representing the image or video content")
)
class Config:
use_enum_values = True
arbitrary_types_allowed = True
class SAM2InferenceOutputItem(BaseModel):
counts: str = Field(..., description="Counts in RLE format")
size: list[int] = Field(..., description="Size of the image")
class SAM2InferenceOutput(BaseModel):
items: list[SAM2InferenceOutputItem] = Field([], description="List of inference results")
class Config:
title = "SAM2InferenceOutput"
use_enum_values = True
arbitrary_types_allowed = True |
I have created a PR that simplifies all this, be kind 🥇 and thank you for such a helpful project. See here #207 Basically you can now do this: responses={
HTTPStatus.OK: {
"description": "Inference output",
"content": {
"application/json": SAM2InferenceOutput,
"application/bson": SAM2InferenceOutput,
},
},
} Not only this: responses={
HTTPStatus.OK: SAM2InferenceOutput,
} And swagger UI is aware |
Sorry, I finally understand what you mean. You don't want to manually add the model schema to class SAM2InferenceOutput(BaseModel):
items:list[str]
model_config = {
"openapi_extra":{
"content": {
"application/json": {"schema": {"$ref": "#/components/schemas/SAM2InferenceOutput"}},
"application/bson": {"schema": {"$ref": "#/components/schemas/SAM2InferenceOutput"}},
}
}
}
@api.post(
"/inference",
responses={
HTTPStatus.OK: SAM2InferenceOutput
},
)
@tracer.capture_method
def create_sam2_inference(body: SAM2InferenceInput) -> Response:
... |
Thanks, but that above still won't work as the |
Environment:
My attempt
But the generated json for the swagger schema has validation errors
I even tried this as it seemed natural and less verbose, but nothing gets registered
But it crashes
The text was updated successfully, but these errors were encountered: