-
Majority of the examples I have seen online deal with @api taking input and returning an output. I was wondering if I could wrap my own feature in GET request apart from the model prediction post API within the same service. from bentoml import BentoService, api
Thank you |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hi @poryee - using an GET request in 0.13 version is not supported unfortunately. The new 1.0 version is around the corner, in BentoML 1.0, user can customize the API server with any WSGI/ASGI web framework, such as FastAPI or Flask. Through which, you can define custom HTTP endpoint in addition to the default endpoints provided in BentoML. You can find some sample code here: https://github.com/bentoml/BentoML/tree/7d50d7e7ab45b411607ba9ec3431425d72d7f522/bentoml/_internal/server Mounting a Flask App to ServiceUse # hello.py
import bentoml
from bentoml.io import JSON
from flask import Flask
flask_app = Flask("sample_wsgi_app")
@flask_app.route("/hello")
def hello_world():
return "<p>Hello, World!</p>"
svc = bentoml.Service("bento-server-test")
svc.mount_wsgi_app(flask_app)
@svc.api(input=JSON(), output=JSON())
def predict(input_json):
return {'input_received': input_json, 'foo': 'bar'}
app = svc.asgi_app Mounting a FastAPI app to ServiceUse import bentoml
from bentoml.io import JSON
from fastapi import FastAPI
fastapi_app = FastAPI()
@fastapi_app.get("/hello")
def hello():
return {"Hello": "World"}
svc = bentoml.Service("hello")
svc.mount_asgi_app(fastapi_app)
@svc.api(input=JSON(), output=JSON())
def predict(input_json):
return {'input_received': input_json, 'foo': 'bar'}
app = svc.asgi_app |
Beta Was this translation helpful? Give feedback.
Hi @poryee - using an GET request in 0.13 version is not supported unfortunately. The new 1.0 version is around the corner, in BentoML 1.0, user can customize the API server with any WSGI/ASGI web framework, such as FastAPI or Flask. Through which, you can define custom HTTP endpoint in addition to the default endpoints provided in BentoML. You can find some sample code here: https://github.com/bentoml/BentoML/tree/7d50d7e7ab45b411607ba9ec3431425d72d7f522/bentoml/_internal/server
Mounting a Flask App to Service
Use
Service#mount_wsgi_app
to mount flask app to a Service. This works for any WSGI based python web application.