Skip to content

Commit

Permalink
Merge branch 'master' into nerf_integration
Browse files Browse the repository at this point in the history
  • Loading branch information
Lothiraldan committed Jan 22, 2024
2 parents d1b3e98 + 73bdb54 commit 235ef19
Show file tree
Hide file tree
Showing 11 changed files with 408 additions and 814 deletions.
4 changes: 3 additions & 1 deletion .github/workflows/test-examples.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ jobs:
- integrations/model-evaluation/gradio/notebooks/Gradio_and_Comet.ipynb
- integrations/model-evaluation/gradio/notebooks/Logging_Model_Inferences_with_Comet_and_Gradio.ipynb
- integrations/model-optimization/ray-tune/notebooks/Comet_and_Ray.ipynb
- integrations/model-training/fastai/notebooks/fastai_hello_world.ipynb
- integrations/model-training/hugging_face/notebooks/Comet_with_Hugging_Face_Trainer.ipynb
- integrations/model-training/keras/notebooks/Comet_with_Keras.ipynb
- integrations/model-training/lightgbm/notebooks/Comet_and_LightGBM.ipynb
Expand Down Expand Up @@ -99,6 +100,8 @@ jobs:
os: [ubuntu-latest]
python-version: ["3.10"]
example:
- {script: "integrations/model-optimization/optuna/optuna-hello-world/optuna-hello-world.py", arg: ""}
- {script: "integrations/model-training/fastai/fastai-hello-world/fastai_hello_world.py", arg: ""}
- {script: "integrations/model-training/hugging_face/transformers-distilbert-fine-tuning/transformers-distilbert-fine-tuning.py", arg: ""}
- {script: "integrations/model-training/keras/keras-mnist-dnn/keras-mnist-dnn.py", arg: ""}
- {script: "integrations/model-training/mlflow/mlflow-hello-world/mlflow-hello-world.py", arg: "run"}
Expand All @@ -112,7 +115,6 @@ jobs:
- {script: "integrations/workflow-orchestration/metaflow/metaflow-hello-world/helloworld.py", arg: "run"}
- {script: "integrations/workflow-orchestration/metaflow/metaflow-model-evaluation/metaflow-model-evaluation.py", arg: "run --max-workers 1 --n_samples 100"}
- {script: "integrations/workflow-orchestration/metaflow/metaflow-regression/metaflow-regression-example.py", arg: "run"}
- {script: "integrations/model-optimization/optuna/optuna-hello-world/optuna-hello-world.py", arg: ""}
env:
SCRIPT_TO_TEST: ${{ matrix.example.script }}
steps:
Expand Down
1 change: 1 addition & 0 deletions fastai/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
All of the fastai examples has been moved here: https://github.com/comet-ml/comet-examples/tree/master/integrations/model-training/fastai/.
44 changes: 0 additions & 44 deletions fastai/train-example.py

This file was deleted.

36 changes: 17 additions & 19 deletions integrations/llm/langchain/notebooks/Comet_with_Langchain.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install -U comet_llm \"langchain>=0.0.346\" openai google-search-results numexpr"
"%pip install -U comet_llm \"langchain>=0.0.346\" openai numexpr"
]
},
{
Expand Down Expand Up @@ -86,7 +86,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"You will need an [OpenAI API Key](https://platform.openai.com/account/api-keys) and a [SerpAPI API Key](https://serpapi.com/dashboard) to run the following examples"
"You will need an [OpenAI API Key](https://platform.openai.com/account/api-keys) to run the following examples"
]
},
{
Expand All @@ -98,8 +98,7 @@
"import os\n",
"\n",
"os.environ[\"OPENAI_API_KEY\"] = \"...\"\n",
"# os.environ[\"OPENAI_ORGANIZATION\"] = \"...\"\n",
"os.environ[\"SERPAPI_API_KEY\"] = \"...\""
"# os.environ[\"OPENAI_ORGANIZATION\"] = \"...\""
]
},
{
Expand All @@ -121,9 +120,11 @@
"from langchain.llms import OpenAI\n",
"\n",
"comet_callback = CometTracer()\n",
"llm = OpenAI(temperature=0.9, callbacks=[comet_callback], verbose=True)\n",
"\n",
"llm_result = llm.generate([\"Tell me a joke\", \"Tell me a poem\", \"Tell me a fact\"] * 3)\n",
"llm = OpenAI(temperature=0.9, verbose=True)\n",
"llm_result = llm.generate(\n",
" [\"Tell me a joke\", \"Tell me a poem\", \"Tell me a fact\"] * 3,\n",
" callbacks=[comet_callback],\n",
")\n",
"print(\"LLM result\", llm_result)\n",
"comet_callback.flush()"
]
Expand All @@ -149,18 +150,16 @@
"comet_callback = CometTracer()\n",
"callbacks = [comet_callback]\n",
"\n",
"llm = OpenAI(temperature=0.9, callbacks=callbacks, verbose=True)\n",
"llm = OpenAI(temperature=0.9, verbose=True)\n",
"\n",
"template = \"\"\"You are a playwright. Given the title of play, it is your job to write a synopsis for that title.\n",
"Title: {title}\n",
"Playwright: This is a synopsis for the above play:\"\"\"\n",
"prompt_template = PromptTemplate(input_variables=[\"title\"], template=template)\n",
"synopsis_chain = LLMChain(\n",
" llm=llm, prompt=prompt_template, callbacks=callbacks, verbose=True\n",
")\n",
"synopsis_chain = LLMChain(llm=llm, prompt=prompt_template, verbose=True)\n",
"\n",
"test_prompts = [{\"title\": \"Documentary about Bigfoot in Paris\"}]\n",
"print(synopsis_chain.apply(test_prompts))\n",
"print(synopsis_chain.apply(test_prompts, callbacks=callbacks))\n",
"comet_callback.flush()"
]
},
Expand All @@ -177,25 +176,24 @@
"metadata": {},
"outputs": [],
"source": [
"from langchain.agents import initialize_agent, load_tools\n",
"from langchain.agents import AgentType, initialize_agent, load_tools\n",
"from langchain.callbacks.tracers.comet import CometTracer\n",
"from langchain.llms import OpenAI\n",
"\n",
"comet_callback = CometTracer()\n",
"callbacks = [comet_callback]\n",
"\n",
"llm = OpenAI(temperature=0.9, callbacks=callbacks, verbose=True)\n",
"llm = OpenAI(temperature=0.9, verbose=True)\n",
"\n",
"tools = load_tools([\"serpapi\", \"llm-math\"], llm=llm, callbacks=callbacks, verbose=True)\n",
"tools = load_tools([\"llm-math\"], llm=llm, verbose=True)\n",
"agent = initialize_agent(\n",
" tools,\n",
" llm,\n",
" agent=\"zero-shot-react-description\",\n",
" callbacks=callbacks,\n",
" AgentType.ZERO_SHOT_REACT_DESCRIPTION,\n",
" verbose=True,\n",
")\n",
"agent.run(\n",
" \"Who is Leo DiCaprio's girlfriend? What is her current age raised to the 0.43 power?\",\n",
" \"What is 2 raised to .123243 power?\",\n",
" callbacks=callbacks,\n",
")\n",
"comet_callback.flush()"
Expand All @@ -218,7 +216,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.12"
"version": "3.11.3"
}
},
"nbformat": 4,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,11 +37,12 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"id": "c-Dq42vt_Vbr"
"id": "c-Dq42vt_Vbr",
"scrolled": true
},
"outputs": [],
"source": [
"%pip install comet_ml torch transformers gradio shap"
"%pip install comet_ml torch transformers \"gradio>=4.0\" shap"
]
},
{
Expand Down Expand Up @@ -134,9 +135,9 @@
" workspace=workspace, project_name=project_name\n",
" )\n",
" experiment.log_other(\"Created from\", \"gradio-inference\")\n",
" \n",
"\n",
" message = f\"Started Experiment: [{experiment.name}]({experiment.url})\"\n",
" \n",
"\n",
" return (experiment, message)\n",
"\n",
" except Exception as e:\n",
Expand Down Expand Up @@ -167,7 +168,7 @@
" input_text = gr.Textbox(label=\"Input Text\", lines=5, interactive=True)\n",
" submit_btn = gr.Button(\"Submit\")\n",
"\n",
" output = gr.HTML(interactive=True)\n",
" output = gr.HTML()\n",
"\n",
" start_experiment_btn.click(\n",
" start_experiment, outputs=[experiment, experiment_status]\n",
Expand Down Expand Up @@ -220,9 +221,9 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
"version": "3.10.12"
}
},
"nbformat": 4,
"nbformat_minor": 1
"nbformat_minor": 4
}
Original file line number Diff line number Diff line change
Expand Up @@ -103,15 +103,15 @@
"import numpy as np\n",
"\n",
"import ray\n",
"from ray import tune, air\n",
"from ray import tune, air, train\n",
"from ray.air.integrations.comet import CometLoggerCallback\n",
"from ray.air.config import ScalingConfig\n",
"\n",
"\n",
"def train_function(config):\n",
" for i in range(30):\n",
" loss = config[\"mean\"] + config[\"sd\"] * np.random.randn()\n",
" tune.report(loss=loss)\n",
" train.report({\"loss\": loss})\n",
"\n",
"\n",
"def tune_function():\n",
Expand Down Expand Up @@ -240,7 +240,7 @@
"}\n",
"\n",
"\n",
"stop = {\"training_iteration\": 5}\n",
"stop = {\"training_iteration\": 10}\n",
"\n",
"tuner = tune.Tuner(\n",
" \"PPO\",\n",
Expand All @@ -260,13 +260,6 @@
"\n",
"check_learning_achieved(results, 100)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand All @@ -289,7 +282,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.10.12"
}
},
"nbformat": 4,
Expand Down
30 changes: 30 additions & 0 deletions integrations/model-training/fastai/fastai-hello-world/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# FastAI integration with Comet.ml

[fastai](https://github.com/fastai/fastai) is a deep learning library which provides practitioners with high-level components that can quickly and easily provide state-of-the-art results in standard deep learning domains, and provides researchers with low-level components that can be mixed and matched to build new approaches.

Instrument fastai with Comet to start managing experiments, create dataset versions and track hyperparameters for faster and easier reproducibility and collaboration.

## Documentation

For more information on using and configuring the fastai integration, see: [https://www.comet.com/docs/v2/integrations/ml-frameworks/fastai/](https://www.comet.com/docs/v2/integrations/ml-frameworks/fastai/?utm_source=comet-examples&utm_medium=referral&utm_campaign=github_repo_2023&utm_content=fastai)

## See it

Take a look at this [public Comet Project](https://www.comet.com/examples/comet-examples-fastai-hello-world/view/new/panels?utm_source=comet-examples&utm_medium=referral&utm_campaign=github_repo_2023&utm_content=fastai).

## Setup

Install dependencies

```bash
python -m pip install -r requirements.txt
```

## Run the example

This example is fine-tuning a pre-trained resnet 28 model on the Mnist Tiny dataset for 5 epochs:


```bash
python fastai-hello-world.py
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# coding: utf-8
import comet_ml

from fastai.vision.all import (
Categorize,
Datasets,
GrandparentSplitter,
IntToFloatTensor,
PILImageBW,
ToTensor,
URLs,
error_rate,
get_image_files,
parent_label,
resnet18,
untar_data,
vision_learner,
)

EPOCHS = 5

comet_ml.init(project_name="comet-examples-fastai-hello-world")
experiment = comet_ml.Experiment()

path = untar_data(URLs.MNIST_TINY)

items = get_image_files(path)
tds = Datasets(
items,
[PILImageBW.create, [parent_label, Categorize()]],
splits=GrandparentSplitter()(items),
)
dls = tds.dataloaders(after_item=[ToTensor(), IntToFloatTensor()])

learn = vision_learner(dls, resnet18, pretrained=True, metrics=error_rate)

learn.fit_one_cycle(EPOCHS)

experiment.end()
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
comet_ml
fastai
Loading

0 comments on commit 235ef19

Please sign in to comment.