Skip to content

Commit

Permalink
Merge branch 'main' into carmocca/remove-per-file-clis
Browse files Browse the repository at this point in the history
  • Loading branch information
carmocca authored May 8, 2024
2 parents 4830506 + 3dabe80 commit 61e0b2c
Show file tree
Hide file tree
Showing 3 changed files with 50 additions and 1 deletion.
39 changes: 39 additions & 0 deletions .github/workflows/publish.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# To create a release, create a tag and push it to GitHub:
#git tag -a "v0.0.1-beta" -m "beta version testing"
#git push --tags
# https://dev.to/iamtekson/publish-package-to-pypi-and-release-new-version-using-github-actions-108k
name: Publish LitGPT to PyPI

on:
push:
tags:
- "v*"
jobs:
build-n-publish:
name: Build and publish to PyPI
runs-on: ubuntu-latest
environment:
name: pypi
url: https://pypi.org/p/litgpt
permissions:
id-token: write

steps:
- name: Checkout source
uses: actions/checkout@v3

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.x"

- name: Build source and wheel distributions
run: |
python -m pip install --upgrade build twine
python -m build
twine check --strict dist/*
- name: Publish distribution to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
user: __token__
password: ${{ secrets.PYPI_API_TOKEN }}
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ all = [
"sentencepiece>=0.2.0", # llama-based models
"tokenizers>=0.15.2", # pythia, falcon, redpajama
"requests>=2.31.0", # litgpt.data
"litdata>=0.2.2", # litgpt.data
"litdata>=0.2.2,<0.2.6", # litgpt.data
"litserve>=0.1.0", # litgpt.deploy
"zstandard>=0.22.0", # litgpt.data.prepare_slimpajama.py
"pandas>=1.9.0", # litgpt.data.prepare_starcoder.py
Expand Down
10 changes: 10 additions & 0 deletions tests/test_lora.py
Original file line number Diff line number Diff line change
Expand Up @@ -758,3 +758,13 @@ def test_lora_model_fsdp_init():
model = fabric.setup(model)
y = model(x)
assert y.shape == torch.Size([2, 8, 512])

# verify that all the parameters, buffers and other attributes aren't on `meta` device
for m in model.modules():
for p_name, parameter in m.named_parameters():
assert not parameter.is_meta, f"Parameter `{p_name}` isn't materialized."
for b_name, buffer in m._buffers.items():
assert not buffer.is_meta, f"Buffer `{b_name}` isn't materialized."
for attr_name, attr_value in m.__dict__.items():
if isinstance(attr_value, torch.Tensor):
assert not attr_value.is_meta, f"Attribute `{attr_name}` isn't materialized."

0 comments on commit 61e0b2c

Please sign in to comment.