Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow Shortcutting Min-max Observer #887

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

kylesayrs
Copy link
Collaborator

@kylesayrs kylesayrs commented Nov 1, 2024

Purpose

Changes

  • Renamed MovingAverageMinMaxObserver -> MinMaxObserver since moving average is not required to use it
  • Shortcut averaging logic by checking self.averaging_constant == 1.0
  • Update docstrings, ect.

Testing

  • Ran examples/quantization_w4a16/llama3_example.py to completion

Copy link

github-actions bot commented Nov 1, 2024

👋 Hi! Thank you for contributing to llm-compressor. Please add the ready label when the PR is ready for review.

@kylesayrs kylesayrs marked this pull request as ready for review November 4, 2024 21:46
@kylesayrs kylesayrs self-assigned this Nov 4, 2024
observer = Observer.load_from_registry(
observer, quantization_args=quantization_args
quantization_args.observer, quantization_args=quantization_args
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should be consistent in how we're fetching the observer - either use the get_observer method or remove it and do it how you're doing it here.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Personally in favor of removing the get_observer now that observer refactor work is done

#939

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants