-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat:delayed_padatious_training #29
Conversation
Warning Rate limit exceeded@JarbasAl has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 5 minutes and 7 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (2)
WalkthroughThe changes in this pull request focus on enhancing the Changes
Possibly related PRs
Suggested labels
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Outside diff range and nitpick comments (2)
ovos_padatious/opm.py (2)
195-195
: Remove unnecessaryf
prefix in log messageThe log message on line 195 does not contain any placeholders, so the
f
prefix is unnecessary.Apply this diff to remove the extraneous
f
:- LOG.debug(f"Nothing new to train for padatious") + LOG.debug("Nothing new to train for padatious")🧰 Tools
🪛 Ruff (0.8.0)
195-195: f-string without any placeholders
Remove extraneous
f
prefix(F541)
206-206
: Remove unnecessaryf
prefix in log messageThe log message on line 206 does not require an
f
prefix as it contains no placeholders.Apply this diff to correct the log statement:
- LOG.debug(f"Training complete for padatious!") + LOG.debug("Training complete for padatious!")🧰 Tools
🪛 Ruff (0.8.0)
206-206: f-string without any placeholders
Remove extraneous
f
prefix(F541)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (1)
ovos_padatious/opm.py
(5 hunks)
🧰 Additional context used
🪛 Ruff (0.8.0)
ovos_padatious/opm.py
195-195: f-string without any placeholders
Remove extraneous f
prefix
(F541)
206-206: f-string without any placeholders
Remove extraneous f
prefix
(F541)
🔇 Additional comments (4)
ovos_padatious/opm.py (4)
18-18
: Importing RLock
for thread safety
The addition of RLock
from the threading
module is appropriate for ensuring thread-safe operations during training.
93-93
: Initializing self.lock
with RLock
Initializing self.lock = RLock()
ensures that access to shared resources is thread-safe within the PadatiousPipeline
class.
124-124
: Registering the 'padatious:train'
event handler
Adding the event handler for 'padatious:train'
allows the system to trigger training externally, which enhances flexibility.
200-208
: Ensuring thread-safe training with self.lock
Wrapping the training loop with self.lock
ensures that the training process is thread-safe, preventing potential race conditions.
🧰 Tools
🪛 Ruff (0.8.0)
206-206: f-string without any placeholders
Remove extraneous f
prefix
(F541)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Outside diff range and nitpick comments (1)
ovos_padatious/training_manager.py (1)
Line range hint
1-150
: Architecture feedback on training optimizationThe changes effectively support delayed training by:
- Adding comprehensive logging throughout the training pipeline
- Maintaining backward compatibility while deprecating single-threaded parameters
- Preserving core training logic while improving visibility
Consider adding structured logging with additional metadata (e.g., training duration, model size) to facilitate monitoring and debugging in production environments.
🧰 Tools
🪛 Ruff (0.8.0)
93-93: Do not use bare
except
(E722)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
📒 Files selected for processing (2)
ovos_padatious/simple_intent.py
(4 hunks)ovos_padatious/training_manager.py
(2 hunks)
🧰 Additional context used
🪛 Ruff (0.8.0)
ovos_padatious/training_manager.py
93-93: Do not use bare except
(E722)
🔇 Additional comments (3)
ovos_padatious/simple_intent.py (2)
29-29
: LGTM: Modern Python class definition
The removal of explicit object
inheritance aligns with Python 3 conventions.
72-74
: LGTM: Enhanced training visibility with debug logging
The addition of sample counting and debug logging provides valuable insights into the training process:
- Tracks positive and negative training samples
- Logs training start and completion
Also applies to: 121-121, 128-128
ovos_padatious/training_manager.py (1)
76-76
: LGTM: Improved debugging visibility
The added debug logging statements provide clear visibility into the intent loading and training process:
- Loading from cache
- Queuing for training
Also applies to: 91-91, 97-97
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
@@ -264,7 +270,9 @@ def _register_object(self, message, object_name, register_func): | |||
|
|||
register_func(name, samples) | |||
|
|||
self.train(message) | |||
self.finished_initial_train = False | |||
if self.config.get("instant_train", True): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
for backwards compat, but we might want to consider changing this default value in next breaking change release, this impacts thread safety and should likely be False
@coderabbitai open an issue
add config option to train after skills loaded, not on every skill
Summary by CodeRabbit
New Features
Bug Fixes
Deprecations
padatious_config
property as deprecated with a warning for direct access.