You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
I'm always frustrated when I try to fine-tune the model with my dataset. I'm currently using Moirai-R models for a feasibility study as part of a demand forecasting model enhancement project. I am finding it challenging to configure various parameters, such as Prediction Time (PDT), Context Length (CTX), Patch Size (PSZ), Batch Size (BSZ), and Test Set Length(TEST), or to set the 'offset', 'eval_length', 'prediction_lengths', 'context_lengths', and 'patch_sizes' parameters in the YAML file located under uni2ts/cli/conf/finetune/val_data. Each of these parameters has a significant impact on the model's performance, and optimizing them remains both challenging and time-consuming. I am particularly facing significant challenges due to the small size of my dataset, which is insufficient for effectively training and testing models. The limited size of the dataset makes it challenging to properly tune and optimize the model, often leading to suboptimal performance and reduced robustness in predictions.
Describe the solution you'd like
A clear and concise description of what you want to happen.
I wonder if it is possible to provide a comprehensive parameter configuration guideline tailored to the characteristics of the dataset, such as its size. I would like to see documentation that can guide users through the process of setting these parameters based on the characteristics of their specific datasets. It can recommend clearly defined parameter ranges and YAML-specific settings. Specific parameter setups could be provided for different scenarios, such as small datasets, large datasets, or models requiring high robustness in predictions. For example, the documentation might include guidance on reducing overfitting with small datasets or improving generalization with limited data points. It would be helpful to include a clear explanation of how each parameter affects the model's performance and predictions. It would be helpful to include a clear explanation of how each parameter affects the model's performance and predictions.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
As an alternative, I attempted to adjust the parameters manually through trial and error. However, this approach proved to be highly time-consuming and inefficient, as it required extensive iterations to identify optimal configurations, especially with a small dataset. I considered implementing grid search or random search to systematically explore parameter combinations. While these methods can find better configurations, they are computationally expensive and not ideal given the limited size of my dataset. Additionally, I followed the instructions provided in the README file of the GitHub repository to fine-tune a pre-trained model on my custom dataset. I created data configuration files and defined the ranges for the hyperparameters as part of the fine-tuning process. However, setting the appropriate ranges for them was difficult, so the performance did not meet my expectations.
Additional context
Add any other context or screenshots about the feature request here.
There are no additional details or screenshots related to this feature request.
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
I'm always frustrated when I try to fine-tune the model with my dataset. I'm currently using Moirai-R models for a feasibility study as part of a demand forecasting model enhancement project. I am finding it challenging to configure various parameters, such as Prediction Time (PDT), Context Length (CTX), Patch Size (PSZ), Batch Size (BSZ), and Test Set Length(TEST), or to set the 'offset', 'eval_length', 'prediction_lengths', 'context_lengths', and 'patch_sizes' parameters in the YAML file located under uni2ts/cli/conf/finetune/val_data. Each of these parameters has a significant impact on the model's performance, and optimizing them remains both challenging and time-consuming. I am particularly facing significant challenges due to the small size of my dataset, which is insufficient for effectively training and testing models. The limited size of the dataset makes it challenging to properly tune and optimize the model, often leading to suboptimal performance and reduced robustness in predictions.
Describe the solution you'd like
A clear and concise description of what you want to happen.
I wonder if it is possible to provide a comprehensive parameter configuration guideline tailored to the characteristics of the dataset, such as its size. I would like to see documentation that can guide users through the process of setting these parameters based on the characteristics of their specific datasets. It can recommend clearly defined parameter ranges and YAML-specific settings. Specific parameter setups could be provided for different scenarios, such as small datasets, large datasets, or models requiring high robustness in predictions. For example, the documentation might include guidance on reducing overfitting with small datasets or improving generalization with limited data points. It would be helpful to include a clear explanation of how each parameter affects the model's performance and predictions. It would be helpful to include a clear explanation of how each parameter affects the model's performance and predictions.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
As an alternative, I attempted to adjust the parameters manually through trial and error. However, this approach proved to be highly time-consuming and inefficient, as it required extensive iterations to identify optimal configurations, especially with a small dataset. I considered implementing grid search or random search to systematically explore parameter combinations. While these methods can find better configurations, they are computationally expensive and not ideal given the limited size of my dataset. Additionally, I followed the instructions provided in the README file of the GitHub repository to fine-tune a pre-trained model on my custom dataset. I created data configuration files and defined the ranges for the hyperparameters as part of the fine-tuning process. However, setting the appropriate ranges for them was difficult, so the performance did not meet my expectations.
Additional context
Add any other context or screenshots about the feature request here.
There are no additional details or screenshots related to this feature request.
The text was updated successfully, but these errors were encountered: