You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This came up on a call. There was concern about a scenario where users run SuperpixelClassification on a folder with images, generating superpixels, and doing some active learning before adding new images to the folder, and rerunning SuperpixelClassification.
Rerunning SuperpixelClassification will generate superpixels and features for the newly added images, but what if the parameters used to generate those features are different than the initial run of SuperpixelClassification? What are the implications of this situation? Does it break training? Is there something we can do to prevent this/warn users?
Having different parameters shouldn't break training, but will make the prediction less consistent. Adding new images with different parameters and then labelling anything in those new images could degrade the performance on the original images until enough labelling is done. I think we currently save the parameters used as a comment on annotation with the initial epoch generated by the algorithm. This should be readable to make it use the same parameters. We could expose this information in another location if it would be more useful.
This came up on a call. There was concern about a scenario where users run SuperpixelClassification on a folder with images, generating superpixels, and doing some active learning before adding new images to the folder, and rerunning SuperpixelClassification.
Rerunning SuperpixelClassification will generate superpixels and features for the newly added images, but what if the parameters used to generate those features are different than the initial run of SuperpixelClassification? What are the implications of this situation? Does it break training? Is there something we can do to prevent this/warn users?
@manthey
The text was updated successfully, but these errors were encountered: