This project compares the performance of three different deep learning models on the WikiArt Dataset:
- ResNet: Manually implemented.
- Xception: Pre-trained on ImageNet.
- DenseNet: Pre-trained on ImageNet.
Each model undergoes training, fine-tuning, and evaluation, with detailed results presented in Jupyter notebooks.
- Custom implementation of ResNet.
- Fine-tuning of pre-trained Xception and DenseNet models on the WikiArt dataset.
- Comparative analysis of training performance and final evaluation metrics.
- Visualization of results for insights into model performance.
Ensure you have the following:
- A Kaggle account.
- Internet access to upload and run notebooks on Kaggle.
Download the WikiArt Dataset and upload it to your Kaggle account's dataset section. Ensure that the dataset is accessible to the notebooks.
-
Download the notebooks from the
notebooks/
directory:resnet-50.ipynb
xception-art-classifier.ipynb
densenet-art-classifier.ipynb
-
Import the notebooks to your Kaggle account:
- Go to Kaggle Notebooks.
- Create a new notebook or import an existing one.
- Upload the desired notebook file.
-
Run the notebook:
- Ensure the WikiArt dataset is properly linked.
- Execute the cells to train, fine-tune, and evaluate the models.
Detailed results for each model, including training curves, validation metrics, and evaluation scores, can be generated by running the notebooks. Highlights include:
- Accuracy and loss plots.
- Confusion matrices.
- Comparative performance tables.
Contributions are welcome! Please create an issue or submit a pull request if you have suggestions for improvements or new features.
This project is licensed under the MIT License. See the LICENSE
file for more details.