Releases: Nikronic/niklib
Releases · Nikronic/niklib
v0.2.0
What's Changed
- Fix
readthedocs
not rendering docs because of missing core packages by adding requirements based on CPU hardware by @Nikronic in #9 - fix missing links, indentation and formatting by @Nikronic in #11
- Update dvc intro by @Nikronic in #12
- transit to use toml based setup by @Nikronic in #13
- use predefined types (e.g.
Field
) instead of generealvalidator
frompydantic
by @Nikronic in #14 - add github action workflows for building docs by @Nikronic in #15
- remove versioning as part of the library by @Nikronic in #16
- fix using library version file instead of setup one (
VERSION
) by @Nikronic in #18 - bump version to
v0.2.0
by @Nikronic in #17
Full Changelog: v0.1.0-alpha...v0.2.0
v0.1.0-alpha
What's Changed
- Fix logger hardcode by @Nikronic in #1
- add new logs and improved levels by @Nikronic in #3
- add conda virtual env guide and yml setup file by @Nikronic in #4
- read the docs integration by @Nikronic in #5
- add tutorials on how to build docs via sphinx by @Nikronic in #6
- Improving docs by @Nikronic in #7
- bump version to
v0.1.0-alpha
by @Nikronic in #8
New Contributors
Full Changelog: v0.0.1-alpha...v0.1.0-alpha
Initial release
Summary:
add pip package setup:
U you can use pip install -e .
for dev and python setup.py sdist bdist_wheel
for releasing package
api
package
api.apps
: for running server apps (such as fastapi apps)api.database
: database needed for API endpoints via sqlalchemyapi.models
: payload and response models for all endpoints validated via pydantic
configs
package
configs.core
: parsing different configs as json used generally for all internal and 3rdpary modules usedconfigs.data
: some sample data for usage demo
data
package
data.constant
: add constants for domain specific knowledge. Constant lists, dicts, and of course custom Enum is used to properly show const data and domain specific info that need tbe used across entire library if used.data.functional
: contains some general functions that does nothing in fact! hence can be used every where! Just some annoying things that I dont want to do again.data.logic
: how to apply different logics to data to extend, summarize and transform, via general or domain specific knowledgedata.pdf
: methods for reading pdf for conversion to tabular data. For now, Adobe XFA pdfs are supporteddata.preprocessor
: All sort of transforms that can be used for all sort of data (for now, since not many methods are here). For now, handling files in general is handled.
utils
package
utils.loggers
: a generalLogger
class that redirects alllogging
into desired stream (internal and 3rdparty) and adds interface for saving artifacts inmlflow
artifact directory. Also, has reset methods to use same logger without any more configs to be used for multiple experiments in an isolated manner (each experiment has its own dirs and files numbered)utils.visualization
: some helpers methods that might use for all visualization. No main plot here, you create main plot first, then call these methods to modify it (such as changing font size for only a single plot)
models
package
models.estimators
: hosts wrapper around 3rd party model libs and also all models defined as estimator. note that only model definition should be here and no training or evaluation script.models.estimators.networks
: hosts only neural network estimators (since they are large and common and might need many tweaks)
models.evaluators
: hosts config files, evaluating scripts and custom or wrapped 3rd party metrics (this is where metrics and evaluation formodels.estimators
can be hosted)models.evaluators.metrics
: wrapper around 3rd party metrics (sklearn metrics) or custom metrics for all sort ofestimators
andtrainers
models.preprocessors
: hosts preprocessing only needed for direct model usage. for instance, no feature engineering here, only things similar to sklearn transformation (onehot) or categorization that are model specific. Also, user can create wrappers for famous libraries such as PyTorch and sklearn if needed.models.preprocessors.core
: contains methods and classes that can be used across all model preprocessors. Given sklearn duck typing is so popular, most of the methods are for that.models.preprocessors.helpers
: helpers for all modules of preprocessors. E.g. for now it contains a preview for transformation done inmodels.preprocessors.core
.
models.trainers
: contains configs and scripts for training (similar tomodels/preprocessors/evaluators
.models.trainers.aml_flaml
: A wrapper aroundflaml
Auto ML framework. Sinceflaml
is a low code library and contains training script itself, for now, this module has only extensions for built in methods. E.g.flaml
uses a wrapper around sklearn models, so there isfind_estimator
to extract underlying model. similarly,log_mlflow_model
exists to log underlying sklearn based model asmlflow
model since there is no flavor in mlflow that matchesflaml
(but sklearn) is supported.
models.preprocessors.weights
: hosts methods for saving, loading, and customizing weights. Note that usually all libs and frameworks have a built in method for saving and loading weights. But if any customization or simplification needed, it should be hosted here.