Skip to content

Commit

Permalink
[Decimal] precision fixed (#75)
Browse files Browse the repository at this point in the history
* [Decimal] precision fixed

* [Decimal Issue] more specs

* [Decimal Issue] lint

* [Decimal Issue] verbose lint

* [Decimal Issue] src code source root folder

* [Decimal Issue] config update
  • Loading branch information
ome9ax authored Sep 2, 2022
1 parent 48e4993 commit 6b56426
Show file tree
Hide file tree
Showing 11 changed files with 196 additions and 59 deletions.
9 changes: 5 additions & 4 deletions .github/workflows/python-package.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ jobs:
python -m venv venv || virtualenv venv
. venv/bin/activate
pip install --upgrade pip # setuptools
pip install .[test,lint,dist]
pip install tox
- name: Get pip cache dir
id: pip-cache
Expand All @@ -87,11 +87,11 @@ jobs:
. venv/bin/activate
# stop the build if there are Python syntax errors or undefined names
# exit-zero treats all errors as warnings. The GitHub editor is 255 chars wide
flake8 --show-source --statistics --count
TOX_PARALLEL_NO_SPINNER=1 tox -e lint
- name: Test
run: |
. venv/bin/activate
pytest
TOX_PARALLEL_NO_SPINNER=1 tox -e py
- name: Upload coverage test results to Codecov
uses: codecov/codecov-action@v2
if: |
Expand All @@ -109,7 +109,8 @@ jobs:
- name: Build distribution package
run: |
. venv/bin/activate
python setup.py sdist bdist_wheel
pip install build
python -m build
ls -l dist
- name: Publish distribution package to TestPyPI
if: github.event_name == 'push' && startsWith(github.ref, 'refs/tags')
Expand Down
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# target-s3-jsonl

![GitHub - License](https://img.shields.io/github/license/ome9ax/target-s3-jsonl)
[![GitHub - License](https://img.shields.io/github/license/ome9ax/target-s3-jsonl)](https://github.com/ome9ax/target-s3-jsonl/blob/main/LICENSE)
[![Python package builder](https://github.com/ome9ax/target-s3-jsonl/workflows/Python%20package/badge.svg)](https://github.com/ome9ax/target-s3-jsonl/actions)
[![codecov](https://codecov.io/gh/ome9ax/target-s3-jsonl/branch/main/graph/badge.svg?token=KV0cn4jKs2)](https://codecov.io/gh/ome9ax/target-s3-jsonl)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/target-s3-jsonl.svg)](https://pypi.org/project/target-s3-jsonl/)
[![PyPI - Python Version](https://img.shields.io/pypi/pyversions/target-s3-jsonl.svg)](https://pypi.org/project/target-s3-jsonl)
[![PyPI version](https://badge.fury.io/py/target-s3-jsonl.svg)](https://badge.fury.io/py/target-s3-jsonl)
[![PyPi project installs](https://img.shields.io/pypi/dm/target-s3-jsonl.svg?maxAge=2592000&label=installs&color=%2327B1FF)](https://pypistats.org/packages/target-s3-jsonl)

Expand All @@ -12,7 +12,7 @@ following the [Singer spec](https://github.com/singer-io/getting-started/blob/ma

## How to use it

`target-s3-jsonl` is a [Singer](https://singer.io) Target which intend to work with regular [Singer](https://singer.io) Tap. It take the output of the tap and export it as a [JSON Lines](http://jsonlines.org/) files.
`target-s3-jsonl` is a [Singer](https://singer.io) Target which intend to work with regular [Singer](https://singer.io) Tap. It take the output of the tap and export it as a [JSON Lines](http://jsonlines.org/) files into an AWS S3 bucket.

## Install

Expand Down
46 changes: 45 additions & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,50 @@
[build-system]
requires = [
"setuptools>=42",
"setuptools",
"wheel"
]
build-backend = "setuptools.build_meta"

[tool.mypy]
show_error_context = true
ignore_missing_imports = true

files = "."

exclude = [
"setup.py",
"docs.*",
"docs.conf.*",
"build",
"venv"
]

# 'strict = true' is equivalent to the following:
check_untyped_defs = true
disallow_incomplete_defs = true
disallow_subclassing_any = true
disallow_untyped_decorators = true
disallow_untyped_defs = true
warn_redundant_casts = true
warn_unused_configs = true
warn_unused_ignores = true
disallow_untyped_calls = true
no_implicit_reexport = true
strict_equality = true

# The following need to have changes made to be able to enable them:
# disallow_any_generics = true
# no_implicit_optional = true
# warn_return_any = true

[[tool.mypy.overrides]] # Overrides for currently untyped modules
module = [
"target_s3_jsonl.*"
]

[[tool.mypy.overrides]] # Overrides for currently untyped modules
ignore_errors = true

module = [
"tests.*"
]
140 changes: 130 additions & 10 deletions setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -16,51 +16,171 @@ classifiers =
Programming Language :: Python :: 3.8
Programming Language :: Python :: 3.9
Programming Language :: Python :: 3.10
project_urls =
# Documentation = https://ome9ax.github.io/target-s3-jsonl
Releases = https://github.com/ome9ax/target-s3-jsonl/releases
Changelog = https://github.com/ome9ax/target-s3-jsonl/blob/main/CHANGELOG.rst
Issue Tracker = https://github.com/ome9ax/target-s3-jsonl/issues

[options]
packages = find:
py_modules = target_s3_jsonl
package_dir =
= src
# py_modules = target_s3_jsonl
python_requires = >=3.8
# install_requires = file: requirements.txt
# install_requires =
# jsonschema==4.9.1
# boto3==1.24.52
# backoff==2.1.2
include_package_data = True
platforms = any

[options.entry_points]
console_scripts =
target-s3-jsonl = target_s3_jsonl:main

[options.package_data]
target_s3_jsonl = logging.conf

[options.packages.find]
where = src
exclude =
tests

[options.extras_require]
test =
pytest-cov
moto[s3,sts]
lint = flake8
dist = wheel

[options.packages.find]
exclude =
tests

[options.entry_points]
console_scripts =
target-s3-jsonl = target_s3_jsonl:main
static = mypy
dist = build
deploy = twine

[tool:pytest]
addopts = -v --cov=target_s3_jsonl --cov-fail-under 95 --cov-report annotate --cov-report xml --cov-report term --cov-report html:htmlcov --doctest-modules
testpaths = tests
asyncio_mode = auto

[coverage:run]
branch = True
omit =
./setup.py
tests/.*
venv/*

[coverage:report]
show_missing = True
skip_covered = False
exclude_lines =
if __name__ == .__main__.:

[flake8]
count = True
show-source = True
statistics = True
extend-exclude = venv
build
ignore = C901
max-line-length = 160
max-complexity = 10

[build_sphinx]
builder = html
warning-is-error = true
# keep-going = true
project = 'Target S3 Jsonl'
version = attr: target_s3_jsonl.__version__
release = attr: target_s3_jsonl.__version__
source-dir = 'docs'

[tox:tox]
passenv = TOXENV TOX_* CI_* GITLAB_*
# requires = tox-pipenv
envlist = py{38,39,310}
# labels =
# test = py{39,310,pi}
# static = flake8, mypy
# envlist = .virtualenvs/target-s3-jsonl
isolated_build = True
# skipsdist = false
# parallel_show_output=True

# requires = tox-pip-extensions
# tox_pip_extensions_ext_venv_update = true

[testenv]
usedevelop = True
extras = test
commands = pytest {posargs}
# publish: python setup.py sdist upload --sign -r pypi
# publish: echo Publish that

[testenv:lint]
usedevelop = True
skip_install = true
deps = flake8
commands = flake8 {posargs}

[testenv:static]
usedevelop = True
skip_install = true
deps = mypy
commands = mypy {posargs}

[testenv:coverage]
usedevelop = True
passenv = CODECOV_TOKEN CI_*
skip_install = true
deps = codecov
# allowlist_externals = gpg
# install_command = echo Install codecov {packages}
# curl https://keybase.io/codecovsecurity/pgp_keys.asc | gpg --no-default-keyring --keyring trustedkeys.gpg --import # One-time step
# curl -Os https://uploader.codecov.io/latest/linux/codecov
# curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM
# curl -Os https://uploader.codecov.io/latest/linux/codecov.SHA256SUM.sig
# gpgv codecov.SHA256SUM.sig codecov.SHA256SUM
# shasum -a 256 -c codecov.SHA256SUM
# chmod +x ./codecov
commands =
codecov \
--file "{toxinidir}/coverage.xml" \
--name "codecov-$CI_PROJECT_NAME" \
--branch "$CI_COMMIT_BRANCH" \
--commit "$CI_COMMIT_SHA" \
--tag "$CI_COMMIT_TAG" \
--flags "unittests" {posargs} || echo 'Codecov upload failed'

[testenv:docs]
# https://packaging-guide.openastronomy.org/en/latest/docs.html
# Init
# sphinx-quickstart docs; cd docs
# edit index.rst >>> add modules
# sphinx-apidoc -o docs .
# sphinx-apidoc -o /source/_modules src
# sphinx-build docs docs/_build/html -W -j auto --color -b html
description = Invoke sphinx-build to build the HTML docs
usedevelop = True
extras = docs
# commands_pre = sphinx-build docs/source "{toxworkdir}/docs_out" -d "{toxworkdir}/docs_doctree" -b doctest {posargs:-E}
# commands = sphinx-build docs docs/_build/html -W -j auto --color -Ea -b html {posargs}
commands = sphinx-build docs/source "{toxworkdir}/docs_out" -d "{toxworkdir}/docs_doctree" -W -j auto --color -b html {posargs}
commands_post = python -c 'import pathlib; print("documentation available under file://\{0\}".format(pathlib.Path(r"{toxworkdir}") / "docs_out" / "index.html"))'
# sphinx-build docs/source "{toxworkdir}/docs_out" -d "{toxworkdir}/docs_doctree" -b linkcheck {posargs:-E}

[testenv:dist]
deps = build
commands = python -m build

[testenv:deploy]
usedevelop = True
skip_install = true
# depends = dist
passenv = TWINE_*
deps =
build
twine
commands_pre =
python -m build
twine check dist/*
commands = twine upload --skip-existing {posargs} dist/*
1 change: 1 addition & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@

setup(
install_requires=[
'adjust-precision-for-schema',
'jsonschema==4.14.0',
'boto3==1.24.62',
'backoff==2.1.2'
Expand Down
26 changes: 7 additions & 19 deletions target_s3_jsonl/__init__.py → src/target_s3_jsonl/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
#!/usr/bin/env python3

__version__ = '1.2.1'
__version__ = '1.2.2'

import argparse
import gzip
Expand All @@ -13,7 +13,7 @@
from uuid import uuid4

from jsonschema import Draft4Validator, FormatChecker
from decimal import Decimal
from adjust_precision_for_schema import adjust_decimal_precision_for_schema

from target_s3_jsonl import s3
from target_s3_jsonl.logger import get_logger
Expand Down Expand Up @@ -82,18 +82,6 @@ def emit_state(state):
sys.stdout.flush()


def float_to_decimal(value):
'''Walk the given data structure and turn all instances of float into
double.'''
if isinstance(value, float):
return Decimal(str(value))
if isinstance(value, list):
return [float_to_decimal(child) for child in value]
if isinstance(value, dict):
return {k: float_to_decimal(v) for k, v in value.items()}
return value


def get_target_key(stream, config, timestamp=None, prefix=None):
'''Creates and returns an S3 key for the stream'''

Expand Down Expand Up @@ -149,7 +137,7 @@ def persist_lines(messages, config, save_records=save_jsonl_file):

record_to_load = o['record']
# NOTE: Validate record
validators[stream].validate(float_to_decimal(record_to_load))
validators[stream].validate(record_to_load)

if config.get('add_metadata_columns'):
record_to_load = add_metadata_values_to_record(o, {}, now)
Expand All @@ -170,11 +158,11 @@ def persist_lines(messages, config, save_records=save_jsonl_file):
if 'stream' not in o:
raise Exception("Line is missing required key 'stream': {}".format(message))
stream = o['stream']
schemas[stream] = o['schema']

if config.get('add_metadata_columns'):
schemas[stream] = add_metadata_columns_to_schema(o)
else:
schemas[stream] = float_to_decimal(o['schema'])
schemas[stream] = add_metadata_columns_to_schema(o) if config.get('add_metadata_columns') else o

adjust_decimal_precision_for_schema(schemas[stream])

# NOTE: prevent exception *** jsonschema.exceptions.UnknownType: Unknown type 'SCHEMA' for validator.
# 'type' is a key word for jsonschema validator which is different from `{'type': 'SCHEMA'}` as the message type.
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
Loading

0 comments on commit 6b56426

Please sign in to comment.