Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add decimal argument support to round function #713

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

andrew-coleman
Copy link
Contributor

The round function has a number of variants to support different numeric types. This commit adds support for rounding decimals. This is required for the spark module.

The round function has a number of variants to support different numeric types.  This commit adds support for rounding decimals.
This is required for the spark module.

Signed-off-by: Andrew Coleman <[email protected]>
and this value cannot be exactly represented, this specifies how
to round it.

- TIE_TO_EVEN: round to nearest value; if exactly halfway, tie
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this happen with decimal representations? I'd argue all of the floating point handling stuff here does not apply.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why would these not apply? For example, the input value 2.5 could be represented exactly in a decimal type, but rounding it to the nearest integer would result in 2 if the rounding mode is TIE_TO_EVEN or 3 if the mode is TIE_AWAY_FROM_ZERO.

@@ -268,3 +268,43 @@ scalar_functions:
AWAY_FROM_ZERO, TIE_DOWN, TIE_UP, TIE_TOWARDS_ZERO, TIE_TO_ODD ]
nullability: DECLARED_OUTPUT
return: fp64?
- args:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For all other decimal functionality we have placed them in _decimal.yaml files. Not sure if we want to have just this one function in a file by itself though.


When `s` is a negative number, the rounding is
performed to the left side of the decimal point
as specified by `s`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this operation affect the scale? We should probably clarify that here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess this function could return a different decimal type (i.e. reduce the precision and the scale parameters), but I was working on the assumption that it would just return a different value. I'm not sure if that is what you are asking.

Copy link
Contributor

@jacques-n jacques-n Oct 16, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's not assume. Let's add some expected behaviors here. Once we get tests inside core, we can transplant those into test cases. And if this is the behaviors of spark we're trying to match, we shouldn't probably just put this in a spark function file (or name it spark_round here). Decimal behavior is often quite different between different systems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants