-
Notifications
You must be signed in to change notification settings - Fork 166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Only delegate to DataFusion cast when we know that it is compatible with Spark #461
Conversation
// TODO need to add tests to see if we really do support all | ||
// timestamp to timestamp conversions |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I filed #467 for adding timestamp to timestamp tests
This is ready for review now @viirya @parthchandra @kazuyukitanimura @huaxingao |
let array = match &from_type { | ||
DataType::Dictionary(key_type, value_type) | ||
if key_type.as_ref() == &DataType::Int32 | ||
&& (value_type.as_ref() == &DataType::Utf8 | ||
|| value_type.as_ref() == &DataType::LargeUtf8) => | ||
{ | ||
cast_with_options(&array, value_type.as_ref(), &CAST_OPTIONS)? | ||
} | ||
_ => array, | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We were previously unpacking dictionary-encoded string arrays only for string to int and string to date. I just moved it earlier on so that we don't have to handle it specifically for certain casts from string
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few questions
| DataType::Int64 | ||
| DataType::Float32 | ||
| DataType::Float64 | ||
| DataType::Utf8 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So right now, there is not Int8
to Decimal128
cast supported, looks like?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current code says that datafusion is compatible with Spark for all int types -> decimal:
DataType::Int8 | DataType::Int16 | DataType::Int32 | DataType::Int64 => matches!(
to_type,
DataType::Boolean
...
| DataType::Decimal128(_, _)
However, this is actually not correct since DataFusion does not have overflow checks for int32 and int64 -> decimal and is not compatible with Spark. I will look at removing those.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removing that case causes a test failure:
- scalar subquery *** FAILED *** (8 seconds, 253 milliseconds)
Cause: java.util.concurrent.ExecutionException: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 410.0 failed 1 times, most recent failure: Lost task 0.0 in stage 410.0 (TID 1286) (192.168.64.23 executor driver): org.apache.comet.CometNativeException: Execution error: Comet Internal Error: Native cast invoked for unsupported cast from Int32 to Decimal128(38, 10)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This test relies on a cast that we do not yet support and enables COMET_CAST_ALLOW_INCOMPATIBLE
to allow it. I will revert the last change and add a comment about this
DataType::Float32 | DataType::Float64 => matches!( | ||
to_type, | ||
DataType::Boolean | ||
| DataType::Int8 | ||
| DataType::Int16 | ||
| DataType::Int32 | ||
| DataType::Int64 | ||
| DataType::Float32 | ||
| DataType::Float64 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For Float32/64
to Int8/16/32/64
, I saw spark_cast_nonintegral_numeric_to_integral
covers them above.
Is this for the case self.eval_mode == EvalMode::Try
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that is correct.
// DataFusion only supports binary data containing valid UTF-8 strings | ||
matches!(to_type, DataType::Utf8) | ||
} | ||
_ => false, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Casting to narrower type like Int64
to Int32
cases are not supported when self.eval_mode == EvalMode::Try
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Casting from Int64
to Int32
for Try
is covered here:
DataType::Int8 | DataType::Int16 | DataType::Int32 | DataType::Int64 => matches!(
to_type,
DataType::Boolean
| DataType::Int8
| DataType::Int16
| DataType::Int32
| DataType::Int64
| DataType::Float32
| DataType::Float64
| DataType::Decimal128(_, _)
| DataType::Utf8
),
…ble with Spark (apache#461) * only delegate to DataFusion cast when we know that it is compatible with Spark * add more supported casts * improve support for dictionary-encoded string arrays * clippy * fix merge conflict * fix a regression * fix a regression * fix a regression * fix regression * fix regression * fix regression * remove TODO comment now that issue has been filed * remove cast int32/int64 -> decimal from datafusion compatible list * Revert "remove cast int32/int64 -> decimal from datafusion compatible list" This reverts commit 340e000. * add comment (cherry picked from commit 79431f8)
Which issue does this PR close?
N/A
Rationale for this change
We have a catchall block in
cast.rs
that delegates to DataFusion for any cast that we don't have a specific match arm for. This is dangerous because it means we sometimes inadvertently delegate to DataFusion for casts where DataFusion is not compatible with Spark, which can lead to data corruption and hard-to-debug issues such as #383 (comment)What changes are included in this PR?
This PR introduces specific checks so that we only delegate to DataFusion for specific casts and changes the catchall to return an error.
I also improved handling of dictionary-encoded string arrays so that these are unpacked early on and this would have prevented the issue in #383 (comment)
How are these changes tested?
Existing tests