Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Add support for RLike #469

Closed
wants to merge 17 commits into from
8 changes: 8 additions & 0 deletions common/src/main/scala/org/apache/comet/CometConf.scala
Original file line number Diff line number Diff line change
Expand Up @@ -401,9 +401,17 @@ object CometConf extends ShimCometConf {
.booleanConf
.createWithDefault(false)

<<<<<<< HEAD
val COMET_REGEXP_ALLOW_INCOMPATIBLE: ConfigEntry[Boolean] =
conf("spark.comet.regexp.allowIncompatible")
.doc("Comet is not currently fully compatible with Spark for all regular expressions. " +
"Set this config to true to allow them anyway using Rust's regular expression engine. " +
"See compatibility guide for more information.")
=======
val COMET_XXHASH64_ENABLED: ConfigEntry[Boolean] =
conf("spark.comet.xxhash64.enabled")
.doc("The xxhash64 implementation is not optimized yet and may cause performance issues.")
>>>>>>> apache/main
.booleanConf
.createWithDefault(false)

Expand Down
4 changes: 4 additions & 0 deletions core/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -136,3 +136,7 @@ harness = false
[[bench]]
name = "shuffle_writer"
harness = false

[[bench]]
name = "regexp"
harness = false
75 changes: 75 additions & 0 deletions core/benches/regexp.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
// Licensed to the Apache Software Foundation (ASF) under one
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
// to you under the Apache License, Version 2.0 (the
// "License"); you may not use this file except in compliance
// with the License. You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing,
// software distributed under the License is distributed on an
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// KIND, either express or implied. See the License for the
// specific language governing permissions and limitations
// under the License.


use std::sync::Arc;
use arrow::datatypes::Int32Type;
use arrow::error::ArrowError;
use arrow_array::{builder::StringBuilder, builder::StringDictionaryBuilder, RecordBatch};
use arrow_schema::{DataType, Field, Schema};
use comet::execution::datafusion::expressions::regexp::RLike;
use criterion::{criterion_group, criterion_main, Criterion};
use datafusion::common::ScalarValue;
use datafusion_physical_expr::{expressions::Column, expressions::Literal, PhysicalExpr, expressions::LikeExpr};

fn criterion_benchmark(c: &mut Criterion) {
let batch = create_utf8_batch().unwrap();
let child_expr = Arc::new(Column::new("foo", 0));
let pattern_expr = Arc::new(Literal::new(ScalarValue::Utf8(Some("5[0-9]5".to_string()))));
let rlike = RLike::new(child_expr.clone(), pattern_expr.clone());
let df_rlike = LikeExpr::new(false, false, child_expr, pattern_expr);

let mut group = c.benchmark_group("regexp");
group.bench_function("regexp_comet_rlike", |b| {
b.iter(|| rlike.evaluate(&batch).unwrap());
});
group.bench_function("regexp_datafusion_rlike", |b| {
b.iter(|| df_rlike.evaluate(&batch).unwrap());
});
}

fn create_utf8_batch() -> Result<RecordBatch, ArrowError> {
let schema = Arc::new(Schema::new(vec![
Field::new("a", DataType::Utf8, true),
Field::new("b", DataType::Dictionary(Box::new(DataType::Int32), Box::new(DataType::Utf8)), true)
]));
let mut string_builder = StringBuilder::new();
let mut string_dict_builder = StringDictionaryBuilder::<Int32Type>::new();
for i in 0..1000 {
if i % 10 == 0 {
string_builder.append_null();
string_dict_builder.append_null();
} else {
string_builder.append_value(format!("{}", i));
string_dict_builder.append_value(format!("{}", i));
}
}
let string_array = string_builder.finish();
let string_dict_array2 = string_dict_builder.finish();
RecordBatch::try_new(schema.clone(), vec![Arc::new(string_array), Arc::new(string_dict_array2)])
}

fn config() -> Criterion {
Criterion::default()
}

criterion_group! {
name = benches;
config = config();
targets = criterion_benchmark
}
criterion_main!(benches);
1 change: 1 addition & 0 deletions core/src/execution/datafusion/expressions/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ pub mod bloom_filter_might_contain;
pub mod correlation;
pub mod covariance;
pub mod negative;
pub mod regexp;
pub mod stats;
pub mod stddev;
pub mod strings;
Expand Down
205 changes: 205 additions & 0 deletions core/src/execution/datafusion/expressions/regexp.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,205 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/

use crate::{errors::CometError, execution::datafusion::expressions::utils::down_cast_any_ref};
use arrow_array::{builder::BooleanBuilder, Array, RecordBatch, StringArray};
use arrow_schema::{DataType, Schema};
use datafusion::logical_expr::ColumnarValue;
use datafusion_common::ScalarValue;
use datafusion_physical_expr::PhysicalExpr;
use regex::Regex;
use std::{
any::Any,
fmt::{Display, Formatter},
hash::Hasher,
sync::Arc,
};

#[derive(Debug, Hash)]
pub struct RLike {
child: Arc<dyn PhysicalExpr>,
pattern: Arc<dyn PhysicalExpr>,
}

impl RLike {
pub fn new(child: Arc<dyn PhysicalExpr>, pattern: Arc<dyn PhysicalExpr>) -> Self {
Self { child, pattern }
}
}

impl Display for RLike {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
write!(
f,
"RLike [child: {}, pattern: {}] ",
self.child, self.pattern
)
}
}

impl PartialEq<dyn Any> for RLike {
fn eq(&self, other: &dyn Any) -> bool {
down_cast_any_ref(other)
.downcast_ref::<Self>()
.map(|x| self.child.eq(&x.child) && self.pattern.eq(&x.pattern))
.unwrap_or(false)
}
}

impl PhysicalExpr for RLike {
kazuyukitanimura marked this conversation as resolved.
Show resolved Hide resolved
fn as_any(&self) -> &dyn Any {
self
}

fn data_type(&self, _input_schema: &Schema) -> datafusion_common::Result<DataType> {
Ok(DataType::Boolean)
}

fn nullable(&self, input_schema: &Schema) -> datafusion_common::Result<bool> {
self.child.nullable(input_schema)
}

fn evaluate(&self, batch: &RecordBatch) -> datafusion_common::Result<ColumnarValue> {
if let ColumnarValue::Array(v) = self.child.evaluate(batch)? {
if let ColumnarValue::Scalar(ScalarValue::Utf8(Some(pattern))) =
self.pattern.evaluate(batch)?
{
// TODO cache Regex across invocations of evaluate() or create it in constructor
match Regex::new(&pattern) {
Ok(re) => {
let inputs = v
.as_any()
.downcast_ref::<StringArray>()
.expect("string array");
let mut builder = BooleanBuilder::with_capacity(inputs.len());
if inputs.is_nullable() {
for i in 0..inputs.len() {
if inputs.is_null(i) {
builder.append_null();
} else {
builder.append_value(re.is_match(inputs.value(i)));
}
}
} else {
for i in 0..inputs.len() {
builder.append_value(re.is_match(inputs.value(i)));
}
}
Ok(ColumnarValue::Array(Arc::new(builder.finish())))
}
Err(e) => Err(CometError::Internal(format!(
"Failed to compile regular expression: {e:?}"
))
.into()),
}
} else {
Err(
CometError::Internal("Only scalar regex patterns are supported".to_string())
.into(),
)
}
} else {
// this should be unreachable because Spark will evaluate regex expressions against
// literal strings as part of query planning
Err(CometError::Internal("Only columnar inputs are supported".to_string()).into())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it possible to encounter dictionary type?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good point. Yes, it probably is.

}
}

fn children(&self) -> Vec<&Arc<dyn PhysicalExpr>> {
vec![&self.child]
}

fn with_new_children(
self: Arc<Self>,
children: Vec<Arc<dyn PhysicalExpr>>,
) -> datafusion_common::Result<Arc<dyn PhysicalExpr>> {
assert!(children.len() == 2);
Ok(Arc::new(RLike::new(
children[0].clone(),
children[1].clone(),
)))
}

fn dyn_hash(&self, state: &mut dyn Hasher) {
use std::hash::Hash;
let mut s = state;
self.hash(&mut s);
}
}

#[cfg(test)]
mod test {
use std::sync::Arc;
use arrow_array::builder::{StringBuilder, StringDictionaryBuilder};
use arrow_array::{Array, BooleanArray, RecordBatch};
use arrow_array::types::Int32Type;
use arrow_schema::{ArrowError, DataType, Field, Schema};
use datafusion_common::{DataFusionError, ScalarValue};
use datafusion_expr::ColumnarValue;
use datafusion_physical_expr::expressions::Literal;
use datafusion_physical_expr_common::expressions::column::Column;
use datafusion_physical_expr_common::physical_expr::PhysicalExpr;
use super::*;

#[test]
fn test_string_input() -> Result<(), DataFusionError> {
do_test(0, "5[0-9]5", 10)
}

#[test]
fn test_dict_encoded_string_input() -> Result<(), DataFusionError> {
do_test(1, "5[0-9]5", 10)
}

fn do_test(column: usize, pattern: &str, expected_count: usize) -> Result<(), DataFusionError> {
let batch = create_utf8_batch()?;
let child_expr = Arc::new(Column::new("foo", column));
let pattern_expr = Arc::new(Literal::new(ScalarValue::Utf8(Some(pattern.to_string()))));
let rlike = RLike::new(child_expr, pattern_expr);
if let ColumnarValue::Array(array) = rlike.evaluate(&batch).unwrap() {
let array = array.as_any().downcast_ref::<BooleanArray>().expect("boolean array");
assert_eq!(expected_count, array.true_count());
} else {
unreachable!()
}
Ok(())
}

fn create_utf8_batch() -> Result<RecordBatch, ArrowError> {
let schema = Arc::new(Schema::new(vec![
Field::new("a", DataType::Utf8, true),
Field::new("b", DataType::Dictionary(Box::new(DataType::Int32), Box::new(DataType::Utf8)), true)
]));
let mut string_builder = StringBuilder::new();
let mut string_dict_builder = StringDictionaryBuilder::<Int32Type>::new();
for i in 0..1000 {
if i % 10 == 0 {
string_builder.append_null();
string_dict_builder.append_null();
} else {
string_builder.append_value(format!("{}", i));
string_dict_builder.append_value(format!("{}", i));
}
}
let string_array = string_builder.finish();
let string_dict_array2 = string_dict_builder.finish();
RecordBatch::try_new(schema.clone(), vec![Arc::new(string_array), Arc::new(string_dict_array2)])
}

}
2 changes: 0 additions & 2 deletions core/src/execution/datafusion/expressions/strings.rs
Original file line number Diff line number Diff line change
Expand Up @@ -143,8 +143,6 @@ make_predicate_function!(EndsWith, ends_with_dyn, ends_with_utf8_scalar_dyn);

make_predicate_function!(Contains, contains_dyn, contains_utf8_scalar_dyn);

// make_predicate_function!(RLike, rlike_dyn, rlike_utf8_scalar_dyn);

#[derive(Debug, Hash)]
pub struct SubstringExec {
pub child: Arc<dyn PhysicalExpr>,
Expand Down
7 changes: 7 additions & 0 deletions core/src/execution/datafusion/planner.rs
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,7 @@ use crate::{
covariance::Covariance,
if_expr::IfExpr,
negative,
regexp::RLike,
scalar_funcs::create_comet_physical_fun,
stats::StatsType,
stddev::Stddev,
Expand Down Expand Up @@ -435,6 +436,12 @@ impl PhysicalPlanner {

Ok(Arc::new(Like::new(left, right)))
}
ExprStruct::Rlike(expr) => {
let left = self.create_expr(expr.left.as_ref().unwrap(), input_schema.clone())?;
let right = self.create_expr(expr.right.as_ref().unwrap(), input_schema)?;

Ok(Arc::new(RLike::new(left, right)))
}
ExprStruct::CheckOverflow(expr) => {
let child = self.create_expr(expr.child.as_ref().unwrap(), input_schema)?;
let data_type = to_arrow_datatype(expr.datatype.as_ref().unwrap());
Expand Down
10 changes: 5 additions & 5 deletions core/src/execution/proto/expr.proto
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ message Expr {
StartsWith startsWith = 27;
EndsWith endsWith = 28;
Contains contains = 29;
// RLike rlike = 30;
RLike rlike = 30;
ScalarFunc scalarFunc = 31;
EqualNullSafe eqNullSafe = 32;
NotEqualNullSafe neqNullSafe = 33;
Expand Down Expand Up @@ -374,10 +374,10 @@ message Like {
Expr right = 2;
}

// message RLike {
// Expr left = 1;
// Expr right = 2;
// }
message RLike {
Expr left = 1;
Expr right = 2;
}

message StartsWith {
Expr left = 1;
Expand Down
7 changes: 7 additions & 0 deletions docs/source/user-guide/compatibility.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,13 @@ be used in production.

There is an [epic](https://github.com/apache/datafusion-comet/issues/313) where we are tracking the work to fully implement ANSI support.

## Regular Expressions

Comet uses the [regex](https://crates.io/crates/regex) crate to evaluate regular expressions, and it is expected that
this will produce different results to Java's regular expression engine in some cases. It also lacks support for
features such as backreferences. For these reasons, regular expression support is disabled by default and can be
enabled by setting `spark.comet.regexp.allowIncompatible=true`.

## Cast

Cast operations in Comet fall into three levels of support:
Expand Down
1 change: 1 addition & 0 deletions docs/source/user-guide/configs.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,7 @@ Comet provides the following configuration settings.
| spark.comet.memory.overhead.min | Minimum amount of additional memory to be allocated per executor process for Comet, in MiB. | 402653184b |
| spark.comet.nativeLoadRequired | Whether to require Comet native library to load successfully when Comet is enabled. If not, Comet will silently fallback to Spark when it fails to load the native lib. Otherwise, an error will be thrown and the Spark job will be aborted. | false |
| spark.comet.parquet.enable.directBuffer | Whether to use Java direct byte buffer when reading Parquet. By default, this is false | false |
| spark.comet.regexp.allowIncompatible | Comet is not currently fully compatible with Spark for all regular expressions. Set this config to true to allow them anyway using Rust's regular expression engine. See compatibility guide for more information. | false |
| spark.comet.rowToColumnar.supportedOperatorList | A comma-separated list of row-based operators that will be converted to columnar format when 'spark.comet.rowToColumnar.enabled' is true | Range,InMemoryTableScan |
| spark.comet.scan.enabled | Whether to enable Comet scan. When this is turned on, Spark will use Comet to read Parquet data source. Note that to enable native vectorized execution, both this config and 'spark.comet.exec.enabled' need to be enabled. By default, this config is true. | true |
| spark.comet.scan.preFetch.enabled | Whether to enable pre-fetching feature of CometScan. By default is disabled. | false |
Expand Down
Loading