Skip to content

Feat: support bit_count function #1602

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 17 commits into from
May 30, 2025
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 8 additions & 4 deletions native/core/src/execution/planner.rs
Original file line number Diff line number Diff line change
Expand Up @@ -103,10 +103,10 @@ use datafusion_comet_proto::{
spark_partitioning::{partitioning::PartitioningStruct, Partitioning as SparkPartitioning},
};
use datafusion_comet_spark_expr::{
ArrayInsert, Avg, AvgDecimal, BitwiseNotExpr, Cast, CheckOverflow, Contains, Correlation,
Covariance, CreateNamedStruct, DateTruncExpr, EndsWith, GetArrayStructFields, GetStructField,
HourExpr, IfExpr, Like, ListExtract, MinuteExpr, NormalizeNaNAndZero, RLike, SecondExpr,
SparkCastOptions, StartsWith, Stddev, StringSpaceExpr, SubstringExpr, SumDecimal,
ArrayInsert, Avg, AvgDecimal, BitwiseCountExpr, BitwiseNotExpr, Cast, CheckOverflow, Contains,
Correlation, Covariance, CreateNamedStruct, DateTruncExpr, EndsWith, GetArrayStructFields,
GetStructField, HourExpr, IfExpr, Like, ListExtract, MinuteExpr, NormalizeNaNAndZero, RLike,
SecondExpr, SparkCastOptions, StartsWith, Stddev, StringSpaceExpr, SubstringExpr, SumDecimal,
TimestampTruncExpr, ToJson, UnboundColumn, Variance,
};
use itertools::Itertools;
Expand Down Expand Up @@ -616,6 +616,10 @@ impl PhysicalPlanner {
let op = DataFusionOperator::BitwiseShiftLeft;
Ok(Arc::new(BinaryExpr::new(left, op, right)))
}
ExprStruct::BitwiseCount(expr) => {
let child = self.create_expr(expr.child.as_ref().unwrap(), input_schema)?;
Ok(Arc::new(BitwiseCountExpr::new(child)))
}
// https://github.com/apache/datafusion-comet/issues/666
// ExprStruct::Abs(expr) => {
// let child = self.create_expr(expr.child.as_ref().unwrap(), Arc::clone(&input_schema))?;
Expand Down
1 change: 1 addition & 0 deletions native/proto/src/proto/expr.proto
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,7 @@ message Expr {
GetArrayStructFields get_array_struct_fields = 57;
ArrayInsert array_insert = 58;
MathExpr integral_divide = 59;
UnaryExpr bitwiseCount = 60;
}
}

Expand Down
172 changes: 172 additions & 0 deletions native/spark-expr/src/bitwise_funcs/bitwise_count.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,172 @@
// Licensed to the Apache Software Foundation (ASF) under one
// or more contributor license agreements. See the NOTICE file
// distributed with this work for additional information
// regarding copyright ownership. The ASF licenses this file
// to you under the Apache License, Version 2.0 (the
// "License"); you may not use this file except in compliance
// with the License. You may obtain a copy of the License at
//
// http://www.apache.org/licenses/LICENSE-2.0
//
// Unless required by applicable law or agreed to in writing,
// software distributed under the License is distributed on an
// "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
// KIND, either express or implied. See the License for the
// specific language governing permissions and limitations
// under the License.

use arrow::{
array::*,
datatypes::{DataType, Schema},
record_batch::RecordBatch,
};
use datafusion::common::Result;
use datafusion::physical_expr::PhysicalExpr;
use datafusion::{error::DataFusionError, logical_expr::ColumnarValue};
use std::hash::Hash;
use std::{any::Any, sync::Arc};

macro_rules! compute_op {
($OPERAND:expr, $DT:ident, $TY:ty) => {{
let operand = $OPERAND
.as_any()
.downcast_ref::<$DT>()
.expect("compute_op failed to downcast array");
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps report $DT as well in case of failure?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed


let result: $DT = operand
.iter()
.map(|x| x.map(|y| bit_count(y.into()) as $TY))
.collect();

Ok(Arc::new(result))
}};
}

/// BitwiseCount expression
#[derive(Debug, Eq)]
pub struct BitwiseCountExpr {
/// Input expression
arg: Arc<dyn PhysicalExpr>,
}

impl Hash for BitwiseCountExpr {
fn hash<H: std::hash::Hasher>(&self, state: &mut H) {
self.arg.hash(state);
}
}

impl PartialEq for BitwiseCountExpr {
fn eq(&self, other: &Self) -> bool {
self.arg.eq(&other.arg)
}
}

impl BitwiseCountExpr {
/// Create new bitwise count expression
pub fn new(arg: Arc<dyn PhysicalExpr>) -> Self {
Self { arg }
}

/// Get the input expression
pub fn arg(&self) -> &Arc<dyn PhysicalExpr> {
&self.arg
}
}

impl std::fmt::Display for BitwiseCountExpr {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(f, "(~ {})", self.arg)
}
}

impl PhysicalExpr for BitwiseCountExpr {
/// Return a reference to Any that can be used for downcasting
fn as_any(&self) -> &dyn Any {
self
}

fn data_type(&self, input_schema: &Schema) -> Result<DataType> {
self.arg.data_type(input_schema)
}

fn nullable(&self, input_schema: &Schema) -> Result<bool> {
self.arg.nullable(input_schema)
}

fn evaluate(&self, batch: &RecordBatch) -> Result<ColumnarValue> {
let arg = self.arg.evaluate(batch)?;
match arg {
ColumnarValue::Array(array) => {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any possibility that we get a dictionary?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems impossible to me, but if you have a case study on how this can be tested, I'm ready to do it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You will need to have many repeated values in order to create dictionary values
If you use something like https://github.com/apache/datafusion-comet/blob/main/spark/src/test/scala/org/apache/spark/sql/CometTestBase.scala#L940
you should be able to test it

let result: Result<ArrayRef> = match array.data_type() {
DataType::Int8 | DataType::Boolean => compute_op!(array, Int8Array, i8),
DataType::Int16 => compute_op!(array, Int16Array, i16),
DataType::Int32 => compute_op!(array, Int32Array, i32),
DataType::Int64 => compute_op!(array, Int64Array, i64),
_ => Err(DataFusionError::Execution(format!(
"(- '{:?}') can't be evaluated because the expression's type is {:?}, not signed int",
self,
array.data_type(),
))),
};
result.map(ColumnarValue::Array)
}
ColumnarValue::Scalar(_) => Err(DataFusionError::Internal(
"shouldn't go to bitwise count scalar path".to_string(),
)),
}
}

fn children(&self) -> Vec<&Arc<dyn PhysicalExpr>> {
vec![&self.arg]
}

fn with_new_children(
self: Arc<Self>,
children: Vec<Arc<dyn PhysicalExpr>>,
) -> Result<Arc<dyn PhysicalExpr>> {
Ok(Arc::new(BitwiseCountExpr::new(Arc::clone(&children[0]))))
}
}

pub fn bitwise_count(arg: Arc<dyn PhysicalExpr>) -> Result<Arc<dyn PhysicalExpr>> {
Ok(Arc::new(BitwiseCountExpr::new(arg)))
}

// Here’s the equivalent Rust implementation of the bitCount function (similar to Apache Spark's bitCount for LongType)
fn bit_count(i: i64) -> i32 {
let mut u = i as u64;
u = u - ((u >> 1) & 0x5555555555555555);
u = (u & 0x3333333333333333) + ((u >> 2) & 0x3333333333333333);
u = (u + (u >> 4)) & 0x0f0f0f0f0f0f0f0f;
u = u + (u >> 8);
u = u + (u >> 16);
u = u + (u >> 32);
(u as i32) & 0x7f
}

#[cfg(test)]
mod tests {
use arrow::datatypes::*;
use datafusion::common::{cast::as_int32_array, Result};
use datafusion::physical_expr::expressions::col;

use super::*;

#[test]
fn bitwise_count_op() -> Result<()> {
let schema = Schema::new(vec![Field::new("field", DataType::Int32, true)]);

let expr = bitwise_count(col("field", &schema)?)?;

let input = Int32Array::from(vec![Some(1), None, Some(12345), Some(89), Some(-3456)]);
let expected = &Int32Array::from(vec![Some(1), None, Some(6), Some(4), Some(54)]);

let batch = RecordBatch::try_new(Arc::new(schema.clone()), vec![Arc::new(input)])?;

let result = expr.evaluate(&batch)?.into_array(batch.num_rows())?;
let result = as_int32_array(&result).expect("failed to downcast to In32Array");
assert_eq!(result, expected);

Ok(())
}
}
2 changes: 2 additions & 0 deletions native/spark-expr/src/bitwise_funcs/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@
// specific language governing permissions and limitations
// under the License.

mod bitwise_count;
mod bitwise_not;

pub use bitwise_count::{bitwise_count, BitwiseCountExpr};
pub use bitwise_not::{bitwise_not, BitwiseNotExpr};
Original file line number Diff line number Diff line change
Expand Up @@ -1652,6 +1652,14 @@ object QueryPlanSerde extends Logging with ShimQueryPlanSerde with CometExprShim
binding,
(builder, binaryExpr) => builder.setBitwiseXor(binaryExpr))

case BitwiseCount(child) =>
createUnaryExpr(
expr,
child,
inputs,
binding,
(builder, unaryExpr) => builder.setBitwiseCount(unaryExpr))

case ShiftRight(left, right) =>
// DataFusion bitwise shift right expression requires
// same data type between left and right side
Expand Down
23 changes: 23 additions & 0 deletions spark/src/test/scala/org/apache/comet/CometExpressionSuite.scala
Original file line number Diff line number Diff line change
Expand Up @@ -90,6 +90,29 @@ class CometExpressionSuite extends CometTestBase with AdaptiveSparkPlanHelper {
}
}

test("bitwise_count") {
Seq(false, true).foreach { dictionary =>
withSQLConf("parquet.enable.dictionary" -> dictionary.toString) {
val table = "bitwise_count_test"
withTable(table) {
sql(s"create table $table(col1 long, col2 int, col3 short, col4 byte) using parquet")
sql(s"insert into $table values(1111, 2222, 17, 7)")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you mind adding random number cases?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep. Added tests with random data.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we also add a test with a Parquet file not created by Spark (see makeParquetFileAllTypes) and see if this reports correct results with unsigned int columns?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CometTestBase.makeParquetFileAllTypes has all the integer types you may want to test.

sql(
s"insert into $table values(${Long.MaxValue}, ${Int.MaxValue}, ${Short.MaxValue}, ${Byte.MaxValue})")
sql(
s"insert into $table values(${Long.MinValue}, ${Int.MinValue}, ${Short.MinValue}, ${Byte.MinValue})")

checkSparkAnswerAndOperator(sql(s"SELECT bit_count(col1) FROM $table"))
checkSparkAnswerAndOperator(sql(s"SELECT bit_count(col2) FROM $table"))
checkSparkAnswerAndOperator(sql(s"SELECT bit_count(col3) FROM $table"))
checkSparkAnswerAndOperator(sql(s"SELECT bit_count(col4) FROM $table"))
checkSparkAnswerAndOperator(sql(s"SELECT bit_count(true) FROM $table"))
checkSparkAnswerAndOperator(sql(s"SELECT bit_count(false) FROM $table"))
}
}
}
}

test("bitwise shift with different left/right types") {
Seq(false, true).foreach { dictionary =>
withSQLConf("parquet.enable.dictionary" -> dictionary.toString) {
Expand Down
Loading