Skip to content

Commit

Permalink
Update plan stability results
Browse files Browse the repository at this point in the history
  • Loading branch information
viirya committed Dec 16, 2024
1 parent 68e02e0 commit 6e35d24
Show file tree
Hide file tree
Showing 810 changed files with 3,687 additions and 3,687 deletions.
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
== Physical Plan ==
* ColumnarToRow (40)
* CometColumnarToRow (40)
+- CometTakeOrderedAndProject (39)
+- CometProject (38)
+- CometBroadcastHashJoin (37)
Expand Down Expand Up @@ -224,14 +224,14 @@ Arguments: [c_customer_id#27], [c_customer_id#27]
Input [1]: [c_customer_id#27]
Arguments: TakeOrderedAndProject(limit=100, orderBy=[c_customer_id#27 ASC NULLS FIRST], output=[c_customer_id#27]), [c_customer_id#27], 100, [c_customer_id#27 ASC NULLS FIRST], [c_customer_id#27]

(40) ColumnarToRow [codegen id : 1]
(40) CometColumnarToRow [codegen id : 1]
Input [1]: [c_customer_id#27]

===== Subqueries =====

Subquery:1 Hosting operator id = 1 Hosting Expression = sr_returned_date_sk#4 IN dynamicpruning#5
BroadcastExchange (45)
+- * ColumnarToRow (44)
+- * CometColumnarToRow (44)
+- CometProject (43)
+- CometFilter (42)
+- CometScan parquet spark_catalog.default.date_dim (41)
Expand All @@ -252,7 +252,7 @@ Condition : ((isnotnull(d_year#7) AND (d_year#7 = 2000)) AND isnotnull(d_date_sk
Input [2]: [d_date_sk#6, d_year#7]
Arguments: [d_date_sk#6], [d_date_sk#6]

(44) ColumnarToRow [codegen id : 1]
(44) CometColumnarToRow [codegen id : 1]
Input [1]: [d_date_sk#6]

(45) BroadcastExchange
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometTakeOrderedAndProject [c_customer_id]
CometProject [c_customer_id]
Expand All @@ -19,7 +19,7 @@ WholeStageCodegen (1)
SubqueryBroadcast [d_date_sk] #1
BroadcastExchange #2
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometProject [d_date_sk]
CometFilter [d_date_sk,d_year]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ TakeOrderedAndProject (45)
: : +- * Filter (27)
: : +- * BroadcastHashJoin ExistenceJoin(exists#1) BuildRight (26)
: : :- * BroadcastHashJoin ExistenceJoin(exists#2) BuildRight (19)
: : : :- * ColumnarToRow (12)
: : : :- * CometColumnarToRow (12)
: : : : +- CometBroadcastHashJoin (11)
: : : : :- CometFilter (2)
: : : : : +- CometScan parquet spark_catalog.default.customer (1)
Expand All @@ -24,24 +24,24 @@ TakeOrderedAndProject (45)
: : : : +- CometFilter (5)
: : : : +- CometScan parquet spark_catalog.default.date_dim (4)
: : : +- BroadcastExchange (18)
: : : +- * ColumnarToRow (17)
: : : +- * CometColumnarToRow (17)
: : : +- CometProject (16)
: : : +- CometBroadcastHashJoin (15)
: : : :- CometScan parquet spark_catalog.default.web_sales (13)
: : : +- ReusedExchange (14)
: : +- BroadcastExchange (25)
: : +- * ColumnarToRow (24)
: : +- * CometColumnarToRow (24)
: : +- CometProject (23)
: : +- CometBroadcastHashJoin (22)
: : :- CometScan parquet spark_catalog.default.catalog_sales (20)
: : +- ReusedExchange (21)
: +- BroadcastExchange (33)
: +- * ColumnarToRow (32)
: +- * CometColumnarToRow (32)
: +- CometProject (31)
: +- CometFilter (30)
: +- CometScan parquet spark_catalog.default.customer_address (29)
+- BroadcastExchange (39)
+- * ColumnarToRow (38)
+- * CometColumnarToRow (38)
+- CometFilter (37)
+- CometScan parquet spark_catalog.default.customer_demographics (36)

Expand Down Expand Up @@ -101,7 +101,7 @@ Left output [3]: [c_customer_sk#3, c_current_cdemo_sk#4, c_current_addr_sk#5]
Right output [1]: [ss_customer_sk#6]
Arguments: [c_customer_sk#3], [ss_customer_sk#6], LeftSemi, BuildRight

(12) ColumnarToRow [codegen id : 5]
(12) CometColumnarToRow [codegen id : 5]
Input [3]: [c_customer_sk#3, c_current_cdemo_sk#4, c_current_addr_sk#5]

(13) CometScan parquet spark_catalog.default.web_sales
Expand All @@ -123,7 +123,7 @@ Arguments: [ws_sold_date_sk#13], [d_date_sk#15], Inner, BuildRight
Input [3]: [ws_bill_customer_sk#12, ws_sold_date_sk#13, d_date_sk#15]
Arguments: [ws_bill_customer_sk#12], [ws_bill_customer_sk#12]

(17) ColumnarToRow [codegen id : 1]
(17) CometColumnarToRow [codegen id : 1]
Input [1]: [ws_bill_customer_sk#12]

(18) BroadcastExchange
Expand Down Expand Up @@ -155,7 +155,7 @@ Arguments: [cs_sold_date_sk#17], [d_date_sk#19], Inner, BuildRight
Input [3]: [cs_ship_customer_sk#16, cs_sold_date_sk#17, d_date_sk#19]
Arguments: [cs_ship_customer_sk#16], [cs_ship_customer_sk#16]

(24) ColumnarToRow [codegen id : 2]
(24) CometColumnarToRow [codegen id : 2]
Input [1]: [cs_ship_customer_sk#16]

(25) BroadcastExchange
Expand Down Expand Up @@ -191,7 +191,7 @@ Condition : (ca_county#21 IN (Rush County,Toole County,Jefferson County,Dona Ana
Input [2]: [ca_address_sk#20, ca_county#21]
Arguments: [ca_address_sk#20], [ca_address_sk#20]

(32) ColumnarToRow [codegen id : 3]
(32) CometColumnarToRow [codegen id : 3]
Input [1]: [ca_address_sk#20]

(33) BroadcastExchange
Expand Down Expand Up @@ -219,7 +219,7 @@ ReadSchema: struct<cd_demo_sk:int,cd_gender:string,cd_marital_status:string,cd_e
Input [9]: [cd_demo_sk#22, cd_gender#23, cd_marital_status#24, cd_education_status#25, cd_purchase_estimate#26, cd_credit_rating#27, cd_dep_count#28, cd_dep_employed_count#29, cd_dep_college_count#30]
Condition : isnotnull(cd_demo_sk#22)

(38) ColumnarToRow [codegen id : 4]
(38) CometColumnarToRow [codegen id : 4]
Input [9]: [cd_demo_sk#22, cd_gender#23, cd_marital_status#24, cd_education_status#25, cd_purchase_estimate#26, cd_credit_rating#27, cd_dep_count#28, cd_dep_employed_count#29, cd_dep_college_count#30]

(39) BroadcastExchange
Expand Down Expand Up @@ -262,7 +262,7 @@ Arguments: 100, [cd_gender#23 ASC NULLS FIRST, cd_marital_status#24 ASC NULLS FI

Subquery:1 Hosting operator id = 3 Hosting Expression = ss_sold_date_sk#7 IN dynamicpruning#8
BroadcastExchange (50)
+- * ColumnarToRow (49)
+- * CometColumnarToRow (49)
+- CometProject (48)
+- CometFilter (47)
+- CometScan parquet spark_catalog.default.date_dim (46)
Expand All @@ -283,7 +283,7 @@ Condition : (((((isnotnull(d_year#10) AND isnotnull(d_moy#11)) AND (d_year#10 =
Input [3]: [d_date_sk#9, d_year#10, d_moy#11]
Arguments: [d_date_sk#9], [d_date_sk#9]

(49) ColumnarToRow [codegen id : 1]
(49) CometColumnarToRow [codegen id : 1]
Input [1]: [d_date_sk#9]

(50) BroadcastExchange
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ TakeOrderedAndProject [cd_gender,cd_marital_status,cd_education_status,cd_purcha
Filter [exists,exists]
BroadcastHashJoin [c_customer_sk,cs_ship_customer_sk]
BroadcastHashJoin [c_customer_sk,ws_bill_customer_sk]
ColumnarToRow
CometColumnarToRow
InputAdapter
CometBroadcastHashJoin [c_customer_sk,c_current_cdemo_sk,c_current_addr_sk,ss_customer_sk]
CometFilter [c_customer_sk,c_current_cdemo_sk,c_current_addr_sk]
Expand All @@ -25,7 +25,7 @@ TakeOrderedAndProject [cd_gender,cd_marital_status,cd_education_status,cd_purcha
SubqueryBroadcast [d_date_sk] #1
BroadcastExchange #3
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometProject [d_date_sk]
CometFilter [d_date_sk,d_year,d_moy]
Expand All @@ -37,7 +37,7 @@ TakeOrderedAndProject [cd_gender,cd_marital_status,cd_education_status,cd_purcha
InputAdapter
BroadcastExchange #5
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometProject [ws_bill_customer_sk]
CometBroadcastHashJoin [ws_bill_customer_sk,ws_sold_date_sk,d_date_sk]
Expand All @@ -47,7 +47,7 @@ TakeOrderedAndProject [cd_gender,cd_marital_status,cd_education_status,cd_purcha
InputAdapter
BroadcastExchange #6
WholeStageCodegen (2)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometProject [cs_ship_customer_sk]
CometBroadcastHashJoin [cs_ship_customer_sk,cs_sold_date_sk,d_date_sk]
Expand All @@ -57,15 +57,15 @@ TakeOrderedAndProject [cd_gender,cd_marital_status,cd_education_status,cd_purcha
InputAdapter
BroadcastExchange #7
WholeStageCodegen (3)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometProject [ca_address_sk]
CometFilter [ca_address_sk,ca_county]
CometScan parquet spark_catalog.default.customer_address [ca_address_sk,ca_county]
InputAdapter
BroadcastExchange #8
WholeStageCodegen (4)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometFilter [cd_demo_sk,cd_gender,cd_marital_status,cd_education_status,cd_purchase_estimate,cd_credit_rating,cd_dep_count,cd_dep_employed_count,cd_dep_college_count]
CometScan parquet spark_catalog.default.customer_demographics [cd_demo_sk,cd_gender,cd_marital_status,cd_education_status,cd_purchase_estimate,cd_credit_rating,cd_dep_count,cd_dep_employed_count,cd_dep_college_count]
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
== Physical Plan ==
* ColumnarToRow (69)
* CometColumnarToRow (69)
+- CometTakeOrderedAndProject (68)
+- CometProject (67)
+- CometBroadcastHashJoin (66)
Expand Down Expand Up @@ -393,14 +393,14 @@ Arguments: [customer_preferred_cust_flag#36], [customer_preferred_cust_flag#36]
Input [1]: [customer_preferred_cust_flag#36]
Arguments: TakeOrderedAndProject(limit=100, orderBy=[customer_preferred_cust_flag#36 ASC NULLS FIRST], output=[customer_preferred_cust_flag#36]), [customer_preferred_cust_flag#36], 100, [customer_preferred_cust_flag#36 ASC NULLS FIRST], [customer_preferred_cust_flag#36]

(69) ColumnarToRow [codegen id : 1]
(69) CometColumnarToRow [codegen id : 1]
Input [1]: [customer_preferred_cust_flag#36]

===== Subqueries =====

Subquery:1 Hosting operator id = 3 Hosting Expression = ss_sold_date_sk#12 IN dynamicpruning#13
BroadcastExchange (73)
+- * ColumnarToRow (72)
+- * CometColumnarToRow (72)
+- CometFilter (71)
+- CometScan parquet spark_catalog.default.date_dim (70)

Expand All @@ -416,7 +416,7 @@ ReadSchema: struct<d_date_sk:int,d_year:int>
Input [2]: [d_date_sk#14, d_year#15]
Condition : ((isnotnull(d_year#15) AND (d_year#15 = 2001)) AND isnotnull(d_date_sk#14))

(72) ColumnarToRow [codegen id : 1]
(72) CometColumnarToRow [codegen id : 1]
Input [2]: [d_date_sk#14, d_year#15]

(73) BroadcastExchange
Expand All @@ -425,7 +425,7 @@ Arguments: HashedRelationBroadcastMode(List(cast(input[0, int, false] as bigint)

Subquery:2 Hosting operator id = 19 Hosting Expression = ss_sold_date_sk#30 IN dynamicpruning#31
BroadcastExchange (77)
+- * ColumnarToRow (76)
+- * CometColumnarToRow (76)
+- CometFilter (75)
+- CometScan parquet spark_catalog.default.date_dim (74)

Expand All @@ -441,7 +441,7 @@ ReadSchema: struct<d_date_sk:int,d_year:int>
Input [2]: [d_date_sk#32, d_year#33]
Condition : ((isnotnull(d_year#33) AND (d_year#33 = 2002)) AND isnotnull(d_date_sk#32))

(76) ColumnarToRow [codegen id : 1]
(76) CometColumnarToRow [codegen id : 1]
Input [2]: [d_date_sk#32, d_year#33]

(77) BroadcastExchange
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometTakeOrderedAndProject [customer_preferred_cust_flag]
CometProject [customer_preferred_cust_flag]
Expand All @@ -24,7 +24,7 @@ WholeStageCodegen (1)
SubqueryBroadcast [d_date_sk] #1
BroadcastExchange #3
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometFilter [d_date_sk,d_year]
CometScan parquet spark_catalog.default.date_dim [d_date_sk,d_year]
Expand All @@ -47,7 +47,7 @@ WholeStageCodegen (1)
SubqueryBroadcast [d_date_sk] #2
BroadcastExchange #8
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometFilter [d_date_sk,d_year]
CometScan parquet spark_catalog.default.date_dim [d_date_sk,d_year]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
TakeOrderedAndProject (22)
+- * Project (21)
+- Window (20)
+- * ColumnarToRow (19)
+- * CometColumnarToRow (19)
+- CometSort (18)
+- CometExchange (17)
+- CometHashAggregate (16)
Expand Down Expand Up @@ -109,7 +109,7 @@ Arguments: hashpartitioning(i_class#9, 5), ENSURE_REQUIREMENTS, CometNativeShuff
Input [7]: [i_item_desc#7, i_category#10, i_class#9, i_current_price#8, itemrevenue#14, _w0#15, i_item_id#6]
Arguments: [i_item_desc#7, i_category#10, i_class#9, i_current_price#8, itemrevenue#14, _w0#15, i_item_id#6], [i_class#9 ASC NULLS FIRST]

(19) ColumnarToRow [codegen id : 1]
(19) CometColumnarToRow [codegen id : 1]
Input [7]: [i_item_desc#7, i_category#10, i_class#9, i_current_price#8, itemrevenue#14, _w0#15, i_item_id#6]

(20) Window
Expand All @@ -128,7 +128,7 @@ Arguments: 100, [i_category#10 ASC NULLS FIRST, i_class#9 ASC NULLS FIRST, i_ite

Subquery:1 Hosting operator id = 1 Hosting Expression = ws_sold_date_sk#3 IN dynamicpruning#4
BroadcastExchange (27)
+- * ColumnarToRow (26)
+- * CometColumnarToRow (26)
+- CometProject (25)
+- CometFilter (24)
+- CometScan parquet spark_catalog.default.date_dim (23)
Expand All @@ -149,7 +149,7 @@ Condition : (((isnotnull(d_date#12) AND (d_date#12 >= 1999-02-22)) AND (d_date#1
Input [2]: [d_date_sk#11, d_date#12]
Arguments: [d_date_sk#11], [d_date_sk#11]

(26) ColumnarToRow [codegen id : 1]
(26) CometColumnarToRow [codegen id : 1]
Input [1]: [d_date_sk#11]

(27) BroadcastExchange
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ TakeOrderedAndProject [i_category,i_class,i_item_id,i_item_desc,revenueratio,i_c
InputAdapter
Window [_w0,i_class]
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometSort [i_item_desc,i_category,i_class,i_current_price,itemrevenue,_w0,i_item_id]
CometExchange [i_class] #1
Expand All @@ -20,7 +20,7 @@ TakeOrderedAndProject [i_category,i_class,i_item_id,i_item_desc,revenueratio,i_c
SubqueryBroadcast [d_date_sk] #1
BroadcastExchange #3
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometProject [d_date_sk]
CometFilter [d_date_sk,d_date]
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
== Physical Plan ==
* ColumnarToRow (33)
* CometColumnarToRow (33)
+- CometHashAggregate (32)
+- CometExchange (31)
+- CometHashAggregate (30)
Expand Down Expand Up @@ -188,14 +188,14 @@ Input [7]: [sum#23, count#24, sum#25, count#26, sum#27, count#28, sum#29]
Keys: []
Functions [4]: [avg(ss_quantity#5), avg(UnscaledValue(ss_ext_sales_price#7)), avg(UnscaledValue(ss_ext_wholesale_cost#8)), sum(UnscaledValue(ss_ext_wholesale_cost#8))]

(33) ColumnarToRow [codegen id : 1]
(33) CometColumnarToRow [codegen id : 1]
Input [4]: [avg(ss_quantity)#30, avg(ss_ext_sales_price)#31, avg(ss_ext_wholesale_cost)#32, sum(ss_ext_wholesale_cost)#33]

===== Subqueries =====

Subquery:1 Hosting operator id = 1 Hosting Expression = ss_sold_date_sk#10 IN dynamicpruning#11
BroadcastExchange (38)
+- * ColumnarToRow (37)
+- * CometColumnarToRow (37)
+- CometProject (36)
+- CometFilter (35)
+- CometScan parquet spark_catalog.default.date_dim (34)
Expand All @@ -216,7 +216,7 @@ Condition : ((isnotnull(d_year#17) AND (d_year#17 = 2001)) AND isnotnull(d_date_
Input [2]: [d_date_sk#16, d_year#17]
Arguments: [d_date_sk#16], [d_date_sk#16]

(37) ColumnarToRow [codegen id : 1]
(37) CometColumnarToRow [codegen id : 1]
Input [1]: [d_date_sk#16]

(38) BroadcastExchange
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometHashAggregate [avg(ss_quantity),avg(ss_ext_sales_price),avg(ss_ext_wholesale_cost),sum(ss_ext_wholesale_cost),sum,count,sum,count,sum,count,sum,avg(ss_quantity),avg(UnscaledValue(ss_ext_sales_price)),avg(UnscaledValue(ss_ext_wholesale_cost)),sum(UnscaledValue(ss_ext_wholesale_cost))]
CometExchange #1
Expand All @@ -19,7 +19,7 @@ WholeStageCodegen (1)
SubqueryBroadcast [d_date_sk] #1
BroadcastExchange #2
WholeStageCodegen (1)
ColumnarToRow
CometColumnarToRow
InputAdapter
CometProject [d_date_sk]
CometFilter [d_date_sk,d_year]
Expand Down
Loading

0 comments on commit 6e35d24

Please sign in to comment.