Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Spark-4.0 build failure due to update in package name to org.apache.spark.sql.classic #12062

Open
nartal1 opened this issue Feb 4, 2025 · 2 comments
Assignees
Labels
bug Something isn't working build Related to CI / CD or cleanly building Spark 4.0+ Spark 4.0+ issues

Comments

@nartal1
Copy link
Collaborator

nartal1 commented Feb 4, 2025

Describe the bug
Spark-4.0 build is failing with below error:

Error: ] /home/runner/work/spark-rapids/spark-rapids/sql-plugin-api/src/main/scala/com/nvidia/spark/rapids/SQLExecPlugin.scala:19: object Strategy is not a member of package org.apache.spark.sql
Error: ] /home/runner/work/spark-rapids/spark-rapids/sql-plugin-api/src/main/scala/com/nvidia/spark/rapids/SQLExecPlugin.scala:27: not found: type Strategy
Error: ] /home/runner/work/spark-rapids/spark-rapids/sql-plugin-api/src/main/scala/com/nvidia/spark/rapids/ShimLoader.scala:30: object Strategy is not a member of package org.apache.spark.sql
Error: ] /home/runner/work/spark-rapids/spark-rapids/sql-plugin-api/src/main/scala/com/nvidia/spark/rapids/ShimLoader.scala:354: not found: type Strategy
Error: [ERROR] four errors found

This is due removal of type strategy in this PR - https://github.com/apache/spark/pull/49713/files#diff-0c293b10c7128f8352624e7c432c341955eab05b26c9ddcb28e2dfba0aa5eafaL44

@nartal1 nartal1 added ? - Needs Triage Need team to review and classify bug Something isn't working build Related to CI / CD or cleanly building Spark 4.0+ Spark 4.0+ issues labels Feb 4, 2025
@nartal1 nartal1 self-assigned this Feb 4, 2025
@mattahrens mattahrens removed the ? - Needs Triage Need team to review and classify label Feb 4, 2025
@nartal1 nartal1 changed the title [BUG] Spark-4.0 build failure due to removal of type Strategy. [BUG] Spark-4.0 build failure due to update in package name to org.apache.spark.sql.classic Feb 5, 2025
@nartal1
Copy link
Collaborator Author

nartal1 commented Feb 5, 2025

There are more failures in sql-plugin module.

[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/StrategyRules.scala:21: object Strategy is not a member of package org.apache.spark.sql
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/StrategyRules.scala:30: not found: type Strategy
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/StrategyRules.scala:32: not found: type Strategy
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/delta/DeltaProvider.scala:21: object Strategy is not a member of package org.apache.spark.sql
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/delta/DeltaProvider.scala:44: not found: type Strategy
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/StrategyRules.scala:43: value nonEmpty is not a member of Nothing
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/StrategyRules.scala:32: private val strategies in class StrategyRules is never used
Applicable -Wconf / @nowarn filters for this fatal warning: msg=<part of the message>, cat=unused-privates, site=com.nvidia.spark.rapids.StrategyRules.strategies
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/com/nvidia/spark/rapids/delta/DeltaProvider.scala:100: not found: type Strategy
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/org/apache/spark/sql/rapids/GpuInsertIntoHadoopFsRelationCommand.scala:209: type mismatch;
 found   : SparkSession (in org.apache.spark.sql) 
 required: SparkSession (in org.apache.spark.sql.classic) 
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/org/apache/spark/sql/rapids/execution/InternalColumnarRddConverter.scala:664: value sqlContext is not a member of org.apache.spark.sql.DataFrame
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/org/apache/spark/sql/rapids/execution/InternalColumnarRddConverter.scala:668: value sqlContext is not a member of org.apache.spark.sql.DataFrame
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/org/apache/spark/sql/rapids/execution/InternalColumnarRddConverter.scala:718: value sqlContext is not a member of org.apache.spark.sql.DataFrame
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/scala/org/apache/spark/sql/rapids/execution/TrampolineUtil.scala:104: value cleanupAnyExistingSession is not a member of object org.apache.spark.sql.SparkSession
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark320/scala/org/apache/spark/sql/rapids/shims/Spark32XShimsUtils.scala:55: value leafNodeDefaultParallelism is not a member of org.apache.spark.sql.SparkSession
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark332db/scala/com/nvidia/spark/rapids/shims/GpuInsertIntoHiveTable.scala:143: type mismatch;
 found   : SparkSession (in org.apache.spark.sql) 
 required: SparkSession (in org.apache.spark.sql.classic) 
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark332db/scala/org/apache/spark/sql/rapids/GpuFileFormatWriter.scala:179: type mismatch;
 found   : SparkSession (in org.apache.spark.sql) 
 required: SparkSession (in org.apache.spark.sql.classic) 
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark332db/scala/org/apache/spark/sql/rapids/shims/GpuCreateDataSourceTableAsSelectCommandShims.scala:113: type mismatch;
 found   : SparkSession (in org.apache.spark.sql) 
 required: SparkSession (in org.apache.spark.sql.classic) 
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark332db/scala/org/apache/spark/sql/rapids/shims/GpuDataSource.scala:90: type mismatch;
 found   : org.apache.spark.sql.catalyst.TableIdentifier
 required: String
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark400/scala/org/apache/spark/sql/hive/rapids/shims/CommandUtilsShim.scala:30: type mismatch;
 found   : SparkSession (in org.apache.spark.sql) 
 required: SparkSession (in org.apache.spark.sql.classic) 
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark400/scala/org/apache/spark/sql/nvidia/DFUDFShims.scala:24: object ExpressionUtils is not a member of package org.apache.spark.sql.internal
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark400/scala/org/apache/spark/sql/nvidia/DFUDFShims.scala:27: not found: value expression
[ERROR] [Error] /home/test/spark-rapids-2502/spark-rapids/sql-plugin/src/main/spark400/scala/org/apache/spark/sql/nvidia/DFUDFShims.scala:28: not found: value column
[ERROR] 22 errors found

@nartal1
Copy link
Collaborator Author

nartal1 commented Feb 6, 2025

Stopped building Spark-4.0.0-SNAPSHOT #12068 . We should renable once the build issue is fixed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working build Related to CI / CD or cleanly building Spark 4.0+ Spark 4.0+ issues
Projects
None yet
Development

No branches or pull requests

2 participants