Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[VL] Can't run spark.plugins=io.glutenproject.GlutenPlugin with release jar #6185

Open
ArnavBalyan opened this issue Jun 23, 2024 · 2 comments
Labels
bug Something isn't working triage

Comments

@ArnavBalyan
Copy link

Backend

VL (Velox)

Bug description

bin/spark-shell
--conf spark.plugins=io.glutenproject.GlutenPlugin
--conf spark.memory.offHeap.enabled=true
--conf spark.memory.offHeap.size=20g
--conf spark.shuffle.manager=org.apache.spark.shuffle.sort.ColumnarShuffleManager
--jars /home/user/gluten-velox-bundle-spark3.3_2.12-1.1.0.jar

Output:

24/06/23 05:27:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
OpenJDK 64-Bit Server VM warning: You have loaded library /tmp/gluten-78747346-ee81-43e9-80b1-ccc58bfa49bf/jni/5c51cec7-2a1d-4dd0-a8f9-406f8dc104ed/gluten-4920012601861087064/libvelox.so which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGILL (0x4) at pc=0x00007f0f88d19753, pid=769118, tid=0x00007f100970a700
#
# JRE version: OpenJDK Runtime Environment (8.0_412-b08) (build 1.8.0_412-b08)
# Java VM: OpenJDK 64-Bit Server VM (25.412-b08 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# C  [libgluten.so+0x30c753]  gluten::Runtime::registerFactory(std::string const&, std::function<gluten::Runtime* (std::unordered_map<std::string, std::string, std::hash<std::string>, std::equal_to<std::string>, std::allocator<std::pair<std::string const, std::string> > > const&)>)+0x23
#
# Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again
#
# An error report file with more information is saved as:
# /home/user/spark/hs_err_pid769118.log
#
# If you would like to submit a bug report, please visit:
#   https://github.com/adoptium/adoptium-support/issues
# The crash happened outside the Java Virtual Machine in native code.
# See problematic frame for where to report the bug.
#
bin/spark-shell: line 47: 769118 Aborted                 "${SPARK_HOME}"/bin/spark-submit --class org.apache.spark.repl.Main --name "Spark shell" "$@"

Spark version

Spark-3.3.x

Spark configurations

No response

System information

No response

Relevant logs

No response

@ArnavBalyan ArnavBalyan added bug Something isn't working triage labels Jun 23, 2024
@ArnavBalyan
Copy link
Author

Using release available here: https://github.com/apache/incubator-gluten/releases

@ArnavBalyan
Copy link
Author

ArnavBalyan commented Jun 23, 2024

Also reported here: #5327 and #6088

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage
Projects
None yet
Development

No branches or pull requests

1 participant