Hadoop - Bad Substitution Error on PHD 3.0.1 when using Spark

Hadoop - Bad Substitution Error on PHD 3.0.1 when using Spark

book

Article ID: KB0082653

calendar_today

Updated On:

Products Versions
Spotfire Data Science 6.x

Description

PHD 3.0.1 connected as Hadoop Data Source.   I cannot run spark jobs.

Issue/Introduction

Bad Substitution Error on PHD 3.0.1 when using Spark

Resolution

For any customers that might have a PHD 3.0.1 cluster and can't run Spark jobs, you may see this error in the jobhistory page: "bad substitution"
 
16/04/08 14:17:12 INFO yarn.ExecutorRunnable: Preparing Local resource
16/04/08 14:17:12 INFO yarn.YarnAllocator: Completed container container_e03_1460150062360_0001_01_000002 (state: COMPLETE, exit status: 1)
16/04/08 14:17:12 INFO yarn.YarnAllocator: Container marked as failed: container_e03_1460150062360_0001_01_000002. Exit status: 1. Diagnostics: Exception from container-launch.
Container id: container_e03_1460150062360_0001_01_000002
Exit code: 1
Exception message: /data/hadoop/yarn/local/usercache/hdfs/appcache/application_1460150062360_0001/container_e03_1460150062360_0001_01_000002/launch_container.sh: line 26: $PWD:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/phd/current/hadoop-client/*:/usr/phd/current/hadoop-client/lib/*:/usr/phd/current/hadoop-hdfs-client/*:/usr/phd/current/hadoop-hdfs-client/lib/*:/usr/phd/current/hadoop-yarn-client/*:/usr/phd/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/phd/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: bad substitution
 
Stack trace: ExitCodeException exitCode=1: /data/hadoop/yarn/local/usercache/hdfs/appcache/application_1460150062360_0001/container_e03_1460150062360_0001_01_000002/launch_container.sh: line 26: $PWD:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/phd/current/hadoop-client/*:/usr/phd/current/hadoop-client/lib/*:/usr/phd/current/hadoop-hdfs-client/*:/usr/phd/current/hadoop-hdfs-client/lib/*:/usr/phd/current/hadoop-yarn-client/*:/usr/phd/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/phd/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: bad substitution
 
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:538)
    at org.apache.hadoop.util.Shell.run(Shell.java:455)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:715)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:745)
 
 
Container exited with a non-zero exit code 1

You'll see the bad substitution error on the very right because this doesn't exist on the node:
/usr/phd/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}

You can remove the offending path in mapred-site.xml on your Ambari page under MapReduce2.  Go into the "Advanced mapred-site" area to remove a portion of the mapreduce.application.classpath:​

[screenshot_1]

mapreduce.application.classpath (original):

$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/phd/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure

Just delete the bolded/italicized line above and restart relevant services.

Attachments

Hadoop - Bad Substitution Error on PHD 3.0.1 when using Spark get_app