jar - Unable to submit Spring boot java application to Spark cluster -


i have developed web application spring boot uses apache spark querying data different datasources (like oracle). @ beginning, had planned run application without submitting using spark-submit script, looks cannot connect master cluster without submitting jar. have generated uber jar includes dependencies , sub-projects using, seems spark not spring boot applications. when try submit app, spark shows following error:

exception in thread "main" java.lang.illegalargumentexception: loggerfactory not logback loggercontext logback on classpath. either remove logback or competing implementation (class org.slf4j.impl.log4jloggerfactory loaded file:/home/rojasmi1/spark/spark-1.4.0/assembly/target/scala-2.10/spark-assembly-1.4.0-hadoop2.2.0.jar). if using weblogic need add 'org.slf4j' prefer-application-packages in web-inf/weblogic.xml object of class [org.slf4j.impl.log4jloggerfactory] must instance of class ch.qos.logback.classic.loggercontext @ org.springframework.util.assert.isinstanceof(assert.java:339) @ org.springframework.boot.logging.logback.logbackloggingsystem.getloggercontext(logbackloggingsystem.java:151) @ org.springframework.boot.logging.logback.logbackloggingsystem.getlogger(logbackloggingsystem.java:143) @ org.springframework.boot.logging.logback.logbackloggingsystem.beforeinitialize(logbackloggingsystem.java:89) @ org.springframework.boot.logging.loggingapplicationlistener.onapplicationstartedevent(loggingapplicationlistener.java:152) @ org.springframework.boot.logging.loggingapplicationlistener.onapplicationevent(loggingapplicationlistener.java:139) @ org.springframework.context.event.simpleapplicationeventmulticaster.invokelistener(simpleapplicationeventmulticaster.java:151) @ org.springframework.context.event.simpleapplicationeventmulticaster.multicastevent(simpleapplicationeventmulticaster.java:128) @ org.springframework.boot.context.event.eventpublishingrunlistener.publishevent(eventpublishingrunlistener.java:100) @ org.springframework.boot.context.event.eventpublishingrunlistener.started(eventpublishingrunlistener.java:54) @ org.springframework.boot.springapplication.run(springapplication.java:277) @ org.springframework.boot.springapplication.run(springapplication.java:957) @ org.springframework.boot.springapplication.run(springapplication.java:946) @ ch.dlx.qubidaoracleconnectorapplication.main(qubidaoracleconnectorapplication.java:12) @ sun.reflect.nativemethodaccessorimpl.invoke0(native method) @ sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl.java:62) @ sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl.java:43) @ java.lang.reflect.method.invoke(method.java:497) @ org.apache.spark.deploy.sparksubmit$.org$apache$spark$deploy$sparksubmit$$runmain(sparksubmit.scala:664) @ org.apache.spark.deploy.sparksubmit$.dorunmain$1(sparksubmit.scala:169) @ org.apache.spark.deploy.sparksubmit$.submit(sparksubmit.scala:192) @ org.apache.spark.deploy.sparksubmit$.main(sparksubmit.scala:111) @ org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala) 

using spark's default log4j profile: org/apache/spark/log4j-defaults.properties

i have tried exclude slf4j-log4j12 dependency in pom file, still getting same error.

the pom file contains following configuration:

<?xml version="1.0" encoding="utf-8"?> <project xmlns="http://maven.apache.org/pom/4.0.0" xmlns:xsi="http://www.w3.org/2001/xmlschema-instance" xsi:schemalocation="http://maven.apache.org/pom/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelversion>4.0.0</modelversion>  <groupid>ch.dlx</groupid> <artifactid>qubida-oracle-connector</artifactid> <version>0.0.1-snapshot</version>  <name>qubida-oracle-connector</name> <description></description>  <properties>     <project.build.sourceencoding>utf-8</project.build.sourceencoding>     <java.version>1.8</java.version> </properties>  <dependencymanagement>     <dependencies>         <dependency>             <groupid>org.springframework.boot</groupid>             <artifactid>spring-boot-dependencies</artifactid>             <version>1.2.5.release</version>             <type>pom</type>             <scope>import</scope>         </dependency>     </dependencies> </dependencymanagement>  <dependencies>     <dependency>         <groupid>org.springframework.boot</groupid>         <artifactid>spring-boot-starter-web</artifactid>         <exclusions>             <exclusion>                 <groupid>org.slf4j</groupid>                 <artifactid>log4j-over-slf4j</artifactid>             </exclusion>          </exclusions>     </dependency>      <dependency>         <groupid>org.springframework.boot</groupid>         <artifactid>spring-boot-starter-tomcat</artifactid>     </dependency>      <dependency>         <groupid>org.springframework.boot</groupid>         <artifactid>spring-boot-starter-test</artifactid>         <scope>test</scope>     </dependency>      <!-- spark -->      <dependency>         <groupid>org.apache.spark</groupid>         <artifactid>spark-core_2.11</artifactid>         <version>1.4.0</version>         <scope>provided</scope>         <exclusions>                     <exclusion>     <groupid>org.slf4j</groupid>     <artifactid>slf4j-log4j12</artifactid>     </exclusion>         </exclusions>     </dependency>      <dependency>         <groupid>org.apache.spark</groupid>         <artifactid>spark-sql_2.11</artifactid>         <version>1.4.0</version>         <scope>provided</scope>     </dependency>      <dependency>         <groupid>org.mongodb</groupid>         <artifactid>mongo-hadoop-core</artifactid>         <version>1.3.0</version>         <exclusions>             <exclusion>                 <groupid>org.slf4j</groupid>                 <artifactid>log4j-over-slf4j</artifactid>             </exclusion>         </exclusions>     </dependency>      <!-- db drivers -->      <dependency>         <groupid>com.oracle</groupid>         <artifactid>ojdbc14</artifactid>         <version>10.2.0.4.0</version>     </dependency>   </dependencies>  <build>     <plugins>         <plugin>             <groupid>org.apache.maven.plugins</groupid>             <artifactid>maven-shade-plugin</artifactid>             <configuration>                 <createdependencyreducedpom>false</createdependencyreducedpom>                 <keepdependencieswithprovidedscope>true</keepdependencieswithprovidedscope>                  <artifactset>                     <excludes>                         <exclude>org.slf4j</exclude>                     </excludes>                 </artifactset>             </configuration>             <executions>                 <execution>                     <phase>package</phase>                     <goals>                         <goal>shade</goal>                     </goals>                 </execution>             </executions>         </plugin>     </plugins> </build> 

is there way submit spring boot application cluster? should use type of project taking account need expose restful api? there way connecting spark cluster without submitting .jar?

thanks in advance help.

i had similar issue, solving try removing spring boot logging following exclusion:

    <dependency>         <groupid>org.springframework.boot</groupid>         <artifactid>spring-boot-starter-web</artifactid>         <exclusions>             <exclusion>                 <groupid>org.springframework.boot</groupid>                 <artifactid>spring-boot-starter-logging</artifactid>             </exclusion>         </exclusions>     </dependency> 

if still error while initializing servlet

java.lang.nosuchmethoderror: javax.servlet.servletcontext.getvirtualservername()ljava/lang/string;

then try using 1.2.1.release version of starter parent, since caused because of servlet-api version used spark cluster.


Comments

Popular posts from this blog

c# - Better 64-bit byte array hash -

webrtc - Which ICE candidate am I using and why? -

php - Zend Framework / Skeleton-Application / Composer install issue -