2017-02-20 200 views
0

在爲scala安裝eclipse scala插件和eclipse maven插件之後。錯誤:無法在scala中找到或加載主類

我是scala的新手,所以我試圖確保環境在測試scala hello世界項目後工作。它按預期工作。

但是我在嘗試執行從公司倉庫檢出的項目時遇到了困難。無論我做什麼(清理,構建,通過mave等乾淨安裝)我得到一個「錯誤:無法找到或加載主類com.company.team.spark.sqlutil.testQuery」,而試圖運行甚至項目中一個小小的Hello World世界程序。我的預感說日食無法爲這個項目創建類文件,因爲它是一個pom issse,但即使經過了幾次嘗試,我仍然無法完成它。請幫我算出這個

版本:Eclipse的月神版本(4.4.0) 版本ID:20140612-0600

斯卡拉 - 2.10.6

Scalacode - testQuery.scala

package com.company.team.spark.sqlutil 

object testQuery { 
    def main(args: Array[String]): Unit = { 
    print ("Hello") 
    } 
} 

以下是我使用的POM。

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 

    <groupId>com.company.team.spark</groupId> 
    <artifactId>HomeSpark</artifactId> 
    <version>0.0.1-SNAPSHOT</version> 
    <packaging>jar</packaging> 

    <name>HomeSpark</name> 
    <url>http://maven.apache.org</url> 

    <properties> 
     <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> 
     <lib.dir>${project.basedir}\lib\</lib.dir> 
    </properties> 

    <dependencies> 
    <!-- <dependency> 
      <groupId>org.apache.hadoop</groupId> 
      <artifactId>hadoop-core</artifactId> 
      <version>1.2.1</version> 
     </dependency> 
     <dependency> 
      <groupId>junit</groupId> 
      <artifactId>junit</artifactId> 
      <version>3.8.1</version> 
      <scope>system</scope> 
      <systemPath>${lib.dir}junit-3.8.1.jar</systemPath> 
     </dependency> 

     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-core_2.10</artifactId> 
      <version>2.1.0</version> 
      <scope>system</scope> 
      <systemPath>${lib.dir}spark-core_2.10-2.1.0.jar</systemPath> 
     </dependency> 

     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-sql_2.10</artifactId> 
      <version>2.1.0</version> 
      <scope>system</scope> 
      <systemPath>${lib.dir}spark-sql_2.10-2.1.0.jar</systemPath> 
     </dependency> 

     <dependency> 
      <groupId>com.databricks</groupId> 
      <artifactId>spark-csv_2.10</artifactId> 
      <version>1.5.0</version> 
      <scope>system</scope> 
      <systemPath>${lib.dir}spark-csv_2.10-1.5.0.jar</systemPath> 
     </dependency> --> 

    <dependency> 
     <groupId>junit</groupId> 
     <artifactId>junit</artifactId> 
     <version>3.8.1</version> 
     <scope>test</scope> 
    </dependency> 

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10 --> 
<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-core_2.10</artifactId> 
    <version>2.1.0</version> 
</dependency> 

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql_2.10 --> 
<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-sql_2.10</artifactId> 
    <version>2.1.0</version> 
</dependency>  

<!-- https://mvnrepository.com/artifact/com.databricks/spark-csv_2.10 --> 
<dependency> 
    <groupId>com.databricks</groupId> 
    <artifactId>spark-csv_2.10</artifactId> 
    <version>1.5.0</version> 
</dependency> 

<dependency> 
      <groupId>com.amazonaws</groupId> 
      <artifactId>aws-java-sdk</artifactId> 
      <version>1.9.2</version> 
     </dependency> 

    </dependencies> 



<build> 

     <sourceDirectory>${project.basedir}/src/main/scala</sourceDirectory> 

     <testOutputDirectory>${project.build.directory}/test-classes</testOutputDirectory> 

     <plugins><plugin> 

    <groupId>net.alchim31.maven</groupId> 
    <artifactId>scala-maven-plugin</artifactId> 
    <version>3.1.3</version> 
    <executions> 
     <execution> 
      <goals> 
       <goal>compile</goal> 
       <goal>testCompile</goal> 
      </goals> 
     </execution> 
    </executions> 
</plugin></plugins> 

</build>  
</project> 

Link項目結構的圖像

+0

這是從eclipse運行時嗎?如果您安裝了maven,請從命令行嘗試'mvn clean compile',它會爲您下載依賴關係,並查看它是否正常工作。 – prayagupd

+0

@prayagupd我厭倦了通過eclipse進行編譯[run as> maven build],它下載了依賴關係。但在嘗試通過/ src/test/scala運行Hello World時仍然會出現相同的錯誤。 代碼類似於 對象testQuery { DEF主(參數:數組[字符串]):單位= { 打印( 「你好」) } } 我的道歉,留出了一部分,其中這是一個測試類。 但是,當我將它移動到src/main/scala文件夾中時,你的建議確實奏效,並單獨刪除了**:Unit = **部分。謝謝! – stormfield

+0

@prayagupd任何建議讓他的測試課程有效? – stormfield

回答

0

我是能夠解決這些問題選擇了後的Scala IDE在Eclipse中的Scala IDE插件集成。

也改變了的pom.xml以下:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 
    <groupId>com.company.fuison</groupId> 
    <artifactId>SomeCloud</artifactId> 
    <version>0.0.1-SNAPSHOT</version> 
    <name>${project.artifactId}</name> 
    <description>My wonderfull scala app</description> 
    <inceptionYear>2015</inceptionYear> 
    <licenses> 
    <license> 
     <name>My License</name> 
     <url>http://....</url> 
     <distribution>repo</distribution> 
    </license> 
    </licenses> 

    <properties> 
    <maven.compiler.source>1.6</maven.compiler.source> 
    <maven.compiler.target>1.6</maven.compiler.target> 
    <encoding>UTF-8</encoding> 
    <scala.version>2.11.5</scala.version> 
    <scala.compat.version>2.11</scala.compat.version> 
    </properties> 

    <dependencies> 
    <dependency> 
     <groupId>org.scala-lang</groupId> 
     <artifactId>scala-library</artifactId> 
     <version>${scala.version}</version> 
    </dependency> 

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 --> 
<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-core_2.11</artifactId> 
    <version>2.0.0</version> 
</dependency> 

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11 --> 
<dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-sql_2.11</artifactId> 
    <version>2.0.0</version> 
</dependency> 


    <dependency> 
    <groupId>com.amazonaws</groupId> 
    <artifactId>aws-java-sdk</artifactId> 
    <version>1.9.2</version> 
    </dependency> 

<!-- https://mvnrepository.com/artifact/com.databricks/spark-csv_2.11 --> 
<dependency> 
    <groupId>com.databricks</groupId> 
    <artifactId>spark-csv_2.11</artifactId> 
    <version>1.5.0</version> 
</dependency> 
    <!-- Test --> 
    <dependency> 
     <groupId>junit</groupId> 
     <artifactId>junit</artifactId> 
     <version>4.12</version> 
     <scope>test</scope> 
    </dependency> 
    <dependency> 
     <groupId>org.specs2</groupId> 
     <artifactId>specs2-core_${scala.compat.version}</artifactId> 
     <version>2.4.16</version> 
     <scope>test</scope> 
    </dependency> 
    <dependency> 
     <groupId>org.scalatest</groupId> 
     <artifactId>scalatest_${scala.compat.version}</artifactId> 
     <version>2.2.4</version> 
     <scope>test</scope> 
    </dependency> 
    <dependency> 
     <groupId>org.specs2</groupId> 
     <artifactId>specs2-junit_${scala.compat.version}</artifactId> 
     <version>2.4.16</version> 
     <scope>test</scope> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.logging.log4j</groupId> 
     <artifactId>log4j-api-scala_2.11</artifactId> 
     <version>2.8.1</version> 
    </dependency> 

    <dependency> 
     <groupId>org.apache.logging.log4j</groupId> 
     <artifactId>log4j-core</artifactId> 
     <version>2.8.1</version> 
    </dependency> 



    <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library 
    <dependency> 
     <groupId>org.scala-lang</groupId> 
     <artifactId>scala-library</artifactId> 
     <version>2.12.1</version> 
    </dependency> 
    --> 
    <!-- https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging_2.11 --> 
    <dependency> 
     <groupId>com.typesafe.scala-logging</groupId> 
     <artifactId>scala-logging_2.11</artifactId> 
     <version>3.5.0</version> 
    </dependency> 


    </dependencies> 

    <build> 
    <resources> 
      <resource> 
       <directory>${project.basedir}/config/log4j</directory> 
      </resource> 
     </resources> 

    <sourceDirectory>src/main/scala</sourceDirectory> 
    <testSourceDirectory>src/test/scala</testSourceDirectory> 
    <plugins> 
     <plugin> 
     <!-- see http://davidb.github.com/scala-maven-plugin --> 
     <groupId>net.alchim31.maven</groupId> 
     <artifactId>scala-maven-plugin</artifactId> 
     <version>3.2.0</version> 
     <executions> 
      <execution> 
      <goals> 
       <goal>compile</goal> 
       <goal>testCompile</goal> 
      </goals> 
      <configuration> 
       <args> 

       <arg>-dependencyfile</arg> 
       <arg>${project.build.directory}/.scala_dependencies</arg> 
       </args> 
      </configuration> 
      </execution> 
     </executions> 
     </plugin> 
     <plugin> 
     <groupId>org.apache.maven.plugins</groupId> 
     <artifactId>maven-surefire-plugin</artifactId> 
     <version>2.18.1</version> 
     <configuration> 
      <useFile>false</useFile> 
      <disableXmlReport>true</disableXmlReport> 
      <!-- If you have classpath issue like NoDefClassError,... --> 
      <!-- useManifestOnlyJar>false</useManifestOnlyJar --> 
      <includes> 
      <include>**/*Test.*</include> 
      <include>**/*Suite.*</include> 
      </includes> 
     </configuration> 
     </plugin> 
    </plugins> 
    </build> 
</project> 
0

使用編譯安裝所需的斯卡拉,Maven的插件。您可能正在使用乾淨安裝,它將從/ bin中刪除生成的.class文件,而eclipse無法找到或加載主類。

相關問題