Skip to content

Commit

Permalink
ZEPPELIN-1599 Remove support on some old versions of Spark.
Browse files Browse the repository at this point in the history
### What is this PR for?
removing support on old versions of Spark including testing and building them.

### What type of PR is it?
[Feature]

### Todos
* [x] - Remove .travis.yml
* [x] - Remove pom.xml
* [x] - Remove some docs

### What is the Jira issue?
* https://issues.apache.org/jira/browse/ZEPPELIN-1599

### How should this be tested?
No test. Check travis simplified

### Screenshots (if appropriate)

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? You cannot use spark from 1.1 to 1.3 any longer
* Does this needs documentation? Yes, should remove some docs

Removed some profiles concerning old versions of Spark

Author: Jongyoul Lee <[email protected]>

Closes apache#1578 from jongyoul/ZEPPELIN-1599 and squashes the following commits:

acf514f [Jongyoul Lee] Fixed the script not for recognizing old versions
4bc11d6 [Jongyoul Lee] Added some docs for the deprecation on support for old versions of Spark
207502d [Jongyoul Lee] Removed some tests for old versions of Spark Removed some profiles concerning old versions of Spark
  • Loading branch information
jongyoul authored and minahlee committed Nov 4, 2016
1 parent 8dde8fb commit c5ab10d
Show file tree
Hide file tree
Showing 5 changed files with 3 additions and 49 deletions.
12 changes: 0 additions & 12 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -58,18 +58,6 @@ matrix:
- jdk: "oraclejdk7"
env: SCALA_VER="2.10" SPARK_VER="1.4.1" HADOOP_VER="2.3" PROFILE="-Pspark-1.4 -Pr -Phadoop-2.3 -Ppyspark -Psparkr" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark,r -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false"

# Test spark module for 1.3.1
- jdk: "oraclejdk7"
env: SCALA_VER="2.10" SPARK_VER="1.3.1" HADOOP_VER="2.3" PROFILE="-Pspark-1.3 -Phadoop-2.3 -Ppyspark" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false"

# Test spark module for 1.2.2
- jdk: "oraclejdk7"
env: SCALA_VER="2.10" SPARK_VER="1.2.2" HADOOP_VER="2.3" PROFILE="-Pspark-1.2 -Phadoop-2.3 -Ppyspark" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false"

# Test spark module for 1.1.1
- jdk: "oraclejdk7"
env: SCALA_VER="2.10" SPARK_VER="1.1.1" HADOOP_VER="2.3" PROFILE="-Pspark-1.1 -Phadoop-2.3 -Ppyspark" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.rest.*Test,org.apache.zeppelin.spark.* -DfailIfNoTests=false"

# Test selenium with spark module for 1.6.1
- jdk: "oraclejdk7"
env: TEST_SELENIUM="true" SCALA_VER="2.10" SPARK_VER="1.6.1" HADOOP_VER="2.3" PROFILE="-Pspark-1.6 -Phadoop-2.3 -Ppyspark -Pexamples" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.AbstractFunctionalSuite -DfailIfNoTests=false"
Expand Down
5 changes: 1 addition & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -128,9 +128,6 @@ Available profiles are
-Pspark-1.6
-Pspark-1.5
-Pspark-1.4
-Pspark-1.3
-Pspark-1.2
-Pspark-1.1
-Pcassandra-spark-1.5
-Pcassandra-spark-1.4
-Pcassandra-spark-1.3
Expand Down Expand Up @@ -192,7 +189,7 @@ enable 3rd party vendor repository (cloudera)

##### `-Pmapr[version]` (optional)

For the MapR Hadoop Distribution, these profiles will handle the Hadoop version. As MapR allows different versions of Spark to be installed, you should specify which version of Spark is installed on the cluster by adding a Spark profile (`-Pspark-1.2`, `-Pspark-1.3`, etc.) as needed.
For the MapR Hadoop Distribution, these profiles will handle the Hadoop version. As MapR allows different versions of Spark to be installed, you should specify which version of Spark is installed on the cluster by adding a Spark profile (`-Pspark-1.6`, `-Pspark-2.0`, etc.) as needed.
The correct Maven artifacts can be found for every version of MapR at http://doc.mapr.com

Available profiles are
Expand Down
1 change: 1 addition & 0 deletions docs/install/upgrade.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,3 +52,4 @@ So, copying `notebook` and `conf` directory should be enough.
- From 0.7, we don't use `ZEPPELIN_JAVA_OPTS` as default value of `ZEPPELIN_INTP_JAVA_OPTS` and also the same for `ZEPPELIN_MEM`/`ZEPPELIN_INTP_MEM`. If user want to configure the jvm opts of interpreter process, please set `ZEPPELIN_INTP_JAVA_OPTS` and `ZEPPELIN_INTP_MEM` explicitly. If you don't set `ZEPPELIN_INTP_MEM`, Zeppelin will set it to `-Xms1024m -Xmx1024m -XX:MaxPermSize=512m` by default.
- Mapping from `%jdbc(prefix)` to `%prefix` is no longer available. Instead, you can use %[interpreter alias] with multiple interpreter setttings on GUI.
- Usage of `ZEPPELIN_PORT` is not supported in ssl mode. Instead use `ZEPPELIN_SSL_PORT` to configure the ssl port. Value from `ZEPPELIN_PORT` is used only when `ZEPPELIN_SSL` is set to `false`.
- The support on Spark 1.1.x to 1.3.x is deprecated.
32 changes: 0 additions & 32 deletions spark/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -450,38 +450,6 @@
</build>

<profiles>
<profile>
<id>spark-1.1</id>
<dependencies>

</dependencies>
<properties>
<spark.version>1.1.1</spark.version>
<akka.version>2.2.3-shaded-protobuf</akka.version>
</properties>
</profile>

<profile>
<id>spark-1.2</id>
<dependencies>
</dependencies>
<properties>
<spark.version>1.2.1</spark.version>
</properties>
</profile>

<profile>
<id>spark-1.3</id>

<properties>
<spark.version>1.3.1</spark.version>
</properties>

<dependencies>
</dependencies>

</profile>

<profile>
<id>spark-1.4</id>
<properties>
Expand Down
2 changes: 1 addition & 1 deletion testing/downloadSpark.sh
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ if [[ ! -d "${SPARK_HOME}" ]]; then
echo "${SPARK_CACHE} does not have ${SPARK_ARCHIVE} downloading ..."

# download archive if not cached
if [[ "${SPARK_VERSION}" = "1.1.1" || "${SPARK_VERSION}" = "1.2.2" || "${SPARK_VERSION}" = "1.3.1" || "${SPARK_VERSION}" = "1.4.1" ]]; then
if [[ "${SPARK_VERSION}" = "1.4.1" ]]; then
echo "${SPARK_VERSION} being downloaded from archives"
# spark old versions are only available only on the archives (prior to 1.5.2)
STARTTIME=`date +%s`
Expand Down

0 comments on commit c5ab10d

Please sign in to comment.