본문 바로가기

스파크(SPARK)

스파크 설치하기

스파크 설치는 간단하다!

 

1.oracle 의 홈디렉토리로 이동

(base) [oracle@centos ~]$ cd

2. 설치 파일 다운로드

(base) [oracle@centos ~]$ wget https://archive.apache.org/dist/spark/spark-2.0.2/spark-2.0.2-bin-hadoop2.7.tgz

3. 압축 풀기

(base) [oracle@centos ~]$ tar xvzf spark-2.0.2-bin-hadoop2.7.tgz

4. 압축 풀고 생긴 디렉토리의 이름을 spark 로 변경

(base) [oracle@centos ~]$ mv spark-2.0.2-bin-hadoop2.7 spark

5. .bash_profile 를 열어서 맨 아래에 아래의 export 문을 입력합니다.

(base) [oracle@centos ~]$ vi .bash_profile

export PATH=$PATH:/home/oracle/spark/bin:$PATH

6. .bash_profile 수행

(base) [oracle@centos ~]$ source .bash_profile

7. 바로 SPARK 접속하기!

(base) [oracle@centos ~]$ spark-shell

 

그러면 이러한 코드가 뜬다!

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
21/01/09 08:54:43 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/01/09 08:54:43 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 10.0.2.15 instead (on interface enp0s3)
21/01/09 08:54:43 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/01/09 08:54:45 WARN SparkContext: Use an existing SparkContext, some configuration may not take effect.
Spark context Web UI available at http://10.0.2.15:4040
Spark context available as 'sc' (master = local[*], app id = local-1610200485296).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.0.2
      /_/
         
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_262)
Type in expressions to have them eval‎‎uated.
Type :help for more information.