noobxtreme.blogg.se

Download spark 2.1.0
Download spark 2.1.0












Download spark 2.1.0 full#

#.executor.extraJavaOptions=-Dhdp.version=currentįor running on Hortonworks platform, need specify “hdp.version” as Java options for Yarn containers, so please uncommment the last three lines in kylin.properties.īesides, in order to avoid repeatedly uploading Spark jars to Yarn, you can manually do that once, and then configure the jar’s HDFS location Please note, the HDFS location need be full qualified name.Īfter Kylin is started, access Kylin web, edit the “kylin_sales” cube, in the “Advanced Setting” page, change the “Cube Engine” from “MapReduce” to “Spark”:Ĭlick “Next” to the “Configuration Overwrites” page, click “+Property” to add property “-partition-cut-mb” with value “500” (reasons below):

download spark 2.1.0

#.yarn.am.extraJavaOptions=-Dhdp.version=current #.driver.extraJavaOptions=-Dhdp.version=current history.fs.logDirectory=hdfs\:///kylin/spark-history eventLog.dir=hdfs\:///kylin/spark-history Below is the default configurations, which is also the minimal config for a sandbox (1 executor with 1GB memory) usually in a normal cluster, need much more executors and each has at least 4GB memory and 2 cores: These properties will be extracted and applied when runs submit Spark job E.g, if you configure “.mory=4G”, Kylin will use “–conf =4G” as parameter when execute “spark-submit”.īefore you run Spark cubing, suggest take a look on these configurations and do customization according to your cluster.

download spark 2.1.0

Kylin embedes a Spark binary (v2.1.0) in $KYLIN_HOME/spark, all the Spark configurations can be managed in $KYLIN_HOME/conf/kylin.properties with prefix “-conf.”.

download spark 2.1.0

If this property isn’t set, Kylin will use the directory that “hive-site.xml” locates in while that folder may have no “hbase-site.xml”, will get HBase/ZK connection error in Spark. conf-dir=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf












Download spark 2.1.0