error: identified expected but integer literal found.

0 votes

I got this error above while creating an array for creating RDD from existing RDD. Can anyone say how can I resolve this?

Jul 3 in Apache Spark by NItin

edited Jul 4 by Gitika 13 views

1 answer to this question.

0 votes

Hi,

You can resolve this error with a simple change, instead of that command follow this line of command below:

sacala> var a1 = Array(1,2,3,4,5,6,7,8,9,10)

Example here, you will get the expected array output here.

answered Jul 4 by Gitika
• 19,720 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 13,190 points
447 views
0 votes
1 answer

Spark streaming with Kafka dependency error

Your error is with the version of ...READ MORE

answered Jul 5, 2018 in Apache Spark by Shubham
• 13,190 points
141 views
0 votes
1 answer

Error while using Spark SQL filter API

You have to use "===" instead of ...READ MORE

answered Feb 4 in Apache Spark by Omkar
• 67,120 points
21 views
0 votes
0 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,110 points
2,052 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,110 points
197 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,499 views
0 votes
1 answer

error: identifier expected but ']' found.

Hi, You can try this remove brackets from ...READ MORE

answered Jul 3 in Apache Spark by Gitika
• 19,720 points
17 views
0 votes
1 answer

Error: value textfile is not a member of org.apache.spark.SparkContext

Hi, Regarding this error, you just need to change ...READ MORE

answered Jul 4 in Apache Spark by Gitika
• 19,720 points
20 views