At which point of time Scala will take default arguments?

0 votes
Can anyone tell at which point of time Scala will take default arguments?
Jul 25, 2019 in Apache Spark by Dhiraj
38 views

1 answer to this question.

0 votes

Hi,

If we omit an argument in a function call, Scala will take the default value we provided for it.

scala> def func(a:Int,b:Int=7){

| println(a*b)

| }

func: (a: Int, b: Int)Unit

scala> func(2,5)

10

scala> func(2)

But make sure that any default arguments must be after all non-default arguments.

answered Jul 25, 2019 by Gitika
• 25,440 points

Related Questions In Apache Spark

0 votes
1 answer
0 votes
1 answer

Which is better in term of speed, Shark or Spark?

Spark is a framework for distributed data ...READ MORE

answered Jun 25, 2018 in Apache Spark by nitinrawat895
• 10,840 points
58 views
0 votes
1 answer

Changing the blacklist time of executor

By default, the node or executor is ...READ MORE

answered Mar 11, 2019 in Apache Spark by Raj
140 views
0 votes
1 answer

Spark Yarn: Changing maximum number of time to submit application

By default, the maximum number of times ...READ MORE

answered Mar 28, 2019 in Apache Spark by Raj
287 views
+1 vote
1 answer
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,840 points
3,874 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,840 points
532 views
+1 vote
11 answers

hadoop fs -put command?

put syntax: put <localSrc> <dest> copy syntax: copyFr ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Aditya
20,501 views
0 votes
1 answer

Which syntax to use to take the sum of list of collection in scala?

Hi, You can see this example to get ...READ MORE

answered Jul 5, 2019 in Apache Spark by Gitika
• 25,440 points
51 views
0 votes
1 answer

By which components spark ecosystem libraries are composed of?

Hi, Spark ecosystem libraries are composed of various ...READ MORE

answered Jul 1, 2019 in Apache Spark by Gitika
• 25,440 points
42 views