Companion objects in Scala

0 votes
Please explain what is the companion object. I'd appreciate if you shared an example too
Feb 23 in Apache Spark by Sharan
32 views

1 answer to this question.

0 votes

When a singleton object is named the same as a class, it is called a companion object. A companion object must be defined inside the same source file as the class. Here is an example:

class Main {

    def sayHelloWorld() {

        println("Hello World");

    }

}

object Main {

    def sayHi() {

        println("Hi!");

    }

}

In this class, you can both: instantiate Main and call sayHelloWorld() or call the sayHi() method on the companion object directly, like this:

var aMain : Main = new Main();

aMain.sayHelloWorld();

Main.sayHi();
answered Feb 23 by Uma

Related Questions In Apache Spark

0 votes
1 answer

Multidimensional Array in Scala

Multidimensional array is an array which store ...READ MORE

answered Feb 11 in Apache Spark by Omkar
• 67,120 points
58 views
0 votes
1 answer
0 votes
1 answer

what are the job optimization Technics in spark and scala ?

There are different methods to achieve optimization ...READ MORE

answered Mar 18 in Apache Spark by Veer
144 views
0 votes
1 answer

How to execute a function in apache-scala?

Hi, Here is a simple example of how ...READ MORE

answered Jul 1 in Apache Spark by Gitika
• 19,720 points
23 views
0 votes
0 answers
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,990 points
2,025 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 9,990 points
188 views
0 votes
10 answers

hadoop fs -put command?

copy command can be used to copy files ...READ MORE

answered Dec 7, 2018 in Big Data Hadoop by Sujay
10,227 views
0 votes
1 answer

Not able to preserve shuffle files in Spark

You lose the files because by default, ...READ MORE

answered Feb 23 in Apache Spark by Rana
25 views
0 votes
1 answer

Spark SQL in databricks

In sparkSql, we can use CASE when ...READ MORE

answered Feb 23 in Apache Spark by Rishi
125 views