Difference between Spark ML & Spark MLlib package

0 votes
When I was implementing KMeans clustering algorithm, I noticed that there are two Machine Learning packages in Spark i.e. ML and MLLib.

Can anyone help me in understanding the difference between both the packages?
Jul 4, 2018 in Apache Spark by code799
309 views

1 answer to this question.

0 votes
org.apache.spark.mllib is the old Spark API while org.apache.spark.ml is the new API.

Now mllib is deprecated and most probably will be removed in the next major release.
answered Jul 5, 2018 by Shubham
• 13,290 points

Related Questions In Apache Spark

0 votes
1 answer

What's the difference between 'filter' and 'where' in Spark SQL?

Both 'filter' and 'where' in Spark SQL ...READ MORE

answered May 23, 2018 in Apache Spark by nitinrawat895
• 10,490 points
4,917 views
0 votes
1 answer

What is the difference between Apache Spark SQLContext vs HiveContext?

Spark 2.0+ Spark 2.0 provides native window functions ...READ MORE

answered May 25, 2018 in Apache Spark by nitinrawat895
• 10,490 points
1,797 views
0 votes
1 answer

Difference between sparkContext, JavaSparkContext, SQLContext, & SparkSession?

Yes, there is a difference between the ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 10,490 points
1,430 views
+1 vote
3 answers

What is the difference between rdd and dataframes in Apache Spark ?

Comparison between Spark RDD vs DataFrame 1. Release ...READ MORE

answered Aug 27, 2018 in Apache Spark by shams
• 3,580 points
11,754 views
0 votes
1 answer
0 votes
1 answer
0 votes
1 answer

Is it possible to run Apache Spark without Hadoop?

Though Spark and Hadoop were the frameworks designed ...READ MORE

answered May 2 in Big Data Hadoop by ravikiran
• 4,200 points
56 views
0 votes
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 10,490 points
2,325 views
0 votes
1 answer
0 votes
1 answer