Difference between Spark ML Spark MLlib package

0 votes
When I was implementing KMeans clustering algorithm, I noticed that there are two Machine Learning packages in Spark i.e. ML and MLLib.

Can anyone help me in understanding the difference between both the packages?
Jul 4, 2018 in Apache Spark by code799
1,005 views

1 answer to this question.

0 votes
org.apache.spark.mllib is the old Spark API while org.apache.spark.ml is the new API.

Now mllib is deprecated and most probably will be removed in the next major release.
answered Jul 5, 2018 by Shubham
• 13,480 points

Related Questions In Apache Spark

0 votes
1 answer

What's the difference between 'filter' and 'where' in Spark SQL?

Both 'filter' and 'where' in Spark SQL ...READ MORE

answered May 23, 2018 in Apache Spark by nitinrawat895
• 11,380 points
22,320 views
0 votes
1 answer

What is the difference between Apache Spark SQLContext vs HiveContext?

Spark 2.0+ Spark 2.0 provides native window functions ...READ MORE

answered May 26, 2018 in Apache Spark by nitinrawat895
• 11,380 points
3,581 views
0 votes
1 answer

Difference between sparkContext, JavaSparkContext, SQLContext, & SparkSession?

Yes, there is a difference between the ...READ MORE

answered Jul 4, 2018 in Apache Spark by nitinrawat895
• 11,380 points
3,358 views
+1 vote
3 answers

What is the difference between rdd and dataframes in Apache Spark ?

Comparison between Spark RDD vs DataFrame 1. Release ...READ MORE

answered Aug 28, 2018 in Apache Spark by shams
• 3,660 points
37,683 views
0 votes
1 answer
+1 vote
2 answers
0 votes
1 answer

Is it possible to run Apache Spark without Hadoop?

Though Spark and Hadoop were the frameworks designed ...READ MORE

answered May 3, 2019 in Big Data Hadoop by ravikiran
• 4,620 points
427 views
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
7,897 views
0 votes
1 answer
0 votes
1 answer