How should i prepare for CCA 175 Exam

+4 votes
I want to get the Cloudera certification for their Hadoop and Spark Dev exam, ie. CCA 175.

I have roughly 2 months of time to prepare.

Any thoughts on how to approach this?
May 10, 2018 in Career Counselling by Data_Nerd
• 2,390 points
4,011 views

3 answers to this question.

0 votes

Edureka has one of the most detailed and comprehensive course on Apache Spark and Hadoop online. But before going for any online training just go through this to have a basic grasp of the technology and the fundamentals

To learn Spark and Hadoop, you need to start with the basics, i.e Big Data and emergence of Hadoop.

Moving forward you need to focus on the main reason Hadoop became popular. It was because of HDFS (Hadoop Distributed File System).

Further moving on take a deep dive into Hadoop Ecosystem and learn various tools inside Hadoop Ecosystem with their functionalities. So, that you will learn how to create a tailored solution according to your requirements

The main components of HDFS are NameNode and DataNode.

NameNode

It is the master daemon that maintains and manages the DataNodes (slave nodes). It records the metadata of all the files stored in the cluster, e.g. location of blocks stored, the size of the files, permissions, hierarchy, etc. It records each and every change that takes place to the file system metadata.

For example, if a file is deleted in HDFS, the NameNode will immediately record this in the EditLog. It regularly receives a Heartbeat and a block report from all the DataNodes in the cluster to ensure that the DataNodes are live. It keeps a record of all the blocks in HDFS and in which nodes these blocks are stored.

DataNode

These are slave daemons which runs on each slave machine. The actual data is stored on DataNodes. They are responsible for serving read and write requests from the clients. They are also responsible for creating blocks, deleting blocks and replicating the same based on the decisions taken by the NameNode.

For processing, we use YARN(Yet Another Resource Negotiator). The components of YARN are ResourceManager and NodeManager.

ResourceManager

It is a cluster level (one for each cluster) component and runs on the master machine. It manages resources and schedule applications running on top of YARN.

NodeManager

It is a node level component (one on each node) and runs on each slave machine. It is responsible for managing containers and monitoring resource utilization in each container. It also keeps track of node health and log management. It continuously communicates with ResourceManager to remain up-to-date.

So, you can perform parallel processing on HDFS using MapReduce.

Next comes the concepts of PigHive and Hbase.

Moving on to Spark you need to learn about Scala, as Spark-shell by default runs on Scala.

  • Scala is a general-purpose programming language, which is aimed to implement common programming patterns in a concise, elegant, and type-safe way
  • It supports both object-oriented and functional programming styles,thus helping programmers to be more productive.

Further moving forward, you need to learn about RDDs , which are the basic building blocks for any spark code.

  • RDD(Resilient Distributed Dataset) is a distributed memory abstraction which lets programmers perform in-memory computations on large clusters in a fault-tolerant manner.
  • They are read-only collection of objects partitioned across a set of machines that can be rebuilt if a partition is lost.
  • RDDs can be created from multiple data sources e.g. Scala collection, local file system, Hadoop, Amazon S3, HBase table etc.

SparkSQL is another main component of Spark which is very important to process structured data in an sql style format.

Next comes the Machine Learning library of Spark, ie. MLlibHow it is used to perform various ML algorithms through Spark. (Regressions and K-means Clustering)

Flume also plays an important role in the process of Streaming data and so does Kafka.

Spark itself has the ability to process and Stream data, which is done through Spark Streaming using DStreams.

Edureka’s Apache Spark and Scala Certification training offers a detailed course specifically designed for the CCA175 exam, covering all the above mentioned topics.

Edureka provides a good list of Spark Videos. I would recommend you go through this Edureka Spark Playlist as well as the Spark Tutorial

There are a lot of Hadoop Videos too.

Hope this helps.

answered May 10, 2018 by kurt_cobain
• 9,350 points
0 votes

CCA 175 is a very important exam for the people who want to excel in Hadoop and Spark. Since you have less time and importance of this exam is very high, you should go for Edureka's course on Hadoop and Spark. They have covered everything and their instructors are well versed with the topics, with ample of examples for better understanding.

answered Jun 26, 2018 by zombie
• 3,790 points
+2 votes

If you have to prepare for Cloudera CCA175 exam and need to get help, by then DumpsStar is an extraordinary stage for you. You can pass Cloudera exam easily by getting Cloudera CCA175 exam dumps in actuality that these CCA175 exam dumps are offered by DumpsStar are affirmed by Cloudera specialists. Its self-appraisal mechanical assembly is shocking, which evaluate your performance and pointed out weak areas. DumpsStar is the best webpage forgiving on the web preparing material to Cloudera CCA175 exam.

You can find related material of CCA175 exam on the DumpsStar that will help you with clearing your Cloudera CCA175 exam on the vital undertaking. DumpsStar is the best source where you can get all the available online exam material. You can without quite a bit of a stretch get Cloudera CCA175 exam dumps and can pass your CCA175 exam with comfort. I authorize to at first get a look at DumpsStar. This gainful resource will help you with understanding the focuses and honest to goodness exam configuration attached into the exam and where to focus your essentialness on. DumpsStar look at material for the CCA175 exam has made things incredibly less requesting.

answered Mar 14, 2019 by erichamm
• 180 points

Related Questions In Career Counselling

0 votes
2 answers

How much salary should I ask for?

Hey, This is a crucial question to ask ...READ MORE

answered May 16, 2019 in Career Counselling by Gitika
• 65,770 points
1,158 views
+1 vote
2 answers

How to prepare for Oracle Certified Associate Java Programmer Exam?

Join an Online Course  As I would like ...READ MORE

answered Sep 9, 2019 in Career Counselling by Sirajul
• 59,230 points
1,414 views
0 votes
2 answers

How do I know linux administrator role right for me?

If you like working with Operating Systems, ...READ MORE

answered Apr 3, 2019 in Career Counselling by Prateek
941 views
–1 vote
3 answers

How do I know Digital Marketing is for me?

It depends on where you want to ...READ MORE

answered Apr 25, 2019 in Career Counselling by Jaideep
• 500 points

reshown Apr 25, 2019 by Vardhan 1,411 views
+1 vote
6 answers

How much salary hike should I expect after 2 years of experience as a Software Developer?

Here are my two cents: First question: After switching ...READ MORE

answered Dec 14, 2020 in Career Counselling by Rajiv
• 8,870 points
185,730 views
+1 vote
2 answers
+1 vote
1 answer

Hadoop Mapreduce word count Program

Firstly you need to understand the concept ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
11,078 views
0 votes
1 answer

hadoop.mapred vs hadoop.mapreduce?

org.apache.hadoop.mapred is the Old API  org.apache.hadoop.mapreduce is the ...READ MORE

answered Mar 16, 2018 in Data Analytics by nitinrawat895
• 11,380 points
2,575 views
+2 votes
11 answers

hadoop fs -put command?

Hi, You can create one directory in HDFS ...READ MORE

answered Mar 16, 2018 in Big Data Hadoop by nitinrawat895
• 11,380 points
109,076 views
webinar REGISTER FOR FREE WEBINAR X
REGISTER NOW
webinar_success Thank you for registering Join Edureka Meetup community for 100+ Free Webinars each month JOIN MEETUP GROUP