How to write Kafka connector to integrate with Facebook API?

0 votes

I am trying to write a Kafka connector to fetch data from the facebook. My problems are, 

1. How to fetch data from facebook through their API without exceeding the limit of API hit provided by Facebook? The connector should call facebook API for data after a specific time interval so that the number of hits won't exceed.

2. Each user can hit the API with their Access Token so users can't share the same topic partition. So how to handle this scenario. Do we have to create one partition for each user?

I read a few guides and blogs to understand Kafka connect and write a connector.

Confluenthttps://docs.confluent.io/current/connect/index.html

Kafka Documentationhttps://kafka.apache.org/documentation/#connect

Conceptually It gave me an idea about what is Kafka connect, how it works and what are the important classes to write a Kafka connector. But still, I am confused that practically how to write and run a connector. I tried to find step by step development guide but didn't get. 

Any tutorial or pdf If you could suggest which have detailed step by step development guide to write and run Kafka connector.

Apr 29 in Apache Kafka by Neeraj
• 120 points

retagged Apr 29 by Omkar 52 views

1 answer to this question.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
0 votes

If you know the limit of API hit provided by Facebook, then you can limit the number of objects returned. To do this, you will have to add the .limit() argument to any field or an edge. For example:

curl -i -X GET \
  "https://graph.facebook.com/{your-user-id}
    ?fields=feed.limit(3)
    &access_token={your-access-token}"

The limit(3) will limit the number of objects in each page to three. Replace this with the limit that Facebook has set and I think it should work. 

answered Apr 29 by Kumar

Related Questions In Apache Kafka

0 votes
1 answer

How to reset the offset of messages consumed from Kafka?

The reset option only prints the result ...READ MORE

answered Jul 9, 2018 in Apache Kafka by Shubham
• 12,270 points
1,229 views
0 votes
1 answer

How to commit message offsets in Kafka for reliable data pipeline?

You can use auto.commit.enable to allow Kafka ...READ MORE

answered Jul 9, 2018 in Apache Kafka by Shubham
• 12,270 points
169 views
0 votes
1 answer

How to purge Kafka Topic?

For your case what you can do ...READ MORE

answered Sep 4, 2018 in Apache Kafka by nitinrawat895
• 9,070 points
196 views
0 votes
1 answer

How to delete a topic in Kafka 0.8.1.1?

Deleting topic isn't always working in 0.8.1.1 Deletion ...READ MORE

answered Sep 4, 2018 in Apache Kafka by nitinrawat895
• 9,070 points
70 views
0 votes
0 answers

How to check pending messages in KAFKA topic?

Let say we have first_topic which has kafka_server_brokertopicmetrics_messagesin_total{instance="localhost:1120",job="kafka",topic="first_topic"}  ...READ MORE

May 2 in Apache Kafka by anonymous
44 views
0 votes
1 answer

Is there any change in consumer offsets if a new partition(s) is added to a Kafka topic?

Yes, it stays the same. An offset is ...READ MORE

answered Jul 9, 2018 in Apache Kafka by nitinrawat895
• 9,070 points
141 views
0 votes
1 answer

Is Kafka and Zookeeper are required in a Big Data Cluster?

Apache Kafka is one of the components ...READ MORE

answered Mar 22, 2018 in Big Data Hadoop by nitinrawat895
• 9,070 points
257 views
0 votes
1 answer

which one to choose for log analysis?

It is not about that you can ...READ MORE

answered Apr 6, 2018 in Big Data Hadoop by nitinrawat895
• 9,070 points
25 views
0 votes
1 answer

Getting error while connecting zookeeper in Kafka - Spark Streaming integration

I guess you need provide this kafka.bootstrap.servers ...READ MORE

answered May 24, 2018 in Apache Spark by Shubham
• 12,270 points
344 views
0 votes
1 answer

Kafka Feature

Here are some of the important features of ...READ MORE

answered Jun 7, 2018 in Apache Spark by Data_Nerd
• 2,340 points
53 views

© 2018 Brain4ce Education Solutions Pvt. Ltd. All rights Reserved.
"PMP®","PMI®", "PMI-ACP®" and "PMBOK®" are registered marks of the Project Management Institute, Inc. MongoDB®, Mongo and the leaf logo are the registered trademarks of MongoDB, Inc.