How to convert a json file structure with values in single quotes to quoteless ?

0 votes
Something as this json object:

"Territory": "'CA'",

I have to get rid of ' ' beside CA.

And in here i am moving the so processed file into a s3 bucket.

I have already established connection. i need help with the code of pyspark to run on the custer that can help change this.
Oct 4 in Apache Spark by anonymous
• 150 points
107 views

1 answer to this question.

0 votes

You can do this by turning off the default escape of double-quote character with a backslash (\). Try the following:

dataframe
.write
.option("header", true)
.option("quote", "\u0000") 
.csv("Path to csv file")

What this does is, changes the "quote" character to the null character(Unicode \u0000). 

answered Oct 4 by Jisha

Related Questions In Apache Spark

+1 vote
1 answer

How to convert JSON file to AVRO file and vise versa

Try including the package while starting the ...READ MORE

answered Aug 26 in Apache Spark by Karan
122 views
0 votes
1 answer

How can I write a text file in HDFS not from an RDD, in Spark program?

Yes, you can go ahead and write ...READ MORE

answered May 29, 2018 in Apache Spark by Shubham
• 13,310 points
1,595 views
0 votes
1 answer

How to convert rdd object to dataframe in spark

SqlContext has a number of createDataFrame methods ...READ MORE

answered May 30, 2018 in Apache Spark by nitinrawat895
• 10,730 points
1,592 views
+13 votes
2 answers

Git management technique when there are multiple customers and need multiple customization?

Consider this - In 'extended' Git-Flow, (Git-Multi-Flow, ...READ MORE

answered Mar 26, 2018 in DevOps & Agile by DragonLord999
• 8,380 points
218 views
0 votes
1 answer
0 votes
1 answer

How to remove the elements with a key present in any other RDD?

Hey, You can use the subtractByKey () function to ...READ MORE

answered Jul 22 in Apache Spark by Gitika
• 25,360 points
127 views
0 votes
1 answer

How to read a data from text file in Spark?

Hey, You can try this: from pyspark import SparkContext SparkContext.stop(sc) sc ...READ MORE

answered Aug 6 in Apache Spark by Gitika
• 25,360 points
344 views