if i want to see my public key after running cat <path> command in gitbash but saying no such file or directory.

0 votes
error is-:

$ cat /c/Users/krishankant sharma/.ssh/id_rsa.pub
cat: /c/Users/krishankant: No such file or directory
cat: sharma/.ssh/id_rsa.pub: No such file or directory
May 23 in Apache Spark by kk
• 120 points
73 views

1 answer to this question.

0 votes

Hey, @KK,

You can fix this issue may be as simple as enclosing the directory path in double-quotes.

$ cat "/c:/Users/krishankant sharma/.ssh/id_rsa.pub"

I hope this will help you.

answered May 26 by Gitika
• 31,430 points

Related Questions In Apache Spark

0 votes
1 answer

env: ‘python’: No such file or directory in pyspark.

Hi@akhtar, This error occurs because your python version ...READ MORE

answered Apr 7 in Apache Spark by MD
• 32,810 points
368 views
0 votes
0 answers

env : R : No such file or directory

Hi, I tried to set sparkR .But I ...READ MORE

Jan 31 in Apache Spark by Hasid
• 370 points
138 views
0 votes
1 answer

where can i get spark-terasort.jar and not .scala file, to do spark terasort in windows.

Hi! I found 2 links on github where ...READ MORE

answered Feb 13, 2019 in Apache Spark by Omkar
• 69,040 points
322 views
0 votes
1 answer

Spark to check if a particular string exists in a file

You can use this: lines = sc.textFile(“hdfs://path/to/file/filename.txt”); def isFound(line): if ...READ MORE

answered Mar 15, 2019 in Apache Spark by Raj
987 views
0 votes
1 answer

Which query to use for better performance, join in SQL or using Dataset API?

DataFrames and SparkSQL performed almost about the ...READ MORE

answered Apr 19, 2018 in Apache Spark by kurt_cobain
• 9,310 points
287 views
0 votes
1 answer

Efficient way to read specific columns from parquet file in spark

As parquet is a column based storage ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,310 points
2,540 views
0 votes
1 answer

Is it better to have one large parquet file or lots of smaller parquet files?

Ideally, you would use snappy compression (default) ...READ MORE

answered May 23, 2018 in Apache Spark by nitinrawat895
• 10,920 points
5,617 views
0 votes
1 answer

How to create RDD from an external file source in scala?

Hi, To create an RDD from external file ...READ MORE

answered Jul 3, 2019 in Apache Spark by Gitika
• 31,430 points
291 views
0 votes
1 answer

How to remove the elements with a key present in any other RDD?

Hey, You can use the subtractByKey () function to ...READ MORE

answered Jul 22, 2019 in Apache Spark by Gitika
• 31,430 points
731 views