if i want to see my public key after running cat <path> command in gitbash but saying no such file or directory.

0 votes
error is-:

$ cat /c/Users/krishankant sharma/.ssh/id_rsa.pub
cat: /c/Users/krishankant: No such file or directory
cat: sharma/.ssh/id_rsa.pub: No such file or directory
May 23 in Apache Spark by kk
• 120 points
114 views

1 answer to this question.

0 votes

Hey, @KK,

You can fix this issue may be as simple as enclosing the directory path in double-quotes.

$ cat "/c:/Users/krishankant sharma/.ssh/id_rsa.pub"

I hope this will help you.

answered May 26 by Gitika
• 41,360 points

Related Questions In Apache Spark

0 votes
1 answer

env: ‘python’: No such file or directory in pyspark.

Hi@akhtar, This error occurs because your python version ...READ MORE

answered Apr 7 in Apache Spark by MD
• 65,360 points
1,091 views
0 votes
1 answer

env : R : No such file or directory

Hi@akhtar, I also got this error. I am able to ...READ MORE

answered Jul 21 in Apache Spark by MD
• 65,360 points
280 views
0 votes
1 answer

where can i get spark-terasort.jar and not .scala file, to do spark terasort in windows.

Hi! I found 2 links on github where ...READ MORE

answered Feb 13, 2019 in Apache Spark by Omkar
• 69,030 points
407 views
0 votes
1 answer

Spark to check if a particular string exists in a file

You can use this: lines = sc.textFile(“hdfs://path/to/file/filename.txt”); def isFound(line): if ...READ MORE

answered Mar 15, 2019 in Apache Spark by Raj
1,363 views
0 votes
1 answer

Which query to use for better performance, join in SQL or using Dataset API?

DataFrames and SparkSQL performed almost about the ...READ MORE

answered Apr 19, 2018 in Apache Spark by kurt_cobain
• 9,320 points
414 views
0 votes
1 answer

Efficient way to read specific columns from parquet file in spark

As parquet is a column based storage ...READ MORE

answered Apr 20, 2018 in Apache Spark by kurt_cobain
• 9,320 points
3,055 views
0 votes
1 answer

How to create RDD from an external file source in scala?

Hi, To create an RDD from external file ...READ MORE

answered Jul 3, 2019 in Apache Spark by Gitika
• 41,360 points
442 views
0 votes
1 answer

How to remove the elements with a key present in any other RDD?

Hey, You can use the subtractByKey () function to ...READ MORE

answered Jul 22, 2019 in Apache Spark by Gitika
• 41,360 points
982 views