I have configured the Hadoop Cluster in the Local system. I want to know the number of blocks for a file in Hadoop. How can I do that?
You can use Hadoop file system command to know the blocks for the specific file. To view the blocks for the specific file you can use the below command.
$ hadoop fsck /path/to/file -files -blocks
Yes. you can use the hadoop fsck command to do ...READ MORE
You can use the hadoop fs -ls command to ...READ MORE
The example uses HBase Shell to keep ...READ MORE
Well, what you can do is use ...READ MORE
You can use dfsadmin which runs a ...READ MORE
hdfs dfsadmin -report
This command tells fs ...READ MORE
Firstly you need to understand the concept ...READ MORE
put <localSrc> <dest>
copyF ...READ MORE
It depends on what kind of testing ...READ MORE
You can use the Chown command. This ...READ MORE
Already have an account? Sign in.