compression - How do I read Snappy compressed files on HDFS without using Hadoop? -
i'm storing files on hdfs in snappy compression format. i'd able examine these files on local linux file system make sure hadoop process created them has performed correctly.
when copy them locally , attempt de-compress them google standard libarary, tells me file missing snappy identifier. when try go around inserting snappy identifier, messes checksum.
what can read these files without having write separate hadoop program or pass through hive?
i found out can use following command read contents of snappy compressed file on hdfs:
hadoop fs -text filename
if intent download file in text format additional examination , processing, output of command can piped file on local system. can use head view first few lines of file.
Comments
Post a Comment