1. hadoop-mapreduce-examples-2.8.1.jar 파일의 pi 클래스 실행
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 | $ hadoop jar /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.8.1.jar pi 10 1000 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/local/hive-0.8.1/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Number of Maps = 10 Samples per Map = 1000 Wrote input for Map #0 Wrote input for Map #1 Wrote input for Map #2 Wrote input for Map #3 Wrote input for Map #4 Wrote input for Map #5 Wrote input for Map #6 Wrote input for Map #7 Wrote input for Map #8 Wrote input for Map #9 Starting Job 17/10/13 12:44:58 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.10.30:8032 17/10/13 12:44:59 INFO input.FileInputFormat: Total input files to process : 10 17/10/13 12:44:59 INFO mapreduce.JobSubmitter: number of splits:10 17/10/13 12:44:59 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1507865821865_0001 17/10/13 12:45:00 INFO impl.YarnClientImpl: Submitted application application_1507865821865_0001 17/10/13 12:45:00 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1507865821865_0001/ 17/10/13 12:45:00 INFO mapreduce.Job: Running job: job_1507865821865_0001 17/10/13 12:45:08 INFO mapreduce.Job: Job job_1507865821865_0001 running in uber mode : false 17/10/13 12:45:08 INFO mapreduce.Job: map 0% reduce 0% 17/10/13 12:45:14 INFO mapreduce.Job: map 20% reduce 0% 17/10/13 12:45:15 INFO mapreduce.Job: map 100% reduce 0% 17/10/13 12:45:18 INFO mapreduce.Job: map 100% reduce 100% 17/10/13 12:45:19 INFO mapreduce.Job: Job job_1507865821865_0001 completed successfully 17/10/13 12:45:19 INFO mapreduce.Job: Counters: 49 File System Counters FILE: Number of bytes read=226 FILE: Number of bytes written=1507099 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=2670 HDFS: Number of bytes written=215 HDFS: Number of read operations=43 HDFS: Number of large read operations=0 HDFS: Number of write operations=3 Job Counters Launched map tasks=10 Launched reduce tasks=1 Data-local map tasks=10 Total time spent by all maps in occupied slots (ms)=48025 Total time spent by all reduces in occupied slots (ms)=2513 Total time spent by all map tasks (ms)=48025 Total time spent by all reduce tasks (ms)=2513 Total vcore-milliseconds taken by all map tasks=48025 Total vcore-milliseconds taken by all reduce tasks=2513 Total megabyte-milliseconds taken by all map tasks=49177600 Total megabyte-milliseconds taken by all reduce tasks=2573312 Map-Reduce Framework Map input records=10 Map output records=20 Map output bytes=180 Map output materialized bytes=280 Input split bytes=1490 Combine input records=0 Combine output records=0 Reduce input groups=2 Reduce shuffle bytes=280 Reduce input records=20 Reduce output records=0 Spilled Records=40 Shuffled Maps =10 Failed Shuffles=0 Merged Map outputs=10 GC time elapsed (ms)=1297 CPU time spent (ms)=5270 Physical memory (bytes) snapshot=3010437120 Virtual memory (bytes) snapshot=21824602112 Total committed heap usage (bytes)=2163736576 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=1180 File Output Format Counters Bytes Written=97 Job Finished in 21.033 seconds Estimated value of Pi is 3.14080000000000000000 | cs |
* hadoop 설치 시 제공된 샘플을 갖고 테스트를 진행합니다.