1. ip 

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
$ ifconfig
ens160    Link encap:Ethernet  HWaddr 12:0c:29:3a:id:ab
          inet addr:192.168.10.30  Bcast:192.168.10.255  Mask:255.255.255.0
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:593671 errors:0 dropped:0 overruns:0 frame:0
          TX packets:297404 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000
          RX bytes:460588158 (460.5 MB)  TX bytes:112335278 (112.3 MB)
 
lo        Link encap:Local Loopback
          inet addr:127.0.0.1  Mask:255.0.0.0
          UP LOOPBACK RUNNING  MTU:65536  Metric:1
          RX packets:91186 errors:0 dropped:0 overruns:0 frame:0
          TX packets:91186 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1
          RX bytes:12822719 (12.8 MB)  TX bytes:12822719 (12.8 MB)
 
cs

* 위에 표기된 내용은 임의의 값입니다.



2. interfaces 파일 편집

1
2
$ sudo vi /etc/network/interfaces
 
cs


2-1. interfaces 수정 전

1
2
3
4
5
6
7
8
9
10
11
12
13
# This file describes the network interfaces available on your system
# and how to activate them. For more information, see interfaces(5).
 
source /etc/network/interfaces.d/*
 
# The loopback network interface
auto lo
iface lo inet loopback
 
# The primary network interface
auto ens160
iface ens160 inet dhcp
 
cs


2-2 interfaces 수정 후

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
# This file describes the network interfaces available on your system
# and how to activate them. For more information, see interfaces(5).
 
source /etc/network/interfaces.d/*
 
# The loopback network interface
auto lo
iface lo inet loopback
 
# The primary network interface
auto ens160
iface ens160 inet static
        address 192.168.10.30
        netmask 255.255.255.0
        broadcast 192.168.10.255
        gateway 192.168.10.1
        network 192.168.10.0
        dns-nameservers 8.8.8.8
 
cs

* 위와 같이 수정 후 저장합니다.



3. networking.service 재시작

1
2
$ sudo systemctl restart networking.service
 
cs



4. 시스템 재부팅

1
2
$ sudo reboot now
 
cs



'Operating System > Linux' 카테고리의 다른 글

Ubuntu ssh server 설치  (0) 2017.10.18
head 명령어  (0) 2017.10.16
FTP 설치 및 접속  (0) 2017.10.12
root 비밀번호 설정  (0) 2017.10.12
Ubuntu Server 설치  (0) 2017.10.12


hadoop 완전 분산 모드 

 



* Hadoop 설치

-> Hadoop 이 설치되어 있지 않은 경우 위의 "Hadoop 설치" 를 클릭하세요.


*. MySQL 설치

-> MySQL 이 설치되어 있지 않은 경우 위의 "MySQL 설치" 를 클릭하세요.



1. mysql 서버에 접속합니다.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
$ mysql -root -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 10
Server version: 5.7.19-0ubuntu0.16.04.(Ubuntu)
 
Copyright (c) 20002017, Oracle and/or its affiliates. All rights reserved.
 
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
 
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
 
mysql> 
 
cs



2. 유저를 생성합니다.

1
mysql> CREATE USER 'hive'@'%' IDENTIFIED BY 'pass';
cs

* pass 부분에 해당 계정의 비밀번호를 입력합니다.



3. 생성한 유저에게 권한을 부여합니다.

1
mysql> GRANT ALL ON *.* TO 'hive'@'%' IDENTIFIED BY 'pass';
cs

* pass 부분에 해당 계정의 비밀번호를 입력합니다.



4. 부여된 권한이 적용되도록 아래와 같이 실행합니다.

1
mysql> flush privileges;
cs



5. mysql 서버 재기동

1
2
$ sudo /etc/init.d/mysql restart
[ ok ] Restarting mysql (via systemctl): mysql.service.
cs



6. 생성한 계정으로 mysql 에 접속합니다.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
$ mysql -u hive -p
Enter password:
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 13
Server version: 5.7.19-0ubuntu0.16.04.(Ubuntu)
 
Copyright (c) 20002017, Oracle and/or its affiliates. All rights reserved.
 
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
 
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
 
cs



7. hive 데이터베이스를 생성합니다.

1
2
mysql> create database hive;
Query OK, row affected (0.02 sec)
cs





1. hive 를 다운로드합니다.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
$ wget http://apache.mirror.cdnetworks.com/hive/hive-2.1.1/apache-hive-2.1.1-bin.tar.gz
--2017-10-13 18:06:32--  http://apache.mirror.cdnetworks.com/hive/hive-2.1.1/apache-hive-2.1.1-bin.tar.gz
Resolving apache.mirror.cdnetworks.com (apache.mirror.cdnetworks.com)... 14.0.101.165
Connecting to apache.mirror.cdnetworks.com (apache.mirror.cdnetworks.com)|14.0.101.165|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 149756462 (143M) [application/x-gzip]
Saving to: ‘apache-hive-2.1.1-bin.tar.gz’
 
apache-hive-2.1.1-bin.t  44%[===========>                 ]  63.87M   125KB/s    in 39s
 
2017-10-13 18:07:10 (1.66 MB/s) - Connection closed at byte 66975472. Retrying.
 
--2017-10-13 18:07:11--  (try: 2)  http://apache.mirror.cdnetworks.com/hive/hive-2.1.1/apache-hive-2.1.1-bin.tar.gz
Connecting to apache.mirror.cdnetworks.com (apache.mirror.cdnetworks.com)|14.0.101.165|:80... connected.
HTTP request sent, awaiting response... 206 Partial Content
Length: 149756462 (143M), 82780990 (79M) remaining [application/x-gzip]
Saving to: ‘apache-hive-2.1.1-bin.tar.gz’
 
apache-hive-2.1.1-bin.t 100%[++++++++++++================>] 142.82M  10.4MB/s    in 7.6s
 
2017-10-13 18:07:19 (10.4 MB/s) - ‘apache-hive-2.1.1-bin.tar.gz’ saved [149756462/149756462]
 
cs



2. 다운로드한 파일을 압축 해제합니다.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
$ tar -xvzf ~/apache-hive-2.1.1-bin.tar.gz
apache-hive-2.1.1-bin/LICENSE
apache-hive-2.1.1-bin/NOTICE
apache-hive-2.1.1-bin/README.txt
apache-hive-2.1.1-bin/RELEASE_NOTES.txt
apache-hive-2.1.1-bin/examples/files/2000_cols_data.csv
apache-hive-2.1.1-bin/examples/files/agg_01-p1.txt
apache-hive-2.1.1-bin/examples/files/agg_01-p2.txt
apache-hive-2.1.1-bin/examples/files/agg_01-p3.txt
apache-hive-2.1.1-bin/examples/files/alltypes.txt
apache-hive-2.1.1-bin/examples/files/alltypes2.txt
apache-hive-2.1.1-bin/examples/files/apache.access.2.log
apache-hive-2.1.1-bin/examples/files/apache.access.log
apache-hive-2.1.1-bin/examples/files/archive_corrupt.rc
apache-hive-2.1.1-bin/examples/files/array_table.txt
apache-hive-2.1.1-bin/examples/files/avro_charvarchar.txt
apache-hive-2.1.1-bin/examples/files/avro_date.txt
apache-hive-2.1.1-bin/examples/files/avro_timestamp.txt
apache-hive-2.1.1-bin/examples/files/AvroPrimitiveInList.parquet
apache-hive-2.1.1-bin/examples/files/AvroSingleFieldGroupInList.parquet
apache-hive-2.1.1-bin/examples/files/binary.txt
apache-hive-2.1.1-bin/examples/files/bool.txt
apache-hive-2.1.1-bin/examples/files/bool_literal.txt
apache-hive-2.1.1-bin/examples/files/cbo_t1.txt
apache-hive-2.1.1-bin/examples/files/cbo_t2.txt
apache-hive-2.1.1-bin/examples/files/cbo_t3.txt
apache-hive-2.1.1-bin/examples/files/cbo_t4.txt
apache-hive-2.1.1-bin/examples/files/cbo_t5.txt
apache-hive-2.1.1-bin/examples/files/cbo_t6.txt
apache-hive-2.1.1-bin/examples/files/char_varchar_udf.txt
apache-hive-2.1.1-bin/examples/files/complex.seq
apache-hive-2.1.1-bin/examples/files/covar_tab.txt
apache-hive-2.1.1-bin/examples/files/create_nested_type.txt
apache-hive-2.1.1-bin/examples/files/csv.txt
apache-hive-2.1.1-bin/examples/files/ct_events_clean.txt
apache-hive-2.1.1-bin/examples/files/customer_address.txt
apache-hive-2.1.1-bin/examples/files/customers.txt
apache-hive-2.1.1-bin/examples/files/data_with_escape.txt
apache-hive-2.1.1-bin/examples/files/datatypes.txt
apache-hive-2.1.1-bin/examples/files/dec.avro
apache-hive-2.1.1-bin/examples/files/dec.parq
apache-hive-2.1.1-bin/examples/files/dec.txt
apache-hive-2.1.1-bin/examples/files/dec_comp.txt
apache-hive-2.1.1-bin/examples/files/dec_old.avro
apache-hive-2.1.1-bin/examples/files/decimal.txt
apache-hive-2.1.1-bin/examples/files/decimal_10_0.txt
apache-hive-2.1.1-bin/examples/files/decimal_1_1.txt
apache-hive-2.1.1-bin/examples/files/dept.txt
apache-hive-2.1.1-bin/examples/files/dim-data.txt
apache-hive-2.1.1-bin/examples/files/dim_shops.txt
apache-hive-2.1.1-bin/examples/files/doctors.avro
apache-hive-2.1.1-bin/examples/files/docurl.txt
apache-hive-2.1.1-bin/examples/files/double.txt
apache-hive-2.1.1-bin/examples/files/dynamic_partition_insert.txt
apache-hive-2.1.1-bin/examples/files/dynpart_test.txt
apache-hive-2.1.1-bin/examples/files/dynpartdata1.txt
apache-hive-2.1.1-bin/examples/files/dynpartdata2.txt
apache-hive-2.1.1-bin/examples/files/emp.txt
apache-hive-2.1.1-bin/examples/files/emp2.txt
apache-hive-2.1.1-bin/examples/files/employee.dat
apache-hive-2.1.1-bin/examples/files/employee2.dat
apache-hive-2.1.1-bin/examples/files/employee_part.txt
apache-hive-2.1.1-bin/examples/files/empty1.txt
apache-hive-2.1.1-bin/examples/files/empty2.txt
apache-hive-2.1.1-bin/examples/files/encoding-utf8.txt
apache-hive-2.1.1-bin/examples/files/encoding_iso-8859-1.txt
apache-hive-2.1.1-bin/examples/files/episodes.avro
apache-hive-2.1.1-bin/examples/files/escape_crlf.txt
apache-hive-2.1.1-bin/examples/files/escapetest.txt
apache-hive-2.1.1-bin/examples/files/extrapolate_stats_full.txt
apache-hive-2.1.1-bin/examples/files/extrapolate_stats_partial.txt
apache-hive-2.1.1-bin/examples/files/extrapolate_stats_partial_ndv.txt
apache-hive-2.1.1-bin/examples/files/fact-data.txt
apache-hive-2.1.1-bin/examples/files/flights_join.txt
apache-hive-2.1.1-bin/examples/files/flights_tiny.txt
apache-hive-2.1.1-bin/examples/files/flights_tiny.txt.1
apache-hive-2.1.1-bin/examples/files/futurama_episodes.avro
apache-hive-2.1.1-bin/examples/files/grad.avsc
apache-hive-2.1.1-bin/examples/files/groupby_groupingid.txt
apache-hive-2.1.1-bin/examples/files/grouping_sets.txt
apache-hive-2.1.1-bin/examples/files/grouping_sets1.txt
apache-hive-2.1.1-bin/examples/files/grouping_sets2.txt
apache-hive-2.1.1-bin/examples/files/hive_626_bar.txt
apache-hive-2.1.1-bin/examples/files/hive_626_count.txt
apache-hive-2.1.1-bin/examples/files/hive_626_foo.txt
apache-hive-2.1.1-bin/examples/files/HiveGroup.parquet
apache-hive-2.1.1-bin/examples/files/HiveRequiredGroupInList.parquet
apache-hive-2.1.1-bin/examples/files/in1.txt
apache-hive-2.1.1-bin/examples/files/in2.txt
apache-hive-2.1.1-bin/examples/files/in3.txt
apache-hive-2.1.1-bin/examples/files/in4.txt
apache-hive-2.1.1-bin/examples/files/in5.txt
apache-hive-2.1.1-bin/examples/files/in6.txt
apache-hive-2.1.1-bin/examples/files/in7.txt
apache-hive-2.1.1-bin/examples/files/in8.txt
apache-hive-2.1.1-bin/examples/files/in9.txt
apache-hive-2.1.1-bin/examples/files/in_file.dat
apache-hive-2.1.1-bin/examples/files/infer_const_type.txt
apache-hive-2.1.1-bin/examples/files/input.txt
apache-hive-2.1.1-bin/examples/files/int.txt
apache-hive-2.1.1-bin/examples/files/json.txt
apache-hive-2.1.1-bin/examples/files/keystore.jks
apache-hive-2.1.1-bin/examples/files/keystore_exampledotcom.jks
apache-hive-2.1.1-bin/examples/files/kv1.seq
apache-hive-2.1.1-bin/examples/files/kv1.string-sorted.txt
apache-hive-2.1.1-bin/examples/files/kv1.txt
apache-hive-2.1.1-bin/examples/files/kv1.val.sorted.txt
apache-hive-2.1.1-bin/examples/files/kv10.txt
apache-hive-2.1.1-bin/examples/files/kv1_broken.seq
apache-hive-2.1.1-bin/examples/files/kv1_cb.txt
apache-hive-2.1.1-bin/examples/files/kv1_cc.txt
apache-hive-2.1.1-bin/examples/files/kv1kv2.cogroup.txt
apache-hive-2.1.1-bin/examples/files/kv2.txt
apache-hive-2.1.1-bin/examples/files/kv3.txt
apache-hive-2.1.1-bin/examples/files/kv4.txt
apache-hive-2.1.1-bin/examples/files/kv5.txt
apache-hive-2.1.1-bin/examples/files/kv6.txt
apache-hive-2.1.1-bin/examples/files/kv7.txt
apache-hive-2.1.1-bin/examples/files/kv8.txt
apache-hive-2.1.1-bin/examples/files/kv9.txt
apache-hive-2.1.1-bin/examples/files/leftsemijoin_mr_t1.txt
apache-hive-2.1.1-bin/examples/files/leftsemijoin_mr_t2.txt
apache-hive-2.1.1-bin/examples/files/lineitem.txt
apache-hive-2.1.1-bin/examples/files/loc.txt
apache-hive-2.1.1-bin/examples/files/location.txt
apache-hive-2.1.1-bin/examples/files/lt100.sorted.txt
apache-hive-2.1.1-bin/examples/files/lt100.txt
apache-hive-2.1.1-bin/examples/files/lt100.txt.deflate
apache-hive-2.1.1-bin/examples/files/map_null_schema.avro
apache-hive-2.1.1-bin/examples/files/map_null_val.avro
apache-hive-2.1.1-bin/examples/files/map_table.txt
apache-hive-2.1.1-bin/examples/files/mapNull.txt
apache-hive-2.1.1-bin/examples/files/MultiFieldGroupInList.parquet
apache-hive-2.1.1-bin/examples/files/nested_complex.txt
apache-hive-2.1.1-bin/examples/files/nested_orders.txt
apache-hive-2.1.1-bin/examples/files/nestedcomplex_additional.txt
apache-hive-2.1.1-bin/examples/files/NestedMap.parquet
apache-hive-2.1.1-bin/examples/files/NewOptionalGroupInList.parquet
apache-hive-2.1.1-bin/examples/files/NewRequiredGroupInList.parquet
apache-hive-2.1.1-bin/examples/files/non_ascii_tbl.txt
apache-hive-2.1.1-bin/examples/files/null.txt
apache-hive-2.1.1-bin/examples/files/nullfile.txt
apache-hive-2.1.1-bin/examples/files/nulls.txt
apache-hive-2.1.1-bin/examples/files/opencsv-data.txt
apache-hive-2.1.1-bin/examples/files/orc_create.txt
apache-hive-2.1.1-bin/examples/files/orc_create_people.txt
apache-hive-2.1.1-bin/examples/files/orc_split_elim.orc
apache-hive-2.1.1-bin/examples/files/orders.txt
apache-hive-2.1.1-bin/examples/files/parquet_array_null_element.txt
apache-hive-2.1.1-bin/examples/files/parquet_columnar.txt
apache-hive-2.1.1-bin/examples/files/parquet_create.txt
apache-hive-2.1.1-bin/examples/files/parquet_external_time.parq
apache-hive-2.1.1-bin/examples/files/parquet_partitioned.txt
apache-hive-2.1.1-bin/examples/files/parquet_type_promotion.txt
apache-hive-2.1.1-bin/examples/files/parquet_types.txt
apache-hive-2.1.1-bin/examples/files/part.rc
apache-hive-2.1.1-bin/examples/files/part.seq
apache-hive-2.1.1-bin/examples/files/part_tiny.txt
apache-hive-2.1.1-bin/examples/files/person age.txt
apache-hive-2.1.1-bin/examples/files/person+age.txt
apache-hive-2.1.1-bin/examples/files/posexplode_data.txt
apache-hive-2.1.1-bin/examples/files/primitive_type_arrays.txt
apache-hive-2.1.1-bin/examples/files/ProxyAuth.res
apache-hive-2.1.1-bin/examples/files/pw17.txt
apache-hive-2.1.1-bin/examples/files/regex-path-2015-12-10_03.txt
apache-hive-2.1.1-bin/examples/files/regex-path-201512-10_03.txt
apache-hive-2.1.1-bin/examples/files/regex-path-2015121003.txt
apache-hive-2.1.1-bin/examples/files/sales.txt
apache-hive-2.1.1-bin/examples/files/same_type1_a.txt
apache-hive-2.1.1-bin/examples/files/same_type1_b.txt
apache-hive-2.1.1-bin/examples/files/same_type1_c.txt
apache-hive-2.1.1-bin/examples/files/sample-queryplan-in-history.txt
apache-hive-2.1.1-bin/examples/files/sample-queryplan.txt
apache-hive-2.1.1-bin/examples/files/sample.json
apache-hive-2.1.1-bin/examples/files/sample2.json
apache-hive-2.1.1-bin/examples/files/service_request_clean.txt
apache-hive-2.1.1-bin/examples/files/SingleFieldGroupInList.parquet
apache-hive-2.1.1-bin/examples/files/small_csv.csv
apache-hive-2.1.1-bin/examples/files/smallsrcsortbucket1outof4.txt
apache-hive-2.1.1-bin/examples/files/smallsrcsortbucket2outof4.txt
apache-hive-2.1.1-bin/examples/files/smallsrcsortbucket3outof4.txt
apache-hive-2.1.1-bin/examples/files/smallsrcsortbucket4outof4.txt
apache-hive-2.1.1-bin/examples/files/smb_bucket_input.rc
apache-hive-2.1.1-bin/examples/files/smb_bucket_input.txt
apache-hive-2.1.1-bin/examples/files/smbbucket_1.rc
apache-hive-2.1.1-bin/examples/files/smbbucket_1.txt
apache-hive-2.1.1-bin/examples/files/smbbucket_2.rc
apache-hive-2.1.1-bin/examples/files/smbbucket_2.txt
apache-hive-2.1.1-bin/examples/files/smbbucket_3.rc
apache-hive-2.1.1-bin/examples/files/smbbucket_3.txt
apache-hive-2.1.1-bin/examples/files/smbdata.txt
apache-hive-2.1.1-bin/examples/files/SortCol1Col2.txt
apache-hive-2.1.1-bin/examples/files/SortCol2Col1.txt
apache-hive-2.1.1-bin/examples/files/SortDescCol1Col2.txt
apache-hive-2.1.1-bin/examples/files/SortDescCol2Col1.txt
apache-hive-2.1.1-bin/examples/files/sortdp.txt
apache-hive-2.1.1-bin/examples/files/sour1.txt
apache-hive-2.1.1-bin/examples/files/sour2.txt
apache-hive-2.1.1-bin/examples/files/source.txt
apache-hive-2.1.1-bin/examples/files/srcbucket0.txt
apache-hive-2.1.1-bin/examples/files/srcbucket1.txt
apache-hive-2.1.1-bin/examples/files/srcbucket20.txt
apache-hive-2.1.1-bin/examples/files/srcbucket21.txt
apache-hive-2.1.1-bin/examples/files/srcbucket22.txt
apache-hive-2.1.1-bin/examples/files/srcbucket23.txt
apache-hive-2.1.1-bin/examples/files/srcsortbucket1outof4.txt
apache-hive-2.1.1-bin/examples/files/srcsortbucket2outof4.txt
apache-hive-2.1.1-bin/examples/files/srcsortbucket3outof4.txt
apache-hive-2.1.1-bin/examples/files/srcsortbucket4outof4.txt
apache-hive-2.1.1-bin/examples/files/store.txt
apache-hive-2.1.1-bin/examples/files/store_sales.txt
apache-hive-2.1.1-bin/examples/files/string.txt
apache-hive-2.1.1-bin/examples/files/StringMapOfOptionalIntArray.parquet
apache-hive-2.1.1-bin/examples/files/struct1_a.txt
apache-hive-2.1.1-bin/examples/files/struct1_b.txt
apache-hive-2.1.1-bin/examples/files/struct1_c.txt
apache-hive-2.1.1-bin/examples/files/struct2_a.txt
apache-hive-2.1.1-bin/examples/files/struct2_b.txt
apache-hive-2.1.1-bin/examples/files/struct2_c.txt
apache-hive-2.1.1-bin/examples/files/struct2_d.txt
apache-hive-2.1.1-bin/examples/files/struct3_a.txt
apache-hive-2.1.1-bin/examples/files/struct3_b.txt
apache-hive-2.1.1-bin/examples/files/struct3_c.txt
apache-hive-2.1.1-bin/examples/files/struct4_a.txt
apache-hive-2.1.1-bin/examples/files/struct4_b.txt
apache-hive-2.1.1-bin/examples/files/struct4_c.txt
apache-hive-2.1.1-bin/examples/files/symlink-with-regex.txt
apache-hive-2.1.1-bin/examples/files/symlink1.txt
apache-hive-2.1.1-bin/examples/files/symlink2.txt
apache-hive-2.1.1-bin/examples/files/T1.txt
apache-hive-2.1.1-bin/examples/files/T2.txt
apache-hive-2.1.1-bin/examples/files/T3.txt
apache-hive-2.1.1-bin/examples/files/tbl.txt
apache-hive-2.1.1-bin/examples/files/test.dat
apache-hive-2.1.1-bin/examples/files/test1.txt
apache-hive-2.1.1-bin/examples/files/test2.dat
apache-hive-2.1.1-bin/examples/files/text-en.txt
apache-hive-2.1.1-bin/examples/files/things.txt
apache-hive-2.1.1-bin/examples/files/things2.txt
apache-hive-2.1.1-bin/examples/files/ThriftPrimitiveInList.parquet
apache-hive-2.1.1-bin/examples/files/ThriftSingleFieldGroupInList.parquet
apache-hive-2.1.1-bin/examples/files/timestamps.txt
apache-hive-2.1.1-bin/examples/files/tiny_a.txt
apache-hive-2.1.1-bin/examples/files/tiny_b.txt
apache-hive-2.1.1-bin/examples/files/tjoin1.txt
apache-hive-2.1.1-bin/examples/files/tjoin2.txt
apache-hive-2.1.1-bin/examples/files/truststore.jks
apache-hive-2.1.1-bin/examples/files/ts_formats.txt
apache-hive-2.1.1-bin/examples/files/tsformat.json
apache-hive-2.1.1-bin/examples/files/type_evolution.avro
apache-hive-2.1.1-bin/examples/files/UnannotatedListOfGroups.parquet
apache-hive-2.1.1-bin/examples/files/UnannotatedListOfPrimitives.parquet
apache-hive-2.1.1-bin/examples/files/union_input.txt
apache-hive-2.1.1-bin/examples/files/union_non_nullable.txt
apache-hive-2.1.1-bin/examples/files/union_nullable.txt
apache-hive-2.1.1-bin/examples/files/unique_1.txt
apache-hive-2.1.1-bin/examples/files/unique_2.txt
apache-hive-2.1.1-bin/examples/files/UserVisits.dat
apache-hive-2.1.1-bin/examples/files/v1.txt
apache-hive-2.1.1-bin/examples/files/v2.txt
apache-hive-2.1.1-bin/examples/files/vc1.txt
apache-hive-2.1.1-bin/examples/files/windowing_distinct.txt
apache-hive-2.1.1-bin/examples/files/x.txt
apache-hive-2.1.1-bin/examples/files/y.txt
apache-hive-2.1.1-bin/examples/files/z.txt
apache-hive-2.1.1-bin/examples/queries/case_sensitivity.q
apache-hive-2.1.1-bin/examples/queries/cast1.q
apache-hive-2.1.1-bin/examples/queries/groupby1.q
apache-hive-2.1.1-bin/examples/queries/groupby2.q
apache-hive-2.1.1-bin/examples/queries/groupby3.q
apache-hive-2.1.1-bin/examples/queries/groupby4.q
apache-hive-2.1.1-bin/examples/queries/groupby5.q
apache-hive-2.1.1-bin/examples/queries/groupby6.q
apache-hive-2.1.1-bin/examples/queries/input1.q
apache-hive-2.1.1-bin/examples/queries/input2.q
apache-hive-2.1.1-bin/examples/queries/input20.q
apache-hive-2.1.1-bin/examples/queries/input3.q
apache-hive-2.1.1-bin/examples/queries/input4.q
apache-hive-2.1.1-bin/examples/queries/input5.q
apache-hive-2.1.1-bin/examples/queries/input6.q
apache-hive-2.1.1-bin/examples/queries/input7.q
apache-hive-2.1.1-bin/examples/queries/input8.q
apache-hive-2.1.1-bin/examples/queries/input9.q
apache-hive-2.1.1-bin/examples/queries/input_part1.q
apache-hive-2.1.1-bin/examples/queries/input_testsequencefile.q
apache-hive-2.1.1-bin/examples/queries/input_testxpath.q
apache-hive-2.1.1-bin/examples/queries/input_testxpath2.q
apache-hive-2.1.1-bin/examples/queries/join1.q
apache-hive-2.1.1-bin/examples/queries/join2.q
apache-hive-2.1.1-bin/examples/queries/join3.q
apache-hive-2.1.1-bin/examples/queries/join4.q
apache-hive-2.1.1-bin/examples/queries/join5.q
apache-hive-2.1.1-bin/examples/queries/join6.q
apache-hive-2.1.1-bin/examples/queries/join7.q
apache-hive-2.1.1-bin/examples/queries/join8.q
apache-hive-2.1.1-bin/examples/queries/sample1.q
apache-hive-2.1.1-bin/examples/queries/sample2.q
apache-hive-2.1.1-bin/examples/queries/sample3.q
apache-hive-2.1.1-bin/examples/queries/sample4.q
apache-hive-2.1.1-bin/examples/queries/sample5.q
apache-hive-2.1.1-bin/examples/queries/sample6.q
apache-hive-2.1.1-bin/examples/queries/sample7.q
apache-hive-2.1.1-bin/examples/queries/subq.q
apache-hive-2.1.1-bin/examples/queries/udf1.q
apache-hive-2.1.1-bin/examples/queries/udf4.q
apache-hive-2.1.1-bin/examples/queries/udf6.q
apache-hive-2.1.1-bin/examples/queries/udf_case.q
apache-hive-2.1.1-bin/examples/queries/udf_when.q
apache-hive-2.1.1-bin/examples/queries/union.q
apache-hive-2.1.1-bin/bin/ext/util/
apache-hive-2.1.1-bin/bin/beeline
apache-hive-2.1.1-bin/bin/beeline.cmd
apache-hive-2.1.1-bin/bin/ext/beeline.sh
apache-hive-2.1.1-bin/bin/ext/cleardanglingscratchdir.cmd
apache-hive-2.1.1-bin/bin/ext/cleardanglingscratchdir.sh
apache-hive-2.1.1-bin/bin/ext/cli.cmd
apache-hive-2.1.1-bin/bin/ext/cli.sh
apache-hive-2.1.1-bin/bin/ext/debug.cmd
apache-hive-2.1.1-bin/bin/ext/debug.sh
apache-hive-2.1.1-bin/bin/ext/hbaseimport.cmd
apache-hive-2.1.1-bin/bin/ext/hbaseimport.sh
apache-hive-2.1.1-bin/bin/ext/hbaseschematool.sh
apache-hive-2.1.1-bin/bin/ext/help.cmd
apache-hive-2.1.1-bin/bin/ext/help.sh
apache-hive-2.1.1-bin/bin/ext/hiveburninclient.sh
apache-hive-2.1.1-bin/bin/ext/hiveserver2.cmd
apache-hive-2.1.1-bin/bin/ext/hiveserver2.sh
apache-hive-2.1.1-bin/bin/ext/hplsql.sh
apache-hive-2.1.1-bin/bin/ext/hwi.cmd
apache-hive-2.1.1-bin/bin/ext/hwi.sh
apache-hive-2.1.1-bin/bin/ext/jar.cmd
apache-hive-2.1.1-bin/bin/ext/jar.sh
apache-hive-2.1.1-bin/bin/ext/lineage.cmd
apache-hive-2.1.1-bin/bin/ext/lineage.sh
apache-hive-2.1.1-bin/bin/ext/llap.sh
apache-hive-2.1.1-bin/bin/ext/llapdump.sh
apache-hive-2.1.1-bin/bin/ext/llapstatus.sh
apache-hive-2.1.1-bin/bin/ext/metastore.cmd
apache-hive-2.1.1-bin/bin/ext/metastore.sh
apache-hive-2.1.1-bin/bin/ext/metatool.sh
apache-hive-2.1.1-bin/bin/ext/orcfiledump.cmd
apache-hive-2.1.1-bin/bin/ext/orcfiledump.sh
apache-hive-2.1.1-bin/bin/ext/rcfilecat.cmd
apache-hive-2.1.1-bin/bin/ext/rcfilecat.sh
apache-hive-2.1.1-bin/bin/ext/schemaTool.cmd
apache-hive-2.1.1-bin/bin/ext/schemaTool.sh
apache-hive-2.1.1-bin/bin/ext/util/execHiveCmd.cmd
apache-hive-2.1.1-bin/bin/ext/util/execHiveCmd.sh
apache-hive-2.1.1-bin/bin/ext/version.sh
apache-hive-2.1.1-bin/bin/hive
apache-hive-2.1.1-bin/bin/hive-config.cmd
apache-hive-2.1.1-bin/bin/hive-config.sh
apache-hive-2.1.1-bin/bin/hive.cmd
apache-hive-2.1.1-bin/bin/hiveserver2
apache-hive-2.1.1-bin/bin/hplsql
apache-hive-2.1.1-bin/bin/hplsql.cmd
apache-hive-2.1.1-bin/bin/metatool
apache-hive-2.1.1-bin/bin/schematool
apache-hive-2.1.1-bin/scripts/llap/bin/llap-daemon-env.sh
apache-hive-2.1.1-bin/scripts/llap/bin/llapDaemon.sh
apache-hive-2.1.1-bin/scripts/llap/bin/runLlapDaemon.sh
apache-hive-2.1.1-bin/scripts/llap/sql/serviceCheckScript.sql
apache-hive-2.1.1-bin/scripts/llap/slider/llap.py
apache-hive-2.1.1-bin/scripts/llap/slider/package.py
apache-hive-2.1.1-bin/scripts/llap/slider/params.py
apache-hive-2.1.1-bin/scripts/llap/slider/templates.py
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/001-HIVE-972.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/002-HIVE-1068.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/003-HIVE-675.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/004-HIVE-1364.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/005-HIVE-417.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/006-HIVE-1823.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/007-HIVE-78.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/008-HIVE-2246.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/008-REVERT-HIVE-2246.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/009-HIVE-2215.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/010-HIVE-3072.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/011-HIVE-3649.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/012-HIVE-1362.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/013-HIVE-3255.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/014-HIVE-3764.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/016-HIVE-6386.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/017-HIVE-6458.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/018-HIVE-6757.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/019-HIVE-7784.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/020-HIVE-9296.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/021-HIVE-11970.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/022-HIVE-11107.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/023-HIVE-12807.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/024-HIVE-12814.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/025-HIVE-12816.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/026-HIVE-12818.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/027-HIVE-12819.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/028-HIVE-12821.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/029-HIVE-12822.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/030-HIVE-12823.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/031-HIVE-12831.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/032-HIVE-12832.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/034-HIVE-13076.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/035-HIVE-13395.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/036-HIVE-13354.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.10.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.11.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.12.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.13.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.14.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.3.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.4.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.4.1.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.5.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.6.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.7.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.8.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-0.9.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-1.1.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-1.2.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-1.3.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-2.0.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-schema-2.1.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-txn-schema-0.13.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-txn-schema-0.14.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-txn-schema-1.3.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-txn-schema-2.0.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/hive-txn-schema-2.1.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/README
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.10.0-to-0.11.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.11.0-to-0.12.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.12.0-to-0.13.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.13.0-to-0.14.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.14.0-to-1.1.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.5.0-to-0.6.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.6.0-to-0.7.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.7.0-to-0.8.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.8.0-to-0.9.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-0.9.0-to-0.10.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-1.1.0-to-1.2.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-1.2.0-to-1.3.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-1.2.0-to-2.0.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade-2.0.0-to-2.1.0.derby.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/derby/upgrade.order.derby
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/001-HIVE-6862.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/002-HIVE-7784.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/003-HIVE-8239.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/004-HIVE-8550.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/005-HIVE-9296.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/006-HIVE-9456.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/007-HIVE-11970.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/008-HIVE-12807.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/009-HIVE-12814.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/010-HIVE-12816.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/011-HIVE-12818.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/012-HIVE-12819.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/013-HIVE-12821.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/014-HIVE-12822.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/015-HIVE-12823.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/016-HIVE-12831.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/017-HIVE-12832.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/019-HIVE-13076.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/020-HIVE-13395.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/021-HIVE-13354.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-0.11.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-0.12.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-0.13.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-0.14.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-1.1.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-1.2.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-1.3.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-2.0.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-schema-2.1.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-txn-schema-0.13.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/hive-txn-schema-0.14.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/pre-0-upgrade-0.12.0-to-0.13.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/pre-0-upgrade-0.13.0-to-0.14.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/pre-1-upgrade-0.12.0-to-0.13.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/pre-1-upgrade-0.13.0-to-0.14.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/README
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/upgrade-0.12.0-to-0.13.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/upgrade-0.13.0-to-0.14.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/upgrade-0.14.0-to-1.1.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/upgrade-1.1.0-to-1.2.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/upgrade-1.2.0-to-1.3.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/upgrade-1.2.0-to-2.0.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/upgrade-2.0.0-to-2.1.0.mssql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mssql/upgrade.order.mssql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/001-HIVE-972.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/002-HIVE-1068.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/003-HIVE-675.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/004-HIVE-1364.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/005-HIVE-417.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/006-HIVE-1823.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/007-HIVE-78.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/008-HIVE-2246.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/009-HIVE-2215.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/010-HIVE-3072.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/011-HIVE-3649.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/012-HIVE-1362.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/013-HIVE-3255.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/014-HIVE-3764.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/016-HIVE-6386.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/017-HIVE-6458.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/018-HIVE-6757.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/019-HIVE-7784.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/020-HIVE-9296.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/021-HIVE-7018.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/022-HIVE-11970.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/023-HIVE-12807.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/024-HIVE-12814.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/025-HIVE-12816.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/026-HIVE-12818.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/027-HIVE-12819.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/028-HIVE-12821.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/029-HIVE-12822.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/030-HIVE-12823.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/031-HIVE-12831.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/032-HIVE-12832.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/034-HIVE-13076.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/035-HIVE-13395.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/036-HIVE-13354.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.10.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.11.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.12.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.13.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.14.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.3.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.4.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.4.1.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.5.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.6.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.7.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.8.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-0.9.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-1.1.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-1.2.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-1.3.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-2.0.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-schema-2.1.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-txn-schema-0.13.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-txn-schema-0.14.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-txn-schema-1.3.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-txn-schema-2.0.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/hive-txn-schema-2.1.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/README
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.10.0-to-0.11.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.11.0-to-0.12.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.12.0-to-0.13.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.13.0-to-0.14.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.14.0-to-1.1.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.5.0-to-0.6.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.6.0-to-0.7.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.7.0-to-0.8.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.8.0-to-0.9.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-0.9.0-to-0.10.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-1.1.0-to-1.2.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-1.2.0-to-1.3.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-1.2.0-to-2.0.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade-2.0.0-to-2.1.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/mysql/upgrade.order.mysql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/010-HIVE-3072.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/011-HIVE-3649.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/012-HIVE-1362.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/013-HIVE-3255.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/014-HIVE-3764.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/016-HIVE-6386.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/017-HIVE-6458.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/018-HIVE-6757.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/019-HIVE-7118.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/020-HIVE-7784.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/021-HIVE-9296.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/022-HIVE-11970.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/023-HIVE-12807.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/024-HIVE-12814.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/025-HIVE-12816.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/026-HIVE-12818.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/027-HIVE-12819.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/028-HIVE-12821.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/029-HIVE-12822.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/030-HIVE-12823.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/031-HIVE-12381.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/032-HIVE-12832.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/034-HIVE-13076.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/035-HIVE-13395.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/036-HIVE-13354.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-0.10.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-0.11.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-0.12.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-0.13.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-0.14.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-0.9.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-1.1.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-1.2.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-1.3.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-2.0.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-schema-2.1.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-txn-schema-0.13.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-txn-schema-0.14.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-txn-schema-1.3.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-txn-schema-2.0.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/hive-txn-schema-2.1.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/pre-0-upgrade-0.13.0-to-0.14.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-0.10.0-to-0.11.0.mysql.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-0.10.0-to-0.11.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-0.11.0-to-0.12.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-0.12.0-to-0.13.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-0.13.0-to-0.14.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-0.14.0-to-1.1.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-0.9.0-to-0.10.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-1.1.0-to-1.2.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-1.2.0-to-1.3.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-1.2.0-to-2.0.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade-2.0.0-to-2.1.0.oracle.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/oracle/upgrade.order.oracle
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/001-HIVE-972.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/002-HIVE-1068.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/003-HIVE-675.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/004-HIVE-1364.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/005-HIVE-417.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/006-HIVE-1823.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/007-HIVE-78.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/008-HIVE-2246.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/008-REVERT-HIVE-2246.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/009-HIVE-2215.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/010-HIVE-3072.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/011-HIVE-3649.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/012-HIVE-1362.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/013-HIVE-3255.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/014-HIVE-3764.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/016-HIVE-6386.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/017-HIVE-6458.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/018-HIVE-6757.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/019-HIVE-7784.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/020-HIVE-9296.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/021-HIVE-11970.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/022-HIVE-12807.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/023-HIVE-12814.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/024-HIVE-12816.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/025-HIVE-12818.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/026-HIVE-12819.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/027-HIVE-12821.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/028-HIVE-12822.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/029-HIVE-12823.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/030-HIVE-12831.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/031-HIVE-12832.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/033-HIVE-13076.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/034-HIVE-13395.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/035-HIVE-13354.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.10.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.11.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.12.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.13.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.14.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.3.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.4.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.4.1.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.5.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.6.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.7.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.8.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-0.9.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-1.1.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-1.2.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-1.3.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-2.0.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-schema-2.1.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-txn-schema-0.13.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-txn-schema-0.14.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-txn-schema-1.3.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-txn-schema-2.0.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/hive-txn-schema-2.1.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/pre-0-upgrade-0.12.0-to-0.13.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/pre-0-upgrade-0.13.0-to-0.14.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/README
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.10.0-to-0.11.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.11.0-to-0.12.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.12.0-to-0.13.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.13.0-to-0.14.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.14.0-to-1.1.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.5.0-to-0.6.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.6.0-to-0.7.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.7.0-to-0.8.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.8.0-to-0.9.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-0.9.0-to-0.10.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-1.1.0-to-1.2.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-1.2.0-to-1.3.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-1.2.0-to-2.0.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade-2.0.0-to-2.1.0.postgres.sql
apache-hive-2.1.1-bin/scripts/metastore/upgrade/postgres/upgrade.order.postgres
apache-hive-2.1.1-bin/conf/hive-default.xml.template
apache-hive-2.1.1-bin/conf/hive-env.sh.template
apache-hive-2.1.1-bin/conf/ivysettings.xml
apache-hive-2.1.1-bin/lib/php/ext/
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/tags/
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/tags/1.0.0/
apache-hive-2.1.1-bin/lib/php/packages/
apache-hive-2.1.1-bin/lib/php/packages/fb303/
apache-hive-2.1.1-bin/lib/php/protocol/
apache-hive-2.1.1-bin/lib/php/transport/
apache-hive-2.1.1-bin/lib/php/autoload.php
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/config.m4
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/php_thrift_protocol.cpp
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/php_thrift_protocol.h
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/tags/1.0.0/config.m4
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/tags/1.0.0/php_thrift_protocol.cpp
apache-hive-2.1.1-bin/lib/php/ext/thrift_protocol/tags/1.0.0/php_thrift_protocol.h
apache-hive-2.1.1-bin/lib/php/packages/fb303/FacebookService.php
apache-hive-2.1.1-bin/lib/php/packages/fb303/fb303_types.php
apache-hive-2.1.1-bin/lib/php/protocol/TBinaryProtocol.php
apache-hive-2.1.1-bin/lib/php/protocol/TProtocol.php
apache-hive-2.1.1-bin/lib/php/Thrift.php
apache-hive-2.1.1-bin/lib/php/transport/TBufferedTransport.php
apache-hive-2.1.1-bin/lib/php/transport/TFramedTransport.php
apache-hive-2.1.1-bin/lib/php/transport/THttpClient.php
apache-hive-2.1.1-bin/lib/php/transport/TMemoryBuffer.php
apache-hive-2.1.1-bin/lib/php/transport/TNullTransport.php
apache-hive-2.1.1-bin/lib/php/transport/TPhpStream.php
apache-hive-2.1.1-bin/lib/php/transport/TSocket.php
apache-hive-2.1.1-bin/lib/php/transport/TSocketPool.php
apache-hive-2.1.1-bin/lib/php/transport/TTransport.php
apache-hive-2.1.1-bin/lib/php/packages/serde/org/
apache-hive-2.1.1-bin/lib/php/packages/serde/org/apache/
apache-hive-2.1.1-bin/lib/php/packages/serde/org/apache/hadoop/
apache-hive-2.1.1-bin/lib/php/packages/serde/org/apache/hadoop/hive/
apache-hive-2.1.1-bin/lib/php/packages/serde/org/apache/hadoop/hive/serde/
apache-hive-2.1.1-bin/lib/php/packages/serde/org/apache/hadoop/hive/serde/Types.php
apache-hive-2.1.1-bin/lib/php/packages/serde/Types.php
apache-hive-2.1.1-bin/lib/php/packages/hive_metastore/metastore/
apache-hive-2.1.1-bin/lib/php/packages/hive_metastore/metastore/ThriftHiveMetastore.php
apache-hive-2.1.1-bin/lib/php/packages/hive_metastore/metastore/Types.php
apache-hive-2.1.1-bin/lib/php/packages/queryplan/Types.php
apache-hive-2.1.1-bin/lib/py/fb303/
apache-hive-2.1.1-bin/lib/py/fb303_scripts/
apache-hive-2.1.1-bin/lib/py/thrift/
apache-hive-2.1.1-bin/lib/py/thrift/protocol/
apache-hive-2.1.1-bin/lib/py/thrift/reflection/
apache-hive-2.1.1-bin/lib/py/thrift/reflection/limited/
apache-hive-2.1.1-bin/lib/py/thrift/server/
apache-hive-2.1.1-bin/lib/py/thrift/transport/
apache-hive-2.1.1-bin/lib/py/fb303/__init__.py
apache-hive-2.1.1-bin/lib/py/fb303/constants.py
apache-hive-2.1.1-bin/lib/py/fb303/FacebookBase.py
apache-hive-2.1.1-bin/lib/py/fb303/FacebookService-remote
apache-hive-2.1.1-bin/lib/py/fb303/FacebookService.py
apache-hive-2.1.1-bin/lib/py/fb303/ttypes.py
apache-hive-2.1.1-bin/lib/py/fb303_scripts/__init__.py
apache-hive-2.1.1-bin/lib/py/fb303_scripts/fb303_simple_mgmt.py
apache-hive-2.1.1-bin/lib/py/thrift/__init__.py
apache-hive-2.1.1-bin/lib/py/thrift/protocol/__init__.py
apache-hive-2.1.1-bin/lib/py/thrift/protocol/fastbinary.c
apache-hive-2.1.1-bin/lib/py/thrift/protocol/TBinaryProtocol.py
apache-hive-2.1.1-bin/lib/py/thrift/protocol/TProtocol.py
apache-hive-2.1.1-bin/lib/py/thrift/reflection/__init__.py
apache-hive-2.1.1-bin/lib/py/thrift/reflection/limited/__init__.py
apache-hive-2.1.1-bin/lib/py/thrift/reflection/limited/constants.py
apache-hive-2.1.1-bin/lib/py/thrift/reflection/limited/ttypes.py
apache-hive-2.1.1-bin/lib/py/thrift/server/__init__.py
apache-hive-2.1.1-bin/lib/py/thrift/server/THttpServer.py
apache-hive-2.1.1-bin/lib/py/thrift/server/TNonblockingServer.py
apache-hive-2.1.1-bin/lib/py/thrift/server/TServer.py
apache-hive-2.1.1-bin/lib/py/thrift/Thrift.py
apache-hive-2.1.1-bin/lib/py/thrift/transport/__init__.py
apache-hive-2.1.1-bin/lib/py/thrift/transport/THttpClient.py
apache-hive-2.1.1-bin/lib/py/thrift/transport/TSocket.py
apache-hive-2.1.1-bin/lib/py/thrift/transport/TTransport.py
apache-hive-2.1.1-bin/lib/py/thrift/transport/TTwisted.py
apache-hive-2.1.1-bin/lib/py/thrift/TSCons.py
apache-hive-2.1.1-bin/lib/py/hive_serde/__init__.py
apache-hive-2.1.1-bin/lib/py/hive_serde/constants.py
apache-hive-2.1.1-bin/lib/py/hive_serde/ttypes.py
apache-hive-2.1.1-bin/lib/py/hive_metastore/__init__.py
apache-hive-2.1.1-bin/lib/py/hive_metastore/constants.py
apache-hive-2.1.1-bin/lib/py/hive_metastore/ThriftHiveMetastore-remote
apache-hive-2.1.1-bin/lib/py/hive_metastore/ThriftHiveMetastore.py
apache-hive-2.1.1-bin/lib/py/hive_metastore/ttypes.py
apache-hive-2.1.1-bin/lib/py/queryplan/__init__.py
apache-hive-2.1.1-bin/lib/py/queryplan/constants.py
apache-hive-2.1.1-bin/lib/py/queryplan/ttypes.py
apache-hive-2.1.1-bin/hcatalog/bin/common.sh
apache-hive-2.1.1-bin/hcatalog/bin/hcat
apache-hive-2.1.1-bin/hcatalog/bin/hcat.py
apache-hive-2.1.1-bin/hcatalog/bin/hcatcfg.py
apache-hive-2.1.1-bin/hcatalog/bin/templeton.cmd
apache-hive-2.1.1-bin/hcatalog/etc/hcatalog/jndi.properties
apache-hive-2.1.1-bin/hcatalog/etc/hcatalog/proto-hive-site.xml
apache-hive-2.1.1-bin/hcatalog/etc/webhcat/webhcat-default.xml
apache-hive-2.1.1-bin/hcatalog/etc/webhcat/webhcat-log4j2.properties
apache-hive-2.1.1-bin/hcatalog/libexec/hcat-config.sh
apache-hive-2.1.1-bin/hcatalog/sbin/hcat_server.py
apache-hive-2.1.1-bin/hcatalog/sbin/hcat_server.sh
apache-hive-2.1.1-bin/hcatalog/sbin/hcatcfg.py
apache-hive-2.1.1-bin/hcatalog/sbin/update-hcatalog-env.sh
apache-hive-2.1.1-bin/hcatalog/sbin/webhcat_config.sh
apache-hive-2.1.1-bin/hcatalog/sbin/webhcat_server.sh
apache-hive-2.1.1-bin/conf/hive-log4j2.properties.template
apache-hive-2.1.1-bin/conf/hive-exec-log4j2.properties.template
apache-hive-2.1.1-bin/conf/beeline-log4j2.properties.template
apache-hive-2.1.1-bin/conf/llap-daemon-log4j2.properties.template
apache-hive-2.1.1-bin/conf/llap-cli-log4j2.properties.template
apache-hive-2.1.1-bin/conf/parquet-logging.properties
apache-hive-2.1.1-bin/hcatalog/share/doc/hcatalog/README.txt
apache-hive-2.1.1-bin/lib/hive-common-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-shims-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-shims-common-2.1.1.jar
apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar
apache-hive-2.1.1-bin/lib/log4j-api-2.4.1.jar
apache-hive-2.1.1-bin/lib/guava-14.0.1.jar
apache-hive-2.1.1-bin/lib/commons-lang-2.6.jar
apache-hive-2.1.1-bin/lib/libthrift-0.9.3.jar
apache-hive-2.1.1-bin/lib/httpclient-4.4.jar
apache-hive-2.1.1-bin/lib/httpcore-4.4.jar
apache-hive-2.1.1-bin/lib/commons-logging-1.2.jar
apache-hive-2.1.1-bin/lib/commons-codec-1.4.jar
apache-hive-2.1.1-bin/lib/curator-framework-2.6.0.jar
apache-hive-2.1.1-bin/lib/curator-client-2.6.0.jar
apache-hive-2.1.1-bin/lib/zookeeper-3.4.6.jar
apache-hive-2.1.1-bin/lib/jline-2.12.jar
apache-hive-2.1.1-bin/lib/netty-3.7.0.Final.jar
apache-hive-2.1.1-bin/lib/hive-shims-0.23-2.1.1.jar
apache-hive-2.1.1-bin/lib/guice-servlet-3.0.jar
apache-hive-2.1.1-bin/lib/guice-3.0.jar
apache-hive-2.1.1-bin/lib/javax.inject-1.jar
apache-hive-2.1.1-bin/lib/aopalliance-1.0.jar
apache-hive-2.1.1-bin/lib/protobuf-java-2.5.0.jar
apache-hive-2.1.1-bin/lib/commons-io-2.4.jar
apache-hive-2.1.1-bin/lib/activation-1.1.jar
apache-hive-2.1.1-bin/lib/jackson-jaxrs-1.9.2.jar
apache-hive-2.1.1-bin/lib/jackson-xc-1.9.2.jar
apache-hive-2.1.1-bin/lib/jersey-server-1.14.jar
apache-hive-2.1.1-bin/lib/asm-3.1.jar
apache-hive-2.1.1-bin/lib/commons-compress-1.9.jar
apache-hive-2.1.1-bin/lib/jetty-util-6.1.26.jar
apache-hive-2.1.1-bin/lib/jersey-client-1.9.jar
apache-hive-2.1.1-bin/lib/commons-cli-1.2.jar
apache-hive-2.1.1-bin/lib/commons-collections-3.2.2.jar
apache-hive-2.1.1-bin/lib/commons-httpclient-3.0.1.jar
apache-hive-2.1.1-bin/lib/junit-4.11.jar
apache-hive-2.1.1-bin/lib/hamcrest-core-1.3.jar
apache-hive-2.1.1-bin/lib/jetty-6.1.26.jar
apache-hive-2.1.1-bin/lib/hive-shims-scheduler-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-storage-api-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-orc-2.1.1.jar
apache-hive-2.1.1-bin/lib/jasper-compiler-5.5.23.jar
apache-hive-2.1.1-bin/lib/jasper-runtime-5.5.23.jar
apache-hive-2.1.1-bin/lib/commons-el-1.0.jar
apache-hive-2.1.1-bin/lib/gson-2.2.4.jar
apache-hive-2.1.1-bin/lib/curator-recipes-2.6.0.jar
apache-hive-2.1.1-bin/lib/jsr305-3.0.0.jar
apache-hive-2.1.1-bin/lib/snappy-0.2.jar
apache-hive-2.1.1-bin/lib/jetty-all-7.6.0.v20120127.jar
apache-hive-2.1.1-bin/lib/geronimo-jta_1.1_spec-1.1.1.jar
apache-hive-2.1.1-bin/lib/mail-1.4.1.jar
apache-hive-2.1.1-bin/lib/geronimo-jaspic_1.0_spec-1.0.jar
apache-hive-2.1.1-bin/lib/geronimo-annotation_1.0_spec-1.1.1.jar
apache-hive-2.1.1-bin/lib/asm-commons-3.1.jar
apache-hive-2.1.1-bin/lib/asm-tree-3.1.jar
apache-hive-2.1.1-bin/lib/javax.servlet-3.0.0.v201112011016.jar
apache-hive-2.1.1-bin/lib/joda-time-2.5.jar
apache-hive-2.1.1-bin/lib/log4j-1.2-api-2.4.1.jar
apache-hive-2.1.1-bin/lib/log4j-core-2.4.1.jar
apache-hive-2.1.1-bin/lib/log4j-web-2.4.1.jar
apache-hive-2.1.1-bin/lib/ant-1.9.1.jar
apache-hive-2.1.1-bin/lib/ant-launcher-1.9.1.jar
apache-hive-2.1.1-bin/lib/json-20090211.jar
apache-hive-2.1.1-bin/lib/metrics-core-3.1.0.jar
apache-hive-2.1.1-bin/lib/metrics-jvm-3.1.0.jar
apache-hive-2.1.1-bin/lib/metrics-json-3.1.0.jar
apache-hive-2.1.1-bin/lib/jackson-databind-2.4.2.jar
apache-hive-2.1.1-bin/lib/jackson-annotations-2.4.0.jar
apache-hive-2.1.1-bin/lib/jackson-core-2.4.2.jar
apache-hive-2.1.1-bin/lib/dropwizard-metrics-hadoop-metrics2-reporter-0.1.2.jar
apache-hive-2.1.1-bin/lib/hive-serde-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-service-rpc-2.1.1.jar
apache-hive-2.1.1-bin/lib/jsp-api-2.0.jar
apache-hive-2.1.1-bin/lib/servlet-api-2.4.jar
apache-hive-2.1.1-bin/lib/ant-1.6.5.jar
apache-hive-2.1.1-bin/lib/libfb303-0.9.3.jar
apache-hive-2.1.1-bin/lib/avro-1.7.7.jar
apache-hive-2.1.1-bin/lib/paranamer-2.3.jar
apache-hive-2.1.1-bin/lib/snappy-java-1.0.5.jar
apache-hive-2.1.1-bin/lib/opencsv-2.3.jar
apache-hive-2.1.1-bin/lib/parquet-hadoop-bundle-1.8.1.jar
apache-hive-2.1.1-bin/lib/hive-metastore-2.1.1.jar
apache-hive-2.1.1-bin/lib/javolution-5.5.1.jar
apache-hive-2.1.1-bin/lib/hbase-client-1.1.1.jar
apache-hive-2.1.1-bin/lib/hbase-annotations-1.1.1.jar
apache-hive-2.1.1-bin/lib/findbugs-annotations-1.3.9-1.jar
apache-hive-2.1.1-bin/lib/hbase-common-1.1.1.jar
apache-hive-2.1.1-bin/lib/hbase-protocol-1.1.1.jar
apache-hive-2.1.1-bin/lib/htrace-core-3.1.0-incubating.jar
apache-hive-2.1.1-bin/lib/netty-all-4.0.23.Final.jar
apache-hive-2.1.1-bin/lib/jcodings-1.0.8.jar
apache-hive-2.1.1-bin/lib/joni-2.1.2.jar
apache-hive-2.1.1-bin/lib/bonecp-0.8.0.RELEASE.jar
apache-hive-2.1.1-bin/lib/derby-10.10.2.0.jar
apache-hive-2.1.1-bin/lib/datanucleus-api-jdo-4.2.1.jar
apache-hive-2.1.1-bin/lib/datanucleus-core-4.1.6.jar
apache-hive-2.1.1-bin/lib/datanucleus-rdbms-4.1.7.jar
apache-hive-2.1.1-bin/lib/commons-pool-1.5.4.jar
apache-hive-2.1.1-bin/lib/commons-dbcp-1.4.jar
apache-hive-2.1.1-bin/lib/jdo-api-3.0.1.jar
apache-hive-2.1.1-bin/lib/jta-1.1.jar
apache-hive-2.1.1-bin/lib/javax.jdo-3.2.0-m3.jar
apache-hive-2.1.1-bin/lib/transaction-api-1.1.jar
apache-hive-2.1.1-bin/lib/antlr-runtime-3.4.jar
apache-hive-2.1.1-bin/lib/stringtemplate-3.2.1.jar
apache-hive-2.1.1-bin/lib/antlr-2.7.7.jar
apache-hive-2.1.1-bin/lib/tephra-api-0.6.0.jar
apache-hive-2.1.1-bin/lib/tephra-core-0.6.0.jar
apache-hive-2.1.1-bin/lib/guice-assistedinject-3.0.jar
apache-hive-2.1.1-bin/lib/fastutil-6.5.6.jar
apache-hive-2.1.1-bin/lib/twill-common-0.6.0-incubating.jar
apache-hive-2.1.1-bin/lib/twill-core-0.6.0-incubating.jar
apache-hive-2.1.1-bin/lib/twill-api-0.6.0-incubating.jar
apache-hive-2.1.1-bin/lib/twill-discovery-api-0.6.0-incubating.jar
apache-hive-2.1.1-bin/lib/twill-zookeeper-0.6.0-incubating.jar
apache-hive-2.1.1-bin/lib/twill-discovery-core-0.6.0-incubating.jar
apache-hive-2.1.1-bin/lib/tephra-hbase-compat-1.0-0.6.0.jar
apache-hive-2.1.1-bin/lib/hive-testutils-2.1.1.jar
apache-hive-2.1.1-bin/lib/tempus-fugit-1.1.jar
apache-hive-2.1.1-bin/lib/hive-exec-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-ant-2.1.1.jar
apache-hive-2.1.1-bin/lib/velocity-1.5.jar
apache-hive-2.1.1-bin/lib/hive-llap-tez-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-llap-client-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-llap-common-2.1.1.jar
apache-hive-2.1.1-bin/lib/commons-lang3-3.1.jar
apache-hive-2.1.1-bin/lib/ST4-4.0.4.jar
apache-hive-2.1.1-bin/lib/ivy-2.4.0.jar
apache-hive-2.1.1-bin/lib/apache-curator-2.6.0.pom
apache-hive-2.1.1-bin/lib/groovy-all-2.4.4.jar
apache-hive-2.1.1-bin/lib/calcite-core-1.6.0.jar
apache-hive-2.1.1-bin/lib/calcite-avatica-1.6.0.jar
apache-hive-2.1.1-bin/lib/calcite-linq4j-1.6.0.jar
apache-hive-2.1.1-bin/lib/eigenbase-properties-1.1.5.jar
apache-hive-2.1.1-bin/lib/janino-2.7.6.jar
apache-hive-2.1.1-bin/lib/commons-compiler-2.7.6.jar
apache-hive-2.1.1-bin/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar
apache-hive-2.1.1-bin/lib/stax-api-1.0.1.jar
apache-hive-2.1.1-bin/lib/hive-service-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-llap-server-2.1.1.jar
apache-hive-2.1.1-bin/lib/slider-core-0.90.2-incubating.jar
apache-hive-2.1.1-bin/lib/jcommander-1.32.jar
apache-hive-2.1.1-bin/lib/jsp-api-2.1.jar
apache-hive-2.1.1-bin/lib/hbase-hadoop2-compat-1.1.1.jar
apache-hive-2.1.1-bin/lib/hbase-hadoop-compat-1.1.1.jar
apache-hive-2.1.1-bin/lib/commons-math-2.2.jar
apache-hive-2.1.1-bin/lib/metrics-core-2.2.0.jar
apache-hive-2.1.1-bin/lib/hbase-server-1.1.1.jar
apache-hive-2.1.1-bin/lib/hbase-procedure-1.1.1.jar
apache-hive-2.1.1-bin/lib/hbase-common-1.1.1-tests.jar
apache-hive-2.1.1-bin/lib/hbase-prefix-tree-1.1.1.jar
apache-hive-2.1.1-bin/lib/jetty-sslengine-6.1.26.jar
apache-hive-2.1.1-bin/lib/jsp-2.1-6.1.14.jar
apache-hive-2.1.1-bin/lib/jsp-api-2.1-6.1.14.jar
apache-hive-2.1.1-bin/lib/servlet-api-2.5-6.1.14.jar
apache-hive-2.1.1-bin/lib/jamon-runtime-2.3.1.jar
apache-hive-2.1.1-bin/lib/disruptor-3.3.0.jar
apache-hive-2.1.1-bin/lib/jpam-1.1.jar
apache-hive-2.1.1-bin/lib/hive-jdbc-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-beeline-2.1.1.jar
apache-hive-2.1.1-bin/lib/super-csv-2.2.0.jar
apache-hive-2.1.1-bin/lib/hive-cli-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-contrib-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-hbase-handler-2.1.1.jar
apache-hive-2.1.1-bin/lib/hbase-hadoop2-compat-1.1.1-tests.jar
apache-hive-2.1.1-bin/lib/hive-hwi-2.1.1.jar
apache-hive-2.1.1-bin/lib/jetty-all-server-7.6.0.v20120127.jar
apache-hive-2.1.1-bin/lib/hive-accumulo-handler-2.1.1.jar
apache-hive-2.1.1-bin/lib/accumulo-core-1.6.0.jar
apache-hive-2.1.1-bin/lib/accumulo-fate-1.6.0.jar
apache-hive-2.1.1-bin/lib/accumulo-start-1.6.0.jar
apache-hive-2.1.1-bin/lib/commons-vfs2-2.0.jar
apache-hive-2.1.1-bin/lib/maven-scm-api-1.4.jar
apache-hive-2.1.1-bin/lib/plexus-utils-1.5.6.jar
apache-hive-2.1.1-bin/lib/maven-scm-provider-svnexe-1.4.jar
apache-hive-2.1.1-bin/lib/maven-scm-provider-svn-commons-1.4.jar
apache-hive-2.1.1-bin/lib/regexp-1.3.jar
apache-hive-2.1.1-bin/lib/accumulo-trace-1.6.0.jar
apache-hive-2.1.1-bin/lib/hive-llap-ext-client-2.1.1.jar
apache-hive-2.1.1-bin/lib/hive-hplsql-2.1.1.jar
apache-hive-2.1.1-bin/lib/antlr4-runtime-4.5.jar
apache-hive-2.1.1-bin/lib/org.abego.treelayout.core-1.0.1.jar
apache-hive-2.1.1-bin/jdbc/hive-jdbc-2.1.1-standalone.jar
apache-hive-2.1.1-bin/hcatalog/share/hcatalog/hive-hcatalog-streaming-2.1.1.jar
apache-hive-2.1.1-bin/hcatalog/share/hcatalog/hive-hcatalog-core-2.1.1.jar
apache-hive-2.1.1-bin/hcatalog/share/hcatalog/hive-hcatalog-pig-adapter-2.1.1.jar
apache-hive-2.1.1-bin/hcatalog/share/hcatalog/hive-hcatalog-server-extensions-2.1.1.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/jersey-json-1.14.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/jettison-1.1.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/jaxb-impl-2.2.3-1.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/jackson-core-asl-1.9.2.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/jersey-core-1.14.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/hive-webhcat-2.1.1.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/jersey-servlet-1.14.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/wadl-resourcedoc-doclet-1.4.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/commons-exec-1.1.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/svr/lib/jul-to-slf4j-1.7.10.jar
apache-hive-2.1.1-bin/hcatalog/share/webhcat/java-client/hive-webhcat-java-client-2.1.1.jar
 
cs



3번과 4번은 선택 사항입니다!


3. 디렉토리 이동합니다.

1
$ sudo mv ~/apache-hive-2.3.0-bin /usr/local/
cs



4. 링크 연결합니다.

1
$ sudo ln -sf /usr/local/apache-hive-2.3.0-bin/ /usr/local/hive
cs



5. profiler 파일에 Hive 의 Home 경로를 등록합니다.
5-1. profile 파일을 수정합니다.

1
$ sudo vi /etc/profile
cs


5-2. profile 파일의 하단에 아래와 같이 추가 후 저장합니다.

1
2
export HIVE_HOME=/usr/local/hive
export PATH=$PATH:$HIVE_HOME/bin
cs


5-3. 추가한 내용을 반영합니다.

1
$ source /etc/profile
cs



6. .bashrc 파일에 Hive 의 HOME 경로를 등록합니다.
6-1. .bashrc 파일을 수정합니다.

1
$ sudo vi ~/.bashrc
cs

* .bashrc 파일은 접속한 계정의 HOME 디렉토리 이하에 있습니다.


6-2. .bashrc 파일의 하단에 다음와 같이 추가 후 저장합니다.
1
2
3
4
# Hive PATH
export HIVE_HOME=/usr/local/hadoop/hive
# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin:$JAVA_HOME/bin:$HADOOP_HOME/sbin:$HIVE_HOME/bin
cs



6-3. 추가한 내용을 반영합니다.

1
$ source ~/.bashrc
cs



7. hdfs 에 디렉토리를 생성합니다.

7-1. /tmp

1
$ hdfs dfs -mkdir /tmp
cs

7-2. /user/hive/warehouse
1
$ hdfs dfs -mkdir /user/hive/warehouse
cs

* 상위 디렉토리인 /user 및 /user/hive 디렉토리가 없을 경우 먼저 생성 후에 진행합니다.

8. 생성한 디렉토리에 접근 권한을 부여합니다.

8-1. /tmp

1
$ hdfs dfs -chmod g+/tmp
cs


8-2. /user/hive/warehouse

1
$ hdfs dfs -chmod g+/user/hive/warehouse
cs



9. mysql jdbc 를 다운로드합니다.

1
2
3
4
5
6
7
8
9
10
11
12
$ wget http://repo.maven.apache.org/maven2/mysql/mysql-connector-java/6.0.6/mysql-connector-java-6.0.6.jar
--2017-10-13 18:20:57--  http://repo.maven.apache.org/maven2/mysql/mysql-connector-java/6.0.6/mysql-connector-java-6.0.6.jar
Resolving repo.maven.apache.org (repo.maven.apache.org)... 151.101.40.215
Connecting to repo.maven.apache.org (repo.maven.apache.org)|151.101.40.215|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2001778 (1.9M) [application/java-archive]
Saving to: ‘mysql-connector-java-6.0.6.jar’
 
mysql-connector-java-6. 100%[============================>]   1.91M  30.0KB/s    in 3m 22s
 
2017-10-13 18:24:20 (9.70 KB/s) - ‘mysql-connector-java-6.0.6.jar’ saved [2001778/2001778]
 
cs



10. hive/lib 디렉토리 이하에 mysql-connector-java-5.1.9.jar 파일을 복사 혹은 이동합니다.

1
$ cp ~/mysql-connector-java-5.1.9.jar /usr/local/hive/lib/
cs

* hive 는 링크된 것으로써 apache-hive-2.3.0-bin 로 연결됩니다.



11. hive/conf 디렉토리 내에 hive-env.sh.template 파일을 hive-env.sh 이름으로 복사합니다.

1
$ cp /usr/local/hive/conf/hive-env.sh.template /usr/local/hive/conf/hive-env.sh
cs



12. hive-env.sh 파일을 수정합니다.

1
$ vi /usr/local/hive/conf/hive-env.sh
cs



13. hive-env.sh 파일에 다음과 같이 추가 후 저장합니다.

1
HADOOP_HOME=/usr/local/hadoop
cs

* hadoop 의 HOME 경로를 확인 후에 작성하세요.



14. hive-log4j2.properties.template 파일을 hive-log4j2.properties 이름으로 복사합니다.

1
$ cp /usr/local/hive/conf/hive-log4j2.properties.template /usr/local/hive/conf/hive-log4j2.properties
cs



15. hive-default.xml.template 파일을 hive-site.xml 이름으로 복사합니다.

1
$ cp /usr/local/hive/conf/hive-default.xml.template /usr/local/hive/conf/hive-site.xml
cs



16. hive-site.xml 파일을 수정합니다.

1
$ vi /usr/local/hive/conf/hive-site.xml
cs

* 파일을 편집하려고 보면 여러 줄에 걸쳐서 설정값들이 있습니다.

* vi 의 해당 라인을 표시하려면 : 을 입력한 후 set nu 를 입력하세요.

* 불필요한 줄들을 삭제하려면 '123'dd 를 실행하면 커서의 위치를 기준으로 123줄을 삭제합니다.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at
       http://www.apache.org/licenses/LICENSE-2.0
   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->
<configuration>
  <!-- WARNING!!! This file is auto generated for documentation purposes ONLY! -->
  <!-- WARNING!!! Any changes you make to this file will be ignored by Hive.   -->
  <!-- WARNING!!! You must make your changes in hive-site.xml instead.         -->
  <!-- Hive Execution Parameters -->
</configuration>
 
cs


17. hive-site.xml 파일에 다음과 같이 수정 및 추가합니다.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
        <name>hive.metastore.warehouse.dir</name>
        <value>/user/hive/warehouse</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>hive</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>pass</value>
</property>
</configuration>
 
cs

* hive 부분에 해당 계정의 아이디를 입력합니다.

* pass 부분에 해당 계정의 비밀번호를 입력합니다.



18. hadoop 디렉토리 이하에 hadoop-env.sh 파일을 수정하여 hive 관련 liv 및 conf 를 hadoop 이 인식할 수 있도록 합니다.

1
vi /usr/local/hadoop/etc/hadoop/hadoop-env.sh
cs



19. hadoop-env.sh 파일의 하단에 다음과 같이 내용을 추가합니다.

1
2
3
4
export HADOOP_CLASSPATH=$HIVE_HOME/conf:$HIVE_HOME/lib
for f in ${HIVE_HOME}/lib/*.jar; do
   HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:$f;
done
cs



20. /usr/local/hive/scripts/metastore/upgrade/mysql/ 디렉토리로 이동합니다.

1
$ cd /usr/local/hive/scripts/metastore/upgrade/mysql/
cs



21. hive 데이터베이스에 스키마를 생성합니다. 두 가지 방식 중 한 가지를 골라서 진행하세요.

21-1. 스크립트 원격 실행 방식

1
2
3
$ mysql -u hive -p hive < /usr/local/hive/scripts/metastore/upgrade/mysql/hive-schema-2.1.0.mysql.sql
Enter password:
 
cs

* 스키마를 생성하기 전에 설치된 hive 버전에 주의하세요.


21-2. /usr/local/hive/bin/schematool 를 실행하는 방식

1
/usr/local/hive/bin/schematool -initSchema -dbType mysql -verbose
cs

* hive-site.xml 파일이 설정된 후에 가능함.



22. hive 의 metastore 를 백그라운드로 실행합니다.

1
2
3
4
5
6
7
8
9
$ ./bin/hive --service metastore &
[1] 8095
hadoop-user@master:/usr/local/hive$ Starting Hive Metastore Server
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
 
cs



23. hive 에 접속합니다.

1
2
3
4
5
6
7
8
9
10
$ /usr/local/hive/bin/hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
 
Logging initialized using configuration in file:/usr/local/apache-hive-2.1.1-bin/conf/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
 
cs

* "HIVE_HOME = /usr/local/apache-hive-2.1.1-bin" 와 같습니다.

* hadoop 관련 서비스가 먼저 실행되어 있어야 합니다.



24. hive 정상 동작 여부 확인

24-1. databases

1
2
3
4
5
hive> show databases;
OK
default
Time taken: 3.065 seconds
 
cs


24-2. tables

1
2
3
4
hive> show tables;
OK
Time taken: 0.018 seconds
 
cs




1. hdfs 저장소에 input 디렉토리를 생성

1
2
$ hdfs dfs -mkdir /input
 
cs



2. hdfs 저장소에 테스트를 위한 파일 저장

2-1. 샘플 데이터

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
$ ll /usr/local/hadoop
total 168
drwxrwxr-x 12 hadoop-user hadoop  4096 Oct 11 11:19 ./
drwxr-xr-x 14 root        root    4096 Oct 10 16:47 ../
drwxrwxr-x  2 hadoop-user hadoop  4096 Jun  2 15:24 bin/
drwxrwxr-x  3 hadoop-user hadoop  4096 Jun  2 15:24 etc/
drwxrwxr-x  2 hadoop-user hadoop  4096 Jun  2 15:24 include/
drwxrwxr-x  3 hadoop-user hadoop  4096 Jun  2 15:24 lib/
drwxrwxr-x  2 hadoop-user hadoop  4096 Jun  2 15:24 libexec/
-rw-rw-r--  1 hadoop-user hadoop 99253 Jun  2 15:24 LICENSE.txt
drwxr-xr-x  3 hadoop-user hadoop  4096 Oct 13 12:37 logs/
-rw-------  1 hadoop-user hadoop     0 Oct 11 11:19 nohup.out
-rw-rw-r--  1 hadoop-user hadoop 15915 Jun  2 15:24 NOTICE.txt
-rw-r--r--  1 hadoop-user hadoop  1366 Jun  2 15:24 README.txt
drwxrwxr-x  3 hadoop-user hadoop  4096 Sep 27 13:03 sbin/
drwxrwxr-x  4 hadoop-user hadoop  4096 Jun  2 15:24 share/
drwxr-xr-x  3 hadoop-user hadoop  4096 Oct 10 13:42 tmp/
drwxr-xr-x  3 hadoop-user hadoop  4096 Sep 25 11:10 yarn_data/
 
cs

* hadoop 디렉토리 내에 README.txt 파일을 샘플 데이터로 사용하겠습니다.


2-2. 샘플 데이터를 hdfs 저장소에 복사

1
2
$ hdfs dfs -put /usr/local/hadoop/README.txt /input
 
cs


2-2-1. 샘플 데이터 복사 여부 확인

1
2
3
4
$ hdfs dfs -ls /input/
Found 1 items
-rw-r--r--   3 hadoop-user supergroup       1366 2017-10-13 13:01 /input/README.txt
 
cs

* hdfs 저장소에 README.txt 파일을 확인함으로써 테스트를 위한 데이터 준비가 완료되었습니다.



3. wordcount 실행

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
$ hadoop jar /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.8.1.jar wordcount /input/README.txt /output
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hive-0.8.1/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/10/13 13:15:45 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.10.30:8032
17/10/13 13:15:46 INFO input.FileInputFormat: Total input files to process : 1
17/10/13 13:15:46 INFO mapreduce.JobSubmitter: number of splits:1
17/10/13 13:15:47 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1507865821865_0002
17/10/13 13:15:47 INFO impl.YarnClientImpl: Submitted application application_1507865821865_0002
17/10/13 13:15:47 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1507865821865_0002/
17/10/13 13:15:47 INFO mapreduce.Job: Running job: job_1507865821865_0002
17/10/13 13:15:54 INFO mapreduce.Job: Job job_1507865821865_0002 running in uber mode : false
17/10/13 13:15:54 INFO mapreduce.Job:  map 0% reduce 0%
17/10/13 13:15:58 INFO mapreduce.Job:  map 100% reduce 0%
17/10/13 13:16:03 INFO mapreduce.Job:  map 100% reduce 100%
17/10/13 13:16:03 INFO mapreduce.Job: Job job_1507865821865_0002 completed successfully
17/10/13 13:16:03 INFO mapreduce.Job: Counters: 49
        File System Counters
                FILE: Number of bytes read=1836
                FILE: Number of bytes written=276889
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=1466
                HDFS: Number of bytes written=1306
                HDFS: Number of read operations=6
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters
                Launched map tasks=1
                Launched reduce tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=2318
                Total time spent by all reduces in occupied slots (ms)=2343
                Total time spent by all map tasks (ms)=2318
                Total time spent by all reduce tasks (ms)=2343
                Total vcore-milliseconds taken by all map tasks=2318
                Total vcore-milliseconds taken by all reduce tasks=2343
                Total megabyte-milliseconds taken by all map tasks=2373632
                Total megabyte-milliseconds taken by all reduce tasks=2399232
        Map-Reduce Framework
                Map input records=31
                Map output records=179
                Map output bytes=2055
                Map output materialized bytes=1836
                Input split bytes=100
                Combine input records=179
                Combine output records=131
                Reduce input groups=131
                Reduce shuffle bytes=1836
                Reduce input records=131
                Reduce output records=131
                Spilled Records=262
                Shuffled Maps =1
                Failed Shuffles=0
                Merged Map outputs=1
                GC time elapsed (ms)=168
                CPU time spent (ms)=1250
                Physical memory (bytes) snapshot=465510400
                Virtual memory (bytes) snapshot=3973697536
                Total committed heap usage (bytes)=354942976
        Shuffle Errors
                BAD_ID=0
                CONNECTION=0
                IO_ERROR=0
                WRONG_LENGTH=0
                WRONG_MAP=0
                WRONG_REDUCE=0
        File Input Format Counters
                Bytes Read=1366
        File Output Format Counters
                Bytes Written=1306
 
cs

* hadoop 설치 시 제공된 hadoop-mapreduce-examples-2.8.1.jar 파일에 wordcount 를 실행한 결과입니다.

* 출력 결과는 실행 시 지정한 /output 디렉토리에 생성됩니다.

* 참고: 출력 디렉토리는 미리 생성하지 않아도 실행 시에 자동으로 생성됩니다.



4. wordcount 실행 후 출력 파일 확인

1
2
3
4
5
$ hdfs dfs -ls /output
Found 2 items
-rw-r--r--   3 hadoop-user supergroup          0 2017-10-13 13:16 /output/_SUCCESS
-rw-r--r--   3 hadoop-user supergroup       1306 2017-10-13 13:16 /output/part-r-00000
 
cs

* 위와 같이 /output 디렉토리에 part-r-00000 파일이 잘 생성되었는지 확인합니다.



5. 출력 결과 확인

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
$ hdfs dfs -cat /output/part-r-00000
(BIS),  1
(ECCN)  1
(TSU)   1
(see    1
5D002.C.1,      1
740.13) 1
<http://www.wassenaar.org/>     1
Administration  1
Apache  1
BEFORE  1
BIS     1
Bureau  1
Commerce,       1
Commodity       1
Control 1
Core    1
Department      1
ENC     1
Exception       1
Export  2
For     1
Foundation      1
Government      1
Hadoop  1
Hadoop, 1
Industry        1
Jetty   1
License 1
Number  1
Regulations,    1
SSL     1
Section 1
Security        1
See     1
Software        2
Technology      1
The     4
This    1
U.S.    1
Unrestricted    1
about   1
algorithms.     1
and     6
and/or  1
another 1
any     1
as      1
asymmetric      1
at:     2
both    1
by      1
check   1
classified      1
code    1
code.   1
concerning      1
country 1
country's       1
country,        1
cryptographic   3
currently       1
details 1
distribution    2
eligible        1
encryption      3
exception       1
export  1
following       1
for     3
form    1
from    1
functions       1
has     1
have    1
http://hadoop.apache.org/core/  1
http://wiki.apache.org/hadoop/  1
if      1
import, 2
in      1
included        1
includes        2
information     2
information.    1
is      1
it      1
latest  1
laws,   1
libraries       1
makes   1
manner  1
may     1
more    2
mortbay.org.    1
object  1
of      5
on      2
or      2
our     2
performing      1
permitted.      1
please  2
policies        1
possession,     2
project 1
provides        1
re-export       2
regulations     1
reside  1
restrictions    1
security        1
see     1
software        2
software,       2
software.       2
software:       1
source  1
the     8
this    3
to      2
under   1
use,    2
uses    1
using   2
visit   1
website 1
which   2
wiki,   1
with    1
written 1
you     1
your    1
 
cs

* 위의 결과로 미루어보아 생각보다 데이터의 양은 많지 않았던 것 같네요.




1. hadoop-mapreduce-examples-2.8.1.jar 파일의 pi 클래스 실행

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
$ hadoop jar /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.8.1.jar pi 10 1000
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hive-0.8.1/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Number of Maps  = 10
Samples per Map = 1000
Wrote input for Map #0
Wrote input for Map #1
Wrote input for Map #2
Wrote input for Map #3
Wrote input for Map #4
Wrote input for Map #5
Wrote input for Map #6
Wrote input for Map #7
Wrote input for Map #8
Wrote input for Map #9
Starting Job
17/10/13 12:44:58 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.10.30:8032
17/10/13 12:44:59 INFO input.FileInputFormat: Total input files to process : 10
17/10/13 12:44:59 INFO mapreduce.JobSubmitter: number of splits:10
17/10/13 12:44:59 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1507865821865_0001
17/10/13 12:45:00 INFO impl.YarnClientImpl: Submitted application application_1507865821865_0001
17/10/13 12:45:00 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1507865821865_0001/
17/10/13 12:45:00 INFO mapreduce.Job: Running job: job_1507865821865_0001
17/10/13 12:45:08 INFO mapreduce.Job: Job job_1507865821865_0001 running in uber mode : false
17/10/13 12:45:08 INFO mapreduce.Job:  map 0% reduce 0%
17/10/13 12:45:14 INFO mapreduce.Job:  map 20% reduce 0%
17/10/13 12:45:15 INFO mapreduce.Job:  map 100% reduce 0%
17/10/13 12:45:18 INFO mapreduce.Job:  map 100% reduce 100%
17/10/13 12:45:19 INFO mapreduce.Job: Job job_1507865821865_0001 completed successfully
17/10/13 12:45:19 INFO mapreduce.Job: Counters: 49
        File System Counters
                FILE: Number of bytes read=226
                FILE: Number of bytes written=1507099
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=2670
                HDFS: Number of bytes written=215
                HDFS: Number of read operations=43
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=3
        Job Counters
                Launched map tasks=10
                Launched reduce tasks=1
                Data-local map tasks=10
                Total time spent by all maps in occupied slots (ms)=48025
                Total time spent by all reduces in occupied slots (ms)=2513
                Total time spent by all map tasks (ms)=48025
                Total time spent by all reduce tasks (ms)=2513
                Total vcore-milliseconds taken by all map tasks=48025
                Total vcore-milliseconds taken by all reduce tasks=2513
                Total megabyte-milliseconds taken by all map tasks=49177600
                Total megabyte-milliseconds taken by all reduce tasks=2573312
        Map-Reduce Framework
                Map input records=10
                Map output records=20
                Map output bytes=180
                Map output materialized bytes=280
                Input split bytes=1490
                Combine input records=0
                Combine output records=0
                Reduce input groups=2
                Reduce shuffle bytes=280
                Reduce input records=20
                Reduce output records=0
                Spilled Records=40
                Shuffled Maps =10
                Failed Shuffles=0
                Merged Map outputs=10
                GC time elapsed (ms)=1297
                CPU time spent (ms)=5270
                Physical memory (bytes) snapshot=3010437120
                Virtual memory (bytes) snapshot=21824602112
                Total committed heap usage (bytes)=2163736576
        Shuffle Errors
                BAD_ID=0
                CONNECTION=0
                IO_ERROR=0
                WRONG_LENGTH=0
                WRONG_MAP=0
                WRONG_REDUCE=0
        File Input Format Counters
                Bytes Read=1180
        File Output Format Counters
                Bytes Written=97
Job Finished in 21.033 seconds
Estimated value of Pi is 3.14080000000000000000
 
cs

* hadoop 설치 시 제공된 샘플을 갖고 테스트를 진행합니다.



※ name node 포맷할 때

Unable to determine address of the host-falling back to "localhost" address 와 같은 경고가 발생한다면

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
$ hdfs namenode -format
17/10/13 09:00:07 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   user = hadoop-user
STARTUP_MSG:   host = java.net.UnknownHostException: ubuntu: ubuntu: Name or service not known
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.8.1
STARTUP_MSG:   classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.8.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/json-smart-1.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core4-4.0.1-incubating.jar:/usr/local/hadoop/share/hadoop/common/lib/jcip-annotations-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/nimbus-jose-jwt-3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.4.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.5.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.51.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-sslengine-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.1.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.8.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core4-4.0.1-incubating.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okio-1.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okhttp-2.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/curator-test-2.7.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javassist-3.18.1-GA.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/curator-client-2.7.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-math-2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/fst-2.24.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/objenesis-2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.8.1.jar:/usr/local/hadoop/contrib/capacity-scheduler/*.jar
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r 20fe5304904fc2f5a18053c389e43cd26f7a70fe; compiled by 'vinodkv' on 2017-06-02T06:14Z
STARTUP_MSG:   java = 1.8.0_144
************************************************************/
17/10/13 09:00:07 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
17/10/13 09:00:07 INFO namenode.NameNode: createNameNode [-format]
Formatting using clusterid: CID-2b6a74d9-2461-487d-acdd-22184f6da5bd
17/10/13 09:00:07 INFO namenode.FSEditLog: Edit logging is async:false
17/10/13 09:00:07 INFO namenode.FSNamesystem: KeyProvider: null
17/10/13 09:00:07 INFO namenode.FSNamesystem: fsLock is fair: true
17/10/13 09:00:07 INFO namenode.FSNamesystem: Detailed lock hold time metrics enabled: false
17/10/13 09:00:08 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
17/10/13 09:00:08 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
17/10/13 09:00:08 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
17/10/13 09:00:08 INFO blockmanagement.BlockManager: The block deletion will start around 2017 Oct 13 09:00:08
17/10/13 09:00:08 INFO util.GSet: Computing capacity for map BlocksMap
17/10/13 09:00:08 INFO util.GSet: VM type       = 64-bit
17/10/13 09:00:08 INFO util.GSet: 2.0% max memory 889 MB = 17.8 MB
17/10/13 09:00:08 INFO util.GSet: capacity      = 2^21 = 2097152 entries
17/10/13 09:00:08 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
17/10/13 09:00:08 INFO blockmanagement.BlockManager: defaultReplication         = 1
17/10/13 09:00:08 INFO blockmanagement.BlockManager: maxReplication             = 512
17/10/13 09:00:08 INFO blockmanagement.BlockManager: minReplication             = 1
17/10/13 09:00:08 INFO blockmanagement.BlockManager: maxReplicationStreams      = 2
17/10/13 09:00:08 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
17/10/13 09:00:08 INFO blockmanagement.BlockManager: encryptDataTransfer        = false
17/10/13 09:00:08 INFO blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
17/10/13 09:00:08 INFO namenode.FSNamesystem: fsOwner             = hadoop-user (auth:SIMPLE)
17/10/13 09:00:08 INFO namenode.FSNamesystem: supergroup          = supergroup
17/10/13 09:00:08 INFO namenode.FSNamesystem: isPermissionEnabled = true
17/10/13 09:00:08 INFO namenode.FSNamesystem: HA Enabled: false
17/10/13 09:00:08 INFO namenode.FSNamesystem: Append Enabled: true
17/10/13 09:00:08 INFO util.GSet: Computing capacity for map INodeMap
17/10/13 09:00:08 INFO util.GSet: VM type       = 64-bit
17/10/13 09:00:08 INFO util.GSet: 1.0% max memory 889 MB = 8.9 MB
17/10/13 09:00:08 INFO util.GSet: capacity      = 2^20 = 1048576 entries
17/10/13 09:00:08 INFO namenode.FSDirectory: ACLs enabled? false
17/10/13 09:00:08 INFO namenode.FSDirectory: XAttrs enabled? true
17/10/13 09:00:08 INFO namenode.NameNode: Caching file names occurring more than 10 times
17/10/13 09:00:08 INFO util.GSet: Computing capacity for map cachedBlocks
17/10/13 09:00:08 INFO util.GSet: VM type       = 64-bit
17/10/13 09:00:08 INFO util.GSet: 0.25% max memory 889 MB = 2.2 MB
17/10/13 09:00:08 INFO util.GSet: capacity      = 2^18 = 262144 entries
17/10/13 09:00:08 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
17/10/13 09:00:08 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
17/10/13 09:00:08 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
17/10/13 09:00:08 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10
17/10/13 09:00:08 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10
17/10/13 09:00:08 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
17/10/13 09:00:08 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
17/10/13 09:00:08 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
17/10/13 09:00:08 INFO util.GSet: Computing capacity for map NameNodeRetryCache
17/10/13 09:00:08 INFO util.GSet: VM type       = 64-bit
17/10/13 09:00:08 INFO util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
17/10/13 09:00:08 INFO util.GSet: capacity      = 2^15 = 32768 entries
17/10/13 09:00:28 WARN net.DNS: Unable to determine local hostname -falling back to "localhost"
java.net.UnknownHostException: ubuntu: ubuntu: Name or service not known
        at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
        at org.apache.hadoop.net.DNS.resolveLocalHostname(DNS.java:284)
        at org.apache.hadoop.net.DNS.<clinit>(DNS.java:61)
        at org.apache.hadoop.hdfs.server.namenode.NNStorage.newBlockPoolID(NNStorage.java:991)
        at org.apache.hadoop.hdfs.server.namenode.NNStorage.newNamespaceInfo(NNStorage.java:600)
        at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:152)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1102)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1544)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1671)
Caused by: java.net.UnknownHostException: ubuntu: Name or service not known
        at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
        at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
        ... 8 more
17/10/13 09:00:48 WARN net.DNS: Unable to determine address of the host-falling back to "localhost" address
java.net.UnknownHostException: ubuntu: ubuntu: Name or service not known
        at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
        at org.apache.hadoop.net.DNS.resolveLocalHostIPAddress(DNS.java:307)
        at org.apache.hadoop.net.DNS.<clinit>(DNS.java:62)
        at org.apache.hadoop.hdfs.server.namenode.NNStorage.newBlockPoolID(NNStorage.java:991)
        at org.apache.hadoop.hdfs.server.namenode.NNStorage.newNamespaceInfo(NNStorage.java:600)
        at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:152)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1102)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1544)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1671)
Caused by: java.net.UnknownHostException: ubuntu: Name or service not known
        at java.net.Inet4AddressImpl.lookupAllHostAddr(Native Method)
        at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
        at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
        at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
        ... 8 more
17/10/13 09:00:48 INFO namenode.FSImage: Allocated new BlockPoolId: BP-966861352-127.0.0.1-1507852848831
17/10/13 09:00:48 INFO common.Storage: Storage directory /tmp/hadoop-hadoop-user/dfs/name has been successfully formatted.
17/10/13 09:00:48 INFO namenode.FSImageFormatProtobuf: Saving image file /tmp/hadoop-hadoop-user/dfs/name/current/fsimage.ckpt_0000000000000000000 using no compression
17/10/13 09:00:48 INFO namenode.FSImageFormatProtobuf: Image file /tmp/hadoop-hadoop-user/dfs/name/current/fsimage.ckpt_0000000000000000000 of size 327 bytes saved in 0 seconds.
17/10/13 09:00:48 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
17/10/13 09:00:48 INFO util.ExitUtil: Exiting with status 0
17/10/13 09:00:48 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at java.net.UnknownHostException: ubuntu: ubuntu: Name or service not known
************************************************************/
 
cs

* 증상은 64라인 이하에 있습니다.



* 해당 원인은 /etc/hosts 파일에 호스트명이 localhost 로 지정되어 있어서 발생하고 있었으며 다른 호스트명으로 수정하면 됩니다.

1
2
3
4
5
6
7
8
9
10
127.0.0.1       localhost
#127.0.1.1      ubuntu.localdomain      ubuntu
 
# The following lines are desirable for IPv6 capable hosts
::1     localhost ip6-localhost ip6-loopback
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
 
192.168.30.128 localhost
 
cs






구축 정보

구분 

IP

Host Name

 제외 단계

name node

 192.168.10.101

master

 13

secondary node

 192.168.10.102

secondary

 12, 13

data node

 192.168.10.103

datanode-a

 12

data node

 192.168.10.104

datanode-b

 12

data node

 192.168.10.105

datanode-c

 12

* 각 노드별로 해당하는 작업이 일부 상이하여 제외해야 하는 단계를 주의해서 참조하세요!


1. 패키지 인덱스 정보 업데이트

1
2
3
4
5
6
7
8
$ sudo apt-get update
Hit:http://us.archive.ubuntu.com/ubuntu xenial InRelease
Get:http://security.ubuntu.com/ubuntu xenial-security InRelease [102 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates InRelease [102 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-backports InRelease [102 kB]
Fetched 306 kB in 1s (154 kB/s)
Reading package lists... Done
 
cs



2. 설치된 패키지 업그레이드

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
$ sudo apt-get upgrade
Reading package lists... Done
Building dependency tree
Reading state information... Done
Calculating upgrade... Done
The following packages have been kept back:
  linux-generic linux-headers-generic linux-image-generic
The following packages will be upgraded:
  apparmor bind9-host ca-certificates cryptsetup cryptsetup-bin curl dnsmasq-base dnsutils
  gcc-5-base git git-man grub-legacy-ec2 libapparmor-perl libapparmor1 libbind9-140
  libcryptsetup4 libcurl3-gnutls libdns-export162 libdns162 libidn11 libisc-export160
  libisc160 libisccc140 libisccfg140 liblwres141 libmspack0 libplymouth4 libpython3.5
  libpython3.5-minimal libpython3.5-stdlib libstdc++libxml2 linux-firmware logrotate lxd
  lxd-client mdadm plymouth plymouth-theme-ubuntu-text python3-update-manager python3.5
  python3.5-minimal snapd tcpdump unattended-upgrades update-manager-core
  update-notifier-common vlan xfsprogs
49 upgraded, newly installed, to remove and not upgraded.
Need to get 71.MB of archives.
After this operation, 9,276 kB of additional disk space will be used.
Do you want to continue? [Y/n] Y
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libpython3.5 amd64 3.5.2-2ubuntu0~16.04.3 [1,360 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 python3.5 amd64 3.5.2-2ubuntu0~16.04.3 [165 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libpython3.5-stdlib amd64 3.5.2-2ubuntu0~16.04.3 [2,132 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 python3.5-minimal amd64 3.5.2-2ubuntu0~16.04.3 [1,596 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libpython3.5-minimal amd64 3.5.2-2ubuntu0~16.04.3 [524 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libidn11 amd64 1.32-3ubuntu1.2 [46.5 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 dnsmasq-base amd64 2.75-1ubuntu0.16.04.3 [295 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 lxd amd64 2.0.10-0ubuntu1~16.04.2 [3,416 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 lxd-client amd64 2.0.10-0ubuntu1~16.04.2 [1,857 kB]
Get:10 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 update-manager-core all 1:16.04.9 [5,330 B]
Get:11 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 python3-update-manager all 1:16.04.9 [31.9 kB]
Get:12 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 update-notifier-common all 3.168.5 [165 kB]
Get:13 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libapparmor1 amd64 2.10.95-0ubuntu2.7 [31.2 kB]
Get:14 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libcryptsetup4 amd64 2:1.6.6-5ubuntu2.1 [73.3 kB]
Get:15 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 gcc-5-base amd64 5.4.0-6ubuntu1~16.04.5 [17.1 kB]
Get:16 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libstdc++6 amd64 5.4.0-6ubuntu1~16.04.5 [393 kB]
Get:17 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libisc-export160 amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [153 kB]
Get:18 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libdns-export162 amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [666 kB]
Get:19 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 logrotate amd64 3.8.7-2ubuntu2.16.04.2 [37.7 kB]
Get:20 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libapparmor-perl amd64 2.10.95-0ubuntu2.7 [31.6 kB]
Get:21 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 apparmor amd64 2.10.95-0ubuntu2.7 [450 kB]
Get:22 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libxml2 amd64 2.9.3+dfsg1-1ubuntu0.3 [697 kB]
Get:23 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 bind9-host amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [38.4 kB]
Get:24 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 dnsutils amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [89.0 kB]
Get:25 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libisc160 amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [215 kB]
Get:26 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libdns162 amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [882 kB]
Get:27 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libisccc140 amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [16.3 kB]
Get:28 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libisccfg140 amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [40.4 kB]
Get:29 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 liblwres141 amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [33.7 kB]
Get:30 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libbind9-140 amd64 1:9.10.3.dfsg.P4-8ubuntu1.8 [23.6 kB]
Get:31 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 ca-certificates all 20170717~16.04.1 [168 kB]
Get:32 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 curl amd64 7.47.0-1ubuntu2.3 [138 kB]
Get:33 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libcurl3-gnutls amd64 7.47.0-1ubuntu2.3 [185 kB]
Get:34 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libplymouth4 amd64 0.9.2-3ubuntu13.2 [85.1 kB]
Get:35 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 plymouth amd64 0.9.2-3ubuntu13.2 [107 kB]
Get:36 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 plymouth-theme-ubuntu-text amd64 0.9.2-3ubuntu13.2 [9,094 B]
Get:37 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 tcpdump amd64 4.9.2-0ubuntu0.16.04.1 [387 kB]
Get:38 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 cryptsetup-bin amd64 2:1.6.6-5ubuntu2.1 [61.8 kB]
Get:39 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 cryptsetup amd64 2:1.6.6-5ubuntu2.1 [123 kB]
Get:40 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 git-man all 1:2.7.4-0ubuntu1.3 [736 kB]
Get:41 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 git amd64 1:2.7.4-0ubuntu1.3 [3,102 kB]
Get:42 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libmspack0 amd64 0.5-1ubuntu0.16.04.1 [37.0 kB]
Get:43 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 linux-firmware all 1.157.12 [38.8 MB]
Get:44 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 mdadm amd64 3.3-2ubuntu7.4 [394 kB]
Get:45 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 snapd amd64 2.27.5 [10.7 MB]
Get:46 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 unattended-upgrades all 0.90ubuntu0.8 [32.4 kB]
Get:47 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 xfsprogs amd64 4.3.0+nmu1ubuntu1.1 [597 kB]
Get:48 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 grub-legacy-ec2 all 0.7.9-233-ge586fe35-0ubuntu1~16.04.2 [28.2 kB]
Get:49 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 vlan amd64 1.9-3.2ubuntu1.16.04.4 [30.6 kB]
Fetched 71.MB in 53s (1,332 kB/s)
Extracting templates from packages: 100%
Preconfiguring packages ...
(Reading database ... 59734 files and directories currently installed.)
Preparing to unpack .../libpython3.5_3.5.2-2ubuntu0~16.04.3_amd64.deb ...
Unpacking libpython3.5:amd64 (3.5.2-2ubuntu0~16.04.3) over (3.5.2-2ubuntu0~16.04.1) ...
Preparing to unpack .../python3.5_3.5.2-2ubuntu0~16.04.3_amd64.deb ...
Unpacking python3.(3.5.2-2ubuntu0~16.04.3) over (3.5.2-2ubuntu0~16.04.1) ...
Preparing to unpack .../libpython3.5-stdlib_3.5.2-2ubuntu0~16.04.3_amd64.deb ...
Unpacking libpython3.5-stdlib:amd64 (3.5.2-2ubuntu0~16.04.3) over (3.5.2-2ubuntu0~16.04.1) ...
Preparing to unpack .../python3.5-minimal_3.5.2-2ubuntu0~16.04.3_amd64.deb ...
Unpacking python3.5-minimal (3.5.2-2ubuntu0~16.04.3) over (3.5.2-2ubuntu0~16.04.1) ...
Preparing to unpack .../libpython3.5-minimal_3.5.2-2ubuntu0~16.04.3_amd64.deb ...
Unpacking libpython3.5-minimal:amd64 (3.5.2-2ubuntu0~16.04.3) over (3.5.2-2ubuntu0~16.04.1) ...
Preparing to unpack .../libidn11_1.32-3ubuntu1.2_amd64.deb ...
Unpacking libidn11:amd64 (1.32-3ubuntu1.2) over (1.32-3ubuntu1.1) ...
Preparing to unpack .../dnsmasq-base_2.75-1ubuntu0.16.04.3_amd64.deb ...
Unpacking dnsmasq-base (2.75-1ubuntu0.16.04.3) over (2.75-1ubuntu0.16.04.2) ...
Preparing to unpack .../lxd_2.0.10-0ubuntu1~16.04.2_amd64.deb ...
Warning: Stopping lxd.service, but it can still be activated by:
  lxd.socket
Unpacking lxd (2.0.10-0ubuntu1~16.04.2) over (2.0.10-0ubuntu1~16.04.1) ...
Warning: Stopping lxd.service, but it can still be activated by:
  lxd.socket
Preparing to unpack .../lxd-client_2.0.10-0ubuntu1~16.04.2_amd64.deb ...
Unpacking lxd-client (2.0.10-0ubuntu1~16.04.2) over (2.0.10-0ubuntu1~16.04.1) ...
Preparing to unpack .../update-manager-core_1%3a16.04.9_all.deb ...
Unpacking update-manager-core (1:16.04.9) over (1:16.04.7) ...
Preparing to unpack .../python3-update-manager_1%3a16.04.9_all.deb ...
Unpacking python3-update-manager (1:16.04.9) over (1:16.04.7) ...
Preparing to unpack .../update-notifier-common_3.168.5_all.deb ...
Unpacking update-notifier-common (3.168.5) over (3.168.4) ...
Preparing to unpack .../libapparmor1_2.10.95-0ubuntu2.7_amd64.deb ...
Unpacking libapparmor1:amd64 (2.10.95-0ubuntu2.7) over (2.10.95-0ubuntu2.6) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
Processing triggers for man-db (2.7.5-1) ...
Processing triggers for mime-support (3.59ubuntu1) ...
Processing triggers for ureadahead (0.100.0-19) ...
Setting up libapparmor1:amd64 (2.10.95-0ubuntu2.7) ...
Processing triggers for systemd (229-4ubuntu19) ...
Processing triggers for dbus (1.10.6-1ubuntu3.3) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
(Reading database ... 59735 files and directories currently installed.)
Preparing to unpack .../libcryptsetup4_2%3a1.6.6-5ubuntu2.1_amd64.deb ...
Unpacking libcryptsetup4:amd64 (2:1.6.6-5ubuntu2.1) over (2:1.6.6-5ubuntu2) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
Setting up libcryptsetup4:amd64 (2:1.6.6-5ubuntu2.1) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
(Reading database ... 59735 files and directories currently installed.)
Preparing to unpack .../gcc-5-base_5.4.0-6ubuntu1~16.04.5_amd64.deb ...
Unpacking gcc-5-base:amd64 (5.4.0-6ubuntu1~16.04.5) over (5.4.0-6ubuntu1~16.04.4) ...
Setting up gcc-5-base:amd64 (5.4.0-6ubuntu1~16.04.5) ...
(Reading database ... 59735 files and directories currently installed.)
Preparing to unpack .../libstdc++6_5.4.0-6ubuntu1~16.04.5_amd64.deb ...
Unpacking libstdc++6:amd64 (5.4.0-6ubuntu1~16.04.5) over (5.4.0-6ubuntu1~16.04.4) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
Setting up libstdc++6:amd64 (5.4.0-6ubuntu1~16.04.5) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
(Reading database ... 59735 files and directories currently installed.)
Preparing to unpack .../libisc-export160_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking libisc-export160 (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../libdns-export162_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking libdns-export162 (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../logrotate_3.8.7-2ubuntu2.16.04.2_amd64.deb ...
Unpacking logrotate (3.8.7-2ubuntu2.16.04.2) over (3.8.7-2ubuntu2.16.04.1) ...
Preparing to unpack .../libapparmor-perl_2.10.95-0ubuntu2.7_amd64.deb ...
Unpacking libapparmor-perl (2.10.95-0ubuntu2.7) over (2.10.95-0ubuntu2.6) ...
Preparing to unpack .../apparmor_2.10.95-0ubuntu2.7_amd64.deb ...
Unpacking apparmor (2.10.95-0ubuntu2.7) over (2.10.95-0ubuntu2.6) ...
Preparing to unpack .../libxml2_2.9.3+dfsg1-1ubuntu0.3_amd64.deb ...
Unpacking libxml2:amd64 (2.9.3+dfsg1-1ubuntu0.3) over (2.9.3+dfsg1-1ubuntu0.2) ...
Preparing to unpack .../bind9-host_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking bind9-host (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../dnsutils_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking dnsutils (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../libisc160_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking libisc160:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../libdns162_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking libdns162:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../libisccc140_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking libisccc140:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../libisccfg140_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking libisccfg140:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../liblwres141_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking liblwres141:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../libbind9-140_1%3a9.10.3.dfsg.P4-8ubuntu1.8_amd64.deb ...
Unpacking libbind9-140:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) over (1:9.10.3.dfsg.P4-8ubuntu1.7) ...
Preparing to unpack .../ca-certificates_20170717~16.04.1_all.deb ...
Unpacking ca-certificates (20170717~16.04.1) over (20160104ubuntu1) ...
Preparing to unpack .../curl_7.47.0-1ubuntu2.3_amd64.deb ...
Unpacking curl (7.47.0-1ubuntu2.3) over (7.47.0-1ubuntu2.2) ...
Preparing to unpack .../libcurl3-gnutls_7.47.0-1ubuntu2.3_amd64.deb ...
Unpacking libcurl3-gnutls:amd64 (7.47.0-1ubuntu2.3) over (7.47.0-1ubuntu2.2) ...
Preparing to unpack .../libplymouth4_0.9.2-3ubuntu13.2_amd64.deb ...
Unpacking libplymouth4:amd64 (0.9.2-3ubuntu13.2) over (0.9.2-3ubuntu13.1) ...
Preparing to unpack .../plymouth_0.9.2-3ubuntu13.2_amd64.deb ...
Unpacking plymouth (0.9.2-3ubuntu13.2) over (0.9.2-3ubuntu13.1) ...
Preparing to unpack .../plymouth-theme-ubuntu-text_0.9.2-3ubuntu13.2_amd64.deb ...
Unpacking plymouth-theme-ubuntu-text (0.9.2-3ubuntu13.2) over (0.9.2-3ubuntu13.1) ...
Preparing to unpack .../tcpdump_4.9.2-0ubuntu0.16.04.1_amd64.deb ...
Unpacking tcpdump (4.9.2-0ubuntu0.16.04.1) over (4.9.0-1ubuntu1~ubuntu16.04.1) ...
Preparing to unpack .../cryptsetup-bin_2%3a1.6.6-5ubuntu2.1_amd64.deb ...
Unpacking cryptsetup-bin (2:1.6.6-5ubuntu2.1) over (2:1.6.6-5ubuntu2) ...
Preparing to unpack .../cryptsetup_2%3a1.6.6-5ubuntu2.1_amd64.deb ...
Unpacking cryptsetup (2:1.6.6-5ubuntu2.1) over (2:1.6.6-5ubuntu2) ...
Preparing to unpack .../git-man_1%3a2.7.4-0ubuntu1.3_all.deb ...
Unpacking git-man (1:2.7.4-0ubuntu1.3) over (1:2.7.4-0ubuntu1.1) ...
Preparing to unpack .../git_1%3a2.7.4-0ubuntu1.3_amd64.deb ...
Unpacking git (1:2.7.4-0ubuntu1.3) over (1:2.7.4-0ubuntu1.1) ...
Preparing to unpack .../libmspack0_0.5-1ubuntu0.16.04.1_amd64.deb ...
Unpacking libmspack0:amd64 (0.5-1ubuntu0.16.04.1) over (0.5-1) ...
Preparing to unpack .../linux-firmware_1.157.12_all.deb ...
Unpacking linux-firmware (1.157.12) over (1.157.11) ...
Preparing to unpack .../mdadm_3.3-2ubuntu7.4_amd64.deb ...
Unpacking mdadm (3.3-2ubuntu7.4) over (3.3-2ubuntu7.2) ...
Preparing to unpack .../snapd_2.27.5_amd64.deb ...
Warning: Stopping snapd.service, but it can still be activated by:
  snapd.socket
Unpacking snapd (2.27.5) over (2.25) ...
Preparing to unpack .../unattended-upgrades_0.90ubuntu0.8_all.deb ...
Unpacking unattended-upgrades (0.90ubuntu0.8) over (0.90ubuntu0.7) ...
Preparing to unpack .../xfsprogs_4.3.0+nmu1ubuntu1.1_amd64.deb ...
Unpacking xfsprogs (4.3.0+nmu1ubuntu1.1) over (4.3.0+nmu1ubuntu1) ...
Preparing to unpack .../grub-legacy-ec2_0.7.9-233-ge586fe35-0ubuntu1~16.04.2_all.deb ...
Leaving 'diversion of /usr/sbin/grub-set-default to /usr/sbin/grub-set-default.real by grub-legacy-ec2'
Unpacking grub-legacy-ec2 (0.7.9-233-ge586fe35-0ubuntu1~16.04.2) over (0.7.9-153-g16a7302f-0ubuntu1~16.04.2) ...
Preparing to unpack .../vlan_1.9-3.2ubuntu1.16.04.4_amd64.deb ...
Unpacking vlan (1.9-3.2ubuntu1.16.04.4) over (1.9-3.2ubuntu1.16.04.3) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
Processing triggers for man-db (2.7.5-1) ...
Processing triggers for ureadahead (0.100.0-19) ...
Processing triggers for systemd (229-4ubuntu19) ...
Setting up libpython3.5-minimal:amd64 (3.5.2-2ubuntu0~16.04.3) ...
Setting up libpython3.5-stdlib:amd64 (3.5.2-2ubuntu0~16.04.3) ...
Setting up libpython3.5:amd64 (3.5.2-2ubuntu0~16.04.3) ...
Setting up python3.5-minimal (3.5.2-2ubuntu0~16.04.3) ...
Setting up python3.(3.5.2-2ubuntu0~16.04.3) ...
Setting up libidn11:amd64 (1.32-3ubuntu1.2) ...
Setting up dnsmasq-base (2.75-1ubuntu0.16.04.3) ...
Setting up lxd-client (2.0.10-0ubuntu1~16.04.2) ...
Setting up lxd (2.0.10-0ubuntu1~16.04.2) ...
Setting up python3-update-manager (1:16.04.9) ...
Setting up update-manager-core (1:16.04.9) ...
Setting up update-notifier-common (3.168.5) ...
Setting up libisc-export160 (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up libdns-export162 (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up logrotate (3.8.7-2ubuntu2.16.04.2) ...
Setting up libapparmor-perl (2.10.95-0ubuntu2.7) ...
Setting up apparmor (2.10.95-0ubuntu2.7) ...
update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
Skipping profile in /etc/apparmor.d/disable: usr.sbin.rsyslogd
Setting up libxml2:amd64 (2.9.3+dfsg1-1ubuntu0.3) ...
Setting up libisc160:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up libdns162:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up libisccc140:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up libisccfg140:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up libbind9-140:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up liblwres141:amd64 (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up bind9-host (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up dnsutils (1:9.10.3.dfsg.P4-8ubuntu1.8) ...
Setting up ca-certificates (20170717~16.04.1) ...
Setting up libcurl3-gnutls:amd64 (7.47.0-1ubuntu2.3) ...
Setting up curl (7.47.0-1ubuntu2.3) ...
Setting up libplymouth4:amd64 (0.9.2-3ubuntu13.2) ...
Setting up plymouth (0.9.2-3ubuntu13.2) ...
update-initramfs: deferring update (trigger activated)
update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
Setting up plymouth-theme-ubuntu-text (0.9.2-3ubuntu13.2) ...
update-initramfs: deferring update (trigger activated)
Setting up tcpdump (4.9.2-0ubuntu0.16.04.1) ...
Installing new version of config file /etc/apparmor.d/usr.sbin.tcpdump ...
Setting up cryptsetup-bin (2:1.6.6-5ubuntu2.1) ...
Setting up cryptsetup (2:1.6.6-5ubuntu2.1) ...
update-initramfs: deferring update (trigger activated)
update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
Setting up git-man (1:2.7.4-0ubuntu1.3) ...
Setting up git (1:2.7.4-0ubuntu1.3) ...
Setting up libmspack0:amd64 (0.5-1ubuntu0.16.04.1) ...
Setting up linux-firmware (1.157.12) ...
update-initramfs: Generating /boot/initrd.img-4.4.0-87-generic
W: mdadm: /etc/mdadm/mdadm.conf defines no arrays.
Setting up mdadm (3.3-2ubuntu7.4) ...
update-initramfs: deferring update (trigger activated)
Generating grub configuration file ...
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17133/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17133/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17146/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17146/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17159/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17159/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17172/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17172/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17231/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17231/usr/sbin/grub-probe
Found linux image: /boot/vmlinuz-4.4.0-87-generic
Found initrd image: /boot/initrd.img-4.4.0-87-generic
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17464/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on vgs invocation. Parent PID 17464/usr/sbin/grub-probe
File descriptor (pipe:[36247]) leaked on lvs invocation. Parent PID 17580/bin/sh
done
update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
Setting up snapd (2.27.5) ...
Installing new version of config file /etc/apparmor.d/usr.lib.snapd.snap-confine.real ...
Setting up unattended-upgrades (0.90ubuntu0.8) ...
Setting up xfsprogs (4.3.0+nmu1ubuntu1.1) ...
update-initramfs: deferring update (trigger activated)
Setting up grub-legacy-ec2 (0.7.9-233-ge586fe35-0ubuntu1~16.04.2) ...
Searching for GRUB installation directory ... found: /boot/grub
Searching for default file ... found: /boot/grub/default
Testing for an existing GRUB menu.lst file ... found: /boot/grub/menu.lst
Searching for splash image ... none found, skipping ...
Found kernel: /vmlinuz-4.4.0-87-generic
Found kernel: /vmlinuz-4.4.0-87-generic
Updating /boot/grub/menu.lst ... done
 
Setting up vlan (1.9-3.2ubuntu1.16.04.4) ...
Installing new version of config file /etc/network/if-up.d/ip ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
Processing triggers for ca-certificates (20170717~16.04.1) ...
Updating certificates in /etc/ssl/certs...
17 added, 42 removed; done.
Running hooks in /etc/ca-certificates/update.d...
done.
Processing triggers for initramfs-tools (0.122ubuntu8.8) ...
update-initramfs: Generating /boot/initrd.img-4.4.0-87-generic
W: mdadm: /etc/mdadm/mdadm.conf defines no arrays.
 
cs

* 설치 진행 중에 "Do you want to continue?" 와 같이 계속 진행할 것인지를 묻는데 이때 Y 를 입력합니다.



3. java 설치 전에 openjdk~ 로 시작하는 패키지 및 설정 파일 삭제

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
$ sudo apt purge openjdk*
Reading package lists... Done
Building dependency tree
Reading state information... Done
Note, selecting 'openjdk-9-jre-headless' for glob 'openjdk*'
Note, selecting 'openjdk-8-jdk' for glob 'openjdk*'
Note, selecting 'openjdk-8-jre' for glob 'openjdk*'
Note, selecting 'openjdk-6-jdk' for glob 'openjdk*'
Note, selecting 'openjdk-6-jre' for glob 'openjdk*'
Note, selecting 'openjdk-9-demo' for glob 'openjdk*'
Note, selecting 'openjdk-6-jre-headless' for glob 'openjdk*'
Note, selecting 'openjdk-8-demo' for glob 'openjdk*'
Note, selecting 'openjdk-8-jre-dcevm' for glob 'openjdk*'
Note, selecting 'openjdk-8-jdk-headless' for glob 'openjdk*'
Note, selecting 'openjdk-9-dbg' for glob 'openjdk*'
Note, selecting 'openjdk-7-jre-headless' for glob 'openjdk*'
Note, selecting 'openjdk-9-doc' for glob 'openjdk*'
Note, selecting 'openjdk-8-jre-zero' for glob 'openjdk*'
Note, selecting 'openjdk-8-source' for glob 'openjdk*'
Note, selecting 'openjdk-jre' for glob 'openjdk*'
Note, selecting 'openjdk-9-jdk' for glob 'openjdk*'
Note, selecting 'openjdk-9-jre' for glob 'openjdk*'
Note, selecting 'openjdk-7-jdk' for glob 'openjdk*'
Note, selecting 'openjdk-7-jre' for glob 'openjdk*'
Note, selecting 'openjdk-9-jdk-headless' for glob 'openjdk*'
Note, selecting 'openjdk-8-jre-headless' for glob 'openjdk*'
Note, selecting 'openjdk-9-source' for glob 'openjdk*'
Note, selecting 'openjdk-8-jre-jamvm' for glob 'openjdk*'
Note, selecting 'openjdk-8-dbg' for glob 'openjdk*'
Note, selecting 'openjdk-8-doc' for glob 'openjdk*'
Package 'openjdk-6-jdk' is not installed, so not removed
Package 'openjdk-7-jre-headless' is not installed, so not removed
Package 'openjdk-6-jre-headless' is not installed, so not removed
Package 'openjdk-7-jre' is not installed, so not removed
Package 'openjdk-6-jre' is not installed, so not removed
Package 'openjdk-7-jdk' is not installed, so not removed
Package 'openjdk-jre' is not installed, so not removed
Package 'openjdk-8-jre-dcevm' is not installed, so not removed
Package 'openjdk-9-dbg' is not installed, so not removed
Package 'openjdk-9-demo' is not installed, so not removed
Package 'openjdk-9-doc' is not installed, so not removed
Package 'openjdk-9-jdk' is not installed, so not removed
Package 'openjdk-9-jdk-headless' is not installed, so not removed
Package 'openjdk-9-jre' is not installed, so not removed
Package 'openjdk-9-jre-headless' is not installed, so not removed
Package 'openjdk-9-source' is not installed, so not removed
Package 'openjdk-8-dbg' is not installed, so not removed
Package 'openjdk-8-demo' is not installed, so not removed
Package 'openjdk-8-doc' is not installed, so not removed
Package 'openjdk-8-jdk' is not installed, so not removed
Package 'openjdk-8-jdk-headless' is not installed, so not removed
Package 'openjdk-8-jre' is not installed, so not removed
Package 'openjdk-8-jre-headless' is not installed, so not removed
Package 'openjdk-8-jre-jamvm' is not installed, so not removed
Package 'openjdk-8-source' is not installed, so not removed
Package 'openjdk-8-jre-zero' is not installed, so not removed
upgraded, newly installed, to remove and not upgraded.
 
cs

* 주의사항

 -> "apt purge" 명령어는 remove 명령어와 다르게 관련 설정 파일도 삭제하기 때문에 추후 사용 시에 주의가 필요합니다.



4. 개인 패키지 저장소 추가

1
2
3
4
5
6
7
8
9
10
11
$ sudo add-apt-repository -y ppa:webupd8team/java
gpg: keyring `/tmp/tmpy82g2jco/secring.gpg' created
gpg: keyring `/tmp/tmpy82g2jco/pubring.gpg' created
gpg: requesting key EEA14886 from hkp server keyserver.ubuntu.com
gpg: /tmp/tmpy82g2jco/trustdb.gpg: trustdb created
gpg: key EEA14886: public key "Launchpad VLC" imported
gpg: no ultimately trusted keys found
gpg: Total number processed: 1
gpg:               imported:  (RSA: 1)
OK
 
cs



5. 패키지 인덱스 정보 업데이트

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
$ sudo apt update
Hit:http://us.archive.ubuntu.com/ubuntu xenial InRelease
Get:http://security.ubuntu.com/ubuntu xenial-security InRelease [102 kB]
Get:http://ppa.launchpad.net/webupd8team/java/ubuntu xenial InRelease [17.5 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates InRelease [102 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-backports InRelease [102 kB]
Get:http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main amd64 Packages [2,912 B]
Get:http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main i386 Packages [2,460 B]
Get:http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main Translation-en [1,260 B]
Fetched 331 kB in 2s (116 kB/s)
Reading package lists... Done
Building dependency tree
Reading state information... Done
packages can be upgraded. Run 'apt list --upgradable' to see them.
 
cs



6. java 설치

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
$ sudo apt install -y oracle-java8-installer
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following additional packages will be installed:
  binutils gsfonts gsfonts-x11 java-common libfontenc1 libxfont1 oracle-java8-set-default
  x11-common xfonts-encodings xfonts-utils
Suggested packages:
  binutils-doc binfmt-support visualvm ttf-baekmuk | ttf-unfonts | ttf-unfonts-core
  ttf-kochi-gothic | ttf-sazanami-gothic ttf-kochi-mincho | ttf-sazanami-mincho
  ttf-arphic-uming firefox | firefox-| iceweasel | mozilla-firefox | iceape-browser
  | mozilla-browser | epiphany-gecko | epiphany-webkit | epiphany-browser | galeon
  | midbrowser | moblin-web-browser | xulrunner | xulrunner-1.| konqueror
  | chromium-browser | midori | google-chrome
The following NEW packages will be installed:
  binutils gsfonts gsfonts-x11 java-common libfontenc1 libxfont1 oracle-java8-installer
  oracle-java8-set-default x11-common xfonts-encodings xfonts-utils
upgraded, 11 newly installed, to remove and not upgraded.
Need to get 6,519 kB of archives.
After this operation, 20.MB of additional disk space will be used.
Get:http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main amd64 oracle-java8-installer all 8u144-1~webupd8~0 [32.9 kB]
Get:http://ppa.launchpad.net/webupd8team/java/ubuntu xenial/main amd64 oracle-java8-set-default all 8u144-1~webupd8~0 [6,738 B]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 binutils amd64 2.26.1-1ubuntu1~16.04.5 [2,311 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial/main amd64 java-common all 0.56ubuntu2 [7,742 B]
Get:http://us.archive.ubuntu.com/ubuntu xenial/main amd64 gsfonts all 1:8.11+urwcyr1.0.7~pre44-4.2ubuntu1 [3,374 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial/main amd64 libfontenc1 amd64 1:1.1.3-1 [13.9 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 libxfont1 amd64 1:1.5.1-1ubuntu0.16.04.3 [95.1 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial/main amd64 x11-common all 1:7.7+13ubuntu3 [22.4 kB]
Get:http://us.archive.ubuntu.com/ubuntu xenial/main amd64 xfonts-encodings all 1:1.0.4-2 [573 kB]
Get:10 http://us.archive.ubuntu.com/ubuntu xenial-updates/main amd64 xfonts-utils amd64 1:7.7+3ubuntu0.16.04.2 [74.6 kB]
Get:11 http://us.archive.ubuntu.com/ubuntu xenial/universe amd64 gsfonts-x11 all 0.24 [7,314 B]
Fetched 6,519 kB in 34s (191 kB/s)
Preconfiguring packages ...
Selecting previously unselected package binutils.
(Reading database ... 59721 files and directories currently installed.)
Preparing to unpack .../binutils_2.26.1-1ubuntu1~16.04.5_amd64.deb ...
Unpacking binutils (2.26.1-1ubuntu1~16.04.5) ...
Selecting previously unselected package java-common.
Preparing to unpack .../java-common_0.56ubuntu2_all.deb ...
Unpacking java-common (0.56ubuntu2) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
Processing triggers for man-db (2.7.5-1) ...
Setting up binutils (2.26.1-1ubuntu1~16.04.5) ...
Processing triggers for libc-bin (2.23-0ubuntu9) ...
Selecting previously unselected package oracle-java8-installer.
(Reading database ... 59937 files and directories currently installed.)
Preparing to unpack .../oracle-java8-installer_8u144-1~webupd8~0_all.deb ...
Unpacking oracle-java8-installer (8u144-1~webupd8~0) ...
Processing triggers for shared-mime-info (1.5-2ubuntu0.1) ...............................]
Processing triggers for mime-support (3.59ubuntu1) ...
Setting up java-common (0.56ubuntu2) ...
Setting up oracle-java8-installer (8u144-1~webupd8~0) ...................................]
No /var/cache/oracle-jdk8-installer/wgetrc file found....................................]
Creating /var/cache/oracle-jdk8-installer/wgetrc and
using default oracle-java8-installer wgetrc settings for it.
Downloading Oracle Java 8...
--2017-10-12 11:00:36--  http://download.oracle.com/otn-pub/java/jdk/8u144-b01/090f390dda5b47b9b721c7dfaa008135/jdk-8u144-linux-x64.tar.gz
Resolving download.oracle.com (download.oracle.com)... 121.254.136.49121.254.136.58
Connecting to download.oracle.com (download.oracle.com)|121.254.136.49|:80... connected.
HTTP request sent, awaiting response... 302 Moved Temporarily
Location: https://edelivery.oracle.com/otn-pub/java/jdk/8u144-b01/090f390dda5b47b9b721c7dfaa008135/jdk-8u144-linux-x64.tar.gz [following]
--2017-10-12 11:00:36--  https://edelivery.oracle.com/otn-pub/java/jdk/8u144-b01/090f390dda5b47b9b721c7dfaa008135/jdk-8u144-linux-x64.tar.gz
Resolving edelivery.oracle.com (edelivery.oracle.com)... 104.75.42.492600:1417:e:28b::2d3e, 2600:1417:e:28e::2d3e
Connecting to edelivery.oracle.com (edelivery.oracle.com)|104.75.42.49|:443... connected.
HTTP request sent, awaiting response... 302 Moved Temporarily
Location: http://download.oracle.com/otn-pub/java/jdk/8u144-b01/090f390dda5b47b9b721c7dfaa008135/jdk-8u144-linux-x64.tar.gz?AuthParam=1507773757_0e6167adc159ce7d572399a0af8ead7b [following]
--2017-10-12 11:00:37--  http://download.oracle.com/otn-pub/java/jdk/8u144-b01/090f390dda5b47b9b721c7dfaa008135/jdk-8u144-linux-x64.tar.gz?AuthParam=1507773757_0e6167adc159ce7d572399a0af8ead7b
Connecting to download.oracle.com (download.oracle.com)|121.254.136.49|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 185515842 (177M) [application/x-gzip]
Saving to: ‘jdk-8u144-linux-x64.tar.gz’
 
     0K ........ ........ ........ ........ ........ ........  110.9M 16s
  3072K ........ ........ ........ ........ ........ ........  311.3M 15s
  6144K ........ ........ ........ ........ ........ ........  59.95M 16s
  9216K ........ ........ ........ ........ ........ ........  611.3M 15s
 12288K ........ ........ ........ ........ ........ ........  811.3M 15s
 15360K ........ ........ ........ ........ ........ ........ 1011.3M 14s
 18432K ........ ........ ........ ........ ........ ........ 1111.3M 14s
 21504K ........ ........ ........ ........ ........ ........ 1311.3M 14s
 24576K ........ ........ ........ ........ ........ ........ 1511.3M 14s
 27648K ........ ........ ........ ........ ........ ........ 1611.0M 13s
 30720K ........ ........ ........ ........ ........ ........ 1811.0M 13s
 33792K ........ ........ ........ ........ ........ ........ 2011.3M 13s
 36864K ........ ........ ........ ........ ........ ........ 2210.9M 12s
 39936K ........ ........ ........ ........ ........ ........ 2311.3M 12s
 43008K ........ ........ ........ ........ ........ ........ 2510.5M 12s
 46080K ........ ........ ........ ........ ........ ........ 2710.0M 12s
 49152K ........ ........ ........ ........ ........ ........ 2811.3M 11s
 52224K ........ ........ ........ ........ ........ ........ 304.47M 12s
 55296K ........ ........ ........ ........ ........ ........ 3211.3M 12s
 58368K ........ ........ ........ ........ ........ ........ 3311.3M 11s
 61440K ........ ........ ........ ........ ........ ........ 3511.3M 11s
 64512K ........ ........ ........ ........ ........ ........ 3711.3M 11s
 67584K ........ ........ ........ ........ ........ ........ 399.56M 10s
 70656K ........ ........ ........ ........ ........ ........ 408.83M 10s
 73728K ........ ........ ........ ........ ........ ........ 4210.5M 10s
 76800K ........ ........ ........ ........ ........ ........ 4410.0M 10s
 79872K ........ ........ ........ ........ ........ ........ 459.65M 9s
 82944K ........ ........ ........ ........ ........ ........ 478.52M 9s
 86016K ........ ........ ........ ........ ........ ........ 497.87M 9s
 89088K ........ ........ ........ ........ ........ ........ 5010.5M 9s
 92160K ........ ........ ........ ........ ........ ........ 5210.6M 8s
 95232K ........ ........ ........ ........ ........ ........ 5411.2M 8s
 98304K ........ ........ ........ ........ ........ ........ 5511.1M 8s
101376K ........ ........ ........ ........ ........ ........ 5711.3M 7s
104448K ........ ........ ........ ........ ........ ........ 5911.3M 7s
107520K ........ ........ ........ ........ ........ ........ 6111.3M 7s
110592K ........ ........ ........ ........ ........ ........ 6211.3M 6s
113664K ........ ........ ........ ........ ........ ........ 6411.3M 6s
116736K ........ ........ ........ ........ ........ ........ 6611.0M 6s
119808K ........ ........ ........ ........ ........ ........ 6711.2M 6s
122880K ........ ........ ........ ........ ........ ........ 6910.7M 5s
125952K ........ ........ ........ ........ ........ ........ 7111.3M 5s
129024K ........ ........ ........ ........ ........ ........ 724.86M 5s
132096K ........ ........ ........ ........ ........ ........ 7411.3M 4s
135168K ........ ........ ........ ........ ........ ........ 7611.2M 4s
138240K ........ ........ ........ ........ ........ ........ 7811.3M 4s
141312K ........ ........ ........ ........ ........ ........ 7911.3M 4s
144384K ........ ........ ........ ........ ........ ........ 8111.3M 3s
147456K ........ ........ ........ ........ ........ ........ 8311.2M 3s
150528K ........ ........ ........ ........ ........ ........ 8411.3M 3s
153600K ........ ........ ........ ........ ........ ........ 8610.8M 2s
156672K ........ ........ ........ ........ ........ ........ 8811.3M 2s
159744K ........ ........ ........ ........ ........ ........ 898.77M 2s
162816K ........ ........ ........ ........ ........ ........ 9111.3M 1s
165888K ........ ........ ........ ........ ........ ........ 939.29M 1s
168960K ........ ........ ........ ........ ........ ........ 9410.2M 1s
172032K ........ ........ ........ ........ ........ ........ 9611.2M 1s
175104K ........ ........ ........ ........ ........ ........ 9811.3M 0s
178176K ........ ........ ........ ........ ........ ......  10011.3M=17s
 
2017-10-12 11:00:54 (10.MB/s) - ‘jdk-8u144-linux-x64.tar.gz’ saved [185515842/185515842]
 
Download done.
Removing outdated cached downloads...
update-alternatives: error: no alternatives for java
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/ControlPanel to provide /usr/bin/ControlPanel (ControlPanel) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/java to provide /usr/bin/java (java) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/javaws to provide /usr/bin/javaws (javaws) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/jcontrol to provide /usr/bin/jcontrol (jcontrol) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/jjs to provide /usr/bin/jjs (jjs) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/keytool to provide /usr/bin/keytool (keytool) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/orbd to provide /usr/bin/orbd (orbd) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/pack200 to provide /usr/bin/pack200 (pack200) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/policytool to provide /usr/bin/policytool (policytool) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/rmid to provide /usr/bin/rmid (rmid) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/rmiregistry to provide /usr/bin/rmiregistry (rmiregistry) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/servertool to provide /usr/bin/servertool (servertool) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/tnameserv to provide /usr/bin/tnameserv (tnameserv) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/bin/unpack200 to provide /usr/bin/unpack200 (unpack200) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/lib/jexec to provide /usr/bin/jexec (jexec) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/appletviewer to provide /usr/bin/appletviewer (appletviewer) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/extcheck to provide /usr/bin/extcheck (extcheck) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/idlj to provide /usr/bin/idlj (idlj) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jar to provide /usr/bin/jar (jar) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jarsigner to provide /usr/bin/jarsigner (jarsigner) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/javac to provide /usr/bin/javac (javac) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/javadoc to provide /usr/bin/javadoc (javadoc) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/javafxpackager to provide /usr/bin/javafxpackager (javafxpackager) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/javah to provide /usr/bin/javah (javah) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/javap to provide /usr/bin/javap (javap) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/javapackager to provide /usr/bin/javapackager (javapackager) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jcmd to provide /usr/bin/jcmd (jcmd) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jconsole to provide /usr/bin/jconsole (jconsole) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jdb to provide /usr/bin/jdb (jdb) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jdeps to provide /usr/bin/jdeps (jdeps) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jhat to provide /usr/bin/jhat (jhat) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jinfo to provide /usr/bin/jinfo (jinfo) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jmap to provide /usr/bin/jmap (jmap) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jmc to provide /usr/bin/jmc (jmc) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jps to provide /usr/bin/jps (jps) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jrunscript to provide /usr/bin/jrunscript (jrunscript) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jsadebugd to provide /usr/bin/jsadebugd (jsadebugd) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jstack to provide /usr/bin/jstack (jstack) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jstat to provide /usr/bin/jstat (jstat) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jstatd to provide /usr/bin/jstatd (jstatd) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/jvisualvm to provide /usr/bin/jvisualvm (jvisualvm) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/native2ascii to provide /usr/bin/native2ascii (native2ascii) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/rmic to provide /usr/bin/rmic (rmic) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/schemagen to provide /usr/bin/schemagen (schemagen) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/serialver to provide /usr/bin/serialver (serialver) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/wsgen to provide /usr/bin/wsgen (wsgen) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/wsimport to provide /usr/bin/wsimport (wsimport) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/bin/xjc to provide /usr/bin/xjc (xjc) in auto mode
update-alternatives: using /usr/lib/jvm/java-8-oracle/jre/lib/amd64/libnpjp2.so to provide /usr/lib/mozilla/plugins/libjavaplugin.so (mozilla-javaplugin.so) in auto mode
Oracle JDK installed
 
#####Important########
To set Oracle JDK8 as default, install the "oracle-java8-set-default" package.
E.g.: sudo apt install oracle-java8-set-default
On Ubuntu systems, oracle-java8-set-default is most probably installed
automatically with this package.
######################
 
Selecting previously unselected package oracle-java8-set-default.........................]
(Reading database ... 59978 files and directories currently installed.)
Preparing to unpack .../oracle-java8-set-default_8u144-1~webupd8~0_all.deb ...
Unpacking oracle-java8-set-default (8u144-1~webupd8~0) ..................................]
Selecting previously unselected package gsfonts..........................................]
Preparing to unpack .../gsfonts_1%3a8.11+urwcyr1.0.7~pre44-4.2ubuntu1_all.deb ...
Unpacking gsfonts (1:8.11+urwcyr1.0.7~pre44-4.2ubuntu1) .................................]
Selecting previously unselected package libfontenc1:amd64................................]
Preparing to unpack .../libfontenc1_1%3a1.1.3-1_amd64.deb ...
Unpacking libfontenc1:amd64 (1:1.1.3-1) ...###...........................................]
Selecting previously unselected package libxfont1:amd64..................................]
Preparing to unpack .../libxfont1_1%3a1.5.1-1ubuntu0.16.04.3_amd64.deb ...
Unpacking libxfont1:amd64 (1:1.5.1-1ubuntu0.16.04.3) ....................................]
Selecting previously unselected package x11-common.##....................................]
Preparing to unpack .../x11-common_1%3a7.7+13ubuntu3_all.deb ...
Unpacking x11-common (1:7.7+13ubuntu3) ...############...................................]
Selecting previously unselected package xfonts-encodings.................................]
Preparing to unpack .../xfonts-encodings_1%3a1.0.4-2_all.deb ...
Unpacking xfonts-encodings (1:1.0.4-2) ...################...............................]
Selecting previously unselected package xfonts-utils.#######.............................]
Preparing to unpack .../xfonts-utils_1%3a7.7+3ubuntu0.16.04.2_amd64.deb ...
Unpacking xfonts-utils (1:7.7+3ubuntu0.16.04.2) ...###########...........................]
Selecting previously unselected package gsfonts-x11.############.........................]
Preparing to unpack .../gsfonts-x11_0.24_all.deb ...
Unpacking gsfonts-x11 (0.24) ...#################################........................]
Processing triggers for libc-bin (2.23-0ubuntu9) ...################.....................]
Processing triggers for systemd (229-4ubuntu19) ...
Processing triggers for ureadahead (0.100.0-19) ...
Processing triggers for man-db (2.7.5-1) ...
Setting up oracle-java8-set-default (8u144-1~webupd8~0) ...
Setting up gsfonts (1:8.11+urwcyr1.0.7~pre44-4.2ubuntu1) ...##########...................]
Setting up libfontenc1:amd64 (1:1.1.3-1) ...#############################................]
Setting up libxfont1:amd64 (1:1.5.1-1ubuntu0.16.04.3) ...###################.............]
Setting up x11-common (1:7.7+13ubuntu3) ...###################################...........]
update-rc.d: warning: start and stop actions are no longer supported; falling back to defaults
Setting up xfonts-encodings (1:1.0.4-2) ...######################################........]
Setting up xfonts-utils (1:7.7+3ubuntu0.16.04.2) ...###############################......]
Setting up gsfonts-x11 (0.24) ...#####################################################...]
Processing triggers for libc-bin (2.23-0ubuntu9) ...####################################.]
Processing triggers for systemd (229-4ubuntu19) ...
Processing triggers for ureadahead (0.100.0-19) ...
 
cs

* 설치 진행 중에 아래와 같이 license 동의 관련 창이 출력됩니다.


6-1. license 동의서


6-2. license 동의서

* 위와 같이 Yes 를 선택하고 다음으로 넘어갑니다.



7. java 버전 확인

1
2
3
4
5
$ java -version
java version "1.8.0_144"
Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)
 
cs



8. java 환경 변수 추가

* 참고

 : 명령어를 통해 환경 변수 설정 등을 기본값으로 하는 방법은 다음과 같습니다.

1
2
sudo apt install oracle-java8-set-default
 
cs

8-1. java 경로 확인

1
2
3
4
5
6
7
8
9
10
$ sudo update-alternatives --config java
There is choice for the alternative java (providing /usr/bin/java).
 
  Selection    Path                                     Priority   Status
------------------------------------------------------------
             /usr/lib/jvm/java-8-oracle/jre/bin/java   1081      auto mode
*            /usr/lib/jvm/java-8-oracle/jre/bin/java   1081      manual mode
 
Press <enter> to keep the current choice[*], or type selection number:
 
cs

* 현재 선택된 설정을 유지하려면 <enter> 버튼을 누르세요.

* jdk 가 설치된 경로는 "/usr/lib/jvm/java-8-oracle" 와 같습니다.


8-2. environment 파일에 java 환경 변수 등록

8-2-1. environment 파일 편집

1
$ sudo nano /etc/environment
cs

* 기호에 따라서 vi 를 사용해도 좋습니다.


8-2-2. java 환경 변수 추가

1
2
3
PATH="/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/gam$
JAVA_HOME="/usr/lib/jvm/java-8-oracle"
cs

* 하단에 위와 같이 JAVA_HOME="/usr/lib/jvm/java-8-oracle" 부분을 추가하고 저장합니다.


8-2-3. environment 파일에 편집한 내용을 적용

1
$ source /etc/environment
cs


8-3. profile 파일에 java 환경 변수 등록

8-3-1. profile 파일 편집

1
$ sudo nano /etc/profile
cs


8-3-2. java 환경 변수 추가

1
2
3
4
5
6
7
8
/etc/profile: system-wide .profile file for the Bourne shell (sh(1))
# and Bourne compatible shells (bash(1), ksh(1), ash(1), ...).
 
...
 
export JAVA_HOME=/usr/lib/jvm/java-8-oracle
export PATH=$PATH:$JAVA_HOME/bin
 
cs

* 하단에 위와 같이 JAVA_HOME="/usr/lib/jvm/java-8-oracle" 부분을 추가하고 저장합니다.


8-3-3. profile 파일에 편집한 내용을 적용

1
$ source /etc/profile
cs



9. host 파일에 node 등록

9-1. host 파일 편집

1
$ sudo nano /etc/hosts
cs


9-2. host 파일에 각 node 등록

1
2
3
4
5
6
7
8
9
10
11
12
13
14
127.0.0.      localhost
#127.0.1.     ubuntu.localdomain      ubuntu
 
# The following lines are desirable for IPv6 capable hosts
::    localhost ip6-localhost ip6-loopback
ff02::ip6-allnodes
ff02::ip6-allrouters
 
192.168.10.101 master
192.168.10.102 secondary
192.168.10.103 datanode-a
192.168.10.104 datanode-b
192.168.10.105 datanode-c
 
cs

* 127.0.1.1 에 해당하는 부분을 주석 처리

* 각 node 의 IP 주소와 host 명을 입력하고 저장합니다.



10. sysctl.conf

10-1. sysctl.conf 파일 편집

1
$ sudo nano /etc/sysctl.conf
cs


10-2. sysctl.conf 파일에 다음의 내용을 추가 후 저장

1
2
3
4
5
6
7
8
9
10
11
12
13
#
/etc/sysctl.conf - Configuration file for setting system variables
# See /etc/sysctl.d/ for additional system variables.
# See sysctl.conf (5for information.
#
 
...
 
# disable ipv6
net.ipv6.conf.all.disable_ipv6 = 1
net.ipv6.conf.default.disable_ipv6 = 1
net.ipv6.conf.lo.disable_ipv6 = 1
 
cs

* 위와 같이 4라인을 추가 후 저장합니다.

* 수정 후 저장이 끝나면 서버를 재기동합니다.


10-3. server 재기동

1
$ sudo reboot
cs



11. hadoop 관련 그룹 및 계정 추가

11-1. hadoop 그룹 추가

1
2
3
4
$ sudo addgroup hadoop
Adding group `hadoop' (GID 1001) ...
Done.
 
cs


11-2. hadoop 유저 추가

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
$ sudo adduser --ingroup hadoop hadoop-user
Adding user `hadoop-user' ...
Adding new user `hadoop-user' (1001) with group `hadoop' ...
Creating home directory `/home/hadoop-user' ...
Copying files from `/etc/skel' ...
Enter new UNIX password:
Retype new UNIX password:
passwd: password updated successfully
Changing the user information for hadoop-user
Enter the new value, or press ENTER for the default
        Full Name []:
        Room Number []:
        Work Phone []:
        Home Phone []:
        Other []:
Is the information correct? [Y/n] Y
 
cs

* --ingroup 옵션은 다른 그룹에 포함되도록 그룹 지정을 가능케 하는 옵션입니다. 기본적으로 시스템 유저는 nogroup 그룹에 포함됩니다.

* 새롭게 유저를 추가하는 경우 비밀번호와 유저 이름을 입력받는 프롬프트 표시됩니다.

* 마지막에 입력한 내용의 이상 유무를 확인하면 유저가 추가됩니다.



12. ssh 설정

-> ssh 설정 관련 상세한 설명은 위의 링크를 참고하세요.

12-1. ssh 설정 관련 축약

1
2
3
4
5
6
7
8
9
10
11
12
$ ssh-keygen -t rsa -P
$ cat $HOME/.ssh/id_rsa.pub >> $HOME/.ssh/authorized_keys
 
$ ssh-copy-id ~/.ssh/id_rsa.pub hadoop-user@datanode-a
$ ssh-copy-id ~/.ssh/id_rsa.pub hadoop-user@datanode-b
$ ssh-copy-id ~/.ssh/id_rsa.pub hadoop-user@datanode-c
 
$ ssh secondary
$ ssh datanode-a
$ ssh datanode-b
$ ssh datanode-c
 
cs

* name node 에서 각 node 에 이상 없이 접속이 돼야 합니다.



13. Data Node 의 hadoop 저장소 생성 및 권한 설정

13-1. hadoop 저장소 생성

1
2
3
4
$ sudo mkdir -p /data01/datanode
$ sudo mkdir -p /data02/datanode
$ sudo mkdir -p /data03/datanode
 
cs

* mkdir -p 에서 -p 옵션을 사용하면 부모 디렉토리가 존재하지 않을 경우 함께 생성을 합니다.


13-2. hadoop 저장소 생성 확인

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
$ ll / |grep data*
drwxr-xr-x   3 root root  4096 Oct 12 15:22 data01/
drwxr-xr-x   3 root root  4096 Oct 12 15:24 data02/
drwxr-xr-x   3 root root  4096 Oct 12 15:25 data03/
 
$ ll /data*
/data01:
total 12
drwxr-xr-x  3 root root 4096 Oct 12 15:22 ./
drwxr-xr-x 26 root root 4096 Oct 12 15:25 ../
drwxr-xr-x  2 root root 4096 Oct 12 15:22 datanode/
 
/data02:
total 12
drwxr-xr-x  3 root root 4096 Oct 12 15:24 ./
drwxr-xr-x 26 root root 4096 Oct 12 15:25 ../
drwxr-xr-x  2 root root 4096 Oct 12 15:24 datanode/
 
/data03:
total 12
drwxr-xr-x  3 root root 4096 Oct 12 15:25 ./
drwxr-xr-x 26 root root 4096 Oct 12 15:25 ../
drwxr-xr-x  2 root root 4096 Oct 12 15:25 datanode/
 
cs

* hadoop 저장소의 소유자가 hadoop-user 가 아니기 때문에 다음 과정에서 소유자 및 소유 그룹을 변경합니다.


13-3. hadoop 저장소의 소유자 및 소유 그룹 변경

1
2
3
$ sudo chown -R hadoop-user:hadoop /data01
$ sudo chown -R hadoop-user:hadoop /data02
$ sudo chown -R hadoop-user:hadoop /data03
cs


13-4. hadoop 저장소의 소유자 및 소유 그룹 확인

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
$ ll / |grep data*
drwxr-xr-x   3 hadoop-user hadoop  4096 Oct 12 15:22 data01/
drwxr-xr-x   3 hadoop-user hadoop  4096 Oct 12 15:24 data02/
drwxr-xr-x   3 hadoop-user hadoop  4096 Oct 12 15:25 data03/
 
$ ll /data*
/data01:
total 12
drwxr-xr-x  3 hadoop-user hadoop 4096 Oct 12 15:22 ./
drwxr-xr-x 26 root        root   4096 Oct 12 15:25 ../
drwxr-xr-x  2 hadoop-user hadoop 4096 Oct 12 15:22 datanode/
 
/data02:
total 12
drwxr-xr-x  3 hadoop-user hadoop 4096 Oct 12 15:24 ./
drwxr-xr-x 26 root        root   4096 Oct 12 15:25 ../
drwxr-xr-x  2 hadoop-user hadoop 4096 Oct 12 15:24 datanode/
 
/data03:
total 12
drwxr-xr-x  3 hadoop-user hadoop 4096 Oct 12 15:25 ./
drwxr-xr-x 26 root        root   4096 Oct 12 15:25 ../
drwxr-xr-x  2 hadoop-user hadoop 4096 Oct 12 15:25 datanode/
 
cs



14. hadoop 다운로드

1
2
3
4
5
6
7
8
9
10
11
12
$ wget http://apache.mirror.cdnetworks.com/hadoop/common/hadoop-2.8.1/hadoop-2.8.1.tar.gz
--2017-10-12 15:46:05--  http://apache.mirror.cdnetworks.com/hadoop/common/hadoop-2.8.1/hadoop-2.8.1.tar.gz
Resolving apache.mirror.cdnetworks.com (apache.mirror.cdnetworks.com)... 14.0.101.165
Connecting to apache.mirror.cdnetworks.com (apache.mirror.cdnetworks.com)|14.0.101.165|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 424555111 (405M) [application/x-gzip]
Saving to: ‘hadoop-2.8.1.tar.gz’
 
hadoop-2.8.1.tar.gz     100%[============================>] 404.89M  5.97MB/s    in 2m 18s
 
2017-10-12 15:48:23 (2.94 MB/s) - ‘hadoop-2.8.1.tar.gz’ saved [424555111/424555111]
 
cs



15. hadoop 압축 파일 해제

1
2
$ tar -xvzf /usr/local/hadoop-2.8.1.tar.gz
 
cs

hadoop-2.8.1.txt



※ 16번 이후 작업에 대해서는 hadoop 소유자로 계정을 전환하여 진행하는 편이 좋습니다.

1
2
3
$ su - hadoop-user
Password:
 
cs


※ hadoop-user 계정을 생성한 이후 별다른 설정을 하지 않았다면 sudo 명령을 사용할 수 없기 때문에 아래 링크를 참조하세요.

 -> sudo 명령어로 root 권한 획득하려면



16. ~/hadoop-2.8.1 디렉토리를 /usr/local/ 디렉토리로 이동

1
2
$ sudo mv ~/hadoop-2.8.1 /usr/local/
 
cs



17. hadoop 디렉토리 경로 확인

1
2
3
$ ll /usr/local/ |grep hadoop-2.8.1
drwxrwxr-x  9 user user      4096 Jun  2 15:24 hadoop-2.8.1/
 
cs



18. /usr/local/hadoop-2.8.1 디렉토리에 대한 /usr/local/hadoop 링크 생성

1
2
$ sudo ln -sf /usr/local/hadoop-2.8.1 /usr/local/hadoop
 
cs



19. hadoop 링크 생성 확인

1
2
3
4
$ ll /usr/local/ |grep hadoop
lrwxrwxrwx  1 root root        23 Oct 12 17:28 hadoop -> /usr/local/hadoop-2.8.1/
drwxrwxr-x  9 user user      4096 Jun  2 15:24 hadoop-2.8.1/
 
cs

* 위의 디렉토리에 대한 소유자 및 소유 그룹이 hadoop 소유 그룹 및 소유자가 아니기 때문에 다음 과정에서 변경합니다.



20. hadoop 디렉토리의 소유 그룹 및 소유자 변경

1
2
$ sudo chown -R hadoop-user:hadoop /usr/local/hadoop-2.8.1
 
cs



21. hadoop 디렉토리의 소유 그룹 및 소유자 확인

1
2
3
4
$ ll /usr/local/ |grep hadoop
lrwxrwxrwx  1 root        root          23 Oct 12 17:28 hadoop -> /usr/local/hadoop-2.8.1/
drwxrwxr-x  9 hadoop-user hadoop      4096 Jun  2 15:24 hadoop-2.8.1/
 
cs



22. masters 파일 생성

1
2
$ echo secondary > /usr/local/hadoop/etc/hadoop/masters
 
cs

* masters 파일에 보조 네임 노드가 있을 경우 해당 보조 네임 노드의 호스트명을 입력합니다.



23. slaves 파일 수정

23-1. 초기 상태

1
2
$ cat /usr/local/hadoop/etc/hadoop/slaves
 
cs


23-2. 설정 수정

23-2-1. salves 편집

1
2
$ vi /usr/local/hadoop/etc/hadoop/slaves
 
cs


23-2-2. 설정 편집 후 저장

1
2
3
4
datanode-a
datanode-b
datanode-c
 
cs

* 기존의 localhost 라고 작성된 내용을 삭제 후 위와 같이 데이터 노드의 호스트명을 차례로 입력 후 저장합니다.


24. java 환경 변수 추가

24-1. .bashrc 파일에 java 환경 변수 등록

24-1-1. .bashrc 파일 편집

1
2
$ nano ~/.bashrc
 
cs


24-1-2. java 환경 변수 추가

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
# ~/.bashrc: executed by bash(1) for non-login shells.
# see /usr/share/doc/bash/examples/startup-files (in the package bash-doc)
# for examples
 
...
 
# Set Hadoop-related environment variables
export HADOOP_PREFIX=/usr/local/hadoop
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_MAPRED_HOME=${HADOOP_HOME}
export HADOOP_COMMON_HOME=${HADOOP_HOME}
export HADOOP_HDFS_HOME=${HADOOP_HOME}
export HADOOP_YARN_HOME=${HADOOP_HOME}
export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop
# Native path
export HADOOP_COMMON_LIB_NATIVE_DIR=${HADOOP_PREFIX}/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_PREFIX/lib/native"
# Java path
export JAVA_HOME=/usr/lib/jvm/java-8-oracle
# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin:$JAVA_PATH/bin:$HADOOP_HOME/sbin
 
cs

* 하단에 위와 같이 7~21라인 부분을 추가하고 저장합니다.


24-1-3. .bashrc 파일에 편집한 내용을 적용

1
$ source ~/.bashrc
cs



25. hadoop 의 환경 설정

25-1. core-site.xml 파일 편집

1
2
$ vi /usr/local/hadoop/etc/hadoop/core-site.xml
 
cs


25-1-1. core-site.xml 파일에 설정 추가

25-1-2. 초기 상태

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at
 
    http://www.apache.org/licenses/LICENSE-2.0
 
  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->
 
<!-- Put site-specific property overrides in this file. -->
 
<configuration>
</configuration>
 
cs

* <configuration> 과 </configuration> 의 사이에 해당 내용을 추가합니다.


25-1-3. 설정 추가

1
2
3
4
5
6
7
8
9
<property>
        <name>fs.default.name</name>
        <value>hdfs://master:8020</value>
</property>
<property>
        <name>hadoop.tmp.dir</name>
        <value>/usr/local/hadoop/tmp</value>
</property>
 
cs

* 위의 내용을 추가하고 저장합니다.

* master 는 네임 노드의 호스트명이며 각자의 호스트명에 맞게 기입하세요.


25-2. hdfs-site.xml 파일 편집

1
2
$ vi /usr/local/hadoop/etc/hadoop/hdfs-site.xml
 
cs


25-2-1. hdfs-site.xml 파일에 설정 추가

25-2-2. 초기 상태

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at
 
    http://www.apache.org/licenses/LICENSE-2.0
 
  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->
 
<!-- Put site-specific property overrides in this file. -->
 
<configuration>
 
</configuration>
 
cs

* <configuration> 과 </configuration> 의 사이에 해당 내용을 추가합니다.


25-2-3. 설정 추가

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
<property>
        <name>dfs.replication</name>
        <value>3</value>
</property>
<property>
        <name>dfs.http.address</name>
        <value>master:50070</value>
</property>
<property>
        <name>dfs.namenode.name.dir</name>
        <value>/usr/local/hadoop/yarn_data/hdfs/namenode</value>
</property>
<property>
        <name>dfs.namenode.secondary.http-address</name>
        <value>secondary:50090</value>
</property>
<!--
<property>
        <name>fs.default.name</name>
        <value>hdfs://master:8020</value>
</property>
-->
<property>
        <name>dfs.hosts</name>
        <value>/usr/local/hadoop/etc/hadoop/slaves</value>
</property>
<property>
        <name>dfs.datanode.data.dir</name>
        <value>/data01/datanode,/data02/datanode,/data03/datanode</value>
</property>
 
cs

* 위의 내용을 추가하고 저장합니다.

* "/usr/local/hadoop/yarn_data/hdfs/namenode" 부분에 해당 디렉토리를 생성합니다.

1
2
$ mkdir -p /usr/local/hadoop/yarn_data/hdfs/namenode
 
cs


25-3. mapred-site.xml 파일 생성

1
2
cp /usr/local/hadoop/etc/hadoop/mapred-site.xml.template /usr/local/hadoop/etc/hadoop/mapred-site.xml
 
cs

* 초기 상태에서 mapred-site.xml 이 존재하지 않기 때문에 mapred-site.xml.template 파일을 mapred-site.xml 이름으로 복사합니다.


25-3-1. mapred-site.xml 파일에 설정 추가

25-3-2. 초기 상태

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at
 
    http://www.apache.org/licenses/LICENSE-2.0
 
  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->
 
<!-- Put site-specific property overrides in this file. -->
 
<configuration>
 
</configuration>
 
cs

* <configuration> 과 </configuration> 의 사이에 해당 내용을 추가합니다.


25-3-3. 설정 추가

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
<property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
</property>
<property>
        <name>mapreduce.jobtracker.address</name>
        <value>master:8021</value>
</property>
<property>
        <name>mapred.job.tracker</name>
        <value>master:9001</value>
</property>
<property>
        <name>mapred.job.tracker.http.address</name>
        <value>master:50030</value>
</property>
 
cs

* 위의 내용을 추가하고 저장합니다.


25-4. yarn-site.xml 파일 편집

1
2
$ vi /usr/local/hadoop/etc/hadoop/yarn-site.xml
 
cs


25-4-1. yarn-site.xml 파일에 설정 추가

25-4-2. 초기 상태

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
<?xml version="1.0"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at
 
    http://www.apache.org/licenses/LICENSE-2.0
 
  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->
<configuration>
 
<!-- Site specific YARN configuration properties -->
 
</configuration>
 
cs

* <configuration> 과 </configuration> 의 사이에 해당 내용을 추가합니다.


25-4-3. 설정 추가

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
<configuration>
 
<!-- Site specific YARN configuration properties -->
<property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
</property>
<property>
        <name>yarn.nodemanager.aux-services.mapreduce_shuffle.class</name>
        <value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
<property>
        <name>yarn.resourcemanager.hostname</name>
        <value>master</value>
</property>
<property>
        <name>yarn.resourcemanager.scheduler.address</name>
        <value>master:8030</value>
</property>
<property>
        <name>yarn.resourcemanager.resource-tracker.address</name>
        <value>master:8031</value>
</property>
<property>
        <name>yarn.resourcemanager.address</name>
        <value>master:8032</value>
</property>
<property>
        <name>yarn.resourcemanager.admin.address</name>
        <value>master:8041</value>
</property>
</configuration>
 
cs

* 위의 내용을 추가하고 저장합니다.



 위의 과정이 전 노드에 설치 및 설정 완료된 이후 name node에서 아래 단계를 진행하세요.


26. hdfs 포맷

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
$ hdfs namenode -format
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/hive-0.8.1/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
17/10/13 09:17:17 INFO namenode.NameNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting NameNode
STARTUP_MSG:   user = hadoop-user
STARTUP_MSG:   host = master/192.168.10.101
STARTUP_MSG:   args = [-format]
STARTUP_MSG:   version = 2.8.1
STARTUP_MSG:   classpath = /usr/local/hadoop/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/httpcore-4.4.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jets3t-0.9.0.jar:/usr/local/hadoop/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/common/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/common/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-client-2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-recipes-2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/htrace-core4-4.0.1-incubating.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/local/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/local/hadoop/share/hadoop/common/lib/jcip-annotations-1.0.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/api-util-1.0.0-M20.jar:/usr/local/hadoop/share/hadoop/common/lib/json-smart-1.1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/local/hadoop/share/hadoop/common/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/common/lib/java-xmlbuilder-0.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/common/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/common/lib/nimbus-jose-jwt-3.9.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/common/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hadoop/share/hadoop/common/lib/curator-framework-2.7.1.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-net-3.1.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-auth-2.8.1.jar:/usr/local/hadoop/share/hadoop/common/lib/slf4j-api-1.7.10.jar:/usr/local/hadoop/share/hadoop/common/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/common/lib/jsch-0.1.51.jar:/usr/local/hadoop/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/common/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/common/lib/hadoop-annotations-2.8.1.jar:/usr/local/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/common/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/common/lib/jetty-sslengine-6.1.26.jar:/usr/local/hadoop/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/local/hadoop/share/hadoop/common/lib/httpclient-4.5.2.jar:/usr/local/hadoop/share/hadoop/common/hadoop-nfs-2.8.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.8.1.jar:/usr/local/hadoop/share/hadoop/common/hadoop-common-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okio-1.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/htrace-core4-4.0.1-incubating.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xml-apis-1.3.04.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/okhttp-2.4.0.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.23.Final.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/hadoop-hdfs-client-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/xercesImpl-2.9.1.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-2.8.1.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javassist-3.18.1-GA.jar:/usr/local/hadoop/share/hadoop/yarn/lib/fst-2.24.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-math-2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/activation-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/curator-client-2.7.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-util-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-collections-3.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/objenesis-2.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/yarn/lib/stax-api-1.0-2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-logging-1.1.3.jar:/usr/local/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-codec-1.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-json-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jsr305-3.0.0.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-xc-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/curator-test-2.7.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-cli-1.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guava-11.0.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/servlet-api-2.5.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jetty-6.1.26.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jettison-1.1.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jaxb-api-2.2.2.jar:/usr/local/hadoop/share/hadoop/yarn/lib/commons-lang-2.6.jar:/usr/local/hadoop/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/yarn/lib/zookeeper-3.4.6-tests.jar:/usr/local/hadoop/share/hadoop/yarn/lib/jersey-client-1.9.jar:/usr/local/hadoop/share/hadoop/yarn/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-api-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-common-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-client-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-registry-2.8.1.jar:/usr/local/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/leveldbjni-all-1.8.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/jackson-core-asl-1.9.13.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/commons-io-2.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/hadoop-annotations-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/local/hadoop/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.8.1-tests.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.8.1.jar:/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.8.1.jar:/usr/local/hive/conf:/usr/local/hive/lib:/usr/local/hive/lib/ant-contrib-1.0b3.jar:/usr/local/hive/lib/antlr-2.7.7.jar:/usr/local/hive/lib/antlr-3.0.1.jar:/usr/local/hive/lib/antlr-runtime-3.0.1.jar:/usr/local/hive/lib/asm-3.1.jar:/usr/local/hive/lib/commons-cli-1.2.jar:/usr/local/hive/lib/commons-codec-1.3.jar:/usr/local/hive/lib/commons-collections-3.2.1.jar:/usr/local/hive/lib/commons-dbcp-1.4.jar:/usr/local/hive/lib/commons-lang-2.4.jar:/usr/local/hive/lib/commons-logging-1.0.4.jar:/usr/local/hive/lib/commons-logging-api-1.0.4.jar:/usr/local/hive/lib/commons-pool-1.5.4.jar:/usr/local/hive/lib/datanucleus-connectionpool-2.0.3.jar:/usr/local/hive/lib/datanucleus-core-2.0.3.jar:/usr/local/hive/lib/datanucleus-enhancer-2.0.3.jar:/usr/local/hive/lib/datanucleus-rdbms-2.0.3.jar:/usr/local/hive/lib/derby-10.4.2.0.jar:/usr/local/hive/lib/guava-r06.jar:/usr/local/hive/lib/hbase-0.89.0-SNAPSHOT.jar:/usr/local/hive/lib/hbase-0.89.0-SNAPSHOT-tests.jar:/usr/local/hive/lib/hive-anttasks-0.8.1.jar:/usr/local/hive/lib/hive-builtins-0.8.1.jar:/usr/local/hive/lib/hive-cli-0.8.1.jar:/usr/local/hive/lib/hive-common-0.8.1.jar:/usr/local/hive/lib/hive-contrib-0.8.1.jar:/usr/local/hive/lib/hive_contrib.jar:/usr/local/hive/lib/hive-exec-0.8.1.jar:/usr/local/hive/lib/hive-hbase-handler-0.8.1.jar:/usr/local/hive/lib/hive-hwi-0.8.1.jar:/usr/local/hive/lib/hive-jdbc-0.8.1.jar:/usr/local/hive/lib/hive-metastore-0.8.1.jar:/usr/local/hive/lib/hive-pdk-0.8.1.jar:/usr/local/hive/lib/hive-serde-0.8.1.jar:/usr/local/hive/lib/hive-service-0.8.1.jar:/usr/local/hive/lib/hive-shims-0.8.1.jar:/usr/local/hive/lib/javaewah-0.3.jar:/usr/local/hive/lib/jdo2-api-2.3-ec.jar:/usr/local/hive/lib/jline-0.9.94.jar:/usr/local/hive/lib/json-20090211.jar:/usr/local/hive/lib/junit-4.10.jar:/usr/local/hive/lib/libfb303-0.7.0.jar:/usr/local/hive/lib/libfb303.jar:/usr/local/hive/lib/libthrift-0.7.0.jar:/usr/local/hive/lib/libthrift.jar:/usr/local/hive/lib/log4j-1.2.15.jar:/usr/local/hive/lib/log4j-1.2.16.jar:/usr/local/hive/lib/mockito-all-1.8.2.jar:/usr/local/hive/lib/mysql-connector-java-5.1.9.jar:/usr/local/hive/lib/slf4j-api-1.6.1.jar:/usr/local/hive/lib/slf4j-log4j12-1.6.1.jar:/usr/local/hive/lib/stringtemplate-3.1-b1.jar:/usr/local/hive/lib/velocity-1.5.jar:/usr/local/hive/lib/zookeeper-3.3.1.jar
STARTUP_MSG:   build = https://git-wip-us.apache.org/repos/asf/hadoop.git -r 20fe5304904fc2f5a18053c389e43cd26f7a70fe; compiled by 'vinodkv' on 2017-06-02T06:14Z
STARTUP_MSG:   java = 1.8.0_144
************************************************************/
17/10/13 09:17:17 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
17/10/13 09:17:17 INFO namenode.NameNode: createNameNode [-format]
17/10/13 09:17:18 WARN common.Util: Path /usr/local/hadoop/yarn_data/hdfs/namenode should be specified as a URI in configuration files. Please update hdfs configuration.
17/10/13 09:17:18 WARN common.Util: Path /usr/local/hadoop/yarn_data/hdfs/namenode should be specified as a URI in configuration files. Please update hdfs configuration.
Formatting using clusterid: CID-fc9f843f-928a-4c9c-bd10-e2dd4c567d79
17/10/13 09:17:18 INFO namenode.FSEditLog: Edit logging is async:false
17/10/13 09:17:18 INFO namenode.FSNamesystem: KeyProvider: null
17/10/13 09:17:18 INFO namenode.FSNamesystem: fsLock is fair: true
17/10/13 09:17:18 INFO namenode.FSNamesystem: Detailed lock hold time metrics enabled: false
17/10/13 09:17:18 INFO util.HostsFileReader: Adding a node "datanode-a" to the list of included hosts from /usr/local/hadoop/etc/hadoop/slaves
17/10/13 09:17:18 INFO util.HostsFileReader: Adding a node "datanode-b" to the list of included hosts from /usr/local/hadoop/etc/hadoop/slaves
17/10/13 09:17:18 INFO util.HostsFileReader: Adding a node "datanode-c" to the list of included hosts from /usr/local/hadoop/etc/hadoop/slaves
17/10/13 09:17:18 INFO blockmanagement.DatanodeManager: dfs.block.invalidate.limit=1000
17/10/13 09:17:18 INFO blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true
17/10/13 09:17:18 INFO blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000
17/10/13 09:17:18 INFO blockmanagement.BlockManager: The block deletion will start around 2017 Oct 13 09:17:18
17/10/13 09:17:18 INFO util.GSet: Computing capacity for map BlocksMap
17/10/13 09:17:18 INFO util.GSet: VM type       = 64-bit
17/10/13 09:17:18 INFO util.GSet: 2.0% max memory 889 MB = 17.8 MB
17/10/13 09:17:18 INFO util.GSet: capacity      = 2^21 = 2097152 entries
17/10/13 09:17:18 INFO blockmanagement.BlockManager: dfs.block.access.token.enable=false
17/10/13 09:17:18 INFO blockmanagement.BlockManager: defaultReplication         = 3
17/10/13 09:17:18 INFO blockmanagement.BlockManager: maxReplication             = 512
17/10/13 09:17:18 INFO blockmanagement.BlockManager: minReplication             = 1
17/10/13 09:17:18 INFO blockmanagement.BlockManager: maxReplicationStreams      = 2
17/10/13 09:17:18 INFO blockmanagement.BlockManager: replicationRecheckInterval = 3000
17/10/13 09:17:18 INFO blockmanagement.BlockManager: encryptDataTransfer        = false
17/10/13 09:17:18 INFO blockmanagement.BlockManager: maxNumBlocksToLog          = 1000
17/10/13 09:17:18 INFO namenode.FSNamesystem: fsOwner             = hadoop-user (auth:SIMPLE)
17/10/13 09:17:18 INFO namenode.FSNamesystem: supergroup          = supergroup
17/10/13 09:17:18 INFO namenode.FSNamesystem: isPermissionEnabled = true
17/10/13 09:17:18 INFO namenode.FSNamesystem: HA Enabled: false
17/10/13 09:17:18 INFO namenode.FSNamesystem: Append Enabled: true
17/10/13 09:17:18 INFO util.GSet: Computing capacity for map INodeMap
17/10/13 09:17:18 INFO util.GSet: VM type       = 64-bit
17/10/13 09:17:18 INFO util.GSet: 1.0% max memory 889 MB = 8.9 MB
17/10/13 09:17:18 INFO util.GSet: capacity      = 2^20 = 1048576 entries
17/10/13 09:17:18 INFO namenode.FSDirectory: ACLs enabled? false
17/10/13 09:17:18 INFO namenode.FSDirectory: XAttrs enabled? true
17/10/13 09:17:18 INFO namenode.NameNode: Caching file names occurring more than 10 times
17/10/13 09:17:18 INFO util.GSet: Computing capacity for map cachedBlocks
17/10/13 09:17:18 INFO util.GSet: VM type       = 64-bit
17/10/13 09:17:18 INFO util.GSet: 0.25% max memory 889 MB = 2.2 MB
17/10/13 09:17:18 INFO util.GSet: capacity      = 2^18 = 262144 entries
17/10/13 09:17:18 INFO namenode.FSNamesystem: dfs.namenode.safemode.threshold-pct = 0.9990000128746033
17/10/13 09:17:18 INFO namenode.FSNamesystem: dfs.namenode.safemode.min.datanodes = 0
17/10/13 09:17:18 INFO namenode.FSNamesystem: dfs.namenode.safemode.extension     = 30000
17/10/13 09:17:18 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10
17/10/13 09:17:18 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10
17/10/13 09:17:18 INFO metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25
17/10/13 09:17:18 INFO namenode.FSNamesystem: Retry cache on namenode is enabled
17/10/13 09:17:18 INFO namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis
17/10/13 09:17:18 INFO util.GSet: Computing capacity for map NameNodeRetryCache
17/10/13 09:17:18 INFO util.GSet: VM type       = 64-bit
17/10/13 09:17:18 INFO util.GSet: 0.029999999329447746% max memory 889 MB = 273.1 KB
17/10/13 09:17:18 INFO util.GSet: capacity      = 2^15 = 32768 entries
17/10/13 09:17:21 INFO namenode.FSImage: Allocated new BlockPoolId: BP-1747180547-192.168.10.101-1507853841326
17/10/13 09:17:21 INFO common.Storage: Storage directory /usr/local/hadoop-2.8.1/yarn_data/hdfs/namenode has been successfully formatted.
17/10/13 09:17:21 INFO namenode.FSImageFormatProtobuf: Saving image file /usr/local/hadoop-2.8.1/yarn_data/hdfs/namenode/current/fsimage.ckpt_0000000000000000000 using no compression
17/10/13 09:17:21 INFO namenode.FSImageFormatProtobuf: Image file /usr/local/hadoop-2.8.1/yarn_data/hdfs/namenode/current/fsimage.ckpt_0000000000000000000 of size 328 bytes saved in 0 seconds.
17/10/13 09:17:21 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
17/10/13 09:17:21 INFO util.ExitUtil: Exiting with status 0
17/10/13 09:17:21 INFO namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at master/192.168.10.101
************************************************************/
 
cs

* name node 에 메타 데이터가 저장되고 있기 때문에 데이터가 저장된 이후에는 hdfs 포맷은 신중히 하는 게 좋습니다.


27. hadoop 서비스 실행

27-1. 서비스 실행 방법1

1
2
3
$ /usr/local/hadoop/sbin/start-dfs.sh
$ /usr/local/hadoop/sbin/start-yarn.sh
 
cs


27-2. 서비스 실행 방법2

1
2
$ /usr/local/hadoop/sbin/start-all.sh
 
cs

* 위의 2가지 명령어를 한 번에 실행하는 명령어이나 권장하는 방법은 아닙니다.


27-3. 서비스 실행 시

27-3-1. start-dfs.sh

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
$ /usr/local/hadoop/sbin/hadoop/start-dfs.sh
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Starting namenodes on [master]
master: starting namenode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-namenode-master.out
datanode-b: starting datanode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-datanode-datanode-b.out
datanode-a: starting datanode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-datanode-datanode-a.out
datanode-c: starting datanode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-datanode-datanode-c.out
Starting secondary namenodes [secondary]
secondary: starting secondarynamenode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-secondarynamenode-secondary.out
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
 
cs


27-3-2. start-yarn.sh

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
$ /usr/local/hadoop/sbin/hadoop/start-dfs.sh
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Starting namenodes on [master]
master: starting namenode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-namenode-master.out
datanode-b: starting datanode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-datanode-datanode-b.out
datanode-a: starting datanode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-datanode-datanode-a.out
datanode-c: starting datanode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-datanode-datanode-c.out
Starting secondary namenodes [secondary]
secondary: starting secondarynamenode, logging to /usr/local/hadoop-2.8.1/logs/hadoop-hadoop-user-secondarynamenode-secondary.out
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop-2.8.1/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
 
cs


28. 서비스 실행

28-1. jps 혹은 jps -m

1
2
3
4
5
6
$ jps
23185 ResourceManager
5508 NodeManager
22861 NameNode
23485 Jps
 
cs

* 간혹 NodeManager 과 ResourceManager 이 올라오지 않는 경우가 있는데 이때 아래와 같이 개별적으로 서비스를 실행합니다.

1
2
3
$ /usr/local/hadoop/sbin/yarn-daemon.sh start nodemanager
$ /usr/local/hadoop/sbin/yarn-daemon.sh start resourcemanager
 
cs



29-1. 서비스 확인

29-1-1. name node

1
2
3
4
5
6
$ jps
23185 ResourceManager
5508 NodeManager
23549 Jps
22861 NameNode
 
cs


29-1-2. secondary node

1
2
3
4
$ jps
13976 Jps
13547 SecondaryNameNode
 
cs


29-1-3. datanode-a

1
2
3
4
5
$ jps
2050 Jps
1847 NodeManager
1722 DataNode
 
cs


29-1-4. datanode-b

1
2
3
4
5
$ jps
2038 Jps
1835 NodeManager
1711 DataNode
 
cs


29-1-5. datanode-c

1
2
3
4
5
$ jps
1732 DataNode
1897 NodeManager
2061 Jps
 
cs

* 각 노드별 해당 서비스 이상 없이 실행되고 있는지 확인합니다.


29-2. Web UI 를 통한 모니터링

28-2-1. namenode Web UI

* 25-2-3. 설정 추가에서 dfs.http.address 의 부분에 설정한 경로로 접근합니다.


28-2-2. resource manager Web UI

* resource manager 서비스를 모니터링할 수 있는 Web UI 로써 포트 번호는 8088로 접근합니다.




1. vsftp 다운로드

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
$ sudo apt-get install vsftpd
sudo: unable to resolve host ubuntu
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following additional packages will be installed:
  ssl-cert
Suggested packages:
  openssl-blacklist
The following NEW packages will be installed:
  ssl-cert vsftpd
0 upgraded, 2 newly installed, 0 to remove and 3 not upgraded.
Need to get 132 kB of archives.
After this operation, 398 kB of additional disk space will be used.
Do you want to continue? [Y/n] Y
Get:1 http://us.archive.ubuntu.com/ubuntu xenial/main amd64 ssl-cert all 1.0.37 [16.9 kB]
Get:2 http://us.archive.ubuntu.com/ubuntu xenial/main amd64 vsftpd amd64 3.0.3-3ubuntu2 [115 kB]
Fetched 132 kB in 6s (21.0 kB/s)
Preconfiguring packages ...
Selecting previously unselected package ssl-cert.
(Reading database ... 60283 files and directories currently installed.)
Preparing to unpack .../ssl-cert_1.0.37_all.deb ...
Unpacking ssl-cert (1.0.37) ...
Selecting previously unselected package vsftpd.
Preparing to unpack .../vsftpd_3.0.3-3ubuntu2_amd64.deb ...
Unpacking vsftpd (3.0.3-3ubuntu2) ...
Processing triggers for man-db (2.7.5-1) ...
Processing triggers for systemd (229-4ubuntu19) ...
Processing triggers for ureadahead (0.100.0-19) ...
Setting up ssl-cert (1.0.37) ...
hostname: Name or service not known
make-ssl-cert: Could not get FQDN, using "ubuntu".
make-ssl-cert: You may want to fix your /etc/hosts and/or DNS setup and run
make-ssl-cert: make-ssl-cert generate-default-snakeoil --force-overwrite
make-ssl-cert: again.
Setting up vsftpd (3.0.3-3ubuntu2) ...
Processing triggers for systemd (229-4ubuntu19) ...
Processing triggers for ureadahead (0.100.0-19) ...
 
cs

* 설치 중에 계속 진행할지 여부를 묻는데 Y 를 눌러 계속 진행합니다.



2. vsftpd.conf 파일 편집

1
2
$ sudo vi /etc/vsftpd.conf
 
cs



3. vsftpd.conf 파일 수정

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# Example config file /etc/vsftpd.conf
 
...
 
# Allow anonymous FTP? (Disabled by default).
anonymous_enable=NO
 
# Uncomment this to enable any form of FTP write command.
write_enable=YES
 
# You may override where the log file goes if you like. The default is shown
# below.
xferlog_file=/var/log/vsftpd.log
 
 
cs

* 위와 같이 주석 처리된 부분을 해제하고 저장합니다.

* 필요에 따라 각 옵션을 주석 처리하거나 해제하면 됩니다.

* anonymous_enable : 익명 사용자의 접속

* write_enable : 쓰기 가능 여부

* xferlog_file : default 로 설정된 vsftpd 에 대한 로그 정보



4. vsftpd.conf 파일 수정 후 ftp 서버 재기동

1
2
3
4
$ sudo /etc/init.d/vsftpd restart
sudo: unable to resolve host ubuntu
[ ok ] Restarting vsftpd (via systemctl): vsftpd.service.
 
cs



5. ftp 프로그램을 통한 접속 여부 확인

* 위의 ftp client 는 filezilla 입니다.

* 다른 ftp 프로그램을 이용해도 확인 유무에 이상 없습니다.



'Operating System > Linux' 카테고리의 다른 글

head 명령어  (0) 2017.10.16
ubuntu 고정 ip 할당  (0) 2017.10.13
root 비밀번호 설정  (0) 2017.10.12
Ubuntu Server 설치  (0) 2017.10.12
SSH 생성 및 분배  (0) 2017.10.10


Ubuntu 16.04.3 LTS (GNU/Linux 4.4.0.-87-generic x86_64)


1
2
3
4
5
$ sudo passwd root
Enter new UNIX password:
Retype new UNIX password:
passwd: password updated successfully
 
cs

* "Enter new UNIX password" 부분에 새롭게 설정할 비밀번호를 입력합니다.

* "Retype new UNIX password" 부분에 새롭게 설정한 비밀번호를 한 번 더 입력합니다.



'Operating System > Linux' 카테고리의 다른 글

ubuntu 고정 ip 할당  (0) 2017.10.13
FTP 설치 및 접속  (0) 2017.10.12
Ubuntu Server 설치  (0) 2017.10.12
SSH 생성 및 분배  (0) 2017.10.10
disk mount  (0) 2017.10.10

1.


2.



3.


4.


5.


6.


7.


8.


9.


10.


11.


12.


13.


14.


15.


16.


17.


18.


19.


20.


21.


22.


23.


24.


25.


26.


27.


28.


29.


30.


31.


32.



'Operating System > Linux' 카테고리의 다른 글

FTP 설치 및 접속  (0) 2017.10.12
root 비밀번호 설정  (0) 2017.10.12
SSH 생성 및 분배  (0) 2017.10.10
disk mount  (0) 2017.10.10
방화벽 ON / OFF  (0) 2017.09.28

+ Recent posts