博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
测试Hadoop2.7.1
阅读量:6879 次
发布时间:2019-06-26

本文共 27924 字,大约阅读时间需要 93 分钟。

三台机器 CentOS7(机器名分别为master-CentOS7、slave1-CentOS7、slave2-CentOS7),每台机器内存2G(迫于无奈,刚换了内存条)

之前写了一篇“”

wordcount统计单词

master-CentOS7(已启动Hadoop集群)

[root@master ~]# cd /usr/local/hadoop/ [root@master hadoop]# bin/hdfs dfs -mkdir /test001[root@master hadoop]# bin/hdfs dfs -ls /Found 3 itemsdrwxr-xr-x   - root supergroup          0 2016-09-01 19:41 /test001drwx------   - root supergroup          0 2016-08-29 20:26 /tmpdrwxr-xr-x   - root supergroup          0 2016-08-29 20:26 /user[root@master hadoop]# lsbin  etc      lib      LICENSE.txt  NOTICE.txt  sbin   tmpdfs  include  libexec  logs         README.txt  share[root@master hadoop]# wc -l LICENSE.txt289 LICENSE.txt[root@master hadoop]# du -sh !$du -sh LICENSE.txt16K     LICENSE.txt[root@master hadoop]# bin/hdfs dfs -copyFromLocal ./LICENSE.txt /test001[root@master hadoop]# bin/hdfs dfs -ls /test001Found 1 items-rw-r--r--   2 root supergroup      15429 2016-09-01 19:46 /test001/LICENSE.txt[root@master hadoop]# bin/hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar wordcount /test001/LICENSE.txt /test001/[root@master hadoop]# echo $?255

发现命令执行出错

org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://192.168.1.182:9000/test001 already exists

命令改一下

[root@master hadoop]# bin/hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar wordcount /test001/LICENSE.txt /test001/wordcount16/09/02 17:09:35 INFO client.RMProxy: Connecting to ResourceManager at /192.168.1.182:803216/09/02 17:09:36 INFO input.FileInputFormat: Total input paths to process : 116/09/02 17:09:36 INFO mapreduce.JobSubmitter: number of splits:116/09/02 17:09:37 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1472804584592_000316/09/02 17:09:37 INFO impl.YarnClientImpl: Submitted application application_1472804584592_000316/09/02 17:09:37 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1472804584592_0003/16/09/02 17:09:37 INFO mapreduce.Job: Running job: job_1472804584592_000316/09/02 17:09:46 INFO mapreduce.Job: Job job_1472804584592_0003 running in uber mode : false16/09/02 17:09:46 INFO mapreduce.Job:  map 0% reduce 0%16/09/02 17:09:55 INFO mapreduce.Job:  map 100% reduce 0%16/09/02 17:10:04 INFO mapreduce.Job:  map 100% reduce 100%16/09/02 17:10:05 INFO mapreduce.Job: Job job_1472804584592_0003 completed successfully16/09/02 17:10:05 INFO mapreduce.Job: Counters: 49        File System Counters                FILE: Number of bytes read=10992                FILE: Number of bytes written=252973                FILE: Number of read operations=0                FILE: Number of large read operations=0                FILE: Number of write operations=0                HDFS: Number of bytes read=15539                HDFS: Number of bytes written=8006                HDFS: Number of read operations=6                HDFS: Number of large read operations=0                HDFS: Number of write operations=2        Job Counters                Launched map tasks=1                Launched reduce tasks=1                Data-local map tasks=1                Total time spent by all maps in occupied slots (ms)=6493                Total time spent by all reduces in occupied slots (ms)=6714                Total time spent by all map tasks (ms)=6493                Total time spent by all reduce tasks (ms)=6714                Total vcore-seconds taken by all map tasks=6493                Total vcore-seconds taken by all reduce tasks=6714                Total megabyte-seconds taken by all map tasks=6648832                Total megabyte-seconds taken by all reduce tasks=6875136        Map-Reduce Framework                Map input records=289                Map output records=2157                Map output bytes=22735                Map output materialized bytes=10992                Input split bytes=110                Combine input records=2157                Combine output records=755                Reduce input groups=755                Reduce shuffle bytes=10992                Reduce input records=755                Reduce output records=755                Spilled Records=1510                Shuffled Maps =1                Failed Shuffles=0                Merged Map outputs=1                GC time elapsed (ms)=146                CPU time spent (ms)=2360                Physical memory (bytes) snapshot=312647680                Virtual memory (bytes) snapshot=1717682176                Total committed heap usage (bytes)=163123200        Shuffle Errors                BAD_ID=0                CONNECTION=0                IO_ERROR=0                WRONG_LENGTH=0                WRONG_MAP=0                WRONG_REDUCE=0        File Input Format Counters                Bytes Read=15429        File Output Format Counters                Bytes Written=8006[root@master hadoop]# echo $?0[root@master hadoop]# bin/hdfs dfs -ls /test001/Found 2 items-rw-r--r--   2 root supergroup      15429 2016-09-01 19:46 /test001/LICENSE.txtdrwxr-xr-x   - root supergroup          0 2016-09-02 17:10 /test001/wordcount[root@master hadoop]# bin/hdfs dfs -ls /test001/wordcountFound 2 items-rw-r--r--   2 root supergroup          0 2016-09-02 17:10 /test001/wordcount/_SUCCESS-rw-r--r--   2 root supergroup       8006 2016-09-02 17:10 /test001/wordcount/part-r-00000[root@master hadoop]# bin/hdfs dfs -cat /test001/wordcount/part-r-00000"AS     4"Contribution"  1"Contributor"   1"Derivative     1"Legal  1"License"       1"License");     1"Licensor"      1"NOTICE"        1"Not    1"Object"        1"Source"        1"Work"  1"You"   1"Your") 1"[]"    1"control"       1"printed        1"submitted"     1(50%)   1(C)     1(Don't  1(INCLUDING      2(INCLUDING,     2(a)     1(an     1(and    1(b)     1(c)     2(d)     1(except 1(http://www.one-lab.org)        1(http://www.opensource.org/licenses/bsd-license.php)    1(i)     1(ii)    1(iii)   1(including      3(or     3(such   1(the    1*       34*/      3-       7/*      1/**     2034819  11       11.      12-Clause        12.      12.0     12.0,    12004    12005,   12008,2009,2010  12011-2014,      13.      14.      15.      16.      17.      18.      19       19.      1:       3A       3ADVISED 2AND     11ANY     10APACHE  1APPENDIX:       1ARE     2ARISING 2Accepting       1Additional      1All     2Apache  5Appendix        1BASIS,  2BE      2BSD     1BSD-style       1BUSINESS        2BUT     4BY      2CAUSED  2CONDITIONS      4CONSEQUENTIAL   2CONTRACT,       2CONTRIBUTORS    4COPYRIGHT       4CRC     1Catholique      1Collet. 1Commission      1Contribution    3Contribution(s) 3Contribution."  1Contributions)  1Contributions.  2Contributor     8Contributor,    1Copyright       5DAMAGE. 2DAMAGES 2DATA,   2DIRECT, 2DISCLAIMED.     2DISTRIBUTION    1Definitions.    1Derivative      17Disclaimer      1END     1EVEN    2EVENT   2EXEMPLARY,      2EXPRESS 2Entity  3Entity" 1European        1FITNESS 3FOR     6Fast    1File    1For     6GOODS   2Grant   2HADOOP  1HOLDERS 2HOWEVER 2Hadoop  1Header  1How     1However,        1IF      2IMPLIED 4IN      6INCIDENTAL,     2INCLUDING,      2INDIRECT,       2INTERRUPTION)   2IS      2IS"     4If      2In      1Institute       1January 1KIND,   2LIABILITY,      4LIABLE  2LICENSE 1LIMITED 4LOSS    2LZ      1LZ4     3Legal   3Liability.      2License 10License,        6License.        11License;        1Licensed        1Licensor        8Licensor,       1Limitation      1Louvain 1MERCHANTABILITY 2MERCHANTABILITY,        1Massachusetts   1NEGLIGENCE      2NO      2NON-INFRINGEMENT,       1NOT     4NOTICE  5Neither 1Notwithstanding 1OF      19ON      2OR      18OTHERWISE)      2OUT     2OWNER   2Object  4OneLab  1PARTICULAR      3POSSIBILITY     2PROCUREMENT     2PROFITS;        2PROVIDED        2PURPOSE 2PURPOSE.        1Patent  1REPRODUCTION,   1Redistribution  2Redistribution. 1Redistributions 4SERVICES;       2SHALL   2SOFTWARE        2SOFTWARE,       2SPECIAL,        2STRICT  2SUBCOMPONENTS:  1SUBSTITUTE      2SUCH    2Sections        1See     1Source  8Subject 2Submission      1TERMS   2THE     10THEORY  2THIS    4TITLE,  1TO,     4TORT    2Technology.     1The     3This    1To      1Trademarks.     1UCL     1USE     2USE,    3University      1Unless  3Use     1Version 2WARRANTIES      4WARRANTIES,     2WAY     2WHETHER 2WITHOUT 2Warranty        1Warranty.       1We      1While   1Work    20Work,   4Work.   1Works   12Works"  1Works,  2Works;  3Yann    1You     24Your    9[name   1[yyyy]  1a       21above   4above,  1acceptance      1accepting       2act     1acting  1acts)   1add     2addendum        1additional      4additions       1advised 1against 1against,        1agree   1agreed  3agreement       1algorithm       1all     3alleging        1alone   1along   1alongside       1also    1an      6and     51and/or  3annotations,    1any     28appear. 1applicable      3applies 1apply   2appropriate     1appropriateness 1archives.       1are     10arising 1as      15asserted        1associated      1assume  1at      3attach  1attached        1attribution     4author  1authorized      2authorship,     2authorship.     1available       1based   1be      7been    2behalf  5below). 1beneficial      1binary  4bind    1boilerplate     1brackets        1brackets!)      1but     5by      21by,     3calculation     1can     2cannot  1carry   1cause   2changed 1character       1charge  1choose  1claims  2class   1classes:        1code    5code,   2combination     1comment 1commercial      1common  1communication   3compiled        1compliance      1complies        1compression     1computer        1conditions      14conditions.     1conditions:     1configuration   1consequential   1consistent      1conspicuously   1constitutes     1construed       1contact 1contained       1contains        1content 1contents        1contract        2contract,       1contributors    1contributory    1control 2control,        1controlled      1conversions     1copies  1copy    3copyright       15copyright,      1counterclaim    1cross-claim     1customary       1damages 3damages,        1damages.        1date    1de      1defend, 1defined 1definition,     2deliberate      1derived 2describing      1description     1designated      1determining     1different       1direct  2direct, 1direction       1disclaimer      2disclaimer.     2discussing      1display 1display,        1distribute      3distribute,     2distributed     3distribution    3distribution,   1distribution.   2do      3document.       1documentation   3documentation,  2does    1each    4easier  1editorial       1either  2elaborations,   1electronic      1electronic,     1enclosed        2endorse 1entities        1entity  3entity, 1entity. 2even    1event   1example 1except  2excluding       3executed        1exercise        1exercising      1explicitly      1express 2failure 1fee     1fields  1fifty   1file    6file,   1file.   2filed.  1files   1files.  1files;  1following       10for     19for,    1form    10form,   4form.   1format. 1forms,  2forum   1found   1from    4from)   1from,   1generated       2give    1goodwill,       1governed        1governing       1grant   1granted 2granting        1grants  2grossly 1harmless        1has     2have    2hereby  2herein  1hold    1http://code.google.com/p/lz4/   1http://www.apache.org/licenses/ 1http://www.apache.org/licenses/LICENSE-2.0      1https://groups.google.com/forum/#!forum/lz4c    1identification  1identifying     1if      4implementation  1implied,        1implied.        1import, 1improving       1in      31inability       1incidental,     1include 3included        2includes        1including       5including,      1inclusion       2incorporated    2incurred        1indemnify,      1indemnity,      1indicated       1indirect,       2individual      3information.    1informational   1infringed       1infringement,   1institute       1intentionally   2interfaces      1irrevocable     2is      10issue   1its     4language        1law     3lawsuit)        1least   1legal   1liability       2liability.      1liable  1licensable      1license 7licenses        1licenses.       1limitation,     1limitations     1limited 4link    1list    4lists,  1litigation      2loss    1losses),        1made    1made,   1mailing 1make,   1making  1malfunction,    1managed 1management      1marked  1marks,  1materials       2may     10mean    10means   2mechanical      1media   1medium, 1meet    1merely  1met:    2modification,   2modifications   3modifications,  3modified        1modify  2modifying       1more    1must    8name    2name)   1names   2names,  1native  1necessarily     1negligence),    1negligent       1no      2no-charge,      2non-exclusive,  2nor     1normally        1not     11nothing 1notice  2notice, 5notices 9object  1obligations     1obligations,    1obtain  1of      75of,     3offer   1offer,  1on      11one     1only    4or      65or,     1org.apache.hadoop.util.bloom.*  1origin  1original        2other   9otherwise       3otherwise,      3out     1outstanding     1own     4owner   4owner.  1owner]  1ownership       2page"   1part    4patent  5patent, 1percent 1perform,        1permission      1permission.     1permissions     3permitted       2perpetual,      2pertain 2places: 1portions        1possibility     1power,  1preferred       1prepare 1prior   1product 1products        1project 2prominent       1promote 1provide 1provided        9provides        2public  1publicly        2purpose 2purposes        4readable        1reason  1reasonable      1received        1recipients      1recommend       1redistributing  2regarding       1remain  1replaced        1repository      1represent,      1representatives,        1reproduce       3reproduce,      1reproducing     1reproduction,   3required        4reserved.       2responsibility, 1responsible     1result  1resulting       1retain  2retain, 1revisions,      1rights  3risks   1royalty-free,   2same    1section)        1sell,   2sent    1separable       1separate        2service 1shall   15shares, 1should  1slicing-by-8    1software        3sole    1solely  1source  9source, 1special,        1specific        2src/main/native/src/org/apache/hadoop/io/compress/lz4/{lz4.h,lz4.c,lz4hc.h,lz4hc.c},    1src/main/native/src/org/apache/hadoop/util:     1state   1stated  2statement       1stating 1stoppage,       1subcomponents   2subject 1sublicense,     1submit  1submitted       2submitted.      1subsequently    1such    17supersede       1support,        1syntax  1systems 1systems,        1terminate       1terms   8terms.  1text    4that    25the     122their   2then    2theory, 1thereof 1thereof,        2thereof.        1these   1third-party     2this    22those   3through 1to      41tort    1tracking        1trade   1trademark,      1trademarks,     1transfer        1transformation  1translation     1types.  1under   10union   1unless  1use     8use,    4used    1using   1verbal, 1version 1warranties      1warranty        1warranty,       1was     1where   1wherever        1whether 4which   2whole,  2whom    1with    16within  8without 6work    5work,   2work.   1works   1worldwide,      2writing 1writing,        3written 2you     2your    4

运行 PI 实例

master-CentOS7(已启动Hadoop集群)

[root@master ~]# cd /usr/local/hadoop/[root@master hadoop]# bin/hadoop jar ./share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.1.jar pi 100 100Number of Maps  = 100Samples per Map = 100Wrote input for Map #0Wrote input for Map #1Wrote input for Map #2Wrote input for Map #3Wrote input for Map #4Wrote input for Map #5Wrote input for Map #6Wrote input for Map #7Wrote input for Map #8Wrote input for Map #9Wrote input for Map #10Wrote input for Map #11Wrote input for Map #12Wrote input for Map #13Wrote input for Map #14Wrote input for Map #15Wrote input for Map #16Wrote input for Map #17Wrote input for Map #18Wrote input for Map #19Wrote input for Map #20Wrote input for Map #21Wrote input for Map #22Wrote input for Map #23Wrote input for Map #24Wrote input for Map #25Wrote input for Map #26Wrote input for Map #27Wrote input for Map #28Wrote input for Map #29Wrote input for Map #30Wrote input for Map #31Wrote input for Map #32Wrote input for Map #33Wrote input for Map #34Wrote input for Map #35Wrote input for Map #36Wrote input for Map #37Wrote input for Map #38Wrote input for Map #39Wrote input for Map #40Wrote input for Map #41Wrote input for Map #42Wrote input for Map #43Wrote input for Map #44Wrote input for Map #45Wrote input for Map #46Wrote input for Map #47Wrote input for Map #48Wrote input for Map #49Wrote input for Map #50Wrote input for Map #51Wrote input for Map #52Wrote input for Map #53Wrote input for Map #54Wrote input for Map #55Wrote input for Map #56Wrote input for Map #57Wrote input for Map #58Wrote input for Map #59Wrote input for Map #60Wrote input for Map #61Wrote input for Map #62Wrote input for Map #63Wrote input for Map #64Wrote input for Map #65Wrote input for Map #66Wrote input for Map #67Wrote input for Map #68Wrote input for Map #69Wrote input for Map #70Wrote input for Map #71Wrote input for Map #72Wrote input for Map #73Wrote input for Map #74Wrote input for Map #75Wrote input for Map #76Wrote input for Map #77Wrote input for Map #78Wrote input for Map #79Wrote input for Map #80Wrote input for Map #81Wrote input for Map #82Wrote input for Map #83Wrote input for Map #84Wrote input for Map #85Wrote input for Map #86Wrote input for Map #87Wrote input for Map #88Wrote input for Map #89Wrote input for Map #90Wrote input for Map #91Wrote input for Map #92Wrote input for Map #93Wrote input for Map #94Wrote input for Map #95Wrote input for Map #96Wrote input for Map #97Wrote input for Map #98Wrote input for Map #99Starting Job16/09/02 16:40:43 INFO client.RMProxy: Connecting to ResourceManager at /192.168.1.182:803216/09/02 16:40:44 INFO input.FileInputFormat: Total input paths to process : 10016/09/02 16:40:45 INFO mapreduce.JobSubmitter: number of splits:10016/09/02 16:40:45 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1472804584592_000216/09/02 16:40:45 INFO impl.YarnClientImpl: Submitted application application_1472804584592_000216/09/02 16:40:46 INFO mapreduce.Job: The url to track the job: http://master:8088/proxy/application_1472804584592_0002/16/09/02 16:40:46 INFO mapreduce.Job: Running job: job_1472804584592_000216/09/02 16:40:55 INFO mapreduce.Job: Job job_1472804584592_0002 running in uber mode : false16/09/02 16:40:55 INFO mapreduce.Job:  map 0% reduce 0%16/09/02 16:41:16 INFO mapreduce.Job:  map 2% reduce 0%16/09/02 16:41:31 INFO mapreduce.Job:  map 3% reduce 0%16/09/02 16:41:32 INFO mapreduce.Job:  map 4% reduce 0%16/09/02 16:41:44 INFO mapreduce.Job:  map 5% reduce 0%16/09/02 16:41:46 INFO mapreduce.Job:  map 6% reduce 0%16/09/02 16:41:59 INFO mapreduce.Job:  map 7% reduce 0%16/09/02 16:42:00 INFO mapreduce.Job:  map 8% reduce 0%16/09/02 16:42:12 INFO mapreduce.Job:  map 9% reduce 0%16/09/02 16:42:13 INFO mapreduce.Job:  map 10% reduce 0%16/09/02 16:42:26 INFO mapreduce.Job:  map 11% reduce 0%16/09/02 16:42:27 INFO mapreduce.Job:  map 12% reduce 0%16/09/02 16:42:40 INFO mapreduce.Job:  map 13% reduce 0%16/09/02 16:42:41 INFO mapreduce.Job:  map 14% reduce 0%16/09/02 16:42:55 INFO mapreduce.Job:  map 15% reduce 0%16/09/02 16:42:56 INFO mapreduce.Job:  map 16% reduce 0%16/09/02 16:43:10 INFO mapreduce.Job:  map 17% reduce 0%16/09/02 16:43:11 INFO mapreduce.Job:  map 18% reduce 0%16/09/02 16:43:25 INFO mapreduce.Job:  map 19% reduce 0%16/09/02 16:43:26 INFO mapreduce.Job:  map 20% reduce 0%16/09/02 16:43:39 INFO mapreduce.Job:  map 21% reduce 0%16/09/02 16:43:40 INFO mapreduce.Job:  map 22% reduce 0%16/09/02 16:43:52 INFO mapreduce.Job:  map 23% reduce 0%16/09/02 16:43:53 INFO mapreduce.Job:  map 24% reduce 0%16/09/02 16:44:06 INFO mapreduce.Job:  map 25% reduce 0%16/09/02 16:44:07 INFO mapreduce.Job:  map 26% reduce 0%16/09/02 16:44:21 INFO mapreduce.Job:  map 27% reduce 0%16/09/02 16:44:23 INFO mapreduce.Job:  map 28% reduce 0%16/09/02 16:44:35 INFO mapreduce.Job:  map 29% reduce 0%16/09/02 16:44:36 INFO mapreduce.Job:  map 30% reduce 0%16/09/02 16:44:48 INFO mapreduce.Job:  map 31% reduce 0%16/09/02 16:44:49 INFO mapreduce.Job:  map 32% reduce 0%16/09/02 16:44:59 INFO mapreduce.Job:  map 33% reduce 0%16/09/02 16:45:00 INFO mapreduce.Job:  map 34% reduce 0%16/09/02 16:45:11 INFO mapreduce.Job:  map 35% reduce 0%16/09/02 16:45:12 INFO mapreduce.Job:  map 36% reduce 0%16/09/02 16:45:22 INFO mapreduce.Job:  map 37% reduce 0%16/09/02 16:45:24 INFO mapreduce.Job:  map 38% reduce 0%16/09/02 16:45:35 INFO mapreduce.Job:  map 39% reduce 0%16/09/02 16:45:36 INFO mapreduce.Job:  map 40% reduce 0%16/09/02 16:45:46 INFO mapreduce.Job:  map 41% reduce 0%16/09/02 16:45:48 INFO mapreduce.Job:  map 42% reduce 0%16/09/02 16:45:58 INFO mapreduce.Job:  map 43% reduce 0%16/09/02 16:46:00 INFO mapreduce.Job:  map 44% reduce 0%16/09/02 16:46:12 INFO mapreduce.Job:  map 45% reduce 0%16/09/02 16:46:13 INFO mapreduce.Job:  map 46% reduce 0%16/09/02 16:46:23 INFO mapreduce.Job:  map 47% reduce 0%16/09/02 16:46:24 INFO mapreduce.Job:  map 48% reduce 0%16/09/02 16:46:34 INFO mapreduce.Job:  map 49% reduce 0%16/09/02 16:46:35 INFO mapreduce.Job:  map 50% reduce 0%16/09/02 16:46:45 INFO mapreduce.Job:  map 51% reduce 0%16/09/02 16:46:46 INFO mapreduce.Job:  map 52% reduce 0%16/09/02 16:46:59 INFO mapreduce.Job:  map 53% reduce 0%16/09/02 16:47:03 INFO mapreduce.Job:  map 53% reduce 18%16/09/02 16:47:07 INFO mapreduce.Job:  map 54% reduce 18%16/09/02 16:47:14 INFO mapreduce.Job:  map 55% reduce 18%16/09/02 16:47:21 INFO mapreduce.Job:  map 56% reduce 18%16/09/02 16:47:25 INFO mapreduce.Job:  map 56% reduce 19%16/09/02 16:47:28 INFO mapreduce.Job:  map 57% reduce 19%16/09/02 16:47:35 INFO mapreduce.Job:  map 58% reduce 19%16/09/02 16:47:42 INFO mapreduce.Job:  map 59% reduce 19%16/09/02 16:47:43 INFO mapreduce.Job:  map 59% reduce 20%16/09/02 16:47:49 INFO mapreduce.Job:  map 60% reduce 20%16/09/02 16:47:57 INFO mapreduce.Job:  map 61% reduce 20%16/09/02 16:48:05 INFO mapreduce.Job:  map 62% reduce 20%16/09/02 16:48:08 INFO mapreduce.Job:  map 62% reduce 21%16/09/02 16:48:14 INFO mapreduce.Job:  map 63% reduce 21%16/09/02 16:48:22 INFO mapreduce.Job:  map 64% reduce 21%16/09/02 16:48:31 INFO mapreduce.Job:  map 65% reduce 21%16/09/02 16:48:32 INFO mapreduce.Job:  map 65% reduce 22%16/09/02 16:48:41 INFO mapreduce.Job:  map 66% reduce 22%16/09/02 16:48:49 INFO mapreduce.Job:  map 67% reduce 22%16/09/02 16:48:57 INFO mapreduce.Job:  map 68% reduce 22%16/09/02 16:49:00 INFO mapreduce.Job:  map 68% reduce 23%16/09/02 16:49:05 INFO mapreduce.Job:  map 69% reduce 23%16/09/02 16:49:12 INFO mapreduce.Job:  map 70% reduce 23%16/09/02 16:49:20 INFO mapreduce.Job:  map 71% reduce 23%16/09/02 16:49:22 INFO mapreduce.Job:  map 71% reduce 24%16/09/02 16:49:28 INFO mapreduce.Job:  map 72% reduce 24%16/09/02 16:49:36 INFO mapreduce.Job:  map 73% reduce 24%16/09/02 16:49:43 INFO mapreduce.Job:  map 74% reduce 24%16/09/02 16:49:46 INFO mapreduce.Job:  map 74% reduce 25%16/09/02 16:49:50 INFO mapreduce.Job:  map 75% reduce 25%16/09/02 16:49:58 INFO mapreduce.Job:  map 76% reduce 25%16/09/02 16:50:09 INFO mapreduce.Job:  map 77% reduce 25%16/09/02 16:50:11 INFO mapreduce.Job:  map 77% reduce 26%16/09/02 16:50:17 INFO mapreduce.Job:  map 78% reduce 26%16/09/02 16:50:25 INFO mapreduce.Job:  map 79% reduce 26%16/09/02 16:50:32 INFO mapreduce.Job:  map 80% reduce 26%16/09/02 16:50:35 INFO mapreduce.Job:  map 80% reduce 27%16/09/02 16:50:39 INFO mapreduce.Job:  map 81% reduce 27%16/09/02 16:50:47 INFO mapreduce.Job:  map 82% reduce 27%16/09/02 16:50:55 INFO mapreduce.Job:  map 83% reduce 27%16/09/02 16:50:56 INFO mapreduce.Job:  map 83% reduce 28%16/09/02 16:51:03 INFO mapreduce.Job:  map 84% reduce 28%16/09/02 16:51:10 INFO mapreduce.Job:  map 85% reduce 28%16/09/02 16:51:17 INFO mapreduce.Job:  map 86% reduce 28%16/09/02 16:51:20 INFO mapreduce.Job:  map 86% reduce 29%16/09/02 16:51:25 INFO mapreduce.Job:  map 87% reduce 29%16/09/02 16:51:34 INFO mapreduce.Job:  map 88% reduce 29%16/09/02 16:51:41 INFO mapreduce.Job:  map 89% reduce 29%16/09/02 16:51:44 INFO mapreduce.Job:  map 89% reduce 30%16/09/02 16:51:49 INFO mapreduce.Job:  map 90% reduce 30%16/09/02 16:51:56 INFO mapreduce.Job:  map 91% reduce 30%16/09/02 16:52:03 INFO mapreduce.Job:  map 92% reduce 30%16/09/02 16:52:06 INFO mapreduce.Job:  map 92% reduce 31%16/09/02 16:52:11 INFO mapreduce.Job:  map 93% reduce 31%16/09/02 16:52:18 INFO mapreduce.Job:  map 94% reduce 31%16/09/02 16:52:26 INFO mapreduce.Job:  map 95% reduce 31%16/09/02 16:52:27 INFO mapreduce.Job:  map 95% reduce 32%16/09/02 16:52:34 INFO mapreduce.Job:  map 96% reduce 32%16/09/02 16:52:41 INFO mapreduce.Job:  map 97% reduce 32%16/09/02 16:52:48 INFO mapreduce.Job:  map 98% reduce 32%16/09/02 16:52:52 INFO mapreduce.Job:  map 98% reduce 33%16/09/02 16:52:55 INFO mapreduce.Job:  map 99% reduce 33%16/09/02 16:53:02 INFO mapreduce.Job:  map 100% reduce 33%16/09/02 16:53:03 INFO mapreduce.Job:  map 100% reduce 100%16/09/02 16:53:04 INFO mapreduce.Job: Job job_1472804584592_0002 completed successfully16/09/02 16:53:04 INFO mapreduce.Job: Counters: 49        File System Counters                FILE: Number of bytes read=2206                FILE: Number of bytes written=11703871                FILE: Number of read operations=0                FILE: Number of large read operations=0                FILE: Number of write operations=0                HDFS: Number of bytes read=26890                HDFS: Number of bytes written=215                HDFS: Number of read operations=403                HDFS: Number of large read operations=0                HDFS: Number of write operations=3        Job Counters                Launched map tasks=100                Launched reduce tasks=1                Data-local map tasks=100                Total time spent by all maps in occupied slots (ms)=921440                Total time spent by all reduces in occupied slots (ms)=376555                Total time spent by all map tasks (ms)=921440                Total time spent by all reduce tasks (ms)=376555                Total vcore-seconds taken by all map tasks=921440                Total vcore-seconds taken by all reduce tasks=376555                Total megabyte-seconds taken by all map tasks=943554560                Total megabyte-seconds taken by all reduce tasks=385592320        Map-Reduce Framework                Map input records=100                Map output records=200                Map output bytes=1800                Map output materialized bytes=2800                Input split bytes=15090                Combine input records=0                Combine output records=0                Reduce input groups=2                Reduce shuffle bytes=2800                Reduce input records=200                Reduce output records=0                Spilled Records=400                Shuffled Maps =100                Failed Shuffles=0                Merged Map outputs=100                GC time elapsed (ms)=12309                CPU time spent (ms)=75150                Physical memory (bytes) snapshot=20894449664                Virtual memory (bytes) snapshot=86424981504                Total committed heap usage (bytes)=13431619584        Shuffle Errors                BAD_ID=0                CONNECTION=0                IO_ERROR=0                WRONG_LENGTH=0                WRONG_MAP=0                WRONG_REDUCE=0        File Input Format Counters                Bytes Read=11800        File Output Format Counters                Bytes Written=97Job Finished in 741.887 secondsEstimated value of Pi is 3.14080000000000000000

  • 如果提示 copyFromLocal: Cannot create directory /123/. Name node is in safe mode.

    这是因为开启了安全模式(先关闭再启动Hadoop集群时,也会导致安全模式,此时不妨试试关闭安全模式)

    • 解决方法:
      cd /usr/local/hadoop
      bin/hdfs dfsadmin -safemode leave

转载于:https://www.cnblogs.com/Genesis2018/p/8304728.html

你可能感兴趣的文章
Android 富文本装饰器Spannable
查看>>
sync.Map源码分析
查看>>
error: invalid storage class for function
查看>>
seci-log 1.08 发布 增加snmp trap v2c和v3的收集
查看>>
jquery通过url传递 和 接收 参数
查看>>
禁用火狐14以后plugin进程
查看>>
linux增加swap分区
查看>>
Android软键盘的显示与隐藏
查看>>
ThreadPool 线程池
查看>>
AWK 文件处理计数
查看>>
我的友情链接
查看>>
AI技术说:人工智能相关概念与发展简史
查看>>
eclipse启动失败
查看>>
(已解决!)精选30道Java笔试题解答
查看>>
【Python之旅】第七篇(三):使用Redis订阅服务
查看>>
linux远程桌面链接windows
查看>>
TrendMicro:新的APT***针对亚洲和欧洲政府组织,包括中国媒体机构
查看>>
C语言中sizeof与strlen区别2
查看>>
我的友情链接
查看>>
我的友情链接
查看>>