<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>《Hadoop-2.2.0集群安装配置实践》的评论</title>
	<atom:link href="http://shiyanjun.cn/archives/561.html/feed" rel="self" type="application/rss+xml" />
	<link>http://shiyanjun.cn/archives/561.html</link>
	<description>简单之美，难得简单，享受简单的唯美。</description>
	<lastBuildDate>Wed, 19 Feb 2025 08:08:30 +0000</lastBuildDate>
		<sy:updatePeriod>hourly</sy:updatePeriod>
		<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.9.2</generator>
	<item>
		<title>作者：yayun</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-57452</link>
		<dc:creator><![CDATA[yayun]]></dc:creator>
		<pubDate>Wed, 25 May 2016 12:27:50 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-57452</guid>
		<description><![CDATA[这个问题有被解决吗？]]></description>
		<content:encoded><![CDATA[<p>这个问题有被解决吗？</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：cinty</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-56474</link>
		<dc:creator><![CDATA[cinty]]></dc:creator>
		<pubDate>Mon, 16 Nov 2015 07:31:17 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-56474</guid>
		<description><![CDATA[谢谢，用了你的配置，我的hive能正确运行了]]></description>
		<content:encoded><![CDATA[<p>谢谢，用了你的配置，我的hive能正确运行了</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：roger.han</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-22740</link>
		<dc:creator><![CDATA[roger.han]]></dc:creator>
		<pubDate>Fri, 22 Aug 2014 11:32:36 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-22740</guid>
		<description><![CDATA[我也遇到这个问题了]]></description>
		<content:encoded><![CDATA[<p>我也遇到这个问题了</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：kimy</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-19023</link>
		<dc:creator><![CDATA[kimy]]></dc:creator>
		<pubDate>Sun, 20 Jul 2014 05:02:33 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-19023</guid>
		<description><![CDATA[有一个错误，core-site.xml 里面是没有dfs.replication属性的，这个属性在hdfs-site.xml里面，详见http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml ，谢谢你的文章，解决了我的一个小问题！！]]></description>
		<content:encoded><![CDATA[<p>有一个错误，core-site.xml 里面是没有dfs.replication属性的，这个属性在hdfs-site.xml里面，详见http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml ，谢谢你的文章，解决了我的一个小问题！！</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：Yanjun</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-19004</link>
		<dc:creator><![CDATA[Yanjun]]></dc:creator>
		<pubDate>Sat, 05 Jul 2014 03:14:08 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-19004</guid>
		<description><![CDATA[这个是可以的，最终在加载解析配置的时候，实际上对这两个配置文件进行了合并。]]></description>
		<content:encoded><![CDATA[<p>这个是可以的，最终在加载解析配置的时候，实际上对这两个配置文件进行了合并。</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：acciidec</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-19003</link>
		<dc:creator><![CDATA[acciidec]]></dc:creator>
		<pubDate>Fri, 04 Jul 2014 12:37:10 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-19003</guid>
		<description><![CDATA[
                dfs.replication
                3
 
这个不是该 配置在hfds-site.xml上么 配置在core-site.xml也可以么]]></description>
		<content:encoded><![CDATA[<p>                dfs.replication<br />
                3</p>
<p>这个不是该 配置在hfds-site.xml上么 配置在core-site.xml也可以么</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：Yanjun</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-19002</link>
		<dc:creator><![CDATA[Yanjun]]></dc:creator>
		<pubDate>Wed, 02 Jul 2014 15:27:19 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-19002</guid>
		<description><![CDATA[这点信息还真没法解决问题，出现这个问题多数是配置问题，可以查看RM或NM的日志，看看到底是哪里出了问题。]]></description>
		<content:encoded><![CDATA[<p>这点信息还真没法解决问题，出现这个问题多数是配置问题，可以查看RM或NM的日志，看看到底是哪里出了问题。</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：w_aryan</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-19000</link>
		<dc:creator><![CDATA[w_aryan]]></dc:creator>
		<pubDate>Tue, 01 Jul 2014 12:41:07 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-19000</guid>
		<description><![CDATA[在执行这个时候报了下面的异常，是否有碰到过？
hadoop jar /home/hadoop/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar wordcount /data/wordcount /output/wordcount

14/07/01 20:19:11 INFO mapreduce.Job: Job job_1404273935577_0002 failed with state FAILED due to: Application application_1404273935577_0002 failed 2 times due to AM Container for appattempt_1404273935577_0002_000002 exited with  exitCode: 1 due to: Exception from container-launch:
org.apache.hadoop.util.Shell$ExitCodeException:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
        at org.apache.hadoop.util.Shell.run(Shell.java:379)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
        at java.util.concurrent.FutureTask.run(FutureTask.java:166)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:722)]]></description>
		<content:encoded><![CDATA[<p>在执行这个时候报了下面的异常，是否有碰到过？<br />
hadoop jar /home/hadoop/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar wordcount /data/wordcount /output/wordcount</p>
<p>14/07/01 20:19:11 INFO mapreduce.Job: Job job_1404273935577_0002 failed with state FAILED due to: Application application_1404273935577_0002 failed 2 times due to AM Container for appattempt_1404273935577_0002_000002 exited with  exitCode: 1 due to: Exception from container-launch:<br />
org.apache.hadoop.util.Shell$ExitCodeException:<br />
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)<br />
        at org.apache.hadoop.util.Shell.run(Shell.java:379)<br />
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)<br />
        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)<br />
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)<br />
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)<br />
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)<br />
        at java.util.concurrent.FutureTask.run(FutureTask.java:166)<br />
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)<br />
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)<br />
        at java.lang.Thread.run(Thread.java:722)</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：Yanjun</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-18890</link>
		<dc:creator><![CDATA[Yanjun]]></dc:creator>
		<pubDate>Tue, 03 Jun 2014 15:33:37 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-18890</guid>
		<description><![CDATA[这个信息不影响你使用Hadoop的任何功能。Hadoop native libraries是出于性能的考虑，而直接使用C重写了Hadoop的一些组件，除非你有这种需要，觉得Hadoop没能达到你实际预期的性能，你可以考虑使用它。详细讲解见：http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/NativeLibraries.html。]]></description>
		<content:encoded><![CDATA[<p>这个信息不影响你使用Hadoop的任何功能。Hadoop native libraries是出于性能的考虑，而直接使用C重写了Hadoop的一些组件，除非你有这种需要，觉得Hadoop没能达到你实际预期的性能，你可以考虑使用它。详细讲解见：http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/NativeLibraries.html。</p>
]]></content:encoded>
	</item>
	<item>
		<title>作者：seaer</title>
		<link>http://shiyanjun.cn/archives/561.html#comment-18795</link>
		<dc:creator><![CDATA[seaer]]></dc:creator>
		<pubDate>Tue, 13 May 2014 03:09:31 +0000</pubDate>
		<guid isPermaLink="false">http://shiyanjun.cn/?p=561#comment-18795</guid>
		<description><![CDATA[那么这个信息不影响Hadoop运作？]]></description>
		<content:encoded><![CDATA[<p>那么这个信息不影响Hadoop运作？</p>
]]></content:encoded>
	</item>
</channel>
</rss>
