Hadoop 安装以及附带程序运行
1.运行环境 Hyper-V 构建1台MasterNode,4台SlaveNode,IP地址分别为192.168.1.200 ~ 192.168.1.204
Web path: https://www.37infos.com/wp-content/uploads/micologmedia/ahBzfnFxeWZseS1taWNvbG9ncg0LEgVNZWRpYRj5ugkM Disk path: /static//wp-content/uploads/micologmedia/ahBzfnFxeWZseS1taWNvbG9ncg0LEgVNZWRpYRj5ugkM Using Page Bundles: false
2.操作系统为Redhat Enterprise Linux 6.2,Basic Server 安装
SSH默认已经安装,JAVA环境需要安装新的SUN JDK,并且设置etc/profile
文件,添加以下几行,并且设置SSH无密码直接登录(google之)
Web path: https://www.37infos.com/wp-content/uploads/micologmedia/ahBzfnFxeWZseS1taWNvbG9ncg0LEgVNZWRpYRjiwgkM Disk path: /static//wp-content/uploads/micologmedia/ahBzfnFxeWZseS1taWNvbG9ncg0LEgVNZWRpYRjiwgkM Using Page Bundles: false
3安装hadoop,新加用户hadoop,并且设置其home为/home/hadoop
设置sudo权限。
4.format 磁盘 (hadoop format)
5.运行sample
1)在home/hadoop下新加input目录,然后添加2个文本文件f1 f2
- hadoop上新建目录 /tmp /tmp/input
2)upload到hadoop 。./hadoop fs –put /home/hadoop/input/* /tmp/input
- 执行 结果如下
Web path: https://www.37infos.com/wp-content/uploads/micologmedia/ahBzfnFxeWZseS1taWNvbG9ncg0LEgVNZWRpYRj6ugkM Disk path: /static//wp-content/uploads/micologmedia/ahBzfnFxeWZseS1taWNvbG9ncg0LEgVNZWRpYRj6ugkM Using Page Bundles: false
Web path: https://www.37infos.com/wp-content/uploads/micologmedia/ahBzfnFxeWZseS1taWNvbG9ncg0LEgVNZWRpYRiF5QgM Disk path: /static//wp-content/uploads/micologmedia/ahBzfnFxeWZseS1taWNvbG9ncg0LEgVNZWRpYRiF5QgM Using Page Bundles: false
最终清理output目录