<>环境准备
* 将hadoop的依赖包放到一个非中文无空格目录下
* 配置环境变量:在系统变量下新建一个HADOOP_HOME,路径为依赖包的路径,在Path中添加%HADOOP_HOEM%\bin
<>idea中创建maven工程
* pom.xml中的依赖 <dependencies> <dependency> <groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId> <version>2.7.3</version> </dependency> <
dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>
4.12</version> </dependency> <dependency> <groupId>org.slf4j</groupId> <
artifactId>slf4j-log4j12</artifactId> <version>1.7.10</version> </dependency> </
dependencies>
2.在resource下创建log4j.properties文件
log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n
log4j.appender.logfile=org.apache.log4j.FileAppender
log4j.appender.logfile.File=target/spring.log
log4j.appender.logfile.layout=org.apache.log4j.PatternLayout
log4j.appender.logfile.layout.ConversionPattern=%d %p [%c] - %m%n
* 在java目录中创建测试类,实现在hdfs中创建一个文件夹 import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem; import org.apache.hadoop.fs.Path; import
org.junit.Test; import java.io.IOException; import java.net.URI; import java.net
.URISyntaxException; public class HdfsClient { @Test public void testmkdir()
throws URISyntaxException, IOException,InterruptedException { //连接的集群的NameNode地址
URI uri = new URI("hdfs://hadoop01:8020"); //创建一个配置文件 Configuration
configuration= new Configuration(); //创建一个用户对象,表示要进行操作的用户(与服务器中的用户对应) String
user="root"; //系统root用户 //获取到客户端对象 FileSystem fs = FileSystem.get(uri,
configuration,user); //创建一个文件夹 fs.mkdirs(new Path("/sanguo/nihao")); //关闭资源 fs.
close(); } }