Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

下载版本

cdh版本 http://archive-primary.cloudera.com/cdh5/cdh/5/

我们下载这个

Cloudera HUE大数据可视化分析

这个是我下载好的

Cloudera HUE大数据可视化分析

我们解压一下

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

下载需要的系统包

yum install ant asciidoc cyrus-sasl-devel cyrus-sasl-gssapi gcc gcc-c++ krb5-devel ibtidy libxml2-devel libxslt-devel openldap-devel python-devel 
sqlite-devel openssl-devel mysql-devel gmp-devel

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

接下来这一步的话可能时间比较久一点起码要三五分钟的,大家耐心等等

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

现在我们编译就成功了!!!

我们可以看到生成我们的build目录

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

现在我们通过notepad打开这个文件

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

直接把官网的这一串拷贝过来

参考官方说明网址 http://archive.cloudera.com/cdh5/cdh/5/hue-3.9.0-cdh5.5.0/manual.html

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

接下来启动服务

Cloudera HUE大数据可视化分析

可以看到服务启动起来了,打印了一长串的服务信息,我们不管他

http://bigdata-pro03.kfk.com:8888

登录这个地址看看这个可视化界面

Cloudera HUE大数据可视化分析

注意我圈出来的,大家首次登录的话一定要仔细阅读里面的内容

我这里就用    用户名:kfk  密码:kfk

点击创建用户

Cloudera HUE大数据可视化分析

相应的界面我们就进来了

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

回到hue.ini文件

Cloudera HUE大数据可视化分析

在hadoop的hdfs-site.xml上添加

Cloudera HUE大数据可视化分析

<property>
<name>dfs.webhdfs.enabled</name>
<value>true</value>
</property>

在hadoop的core-site.xml下面加上

Cloudera HUE大数据可视化分析

<property>
<name>hadoop.proxyuser.hue.hosts</name>
<value>*</value>
</property> <property>
<name>hadoop.proxyuser.hue.groups</name>
<value>*</value>
</property>

配置完了之后就把配置文件分发到节点1 和节点2上去

Cloudera HUE大数据可视化分析

分发完之后我们重启一下服务

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

把hue也启动一下

Cloudera HUE大数据可视化分析

再次进入可视化界面

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

可以看到我的hdfs目录了

可以进来点开这些数据

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

我们首先查看yarn-site.xml

Cloudera HUE大数据可视化分析

我们配置hue.ini

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

我们继续配置hue.ini

Cloudera HUE大数据可视化分析

先启动hivesever2

Cloudera HUE大数据可视化分析

启动一下hueCloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

可以看到hive里面我们之前创建的表

Cloudera HUE大数据可视化分析

我们试图执行一下查询语句,结果报错了

Cloudera HUE大数据可视化分析

在xshell里面看到报错的信息

[kfk@bigdata-pro03 hive-0.13.-cdh5.3.0]$ bin/hiveserver2
Starting HiveServer2
OK
OK
OK
OK
OK
OK
OK
OK
OK
OK
OK
OK
NoViableAltException(@[:: ( ( KW_AS )? alias= Identifier )?])
at org.antlr.runtime.DFA.noViableAlt(DFA.java:)
at org.antlr.runtime.DFA.predict(DFA.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.tableSource(HiveParser_FromClauseParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.fromSource(HiveParser_FromClauseParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.joinSource(HiveParser_FromClauseParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.fromClause(HiveParser_FromClauseParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.fromClause(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.singleSelectStatement(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.selectStatement(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.regularBody(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpressionBody(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpression(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:)
at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:)
at org.apache.hive.service.cli.session.HiveSessionImpl.runOperationWithLogCapture(HiveSessionImpl.java:)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
at java.lang.reflect.Method.invoke(Method.java:)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$(HiveSessionProxy.java:)
at org.apache.hive.service.cli.session.HiveSessionProxy$.run(HiveSessionProxy.java:)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:)
at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:)
at com.sun.proxy.$Proxy16.executeStatementAsync(Unknown Source)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
at java.lang.Thread.run(Thread.java:)
FAILED: ParseException line : cannot recognize input near 'limt' '' '<EOF>' in table source
OK
OK
NoViableAltException(@[:: ( ( KW_AS )? alias= Identifier )?])
at org.antlr.runtime.DFA.noViableAlt(DFA.java:)
at org.antlr.runtime.DFA.predict(DFA.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.tableSource(HiveParser_FromClauseParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.fromSource(HiveParser_FromClauseParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.joinSource(HiveParser_FromClauseParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.fromClause(HiveParser_FromClauseParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.fromClause(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.singleSelectStatement(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.selectStatement(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.regularBody(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpressionBody(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.queryStatementExpression(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:)
at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:)
at org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:)
at org.apache.hive.service.cli.session.HiveSessionImpl.runOperationWithLogCapture(HiveSessionImpl.java:)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:)
at java.lang.reflect.Method.invoke(Method.java:)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$(HiveSessionProxy.java:)
at org.apache.hive.service.cli.session.HiveSessionProxy$.run(HiveSessionProxy.java:)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:)
at org.apache.hadoop.hive.shims.HadoopShimsSecure.doAs(HadoopShimsSecure.java:)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:)
at com.sun.proxy.$Proxy16.executeStatementAsync(Unknown Source)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:)
at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:)
at java.lang.Thread.run(Thread.java:)
FAILED: ParseException line : cannot recognize input near 'limt' '' '<EOF>' in table source
OK

是因为我们的hbase没有启动的原因

Cloudera HUE大数据可视化分析

配置hue.ini

Cloudera HUE大数据可视化分析

重启一下hue

Cloudera HUE大数据可视化分析

我们重新打开可视化界面

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

执行一下查询语句

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

启动hbaseCloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

Cloudera HUE大数据可视化分析

由于hue3.9版本的兼容性问题,下面我们改用hue3.7版本,配置跟3.9一样的。

但是我这里的环境问题,用3.7的版本没办法打开可视化界面,我估计是我的hive hbase的版本太低的原因了,我还是只能用回3.9版本的,还请大家谅解,建议hbase hive都用1.0以上的版本吧

如果大家遇到这个问题,hive命令行没有显示日志的话可以参考一下方法

Cloudera HUE大数据可视化分析

上一篇:Python判断内网IP


下一篇:安装PLSQL,登录报“无法解析指定的连接标识符的错误”