implicit隐式转换的使用1、修饰变量调用隐式类型参数的方法,在不传参数的情况下,编译器会在定义域范围内搜索隐式类型变量,自动传入
1234567891011package testobject TestImplicit1 { def main(args: Array[String]): Unit = { val stu = new Student() stu.print(20) implicit val n:Int =22 stu.print }}
12345678package testclass Student { def print(implicit info:Int)={ println("学生年龄:"+info) }}
2、隐式方法
参数传递错误,类型转换方法是隐式的,自动调用
123456789package testobject TestImplicit1 { def main(args: Array[String ...
1234567891011121314151617181920212223242526272829303132333435363738394041424344454647484950515253import requestsfrom lxml import etreeimport csvclass DaXue(): def __init__(self): self.headers={ 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/89.0.4389.90 Safari/537.36 FS' } self.links=[] self.datas=[] def get_html(self,url): resp=requests.get(url,self.headers) ...
项目效果图
项目目录结构
project
app
static
js
echarts.js
templates
index.html
__init__.py
config.py
extentions.py
models.py
views.py
manage.py
项目代码index.html
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576777879808182838485868788899091929394959697989910010110210310410510610710810911011111211311411511611711811912012112212312412512612712812913013113213313413513613713813914014114214314414514614714814 ...
删除hadoop-2.7.6/share/hadoop/yarn/jline-2.12.jar删除所有的以jline开头的包(一般就一个)
rm -rf hadoop-2.6.0/share/hadoop/yarn/lib/jline-2.12.jar安装hive时报错
123456789101112131415161718192021222324252627282930313233[ERROR] Terminal initialization failed; falling back to unsupportedjava.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expectedat jline.TerminalFactory.create(TerminalFactory.java:101)at jline.TerminalFactory.get(TerminalFactory.java:158)at jline.consol ...
在启动hadoop或者用到hadoop相关应用时出现:
出现这个的原因是某个配置文件缺少 < property > < /property >;可以检查在安装hadoop修改过得 *-site.xml 文件。笔者是yarn-site.xml里面缺少< property > < /property >
添加后该WARN消失。
使用hdfs有时候需要统计文件行数和文件的大小
1.hdfs下载文件夹中多个文件
hadoop fs -get /目录 目录
2.统计多个文件行数
hadoop fs -cat /文件* | wc -l
3.统计文件大小
hadoop fs -count /文件*
统计单个文件 只需要精确到文件即可
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565704:38:16,711 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id04:38:16,712 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=04:38:17,407 [main] WARN org.apache.hadoop.mapreduce.JobResourceUploader - Hadoop command-line option parsing not perfor ...
mkdir: Permission denied: user=root, access=WRITE, inode=”/“:hdfs:supergroup:drwxr-xr-x
出现这个错误的原因是没有权限
解决办法有三个
1、sudo -u hdfs hadoop fs -mkdir /aaa
2、su hdfs 进入hdfs用户
如果是-cat失败的话还可以直接把文件或文件夹权限改了
3、sudo -u hdfs hadoop fs -chmod 777 /aaa
echarts出现这个错误,多半是配置中少了yAxis或者属性名写错
xAxis同理