文档章节

Fedora 24 安装 Scrapy 中遇到的坑

webcreazy
 webcreazy
发布于 2016/07/17 19:58
字数 1759
阅读 291
收藏 1
点赞 0
评论 2

首先应先安装各种编译需要的包:

[root@bogon upir]# dnf install libxslt-devel redhat-rpm-config python-devel openssl-devel

再进行Scrapy安装:

[root@bogon upir]# pip install Scrapy
Collecting Scrapy
  Downloading Scrapy-1.1.1-py2.py3-none-any.whl (295kB)
    100% |████████████████████████████████| 296kB 40kB/s 
Collecting lxml (from Scrapy)
  Downloading lxml-3.6.0.tar.gz (3.7MB)
    100% |████████████████████████████████| 3.7MB 47kB/s 
Collecting w3lib>=1.14.2 (from Scrapy)
  Downloading w3lib-1.14.3-py2.py3-none-any.whl
Collecting service-identity (from Scrapy)
  Downloading service_identity-16.0.0-py2.py3-none-any.whl
Collecting pyOpenSSL (from Scrapy)
  Downloading pyOpenSSL-16.0.0-py2.py3-none-any.whl (45kB)
    100% |████████████████████████████████| 51kB 40kB/s 
Collecting six>=1.5.2 (from Scrapy)
  Downloading six-1.10.0-py2.py3-none-any.whl
Collecting parsel>=0.9.3 (from Scrapy)
  Downloading parsel-1.0.2-py2.py3-none-any.whl
Collecting PyDispatcher>=2.0.5 (from Scrapy)
  Downloading PyDispatcher-2.0.5.tar.gz
Collecting queuelib (from Scrapy)
  Downloading queuelib-1.4.2-py2.py3-none-any.whl
Collecting cssselect>=0.9 (from Scrapy)
  Downloading cssselect-0.9.2-py2.py3-none-any.whl
Collecting Twisted>=10.0.0 (from Scrapy)
  Downloading Twisted-16.3.0.tar.bz2 (2.9MB)
    100% |████████████████████████████████| 2.9MB 33kB/s 
Collecting pyasn1-modules (from service-identity->Scrapy)
  Downloading pyasn1_modules-0.0.8-py2.py3-none-any.whl
Collecting pyasn1 (from service-identity->Scrapy)
  Downloading pyasn1-0.1.9-py2.py3-none-any.whl
Collecting attrs (from service-identity->Scrapy)
  Downloading attrs-16.0.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from pyOpenSSL->Scrapy)
  Downloading cryptography-1.4.tar.gz (399kB)
    100% |████████████████████████████████| 409kB 48kB/s 
Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->Scrapy)
  Downloading zope.interface-4.2.0.tar.gz (146kB)
    100% |████████████████████████████████| 153kB 56kB/s 
Collecting idna>=2.0 (from cryptography>=1.3->pyOpenSSL->Scrapy)
  Downloading idna-2.1-py2.py3-none-any.whl (54kB)
    100% |████████████████████████████████| 61kB 60kB/s 
Requirement already satisfied (use --upgrade to upgrade): setuptools>=11.3 in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->Scrapy)
Collecting enum34 (from cryptography>=1.3->pyOpenSSL->Scrapy)
  Downloading enum34-1.1.6-py2-none-any.whl
Collecting ipaddress (from cryptography>=1.3->pyOpenSSL->Scrapy)
  Downloading ipaddress-1.0.16-py27-none-any.whl
Collecting cffi>=1.4.1 (from cryptography>=1.3->pyOpenSSL->Scrapy)
  Downloading cffi-1.7.0-cp27-cp27mu-manylinux1_x86_64.whl (383kB)
    100% |████████████████████████████████| 389kB 58kB/s 
Collecting pycparser (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->Scrapy)
  Downloading pycparser-2.14.tar.gz (223kB)
    100% |████████████████████████████████| 225kB 57kB/s 
Installing collected packages: lxml, six, w3lib, pyasn1, pyasn1-modules, idna, enum34, ipaddress, pycparser, cffi, cryptography, pyOpenSSL, attrs, service-identity, cssselect, parsel, PyDispatcher, queuelib, zope.interface, Twisted, Scrapy
  Running setup.py install for lxml ... error
    Complete output from command /usr/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-7xYR1h/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-h5ATXl-record/install-record.txt --single-version-externally-managed --compile:
    Building lxml version 3.6.0.
    Building without Cython.
    ERROR: /bin/sh: xslt-config: 未找到命令
    
    ** make sure the development packages of libxml2 and libxslt are installed **
    
    Using build configuration of libxslt
    running install
    running build
    running build_py
    creating build
    creating build/lib.linux-x86_64-2.7
    creating build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/doctestcompare.py -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/__init__.py -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/_elementpath.py -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/sax.py -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/ElementInclude.py -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/usedoctest.py -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/pyclasslookup.py -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/cssselect.py -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/builder.py -> build/lib.linux-x86_64-2.7/lxml
    creating build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/__init__.py -> build/lib.linux-x86_64-2.7/lxml/includes
    creating build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/diff.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/defs.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/__init__.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/ElementSoup.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/soupparser.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/_setmixin.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/usedoctest.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/formfill.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/clean.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/_diffcommand.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/html5parser.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/builder.py -> build/lib.linux-x86_64-2.7/lxml/html
    copying src/lxml/html/_html5builder.py -> build/lib.linux-x86_64-2.7/lxml/html
    creating build/lib.linux-x86_64-2.7/lxml/isoschematron
    copying src/lxml/isoschematron/__init__.py -> build/lib.linux-x86_64-2.7/lxml/isoschematron
    copying src/lxml/lxml.etree.h -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/lxml.etree_api.h -> build/lib.linux-x86_64-2.7/lxml
    copying src/lxml/includes/xinclude.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/uri.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/xslt.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/tree.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/xpath.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/schematron.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/config.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/c14n.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/relaxng.pxd -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/etree_defs.h -> build/lib.linux-x86_64-2.7/lxml/includes
    copying src/lxml/includes/lxml-version.h -> build/lib.linux-x86_64-2.7/lxml/includes
    creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources
    creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/rng
    copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/rng
    creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl
    copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl
    copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl
    creating build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-x86_64-2.7/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    running build_ext
    building 'lxml.etree' extension
    creating build/temp.linux-x86_64-2.7
    creating build/temp.linux-x86_64-2.7/src
    creating build/temp.linux-x86_64-2.7/src/lxml
    gcc -pthread -fno-strict-aliasing -O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -DNDEBUG -O2 -g -pipe -Wall -Werror=format-security -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector-strong --param=ssp-buffer-size=4 -grecord-gcc-switches -specs=/usr/lib/rpm/redhat/redhat-hardened-cc1 -m64 -mtune=generic -D_GNU_SOURCE -fPIC -fwrapv -fPIC -Isrc/lxml/includes -I/usr/include/python2.7 -c src/lxml/lxml.etree.c -o build/temp.linux-x86_64-2.7/src/lxml/lxml.etree.o -w
    gcc: 错误:/usr/lib/rpm/redhat/redhat-hardened-cc1:No such file or directory
    Compile failed: command 'gcc' failed with exit status 1
    creating tmp
    cc -I/usr/include/libxml2 -c /tmp/xmlXPathInitxsOje_.c -o tmp/xmlXPathInitxsOje_.o
    /tmp/xmlXPathInitxsOje_.c:1:26: 致命错误:libxml/xpath.h:No such file or directory
     #include "libxml/xpath.h"
                              ^
    编译中断。
    *********************************************************************************
    Could not find function xmlCheckVersion in library libxml2. Is libxml2 installed?
    *********************************************************************************
    error: command 'gcc' failed with exit status 1
    
    ----------------------------------------
Command "/usr/bin/python -u -c "import setuptools, tokenize;__file__='/tmp/pip-build-7xYR1h/lxml/setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-h5ATXl-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /tmp/pip-build-7xYR1h/lxml/

上面是首次安装出现的一个问题,

下面是成功安装的:

[root@bogon upir]# pip install Scrapy
Collecting Scrapy
  Using cached Scrapy-1.1.1-py2.py3-none-any.whl
Requirement already satisfied (use --upgrade to upgrade): lxml in /usr/lib64/python2.7/site-packages (from Scrapy)
Requirement already satisfied (use --upgrade to upgrade): w3lib>=1.14.2 in /usr/lib/python2.7/site-packages (from Scrapy)
Collecting service-identity (from Scrapy)
  Using cached service_identity-16.0.0-py2.py3-none-any.whl
Collecting pyOpenSSL (from Scrapy)
  Using cached pyOpenSSL-16.0.0-py2.py3-none-any.whl
Collecting parsel>=0.9.3 (from Scrapy)
  Using cached parsel-1.0.2-py2.py3-none-any.whl
Collecting PyDispatcher>=2.0.5 (from Scrapy)
  Using cached PyDispatcher-2.0.5.tar.gz
Requirement already satisfied (use --upgrade to upgrade): six>=1.5.2 in /usr/lib/python2.7/site-packages (from Scrapy)
Collecting queuelib (from Scrapy)
  Using cached queuelib-1.4.2-py2.py3-none-any.whl
Collecting cssselect>=0.9 (from Scrapy)
  Using cached cssselect-0.9.2-py2.py3-none-any.whl
Collecting Twisted>=10.0.0 (from Scrapy)
  Using cached Twisted-16.3.0.tar.bz2
Requirement already satisfied (use --upgrade to upgrade): pyasn1-modules in /usr/lib/python2.7/site-packages (from service-identity->Scrapy)
Requirement already satisfied (use --upgrade to upgrade): pyasn1 in /usr/lib/python2.7/site-packages (from service-identity->Scrapy)
Collecting attrs (from service-identity->Scrapy)
  Using cached attrs-16.0.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from pyOpenSSL->Scrapy)
  Using cached cryptography-1.4.tar.gz
Collecting zope.interface>=3.6.0 (from Twisted>=10.0.0->Scrapy)
  Using cached zope.interface-4.2.0.tar.gz
Requirement already satisfied (use --upgrade to upgrade): idna>=2.0 in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->Scrapy)
Requirement already satisfied (use --upgrade to upgrade): setuptools>=11.3 in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->Scrapy)
Requirement already satisfied (use --upgrade to upgrade): enum34 in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->Scrapy)
Requirement already satisfied (use --upgrade to upgrade): ipaddress in /usr/lib/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->Scrapy)
Requirement already satisfied (use --upgrade to upgrade): cffi>=1.4.1 in /usr/lib64/python2.7/site-packages (from cryptography>=1.3->pyOpenSSL->Scrapy)
Requirement already satisfied (use --upgrade to upgrade): pycparser in /usr/lib/python2.7/site-packages (from cffi>=1.4.1->cryptography>=1.3->pyOpenSSL->Scrapy)
Installing collected packages: cryptography, pyOpenSSL, attrs, service-identity, cssselect, parsel, PyDispatcher, queuelib, zope.interface, Twisted, Scrapy
  Running setup.py install for cryptography ... done
  Running setup.py install for PyDispatcher ... done
  Running setup.py install for zope.interface ... done
  Running setup.py install for Twisted ... done
Successfully installed PyDispatcher-2.0.5 Scrapy-1.1.1 Twisted-16.3.0 attrs-16.0.0 cryptography-1.4 cssselect-0.9.2 parsel-1.0.2 pyOpenSSL-16.0.0 queuelib-1.4.2 service-identity-16.0.0 zope.interface-4.2.0

 

© 著作权归作者所有

共有 人打赏支持
webcreazy
粉丝 4
博文 15
码字总数 8319
作品 0
大连
程序员
加载中

评论(2)

webcreazy
webcreazy

引用来自“ding3my”的评论

哥们为必须登录为你点赞,这问题折磨我大半天

解决了就好,我也是被折磨了好久
d
ding3my
哥们为必须登录为你点赞,这问题折磨我大半天
win7 下安装scrapy遇的坑

一. 安装 由于scrapy进行网络抓取的便捷与强大功能,准备在win7下安装该框架,但遇到很多麻烦。 解决方法: 二. 使用 在初步使用scrapy框架时,也出现一些问题:scrapy startproject tutoria...

fight123 ⋅ 2016/11/12 ⋅ 0

2016 python 2.7 win7 安装 scrapy 及遇到的大坑

Windows 平台: 我的系统是 Win7,首先,你要有Python,我用的是2.7.11版本,Python3相仿,只是一些源文件不同。 官网文档:http://doc.scrapy.org/en/latest/intro/install.html,官网的实在...

impression ⋅ 2016/10/24 ⋅ 0

学习编程的你,遇到了Bug该怎么办?

学习 这里我先回答标题的问题,答案就是:百度! 直接把错误提示复制在搜索栏,用百度搜索。如果没有现成的错误提示,只有模糊的需求,那就整理一下需求,组织一下语言,然后用百度搜索自己的...

爱吃西瓜的番茄酱 ⋅ 2017/11/12 ⋅ 0

Scrapy 抓取疑惑问题,未解决!!!

版本python3.5 scrapy 1.4 抓取链家数据的时候,抓到一定数据量会卡住不前,不知道问题原因,在setting设置了一些参数,但是并没有感觉到效果。我记得以前使用scrapy设置timeout的时候,是有...

makeroomfor1 ⋅ 2017/12/11 ⋅ 0

Win7 64bit 安装爬虫Scrapy

安装Scrapy进过的坑 在学习爬虫的时候,也上网搜过不少相关教程,最终决定选择在Linux上开发,只能用虚拟机了,但是虚拟机比较卡,也比较占用系统资源,所以决定尝试在Windows win7上安装爬虫...

Listen_ing ⋅ 2016/11/30 ⋅ 0

Winndows安装python环境及Scrapy安装

1、下载并安装python 因为Windows目前支持python3不好,因此还是下载python2.7吧 https://www.python.org/downloads/release/python-2712/ 从官网选择Windows x86-64 MSI installer下载并安装...

tree2013 ⋅ 2016/11/02 ⋅ 0

Scrapy爬图片(二)

Scrapy第三篇: ImagesPipeline的使用 大家好呀,我来填坑了(半夜写文也是有些醉啊,课太多没有办法唉。。) 上次的项目一发出,立即有盆友留言: "看来我们开的不是一辆车" ”还是您这趟比较...

Wakingup ⋅ 2017/04/18 ⋅ 0

关于Fedora 24 的 `dnf update`这个小故障

导读 Fedora 项目组的 Adam Williamson 发布一则关于Fedora 24的服务公告,有一部分Fedora 24用户运行dnf update命令会出现 “duplicated packages” 和 “kernel updates not working” 等错...

linux小陶 ⋅ 2016/12/28 ⋅ 0

Scrapy爬虫:抓取大量斗图网站最新表情图片

Paste_Image.png 一:目标 使用Scrapy框架遇到很多坑,坚持去搜索,修改代码就可以解决问题。这次爬取的是一个斗图网站的最新表情图片www.doutula.com/photo/list,练习使用Scrapy框架并且使...

布咯咯_rieuse ⋅ 2017/06/12 ⋅ 0

存储大量爬虫数据的数据库,了解一下?

"当然, 并不是所有数据都适合" 在学习爬虫的过程中, 遇到过不少坑. 今天这个坑可能以后你也会遇到, 随着爬取数据量的增加, 以及爬取的网站数据字段的变化, 以往在爬虫入门时使用的方法局限性...

fesoncn ⋅ 04/09 ⋅ 0

没有更多内容

加载失败,请刷新页面

加载更多

下一页

看东方明珠新媒体如何基于阿里视频云,构建完整的视频OTT平台SaaS服务

摘要: 东方明珠新媒体如何基于阿里云,搭建了面向第三方的视频SaaS服务?6月8日,上海云栖大会视频专场中,东方明珠新媒体股份有限公司云计算中心的副总周少毅带来了《东方明珠视频云》为题...

阿里云云栖社区 ⋅ 24分钟前 ⋅ 0

C#调用WebService实例和开发 VS2013

简单的理解就是:webservice就是放在服务器上的函数,所有人都可以调用,然后返回信息。 Web Service的主要目标是跨平台的可互操作性。为了实现这一目标,Web Service 完全基于XML(可扩展标...

布衣大侠 ⋅ 27分钟前 ⋅ 0

基于FlumeNG+Kafka+ElasticSearch+Kibana的日志系统

环境准备 1.服务器概览 hostname ip 操作系统 说明 安装内容 node1.fek 192.168.2.161 centos 7 node1节点 nginx,jdk1.8, flumeNG, elasticsearch slave1 node2.fek 192.168.2.162 centos ......

张shieppp ⋅ 27分钟前 ⋅ 0

问答网站已成过去,深度问答社区才是当下

曾几何时,各类问答网站数不胜数,从百度知道这类综合型问答网站到各种垂直细分的问答网站,都有不少,但到了移动互联网时代,很明显的一大趋势是,网站整体的流量都在下滑,随着移动智能设备...

ThinkSNS账号 ⋅ 29分钟前 ⋅ 0

Android平台架构(ART)

Android平台架构(ART) 本文目的:准确表述Android平台架构 本文转载自[Android官网] 本文定位:学习笔记 学习过程记录,加深理解。也希望能给学习的同学一些灵感 本文更新时间:2018.06.22(...

lichuangnk ⋅ 31分钟前 ⋅ 0

看东方明珠新媒体如何基于阿里视频云,构建完整的视频OTT平台SaaS服务

摘要: 东方明珠新媒体如何基于阿里云,搭建了面向第三方的视频SaaS服务?6月8日,上海云栖大会视频专场中,东方明珠新媒体股份有限公司云计算中心的副总周少毅带来了《东方明珠视频云》为题...

猫耳m ⋅ 33分钟前 ⋅ 0

Java 动态代理 原理解析

概要 AOP的拦截功能是由java中的动态代理来实现的。说白了,就是在目标类的基础上增加切面逻辑,生成增强的目标类(该切面逻辑或者在目标类函数执行之前,或者目标类函数执行之后,或者在目标...

轨迹_ ⋅ 35分钟前 ⋅ 0

js 获取当前时间

var myDate = new Date();myDate.getYear(); //获取当前年份(2位)myDate.getFullYear(); //获取完整的年份(4位,1970-????)myDate.getMonth(); //获取当前月份(0-11,0代表1月)myDate...

夜醒者 ⋅ 41分钟前 ⋅ 0

windows删除或修改本地Git保存的账号密码

在win10或者win7都是一样的步骤: (一)进入控制面板(二)选择用户账户(三)选择管理你的凭据(四)选择Windows凭据(五)选择git保存的用户信息(六)选择编辑或者进...

果树啊 ⋅ 41分钟前 ⋅ 0

8个基本的Docker容器管理命令

前言: 在这篇文章中,我们将带你学习 8 个基本的 Docker 容器命令,它们操控着 Docker 容器的基本活动,例如 运行run、 列举list、 停止stop、 查看历史纪录logs、 删除delete 等等。文末福...

java高级架构牛人 ⋅ 43分钟前 ⋅ 0

没有更多内容

加载失败,请刷新页面

加载更多

下一页

返回顶部
顶部