文档章节

elasticseach 数据的导出与导出工具elasticdump使用

scgaopan
 scgaopan
发布于 07/21 14:58
字数 2278
阅读 35
收藏 6
  • Centos下安装elasticdump
yum install elasticdump
  • 安装完成后
[root@i-vvxxxxswtw5ne ~]# elasticdump --help
elasticdump: Import and export tools for elasticsearch
version: 2.2.0

Usage: elasticdump --input SOURCE --output DESTINATION [OPTIONS]

--input
                    Source location (required)
--input-index
                    Source index and type
                    (default: all, example: index/type)
--output
                    Destination location (required)
--output-index
                    Destination index and type
                    (default: all, example: index/type)
--limit
                    How many objects to move in batch per operation
                    limit is approximate for file streams
                    (default: 100)
--debug
                    Display the elasticsearch commands being used
                    (default: false)
--type
                    What are we exporting?
                    (default: data, options: [data, mapping])
--delete
                    Delete documents one-by-one from the input as they are
                    moved.  Will not delete the source index
                    (default: false)
--searchBody
                    Preform a partial extract based on search results
                    (when ES is the input,
                    (default: '{"query": { "match_all": {} } }'))
--sourceOnly
                    Output only the json contained within the document _source
                    Normal: {"_index":"","_type":"","_id":"", "_source":{SOURCE}}
                    sourceOnly: {SOURCE}
                    (default: false)
--all
                    Load/store documents from ALL indexes
                    (default: false)
--ignore-errors
                    Will continue the read/write loop on write error
                    (default: false)
--scrollTime
                    Time the nodes will hold the requested search in order.
                    (default: 10m)
--maxSockets
                    How many simultaneous HTTP requests can we process make?
                    (default:
                      5 [node <= v0.10.x] /
                      Infinity [node >= v0.11.x] )
--timeout
                    Integer containing the number of milliseconds to wait for
                    a request to respond before aborting the request. Passed
                    directly to the request library. Mostly used when you don't
                    care too much if you lose some data when importing
                    but rather have speed.
--offset
                    Integer containing the number of rows you wish to skip
                    ahead from the input transport.  When importing a large
                    index, things can go wrong, be it connectivity, crashes,
                    someone forgetting to `screen`, etc.  This allows you
                    to start the dump again from the last known line written
                    (as logged by the `offset` in the output).  Please be
                    advised that since no sorting is specified when the
                    dump is initially created, there's no real way to
                    guarantee that the skipped rows have already been
                    written/parsed.  This is more of an option for when
                    you want to get most data as possible in the index
                    without concern for losing some rows in the process,
                    similar to the `timeout` option.
--inputTransport
                    Provide a custom js file to us as the input transport
--outputTransport
                    Provide a custom js file to us as the output transport
--toLog
                    When using a custom outputTransport, should log lines
                    be appended to the output stream?
                    (default: true, except for `$`)
--help
                    This page

Examples:

# Copy an index from production to staging with mappings:
elasticdump \
  --input=http://production.es.com:9200/my_index \
  --output=http://staging.es.com:9200/my_index \
  --type=mapping
elasticdump \
  --input=http://production.es.com:9200/my_index \
  --output=http://staging.es.com:9200/my_index \
  --type=data

# Backup index data to a file:
elasticdump \
  --input=http://production.es.com:9200/my_index \
  --output=/data/my_index_mapping.json \
  --type=mapping
elasticdump \
  --input=http://production.es.com:9200/my_index \
  --output=/data/my_index.json \
  --type=data

# Backup and index to a gzip using stdout:
elasticdump \
  --input=http://production.es.com:9200/my_index \
  --output=$ \
  | gzip > /data/my_index.json.gz

# Backup the results of a query to a file
elasticdump \
  --input=http://production.es.com:9200/my_index \
  --output=query.json \
  --searchBody '{"query":{"term":{"username": "admin"}}}'
Learn more @ https://github.com/taskrabbit/elasticsearch-dump
  • 数据从一个库导入另一个库input和output都是url
    [root@i-vvwdddtw5ne ~]# elasticdump --input=http://192.192.16.50:9200/elasticsearch_sapdata --output=http://192.192.16.30:9200/elasticsearch_sapdata --type=data
    Sun, 21 Jul 2019 06:44:18 GMT | starting dump
    Sun, 21 Jul 2019 06:44:18 GMT | Error Emitted => {"error":{"root_cause":[{"type":"parsing_exception","reason":"The field [fields] is no longer supported, please use [stored_fields] to retrieve stored fields or _source filtering if the field is not stored","line":1,"col":36}],"type":"parsing_exception","reason":"The field [fields] is no longer supported, please use [stored_fields] to retrieve stored fields or _source filtering if the field is not stored","line":1,"col":36},"status":400}
    Sun, 21 Jul 2019 06:44:18 GMT | Total Writes: 0
    Sun, 21 Jul 2019 06:44:18 GMT | dump ended with error (get phase) => Error: {"error":{"root_cause":[{"type":"parsing_exception","reason":"The field [fields] is no longer supported, please use [stored_fields] to retrieve stored fields or _source filtering if the field is not stored","line":1,"col":36}],"type":"parsing_exception","reason":"The field [fields] is no longer supported, please use [stored_fields] to retrieve stored fields or _source filtering if the field is not stored","line":1,"col":36},"status":400}

      解决办法:加上  --searchBody '{"query":{"match_all": {}}}'

#input 和output都指向库的url
[root@i-vvwtw5ne ~]# elasticdump --input=http://192.192.16.50:9200/elasticsearch_sapdata --output=http://192.192.16.30:9200/elasticsearch_sapdata --type=data   --searchBody '{"query":{"match_all": {}}}'
Sun, 21 Jul 2019 06:49:57 GMT | starting dump
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 0)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 100)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 200)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 300)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 400)
Sun, 21 Jul 2019 06:49:57 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:57 GMT | got 100 objects from source elasticsearch (offset: 500)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 600)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 700)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 800)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 900)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 1000)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 1100)
Sun, 21 Jul 2019 06:49:58 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:58 GMT | got 100 objects from source elasticsearch (offset: 1200)
Sun, 21 Jul 2019 06:49:59 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:49:59 GMT | got 87 objects from source elasticsearch (offset: 1300)
Sun, 21 Jul 2019 06:49:59 GMT | sent 87 objects to destination elasticsearch, wrote 87
Sun, 21 Jul 2019 06:49:59 GMT | got 0 objects from source elasticsearch (offset: 1387)
Sun, 21 Jul 2019 06:49:59 GMT | Total Writes: 1387
Sun, 21 Jul 2019 06:49:59 GMT | dump complete
  • 把一个文件导入到库中,input为文件,output为要导入的库
[root@i-vvwtw5ne ~]# elasticdump --input=gaopan.json --output=http://192.192.16.30:9200/elasticsearch_sapdata --type=data --searchBody '{"query":{"match_all": {}}}'
Sun, 21 Jul 2019 06:53:36 GMT | starting dump
Sun, 21 Jul 2019 06:53:36 GMT | got 100 objects from source file (offset: 0)
Sun, 21 Jul 2019 06:53:36 GMT | sent 100 objects to destination elasticsearch, wrote 100
Sun, 21 Jul 2019 06:53:37 GMT | got 137 objects from source file (offset: 100)
Sun, 21 Jul 2019 06:53:37 GMT | sent 137 objects to destination elasticsearch, wrote 137
Sun, 21 Jul 2019 06:53:37 GMT | got 141 objects from source file (offset: 237)
Sun, 21 Jul 2019 06:53:37 GMT | sent 141 objects to destination elasticsearch, wrote 141
Sun, 21 Jul 2019 06:53:37 GMT | got 132 objects from source file (offset: 378)
Sun, 21 Jul 2019 06:53:37 GMT | sent 132 objects to destination elasticsearch, wrote 132
Sun, 21 Jul 2019 06:53:37 GMT | got 143 objects from source file (offset: 510)
Sun, 21 Jul 2019 06:53:37 GMT | sent 143 objects to destination elasticsearch, wrote 143
Sun, 21 Jul 2019 06:53:37 GMT | got 132 objects from source file (offset: 653)
Sun, 21 Jul 2019 06:53:37 GMT | sent 132 objects to destination elasticsearch, wrote 132
Sun, 21 Jul 2019 06:53:37 GMT | got 140 objects from source file (offset: 785)
Sun, 21 Jul 2019 06:53:38 GMT | sent 140 objects to destination elasticsearch, wrote 140
Sun, 21 Jul 2019 06:53:38 GMT | got 131 objects from source file (offset: 925)
Sun, 21 Jul 2019 06:53:38 GMT | sent 131 objects to destination elasticsearch, wrote 131
Sun, 21 Jul 2019 06:53:38 GMT | got 143 objects from source file (offset: 1056)
Sun, 21 Jul 2019 06:53:38 GMT | sent 143 objects to destination elasticsearch, wrote 143
Sun, 21 Jul 2019 06:53:38 GMT | got 132 objects from source file (offset: 1199)
Sun, 21 Jul 2019 06:53:38 GMT | sent 132 objects to destination elasticsearch, wrote 132
Sun, 21 Jul 2019 06:53:38 GMT | got 56 objects from source file (offset: 1331)
Sun, 21 Jul 2019 06:53:38 GMT | sent 56 objects to destination elasticsearch, wrote 56
Sun, 21 Jul 2019 06:53:38 GMT | got 0 objects from source file (offset: 1387)
Sun, 21 Jul 2019 06:53:38 GMT | Total Writes: 1387
Sun, 21 Jul 2019 06:53:38 GMT | dump complete
  • 把数据导出到json文件中(input为要导出的库,output为要导出的文件路径)
    [root@i-vvwtw5ne ~]# elasticdump --input=http://192.192.16.30:9200/elasticsearch_sapdata --output=gaopan2.json --type=data    --searchBody '{"query":{"match_all": {}}}' 
    Sun, 21 Jul 2019 06:55:57 GMT | starting dump
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 0)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 100)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 200)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 300)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 400)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 500)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 600)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 700)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 800)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 900)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 1000)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 1100)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 100 objects from source elasticsearch (offset: 1200)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 100 objects to destination file, wrote 100
    Sun, 21 Jul 2019 06:55:57 GMT | got 87 objects from source elasticsearch (offset: 1300)
    Sun, 21 Jul 2019 06:55:57 GMT | sent 87 objects to destination file, wrote 87
    Sun, 21 Jul 2019 06:55:57 GMT | got 0 objects from source elasticsearch (offset: 1387)
    Sun, 21 Jul 2019 06:55:57 GMT | Total Writes: 1387
    Sun, 21 Jul 2019 06:55:57 GMT | dump complete
    

     

 

 

© 著作权归作者所有

scgaopan
粉丝 4
博文 45
码字总数 22746
作品 0
成都
私信 提问
当ELK出现问题时,我们需要做些什么

当然是备份备份备份啊!!! 首先最好还是记录一下你的ELK的安装配置吧~虽然不是必要的,下面按照ELK三个字母的顺序来谈 1-关于elasticsearch elasticsearch的config信息备份一下会使你再次搭...

杨春炼
2016/06/14
127
0
记elasticdump 备份数据导出导入

版本: elasticsearch 5.5.2 elasticdump 2.2 系统 CentOS7.3 因项目需求 从生产导出一份索引到测试 帮助文档 https://github.com/taskrabbit/elasticsearch-dump?utmsource=dbweekly&utmmed......

雁南飞丶
2018/07/16
740
0
弹性搜索的导入和导出工具 - elasticsearch-dump

移动和保存索引的工具。 版本警告! Elasticdump 1.0.0 的版本更改转储创建的文件的格式。使用版本创建的文件可能不适用于当前的版本,而且会导致内存不足。 Elasticdump 2.0.0 转储版本删除...

匿名
2017/12/13
862
0
ElasticSearch 单表(json格式)导入之nodejs+elasticdump运用

安装node 和 node的elasticdump插件实现导入导出参考文章:https://github.com/taskrabbit/elasticsearch-dump Window安装步骤 1.安装nodejs,百度nodejs,然后去官网下载4.4.7的版本 2.安装e...

wsy940822
2016/07/08
781
0
ElasticSearch数据迁移

使用 https://github.com/taskrabbit/elasticsearch-dump 安装 nodejs, npm后使用 npm install elasticdump -g安装。 使用参考: ES=http://search-es-0.search-es.app.svc.cluster.local:92......

John
2018/12/20
0
0

没有更多内容

加载失败,请刷新页面

加载更多

高防CDN的出现是为了解决网站的哪些问题?

高防CDN是为了更好的服务网络而出现的,是通过高防DNS来实现的。高防CDN是通过智能化的系统判断来路,再反馈给用户,可以减轻用户使用过程的复杂程度。通过智能DNS解析,能让网站访问者连接到...

云漫网络Ruan
今天
10
0
聊聊Tomcat中的连接器(Connector)

上期回顾 上一篇文章《Tomcat在SpringBoot中是如何启动的》从main方法启动说起,窥探了SpringBoot是如何启动Tomcat的,在分析Tomcat中我们重点提到了,Tomcat主要包括2个组件,连接器(Conne...

木木匠
今天
8
0
OSChina 周一乱弹 —— 熟悉的味道,难道这就是恋爱的感觉

Osc乱弹歌单(2019)请戳(这里) 【今日歌曲】 @xiaoshiyue :好久没分享歌了分享张碧晨的单曲《今后我与自己流浪》 《今后我与自己流浪》- 张碧晨 手机党少年们想听歌,请使劲儿戳(这里)...

小小编辑
今天
2.2K
22
SpringBoot中 集成 redisTemplate 对 Redis 的操作(二)

SpringBoot中 集成 redisTemplate 对 Redis 的操作(二) List 类型的操作 1、 向列表左侧添加数据 Long leftPush = redisTemplate.opsForList().leftPush("name", name); 2、 向列表右......

TcWong
今天
37
0
排序––快速排序(二)

根据排序––快速排序(一)的描述,现准备写一个快速排序的主体框架: 1、首先需要设置一个枢轴元素即setPivot(int i); 2、然后需要与枢轴元素进行比较即int comparePivot(int j); 3、最后...

FAT_mt
昨天
4
0

没有更多内容

加载失败,请刷新页面

加载更多

返回顶部
顶部