Last active
August 17, 2022 09:50
-
-
Save sergey-alekseev/b3edf7702a86b72c05ed8e65fb257cd0 to your computer and use it in GitHub Desktop.
Related to http://stackoverflow.com/questions/37007206/export-data-from-elasticsearch-to-csv-using-logstash
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
input { | |
# Read all documents from Elasticsearch matching the given query | |
elasticsearch { | |
hosts => "localhost" | |
index => "index_name" | |
query => '{"query":{"regexp":{"not_analyzed_field":".*"}}}' | |
} | |
} | |
output { | |
csv { | |
fields => ["_id", "_type", "field"] | |
path => "/home/sergey/export.csv" | |
} | |
} |
In case if I want update the elasticsearch version, How can I keep the indexes ?
thanks
hello, how can I export all the indexes one time.
You can use regex for index like index => "*-suffix"
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
hello,
how can I export all the indexes one time.