The following are some of the situations where the developers simply love to hate!
The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time. — Tom Cargill, Bell Labs, from the book ‘Programming Pearls’
We get used to some situations. But, there are some we don’t.
Good Side
Bad Side:
Ugly Side:
This is where MongoES can help - MongoES. It is a pure python3-bred Migration tool to migrate documents from the Elasticsearch’s indices to the MongoDB collections.
It’s robust in its native way; no queues/message brokers are involved; which means that there won’t be any memory spikes or system freezes.
This was achievable because MongoES specifically uses a tagging strategy before the migration. The tagging happens in the source Elasticsearch, which stands as a checkpoint during the migration.
Unless the documents are explicitly tagged, the _id
fields in Elasticsearch documents are a bunch of alphanumeric strings generated to serialize the documents. These _id
columns become unusable since queries/aggregations can not be run using them.
MongoES - How to
Edit the mongoes.json
file according to your requirements.
{
"EXTRACTION":
{
"HOST": "localhost",
"INDEX": "lorem_ipsum",
"DBENGINE": "elasticsearch",
"PORT":9200
},
"COMMIT":
{
"HOST": "localhost",
"DATABASE": "plasmodium_proteinbase",
"COLLECTION": "mongoes",
"DBENGINE": "mongo",
"USER": "",
"PASSWORD": "",
"PORT":5432
}
}
Make sure that both the Elasticsearch and MongoDB services are up and running, and fire up the migration by keying in:
> python3 __init__.py
Sit back and relax; for we got you covered! The migration’s default value is 1000 documents per transfer.