node.js - Creating an online search engine for large XML database (10GB - 1TB of data) with server running -


i have been using node.js create website able search google patent grant database provides data in xml format. i've been using mongodb user database, told me had lot of difficulty creating fast search engine using mongodb, said grew large. database technology/software should use in conjunction node.js create efficient search engine? bad idea have 2 different database technologies running 1 website, e.g. mongodb , postgresql? found technology called norch on github https://github.com/fergiemcdowall/norch . technology helpful?

you going have hard time matching or beating lucene in text search either postgres or mongodb. solr or elasticsearch better options (they both use lucene).

that being said people still store data in other search index , implement kind of synchronization between search index , data repository.

edit based on comment:

an example combination solr , postgres. solr search engine , postgres data repository. use dataimporthandler pull data postgres.


Comments

Popular posts from this blog

c# - Better 64-bit byte array hash -

webrtc - Which ICE candidate am I using and why? -

php - Zend Framework / Skeleton-Application / Composer install issue -