normalize-json-data
v1.0.9
Published
using mapper defind by yourself and get normalize data
Downloads
6
Maintainers
Readme
Crawler use redis to query, and mongodb to save needcessary data, and index end data to elasticsearch Crawler need source, token to start. First, migrate data from mongo to redis by run javascript file: ./dist/script/migrate/migrate_source_from_mongo_2_redis.js After that, run pm2.yml file to start all script.
To crawl special post comment: node dist/scripts/facebook/fetch_comment.js -t page -i 1557782724478208_2395830717340067 -c 0 To crawl special source: node dist/scripts/facebook/fetch_post.js -t page -s 1557782724478208 -c 0
############## News/Review/Ecom
use case for pattern: