Node.js: Optimisations on parsing large JSON file
What if you have 14GB of JSON data, and you want to read it in Node.js and perform a filtering operation. How would you do it? Type of the data For the Backend DB architecture project, with a friend we generated 15,930,001 posts for a total of 14GB of data. Each post has the following type: export class Post { public client : number ; //ID of the Client public channel : number ; //ID of the Channel public id : number public created_date : number ; public text : string ; public post_type : "TEXT" | "IMAGE" | "VIDEO" | "LINK" ; public labels : string []; //max 10 public insights : number []; //array of 100 integers } Baseline Good benchmark for the minimum time to scan the file is to compare with the "wc" utility. It yields 12.1 seconds scan time for "\n": time wc -l ../posts.json 15930000 ../posts.json wc -l ../posts.json 8.23 s user 3.32 s system 94 % cpu 12 . 191 total Filtering for...