![]() In the code above, we iterate over csvData array, each row will be saved to PostgreSQL using pg client pool.ĭone() function is used to release the client when process finishes. "INSERT INTO category (id, name, description, created_at) VALUES ($1, $2, $3, $4)" Ĭonsole.log("inserted " + res.rowCount + " row:", row) create a new connection pool to the database Remember that the code will be written inside 'end' event handler function. Now we’re gonna use pg module to connect to PostgreSQL database and save them. +-+-Īfter the step Reading CSV file, we have all rows (except header) which are pushed in csvData array. So we run the script below: CREATE TABLE category( What we need is a table in PostgreSQL database named category. ![]() Now we have the csv file bezkoder.csv with content looks like: id,name,description,createdAtġ,Node.js,JavaScript runtime environment,Ģ,Vue.js,JavaScript Framework for building UI,ģ,Angular.js,Platform for building mobile & desktop web app, ![]() Hence, we will save data to PostgreSQL in this handler function. – on('end') is triggered after the parsing is done, at the time that we have all records. – on('data') is triggered when a record is parsed, so we will get the record ( data) in the handler function. In the process of creating CsvParserStream, we listen 2 events: This ReadStream object will ‘pipe’ a CsvParserStream object generated from fast-csv parse() function: let stream = fs.createReadStream("bezkoder.csv") Then we create a ReadStream from csv file using fs.createReadStream() function. Read CSV file using fast-csvĪs usual, we need to import necessary modules to js file, in this case, they are fs & fast-csv: const fs = require("fs")
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |