-
Notifications
You must be signed in to change notification settings - Fork 270
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
downstreamFormat - Last in array to contain no comma? #333
Comments
The same for me ^^^ |
@Keyang The same for me too ^^^ |
Maybe if someone needs a quick solution until it's fixed: import { Transform } from 'stream';
const csv = csvtojson(csvOptions);
// you can use one more stream to transform csv stream output
const transform = new Transform({
transform (chunk, encoding, callback) {
let string = chunk.toString('utf-8');
if (['[\n', ']\n'].includes(string)) {
if (string === '[\n') {
this.theFirstEntity = true;
}
return callback(null, string);
}
string = string.replace(/,$/gm, '');
if (this.theFirstEntity) {
this.theFirstEntity = false;
} else {
string = `,` + string;
}
callback(null, string);
},
});
readableStream.pipe(csv).pipe(transform).pipe(fs.createWriteStream('output.json'));
/*
output should be the following:
[
{"Name": "Bob"}
,{"Name": "Sarah"}
,{"Name": "James"}
]
*/ |
@Keyang any updates on this? |
I had more luck with this const { Transform } = require('stream');
const csvtojson = require("csvtojson");
const lineToArray = new Transform({
transform (chunk, encoding, cb) {
// add [ to very front
// add , between rows
// remove crlf from row
this.push((this.isNotAtFirstRow ? ',' : '[') + chunk.toString('utf-8').slice(0, -1));
this.isNotAtFirstRow = true;
cb();
},
flush(cb) {
// add ] to very end or [] if no rows
const isEmpty = (!this.isNotAtFirstRow);
this.push(isEmpty ? '[]' : ']');
cb();
}
});
readableStream
.pipe(csvtojson({
checkType: true,
downstreamFormat: 'line'
}))
.pipe(lineToArray)
.pipe(writableStream); Also #389 Update: Modified to reflect comments on -1 #333 (comment) |
Is someone still maintaining this? I'm having the same exact issue. |
same issue |
any update on this? |
hi @ushakov-ruslan |
did you try my example? |
yes @oliverfoster `parse(){
|
You're not using my example. I also couldn't get @ushakov-ruslan s example to work either which is why I left my example. Perhaps the code changed in between? Mine explicitly switches to the line stream so that each write to the stream is a single JSON item or csv row. It makes everything much simpler to handle. |
|
will this work for |
Downstreamformat: 'line'. My transform is called lineToArray |
downstreamFormat: 'line' Also. I want to have array of object. |
readStream
.pipe(csv({
downstreamFormat: 'line',
checkType: true
})
.pipe(lineToArray)
.pipe(stream); Are you debugging? |
You're also not calling resolve on your promise. |
I am calling resolve on header to get all header values. |
Can you remove all the header stuff and just test it as I wrote it three messages ago? The line parser is not the array parser. Get the array of objects first then work out how to change the headers. |
|
Your input filepath isn't right, the CSV isn't formatted correctly, you're not importing the library properly, you're dropping errors in a try catch block around your call to parse or something along those lines. Are you using a debugger? If you're getting an empty file the last line I know is running is the fs.createWriteStream. |
readStream.pipe(stream); Does this make a copy of the file? |
yes correct.
|
What version of node are you using? I want to test myself. It definitely looks like an issue with the way I'm using streams. |
Node Version : |
Works for me. |
Any luck @ravibadoni? |
Great thanks @olivermartinfoster |
Awesome, that's good!! 👍 😅 |
Any news ? The transformer not working for my case. |
What's your case? No news yet. |
it worked for me if I replace |
@oliverfoster's solution #333 (comment) worked for me, with this small change: - this.push((this.isNotAtFirstRow ? ',' : '[') + chunk.toString('utf-8').slice(0,-2));
+ this.push((this.isNotAtFirstRow ? ',' : '[') + chunk.toString('utf-8').slice(0,-1)); |
@petermikitsh change to -1 helped me correctly parse a CSV file to JSON the proper format, thank you both! @oliverfoster |
I've updated my comment to the |
import csv from 'csvtojson'; // https://github.com/eugeneware/replacestream async function convertToJson(inputcsvfilename, outputjsonfilename) {
} |
Hi there, trying to read a CSV file and stream its output data to another JSON array file.
But the last JSON added to the array adds an extra comma at the end creating an invalid array/json.
As such:
How could I make sure it doesn't add the comma on the last entry?
Cheers
The text was updated successfully, but these errors were encountered: