I want to create a new collection and add thousands of documents sized ~ 1-2K to it. I already have data in json so I thought this would be easy.
I understand that batch can have 500 writes at a time so to break it into chunks of 500 I wrote the following code. Though for testing purpose I am running it with chunks of 20 and my test json has 72 objects.
But I keep getting the following error
[email protected]:148
throw new Error('Cannot modify a WriteBatch that has been committed.');
^
Error: Cannot modify a WriteBatch that has been committed.
My code is as follows
var dataObj = JSON.parse(fs.readFileSync('./bigt.json'))
var tmpdd = dataObj.slice(0, 72)
var batch = db.batch();
console.log(tmpdd.length)
let tc = tmpdd.length
let lc = 0
let upperLimit = 20, dd = null
while(lc<=tc){
dd = tmpdd.slice(lc, upperLimit )
console.log(lc, upperLimit)
dd.map(
o => batch.set(db.collection('nseStocks').doc(o.Date+o.variable), o)
)
batch.commit().then(function () {
console.log('Written to firestore', lc, lc + upperLimit)
})
.catch(
(err) => console.log('Fail', err)
)
lc = upperLimit
upperLimit = upperLimit + 20
}
Also it's weird that batch doesn't seem to be committed in every iteration of the loop. Ideally I would let Firestore determine document ids but apparently batch does not have add function.
I have tried adding documents in a loop instead of doing batch writes. But it gives me timeout error after adding a few documents. And of course it's not practical for large number of documents.
You could tell I am very new to Firestore and it's my second day playing with it.
Please let me know if there are any obvious mistakes or better ways of achieving this seemingly simple task.
Thanks
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…