Solution:
Please read the Best Practices to learn how to use batch operations.
This should be way faster:
function getData() {
const ss = SpreadsheetApp.getActiveSpreadsheet();
const sheet = ss.getSheetByName('URL2');
const urls = sheet.getRange('A1:A2000').getValues().flat(); // Run this in batches
const data = urls.map(url=>{
let response = UrlFetchApp.fetch(url, {muteHttpExceptions: true}).getContentText();
return [response.includes("cast</tt></a> <tt>area") ? 1: 0];
})
const brow=sheet.getRange('B1:B').getValues().filter(String).length+1;
sheet.getRange(brow,2,data.length,data[0].length).setValues(data);
}
Important info:
Note that consumer accounts and business accounts have a limit of 20k and 100k of URL fetch calls respectively on a daily basis. Since you can only run scripts for 6 minutes it means that you don't have a business account and therefore you can fetch only 20k urls per day. Assuming you already fetched some URLs today, you might have already consumed some of your daily quota, so be careful with this.
Please Note!
Run your code in batches: from A1:A2000
then from A2001:A4000
etc, so you make sure the script is not terminated in the middle of the process and then you don't get anything back in the sheet. I adjusted the code so you can continue pasting from the last entry in column B.
Also UrlFetchApp.fetch
is a slow method, so it will still take some time. However, from GAS
or JavaScript
point of view, the code is very efficient.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…