Firebase pubsub cloud scheduler error
The bugs reported on console when using cloud scheduler - 1. Process exited with code 16 2. Firestore: Error: 4 DEADLINE_EXCEEDED: Deadline exceeded
I am recently experiencing regular errors for my firebase cloud functions. Especially it is a scheduler function that is triggered every 30 minutes, to update the data in the firestore.
My Code looks like this:
exports.updateRiverLevelsData` = functions
.runWith({ timeoutSeconds: 540, memory: "1GB" })
.pubsub.schedule("every 30 minutes")
.onRun(async (context) => {
const scotlandResponse = await axios.get(
"https://www.riverapp.net/affiliate/api/get_stations?providerId=559ba5cd84821f077403c4b8"
);
const irelandResponse = await axios.get(
"https://www.riverapp.net/affiliate/api/get_stations?providerId=580e616584821f5ed78e9ee7"
);
const walesResponse = await axios.get(
"https://www.riverapp.net/affiliate/api/get_stations?providerId=5e4c4c5650ac9c863f897ca3"
);
const englandResponse = await axios.get(
"https://www.riverapp.net/affiliate/api/get_stations?providerId=54aaf0a7e4b01337250b209d"
);
const scotland = await scotlandResponse.data;
const ireland = await irelandResponse.data;
const wales = await walesResponse.data;
const england = await englandResponse.data;
return await Promise.all([
insertLevels(england, "england"),
insertLevels(scotland, "scotland"),
insertLevels(ireland, "ireland"),
insertLevels(wales, "wales"),
]);
});
```
``
```
const insertLevels = async (country: any, label: string) => {
const db = admin.firestore();
interface ISomeObject {
[key: string]: any;
}
const legends: ISomeObject = {};
const gaugeLegends: ISomeObject = {};
try {
for (let i = 0; i < country.length; i++) {
let refRiver = country[i].riverName;
if (refRiver.includes("/")) {
refRiver = refRiver.replace("/", "-");
}
if (!legends[refRiver]) {
legends[refRiver] = {
gauges: [
{
gaugename: country[i].gaugeName,
lastupdate: new Date(country[i].latestLevelTime).toISOString(),
levelseries: [`${country[i].latestLevelValue}`],
leveltrend: "",
pin: false,
rivername: refRiver,
temptrend: "",
timeseries: [new Date().toISOString()],
tempseries: [`${country[i].latestTemperatureValue}`],
},
],
};
} else {
const temp = legends[refRiver];
const newObj = {
gaugename: country[i].gaugeName,
lastupdate: new Date(country[i].latestLevelTime).toISOString(),
levelseries: [`${country[i].latestLevelValue}`],
leveltrend: "",
pin: false,
rivername: refRiver,
temptrend: "",
timeseries: [new Date().toISOString()],
tempseries: [`${country[i].latestTemperatureValue}`],
};
temp.gauges.push(newObj);
legends[refRiver] = temp;
}
}
country.map((item: { gaugeName: string | number }) => {
if (!gaugeLegends[item.gaugeName]) {
gaugeLegends[item.gaugeName] = item;
}
});
const date = new Date(); // 2009-11-10
const dateKey = date.toLocaleString("default", {
month: "long",
year: "numeric",
});
const refCollection = db.collection("riverStoreLevels").doc(dateKey);
const checkData = await refCollection.collection(label).limit(1).get();
if (checkData.size) {
const snapshot = await refCollection.collection(label).get();
const data = snapshot.docs.map((doc) => {
return {
id: doc.id,
...doc.data(),
};
});
data.map((item: any) => {
let riverName = item.id;
if (riverName.includes("/")) {
riverName = riverName.replace("/", "-");
}
const gauges = item.gauges;
if (legends[riverName]) {
for (let i = 0; i < gauges.length; i++) {
// cari yang sama dengan data yang baru dan lama
const oldEachGauge = gauges[i];
const newEachGauge = gaugeLegends[oldEachGauge.gaugename];
if (newEachGauge) {
const levelseries = oldEachGauge.levelseries;
const tempseries = oldEachGauge.tempseries;
const timeseries = oldEachGauge.timeseries;
let temptrend = "";
let leveltrend = "";
const latestriverupdate = new Date(
newEachGauge.latestLevelTime
).toISOString();
if (
parseFloat(levelseries[levelseries.length - 1]) <
parseFloat(newEachGauge.latestLevelValue)
) {
leveltrend = "raising";
} else if (
parseFloat(levelseries[levelseries.length - 1]) >
parseFloat(newEachGauge.latestLevelValue)
) {
leveltrend = "falling";
} else {
leveltrend = "static";
}
if (
parseFloat(tempseries[tempseries.length - 1]) <
parseFloat(newEachGauge.latestTemperatureValue)
) {
temptrend = "raising";
} else if (
parseFloat(tempseries[tempseries.length - 1]) >
parseFloat(newEachGauge.latestTemperatureValue)
) {
temptrend = "falling";
} else {
temptrend = "static";
}
levelseries.push(newEachGauge.latestLevelValue.toString());
tempseries.push(newEachGauge.latestTemperatureValue.toString());
timeseries.push(new Date().toISOString());
oldEachGauge.leveltrend = leveltrend;
oldEachGauge.temptrend = temptrend;
oldEachGauge.lastupdate = latestriverupdate;
}
}
legends[item.id] = { gauges: item.gauges };
}
});
const batchArray: any[] = [];
batchArray.push(db.batch());
let operationCounter = 0;
let batchIndex = 0;
for (const [key, value] of Object.entries(legends)) {
let refRiver = key;
if (refRiver.includes("/")) {
refRiver = refRiver.replace("/", "-");
}
const docRef = refCollection.collection(label).doc(refRiver);
await batchArray[batchIndex].set(docRef, value);
operationCounter++;
if (operationCounter === 499) {
batchArray.push(db.batch());
batchIndex++;
operationCounter = 0;
}
}
return batchArray.forEach(async (batch) => await batch.commit());
} else {
console.log("Document Not Exist");
const batchArray: any[] = [];
batchArray.push(db.batch());
let operationCounter = 0;
let batchIndex = 0;
for (const [key, value] of Object.entries(legends)) {
let refRiver = key;
if (refRiver.includes("/")) {
refRiver = refRiver.replace("/", "-");
}
const docRef = refCollection.collection(label).doc(refRiver);
await batchArray[batchIndex].set(docRef, value);
operationCounter++;
if (operationCounter === 499) {
batchArray.push(db.batch());
batchIndex++;
operationCounter = 0;
}
}
return batchArray.forEach(async (batch) => await batch.commit());
}
} catch (err) {
const e = (err as Error).message;
functions.logger.error(e);
return;
}
};`
The function is basically just updating the doc array fields for every doc in each collection.
Database structure: riverStoreLevels (collection) -> Date.now() (document) -> England (collection) -> test (document) -> test.data().gauges (field that is going to be updated).
I'm very frustrated and have been getting this error for a long time and have even asked on some forums with no answer really need help.
I couldn't figure out how to label this issue, so I've labeled it for a human to triage. Hang tight.
Is it possible that the function is running beyond 540second limit you specified?
Other than this wild guess, I don't think I have enough information to help - can you share some log output or add timing instrumentation to help narrow down the issue you are seeing?
Hey @hiromijorge. We need more information to resolve this issue but there hasn't been an update in 7 weekdays. I'm marking the issue as stale and if there are no new updates in the next 3 days I will close it automatically.
If you have more information that will help us get to the bottom of this, just add a comment!
Since there haven't been any recent updates here, I am going to close this issue.
@hiromijorge if you're still experiencing this problem and want to continue the discussion just leave a comment here and we are happy to re-open this.