I have a use case where I'm wanting to run potentially long-running queries via CloudFlare workers. These seem to work great for the simple quick stuff, but I'm thinking about queries which could potentially take a couple of minutes. For example, a quick query:
export default {
async fetch(
request: Request,
env: Env,
ctx: ExecutionContext
): Promise<Response> {
const client = new Client(env.DB_URL);
await client.connect();
const result = await client.query("SELECT * FROM data");
const resp = new Response(JSON.stringify(result.rows), {
headers: { "Content-Type": "application/json" }
});
ctx.waitUntil(client.end());
return resp;
},
};
I don't really want to be sat waiting for the worker to run a long query, so I'm considering something whereby the worker issues the query and sticks the results into a cache somewhere that I can poll for (so maybe this is just a JavaScript problem?)
However, this is just an idea that I've come up with. Are there any other more 'best practice' ways of potentially solving this?
One solution could be to use Cloudflare Queues. Send a job to
long-running-jobsqueue that triggers a Worker (the long running job). When the long running job is finished it sends the results to another queue (could send to multiple queues if needed)long-running-resultsthat triggers a Worker that does something with the resulting data. If some error occurs, send that tolong-running-errors.You could extend that solution with;
Queued,Running,Failed,Completed. That could be built using a database (Cloudflare D1 - in public beta), key-value store (Cloudflare Workers KV) or a strongly consistant store (Cloudflare Durable objects).Remember that there are limits on the CPU time for Workers; 30 s for HTTP requests and 15 min for Queue consumers.