I'm looking for a package to run background tasks in Django.
I came across django-background-tasks but there are compatibility issues with the latest Django version since this package appears to be no longer maintained.
I am calling a function run_job from views.py that runs simulations via matlab engine
def run_job():
eng = matlab.engine.start_matlab()
eng.addpath(self.utils_dir)
eng.addpath(self.inp_dir)
eng.cd(self.t_dir, nargout=0)
eng.main([self.data_path], nargout=0)
After uploading a file in the frontend, in the backend, this function is called in views.py. The problem is, currently, I am waiting for the simulation to be complete. This is an issue i.e. the job terminates if the user switches to other tabs. Therefore, I would like to know how to run the simulation in the background such that there is no interruption when the user switches to other tabs of the web page in the front,end.
Suggestions on other packages that I could use to run tasks in the background will be of great help.
EDIT:
In models.py, I've a class
Create your models here.
class file_upload(models.Model):
uploader = models.ForeignKey(User, on_delete=models.CASCADE)
ids = models.AutoField(primary_key=True)
added_on = models.DateTimeField(auto_now_add=True, null=True)
file_name = models.CharField(max_length=255)
# registration
verification_token = models.CharField(max_length=255, null=True, blank=True, default="")
# running job
# running_job = models.CharField(max_length=255)
# finished jobs
# finished_jobs = models.CharField(max_length=255)
# task dict {task_name, task_status}
task_info = models.TextField(null=True, blank=True)
To the same class, I was trying to add task_info variable to save information about the finished and running tasks.
I'm not very sure if management command should be added to models.py.
You might write a management command that, say, takes a
RunRequestinstance and generates aRunResultsinstance when it finishes. The view could then create aRunRequestinstance (with a link to theUserinstance!) and spawn a process completely separate from the Django server to actually do the processing. Having done this it could immediately return to the user, perhaps by redirecting to aMyRunJobsview which polls the user'sRunRequestsandRunResultsand displays status and provides appropriate links to retrieve or display the results for ones that have completed, and (maybe) an estimated time to completion for the ones that haven't.Management commands provide the necessary environment for code to access Django models and instances through code which isn't running in a View, but from a "command line" (which in this case might be used only by the process-spawning code).
Vague and sketchy I know, but detail would require knowing more about your environment, the matlab processing, etc.
Alternatively there's a very comprehensive (and complex) package for running stuff asynchronously, called Celery. It supports Django. I have virtually no experience with it because I decided it was a sledgehammer, and my problem a thin-shelled nut.