Controlling the number of celery tasks?

Hi,

Our staging VM that runs all of the containers (we run with tutor local) just OOM’ed.

While investigating what was taking so much memory, I noticed that the bulk is used by a large number of celery tasks.

On our production VM, we have 53! such processes, each taking between 0.3 and 0.6% of memory, so this adds up.

I highly doubt that this many celery tasks is needed. We are at a pre-production stage, and we have almost no user, only early testers.

Is there a way to control that in order to limit how much memory is needed for the VM ?

You can control the number of child processes celery spawns with worker_concurrency and the memory with worker_max_memory_per_child

Hum, is there a way to control that through tutor ?

Probably through a plugin that adds django settings like CELERY_WORKER_CONCURRENCY or using the LMS_WORKER_COMMAND and CMS_WORKER_COMMAND filters.

You could make a plugin (i didn’t test this):

# myplugin.py
from tutor import hooks

CELERY_SETTINGS = """
CELERY_WORKER_CONCURRENCY = 1
CELERY_WORKER_MAX_MEMORY_PER_CHILD = 12288 * 1
"""

hooks.Filters.ENV_PATCHES.add_items(
    [
        ("openedx-common-settings", CELERY_SETTINGS),
    ]
)
hooks.Filters.LMS_WORKER_COMMAND.add_items(
    [
        "--concurrency=2",
    ]
)