How to implement redis instead of memcached

There is a delay that can last several hours from the moment a course is edited in the Studio platform to the moment the change is reflected in the LMS, so we would like to use redis instead of memcached to try to solve this issue.

The current fix is restarting memcached:

docker restart edx.devstack.memcached
tutor local restart memcached

Open edX uses memcached via Django, so it should be a matter of adding a new container for redis, making sure you have the appropriate driver for it, and then changing the Django configuration for your CACHES to point to it. That being said, it seems unlikely that a publishing delay would be due to memcached. Can you describe in more detail what’s happening (setup, Open edX version, error logs, etc.)?

Thank you.

While I gather the information you asked, here is a link of another person experiencing the same:

BTW, when you see the delay in changes, are you looking at the course outline or the courseware directly? There are some things that build off things triggered by publish (e.g. the course title in the learner dashboard, the mobile view, the course outline) and might not be updating the cached value properly–causing that stale value to remain until the cached value times out. If it’s Block Structures related (e.g. the course outline or mobile view), that could be up to 24 hours.

But if this is the problem you’re seeing, going to the courseware content directly in the LMS should show the changes immediately, as that pulls directly from the modulestore (i.e. the same thing Studio is looking at) and not from data that is compiled after publish into Block Structures.

Once you are in the courseware, everything seems fine, the problem is in the outline, that would be this url:
/courses/course-v1:demo+demo+demo/course/

Yeah, that definitely sounds like a Block Structures issue and will not be fixed by switching your caching backend. Knowing the version of Open edX you’re running and any relevant log data generated when you publish would be helpful.