Hi all, I’m looking for a way to have a robots.txt implemented so google doesn’t scrape my dev sites. I have found some other short answers, but weren’t very helpful. How are others achieving this?
Hello @Rick_Sauer and welcome to Open edX community!
I’ve written a blog post to achieve the same, which you can find here and refer:
How to add a robots.txt file in Open edX
I hope this will help!
Thanks for the welcome!
Article is great and can see where the file goes, the step I’m not 100% sure about is
Go to the
edx-platform/lms/urls.py file and add the following code
Currently running Tutor version 16.1.4
the step I’m not 100% sure about is Go to the
edx-platform/lms/urls.pyfile and add the following code
This blog was written pre-tutor era.
Either you can edit your
lms/urls.py file directly or you can create plugin of Open edX and add your URL pattern in LMS. You can find plugin creation guide here.
The code is pretty simple and just adds your robots.txt file’s URL pattern, using this Django project can locate and render the file.