Building a Custom XBlock for Multi-File, Multi-Language Coding Assessments (HackerRank-style) using Monaco + Judge0

Hi everyone,

I’m currently exploring the development of a custom XBlock that allows learners to practice mini-project style coding assessments directly within Open edX. The idea is to go beyond single-file/single-language inputs and create an experience similar to HackerRank or HackerEarth, where:

  • Learners can write and manage multiple code files per problem
  • The editor supports multiple languages (Python, C++, Java, etc.)
  • Real-time code execution and evaluation is powered by Judge0
  • The interface is built using Monaco Editor (VS Code-like UX)
  • All of this is kept open-source and extensible by the community

What I’ve explored so far:

  • Looked at xblock-ai-evaluation as a reference
  • Monaco is embed-able, but multi-file support and syncing with the backend is tricky
  • Judge0 seems to work well for code execution across languages, but file structuring and per-language runtime handling needs careful design

What I’m trying to build:
An XBlock that can:

  • Let instructors define problems requiring more than one source file
  • Provide starter templates in multiple files
  • Let students submit code in the language of their choice
  • Automatically evaluate and provide feedback (via test cases or expected output)
  • Be as portable and reusable as possible for the wider edX community

What I need help with:

  • Has anyone already attempted something similar?
  • Are there other XBlocks I should look into for inspiration (like codeeditor, problem-builder, or xblock-jsinput)?
  • Guidance on handling:
    • Backend design for storing and submitting multiple files
    • Frontend integration of Monaco in multi-tab/file mode
    • Secure, scalable integration with Judge0 (possibly with Docker isolation?)
  • Contributors or collaborators who might be interested in helping shape this as an open-source plugin

Would love any thoughts, suggestions, or pointers from the community. If someone has experience with building similar tooling or has time to brainstorm, I’d greatly appreciate a conversation.

Thanks in advance!

— Yahya Adhikari

(Open Source Enthusiast | Educator | Developer)

Hi Yahya, we’ve built something similar at Edly before, it leverages Jupyter to support such assessments.
Here is the Jupyter XBlock –> GitHub - overhangio/jupyter-xblock: Integrate JupyterHub notebooks with Open edX

1 Like

Hi Tahir, thanks a lot for pointing me to the jupyter-xblock repo, I checked it out and it looks useful. I do have a question about the resources side though: I’m planning assessments for around 500 students concurrently (in batches), would running this with a Jupyter-based approach be too resource-intensive for that scale?

Specifically I’m worried about things like per-user kernel memory/CPU, container churn, disk I/O, and how to safely isolate/rate-limit test runs. If you have experience or recommendations on deployment patterns that handle this scale, I’d really appreciate any practical advice or rough sizing/tradeoffs you’ve seen work in production.

Thanks again for the pointer, happy to continue this thread or hop on a quick call to brainstorm if you’re free.