Proposal: Multi-Question Problem Authoring in Open edX

Hi Educators,

I wanted to share an improvement we’re proposing for the Open edX platform. Currently, the problem editor only supports creating single-question problems, which can limit how educators assess complex learning outcomes.

We’re proposing to enhance the editor so that instructors can create multi-question problems within a single block, each with its own type and linked scoring, but submitted together as one cohesive unit.

Here’s a link to the full proposal in Confluence: Multi-question problem editor proposal

We’d love to hear your feedback:

  • Do you have any insights about how you currently manage these scenarios, and would you like to see this feature developed?

Your input will help us refine and prioritize this proposal. Thanks in advance for sharing your experiences and thoughts!

4 Likes

@egordon - wanted to flag this to you and your team!

I’ve run into a version of what you are trying to solve, I think.

In the old editor, I used to nest several questions into one block. It was mostly for aesthetic reasons so there was one “submit” button for several questions.

But I largely stopped that because of the effect on analytics. Analytics treated one block as one question, so I couldn’t get granular about each question when I grouped them all together like this.

So my feedback would be to try to have analytics treat each questions separately, if you do move forward with this.

Thanks!

3 Likes

+1 on this, @Chelsea_Rathbun is aware of the analytics open issues / questions on this proposal. Thanks for flagging!

I rewatched the Educator Working Group meeting recording and consolidated the feedback on the Multi-Question Problem proposal. Sharing here for tracking:

  • Immediate feedback: How will feedback work in multi-question problems? If there is a single submit button for a group of questions, showing feedback only after the single submit can harm the student experience by removing immediate feedback. Mixed behavior within one block (some questions show feedback, others do not) is confusing. This needs closer analysis.

  • Analytics (Aspects): Aspects should support per-question analytics for multi-question problems; today it aggregates data at the block level only.

  • Configuration scope: To avoid overload, prefer block/quiz-level settings (e.g., show answer, feedback timing, scoring) and limit per-question options. If granular per-question behavior is needed, authors can use one-question-per-problem option.

  • Rescore/Reset: Clarify how rescore and reset will behave for multi-question blocks. If applied to the block, they may affect all questions at once, which is not always desired.

Hi everyone,

This feature is really valuable, and we also see a strong need for it on our side. Grouping multiple related questions into a single problem block would greatly improve the learner experience and give course teams more flexibility.

Is there an existing GitHub PR, or any development work already started for this?
If so, could someone please share the link?

We’d be happy to follow the progress or even contribute in any way we can.

Thanks again for pushing this forward!

2 Likes

Hello!

No development has started on this, though we are planning to put some more design work into this to get it closer to being something that can be contributed from a development standpoint.

Perhaps there are ways the development could get started but I think first we want to get full / confirmed product proposal acceptance which we are figuring out / working on with the new process in the coming weeks / months.