This is the retrospective thread for the most recent Core Contributor (CC) sprint from July 22nd - August 5th.
Raw retrospective report
Note that the raw results of the checkins are now available in Listaflow, and visible at any time (in reports → compare):
NB: @Ali@Cassie Note that several people have mentioned that this isn’t easy to find – including me, I had to look for a bit before finding it in the “compare” tab which has the raw results. The overview & trends have some information, but only a small proportion of the questions produce useful information on the “Trends” page currently.
Overview of the check-ins
Since there weren’t any blockers or improvement suggestions this sprint, I’ll focus instead on giving an overview of the work that has been reported last sprint.
Despite being one of the slowest core contributors sprints, due to August vacations & people being sick, we still had 68h contributed, split between the following main projects:
Product PR review workflow improvements (Natalia, Sarina, Xavier) → There are ongoing discussions on how to improve product reviews in OSPRs, especially in terms of ensuring prompt reviews, and incorporating the changes to the Open edX project from the 2U/tCRIL split & the core contributor program.
“Report” shows the “overview”, “trends”, and “compare” views that are specific to this run of the “Sprint Retrospective & Planning” template. (P.s. The user can also access this view by clicking on the “Reports” item in the main navigation, but they would then need to adjust the filters to find the report for this specific run.)
The updates described above should make it easier for users to find the various pages of the run report, as well as other users’ responses.
However, the updates don’t solve the issue of the “Trends” page, and how only a small percentage of the question results are useful. @antoviaque Am I right in thinking that it’s specifically the results from free-form text questions that are not meaningful?
@Ali Thank you for this preview – that does look like great ways to make it clearer, with clearer paths between the different types of “views” for a run (my response, all responses, report).
Yes - since most of the questions are free forms on these retrospective questionnaires, the report has very little exploitable information. So that requires to go to the “Compare” tab, which has a busy table with multiple runs shown as well as empty entries - so you have to dig through the page to find the information. In that regard, the “All responses” tab from the runs seem a good improvement.
Do you think that would make the tables easier to scan?
P.s. Sometime in the future (when time allows) I’ll ask a few Core Contributors to do a mini user test where they record themselves whilst going through the Sprint Retrospective user flow. That’ll give us some valuable feedback to work with.
@ali_hugo So far I’ve mostly needed to look at one report at a time, so yes the second one would currently be more useful to me. I see how it could be useful to compare at time though - maybe they are different views, like different tabs of the run (“All responses” vs “Previous responses”?), or part of the filters/options on the “All responses” tab?
I’ll need to think this through because the Reports page (accessed from the main menu) and the run report currently work in a slightly different way (in fact, the “All Responses” page isn’t even accessible from the main Reports page at the moment). I have a feeling there’s a way to better standardise the two views of the Reports.
I’m going to schedule some time in a future sprint to take another look at this. Thanks for bringing these UX niggles to my attention.