@arbrandes I’ve followed your suggestion and reviewed the Maple Retrospective
Issues identified in Maple Release Process:
Ideas to improve:
How did Nutmeg fare:
How did Olive fare:
Plan for Palm:
Work tends to be done at the last minute close to meetup dates and release dates.
Start working on the next release now.
The testing period was divided into two (2) phases and announced in Discuss. The first was a 5-week long testing phase from 25 April 2022 to 1 June 2022, and a 1-week bug triagging phase from 2 June 2022 up to the official launch of Nutmeg on 9 June 2022. We had completed 90% of test cases by the end of the first phase. Within 48 hours of Nutmeg launch, the test completion status had jumped to 94.8% (excluding Tutor upgrades as test cases) with 220/228 manual test cases performed. This was a good performance, but there was still some last minute bug fixing.
The testing period was about 1 week longer for Olive than Nutmeg. Olive testing was divided into three (3) phases and announced in Discuss. TEST SPRINT 1 of 3 ran for 4 weeks from Friday 21 October 2022 to Friday 18 November. TEST SPRINT 2 of 3 ran for 2 weeks from Friday 18 November to Friday 2 December. TEST SPRINT 3 of 3 ran for 1 week from Friday 2 December to Friday 9 December (Olive release date). Unexpectedly, Pierre and and Fayyaz steamed ahead and within 1 week of testing, 90.3% of all test cases had been completed. We rested easy after that. This extraordinary fact meant that after only one week of testing, 206 out of 228 test cases have already been tackled! This put us way ahead of schedule. This gave us more time for bug triagging, which we actually left quite late, so much so that we delayed the launch of Olive by a few days in order to add in some last minute fixes, which was a good idea in hindsight. For Palm, I shall focus more on coordinating GitHub issues from the very beggining as soon as they arrive. This should quell any possible delays.
The test cases for Olive were not 100% accurate because I did not take into consideratino the new MFEs. For Palm, the test cases will be updated so there’s no confusion. @ARM also mentioned he may help by suggesting some automated testing ideas. Usually we stay clear of this, but surely some automated testing would be of value, where it makes sense. Moodle does some automated testing, for example.
As a result, there is not much happening asynchronously and progress is very sporadic which leads to intense crunch time.
Introduce an issue reviewer role. Any BTR member can be an issue reviewer, the goal is to have someone that keeps track of a certain number of issues and make sure people provide updates outside of the meetup.
I don’t recall who performed this role as I was not monitoring GitHub. Likely Sofiane was involved.
Adolfo encouraged me to tag priority issues (#OliveReleaseBlocker or something to that effect) so they can be crushed on time. That worked well as Maria and Ghassan and others came in hot to fix everything for us!
I will work closely with Jorge for Palm.
There is not much coordination in the testing process.
Deploy a testing instance and engage with the community to help with testing.
Régis deployed a test instance for the community, and I performed the role of release testing coordinator. An identifiable roadblock from my perspective was that I only coordinated test cases, whilst I did not coordinate any bug triagging in GutHub directly.
Régis again deployed a test instance. This time, with help fand suggestions rom Adolfo and others, I also coordianted the bug triagging in GitHub. This was a marked improvement from a coodination perspective. However, I did not do as many updates during testing as I did with Nutmeg because I was busy that month plus the vast majority of testing was completed within the first week! For Palm, more regular updates shall be done, regardless of how much progress is made.
Coordination should work like a well-oiled machine now.
We did a good job with testing Olive, I’m impressed with the team effort. We have a large test team and some superstars who make it all work like clockwork. I’d like to start reviewing the Palm Test Cases much earlier this time, which means I need to coordinate with @regis to kindly ask him to setup the Palm test instance much earlier that we had done with Nutmeg and Olive.
I think my main retro for Olive was that it’s hard to do a retro this long after the release. For Palm, let’s start a document when we start the process (in April) for the group to share retro notes as we go.
We kept to our schedule (slightly modified to account for holidays)
We managed to include new MFEs despite significant gaps in their development (missing required features, missing document, general missing information)
Let’s review the calendar for Pam now, and choose dates to avoid holidays.
I regret there was a bug fix made in November in master that I didn’t advocate for adding to Olive.1. In retrospect, I wish I had done two things:
Added a test to the testing plan as soon as I realized this was a bug in master (probably should still do this, for Palm)
open an issue in BTR
Let’s review the possible slate for new MFEs for Palm now and coordinate on the basics:
review our requirements list: are they correct, complete, and clear?
review MFE candidates for Palm and identify requirement gaps
It’s not just MFEs that are confusing. In producing the release notes, I tried to review all the “feature” commits in all the olive repos. There are many repos that I couldn’t figure out what they are for. I know a lot of effort is going into determining which repos need to be in the openedx org. It would be helpful to review and update repo readmes as well.