Core Contributor Sprint Retro - June 10th - June 24th

Here is the retrospective for the most recent Core Contributor (CC) sprint:

Calls/offers for help

This is what the core team needs help on now. Note that anyone can help – you don’t need to be a Core Contributor (yet!) for that:

Is there anything where you could use help from others? Or that you would like to collaborate on?

We opened a proposal in the BTR wg board: Proposed backport for nutmeg Open edX Filters last batch · Issue #187 · openedx/build-test-release-wg · GitHub We’ll appreciate the help in reviewing the request. Maria Grimaldi
Same as last week I’d like to find some time with Kyle McCormick to ask some questions about his TestX. I have not reached out to him yet! Oops! Dean Jay Mathew
I’d love for someone that is currently hosting large scale instances on k8s or is looking forward to it, to look at the purpose of drydock and give me their opinion Feliipe Montoya

What went well?

What went well these last two weeks / sprint?

We have a small team focused on OEP-55 and are making progress. Edward Edward
We finally finished the hooks framework pending implementations! Maria Grimaldi
Régis has been doing great work in Slack and the forums Dean Jay Mathew
The new time scheduling for the meeting Ilaria Botti
I got a lot done on both Listaflow and Paragon :slight_smile: Ali Hugo
Got great help/input on Nutmeg blog posts from Peter - thanks Peter! Tutor release notes are awesome and very helpful, great job Régis & team! Sarina Canelake

Improvements and Blockers?

Did you experience any blockers or have suggestions for improvements?

I’m seeing private links and bare-bones descriptions in a lot of recent commits & PRs. At tCRIL we’re discussing ways to improve things, but we’re also open to ideas. The existing PR template in edx-platform seems to work well for those who are committed to following it, but it’s just ignored by others. The conventional commit linter generally does its job, but the standard it enforces is very lenient. . Kyle McCormick

Hours contributed

The total reported hours contributed for the last sprint: 199 hours

The totals of the hours reported in the last eleven (11) sprints:

Name Total Hours Contributed in the last 11 sprints
Feanil Patel 216
Sarina Canelake 166
Andrés González 147
Piotr Surowiec 127
Xavier Antoviaque 105.15
Ali Hugo 93.82
Kyle McCormick 90
Zia Fazal 88
Gabriel D’Amours 87
Ghassan Maslamani 80
Felipe Montoya 77
Maria Grimaldi 76
Peter Pinch 75
Edward Edward 70
Braden MacDonald 52
Dean Jay Mathew 45
Adolfo Brandes 40
David Ormsbee 40
Juan Camilo Montoya 38
Pierre Mailhot 36
Jillian Vogel 35
Igor Degtiarov 32
Matjaz Gregoric 29
Giovanni Cimolin 28
Kshitij Sobti 23
Sofiane Bebert 20
Jhony Avella 16
Usman Khalid 13
Nizar Mahmoud 10
Steffania Trabucchi 10
Omar Al-Ithawi 7.75
Esteban Etcheverry 7
Nicole Kessler 7
JayRam Nai 4
Ilaria Botti 3
Chintan Joshi 1
Stefania Trabucchi 0

Check-ins

Scheduling and planning for the July TOC as well as giving your comments on Xavier’s post for the process of selection of TOC members is one of the central points mentioned. All the raw details and extra information can be found here:

2 Likes

@Felipe Do you mean this drydock? I looked at its documentation but honestly couldn’t understand what it’s supposed to do. It seems like an alternative to Terraform; is that right? If so I can’t see the appeal.

@kmccormick Thanks for working on it. This is an important issue to me and many others. Personally I would focus on the reviewers who are approving these PRs. If the criteria for merging a PR includes that it has a description or follows the template, and reviewers are merging those PRs without checking that, then it’s likely they are not checking other things which are expected of a reviewer.

If someone is a “repeat offender” with approving/merging PRs that don’t have any descriptions or are missing any other important steps of the code review process, I think it would be totally reasonable for them to have their review privileges reduced for a while (e.g. their review is no longer sufficient for merge on its own, and someone else also has to review, until they’ve shown a consistent pattern again).

6 Likes

Hey @braden thanks a lot for looking at this.

I was referring to this drydock. It is a raw attempt to create a repo with a structure that would allow for many different large scale installations references to live in one place.

@kmccormick @braden We discussed this topic during the contributors meetup, and there was a consensus that this should be addressed. We ran a quick show of hands, and everyone involved in code reviews agreed.

The approach you’re suggesting @braden seem good to me – we need to be able to trust that the reviewers follow the project decisions and guidelines for what gets merged in the project. We could also likely make the linter or automation more strict, but it often comes with side effects (more false positives, and could probably be circumvented). On something obvious like the lack of a proper PR description, the responsibility is clearly on the reviewer to not let it pass.

It’s definitely important to start by discussing - I’m guessing most of the reviewers who don’t enforce this don’t know that this is a requirement of the project, nor why it is important for the project. Communicating and explaining would be important imho, and I’m sure most reviewers would understand and agree to it.

At the same time, since the project direction is moving from a single actor to a community, it will be important to be able to draw clear limits to what is accepted by the project – and it would be important to tie continued elevated rights within the project to respecting the project’s rules. (And it is actually a good incentive to participate in the elaboration of those rules.)

3 Likes

@mgmdi Thanks for posting the proposal! I tried to have a look, but I didn’t see an explanation or rationale for the backport? If you could explain a bit more in the PR why the backport is necessary, that would likely help with the review.

I’m not certain what the exact policy is currently for backports in releases, but I think we have mostly kept it to critical bugs or security issues? The drawback of backporting being that is changes the stable release, making it less stable :slight_smile: We did sometimes backport issues or even features when there was a strong interest or need for it in the community. Maybe it would be worth using the opportunity for defining a rule, to decide how to accept or refuse this type of backport request? Maybe using a similar review as for other PRs, but have a higher minimum number of :+1: from core contributors reviewers? (5?)

i totally agree with @braden and @antoviaque comments above.
But also more general question that worth answering is gievn the fact public PRs have private links:

Why are there private discussions, documenations, refs…etc that are going to affect public PRs/repos in the first place?

If for example we impose not to add private link in PR/commit and then these links are removed, but the references are still private. I just want to avoid such scinario

1 Like

I didn’t see an explanation or rationale for the backport? If you could explain a bit more in the PR why the backport is necessary, that would likely help with the review.

For context, this is a backport that was discussed before Nutmeg release. The filters functionality for the Hooks extension was not going to land in time for Nutmeg so we agreed that this is a feature that we’re happy to backport.

But you are correct it’s probably worth additing it to the issue for future references.

I’m not certain what the exact policy is currently for backports in releases

The BTR stance on backporting to release is that as long as it doesn’t require data migrations, we will consider it on case by case basis.

2 Likes

Hello! You’re absolutely right, I missed the rationale and a bit more context. I’ll modify the issue for future reference.

1 Like

FYI @e0d has posted a first PR to begin addressing this: