Same as last week I’d like to find some time with Kyle McCormick to ask some questions about his TestX. I have not reached out to him yet! Oops!
Dean Jay Mathew
I’d love for someone that is currently hosting large scale instances on k8s or is looking forward to it, to look at the purpose of drydock and give me their opinion
What went well?
What went well these last two weeks / sprint?
We have a small team focused on OEP-55 and are making progress.
We finally finished the hooks framework pending implementations!
Régis has been doing great work in Slack and the forums
Dean Jay Mathew
The new time scheduling for the meeting
I got a lot done on both Listaflow and Paragon
Got great help/input on Nutmeg blog posts from Peter - thanks Peter! Tutor release notes are awesome and very helpful, great job Régis & team!
Improvements and Blockers?
Did you experience any blockers or have suggestions for improvements?
I’m seeing private links and bare-bones descriptions in a lot of recent commits & PRs. At tCRIL we’re discussing ways to improve things, but we’re also open to ideas. The existing PR template in edx-platform seems to work well for those who are committed to following it, but it’s just ignored by others. The conventional commit linter generally does its job, but the standard it enforces is very lenient. .
The total reported hours contributed for the last sprint: 199 hours
The totals of the hours reported in the last eleven (11) sprints:
@Felipe Do you mean this drydock? I looked at its documentation but honestly couldn’t understand what it’s supposed to do. It seems like an alternative to Terraform; is that right? If so I can’t see the appeal.
@kmccormick Thanks for working on it. This is an important issue to me and many others. Personally I would focus on the reviewers who are approving these PRs. If the criteria for merging a PR includes that it has a description or follows the template, and reviewers are merging those PRs without checking that, then it’s likely they are not checking other things which are expected of a reviewer.
If someone is a “repeat offender” with approving/merging PRs that don’t have any descriptions or are missing any other important steps of the code review process, I think it would be totally reasonable for them to have their review privileges reduced for a while (e.g. their review is no longer sufficient for merge on its own, and someone else also has to review, until they’ve shown a consistent pattern again).
@kmccormick@braden We discussed this topic during the contributors meetup, and there was a consensus that this should be addressed. We ran a quick show of hands, and everyone involved in code reviews agreed.
The approach you’re suggesting @braden seem good to me – we need to be able to trust that the reviewers follow the project decisions and guidelines for what gets merged in the project. We could also likely make the linter or automation more strict, but it often comes with side effects (more false positives, and could probably be circumvented). On something obvious like the lack of a proper PR description, the responsibility is clearly on the reviewer to not let it pass.
It’s definitely important to start by discussing - I’m guessing most of the reviewers who don’t enforce this don’t know that this is a requirement of the project, nor why it is important for the project. Communicating and explaining would be important imho, and I’m sure most reviewers would understand and agree to it.
At the same time, since the project direction is moving from a single actor to a community, it will be important to be able to draw clear limits to what is accepted by the project – and it would be important to tie continued elevated rights within the project to respecting the project’s rules. (And it is actually a good incentive to participate in the elaboration of those rules.)
@mgmdi Thanks for posting the proposal! I tried to have a look, but I didn’t see an explanation or rationale for the backport? If you could explain a bit more in the PR why the backport is necessary, that would likely help with the review.
I’m not certain what the exact policy is currently for backports in releases, but I think we have mostly kept it to critical bugs or security issues? The drawback of backporting being that is changes the stable release, making it less stable We did sometimes backport issues or even features when there was a strong interest or need for it in the community. Maybe it would be worth using the opportunity for defining a rule, to decide how to accept or refuse this type of backport request? Maybe using a similar review as for other PRs, but have a higher minimum number of from core contributors reviewers? (5?)
I didn’t see an explanation or rationale for the backport? If you could explain a bit more in the PR why the backport is necessary, that would likely help with the review.
For context, this is a backport that was discussed before Nutmeg release. The filters functionality for the Hooks extension was not going to land in time for Nutmeg so we agreed that this is a feature that we’re happy to backport.
But you are correct it’s probably worth additing it to the issue for future references.
I’m not certain what the exact policy is currently for backports in releases
The BTR stance on backporting to release is that as long as it doesn’t require data migrations, we will consider it on case by case basis.