Coming from “Prioritizing volunteer contributions in free software development”, the Wikimedia Foundation allowed me to spend time on research about code review (CR) earlier in 2016. The theses and bullet points below incorporate random literature and comments from numerous people.
While the results might also be interesting for other free and open source software projects, they might not apply to your project for various reasons.
In Wikimedia we would like to review and merge better code faster. Especially patches submitted by volunteers. Code Review should be a tool and not an obstacle.
Benefits of Code Review are knowledge transfer, increased team awareness, and finding alternative solutions. Good debates help to get to a higher standard of coding and drives quality.[A1]
I see three dimensions of potential influential factors and potential actions (that often cannot be cleanly separated):
In general, “among the factors we studied, non-technical (organizational and personal) ones are betters predictors” (means: possible factors that might affect the outcome and interval of the code review process) “compared to traditional metrics such as patch size or component, and bug priority.”[S1]
An unstructured review approach potentially demotivates first patch contributors, but fast and structured feedback is crucial for keeping them engaged.
Set up and document a multi-phase, structured patch review process for reviewers: Three steps proposed by Sarah Sharp for maintainers / reviewers[A2], quoting:
Not enough skillful or available reviewers and potential lack of confident reviewers[W1]? Not enough reviewers with rights to actually merge into the codebase?
Lack of repository owners / maintainers, or under-resourced or unclear responsibilities when everyone expects someone else to review. (For the MediaWiki core code repository specifically, see related tasks T115852 and T1287.)
“Changes failing to capture a reviewer’s interest remain unreviewed”[S3] due to self-selecting process of reviewers, or everybody expects another person in the team to review. “When everyone is responsible for something, nobody is responsible”[W4].
Hard for new contributors to identify and add good reviewers.
“choice of reviewers plays an important role on reviewing time. More active reviewers provide faster responses” but “no correlation between the amount of reviewed patches on the reviewer positivity”.[S1]
Due to unhelpful reviewer comments, contributors spend time on creating many revisions/iterations before successful merge.
Prioritization / weak review culture: more pressure to write new code than to review patches contributed? Code review “application is inconsistent and enforcement uneven.”[W8]
Workload of existing reviewers; too many items on their list already.
Reviewer’s Queue Length: “the shorter the queue, the more likely the reviewer is to do a thorough review and respond quickly” and the longer the more likely it takes longer but “better chance of getting in” (due to more sloppy review?)[S1].
Due to poor quality of contributors’ patches, reviewers spend time on reviewing many revisions/iterations before successful merge. Might make reviewers ignore instead of reviewing again and again giving yet another negative CR-1 review.
Likeliness of patch acceptance depends on: Developer experience, patch maturity; Review time impacted by submission time, number of code areas affected, number of suggested reviewers, developer experience.[S7]
Hard to realize how (in)active a repository is for a potential contributor.
Changesets are rarely picked up by other developers[WB]. After merging, “it is very difficult to revert it or to get original developers to help fix some broken aspect of a merged change”[WB] regarding followup fixing culture.
Hard to find existing “related” patches in a certain code area when working on your own patch in that area, or when reviewing several patches in the same code area. (Hence there might also be some potential rebase/merge conflicts[WB] to avoid if possible.)
Lack of synchronization between developer teams: team A stuck because team B doesn’t review their patches?
Comment which important factors that you have experienced are missing!
No idea if this is useful to anyone but it was an interesting exercise.
By default I have disabled storing cookies in my main web browser. I have a custom list of specific web sites that I allow to set cookies. (Whether that makes any sense regarding all the other data your browser sends which might create a unique fingerprint anyway is a different question up to your personal judgement/opinion on “privacy” and not the topic here.)
Ideally that whitelist would only include web sites that use my data in a way that I can agree with. In reality, services exist that could either be considered convenient (like Facebook; if you want to use their services you could use a private browser session every time and reenter your password, or use a separate browser to isolate Facebook’s cross-site cookie pollution) or services that your employer or customers use or expect for whatever reasons.
Google Hangouts video calls and Google Hangout text chats (which are proprietary after dropping XMPP) are used by some of my co-workers.
I have been wondering for a while which specific Google sites to allow setting cookies in order to be able to use these services but could not find information on the web. Google lists a bunch of domains but that list seems neither specific nor complete.
Going for trial and error, I removed any Google cookies (which might require more than a simple string search due to sites such as accounts.youtube.com), removed any potential rules allowing Google cookies, set my browser to not allow any cookies, and see how far I can make it working around random error messages and getting logged out immediately after having logged in.
I ended up allowing the sites accounts.google.com, client-channel.google.com, clients[1-6].google.com, hangouts.google.com, people-pa.clients[1-6].google.com, plus.google.com, talkgadget.google.com to set cookies. Some of these were trickier to find but your web browser’s developer tools allow you to check which sites want to set cookies.
And now back to actual work.
At Wikimedia, for the last months I’ve been on and off rewriting our on-wiki technical Gerrit/Git/Code Review documentation.
That included improving the onboarding steps like setting up Git and Gerrit (related task; 135 edits), the contribution guidelines and expectations for patch authors (related task; 28 edits), and to some extent the guidelines for patch reviewers (related task; 23 edits).
Among the potential next steps there is agreeing on a more structured, standardized approach for reviewing code contributions. That will require engineering and development to lead efforts to have teams follow those guidelines, to establish a routine of going through unreviewed patches, and other potential iterative improvements.
I’m not a person carrying around a laptop and don’t use mobile phones much. The more text/comments to tackle (or seperate pages covering related topics), the more I prefer working on paper. (That’s also how I started high-level planning the GNOME Evolution user docs rewrite.)
It might be archaic but paper allows me to get an overview of several pages/documents at the same time. (I could probably also buy more or bigger screens?) I can mark and connect sections that are related and should not be in four different places (like Troubleshooting related information or operating system specific instructions). Plus trying to be accountable and transparent I end up performing lots of small atomic changes with a proper change summary message so I can cross out sections on paper that are done on the wiki.
Paper especially works for me when thinking about topics that still require finding an approach. So I end up in the park or in a pub.
In a future blog post I’m going to cover what I’ve learned about aspects and issues of code review.
Filthy attempts on the unconference session scheduling whiteboard by so-called “friends” trying to trick me into literally ‘something’.
They won’t succeed.
In late March 2016, I attended some Wikimedia gatherings in the Middle East: The WikiArabia conference in Amman (Jordan), a Technical Meetup in Ramallah (Palestinian territories), and the Wikimedia Hackathon in Jerusalem (Israel).
I gave an introduction to the many technical areas in Wikimedia anyone can contribute to. I also gave an introduction how to use Phabricator, the project management suite used (for mostly technical aspects) by the Wikimedia community which allows managing and following progress in projects and collaborating with developers.
As I love discussing society and politics I was not sure initially how much I’d have open and blunt conversations. But on the first evening I was already sitting together with folks from Tunesia, Egypt and Saudi-Arabia who were comparing the situations in their home countries. People also allowed me to learn a little bit about how daily life is in Iraq or Saudi-Arabia.
After a short trip to Petra, we spent an entire day to get to and cross the border between Jordan and the West Bank. If you look at the mere distance it feels ridiculous. It definitely makes you appreciate open borders.
Afterwards, we were very lucky that Maysara (one of our hosts) took the time and his car to drive us around in the Westbank to visit a bunch of spots, pass settlements, look at walls, or wonder which streets to take (sometimes a checkpoint with a soldier pointing a machine gun at you helps making decisions).
At some point, Maysara simplified it in a single quoted sentence: For Israelis it’s fear. For Palestinians it’s humiliation.
On the last day I visited the Yad Vashem Holocaust memorial with some co-workers (thanks to Moriel for organizing it). It’s obviously an activity you cannot “look forward to”. I am still impressed by our guide who explained and summarized history extremely well.
The architecture of Yad Vashem makes you go through several rooms on the left and right of the passageway in a chronological way and our guide mentioned several times that you “cannot yet see what is coming a few rooms (means: a few years) later”, and the question “Why did Jewish citizens not flee” got answered by “Where would you try to escape to if even outside of ghettos and concentration camps everybody is hostile”. Which explained very well the self-understanding why to found a state for Jews.
I am incredibly thankful to those many great people I could meet and who shared their points of views on the social and political situation, always in a pretty reflected and respectful way despite of all the frustration being around.
And whatever my question was to locals, the answer pretty much always was “It’s more complicated than you thought.”
In a society where the path of welfare could be expressed by “walk → motorbike → car”, I received some grins admitting I had never had a motorbike ride before. In Indian traffic I’d call that an experience, for a tourist like me.
As usual, it’s wonderful to finally meet folks in person who you’ve only spoken to online beforehand, and to hang out with old friends. (I sound like a broken record here. I am sorry I could not see everybody. I’ll be back.)
Sometimes, when some individual, group, institution publishes, releases, leaks a cruel video of this planet’s conflicts, media decide not to show it or only show a (edited) screenshot.
I usually end up trying to find the uncut video on the internet because I naively believe I can better realize how cruel things are when I force myself to watch it.
So I constantly feel the need to defend my behavior against accusations of voyeurism and dehumanization: “Can’t you imagine the rest anyway? Haven’t you played video games and watched enough fictitious movies?”
A few months ago I bought the photo book “War Porn” by Christoph Bangert (a photo journalist in war regions). The photos are somewhere between disturbing and disgusting.
Bangert also covers the aspect of self-censorship as a photographer or publisher, hence for some of the pages you have to decide yourself whether to break the perforation to see the image included on that page.
The book’s introduction puts it into simple words that I could not find myself:
“What’s the point of showing these things? We know that wars and disasters are horrible events. But are we really aware of just HOW horrible they are? Yes? Why are we so shocked by these pictures, then?”
Google Code-in 2015 is over. As a co-admin and mentor for Wikimedia (one of the 14 organizations who took part and provided mentors and tasks) I can say it’s been crazy as usual. :)
To list some of the students’ achievements:
Numerous GCI participants also blogged about their GCI experience with Wikimedia:
The Grand Prize winners and finalists will be announced on February 8th.
Congratulations to our many students and 35 mentors for fixing 461 tasks, and thank you for your hard work and your contributions to free software and free knowledge.
See you around on IRC, mailing lists, Phabricator tasks, and Gerrit changesets!
Publishing my usual list of awkward and never complete pop music preferences of 2015.
In this year’s edition of Google Code-in, students can choose from tasks provided by the following organizations / projects: Apertium, Copyleft Games, Drupal, FOSSASIA, Haiku, KDE, MetaBrainz, OpenMRS, RTEMS, SCoRe, Sugar Labs, Systers, Ubuntu, and Wikimedia.
If you are a 13-17 year old pre-university student interested in getting involved into free and open source software development, check out Google Code-in.
It is a great opportunity to learn about distributed software projects, to find out which areas of software development you are interested in, to gather some on-hands experience, to contribute to “real” projects out there used by millions of people, and to make new friends all over the world.