Sustainable growth and visibility

I have some thoughts on what might be needed in the "project management" front of the Rust project to maintain and increase its development speed in a sustainable way, welcoming new contributors and avoiding burnout or undue burden on individuals:

https://estebank.github.io/sustainable-growth-and-visibility.html

I think that having metrics for the state of the project (as opposed to the code) is necessary: Rust is as big as ever and continues to grow, and Brooks's law means that without proactively exercising "project management" tasks we'll hit a limit on the projects we can tackle or how many people we can onboard.

I see this as a way for (if I'm allowed to riff off Carl Sagan) "the Rust community to understand itself", a way to have a high level view of things to have an early warning system for when things are going badly and as one benchmark for whether our efforts are having an impact.

Please, let me know your thoughts!

7 Likes

Hi Esteban,

I'm very much in favor of a custom tool for the Rust project to track such metrics.

It's a general notion I see all over the place that a generic view of "project tracking" doesn't exist - the tool should follow the focus of the organisation.

So, my very quick reaction to the tool is: great!

FWIW, I avoid the word "health" for such investigations: a project may very much be "healthy" if some of the corners are dirty, while the common intuition around "health" draws from medicine and does give a lot less leeway.

My biggest feedback is though that such tools are sharp and careful to be used.

It may help if I give some context from previous Rust leadership experience so that it isn't lost. Early on in the Rust project, we had access to a tool called "Cauldron". Looking at it exposed a number of problems in that tool use:

  1. We were aware of most of the (non)-problems it has shown

  2. We were aware of crucial problems that the tool has not shown

  3. In all the metrics of the tool (and other metrics we looked at), we found phenomena that were hard to explain - such as a sudden increase in crate growth rate

  4. The tool doesn't show things that don't exist yet at least to our awareness, but are desireable

This has led to the joke at one all-hands that we need a "crystal ball working group".

(1) is conceptually the easiest problem to deal with: in case of non-problems, the metric should be silenced and only be shown if theres a relevant change. The problem here, drawing from my background as a search and dashboard consultant: agreeing on what is "relevant" and defining it in software is a laborous process.

(2) is similar: if the problem is tangible, a metric should be designed and inserted. The same problem as in (1) applies.

Both of these combined point to the problem that such a tool needs to be easily changeable and flexible (btw. one of the reasons why many companies still rely on spreadsheets). A meta-metric for the use of such a tool is how often the gathered metrics change (or are even removed). This also maps to an effect called "dashboard fatigue": an ever-extended dashboard with all green lamps quickly looses relevance. The same is for a dashboard where a metric is tracked that is currently not being improved. How to deal with this is very much bound to the organisation, and it's daily state.

(3) Is also interesting: it needs a lot of experience and research, potentially talking to people. Solutions here are surprisingly simple though: often, a collection of notes of interesting things we have seen and a group of people interested in investigation. The solution here may literally be over a drink in a bar.

(4) Is something to be keenly aware of, as it shows the boundaries of such a tool. And those boundaries are large. That means that users of such a tool need to be strongly incentivised to turn away from it regularly and do other methods of research (surveying, user interviews, industry research).

The tl;dr may sound trivial, but usage of such tools needs a lot of education and we need to find a way to provide that education to users. Also, it needs a lot of work and brain cycles. This is not at all an objection or a warning, if we can achieve that, we give users a valuable skill and boost the organisation.

Florian

3 Likes

I'm worried that this problem needs qualitative, not quantitative, analysis.

For rustc itself maybe there are enough contributors to track trends, but for smaller projects I'm afraid that the data is going to be sparse. For example, see skew of contributions to the cc crate. It's basically Alex + people submitting one-off fixes.

2 Likes