A common theme from the case studies in my last few posts is that for a data team embedded in a biotech organization, the hardest part of the job often isn’t designing the tools (models, pipelines, apps, etc) but driving changes around how the tools will be used. Bad software is a symptom, not a cause.
This might mean introducing processes to formally capture metadata about experiments during the planning phase, or use a predictive model instead of heuristics to define experiment parameters or countless other examples. I think of these changes as creating the space into which the tool will fit.
Most of us on the data side of the house already like to work by iterating with a feedback cycle. But when this iteration only applies to the tools, and not the process, we end up scrambling to create the space at the very end.
Iterating on creating the space in parallel to creating the tool is a very different way of thinking/working. In many ways it’s more difficult than technical iteration on tooling, which is both why many of us don’t do it, and why we need to work harder to do better.
Making space for better tools
This is a powerful diagram!
It shows tooling as an intermediation between the Mental Modal and the Process. It also illustrates the advantage of right sizing a tool. Too large, it may never get successfully deployed. Too large also wastes dev resources, which might be better utilized. I especially like the fact the Process and Model can be directly connected even without the new tool, which might indicate that a tool often faces competition from legacy, outsourced, or manual methods.