In my post last week, I claimed that you can break down a digital twin of the lab into two halves, the model and the implementation, both of which are hard (but in different ways). In this post, I want to start discussing the model, since that has to come before the implementation. And I’ll start by breaking it down even further into three kinds of models: The data model, the decision model and the workflow model. This week, I’ll explore the data model, which is the static picture of how you organize your information. It’s sometimes a called a schema or an ontology. Either way, it’s representing the lab and beyond as structured data.
Or...don't use an ELN like that where you end up hacking around the constrained data model and duplicating data everywhere. Use a proper platform and none of this is an issue at all. Better yet combine a proper platform with proper search tools and again you can get the data in a workable way with zero SQL, zero code and any user can do it. Such a solution exists :)
I love this article. I refer to this as a data dictionary coming from travel tech before starting Scispot.com
Every ELN/LIMS should eventually support a configurable data dictionary. I have seen many companies start with implementation without knowing their data dictionary - obviously it evolves as you grow your workflows, and results.
However, having a clear ontology map and how it impacts your downstream computational workflows pays off.
Or...don't use an ELN like that where you end up hacking around the constrained data model and duplicating data everywhere. Use a proper platform and none of this is an issue at all. Better yet combine a proper platform with proper search tools and again you can get the data in a workable way with zero SQL, zero code and any user can do it. Such a solution exists :)
I love this article. I refer to this as a data dictionary coming from travel tech before starting Scispot.com
Every ELN/LIMS should eventually support a configurable data dictionary. I have seen many companies start with implementation without knowing their data dictionary - obviously it evolves as you grow your workflows, and results.
However, having a clear ontology map and how it impacts your downstream computational workflows pays off.