Data Process Flow Mapping Part 2: “Don’ts”
Data process flow maps serve as visual roadmaps, shedding light on the journey data takes from creation to disposal. These visual aids offer invaluable perspectives into interdependencies within pharmaceutical processes, allowing for the detection of bottlenecks and data integrity risks. Ultimately, they present an opportunity for efficiency enhancement, informed decision-making, and risk mitigation. This practice establishes a culture of ongoing improvement. Notably, the interdisciplinary nature of this approach is highlighted, emphasizing collaborative teamwork as an essential component of activities surrounding data process flow mapping.
However, there are common pitfalls during the mapping process. This includes the risk of becoming entangled in excessive detail, which can hinder clarity and usability. Furthermore, the separation of data and process could obscure critical connections. It is important to consider the combination of those aspects (process steps and data) upfront. Additionally, using rigid tools that impede flexibility can obstruct the creation of adaptable data flow maps.
In the first blog article of this series, we have focused on the “Dos” (Data Process Flow Mapping Part 1: “Dos”). In this second part, we will comment on common pitfalls and difficulties that could hinder successful data process flow mapping. By formulating 4 “Don’ts”, we want to raise awareness about those hindrances and show ways to overcome them.
Come along with us as we delve into the potential of data process flow maps once more, sharing aspects to consider and maybe to prevent from the start.
1. Don’t get down a rabbit hole.
Navigating the complexities of data mapping requires vigilance against the common pitfall of getting lost in detail. It should always be possible to trace it back to the “end-to-end” data flow (e.g., from the sensor up to the MES /eBR and the Data Historian (in a manufacturing setup). However, delving too deeply into hardware, software, people, or procedures can sidetrack the process and obscure the primary goal. To prevent this, a dedicated Data Integrity expert team is involved in moderating the mapping endeavor (see point 1 in part 1, Get collaborative!). Their expertise ensures that discussions remain aligned with the map’s objectives and the data itself. By maintaining focus on the bigger picture, this moderation prevents unnecessary tangents. It contributes to the precision, efficiency, and success of the data mapping initiative.
2. Don’t get stuck with rigid tools.
While starting, data mapping is aided by tools like templates, decision trees, and pilot projects. A potential hurdle arises when these aids turn into constraints. Overly rigid structures can hinder adaptability to unique complexities within the pharmaceutical industry. While they offer initial guidance, they should not become inflexible roadblocks. Flexibility is essential for accommodating dynamic processes, diverse data flows, and an agile work environment. Utilized correctly, these tools offer a springboard for mapping, but their limitations must be recognized to avoid tunnel vision. Balancing structure with adaptability ensures that data mapping remains responsive and effective, addressing the landscape of the pharmaceutical domain.
3. Don’t make the maps hard to access.
Data mapping’s effectiveness hinges on accessibility. A critical mistake to avoid is passing on maps to obscurity, detached from procedures. Instead, integration is vital. Maps should be seamlessly woven into workflows, accessible, and democratized. Information silos restrain their potential. By pulling insights from diverse sources and aligning them with procedures, data mapping becomes a ubiquitous tool, empowering users as well as decision-makers at every level. Democratization fuels collaboration, enhances understanding, and nurtures informed choices. Moreover, this makes it possible to subsequently use the maps for identifying and mitigating data integrity risks. In the pharmaceutical realm, this transparency boosts compliance, risk management, and fosters a culture of proactive data-driven decisions.
4. Don’t separate data and process maps.
Data Process Mapping is a detailed exercise aimed at uncovering connections between data sets and specific process steps. To begin, relevant steps that comprise the chosen process are listed. Subsequently, relevant data sets are associated. When analyzed where the data resides within the data life cycle (e.g. according to GAMP principles) the identification of potential data integrity risks or gaps can be facilitated as well. Centralized storage and the need for data accessibility by various processes are key components of this analysis, especially in an era where data sharing and availability are integral to Pharma 4.0 objectives. This could involve communication interfaces that can efficiently channel multiple data sets through a single system, optimizing data flow. This concept is based on the FAIR principle for data governance. According to the FAIR principle, data should be Findable, Accessible, Interoperable and Reusable.
In the context of data process flow mapping, collaboration is essential to identify and address risks like incomplete data life cycles and redundant storage. Visualizing data architecture alongside processes fosters insight. Therefore, processes and the associated data should not be separated, as it helps to spot data integrity issues across paper-based and digital systems. Early detection allows for effective remediation planning, ultimately enhancing quality and positively impacting product quality and patient safety.
5. Don’t stop the scope.
A comprehensive overview is extremely valuable to not oversee risks, especially where external suppliers or multiple parties are coming into play. In our article on third party management we talk about common data integrity points of failure, e.g., loss of connection data and metadata (audit trail). In these relationships, contracting experts with GxP responsibility, extra “data integrity care” is required to ensure that ‘out of sight’ does not become ‘out of mind’.
This holistic perspective on data processes not only safeguards internal operations but also extends its protective umbrella to encompass the critical contributions of external entities like CROs and CLOs, ensuring a robust data integrity framework across the board. Situations with external partners involve handover or collection of critical data. A well-defined data governance approach, incorporating the flow of data to the process steps, enhances the oversight before, during and after operation when managing external relationships.
Compiling data process flow maps is not a walk in the park. However, fostering the right environment, where collaboration and critical thinking can thrive, is an effort that will pay off in future quality enhancement. It can be challenging to manage the trade-off between procedural details and oversight on data integrity risk identification. Not getting lost in-between complexity, system details but at the same time not losing the data lifecycle out of sight is not a simple task. Therefore, finding the balance on the right level of detail (point 1, Don’t get down the rabbit hole) and the combination of information on data and process flow (point 4, Don’t separate data and process maps) can require several rounds of drafting and rework. Flexibility and teamwork are crucial to create data process flow maps successfully.
Ultimately, data process mapping not only enhances data integrity but also triggers profound insights into the integration of data and its central role in pharmaceutical operations. As an outcome, process transparency, efficiency, and risk management are being augmented. These strategies, in the end, will enhance quality, with a continued focus on both product quality and patient safety.
Which “Don’ts” have surprised you the most? Do not hesitate to contact us to get your data process flow mapping started!