Harnessing the Power of Data in IBP Transformation
Harnessing the accrued and ever-increasing data continues to be the biggest challenges for companies embarking on IBP transformations and it is also the most important aspect to master for successful execution. Here we delve into the factors that make mastering data so difficult and provide examples and guidance on how to overcome them.
Characteristics
Following are the major characteristics of the data powering IBP.
Accurate: IBP data is used to generating forecasts and aid in decision making and as such it needs to be precise and reliable. There is no ceiling to the damage inaccuracy in data can present.
Up to date: For the same reason as above, data needs to be real-time or near it, reflecting the current state of the business and its operations. This makes the decision-making process more agile and can also reduce communication touchpoints.
Complete: IBP data is essential to central and holistic decision making. Therefore, it needs encompass all relevant aspects of the business. These include but are not limited to financial, operational, supply-chain, procurement, and market-related information.
Granular: Operational data specific to supply chain has strict hierarchies especially as it relates to products or customers. Hence, the relevancy of data for the user can vary in granularity, from high-level summaries to detailed transactional data.
The data transformation must holistically encompass the above-mentioned aspects to truly gain insight from the information and provide tangible benefits in the decision-making process. Provided below are five necessary things an organization must do for the said data transformation.
Preparation
Having relevant data for the right users and processes is crucial precursor to enabling any kind of digital transformation. Most legacy organizations will have multiple systems and data sources. They will also have processes that have been built upon those systems and would often have dependencies on each other for normalizing the data. This often leads to situations where we define a different number based on who is looking. As s:uch, sufficient effort has to be focused on
Disentangling these processes,
Formalizing the source contracts, collection and refresh cadence of data
Building out the relevant infrastructure and data pipelines
Building out the people system and processes to manage the pipelines.
Continuous monitoring and enhancements for robustness of the pipelines. Data will continue to grow and as such it is important that the technology evolves alongside it.
This is perhaps the most time-intensive undertaking among all mentioned. So, it is important to do it in a staggered manner that allows for quicker results while also maintaining cohesion between the different services.
For a large retail client, we upgraded their ERP system, built an enterprise data platform, and developed downstream data layers that provided a more unified view of the organization's data. This enhanced visibility not only improved operational transparency but also led to a significant increase in data-driven use cases. As a result, the client was able to make more informed decisions and optimize various business processes, driving greater efficiency across the organization.
Scoping
The most crucial aspect for reaching a consensus on numbers is to first define clearly the right sources, transformations and units of the data being utilized for a given process. This is an oft-neglected part of a build that quickly comes to the fore when problems in measurement materialize. It is important to formally acknowledge the data scopes through written records and advertise them broadly so that everyone sings from the same hymn sheet.
Following are a few of the potential hazards in data that one may encounter:
Shipments are tracked as actualized (remove returns or unfulfilled orders) or invoiced.
There may be multiple product id standards.
Shipments may have multiple units of measure.
Same data resides in two different sources. Orders are placed in SAP, but actualized shipments are tracked in Kinaxis.
Data categorization happens in multiple systems. Customer orders are recorded in SAP, but the product hierarchy is created in Kinaxis.
Forecasting is done for a subset of customers (say, retail)
Data Is not refreshed at the cadence that forecasting requires.
SMEs have incorrect definitions of data attributes and their usage.
Data Model & Platform Design
Once the scope is finalized, we can commence work on IBP specific data setup. This means taking the enterprise data and deliberately designing the pipelines and storage architecture that would allow for fit-for-purpose actions on the data by downstream IBM tasks and applications.
The data platform is a layer of fabric connecting multiple sources rather than a single component. Since the underlying storage solutions and their number change with advent of technology, you want an abstraction that can protect the application code from the changes. This can be a custom interface or a utility existing for a given language. For eg: python has fsspec, an interface for interacting with different storage solutions.
Data model refers to structure and relationships between the different data attributes that drive IBP. As described above, this data comes from disparate operational and functional sources. So, the relationships between them has to be formed based on what the usage is and where the data is shown.
For a large B2B client, we created an internal planning application that required separating data structures based on two distinct patterns of viewing the data – high granularity charts and graphs that required high level of detail in the data and low granularity ones that showed trends over longer time horizon.
Governance
Data Governance refers to framework or practices that organizations implement to ensure that their data is managed effectively, securely, and in compliance with relevant regulations and policies. When it comes to IBP applications it encompasses the following:
Role Base Access Control (RBAC)
All interactions on the applications are attributed to clearly defined user personas and each persona can be assigned to one or more users. These personas govern the rights attached to a given UI component namely read, edit, delete etc. In this manner we place access controls in place using the persona/role. Usually RBAC is integrated with an organizations IT-maintained directory services – most common of them being Microsoft Active Directory
ABAC
While the personas govern how one interacts with the application, data controls affect what information one sees. A user is assigned a specific set of attributes which act as filters and are passed down from the UI. These are then applied at the source level (database, cache, static files) and the information is then returned back to the application layer. Different applications might implement this differently, but this is usually implemented at the application level itself for off-the-shelf solutions.
Data Lineage
Data lineage refers to the ability to track and comprehend the journey of data starting from its origin and ending at its destination. It involves capturing and documenting information about where the data originated, how it underwent transformations and the various steps it went through as it moved across systems, applications and processes.
Organizations benefit from data lineage as it provides them with an understanding of how data is created, manipulated, and utilized within their systems. It helps address questions such as:
Where did the data originate from?
Who accessed or modified it?
Where was it stored?
How was it used or analyzed?
For one client, we built a custom, automated data tracer from the ground up to track data lineage across various sources, both on-premises and in the cloud. This tracer captured and stored metadata related to the flow of data, tailored to the specific storage types (such as Parquet files or relational databases). This solution allowed us to document the entire journey of the data, making it possible to visualize the flow across systems in different formats and at varying levels of detail. This visibility was critical in ensuring data accuracy, transparency, and consistency throughout the organization's IBP processes.
In conclusion, harnessing data for Integrated Business Planning (IBP) transformation is no small feat. From ensuring data accuracy and granularity to managing multiple sources and aligning stakeholders, the path to success requires thoughtful planning and execution. By addressing these challenges—through processes and technology—organizations can unlock the full potential of their data.