Everything about Agile Methodology revolves around key values of rapid development, adaptability, and continuous improvement. But how many times can we, as project managers, point to specific instances where our project was saved from going completely off the rails due to Agile principles? Most of the time, small changes, and incremental values are introduced in Sprint retrospectives, to make very small course corrections, or just to reaffirm if the development manifestation is on-point with the product vision. And as teams mature and roadmaps develop, it becomes almost mandatory to live up to stakeholders’ expectations, since individual work relationships have been established.
Every once in awhile, especially early in new projects, Agile flexes its muscles by demonstrating that adhering to the key principles can conquer ambiguous goals and handle major course corrections — preventing a major loss of time and/or wasted resources. Sharing these experiences is important because it demonstrates the importance of Agile processes — especially daily stand-ups, the concept of delivering a viable product unit per Sprint, and Sprint demos when we are tempted to skip or skim through them. This is one such case from my personal experience as a Product Owner on a BI platform development team.To gain in-depth knowledge of project management, check out PMP exam prep training.
Context
Business Intelligence (BI) platforms are like most software products where they require the same mix of cross-functional individuals and skills to create a viable product. In this case, the team was comprised of three software engineers, one engineering lead, one technical project/Scrum manager, and myself as the Product Owner representing multiple end-user stakeholders. In addition, one team member filled the role as a data analyst with expertise in ETL and data warehousing, a role unique for data and BI platform development, to help translate business requirements to functional specification for the engineering team.
Our goal was to create an analytical cube to track and analyze the performance of a transactional engine used to register and house customer account information for the procurement and payment of software products. The transactional engine was also new, being built and iterated as we started our project.
This is, of course, where the ambiguity began. How and where would we collect the data from this system, before data existed? What would be the defining unit of transaction? How will the data best represent the performance? There was no shortage of questions, but there were tight deadlines. Analytics needed to be live no more than two weeks after the full transactional system went live, so any failures or pain points could be quickly determined and corrected before they took root as poor satisfaction for our customers.
We started working on planning a roadmap and two-week sprints, with monthly production deployments and incorporating all the normal Agile/Scrum milestones in each sprint. We took what information was available to us to determine the best course of action. Test data from the transactional engine indicated individual units to the customer agreements, not customer accounts, and supplemental information started to form a picture of appropriate data relationships, laying out the framework of our relational databases to build.
Two sprints in, we prepared for our first production deployment on schedule. Further out, our plan put us at completion to coincide with the transactional engine go-live date. We entered our UAT environment testing confident and satisfied. That is, until we started to test use-case scenarios on our UAT prototype.
Action
In the UAT demo, along with the Scum manager and data analyst, we quickly realized it was nearly impossible to determine the volume of customers signing up in a time period - a pretty basic measure needed to track and provide analytical value on our program performance. The reason why? Customers could register and be approved to transact under multiple agreement types. Our data model didn’t define customers uniquely, but instead, used the agreement values. The product was designed to requirements, but the requirements did not account for the translation of data to units that were valuable to business evaluation. When we analyzed the test data, we failed to recognize this and captured the requirements correctly.
We had to redesign our data model, and do it quickly as the deadline was fast approaching. As we were in UAT testing, sign-off was needed to release a viable product to production. As our first action, we agreed to proceed with the ill-designed data model for the first release, though not ideal, because the first release would meet needs for now and would not be retained for the end result. During the next daily stand-up, we suspended our normal activities to explain to the entire team what had happened. Agile training kicked in here, with everyone showing up, ready to take on the new challenge instead of inspecting what went wrong.
Our third sprint needed to be planned, and we decided to dedicate half the team to designing the new model while the other half designed the next phase of the project. This could work because the next phase required acquiring additional data for our model, and integration into the data model would not start until the new design was ready. Then came our next planning meeting, which looked much more like a Joint Application Development (JAD) session. We were all creating the new design together instead of heavier dictation by the PO and Scrum manager.
With valuable input from the engineers, we were able to leverage parts of the old data model to translate into the new one. This made it easier to right-size the work for our remaining sprints, and we could use our retrospective to discuss the tradeoffs that would entail. Each stand-up of that sprint was interactive and engaging, with information and decisions being made every day. The members of our team working on the next data acquisition were benefitting as well, understanding the new model, so that when the next integration phase came, it would be seamless.
By the third sprint demonstration session, we had a completed design and framework of a basic data model that was much more user-friendly in our test environment. It would work with new data acquisition, and was robust enough to handle changes to the way the transactional engine worked as well. This was a plus knowing that the system could change in its infancy.
Result
We were able to develop and deploy the completed BI product 10 days after the transactional engine went live, well within the two-week window we were given at start. A good thing too, because the business intelligence revealed multiple pain-points with the transactional engine that was corrected in the first three months of activity before they became widespread issues (because it was developed under an Agile SDLC too!).
Because everyone on this Scrum team had previous work experience and training in Agile, when our major defect was found, I noticed three things:
Everyone demonstrated positive and value-adding insights because we were using Agile tools both before and after the issue discovery. There was no finger-pointing at who caused the problem, no wasting time on reiterating goals or expectation, or on additional root causing and over analysis of the issue. Everyone leveraged their role and their cross-functional knowledge to acknowledge, overcome, and advance on the right path forward.
We determined and fixed the problem very early in development, instead of waiting until a comprehensive testing window near the end of our product for it to be revealed, which would have been an even bigger issue and more rework with all the additional data and features built upon and with dependencies on the core data model.
There was very little impact to meeting the project deadline. Even more so, the additional value and learning of our product due to “fast failure” influenced productive, rapid adaptation to build the “right thing”.
Top Cities where KnowledgeHut Conduct Project Management Certification Training Course Online
I am a skilled business intelligence analyst with advanced experience in extracting and working with big data and data warehouses for impact reporting, especially for operational & business processes supporting cloud-computing business and sales models.
Share This Article
Ready to Master the Skills that Drive Your Career?