It ain’t always all technical!

I was called in to help out a client for a day as I was ramping up. It would take away one full day from my studies – which were a lot, but getting close to my first client in the company I didn’t want to let go of this opportunity. The task was to review ‘scripts’ as the client’s data migration plan was going awry and their go-live was pushed out a couple of times already. The first thing in the morning I was introduced on the call. The team – and it wasn’t a surprise – was full of experienced professional in Salesforce implementation with years and years behind them in experience. I sure was nervous the night before thinking of the numerous what-ifs – ‘what if I come out as inexperience’, ‘what if I can’t troubleshoot the issue’, what if my short exposure to Salesforce turns out to be inadequate’, and so on. The first half hour, I got to know what the issue was. The client wanted a second pair of eyes on the scripts to ensure we catch anything and everything that could impede the go-live which was in about two weeks. I was asked to lay out a plan for the day ahead. I offered to sit with the relevant people on a call to hear the full story before coming up with a plan.

In the meeting with two of the team from the Systems Integrator – a well established consulting firm in the Salesforce space – I got to learn many things. Key among these was the fact that there were three legacy systems from which they needed to pull data out of and transfer over to Salesforce. They had been provided a data mapping document from the client and had two experts helping them out with understanding the queries. Then there was the set of issues they were having. I offered to look at these issues and found that these were not categorized – and in essence there was no triaging process to prioritize them. Their means of testing the data migration was to check off the number of records from the source system with the destination. The report the team had produced had many of the percentages way off from 100%, some running into the thousand. Yes, numbers indicating more data was loaded than was available in the source. There was no possible way to explain these numbers. I was countered with the fact that the source and destination number won’t always tally, even beyond an acceptable discrepancy such as 1-2%. The report and the way the team was looking at success for the migration was totally unreliable.

Given the approach the team had taken to test the data migration, it was evident that they weren’t looking at the whole picture. The data migration was a piece of the whole puzzle. The big objective of the go-live was to enable the business to start using Salesforce. This meant there were other work streams that should have been going on, such as configuration, building any custom visual force pages or integration with other systems. With each data migration run, the team should (also) be testing the application itself and there was the disconnect. I later learnt their offshore QA team was doing the application testing, but testing the data migration and the application was coordinated at least that is what appeared to me.

The other glaring piece that was missing was traceability of the requirements to testing and vice versa. Unless the team could demonstrate to the business that with each test, the requirements of the project were being met, there was no way the project was going to be signed-off. When one is deep down to the knees in developing an application, often times the disconnect with the actual requirements goes unbounded. This is where project management needs to keep a close eye on what is required and what is being delivered. It is such a simple thing, that is to tie the results to the original ask, but teams lose sight unless someone is monitoring such linkage. If you leave delivering a large project to just the technical team, it is going to be a resounding failure in all probability. Effective project management is a core part of a successful delivery however much of an overhead it might appear to a tech genius. The bigger the project, the more the necessity of oversight.

Comments need to be approved by the Admin before being published.