There are two legacy requirement management tools from the 90's in use. They are expensive and overlapping. The client has a potential replacement candidate for both tools and they need an evaluation done to see if the replacement is feasable. If the answer is yes there will be a long and lucrative replacement project.
Teams have been using the two tools for a very long time. Each team has developed a custom way of working and have developed their own complex scripts and calculated fields. There are a multitude of integrations moving data around between a web of tools but somehow the data is never quite in sync. Still everyone likes their tooling and so far the only compelling argument to change is a cost reduction. The only way to win them over is to deliver a better tool and one that makes their work easier.
An initial evaluation of a handful of tools has been completed and they have narrowed it down to one that we must study in detail. We need to show how it can be configured to deliver the same functionality and how the data can be migrated over.
Our role is to conduct the evaluation with a group of business representatives. We must prove that the tool can meet the needed use cases. We have been given a highly capable technical expert from the vendor. This will be a big sale for them so there is a lot of pressure on him to deliver.
The fact that there has been no common way of working creates a big obstacle. The data is very diffenent and some level of harmonization would be needed. There is little interest for the teams in changing their current ways of working. Although siloed and scattered it is all under their control. While sitting down with the vendor's technical consultant it emerges very quickly that the replacement tool may not be up for the job. It also belongs to a generation of yesterday's tools that have performance issues when scaled and has an architecture that limits its usability in one of the key business cases. These factors combined are enough to call it a no-go.
Sometimes success is failing fast. If it becomes quickly clear that something will not work then there is no point in dragging it out. We wrapped this project up early and saved the customer some money. During the project we focused on learning by doing, demonstration instead of powerpoint and developing capabilities like in data export from legacy tools that could be reusable in the future. There was very little waste.
With any tool evaluation it is important to get away from the sales guys as soon as possible and talk to reliable technical experts who know and understand the product. There is nothing wrong with tool consolidation but for this particular project it would be very difficult to realize any reasonable return on investment based solely on licence cost reduction. The ways of working also needed to be fixed to fully realize benefits. Tool change without a hard look at processes is a missed opportunity.