Voted Best Answer
Jan 04, 2016 - 03:42 AM
There are a number of issues here.
The contractual requirement to deploy ILMT is essentially a legacy of the days when SAM was immature in comparison to the technical reporting requirements of sub-capacity usage. There was a need for discovery tools to sufficiently query the servers they were monitoring and not many were able to do that; additionally, the back end calculations to generate an accurate sub-cap IBM licence position were beyond many tools.
However, IBM’s ILMT has a reputation for being both clunky and a pain to maintain; at SAMGarde we have, for a number of years, been successfully challenging IBM to allow customers to use a well deployed SAM tool to report on sub-capacity usage instead of ILMT. For the customers this means that they require only one tool to manage their SAM estate (important in many outsourced environment as each additional application adds additional cost) and for IBM it improves customer relationships. This was formalised when “Flexera” became the first IBM certified SAM tool, but other tools are now being accepted. What is important is that the discussion is had as part of a broader commercial discussion. This is one of the reasons why IBM is probably the most complex publisher to negotiate with.
With regards to incorrect data, yes we have seen incorrect data from ILMT in the same way we have seen incorrect data with ALL SAM tools. The principle of “Garbage In / Garbage Out” is as true now as it was back in the 1970’s. The important thing is to have a verification process in place.
As for IBM’s response to inaccurate data, that depends on how one handles the engagement. Where discrepancies are identified these should be fully documented to allow better understanding of both the operational and commercial impacts, then plan your engagement with IBM so that you are clear what your customer needs, wants & is prepared to concede.
In these instances, we have ALWAYS found IBM to be willing to both listen & act.