As fraud examiners intimately concerned with the general on-going state of health of fraud management and response systems, we find ourselves constantly looking at the integrity of the data that’s truly the life blood of today’s client organizations. We’re constantly evaluating the network of anti-fraud controls we hope will help keep those pesky, uncontrolled, random data vulnerabilities to a minimum. Every little bit of critical information that gets mishandled or falls through the cracks, every transaction that doesn’t get recorded, every anti-fraud policy or procedure that’s misapplied has some effect on the client’s overall fraud management picture.
When it comes to managing its client, financial and payment data, almost every organization has a Pauline. Pauline’s the person everyone goes to get the answers about data, and the state of the system(s) that process it, that no one else in her unit ever seems to have. That’s because Pauline is an exceptional employee with years of detailed hands-on-experience in daily financial system operations and maintenance. Pauline is also an example of the extraordinary level of dependence that many organizations have today on a small handful of their key employees. The great recession of past memory where enterprises relied on retaining the experienced employees they had rather than on traditional hiring and cross-training practices only exacerbated a still existing, ever growing trend. The very real threat to the fraud management system that the Pauline’s of the corporate data world pose is not so much that they will commit fraud themselves (although that’s an ever present possibility) but that they will retire or get another job out of state, taking their vital knowledge of the company systems and data with them.
The day after Pauline’s retirement party and, to an increasing degree thereafter, it will dawn on Pauline’s unit management that it’s lost a large amount of valuable information about the true state of its data and financial processing system(s), of its total lack of a large amount of system critical data documentation that’s been carried around nowhere but in Jane’s head. The point is that, for some organizations, their reliance on a few key employees for day to day, operationally related information on their data goes well beyond what’s appropriate and constitutes an unacceptable level of risk to their fraud prevention system. Today’s newspapers and the internet are full of stories about data breeches, only reinforcing the importance of vulnerable data and of its documentation to the on-going operational viability of our client organizations.
Anyone whose investigated frauds involving large scale financial systems (insurance claims, bank records, client payment information) is painfully aware that when the composition of data changes (field definitions or content) surprisingly little of that change related information is ever formally documented. Most of the information is stored in the heads of some key employees, and those key employees aren’t necessarily the ones involved in everyday, routine data management projects. There’s always a significant level of detail that’s gone undocumented, left out or to chance, and it becomes up to the analyst of the data (be s/he an auditor, a management scientist, a fraud examiner or other assurance professional) to find the anomalies and question them. The anomalies might be in the form of missing data, changes in data field definitions, or change in the content of the fields; the possibilities are endless. Without proper, formal documentation, the immediate or future significance of these types of anomalies for the fraud management systems and for the overall fraud risk assessment process itself become almost impossible to determine.
If our auditor or fraud examiner, operating under today’s typical budget or time constraints, is not very thorough and misses even finding some of these anomalies, they can end up never being addressed. How many times as an analyst have you tried to explain something (like apparently duplicate transactions) about the financial system that just doesn’t look right only to be told, “Oh, yeah. Pauline made that change back in February before she retired; we don’t have too many details on it.” In other words, undocumented changes to transactions and data, details of which are now only existent in Pauline’s head. When a data driven system is built on incomplete information, the system can be said to have failed in its role as a component of overall fraud management. The cycle of incomplete information gets propagated to future decisions, and the cost of the missing or inadequately explained data can be high. What can’t be seen, can’t ever be managed or even explained.
It’s truly humbling for any practitioner to experience how much critical financial information resides in the fading (or absent) memories of past or present key employees. As fraud examiners we should attempt to foster a culture among our clients supportive of the development of concurrent transaction related documentation and the sharing of knowledge on a consistent basis for all systems but especially in matters involving changes to critical financial systems. One nice benefit of this approach, which I brought to the attention of one of my clients not too long ago, would be to free up the time of one of these key employees to work on more productive fraud control projects rather than constantly serving as the encyclopedia for the rest of the operational staff.