Tag Archives: fraud analytics

Detailed Planning is the Key to Successful Fraud Auditing

CoffeeMugIt’s difficult to believe that it’s already October! The 2014 Chapter year has flown by and has been extremely good for all of us with twenty new members nationwide and three very successful live Chapter training events!

Something Dr. Doug Ziegenfuss said about fraud audit planning during our session on Ethics 2014 for CPA’s and Fraud Examiners struck me as being worthy of a post.   Considering the various techniques internal audit organizations are using to fight fraud, Doug singled out the use of specialty audit software like ACL as critical to the success of analytics based programs, critical if the modern control assurance enterprise is to have even a hope of being ultimately successful.   But just having software isn’t in itself enough. It’s true that audit analytics can quickly examine large files and flag the digital markers of potentially fraudulent activity to help auditors of all kinds work more effectively and efficiently. But any tool is only as effective as the planning for its use allows it to be.

As we fraud examiners are painfully aware and the news media daily attest, evidence of on-going fraud often resides deep in an organization’s data.  Unfortunately, these schemes often go undetected for months or even years, draining evermore revenue from the organization.  Dr. Doug’s point is that this is often the case because of superficial audit planning.  Development of a risk based audit program is no easy task but there is no alternative if the auditing effort is going to be more than a superficial light dusting of the control structure.  Risk based programs with emphasis on analytics are based on the preparation of descriptive system narratives, detailed workflow diagrams, and risk assessments; hard work on the front end but, if comprehensive,  not as difficult to update and maintain over the long run as many auditors think.

As we’ve said in post after post on this blog, leveraging fraud identification technology should always be directed at the solving of the business problem of fraud revelation, control and eradication rather than at acquiring technology for technologies sake.  The effort requires a clear assessment of the entire audit life cycle of the organization to find ways to use technology to enable a reasonable level of measurable efficiency.  It’s up to the chief audit executive (CAE) and the audit committee (if there is one) to ensure that audit analytics are leveraged to achieve well defined goals built on a solid foundation of key risk measures.

Once data analytic targets are established, the specific analytic technology audit management has selected can be used to extract, scrub, and analyze data for a variety of anomalies and fraud scenarios.  Any chosen analytic solution should always provide independent access to source data, minimizing the need for the organization’s IT department to intervene and simultaneously protecting network integrity.  As a key component of fraud audit strategy, the independent assurance effort should strive to have each of its audits include enough analytics based tests to pinpoint such anomalies as indication of segregation of duties conflicts, transactions modified to avoid approval or authorization, funds leakage, inappropriate payments and a whole host of abuse of corporate assets related frauds.

The fact that every organization has unique data issues is another reason for instituting a program of long range audit planning. As every auditor knows, a data series cannot be validated in a vacuum; it must be tied to another series to ensure its accuracy.  Organizational data idiosyncrasies and patterns mean that data validation is crucial to the success of the audit analysis effort.  This is where Certified Fraud Examiner’s (CFE) experience with fraud audit analysis and audit technology becomes especially valuable to the organization’s analytics program.  Once the audit team has documented the nuances of the organization’s data, an experienced CFE can assist the team in the development of a fraud-indicator approach that weights audit test results based on their propensity for fraud.  Transactions or vendors flagged in multiple tests, for example, rank as a higher review priority than a lower-risk anomaly that appears only once such as an invoice submitted on the weekend or vendor payments directed to post office boxes.

Detailed audit planning of analytic supported reviews is the key to success for every organization eager to strengthen internal controls in the modern distributed computing environment. Fraud audit analytics minimizes sampling risk and promotes efficient, highly focused audit practices.  Only if properly planned for can such anti-fraud solutions provide full population-visibility and the power to uncover small anomalies in the virtual ocean of data, casting a wider net to more effectively fight deeply buried instances of fraud, waste and abuse.

Data Warehouses, OLAP & the Fraud Examiner

banner

SewingSince two of the topics to be discussed in our April 2014 Introduction to Fraud Examination seminar in Richmond next week will be client data warehousing and online analytical processing (OLAP), I thought I’d write a short post briefly introducing both concepts as practice tools for fraud examiners.

A fraud examiner laying out an investigation might well ask, “What’s a data warehouse and how’s can it be of use to me in building the case at hand?”  The short answer is that a client’s data warehouse represents formatted, managed client data stored outside of the client’s operational information systems.  The originating idea behind the warehouse concept is that data stored to answer all sorts of analytical questions about the business can be accessed most effectively by users by separating that data from the enterprise’s operational systems.  What’s the point of separating operational data from analytical data?  Most audit practitioners today can remember that not too long ago our clients archived inactive data onto tapes and ran any analytical reports they chose to run against those tapes, primarily to lessen the day to day system performance impact on their important operational systems.

In today’s far more complex data handing environment, the reason for the separation is that there’s just far more data of  different types to be analyzed, all available at once and requiring processing at different  frequencies and levels,  for a seemingly ever expanding roster of purposes. The last decade has demonstrated that the data warehouse concept operates most successfully for those organizations which can combine data from multiple business processes such as marketing, sales and production systems into an easily updated and maintained location (such as the cloud), accessible by all authorized user stakeholders, both internal and external to the organization. Source applications feed the warehouse incrementally and map the transfer trace allowing fraud examiners and other control assurance professionals to perform transaction cross-referencing and data filtering; the fraud examiner profiling the data flows related to a fraud scenario can generate case related queries  for a given week, month, quarter or a year and (this is most important) compare financial transaction data flows based the on-going historical status (old and updated) of the same and related applications.

A important point for fraud examiners to be aware of is that, often, access to just low end data analysis tools such as simple query capabilities may be all that’s required to assist in the construction of  even relatively complex fraud cases.  For examiners who choose to broaden their practice capabilities to handle even more complex investigations, access to powerful, multi-dimensional tools is now, increasingly, available.  One of these tools is on line analytical processing (OLAP) based on the concept of the relational database embodied by many, if not most, information systems database applications today.

Think of a cloud based data ware house over-laid by a complex, very large spreadsheet (the OLAP application) allowing the examiner to perform queries, searches, pivots, calculations and a vast array of other types of data manipulation over multiple dimensional pages.  Imagine being able to flip a complex database on its side and examine all of the data from that different perspective or being able to highlight an individual data element and then drilling down to examine and trace the basic foundational data that went into creating that item of interest.

Today’s increasingly cloud based OLAP systems support multi-dimensional conceptual (what-if) views of the underlying data allowing for calculations and modeling to be applied across multiple dimensions, through hierarchies, and across database elements.  Amazingly, advanced tools are presently available to allow OLAP-based analysis across eight to ten different dimensions.  Of special interest to investigators is the OLAP’s ability to perform detailed analysis of trends and fluctuations in transactional data while laying bare the supporting information rolling up to the trend or fluctuation.

Fraud examiners and other assurance professionals need to be generally aware that there are various software products on the market today that can be used to perform OLAP functions.  As we’ll cover in the seminar, client organizations usually implement OLAP in a multi-user client/server mode with the aim of offering authorized users rapid responses to queries regardless of the size and complexity of the internal or cloud based underlying warehouse.  OLAP can held fraud examiners and other users summarize targeted client information for investigation through comparative, personalized viewing as well as through analysis of historical and projected data in various what-if data model scenarios.

Continuous Auditing versus Continuous Monitoring in Fraud Prevention Programs

wreath-4The efficacy of modern fraud prevention programs has been vastly improved by advances in data mining, analytics and the near ubiquitous cloud based storage and availability of client transactional data; the advances, however, have been accompanied by some confusion on the part of fraud prevention professionals in the incorporation of  these new tools into an effective, risk based, prevention program.  Three common sources of confusion usually arise during the implementation process of analytically supported fraud prevention schemes; first, is the confusion  between the continuous monitoring of transactions (made possible by data mining and analytics coupled with enterprise risk management approaches for the identification of high risk business processes) and continuous auditing for fraud.  Second is the need to understand the role of the continuous auditing for fraud in high risk business processes as a meta control (i.e., as a control of controls) and third is the concern of separation of duties (i.e., who will do what when actual instances of suspected fraud are identified by the process).

The continuous, analytically based,  monitoring of high risk business processes found to be especially vulnerable to pre-identified,  attempted fraud scenarios is a dynamic process (i.e., the fraud examiner/auditor can turn analytical procedures on and off by re-configuring tests based on what fraud scenarios and levels of accompanying risk s/he feels  are presently most active as threats.  By continuously monitoring particular, configurable high risk items, continuous testing for the presence of likely fraud scenarios constitutes a wholly new control level, acting as a meta control.  For example, a bank’s analytically based loan transaction system can issue an alarm regarding the presence of a suspected component of a fraud scenario and issue an alarm, under pre-specified  circumstances, to the bank manager’s supervisor as loans to a given customer exceed pre-authorized levels.  This fraud prevention program measure thus increases the number of configurable controls (e.g., choosing to issue an alarm and when) by going past simple continuous monitoring all way to the level of continuous auditing/testing and subsequent management alert.

Implementing this type of approach to fraud prevention generally means taking the following general types of steps:

—identifying the client’s high risk business processes for scenario testing.  The choice of high risk business processes should be integrated into the annual fraud prevention plan and the enterprise risk management (ERM) annual review.  This exercise should be integrated with other compliance plans (for example, with the internal audit annual plan, if there is one).

—identify rules that will guide the analytically based fraud scenario testing activity; these rules need to be programmed, repeated frequently and reconfigured when needed.  As an example, a financial institution might have defined a critical component of a given fraud scenario;  in response the bank monitors all checking accounts nightly by extracting files that meet the criteria of having a debt balance that is 20 percent larger than the loan threshold for a certain type of customer.

—determine the frequency of testing for the critical fraud scenarios and related business processes; this is important because the chosen frequency of testing has to depend on the natural rhythm of the subject business process including the timing of computer and business activities and the availability to the client of fraud examiners and auditors with experience of the underlying fraud scenario.

—cost benefit analysis needs to be performed; only the most high risk business processes vulnerable to a given frequently occurring  fraud scenario should be continuously tested; once the threat is determined to have subsided (perhaps by the application or tightening of  prevention controls) shut the continuous testing down as no longer cost effective.

—mechanisms must be in place to communicate positive testing results to business owners and the communication must be independent, objective and consistent; all the parties who will address elements of the suspected fraud and whose role requires taking some pre-defined action under the identified fraud scenario must be informed.

The evolution of fraud prevention programs to incorporate analytically based fraud evaluation and examination testing on a continuous and near continuous basis  is a giant step for the fraud examination and auditing professions. This evolution will take time, substantial attention from senior management and additional costs and resources as continuous fraud auditing activities are implemented and extended; these efforts will have a lasting effect on the future of both professions.