Tag Archives: data analytics

Bye-Bye Money

Miranda had responsibility for preparing personnel files for new hires, approval of wages, verification of time cards, and distribution of payroll checks. She “hired” fictitious employees, faked their records, and ordered checks through the payroll system. She deposited some checks in several personal bank accounts and cashed others, endorsing all of them with the names of the fictitious employees and her own. Her company’s payroll function created a large paper trail of transactions among which were individual earnings records, W-2 tax forms, payroll deductions for taxes and insurance, and Form 941 payroll tax reports. She mailed all the W-2 forms to the same post office box.

Miranda stole $160,000 by creating some “ghosts,” usually 3 to 5 out of 112 people on the payroll and paying them an average of $650 per week for three years. Sometimes the ghosts quit and were later replaced by others. But she stole “only” about 2 percent of the payroll funds during the period.

A tip from a fellow employee received by the company hotline resulted in the engagement of Tom Hudson, CFE.  Tom’s objective was to obtain evidence of the existence and validity of payroll transactions on the control premise that different people should be responsible for hiring (preparing personnel files), approving wages, and distributing payroll checks. “Thinking like a crook” lead Tom to readily see that Miranda could put people on the payroll and obtain their checks just as the hotline caller alleged. In his test of controls Tom audited for transaction authorization and validity. In this case random sampling was less likely to work because of the small number of alleged ghosts. So, Tom looked for the obvious. He selected several weeks’ check blocks, accounted for numerical sequence (to see whether any checks had been removed), and examined canceled checks for two endorsements.

Tom reasoned that there may be no “balance” to audit for existence/occurrence, other than the accumulated total of payroll transactions, and that the total might not appear out of line with history because the tipster had indicated that the fraud was small in relation to total payroll and had been going on for years.  He decided to conduct a surprise payroll distribution, then followed up by examining prior canceled checks for the missing employees and then scan personnel files for common addresses.

Both the surprise distribution and the scan for common addresses quickly provided the names of 2 or 3 exceptions. Both led to prior canceled checks (which Miranda had not removed and the bank reconciler had not noticed), which carried Miranda’s own name as endorser. Confronted, she confessed.

The major risks in any payroll business cycle are:

•Paying fictitious “employees” (invalid transactions, employees do not exist);

• Overpaying for time or production (inaccurate transactions, improper valuation);

•Incorrect accounting for costs and expenses (incorrect classification, improper or inconsistent presentation and disclosure).

The assessment of payroll system control risk normally takes on added importance because most companies have fairly elaborate and well-controlled personnel and payroll functions. The transactions in this cycle are numerous during the year yet result in lesser amounts in balance sheet accounts at year-end. Therefore, in most routine outside auditor engagements, the review of controls, test of controls and audit of transaction details constitute the major portion of the evidence gathered for these accounts. On most annual audits, the substantive audit procedures devoted to auditing the payroll-related account balances are very limited which enhances fraud risk.

Control procedures for proper segregation of responsibilities should be in place and operating. Proper segregation involves authorization (personnel department hiring and firing, pay rate and deduction authorizations) by persons who do not have payroll preparation, paycheck distribution, or reconciliation duties. Payroll distribution (custody) is in the hands of persons who do not authorize employees’ pay rates or time, nor prepare the payroll checks. Recordkeeping is performed by payroll and cost accounting personnel who do not make authorizations or distribute pay. Combinations of two or more of the duties of authorization, payroll preparation and recordkeeping, and payroll distribution in one person, one office, or one computerized system may open the door for errors and frauds. In addition, the control system should provide for detail control checking activities.  For example: (1) periodic comparison of the payroll register to the personnel department files to check hiring authorizations and for terminated employees not deleted, (2) periodic rechecking of wage rate and deduction authorizations, (3) reconciliation of time and production paid to cost accounting calculations, (4) quarterly reconciliation of YTD earnings records with tax returns, and (5) payroll bank account reconciliation.

Payroll can amount to 40 percent or more of an organization’s total annual expenditures. Payroll taxes, Social Security, Medicare, pensions, and health insurance can add several percentage points in variable costs on top of wages. So, for every payroll dollar saved through forensic identification, bonus savings arise automatically from the on-top costs calculated on base wages. Different industries will exhibit different payroll risk profiles. For example, firms whose culture involves salaried employees who work longer hours may have a lower risk of payroll fraud and may not warrant a full forensic approach. Organizations may present greater opportunity for payroll fraud if their workforce patterns entail night shift work, variable shifts or hours, 24/7 on-call coverage, and employees who are mobile, unsupervised, or work across multiple locations. Payroll-related risks include over-claimed allowances, overused extra pay for weekend or public holiday work, fictitious overtime, vacation and sick leave taken but not deducted from leave balances, continued payment of employees who have left the organization, ghost employees arising from poor segregation of duties, and the vulnerability of data output to the bank for electronic payment, and roster dysfunction. Yet the personnel assigned to administer the complexities of payroll are often qualified by experience than by formal finance, legal, or systems training, thereby creating a competency bias over how payroll is managed. On top of that, payroll is normally shrouded in secrecy because of the inherently private nature of employee and executive pay. Underpayment errors are less probable than overpayment errors because they are more likely to be corrected when the affected employees complain; they are less likely to be discovered when employees are overpaid. These systemic biases further increase the risk of unnoticed payroll error and fraud.

Payroll data analysis can reveal individuals or entire teams who are unusually well-remunerated because team supervisors turn a blind eye to payroll malpractice, as well as low-remunerated personnel who represent excellent value to the organization. For example, it can identify the night shift worker who is paid extra for weekend or holiday work plus overtime while actually working only half the contracted hours, or workers who claim higher duty or tool allowances to which they are not entitled. In addition to providing management with new insights into payroll behaviors, which may in turn become part of ongoing management reporting, the total payroll cost distribution analysis can point forensic accountants toward urgent payroll control improvements.

The detail inside payroll and personnel databases can reveal hidden information to the forensic examiner. Who are the highest earners of overtime pay and why? Which employees gained the most from weekend and public holiday pay? Who consistently starts late? Finishes early? Who has the most sick leave? Although most employees may perform a fair day’s work, the forensic analysis may point to those who work less, sometimes considerably less, than the time for which they are paid. Joined-up query combinations to search payroll and human resources data can generate powerful insights into the organization’s worst and best outliers, which may be overlooked by the data custodians. An example of a query combination would be: employees with high sick leave + high overtime + low performance appraisal scores + negative disciplinary records. Or, reviewers could invert those factors to find the unrecognized exemplary performers.

Where predication suggests fraud concerns about identified employees, CFEs can add value by triangulating time sheet claims against external data sources such as site access biometric data, company cell phone logs, phone number caller identification, GPS data, company email, Internet usage, company motor fleet vehicle tolls, and vehicle refueling data, most of which contain useful date and time-of-day parameters.  The data buried within these databases can reveal employee behavior, including what they were doing, where they were, and who they were interacting with throughout the work day.

Common findings include:

–Employees who leave work wrongfully during their shift;
–Employees who work fewer hours and take sick time during the week to shift the workload to weekends and public holidays to maximize pay;
–Employees who use company property excessively for personal purposes during working hours;
–Employees who visit vacation destinations while on sick leave;
–Employees who take leave but whose managers do not log the paperwork, thereby not deducting leave taken and overstating leave balances;
–Employees who moonlight in businesses on the side during normal working hours, sometimes using the organization’s equipment to do so.

Well-researched and documented forensic accounting fieldwork can support management action against those who may have defrauded the organization or work teams that may be taking inappropriate advantage of the payroll system. Simultaneously, CFEs and forensic accountants, working proactively, can partner with management to recover historic costs, quantify future savings, reduce reputational and political risk, improve the organization’s anti-fraud policies, and boost the productivity and morale of employees who knew of wrongdoing but felt powerless to stop it.

The Who, the What, the When

CFEs and forensic accountants are seekers. We spend our days searching for the most relevant information about our client requested investigations from an ever-growing and increasingly tangled data sphere and trying to make sense of it. Somewhere hidden in our client’s computers, networks, databases, and spreadsheets are signs of the alleged fraud, accompanying control weaknesses and unforeseen risks, as well as possible opportunities for improvement. And the more data the client organization has, the harder all this is to find.  Although most computer-assisted forensic audit tests focus on the numeric data contained within structured sources, such as financial and transactional databases, unstructured or text based data, such as e-mail, documents, and Web-based content, represents an estimated 8o percent of enterprise data within the typical medium to large-sized organization. When assessing written communications or correspondence about fraud related events, CFEs often find themselves limited to reading large volumes of data, with few automated tools to help synthesize, summarize, and cluster key information points to aid the investigation.

Text analytics is a relatively new investigative tool for CFEs in actual practice although some report having used it extensively for at least the last five or more years. According to the ACFE, the software itself stems from a combination of developments in our sister fields of litigation support and electronic discovery, and from counterterrorism and surveillance technology, as well as from customer relationship management, and research into the life sciences, specifically artificial intelligence. So, the application of text analytics in data review and criminal investigations dates to the mid-1990s.

Generally, CFEs increasingly use text analytics to examine three main elements of investigative data: the who, the what, and the when.

The Who: According to many recent studies, substantially more than a half of business people prefer using e-mail to use of the telephone. Most fraud related business transactions or events, then, will likely have at least some e-mail communication associated with them. Unlike telephone messages, e-mail contains rich metadata, information stored about the data, such as its author, origin, version, and date accessed, and can be documented easily. For example, to monitor who is communicating with whom in a targeted sales department, and conceivably to identify whether any alleged relationships therein might signal anomalous activity, a forensic accountant might wish to analyze metadata in the “to,” “from,” “cc,” or “bcc” fields in departmental e-mails. Many technologies for parsing e-mail with text analytics capabilities are available on the market today, some stemming from civil investigations and related electronic discovery software. These technologies are like the social network diagrams used in law enforcement or in counterterrorism efforts.

The What: The ever-present ambiguity inherent in human language presents significant challenges to the forensic investigator trying to understand the circumstances and actions surrounding the text based aspects of a fraud allegation. This difficulty is compounded by the tendency of people within organizations to invent their own words or to communicate in code. Language ambiguity can be illustrated by examining the word “shred”. A simple keyword search on the word might return not only documents that contain text about shredding a document, but also those where two sports fans are having a conversation about “shredding the defense,” or even e-mails between spouses about eating Chinese “shredded pork” for dinner. Hence, e-mail research analytics seeks to group similar documents according to their semantic context so that documents about shredding as concealment or related to covering up an action would be grouped separately from casual e-mails about sports or dinner, thus markedly reducing the volume of e-mail requiring more thorough ocular review. Concept-based analysis goes beyond traditional search technology by enabling users to group documents according to a statistical inference about the co-occurrence of similar words. In effect, text analytics software allows documents to describe themselves and group themselves by context, as in the shred example. Because text analytics examines document sets and identifies relationships between documents according to their context, it can produce far more relevant results than traditional simple keyword searches.

Using text analytics before filtering with keywords can be a powerful strategy for quickly understanding the content of a large corpus of unstructured, text-based data, and for determining what is relevant to the search. After viewing concepts at an elevated level, subsequent keyword selection becomes more effective by enabling users to better understand the possible code words or company-specific jargon. They can develop the keywords based on actual content, instead of guessing relevant terms, words, or phrases up front.

The When: In striving to understand the time frames in which key events took place, CFEs often need to not only identify the chronological order of documents (e.g., sorted by or limited to dates), but also link related communication threads, such as e-mails, so that similar threads and communications can be identified and plotted over time. A thread comprises a set of messages connected by various relationships; each message consists of either a first message or a reply to or forwarding of some other message in the set. Messages within a thread are connected by relationships that identify notable events, such as a reply vs. a forward, or changes in correspondents. Quite often, e-mails accumulate long threads with similar subject headings, authors, and message content over time. These threads ultimately may lead to a decision, such as approval to proceed with a project or to take some other action. The approval may be critical to understanding business events that led up to a particular journal entry. Seeing those threads mapped over time can be a powerful tool when trying to understand the business logic of a complex financial transaction.

In the context of fraud risk, text analytics can be particularly effective when threads and keyword hits are examined with a view to considering the familiar fraud triangle; the premise that all three components (incentive/pressure, opportunity, and rationalization) are present when fraud exists. This fraud triangle based analysis can be applied in a variety of business contexts where increases in the frequency of certain keywords related to incentive/pressure, opportunity, and rationalization, can indicate an increased level of fraud risk.

Some caveats are in order.  Considering the overwhelming amount of text-based data within any modern enterprise, assurance professionals could never hope to analyze all of it; nor should they. The exercise would prove expensive and provide little value. Just as an external auditor would not reprocess or validate every sales transaction in a sales journal, he or she would not need to look at every related e-mail from every employee. Instead, any professional auditor would take a risk-based approach, identifying areas to test based on a sample of data or on an enterprise risk assessment. For text analytics work, the reviewer may choose data from five or ten individuals to sample from a high-risk department or from a newly acquired business unit. And no matter how sophisticated the search and information retrieval tools used, there is no guarantee that all relevant or high-risk documents will be identified in large data collections. Moreover, different search methods may produce differing results, subject to a measure of statistical variation inherent in probability searches of any type. Just as a statistical sample of accounts receivable or accounts payable in the general ledger may not identify fraud, analytics reviews are similarly limited.

Text analytics can be a powerful fraud examination tool when integrated with traditional forensic data-gathering and analysis techniques such as interviews, independent research, and existing investigative tests involving structured, transactional data. For example, an anomaly identified in the general ledger related to the purchase of certain capital assets may prompt the examiner to review e-mail communication traffic among the key individuals involved, providing context around the circumstances and timing, of events before the entry date. Furthermore, the forensic accountant may conduct interviews or perform additional independent research that may support or conflict with his or her investigative hypothesis. Integrating all three of these components to gain a complete picture of the fraud event can yield valuable information. While text analytics should never replace the traditional rules-based analysis techniques that focus on the client’s financial accounting systems, it’s always equally important to consider the communications surrounding key events typically found in unstructured data, as opposed to that found in the financial systems.

RVACFES May 2017 Event Sold-Out!

On May 17th and 18th the Central Virginia ACFE Chapter and our partners, the Virginia State Police and the Association of Certified Fraud Examiners (ACFE) were joined by an over-flow crowd of audit and assurance professionals for the ACFE’s training course ‘Conducting Internal Investigations’. The sold-out May 2017 seminar was the ninth that our Chapter has hosted over the years with the Virginia State Police utilizing a distinguished list of certified ACFE instructor-practitioners.

Our internationally acclaimed instructor for the May seminar was Gerard Zack, CFE, CPA, CIA, CCEP. Gerry has provided fraud prevention and investigation, forensic accounting, and internal and external audit services for more than 30 years. He has worked with commercial businesses, not-for-profit organizations, and government agencies throughout North America and Europe. Prior to starting his own practice in 1990, Gerry was an audit manager with a large international public accounting firm. As founder and president of Zack, P.C., he has led numerous fraud investigations and designed customized fraud risk management programs for a diverse client base. Through Zack, P.C., he also provides outsourced internal audit services, compliance and ethics programs, enterprise risk management, fraud risk assessments, and internal control consulting services.

Gerry is a Certified Fraud Examiner (CFE) and Certified Public Accountant (CPA) and has focused most of his career on audit and fraud-related services. Gerry serves on the faculty of the Association of Certified Fraud Examiners (ACFE) and is the 2009 recipient of the ACFE’s James Baker Speaker of the Year Award. He is also a Certified Internal Auditor (CIA) and a Certified Compliance and Ethics Professional (CCEP).

Gerry is the author of Financial Statement Fraud: Strategies for Detection and Investigation (published 2013 by John Wiley & Sons), Fair Value Accounting Fraud: New Global Risks and Detection Techniques (2009 by John Wiley & Sons), and Fraud and Abuse in Nonprofit Organizations: A Guide to Prevention and Detection (2003 by John Wiley & Sons). He is also the author of numerous articles on fraud and teaches seminars on fraud prevention and detection for businesses, government agencies, and nonprofit organizations. He has provided customized internal staff training on specialized auditing issues, including fraud detection in audits, for more than 50 CPA firms.

Gerry is also the founder of the Nonprofit Resource Center, through which he provides antifraud training and consulting and online financial management tools specifically geared toward the unique internal control and financial management needs of nonprofit organizations. Gerry earned his M.B.A at Loyola University in Maryland and his B.S.B.A at Shippensburg University of Pennsylvania.

To some degree, organizations of every size, in every industry, and in every city, experience internal fraud. No entity is immune. Furthermore, any member of an organization can carry out fraud, whether it is committed by the newest customer service employee or by an experienced and highly respected member of upper management. The fundamental reason for this is that fraud is a human problem, not an accounting problem. As long as organizations are employing individuals to perform business functions, the risk of fraud exists.

While some organizations aggressively adopt strong zero tolerance anti-fraud policies, others simply view fraud as a cost of doing business. Despite varying views on the prevalence of, or susceptibility to, fraud within a given organization, all must be prepared to conduct a thorough internal investigation once fraud is suspected. Our ‘Conducting Internal Investigations’ event was structured around the process of investigating any suspected fraud from inception to final disposition and beyond.

What constitutes an act that warrants an examination can vary from one organization to another and from jurisdiction to jurisdiction. It is often resolved based on a definition of fraud adopted by an employer or by a government agency. There are numerous definitions of fraud, but a popular example comes from the joint ACFE-COSO publication, Fraud Risk Management Guide:

Fraud is any intentional act or omission designed to deceive others, resulting in the victim suffering a loss and/or the perpetrator achieving a gain.

However, many law enforcement agencies have developed their own definitions, which might be more appropriate for organizations operating in their jurisdictions. Consequently, fraud examiners should determine the appropriate legal definition in the jurisdiction in which the suspected offense was committed.

Fraud examination is a methodology for resolving fraud allegations from inception to disposition. More specifically, fraud examination involves:

–Assisting in the detection and prevention of fraud;
–Initiating the internal investigation;
–Obtaining evidence and taking statements;
–Writing reports;
–Testifying to findings.

A well run internal investigation can enhance a company’s overall well-being and can help detect the source of lost funds, identify responsible parties and recover losses. It can also provide a defense to legal charges by terminated or disgruntled employees. But perhaps, most importantly, an internal investigation can signal to every company employee that the company will not tolerate fraud.

Our two-day seminar agenda included Gerry’s in depth look at the following topics:

–Assessment of the risk of fraud within an organization and responding when it is identified;
–Detection and investigation of internal frauds with the use of data analytics;
–The collection of documents and electronic evidence needed during an investigation;
–The performance of effective information gathering and admission seeking interviews;
–The wide variety of legal and regulatory concerns related to internal investigations.

Gerry did his usual tremendous job in preparing the professionals in attendance to deal with every step in an internal fraud investigation, from receiving the initial allegation to testifying as a witness. The participants learned to lead an internal investigation with accuracy and confidence by gaining knowledge about topics such as the relevant legal aspects impacting internal investigations, the use of computers and analytics during the investigation, collecting and analyzing internal and external information, and interviewing witnesses and the writing of effective reports.

Analytics & Fraud Prevention

During our Chapter’s live training event last year, ‘Investigating on the Internet’, our speaker Liseli Pennings, pointed out that, according to the ACFE’s 2014 Report to the Nations on Occupational Fraud and Abuse, organizations that have proactive, internet oriented, data analytics in place have a 60 percent lower median loss because of fraud, roughly $100,000 lower per incident, than organizations that don’t use such technology. Further, the report went on, use of proactive data analytics cuts the median duration of a fraud in half, from 24 months to 12 months.

This is important news for CFE’s who are daily confronting more sophisticated frauds and criminals who are increasingly cyber based.  It means that integrating more mature forensic data analytics capabilities into a fraud prevention and compliance monitoring program can improve risk assessment, detect potential misconduct earlier, and enhance investigative field work. Moreover, forensic data analytics is a key component of effective fraud risk management as described in The Committee of Sponsoring Organizations of the Treadway Commission’s most recent Fraud Risk Management Guide, issued in 2016, particularly around the areas of fraud risk assessment, prevention, and detection.  It also means that, according to Pennings, fraud prevention and detection is an ideal big data-related organizational initiative. With the growing speed at which they generate data, specifically around their financial reporting and sales business processes, our larger CFE client organizations need ways to prioritize risks and better synthesize information using big data technologies, enhanced visualizations, and statistical approaches to supplement traditional rules-based investigative techniques supported by spreadsheet or database applications.

But with this analytics and fraud prevention integration opportunity comes a caution.  As always, before jumping into any specific technology or advanced analytics technique, it’s crucial to first ask the right risk or control-related questions to ensure the analytics will produce meaningful output for the business objective or risk being addressed. What business processes pose a high fraud risk? High-risk business processes include the sales (order-to-cash) cycle and payment (procure-to-pay) cycle, as well as payroll, accounting reserves, travel and entertainment, and inventory processes. What high-risk accounts within the business process could identify unusual account pairings, such as a debit to depreciation and an offsetting credit to a payable, or accounts with vague or open-ended “catch all” descriptions such as a “miscellaneous,” “administrate,” or blank account names?  Who recorded or authorized the transaction? Posting analysis or approver reports could help detect unauthorized postings or inappropriate segregation of duties by looking at the number of payments by name, minimum or maximum accounts, sum totals, or statistical outliers. When did transactions take place? Analyzing transaction activities over time could identify spikes or dips in activity such as before and after period ends or weekend, holiday, or off-hours activities. Where does the CFE see geographic risks, based on previous events, the economic climate, cyber threats, recent growth, or perceived corruption? Further segmentation can be achieved by business units within regions and by the accounting systems on which the data resides.

The benefits of implementing a forensic data analytics program must be weighed against challenges such as obtaining the right tools or professional expertise, combining data (both internal and external) across multiple systems, and the overall quality of the analytics output. To mitigate these challenges and build a successful program, the CFE should consider that the priority of the initial project matters. Because the first project often is used as a pilot for success, it’s important that the project address meaningful business or audit risks that are tangible and visible to client management. Further, this initial project should be reasonably attainable, with minimal dollar investment and actionable results. It’s best to select a first project that has big demand, has data that resides in easily accessible sources, with a compelling, measurable return on investment. Areas such as insider threat, anti-fraud, anti-corruption, or third-party relationships make for good initial projects.

In the health care insurance industry where I worked for many years, one of the key goals of forensic data analytics is to increase the detection rate of health care provider billing non-compliance, while reducing the risk of false positives. From a capabilities perspective, organizations need to embrace both structured and unstructured data sources that consider the use of data visualization, text mining, and statistical analysis tools. Since the CFE will usually be working as a member of a team, the team should demonstrate the first success story, then leverage and communicate that success model widely throughout the organization. Results should be validated before successes are communicated to the broader organization. For best results and sustainability of the program, the fraud prevention team should be a multidisciplinary one that includes IT, business users, and functional specialists, such as management scientists, who are involved in the design of the analytics associated with the day-to-day operations of the organization and hence related to the objectives of  the fraud prevention program. It helps to communicate across multiple departments to update key stakeholders on the program’s progress under a defined governance regime. The team shouldn’t just report noncompliance; it should seek to improve the business by providing actionable results.

The forensic data analytics functional specialists should not operate in a vacuum; every project needs one or more business champions who coordinate with IT and the business process owners. Keep the analytics simple and intuitive, don’t include too much information in one report so that it isn’t easy to understand. Finally, invest time in automation, not manual refreshes, to make the analytics process sustainable and repeatable. The best trends, patterns, or anomalies often come when multiple months of vendor, customer, or employee data are analyzed over time, not just in the aggregate. Also, keep in mind that enterprise-wide deployment takes time. While quick projects may take four to six weeks, integrating the entire program can easily take more than one or two years. Programs need to be refreshed as new risks and business activities change, and staff need updates to training, collaboration, and modern technologies.

Research findings by the ACFE and others are providing more and more evidence of the benefits of integrating advanced forensic data analytics techniques into fraud prevention and detection programs. By helping increase their client organization’s maturity in this area, CFE’s can assist in delivering a robust fraud prevention program that is highly focused on preventing and detecting fraud risks.

Cybersecurity – Is There a Role for Fraud Examiners?

cybersecurityAt a cybersecurity fraud prevention conference, I attended recently in California one of the featured speakers addressed the difference between information security and cybersecurity and the complexity of assessing the fraud preparedness controls specifically directed against cyber fraud.  It seems the main difficulty is the lack of a standard to serve as the basis of a fraud examiner’s or auditor’s risk review. The National Institute of Standards and Technology’s (NIST) framework has become a de facto standard despite the fact that it’s more than a little light on specific details.  Though it’s not a standard, there really is nothing else at present against which to measure cybersecurity.  Moreover, the technology that must be the subject of a cybersecurity risk assessment is poorly understood and is mutating rapidly.  CFE’s, and everyone else in the assurance community, are hard pressed to keep up.

To my way of thinking, a good place to start in all this confusion is for the practicing fraud examiner to consider the fundamental difference between information security and cybersecurity, the differing nature of the threat itself.   There is simply a distinction between protecting information against misuse of all sorts (information security) and an attack by a government, a terrorist group, or a criminal enterprise that has immense resources of expertise, personnel and time, all directed at subverting one individual organization (cybersecurity).  You can protect your car with a lock and insurance but those are not the tools of choice if you see a gang of thieves armed with bricks approaching your car at a stoplight. This distinction is at the very core of assessing an organization’s preparations for addressing the risk of cyberattacks and for defending itself against them.

As is true in so many investigations, the cybersecurity element of the fraud risk assessment process begins with the objectives of the review, which leads immediately on to the questions one chooses to ask. If an auditor only wants to know “Are we secure against cyberattacks?” then the answer should be up on a billboard in letters fifty feet high: No organization should ever consider itself safe against cyber attackers. They are too powerful and pervasive for any complacency. If major television networks can be stricken, if the largest banks can be hit, if governments are not immune, then the CFE’s client organization is not secure either.  Still, all anti-fraud reviewers can ask subtle and meaningful questions of client management, specifically focused on the data and software at risk of an attack. A fraud risk assessment process specific to cybersecurity might delve into the internals of database management systems and system software, requiring the considerable skills of a CFE supported by one or more tech-savvy consultants s/he has engaged to form the assessment team. Or it might call for just asking simple questions and applying basic arithmetic.

If the fraud examiner’s concern is the theft of valuable information, the simple corrective is to make the data valueless, which is usually achieved through encryption. The CFE’s question might be, “Of all your data, what percentage is encrypted?” If the answer is 100 percent, the follow-up question is whether the data are always encrypted—at rest, in transit and in use. If it cannot be shown that all data are secured all of the time, the next step is to determine what is not protected and under what circumstances. The assessment finding would consist of a flat statement of the amount of unencrypted data susceptible to theft and a recitation of the potential value to an attacker in stealing each category of unprotected data. The readers of this blog know that data must be decrypted in order to be used and so would be quick to point out that “universal” encryption in use is, ultimately, a futile dream. There are vendors who, think otherwise, but let’s accept the fact that data will, at some time, be exposed within a computer’s memory. Is that a fault attributable to the data or to the memory and to the programs running in it? Experts say it’s the latter. In-memory attacks are fairly devious, but the solutions are not. Rebooting gets rid of them and antimalware programs that scan memory can find them. So a CFE can ask,” How often is each system rebooted?” and “Does your anti-malware software scan memory?

To the extent that software used for attacks is embedded in the programs themselves, the problem lies in a failure of malware protection or of change management. A CFE need not worry this point; according to my California presenter many auditors (and security professionals) have wrestled with this problem and not solved it either. All a CFE needs to ask is whether anyone would be able to know whether a program had been subverted. An audit of the change management process would often provide a bounty of findings, but would not answer the reviewer’s question. The solution lies in having a version of a program known to be free from flaws (such as newly released code) and an audit trail of

known changes. It’s probably beyond the talents of a typical CFE to generate a hash total using a program as data and then to apply the known changes in order to see if the version running in production matches a recalculated hash total. But it is not beyond the skills of IT expects the CFE can add to her team and for the in-house IM staff responsible keeping their employer’s programs safe. A CFE fraud risk reviewer need only find out if anyone is performing such a check. If not, the CFE can simply conclude and report to the client that no one knows for sure if the client’s programs have been penetrated or not.

Finally, a CFE might want to find out if the environment in which data are processed is even capable of being secured. Ancient software running on hardware or operating systems that have passed their end of life are probably not reliable in that regard. Here again, the CFE need only obtain lists and count. How many programs have not been maintained for, say, five years or more? Which operating systems that are no longer supported are still in use? How much equipment in the data center is more than 10 years old? All this is only a little arithmetic and common sense, not rocket science.

In conclusion, frauds associated with weakened or absent cybersecurity systems are not likely to become a less important feature of the corporate landscape over time. Instead, they are poised to become an increasingly important aspect of doing business for those who create automated applications and solutions, and for those who attempt to safeguard them on the front end and for those who investigate and prosecute crimes against them on the back end. While the ramifications of every cyber fraud prevention decision are broad and diverse, a few basic good practices can be defined which the CFE, the fraud expert, can help any client management implement:

  • Know your fraud risk and what it should be;
  • Be educated in management science and computer technology. Ensure that your education includes basic fraud prevention techniques and associated prevention controls;
  • Know your existing cyber fraud prevention decision model, including the shortcomings of those aspects of the model in current use and develop a schedule to address them;
  • Know your frauds. Understand the common fraud scenarios targeting your industry so that you can act swiftly when confronted with one of them.

We can conclude that the issues involving cybersecurity are many and complex but that CFE’s are equipped  to bring much needed, fraud related experience to any management’s table as part of the team in confronting them.

Dr. Fraudster & the Billing Anomaly Continuum

healthcare-fraudThis month’s member’s lecture on Medicare and Medicaid Fraud triggered a couple of Chapter member requests for more specifics about how health care fraud detection analytics work in actual practice.

It’s a truism within the specialty of data analytics having to do with health care billing data that the harder you work on the front end, the more successful you’ll be in materializing information that will generate productive results on the back end.  Indeed, in the output of health care analytics applications, fraud examiners and health care auditors now have a new set of increasingly powerful tools to use in the audit and investigation of all types of fraud generally and of health care fraud specifically; I’m referring, of course, to analytically supported analysis of what’s called the billing anomaly continuum.

The use of the anomaly continuum in the general investigative process starts with the initial process of detection, proceeds to investigation and mitigation and then (depending on the severity of the case) can lead to the follow-on phases of prevention, response and recovery.   We’ll only discuss the first three phases here as most relevant for the fraud examination process and leave the prevention, response and recovery phases for a later post.

Detection is the discovery of clues within the data.  The process involves taking individual data segments related to the whole health care process (from the initial provision of care by the health care provider all the way to the billing and payment for that care by the insurance provider) and blending them into one data source for seamless analysis.  Any anomalies in the data can then be noted.  The output is then evaluated for either response or for follow-up investigation.  It is these identified anomalies that will go on at the end of the present investigative process to feed the detection database for future analysis.

As an example of an actual Medicare case, let’s say we have a health care provider whom we’ll call Dr. Fraudster, some of whose billing data reveals a higher than average percentage of complicated (and costly) patient visits. It also seems that Dr. Fraudster apparently generated some of this billings while travelling outside the country.  There were also referred patient visits to chiropractors, acupuncturists, massage therapists, nutritionists and personal trainers at a local gym whose services were also billed under Dr. Fraudster’s tax ID number as well as under standard MD Current Procedural Terminology (CPT) visit codes.  In addition, a Dr. Outlander, a staff physician, and an unlicensed doctor, was on Dr. Fraudster’s staff and billed for $5 an hour.  Besides Outlander, a Dr. Absent was noted as billing out of Dr. Fraudster’s clinic even though he was no longer associated with the clinic.

First off, in the initial detection phase, its seems Dr. Fraudster’s high-volume activity flagged an edit function that tracks an above-average practice growth rate without the addition of new staff on the claim form.  Another anomalous activity picked up was the appearance of wellness services presented as illness based services.  Also the billed provision of services while travelling is also certainly anomalous.

The following investigation phase involves ascertaining whether various activities or statements are true.  In Dr. Fraudster’s case, evidence to collect regarding his on-staff associate, Dr. Outlander, may include confirmation of license status, if any; educational training, clinic marketing materials and payroll records.  The high percentage of complicated visits and the foreign travel issues need to be broken down and each activity analyzed separately in full detail.  If Dr. Fraudster truly has a high complication patient population, most likely these patients would be receiving some type of prescription regime.  The lack of a diagnosis requirement with associated prescriptions in this case limited the scope of the real-life investigation.  Was Dr. Fraudster prescribing medications with no basis?  If he uses an unlicensed Doctor on his staff, presents wellness services as illness related services, and sees himself (perhaps) as a caring doctor getting reluctant insurance companies to pay for alternative health treatments, what other alternative treatment might he be providing with prescribed medications?  Also, Dr. Fraudster had to know that the bills submitted during his foreign travels were false.  Statistical analysis in addition to clinical analysis of the medical records by actual provider and travel records would provide a strong argument that the doctor had intent to misrepresent his claims.

The mitigation phase typically builds on issues noted within the detection and investigation phases.  Mitigation is the process of reducing or making a certain set of circumstances less severe.  In the case of Dr. Fraudster, mitigation occurred in the form of prosecution.  Dr. Fraudster was convicted of false claims and removed from the Medicare network as a licensed physician, thereby preventing further harm and loss.  Other applicable issues that came forward at trial were evidence of substandard care and medical unbelievability patterns (CPE codes billed that made no sense except to inflate the billing).  What made this case even more complicated was tracking down Dr. Fraudster’s assets.  Ultimately, the real-life Dr. Fraudster did receive a criminal conviction; civil lawsuits were initiated, and he ultimately lost his license.

From an analytics point of view, mitigation does not stop at the point of conviction of the perpetrator.  The findings regarding all individual anomalies identified in the case should be followed up with adjustment of the insurance company’s administrative adjudication and edit procedures (Medicare was the third party claims payer in this case).  What this means is that feedback from every fraud case should be fed back into the analytics system.  Incorporating the patterns of Dr. Fraudster’s fraud into the Medicare Fraud Prevention Model will help to prevent or minimize future similar occurrences, help find currently on-going similar schemes elsewhere with other providers and reduce the time it takes to discover these other schemes.  A complete mitigation process also feeds detection by reducing the amount of investigative time required to make the existence of a fraud known.

As practicing fraud examiners, we are provided by the ACFE with an examination methodology quite powerful in its ability to extend and support all three phases of the health care fraud anomaly identification process presented above.  There are essentially three tools available to the fraud examiner in every health care fraud examination, all of which can significantly extend the value of the overall analytics based health care fraud investigative process.  The first is interviewing – the process of obtaining relevant information about the matter from those with knowledge of it.  The second is supporting documents – the examiner is skilled at examining financial statements, books and records.   The examiner also knows the legal ramifications of the evidence and how to maintain the chain of custody over documents.  The third is observation – the examiner is often placed in a position where s/he can observe behavior, search for displays of wealth and, in some instances, even observe specific offenses.

Dovetailing the work of the fraud examiner with that of the healthcare analytics team is a win for both parties to any healthcare fraud investigation and represents a considerable strengthening of the entire long term healthcare fraud mitigation process.

Where the Money Is

bank-robberyOne of the followers of our Central Virginia Chapter’s group on LinkedIn is a bank auditor heavily engaged in his organization’s analytics based fraud control program.  He was kind enough to share some of his thoughts regarding his organization’s sophisticated anti-fraud data modelling program as material for this blog post.

Our LinkedIn connection reports that, in his opinion, getting fraud data accurately captured, categorized, and stored is the first, vitally important challenge to using data-driven technology to combat fraud losses. This might seem relatively easy to those not directly involved in the process but, experience quickly reveals that having fraud related data stored reliably over a long period of time and in a readily accessible format represents a significant challenge requiring a systematic approach at all levels of any organization serious about the effective application of analytically supported fraud management. The idea of any single piece of data being of potential importance to addressing a problem is a relatively new concept in the history of banking and of most other types of financial enterprises.

Accumulating accurate data starts with an overall vision of how the multiple steps in the process connect to affect the outcome. It’s important for every member of the fraud control team to understand how important each process pre-defined step is in capturing the information correctly — from the person who is responsible for risk management in the organization to the people who run the fraud analytics program to the person who designs the data layout to the person who enters the data. Even a customer service analyst or a fraud analyst not marking a certain type of transaction correctly as fraud can have an on-going impact on developing an accurate fraud control system. It really helps to establish rigorous processes of data entry on the front end and to explain to all players exactly why those specific processes are in place. Process without communication and communication without process both are unlikely to produce desirable results. In order to understand the importance of recording fraud information correctly, it’s important for management to communicate to all some general understanding about how a data-driven detection system (whether it’s based on simple rules or on sophisticated models) is developed.

Our connection goes on to say that even after an organization has implemented a fraud detection system that is based on sophisticated techniques and that can execute effectively in real time, it’s important for the operational staff to use the output recommendations of the system effectively. There are three ways that fraud management can improve results within even a highly sophisticated system like that of our LinkedIn connection.

The first strategy is never to allow operational staff to second-guess a sophisticated model at will. Very often, a model score of 900 (let’s say this is an indicator of very high fraud risk), when combined with some decision keys and sometimes on its own, can perform extremely well as a fraud predictor. It’s good practice to use the scores at this high risk range generated by a tested model as is and not allow individual analysts to adjust it further. This policy will have to be completely understood and controlled at the operational level. Using a well-developed fraud score as is without watering it down is one of the most important operational strategies for the long term success of any model. Application of this rule also makes it simpler to identify instances of model scoring failure by rendering them free of any subsequent analyst adjustments.

Second, fraud analysts will have to be trained to use the scores and the reason codes (reason codes explain why the score is indicative of risk) effectively in operations. Typically, this is done by writing some rules in operations that incorporate the scores and reason codes as decision keys. In the fraud management world, these rules are generally referred to as strategies. It’s extremely important to ensure strategies are applied uniformly by all fraud analysts. It’s also essential to closely monitor how the fraud analysts are operating using the scores and strategies.

Third, it’s very important to train the analysts to mark transactions that are confirmed or reported to be fraudulent by the organization’s customers accurately in their data store.

All three of these strategies may seem very straight forward to accomplish, but in practical terms, they are not that easy without a lot of planning, time, and energy. A superior fraud detection system can be rendered almost useless if it is not used correctly. It is extremely important to allow the right level of employee to exercise the right level of judgment.  Again, individual fraud analysts should not be allowed to second-guess the efficacy of a fraud score that is the result of a sophisticated model. Similarly, planners of operations should take into account all practical limitations while coming up with fraud strategies (fraud scenarios). Ensuring that all of this gets done the right way with the right emphasis ultimately leads the organization to good, effective fraud management.

At the heart of any fraud detection system is a rule or a model that attempts to detect a behavior that has been observed repeatedly in various frequencies in the past and classifies it as fraud or non-fraud with a certain rank ordering. We would like to figure out this behavior scenario in advance and stop it in its tracks. What we observe from historical data and our experience needs be converted to some sort of a rule that can be systematically applied to the data real-time in the future. We expect that these rules or models will improve our chance of detecting aberrations in behavior and help us distinguish between genuine customers and fraudsters in a timely manner. The goal is to stop the bleeding of cash from the account and to accomplish that as close to the start of the fraud episode as we can. If banks can accurately identify early indicators of on-going fraud, significant losses can be avoided.

In statistical terms, what we define as a fraud scenario would be the dependent variable or the variable we are trying to predict (or detect) using a model. We would try to use a few independent variables (as many of the variables used in the model tend to have some dependency on each other in real life) to detect fraud. Fundamentally, at this stage we are trying to model the fraud scenario using these independent variables. Typically, a model attempts to detect fraud as opposed to predict fraud. We are not trying to say that fraud is likely to happen on this entity in the future; rather, we are trying to determine whether fraud is likely happening at the present moment, and the goal of the fraud model is to identify this as close to the time that the fraud starts as possible.

In credit risk management, we try to predict if there will likely be serious delinquency or default risk in the future, based on the behavior exhibited in the entity today. With respect to detecting fraud, during the model-building process, not having accurate fraud data is akin to not knowing what the target is in a shooting range. If a model or rule is built on data that is only 75 percent accurate, it is going to cause the model’s accuracy and effectiveness to be suspect as well. There are two sides to this problem.  Suppose we mark 25 percent of the fraudulent transactions inaccurately as non-fraud or good transactions. Not only are we missing out on learning from a significant portion of fraudulent behavior, by misclassifying it as non-fraud, the misclassification leads to the model assuming the behavior is actually good behavior. Hence, misclassification of data affects both sides of the equation. Accurate fraud data is fundamental to addressing the fraud problem effectively.

So, in summary, collecting accurate fraud data is not the responsibility of just one set of people in any organization. The entire mind-set of the organization should be geared around collecting, preserving, and using this valuable resource effectively. Interestingly, our LinkedIn connection concludes, the fraud data challenges faced by a number of other industries are very similar to those faced by financial institutions such as his own. Banks are probably further along in fraud management and can provide a number of pointers to other industries, but fundamentally, the problem is the same everywhere. Hence, a number of techniques he details in this post are applicable to a number of industries, even though most of his experience is bank based. As fraud examiners and forensic accountants, we will no doubt witness the impact of the application of analytically based fraud risk management by an ever multiplying number of client industrial types.

Informed Analytics

data-analytics_2by Michael Bret Hood,
21st Century Learning & Consulting,
LLC, University of Virginia, Retired FBI

I recently had a conversation with an old friend who is an accounting professor at a large southern university.

We were discussing my impending retirement and the surprising difficulty I am having in finding a corporate fraud investigation position. One of the things we discussed was the recent trend to hire mathematicians and statisticians as directors of fraud detection and risk control programs. Knowing that I could be biased, I asked the professor if he had seen the same thing. He replied that he had and then uttered, “What a foolish mistake!”

While neither of us harbors any ill will toward the community of mathematicians and statisticians (they probably are a lot smarter and way more technologically gifted than us), fraud detection and fraud prevention is so much more than the numbers and related informational sub‐sets. Sun Tzu in The Art of War said, “Know your enemy and know yourself and you can fight a hundred battles without disaster.” What every fraud contains that data analytics can never account for is the human behavior element. Absent the analytical process directly involving someone with the expertise of knowing how fraudsters operate as well as someone who understands victimology, significant weaknesses will almost always be introduced to the analysis. Matt Asay, in his InformationWeek article ‘8 Reasons Big Data Projects Fail’, understands this inherent flaw. “Too many organizations hire data scientists who might be math and programming geniuses but who lack the most important component: domain knowledge.”

The current perception is that data analytics causes the fraudulent patterns in organizations to just suddenly become exposed. Unfortunately, the pattern algorithms and programs created by data scientists are not magic elixirs. Just like the old gold miners did, someone has to sort through the data to ensure the patterns are both relevant and valid. In his article ‘What Data Can’t Do’, author David Brooks says the following, “Data is always constructed to someone’s predispositions and values. The end result looks disinterested, but in reality, there are value choices all the way through, from construction to interpretation.” The old adage suggesting inferior input equals inferior output certainly applies with equal force to data analytics today.

Data analytics has certainly had its successes. Credit card companies have been able to stem losses based on intricate and real‐time analysis of current trends, which unaided human reviewers would certainly be unable to produce manually. In other cases, data analytics have failed. “Google Flu Trends unsuccessfully predicted flu outbreaks in 2013 when it ended up forecasting twice as many cases of flu as the Center for Disease Control and Prevention reported.”  Data analytics as applied by even the best data scientists can’t always quantify the human element in their computations. No one should say that data analytics are not useful tools to be leveraged in any fraud investigation. In fact, they are most beneficial. However, implementing data analytics does not place an impenetrable anti-fraud fortress around your data and/or your money. Sometimes it takes a combination of data analytics and experienced professionals to produce the best results.

In one business, data analytics were deployed using Benford’s Law in such a way that an insider‐led tax refund fraud scheme was uncovered, saving the company millions of dollars. This data set, however, would never have been chosen for analysis were it not for forensic accountants who noticed a variance in the numbers they sampled. Fraud is a crime that always includes an unmistakable human element represented in the actions and reactions of both the perpetrator and the victim. Data analytics, although extremely useful will never be able to take into account the full dynamic range of emotions and decision making of which human beings are capable. Businesses have started to realize this problem as evidenced by a recent Gartner survey where the author claims, “Big data was so yesterday; it’s all about algorithms today.”

While it may cost organizations a little more in salary to engage the services of   experienced fraud investigators such as myself, the resultant ROI is far superior to the cost of the investment.

You Can’t Prevent What You Can’t See

uncle-samThe long, rainy Central Virginia fourth of July weekend gave me a chance to review the ACFE’s latest Report to the Nations and I was struck by what the report had to say about proactive data analytics as an element of internal control, especially as applicable to small business fraud prevention.

We’re all familiar with the data analytics performed by larger businesses of which proactive data analytic tests form only a part.  This type of analysis is accomplished with the use of sophisticated software applications that comb through massive volumes of data to determine weak spots in the control system. By analyzing data in this manner, large companies can prevent fraud from happening or detect an ongoing fraud scheme. The Report to the Nations reveals, among other things that, of the anti-fraud controls analyzed, proactive data monitoring and analysis appears to be the most effective at limiting the duration and cost of fraud schemes. By performing proactive data analysis, companies detected fraud schemes sooner, limiting the total potential loss. Data analysis is not a new concept, but, as we all know, with the increasing number of electronic transactions due to advances in technology, analyzing large volumes of data has become ever more complex and costly to implement and manage.

Companies of all sizes are accountable not only to shareholders but to lenders and government regulators.  Although small businesses are not as highly regulated by the government since they are typically not publically financed, small business leaders share the same fiduciary duty as large businesses: to protect company assets. Since, according to the ACFE, the average company loses 5% of revenue to fraud, it stands to reason that preventing losses due to fraud could increase profitability by 5%. When viewed in this light, many small businesses would benefit from taking a second look at implementing stronger fraud prevention controls.  The ACFE also reports that small businesses tend to be victims of fraud more frequently than large businesses because small businesses have limited financial and human resources. In terms of fraud prevention and detection, having fewer resources overall translates into having fewer resources dedicated to strong internal controls. The Report also states that small businesses (less than 100 employees) experience significantly larger losses percentage-wise than larger businesses (greater than 100 employees). Since small businesses do not have the resources to dedicate to fraud prevention and detection, they’re not able to detect fraud schemes as quickly, prolonging the scheme and increasing the losses to the company.

The ACFE goes on to tell us that certain controls are anti-fraud by nature and can prevent and detect fraud, including conducting an external audit of a set of financial statements, maintaining an internal audit department, having an independent audit committee, management review of all financial statements, providing a hotline to company employees, implementing a company code of conduct and anti-fraud policy, and practicing pro-active data monitoring. While most of these controls are common for large companies, small businesses have difficulty implementing some of them, again,  because of their limited financial and human resources.

What jumped out at me from the ACFE’s Report was that only 15% of businesses under 100 employees currently perform proactive data analysis, while 41.9% of businesses over 100 employees do. This is a sign that many small businesses could be doing a basic level of data analysis, but aren’t. The largest costs associated with data analysis are software costs and employee time to perform the analysis. With respect to employee resources, data analysis is a control that can be performed by a variety of employees, such as a financial analyst, an accountant, an external consultant, a controller, or even the CFO. The level of data analysis should always be structured to fit within the cost structure of the company. While larger companies may be able to assign a full time analyst to handle these responsibilities, smaller companies may only be able to allocate a portion of their time to this task. Given these realities, smaller businesses, need to look for basic data analysis techniques that can be easily implemented.

The most basic data analysis techniques are taught in introductory accounting courses and aren’t particularly complex: vertical analysis, horizontal analysis, liquidity ratios, and profitability ratios. Large public companies are required to prepare these type of calculations for their filings with the Securities and Exchange Commission. For small businesses, these ratios and analyses can be calculated by using two of the basic financial statements produced by any accounting software:  the income statement and the balance sheet. By comparing the results of these calculations to prior periods or to industry peers, significant variances can point to areas where fraudulent transactions may have occurred. This type of data analysis can be performed in a tabular format and the results used to create visual aids. Charts and graphs are a great way for a small business analyst to visualize variances and trends for management.

I like to point out to small business clients that all of the above calculations can be performed with Microsoft Excel and Microsoft Access. These are off-the-shelf tools that any analyst can use to perform even analytical calculations of great complexity. The availability of computing power in Excel and Access and the relatively easy access to audit tools … known as Computer Assisted Audit Techniques (CAAT), have accelerated the analytical review process generally. Combined with access to the accounting server and its related applications and to the general ledger, CAATS are very powerful tools indeed.

The next step would be to consider using more advanced data analysis programs. Microsoft Excel has many features to perform data analysis, and it is probably already installed on many computers within small enterprises. CFE’s might suggest to their clients adding the Audit Control Language (ACL) Add-In to client Excel installations to add yet another layer of advanced analysis that will help make data analytics more effective and efficient. When a small business reaches a level of profitability where it can incorporate a more advanced data analysis program,it can add a more robust tool such as IDEA or ACL Analytics. Improving controls by adding a specialized software program will require financial resources to acquire it and to train employees. It will also require the dedication of time from employees serving in the role of internal examiners for fraud like internal auditors and financial personnel. Professional organizations such as the ACFE and AICPA have dedicated their time and efforts to ensuring that companies of all sizes are aware of the threats of fraud in the workplace. One suggestion I might make to these professional organizations would be to work with accounting software developers and the current developers of proactive data analysis tools to incorporate data analysis reports into their standard products. If a small business had the ability to run an anti-fraud report as a part of their monthly management review of financial statements without having to program the report, it would save a significant amount of company resources and improve the fraud prevention program overall.

To sum up, according to Joseph T. Wells, founder of the ACFE, “data analytics have never been more important or useful to a fraud examiner. There are more places for fraud to hide, and more opportunities for fraudsters to conceal it.” Clearly there are many resources available today for small businesses of almost any size to implement proactive data analysis tools. With the significant advances in technology, exciting new anti-fraud solutions appear on the horizon almost daily; the only thing standing between them and our clients is the decision to pick them up and use them.

The Auditor and the Fraud Examiner

financial-statementsOur Chapter averages about three new members a month, a majority of whom are drawn from the pool of relatively recent college graduates in accounting or finance, most of whom possessing an interest in fraud examination and having a number of courses in auditing under their belts.  From the comments I get it seems that our new members are often struck by the apparent similarities between fraud examination and auditing imparted by their formal training and yet hazy about the differences between the two in actual practice.

But, unlike the financial statement focus in financial auditing, fraud examination involves resolving fraud allegations from inception to disposition. Fraud examination methodology requires that all fraud allegations be handled in a uniform, legal fashion and be resolved on a timely basis. Assuming there is sufficient reason (predication) to conduct a fraud examination, specific examination steps usually are employed. At each step of the fraud examination process, the evidence obtained and the effectiveness of the fraud theory approach are continually assessed and re-assessed. Further, the fraud examination methodology gathers evidence from the general to the specific. As such, the suspect (subject) of the inquiry typically would be interviewed last, only after the fraud examiner has obtained enough general and specific information to address the allegations adequately.  However, just like a financial statement audit, a fraud investigation consists of a multitude of steps necessary to resolve allegations of fraud: interviewing witnesses, assembling evidence, writing reports, and dealing with prosecutors and the courts. Because of the legal ramifications of the fraud examiners’ actions, the rights of all individuals must be observed throughout. Additionally, fraud examinations must be conducted only with adequate cause or predication.

Predication is the totality of circumstances that would lead a reasonable, professionally trained, and prudent individual to believe a fraud has occurred, is occurring, or will occur. Predication is the basis upon which an examination is commenced. Unlike a financial audit, fraud examinations should never be conducted without proper predication. Each fraud examination begins with the prospect that the case will end in litigation. To solve a fraud without complete and perfect evidence, the examiner must make certain assumptions. This is not unlike the scientist who postulates a theory based on observation and then tests it. In the case of a complex fraud, fraud theory is almost indispensable. Fraud theory begins with a hypothesis, based on the known facts, of what might have occurred. Then that hypothesis or key assumption is tested to determine whether it’s provable.

The fraud theory approach involves the following steps, in the order of their occurrence:

  • Analyze available data.
  • Create a hypothesis.
  • Test the hypothesis.
  • Refine and amend the hypothesis.
  • Accept or reject the hypothesis based on the evidence.

With that said, fraud examinations incorporate many auditing techniques; however, the primary differences between an audit and a fraud investigation are the scope, methodology, and reporting. It’s also true that many of the fraud examiners in our Chapter (as in every ACFE Chapter) have an accounting background. Indeed, some of our members are employed primarily in the audit function of their organizations. Although fraud examination and auditing are related, they are not the same discipline. So how do they differ?  First, there’s the question of timing.  Financial audits are conducted on a regular recurring basis while fraud examinations are non-recurring; they’re conducted only with sufficient predication.

The scope of the examination in a financial audit is general (the scope of the audit is a general examination of financial data) while the fraud examination is conducted to resolve specific allegations.

An audit is generally conducted for the purpose of expressing an opinion on the financial statements or related information.  The fraud examination’s goal is to determine whether fraud has occurred, is occurring, or will occur, and to determine who is responsible.

The external audit process is non-adversarial in nature. Fraud examinations, because they involve efforts to affix blame, are adversarial in nature.

Audits are conducted primarily by examining financial data. Fraud examinations are conducted by (1) document examination; (2) review of outside data, such as public records; and (3) interviews.

Auditors are required to approach audits with professional skepticism. Fraud examiners approach the resolution of a fraud by attempting to establish sufficient proof to support or refute an allegation of fraud.

As a general rule during a financial fraud investigation, documents and data should be examined before interviews are conducted. Documents typically provide circumstantial evidence rather than direct evidence. Circumstantial evidence is all proof, other than direct admission, of wrongdoing by the suspect or a co-conspirator.  In collecting evidence, it’s important to remember that every fraud examination may result in litigation or prosecution. Although documents can either help or harm a case, they generally do not make the case; witnesses do. However, physical evidence can make or break the witnesses. Examiners should ensure that the evidence is credible, relevant, and material when used to support allegations of fraud.

From the moment evidence is received, its chain of custody must be maintained for it to be accepted by the court. This means that a record must be made when the item is received or when it leaves the care, custody, or control of the fraud examiner. This is best handled by a memorandum of interview by the custodian of the records when the evidence is received.

Fraud examiners are not expected to be forensic document experts; however, they should possess adequate knowledge superior to that of a lay person.

In fraud investigations, examiners discover facts and assemble evidence. Confirmation is typically accomplished by interviews. Interviewing witnesses and conspirators is an information-gathering tool critical in the detection of fraud. Interviews in financial statement fraud cases are different than those in most other cases because the suspect being interviewed might also be the boss.

In conclusion, auditing procedures are indeed often used in a financial statement fraud examination. Auditing procedures are the acts or steps performed by an auditor in conducting the review. According to the third standard of fieldwork of generally accepted auditing standards, “The auditor must obtain sufficient appropriate audit evidence by performing audit procedures to afford a reasonable basis for an opinion regarding the financial statements under audit.”  Common auditing procedures routinely used during fraud examination, as during financial statement examination, are confirmations, physical examination, observation, inquiry, scanning, inspection, vouching, tracing, re-performance, re-computation, analytical procedures, and data mining; these are all vital tools in the arsenal of both practitioners as well as of all financial assurance professionals.