This chapter will explain a methodology for using fraud data analytics in the search for fraud scenarios in financial statement accounts. Chapters 14 and 15 will explain how to use the methodology for revenue and journal entry testing.
This chapter will focus on uncovering fraud scenarios that are recorded in the general ledger or omitted from the general ledger either in a source journal or through a manual journal entry. This chapter will not focus on standards of care required by the professional auditing standards to uncover material misstatement through an intentional error. It is assumed the reader has knowledge of GAAS and will incorporate the standards into the data analytics plan.
I would encourage the reader to study the known financial statement frauds that occurred to improve their personal knowledge of how management has misstated financial statements. The knowledge will assist the auditor in the data interpretation aspect of the methodology. It is always easier to see a fraud scenario the second time versus the first time. Through the study of previous financial statement fraud cases, the auditor will learn the concept of fraud predictability or the logic of building the fraud analytics plan.
In one publicly traded company, revenue was materially overstated through a series of small‐dollar general journal entries that in the aggregate caused a material misstatement. Therefore, financial statement history has taught the profession that fraud can be recorded in one large transaction or a series of transactions in the aggregate that can cause a misstatement of a financial statement account. In one publicly traded company, management intentionally failed to write off damaged or lost assets. Once again, history has taught the profession, the absence of a write‐off journal entry would have highlighted this fraud scenario. Remember the famous phrase: Knowledge is power.
The methodology for fraud in financial accounts is the same as described in the first five chapters but needs to be adapted for the nuances of fraud in financial statements versus asset misappropriation or corruption. The fraud scenario statement has the same four components. However, the concept of a material overstatement or understatement must be included in the fraud scenario. The fraud action statement needs to be adapted for how, where, and when the fraudulent transaction is recorded in the general ledger.
The auditor is responsible for detecting misstatements, whether caused by error or fraud in the financial statements that are material and would impact an investor's decision to purchase the stock or for a bank to provide a loan to a company. The auditor is not responsible for uncovering all fraud but fraud that is material. Therefore, the fraud data analytics plan should incorporate the concept of fraud and materiality.
Lastly, because fraud in financial statements requires the intent to conceal the fraud, the element of concealment needs to be considered in building the fraud data analytics plan. Remember from our discussion in Chapters 2 and 3 that concealment is separate from the fraud scenario. Concealment also has three levels of sophistication.
Within this chapter, there are two overall approaches for searching for fraud in financial statements. First, search for the fraud data profile of the recorded fraud scenario. Second, search for the predictable concealment strategies associated with the fraud scenario. Before we start the discussion, we need to explain a few terms associated with financial statement misstatement.
An error is a mistake in the recognition, presentation, or disclosure in the financial statements. The error could be caused by a mistake in applying GAAP, internal control weakness resulting in improper recognition, or an error in judgment or assumptions. In one sense, the list is endless. However, an error does not occur with intent to misstate the financial statements. Therefore, there is no desire to conceal the error at the time the error occurs. However, the error may result in a material misstatement of the financial statements. Fraud data analytics can assist the auditor in searching for an error by searching for the fraud data profile of the transaction. However, fraud data analytics will not detect the misstatement through the search for the predictable concealment strategies because there was no intent to conceal the truth.
To illustrate, let's assume expenses were understated at year‐end. The search for 2016‐dated vendor invoices in the 2017 purchases journal would detect the error. If the error was intentional, the 2016 invoices would be recorded in the journal after the opinion date or with a 2017 date versus the correct date.
The concept has many terms associated with the phrase: income smoothing, cookie jar, creative accounting, and aggressive accounting. By whatever name, earnings management is a strategy used by management to achieve a predefined earnings target. It usually occurs through changes in accounting practices or through aggressive estimates for accruals or adjustments. It can either result in understatement in the current year to assist in achieving next year's earnings or result in overstatement in the current year through aggressive accounting practices.
Earnings management is sometimes called the gray area between errors causing misstatement and intent by management causing misstatement of the financial statements. It may not be a clear violation of GAAP but does distort the true financial picture of the company or a material misstatement. The planning reports may highlight areas of misstatement but fraud data analytics is not really intended to search for the type of misstatement caused by earnings management through the improper application of GAAP.
Financial statement fraud is the process of intentionally misleading the reader of the financial statements. It is the deliberate misrepresentation, misstatement, or omission of financial data to provide the impression that the organization is financially sound.
The key aspect in determining whether the misstatement is fraudulent is through establishing the intent of management. Is there clear and convincing evidence that management intended to misstate the financial statements, or the misstatement was caused by an unintentional error? The job of the fraud data analytics auditor is to identify the fraud scenario in the general ledger. The job of the fraud auditor is to offer that opinion that the scenario occurred with intent to conceal the fraud scenario and whether the fraud scenario is material to the financial statements.
The answer is simple. An error is a mistake, whereas fraud is an intentional act that management conceals from the auditor. For this reason, using the predictable concealment strategies is an effective method for searching for misstatement through fraudulent efforts.
The starting point for building the fraud data analytics plan is for the fraud auditor to understand the inherent fraud scheme concept, and how to write a fraud scenario for the inherent fraud scheme.
The inherent fraud scheme concept for financial statement fraud or the methodology for writing a fraud scenario described in earlier chapters is the same for financial statement fraud scenarios. What changes is the addition of the financial account, which is impacted by the fraud scenario, type of misstatement, and how to write the fraud action statement. In Chapter 14 and 15, respectively, we will describe the nuances of writing fraud scenarios for revenue and journal entries.
Let's review the component of fraud scenario for financial statement fraud:
The following provides examples of how to write a financial statement fraud scenario using the prescribed format:
All three fraud scenarios involve overstatement of revenue by the controller. However, the important point is to understand that the fraud data analytics for the three scenarios is very different. As follows for each scenario:
The proceeding three scenarios demonstrate the concept that the fraud data analytics must be designed for a specific fraud scenario.
Creating the fraud scenario requires the fraud auditor to be able to identify the fraud action statement in order to build the fraud interrogation routine. To repeat, a fraud action for financial statement fraud has the following components:
Building the fraud action statement starts with creating the list of permutations by changing the variables and then changing the generic permutations to be account‐specific. Once the permutation analysis is understood, the fraud auditor then links the specific technique to the person committing the fraud scenario.
Starting with the concept of using generic permutation analysis to create fraud action statements:
Remember, the fraud auditor does not know which fraud scenario the controller is going to commit, but the fraud auditor does know all the fraud scenarios the controller can commit. By starting with the generic fraud action statement and applying the generic statement to the specific account, the fraud auditor is assured regarding the completeness of the fraud scenario listing. Using the generic fraud action statement, we will identify the applicable fraud scenarios to be included in our fraud data analytics plan. Let's assume the balance sheet has an asset entitled capitalized advertising expenditures.
The first step is now completed. The fraud auditor has built the foundation of the fraud data analytics plan by identifying the permutations of the fraud scenario. GAAS will require the fraud auditor to assess the likelihood of the scenarios. Assuming the likelihood assessment has labeled the fraud scenario as an identified fraud risk for purposes of testing, the fraud auditor knows the answers to the where, when, and how questions in order to build the data interrogation routine. The astute auditor understands that the list of previous scenarios can be easily converted to each asset account by adapting the fraud action statement for the financial account.
The second step is to incorporate the GAAP implications into the fraud scenario. As a matter of style, the GAAP assertion can be written into the fraud scenario or documented as part of the overall plan. The problem with incorporating GAAP into the statement is that the fraud scenario may become too wordy.
The GAAS standards as described in SAS #106 (audit evidence) discusses the concept of the risk of material misstatement at the financial statement and assertion levels. The fraud data analytics plan must link to the assertions described in the standard. The following illustrates how the fraud action statement links to the assertion statements for class of transactions and events:
Answering the fraud action statement questions of where, when, and how will help the auditor achieve the goals of SAS #106.
Effective financial statement analysis and interpretation begins with an understanding of the types of questions that a fraud auditor needs to ask concerning where, when, and how the financial statements maybe misstated. The profession abounds with ratios and variance analysis, both vertical and horizontal analysis. Fraud data analytics is another tool for the auditor to understand the data. One difference between the traditional tools and the fraud data analytics tool is that the fraud auditor is searching for an intentional error that is material and embedded in the transactional history of a source journal or a general journal entry. A second difference is that fraud data analytics is looking at all transactions versus a random sample of transactions. Fraud data analytics should complement the existing tools versus replace the traditional tools.
Understanding the data starts with understanding what transactions created the account balance. The use of disaggregated analysis of an account balance is a powerful tool for the fraud auditor to determine what journal created the account balance. The tool allows the auditor to understand whether the account was created through a source journal or a general journal entry. This knowledge tells the auditor from a materiality factor whether it is more predictable that the scenario is embedded in the source journal or general journal entry.
The trial balance is the starting point of understanding where, when, and how the financial statements are misstated through fraud scenarios. The first report should be an analysis of each general ledger account as to whether the balance was created through a source journal or through a journal entry. The analysis should provide a net summary of the debits and the credits that compose each financial account through the source journals and the general journal entries.
The purpose of the report is to understand how the account balance was created. All account balances are created through a source journal or a general journal entry. By understanding the composition and the dollar value of the recorded transactions, the fraud auditor can start to develop the fraud data analytics plan. The analysis does require the fraud auditor to understand whether the account balance should be created from a source journal or through general journal entries.
For a simple illustration, if 95 percent of the revenue total is created from a source journal and 5 percent of total revenue is created from general journal entries, then the most likely location for a material overstatement is in the source journal. If the source journal is the primary source of the balance, then the most likely fraud scenarios are associated with either false revenue or improper recognition of real revenue. The next round of reports would be designed to determine the likelihood of false revenue or improper recognition.
For key financial statement accounts supported by a sub‐ledger, the use of disaggregated analysis would recreate the control account balance by each account in the sub‐ledger. The second level of disaggregated analysis is to determine whether the sub‐account was created by source journal or general journal entries. The purpose of the analysis is to determine where the material entries are located in the sub‐ledger. Is the source journal creating the account balance or a general journal entry?
To illustrate the concept, using the work‐in‐process inventory account, let's assume the disaggregated analysis indicates that a general journal entry is creating 60 percent of the balance for three accounts in the sub‐ledgers. Then the analysis would perform a disaggregated analysis on the journal entry. In another example, we will assume that the analysis indicates that 100 percent of the account balance for one account in the sub‐ledger was created from the accounts payable source journal. Furthermore, the disaggregated analysis of the accounts payable transaction reveals that one vendor invoice from a real vendor for $500,000 was recorded on December 31. Should this be suspicious?
The second tool is the data summarization feature of existing software to summarize journal activity based on a specific criterion:
The understanding the data concept is the same concept described in earlier chapters in the planning reports section. The purpose of this stage of fraud data analytics is to improve the predictability of the planning stage of the audit by highlighting those accounts or transactions that have the highest predictability of being associated with a fraud scenario.
In Chapter 4, we discussed the steps of building a fraud data analytics plan. The same steps apply for financial statements; however, due to the unique nature of financial statements, we need to answer additional questions to build a comprehensive plan to detect fraud scenarios in the financial accounts.
The planning stage requires the auditor to brainstorm by discussing the susceptibility of a financial account in having a material error arising from either an error or fraud. From the fraud perspective, the discussion should include the fraud scenarios that could impact a significant account balance and how the fraud scenario would be concealed in the account balance, or, said another way, the red flags of the scenario or red flags of the concealment technique.
The starting point is to identify which accounts individually or in the aggregate are deemed to be a significant account balance. For assets, are we focusing on inventory, accounts receivable, or capitalized expenses? The brainstorming should identify the financial account and discuss the type of scenarios that would be predictable for the account and the company. The focus of the discussion is on the fraud action statement as to how the type of error would occur and be concealed.
The six primary techniques used to overstate an asset are:
The four primary techniques used to understate an asset are:
The four primary techniques used to overstate an expense are:
The four primary techniques used to understate an expense are:
The second aspect of the brainstorming should be the predictable concealment techniques used to conceal the fraud scenario and what is the fraud data profile of the concealment technique. To illustrate, if management is overstating inventory, what are the concealment techniques, and what is the fraud data profile?
In one famous inventory fraud scheme, management tricked the auditors by transferring inventory from one physical location to the physical location where the auditor was performing test counts. Therefore, the fraud data analytics would search for inventory transfers at year‐end (the fraud action) or search for the reversal of the transfer after year‐end (fraud concealment). The next step is to incorporate the GAAP implications.
The accounting policies identify how a transaction should be recorded to comply with GAAP. The auditor needs to ensure that the accounting practices comply with GAAP. However, from a fraud data analytics perspective, the company's accounting policies will identify how a transaction should be recorded. Therefore, an anomaly can be described as a transaction recorded contrary to the company's accounting guidelines.
The process starts with the fraud risk assessment prepared as part of the planning process. One of the goals of the fraud risk assessment is the process of identifying those fraud risks that could have a material impact on the financial statement. For purposes of this book, we will refer to the identified fraud risk concept as a fraud scenario. From a GAAS perspective, a fraud scenario is called an identified fraud risk.
The second step of the process is the planning reports used in the understanding of the data step. If the planning reports do not suggest material amounts associated with the fraud data profile, then more than likely, that scenario is not occurring.
The correct overall strategy is dependent on many variables, ranging from the nature of the industry to how the general ledger is maintained. The correct answer to the question is determined on a scenario‐by‐scenario basis. This is why understanding how to write a fraud scenario becomes the basis for developing the fraud audit plan for searching for a material error in the financial statement. In practice, you will learn that one fraud data analytics plan will link to several fraud scenarios. It is the completeness of the thought process that is critical to the exercise of judgment in building the fraud audit approach for financial accounts.
Your initial use of this process will seem overly bureaucratic; however, eventually the process will become just a way of thinking. The thought process is important to understand what your data analytics plan is designed to detect and what the data analytics plan will not detect. The following steps are necessary to build the data interrogation routines:
GAAS provides guidance on which accounts should be the focus of the audit. We will assume for purposes of illustration that the GAAS analysis has indicated that inventory is a significant account balance. The fraud scenario for illustration is:
As the auditor understands, the impact of one fraud scenario typically impacts another financial account. The beauty of the double entry system is that the fraud scenario can be detected through two accounts. Yes, a matter of style, yes, a matter of double entry accounting, but ultimately, the fraud data analytics is searching for the fraud scenario or the fraud scenario concealment strategy.
The direction of testing is a critical question that must be considered in the planning stage. The fraud risk factors or conditions of fraud used in the planning stage of the audit should provide the auditor with clues as to the direction of testing. The primary purpose of the direction of testing question is to determine if the fraud data analytics is searching for overstatement or understatement. From a simple perspective, if the fraud auditor is performing a valid test on the wrong year of data, then the fraud data analytics will not uncover the fraud scenario.
The direction of misstatement will also provide a clue as to which year to look for the misstatement. From a fraud data analytics perspective, there are three years that should be considered in building the fraud data analytics plan. Assuming the year of audit is the calendar year ending December 31, 2016, the following describes the direction of testing for over‐ or understatement:
To further illustrate the concept, we will assume a year‐end of December 31, 2016, fiscal year. The year under audit, 2016, is a collection of all transactions recorded in the general ledger through either a source journal or general journal entries. Said another way, all of the fraud scenarios committed by management are either recorded in the general ledger or not recorded in the general ledger (i.e., the general ledger reflects the absence of a transaction that should have been recorded). The real question for the fraud auditor is to determine in which year to search for the fraud scenario causing the over‐ or understatement. The examples will assume the primary focus is the balance sheet accounts while understanding the impact on the other side of the entry:
Using the previous fraud scenario:
The fraud data analytics would search for the absence of a journal entry in the 2016 year or through disaggregated analysis of the 2016 inventory subledger.
The overstatement or understatement question is an important one for building the fraud data analytics plan. Without considering the question, the fraud data analytics does not know which year of data to use in the search for the fraud scenario.
GAAS requires the auditor to consider the nature, timing, and extent of the audit procedures. The correlation of the opinion date and the year‐end has a direct impact on the timing of the fraud data analytics.
The next consideration is the proximity of the opinion date to the financial statement date or the timing of the fraud data analytics. As the opinion date moves closer to the financial statement date, the ability to search 2017 for overstatement or understatement in the 2016 financial statement accounts becomes difficult. Therefore, performing the fraud data analytics during the midyear time period or through the use of retrospective analysis becomes a more viable approach. The when and where questions will depend on the opinion date proximity to the financial statement date and which financial statement account we are interrogating.
Let's first consider the when question using two assumptions:
Using the previous fraud scenario and assuming a January 31, 2017, opinion date:
The opinion date does not provide sufficient time to use 2017 data; therefore, the analysis must focus on the 2016 ledger. The procedure most likely will need to be performed earlier in the year.
Financial statements are prepared from the general ledger unless management is creating top‐sided journal entries. General ledgers are a summary of source journals and manual journal entries. Therefore if a fraud scenario is lurking in the general ledger, the fraud scenario had to be recorded through a source journal or through a journal entry. The exception to the following statement is the transaction or journal entry that is not recorded.
The answer to this question tells the fraud auditor which journal to search for the fraud scenario. That is why understanding the data question, previously discussed, is so critical.
Using the previous scenario:
In this scenario, it is the absence of the journal entry that is causing the overstatement, whereas the concealment technique is lurking in the subledger caused by a false movement either in the inventory journal or sales journal.
The fraud data analytics searches for either the recorded scenario or how the fraud scenario is concealed in the general ledger. The decision is made on a scenario‐by‐scenario basis on an account‐by‐account basis. The following guidelines are useful in determining which approach:
Using the previous fraud scenario:
For this scenario, the overstatement occurs due to the failure to record a transaction. Therefore, the fraud data analytics will search for evidence on how obsolescence is concealed in the subledgers.
The decision is both a matter of style and based on the data interpretation of the planning reports. The first planning reporting is the disaggregated analysis of each general ledger account that is intended to describe what journal created the account balance or what journals created the change from the beginning balance to the ending balance. The review of this report should provide some direction to answer the question.
The answer to the question will depend on the stage of the audit as to testing internal controls or substantive testing of account balances. The answer will also depend on the fraud scenario and the associated concealment strategy. Using the previous fraud scenario:
For this scenario, there is no transaction recorded. Therefore, the summary of journal entries by account balance would reveal that no write‐downs have occurred. The second analysis would use disaggregated analysis of the subledger.
The disaggregated approach would compare beginning inventory to ending inventory, searching for inventory lines that have zero movement or minimal movement. A similar approach would use the sales journal and summarize by product number. The exclusion theory would exclude all product lines having significant sales movement. The report would then compare sales by product line to the inventory movement report. The sample selection is based on inventory line items with no movement and no sales. This assumes that management has not attempted to conceal the lack of inventory movement.
As described in earlier chapters, fraud concealment is different from the fraud scenario. Fraud concealment theory in financial statements is the same concept as described for asset misappropriation or corruption fraud schemes. The difference is that the method of concealment is recorded in the general ledger in the 2015, 2016, or 2017 year. Second, history has taught us that each financial account has predictable concealment strategies.
Concealment is an integral part of hiding the fraud scenario from the fraud auditor. The general concealment may occur by: how the transaction is recorded; how the transaction is reversed; when the transaction is recorded; or which section of the financial statement the transaction is recorded. The specific concealment strategies are those actions taken by management to falsify the financial statement assertions. An integral part of the concealment is the documentation supporting the transaction. However, the documentation aspect is not critical to the fraud data analysis plan but is critical to developing the fraud audit program.
The second consideration to identifying the concealment strategies is to understand the accounting principles associated with the fraud scenario and the financial account. Using each accounting principle for the financial account, the fraud auditor would brainstorm how management would provide the illusion of compliance with the accounting principles.
Consistent with Chapter 3, there are two types of concealment strategies. There are general concealment strategies and the concealment strategies that directly correlate to creating the illusion that the fraud scenario complies with GAAP or complies with the company's accounting practices.
How is the transaction recorded?
How is the transaction reversed?
When is the transaction recorded?
In which section of the financial statement is the transaction recorded?
What is the concealment strategy which creates the illusion of compliance with GAAP?
The discussion starts with the specific fraud scenario and focuses on the fraud action statement and the accounting practices for proper recognition of the transaction. The goal of management is to create the illusion of compliance with GAAP and record the transaction in a way to hide the truth from the fraud auditor.
To illustrate the concept, we will explain the techniques to conceal obsolete inventory from the fraud auditor. In the example, the company accounting practice does not recognize a reserve for obsolescence and uses the direct write‐off method at time of obsolescence. The accounting practice is to review inventory usage reports to determine which inventory items should be written off. We will further assume the accounting policy is correct.
Using the previous fraud scenario:
The following concealment techniques allow the controller to hide the fraud scenario from the fraud auditor:
Financial statements are overstated or understated by a single intentional error or a series of intentional errors in the aggregate. The fraud error is the fraud scenario. Many high‐profile financial statement frauds involve either one major fraud scenario or many small fraud scenarios. The fraud scenario may have been committed by one large transaction in one account or many smaller transactions in one account that in the aggregate are material.
The concept of a large error needs to be defined through a general ledger account(s) associated with the materiality concept for a financial statement or financial account. Using inventory, the trial balance may indicate that there are many accounts associated with the amount of inventory reported on the financial statement.
The large error may be hidden in raw materials, work in process, and finished goods or spread throughout all the inventory accounts. The fraud data analytics plan must understand how to link the concept of a large error associated with a fraud scenario to a series of general ledger accounts. The fraud data analytics will create a homogeneous data set of large‐dollar transactions as defined through the materiality concept. For a large‐dollar transaction the clue is amount of the transaction and the posting date of the transaction.
The concept of a small error has the same linkage to the general ledger accounts associated with the material number on the financial statement. However, in the small error scheme, there are many transactions posted. The fraud data analytics will create a homogeneous data set of small‐dollar transactions, as defined through the materiality concept. The first report is a summary report linked to the inventory line item providing aggregate dollar value, record count, largest dollar, smallest dollar, and average dollar. For a series of small transactions the clue is the high frequency and the high aggregate dollar value.
The reason for the two data files correlates back to Chapter 4 with the use of the inclusion and exclusion theory regarding shrinking the size of the haystack. Spotting an error in a small homogeneous data set is easier than spotting an error in a large collection of data that has limited commonality. In this question, the commonality of data is the dollar amount of the transaction.
Using the previous fraud scenario with a minor change to include damaged inventory:
The sample selection should use the materiality amount as the focus of the analysis. Large transactions are those transactions that exceed the materiality amount for the account balance or are within a percentage of the materiality amount. The small transactions are the remaining transactions. The large transactions will use the specific identification strategy, whereas the small transactions would start with using a summary of transactions to search for materiality.
As a reminder, the right data‐mining strategy is based on both fraud data analytics strategy and answers to all the previous questions. As a reminder:
Using the previous fraud scenario, what is the right fraud data analytics strategy?
Specification identification is the starting point to search for the fraud scenario, the concealment strategy, and the inclusion and exclusion theory. The sample selection would either use the materiality factor of the specific identification or use data interpretation using the auditor's understanding of the industry and company.
The fraud data analytics plan must be linked to the audit program. Integrating fraud testing into the audit plan requires a sample plan designed to search for fraud but also testing techniques designed to pierce the concealment strategy associated with the fraud scenario.
The purpose of this chapter was to describe a methodology for creating a fraud data analytics plan for financial accounts. The next two chapters will describe how to apply the methodology to the revenue source journal and general journal entries. The fraud auditor will need to adapt the methodology to their industry and their company. Issues such as nature, timing, and extent of audit procedures will impact the success of fraud data analytics in searching for the material error. However, those issues are under the control of the auditor.
The good news is that financial statement fraud is in the general ledger. The bad news is, financial fraud is in the general ledger. The sheer amount of transactions and journal entries recorded in a general ledger makes it imperative that auditors use fraud data analytics. Generally speaking, the fraud auditor has access to all the data necessary to locate financial statement fraud scenarios, whereas in asset misappropriation schemes and corruption schemes the fraud auditor does not have all the necessary data or information to formulate a conclusion.