In this technologically driven age forensic accountants use a wide variety of tools to mine relevant data from the digital mountain.
William Wilkinson reports
The FSA’s prosecution of banker Christian Littlewood and his wife marked the regulator’s sixth successful case against insider dealers.
In a briefing about the Littlewood prosecution Margaret Cole, director of enforcement at the FSA, referred to the two-year investigation as a “long, hard slog through reams of evidence”, flagging the need for investigators to avoid getting bogged down in a mass of information and detail while painstakingly sifting through the evidence.
The FSA amassed 1,700Gb of computerised material, 43,000 hard-copy pages and 10 years’ worth of banking and trading records across 150 bank accounts and 18 trading accounts.
Such investigations are becoming increasingly common for the UK’s regulatory bodies, and while any lawyer involved in inter-company litigation will be aware of the vast amount of data that a forensic investigation uncovers, questions are often asked about the processes involved in the uncovering stage. So how does one go about capturing and reviewing such a wealth of information in a forensically sound way?
During complex investigations, be they enforcement actions by the FSA, Office of Fair Trading inquiries or inter-company litigation, the devil is in the detail. But those devilish details can now be stored almost anywhere from BlackBerry servers to USB memory sticks.
Take, for example, a multiple fraud investigation – the brief can be as broad as ’secure everything’. The challenge is uncovering the smoking gun once data has been secured. Moreover, the methods for securing electronic data to court-tested standards are almost as varied as the systems from which that information comes.
Where data is stored on one or more sources, such as laptops, desktops or USB storage devices, the use of traditional write-blocking hardware is still prevalent. This allows an investigator to examine the content of a hard disk without risking changing it. Having determined that the system may be relevant, a clone is acquired, which allows an investigator to analyse data without touching the original system again.
Where the investigator encounters data stored on files or email servers, the appropriate collection methodology is determined by factors such as where the server is based (geographically and jurisdictionally), its size (both physical and in terms of data storage), whether it can be shut down, and whether it is backed up regularly. Depending on these things, the server may either be imaged in its entirety or a small subset of data secured live using specialist tools to maintain evidential provenance. Occasionally, the previous night’s back-up tape will suffice.
Mobile phones, smartphones and other more unusual electronic units such as GPS devices or printers may be dealt with using specialised equipment. They can also be sub-contracted to an organisation that specialises in such devices. All these sources of electronic data can produce information equating to millions of files. Printing all that out would require vast tracts of forest and an equally large amount of reviewer-hours to understand.
Fortunately, computers also provide the key to sifting through all this data to find the relevant material. Many types of files are common across computers – Windows operating system files, for example – and a database of the digital fingerprints of these files can be used to rapidly exclude them. An investigator can use specialised software that can read electronic files to create an index of words contained therein.
For non-electronically readable files, more specialised software can be used that looks at a file to see if it can recognise words. This optical character-recognition technology can also be used on paper-based records to create searchable electronic versions.
If there have been several sources of data – such as a server and some of its back-up tapes – there could be many files that are the same in each source of data. An investigator can use software tools that can de-duplicate data, leaving only one copy of each file to review.
Investigators can also now run keyword searches across virtually all electronic data, identifying potentially relevant files. Depending on the words chosen, this may still bring back tens of thousands of files.
If there are relatively few files, a single reviewer might be able to manage looking at all the results, but in larger or more complex cases there will be far too many files for one person to read – and potentially several issues under investigation.
This is the stage at which a dedicated review system comes into its own. This is a technical solution whereby electronic data can be centralised in a suitable IT environment and multiple reviewers can access the system simultaneously. Teams of investigators or reviewers can read the electronic files and break them down into various categories, giving files tags such as ’responsive’, ’legally privileged’, ’irrelevant to issue’ and so on.
An investigator can look through these filtered and narrowed search results for a document that may be the smoking gun. At the very least they may point to other areas of investigation or sources of data.
William Wilkinson leads the technology forensic team in BDO’s technology advisory services unit