Tag Archives: data sharing

Part Two: Major Investigation Analytics – Big Data and Smart Data

Posted by Douglas Wood, Editor.

As regular readers of this blog know, I spend a great deal of time writing about the use of technology in the fight against crime – financial and otherwise. In Part One of this series, I overviewed the concept of Major Investigation Analytics and Investigative Case Management.

I also overviewed the major providers of this software technology – Palantir Technologies, Case Closed Software, and Visallo. The latter two recently became strategic partners, in fact.

The major case for major case management (pun intended) was driven home at a recent crime and investigation conference in New York. Full Disclosure: I attended the conference for educational purposes as part of my role at Crime Tech Weekly. Throughout the three day conference, speaker after speaker talked about making sense of data. I think if I’d have heard the term ‘big data’ one more time I’d have gone insane.  Nevertheless, that was the topic du jour as you can imagine, and the 3 V’s of big data – volume, variety, and velocity – remain a front and center topic for the vendor community serving the investigation market.

According to one report, 96% of everything we do in life – personal or at work – generates data. That statement probably best sums up how big ‘big data’ is.  Unfortunately,  there was very little discussion about how big data can help investigate major crimes. There was a lot of talk about analytics, for sure, but there was a noticeable lack of ‘meat on the bone’ when it came to major investigation analytics.

Nobody has ever yelled out “Help, I’ve been attacked. Someone call the big data!”. That’s because big data doesn’t, in and by itself, do anything.  Once you can move ‘big data’ into ‘smart data’, however, you have an opportunity to investigate and adjudicate crime. To me, smart data (in the context of investigations) is a subset of an investigator’s ability to:

  1. Quickly triage a threat (or case) using only those bits of data that are most immediately relevant
  2. Understand the larger scope of the crime through experience and crime analytics, and
  3. Manage that case through intelligence-led analytics and investigative case management, data sharing, link exploration, text analytics, and so on.

Connecting the dots, as they say. From an investigation perspective, however, connecting dots can be daunting. In the children’s game, there is a defined starting point and a set of rules.  We simply need to follow the instructions and the puzzle is solved. Not so in the world of the investigator. The ‘dots’ are not as easy to find. It can be like looking for a needle in a haystack, but the needle is actually broken into pieces and spread across ten haystacks.

Big data brings those haystacks together, but only smart data finds the needles… and therein lies the true value of major investigation analytics.

Major Investigation Analytics – No longer M.I.A. (Part One)

Posted by Douglas Wood, Editor.  http://www.linkedin.com/in/dougwood

Long before the terrorist strikes of 9/11 created a massive demand for risk and investigation technologies, there was the case of Paul Bernardo.

Paul Kenneth Bernardo was suspected of more than a dozen brutal sexual assaults in Scarborough, Canada, within the jurisdiction of the Ontario Provincial Police. As his attacks grew in frequency they also grew in brutality, to the point of several murders. Then just as police were closing in the attacks suddenly stopped. That is when the Ontario police knew they had a problem. Because their suspect was not in jail, they knew he had either died or fled to a location outside their jurisdiction to commit his crimes.

The events following Bernardo’s disappearance in Toronto and his eventual capture in St. Catharines, would ultimately lead to an intense 1995 investigation into police practices throughout the Province of Ontario, Canada. The investigation, headed by the late Justice Archie Campbell, showed glaring weaknesses in investigation management and information sharing between police districts.

Campbell studied the court and police documents for four months and then produced a scathing report that documented systemic jurisdictional turf wars among the police forces in Toronto and the surrounding regions investigating a string of nearly 20 brutal rapes in the Scarborough area of Toronto and the murders of two teenaged girls in the St. Catharines area. He concluded that the investigation “was a mess from beginning to end.”

Campbell went on to conclude that there was an “astounding and dangerous lack of co-operation between police forces” and a litany of errors, miscalculations and disputes. Among the Justice’s findings was a key recommendation that an investigative case management system was needed to:

  1. Record, organize, manage, analyze and follow up all investigative data
  2. Ensure all relevant information sources are applied to the investigation
  3. Recognize at an early stage any linked or associated incidents
  4. “Trigger” alerts to users of commonalities between incidents
  5. Embody an investigative methodology incorporating standardized procedures

Hundreds of vendors aligned to provide this newly mandated technology, and eventually a vendor was tasked with making it real with the Ontario Major Case Management (MCM) program. With that, a major leap in the evolution of investigation analytics had begun. Today, the market leaders include IBM i2, Case Closed Software, Palantir Technologies, and Visallo.

Recently, the Ottawa Citizen newspaper published an indepth article on the Ontario MCM system. I recommend reading it.

Investigation analytics and major case management

The components of major investigation analytics include: Threat Triage, Crime & Fraud Analytics, and Intelligence-Lead Investigative Case Management. Ontario’s MCM is an innovative approach to solving crimes and dealing with complex incidents using these components. All of Ontario’s police services use this major investigation analytics tool to investigate serious crimes – homicides, sexual assaults and abductions. It combines specialized police training and investigation techniques with specialized software systems. The software manages the vast amounts of information involved in investigations of serious crimes.

Major investigation analytics helps solve major cases by:

  1. Providing an efficient way to keep track of, sort and analyze huge amounts of information about a crime:  notes, witness statements, door-to-door leads, names, locations, vehicles and phone numbers are examples of the types of information police collect
  2. Streamlining investigations
  3. Making it possible for police to see connections between cases so they can reduce the risk that serial offenders will avoid being caught
  4. Preventing crime and reducing the number of potential victims by catching offenders sooner.

See Part Two of this series here.

To 314(b) or not to 314(b)?

Posted by Douglas Wood, Editor.  http://www.linkedin.com/in/dougwood

FinCEN today (November 1, 2013) released a fact sheet regarding data sharing between financial institutions under the Section 314(b) of the US Patriot Act.

314(b) provides financial institutions with the ability to share information with one another, under a safe harbor that offers protections from liability, in order to better identify and report potential money laundering or terrorist activities.  314(b) information sharing is a voluntary program, and FinCEN has always encouraged its use.

A few years ago, I spent considerable time looking at the overall 314(b) program. I interviewed dozens of Chief Compliance Officers (CCO) and AML/Fraud experts. I found that, despite the benefits to financial institutions – reduction of fraud loss, more complete SARs filings, shedding light on financial trails, etc – the program was not particularly well-utilized. The system, for all it’s good intentions, is very manual.

Imagine you are a 314(b) officer at a financial institution. Your job is to facilitate the data sharing amongst the community. So, much of your time is spent interacting with your CCO on which specific cases should be shared, and with whom. When you get that information, you open up you financial crimes investigation tools, and begin contacting your counterparts across the U.S. and asking them “Hey, do you know anything about Douglas Wood?” You’re calling the other officers completely blind with no idea whatsoever if they know Doug. In the meantime, your voicemail inbox is being flooded with other calls from other institutions asking if you know a bunch of other people (or entities).

Finding the institutions that know Douglas Wood is a lot like looking for a needle in a haystack… except you don’t know which haystacks to look in. The system too often grinds to a halt, despite some excellent work being done by 314(b) officers across the country. There has to be a better way, and some have proposed a data contribution system where financial institutions upload their bad guy data into one large third-party haystack, making the needle a little easier to find. As an advocate for the use of technology in the fight against financial crimes, I hope that model finds some success. The problem, of course, is that banks are LOATHED to put their data in the hands of a third party. Also, it’s typically up to each individual bank to decide if and when they choose to upload their data to be inter-mingled with other institutions. Far too often, it is not entirely reliable and not particularly current.

There is a better way. Several years ago, working with some tech-savvy employees, I envisioned a member-based 314(b) program where each institution maintained total control of their data. The model does not require individual banks to contribute their data for inter-mingling.  All ‘bad guy’ data sits and remains securely behind the banks’ respective firewalls. When an individual bank sends out a request to find out who, if anyone, may have information about a suspicious entity, the request is systematically sent out to all members using a secure network such as SWIFT, for example. That electronic search returns to the querying bank only a risk score which indicates the likelihood that another member is investigating the same entity.

No personally identifiable information (PII) is ever shared, yet the search is productive. The enquiring bank now knows that the person of interest was found in the bad guy data from other participating institutions. With this information in hand, the respective 314(b) officers can move their voicemail exchanges from “Have you ever heard of Douglas Wood” to “We’re both investigating Douglas Wood… let’s do it together.” The time-consuming, manual efforts are dramatically reduced and more bad guys are put away.

So if the question is to 314(b) or not to 314(b), perhaps the answer lies in data privacy compliant technology.

Premium Fraud – Piano Tuners and Window Washers?

Posted by Douglas Wood, Editor.

I came across a news article earlier this week regarding a business owner convicted of fraudulently avoiding worker’s compensation premiums. The link to that news article is below.

It brought to mind some fascinating work I was involved with a few years ago to help a state run Worker’s Compensation Bureau more effectively detect this kind of fraud.  Their biggest concern was recovering monies owed by companies who illegally misrepresent themselves for the purpose of reducing or avoiding the payment of premiums. Here’s how these scams work…

Intentional Misclassification: A crooked business claims that employees work safer jobs than they really do. Perhaps a high-rise window washer is falsely classified as a piano tuner. Much lower premiums, obviously.

Employee Misrepresentation: A business says it has fewer employees or a lower payroll than it actually does.

Coverage Avoidance/Experience Modification: A business simply doesn’t buy the required insurance, hoping state officials won’t notice. If the state learns of the avoidance, the company will simply close, then re-emerge as a ‘new’ company’ in order to avoid the payments.

So the state bureau I worked with needed to understand when, for example, a ‘piano tuner’ was requesting a permit for high rise window washing. Red flag, right?  Or when an five separate claims were filed by employees of a company who stated they had only 3 employees. Another red flag.

Oh, and what about a new company registrant whose owners, address, telephone number, and line of business are all suspiciously similar to those of a recently closed business who owed thousands of dollars in back premiums. BIG red flag.

The state itself had all of the data it needed to detect this fraud. The problem, as is often the case, is that the data sat in different jurisdictions. Working with our client, we helped those other jurisdictions – Business Registrations, Building Permits, Tax Departments, etc. – understand the value of sharing that data. That’s the key to this success story – data sharing. Without it, problems are much more difficult to solve.

Ultimately, we delivered a system that included business rules, anomaly detection, and social network analysis.  It provided the bureau with the ability to flag those anomalies using their existing data infrastructure and fraud alert output from those other state agencies.

With the tools in place to trigger those red flags, the agency immediately began recovering lost premiums, prosecuting offenders, and ultimately adding much needed revenue to the state coffers.

Fraudsters who choose to commit financial crimes are always coming up with new scams. Those of us committed to delivering true technology innovations through data sharing are starting to put a real dent in their chosen profession, though.

Maybe they can tune pianos instead. Do they need a building permit for that?

http://www.workerscompensation.com/compnewsnetwork/mobile/news/17511-investigation-leads-to-conviction-of-ca-business-owner-for-insurance-fraud.htmlgus

Posted by Douglas G. Wood. Click on ABOUT for more information and follow Financial Crimes Weekly on Twitter @FightFinCrime