Language selection

Search


Evaluation of the Criminal Investigations Program: Section 3

Evaluation findings: Resource utilization

Human resources

Finding 9: The number of cases opened per Investigator varies by region and is dependent on the types of cases selected for investigations. Investigating larger and more complex cases takes more time and human resources, resulting in fewer cases per Investigator in some regions.

To determine whether regions have sufficient resources (i.e., the right number of investigators), the evaluation examined the number of leads received from 2016 to 2017 through 2020 to 2021 in comparison to the average number of investigators working in each Region. However, this analysis is somewhat limited, as the number of leads received may not fully reflect the level of riskFootnote 32 and volume of work per case in a given region.

As can be seen in table 7, the Program had an average of 207 investigators available per year to conduct investigations. The number of investigators allocated to each region does not appear to be linked to the number of leads received each year. As a result, the number of leads per investigator varies widely by region.

Table 7: Number of leads received per year, investigators available, and leads per investigator (by region, 5-year average from 2016 to 2017 through 2020 to 2021
Region Number of leads received per year (average) Number of investigators available per year (average) Leads per Investigator per year (highest to lowest)
Prairie 526 24 21.9
Southern Ontario 283 23 12.3
Northern Ontario 146 12 11.9
Atlantic 159 14 11.2
Pacific 353 39 9.3
Greater Toronto Area 428 48 8.9
Quebec 267 46 5.9
Total 2,156 207 10.5

Source: Leads from Criminal Investigations Information Management System (CIIMS) data provided by the Program. Number of investigators is from CAS minus the count of Digital forensic investigators (DFIs) provided by the Program.

Currently, regions select the number of cases to open based, in part, on the number of investigators availableFootnote 33, which limits certain regions since their number of investigators might not be reflective of the current volume of leads or level of risk that exists. There is therefore the potential that in some regions, leads are not being investigated because of a lack of resources rather than the merits of the leads themselves.

As illustrated in table 7, the Prairie Region has the highest number of leads per Investigator at 21.9, in contrast to the 5.9 leads per Investigator in Quebec Region, which could indicate a lack of investigative resources in Prairie region. While this analysis has limitations, the wide variance between regions suggests a potential need to review how resources are allocated to reflect current operational realities.

An examination of CIIMS and HR data also showed that the average number of cases opened per Investigator also varies by Region (refer to table 8), from 4.5 cases per Investigator in Northern Ontario Region, to 1.3 cases per Investigator in the Greater Toronto Area Region. The number of cases opened is not an indicator of workload. This variation can be a result of the different ways regions choose to manage their resources. Some regions choose to focus on Major and Complex cases and open relatively fewer cases as a result, while some regions focus more on Port Prosecution cases and open relatively more cases as a result. The selection of Port Prosecution cases may not be aligned with Program guidance to focus the majority of activities on Major and Complex cases, for acceptable reasons; however, data on the level of effort or hours worked per case is needed to determine the extent to which this may be problematic.

Table 8: Cases opened per Investigator by region (average per year from 2016 to 2017 through 2020 to 2021)
Region Cases per investigator (highest to lowest)
Northern Ontario 4.5
Southern Ontario 3.6
Atlantic 3.4
Pacific 2.4
Prairie 2.2
Quebec 1.5
Greater Toronto Area 1.3
Total 2.2

Source: Cases from CIIMS data provided by the Program. Number of investigators is from CAS minus the count of DFIs provided by the Program.

Availability of tools and resources

Finding 10: The lack of Major Case Management (MCM) software and use of MCM principles, low training completion rates, and lack of administrative support for investigators have impacted the efficiency of case management and resource utilization.

Regional managers interviewed noted the lack of training and MCM software has led to cases taking longer to complete than they otherwise would. Stakeholders suggested that adopting MCM software would reduce the amount of time spent preparing evidence for disclosure, as it would reduce the need for manual tracking and organizing of the evidence collected. In addition, Public Prosecution Services Canada (PPSC) explained during the 2022 CBSA-PPSC Symposium that implementing MCM principles for case management would improve investigation planning and managing the volume of evidence collected – leading to more efficient use of investigation resources. Improved training completion could also allow new investigators to get up to speed quicker and improve quality of investigation planning, both of which could further improve efficiency.

"No administrative support to handle certain tasks in the investigation that should not be done by investigators but rather by clerks."

Regional interviewee

All regional managers interviewed said they do not have enough administrative support staff given their volume of work, which has led to investigators doing administrative work – reducing their time available for core responsibilities and impacting their efficiency. This sentiment was also expressed by investigators across four of the regions through the survey. The lack of administrative support staff is further exacerbated by the lack of MCM software, which increases the amount of clerical work needed to prepare for disclosure. Currently, only the Atlantic, Greater Toronto Area, Pacific, and Quebec regions have investigative support staffFootnote 34 to assist investigators. The Program also indicated that it has a need for Analysts to provide tactical analytical support that will facilitate the advancement of investigations in a more efficient and effective manner. While the Program Headquarters (HQ) staff was looking into the use of intelligence analysts to support investigations, this was still in the exploration phase when this evaluation was completedFootnote 35.

Stakeholders expressed concern that the lack of tools and resources noted above has potentially led to cases taking too long to resolve. For example, in interviews with CBSA and PPSC, concerns were raised about cases possibly being closed due to the Supreme Court of Canada's Jordan decision, which limits the period between when a person is charged and when a decision is rendered to 18 months for provincial court trials and 30 months for superior court trialsFootnote 36. This means that if resolution of a case takes too long after charges are laid, proceedings have to be "stayed" (a temporary or permanent stop to a trial). Given that investigators can continue to support PPSC in the collection of evidence after charges are laid or in the preparation of disclosure, their timeliness could impact whether a case is stayed due to the Jordan decision. While CIIMS data showed zero cases as being marked not successful due to the Jordan decision, one region mentioned they had knowledge of at least "a couple" of cases in their region that were unsuccessful for this reason.

To validate stakeholder perceptions that cases may be taking too long to resolve, the evaluation analyzed available CIIMS data on the length of cases concluded (court process has been completed). Although there were limitations to this analysis (refer to Appendix D: Evaluation limitations), the overall trend supported interviewee perceptions. Data showed that from 2016 to 2017 through 2020 to 2021, 1,069 cases were concluded. While most cases were resolved within a year, 43% of complex cases took two or more years to conclude. This is notable as complex cases can have a greater impact on the individual investigated and present a higher risk to the safety, security, and prosperity of Canada. In interviews and during the 2022 CBSA-PPSC Symposium, PPSC also noted that large and lengthy cases can have diminishing returns due to the large amount of evidence created that must be collected and prepared for disclosure, and the risk that witnesses will become unavailable over time.

Gender-based analysis plus

Finding 11: Due to limited data availability, it was not possible to perform an in-depth gender-based analysis plus (GBA Plus) or fully assess how different identity factors impact how diverse groups are impacted by the Program's activities.

The Canadian Gender Budgeting Act (2018) requires that the President of Treasury Board make available to the public, once a year, an analysis of impacts that Government of Canada programs had in terms of gender and diversityFootnote 37. The CBSA and other federal organizations provide information of program impacts to the Treasury Board of Canada Secretariat (TBS) via the GBA Plus Supplementary Information Tables (SIT), contained in Departmental Results Reports.

The Program reported through the 2020 to 2021 GBA Plus SIT that "the analysis of GBA Plus impacts is currently limited by system capabilities to collect and report on GBA Plus dataFootnote 38." This limitation was also noted in interviews conducted for this evaluation.

Limited GBA Plus data impacts the Program's ability to gain insights into how different identity factors impact how diverse groups are impacted by criminal investigations. Information on criminal investigation cases are stored and managed in the CIIMS. Identity factors such as "gender," "age," and "perceived race" can be found for some of the cases, but these are not mandatory fields. For example, from , to :

  • "gender" information was available in 83% of cases
  • "age" information was available in 79% of cases
  • "perceived race" was available in 38% of cases

Gender and age information originates mostly from identity documents submitted at the border, which do not include data on race and ethnicity. CBSA staff also records, in some instances, their perceptions of a traveller's race for searches and arrests purposesFootnote 39. Perceived race data, as recorded by CBSA officers, is biased and not a valid or reliable measure for assessing the impacts of the Program on diverse groups of individuals in terms of their race. Asking an officer to determine another person's race based on their own assessment carries its own risks, even if the intentions are to fill in data gaps to ensure people are not being discriminated against.

Data that is not currently being collected but could be relevant for the purposes of Program GBA Plus analysis include: income, self-declared race and ethnicity, and disability status. Having access to adequate data, to the extent that is appropriate and feasible for the Program, will support the CBSA in meeting the requirements of the TBS related to the Canadian Gender Budgeting Act (2018).

The evaluation conducted an analysis of how the Program might be impacting a given group disproportionally. Due to the limited race data available and that it is recorded as "perceived race," the evaluation will not cover the specific results of the analysis as it would not be an accurate reflection of the impact of the Program on diverse groups.

More accurate data on the ethnicity/race of individuals involved in criminal investigation cases is necessary in order to determine the impact of the Program on diverse groups at different decision points in the investigative process. Cross-analysis of GBA Plus data with data on case characteristics could also provide information on how the Program impacts diverse groups of people.

Aside from the need to meet TBS GBA Plus reporting requirements, there is evidence to suggest that the Program also requires at least some race/ethnicity data to make operational decisions. For instance, the CBSA Prosecution Policy lists multiple factors to be considered when deciding to investigate a case, as not every instance of non-compliance is expected to merit a criminal investigation. One of the factors Program staff are required to consider is the public interest, which involves taking into consideration the offender's circumstances and background, including past victimization and systemic factors experienced by Indigenous personsFootnote 40. Currently, the Program does not have data available to ensure that case selection takes such a factor into consideration, as perceived race is not an adequate or reliable indicator.

Implementation of 2015 evaluation recommendations

Finding 12: The recommendations from the 2015 Evaluation of the Criminal Investigations Program were implemented; however, more work could be done for the Program to fully benefit.

The 2015 Evaluation of the Criminal Investigations Program made four recommendations covering three themes: performance measurement, obtaining information from other government departments (OGDs), and optimizing the delivery of digital forensic investigation services. This evaluation assessed the extent to which these recommendations improved Program performance.

Performance measurement

The 2015 evaluation recommended that the program:

  • develop a performance measurement framework that links key indicators (for each case) across each stage of the investigation process
    • this is to include criteria for the various decision points
  • implement regular monitoring of Program performance against intended outcomes and determine if activities are aligned with Program objectives, including national priorities

The current evaluation determined that the Program made good progress in this area, but additional adjustments are required. A performance measurement framework and key performance indicators (KPIs) were developed in response to these recommendations and indicators are regularly monitored by the Program. CIIMS now includes fields for additional decision points and rationale for decisions, as recommended. In addition, the Program has introduced a Quality Assurance Review process to improve data quality, which will improve the Program's ability to use data for decision-making. However, as noted in Section 3.3 and in Annex D, there is further room to improve data in this area by making the reason for coding a case "not successful" a mandatory field in CIIMS.

There is also room to improve the current performance measurement framework to better measure the Program's performance against intended outcomes, notably the alignment of cases with CBSA priorities, the quality of evidence collected, and the utilization of resources against current workload. Ways to improve performance measurement in these areas were noted in this evaluation report in Sections 3.1, 3.3, and 4.1.

Some regional managers also expressed a desire for the performance measurement framework to capture other aspects of performance such as the decision to close a case due to lack of evidence, successful judicial authorizations and search warrants, and completion of court briefs. In their view, the outcome of cases is also influenced by factors outside the control of the CBSA (i.e. PPSC and court decisions) and thus related indicators might be better suited to assess investigation quality.

Obtaining information from other government departments

The 2015 Evaluation recommended that the program identify and mitigate barriers to, and monitor progress in, obtaining evidence from OGDs using the investigative body designation (IBD).

The current evaluation determined good progress in implementing this recommendation as well. Some regional managers and the majority of criminal investigators surveyed through the current evaluation expressed satisfaction with the current process for engaging with OGDs to request information. In interviews, five regions noted continued difficulties in obtaining information and evidence from OGDs, while two regions reported that the ability to get information from OGDs had improved over the last five years. Delays in receiving a response from OGDs was noted as a main challenge through the survey. Nevertheless, 63% of criminal investigators surveyed were highly or somewhat satisfied with the overall process of requesting information from OGDs, and 61% were satisfied with the timeliness of OGD responses.

In interviews, some regional managers noted difficulty in identifying who to contact in OGDs to request information; meanwhile, 48% of criminal investigators surveyed were dissatisfied with the clarity of procedures and steps required to request evidence from OGDs.

In response to the 2015 recommendation, the Program developed an IBD course that explains what IBD is, the scope of the IBD authorities, and how to use the designation to request information. Training completion for the online only course was 91% for investigators working in their position for three to five years, but only 53% and 28% for investigators working fewer than two years, or over five years, respectively. Increasing uptake for the IBD course may address some of the concerns expressed by managers and investigators. This could be addressed by implementing the recommendation, as stated in Section 3.4, related to improving training completion overall.

Optimizing delivery of digital forensic investigation services

The 2015 evaluation recommended that the program develop options to deliver the digital forensic investigation service and to optimize the alignment of existing resources to support evolving demands. The Branch should implement and monitor the selected options.

Some progress has been made in this area, but certain action items are outstanding. In response to this recommendation, the Program completed an internal review of the Digital forensics unit (DFU) that sits within the Criminal Investigations Division at HQ and identified five action items (refer to table 9)Footnote 41. Two items have been completed, while two are currently in progress. As a result of the internal review, the previous backlog for DFIs has been reduced, according to interviewees. Regional managers interviewed felt well supported by the DFU in terms of getting the tools and training required for DFIs to do their jobs. About 93% of criminal investigators surveyed also reported being highly or somewhat satisfied with the overall engagement with DFIs in the regions.

Table 9: Status of action items identified in the DFU internal review
Action item Status
Create a digital forensic Investigations Policy Completed
Create Standard Operating Procedures Completed
Create a performance measurement framework for the DFU Not completed
Establish funding for the DFU Unknown*
Review classification and staffing for DFIs In progress

*Note: The current evaluation received contradictory information on the status of this action item.

In interviews, three regions noted challenges in relation to attracting and retaining qualified DFI staff in order to keep up with the volume of work. The DFU also noted this challenge. Currently, DFIs must be recruited internally from existing Investigator positions, which means the pool from which to hire DFIs is limited. The DFU is looking into alternatives for the recruitment of DFIs or IT support positions.

An analysis of performance and DFI human resources data demonstrated varying levels of support available from DFIs to criminal investigators in each Region (refer to table 10). A higher number of investigators per DFI indicates that a region has less support available from their DFI positions. An analysis of overtime worked per digital forensic investigators can also provide insight into whether a region has the right level of support from the DFI service – for example, Quebec region digital forensic Investigation services supports the most criminal investigators and work the second most overtime potentially indicating a need for more resources in this region.

Table 10: Average number of DFIs and investigators per region, from 2016 to 2017 through 2020 to 2021
Region DFIs Investigators Ratio of Investigator to one DFI (highest to lowest) Overtime (measured in FTEs)
Quebec 4.4 46.2 11.5 0.18
Southern Ontario 2 23 11.5 0.07
Greater Toronto Area 5.2 47.8 9.7 0.27
Prairie 2.6 24.2 9.5 0.11
Northern Ontario 1.6 12.2 8.6 0.10
Pacific 7 39.2 5.6 0.17
Atlantic 3 14.2 4.7 0.05
Total 27.8 206.8 8.2 0.95

Source: Number of investigators is from CAS minus the count of DFIs provided by the Program. Overtime from Costing Analytical Model (CAM).

There may be an opportunity to share DFI resources across regions to a greater extent. However, this is currently challenging due to technological limitations making it difficult to share large files electronically. [Redacted]

Conclusion and recommendations

The program has made good progress since the 2015 evaluation and is positively supporting the CBSA's public safety and economic prosperity objectives. The Program is doing so by conducting investigations and working with the PPSC to hold individuals and entities criminally responsible for willfully contravening border legislation and threatening the safety, security and prosperity of Canadians. The Program also contributes to upholding Canada's border legislation through deterrence, when criminal proceedings (from charge to sentencing) are publicized.

The Program is achieving positive results, as demonstrated by the high acceptance rate by the PPSC of referrals for prosecution; and the high rate of conviction of cases prosecuted in court. However, the quality of investigations (i.e., the extent to which the evidence collected is of the highest standard required) could not be fully assessed using current performance indicators and data.

There is evidence to suggest that there are areas for improvement in terms of performance measurement on the quality of investigations conducted, cases aligned with priorities, and reasons when investigations were opened but dropped before referral to PPSC.

There is also an opportunity for investigators to better understand how to leverage PPSC expertise to improve investigation quality and efficiency.

Finally, operational improvements, particularly around training and availability of tools and resources, and greater alignment of case selection with CBSA enforcement priorities, could improve the efficient use of resources.

As a result of the findings and conclusions described above, the evaluation recommends the following:

Recommendation 1: Addressing the root causes of low training completion rates
The Vice President of Intelligence and Enforcement Branch should work with the Vice President of Human Resources Branch to assess issues related to the low completion rate of criminal investigator training and develop a work plan to address the gap.
Recommendation 2: Performance measurement, including related to case selection and gender-based analysis plus factors
The Vice President of Intelligence and Enforcement Branch should update the Criminal Investigations Program's Performance Measurement Framework (PMF) to improve oversight and reporting on case selection, quality of all investigations, program resource utilization and expenditures, and review opportunities for the Program to gather reliable information on its potential impacts on diverse groups of people based on relevant GBA Plus identity factors.
Recommendation 3: Program HQ oversight of resource allocation and coordination of regional information sharing
With a view to improve efficiency and mature its functional management role, over and above its ongoing work towards securing an appropriate major case management tool, the Vice President of Intelligence and Enforcement Branch should seek to better understand regional resource allocation and associated Program performance and provide a forum for regions to exchange on approaches for regional case selection, expertise, best practices and challenges.
Date modified: