Document Type

Thesis

Degree Name

Master of Science (MSc)

Department

Mathematics

Program Name/Specialization

<--Please Select Program Name/Specialization-->

Faculty/School

Faculty of Science

First Advisor

Dr. Cristina Stoica

Advisor Role

Supervisor

Second Advisor

Dr. Xu (Sunny) Wang

Advisor Role

Co-Supervisor

Abstract

It is often assumed that natural phenomena occur randomly over time. However, careful analysis reveals that these events typically form some series or sequences and exhibit distinctive temporal patterns. These patterns are not exclusive to nature. They also appear in human activities, often studied under the concept of bursty human dynamics. The statistical methods analyzing bursty human dynamics not only capture overall trends or seasonality but also explore how past events influence future ones. It makes the analysis more realistic and the results more closely aligned with reality. Bursty human dynamics can be studied at two levels: the individual level and the societal level. This research focuses on individual- level activities, with a particular emphasis on crime analysis. The Major Crime Indicator (MCI) dataset is used for crime analysis. It is an open data provided by the Toronto Police Service via its Public Safety Data Portal. The dataset includes crime records from January 2001 to June 2024, comprising 396,735 observations across 25 variables. It documents five major crime types that occurred in the Greater Toronto Area: Assault, Break and Enter, Auto Theft, Robbery, and Theft Over. The relationship between past and future events can take various forms, which may be additive, multiplicative, linear, or non-linear. To capture these relation- ships, numerous methods have been developed and successfully applied across var- ious fields. These different methods can be categorized into traditional statistical approaches, machine learning, and deep learning models. However, most of the methods are unable to capture dependencies among the events. Usually, classic self-exciting point processes are used to develop more realistic models considering exogenous and endogenous factors. However, these types of models exhibit several drawbacks, such as the ignorance of non-linear dependencies, the inhibition effect among events and the parametric forms. Machine learning models are least reliable for temporal data because they do not consider temporal properties and hence fail to capture any patterns. Although deep learning models can capture non-linear or multiplicative factors, they are unable to model self-excitation among events and cannot capture inhibition effects. Hence, to overcome these limitations, in this research, a more sophisticated

approach is explored by embedding a non-linear self-excitation effect in deep learn- ing models. The proposed Convolutional Neural LSTM Hawkes Model takes advantage of two powerful deep learning structures, a Convolutional Neural Network (CNN) and Long-Short-Term Memory (LSTM), as well as classic Hawkes processes. To include the endogenous factor in the model, it expands the standard LSTM architecture. The basic Convolutional Neural Network helps extract important in- formation and patterns from the data. The feed-forward neural network is replaced by a modified version of LSTM. Triggering effects are modeled in LSTM by embed- ding a self-excitation gate. The proposed architecture is different from the typical LSTM model. The model has two forget gates and two input gates. These gates are used to evaluate the baseline and excitation terms by determining two different cell states. The additional excitation gate is used to calculate the decay rate for the excitation effect on which the updated hidden state at each step will depend. The expanded range of triggering effects over R can be achieved by using tanh activation function. This research aims to predict future crime occurrence more accurately and cap- ture complex patterns. This work adds to the field of crime forecasting by introduc- ing a powerful deep learning approach that benefits from both statistical insights and the adaptability of neural networks.

Convocation Year

2025

Convocation Season

Fall

Available for download on Sunday, March 01, 2026

Share

COinS