## Upcoming Talks

### March 8, 2018

#### Speaker: Massoud Ataei (York U)

An N-th order tensor is a multidimensional array that is defined on a field equipped with the tensor product of N vector spaces. In this talk, I will first define the outer, inner, Hadamard and n-mode products of the tensors and discuss their basic properties. Then, the Canonical Polyadic Decomposition (CPD) will be presented where a tensor is decomposed as a sum of several rank-1 tensors. As an illustration, I will further provide my recent results on applications of the CPD to S&P500 data analysis.

## Past Talks

### March 1, 2018

#### Title: New bounds for $\psi(x;q,a)$

Let $a,q$ be relatively prime integers. Then consider $$\psi(x;q,a)=\sum_{\substack{n\le x\\ n\equiv a(\bmod q)}}\Lambda(n).$$ We discuss explicit bounds for $\psi(x;q,a)$, which provide an extension and an improvement over the bounds given by Ramaré and Rumely in 1996. This article introduces three novel pieces to the argument.

### February 15, 2018

#### Title: The Laplace transform of the lognormal distribution

Some integral transforms of the lognormal distribution, such as the Laplace and Fourier transforms, have no known closed form. Several approximations and numerical methods for computing the Laplace transform of the lognormal distribution (LTLD) have been proposed in the literature. The majority of these methods are only valid for complex arguments with nonnegative real part (at best).

In this talk we will explore the analytic continuation of the LTLD to $\mathbb{C}\setminus(-\infty,0]$. Two integral expressions for the analytic continuation will be presented. There is a well known expression for the characteristic function in the literature that is incorrect; we will see why it is incorrect and provide the correct expression. An integral expression of the LTLD will be exploited to obtain asymptotic series approximations. We will discuss the computation of the LTLD by way of numerical integration and series approximations.

If time permits, we will discuss applications like computing the density of a sum of independent lognormals, or computing the density of the Thorin measure of a lognormal distribution.

### February 8, 2018

#### Title: Regular Toirodal Polytopes and Hypertopes

Toroidal polytopes can be built by twisting a portion of a Euclidean tesselation into a torus by identifying translation vectors. Hypertopes are generalizations of polytopes which no longer require vertices, edges, etc... to be ordered. Hypertopes can "tile" Euclidean space, and so we need only find which vectors generate toroidal Hypertopes.

### February 1, 2018

#### Title: Card Shuffling and Hopf Algebras

Riffle shuffling is one of the most popular techniques to randomize a deck of cards. One immediate question follows: how many times do I have to shuffle so that my deck is random? I will talk about this question using the language of Hopf algebras.

### January 25, 2018

#### Title: Honey Bees and Equations of Sociality

It’s no secret that honey bees are dying. The causes of this phenomenon is multi- facted and dependent on complex interactions between environment, pathogens and the structure of of honey bee colonies. Mathematical models help to discern how stresses on a honey bee colony may cause declines in the population, and can offer proposed avenues for conservation and re-population efforts. In this talk, I will painstakingly develop a base model for honey bee colony dynamics and show how such a model can help answer some open questions in biology. Once resolved, we will venture into more current research in the field.

### January 18, 2018

#### Winter Term Kick-off Problem Session

Welcome back to the Winter Term, and the Winter 2018 Series of the Left to the Reader Seminar! To start things off this term, we will have an open problem session again, where audience members will share the problems they are working on, or are interested in working on. Each person will take five minutes to share their current interests to a general mathematical audience, and explain some of the tools most commonly used in their field. Afterward, we will fill up the time slots of the coming term. Hope to see you all there!

### November 30, 2017

#### Title: BlockChain Madness! An introduction to the world of Bitcoin and decentralized networks

Blockchain is a decentralized transaction and data management technology developed first for Bitcoin cryptocurrency in 2008. The technology is based on a distributed public ledger structure, in which the ledger is not owned or controlled by one central authority, transactions are immutable and transparent, and eliminates the so-called double spend problem. For these reasons, blockchain technology is expected to revolutionize industry and commerce and drive economic change on a global scale. It has potential to empower people in developing countries with recognized identity or asset ownership, provide financial independence, and avert financial crises.

Bitcoin, the decentralized peer-to-peer digital currency, is the most popular application of blockchain technology. The digital currency itself is highly controversial but the underlying blockchain technology has worked flawlessly and found wide range of applications in business, politics, health, and society at large. For example, NASDAQ in partnership with Chain are working on shares trading using blockchain; Verisart is using blockchain to verify art prices and encoding copyrights of art work; and ShoCard encodes and stores personal information regarding identity on the blockchain. In this talk, I will explore blockchain technology as a significant source of disruptive innovations and the possible paradigm shift to a democratic scalable digital economy. If time permits, I will provide an overview of the mathematical foundation of blockchain development, specifically, finite fields and elliptic curve cryptography.

### November 23, 2017

#### Title: The limiting distribution of composite likelihood ratio test under non-standard conditions

Full likelihood estimation is a well known and traditional method for parameter estimation. However, correlation and high dimensionality in data often could make the computation of maximum likelihood very intensive and prohibitive. Composite likelihood estimation was introduced as an alternative to the full likelihood that by using sub-densities instead of the joint densities makes the work more feasible. Therefore, I focus on finding the distribution of a composite likelihood version of a hypothesis test which has more complicated form than the full likelihood one.

Full likelihood estimation is a well known and traditional method for parameter estimation. However, correlation and high dimensionality in data often could make the computation of maximum likelihood very intensive and prohibitive. Composite likelihood estimation was introduced as an alternative to the full likelihood that by using sub-densities instead of the joint densities makes the work more feasible. Therefore, I focus on finding the distribution of a composite likelihood version of a hypothesis test which has more complicated form than the full likelihood one.

### November 16, 2017

#### Title: Modelling WNV epidemics in Emilia-Romagna

West Nile Virus (WNV) has been identified for the first time in Italy in 1998, and more continuously since 2008 with a total of 173 neurological human cases between 2008 and 2015 and has become endemic also in Canada - especially in the the Ontario region. The region Emilian Romagna has set up since 2009 a systematic program of mosquito and corvids (known to be among the most competent bird species for WNV) trapping and testing. Data collected through this program has been analysed through a mathematical model in order to understand the main drivers of the observed dynamics. Our results showed that including a seasonal shift in mosquito feeding behaviour (and not keeping it constant) makes model outputs much more consistent with observed data.

In this talk, I would first like to disucss with you about some of the key facts related to West Nile Virues and introduce the notion of compartmental models in disease modelling.. In the second part, I would like to show parts of our results including the ODE model for the mosquito-corvid dynamics and some Markov Chain Monte Carlo (MCMC) methods to estimate the parameters required.

### November 9, 2017

#### Title: A Combinatorial counting construction on $\omega_{1}$

For this talk I will discuss an introductory overview of a recent work with Fulgencio Lopez.

We show that adding at least $\omega_2$ Cohen reals adds a capturing construction scheme. We study the weaker notion of $n$-capturing construction scheme, and show it is consistent to have an $n$-capturing construction scheme, but no $(n+1)$-capturing construction schemes. We also study the relation of n-capturing with the $m$-Knaster hierarchy, and show that $\text{MA}_{\omega_1}(K_m)$ and $n$-capturing are independent when $n$ is at most $m$, and incompatible if $n>m$.

### November 2, 2017

#### Nonlinear statistical filtering for noise removal in radar target tracking

The common usage of the word "filter" refers to a device that removes unwanted components from a mixture, such as a cigarette filter reducing the number of fine particles inhaled with tobacco smoke. Similarly, in signal processing, a filter refers to a process that removes unwanted components from a signal. Often, a signal processing filter removes a range of frequencies from a signal. It "filters out" unwanted frequency components. A statistically defined filter, however, filters out noise from a noisy signal. The first statistically defined filter to be described was the Wiener filter, developed by Norbert Wiener during the 1940's. It paved the way for other statistically defined filters to be introduced, including the Kalman filter, which is the focus of this talk. I will start with a brief description of the Wiener filter, then describe and show an implementation of the (linear) Kalman filter. Next, I will show three nonlinear filters based on the Kalman filter: the extended Kalman filter, the second order nonlinear filter, and the Monte Carlo simulation filter. Finally, I will show results of a simulation study comparing these filters applied to a nonlinear radar target tracking problem.

### October 19, 2017

#### Explicit results on Chebyshev functions: A prime counting adventure

Consider the function $\pi(x)=\{n\le x : n \text{ is prime}\}$. Legendre conjectured that $\pi(x)\sim \frac{x}{\log x}$. Nearly 100 years later, in 1896, Hadamard and de la Vallée Poussin proved an equivalent statement by considering the weighted prime counting function $\psi(x)$. In 1962, Rosser and Schoenfeld gave a method to explicitly estimate the error term in the approximation of $\psi(x)$. This result relies on information concerning the non-trivial zeros of the Riemann zeta function $\zeta(s)$ and subsequent numerical improvements to this information also translated into improved estimates for $\psi(x)$.

In this talk, we present various new explicit methods such as introducing some smooth weights and establishing some zero density estimates for the Riemann zeta function. Additionally, we will discuss results using these new techniques for finding primes in short intervals and the distribution of primes in arithmetic progressions.

### October 12, 2017

#### Problems on D-spaces, Menger spaces and the star versions of the Menger property

A topological space $X$ is a $D$-space if for every neighborhood assignment $\{N(x):x \in X\}$ (that is, $N(x)$ is an open neighborhood of $x$ for each $x \in X$), there is a closed discrete subset $D$ of $X$ such that $\{N(x):x \in D\}$ is a cover of $X$. A topological space $X$ is Menger if for each sequence $\{\mathcal{U}_n:n\in\omega\}$ of open covers of $X$, there is a sequence $\{\mathcal{V}_n:n\in\omega\}$ such that for each $n\in\omega$, $\mathcal{V}_n$ is a finite subset of $\mathcal{U}_n$ and $\{\bigcup\mathcal{V}_n:n\in\omega\}$ is an open cover of $X$. Todd Eisworth states: "There are certainly some mathematical questions that arouse the curiosity of almost anyone who comes in contact with them, questions that tempt with the simplicity of of their formulation, tantalize with promises of an elegant solution if only one can look at the problem in just the right way, and taunt with the number of excellent mathematicians who have examined the question in the past and failed to solve it. The theory of $D$-spaces is replete with such questions."

I would like to discuss with you some problems related with $D$-spaces, Menger spaces, and the star versions of the Menger property.

### October 5, 2017

#### Title: Artificial Intelligence and Analysis of Non-stationary Spatial-temporal Data

In this talk, I will first give a brief introduction to the area of Artificial Intelligence (AI) in which the main differences between Strong AI and Weak AI, and some of the mathematical as well as philosophical theories formulated for describing each of these subareas will be discussed. I will further review the advances in Machine Learning research and current state-of-the-art in Deep Learning, Statistical Learning and Tensor Decomposition-based Learning methods. The rest of the talk will focus on the main challenges and difficulties confronted when analyzing the non-stationary spatial-temporal data using the machine learning approach. As a real-world example of such a complex data, electroencephalogram recordings of patients having major depressive disorder will be demonstrated on which I will share some of my recent research achievements. Our proposed framework utilized for analyzing the mentioned data involves the use of various techniques developed in statistics, operations research, machine learning and big data

### September 28, 2017

#### Change-point detection for noisy non-stationary biological signals

Experimentally and clinically collected time series data are often contaminated with significant confounding noise, creating short, non-stationary time series. This noise, due to natural variability and measurement error, poses a challenge to conventional change point detection methods. We proposed a novel, real-time change point detection method for effectively extracting important time points in non-stationary, noisy time series. We validated our approach with three simulated time series, as well as with a physiological data set of simulated labour experiments in fetal sheep. Our methodology allows for the first time the detection of fetal acidemia from changes in the fetus' heart rate variability, rather than traditional invasive methods. We believe that our method demonstrates a first step towards the development of effective, non-invasive real time monitoring during labour from signals which may be easily collected.

### September 21, 2017

#### Kick-off Problem Session

This week we are having audience members introduce a problem they are interested in studying. Each person will take only 5 minutes to give the problem statement and some of the tools used. This session will act as a preview for what may appear in the upcoming weeks. Afterward we will try to fill up the time slots with volunteer speakers.