### November 30, 2017

#### Speaker: Affan Akhter (York U)

#### Title: BlockChain Madness! An introduction to the world of Bitcoin and decentralized networks

Blockchain is a decentralized transaction and data management technology developed first for Bitcoin cryptocurrency in 2008. The technology is based on a distributed public ledger structure, in which the ledger is not owned or controlled by one central authority, transactions are immutable and transparent, and eliminates the so-called double spend problem. For these reasons, blockchain technology is expected to revolutionize industry and commerce and drive economic change on a global scale. It has potential to empower people in developing countries with recognized identity or asset ownership, provide financial independence, and avert financial crises.

Bitcoin, the decentralized peer-to-peer digital currency, is the most popular application of blockchain technology. The digital currency itself is highly controversial but the underlying blockchain technology has worked flawlessly and found wide range of applications in business, politics, health, and society at large. For example, NASDAQ in partnership with Chain are working on shares trading using blockchain; Verisart is using blockchain to verify art prices and encoding copyrights of art work; and ShoCard encodes and stores personal information regarding identity on the blockchain. In this talk, I will explore blockchain technology as a significant source of disruptive innovations and the possible paradigm shift to a democratic scalable digital economy. If time permits, I will provide an overview of the mathematical foundation of blockchain development, specifically, finite fields and elliptic curve cryptography.

### November 23, 2017

#### Speaker: Mahdis Azadbakhsh (York U)

#### Title: The limiting distribution of composite likelihood ratio test under non-standard conditions

Full likelihood estimation is a well known and traditional method for parameter estimation. However, correlation and high dimensionality in data often could make the computation of maximum likelihood very intensive and prohibitive. Composite likelihood estimation was introduced as an alternative to the full likelihood that by using sub-densities instead of the joint densities makes the work more feasible. Therefore, I focus on finding the distribution of a composite likelihood version of a hypothesis test which has more complicated form than the full likelihood one.

Full likelihood estimation is a well known and traditional method for parameter estimation. However, correlation and high dimensionality in data often could make the computation of maximum likelihood very intensive and prohibitive. Composite likelihood estimation was introduced as an alternative to the full likelihood that by using sub-densities instead of the joint densities makes the work more feasible. Therefore, I focus on finding the distribution of a composite likelihood version of a hypothesis test which has more complicated form than the full likelihood one.

### November 16, 2017

#### Speaker: Marco Tosato (York U)

#### Title: Modelling WNV epidemics in Emilia-Romagna

West Nile Virus (WNV) has been identified for the first time in Italy in 1998, and more continuously since 2008 with a total of 173 neurological human cases between 2008 and 2015 and has become endemic also in Canada - especially in the the Ontario region. The region Emilian Romagna has set up since 2009 a systematic program of mosquito and corvids (known to be among the most competent bird species for WNV) trapping and testing. Data collected through this program has been analysed through a mathematical model in order to understand the main drivers of the observed dynamics. Our results showed that including a seasonal shift in mosquito feeding behaviour (and not keeping it constant) makes model outputs much more consistent with observed data.

In this talk, I would first like to disucss with you about some of the key facts related to West Nile Virues and introduce the notion of compartmental models in disease modelling.. In the second part, I would like to show parts of our results including the ODE model for the mosquito-corvid dynamics and some Markov Chain Monte Carlo (MCMC) methods to estimate the parameters required.

### November 9, 2017

#### Speaker: Damjan Kalajdzievski (York U)

#### Title: A Combinatorial counting construction on $\omega_{1}$

For this talk I will discuss an introductory overview of a recent work with Fulgencio Lopez.

We show that adding at least $\omega_2$ Cohen reals adds a capturing construction scheme. We study the weaker notion of $n$-capturing construction scheme, and show it is consistent to have an $n$-capturing construction scheme, but no $(n+1)$-capturing construction schemes. We also study the relation of n-capturing with the $m$-Knaster hierarchy, and show that $\text{MA}_{\omega_1}(K_m)$ and $n$-capturing are independent when $n$ is at most $m$, and incompatible if $n>m$.

### November 2, 2017

#### Speaker: Andrea Zubac (York U)

#### Nonlinear statistical filtering for noise removal in radar target tracking

The common usage of the word "filter" refers to a device that removes unwanted components from a mixture, such as a cigarette filter reducing the number of fine particles inhaled with tobacco smoke. Similarly, in signal processing, a filter refers to a process that removes unwanted components from a signal. Often, a signal processing filter removes a range of frequencies from a signal. It "filters out" unwanted frequency components. A statistically defined filter, however, filters out noise from a noisy signal. The first statistically defined filter to be described was the Wiener filter, developed by Norbert Wiener during the 1940's. It paved the way for other statistically defined filters to be introduced, including the Kalman filter, which is the focus of this talk. I will start with a brief description of the Wiener filter, then describe and show an implementation of the (linear) Kalman filter. Next, I will show three nonlinear filters based on the Kalman filter: the extended Kalman filter, the second order nonlinear filter, and the Monte Carlo simulation filter. Finally, I will show results of a simulation study comparing these filters applied to a nonlinear radar target tracking problem.

### October 19, 2017

#### Speaker: Allysa Lumley (York U)

#### Explicit results on Chebyshev functions: A prime counting adventure

Consider the function $\pi(x)=\{n\le x : n \text{ is prime}\}$. Legendre conjectured that $\pi(x)\sim \frac{x}{\log x}$. Nearly 100 years later, in 1896, Hadamard and de la Vallée Poussin proved an equivalent statement by considering the weighted prime counting function $\psi(x)$. In 1962, Rosser and Schoenfeld gave a method to explicitly estimate the error term in the approximation of $\psi(x)$. This result relies on information concerning the non-trivial zeros of the Riemann zeta function $\zeta(s)$ and subsequent numerical improvements to this information also translated into improved estimates for $\psi(x)$.

In this talk, we present various new explicit methods such as introducing some smooth weights and establishing some zero density estimates for the Riemann zeta function. Additionally, we will discuss results using these new techniques for finding primes in short intervals and the distribution of primes in arithmetic progressions.

### October 12, 2017

#### Speaker: Sergio Garcia Balan (York U)

#### Problems on D-spaces, Menger spaces and the star versions of the Menger property

A topological space $X$ is a $D$-*space* if for every neighborhood assignment $\{N(x):x \in X\}$ (that is, $N(x)$ is an open neighborhood of $x$ for each $x \in X$), there is a closed discrete subset $D$ of $X$ such that $\{N(x):x \in D\}$ is a cover of $X$. A topological space $X$ is *Menger* if for each sequence $\{\mathcal{U}_n:n\in\omega\}$ of open covers of $X$, there is a sequence $\{\mathcal{V}_n:n\in\omega\}$ such that for each $n\in\omega$, $\mathcal{V}_n$ is a finite subset of $\mathcal{U}_n$ and $\{\bigcup\mathcal{V}_n:n\in\omega\}$ is an open cover of $X$. Todd Eisworth states: "There are certainly some mathematical questions that arouse the curiosity of almost anyone who comes in contact with them, questions that tempt with the simplicity of of their formulation, tantalize with promises of an elegant solution if only one can look at the problem in just the right way, and taunt with the number of excellent mathematicians who have examined the question in the past and failed to solve it. The theory of $D$-spaces is replete with such questions."

I would like to discuss with you some problems related with $D$-spaces, Menger spaces, and the star versions of the Menger property.

### October 5, 2017

#### Speaker: Masoud Ataei (York U)

#### Title: Artificial Intelligence and Analysis of Non-stationary Spatial-temporal Data

In this talk, I will first give a brief introduction to the area of Artificial Intelligence (AI) in which the main differences between Strong AI and Weak AI, and some of the mathematical as well as philosophical theories formulated for describing each of these subareas will be discussed. I will further review the advances in Machine Learning research and current state-of-the-art in Deep Learning, Statistical Learning and Tensor Decomposition-based Learning methods. The rest of the talk will focus on the main challenges and difficulties confronted when analyzing the non-stationary spatial-temporal data using the machine learning approach. As a real-world example of such a complex data, electroencephalogram recordings of patients having major depressive disorder will be demonstrated on which I will share some of my recent research achievements. Our proposed framework utilized for analyzing the mentioned data involves the use of various techniques developed in statistics, operations research, machine learning and big data

### September 28, 2017

#### Speaker: Nathan Gold (York U)

#### Change-point detection for noisy non-stationary biological signals

Experimentally and clinically collected time series data are often contaminated with significant confounding noise, creating short, non-stationary time series. This noise, due to natural variability and measurement error, poses a challenge to conventional change point detection methods. We proposed a novel, real-time change point detection method for effectively extracting important time points in non-stationary, noisy time series. We validated our approach with three simulated time series, as well as with a physiological data set of simulated labour experiments in fetal sheep. Our methodology allows for the first time the detection of fetal acidemia from changes in the fetus' heart rate variability, rather than traditional invasive methods. We believe that our method demonstrates a first step towards the development of effective, non-invasive real time monitoring during labour from signals which may be easily collected.