Archive 2017

December 11, 2017: Chen Zhou (Erasmus University Rotterdam)

When: Monday December 11th, 15:05
Where: TU Delft, Faculty EWI, Mekelweg 4, Snijderzaal (LB01.010)

Trends in extreme value indices

We consider extreme value analysis for independent but non-identically distributed observations. In particular, the observations do not share the same extreme value index. This situation is related to, but differs from, heteroscedastic extremes in Einmahl et al. (2016). Compared to the heteroscedastic extremes, our model allows for a broader class in which tails of the probability distributions of different observations are of different order. In other words, we are dealing with distributions that differ much more than the heteroscedastic extremes. Assuming continuously changing extreme value indices, we provide a non-parametric estimate for the functional extreme value index. Besides estimating the extreme value index locally, we also provide a global estimator for the trend and its joint asymptotic property. The global asymptotic property can be used for testing a pre-specified parametric trend in the extreme value indices. In particular, it can be applied to test whether the extreme value index remains at a constant level across all observations.

December 11, 2017: Jan Beirlant (KU Leuven)

When: Monday December 11th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Snijderzaal (LB01.010)

Bias reduced estimation of the extreme value index

A lot of attention has been paid to bias reduced estimation of the extreme value index in case of heavy-tailed distributions. In this talk we present some proposals for all max-domains of attraction. A first method is based on ridge regression for generalized quantiles. Secondly we discuss the use of Bernstein polynomials for estimating the bias in the Peaks over Threshold method.

December 4, 2017: Pierre Monmarché (Université Pierre et Marie Curie)

When: Monday December 4th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall D@ta.

Sampling with kinetic processes

Given a target probability measure mu, a MCMC algorithm relies on an ergodic Markov process with invariant measure mu. There exist many such processes and, in order to chose the best one, one should understand at which speed they converge to their equilibrium. We will motivate the use of kinetic processes, and present some results on two different dynamics: the kinetic Langevin process, which is a hypoelliptic diffusion, and velocity jump processes, which are piecewise deterministic processes.

November 27: Ludolf Meester (TU Delft)

When: Monday November 27th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall D@ta.

Exponential convergence of adaptive importance sampling algorithms for Markov chains 

These algorithms orginate in the field of particle transport analysis, but the structure of the problems is quite general: a Markov chain is run and per transition a "reward" is earned; this continues until the process hits a "graveyard set." Quantity of interest is the expected total reward.  In the original problem the reward is energy dissipated, but other problems also fit in: rare event simulations in various settings (reward is 1 for the transition into the graveyard and 0 otherwise); finding the largest eigenvalue of a nonnegative matrix.

A recent paper answers the following question: for which of this kind of Markov chain problems can a so-called filtered estimator be found in combination with a Markov importance measure under which this estimator has variance zero. Adaptive importance sampling algorithms aim to approach this zero variance measure on-the-fly and already two special cases were known for which this works: the resulting sequence of estimates converges at an exponential rate. For a while I thought that finding a general convergence proof would be impossible, but in recent months I have made some progress with this. In the talk I will describe the proof including the part where the conditions are not weak enough to my liking---maybe you have an idea.... 

November 20, 2017: Michael Vogt (University of Bonn)

When: Monday November 20th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall D@ta.

Multiscale Clustering of Nonparametric Regression Curves

We study a longitudinal data model with nonparametric regression functions that may vary across the observed subjects. In a wide range of applications, it is natural to assume that not every subject has a completely different regression function. We may rather suppose that the observed subjects can be grouped into a small number of classes whose members share the same regression curve. We develop a bandwidth-free clustering method to estimate the unknown group structure from the data. More specifically, we construct estimators of the unknown classes and their unknown number which are free of classical bandwidth or smoothing parameters. In the talk, we analyze the statistical properties of the proposed estimation method and illustrate it by an application to temperature anomaly data. 

November 13, 2017:  Stochastics Meeting Lunteren

November 6, 2017:  Nan van Geloven (Leiden University Medical Center)

When: Monday November 6th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall D@ta.

Univariate frailty models for the evaluation of treatments

Doctors often have to choose between starting treatment immediately or first introducing a wait-and-see period during which a patient might recover without the treatment. In this presentation I show that the effect of such a treatment delay period on the time to recovery depends on the heterogeneity between patients’ recovery chances. I study this effect using univariate frailty models, assuming different distributions for the frailty. In a frailty model with constant (i.e, exponential) baseline hazard and a proportional treatment effect that is common over patients, a treatment delay period hardly compromises cumulative recovery rates if the population is heterogeneous. In a homogeneous population however, cumulative recovery rates are directly compromised by treatment delay.

Estimating the effect of treatment delay from data can be done in several ways. We show that the conventional Cox proportional hazard model overestimates the effect of treatment delay. Including a frailty term in the model could improve the estimation, but frailties are generally hard to estimate in univariate survival data. I present alternative approaches accommodating the effect of heterogeneity on treatment delay using treatment by time interaction terms. Estimation results are presented both through simulations and in two motivating applications evaluating the effect of delaying fertility treatments on time-to-pregnancy in couples with unexplained subfertility.

October 30, 2017: Eric-Jan Wagenmakers (University of Amsterdam)

When: Monday October 30th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall Chip.

The Why and How of Testing a Point-Null Hypothesis Within a Bayesian Framework 

In the first part of this presentation I will describe the
history of the Bayesian point-null hypothesis test as developed by
Harold Jeffreys. The conclusion --which appears to have been largely
forgotten-- is that in order to have any confidence in the existence
of an invariance or a general law, one needs to assign it a separate
prior probability. I will contrast Jeffreys's methodology with a
beguiling alternative, which is to assess the extent to which the
posterior distribution overlaps with the point of interest.

October 23, 2017: Aernout van Enter (University of Groningen)

When: Monday October 23th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall G.

One-sided versus two-sided points of view

Finite-state, discrete-time Markov chains coincide with Markov fields on $\mathbb Z$, (which are nearest-neighbour Gibbs measures in one dimension). That is, the one-sided Markov property and the two-sided Markov property are equivalent.

We discuss to what extent this remains true if we try to weaken the Markov property to the almost Markov property, which is a form of continuity of conditional probabilities. The generalization of the one-sided Markov measures leads to the so-called "g-measures" (aka "chains with complete connection", "uniform martingales",..), whereas the two-sided generalization leads to the class of Gibbs or DLR measures, as studied in statistical mechanics. It was known before that there exist g-measures which are not Gibbs measures. It is shown here that neither class includes the other.

We consider this issue in particular on the example of long-range, Dyson model, Gibbs measures.

(Work with R.Bissacot, E.Endo and A. Le Ny)

October 18, 2017: Erik Broman (Chalmers University of Technology and University of Gothenburg)

When: Wednesday October 18th, 13:45

Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Covering a subset of R^d by Poissonian random sets The problem of covering a set A by a collection of random sets dates back to Dvoretzky in 1954. Since then, a host of papers have been written on the subject. In this talk we shall review some of this history and discuss two directions in which progress have recently been made. In the first case we consider a statistically scale invariant collection of subsets of R^d, which are chosen at random according to a Poisson process of intensity lambda. The complement of the union of these sets is then a random fractal that we denote by C. Such random fractals have been studied in many contexts, but here we are interested in the critical value of lambda for which the set C is almost surely empty (so that R^d is completely covered). Such problems were earlier studied and solved in one dimension, while here we shall present recent progress which solves it in all dimensions. This part is based on joint work with J. Jonasson and J. Tykesson.

In the second direction we consider a dynamic version of coverings. For instance,
the set A could be a box of side lengths n, and then balls are raining from the
sky at unit rate. One then asks for the time at which A is covered. Together with
F. Mussini I have recently studied a variant in which the balls are replaced by
bi-infinite cylinders. This makes the problem fundamentally different as one no
longer have independence between well separated regions. Thus, new methods
and techniques must be used. Our main result is that we find the correct asymptotics for the cover time as the set A grows.

October 16, 2017: Carlo Lancia (Leiden University Medical Center)

When: Monday October 16rd, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall G.

Modelling inbound air traffic with pre-scheduled random arrivals: analytical and applied results

Pre-scheduled random arrivals (PSRA) are obtained by superimposing i.i.d. random delays to a deterministic stream of customers. This point process is very fit for modelling vehicular and logistics streams, where structured arrivals are inherently subject to random fluctuations. Yet, the use of PSRA for modelling air-traffic demand is scarce, and Poisson 
processes are preferred because of mathematical tractability.

Using data from some important European airports, I will construct a PSRA process and show that it describes well the inbound demand. Further, I will show that this PSRA can capture air-space capapacityconstraints, which are observed as a negative autocorrelation between arrivals in two consecutive time windows.

Next, I will move to a stochastic operations research setting and study a single-server queue with deterministic service time. Such a model is motivated by the landing operations of a large hub, like London Heathrow. I will consider the special case of a PSRA with exponential delays, called exponentially delayed arrivals (EDA). In Kendall's notation the queue system is then EDA/D/1.

I will show how to model EDA/D/1 as a bivariate Markov chain in the quarter plane. This chain has a unique equilibrium distribution, which can be found by solving a bivariate functional equation for the generating function of the stationary state. The functional equation is hard to solve but admit an easy solution on a subspace of the complex bi-plane. Using that solution, I will derive asymptotic bounds for the equilibrium distribution of EDA/D/1 and propose an easy-yet-efficient approximation scheme.

October 9, 2017: Paulo Serra (TU Eindhoven)

When: Monday October 9th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall G.

Bayesian preventive maintenance

Motivated by industrial practice, we model the condition of an asset using a stochastic process. We assume that as soon as the condition of the asset exceeds a predefi ned safety threshold, the asset is shut down and costly corrective reparations must ensue. The condition of the asset is regularly inspected (at pre-determined moments, with a fi xed inspection cost), and the goal is to decide based on the history of the inspections and taking into account all costs, if preventive maintenance (which is less costly) should be performed.
We consider the class of so called control limit (threshold-type) policies: maintenance is performed if a certain threshold (depending on the collected data, preventive maintenance- and corrective repair costs) is exceeded, at which point the asset is restored to a "good-as-new" condition. We work with a loss function that corresponds to the costs of each action (repair, maintenance, inspection). The model parameters are endowed with a prior, and the threshold is chosen so as to minimise the Bayesian expected loss.
The asymptotic distribution of the model parameters, of the duration of each maintenance cycle, and of the proposed threshold is also discussed.
I also present some numerical results, and address issues relating to the optimality of the approach.
This is joint work with Stella Kapodistria (Eindhoven University of Technology).

October 2, 2017: Lorenzo Federico (TU Eindhoven)

When: Monday October 2nd, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall G.


Percolation on fInite graphs is known to exhibit a phase transition similar to the Erdos-Renyi Random Graph in presence of sufficiently weak geometry. We focus on the Hamming graph $H(d,n)$ (the cartesian product of d complete graphs on $n$ vertices each) when $d$ is fixed and $n \to \infty$. We identify the critical point $p^{(d)}_c$ at which such phase transition happens and we analyse the structure of the largest connected components at criticality. We prove that the scaling limit of component sizes is identical to the one for critical Erdos-Renyi components, while the number of

surplus edges is much higher. These results are obtained coupling percolation to the trace of branching random walks on the Hamming graph.

Based on joint work with Remco van der Hofstad, Frank den Hollander and Tim Hulshof.

September 25, 2017: Alisa Kirichenko (University of Amsterdam)

When: Monday September 25th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall G.

Function estimation on a large graph using Bayesian Laplacian regularization.

We consider a Bayesian approach to estimating a smooth function in the context of regression or classification problems on large graphs. We present a mathematical framework that allows to study the performance of nonparametric function estimation methods on large graphs. We also present minimax convergence rates for these problems within the framework. We show how asymptotically optimal Bayesian regularization can be achieved under an asymptotic shape assumption on the underlying graph and a smoothness condition on the target function, both formulated in terms of the graph Laplacian. The priors we study are randomly scaled Gaussians with precision operators involving the Laplacian of the graph.

September 18, 2017: Conrado Freitas Paulo Da Costa (Leiden University)

When: Monday September 18th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall G.

From particle systems to reaction-diffusion equations

In this talk I will present how to use birth and death chains on a graph to derive solutions of a family of non-linear finite dimensional SDE's of Reaction-Diffusion type. In one dimension, this family corresponds to the fluctuations around a stable point of an ODE, and can be seen as arising either from taylor-made particle systems or from scaling limits of a canonical particle system. For multi-dimensional SDE's, the approach of taylor-making the particle systems is more general and allows for a broader family of SDE's.The derivation of these solutions, in the spirit of Stroock and Varadhan martingale methods, is based on using the equivalence between weak solutions of SDE's and solutions of Martingale problems.


September 11, 2017: William Yoo (Leiden University)

When: Monday September 11th, 16:00
Where: TU Delft, Faculty EWI, Mekelweg 4, Lecture hall G.

Bayes Lepski’s method and credible bands through volume of tubular neighborhoods

For a class of priors based on random series basis expansion, we develop a Bayesian Lepski’s method to estimate unknown regression function. In this approach, the series truncation point is determined based on a stopping rule that balances the posterior mean bias and the posterior standard deviation. Armed with this mechanism, we discuss an interesting method to construct Bayesian credible bands, where this statistical task is reformulated into a problem in geometry, and the band’s radius is calculated based on finding the volume of certain tubular neighborhoods embedded on a unit sphere. We discuss two special cases involving B-splines and wavelets, and will touch upon some interesting consequences such as the uncertainty principle and self-similarity.

July 4, 2017: Emilio Cirillo (Rome 1)

When: Tuesday July 4th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room Pi.

Particle-based modelling of flows through obstacles

The presence of obstacles modifies the way in which particles diffuse. In cells it is observed that the mean-square displacement of biomolecules scales as a power law with exponent smaller than one. This behavior, called anomalous diffusion, is due to the presence of macromolecules playing the role of obstacles. We discuss the effect of fixed macroscopic obstacles on the time needed by particles to cross a strip and we consider both a diffusive and a ballistic regime. We find that in some regimes this residence time is not monotonic with respect to the size and the location of the obstacles. We discuss our results for particles performing random walks on a two dimensional strip considering also the effect of an exclusion rule. Results obtained in collaboration with A. Ciallella (Rome), O. Krehel (Eindhoven), A. Muntean (Karlstadt), R. van Santen (Eindhoven), and A. Sengar (Eindhoven) will be discussed.

June 27, 2017: Christoph Hofer-Temmel (NLDA)

When: Tuesday June 27th, 12:45 
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Disagreement percolation for marked Gibbs point processes Disagreement percolation is a technique to control the differing boundary conditions in a Gibbs specification by a simpler percolation model. In the high temperature regime, the percolation model does not percolate and implies the uniqueness of the Gibbs measure. If the percolation has exponentially decaying connection probabilities, then exponential decay of correlations for the Gibbs measure follows, too. We extend this technique from the discrete case and bounded range interaction simple Gibbs point processes to finite range interaction marked Gibbs point process and general Boolean models. A core building block is a dependent thinning from a Poisson point process to a dominated Gibbs point process within a finite volume, where the thinning probability is related to the derivative of the free energy of the Gibbs point process.

June 20, 2017: Matthias Gorny (UniversitĂ© Paris-Sud)

When: Tuesday June 20th, 12:45 
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

A Curie-Weiss model of self-organized criticality

In their famous 1987 article, Per Bak, Chao Tang and Kurt Wiesenfeld showed that certain complex systems, composed of a large number of dynamically interacting elements, are naturally attracted by critical points, without any external intervention. This phenomenon, called self-organized criticality (SOC), can be observed empirically or simulated on a computer in various models. However the mathematical analysis of these models turns out to be extremely difficult. Even models whose definition seems simple, such as the models describing the dynamics of a sandpile, are not well understood mathematically. In my presentation, I will introduce a model of SOC which is built by modifying the generalized Ising Curie-Weiss model. I will present a fluctuation theorem which proves that this model indeed exhibits SOC: the sum $S_{n}$ of the random variables behaves as in the typical critical generalized Ising Curie-Weiss model, i.e., the fluctuations are of order $n^{3/4}$ and the limiting law is $C \exp(-\lambda x^{4})\,dx$ where $C$ and $\lambda$ are suitable positive constants. Finally I will introduce associated dynamic models of SOC.

June 13, 2017: No Seminar

June 9, 2017: Rob Ross (TU Delft)

When: Friday June 9th, 12:45 
Where: TBA

Reliability in High Voltage networks – Effective asset management of a strategic infrastructure

The transmission electrical network is the backbone of the electrical grid. It connects large scale power generation to the regional distribution electrical networks and to large customers. Of growing importance are also the interconnections between neighbouring countries and between countries through submarine cable systems.
TenneT is the transmission utility in the Netherlands and a large part of Germany. With a security of supply of 99.9999% and 41 million end-users the challenge is how to preform effective asset management, i.e. how to warrant and make optimal use of the many thousands of objects that together shape the grid. On the one hand billions of euros are invested in the development of grids that embrace sustainable energy. On the other hand a considerable part of the grid is over 30 years of age and still functioning well, but the challenge is to timely detect the need for inspection, refurbishment and replacement. Too early replacement is a waste of public money, but too late replacement may lead to large damage. Asset management aims at doing the right thing at the right time against minimum costs. The underlying evaluation and decision-making are based on expertise and optimized with statistics.
This colloquium will focus on the various issues that asset strategists of electrical power grids face and the methods that are in place or under development in pursue of the maintaining a high reliability and availability of the electric power supply.

June 6, 2017: Yining Chen (LSE)

When: Tuesday June 6th, 12:45 
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Detecting multiple local extrema via wild binary segmentation

We consider the univariate nonparametric regression problem, where given n observations, the goal is to detect the number and locations of multiple local maxima and minima in the curve. We propose a new approach that combines the ideas of wild binary segmentation (Fryzlewicz, 2014) and mode estimation using isotone regression. We show that our procedure consistently estimates the number of local extrema, and is minimax optimal (up to a logarithmic factor) in estimating the locations of these points. Moreover, we show that the computational complexity of our method is near-optimal (i.e., up to a logarithmic factor, of order n). Finally, we discuss how our approach could be extended to detect other interesting features, such as inflection points.

May 30, 2017: StĂ©phanie van der Pas (LUMC and Leiden University)

When: Tuesday May 30th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Bayesian community detection

In the stochastic block model, nodes in a graph are partitioned into classes ('communities') and it is assumed that the probability of the presence of an edge between two nodes solely depends on their class labels. We are interested in recovering the class labels, and employ the Bayesian posterior mode for this purpose. We present results on weak consistency (where the fraction of misclassified nodes converges to zero) and strong consistency (where the number of misclassified nodes converges to zero) of the posterior mode, in the 'dense' regime where the probability of an edge occurring between two nodes remains bounded away from zero, and in the 'sparse' regime where this probability does go to zero as the number of nodes increases.

May 23, 2017: Jere Koskela (TU Berlin)

When: Tuesday May 23rd, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Consistency results for Bayesian nonparametric inference from processes with jumps

Consistency can informally be thought of as the ability to infer the true data generating model from a sufficiently large amount of data, and has been regarded as a minimal condition for good inference procedures for several decades. It is also notoriously difficult to verify in the Bayesian nonparametric setting. In recent years, positive results have been established for discretely observed, diffusions under restrictive but verifiable conditions on the prior. I will present an introduction to Bayesian nonparametric inference and posterior consistency, and show how these results for diffusions can be generalised to jump-diffusions under an additional identifiability assumption. Similar arguments will also be shown to yield posterior consistency for a separate class of processes called Lambda-Fleming-Viot processes: inhomogeneous, compactly supported compound Poisson processes arising as models of allele frequencies in population genetics. Identifiability can also be verified rather than assumed for Lambda-Fleming-Viot processes, which results in a tractable set of conditions for posterior consistency that is satisfied e.g. by the popular Dirichlet process mixture model prior.

May 16, 2017: Yong Wang (The University of Auckland)

When: Tuesday May 16th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Mixture-based Nonparametric Density Estimation

In this talk, I will describe a general framework for nonparametric density estimation that uses nonparametric or semiparametric mixture distributions. Similar to kernel-based estimation, the proposed approach uses bandwidth to control the density smoothness, but each density estimate for a fixed bandwidth is determined by likelihood maximization, with bandwidth selection carried out as model selection. This leads to much simpler models than the kernel ones, yet with higher accuracy.
Results of simulation studies and real-world data in both the univariate and the multivariate situation will be given, all suggesting that these mixture-based estimators outperform the kernel-based ones.

May 9, 2017: Hakan GĂĽldas (Leiden University)

When: Tuesday May 9th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Random walks on dynamic configuration models

In this talk, first I will introduce dynamic configuration model which is a dynamic random graph model in discrete time. Then, I will go into details of our results about mixing times of simple random walks on dynamic configuration model. The key property of the dynamic configuration model is that the degrees of the vertices do not change over time. Thanks to this property, the notion of stationary distribution for the random walk on the graph makes sense and mixing occurs although the random walk itself is not Markovian. The results I will give identify the behaviour of mixing times in terms of the proportion of edges that changes at every step of graph dynamics when the number of vertices is large.
Result are based on a joint work with Luca Avena, Remco van der Hofstad and Frank den Hollander.

May 2, 2017: Moritz Schauer (Leiden University)

When: Tuesday May 2nd, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Stochastic monotonicity of Markov processes - A generator approach

We consider stochastic orders on random variables which can be defined in terms of expectations of test functions. Notable examples are the standard stochastic order induced by the increasing functions or the convex order induced by convex functions, capturing the size and spread of random variables. In general, we consider cones of test functions characterized by Φ f ≥ 0 for some linear operator Φ.
Of particular interest are stochastically monotone Markov processes which preserve stochastic order properties in time. The semigroup {S(t): t ≥ 0} of a monotone Markov processes defined by  S(t) f(x) = E [ f(X(t)) | X(0) = x] maps these cones into themselves.
We introduce a new functional analytic technique based on the generator A of the semi-group of a Markov process X(t) and its resolvent to study the property of stochastic monotonicity. We show that the existence of an operator B with positive resolvent such that Φ A - B Φ is a positive operator for a large enough class of functions implies stochastic monotonicity. This establishes a technique for proving stochastic monotonicity and preservation of order for Markov processes that can be applied in a wide range of settings including various orders for diffusion processes with or without boundary conditions and orders for discrete interacting particle systems.
Joint work with Richard C. Kraaij (Ruhr-University of Bochum)

April 25, 2017: Maaneli Derakhshani (Utrecht University)

When: Tuesday April 25th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Hints Toward A Stochastic Hidden-Variables Foundation For Quantum Mechanics

It is well-known that standard quantum theory is plagued by conceptual and technical problems, most notably the quantum measurement problem. The quantum measurement problem indicates that standard quantum theory (whether in non-relativistic or relativistic or quantum-gravitational form) cannot be a fundamental theory of the physical world, and must be replaced by a measurement-problem-free theory of quantum phenomena. Among the viable alternatives to standard quantum theory are nonlocal contextual 'hidden-variable' theories. In this talk, it will be shown that there are tantalizing meta-theoretical hints that the Schroedinger equation and Born-rule interpretation of the wavefunction in standard quantum mechanics have deeper foundations in some nonlocal contextual theory of stochastic hidden-variables. This will be shown by drawing surprising and little-known correspondences between the mathematical structures of Schroedinger's equation and quantum expectation values of physical observables, one the one hand, and the mathematical structures of (1) classical statistical mechanics in the Hamilton-Jacobi representation, (2) the Einstein-Smoluchowski theory of classical Brownian motion, and (3) de Broglie's famous model of a clock particle guided by phase waves, on the other. Finally, it will be suggested that Nelson's stochastic mechanics, and a recent generalization of it proposed by us, constitutes the anticipated theory of stochastic hidden-variables. 

April 18, 2017: Easter Break

April 11, 2017: Dutch Math Congress

April 4, 2017: Ronald Meester (VU Amsterdam)

When: Tuesday April 4th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Why the law of total probability is sometimes not desirable, and how the the theory of belief functions helps to take care of this

We discuss some examples of betting situations in which the law of total probability fails. Since this law follows from the axioms of Kolmogorov and the definition of conditional probability, it follows that a more general theory is necessary. I will formulate more flexible axioms which turn out to characterize belief functions, a well known generalisation of probability measures. Within this theory, conditional belief functions can be defined in various ways, corresponding, roughly, to conditioning on either a necessary truth or a contingent truth. As such, the classical theory is extended and refined at the same time. I will argue that when probability is interpreted epistemically, one should always use belief functions rather than Kolmogorov probability. 

This is joint work with Timber Kerkvliet.

March 28, 2017: Gourab Ray (Cambridge University)

When: Tuesday March 28th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Universality of fluctuation in the dimer model

The dimer model is a very popular model in statistical physics because of its exact solvability properties. I will try to convince you that the fluctuation in the dimer model is universal in the sense that it is more or less independent of the underlying graph and also the topology the graph is embedded in and is given by a form of Gaussian free field.
Joint work with Nathanael Berestycki and Benoit Laslier.

March 21, 2017: Andrew Duncan (University of Sussex)

When: Tuesday March 21st, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Measuring Sample Quality with Diffusions

To improve the efficiency of Monte Carlo estimators, practitioners are turning to biased Markov chain Monte Carlo procedures that trade off asymptotic exactness for computational speed. While a reduction in variance due to more rapid sampling can outweigh the bias introduced, the inexactness creates new challenges for parameter selection. In particular, standard measures of sample quality, such as effective sample size, do not account for asymptotic bias. To address these challenges, we introduce a new computable quality measure based on Stein's method that quantifies the maximum discrepancy between sample and target expectations over a large class of test functions. We demonstrate this tool by comparing exact, biased, and deterministic sample sequences and illustrate applications to hyperparameter selection, convergence rate assessment, and quantifying bias-variance tradeoffs in posterior inference.

March 14, 2017: Kolyan Ray (Leiden University)

When: Tuesday March 14th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Asymptotic equivalence between density estimation and Gaussian white noise revisited

Asymptotic equivalence between two statistical models means that they have the same asymptotic properties with respect to all decision problems with bounded loss. A key result by Nussbaum states that nonparametric density estimation is asymptotically equivalent to a suitable Gaussian shift model, provided that the densities are smooth enough and uniformly bounded away from zero.
We study the case when the latter assumption does not hold and the density is possibly small. We further derive the optimal Le Cam distance between these models, which quantifies how close they are. As an application, we also consider Poisson intensity estimation with low count data.
This is joint work with Johannes Schmidt-Hieber.

March 7, 2017: Frank van der Meulen (TU Delft)

When: Tuesday March 7th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Bayesian estimation for hypo-elliptic diffusions

Suppose X is a discretely observed diffusion process and we wish to sample from the posterior distribution of parameters appearing in either the drift coefficient or the diffusion coefficient. As the likelihood is intractable a common approach is to derive an MCMC algorithm where the missing diffusion paths in between the observations are augmented to the state space. This requires efficient sampling of diffusion bridges. In recent years some results have appeared in the "uniformly elliptic" case, which is characterised by nondegeneracy of the covariance matrix of the noise. The "hypo-elliptic"  case refers to the situation where the covariance matrix of the noise is degenerate and where observations are only made of variables that are not directly forced by white noise. As far as I am aware, not much is known how to sample bridges in this case.
In this talk I will share some recent ideas on extending earlier results with Harry van Zanten (UvA) and Moritz Schauer (Leiden), derived under the assumption of uniformly ellipticity, to this setting.
Joint work with Harry van Zanten (Uva), Moritz Schauer (Leiden) and Omiros Papaspilopoulos (Universitat Pompeu Fabra)

February 28, 2017: No Seminar

February 21, 2017: Pasquale Cirillo (TU Delft) - Cancelled

When: Tuesday February 21st, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Interacting Urn Systems and a Financial Application

February 9 (Extra Thursday!!!), 2017: Gareth Roberts (University of Warwick)

When: Thursday February 9th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Towards not being afraid of the big bad data set

February 7, 2017: Nick Wormald (Monash University)

When: Tuesday February 7th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

A natural infection model

Suppose that individuals are randomly placed points in space according to a Poisson process, and have two states, infected or healthy. Any infected individual passes the infection to any other at distance d according to a Poisson process, whose rate is a function f(d) of d that decreases with d. Any infected individual heals at rate 1. Initially, one individual is infected. An epidemic is said to occur when the infection lasts forever. We investigate conditions on f under which the probability of an epidemic is nonzero. This is joint work with Josep Diaz and Xavier Perez Gimenez.

January 31, 2017: Guido Bacciagaluppi (Utrecht University)

When: Tuesday January 31st, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

Quantum probability and contextuality

In this talk, I shall introduce the generalised theory of probability that arises naturally in quantum mechanics, emphasising its understanding in terms of 'contextuality', and discussing whether and in what sense modelling such phenomena indeed requires going beyond Kolmogorovian probability. 

January 24, 2017: Cancelled

January 17, 2017: Arnaud Le Ny (UniversitĂ© Paris-Est Marne-la-VallĂ©e)

When: Tuesday January 17th, 12:45
Where: TU Delft, Faculty EWI, Mekelweg 4, Room D@ta.

(Very) Persistent Random Walks

In this talk, we shall describe recent works [1] and (maybe) [2] in which we investigate asymptotic properties of one dimensional very Persistent Random Walks (PRW). PRW are correlated walks whose increments are, on the contrary to simple random walks, not i.i.d. but rather dependent in a Markov (finte order) way. They have been widely studied since the mid of last century under different vocables as Goldstein-Kac, correlated or again persistent walks. Due to the extra memory induced by the increments, these random walks are not Marokov proecesses anypore. By very persistent we mean here a model in which even the increments are not Markov, but rather Variable Length Markov Chains whose conditional laws directly depend of the time already spent in the given direction. Equivalently, we are given  two independent sequences of i.i.d. persistence times, in a general possibly non-summable framework that extends previous work of Malduin et al. on Directionnally Recurrent Random Walks [3]. Using an extension of Erickson's criteria [4], we provide a general classification of recurrence vs. Transience in term of drift or tail properties depending on the intial laws, and also identify different regime in the scaling limits for persistent times lying in the bassin of attraction of stable laws.
This is a joint work with P. CĂ©nac (Dijon), B. de Loynes (Rennes) and Y. Offret (Dijon).

[1] P. CĂ©nac, A. Le Ny, B. de Loynes, Y. Offret. Persistent Random Walks I : Recurrence vs. Transience. J. of Theo. Probab. 29, 2016/17.

[2] P. CĂ©nac, A. Le Ny, B. de Loynes, Y. Offret. Persistent Random Walks II : Functional Limit Theorems. Preprint

[3] R. Malduin, M. Monticino, H. von Weisäcker. Directionally Reinforced Random Walks. Adv. In Math. 117, no 2 : 239—252, 1996.