Maximum Likelihood Estimation for the Poisson-Binomial Distribution
via the EM Algorithm with Environmental Applications
Dimitris Karlis and Evdokia Xekalaki

The Poisson-Binomial distribution arises as the convolution of a Poisson with a Binomial distribution. The complicated form of its probability function has confined the applicability of the distribution in real practical problems. In this note we fully describe ML estimation for the parameters of the Poisson-Binomial distribution. The EM algorithm is applied, using the latent structure of the derivation of the distribution as a convolution of two non-observable random variables. Extensions of the algorithm to cover cases of finite mixtures of this distribution are given. These algorithms can be very helpful  in estimating the parameters of discrete valued time series and in particular for the INAR(1) and the SINAR(1) processes. The developed algorithm is applied to simulated data. Environmental applications are also considered.

DIMITRIS KARLIS
Department of Statistics, AUEB
Athens, Greece
 
 


The Application of Space-Time Methods for the Description
and the Analysis of Greek Earthquake Catalogues
Athina Karvounaraki and John Panaretos

In this paper space-time shock point process are used to model the occurrences of the earthquakes over time and space in the Aegean and the surrounding area. This area is considered as one of the seismically most active regions of the world and therefore investigation and modeling its seismicity is of great interest. Methods such as Kernel smoothing are used for the non-parametric estimation of the intensity function. Effects of spatial concentrations that persist in time at the same locations (clustering) are also taken into consideration.

ATHINA KARVOUNARAKI
Department of Statistics, AUEB
Athens, Greece

Multinomial Probit Model
M. Linardakis and P. Dellaportas

New Bayesian perspectives for the analysis of stated preference data using the multinomial probit model are presented. The resulting model formulations give rise to the so-called multivariate probit model that emerges from a series of ranking responses in a set of hypothetical scenarios. Our methodological contributions consist of the following. First, we propose a Gibbs sampler which ensures identified parameters for the covariance matrix of the underlying utility vectors. Second, we enhance the multivariate probit model with the embodiment of a utility threshold parameter which deals realistically with ranking responses, transitivity of indifference among alternatives, or ties. A further improvement upon the above model is an inclusion of a hierarchical step which models the unit-specific utility thresholds as exchangeably distributed. Finally, we permit the use of heavy tail distributions for the stochastic error term. The implementation tool adopted is MCMC. The use of the proposed methodology is illustrated by a real data application from a stated preference experiment about three main transportation modes in the city of Athens.

M. LINARDAKIS
Department of Statistics, AUEB
Athens, Greece
 
 

Risk Processes with Delayed Claims Perturbed by Diffusion
G.D. Makatis and M.A. Zazanis

We consider risk processes with delayed claims perturbed by diffusion and obtain asymptotic expressions for the ruin probability using both change of measure arguments and a direct renewal-theoretic approach. Refined asymptotics are obtained in the second case via the use of Laplace transforms.

G.D. MAKATIS
Department of Statistics, AUEB
Athens, Greece
 
 

Control Charts for the Lognormal Distribution
Petros Maravelakis, John Panaretos, and Stelios Psarakis

Control Charts are the main tools of Statistical Process Control. They are used for deciding whether a process is statistically stable or not. Much theory and many applications have been developed for the Gaussian (Normal) distribution in this area. However, in real data sets we usually face up nonnormal processes. Consequently, this theory does not apply.
In the present paper, we focus attention on the lognormal distribution that can be considered as a special nonnormal case. In particular, we present the Shewhart Control Charts developed up to now, under such distributional assumptions and a new Control Chart based on the CUSUM theory.

PETROS MARAVELAKIS
Department of Statistics, AUEB
Athens, Greece
 
 

Methods of Expanding Abridged Life Tables Method Evaluation and Comparisons
Vangelis Panousis and Anastasia Kostaki

The problem of estimating the age-specific mortality pattern from data given in age intervals has been extensively discussed in demographic, biostatistical, as well as in actuarial literature. The main reasons for providing data in an abridged form are related to the phenomenon of  "age heaping" caused by age misstatements in data registration and also the unstable mortality probability estimates provided by insufficiently small samples.
In this study we review, evaluate and compare the several methods that have appeared in the literature for expanding an abridged life table, to a complete one. We describe a parametric method, which requires the use of a parametric model. In our applications the 8-parameter Heligman–Pollard formula is used. A non parametric method is also considered which relates the target abridged life table with an existed complete one. We also describe an old method presented by Reed, which is applied and described in detail by V.Valaoras (1984), and a relatively new one which requires five-year age groups presented by J.Pollard (1989). The rest of the methods presented, are the application of some interpolation formula to the survivors function lx of the abridged life table. A conventional interpolation technique is the application of a six–point Lagrangean interpolation. A set of some six- point interpolation formulas is also presented. Spline Interpolation is a case of an osculatory interpolation technique which has lately received great attention. The performance of the various methods, is evaluated by applying each one of them to several life tables of different  populations (e.g. Sweden, France, Italy, New Zealand, Germany e.t.c.).

VANGELIS PANOUSIS
Department of Statistics, AUEB
Athens, Greece
 
 

Stochastic Modelling of Rain Rate Processes Through a Diffusion Framework
Dimitra Pinotsi and Harry Pavlopoulos

The instantaneous intensity of rainfall at a point in space is called rain rate. Given an instantaneous map or snapshot of a rain field over a fixed geographic region one can obtain its average intensity, which is the spatial average rain rate of the field at that very instant. Sample functions of such averages are considered to be realizations of stochastic processes referred to as spatially averaged rain rate processes.
Previous statistical research shows that the probabilistic distribution of rain rate, conditional on rain, is unimodal and highly skewed to the right. In the present work we shall try to see for ourselves whether this is in agreement to what real rainfall data retrospect, and also to provide a stochastic model of rain rate processes.
The mathematical framework used is the theory of one-dimensional diffusion processes. Non-parametric regression methods were used in order to guide the choice of specific parametric curves needed to model the drift and diffusion coefficients of a diffusion process model. Suggestions are given on the form of the model in search. Finally we check whether the scaling properties, which are characteristic of spatially averaged rain rate data, also hold under the probability density function obtained through the diffusion model approach.

DIMITRA PINOTSI
Department of Statistics, AUEB
Athens, Greece
 
 

Application of Integer Valued Stochastic Processes to Modelling and Prediction
of Fractional Area where Rain Intensity Exceeds a Given Threshold Level
Dimitris Podiotis and Harry Pavlopoulos

Integer-valued auto-regressive stochastic processes INAR(1) and SINAR(1), with innovations which follow a simple or randomly mixed Poisson marginal distribution, are used in order to model time series of fractional area of a fixed region where rain intensity exceeds a given threshold level. Fitting is accomplished by maximum likelihood estimation of the parameters involved, via the EM-algorithm, and the performance of predictions made by such fitted models is addressed. Furthermore, in the stationary case, it is investigated to what extent such models are consistent with scaling properties of the mean and variance of  fractional area. In particular, scaling properties are considered both with respect to the spatial scale of the region (for fixed threshold level of intensity), and also with respect to the threshold scale (for fixed size of the region).

DIMITRIS PODIOTIS
Department of Statistics, AUEB
Athens, Greece
 
 

Extreme Value Theory in Reinsurance
Ioannis Stamoulis and Nikos Fragos

The study, of extremes, whether from a purely probabilistic or from a practical statistical  view  point, has grown gradually since the last few years. Any statistical problem sensitive to extreme observations has to be approached by appropriate methods. Tail modeling and estimation of long return periods and large quantile constitute important and controversial examples. Lots of traces of the theory of extreme value statistics can be found in the board field of non-life insurance. While insurance mathematics is a natural prime domain of application of extreme value theory, unlike engineers, actuaries have not yet fully developed and explored the statistical methodology of extremes. Theoretically one can say that an insurance company can always safeguard itself against portfolio contamination caused by claims that should be considered as extreme rather than average: indeed, thanks to a reinsurance contract the claim distribution is truncated to the right from the view point of the ceding company. As such, the whole area of reinsurance is probably one of the most important fields of application of extreme value theory.  Hence good estimates for the tails of loss severity distributions are essential for pricing or positioning high-excess and stop loss layers in reinsurance. We describe parametric curve-fitting methods for modeling extreme historical losses. These methods revolve around the generalized Pareto distribution and are supported by extreme value  theory. We summarize relevant theoretical results and examples and provide an extensive application to data on large fire insurance losses of a Greek insurance Company.

IOANNIS STAMOULIS
Department of Statistics, AUEB
Athens, Greece
 
 

Approximations of Risk Processes with Time-Varying Poisson Claims
Y. D. Zaphiropoulos and M.A. Zazanis

We consider risk processes when claims occur according to a time-varying Poisson process. The performance of diffusion approximations for the ruin probability in the light-tailed case, and approximations based on stable processes for the heavy-tailed case, is examined both analytically and experimentally.

Y. D. ZAPHIROPOULOS
Department of Statistics, AUEB
Athens, Greece
 
 

Back to Scientific Program