Institutional Repository
Array
Pages
-
- Proton structure from multiparticle contribution to elastic pp-scattering at 7 TeV
- The parton content of protons gets strong peripheral contribution at 1 fm from multiparticle dynamics as revealed by the overlap function in the unitarity condition for elastic pp-scattering at 7 TeV., Sponsorship: IIT College of Science, High Energy Physics Division of Argonne National Laboratory
-
- Diffraction at the LHC
- Proton-proton collisions at the LHC can be classified as elastic, non-diffractive, and diffractive. In this paper we discuss various measurements of these above processes at various LHC experiments. We report about the total proton-proton cross-section measurements, about the analysis of diffractive events and also about the pseudorapidity distribution in inelastic events., Sponsorship: IIT College of Science, High Energy Physics Division of Argonne National Laboratory
-
- Study of the helix model
- The quantum properties of a helix-like shaped QCD string are studied in the context of the semi-classical Lund fragmentation model. It is shown how simple quantization rules combined with the causality considerations in the string fragmentation process describe the mass hierarchy of light pseudoscalar mesons. The quantized helix string model predicts observable quantum effects related to the threshold behaviour of the intrinsic transverse momentum of hadrons, and of the minimal transverse momentum difference between adjacent hadrons., Sponsorship: IIT College of Science, High Energy Physics Division of Argonne National Laboratory
-
- Measurement of the charged kaon correlations at small relative momentum in the SELEX experiment
- We report the first measurement of charged kaon Bose–Einstein correlations produced by 600 GeV/c Σ−, π− and 540 GeV/c p beams in the SELEX experiment. The SELEX (E781) experiment at Fermilab is a fixed target three–stage magnetic spectrometer designed for study hadroproduction at high acceptance for forward interactions (xF ≥ 0.1). One–dimensional charged kaon correlation functions were obtained for all three beams and three pair transverse momentum ranges. The femtoscopic parameters for the radii and correlation strength of the kaon source were extracted. The fit results show the decrease of the emission source radii with the increase of the pair transverse momentum., Sponsorship: IIT College of Science, High Energy Physics Division of Argonne National Laboratory
-
- Searching for W' bosons at LHC with single top production
- One of the strengths of the LHC is its capacity for the discovery of new physics. As a consequence of many BSM theories, W′ bosons make an ideal particle to search for to constrain many models. One mode in particular has relatively low background: Single top quark production mediated by a W′ boson. For W′ masses less than 1500 GeV, all the decay products of the top quark are visible, and the strongest channel is the top’s decay into an electron or muon, with the associated neutrino and a bottom jet. As the W′ mass increases, the decay products from the highly boosted top and bottom quarks from the W′ appear as fat jets; boosted top tagging algorithms abound, and we propose a boosted bottom tag to set an exclusion limit of 2750 GeV for standard model-like couplings with existing 8 TeV data., Sponsorship: IIT College of Science, High Energy Physics Division of Argonne National Laboratory
-
- Large x physics: recent results and future plans
- The valence region is exceedingly important in hadron physics since this region not only defines a hadron but also is an excellent discriminator of nucleon structure models. Present uncertainties in light quark distribution functions at high x could also impact high energy physics. Here we will describe a new generation of experiments at Jefferson Lab that is aimed at the high x region of the nucleon. It is noted that the proposed Electron Ion Collider could explore the high x regime., Sponsorship: IIT College of Science, High Energy Physics Division of Argonne National Laboratory
-
- HERAFitter - an open source QCD fit framework
- The patron distribution functions (PDFs) describe the parton content of the proton. They cannot be calculated from the first principle and are to be determined empirically by fitting experimental observables to quantum chromodynamics (QCD) predictions. The HERAFitter project aims to provide a framework for QCD analyses of proton structure at leading order (LO), next-to-leading order (NLO) and next-to-next-to-leading (NNLO) orders in perturbative QCD. The framework includes various modules and interfaces enabling a large number of theoretical and methodological options. It also allows to study the impact of the new experimental data on the PDFs from ep, pp and pp̅ scattering processes. The fast development of the project involves active communication with theorists and experimentalist as well as a close collaboration with the main PDF fitting groups. Full information about the project, downloads and documentation can be found in http://herafitter.org., Sponsorship: IIT College of Science, High Energy Physics Division of Argonne National Laboratory
-
- Improved isolation of the p-p underlying event based on minimum-bias trigger-associated hadron correlations
- Some aspects of hadron production in p-p collisions remain unresolved, including the low-hadron-momentum structure of high-parton-energy dijets, separation of triggered dijets from the underlying event (UE), the systematics of multiple parton interactions and possible systematic underestimation of dijet contributions to high-energy nuclear collisions. In this study we apply a minimum-bias trigger-associated (TA) correlation analysis to p-p collisions. We extract a hard component from TA correlations that can be compared with measured jet fragment systematics derived from e+-e− collisions. The kinematic limits on jet fragment production may be determined. The same method may be extended to A-A collisions where the role of minimum-bias jets in spectra and correlations is strongly contested., Sponsorship: IIT College of Science, High Energy Physics Division of Argonne National Laboratory
-
- Exact tests to compare contingency tables under quasi-independence and quasi-symmetry
- In this work we define log-linear models to compare several square contingency tables under the quasi-independence or the quasi-symmetry model, and the relevant Markov bases are theoretically characterized. Through Markov bases, an exact test to evaluate if two or more tables fit a common model is introduced. Two real-data examples illustrate the use of tehse models in different fields of applications.
-
- Inference for Ordinal Log-Linear Models Based on Algebraic Statistics
- Tools of algebraic statistics combined with MCMC algorithms have been used in contingency table analysis for model selection and model fit testing of log-linear models. However, this approach has not been considered so far for association models, which are special log-linear models for tables with ordinal classification variables. The simplest association model for two-way tables, the uniform (U) association model, has just one parameter more than the independence model and is applicable when both classification variables are ordinal. Less parsimonious are the row (R) and column (C) effect association models, appropriate when at least one of the classification variables is ordinal. Association models have been extended for multidimensional contingency tables as well. Here, we adjust algebraic methods for association models analysis and investigate their eligibility, focusing mainly on two-way tables. They are implemented in the statistical software R and illustrated on real data tables. Finally the algebraic model fit and selection procedure is assessed and compared to the asymptotic approach in terms of a simulation study.
-
- Maximum Likelihood Estimation of the Latent Class Model through Model Boundary Decomposition
- The Expectation-Maximization (EM) algorithm is routinely used for maximum likelihood estimation in latent class analysis. However, the EM algorithm comes with no global guarantees of reaching the global optimum. We study the geometry of the latent class model in order to understand the behavior of the maximum likelihood estimator. In particular, we characterize the boundary stratification of the binary latent class model with a binary hidden variable. For small models, such as for three binary observed variables, we show that this stratification allows exact computation of the maximum likelihood estimator. In this case we use simulations to study the maximum likelihood estimation attraction basins of the various strata and performance of the EM algorithm. Our theoretical study is complemented with a careful analysis of the EM fixed point ideal which provides an alternative method of studying the boundary stratification and maximizing the likelihood function. In particular, we compute the minimal primes of this ideal in the case of a binary latent class model with a binary or ternary hidden random variable.
-
- An Euclidean norm based criterion to assess robots’ 2D path-following performance
- A current need in the robotics field is the definition of methodologies for quantitatively evaluating the results of experiments. This paper contributes to this by defining a new criterion for assessing path-following tasks in the planar case, that is, evaluating the performance of robots that are required to follow a desired reference path. Such criterion comes from the study of the local differential geometry of the problem. New conditions for deciding whether or not the zero locus of a given polynomial intersects the neighbourhood of a point are defined. Based on this, new algorithms are presented and tested on both simulated data and experiments conducted at sea employing an Unmanned Surface Vehicle.
-
- On multivariable cumulant polynomial sequences with applications
- A new family of polynomials, called cumulant polynomial sequence, and its extension to the multivariate case is introduced relying on a purely symbolic combinatorial method. The coefficients are cumulants, but depending on what is plugged in the indeterminates, moment sequences can be recovered as well. The main tool is a formal generalization of random sums, when a not necessarily integer-valued multivariate random index is considered. Applications are given within parameter estimations, L ?evy processes and random matrices and, more generally, problems involving multivariate functions. The connection between exponential models and multivariable Sheffer polynomial sequences offers a different viewpoint in employing the method. Some open problems end the paper.
-
- Mixtures and products in two graphical models
- We compare two statistical models of three binary random variables. One is a mixture model and the other is a product of mixtures model called a restricted Boltzmann machine. Although the two models we study look different from their parametrizations, we show that they represent the same set of distributions on the interior of the probability simplex, and are equal up to closure. We give a semi-algebraic description of the model in terms of six binomial inequalities and obtain closed form expressions for the maximum likelihood estimates. We briefly discuss extensions to larger models.
-
- The geometry of Sloppiness
- The use of mathematical models in the sciences often requires the estimation of unknown parameter values from data. Sloppiness provides information about the uncertainty of this task. In this paper, we develop a precise mathematical foundation for sloppiness and define rigorously its key concepts, such as `model manifold', in relation to concepts of structural identifiability. We redefine sloppiness conceptually as a comparison between the premetric on parameter space induced by measurement noise and a reference metric. This opens up the possibility of alternative quantification of sloppiness, beyond the standard use of the Fisher Information Matrix, which assumes that parameter space is equipped with the usual Euclidean and the measurement error is infinitesimal. Applications include parametric statistical models, explicit time dependent models, and ordinary differential equation models.
-
- Ideal-Theoretic Strategies for Asymptotic Approximation of Marginal Likelihood Integrals
- The accurate asymptotic evaluation of marginal likelihood integrals is a fundamental problem in Bayesian statistics. Following the approach introduced by Watanabe, we translate this into a problem of computational algebraic geometry, namely, to determine the real log canonical threshold of a polynomial ideal, and we present effective methods for solving this problem. Our results are based on resolution of singularities. They apply to parametric models where the Kullback-Leibler distance is upper and lower bounded by scalar multiples of some sum of squared real analytic functions. Such models include finite state discrete models.
-
- Markov bases for two-way change-point models of ladder determinantal tables
- To evaluate the goodness-of-fit of a statistical model to given data, calculating a conditional p value by a Markov chain Monte Carlo method is one of the effective approaches. For this purpose, a Markov basis plays an important role because it guarantees the connectivity of the chain, which is needed for unbiasedness of the estimation, and therefore is investigated in various settings such as incomplete tables or subtable sum constraints. In this paper, we consider the two-way change-point model for the ladder determinantal table, which is an extension of these two previous works, i.e., works on incomplete tables by Aoki and Takemura (2005, J. Stat. Comput. Simulat.) and subtable some constraints by Hara, Takemura and Yoshida (2010, J. Pure Appl. Algebra). Our main result is based on the theory of Gr ?obner basis for the distributive lattice. We give a numerical example for actual data.
-
- Mode Poset Probability Polytopes
- A mode of a probability distribution is an elementary event that has more probability mass than each of its direct neighbors, with respect to some vicinity structure on the set of elementary events. The mode inequalities cut out a polytope from the simplex of probability distributions. Related to this is the concept of strong modes. A strong mode is an elementary event that has more probability mass than all its direct neighbors together. The set of probability distributions with a given set of strong modes is again a polytope. We study the vertices, the facets, and the volume of such polytopes depending on the sets of (strong) modes and the vicinity structures.
-
- On Exchangeability in Network Models
- We derive representation theorems for exchangeable distributions on finite and infinite graphs using elementary arguments based on geometric and graph-theoretic concepts. Our results elucidate some of the key differences, and their implications, between statistical network models that are finitely exchangeable and models that define a consistent sequence of probability distributions on graphs of increasing size.
-
- Cubature Rules and Expected Value of Some Complex Functions
- The expected value of some complex valued random vectors is computed by means of the indicator function of a designed experiment as known in algebraic statistics. The general theory is set-up and results are obtained for finite discrete random vectors and the Gaussian random vector. The precision space of some cubature rules/designed experiments is determined.
