Overhead plan for The Snow Cruiser, designed and checked by Dr. Thomas C. Poulter, drawn by C.E. Hammett, and traced by E.J. Loutzenheiser.... Show moreOverhead plan for The Snow Cruiser, designed and checked by Dr. Thomas C. Poulter, drawn by C.E. Hammett, and traced by E.J. Loutzenheiser. The Snow Cruiser was designed by Armour Research Foundation Scientific Director Thomas C. Poulter, constructed by ARF, and taken on Richard Byrd's third Antarctic expedition. Show less
Plan for The Snow Cruiser, designed and checked by Dr. Thomas C. Poulter, drawn by C.E. Hammett, and traced by E.J. Loutzenheiser. The Snow... Show morePlan for The Snow Cruiser, designed and checked by Dr. Thomas C. Poulter, drawn by C.E. Hammett, and traced by E.J. Loutzenheiser. The Snow Cruiser was designed by Armour Research Foundation Scientific Director Thomas C. Poulter, constructed by ARF, and taken on Richard Byrd's third Antarctic expedition. Show less
Evening course catalog for the Institute of Design, 1950-1951. In addition to a descriptive listing of evening classes and a class schedule,... Show moreEvening course catalog for the Institute of Design, 1950-1951. In addition to a descriptive listing of evening classes and a class schedule, the catalog includes the aims and history of the Institute of Design, a faculty roster, and other pertinent information about the enrollment process. Show less
This paper closely examines HMMs in which all the hidden random variables are... Show moreThis paper closely examines HMMs in which all the hidden random variables are
binary. Its main contributions are (1) a birational parametrization for every such HMM, with an
explicit inverse for recovering the hidden parameters in terms of observables, (2) a semialgebraic
model membership test for every such HMM, and (3) minimal dening equations for the 4-node
fully binary model, comprising 21 quadrics and 29 cubics, which were computed using Grobner
bases in the cumulant coordinates of Sturmfels and Zwiernik. The new model parameters in (1) are
rationally identiable in the sense of Sullivant, Garcia-Puente, and Spielvogel, and each model's
Zariski closure is therefore a rational projective variety of dimension 5. Grobner basis computations
for the model and its graph are found to be considerably faster using these parameters. In the
case of two hidden states, item (2) supersedes a previous algorithm of Schonhuth which is only
generically dened, and the dening equations (3) yield new invariants for HMMs of all lengths
4. Such invariants have been used successfully in model selection problems in phylogenetics, and
one can hope for similar applications in the case of HMMs. Show less
The motivation for this paper is the geometric approach to statistical learning Bayesiannetwork (BN) structures. We review three vector... Show moreThe motivation for this paper is the geometric approach to statistical learning Bayesiannetwork (BN) structures. We review three vector encodings of BN structures. The first one has been used by Jaakkola et al. [9] and also by Cussens [4], the other two use special integral vectors formerly introduced, called imsets [18, 20]. The topic is the comparison of outer polyhedral approximations of the corresponding polytopes. We show how to transform the inequalities suggested by Jaakkola et al. [9] into the framework of imsets. The result of our comparison is the observation that the implicit polyhedral approximation of the standard imset polytope suggested in [21] gives a tighter approximation than the (transformed) explicit polyhedral approximation from [9]. As a consequence, we confirm a conjecture from [21] that the above-mentioned implicit polyhedral approximation of the standard imset polytope is an LP relaxation of that polytope. In the end, we review recent attempts to apply the methods of integer programming to learning BN structures and discuss the task of finding suitable explicit LP relaxation in the imset-based approach. Show less
We consider the real log canonical threshold for the learning model in Bayesian estimation. This threshold corresponds to a learning... Show moreWe consider the real log canonical threshold for the learning model in Bayesian estimation. This threshold corresponds to a learning coefficient of generalization error in Bayesian estimation, which serves to measure learning efficiency in hierarchical learning models [30, 31, 33]. In this paper, we clarify the ideal which gives the log canonical threshold of the restricted Boltzmann machine and consider the learning coefficients of this model. Show less