Search results
(281 - 300 of 324)
Pages
- Title
- MODELING THE ORIGIN AND INCEPTION OF FATIGUE FAILURE IN METALS THROUGH A NON-LINEAR MECHANICAL SPRING SYSTEM
- Creator
- Divani, Abbas
- Date
- 2019
- Description
-
Fatigue failure of metals is one of the aspects of material behavior that have not been explained through direct calculations or mathematical...
Show moreFatigue failure of metals is one of the aspects of material behavior that have not been explained through direct calculations or mathematical modeling. This research is to undertake the development of an innovative model that can be used in predicting the fatigue behavior of metals. This model is based on the behavior of a system of large number of parallel elements, each composed of two springs, a string, and a mass block, that undergo cyclically varying or random load cycles. Failures among the elements occur at random and can be used as a means to simulate fatigue damage and fatigue behavior. Initial studies on this model have produced promising results. This research is intended for full development and implementation of the model including procedures for Development of a method for calibration of the model parameters using the common mechanical properties of steel, extension of the model to incorporate the hysteresis behavior of steel under cycling loading, and development of Constant Fatigue Life diagrams Such as Goodman diagram.
Show less
- Title
- WIM BASED LIVE LOAD FACTORS FOR CONSISTENT ILLINOIS BRIDGE RELIABILITY
- Creator
- Chi, Jingya
- Date
- 2019
- Description
-
The Load and Resistance Factor Rating (LRFR) approach was developed in the early 2000s. The live-load factors were calibrated at that time so...
Show moreThe Load and Resistance Factor Rating (LRFR) approach was developed in the early 2000s. The live-load factors were calibrated at that time so that bridges rated by the LRFR approach could achieve a uniform structural reliability. However, the first calibration of the live-load factors was intent on the applications to the entire nation, without considering state-specific traffic conditions and truck restrictions. In addition, the calibration was carried out using limited data collected from the weigh stations of Ontario, Canada, in the 1970s. Therefore, to develop a practice that is consistent with the current LRFR approach as well as considering the state-specific live-load effects has motivated us to conduct this study.We study the weigh-in-motion (WIM) data that have been collected by the Department of Transportation of several states (i.e., Michigan, New York, Minnesota, California, Illinois, Oregon, Kentucky and Pennsylvania). These data contain approximate four years (i.e., from 2013 to 2017) of continuously-recorded trucks. They provide the information about truck weight and configuration, as well as the truck traffic pattern. In this research, we focus on the data collected by the Illinois DOT to calibrate live-load factors for the Illinois LRFR highway bridge evaluation.We first propose and verify a simulation method to statistically restore the missing trucks in the second lanes in the Illinois data. Based on the concept of relative calibration, we propose 3 sets of live-load factors for the Illinois legal, routine permit and special permit load ratings. Then we conduct a sensitivity analysis on the overweight trucks. Finally, we study the effect of law enforcement on the calibration.
Show less
- Title
- MULTI-OBJECTIVE OPTIMIZATION FOR LEED - NEW CONSTRUCTION USING GENETIC ALGORITHMS
- Creator
- ALOTHAIMEEN, IBRAHEEM
- Date
- 2018
- Description
-
In the U.S., the building sector is responsible for 73% of electricity usage, 38% of CO2 emissions, and 13.6% of potable water. These data...
Show moreIn the U.S., the building sector is responsible for 73% of electricity usage, 38% of CO2 emissions, and 13.6% of potable water. These data indicate that the construction industry negatively impacts the global environment and natural resources. The concept of “sustainability” was introduced to set guidelines for the construction industry to limit its negative environmental impact. To promote sustainability in the construction industry, many organizations have introduced guidelines and rating systems for buildings. One of these rating systems is Leadership in Energy and Environmental Design (LEED) which is the most globally acknowledged system.Although LEED excels in reducing the negative environmental impacts and the energy consumption of buildings, the high costs in the early phases associated with the implementation and pursuit of LEED certification are pushing away some project owners from entering the process. Therefore, to balance these objectives in sustainable projects, an approach which optimizes multiple objectives is needed. In this study, a multi-objective optimization framework, which uses Non-dominated Sorting Genetic Algorithm-II (NSGA-II), is proposed to find the optimal solution in terms of life-cycle cost and sustainability for a new construction project pursuing LEED v4 BD+C certification. A BIM project of a 3-floor educational building was selected as a case study in the research. The study case is used to verify the efficiency and soundness of the proposed model. The results show that the method does indeed lead to optimal solutions.
Show less
- Title
- NUMERICAL STUDY OF MICRO, MESO, AND MACRO-MECHANICAL BEHAVIOR OF COHESIONLESS GRANULAR MATERIALS USING 3D DEM ROLLING/TWISTING RESISTANCE MODELS
- Creator
- Goudarzi, Nima
- Date
- 2018
- Description
-
It has been frequently demonstrated that the mechanical behavior of cohesionlessgranular materials including sand and gravel is significantly...
Show moreIt has been frequently demonstrated that the mechanical behavior of cohesionlessgranular materials including sand and gravel is significantly influenced by theirmorphological features including the shape and surface texture. Therefore, the primaryobjective of this thesis is to take a more critical look at micro-, meso- and macromechanicalbehaviors of cohesionless granular materials in response to effective modelingof the grains morphology and to establish a practical yet straightforward causal relationshipbetween micro-scale modeling and macro-scale soil behavior.To precisely investigate the effects of morphology on the macroscopic behavior, aparticle-based microscopic approach using the Discrete Element Method (DEM) wasemployed. In this regard, a novel 3D micro-mechanical contact model, based on the MTL(moment transfer law) theory, incorporating both rolling and twisting resistances, waspicked to describe the inter granular behavior between cohesionless particles. Severaltriaxial and direct shear tests were run to characterize link (s), if any, between the microscalefeatures and the macroscopic soil responses. Results from these tests were analyzedat both the peak and critical state. Through the development of a comprehensive calibrationmethodology and finding a reasonable match between numerical and experimental results,it was found that even in the ideal case of perfectly spherical grains, it is still possible toeffectively model the presence and effects of influential micro-scale morphologicalfeatures without the need for direct modeling of geometrical complexities followed bychallenging issues such as limitations in computational resources and almost unresolvabledifficulties in tracing the evolution of the modeled morphology during the loading.
Show less
- Title
- ESTIMATES OF FINE AND ULTRAFINE PARTICLE REMOVAL EFFICIENCY FOR RESIDENTIAL HVAC FILTERS USING IN-SITU SIZE-RESOLVED EFFICIENCY MEASUREMENTS
- Creator
- Zeng, Yicheng
- Date
- 2018
- Description
-
Central heating, ventilating, and air-conditioning (HVAC) filters are commonly evaluated for their size-resolved particle removal efficiency ...
Show moreCentral heating, ventilating, and air-conditioning (HVAC) filters are commonly evaluated for their size-resolved particle removal efficiency (for particles 0.3 to 10 µm in diameter) by challenging them with a test aerosol in a laboratory setting. However, aerosol measurement and reporting classifications that are most commonly used in regulatory monitoring and building measurements include integral measures of mass-based concentrations (e.g., PM2.5, or the mass concentration of particles smaller than 2.5 µm) or total number concentrations (e.g., total UFPs, or ultrafine particles smaller than 100 nm). Because filter test standards have not traditionally considered these measures, building owners, occupants, and other key personnel cannot make informed decisions on HVAC filtration for these classifications. Moreover, because the removal efficiency for integral measures of total mass and number concentrations are also a function of the underlying particle size distributions that challenge the filter, one must consider the varied sources and size distributions of aerosols that filters encounter in real building applications. This work has two objectives: (1) to measure the in-situ size-resolved particle removal efficiency of a large number of commercially available residential HVAC filters, and (2) to use those size-resolved efficiency data to estimate integral measures of PM2.5 and total UFP removal efficiency for the same filters for typical residential indoor settings based on a literature survey of measured indoor particle size distributions. Particle concentration measurements were made upstream and downstream of a wide range of commercially available filters installed in a central air handling unit in an unoccupied residential apartment unit. A literature review was conducted to gather a variety of indoor particle size distributions (PSDs) from across the world and tri-modal lognormal distributions were fit to each of them. Finally, the particle removal efficiency for each filter for integral measures of indoor UFPs and PM2.5 were calculated for each indoor PSD. In-situ size-resolved measurements indicate that filters with similar rating values but from different manufacturers can have very different removal efficiencies for integral measures of PM2.5 and total UFPs, and that the assumption for indoor PSDs can greatly impact estimates of removal efficiency.
Show less
- Title
- House museums In Chicago: a re-examination of motives, origins, and transformations of the institutions
- Creator
- Whittaker, Daniel Joseph
- Date
- 2018
- Description
-
A house museum is a former residence converted into a publicly accessible structure, which preserves an identity of its original domestic...
Show moreA house museum is a former residence converted into a publicly accessible structure, which preserves an identity of its original domestic history. These houses shelter a wide variety of institutions with a diverse range of imperatives and services. With a focus on Chicago house museums, this dissertation seeks an overarching pattern underlying this conversion and reuse of residential buildings. This dissertation focuses on six house museums in Chicago: the Palmer Castle, the Harding Castle, the Clarke House, the Glessner House, the Madlener House and the Robie House. The Palmer and Harding Castles ceased to exist as house museums and are no longer standing.Conventional archival research conducted during the initial phases yielded historiographies that corroborate as well as contradict popular stories about the process by which the houses were preserved, salvaged and converted. Key primary-source research includes interviews with persons involved in—and observant of—motivations and forces in play upon these six case studies. Texts of the interviews are included in appendices. The dissertation reveals how select individuals (acting variously as architects, historians, concerned citizens, and leaders of institutions) influenced the creation of the six house museums. This dissertation contains a chronicle and an evaluation of the values which informed and influenced the house museum condition in Chicago in an environment which largely pre-dated the historic building preservation movement in America. The case studies show that the persons and parties involved in saving various houses for reuse did not generally execute definitive plans, in full, with a clear ultimate goal. Instead, in all cases, individuals and small groups of people fought an array of idiosyncratic battles, often yielding short-term victories. Economic pressures, political conditions, and societal values evolve, ushering in new opportunities and new dangers for nascent institutions inhabiting former residences. As each generation of directors, curators and governing boards mature and matriculate, the goals and objectives which influenced the reuse of their house museums changed. The very notion of attaining some sort of permanent statis has been found, through this research, to be elusive. Dynamism in both the people and the institution reusing these house museums can yield positive outcomes ensuring preservation of the institution of the house museum.
Show less
- Title
- Applying Statistical Methods to Air Quality and Asthma Data in Chicago Homes
- Creator
- Abromitis, Kari
- Date
- 2020
- Description
-
This thesis investigates one years’ worth of indoor and outdoor air pollution data collected from Chicago area homes in relation to results...
Show moreThis thesis investigates one years’ worth of indoor and outdoor air pollution data collected from Chicago area homes in relation to results from monthly asthma surveys taken by the residents of those homes. This examination involves the processing and calibration of this large dataset, basic statistical analysis, and analysis of asthma as related to variation of air pollution and air pollution resulting from nearby transportation sources. The data was collected as a part of the Breathe Easy project, which was funded by the HUD and involved IIT and Elevate Energy, a Chicago-based economic development organization that promotes building equality through climate action. The majority of the data processing and analysis were performed using Python and it is intended for continued use during the ongoing Breathe Easy project. The basic statistical analysis of this data led to initial investigations of how the variability of pollutants on a daily basis triggered asthma severity and symptoms. There were limited relationships observed between asthma symptoms and pollutant variability, and it was found to not be as important as overall pollutant levels. A second investigation was pursued to examine how the proximity to transportation, including Metra trains, freight trains, elevated rail trains, highways, bus lines, and busy roads, affected indoor and outdoor pollution levels at each home, as well as concurrent asthma outcomes. Similar to previous research, there was some relation for transportation closeness, particularly for highways and Metra trains, and pollution emitted that effects the health of nearby residents. In addition, homes that had greater air infiltration (via envelope airtightness measurements) had elevated levels of particulate matter – the pollutant most associated with transportation proximity. This thesis provides a basis for further investigations in this ongoing project and for similar asthma and air quality relationship studies.
Show less
- Title
- Public Event Identification Traffic Data Using Machine Learning Approach
- Creator
- Yang, Hanyi
- Date
- 2020
- Description
-
This study developed a shock waved diagram based deep learning model (SW-DLM) to predict the occurrence of public events in real-time...
Show moreThis study developed a shock waved diagram based deep learning model (SW-DLM) to predict the occurrence of public events in real-time according to their impacts on nearby highway traffic. Specifically, using point traffic volume data as a boundary condition, shock wave analysis is first conducted to understand the impacts and features of a public event on a nearby highway-ramp intersection. Next, this analysis develops the SWG algorithm to efficiently generate and expand shock wave diagrams in real-time according to the data collection rate. Built upon that, this study contributes a novel approach, which encodes a shock wave diagram with an optimal grid of pixels balancing resolution and computation load. Using the features extracted from encoded time-series shock wave diagrams as inputs, a deep learning approach, Long-short term memory (LSTM) model, is applied to predict the occurring of a public event. The numerical experiments based on the field data demonstrate that using encoded shock wave diagrams rather than point traffic data can significantly improve the accuracy of the deep learning for predicting the occurring of a public event. The SW-DLM presents satisfied prediction performance on the average as well as on an individual day with or without traffic accident interference, happening nearby the venue of a public event. The implementation of this approach to real-time traffic provision tools such as GPS will alert travelers en route on-going events in a transportation network and help travelers to make a smart trip plan and avoid traffic congestion. Moreover, it promotes smart city development by providing a strong capability to monitor the transportation system and conduct real-time traffic management intelligently.
Show less
- Title
- Parameterization of vertical dispersion coefficient (σz) near roadway: vehicle wake, density and types
- Creator
- Yu, Yu-Ting
- Date
- 2020
- Description
-
Pollutants emitted by motor vehicles is one of the major public concern in modern society since pollutants (e.g. ultrafine particles and CO2)...
Show morePollutants emitted by motor vehicles is one of the major public concern in modern society since pollutants (e.g. ultrafine particles and CO2) have been related to adverse health effect and climate change. Currently, air quality dispersion models (e.g. CALINE4 and AERMOD) are widely used to predict concentration near roadway. However, both air quality dispersion models are not account for the vehicle variability (vehicle types), causing the inaccuracy in predicting pollutant concentration near roadway. To better understand the dynamic of dispersive process, it is important to evaluate the variation of vertical dispersion coefficient (σz) for different vehicle types (e.g. LDV and HDV). A “wake area model” is developed based on “effective wake area” and vehicle density to reveal the dynamic of dispersion process that related to vehicles. Results from this thesis demonstrate that the near roadway σz that calculated from field measurements increases as vehicle density increases and HDV can generate higher value of σz compare to LDV. σz increase from 1.7 m to 4 m with LDV density increase from 0.005 m-1 to 0.18 m-1; while σz starts at 2 m and reach up to 6 m with HDV density increase from 0.001 m-1 to 0.01 m-1. The significant differences of σz are due to vehicle density and vehicle types. The effective wake area contributed by one HDV (397 m2) is much larger than effective wake area contributed by one LDV (13 m2), which indicates that one HDV contributes 31 times higher than one LDV in terms of near roadway σz. The significant differences of “effective wake area” are due to vehicle wake structure and vehicle dimension.Compared to the air quality dispersion models results, both CALINE4 and AERMOD failed to respond the variation of σz as vehicle density increases for LDVs and HDVs. CALINE4 shows the value of σz is near 2.8 m and 2.4 m for LDV and for vehicle fleet (mixture of LDV and HDV), respectively. AERMOD shows the value of σz is near 1.2 m and 1.3 m for LDV and for vehicle fleet, respectively. Because both air quality dispersion models did not correctly predict σz near roadway, a new equation that parameterize σz near roadway need to be developed to better predict the σz in real world condition. For a given road segment, the parameterization of σz is a result from dimension of “effective wake area” for different vehicle types and number of “effective wake area” for a vehicle fleet. Comparison of measured and parameterized σz near roadway dispersion coefficients show good agreement (slope=0.99, R2=0.85) with a range between 1 and 4 m for LDVs and 2 to 6 m for HDVs. Also, Comparison of measured and simulated CO2 show good agreement (slope=1, R2=0.90) with a range between 0.01 g m-3 to 0.12 g m-3 for simulated value and 0.01 g m-3 to 0.14 g m-3 for measured value. For UFPs, the results also show good agreement (slope=1.14, R2=0.71) with a range between 600 pt cm-3 to 11,000 pt cm-3 for simulated value and 600 pt cm-3 to 17,000 pt cm-3 for measured value.The “wake area model” provides a new perspective for σz that related to vehicles and bridges vehicle wake to σz. The results from this study will be critical for improving predictions of pollutant concentration from air quality dispersion near roadway.
Show less
- Title
- ADVANCING OPEN-SOURCE TOOLS FOR INDOOR ENVIRONMENTAL MONITORING AND BUILDING SYSTEMS CONTROLS USING WIRELESS SENSOR NETWORKS
- Creator
- Ali, Akram Syed
- Date
- 2021
- Description
-
Incorporating data monitoring and visualization tools in buildings can provide a glimpse into their energy use, thermal performance, daily...
Show moreIncorporating data monitoring and visualization tools in buildings can provide a glimpse into their energy use, thermal performance, daily operation, and maintenance requirements. However, buildings have traditionally been monitored using hardware and software that are expensive, proprietary, and often limited in terms of ease of use and flexibility. Many existing buildings remain unmonitored or poorly monitored, leaving many opportunities for energy savings and improving indoor environmental conditions unaddressed. To this end, the goal of this research is to develop and demonstrate an open-source hardware and software platform for monitoring and controlling the performance of buildings and their systems, called Elemental. It is designed to provide real-time data on indoor environmental quality, energy usage, heating, ventilating, and air-conditioning (HVAC) operation, and other factors to its users, and provide easy development of building controls. It combines: (i) custom low power printed circuit boards (PCBs) with RF transceivers for wireless sensors, control nodes, and USB gateway, (ii) a Raspberry Pi with custom firmware acting as a backhaul, and (iii) custom software applications that manage data storage, device configuration and interface for querying and visualizing the data in real-time. The platform is built around the idea of a private, secure, and open technology for the built environment. Among its many applications, the platform allows occupants to investigate anomalies in energy usage, environmental quality, and thermal performance. It also includes multiple frontends to view and analyze building activity data, which can be used directly in building controls. This proposal describes the development process of the hardware and software used in the Elemental platform along with three distinct applications including: (1) deployment in a research lab for long-term data collection and automated analysis, (2) use as a full-home energy and environmental monitoring solution, and (3) building heating system automation at the room-level with the development and deployment of a custom radiator control. Through these applications, this work demonstrates that the platform allows easy and virtually unlimited datalogging, monitoring, and analysis of real-time sensor data with low setup costs. Low-power sensor nodes placed in abundance in a building can also provide precise and immediate fault-detection, allowing for tuning equipment for more efficient operation and faster maintenance during the lifetime of the building.
Show less
- Title
- OPTIMUM WEIGHT STIFFNESS STRUCTURAL DESIGN
- Creator
- Barnett, Ralph L.
- Date
- 2021
- Description
-
My adventures with flexible structures began on the IIT campus with an extracurricular undergraduate project to design an “Open House Exhibit”...
Show moreMy adventures with flexible structures began on the IIT campus with an extracurricular undergraduate project to design an “Open House Exhibit” for the Civil Engineering Department. I chose to display a reinforced concrete diving board together with a prestressed concrete diving board. Visitors enthusiastically pounced on the reinforced concrete structure whose rigid response disappointed one and all. Their indignation was transferred to the prestressed cantilever which thrust them upward from six to ten feet into the air. This unexpected response from a diving board became so dangerous that the Exhibit was unceremoniously closed. I still have the display sign, “More Bounce to the Ounce.”While still an undergraduate, I secured a part-time job at Armour Research Foundation where I responded to a bid request from Rock Island Arsenal to design the 26 foot Honest John Rocket Launcher Rail at minimum weight. This tactical weapon was transported by helicopter. I basked in the fantasy that I was Leonardo da Vinci without his artistic proclivity. Rocket launchers that droop during operation are similar in concept to a circular firing squad. So began my research into minimum weight beams based on deflection rather than strength. I searched for the shoulders of Giants. I found them in the form of mathematicians not structural engineers. I achieved a 26.5% weight savings in the 1126 pound rail by optimizing the geometry. When I developed an optimum prestressed and segmented Kentanium cermet rail, the weight savings became 89%. The right material provides a bigger bang for the buck. When my journey into optimum design began, I was armed only with analysis tools: strength, stability, and stiffness. This thesis begins with an outline of my present toolbox which contains eight design concepts: 1. Establish the Geometry, 2. Select a material from a finite number of candidates, 3. Prestress and Prestrain, 4. Statistical Screening (Proof Testing), 5. Manipulation of Boundary Conditions, 6. Energized Systems, 7. Counterweights, 8. Self-Healing and Self-Reinforcing. Four of these are used through this review which focuses on stiffness. Beginning with beams, deflection control examples are described where prestraining and prestressing techniques are used to produce both a zero-deflection beam and a method for pushing with a chain. The calculus of variations made it possible to establish optimum tapers for the flanges and webs of I-beams that minimize beam weight for a specified deflection or, because of reciprocity, minimize beam deflection for a specified beam weight. An anomaly is encountered that enables one to achieve an upward, downward, or zero deflection with a set of beams of vanishing weight. In addition, special circumstances are defined where a uniform strength design is identical to the minimum weight design based on a specific deflection. Closed form solutions are obtained for a variety of loading scenarios. One problem is presented for self-weight that leads to a nonlinear integral equation. The optimum stiffness-weight design of trusses is undertaken where the area distribution of the truss members is optimized using Lagrange’s method of undetermined multipliers. Once again, we obtain a degenerate case where upward, downward, and zero deflection conditions can be met with an infinite set of trusses of vanishing weight. We photograph a simply supported truss under a downward load that leads to an upward deflection at one of the joints. Special loading conditions are identified that lead to uniform stress designs that are identical to the minimum weight designs based on deflections. This study provides a Segway into the world of minimum weight strength design of trusses. The resulting Maxwell and Michell trusses sometimes display the optimum distribution of bar areas from the point of view of stiffness. Many practitioners are under the mistaken impression that Michell structures, when they exist, provide the optimum truss profile for stiffness. Unfortunately, the optimum array of truss joints based on deflection does not exist. For both trusses and beams the optimum distribution of mass is shown to be necessary and sufficient; the sufficiency is established using well-known inequalities. The role of stiffness in the design of columns is explored in our final chapter. This cringe-worthy history of column analysis begins our study as a warning to practitioners who use analysis as their basis for design and especially optimum design. Conventional elastic and inelastic buckling theories provide little insight into the design of columns. The fundamentals of minimum weight column design are presented to show the power of design theory in contrast to analysis. Both prismatic and tapered columns are studied with one surprise result; the optimum taper gives rise to a uniform bending stress (without axial stresses). It was fun to see that in 1733 Lagrange made a mistake in calculus of variations that led to the incorrect solution for the optimum tapered column. It took 78 years before Clausen obtained the correct solution. The problem has been revisited by William Prager and again by the author who used dynamic programming. Of course, we all got the same result which is a dreadful solid circular tapered column that is heavier than any ordinary waterpipe. The best of a class is not necessarily the best possible design. Under the heading, “Intuition is a good servant but a bad master,” we introduce the notions of tension members that buckle, columns constructed from spherical beads, optimum rigging of crane booms, and deflection reversal of beam-columns. In several places we observe that the weight of optimum columns is proportional to P^α where P is the axial load and α is less than unity. We fail to tell the reader that this implies that minimum weight columns require putting all your eggs in one basket; one column under load P is lighter than two columns each under load P/2. On the other hand, we expose the solid circular column as the least efficient shape among all regular polygons, the equilateral triangle is the best. Indeed, there is a family of rectangles that are superior to the circular cross-section. Finally, the author’s prestressed tubular column is introduced that is pressurized to eliminate local buckling. Euler’s buckling can always be eliminated with a thin-wall section of sufficient width without a weight penalty. The weight of the balloon-like member is proportional to (PL) which implies that at last we have a compressive member that meets the requirement of a Michell structure. Bundling of pressurized gas columns are possible without a weight penalty. Further, the column is insensitive to most imperfections. It is the lightest known column for small structural indices (P/L^2 ). When coupled with circulating cryogenic liquid as a prestressing system, a limiting column has a vanishing weight.
Show less
- Title
- UTILITY OF WATERSHED MODELS: IMPROVING TMDL DEVELOPMENT THROUGH A MARGIN OF SAFETY ESTIMATION AND UNCERTAINTY COMMUNICATION
- Creator
- Nunoo, Robert
- Date
- 2020
- Description
-
Watershed models are used to represent the physical, chemical, and biological mechanisms that determine the fate and transport of pollutants...
Show moreWatershed models are used to represent the physical, chemical, and biological mechanisms that determine the fate and transport of pollutants in waterbodies (Daniel 2011). These models, in general, are used for exploratory, planning, and regulatory purposes (Harmel et al. 2014). Watershed models have numerous applications; one such use is the development of total maximum daily load (TMDL). TMDL is the amount of pollution a waterbody can receive without becoming impaired. Because of the challenge of uncertainty associated with models and the TMDL development process, the United States Clean Water Act Section 303 (d)(1)(c) requires that a margin of safety (MOS) be specified to account for uncertainty in TMDLs. The question of how MOS is estimated in TMDL was identified as a problem by the National Research Council (NRC 2001). Since the identification of the problem about two decades ago, there have been very few inventories or audits of approved TMDL studies. This study describes a natural language processing and machine learning aided review of the MOS in approved TMDLs from 2002 to 2016. The study determined whether the MOS values incorporated followed a pattern and examined whether there exist a relationship between MOS values and some ecological conditions. Relatively few TMDLs were based on some form of calculation to estimate explicit MOS values; these TMDLs constituted only 16% of the reviewed sample. The remaining 84% used conventional values, but few of those studies provided reasons for their selected values. A statistical assessment of those MOS values revealed that the MOS depended on States (location of waterbody), USEPA regions, waterbody type, designated water use, TMDL model used, and dataavailability. The findings indicate that few TMDL developers are following the National Research Council’s suggestions of using a rigorous uncertainty estimation approach for rational choices for the MOS. An adaptive approach based on Bayes-Discrepancy was proposed for estimating an MOS for a TMDL. The approach is based on the Bayesian hierarchical framework of estimating uncertainty associated with watershed models. With this approach, TMDL developers can communicate the effects of their watershed model. The approach was applied to a Ferson Creek model of the Fox River watershed to access variability and uncertainty in the model results, and also estimate possible MOS values for two monitoring stations in the watershed. Results suggest that an MOS of 0.04 mg/L could lead to a 0.1 probability of violating the water quality standard for an underpredicting model. The Bayes-discrepancy estimation method will enable TMDL developers and watershed managers to strike a balance between implementation options and water quality concerns.
Show less
- Title
- Framework For Cloud-Based BIM Governance
- Creator
- Mehraj, Isma
- Date
- 2020
- Description
-
Due to the rapid adoption of building information modeling (BIM) in the architecture, engineering, and construction (AEC) every building can...
Show moreDue to the rapid adoption of building information modeling (BIM) in the architecture, engineering, and construction (AEC) every building can be visualized and interpreted even before its foundation touches the ground. BIM methods are expanding and have entered mainstream use that requires immediate consideration. BIM is new and difficult to operate as mostly due to the enormous amount of data that causes improper data management. The objective of this study is to formulate a cloud-based BIM governance framework with a focus on practical issues for its implementation in the construction organizations. A framework was developed to study Data Management, Team Collaboration, Data Organization, and Legal Assurance as major constructs. It is expected that the constructs will provide a benchmark for BIM cloud governance implementation for BIM /VDC engineers to follow. The incorporation of this framework in BIM practices would produce new opportunities for the AEC community to work in collaboration and increase efficiency in data sharing. A survey among a wide spectrum of BIM/VDC practitioners from major construction organizations in the United States was conducted to explore and find evidence of the strength of the constructs. We anticipate that this framework will provide a basis for assessment and recognition of pivoting, driving factors for practical and effective BIM implementation.
Show less
- Title
- IMPROVING KNOWLEDGE OF MICROBIAL DYNAMICS ON BUILDING MATERIALS UNDER HIGH MOISTURE CONDITIONS
- Creator
- ZHAO, DAN
- Date
- 2020
- Description
-
Most buildings experience some kind of high moisture event(s) throughout their life cycles, often resulting from water leaks or migration of...
Show moreMost buildings experience some kind of high moisture event(s) throughout their life cycles, often resulting from water leaks or migration of water vapor through the enclosure. Dampness and moisture in buildings leads to fungal growth and is associated with adverse human health outcomes. Although the dynamics of fungal growth on buildings materials has been investigated for decades, few studies have integrated modern chemical or microbiological analytical methods (e.g., DNA sequencing, qPCR, etc.) to understand microbial dynamics on materials held at high humidity conditions. Moreover, most mold growth prediction models remain relatively simplistic and rely solely on empirical data for visible mold growth. To bridge some of these gaps, this research aims to improve understanding of microbial growth and community dynamics on building materials under high moisture conditions and to improve our ability to predict microbial growth and community dynamics under a variety of conditions. Five distinct but overlapping research objectives are used to achieve these goals, including: (1) evaluating the growth of microorganisms on wetted building materials and identifying relationships between specific microbial taxa, metabolites, and environmental variables; (2) identifying inherent material chemistry drivers of fungal growth susceptibility and their relation to microbial community structure; (3) exploring how fluctuating moisture exposures impact bacterial and fungal growth and dynamics on building materials; (4) investigating microbial interactions using isolated communities on a single material; and (5) evaluating and improving existing mathematical mold growth models.
Show less
- Title
- INDUSTRIALIZED BUILDING CONSTRUCTION MODELS FOR TORNADO AFTERMATH RECOVERY
- Creator
- Alves de Carvalho, Augusto
- Date
- 2019
- Description
-
Some researchers have reported that the number of disasters is expanding in scale and occurrences. Today, humanity occupies more land than...
Show moreSome researchers have reported that the number of disasters is expanding in scale and occurrences. Today, humanity occupies more land than forty years ago. Due to this, existing communities are prone to higher chances of being affected by disasters. Consequently, the number of natural disasters and losses have increased through time. Recent research work indicates that construction of new houses takes the majority of the recovery time; for example, In Joplin tornado aftermath, the development of new houses took the longest part of the recovery time (D. J. Smith & Sutter, 2013). The disaster industry sees housing and shelter as a product. The procurement is done on a necessity basis. The product --tents, inter-shelters, trailers, permanent dwellings, or any property to rent-- has to be ready whenever required. Therefore, after calculating the construction capacity in tornado regions, a methodology is proposed to compare four different robust industrialized building construction alternatives, keeping components, modules, and pieces in stock. Comparing them will provide information about which format is more appropriate for a profitable company or even a public entity, to respond and recover from a disaster faster.
Show less
- Title
- A Risk-Based Approach in Selecting Design Natural Hazard Loads for Temporary Structures
- Creator
- Saleh, Eman
- Date
- 2020
- Description
-
Temporary structures are systems that are used for short period applications such as in maintenance and retrofit applications or for staged...
Show moreTemporary structures are systems that are used for short period applications such as in maintenance and retrofit applications or for staged performances. Examples of its applications are tents, scaffoldings, and other facilities that have a short service life. Structures under construction and structures under serviceability conditions also fall under this category of structures; because their service time period is only limited to the duration of the construction, which is much shorter than the lifetime of the finished structure. One of the main characteristics of these types of structures is their high vulnerability to stability. Most specifications rarely cover temporary structures. Designing these structures to appropriate levels of the dead and live load does not impose an issue, in which using the same load factors as in permanent structures is considered a reasonable choice. However, an issue is raised when choosing the appropriate wind, seismic and snow load levels, where using the same exposure levels as in permanent structures may not be a desirable economic decision. Thus, for temporary structures, it makes sense to reduce the extreme event design loads proportional to the intended design life. In this manner, it is imperative to have specific rules or guidelines that would address the design aspects and reliability of these structures especially against lateral loads such as wind and earthquakes. This study aims at establishing several decision-making processes that could help contractors, designers, and erectors of temporary works to decide upon safety factors and/or return periods for environmental loads, with emphasis on the wind load. This decision-making process can be used in temporary projects (e.g., bridge erecting) to establish a design criterion based on the nature of the project. The study shows that the optimal decision-making process depends on the willingness to take advantage of wind locality characteristics (e.g., seasonality factor) in certain construction period or region, type of information available for the decision-maker (i.e., precise or imprecise), the risk associated with the constructed facility or the temporary structure itself, a potential for recourse actions, and the decision-maker’s attitude toward the trade-off between losses and gains with respect to uncertainty. The suggested decision-making process proposed is Bayesian decision process, the fuzzy decision process; (3) a two-stage stochastic programming solution; and (4) case-based decision theory. Several practical examples are presented in this thesis to show how different situations may require varying decision-making processes in order to reach the optimal decision. The design of temporary structures can be altered in response to a forecasted hurricane; thus, we propose a three-stage stochastic programming solution to decide upon their optimal wind design load. In addition, we extend the hurricane catastrophe models for application in temporary structures. This enables contractors to forecast the hurricane losses as a basis for estimating the adequate catastrophe cover such as insurance premiums and reinsurance for temporary structures. This scheme is then illustrated in an example for deciding the required temporary bracings for a steel frame under construction during the hurricane season.To prevent temporary structures from collapse, it is important to investigate the performance quality of previous projects and remove any causes for potential mishaps. This can be done by continuous monitoring of different projects and an investigation of accidents, if any, to help prevent future failures. In this study, we use an audit evidence scheme based on commonly available evidence theories used in the construction industry. In one such theory, the evidence is represented via a tree structure, in which the propagation is toward the variables that represent the project as a whole and separate work packages within a project. For simplicity, we only consider the binary case of variables i.e. whether or not a project conforms to the quality standards. The formulation of the auditing tool relates belief functions to the assertion of the quality assurance and quality control (QA/QC) measures and provides formulas for human error risk. These formulas provide plausibilities of human error in a belief-function format. An auditor may use the belief values to assess the degree of quality performance and to identify the sources of the problem in temporary work. We further illustrate the method in a practical application to evaluate the shoring/reshoring operations for estimating the construction risk in a multistory concrete structure. This evaluation may be used to decide on an appropriate time for formwork removal, shoring/reshoring schedule, casting cycles, post-tensioning sequence, and the required number of reshoring levels given the available evidence. To reduce the computational burden, we describe the shoring/reshoring system as a multi-state system (MSS) where the universal generating function (UGF) was used to estimate the reliability of the system.
Show less
- Title
- Frictional behavior of bronze-graphite composite as sliding element in the base isolation system
- Creator
- You, Da
- Date
- 2021
- Description
-
There are many calamities around the world, one of the most dangerous disasters is earthquake which threatens the safety of people and the...
Show moreThere are many calamities around the world, one of the most dangerous disasters is earthquake which threatens the safety of people and the structures. Almost every year, there are a lot of property losses and casualties caused by earthquakes. To mitigate the bad effect of the earthquake, the base isolation system was proposed by previous researchers. With the contribution of many researchers, several seismic isolations have been developed. Until now, many structures have installed seismic isolations to resist seismic energy and vibration. The seismic isolation system works well during the earthquake period, and it does help reduce the casualty and property loss induced by earthquakes. There are two main types of bearings used in the seismic isolation system. One is the elastomeric bearings and the other is the sliding bearings. The mechanics of the seismic isolation system preventing the influence of the earthquake and reducing the horizontal acceleration of the structure is to elongate the natural frequency of structure. As for the sliding bearings, the simplest way to increase the period is to reduce the friction coefficient of the two sliding elements. In conventional, two stainless steel plates are commonly used in the pure flat sliding bearing. This study tries to use bronze-graphite composite in the sliding bearing to decrease the friction coefficient.Consequently, the testing results suggest that the bronze-graphite composite has a lower friction coefficient, especially the graphite acting as a lubricant. The friction coefficient of the bronze-graphite plate is in the range of 0.12 to 0.23 under the load of 160 kg - 800kg. With a higher ratio of graphite to bronze at the sliding surface, the effect of reducing the friction coefficient more obviously. And the friction coefficient changes during the increasing loads period. It decreases at the beginning, and starts to increase at a certain load applied on it. Finally, it is reasonable to bronze-graphite composite in a low rise structure which has a relatively low weight. Because the load applied in the test is not high enough, the consequence may not work for high or heavy structure. Taken together, the use of new material with similar properties in the seismic isolation system can help improve the performance of resisting the earthquake. It should be accounted for further research in this field.
Show less
- Title
- The Peter Principle and Career Development in Construction Management
- Creator
- Bolisetty, Lakshmi Satya Bavya
- Date
- 2023
- Description
-
The construction industry, like any other industry, may benefit from updating its working environment by adopting advances in technology,...
Show moreThe construction industry, like any other industry, may benefit from updating its working environment by adopting advances in technology, updating the qualifications of its workforce, and ensuring that the right professionals are employed at each level. Having effective career development programs, filling project management roles with competent professionals, and promoting competent professionals ensures a successful project delivery. Promotion typically ensures candidates are rewarded for their performance and motivates them towards achieving higher success. Promoting the wrong person or having an incompetent person at any management level has detrimental effects to the project’s success. However, according to the “Peter Principle” (Peter, 1969), incumbents in a hierarchy tend to rise to “a level of respective incompetence”. That is, they are promoted based on their success in their current role, rather than their ability to perform in the new one. Through a review of the literature and a critical analysis of the existing research, this study examines the strengths and weaknesses of the Peter Principle and its relevance to professionals employed by construction companies and construction management firms in contemporary management practice. It also explores the reasons why an incumbent is stuck in a position for a long time. The findings of this research suggest that while the Peter Principle may have some limitations, it remains an important concept for managers to consider when making decisions about employee promotion and development. It concludes that while the Peter Principle may have a significant effect in promotion decisions, there are also external circumstances unique to each individual that may affect their performance.
Show less
- Title
- Design and Fabrication of Battery-Operated Radiator Control (BORC) Utilizing 3D Printing Strategies
- Creator
- Riley, Christopher W.
- Date
- 2023
- Description
-
The scope of this work aims to serve as a continuation of prior research focused on the “development and evaluation of an automatic steam...
Show moreThe scope of this work aims to serve as a continuation of prior research focused on the “development and evaluation of an automatic steam radiator control system or retrofitting legacy heating systems in existing buildings” (Syed Ali et al., 2020) by describing and testing the mechanical components of the developed controller in full detail. Other aspects of radiator efficiency are also explored. Primarily, this work aims to elaborate on the importance of material selection and mechanical properties of the design process. It also proposes initiative-taking solutions for the building’s energy recovery by monitoring the initial set up and focusing on certain details such as cardinal direction, thermal breaks, etc. These legacy systems are generally problematic when attempting to calculate energy efficiency, as a majority of radiator controls tend to be manual. Though there are comparable products within the European market, they cater to hot water systems and not steam, and in some instances require an internet bridge for operation (Tahersima et al., 2010). Since this is an extension of our earlier project, I will refer to it as Battery Operated Radiator Control (BORC) and the previous version as BERG’s Automated Radiator Control (ARC).
Show less
- Title
- The Effect of Time Step on HSPF Model Performance
- Creator
- Rubinstein, Benjamin J.
- Date
- 2023
- Description
-
Hydrological modeling is a mature and well researched field; however, because most climate data are collected on hour or greater time...
Show moreHydrological modeling is a mature and well researched field; however, because most climate data are collected on hour or greater time intervals there is very little research on the effect of using high resolution data as inputs for the models. A Python tool for downloading high resolution five minute interval data from the Oklahoma Mesonet was created and the PyHSPF Python package was used to generate, calibrate, and validate HSPF models using five minute, one hour, and daily time steps. Flow errors, R², and Nash-Sutcliffe efficiency for simulated outflows, and resource usage were compared for each model. The hourly and five minute models performed similarly well, and the daily model performed significantly worse. The results of this work could prove useful for policy makers and researchers looking to update or create new climate data collection protocols, and the tools used can be applied to many different kinds of future research.
Show less