Search results
(9,521 - 9,540 of 9,950)
Pages
- Title
- Distribution-aware Visual Semantic Understanding
- Creator
- Chen, Ying
- Date
- 2021
- Description
-
Understanding visual semantics, including change detection and semantic segmentation, is an essential task in many computer vision and image...
Show moreUnderstanding visual semantics, including change detection and semantic segmentation, is an essential task in many computer vision and image processing applications. Examples of visual semantics understanding in images include land cover monitoring, urban expansion evaluation, autonomous driving, and scene understanding. The goal is to locate and recognize appropriate pixel-wise semantic labels in images. Classical computer vision algorithms involve sophisticated semi-heuristic pre-processing steps and potentially manual interaction. In this thesis, I propose and evaluate end-to-end deep neural approaches for processing images which achieve better performance compared with existing approaches. Supervised semantic segmentation has been widely studied and achieved great success with deep learning. However, existing deep learning methods typically suffer from generalization issues where a well-trained model may not work well on unseen samples from a different dataset. This is due to a distribution change or domain shift between the training and test sets that can degrade performance. Providing more labeled samples covering many possible variations can further improve the generalization of models, but acquiring labeled data is typically time-consuming, labor-intensive and requires domain knowledge. To tackle this label scarcity bottleneck for supervised learning, we propose to apply unsupervised domain adaptation, semi-supervised learning, and semi-supervised domain adaptation for neural semantic segmentation. The motivation behind unsupervised domain adaptation for semantic segmentation is to transfer learned knowledge from one or more source domains with sufficient labeled samples to a different but relevant target domain where labeled data is sparse or non-existent. The adaptation algorithm tends to learn a common representation space where the distributions over both source and target domains are matched. In this way, we expect a classifier working well in the source domain to generalize well to the target domain. More specifically, we try to learn class-aware source-target domain distribution differences, and transfer the knowledge learned from labeled synthetic data on the source domain to the unlabeled real data on the target domain. Different from domain adaptation, semi-supervised semantic segmentation aims at utilizing a large amount of unlabeled data to improve semantic classification trained on a small amount of labeled data from the same distribution. Specifically, supervised semantic segmentation is trained together with an unsupervised model by applying perturbations on encoded states of the network instead of the input, or using mask-based data augmentation techniques to encourage consistent predictions over mixed samples. In this way, learned representation which capture many kinds of unseen variations in unlabeled data, benefit the supervised semantic classifier. We propose a mask-based data augmentation semi-supervised learning network to utilize structure information from a variety of unlabeled examples to improve the learning on a limited number of labeled examples.Both unsupervised domain adaptation (UDA) with full source supervision but without target supervision and semi-supervised learning (SSL) with partial supervision have shown to be able to address the generalization problem to some extent. While such methods are effective at aligning different feature distributions, their inability to efficiently exploit unlabeled data leads to intra-domain discrepancy in the target domain, where the target domain is separated into two unaligned sub-distributions due to source-aligned and target-aligned data. That is, enforcing partial alignment between full labeled source data and a few labeled target data does not guarantee that the remaining unlabeled target samples will be aligned with source feature clusters, thus leaving them unaligned. Hence, I propose methods for incorporating the advantages of both UDA and SSL, termed semi-supervised domain adaptation (SSDA), with a goal to align cross-domain features as well as addressing the intra-domain discrepancy within the target domain. I propose a simple yet effective semi-supervised domain adaptation approach by utilizing a two-step domain adaptation addressing both cross-domain and intra-domain shifts.
Show less
- Title
- Factors Influencing the Level of Detection of Testing Listeria monocytogenes in Ice Cream
- Creator
- Chen, Bairu
- Date
- 2022
- Description
-
The increasing evidence has shown that having a sensitive detection method for Listeria monocytogenes in food products is critical for public...
Show moreThe increasing evidence has shown that having a sensitive detection method for Listeria monocytogenes in food products is critical for public health as well as industrial economics. L. monocytogenes was associated with foodborne illness outbreaks linked to ice cream in the United States from 2010 to 2015, with another recent outbreak under investigation. The FDA Bacteriological Analytical Manual (BAM) method was commonly used for L. monocytogenes detection. However, the performance characteristics of the chromogenic methods (MOX, RLM, and R&F agars) remain to be elucidated. The factorial effect on Level of Detection (LOD) as an essential element of the International Organization for Standardization (ISO) approach for qualitative method validation was investigated in this study.For examining the LOD of L. monocytogenes in ice cream, fractional contaminated samples were prepared with the ice cream obtained from the 2015 outbreak and enumerated using the FDA BAM Most Probable Number (MPN) method for Listeria. The effect of test portion size was determined by comparing 10g and 25g using the BAM method with chromogenic agars (MOX, RLM, and R&F). The ISO single-lab validation requirement was followed for the factorial effect study, including four different factors: sample size (10g and 25g), ice cream types (commercially available regular vanilla ice cream and vanilla ice cream with low fat and no added sugar), re-freezing process (with re-freezing and without re-freezing process), and thawing process (slow thaw and fast thaw). LOD and relative LOD (RLOD) were computed using MiBiVal software to compare the sensitivity of the three chromogenic agars and the different factors. For all of the detection experiments, presumptive colonies were identified using the API listeria kit. The 2015 naturally contaminated ice cream was enumerated and resulted in an average contamination level of 2.15 MPN/g. At fractional levels of 0.25 MPN/10g and 0.75 MPN/10g, the positive rates of L. monocytogenes detected from 10g and 25g of sample portions were consistent with the statistically theoretical positive rates. The RLOD values for the reference method (MOX) and the alternative methods (RLM, R&F) were above 1 in both portion sizes, which suggested that MOX was slightly more sensitive than RLM and R&F. The factorial effect study indicated that the four factors have no significant influence on the LOD of L. monocytogenes detection at the fractional contamination levels. However, the test portion size of 25g provided more consistent results among the chromogenic media than the 10g portion size. Fat content was shown to have an effect on L. monocytogenes detection in a large test portion. The information from this study will be useful for the improvement of the reproducibility of a qualitative detection method and can also be used for data analysis standards such as ISO 16140 in method validation studies.
Show less
- Title
- Habitat 2022+
- Creator
- Cho, San Lae Lae
- Date
- 2021
- Description
-
Globalization and urbanization have led a vast group of population to shift from rural and suburban areas to urban centers. According to the...
Show moreGlobalization and urbanization have led a vast group of population to shift from rural and suburban areas to urban centers. According to the United Nation’s (UN) projection, around 2.5 billion more people will live in the cities by 2050. For cities to accommodate such a population, cities can grow horizontally or vertically. If cities were to grow horizontally, this population would need to commute majorly by car which has a very high carbon footprint. By building vertically, the population’s commute becomes more efficient if the vertical structure is built as such is a city within a city. Although having existed for 500 years, tall buildings and skyscrapers have been topics of controversy for their excessive use of resources, ignoring environmental and contextual problems. In order to mitigate this sustainability issue, mass timber as a construction material has been recently introduced into the American market. Mass timber is the building material for a carbon neutral future according to Council on Tall Buildings and Urban Habitats (CTBUH). Another means is to grow plants around the buildings meaning they are regrowing greenery back into nature with the hopes to take a step towards mitigating climate change. In addition, with the argument of tall buildings lacking neighborhood-like qualities that for instance, a Chicago block of housing may give, more and more tall buildings around the world have tried to implement the lush spaces onto the building perimeters. These spaces with plants and trees mimicking nature can exist as both indoor and outdoor spaces. While neighborhoods have parks, these spaces or sky gardens will substitute as modified parks for skyscrapers.With psychological benefits tied to spaces linked to nature in high-rises, such spaces should be integrated in all high rise, residential buildings. Today, in addition to Covid pandemic, many people practice working from home. Meanwhile, it is human nature since birth to want to be surrounded by plants and nature, either spending time under a tree or lying on the grass. To mitigate spending excessive amounts of hours indoors, spaces linked to nature have become crucial in providing a sense of peace and reconnecting back to nature. However, currently architecture that has nature integrated, results in spaces only for the wealthy. Furthermore, architecture in the US is so commercially driven with developers aiming for the most cost effective, most efficient and largest floor areas. For nature to be more available to the living spaces of middle class and ideally working class, the city planners and city development authorities need to get involved. For instance, the city could aid by allocating some of the tax collected to build more green architecture. With the tax fund assistance from the city, high quality living of green architecture will become available and much more affordable.Another option to making living more affordable is by implementing co-living. Despite the Covid-19 pandemic, co-living remains relatively high in demand in Chicago according to Chicago Tribune. For the younger generations such as the Millennials and younger, it is predicted that due to the nature of student loan debts, these generations are less likely to be financially well off after graduation. Hence, they are less likely to establish a family as soon as the previous generations, as most people tend to start a family when they feel financially stable.Hence, in this design thesis, a certain prototype will be tested out regarding some of these issues. This is just one option to the issues above and there are many options and other nuances still available to explore.
Show less
- Title
- RE-DEFINING LABOR HOUSING COMMUNITIES IN DUBAI
- Creator
- Chavadi, Sanjana Mrutyunjaya
- Date
- 2022
- Description
-
From the moment they are duped into believing in a utopian lifestyle, to their day to-day working and living conditions once they arrive in...
Show moreFrom the moment they are duped into believing in a utopian lifestyle, to their day to-day working and living conditions once they arrive in the country, to the denial of theirrights for years after they are stuck in the country, the migration of workers from other countries into the Emirates is a neglected humanitarian issue. The United Arab Emirates and several other GCC countries have a heinous habit of recruiting illiterate foreigners into their country and abusing them as if they were modern-day slaves. Many nations and individuals have legal control over immigrants by applying the Kafala sponsorship system and ignoring international norms and rights established by organizations like the International Labor Organization. While the Kafala system is only one of many aspects of the whole problem, this thesis will seek to identify the influence of architecture on the lives of the South Asian labor force who work behind the beautiful façade of the United Arab Emirates by reconstructing the labor community. These colossal structures are built and shown while the individuals who construct them are ignored and hidden. Immigrant low income labor employees bear the brunt of their hard life-work imbalance with the intention of providing assistance to their family in their home nations. The labor force of Dubai becomes the major target of the terrorists, isolated from Dubai's pleasure amenities, socially detached from the rest of the population, and separated from their creations. By creating a landmark and developing a sense of social coherence, integration, and empowerment among the labor community, this initiative intends to bridge the divide and commemorate the labor community. This project will propose a new sort of community and housing typology that embraces the ideals of modularity, adaptability, and community as a solution to the immigrant worker dilemma.
Show less
- Title
- Hedge Fund Replication With Deep Neural Networks And Generative Adversarial Networks
- Creator
- Chatterji, Devin Mathew
- Date
- 2022
- Description
-
Hedge fund replication is a means for allowing investors to achieve hedge fund-like returns, which are usually only available to institutions....
Show moreHedge fund replication is a means for allowing investors to achieve hedge fund-like returns, which are usually only available to institutions. Hedge funds in total have over $3 trillion in assets under management (AUM). More traditional money managers would like to offer hedge fund-like returns to retail investors by replicating their performance. There are two primary challenges with existing hedge fund replication methods, difficulty capturing the nonlinear and dynamic exposures of hedge funds with respect to the factors, and difficulty in identifying the right factors that reflect those exposures. It has been shown in previous research that deep neural networks (DNN) outperform other linear and machine learning models when working with financial applications. This is due to the ability of DNNs to model complex relationships, such as non-linearities and interaction effects, between input features without over-fitting. My research investigates DNNs and generative adversarial networks (GAN) in order to address the challenges of factor-based hedge fund replication. Neither of these methods have been applied to the hedge fund replication problem. My research contributes to the literature by showing that the use of these DNNs and GANs addresses the existing challenges in hedge fund replication and improves on results in the literature.
Show less
- Title
- Improving Utility and Efficiency for Privacy Preserving Data Analysis
- Creator
- Liu, Bingyu
- Date
- 2022
- Description
-
In recent decades, the smart cities are incorporating with Internet-of-Things (IoT) infrastructures for improving the citizens’ quality of...
Show moreIn recent decades, the smart cities are incorporating with Internet-of-Things (IoT) infrastructures for improving the citizens’ quality of life by leveraging information/data. The huge amount of data is extracted and generated from the devices (e.g., mobile applications, GPS navigation systems, urban traffic cameras, etc.), or city sectors such as Intelligent Transportation Systems (ITS), Resource Allocation, Utilities, Crime Detection, Hospitals, and other community services.This dissertation aims to systematically research the Data Analysis in IoT System, which mainly consists of two aspects: Utility and Efficiency. First, ITS as a representative system in IoT in the smart city, I present the work on privacy preserving for the trajectories data, which is achieved by the differential privacy technique with a novel sanitation framework. Moreover, I have studied the resource allocation problem in two different approaches: Cryptographic computation and Hardware en- claves with the utility and efficiency accordingly. For the Cryptographic computation approach, I utilize Secure Multi-party Computation (SMC) technique for achieving the privacy-aware divisible double auction without a mediator. Besides, I also pro- pose a hardware-based solution Trusted Execution Environment (TEE) for performance improvement. At the same time, integrity and confidentiality are also able to be guaranteed. The proposed hybridized Trusted Execution Environment (TEE)- Blockchain System is designed for securely executing smart contract. Finally, I have studied the Cryptographic Video DNN Inference for the smart city surveillance, which privately inferring videos (e.g., action recognition, and video, and classification) on 3D spatial-temporal features with the C3D and I3D pre-trained DNN models with high performance. This dissertation proposes the privacy preserving frameworks and mechanisms are able to be applied efficiently for IoT in the real-world.
Show less
- Title
- A BIM-BASED LIFE CYCLE ASSESSMENT TOOL OF EMBODIED ENERGY AND ENVIRONMENTAL IMPACTS OF REINFORCED CONCRETE TALL BUILDINGS
- Creator
- Ma, Lijian
- Date
- 2022
- Description
-
Today 55 percent of population in the world lives in urban areas which is expected to increase to 68 percent by the year 2050. In the cities,...
Show moreToday 55 percent of population in the world lives in urban areas which is expected to increase to 68 percent by the year 2050. In the cities, high-rise buildings as symbols of the modern cityscape are dominating the skylines, but the data to demonstrate their embodied energy and environmental impacts are scarce, compared to low- or mid-rise buildings. Reducing the embodied energy and environmental impacts of buildings is critical as about 42 percent of primary energy use and 39 percent of the global greenhouse gas (GHG) emissions come from the building sector. However, it is an overlooked area in embodied energy and environmental impacts of tall buildings. This doctoral research aims to investigate the effects of tall buildings on embodied energy and environmental impacts by using process-based life cycle assessment (LCA) methodology within Building Information Modelling (BIM) environment, which provides construction industry platform to incorporate sustainability information in architectural design. This doctoral research is carried out through a literature review on embodied energy of high-rise buildings. Current LCA methods of buildings are also discussed in the literature review. It then develops a framework for BIM-based assessment of the embodied energy and environmental impacts of tall buildings. To achieve that, a case study of tall reinforced concrete building is applied, by using ISO 14040 and 14044 guidelines with available database, Revit and Tally application in Revit. The author concentrates on embodied energy and environmental impacts of reinforced concrete tall buildings. Finally, the association between design and construction variables with embodied energy and environmental impacts is explored. This research will lead to significant contributions. A comprehensive study on embodied energy and environmental impacts of high-rise building will address a major gap in LCA literature. Researchers and environmental consultants can use the results of this research to create design tools to evaluate environmental impacts of high-rise buildings. Also, architects can use the results of this research to develop insight into the environmental performance of tall buildings in early design stage. Architects and engineers can also use the results to optimize tall building design for low embodied energy and environmental impacts. Finally, the results of this research will enable architects, engineers, planners, and policymakers develop more sustainable built environments.
Show less
- Title
- Active Load Control in a Synchronized and Democratized (SYNDEM) Smart Grid
- Creator
- Lv, Zijun
- Date
- 2021
- Description
-
Smart grid is envisioned to take advantage of modern information and communication technologies in achieving a more intelligent grid in order...
Show moreSmart grid is envisioned to take advantage of modern information and communication technologies in achieving a more intelligent grid in order to facilitate: Integration of renewable resources; Integration of all types of energy storage; Two-way communication between the consumer and utility so that end users can actively participate. The Synchronized and Democratized (SYNDEM) smart grid is regarded as the next generation smart grid. The objective of the SYNDEM smart grid is for all active players in a grid, large or small, conventional or renewable, supplying or consuming, to be able to equally and laterally regulate the grid in a synchronous manner to enhance the stability, reliability, and resiliency of future power systems. In a SYNDEM smart grid, power electronic converters are controlled to behave as conventional synchronous machines. Such converters are called virtual synchronous machines (VSMs).Following the SYNDEM structure, this thesis mainly focuses on developing the VSM technology for the automatic grid regulation at the demand side. The major aim and objective is to achieve active or intelligent loads that can flexibly and automatically take part in grid regulation. Moreover, the active load is expected to have similar grid regulation behavior as other active players in the grid, for e.g., renewable generations. To achieve this, a droop-controlled rectifier is proposed that acts as a general interface for a load to grid. The rectifier is controlled as a VSM so that a load equipped with such a rectifier can take part in grid regulation continuously like a traditional synchronous machine. Such a rectifier has a built-in storage port, in addition to the normal AC and DC ports. The flexibility required by the AC port to support the grid is provided by the storage port. The DC-bus voltage of the storage port is able to fluctuate with in a wide range to exchange energy with the grid.In order to further take use of the energy in the storage port (DC-bus capacitor) of a rectifier more reasonably and increase the support time to grid, an adaptive droop mechanism is proposed. Under such a droop mechanism, the rectifier can automatically change the power consumed according to the grid voltage variations as well as its potential to provide grid support. To achieve this, a flexibility coefficient is introduced to indicate the power flexibility level of the DC-bus capacitor. Then this flexibility coefficient is embedded into the universal droop controller (UDC) to make it adaptive. Hence, the adaptive droop controller has a changing droop coefficient corresponding to the power flexibility of a rectifier, so it can take advantage of the energy stored in its DC-bus capacitor wisely to support the grid. This droop controller can also be applied into connection between two SYNDEM smart grids. To achieve this, a grid bridge (GB) that enables autonomous and equal regulation between two SYNDEM grids is proposed. The real power transferred through a GB has linear relationship with the voltage deviation between the two micro-grids connected. The micro-grid with a higher voltage will automatically provide power to the lower one. Moreover, the power direction of a GB is bidirectional and determined by the grid voltage difference, this makes the two micro-grids equal to each other. The GB is physically a back-to-back converter. In order to achieve autonomous and equal regulation, both sides of the back-to-back converter are controlled under droop controller with the same droop coefficients. The VSM control technology is also developed to control Modular multilevel converters (MMCs) for high voltage applications. Like active loads introduced above, the MMCs can take part in the grid regulation according to the droop mechanism designed. In order to eliminate the circulating current that exists in MMCs, proportional-resonant (PR) controllers are adopted to inject second-order harmonics to the MMCs to suppress the second order circulating current. The dynamics, implementation and operation of the VSM-like MMC are introduced and analyzed. Particularly, how the VSM control algorithm works with the circulating current control in MMCs is presented. An IIT SYNDEM Smart Grid Testbed is built in an aim of achieving a minimize realization of the SYNDEM system. Extensive experiments are done on the system to show the operational scenarios when the proposed active loads are integrated in the system. There are in total eight nodes in the IIT SYNDEM testbed, which contains two utility grids, one AC load, one DC load, two solar farms and two wind farms. All the nodes are connected to a local grid through VSMs, so that they can take part in the local grid regulations in similar ways. The IIT SYNDEM Smart Grid Testbed is described in details and experimental results are provided to show the dynamic and steady performance of the IIT SYNDEM smart grid.
Show less
- Title
- Modeling, Analysis and Computation of Tumor Growth
- Creator
- Lu, Min-Jhe
- Date
- 2022
- Description
-
In this thesis we investigate the modeling, analysis and computation of tumor growth.The sharp interface model we considered is to understand...
Show moreIn this thesis we investigate the modeling, analysis and computation of tumor growth.The sharp interface model we considered is to understand how the two key factors of (1) the mechanical interaction between the tumor cells and their surroundings, and (2) the biochemical reactions in the microenvironment of tumor cells can influence the dynamics of tumor growth. From this general model we give its energy formulation and solve it numerically using the boundary integral methods and the small-scale decomposition under three different scenarios.The first application is the two-phase Stokes model, in which tumor cells and the extracellular matrix are both assumed to behave like viscous fluids. We compared the effect of membrane elasticity on the tumor interface and the curvature-weakening one and found the latter would promote the development of branching patterns.The second application is the two-phase nutrient model under complex far-field geometries, which represents the heterogeneous vascular distribution. Our nonlinear simulations reveal that vascular heterogeneity plays an important role in the development of morphological instabilities that range from fingering and chain-like morphologies to compact,plate-like shapes in two-dimensions.The third application is for the effect of angiogenesis, chemotaxis and the control of necrosis. Our nonlinear simulations reveal the stabilizing effects of angiogenesis and the destabilizing ones of chemotaxisand necrosis in the development of tumor morphological instabilities if the necrotic core is fixed. We also perform the bifurcation analysis for this model.In the end, as a future work, we propose new models through Energetic Variational Approach (EnVarA) to shed light on the modeling issues.
Show less
- Title
- GLOBAL ESTIMATION AND ANALYSIS OF IONOSPHERIC DRIVERS WITH A DATA ASSIMILATION ALGORITHM
- Creator
- López Rubio, Aurora
- Date
- 2022
- Description
-
This dissertation studies a data assimilation algorithm that estimates the drivers of the ionosphere-thermosphere (IT) region of the Earth....
Show moreThis dissertation studies a data assimilation algorithm that estimates the drivers of the ionosphere-thermosphere (IT) region of the Earth. The algorithm, EMPIRE (Estimating Model Parameters from Ionospheric Reverse Engineering) can estimate 2 main drivers of the ionospheric behavior: neutral winds and electric potential by ingesting mainly ionospheric densities obtained through Global Satellite System (GNSS) measurements. Additionally, the algorithm can ingest FPI (Fabry-Perot interferometer) neutral wind measurements. The contributions include 1) Vector spherical harmonic basis function for neutral wind estimation, 2) Quantification of the representation error of the estimations of the algorithm EMPIRE, 3) Analysis of Nighttime Ionospheric Localized density Enhancement (NILE) events and 4) Ingestion of global ICON (Ionospheric Connection Explorer) neutral winds measurements. The IT region in the atmosphere is characterized by having a large concentration of free ions and electrons, electromagnetic radiation and Earth's magnetic field. The behavior of the region is dominated by the solar activity, that ionizes the free electrons of the region, forming ionospheric plasma and determining its density. Unusual solar activity or any atmospheric disturbance affects the distribution of the ionospheric plasma and the behavior of the IT region. The redistribution of the ionospheric density impacts technology widely used such as telecommunication or satellite navigation, so it is increasingly important to study the IT system response. The IT behavior can be characterized by what drives its changes. Two drivers that play a key role, the ones we focus on this dissertation, are electric potential, that directly affects the charged ions in the system, and neutral winds, that refers to the velocity of the neutral particles that form the thermosphere. To quantify these drivers, measurements and climate models are available. Measurements are limited as the IT region is vast and covers the entire globe. Climate models can provide information in all the region, but they are usually not as reliable during the unusual solar activity conditions or disturbances. In this dissertation we use a data assimilation algorithm, EMPIRE, that combines both sources of data, measurements and models, to estimate the IT drivers, neutral winds and electric potential. EMPIRE ingests measurements of the plasma density rate and models the physics of the region with the ion continuity equation. The drivers are represented with basis functions and their coefficients are estimated by fitting the expansions with a Kalman filter. In previous work and use of the algorithm, the neutral winds were expanded using power series basis function for each of the components of the vector. The first contribution of the dissertation is to use a vector spherical harmonic expansion to describe the winds, allowing a continuous expansion around the globe and self-consistent components of the vector. Before, EMPIRE estimated the correction of the drivers with respect climate model values. In this work, EMPIRE is also modified to directly estimate the drivers. Then, a study of the representation error, which is the discrepancy between the true physics and the discrete model that represents the physics of EMPIRE and its quantification is done. Next, EMPIRE is used to analyze two NILE events, using the global estimation of both winds, from the first contribution, and the electric potential, derived in previous work. Finally, global estimation of winds allows us to implement the ingestion of ICON global winds in EMPIRE, in addition to the plasma density rate measurements.
Show less
- Title
- Machine learning applications to video surveillance camera placement and medical imaging quality assessment
- Creator
- Lorente Gomez, Iris
- Date
- 2022
- Description
-
In this work, we used machine learning techniques and data analysis to approach two applications. The first one, in collaboration with the...
Show moreIn this work, we used machine learning techniques and data analysis to approach two applications. The first one, in collaboration with the Chicago Police Department (CPD), involves analyzing and quantifying the effect that the installation of cameras had on crime, and developing a predictive model with the goal of optimizing video surveillance camera location in the streets. While video surveillance has become increasingly prevalent in policing, its intended effect on crime prevention has not been comprehensively studied in major cities in the US. In this study, we retrospectively analyzed the crime activities in the vicinity of 2,021 surveillance cameras installed between 2005 and 2016 in the city of Chicago. Using Difference-in-Differences (DiD) analysis, we examined the daily crime counts that occurred within the fields-of-view of these cameras over a 12-month period, both before and after the cameras were installed. We also investigated their potential effect on crime displacement and diffusion by examining the crime activities in a buffer zone (up to 900 ft) extended from the cameras. The results show that, collectively, there was an 18.6% reduction in crime counts within the direct viewsheds of all of the study cameras (excluding District 01 where the Loop -Chicago's business center- is located). In addition, we adapted the methodology to quantify the effect of individual cameras. The quantified effect on crime is the prediction target of our 2-stage machine learning algorithm that aims to estimate the effect that installing a videocamera in a given location will have on crime. In the first stage, we trained a classifier to predict if installing a videocamera in a given location will result in a statistically significant decrease in crime. If so, the data goes through a regression model trained to estimate the quantified effect on crime that the camera installation will have. Finally, we propose two strategies, using our 2-stage predictive model, to find the optimal locations for camera installations given a budget. Our proposed strategies result in a larger decrease in crime than a baseline strategy based on choosing the locations with higher crime density.The second application that forms this thesis belongs to the field of model observers for medical imaging quality assessment. With the advance of medical imaging devices and technology, there is a need to evaluate and validate new image reconstruction algorithms. Image quality is traditionally evaluated by using numerical figures of merit that indicate similarity between the reconstruction and the original. In medical imaging, a good reconstruction strategy should be one that helps the radiologist perform a correct diagnosis. For this reason, medical imaging reconstruction strategies should be evaluated on a task-based approach by measuring human diagnosis accuracy. Model observers (MO) are algorithms capable of acting as human surrogates to evaluate reconstruction strategies, reducing significantly the time and cost of organizing sessions with expert radiologists. In this work, we develop a methodology to estimate a deep learning based model observer for a defect localization task using a synthetic dataset that simulates images with statistical properties similar to trans-axial sections of X-ray computed tomography (CT). In addition, we explore how the models access diagnostic information from the images using psychophysical methods that have been previously employed to analyze how the humans extract the information. Our models are independently trained for five different humans and are able to generalize to images with noise statistic backgrounds that were not seen during the model training stage. In addition, our results indicate that the diagnostic information extracted by the models matches the one extracted by the humans.
Show less
- Title
- EXAMINING PERFORMANCE DEGRADATION OF LI-ION BATTERIES WITH SILICON-BASED ANODE AND POSSIBLE SOLUTIONS TO IMPROVE THE SILICON ANODE BEHAVIOR
- Creator
- Luo, Mei
- Date
- 2022
- Description
-
Si has been investigated as a promising alternative to conventional graphite because of its high specific capacity and wide operating voltage;...
Show moreSi has been investigated as a promising alternative to conventional graphite because of its high specific capacity and wide operating voltage; however, technical challenges related to volume change in the silicon anode have hampered their practical application. In this work, the effects of silicon volume change on electrochemical performance has been studied in NMC532/Si full cells. First, different area specific capacity ratios of the negative to positive electrode (N:P ratio) were investigated using three-electrode cells. With individual electrode potentials monitored by a reference electrode, different depths of lithiation/delithiation at the anode and cathode were found to play an important role on cell performance; the cell with higher N:P ratio displays superior electrochemical performance due to its smaller silicon volume change. Further, calendar-life aging and cycle-life aging of NMC532/Si cells were compared with their electrode potentials monitored using a reference electrode. The observation of larger capacity decay and impedance growth of cycle-life aging cells illustrates the important effect of silicon volume change; significant capacity decay of calendar-life aged cell was observed as well, revealing an essential role of chemical effect of ongoing side reactions at Si anode. Specially-designed silicon with different protocols and electrolyte additives were investigated to address the intrinsic challenges of Si anodes for lithium-ion batteries.
Show less
- Title
- Mothers’ Vaccination Decision: The Relation Between Science Skepticism, Social Networks, Vaccination Beliefs, and Fear of ASD
- Creator
- Lockwood, Maria Izabel Kugelmas Guarita
- Date
- 2021
- Description
-
Vaccines are instrumental in stopping the spread of disease, yet some parents choose to not vaccinate their children. Despite scientific...
Show moreVaccines are instrumental in stopping the spread of disease, yet some parents choose to not vaccinate their children. Despite scientific evidence that childhood vaccines are safe, there is an increasing number of children in the United States and the United Kingdom who are not getting vaccinated. The current study investigates different factors that may be associated with mothers’ decision to vaccinate their children. This study examines the relations between skepticism in science, vaccination beliefs, fear of having a child with Autism Spectrum Disorder (ASD), social network recommendations, and maternal decision to vaccinate. Participants included 293 expectant mothers in the United States and the United Kingdom. Results indicated that mothers who are pro-vaccine and mothers who are vaccine-hesitant have different score profiles across scales that measure skepticism in science, vaccination beliefs, and fear of having a child with ASD. Specifically, we found that relative to mothers who are vaccine-hesitant, mothers who are pro-vaccine: (1) indicated less skepticism in science; (2) had fewer anti-vaccination beliefs; (3) did not statistically differ on their fear of having a child with ASD; and (4) had a smaller percentage of their social network advocating against vaccination. Thus, the study adds to the research literature, as it illustrates that mothers who are vaccine-hesitant differ from mothers who are pro-vaccine on key factors.
Show less
- Title
- Data-Driven Methods for Soft Robot Control and Turbulent Flow Models
- Creator
- Lopez, Esteban Fernando
- Date
- 2022
- Description
-
The world today has seen an exponential increase in its usage of computers for communication and measurement. Thanks to recent technologies,...
Show moreThe world today has seen an exponential increase in its usage of computers for communication and measurement. Thanks to recent technologies, we are now able to collect more data than ever before. This has dawned a new age of data-driven methods which can describe systems and behaviors with increasing accuracy. Whereas before we relied on the expertise of a few professionals with domain-specific knowledge developed over years of rigorous study, we are now able to rely on collected data to reveal patterns, develop novel ideas, and offer solutions to the world’s engineering problems. No domain is safe. Within the engineering realm, data-driven methods have seen vast usage in the areas of control and system identification. In this thesis we explore two areas of data-driven methods, namely reinforcement learning and data-driven causality. Reinforcement learning is a method by which an agent learns to increase its selection of ideal actions and behaviors which result in an increasing reward. This method was applied to a soft-robotic concept called the JAMoEBA to solve various tasks of interest in the robotics community, specifically tunnel navigation, obstacle field navigation, and object manipulation. A validation study was conducted to show the complications that arise when applying reinforcement learning to such a complex system. Nevertheless, it was shown that reinforcement learning is capable of solving three key tasks (static tunnel navigation, obstacle field navigation, and object manipulation) using specific simulation and learning hyperparameters. Data-driven causality encompasses a range of metrics and methods which attempt to uncover causal relationships between variables in a system. Several information theoretic causal metrics were developed and applied to nine mode turbulent flow data set which represents the Moehlis model. It was shown that careful consideration into the method used was required to identify significant causal relationships. Causal relationships were shown to converge over several hundred realizations of the turbulent model. Furthermore, these results match the expected causal relationships given known information of self-sustaining processes in turbulence, validating the method’s ability to identify causal relationships in turbulence.
Show less
- Title
- ARCHITECTURE FOR COLLABORATIVE CREATIVITY - SPACE WE-Q: SPACE INTELLIGENCE EMPOWERING CREATIVE WE CULTURE IN LEARNING-DRIVEN ENVIRONMENTS
- Creator
- Mor-Avi, Anat
- Date
- 2020
- Description
-
Changes in societal culture, along with research on how we learn, challenge current architectural solutions. Education’s shifting paradigms...
Show moreChanges in societal culture, along with research on how we learn, challenge current architectural solutions. Education’s shifting paradigms align with these changes and move teaching strategies from teacher-centered to learner-centered, and from formal and passive, to informal and active modes. Another shift emphasizes collaboration and participatory creativity, which evolve the idea of the “collective,” or “We” versus “I” scenarios. In addition, studies show that creativity flourishes in specific contradictory performances. Supporting these reported changes, new knowledge, and paradigm shifts, this research studied how an active, adaptive architectural design approach might emerge into the learning and creative processes. Evidence indicates that “design and space do matter,” particularly in learning- and working-driven domains. Empirical research has been weak in addressing this understanding relative to architectural solutions, affordances, behaviors, and emotions, promoting collaborative creativity. This research aimed to investigate patterns of architectural affordances believing to impact and empower collaborative cultures and behaviors in learning environments (“WE CULTURE”), specifically motions and emotions. A Mixed-method research design was conducted, using two techniques: (a) a content analysis of awarded learning and working environments, and (b) a post-occupancy evaluation using ethnographic techniques to study the Kaplan Innovation Institute at the Illinois Institute of Technology in Chicago, Illinois, USA. In an effort to provide an applied design study, a visual pattern language related to cultures of learning, environment behavior, and emotions was developed. The pattern language is the platform for designing intelligent spaces, SPACE WE-Q, promoting collaborative behaviors, and creativity through adaptive and behavior-based systems of active affordances. SPACE WE-Q offers a planned adaptive system for unplanned creative processes that emerges into learning and suggesting a new relationship between architecture and education, between architects and users, and between users and space.
Show less
- Title
- Promoting Healthy Lifestyle Behaviors for African Americans with Serious Mental Illness and Weight Concerns
- Creator
- Nieweglowski, Katherine
- Date
- 2022
- Description
-
People with serious mental illness face greater rates of chronic illness and obesity compared to those without mental illness. These rates are...
Show morePeople with serious mental illness face greater rates of chronic illness and obesity compared to those without mental illness. These rates are disproportionately higher for those who are part of racially minoritized groups. For example, African Americans are more likely to be obese compared to their white counterparts. This study sought to test a diet and exercise program—developed through community-based participatory research—called “Behaviors for Healthy Lifestyles” (BHL) for African Americans with serious mental illness and weight concerns. The impact of this program, also combined with peer health navigation (PHN), was tested on various physical and mental health outcomes. Participants were randomly assigned to either integrated-care treatment as usual (IC-TAU), BHL, or BHL+PHN. Data was collected at baseline, 4-month, 8-month, and 12-month follow up for outcomes measuring general health, bodily pain, physical functioning, emotional well-being, depression, recovery, quality of life, weight efficacy, and emotional eating. Monthly data collection was also conducted on frequency of healthy lifestyle behaviors related to diet and physical activity. Findings from group by trial analyses of variance on these outcomes did not show any significant impact. Implications for testing diet and exercise interventions combined with PHN for this population are discussed along with future research considerations related to increasing attendance and participation for greater health improvements.
Show less
- Title
- Development and evaluation of high resolution MRI templates and labels of the MIITRA atlas
- Creator
- Niaz, Mohammad Rakeen
- Date
- 2022
- Description
-
A digital human brain atlas consisting of MRI-based multi-modal templates and semantic labels delineating brain regions are commonly used as...
Show moreA digital human brain atlas consisting of MRI-based multi-modal templates and semantic labels delineating brain regions are commonly used as references for spatial normalization in a wide range of neuroimaging studies. Magnetic resonance imaging (MRI) studies of the aging brain is of significant interest in recent times to explore the role of brain characteristics associated with cognitive functions. The introduction of advanced image reconstruction techniques, and the recent trend in MRI acquisitions at submillimeter in-plane resolution have resulted in an easier availability of MRI data on older adults at high spatial resolution. An atlas with a comprehensive set of high-resolution templates representative of the older adult brain and detailed labels accurately mapping brain regions can increase the sensitivity and specificity of such neuroimaging studies. Additionally, most neuroimaging studies can benefit from a high-resolution atlas with templates where fine brain structures are resolved and, where the transition between different tissue can be more accurately defined. However, such an atlas is not publicly available for older adults. Hence the goal of this thesis is to develop a comprehensive, high-resolution digital human brain atlas for older adults termed as Multi-channel Illinois Institute of Technology and Rush University Aging (MIITRA) atlas.This dissertation aims a) to develop a new technique based on the principles of super-resolution for the construction of high-resolution structural and diffusion tensor templates, and evaluate the templates for use in studies on older adults, b) to construct and evaluate high-resolution structural and diffusion tensor templates constructed using the method developed in (a) for the MIITRA atlas using MRI data collected on 400 nondemented older adults, c) to investigate and develop a technique for the construction of high-resolution labels and evaluate the performance of gray matter labels constructed using this technique in segmenting the gray matter of older adults, and d) to develop and evaluate a comprehensive set of high-resolution labels using the technique developed in (c) for the MIITRA atlas using data on 400 non-demented older adults. Based on the aforementioned points, the thesis is structured as follows: Firstly, this thesis presents a novel approach for the construction of a high-resolution T1-weighted structural template based on the principles of super resolution. This method introduced a forward mapping technique to minimize signal interpolation, and a weighted averaging method to account for residual misregistration. The new template was shown to resolve finer brain structures compared to a lower resolution template constructed using the same data. It was demonstrated through systematic comparison of this new template to several other standardized templates of different resolutions that a) it exhibited high image sharpness, b) was free of image artifacts, c) allowed for high spatial normalization accuracy and detection of smaller inter-group morphometric differences compared to other standardized templates, d) was highly representative of the older adult brain. This novel approach was further modified for the construction of a high spatial resolution diffusion tensor imaging template. The new DTI template is the first high spatial resolution population-based DTI template of the older adult brain and exhibits high image quality, high sharpness, is free of artifacts, resolves fine white matter structures, and provides higher spatial normalization accuracy of older adult DTI data compared to other available DTI templates. Secondly, the aforementioned techniques were utilized in the development of high resolution T1-weighted and DTI templates, and tissue probability maps for the MIITRA atlas using high quality MRI images on 400 diverse, community cohort of non-demented older adults. Thirdly, a novel approach for generating high resolution gray matter labels is presented that involves a) utilization of the super resolution technique to ensure sharp delineation of structures, and b) a multi atlas based correction technique to reduce errors due to misregistration. High-resolution gray matter labels were constructed using the super resolution technique. When used for regional segmentation of the gray matter of older adults, the new gray matter labels of the showed high overlap, high geometric correlation, and low dissimilarity with the manually edited reference labels, demonstrating that there is a high agreement between the new labels and the manually edited Freesurfer labels. Finally, this thesis presents the development of a comprehensive array of gyral-based, cytoarchitecture-based, and functional connectivity-based gray matter labels in MIITRA space utilizing the aforementioned techniques. These labels include gyral-based, cytoarchitecture-based, and functional connectivity-based labels which will enhance the functionality of the MIITRA atlas. The new labels will also enhance the interoperability of MIITRA with the source atlases.
Show less
- Title
- The Detection of Emerging Pathogenic Arcobacter Species In Poultry and Poultry By-Products
- Creator
- Nguyen, Paul
- Date
- 2022
- Description
-
Arcobacter species are emerging foodborne pathogens that are associated with human gastrointestinal illness. Typical symptoms of Arcobacter...
Show moreArcobacter species are emerging foodborne pathogens that are associated with human gastrointestinal illness. Typical symptoms of Arcobacter infection that have been reported include diarrhea, abdominal cramps, nausea, vomiting, and in severe cases, bacteremia. Consumption of contaminated food and water is the most common transmission source that leads to human infection. When consumed, pathogenic Arcobacter spp. pass through the stomach and establishes themselves in the host intestinal tract, where they cause gastroenteritis. Currently, there is no standard isolation method to detect pathogenic Arcobacter spp. from food and environment sample matrices. The research detailed in this thesis describes the development of the Nguyen-Restaino-Juárez Arcobacter detection system (NRJ) comprised of a selective enrichment broth and a chromogenic agar plate used to isolate three pathogenic species: Arcobacter butzleri, Arcobacter cryaerophilus, and Arcobacter skirrowii. Results revealed that NRJ yielded 97.8% inclusivity and 100.0% exclusivity when evaluating against select bacterial strains found in foods. Our research group internally validated the novel chromogenic detection system by comparing its efficacy against the modified Houf reference method (HB). Method-performance evaluations determined the NRJ method was significantly more sensitive and specific than modified HB when isolating the three Arcobacter species from ground chicken samples. Furthermore, 16S amplicon sequencing data identified that greater than 97% of bacterial isolates recovered using the NRJ detection system were Arcobacter species. This thesis presents the development and validation of a new gold standard method for isolating these emerging pathogens in food, clinical and environmental sampling.
Show less
- Title
- Synthesis and Photophysical Characterization of Novel Aromatic Triplet Dyes for Photodynamic Therapy Applications
- Creator
- Morgan, Jayla A
- Date
- 2022
- Description
-
Photodynamic therapy is a biomedical approach to treating specific types of cancerous tumor cells and harmful bacteria. The core principle of...
Show morePhotodynamic therapy is a biomedical approach to treating specific types of cancerous tumor cells and harmful bacteria. The core principle of photodynamic therapy involves the usage of a photosensitizer, which is an agent with the capability of transforming molecular, triplet state oxygen, into a reactive oxygen species upon a reaction with near-infrared (NIR) light. The reactive oxygen species has been demonstrated to cause apoptosis among harmful cells without damaging cancer free cells. The effectiveness of photodynamic is highly dependent upon the identity of the photosensitizer; a powerful and efficient photosensitizer should be non-toxic, exhibit high light absorption capabilities, and should produce large amounts of the reactive oxygen species. A novel chromophore bis-iodo-dipyrrolonaphthyridine-dione was demonstrated to have all vital characteristics of an ideal photosensitizer, however produced low amounts of the reactive oxygen species of interest due to the chemical instability of a carbon-halogen bond present in the molecule. Various subsequent halogenations (bis-bromo and bis-chloro) completed in order to remedy this instability revealed specific regioselectivity in regards to the dipyrrolonaphthyridinedione parent that are exhibited upon substituents effects by the substrate, electronic effects exhibited by the reagents of interest, and overall photophysical characterization of the molecules.
Show less
- Title
- Non-Hermitian Phononics
- Creator
- Mokhtari, Amir Ashkan
- Date
- 2021
- Description
-
Non-Hermitian and open systems are those that interact with their environment by the flows of energy, particles, and information. These systems...
Show moreNon-Hermitian and open systems are those that interact with their environment by the flows of energy, particles, and information. These systems show rich physical behaviors such as unidirectional wave reflection, enhanced transmission, and enhanced sensitivity to external perturbations comparing to a Hermitian system. To study non-Hermitian and open systems, we first present key concepts and required mathematical tools such as the theory of linear operators, linear algebra, biorthogonality, and exceptional points. We first consider the operator properties of various phononic eigenvalue problems. The aim is to answer some fundamental questions about the eigenvalues and eigenvectors of phononic operators. These include questions about the potential real and complex nature of the eigenvalues, whether the eigenvectors form a complete basis, what are the right orthogonality relationships, and how to create a complete basis when none may exist at the outset. In doing so we present a unified understanding of the properties of the phononic eigenvalues and eigenvectors which would emerge from any numerical method employed to compute such quantities. Next, we apply the mentioned theories on the phononic operators to the problem of scattering of in-plane waves at an interface between a homogeneous medium and a layered composite. This problem is an example of a non self-adjoint operator with biorthogonal eigenvectors and a complex spectrum. Since this problem is non self-adjoint, the degeneracies in the spectrum generally represent a coalescing of both the eigenvalues and eigenvectors (exceptional points). These degeneracies appear in both the complex and real domains of the wavevector. After calculating the eigenvalues and eigenvectors, we then calculate the scattered fields through a novel application of the Betti-Rayleigh reciprocity theorem. Several numerical examples showing rich scattering phenomena are presented afterward. We also prove that energy flux conservation is a restatement of the biorthogonality relationship of the non self-adjoint operators. Finally, we discuss open elastodynamics as a subset of non-Hermitian systems. A basic concept in open systems is effective Hamiltonian. It is a Hamiltonian that acts in the space of reduced set of degrees of freedom in a system and describes only a part of the eigenvalue spectrum of the total Hamiltonian. We present the Feshbach projection operator formalism -- traditionally used for calculating effective Hamiltonians of subsystems in quantum systems -- in the context of mechanical wave propagation problems. The formalism allows for the direct formal representation of effective Hamiltonians of finite systems which are interacting with their environment. This results in a smaller set of equations which isolate the dynamics of the system from the rest of the larger problem that is usually infinite size. We then present the procedure to calculate the Green's function of effective Hamiltonian. Finally we solve the scattering problem in 1D discrete systems using the Green's function method.
Show less