Search results
(281 - 300 of 4,545)
Pages
- Title
- MINIMIZING SALMONELLA CONTAMINATION IN SPROUTS BY CONTROLLING THE GERMINATION TEMPERATURE
- Creator
- Zhang, Hanshuai
- Date
- 2013, 2013-12
- Description
-
Since 1990, contaminated sprouts have been linked to at least 46 outbreaks and over 2,500 cases of illness in the US [13]. Unlike other ready...
Show moreSince 1990, contaminated sprouts have been linked to at least 46 outbreaks and over 2,500 cases of illness in the US [13]. Unlike other ready-to-eat produce, sprouts pose a particular concern as the conditions that promote germination of their seeds also facilitate the growth of pathogens [6]. To address sprout’s propensity to microbial contamination, the U.S. Food and Drug Administration (FDA) has recommended that seeds destined for sprout production be disinfected with chemical sanitizers such as 20,000 ppm of calcium hypochlorite, Ca(OCl)2 [29]. However, this disinfectant often cannot completely eliminate pathogen that may be present in seeds [4, 23]; in which case, the surviving bacteria can re-grow to significant numbers during germination and cause severe illness upon consumption [45]. Therefore, maintaining control of the germination conditions to present the proliferation of pathogens is a crucial step in the overall approach to reduce microbial hazards in finished sprouts. This study examines the effects of temperature on the proliferation of Salmonella during germination, and how this temperature effect is influenced by factors such as pathogen load, seed-lot, and the presence or absence of chemical treatment with Ca(OCl)2 was also evaluated. Alfalfa seeds artificially inoculated with ~3 log CFU/g of Salmonella were used as the contaminated seeds. They were mixed at different levels (0.01, 0.1, 1.0, or 10.0% by weight) with 200g of non-contaminated seeds and then were allowed to germinate in glass jars for 3 or 5 days at 10, 20, or 30°C. The same experiment was repeated for the spiking seeds that were treated with 20,000 ppm Ca(ClO)2 for 15 min prior to sprouting. Sprout samples were taken from each jar daily and analyzed for the level of Salmonella ix by either plating on XLD plates or the three-tube most probable number method as described in the FDA BAM. The level of Salmonella increased during sprouting at all three temperatures and reached the highest level at 48h. Sprouting at 10°C yielded the least number of Salmonella when all other factors were controlled. At all spiking levels, or the percentage of seeds contaminated before sprouting, level of Salmonella increased during sprouting, and at 20°C and 30°C, the level of Salmonella reached to a similar level of 5 log CFU/g and 7 log CFU/g respectively. At 1.0% spiking level, the level of Salmonella increased by approximately 1.5, 4.0, and 6.0 log CFU/g in sprouts grown at 10, 20, and 30°C respectively. Difference in the level of microflora background between different seed lots did not appear to affect Salmonella proliferation during sprouting. Treatments with 20,000 ppm free chlorine in some cases lowered the levels of Salmonella to undetectable levels, while in other cases, it caused an approx. 3 log reduction in Salmonella count on seeds. The surviving ones could still proliferate during sprouting although with a delay and a much slower rate, and did not reach the maximal level at 48 h of sprouting. However, Ca(OCl)2 did not prevent the re-growth of Salmonella during germination. In conclusion, these results showed that sprouting temperatures do affect Salmonella proliferations. We recommend lowering the sprouting temperature in conjunction with chemical treatment of prior to sprouting seeds to reduce microbial hazards in sprouts.
M.S. in Food Safety and Technology, December 2013
Show less
- Title
- LARGE-SCALE SIMULATION OF ELECTRIC POWER SYSTEMS FOR WIND
- Creator
- Wei, Tian
- Date
- 2011-08, 2011-07
- Description
-
The utilization of wind energy will pose great socioeconomic benefits with reductions in power plant emissions and the supply of zero cost...
Show moreThe utilization of wind energy will pose great socioeconomic benefits with reductions in power plant emissions and the supply of zero cost energy; however, largescale wind energy integration could introduce inevitable challenges to regional transmission systems and hourly system operations. This thesis addresses the congestion identification, simulation and analysis of large-scale electric power systems in different scenarios, large-scale wind energy integration and related transmission expansion issues. A methodology based on the security-constrained unit commitment (SCUC) is applied to analyze the transmission congestions in the Eastern Interconnection of the United States. The identified congestions are visualized along with the Geographical Information System (GIS) data and compared with the results in National Electric Transmission Congestion Study (NETCS) published by the Department of Energy of the United States in 2006. The study also provides the locational marginal price (LMP) information in the Eastern Interconnection, which is not available in the NETCS report. This thesis implements a comprehensive simulation and scenario analysis of the Illinois electric power system for the year 2011. Possible scenarios representing electrical load sensitivities to economic growth, fuel price variations, and the impact of carbon cost, are studied. This thesis presents the hourly simulation results for the large-scale wind energy integration in the Eastern Interconnection of the United States. An hourly unit commitment is applied for the simulation of the economics of wind energy integration in the year 2030. The energy portfolio for supplying the hourly load in 2030 is developed based on wind integration levels. The sensitivities of fuel price, wind energy quantity, xvii load forecast, carbon cost, and load management to the proposed 2030 wind integration are studied. This thesis identifies transmission congestions and expands the existing transmission system in the Eastern Interconnection of the United States for accommodating a large-scale integration of wind energy. Violated transmission flows which would cause the infeasibility of hourly SCUC are identified. An iterative transmission expansion analysis is implemented to identify the minimum required additions to the Eastern Interconnection for mitigating hourly transmission congestions.
Ph.D. in Electrical Engineering, July 2011
Show less
- Title
- DIRECT DIFFEOMORPHIC REPARAMETERIZATION FOR CORRESPONDENCE OPTIMIZATION IN STATISTICAL SHAPE MODELING
- Creator
- Li, Kang
- Date
- 2015, 2015-05
- Description
-
This dissertation proposes an efficient optimization approach for obtaining shape correspondence across a group of objects for statistical...
Show moreThis dissertation proposes an efficient optimization approach for obtaining shape correspondence across a group of objects for statistical shape modeling. With each shape represented in a B-spline based parametric form, the correspondence across the shape population is cast as an issue of seeking a reparametrization for each shape so that a quality measure of the resulting shape correspondence across the group is optimized. The quality measure is the description length of covariance matrix of the shape population, with landmarks sampled on each shape. The movement of landmarks on each B-spline shape is controlled by the reparameterization of the B-spline shape. The reparameterization itself is also represented with B-splines and B-spline coefficients are used as optimization parameters. We have developed formulations for ensuring the bijectivity of the reparameterization. A gradient-based optimization approach is developed, including techniques such as constraint aggregation and adjoint senstivity for efficient, direct di↵eomorphic reparameterization of landmarks to improve the group-wise shape correspondence. Numerical experiments on both synthetic and real 2D and 3D data sets demonstrate the efficiency and e↵ectiveness of the proposed approach.This dissertation proposes an efficient optimization approach for obtaining shape correspondence across a group of objects for statistical shape modeling. With each shape represented in a B-spline based parametric form, the correspondence across the shape population is cast as an issue of seeking a reparametrization for each shape so that a quality measure of the resulting shape correspondence across the group is optimized. The quality measure is the description length of covariance matrix of the shape population, with landmarks sampled on each shape. The movement of landmarks on each B-spline shape is controlled by the reparameterization of the B-spline shape. The reparameterization itself is also represented with B-splines and B-spline coefficients are used as optimization parameters. We have developed formulations for ensuring the bijectivity of the reparameterization. A gradient-based optimization approach is developed, including techniques such as constraint aggregation and adjoint senstivity for efficient, direct di↵eomorphic reparameterization of landmarks to improve the group-wise shape correspondence. Numerical experiments on both synthetic and real 2D and 3D data sets demonstrate the efficiency and e↵ectiveness of the proposed approach.
Ph.D. in Mechanical and Aerospace Engineering, May 2015
Show less
- Title
- Estimation of Future Glucose Concentrations with Subject-Specific Recursive Linear Models
- Creator
- Eren-oruklu, Meriyan, Cinar, Ali, Quinn, Lauretta, Smith, Donald
- Date
- 2009-04
- Publisher
- MARY ANN LIEBERT INC
- Description
-
Background: Estimation of future glucose concentrations is a crucial task for diabetes management. Predicted glucose values can be used for...
Show moreBackground: Estimation of future glucose concentrations is a crucial task for diabetes management. Predicted glucose values can be used for early hypoglycemic/hyperglycemic alarms or for adjustment of insulin injections or insulin infusion rates of manual or automated pumps. Continuous glucose monitoring (CGM) technologies provide glucose readings at a high frequency and consequently detailed insight into the subject's glucose variations. The objective of this research is to develop reliable subject-specific glucose prediction models using CGM data. Methods: Two separate patient databases collected under hospitalized (disturbance-free) and normal daily life conditions are used for validation of the proposed glucose prediction algorithm. Both databases consist of glucose concentration data collected at 5-min intervals using a CGM device. Using time-series analysis, low-order linear models are developed from patients' own CGM data. The time-series models are integrated with recursive identification and change detection methods, which enables dynamic adaptation of the model to inter-/intra-subject variability and glycemic disturbances. Prediction performance is evaluated in terms of glucose prediction error and Clarke Error Grid analysis (CG-EGA). Results: Prediction errors are significantly reduced with recursive identification of the models, and predictions are further improved with inclusion of a parameter change detection method. CG-EGA analysis results in accurate readings of 90% or more. Conclusions: Subject-specific glucose prediction strategy has been developed. Including a change detection method to the recursive algorithm improves the prediction accuracy. The proposed modeling algorithm with small number of parameters is a good candidate for installation in portable devices for early hypoglycemic/hyperglycemic alarms and for closing the glucose regulation loop with an insulin pump.
Endnote format citation
Show less
- Title
- THE RELATIONSHIPS OF BIOLOGICAL, PSYCHOLOGICAL, AND SOCIAL FACTORS TO PERCEIVED SEIZURE SEVERITY IN INDIVIDUALS WITH TEMPORAL LOBE EPILEPSY
- Creator
- Fong, Wing Man
- Date
- 2012-12-17, 2012-12
- Description
-
Epilepsy is a chronic neurological condition that can cause negative consequences in all areas of life for people with the condition. Research...
Show moreEpilepsy is a chronic neurological condition that can cause negative consequences in all areas of life for people with the condition. Research evidence has led to the goal of treatment to gradually shift from targeting only reducing the frequency of seizures to decreasing individual’s perception of seizure severity. While several studies have suggested that psychological and social factors, in addition to epilepsy-related biological factors, can also impact seizure severity perception, this area has received much less attention in relation to individuals’ perception of seizure severity. Therefore, this study examined individuals’ perception of seizure severity using a biopsychosocial framework which takes into account biological, psychological, and social factors to develop a comprehensive understanding about the perception of seizure severity. 57 individuals from a neurology clinic at a tertiary medical center with a primary diagnosis of localization-related temporal lobe epilepsy participated in this study. They were interviewed for demographic information and were assessed using Quality of Life in Epilepsy-89, Beck Depression Inventory, Stressful Life Events Scale, and Liverpool Seizure Severity Scale. A Multiple regression analysis indicated that the number of antiepileptic drugs, perceived memory function, the number of stressful life events, and social support were all significant predictors of perceived seizure severity. The current findings supported a biopsychosocial approach in understanding the perception of seizure severity among people with epilepsy. Clinical implications of these findings are discussed.
M.S. in Psychology, December 2012
Show less
- Title
- DEPTH MAP PROCESSING FOR MULTI-VIEW VIDEO PLUS DEPTH
- Creator
- Vijayanagar, Krisha Rao
- Date
- 2014, 2014-05
- Description
-
The world of multimedia and visual entertainment has grown in leaps and bounds in the past decade with 3-D television being one of the biggest...
Show moreThe world of multimedia and visual entertainment has grown in leaps and bounds in the past decade with 3-D television being one of the biggest technologies. Amongst several formats proposed for representing 3-D content, multi- view video plus depth (MVD) format has gained a lot of interest in the past few years. MVD requires that each view of a particular scene be accompanied by a per-pixel depth. This introduces new problems for compression and transmission of MVD content because a depth map has di erent characteristics from a color image. Keeping the MVD format and depth map characteristics in mind, we highlight three majors problems that plague the MVD format, namely, 1. depth map re nement. 2. depth map compression. 3. novel view synthesis using the depth map at the decoder side. In order to re ne a depth map, we propose a multi-resolution anisotropic di usion algorithm that is optimized to run in real-time thus ensuring that the encoder does not su er from additional latency. Next, we propose two unique solutions for compressing them. We rst propose a solution using the Layered Depth Video (LDV) concept using a rate-distortion optimized quadtree decomposition of the LDV using a novel two-mode block truncation code with improved prediction. We also propose a compression solution using compressive sensing (CS) concepts by creating a hybrid rate-optimized CS codec. This codec achieves two goals:- rstly, block classi cation to ensure lower decoder complexity and secondly, rate-distortion optimization of the measurement rate for each block that is to be compressively sensed. We then look at the view synthesis component of the MVD tool-chain which x is a time-sensitive process. Keeping decoding latency in mind, we propose a lookup- table based approach to the 3-D warping process with a simpli ed hole- lling algorithm that is not only competitive quality-wise with other schemes but is several times faster too. It is hopeful that the presented techniques can be used successfully to create MVD architectures for applications that need low-complexity encoding solutions.
PH.D in Electrical Engineering, May 2014
Show less
- Title
- COMPUTER MODELING OF BREAST LESIONS AND STUDIES OF ANALYZER-BASED X-RAY IMAGING
- Creator
- Garcia, Luis De Sisternes
- Date
- 2011-11, 2011-12
- Description
-
Phase-contrast x-ray imaging is an emerging technique that promises to yield highly sensitive medical images of soft tissue, which is...
Show morePhase-contrast x-ray imaging is an emerging technique that promises to yield highly sensitive medical images of soft tissue, which is difficult to observe via conventional radiography given its low X-ray attenuation differences. One of these phase-contrast techniques, known as analyzer-based imaging, has demonstrated that highly detailed breast tissue images can be obtained using synchrotron radiation. However, synchrotron facilities are impractical for clinical use. This thesis introduces studies and exposure consideration towards the application of analyzer-based imaging in a clinical environment, particularly in the context of breast imaging. It also introduces a computational breast lesion model that generates randomized three-dimensional phantoms which follow realistically the characteristics observed in real lesions. Moving analyzer-based imaging to clinical application requires the consideration of photon noise, inherent from the use of a photon-limited conventional source. We summarize the statistical properties in the presence of photon noise of two popular analyzer-based imaging techniques, known as diffraction-enhanced imaging (DEI) and multiple-image radiography (MIR). The statistics for MIR have not been previously derived and are introduced in this thesis. Comparison of the resulting statistical predictions with results obtained by Monte Carlo simulation validated the analysis. An expression for the maximum-likelihood (ML) solution for analyzer-based imaging is presented as a way of minimizing the effects of photon noise in the reconstruction of the object’s absorption, refraction and ultra-small angle scattering properties, and more practical maximum-likelihood expectation-maximization (ML-EM) and maximum-a-posteriori expectation-maximization (MAP-EM) solutions are also introduced. The behavior of the ML-EM and MAP-EM solutions was compared to the results produced by the five best-known analyzer-based reconstruction methods using computer simulations. The ML-EM and MAP-EM reconstructions proved closer to the theoretical values as they do not rely on commonly known limitations and approximations introduced by the other techniques. We introduce the development and evaluation of a new computational breast lesion phantom model that can simulate either massess or microcalcifications. The proposed tool allows the generation of a large number of randomized three-dimensional breast lesion simulations following desired characteristics normally used to describe breast lesions in clinical practice. The initial motivation for the development of this new phantom model was to enable the proposed evaluations of analyzer-based imaging to be achieved. However, the model became a major focus of this thesis because it improves significantly upon those that can be found in previous literature. The proposed lesion model can be used for evaluation studies across different breast imaging techniques, as well as for training purposes, so it is our hope that it could become an important resource for the broader mammography research community. As part of the lesion modeling research, we also introduce methods to computationally modify experimental mammography and analyzer-based images of breast tissue so that they present the generated tumor simulations embedded within their parenchyma realistically. The realism of the simulated lesion images was evaluated by comparison of 83 real tumor cases observed in mammograms with 83 constructed hybrid images in which simulated tumors matching the characteristics observed in the real cases were embedded, with healthy tissue acting as background. As a quantitative comparison, extracted features describing tumor shape and density showed no statistically significant differences between real and simulated tumors. A known computational tumor classification technique based on their shape observed in mammography was implemented and showed no significant performance differences between real and simulated cases, as well as showing good correlation with previously published performance results in real tumors. To measure the realism for use in human observer studies, we conducted a reader study in which 5 experienced radiologists were asked to judge whether each of the 166 images was real or simulated by assigning a score on a 7-point scale. The results were analyzed in a multiple-reader multiple-case statistical framework. The conclusion of the study was that the readers’ accuracy in assessing whether the lesions were real or simulated was not significantly better than random chance. This thesis also incorporates a reader study to evaluate the degree to which photon-limited analyzer-based images may be effective for visualization of breast cancer features. Our motivation was to establish the x-ray intensity that would be required to make these methods feasible, the purpose being to serve as a guide in parameter selection for future design of imaging hardware. We conducted a series of observer studies that quantify the performance of analyzer-based refraction images at different noise levels for the task of identifying subtle details present in breast tumors which are relevant to clinical diagnosis. The cases shown to the readers consisted of hybrid images where simulated lesions of known characteristics were computationally embedded in real breast analyzer-based background images. The original phase-contrast data was obtained using synchrotron radiation and was later modified to simulate the noise and blurring effects produced from a photon-limited source with a 300μm aperture size, similar to those used in a laboratory environment. Results showed that the analyzer-based imaging techniques statistically outperformed conventional mammography for the given task with an average of just 128 recorded photons per pixel in background image regions
Ph.D. in Electrical Engineering, December 2011
Show less
- Title
- FEASIBILITY OF INDICATOR MICROORGANISMS IN ASSESSING THE EFFICACY OF SANITIZERS IN FRESH PRODUCE WASHING
- Creator
- Zhong, Zeyan
- Date
- 2016, 2016-05
- Description
-
Sanitizers have been widely used in controlling cross-contamination during fresh produce washing processes. It is important to identify...
Show moreSanitizers have been widely used in controlling cross-contamination during fresh produce washing processes. It is important to identify different approaches for monitoring the efficacy of sanitizer washing. In this study, our objective was to determine the feasibility of indicator microorganisms in assessing the efficacy of chlorine and peroxyacetic acid (PAA) in fresh-cut produce washing. In inactivation experiments, the survivals of inoculated indicator microorganisms [total viable count (TVC), Enterobacteriaceae and total coliforms] and E. coli O157:H7 were determined after exposure to different concentrations of chlorine or PAA in water with added organic load (TOC~200 mg/l). In small-scale washing experiments, bacterial transfer from lettuce leaves inoculated with indicators or E. coli O157:H7 to wash water containing chlorine or PAA with/without organic matters was evaluated. In the inactivation study, higher log reduction was observed for Enterobacteriaceae and total coliforms than for E. coli O157:H7 after chlorine treatment at an initial concentration of 30 ppm. In the bacterial transfer study [a inoculated leaf (~5 log CFU) in 30 ml of wash water], without additional organic matter, approximately 3.5 log total CFU of all bacteria groups was transferred into wash water. Less than 1.5 log total CFU were detected for Enterobacteriaceae, total coliforms and E. coli O157:H7 in 1 ppm chlorine washing. With introduced organic matter, similar microbial loads of Enterobacteriaceae, total coliforms and E. coli O157:H7 were detected on washed leaves and in wash water with initial chlorine concentration of 1 ppm; when initial chlorine level increased to 10 ppm, less than 2 log total CFU of Enterobacteriaceae and total coliforms was detected while about 3 log total CFU of E. coli O157:H7 transferred to wash water, suggesting that chlorine at an initial concentration of 10 ppm could not prevent cross-contamination in the presence of organic load. The indicators showed different levels of survival and transfer from E. coli O157:H7 in this study and were unreliable to effectively monitor the behavior of the pathogen during chlorine wash in fresh produce processes. Enterobacteriaceae and total coliforms were more vulnerable to 40 ppm PAA treatment than E. coli O157:H7 in PAA inactivation experiments. No survival, except for TVC, was observed after 80 ppm of PAA inactivation. In the bacterial transfer study, 40 ppm PAA in wash water was found unable to prevent cross-contamination of indicators and E. coli O157:H7 from leaves to water. Further investigation is needed to compare the bacterial transfer levels of E. coli O157:H7 and the indicators at higher PAA levels.
M.S. in Food Safety and Technology, May 2016
Show less
- Title
- Nutrition and Diabetes: Pathophysiology and Management
- Creator
- Opara, Emmanuel C.
- Date
- 2006
- Publisher
- CRC, Taylor & Francis
- Title
- COOPERATIVE BATCH SCHEDULING FOR HPC SYSTEMS
- Creator
- Yang, Xu
- Date
- 2017, 2017-05
- Description
-
The batch scheduler is an important system software serving as the interface between users and HPC systems. Users submit their jobs via batch...
Show moreThe batch scheduler is an important system software serving as the interface between users and HPC systems. Users submit their jobs via batch scheduling portal and the batch scheduler makes scheduling decision for each job based on its request for system resources and system availability. Jobs submitted to HPC systems are usually parallel applications and their lifecycle consists of multiple running phases, such as computation, communication and input/output data. Thus, the running of such parallel applications could involve various system resources, such as power, network bandwidth, I/O bandwidth, storage, etc. And most of these system resources are shared among concurrently running jobs. However, Today's batch schedulers do not take the contention and interference between jobs over these resources into consideration for making scheduling decisions, which has been identified as one of the major culprits for both the system and application performance variability. In this work, we propose a cooperative batch scheduling framework for HPC systems. The motivation of our work is to take important factors about jobs and the system, such as job power, job communication characteristics and network topology, for making orchestrated scheduling decisions to reduce the contention between concurrently running jobs and to alleviate the performance variability. Our contributions are the design and implementation of several coordinated scheduling models and algorithms for addressing some chronic issues in HPC systems. The proposed models and algorithms in this work have been evaluated by the means of simulation using workload traces and application communication traces collected from production HPC systems. Preliminary experimental results show that our models and algorithms can effectively improve the application and the system overall performance, HPC facilities' operation cost, and alleviate the performance variability caused by job interference.
Ph.D. in Computer Science, May 2017
Show less
- Title
- APPLICABILITY OF THE INTERVAL ANALYSIS METHOD TO RELIABILITY ASSESSMENT OF STRUCTURES
- Creator
- Taha, Raguez A.
- Date
- 2012-11-26, 2012-12
- Description
-
The scope of this study is focused on the applicability of the interval method to the reliability analysis of structures. The study includes...
Show moreThe scope of this study is focused on the applicability of the interval method to the reliability analysis of structures. The study includes using the traditional probability methods in several applications and comparing the results with the interval analysis. Accordingly, the interval arithmetic rules are applied in several examples and the results are compared with those from the traditional probability theory concepts. Specifically, the procedure for using the interval methods was applied to the development of the probability of failure for a simple structural member; and for a link in a network system. Through three simple examples, the research shows that the interval method offers comparable results with those from the traditional first-order reliability method. The procedure based on the interval method, however, may offer advantages, especially in cases where the uncertainties in various parameters describing the reliability of a system are large. In the presence of large uncertainties, the traditional methods of reliability analysis, based on the first order approximation, may lose their accuracy. In these cases, the interval method may offer a simple method yet with more accurate results.
M.S. in Civil Engineering, December 2012
Show less
- Title
- STUDY OF VITREOSCILLA HEMOGLOBIN VARIANTS PRODUCED BY RANDOM MUTAGENESIS
- Creator
- Lin, Xiaodan
- Date
- 2015, 2015-05
- Description
-
This study is focused on comparing the wild type and mutated versions of the Vitreoscilla hemoglobin gene (vgb). The purpose of this focus is...
Show moreThis study is focused on comparing the wild type and mutated versions of the Vitreoscilla hemoglobin gene (vgb). The purpose of this focus is to find out whether any of the vgb mutations provides an advantage regarding cell growth rate, as well as on the expression level of Vitreoscilla hemoglobin protein (VHb). A negative control Escherichia coli DH5α (E. coli DH5α) bearing no pUC plasmid, as well as seven E. coli DH5α strains bearing different pUC-based plasmids were tested in the experiments. Among these were one vector-only negative control (pUC18), one wild type positive control (pUC8:16, which carries wild type vgb) and five different types of pUC-bearing vgb mutants (pUC-vgb M1, M2, M3, M4 and pUC18-vgb M3). In order to compare cell growth rate among these strains, the growth rate assay was carried out under three different conditions: (1) Luria-Bertani (LB) medium, aerobic conditions; (2) Terrific Broth (TB) medium, low oxygen conditions; and (3) TB medium, microaerobic conditions. In addition, the carbon monoxide (CO) difference spectra assay was conducted to measure functioning VHb protein expression levels for the strains grown under aerobic conditions. In contrast to the results obtained by our Australian collaborators, our growth rate assay and CO difference spectra assay showed no growth advantage or higher expression level of functioning VHb protein due to any of the vgb mutations. For the further study of the vgb mutants, four different recombinant plasmids were constructed by cloning three types of mutated vgb (vgb M1, M3 and M4) as well as wild type vgb into the prokaryotic expression vector pUC8 with ampicillin (Amp) resistance. After being transformed into competent E. coli DH5α cell, these resulting xii strains, as well as the plasmid-free negative control (E. coli DH5α) and vector-only negative control (E. coli DH5α bearing plasmid pUC8) were tested by the CO difference spectra assay. Except strain E. coli DH5α [pUC8-vgb M3], which showed a slight increase in the VHb expression level, the strain bearing other mutated vgbs did not demonstrate any elevation in VHb protein expression level, compared to the positive control containing wild type vgb.
M.S. in Biology, May 2015
Show less
- Title
- Design of Kranti Home and School (Mumbai) for Sexually Trafficked Girls (sequence unknown), IPRO 343 - Project Plan
- Creator
- Arastu, Insiyah, Choe, Jonathan, Davis, Elena, Enriquez, Sheena, Kimball, Mark, Kleehammer, Christopher, Klimek, Joseph, Marcheva, Boryana, Martin, Amy, Patel, Hamida, Patel, Avni, Semerdjiev, Ivo, Sturrup, Terez
- Date
- 2009-12
- Description
-
To develop and design a home and center of education for girls who have been previously trafficked.
Project Plan for IPRO 343: Design of Kranti...
Show moreTo develop and design a home and center of education for girls who have been previously trafficked.
Project Plan for IPRO 343: Design of Kranti Home and School (Mumbai) for Sexually Trafficked Girls for the fall 2009 semester
Show less
- Title
- FUNCTIONAL ANALYSIS OF POTENTIAL PHOSPHORYLATION SITES IN BAXΔ2 UNIQUE OLIGOPEPTIDE
- Creator
- Tsai, Yu-tseng
- Date
- 2014, 2014-07
- Description
-
The tumor suppressor gene, Bax, plays a critical role in tumor progression through regulating cell apoptosis. Mutations on the BAX gene often...
Show moreThe tumor suppressor gene, Bax, plays a critical role in tumor progression through regulating cell apoptosis. Mutations on the BAX gene often result in silencing its expression and the loss of pro-death ability. However, there is a unique Bax isoform, BaxΔ2, recently discovered in these Bax mutated cancer cells. BaxΔ2 isoform shows higher pro-apoptotic activity than Baxα. Unlike the parental Baxα, BaxΔ2 does not target mitochondria and forms aggregates in cytosol. There is a unique 10-amino-acid peptide in the N-terminus of BaxΔ2 protein possible function as a special signal. Two serines in this region are predicted as potential phosphorylation sites for regulation of the protein activity. To test this hypothesis, we mutated both serines (SS) into non-phosphorylatable alanines (AA) by site-directed mutagenesis approach. Both BaxΔ2 wild type (BaxΔ2-SS) and mutants (BaxΔ2-AA) were tagged with GFP, which allows us to monitor the protein expression and cellular localization in live cells. Here, we found that the distribution patterns of BaxΔ2-AA and BaxΔ2-SS were similar and appeared as aggregates in cytosol. BaxΔ2-AA mutant also possessed the similar pro-apoptotic activity with BaxΔ2-SS wild type. These results suggested that the two serines in BaxΔ2 unique oligopeptide might not play a critical role in BaxΔ2 localization and pro-death activity under the current ectopic expression conditions. Further study is needed to have better understanding of phosphorylation in contribution to unique behavior of BaxΔ2.
M.S. in Biology, July 2014
Show less
- Title
- The utilization of sunflower stalks in paper manufacture
- Creator
- Zavertnik, J.
- Date
- 2009, 1914
- Publisher
- Armour Institute of Technology
- Description
-
http://www.archive.org/details/utilizationofsun00zave
Thesis (B.S.)--Armour Institute of Technology
- Title
- POINT CLOUD FUSION BETWEEEN AERIAL AND VEHICLE LIDAR
- Creator
- Guangyao, Ma
- Date
- 2015, 2015-05
- Description
-
Because of the increasing requirement of precision in region of 3-D map, we began to use LiDAR to establish a more accurate map. There still...
Show moreBecause of the increasing requirement of precision in region of 3-D map, we began to use LiDAR to establish a more accurate map. There still exist some problems although we have already made a great progress in this area. One of them, which I tried to process during my thesis study, is that we have two points source - Aerial LiDAR Data( Points gotten by Airplane ) and Vehicle LiDAR Data( Points gotten by Vehicle ) - while both of them have a different density and cannot be merged well. This process - Fusion-is kindly similar to registration, the difference is that the points we would like to merge are generated from different devices and have only few points pairs in the same region. For example, the Aerial LiDAR data has a higher points density in the roofs and ground, but lower in the walls. In the meanwhile, the Vehicle LiDAR data has a lot of points in the walls and ground region. It is beneficial to minimize the difference between these two point sets since the process is necessary for modeling, registration and so on. Therefore, my thesis is to minimize the difference between these two data sources, a procedure of Fusion. The main idea is to read the LiDAR data into data structure of Point Cloud, sample their density to the similar level, and select several corresponding special region pairs( we named these regions -chunks, e.g. Median strip and boundaries of road ) with sufficient interesting points to do fusion. Interesting points indicate the points with one and more special features among all points. And, the algorithm we used to implement the fusion is ICP( Iterative Closet Point Algorithm). Not similar to Registration of Point Cloud, research in the Fusion area is rare. Therefore, the existing algorithms are not well suitable in this project. I deduce some new algorithms during my research since the original ICP Algorithm cannot work well. Both Update Equation and Objective Function are modified. In this thesis, PCL( Point Cloud Library ) is mainly used to implement the basic function, such as nding the nearest points and sampling point cloud, and Eigen library to write the core functions( e.g. Modified Iterative Closest Point Alg ). I also use libLAS library to implement the IO operations and MeshLab to visualize the point cloud after modification.
M.S. in Computer Science, May 2015
Show less
- Title
- SPEECH INTELLIGIBILITY AND ACCENTS IN SPEECH-MEDIATED INTERFACES: RESULTS AND RECOMMENDATIONS
- Creator
- Lawrence, Halcyon M.
- Date
- 2013, 2013-07
- Description
-
There continues to be significant growth in the development and use of speech–mediated devices and technology products; however, there is no...
Show moreThere continues to be significant growth in the development and use of speech–mediated devices and technology products; however, there is no evidence that non-native English speech is used in these devices, despite the fact that English is now spoken by more non-native speakers than native speakers, worldwide. This relative absence of non-native English speech in devices may be due in part to the costs associated with localizing speech devices, but it may also be attributable to the fact that not enough is known about user performance with accented speech in speech–mediated environments. In the absence of targeted research, developers may be relying on existing studies which focus on perception (impression) of accented speech, as a basis of decision-making. However, perception paints only part of the picture when it comes to understanding how and why people perform in certain ways and in certain environments. Three studies were conducted to answer the following questions: (1) What are the acoustic-phonetic characteristics of negatively- and positively-perceived accented speech? And how are these characteristics related to markers of intelligible speech? (2) How do participants perform on different types of accented-speech tasks? (3) What is the relationship between user perception of accented speech and user performance in response to accented speech? and; (4) How do participants perform on accented speech tasks of varying complexity? Arising out of this research, there are six recommendations for the use of accented speech in speech-mediated devices. Also, the findings of this study raise questions about inherent linguistic stereotypes which impact both our perceptions and our choices about xvi the accents we want to hear on our speech devices. A discussion about if and how these stereotypes can be altered and measured are included. Future research should examine the role of experienced non-native talkers in speech devices. Results of study one demonstrated that some experienced non-native talkers were positively-perceived by raters and may be good candidates for talkers in speech devices. A study like this would explicitly establish if listeners consistently make native vs. non-native distinctions in their preferences or if a prestige continuum emerges.
PH.D in Technical Communication, July 2013
Show less
- Title
- Learning the Lingo? Gender, Prestige and Linguistic Adaptation in Review Communities
- Creator
- Hemphill, Libby, Otterbacher, Jahna
- Date
- 2011-11-19, 2012-02
- Publisher
- ACM Press
- Description
-
Women and men communicate differently in both face-to- face and computer-mediated environments. We study linguistic patterns considered...
Show moreWomen and men communicate differently in both face-to- face and computer-mediated environments. We study linguistic patterns considered gendered in reviews contributed to the Internet Movie Database. IMDb has been described as a male-majority community, in which females contribute fewer reviews and enjoy less prestige than males. Analyzing reviews posted by prolific males and females, we hypothesize that females adjust their communication styles to be in sync with their male counterparts. We find evidence that while certain characteristics of “female language” persevere over time (e.g., frequent use of pronouns) others (e.g., hedging) decrease with time. Surprisingly, we also find that males often increase their use of “female” features. Our results indicate, that even when they resemble men’s reviews linguistically, women’s reviews still enjoy less prestige and smaller audiences.
Show less
- Title
- INVESTIGATION OF NOROVIRUS CROSS CONTAMINATION DURING FOOD SERVICE PROCEDURES USED IN PREPARATION OF FRESH PRODUCE
- Creator
- Suriyanarayanan, Annamalai
- Date
- 2011-11, 2011-12
- Description
-
Human norovirus (HuNoV) is considered an important cause of foodborne outbreaks, often attributed to the preparation of fresh produce by...
Show moreHuman norovirus (HuNoV) is considered an important cause of foodborne outbreaks, often attributed to the preparation of fresh produce by infected food handlers. In this investigation, methods for recovery of murine norovirus (MNV-1), a surrogate for HuNoV, from food preparatory surfaces were optimized, and MNV-1 crosscontamination between various surfaces common in a food service setting were studied. Fifty microliters of MNV-1 was inoculated onto demarcated 1 x 1 inch squares of polypropylene cutting board, stainless steel knife and spigots. After drying, MNV-1 was recovered from each surface using either a cotton swab, composite tissue or sterile sponge in combination with different eluents such as tissue culture growth medium, 3% beef extract, glycine buffer (50mM glycine, 1% beef extract), stripping solution (0.04% K2HPO4, 1.01% Na2HPO4, 0.1% Triton X-100), and Earle’s Balanced Salt Solution (EBSS). The eluent/recovery tool combinations that recovered the highest percentage of MNV-1 from cutting board were stripping solution/sponge (20%) and growth medium/swab (20%). The greatest recovery from the knife blade was achieved with the growth medium/composite tissue combination (43%), while recovery from spigots was greatest using the stripping solution/sponge (28%) and the growth medium/sponge combinations (27%). In the second phase of this investigation, human volunteers were asked to perform various tasks in order to quantify the amount of MNV-1 cross contamination between various surfaces, including bare hands, fresh-cut lettuce, and spigots. The percentage of MNV-1 transfer from hands to spigots varied from 0.06% to 3.59%, spigots to hands varied from 10% to 90.4% and lettuce to hands varied from 0.30% to 4.33%. x The results of this investigation can be used in developing a model describing the transfer pattern of HuNoV between surfaces common in retail food service, and used in developing educational materials for food service workers.
M.S. in Science, Food Safety, and Technology, December 2011
Show less
- Title
- Bucktown Artist's Live/Work Cooperative
- Creator
- Collier, T Scott
- Date
- 2011-05, 2011-05
- Description
-
By creating affordable live-work environments and a cooperative production facility the project will reestablish the Bucktown neighborhood as...
Show moreBy creating affordable live-work environments and a cooperative production facility the project will reestablish the Bucktown neighborhood as a strong and viable artist community in the city of Chicago.
Show less