ERIC Educational Resources Information Center
Howard, Steven J.; Melhuish, Edward
2017-01-01
Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years…
Comparison of Past, Present, and Future Volume Estimation Methods for Tennessee
Stanley J. Zarnoch; Alexander Clark; Ray A. Souter
2003-01-01
Forest Inventory and Analysis 1999 survey data for Tennessee were used to compare stem-volume estimates obtained using a previous method, the current method, and newly developed taper models that will be used in the future. Compared to the current method, individual tree volumes were consistently underestimated with the previous method, especially for the hardwoods....
An Investigation of Agility Issues in Scrum Teams Using Agility Indicators
NASA Astrophysics Data System (ADS)
Pikkarainen, Minna; Wang, Xiaofeng
Agile software development methods have emerged and become increasingly popular in recent years; yet the issues encountered by software development teams that strive to achieve agility using agile methods are yet to be explored systematically. Built upon a previous study that has established a set of indicators of agility, this study investigates what issues are manifested in software development teams using agile methods. It is focussed on Scrum teams particularly. In other words, the goal of the chapter is to evaluate Scrum teams using agility indicators and therefore to further validate previously presented agility indicators within the additional cases. A multiple case study research method is employed. The findings of the study reveal that the teams using Scrum do not necessarily achieve agility in terms of team autonomy, sharing, stability and embraced uncertainty. The possible reasons include previous organizational plan-driven culture, resistance towards the Scrum roles and changing resources.
USDA-ARS?s Scientific Manuscript database
An easy, rapid, and inexpensive method was developed to measure total, soluble, and insoluble starch in products at the factory and refinery, using microwave-assisted neutralization chemistry. The method was optimized using the previously developed USDA Starch Research method as a reference. Optimal...
RESEARCH ASSOCIATED WITH THE DEVELOPMENT OF EPA METHOD 552.2
The work presented in this paper entails the development of a method for haloacetic acid (HAA) analysis, Environmental Protection Agency (EPA)method 552.2, that improves the saftey and efficiency of previous methods and incorporates three additional trihalogenated acetic acids: b...
Solution and reasoning reuse in space planning and scheduling applications
NASA Technical Reports Server (NTRS)
Verfaillie, Gerard; Schiex, Thomas
1994-01-01
In the space domain, as in other domains, the CSP (Constraint Satisfaction Problems) techniques are increasingly used to represent and solve planning and scheduling problems. But these techniques have been developed to solve CSP's which are composed of fixed sets of variables and constraints, whereas many planning and scheduling problems are dynamic. It is therefore important to develop methods which allow a new solution to be rapidly found, as close as possible to the previous one, when some variables or constraints are added or removed. After presenting some existing approaches, this paper proposes a simple and efficient method, which has been developed on the basis of the dynamic backtracking algorithm. This method allows previous solution and reasoning to be reused in the framework of a CSP which is close to the previous one. Some experimental results on general random CSPs and on operation scheduling problems for remote sensing satellites are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yagnik, Gargey B.
The main goal of the presented research is development of nanoparticle based matrix-assisted laser desorption ionization-mass spectrometry (MALDI-MS). This dissertation includes the application of previously developed data acquisition methods, development of novel sample preparation methods, application and comparison of novel nanoparticle matrices, and comparison of two nanoparticle matrix application methods for MALDI-MS and MALDI-MS imaging.
Comparison of Methods for Determining Boundary Layer Edge Conditions for Transition Correlations
NASA Technical Reports Server (NTRS)
Liechty, Derek S.; Berry, Scott A.; Hollis, Brian R.; Horvath, Thomas J.
2003-01-01
Data previously obtained for the X-33 in the NASA Langley Research Center 20-Inch Mach 6 Air Tunnel have been reanalyzed to compare methods for determining boundary layer edge conditions for use in transition correlations. The experimental results were previously obtained utilizing the phosphor thermography technique to monitor the status of the boundary layer downstream of discrete roughness elements via global heat transfer images of the X-33 windward surface. A boundary layer transition correlation was previously developed for this data set using boundary layer edge conditions calculated using an inviscid/integral boundary layer approach. An algorithm was written in the present study to extract boundary layer edge quantities from higher fidelity viscous computational fluid dynamic solutions to develop transition correlations that account for viscous effects on vehicles of arbitrary complexity. The boundary layer transition correlation developed for the X-33 from the viscous solutions are compared to the previous boundary layer transition correlations. It is shown that the boundary layer edge conditions calculated using an inviscid/integral boundary layer approach are significantly different than those extracted from viscous computational fluid dynamic solutions. The present results demonstrate the differences obtained in correlating transition data using different computational methods.
NASA Astrophysics Data System (ADS)
Leiserson, Mark D. M.; Tatar, Diana; Cowen, Lenore J.; Hescott, Benjamin J.
A new method based on a mathematically natural local search framework for max cut is developed to uncover functionally coherent module and BPM motifs in high-throughput genetic interaction data. Unlike previous methods which also consider physical protein-protein interaction data, our method utilizes genetic interaction data only; this becomes increasingly important as high-throughput genetic interaction data is becoming available in settings where less is known about physical interaction data. We compare modules and BPMs obtained to previous methods and across different datasets. Despite needing no physical interaction information, the BPMs produced by our method are competitive with previous methods. Biological findings include a suggested global role for the prefoldin complex and a SWR subcomplex in pathway buffering in the budding yeast interactome.
Leiserson, Mark D M; Tatar, Diana; Cowen, Lenore J; Hescott, Benjamin J
2011-11-01
A new method based on a mathematically natural local search framework for max cut is developed to uncover functionally coherent module and BPM motifs in high-throughput genetic interaction data. Unlike previous methods, which also consider physical protein-protein interaction data, our method utilizes genetic interaction data only; this becomes increasingly important as high-throughput genetic interaction data is becoming available in settings where less is known about physical interaction data. We compare modules and BPMs obtained to previous methods and across different datasets. Despite needing no physical interaction information, the BPMs produced by our method are competitive with previous methods. Biological findings include a suggested global role for the prefoldin complex and a SWR subcomplex in pathway buffering in the budding yeast interactome.
Farer, Leslie J; Hayes, John M
2005-01-01
A new method has been developed for the determination of emamectin benzoate in fish feed. The method uses a wet extraction, cleanup by solid-phase extraction, and quantitation and separation by liquid chromatography (LC). In this paper, we compare the performance of this method with that of a previously reported LC assay for the determination of emamectin benzoate in fish feed. Although similar to the previous method, the new procedure uses a different sample pretreatment, wet extraction, and quantitation method. The performance of the new method was compared with that of the previously reported method by analyses of 22 medicated feed samples from various commercial sources. A comparison of the results presented here reveals slightly lower assay values obtained with the new method. Although a paired sample t-test indicates the difference in results is significant, this difference is within the method precision of either procedure.
Leiserson, Mark D.M.; Tatar, Diana; Cowen, Lenore J.
2011-01-01
Abstract A new method based on a mathematically natural local search framework for max cut is developed to uncover functionally coherent module and BPM motifs in high-throughput genetic interaction data. Unlike previous methods, which also consider physical protein-protein interaction data, our method utilizes genetic interaction data only; this becomes increasingly important as high-throughput genetic interaction data is becoming available in settings where less is known about physical interaction data. We compare modules and BPMs obtained to previous methods and across different datasets. Despite needing no physical interaction information, the BPMs produced by our method are competitive with previous methods. Biological findings include a suggested global role for the prefoldin complex and a SWR subcomplex in pathway buffering in the budding yeast interactome. PMID:21882903
The application of contraction theory to an iterative formulation of electromagnetic scattering
NASA Technical Reports Server (NTRS)
Brand, J. C.; Kauffman, J. F.
1985-01-01
Contraction theory is applied to an iterative formulation of electromagnetic scattering from periodic structures and a computational method for insuring convergence is developed. A short history of spectral (or k-space) formulation is presented with an emphasis on application to periodic surfaces. To insure a convergent solution of the iterative equation, a process called the contraction corrector method is developed. Convergence properties of previously presented iterative solutions to one-dimensional problems are examined utilizing contraction theory and the general conditions for achieving a convergent solution are explored. The contraction corrector method is then applied to several scattering problems including an infinite grating of thin wires with the solution data compared to previous works.
Matsumoto, Hirotaka; Kiryu, Hisanori
2016-06-08
Single-cell technologies make it possible to quantify the comprehensive states of individual cells, and have the power to shed light on cellular differentiation in particular. Although several methods have been developed to fully analyze the single-cell expression data, there is still room for improvement in the analysis of differentiation. In this paper, we propose a novel method SCOUP to elucidate differentiation process. Unlike previous dimension reduction-based approaches, SCOUP describes the dynamics of gene expression throughout differentiation directly, including the degree of differentiation of a cell (in pseudo-time) and cell fate. SCOUP is superior to previous methods with respect to pseudo-time estimation, especially for single-cell RNA-seq. SCOUP also successfully estimates cell lineage more accurately than previous method, especially for cells at an early stage of bifurcation. In addition, SCOUP can be applied to various downstream analyses. As an example, we propose a novel correlation calculation method for elucidating regulatory relationships among genes. We apply this method to a single-cell RNA-seq data and detect a candidate of key regulator for differentiation and clusters in a correlation network which are not detected with conventional correlation analysis. We develop a stochastic process-based method SCOUP to analyze single-cell expression data throughout differentiation. SCOUP can estimate pseudo-time and cell lineage more accurately than previous methods. We also propose a novel correlation calculation method based on SCOUP. SCOUP is a promising approach for further single-cell analysis and available at https://github.com/hmatsu1226/SCOUP.
Fast Construction of Near Parsimonious Hybridization Networks for Multiple Phylogenetic Trees.
Mirzaei, Sajad; Wu, Yufeng
2016-01-01
Hybridization networks represent plausible evolutionary histories of species that are affected by reticulate evolutionary processes. An established computational problem on hybridization networks is constructing the most parsimonious hybridization network such that each of the given phylogenetic trees (called gene trees) is "displayed" in the network. There have been several previous approaches, including an exact method and several heuristics, for this NP-hard problem. However, the exact method is only applicable to a limited range of data, and heuristic methods can be less accurate and also slow sometimes. In this paper, we develop a new algorithm for constructing near parsimonious networks for multiple binary gene trees. This method is more efficient for large numbers of gene trees than previous heuristics. This new method also produces more parsimonious results on many simulated datasets as well as a real biological dataset than a previous method. We also show that our method produces topologically more accurate networks for many datasets.
NASA Technical Reports Server (NTRS)
Craig, R. R., Jr.
1985-01-01
A component mode synthesis method for damped structures was developed and modal test methods were explored which could be employed to determine the relevant parameters required by the component mode synthesis method. Research was conducted on the following topics: (1) Development of a generalized time-domain component mode synthesis technique for damped systems; (2) Development of a frequency-domain component mode synthesis method for damped systems; and (3) Development of a system identification algorithm applicable to general damped systems. Abstracts are presented of the major publications which have been previously issued on these topics.
A Rapid Dialysis Method for Analysis of Artificial Sweeteners in Foods (2nd Report).
Tahara, Shoichi; Yamamoto, Sumiyo; Yamajima, Yukiko; Miyakawa, Hiroyuki; Uematsu, Yoko; Monma, Kimio
2017-01-01
Following the previous report, a rapid dialysis method was developed for the extraction and purification of four artificial sweeteners, namely, sodium saccharide (Sa), acesulfame potassium (AK), aspartame (APM), and dulcin (Du), which are present in various foods. The method was evaluated by the addition of 0.02 g/kg of these sweeteners to a cookie sample, in the same manner as in the previous report. Revisions from the previous method were: reduction of the total dialysis volume from 200 to 100 mL, change of tube length from 55 to 50 cm, change of dialysate from 0.01 mol/L hydrochloric aqueous solution containing 10% sodium chloride to 30% methanol solution, and change of dialysis conditions from ambient temperature with occasional shaking to 50℃ with shaking at 160 rpm. As a result of these revisions, the recovery reached 99.3-103.8% with one hour dialysis. The obtained recovery yields were comparable to the recovery yields in the previous method with four hour dialysis.
CFD Analysis of the SBXC Glider Airframe
2016-06-01
mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the previous research data...greater than 15 m/s. 14. SUBJECT TERMS finite element method, computational fluid dynamics, Y Plus, mesh element quality, aerodynamic data, fluid...based mathematically on finite element methods. To validate and verify the methodology developed, a mathematical comparison was made with the
Skeletal Mechanism Generation of Surrogate Jet Fuels for Aeropropulsion Modeling
NASA Astrophysics Data System (ADS)
Sung, Chih-Jen; Niemeyer, Kyle E.
2010-05-01
A novel implementation for the skeletal reduction of large detailed reaction mechanisms using the directed relation graph with error propagation and sensitivity analysis (DRGEPSA) is developed and presented with skeletal reductions of two important hydrocarbon components, n-heptane and n-decane, relevant to surrogate jet fuel development. DRGEPSA integrates two previously developed methods, directed relation graph-aided sensitivity analysis (DRGASA) and directed relation graph with error propagation (DRGEP), by first applying DRGEP to efficiently remove many unimportant species prior to sensitivity analysis to further remove unimportant species, producing an optimally small skeletal mechanism for a given error limit. It is illustrated that the combination of the DRGEP and DRGASA methods allows the DRGEPSA approach to overcome the weaknesses of each previous method, specifically that DRGEP cannot identify all unimportant species and that DRGASA shields unimportant species from removal.
SOLID-FUEL HOUSEHOLD COOK STOVES: CHARACTERIZATION OF PERFORMANCE AND EMISSIONS
Previous studies have shown that some fuel-efficient solid-fuel cook stoves have had worse pollutant emissions of PICs (products of incomplete combustion) than traditional cooking methods. Better stoves have been developed to reduce emissions, but test results have not previously...
Won, Jonghun; Lee, Gyu Rie; Park, Hahnbeom; Seok, Chaok
2018-06-07
The second extracellular loops (ECL2s) of G-protein-coupled receptors (GPCRs) are often involved in GPCR functions, and their structures have important implications in drug discovery. However, structure prediction of ECL2 is difficult because of its long length and the structural diversity among different GPCRs. In this study, a new ECL2 conformational sampling method involving both template-based and ab initio sampling was developed. Inspired by the observation of similar ECL2 structures of closely related GPCRs, a template-based sampling method employing loop structure templates selected from the structure database was developed. A new metric for evaluating similarity of the target loop to templates was introduced for template selection. An ab initio loop sampling method was also developed to treat cases without highly similar templates. The ab initio method is based on the previously developed fragment assembly and loop closure method. A new sampling component that takes advantage of secondary structure prediction was added. In addition, a conserved disulfide bridge restraining ECL2 conformation was predicted and analytically incorporated into sampling, reducing the effective dimension of the conformational search space. The sampling method was combined with an existing energy function for comparison with previously reported loop structure prediction methods, and the benchmark test demonstrated outstanding performance.
NASA Technical Reports Server (NTRS)
Stahara, S. S.; Elliott, J. P.; Spreiter, J. R.
1983-01-01
An investigation was conducted to continue the development of perturbation procedures and associated computational codes for rapidly determining approximations to nonlinear flow solutions, with the purpose of establishing a method for minimizing computational requirements associated with parametric design studies of transonic flows in turbomachines. The results reported here concern the extension of the previously developed successful method for single parameter perturbations to simultaneous multiple-parameter perturbations, and the preliminary application of the multiple-parameter procedure in combination with an optimization method to blade design/optimization problem. In order to provide as severe a test as possible of the method, attention is focused in particular on transonic flows which are highly supercritical. Flows past both isolated blades and compressor cascades, involving simultaneous changes in both flow and geometric parameters, are considered. Comparisons with the corresponding exact nonlinear solutions display remarkable accuracy and range of validity, in direct correspondence with previous results for single-parameter perturbations.
Application of Raman spectroscopy for cervical dysplasia diagnosis
Kanter, Elizabeth M.; Vargis, Elizabeth; Majumder, Shovan; Keller, Matthew D.; Woeste, Emily; Rao, Gautam G.; Mahadevan-Jansen, Anita
2014-01-01
Cervical cancer is the second most common malignancy among women worldwide, with over 490000 cases diagnosed and 274000 deaths each year. Although current screening methods have dramatically reduced cervical cancer incidence and mortality in developed countries, a “See and Treat” method would be preferred, especially in developing countries. Results from our previous work have suggested that Raman spectroscopy can be used to detect cervical precancers; however, with a classification accuracy of 88%, it was not clinically applicable. In this paper, we describe how incorporating a woman's hormonal status, particularly the point in menstrual cycle and menopausal state, into our previously developed classification algorithm improves the accuracy of our method to 94%. The results of this paper bring Raman spectroscopy one step closer to being utilized in a clinical setting to diagnose cervical dysplasia. Posterior probabilities of class membership, as determined by MRDF-SMLR, for patients regardless of menopausal status, and for pre-menopausal patients only PMID:19343687
Self-calibrating models for dynamic monitoring and diagnosis
NASA Technical Reports Server (NTRS)
Kuipers, Benjamin
1994-01-01
The present goal in qualitative reasoning is to develop methods for automatically building qualitative and semiquantitative models of dynamic systems and to use them for monitoring and fault diagnosis. The qualitative approach to modeling provides a guarantee of coverage while our semiquantitative methods support convergence toward a numerical model as observations are accumulated. We have developed and applied methods for automatic creation of qualitative models, developed two methods for obtaining tractable results on problems that were previously intractable for qualitative simulation, and developed more powerful methods for learning semiquantitative models from observations and deriving semiquantitative predictions from them. With these advances, qualitative reasoning comes significantly closer to realizing its aims as a practical engineering method.
Relationship between Defect Size and Fatigue Life Distributions in Al-7 Pct Si-Mg Alloy Castings
NASA Astrophysics Data System (ADS)
Tiryakioğlu, Murat
2009-07-01
A new method for predicting the variability in fatigue life of castings was developed by combining the size distribution for the fatigue-initiating defects and a fatigue life model based on the Paris-Erdoğan law for crack propagation. Two datasets for the fatigue-initiating defects in Al-7 pct Si-Mg alloy castings, reported previously in the literature, were used to demonstrate that (1) the size of fatigue-initiating defects follow the Gumbel distribution; (2) the crack propagation model developed previously provides respectable fits to experimental data; and (3) the method developed in the present study expresses the variability in both datasets, almost as well as the lognormal distribution and better than the Weibull distribution.
Nonholonomic Hamiltonian Method for Meso-macroscale Simulations of Reacting Shocks
NASA Astrophysics Data System (ADS)
Fahrenthold, Eric; Lee, Sangyup
2015-06-01
The seamless integration of macroscale, mesoscale, and molecular scale models of reacting shock physics has been hindered by dramatic differences in the model formulation techniques normally used at different scales. In recent research the authors have developed the first unified discrete Hamiltonian approach to multiscale simulation of reacting shock physics. Unlike previous work, the formulation employs reacting themomechanical Hamiltonian formulations at all scales, including the continuum. Unlike previous work, the formulation employs a nonholonomic modeling approach to systematically couple the models developed at all scales. Example applications of the method show meso-macroscale shock to detonation simulations in nitromethane and RDX. Research supported by the Defense Threat Reduction Agency.
Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory
ERIC Educational Resources Information Center
Long, Haiying
2017-01-01
Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…
Extending the solvent-free MALDI sample preparation method.
Hanton, Scott D; Parees, David M
2005-01-01
Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry is an important technique to characterize many different materials, including synthetic polymers. MALDI mass spectral data can be used to determine the polymer average molecular weights, repeat units, and end groups. One of the key issues in traditional MALDI sample preparation is making good solutions of the analyte and the matrix. Solvent-free sample preparation methods have been developed to address these issues. Previous results of solvent-free or dry prepared samples show some advantages over traditional wet sample preparation methods. Although the results of the published solvent-free sample preparation methods produced excellent mass spectra, we found the method to be very time-consuming, with significant tool cleaning, which presents a significant possibility of cross contamination. To address these issues, we developed an extension of the solvent-free method that replaces the mortar and pestle grinding with ball milling the sample in a glass vial with two small steel balls. This new method generates mass spectra with equal quality of the previous methods, but has significant advantages in productivity, eliminates cross contamination, and is applicable to liquid and soft or waxy analytes.
Modeling, implementation, and validation of arterial travel time reliability.
DOT National Transportation Integrated Search
2013-11-01
Previous research funded by Florida Department of Transportation (FDOT) developed a method for estimating : travel time reliability for arterials. This method was not initially implemented or validated using field data. This : project evaluated and r...
On finite element methods for the Helmholtz equation
NASA Technical Reports Server (NTRS)
Aziz, A. K.; Werschulz, A. G.
1979-01-01
The numerical solution of the Helmholtz equation is considered via finite element methods. A two-stage method which gives the same accuracy in the computed gradient as in the computed solution is discussed. Error estimates for the method using a newly developed proof are given, and the computational considerations which show this method to be computationally superior to previous methods are presented.
Evolution of the SCS curve number method and its applications to continuous runoff simulation
USDA-ARS?s Scientific Manuscript database
The Natural Resources Conservation Service (NRCS) [previously Soil Conservation Service (SCS)] developed the SCS runoff curve-number (CN) method for estimating direct runoff from storm rainfall. The NRCS uses the CN method for designing structures and for evaluating their effectiveness. Structural...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2012 CFR
2012-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2014 CFR
2014-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
43 CFR 11.64 - Injury determination phase-testing and sampling methods.
Code of Federal Regulations, 2013 CFR
2013-10-01
.... In developing these objectives, the availability of information from response actions relating to the...), test cases proving the code works, and any alteration of previously documented code made to adapt the... computer code (if any), test cases proving the code works, and any alteration of previously documented code...
NASA Technical Reports Server (NTRS)
Rosenfeld, Moshe
1990-01-01
The development, validation and application of a fractional step solution method of the time-dependent incompressible Navier-Stokes equations in generalized coordinate systems are discussed. A solution method that combines a finite-volume discretization with a novel choice of the dependent variables and a fractional step splitting to obtain accurate solutions in arbitrary geometries was previously developed for fixed-grids. In the present research effort, this solution method is extended to include more general situations, including cases with moving grids. The numerical techniques are enhanced to gain efficiency and generality.
Development of a model for predicting NASA/MSFC program success
NASA Technical Reports Server (NTRS)
Riggs, Jeffrey; Miller, Tracy; Finley, Rosemary
1990-01-01
Research conducted during the execution of a previous contract (NAS8-36955/0039) firmly established the feasibility of developing a tool to aid decision makers in predicting the potential success of proposed projects. The final report from that investigation contains an outline of the method to be applied in developing this Project Success Predictor Model. As a follow-on to the previous study, this report describes in detail the development of this model and includes full explanation of the data-gathering techniques used to poll expert opinion. The report includes the presentation of the model code itself.
Information form the previously approved extended abstract A standardized area source measurement method based on mobile tracer correlation was used for methane emissions assessment in 52 field deployments...
Mark J. Ducey; Jeffrey H. Gove; Harry T. Valentine
2008-01-01
Perpendicular distance sampling (PDS) is a fast probability-proportional-to-size method for inventory of downed wood. However, previous development of PDS had limited the method to estimating only one variable (such as volume per hectare, or surface area per hectare) at a time. Here, we develop a general design-unbiased estimator for PDS. We then show how that...
On fixed-area plot sampling for downed coarse woody debris
Jeffrey H. Gove; Paul C. Van Deusen
2011-01-01
The use of fixed-area plots for sampling down coarse woody debris is reviewed. A set of clearly defined protocols for two previously described methods is established and a new method, which we call the 'sausage' method, is developed. All methods (protocols) are shown to be unbiased for volume estimation, but not necessarily for estimation of population...
Polidori, David; Rowley, Clarence
2014-07-22
The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method.
Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer
Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro
2015-01-01
We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs. PMID:26819909
3D temporal subtraction on multislice CT images using nonlinear warping technique
NASA Astrophysics Data System (ADS)
Ishida, Takayuki; Katsuragawa, Shigehiko; Kawashita, Ikuo; Kim, Hyounseop; Itai, Yoshinori; Awai, Kazuo; Li, Qiang; Doi, Kunio
2007-03-01
The detection of very subtle lesions and/or lesions overlapped with vessels on CT images is a time consuming and difficult task for radiologists. In this study, we have developed a 3D temporal subtraction method to enhance interval changes between previous and current multislice CT images based on a nonlinear image warping technique. Our method provides a subtraction CT image which is obtained by subtraction of a previous CT image from a current CT image. Reduction of misregistration artifacts is important in the temporal subtraction method. Therefore, our computerized method includes global and local image matching techniques for accurate registration of current and previous CT images. For global image matching, we selected the corresponding previous section image for each current section image by using 2D cross-correlation between a blurred low-resolution current CT image and a blurred previous CT image. For local image matching, we applied the 3D template matching technique with translation and rotation of volumes of interests (VOIs) which were selected in the current and the previous CT images. The local shift vector for each VOI pair was determined when the cross-correlation value became the maximum in the 3D template matching. The local shift vectors at all voxels were determined by interpolation of shift vectors of VOIs, and then the previous CT image was nonlinearly warped according to the shift vector for each voxel. Finally, the warped previous CT image was subtracted from the current CT image. The 3D temporal subtraction method was applied to 19 clinical cases. The normal background structures such as vessels, ribs, and heart were removed without large misregistration artifacts. Thus, interval changes due to lung diseases were clearly enhanced as white shadows on subtraction CT images.
Functional Techniques for Data Analysis
NASA Technical Reports Server (NTRS)
Tomlinson, John R.
1997-01-01
This dissertation develops a new general method of solving Prony's problem. Two special cases of this new method have been developed previously. They are the Matrix Pencil and the Osculatory Interpolation. The dissertation shows that they are instances of a more general solution type which allows a wide ranging class of linear functional to be used in the solution of the problem. This class provides a continuum of functionals which provide new methods that can be used to solve Prony's problem.
Space Suit Joint Torque Measurement Method Validation
NASA Technical Reports Server (NTRS)
Valish, Dana; Eversley, Karina
2012-01-01
In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.
Development in Children with Achondroplasia: A Prospective Clinical Cohort Study
ERIC Educational Resources Information Center
Ireland, Penelope J.; Donaghey, Samantha; McGill, James; Zankl, Andreas; Ware, Robert S.; Pacey, Verity; Ault, Jenny; Savarirayan, Ravi; Sillence, David; Thompson, Elizabeth; Townshend, Sharron; Johnston, Leanne M.
2012-01-01
Aim: Achondroplasia is characterized by delays in the development of communication and motor skills. While previously reported developmental profiles exist across gross motor, fine motor, feeding, and communication skills, there has been no prospective study of development across multiple areas simultaneously. Method: This Australasian…
Optimal Stratification of Item Pools in a-Stratified Computerized Adaptive Testing.
ERIC Educational Resources Information Center
Chang, Hua-Hua; van der Linden, Wim J.
2003-01-01
Developed a method based on 0-1 linear programming to stratify an item pool optimally for use in alpha-stratified adaptive testing. Applied the method to a previous item pool from the computerized adaptive test of the Graduate Record Examinations. Results show the new method performs well in practical situations. (SLD)
PE Metrics: Background, Testing Theory, and Methods
ERIC Educational Resources Information Center
Zhu, Weimo; Rink, Judy; Placek, Judith H.; Graber, Kim C.; Fox, Connie; Fisette, Jennifer L.; Dyson, Ben; Park, Youngsik; Avery, Marybell; Franck, Marian; Raynes, De
2011-01-01
New testing theories, concepts, and psychometric methods (e.g., item response theory, test equating, and item bank) developed during the past several decades have many advantages over previous theories and methods. In spite of their introduction to the field, they have not been fully accepted by physical educators. Further, the manner in which…
Discovery of rare mutations in populations: TILLING by sequencing
USDA-ARS?s Scientific Manuscript database
Discovery of rare mutations in populations requires methods for processing and analyzing in parallel many individuals. Previous TILLING methods employed enzymatic or physical discrimination of heteroduplexed from homoduplexed target DNA. We used mutant populations of rice and wheat to develop a meth...
NASA Technical Reports Server (NTRS)
Gracey, William
1948-01-01
A simplified compound-pendulum method for the experimental determination of the moments of inertia of airplanes about the x and y axes is described. The method is developed as a modification of the standard pendulum method reported previously in NACA report, NACA-467. A brief review of the older method is included to form a basis for discussion of the simplified method. (author)
Li, Qike; Schissler, A Grant; Gardeux, Vincent; Achour, Ikbel; Kenost, Colleen; Berghout, Joanne; Li, Haiquan; Zhang, Hao Helen; Lussier, Yves A
2017-05-24
Transcriptome analytic tools are commonly used across patient cohorts to develop drugs and predict clinical outcomes. However, as precision medicine pursues more accurate and individualized treatment decisions, these methods are not designed to address single-patient transcriptome analyses. We previously developed and validated the N-of-1-pathways framework using two methods, Wilcoxon and Mahalanobis Distance (MD), for personal transcriptome analysis derived from a pair of samples of a single patient. Although, both methods uncover concordantly dysregulated pathways, they are not designed to detect dysregulated pathways with up- and down-regulated genes (bidirectional dysregulation) that are ubiquitous in biological systems. We developed N-of-1-pathways MixEnrich, a mixture model followed by a gene set enrichment test, to uncover bidirectional and concordantly dysregulated pathways one patient at a time. We assess its accuracy in a comprehensive simulation study and in a RNA-Seq data analysis of head and neck squamous cell carcinomas (HNSCCs). In presence of bidirectionally dysregulated genes in the pathway or in presence of high background noise, MixEnrich substantially outperforms previous single-subject transcriptome analysis methods, both in the simulation study and the HNSCCs data analysis (ROC Curves; higher true positive rates; lower false positive rates). Bidirectional and concordant dysregulated pathways uncovered by MixEnrich in each patient largely overlapped with the quasi-gold standard compared to other single-subject and cohort-based transcriptome analyses. The greater performance of MixEnrich presents an advantage over previous methods to meet the promise of providing accurate personal transcriptome analysis to support precision medicine at point of care.
Howard, Steven J.; Melhuish, Edward
2016-01-01
Several methods of assessing executive function (EF), self-regulation, language development, and social development in young children have been developed over previous decades. Yet new technologies make available methods of assessment not previously considered. In resolving conceptual and pragmatic limitations of existing tools, the Early Years Toolbox (EYT) offers substantial advantages for early assessment of language, EF, self-regulation, and social development. In the current study, results of our large-scale administration of this toolbox to 1,764 preschool and early primary school students indicated very good reliability, convergent validity with existing measures, and developmental sensitivity. Results were also suggestive of better capture of children’s emerging abilities relative to comparison measures. Preliminary norms are presented, showing a clear developmental trajectory across half-year age groups. The accessibility of the EYT, as well as its advantages over existing measures, offers considerably enhanced opportunities for objective measurement of young children’s abilities to enable research and educational applications. PMID:28503022
Developments in mycotoxin analysis: an update for 2014-2015
USDA-ARS?s Scientific Manuscript database
This review summarizes developments in the determination of mycotoxins over a period between mid-2014 and mid-2015. In tradition with previous articles of this series, analytical methods to determine aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxins, patulin, trichothecenes, an...
Bhat; Bergstrom; Teasley; Bowker; Cordell
1998-01-01
/ This paper describes a framework for estimating the economic value of outdoor recreation across different ecoregions. Ten ecoregions in the continental United States were defined based on similarly functioning ecosystem characters. The individual travel cost method was employed to estimate recreation demand functions for activities such as motor boating and waterskiing, developed and primitive camping, coldwater fishing, sightseeing and pleasure driving, and big game hunting for each ecoregion. While our ecoregional approach differs conceptually from previous work, our results appear consistent with the previous travel cost method valuation studies.KEY WORDS: Recreation; Ecoregion; Travel cost method; Truncated Poisson model
Methodology for dynamic biaxial tension testing of pregnant uterine tissue.
Manoogian, Sarah; Mcnally, Craig; Calloway, Britt; Duma, Stefan
2007-01-01
Placental abruption accounts for 50% to 70% of fetal losses in motor vehicle crashes. Since automobile crashes are the leading cause of traumatic fetal injury mortality in the United States, research of this injury mechanism is important. Before research can adequately evaluate current and future restraint designs, a detailed model of the pregnant uterine tissues is necessary. The purpose of this study is to develop a methodology for testing the pregnant uterus in biaxial tension at a rate normally seen in a motor vehicle crash. Since the majority of previous biaxial work has established methods for quasi-static testing, this paper combines previous research and new methods to develop a custom designed system to strain the tissue at a dynamic rate. Load cells and optical markers are used for calculating stress strain curves of the perpendicular loading axes. Results for this methodology show images of a tissue specimen loaded and a finite verification of the optical strain measurement. The biaxial test system dynamically pulls the tissue to failure with synchronous motion of four tissue grips that are rigidly coupled to the tissue specimen. The test device models in situ loading conditions of the pregnant uterus and overcomes previous limitations of biaxial testing. A non-contact method of measuring strains combined with data reduction to resolve the stresses in two directions provides the information necessary to develop a three dimensional constitutive model of the material. Moreover, future research can apply this method to other soft tissues with similar in situ loading conditions.
NASA Technical Reports Server (NTRS)
Brand, J. C.
1985-01-01
Contraction theory is applied to an iterative formulation of electromagnetic scattering from periodic structures and a computational method for insuring convergence is developed. A short history of spectral (or k-space) formulation is presented with an emphasis on application to periodic surfaces. The mathematical background for formulating an iterative equation is covered using straightforward single variable examples including an extension to vector spaces. To insure a convergent solution of the iterative equation, a process called the contraction corrector method is developed. Convergence properties of previously presented iterative solutions to one-dimensional problems are examined utilizing contraction theory and the general conditions for achieving a convergent solution are explored. The contraction corrector method is then applied to several scattering problems including an infinite grating of thin wires with the solution data compared to previous works.
2014-01-01
Background The indocyanine green dilution method is one of the methods available to estimate plasma volume, although some researchers have questioned the accuracy of this method. Methods We developed a new, physiologically based mathematical model of indocyanine green kinetics that more accurately represents indocyanine green kinetics during the first few minutes postinjection than what is assumed when using the traditional mono-exponential back-extrapolation method. The mathematical model is used to develop an optimal back-extrapolation method for estimating plasma volume based on simulated indocyanine green kinetics obtained from the physiological model. Results Results from a clinical study using the indocyanine green dilution method in 36 subjects with type 2 diabetes indicate that the estimated plasma volumes are considerably lower when using the traditional back-extrapolation method than when using the proposed back-extrapolation method (mean (standard deviation) plasma volume = 26.8 (5.4) mL/kg for the traditional method vs 35.1 (7.0) mL/kg for the proposed method). The results obtained using the proposed method are more consistent with previously reported plasma volume values. Conclusions Based on the more physiological representation of indocyanine green kinetics and greater consistency with previously reported plasma volume values, the new back-extrapolation method is proposed for use when estimating plasma volume using the indocyanine green dilution method. PMID:25052018
Development of direct-inverse 3-D methods for applied transonic aerodynamic wing design and analysis
NASA Technical Reports Server (NTRS)
Carlson, Leland A.
1989-01-01
An inverse wing design method was developed around an existing transonic wing analysis code. The original analysis code, TAWFIVE, has as its core the numerical potential flow solver, FLO30, developed by Jameson and Caughey. Features of the analysis code include a finite-volume formulation; wing and fuselage fitted, curvilinear grid mesh; and a viscous boundary layer correction that also accounts for viscous wake thickness and curvature. The development of the inverse methods as an extension of previous methods existing for design in Cartesian coordinates is presented. Results are shown for inviscid wing design cases in super-critical flow regimes. The test cases selected also demonstrate the versatility of the design method in designing an entire wing or discontinuous sections of a wing.
Fabrication of large area woodpile structure in polymer
NASA Astrophysics Data System (ADS)
Gupta, Jaya Prakash; Dutta, Neilanjan; Yao, Peng; Sharkawy, Ahmed S.; Prather, Dennis W.
2009-02-01
A fabrication process of three-dimensional Woodpile photonic crystals based on multilayer photolithography from commercially available photo resist SU8 have been demonstrated. A 6-layer, 2 mm × 2mm woodpile has been fabricated. Different factors that influence the spin thickness on multiple resist application have been studied. The fabrication method used removes, the problem of intermixing, and is more repeatable and robust than the multilayer fabrication techniques for three dimensional photonic crystal structures that have been previously reported. Each layer is developed before next layer photo resist spin, instead of developing the whole structure in the final step as used in multilayer process. The desired thickness for each layer is achieved by the calibration of spin speed and use of different photo resist compositions. Deep UV exposure confinement has been the defining parameter in this process. Layer uniformity for every layer is independent of the previous developed layers and depends on the photo resist planarizing capability, spin parameters and baking conditions. The intermixing problem, which results from the previous layers left uncrossed linked photo resist, is completely removed in this process as the previous layers are fully developed, avoiding any intermixing between the newly spun and previous layers. Also this process gives the freedom to redo every spin any number of times without affecting the previously made structure, which is not possible in other multilayer process where intermediate developing is not performed.
Signal-Processing Algorithm Development for the ACLAIM Sensor
NASA Technical Reports Server (NTRS)
vonLaven, Scott
1995-01-01
Methods for further minimizing the risk by making use of previous lidar observations were investigated. EOFs are likely to play an important role in these methods, and a procedure for extracting EOFs from data has been implemented, The new processing methods involving EOFs could range from extrapolation, as discussed, to more complicated statistical procedures for maintaining low unstart risk.
Developing and validating a nutrition knowledge questionnaire: key methods and considerations.
Trakman, Gina Louise; Forsyth, Adrienne; Hoye, Russell; Belski, Regina
2017-10-01
To outline key statistical considerations and detailed methodologies for the development and evaluation of a valid and reliable nutrition knowledge questionnaire. Literature on questionnaire development in a range of fields was reviewed and a set of evidence-based guidelines specific to the creation of a nutrition knowledge questionnaire have been developed. The recommendations describe key qualitative methods and statistical considerations, and include relevant examples from previous papers and existing nutrition knowledge questionnaires. Where details have been omitted for the sake of brevity, the reader has been directed to suitable references. We recommend an eight-step methodology for nutrition knowledge questionnaire development as follows: (i) definition of the construct and development of a test plan; (ii) generation of the item pool; (iii) choice of the scoring system and response format; (iv) assessment of content validity; (v) assessment of face validity; (vi) purification of the scale using item analysis, including item characteristics, difficulty and discrimination; (vii) evaluation of the scale including its factor structure and internal reliability, or Rasch analysis, including assessment of dimensionality and internal reliability; and (viii) gathering of data to re-examine the questionnaire's properties, assess temporal stability and confirm construct validity. Several of these methods have previously been overlooked. The measurement of nutrition knowledge is an important consideration for individuals working in the nutrition field. Improved methods in the development of nutrition knowledge questionnaires, such as the use of factor analysis or Rasch analysis, will enable more confidence in reported measures of nutrition knowledge.
Transport Test Problems for Hybrid Methods Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaver, Mark W.; Miller, Erin A.; Wittman, Richard S.
2011-12-28
This report presents 9 test problems to guide testing and development of hybrid calculations for the ADVANTG code at ORNL. These test cases can be used for comparing different types of radiation transport calculations, as well as for guiding the development of variance reduction methods. Cases are drawn primarily from existing or previous calculations with a preference for cases which include experimental data, or otherwise have results with a high level of confidence, are non-sensitive, and represent problem sets of interest to NA-22.
Development of Novel Noninvasive Methods of Stress Assessment in Baleen Whales
2013-09-30
adrenal hormone ( aldosterone ) that has not been adequately studied in baleen whales. Respiratory sampling is a novel method of physiological ... physiological stress levels of free-swimming cetaceans (Amaral 2010, ONR 2010, Hunt et al. 2013a). We have previously demonstrated that respiratory vapor...hormones have not yet been tested in either feces or blow, particularly aldosterone . Our aim in this project is to further develop both techniques
Explicit criteria for prioritization of cataract surgery
Ma Quintana, José; Escobar, Antonio; Bilbao, Amaia
2006-01-01
Background Consensus techniques have been used previously to create explicit criteria to prioritize cataract extraction; however, the appropriateness of the intervention was not included explicitly in previous studies. We developed a prioritization tool for cataract extraction according to the RAND method. Methods Criteria were developed using a modified Delphi panel judgment process. A panel of 11 ophthalmologists was assembled. Ratings were analyzed regarding the level of agreement among panelists. We studied the effect of all variables on the final panel score using general linear and logistic regression models. Priority scoring systems were developed by means of optimal scaling and general linear models. The explicit criteria developed were summarized by means of regression tree analysis. Results Eight variables were considered to create the indications. Of the 310 indications that the panel evaluated, 22.6% were considered high priority, 52.3% intermediate priority, and 25.2% low priority. Agreement was reached for 31.9% of the indications and disagreement for 0.3%. Logistic regression and general linear models showed that the preoperative visual acuity of the cataractous eye, visual function, and anticipated visual acuity postoperatively were the most influential variables. Alternative and simple scoring systems were obtained by optimal scaling and general linear models where the previous variables were also the most important. The decision tree also shows the importance of the previous variables and the appropriateness of the intervention. Conclusion Our results showed acceptable validity as an evaluation and management tool for prioritizing cataract extraction. It also provides easy algorithms for use in clinical practice. PMID:16512893
Screening Tools to Estimate Mold Burdens in Homes
Objective: The objective of this study was to develop screening tools that could be used to estimate the mold burden in a home which would indicate whether more detailed testing might be useful. Methods: Previously, in the American Healthy Home Survey, a DNA-based method of an...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ning, S. A.; Hayman, G.; Damiani, R.
Blade element momentum methods, though conceptually simple, are highly useful for analyzing wind turbines aerodynamics and are widely used in many design and analysis applications. A new version of AeroDyn is being developed to take advantage of new robust solution methodologies, conform to a new modularization framework for National Renewable Energy Laboratory's FAST, utilize advanced skewed-wake analysis methods, fix limitations with previous implementations, and to enable modeling of highly flexible and nonstraight blades. This paper reviews blade element momentum theory and several of the options available for analyzing skewed inflow. AeroDyn implementation details are described for the benefit of usersmore » and developers. These new options are compared to solutions from the previous version of AeroDyn and to experimental data. Finally, recommendations are given on how one might select from the various available solution approaches.« less
Humidity compensation of bad-smell sensing system using a detector tube and a built-in camera
NASA Astrophysics Data System (ADS)
Hirano, Hiroyuki; Nakamoto, Takamichi
2011-09-01
We developed a low-cost sensing system robust against humidity change for detecting and estimating concentration of bad smell, such as hydrogen sulfide and ammonia. In the previous study, we developed automated measurement system for a gas detector tube using a built-in camera instead of the conventional manual inspection of the gas detector tube. Concentration detectable by the developed system ranges from a few tens of ppb to a few tens of ppm. However, we previously found that the estimated concentration depends not only on actual concentration, but on humidity. Here, we established the method to correct the influence of humidity by creating regression function with its inputs of discoloration rate and humidity. We studied 2 methods (Backpropagation, Radial basis function network) to get regression function and evaluated them. Consequently, the system successfully estimated the concentration on a practical level even when humidity changes.
Birth month affects lifetime disease risk: a phenome-wide method
Boland, Mary Regina; Shahn, Zachary; Madigan, David; Hripcsak, George; Tatonetti, Nicholas P
2015-01-01
Objective An individual’s birth month has a significant impact on the diseases they develop during their lifetime. Previous studies reveal relationships between birth month and several diseases including atherothrombosis, asthma, attention deficit hyperactivity disorder, and myopia, leaving most diseases completely unexplored. This retrospective population study systematically explores the relationship between seasonal affects at birth and lifetime disease risk for 1688 conditions. Methods We developed a hypothesis-free method that minimizes publication and disease selection biases by systematically investigating disease-birth month patterns across all conditions. Our dataset includes 1 749 400 individuals with records at New York-Presbyterian/Columbia University Medical Center born between 1900 and 2000 inclusive. We modeled associations between birth month and 1688 diseases using logistic regression. Significance was tested using a chi-squared test with multiplicity correction. Results We found 55 diseases that were significantly dependent on birth month. Of these 19 were previously reported in the literature (P < .001), 20 were for conditions with close relationships to those reported, and 16 were previously unreported. We found distinct incidence patterns across disease categories. Conclusions Lifetime disease risk is affected by birth month. Seasonally dependent early developmental mechanisms may play a role in increasing lifetime risk of disease. PMID:26041386
Latex Micro-balloon Pumping in Centrifugal Microfluidic Platforms
Aeinehvand, Mohammad Mahdi; Ibrahim, Fatimah; Al-Faqheri, Wisam; Thio, Tzer Hwai Gilbert; Kazemzadeh, Amin; Wadi harun, Sulaiman; Madou, Marc
2014-01-01
Centrifugal microfluidic platforms have emerged as point-of-care diagnostic tools. However, the unidirectional nature of the centrifugal force limits the available space for multi-stepped processes on a single microfluidics disc. To overcome this limitation, a passive pneumatic pumping method actuated at high rotational speeds has been previously proposed to pump liquid against the centrifugal force. In this paper, a novel micro-balloon pumping method that relies on elastic energy stored in a latex membrane is introduced. It operates at low rotational speeds and pumps a larger volume of liquid towards the centre of the disc. Two different micro-balloon pumping designs have been developed to study the pump performance and capacity at a range of rotational frequencies from 0 to 1500 rpm. The behaviour of the micro-balloon pump on the centrifugal microfluidic platforms has been theoretically analysed and compared with the experimental data. The experimental data shows that, the developed pumping method dramatically decreases the required rotational speed to pump liquid compared to the previously developed pneumatic pumping methods. It also shows that within a range of rotational speed, desirable volume of liquid can be stored and pumped by adjusting the size of the micro-balloon. PMID:24441792
2012-01-01
Background While progress has been made to develop automatic segmentation techniques for mitochondria, there remains a need for more accurate and robust techniques to delineate mitochondria in serial blockface scanning electron microscopic data. Previously developed texture based methods are limited for solving this problem because texture alone is often not sufficient to identify mitochondria. This paper presents a new three-step method, the Cytoseg process, for automated segmentation of mitochondria contained in 3D electron microscopic volumes generated through serial block face scanning electron microscopic imaging. The method consists of three steps. The first is a random forest patch classification step operating directly on 2D image patches. The second step consists of contour-pair classification. At the final step, we introduce a method to automatically seed a level set operation with output from previous steps. Results We report accuracy of the Cytoseg process on three types of tissue and compare it to a previous method based on Radon-Like Features. At step 1, we show that the patch classifier identifies mitochondria texture but creates many false positive pixels. At step 2, our contour processing step produces contours and then filters them with a second classification step, helping to improve overall accuracy. We show that our final level set operation, which is automatically seeded with output from previous steps, helps to smooth the results. Overall, our results show that use of contour pair classification and level set operations improve segmentation accuracy beyond patch classification alone. We show that the Cytoseg process performs well compared to another modern technique based on Radon-Like Features. Conclusions We demonstrated that texture based methods for mitochondria segmentation can be enhanced with multiple steps that form an image processing pipeline. While we used a random-forest based patch classifier to recognize texture, it would be possible to replace this with other texture identifiers, and we plan to explore this in future work. PMID:22321695
Spectra library assisted de novo peptide sequencing for HCD and ETD spectra pairs.
Yan, Yan; Zhang, Kaizhong
2016-12-23
De novo peptide sequencing via tandem mass spectrometry (MS/MS) has been developed rapidly in recent years. With the use of spectra pairs from the same peptide under different fragmentation modes, performance of de novo sequencing is greatly improved. Currently, with large amount of spectra sequenced everyday, spectra libraries containing tens of thousands of annotated experimental MS/MS spectra become available. These libraries provide information of the spectra properties, thus have the potential to be used with de novo sequencing to improve its performance. In this study, an improved de novo sequencing method assisted with spectra library is proposed. It uses spectra libraries as training datasets and introduces significant scores of the features used in our previous de novo sequencing method for HCD and ETD spectra pairs. Two pairs of HCD and ETD spectral datasets were used to test the performance of the proposed method and our previous method. The results show that this proposed method achieves better sequencing accuracy with higher ranked correct sequences and less computational time. This paper proposed an advanced de novo sequencing method for HCD and ETD spectra pair and used information from spectra libraries and significant improved previous similar methods.
NASA Astrophysics Data System (ADS)
Solovjov, Vladimir P.; Webb, Brent W.; Andre, Frederic
2018-07-01
Following previous theoretical development based on the assumption of a rank correlated spectrum, the Rank Correlated Full Spectrum k-distribution (RC-FSK) method is proposed. The method proves advantageous in modeling radiation transfer in high temperature gases in non-uniform media in two important ways. First, and perhaps most importantly, the method requires no specification of a reference gas thermodynamic state. Second, the spectral construction of the RC-FSK model is simpler than original correlated FSK models, requiring only two cumulative k-distributions. Further, although not exhaustive, example problems presented here suggest that the method may also yield improved accuracy relative to prior methods, and may exhibit less sensitivity to the blackbody source temperature used in the model predictions. This paper outlines the theoretical development of the RC-FSK method, comparing the spectral construction with prior correlated spectrum FSK method formulations. Further the RC-FSK model's relationship to the Rank Correlated Spectral Line Weighted-sum-of-gray-gases (RC-SLW) model is defined. The work presents predictions using the Rank Correlated FSK method and previous FSK methods in three different example problems. Line-by-line benchmark predictions are used to assess the accuracy.
GStream: Improving SNP and CNV Coverage on Genome-Wide Association Studies
Alonso, Arnald; Marsal, Sara; Tortosa, Raül; Canela-Xandri, Oriol; Julià, Antonio
2013-01-01
We present GStream, a method that combines genome-wide SNP and CNV genotyping in the Illumina microarray platform with unprecedented accuracy. This new method outperforms previous well-established SNP genotyping software. More importantly, the CNV calling algorithm of GStream dramatically improves the results obtained by previous state-of-the-art methods and yields an accuracy that is close to that obtained by purely CNV-oriented technologies like Comparative Genomic Hybridization (CGH). We demonstrate the superior performance of GStream using microarray data generated from HapMap samples. Using the reference CNV calls generated by the 1000 Genomes Project (1KGP) and well-known studies on whole genome CNV characterization based either on CGH or genotyping microarray technologies, we show that GStream can increase the number of reliably detected variants up to 25% compared to previously developed methods. Furthermore, the increased genome coverage provided by GStream allows the discovery of CNVs in close linkage disequilibrium with SNPs, previously associated with disease risk in published Genome-Wide Association Studies (GWAS). These results could provide important insights into the biological mechanism underlying the detected disease risk association. With GStream, large-scale GWAS will not only benefit from the combined genotyping of SNPs and CNVs at an unprecedented accuracy, but will also take advantage of the computational efficiency of the method. PMID:23844243
ERIC Educational Resources Information Center
Losinski, Mickey; Cuenca-Carlino, Yojanna; Zablocki, Mark; Teagarden, James
2014-01-01
Two previous reviews have indicated that self-regulated strategy instruction (SRSD) is an evidence-based practice that can improve the writing skills of students with emotional and behavioral disorders. The purpose of this meta-analysis is to extend the findings and analytic methods of previous reviews by examining published studies regarding…
Developments in mycotoxin analysis: an update for 2013 – 2014
USDA-ARS?s Scientific Manuscript database
This review highlights developments in the determination of mycotoxins over a period between mid-2013 and mid-2014. It continues in the format of the previous articles of this series, emphasising on analytical methods to determine aflatoxins, Alternaria toxins, ergot alkaloids, fumonisins, ochratoxi...
USDA-ARS?s Scientific Manuscript database
A high-throughput qualitative screening and identification method for 9 aminoglycosides of regulatory interest has been developed, validated, and implemented for bovine kidney, liver, and muscle tissues. The method involves extraction at previously validated conditions, cleanup using disposable pip...
Wheat mill stream properties for discrete element method modeling
USDA-ARS?s Scientific Manuscript database
A discrete phase approach based on individual wheat kernel characteristics is needed to overcome the limitations of previous statistical models and accurately predict the milling behavior of wheat. As a first step to develop a discrete element method (DEM) model for the wheat milling process, this s...
Two computational methods are proposed for estimation of the emission rate of volatile organic compounds (VOCs) from solvent-based indoor coating materials based on the knowledge of product formulation. The first method utilizes two previously developed mass transfer models with ...
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Subekti, M.; Center for Development of Reactor Safety Technology, National Nuclear Energy Agency of Indonesia, Puspiptek Complex BO.80, Serpong-Tangerang, 15340; Ohno, T.
2006-07-01
The neuro-expert has been utilized in previous monitoring-system research of Pressure Water Reactor (PWR). The research improved the monitoring system by utilizing neuro-expert, conventional noise analysis and modified neural networks for capability extension. The parallel method applications required distributed architecture of computer-network for performing real-time tasks. The research aimed to improve the previous monitoring system, which could detect sensor degradation, and to perform the monitoring demonstration in High Temperature Engineering Tested Reactor (HTTR). The developing monitoring system based on some methods that have been tested using the data from online PWR simulator, as well as RSG-GAS (30 MW research reactormore » in Indonesia), will be applied in HTTR for more complex monitoring. (authors)« less
NASA Astrophysics Data System (ADS)
Ji, Yang; Chen, Hong; Tang, Hongwu
2017-06-01
A highly accurate wide-angle scheme, based on the generalized mutistep scheme in the propagation direction, is developed for the finite difference beam propagation method (FD-BPM). Comparing with the previously presented method, the simulation shows that our method results in a more accurate solution, and the step size can be much larger
NASA Astrophysics Data System (ADS)
Saracco, Ginette; Moreau, Frédérique; Mathé, Pierre-Etienne; Hermitte, Daniel; Michel, Jean-Marie
2007-10-01
We have previously developed a method for characterizing and localizing `homogeneous' buried sources, from the measure of potential anomalies at a fixed height above ground (magnetic, electric and gravity). This method is based on potential theory and uses the properties of the Poisson kernel (real by definition) and the continuous wavelet theory. Here, we relax the assumption on sources and introduce a method that we call the `multiscale tomography'. Our approach is based on the harmonic extension of the observed magnetic field to produce a complex source by use of a complex Poisson kernel solution of the Laplace equation for complex potential field. A phase and modulus are defined. We show that the phase provides additional information on the total magnetic inclination and the structure of sources, while the modulus allows us to characterize its spatial location, depth and `effective degree'. This method is compared to the `complex dipolar tomography', extension of the Patella method that we previously developed. We applied both methods and a classical electrical resistivity tomography to detect and localize buried archaeological structures like antique ovens from magnetic measurements on the Fox-Amphoux site (France). The estimates are then compared with the results of excavations.
A study on the application of topic models to motif finding algorithms.
Basha Gutierrez, Josep; Nakai, Kenta
2016-12-22
Topic models are statistical algorithms which try to discover the structure of a set of documents according to the abstract topics contained in them. Here we try to apply this approach to the discovery of the structure of the transcription factor binding sites (TFBS) contained in a set of biological sequences, which is a fundamental problem in molecular biology research for the understanding of transcriptional regulation. Here we present two methods that make use of topic models for motif finding. First, we developed an algorithm in which first a set of biological sequences are treated as text documents, and the k-mers contained in them as words, to then build a correlated topic model (CTM) and iteratively reduce its perplexity. We also used the perplexity measurement of CTMs to improve our previous algorithm based on a genetic algorithm and several statistical coefficients. The algorithms were tested with 56 data sets from four different species and compared to 14 other methods by the use of several coefficients both at nucleotide and site level. The results of our first approach showed a performance comparable to the other methods studied, especially at site level and in sensitivity scores, in which it scored better than any of the 14 existing tools. In the case of our previous algorithm, the new approach with the addition of the perplexity measurement clearly outperformed all of the other methods in sensitivity, both at nucleotide and site level, and in overall performance at site level. The statistics obtained show that the performance of a motif finding method based on the use of a CTM is satisfying enough to conclude that the application of topic models is a valid method for developing motif finding algorithms. Moreover, the addition of topic models to a previously developed method dramatically increased its performance, suggesting that this combined algorithm can be a useful tool to successfully predict motifs in different kinds of sets of DNA sequences.
NASA Technical Reports Server (NTRS)
Gramoll, K. C.; Dillard, D. A.; Brinson, H. F.
1989-01-01
In response to the tremendous growth in the development of advanced materials, such as fiber-reinforced plastic (FRP) composite materials, a new numerical method is developed to analyze and predict the time-dependent properties of these materials. Basic concepts in viscoelasticity, laminated composites, and previous viscoelastic numerical methods are presented. A stable numerical method, called the nonlinear differential equation method (NDEM), is developed to calculate the in-plane stresses and strains over any time period for a general laminate constructed from nonlinear viscoelastic orthotropic plies. The method is implemented in an in-plane stress analysis computer program, called VCAP, to demonstrate its usefulness and to verify its accuracy. A number of actual experimental test results performed on Kevlar/epoxy composite laminates are compared to predictions calculated from the numerical method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yokoyama, Yoko; Shimizu, Akira; Okada, Etsuko
Highlights: Black-Right-Pointing-Pointer We developed new method to rapidly identify COL1A1-PDGFB fusion in DFSP. Black-Right-Pointing-Pointer New PCR method using a single primer pair detected COL1A1-PDGFB fusion in DFSP. Black-Right-Pointing-Pointer This is the first report of DFSP with a novel COL1A1 breakpoint in exon 5. -- Abstract: The detection of fusion transcripts of the collagen type 1{alpha}1 (COL1A1) and platelet-derived growth factor-BB (PDGFB) genes by genetic analysis has recognized as a reliable and valuable molecular tool for the diagnosis of dermatofibrosarcoma protuberans (DFSP). To detect the COL1A1-PDGFB fusion, almost previous reports performed reverse transcription polymerase chain reaction (RT-PCR) using multiplex forward primersmore » from COL1A1. However, it has possible technical difficulties with respect to the handling of multiple primers and reagents in the procedure. The objective of this study is to establish a rapid, easy, and efficient one-step method of PCR using only a single primer pair to detect the fusion transcripts of the COL1A1 and PDGFB in DFSP. To validate new method, we compared the results of RT-PCR in five patients of DFSP between the previous method using multiplex primers and our established one-step RT-PCR using a single primer pair. In all cases of DFSP, the COL1A1-PDGFB fusion was detected by both previous method and newly established one-step PCR. Importantly, we detected a novel COL1A1 breakpoint in exon 5. The newly developed method is valuable to rapidly identify COL1A1-PDGFB fusion transcripts in DFSP.« less
Recent advances in peanut breeding and genetics
USDA-ARS?s Scientific Manuscript database
Most previous advances in peanut cultivar development have been made using conventional breeding methods for self-pollinated crops. Peanut has lagged behind many other crops on use of molecular genetic technology for cultivar development in part due to lack of investment, but also because of low le...
NASA Technical Reports Server (NTRS)
Stahara, S. S.
1984-01-01
An investigation was carried out to complete the preliminary development of a combined perturbation/optimization procedure and associated computational code for designing optimized blade-to-blade profiles of turbomachinery blades. The overall purpose of the procedures developed is to provide demonstration of a rapid nonlinear perturbation method for minimizing the computational requirements associated with parametric design studies of turbomachinery flows. The method combines the multiple parameter nonlinear perturbation method, successfully developed in previous phases of this study, with the NASA TSONIC blade-to-blade turbomachinery flow solver, and the COPES-CONMIN optimization procedure into a user's code for designing optimized blade-to-blade surface profiles of turbomachinery blades. Results of several design applications and a documented version of the code together with a user's manual are provided.
How to Deal with Emotional Abuse and Neglect--Further Development of a Conceptual Framework (FRAMEA)
ERIC Educational Resources Information Center
Glaser, Danya
2011-01-01
Objective: To develop further the understanding of emotional abuse and neglect. Methods: Building on previous work, this paper describes the further development of a conceptual framework for the recognition and management of emotional abuse and neglect. Training in this framework is currently being evaluated. The paper also briefly reviews more…
ERIC Educational Resources Information Center
Rescorla, Leslie; Nyame, Josephine; Dias, Pedro
2016-01-01
Purpose: Our objective was to replicate previous crosslinguistic findings by comparing Portuguese and U.S. children with respect to (a) effects of language, gender, and age on vocabulary size; (b) lexical composition; and (c) late talking. Method: We used the Language Development Survey (LDS; Rescorla, 1989) with children (18-35 months) learning…
The red supergiant population in the Perseus arm
NASA Astrophysics Data System (ADS)
Dorda, R.; Negueruela, I.; González-Fernández, C.
2018-04-01
We present a new catalogue of cool supergiants in a section of the Perseus arm, most of which had not been previously identified. To generate it, we have used a set of well-defined photometric criteria to select a large number of candidates (637) that were later observed at intermediate resolution in the infrared calcium triplet spectral range, using a long-slit spectrograph. To separate red supergiants from luminous red giants, we used a statistical method, developed in previous works and improved in the present paper. We present a method to assign probabilities of being a red supergiant to a given spectrum and use the properties of a population to generate clean samples, without contamination from lower luminosity stars. We compare our identification with a classification done using classical criteria and discuss their respective efficiencies and contaminations as identification methods. We confirm that our method is as efficient at finding supergiants as the best classical methods, but with a far lower contamination by red giants than any other method. The result is a catalogue with 197 cool supergiants, 191 of which did not appear in previous lists of red supergiants. This is the largest coherent catalogue of cool supergiants in the Galaxy.
NASA Technical Reports Server (NTRS)
Holt, Maurice
1998-01-01
Contributions to the Method of Characteristics in Three Dimensions, which previously received incomplete recognition, are reviewed. They mostly follow from a fundamental paper by Rusanov which led to several developments in Russia, described by Chushkin.
NASA Technical Reports Server (NTRS)
Jefferys, W. H.
1981-01-01
A least squares method proposed previously for solving a general class of problems is expanded in two ways. First, covariance matrices related to the solution are calculated and their interpretation is given. Second, improved methods of solving the normal equations related to those of Marquardt (1963) and Fletcher and Powell (1963) are developed for this approach. These methods may converge in cases where Newton's method diverges or converges slowly.
A new method for the prediction of combustion instability
NASA Astrophysics Data System (ADS)
Flanagan, Steven Meville
This dissertation presents a new approach to the prediction of combustion instability in solid rocket motors. Previous attempts at developing computational tools to solve this problem have been largely unsuccessful, showing very poor agreement with experimental results and having little or no predictive capability. This is due primarily to deficiencies in the linear stability theory upon which these efforts have been based. Recent advances in linear instability theory by Flandro have demonstrated the importance of including unsteady rotational effects, previously considered negligible. Previous versions of the theory also neglected corrections to the unsteady flow field of the first order in the mean flow Mach number. This research explores the stability implications of extending the solution to include these corrections. Also, the corrected linear stability theory based upon a rotational unsteady flow field extended to first order in mean flow Mach number has been implemented in two computer programs developed for the Macintosh platform. A quasi one-dimensional version of the program has been developed which is based upon an approximate solution to the cavity acoustics problem. The three-dimensional program applies Greens's Function Discretization (GFD) to the solution for the acoustic mode shapes and frequency. GFD is a recently developed numerical method for finding fully three dimensional solutions for this class of problems. The analysis of complex motor geometries, previously a tedious and time consuming task, has also been greatly simplified through the development of a drawing package designed specifically to facilitate the specification of typical motor geometries. The combination of the drawing package, improved acoustic solutions, and new analysis, results in a tool which is capable of producing more accurate and meaningful predictions than have been possible in the past.
Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.
Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen
2017-11-01
A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.
Development of Practical Methods for Assessing the Chronic Toxicity of Effluents
This is a short introductory essay, invited as front matter to the journal "Environmental Toxicology and Chemistry". It was invited to provide an overview of the development of a previous article by the first author (when he was an EPA employee) which is being recognized as one ...
Innovation 101: Promoting Undergraduate Innovation through a Two-Day Boot Camp
ERIC Educational Resources Information Center
West, Richard E.; Tateishi, Isaku; Wright, Geoffrey A.; Fonoimoana, Melia
2012-01-01
Over the years, many training methods for creativity and innovation have been developed. Despite these programs and research, further improvement is necessary, particularly in schools of technology and engineering education, where previous efforts have focused on developing solutions to defined problems, not in identifying and defining the…
Grinding and cooking dry-mill germ to optimize aqueous enzymatic oil extraction
USDA-ARS?s Scientific Manuscript database
The many recent dry grind plants that convert corn to ethanol are potential sources of substantial amounts of corn oil. This report describes an aqueous enzymatic extraction (AEE) method to separate oil from dry-mill corn germ (DMG). The method is an extension of AEE previously developed for wet...
das Neves Costa, Fernanda; Hubert, Jane; Borie, Nicolas; Kotland, Alexis; Hewitson, Peter; Ignatova, Svetlana; Renault, Jean-Hugues
2017-03-03
Countercurrent chromatography (CCC) and centrifugal partition chromatography (CPC) are support free liquid-liquid chromatography techniques sharing the same basic principles and features. Method transfer has previously been demonstrated for both techniques but never from one to another. This study aimed to show such a feasibility using fractionation of Schinus terebinthifolius berries dichloromethane extract as a case study. Heptane - ethyl acetate - methanol -water (6:1:6:1, v/v/v/v) was used as solvent system with masticadienonic and 3β-masticadienolic acids as target compounds. The optimized separation methodology previously described in Part I and II, was scaled up from an analytical hydrodynamic CCC column (17.4mL) to preparative hydrostatic CPC instruments (250mL and 303mL) as a part of method development. Flow-rate and sample loading were further optimized on CPC. Mobile phase linear velocity is suggested as a transfer invariant parameter if the CPC column contains sufficient number of partition cells. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Lutz, O
1940-01-01
Using a previously developed method, the boundary process of four-stroke-cycle engines are set up. The results deviate considerably from those obtained under the assumption that the velocity fluctuation is proportional to the cylinder piston motion. The deviation is less at the position of resonance frequencies. By the method developed, the effect of the resonance vibrations on the volumetric efficiency can be demonstrated.
Koski, Antti; Tossavainen, Timo; Juhola, Martti
2004-01-01
Electrocardiogram (ECG) signals are the most prominent biomedical signal type used in clinical medicine. Their compression is important and widely researched in the medical informatics community. In the previous literature compression efficacy has been investigated only in the context of how much known or developed methods reduced the storage required by compressed forms of original ECG signals. Sometimes statistical signal evaluations based on, for example, root mean square error were studied. In previous research we developed a refined method for signal compression and tested it jointly with several known techniques for other biomedical signals. Our method of so-called successive approximation quantization used with wavelets was one of the most successful in those tests. In this paper, we studied to what extent these lossy compression methods altered values of medical parameters (medical information) computed from signals. Since the methods are lossy, some information is lost due to the compression when a high enough compression ratio is reached. We found that ECG signals sampled at 400 Hz could be compressed to one fourth of their original storage space, but the values of their medical parameters changed less than 5% due to compression, which indicates reliable results.
A gas-kinetic BGK scheme for the compressible Navier-Stokes equations
NASA Technical Reports Server (NTRS)
Xu, Kun
2000-01-01
This paper presents an improved gas-kinetic scheme based on the Bhatnagar-Gross-Krook (BGK) model for the compressible Navier-Stokes equations. The current method extends the previous gas-kinetic Navier-Stokes solver developed by Xu and Prendergast by implementing a general nonequilibrium state to represent the gas distribution function at the beginning of each time step. As a result, the requirement in the previous scheme, such as the particle collision time being less than the time step for the validity of the BGK Navier-Stokes solution, is removed. Therefore, the applicable regime of the current method is much enlarged and the Navier-Stokes solution can be obtained accurately regardless of the ratio between the collision time and the time step. The gas-kinetic Navier-Stokes solver developed by Chou and Baganoff is the limiting case of the current method, and it is valid only under such a limiting condition. Also, in this paper, the appropriate implementation of boundary condition for the kinetic scheme, different kinetic limiting cases, and the Prandtl number fix are presented. The connection among artificial dissipative central schemes, Godunov-type schemes, and the gas-kinetic BGK method is discussed. Many numerical tests are included to validate the current method.
NASA Technical Reports Server (NTRS)
Trimble, Jay Phillip
2013-01-01
This is based on a previous talk on agile development. Methods for delivering software on a short cycle are described, including interactions with the customer, the affect on the team, and how to be more effective, streamlined and efficient.
Pressure algorithm for elliptic flow calculations with the PDF method
NASA Technical Reports Server (NTRS)
Anand, M. S.; Pope, S. B.; Mongia, H. C.
1991-01-01
An algorithm to determine the mean pressure field for elliptic flow calculations with the probability density function (PDF) method is developed and applied. The PDF method is a most promising approach for the computation of turbulent reacting flows. Previous computations of elliptic flows with the method were in conjunction with conventional finite volume based calculations that provided the mean pressure field. The algorithm developed and described here permits the mean pressure field to be determined within the PDF calculations. The PDF method incorporating the pressure algorithm is applied to the flow past a backward-facing step. The results are in good agreement with data for the reattachment length, mean velocities, and turbulence quantities including triple correlations.
Eyal-Altman, Noah; Last, Mark; Rubin, Eitan
2017-01-17
Numerous publications attempt to predict cancer survival outcome from gene expression data using machine-learning methods. A direct comparison of these works is challenging for the following reasons: (1) inconsistent measures used to evaluate the performance of different models, and (2) incomplete specification of critical stages in the process of knowledge discovery. There is a need for a platform that would allow researchers to replicate previous works and to test the impact of changes in the knowledge discovery process on the accuracy of the induced models. We developed the PCM-SABRE platform, which supports the entire knowledge discovery process for cancer outcome analysis. PCM-SABRE was developed using KNIME. By using PCM-SABRE to reproduce the results of previously published works on breast cancer survival, we define a baseline for evaluating future attempts to predict cancer outcome with machine learning. We used PCM-SABRE to replicate previous work that describe predictive models of breast cancer recurrence, and tested the performance of all possible combinations of feature selection methods and data mining algorithms that was used in either of the works. We reconstructed the work of Chou et al. observing similar trends - superior performance of Probabilistic Neural Network (PNN) and logistic regression (LR) algorithms and inconclusive impact of feature pre-selection with the decision tree algorithm on subsequent analysis. PCM-SABRE is a software tool that provides an intuitive environment for rapid development of predictive models in cancer precision medicine.
Peterson, L W; Hardin, M; Nitsch, M J
1995-05-01
Primary care physicians can be instrumental in the initial identification of potential sexual, emotional, and physical abuse of children. We reviewed the use of children's artwork as a method of communicating individual and family functioning. A quantitative method of analyzing children's artwork provides more reliability and validity than some methods used previously. A new scoring system was developed that uses individual human figure drawings and kinetic family drawings. This scoring system was based on research with 842 children (341 positively identified as sexually molested, 252 positively not sexually molested but having emotional or behavioral problems, and 249 "normal" public school children). This system is more comprehensive than previous systems of assessment of potential abuse.
Takahashi, Daisuke; Inomata, Tatsuji; Fukui, Tatsuya
2017-06-26
We previously reported an efficient peptide synthesis method, AJIPHASE®, that comprises repeated reactions and isolations by precipitation. This method utilizes an anchor molecule with long-chain alkyl groups as a protecting group for the C-terminus. To further improve this method, we developed a one-pot synthesis of a peptide sequence wherein the synthetic intermediates were isolated by solvent extraction instead of precipitation. A branched-chain anchor molecule was used in the new process, significantly enhancing the solubility of long peptides and the operational efficiency compared with the previous method, which employed precipitation for isolation and a straight-chain aliphatic group. Another prerequisite for this solvent-extraction-based strategy was the use of thiomalic acid and DBU for Fmoc deprotection, which facilitates the removal of byproducts, such as the fulvene adduct. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Development of new methodologies for evaluating the energy performance of new commercial buildings
NASA Astrophysics Data System (ADS)
Song, Suwon
The concept of Measurement and Verification (M&V) of a new building continues to become more important because efficient design alone is often not sufficient to deliver an efficient building. Simulation models that are calibrated to measured data can be used to evaluate the energy performance of new buildings if they are compared to energy baselines such as similar buildings, energy codes, and design standards. Unfortunately, there is a lack of detailed M&V methods and analysis methods to measure energy savings from new buildings that would have hypothetical energy baselines. Therefore, this study developed and demonstrated several new methodologies for evaluating the energy performance of new commercial buildings using a case-study building in Austin, Texas. First, three new M&V methods were developed to enhance the previous generic M&V framework for new buildings, including: (1) The development of a method to synthesize weather-normalized cooling energy use from a correlation of Motor Control Center (MCC) electricity use when chilled water use is unavailable, (2) The development of an improved method to analyze measured solar transmittance against incidence angle for sample glazing using different solar sensor types, including Eppley PSP and Li-Cor sensors, and (3) The development of an improved method to analyze chiller efficiency and operation at part-load conditions. Second, three new calibration methods were developed and analyzed, including: (1) A new percentile analysis added to the previous signature method for use with a DOE-2 calibration, (2) A new analysis to account for undocumented exhaust air in DOE-2 calibration, and (3) An analysis of the impact of synthesized direct normal solar radiation using the Erbs correlation on DOE-2 simulation. Third, an analysis of the actual energy savings compared to three different energy baselines was performed, including: (1) Energy Use Index (EUI) comparisons with sub-metered data, (2) New comparisons against Standards 90.1-1989 and 90.1-2001, and (3) A new evaluation of the performance of selected Energy Conservation Design Measures (ECDMs). Finally, potential energy savings were also simulated from selected improvements, including: minimum supply air flow, undocumented exhaust air, and daylighting.
An analytical method to predict efficiency of aircraft gearboxes
NASA Technical Reports Server (NTRS)
Anderson, N. E.; Loewenthal, S. H.; Black, J. D.
1984-01-01
A spur gear efficiency prediction method previously developed by the authors was extended to include power loss of planetary gearsets. A friction coefficient model was developed for MIL-L-7808 oil based on disc machine data. This combined with the recent capability of predicting losses in spur gears of nonstandard proportions allows the calculation of power loss for complete aircraft gearboxes that utilize spur gears. The method was applied to the T56/501 turboprop gearbox and compared with measured test data. Bearing losses were calculated with large scale computer programs. Breakdowns of the gearbox losses point out areas for possible improvement.
[Treatment of gamma-hydroxybutyrate withdrawal].
Strand, Niels August Willer; Petersen, Tonny Studsgaard; Nielsen, Lars Martin; Boegevig, Soren
2017-12-11
Gamma-hydroxybutyrate (GHB) is a drug of abuse, for which physical addiction develops quickly. GHB withdrawal can develop into a life-threatening condition and has previously been treated mainly with benzodiazepines. These have not always proven effective, leading to long hospitalizations in intensive care units. Based on successful Dutch treatment results for using GHB to treat GHB withdrawal symptoms, we propose to implement a similar method in Denmark. The method requires an interdisciplinary effort for which The Danish Poison Information Centre should be consulted for expertise.
The Development of a Decision Support System for Mobile Learning: A Case Study in Taiwan
ERIC Educational Resources Information Center
Chiu, Po-Sheng; Huang, Yueh-Min
2016-01-01
While mobile learning (m-learning) has considerable potential, most of previous strategies for developing this new approach to education were analysed using the knowledge, experience and judgement of individuals, with the support of statistical software. Although these methods provide systematic steps for the implementation of m-learning…
ERIC Educational Resources Information Center
Mursu, Anja; Luukkonen, Irmeli; Toivanen, Marika; Korpela, Mikko
2007-01-01
Introduction: The purpose of information systems is to facilitate work activities: here we consider how Activity Theory can be applied in information systems development. Method. The requirements for an analytical model for emancipatory, work-oriented information systems research and practice are specified. Previous research work in Activity…
Teacher Participation in Online Professional Development: Exploring Academic Year Classroom Impacts
ERIC Educational Resources Information Center
Opfer, Thomas
2017-01-01
The purpose of this mixed methods case study research was to investigate the reasons teachers chose online professional development (OPD) focusing on technology integration and how this OPD impacted teachers' classroom practices over a six month period. Previous research identified that OPD provides flexibility beyond what traditional face-to-face…
The goals of this study were to develop and validate a Rapid Assessment Method (RAM) for assessing the condition of coastal wetlands in New England, USA. Eighty-one coastal wetland sites were assessed; nested within these were ten reference sites which were previously assessed us...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao Yajun
A previously established Hauser-Ernst-type extended double-complex linear system is slightly modified and used to develop an inverse scattering method for the stationary axisymmetric general symplectic gravity model. The reduction procedures in this inverse scattering method are found to be fairly simple, which makes the inverse scattering method applied fine and effective. As an application, a concrete family of soliton double solutions for the considered theory is obtained.
FLARE STARS—A FAVORABLE OBJECT FOR STUDYING MECHANISMS OF NONTHERMAL ASTROPHYSICAL PHENOMENA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oks, E.; Gershberg, R. E.
2016-03-01
We present a spectroscopic method for diagnosing a low-frequency electrostatic plasma turbulence (LEPT) in plasmas of flare stars. This method had been previously developed by one of us and successfully applied to diagnosing the LEPT in solar flares. In distinction to our previous applications of the method, here we use the latest advances in the theory of the Stark broadening of hydrogen spectral lines. By analyzing observed emission Balmer lines, we show that it is very likely that the LEPT was developed in several flares of AD Leo, as well as in one flare of EV Lac. We found themore » LEPT (though of different field strengths) both in the explosive/impulsive phase and at the phase of the maximum, as well as at the gradual phase of the stellar flares. While for solar flares our method allows diagnosing the LEPT only in the most powerful flares, for the flare stars it seems that the method allows revealing the LEPT practically in every flare. It should be important to obtain new and better spectrograms of stellar flares, allowing their analysis by the method outlined in the present paper. This can be the most favorable way to the detailed understanding of the nature of nonthermal astrophysical phenomena.« less
Enzymatic modification of schizophyllan
USDA-ARS?s Scientific Manuscript database
An enzymatic method was developed for the progressive modification of the polysaccharide schizophyllan. Fungal strains Hypocrea nigricans NRRL 62555, Penicillium crustosum NRRL 62558, and Penicillium simplicissimum NRRL 62550 were previously identified as novel sources of ß-endoglucanase with specif...
ERIC Educational Resources Information Center
Kilburn, Daniel; Nind, Melanie; Wiles, Rose
2014-01-01
In light of calls to improve the capacity for social science research within UK higher education, this article explores the possibilities for an emerging pedagogy for research methods. A lack of pedagogical culture in this field has been identified by previous studies. In response, we examine pedagogical literature surrounding approaches for…
The exact solution of the monoenergetic transport equation for critical cylinders
NASA Technical Reports Server (NTRS)
Westfall, R. M.; Metcalf, D. R.
1972-01-01
An analytic solution for the critical, monoenergetic, bare, infinite cylinder is presented. The solution is obtained by modifying a previous development based on a neutron density transform and Case's singular eigenfunction method. Numerical results for critical radii and the neutron density as a function of position are included and compared with the results of other methods.
Genetics-based methods for detection of Salmonella spp. in foods.
Mozola, Mark A
2006-01-01
Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.
Leal, Julie Ehret; Thompson, Amy N; Brzezinski, Walter A
2010-01-01
To evaluate public awareness of pharmaceuticals in drinking water and to develop educational efforts to promote awareness in our community. A review of the literature was conducted to gain a full perspective of the current issue. Questionnaires, interviews, and website feedback were used to assess awareness of the problem and the most commonly used medication disposal methods. In addition, educational flyers were created to disseminate information to the public. The questionnaires were completed by a total of 96 respondents. Of respondents employed in health care, 72% had previous knowledge of pharmaceutical medications being found in our local (Charleston, SC) water supply, and of respondents not employed in health care, 54% had previous knowledge. For those with previous knowledge, 7% disposed of medications in the toilet or sink, 38% used the trash, and 36% used multiple methods. Of respondents indicating no previous knowledge, 3% disposed of medications in the toilet or sink, 35% used the trash, and 42% used multiple methods. Public awareness of pharmaceuticals in drinking water and educational efforts focusing on proper disposal of medications are essential in helping to reduce drinking water contamination.
NASA Technical Reports Server (NTRS)
Coen, Peter G.
1991-01-01
A new computer technique for the analysis of transport aircraft sonic boom signature characteristics was developed. This new technique, based on linear theory methods, combines the previously separate equivalent area and F function development with a signature propagation method using a single geometry description. The new technique was implemented in a stand-alone computer program and was incorporated into an aircraft performance analysis program. Through these implementations, both configuration designers and performance analysts are given new capabilities to rapidly analyze an aircraft's sonic boom characteristics throughout the flight envelope.
A strategy for evaluating pathway analysis methods.
Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques
2017-10-13
Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth, either established or assumed, of the pathways perturbed by a specific clinical or experimental condition. As such, our strategy allows researchers to systematically and objectively evaluate pathway analysis methods by employing any number of datasets for a variety of conditions.
2009-06-01
3. Previous Navy CRM Assessments ....................................................24 4. Applying Kirkpatrick’s Topology of Evaluation...development within each aviation community. Kirkpatrick’s (1976) hierarchy of training evaluation technique was applied to examine three levels of... Applying methods and techniques used in previous CRM evaluation research, this thesis provided an updated evaluation of the Naval CRM program to fill
Tapia-Lewin, Sebastián; Vergara, Karina; De La Barra, Christian; Godoy, Natalio; Castilla, Juan Carlos; Gelcich, Stefan
2017-10-01
Artisanal fishery activities support the livelihoods of millions of people worldwide, particularly in developing countries. Within these fisheries, distal global drivers can promote switching between alternative target resources. These drivers can promote the rapid development of new, unregulated and previously unexploited fisheries that pose a threat to the sustainability of ecosystems. In this paper, we describe a new artisanal shore gathering activity that targets a previously unexploited resource: the sandhopper (Orchestoidea tuberculata). The activity is driven by aquarium trade demand for food. We used mixed methods to describe the activity, assessed basic socio-economic incentives, and estimated Catches per Unit Effort. Results show that the sandhopper plays an important role for the livelihoods of shore gatherers engaged in the activity. Gatherers have adapted and developed two main extraction methods with different degrees of investment and extraction rates. Furthermore, gatherers have developed local knowledge regarding the ecology and management of the resource. Results show that economic incentives can motivate a rapid expansion of this unregulated activity. Future research gaps and management options to address the development of this fishery are discussed in light of these findings.
Occupied Volume Integrity Testing : Elastic Test Results and Analyses
DOT National Transportation Integrated Search
2011-09-21
Federal Railroad Administration (FRA) and the Volpe Center have been conducting research into developing an alternative method of demonstrating occupied volume integrity (OVI) through a combination of testing and analysis. Previous works have been pu...
Projected 1981 exposure estimates using iterative proportional fitting
DOT National Transportation Integrated Search
1985-10-01
1981 VMT estimates categorized by eight driver, vehicle, and environmental : variables are produced. These 1981 estimates are produced using analytical : methods developed in a previous report. The estimates are based on 1977 : NPTS data (the latest ...
Development of a Nonequilibrium Radiative Heating Prediction Method for Coupled Flowfield Solutions
NASA Technical Reports Server (NTRS)
Hartung, Lin C.
1991-01-01
A method for predicting radiative heating and coupling effects in nonequilibrium flow-fields has been developed. The method resolves atomic lines with a minimum number of spectral points, and treats molecular radiation using the smeared band approximation. To further minimize computational time, the calculation is performed on an optimized spectrum, which is computed for each flow condition to enhance spectral resolution. Additional time savings are obtained by performing the radiation calculation on a subgrid optimally selected for accuracy. Representative results from the new method are compared to previous work to demonstrate that the speedup does not cause a loss of accuracy and is sufficient to make coupled solutions practical. The method is found to be a useful tool for studies of nonequilibrium flows.
NASA Technical Reports Server (NTRS)
Henderson, R. G.; Thomas, G. S.; Nalepka, R. F.
1975-01-01
Methods of performing signature extension, using LANDSAT-1 data, are explored. The emphasis is on improving the performance and cost-effectiveness of large area wheat surveys. Two methods were developed: ASC, and MASC. Two methods, Ratio, and RADIFF, previously used with aircraft data were adapted to and tested on LANDSAT-1 data. An investigation into the sources and nature of between scene data variations was included. Initial investigations into the selection of training fields without in situ ground truth were undertaken.
The Development of a Rapid Prototyping Environment
1989-12-01
constraints is a very complex, time- consuming and costly took. This situation can be iaproved by the use of adequate development methods and powerful support...was an essential factor in our decision . The previous development of CAPS tools utilized and assuumed the availa- bility of a Sun Workstation. There...the development of a production * 23 sys(tCm. An early decision was mIade to accept dependence upon the best locally avail- able resources. Portability
1999-01-01
contaminating the surface. Research efforts to develop an improved sampling method have previously been limited to deposits made from solutions of explosives...explosive per fingerprint calculated in this way has too much variation to allow determination of sampling efficiency or to use this method to prepare...crystals is put into suspension, the actual amount is determined by usual methods including high-performance liquid chromatography (HPLC), gas
Rakesh Minocha; Stephanie Long
2004-01-01
The objective of the present study was to develop a rapid HPLC method for simultaneous separation and quantitation of dansylated amino acids and common polyamines in the same matrix for analyzing forest tree tissues and cell cultures. The major modifications incorporated into this method as compared to previously published HPLC methods for separation of only dansyl...
Meshless Local Petrov-Galerkin Method for Bending Problems
NASA Technical Reports Server (NTRS)
Phillips, Dawn R.; Raju, Ivatury S.
2002-01-01
Recent literature shows extensive research work on meshless or element-free methods as alternatives to the versatile Finite Element Method. One such meshless method is the Meshless Local Petrov-Galerkin (MLPG) method. In this report, the method is developed for bending of beams - C1 problems. A generalized moving least squares (GMLS) interpolation is used to construct the trial functions, and spline and power weight functions are used as the test functions. The method is applied to problems for which exact solutions are available to evaluate its effectiveness. The accuracy of the method is demonstrated for problems with load discontinuities and continuous beam problems. A Petrov-Galerkin implementation of the method is shown to greatly reduce computational time and effort and is thus preferable over the previously developed Galerkin approach. The MLPG method for beam problems yields very accurate deflections and slopes and continuous moment and shear forces without the need for elaborate post-processing techniques.
General object-oriented software development
NASA Technical Reports Server (NTRS)
Seidewitz, Edwin V.; Stark, Mike
1986-01-01
Object-oriented design techniques are gaining increasing popularity for use with the Ada programming language. A general approach to object-oriented design which synthesizes the principles of previous object-oriented methods into the overall software life-cycle, providing transitions from specification to design and from design to code. It therefore provides the basis for a general object-oriented development methodology.
ERIC Educational Resources Information Center
Drewett, R. F.; Corbett, S. S.; Wright, C. M.
2006-01-01
Background: Previous studies suggest that failure to thrive in infancy may be associated with adverse sequelae in childhood. Although cognitive abilities have been extensively investigated, little systematic research is available on other aspects of development. Methods: Eighty-nine children who failed to thrive as infants and 91 controls were…
ERIC Educational Resources Information Center
Wetzel, Angela Payne
2011-01-01
Previous systematic reviews indicate a lack of reporting of reliability and validity evidence in subsets of the medical education literature. Psychology and general education reviews of factor analysis also indicate gaps between current and best practices; yet, a comprehensive review of exploratory factor analysis in instrument development across…
ERIC Educational Resources Information Center
McClain, Arianna D.; Hekler, Eric B.; Gardner, Christopher D.
2013-01-01
Background: Previous research from the fields of computer science and engineering highlight the importance of an iterative design process (IDP) to create more creative and effective solutions. Objective: This study describes IDP as a new method for developing health behavior interventions and evaluates the effectiveness of a dining hall--based…
ERIC Educational Resources Information Center
Hong, Guanglei; Yu, Bing
2008-01-01
This study examines the effects of kindergarten retention on children's social-emotional development in the early, middle, and late elementary years. Previous studies have generated mixed results partly due to some major methodological challenges, including selection bias, measurement error, and divergent perceptions of multiple respondents in…
Developing Emotional Literacy through Individual Dance Movement Therapy: A Pilot Study
ERIC Educational Resources Information Center
Meekums, Bonnie
2008-01-01
This paper reports a pragmatic mixed methods pilot study of teacher perceptions regarding a school-based Dance Movement therapy (DMT) service for six children aged four to seven in a North of England primary school. No previous studies have systematically evaluated DMT in terms of the development of Emotional Literacy (EL), though theoretical…
Liu, Wenying; Yeh, Yi-Chun; Lipner, Justin; Xie, Jingwei; Sung, Hsing-Wen; Thomopoulos, Stavros; Xia, Younan
2011-01-01
A new method was developed to coat hydroxyapatite (HAp) onto electrospun poly(lactic-co-glycolic acid) (PLGA) nanofibers for tendon-to-bone insertion site repair applications. Prior to mineralization, chitosan and heparin were covalently immobilized onto the surface of the fibers to accelerate the nucleation of bone-like HAp crystals. Uniform coatings of HAp were obtained by immersing the nanofiber scaffolds into a modified 10 times concentrated simulated body fluid (m10SBF) for different periods of time. The new method resulted in thicker and denser coatings of mineral on the fibers compared to previously reported methods. Scanning electron microscopy measurements confirmed the formation of nanoscale HAp particles on the fibers. Mechanical property assessment demonstrated higher stiffness with respect to previous coating methods. A combination of the nanoscale fibrous structure and bone-like mineral coating could mimic the structure, composition, and function of mineralized tissues. PMID:21710996
NASA Astrophysics Data System (ADS)
Takahashi, Hiroki; Hasegawa, Hideyuki; Kanai, Hiroshi
2013-07-01
For the facilitation of analysis and elimination of the operator dependence in estimating the myocardial function in echocardiography, we have previously developed a method for automated identification of the heart wall. However, there are misclassified regions because the magnitude-squared coherence (MSC) function of echo signals, which is one of the features in the previous method, is sensitively affected by the clutter components such as multiple reflection and off-axis echo from external tissue or the nearby myocardium. The objective of the present study is to improve the performance of automated identification of the heart wall. For this purpose, we proposed a method to suppress the effect of the clutter components on the MSC of echo signals by applying an adaptive moving target indicator (MTI) filter to echo signals. In vivo experimental results showed that the misclassified regions were significantly reduced using our proposed method in the longitudinal axis view of the heart.
Hurst, William J; Stanley, Bruce; Glinski, Jan A; Davey, Matthew; Payne, Mark J; Stuart, David A
2009-10-15
This report describes the characterization of a series of commercially available procyanidin standards ranging from dimers DP = 2 to decamers DP = 10 for the determination of procyanidins from cocoa and chocolate. Using a combination of HPLC with fluorescence detection and MALDI-TOF mass spectrometry, the purity of each standard was determined and these data were used to determine relative response factors. These response factors were compared with other response factors obtained from published methods. Data comparing the procyanidin analysis of a commercially available US dark chocolate calculated using each of the calibration methods indicates divergent results and demonstrate that previous methods may significantly underreport the procyanidins in cocoa-containing products. These results have far reaching implications because the previous calibration methods have been used to develop data for a variety of scientific reports, including food databases and clinical studies.
Development of a new flux splitting scheme
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Steffen, Christopher J., Jr.
1991-01-01
The use of a new splitting scheme, the advection upstream splitting method, for model aerodynamic problems where Van Leer and Roe schemes had failed previously is discussed. The present scheme is based on splitting in which the convective and pressure terms are separated and treated differently depending on the underlying physical conditions. The present method is found to be both simple and accurate.
Development of a new flux splitting scheme
NASA Technical Reports Server (NTRS)
Liou, Meng-Sing; Steffen, Christopher J., Jr.
1991-01-01
The successful use of a novel splitting scheme, the advection upstream splitting method, for model aerodynamic problems where Van Leer and Roe schemes had failed previously is discussed. The present scheme is based on splitting in which the convective and pressure terms are separated and treated differently depending on the underlying physical conditions. The present method is found to be both simple and accurate.
Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers
García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta
2016-01-01
The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine. PMID:28773653
Hard-Rock Stability Analysis for Span Design in Entry-Type Excavations with Learning Classifiers.
García-Gonzalo, Esperanza; Fernández-Muñiz, Zulima; García Nieto, Paulino José; Bernardo Sánchez, Antonio; Menéndez Fernández, Marta
2016-06-29
The mining industry relies heavily on empirical analysis for design and prediction. An empirical design method, called the critical span graph, was developed specifically for rock stability analysis in entry-type excavations, based on an extensive case-history database of cut and fill mining in Canada. This empirical span design chart plots the critical span against rock mass rating for the observed case histories and has been accepted by many mining operations for the initial span design of cut and fill stopes. Different types of analysis have been used to classify the observed cases into stable, potentially unstable and unstable groups. The main purpose of this paper is to present a new method for defining rock stability areas of the critical span graph, which applies machine learning classifiers (support vector machine and extreme learning machine). The results show a reasonable correlation with previous guidelines. These machine learning methods are good tools for developing empirical methods, since they make no assumptions about the regression function. With this software, it is easy to add new field observations to a previous database, improving prediction output with the addition of data that consider the local conditions for each mine.
Morison, Gordon; Boreham, Philip
2018-01-01
Electromagnetic Interference (EMI) is a technique for capturing Partial Discharge (PD) signals in High-Voltage (HV) power plant apparatus. EMI signals can be non-stationary which makes their analysis difficult, particularly for pattern recognition applications. This paper elaborates upon a previously developed software condition-monitoring model for improved EMI events classification based on time-frequency signal decomposition and entropy features. The idea of the proposed method is to map multiple discharge source signals captured by EMI and labelled by experts, including PD, from the time domain to a feature space, which aids in the interpretation of subsequent fault information. Here, instead of using only one permutation entropy measure, a more robust measure, called Dispersion Entropy (DE), is added to the feature vector. Multi-Class Support Vector Machine (MCSVM) methods are utilized for classification of the different discharge sources. Results show an improved classification accuracy compared to previously proposed methods. This yields to a successful development of an expert’s knowledge-based intelligent system. Since this method is demonstrated to be successful with real field data, it brings the benefit of possible real-world application for EMI condition monitoring. PMID:29385030
Neville, David C A; Coquard, Virginie; Priestman, David A; te Vruchte, Danielle J M; Sillence, Daniel J; Dwek, Raymond A; Platt, Frances M; Butters, Terry D
2004-08-15
Interest in cellular glycosphingolipid (GSL) function has necessitated the development of a rapid and sensitive method to both analyze and characterize the full complement of structures present in various cells and tissues. An optimized method to characterize oligosaccharides released from glycosphingolipids following ceramide glycanase digestion has been developed. The procedure uses the fluorescent compound anthranilic acid (2-aminobenzoic acid; 2-AA) to label oligosaccharides prior to analysis using normal-phase high-performance liquid chromatography. The labeling procedure is rapid, selective, and easy to perform and is based on the published method of Anumula and Dhume [Glycobiology 8 (1998) 685], originally used to analyze N-linked oligosaccharides. It is less time consuming than a previously published 2-aminobenzamide labeling method [Anal. Biochem. 298 (2001) 207] for analyzing GSL-derived oligosaccharides, as the fluorescent labeling is performed on the enzyme reaction mixture. The purification of 2-AA-labeled products has been improved to ensure recovery of oligosaccharides containing one to four monosaccharide units, which was not previously possible using the Anumula and Dhume post-derivatization purification procedure. This new approach may also be used to analyze both N- and O-linked oligosaccharides.
NASA Astrophysics Data System (ADS)
Sumi, C.
Previously, we developed three displacement vector measurement methods, i.e., the multidimensional cross-spectrum phase gradient method (MCSPGM), the multidimensional autocorrelation method (MAM), and the multidimensional Doppler method (MDM). To increase the accuracies and stabilities of lateral and elevational displacement measurements, we also developed spatially variant, displacement component-dependent regularization. In particular, the regularization of only the lateral/elevational displacements is advantageous for the lateral unmodulated case. The demonstrated measurements of the displacement vector distributions in experiments using an inhomogeneous shear modulus agar phantom confirm that displacement-component-dependent regularization enables more stable shear modulus reconstruction. In this report, we also review our developed lateral modulation methods that use Parabolic functions, Hanning windows, and Gaussian functions in the apodization function and the optimized apodization function that realizes the designed point spread function (PSF). The modulations significantly increase the accuracy of the strain tensor measurement and shear modulus reconstruction (demonstrated using an agar phantom).
NASA Technical Reports Server (NTRS)
Duyar, A.; Guo, T.-H.; Merrill, W.; Musgrave, J.
1992-01-01
In a previous study, Guo, Merrill and Duyar, 1990, reported a conceptual development of a fault detection and diagnosis system for actuation faults of the space shuttle main engine. This study, which is a continuation of the previous work, implements the developed fault detection and diagnosis scheme for the real time actuation fault diagnosis of the space shuttle main engine. The scheme will be used as an integral part of an intelligent control system demonstration experiment at NASA Lewis. The diagnosis system utilizes a model based method with real time identification and hypothesis testing for actuation, sensor, and performance degradation faults.
High correlations between MRI brain volume measurements based on NeuroQuant® and FreeSurfer.
Ross, David E; Ochs, Alfred L; Tate, David F; Tokac, Umit; Seabaugh, John; Abildskov, Tracy J; Bigler, Erin D
2018-05-30
NeuroQuant ® (NQ) and FreeSurfer (FS) are commonly used computer-automated programs for measuring MRI brain volume. Previously they were reported to have high intermethod reliabilities but often large intermethod effect size differences. We hypothesized that linear transformations could be used to reduce the large effect sizes. This study was an extension of our previously reported study. We performed NQ and FS brain volume measurements on 60 subjects (including normal controls, patients with traumatic brain injury, and patients with Alzheimer's disease). We used two statistical approaches in parallel to develop methods for transforming FS volumes into NQ volumes: traditional linear regression, and Bayesian linear regression. For both methods, we used regression analyses to develop linear transformations of the FS volumes to make them more similar to the NQ volumes. The FS-to-NQ transformations based on traditional linear regression resulted in effect sizes which were small to moderate. The transformations based on Bayesian linear regression resulted in all effect sizes being trivially small. To our knowledge, this is the first report describing a method for transforming FS to NQ data so as to achieve high reliability and low effect size differences. Machine learning methods like Bayesian regression may be more useful than traditional methods. Copyright © 2018 Elsevier B.V. All rights reserved.
Comparison of methods for measuring travel time at Florida freeways and arterials.
DOT National Transportation Integrated Search
2014-07-01
Travel time is an important performance measure used to assess the traffic operational quality of various types of highway : facilities. Previous research funded by the Florida Department of Transportation (FDOT) on travel time reliability developed,...
MEASUREMENT METHOD FOR VOLATILE METABOLIC BIOMARKERS IN EXHALED BREATH CONDENSATE
EPA is developing biomarker methodology to interpret spot biological measurements and their linkage to previous environmental pollutants exposures for individuals. This work explores the use of a promising biological media, exhaled breath condensate (EBC), which contains trapped...
A second-order accurate immersed boundary-lattice Boltzmann method for particle-laden flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Qiang; Fan, Liang-Shih, E-mail: fan.1@osu.edu
A new immersed boundary-lattice Boltzmann method (IB-LBM) is presented for fully resolved simulations of incompressible viscous flows laden with rigid particles. The immersed boundary method (IBM) recently developed by Breugem (2012) [19] is adopted in the present method, development including the retraction technique, the multi-direct forcing method and the direct account of the inertia of the fluid contained within the particles. The present IB-LBM is, however, formulated with further improvement with the implementation of the high-order Runge–Kutta schemes in the coupled fluid–particle interaction. The major challenge to implement high-order Runge–Kutta schemes in the LBM is that the flow information suchmore » as density and velocity cannot be directly obtained at a fractional time step from the LBM since the LBM only provides the flow information at an integer time step. This challenge can be, however, overcome as given in the present IB-LBM by extrapolating the flow field around particles from the known flow field at the previous integer time step. The newly calculated fluid–particle interactions from the previous fractional time steps of the current integer time step are also accounted for in the extrapolation. The IB-LBM with high-order Runge–Kutta schemes developed in this study is validated by several benchmark applications. It is demonstrated, for the first time, that the IB-LBM has the capacity to resolve the translational and rotational motion of particles with the second-order accuracy. The optimal retraction distances for spheres and tubes that help the method achieve the second-order accuracy are found to be around 0.30 and −0.47 times of the lattice spacing, respectively. Simulations of the Stokes flow through a simple cubic lattice of rotational spheres indicate that the lift force produced by the Magnus effect can be very significant in view of the magnitude of the drag force when the practical rotating speed of the spheres is encountered. This finding may lead to more comprehensive studies of the effect of the particle rotation on fluid–solid drag laws. It is also demonstrated that, when the third-order or the fourth-order Runge–Kutta scheme is used, the numerical stability of the present IB-LBM is better than that of all methods in the literature, including the previous IB-LBMs and also the methods with the combination of the IBM and the traditional incompressible Navier–Stokes solver. - Highlights: • The IBM is embedded in the LBM using Runge–Kutta time schemes. • The effectiveness of the present IB-LBM is validated by benchmark applications. • For the first time, the IB-LBM achieves the second-order accuracy. • The numerical stability of the present IB-LBM is better than previous methods.« less
Previous experience in manned space flight: A survey of human factors lessons learned
NASA Technical Reports Server (NTRS)
Chandlee, George O.; Woolford, Barbara
1993-01-01
Previous experience in manned space flight programs can be used to compile a data base of human factors lessons learned for the purpose of developing aids in the future design of inhabited spacecraft. The objectives are to gather information available from relevant sources, to develop a taxonomy of human factors data, and to produce a data base that can be used in the future for those people involved in the design of manned spacecraft operations. A study is currently underway at the Johnson Space Center with the objective of compiling, classifying, and summarizing relevant human factors data bearing on the lessons learned from previous manned space flights. The research reported defines sources of data, methods for collection, and proposes a classification for human factors data that may be a model for other human factors disciplines.
NASA Technical Reports Server (NTRS)
Halford, G. R.
1983-01-01
The presentation focuses primarily on the progress we at NASA Lewis Research Center have made. The understanding of the phenomenological processes of high temperature fatigue of metals for the purpose of calculating lives of turbine engine hot section components is discussed. Improved understanding resulted in the development of accurate and physically correct life prediction methods such as Strain-Range partitioning for calculating creep fatigue interactions and the Double Linear Damage Rule for predicting potentially severe interactions between high and low cycle fatigue. Examples of other life prediction methods are also discussed. Previously announced in STAR as A83-12159
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
Objective To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. Background In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. Methods In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Results Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. Conclusion The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies. PMID:25148262
Rajaraman, Prathish K; Manteuffel, T A; Belohlavek, M; Heys, Jeffrey J
2017-01-01
A new approach has been developed for combining and enhancing the results from an existing computational fluid dynamics model with experimental data using the weighted least-squares finite element method (WLSFEM). Development of the approach was motivated by the existence of both limited experimental blood velocity in the left ventricle and inexact numerical models of the same flow. Limitations of the experimental data include measurement noise and having data only along a two-dimensional plane. Most numerical modeling approaches do not provide the flexibility to assimilate noisy experimental data. We previously developed an approach that could assimilate experimental data into the process of numerically solving the Navier-Stokes equations, but the approach was limited because it required the use of specific finite element methods for solving all model equations and did not support alternative numerical approximation methods. The new approach presented here allows virtually any numerical method to be used for approximately solving the Navier-Stokes equations, and then the WLSFEM is used to combine the experimental data with the numerical solution of the model equations in a final step. The approach dynamically adjusts the influence of the experimental data on the numerical solution so that more accurate data are more closely matched by the final solution and less accurate data are not closely matched. The new approach is demonstrated on different test problems and provides significantly reduced computational costs compared with many previous methods for data assimilation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Mata-Cantero, Lydia; Lafuente, Maria J; Sanz, Laura; Rodriguez, Manuel S
2014-03-21
The establishment of methods for an in vitro continuous culture of Plasmodium falciparum is essential for gaining knowledge into its biology and for the development of new treatments. Previously, several techniques have been used to synchronize, enrich and concentrate P. falciparum, although obtaining cultures with high parasitaemia continues being a challenging process. Current methods produce high parasitaemia levels of synchronized P. falciparum cultures by frequent changes of culture medium or reducing the haematocrit. However, these methods are time consuming and sometimes lead to the loss of synchrony. A procedure that combines Percoll and sorbitol treatments, the use of magnetic columns, and the optimization of the in vitro culture conditions to reach high parasitaemia levels for synchronized Plasmodium falciparum cultures is described. A new procedure has been established using P. falciparum 3D7, combining previous reported methodologies to achieve in vitro parasite cultures that reach parasitaemia up to 40% at any intra-erythrocytic stage. High parasitaemia levels are obtained only one day after magnetic column purification without compromising the parasite viability and synchrony. The described procedure allows obtaining a large scale synchronized parasite culture at a high parasitaemia with less manipulations than other methods previously described.
Finite Element Analysis of Poroelastic Composites Undergoing Thermal and Gas Diffusion
NASA Technical Reports Server (NTRS)
Salamon, N. J. (Principal Investigator); Sullivan, Roy M.; Lee, Sunpyo
1995-01-01
A theory for time-dependent thermal and gas diffusion in mechanically time-rate-independent anisotropic poroelastic composites has been developed. This theory advances previous work by the latter two authors by providing for critical transverse shear through a three-dimensional axisymmetric formulation and using it in a new hypothesis for determining the Biot fluid pressure-solid stress coupling factor. The derived governing equations couple material deformation with temperature and internal pore pressure and more strongly couple gas diffusion and heat transfer than the previous theory. Hence the theory accounts for the interactions between conductive heat transfer in the porous body and convective heat carried by the mass flux through the pores. The Bubnov Galerkin finite element method is applied to the governing equations to transform them into a semidiscrete finite element system. A numerical procedure is developed to solve the coupled equations in the space and time domains. The method is used to simulate two high temperature tests involving thermal-chemical decomposition of carbon-phenolic composites. In comparison with measured data, the results are accurate. Moreover unlike previous work, for a single set of poroelastic parameters, they are consistent with two measurements in a restrained thermal growth test.
Differential Privacy Preserving in Big Data Analytics for Connected Health.
Lin, Chi; Song, Zihao; Song, Houbing; Zhou, Yanhong; Wang, Yi; Wu, Guowei
2016-04-01
In Body Area Networks (BANs), big data collected by wearable sensors usually contain sensitive information, which is compulsory to be appropriately protected. Previous methods neglected privacy protection issue, leading to privacy exposure. In this paper, a differential privacy protection scheme for big data in body sensor network is developed. Compared with previous methods, this scheme will provide privacy protection with higher availability and reliability. We introduce the concept of dynamic noise thresholds, which makes our scheme more suitable to process big data. Experimental results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy.
Fukunaga, Kenji; Ichitani, Katsuyuki; Taura, Satoru; Sato, Muneharu; Kawase, Makoto
2005-02-01
We determined the sequence of ribosomal DNA (rDNA) intergenic spacer (IGS) of foxtail millet isolated in our previous study, and identified subrepeats in the polymorphic region. We also developed a PCR-based method for identifying rDNA types based on sequence information and assessed 153 accessions of foxtail millet. Results were congruent with our previous works. This study provides new findings regarding the geographical distribution of rDNA variants. This new method facilitates analyses of numerous foxtail millet accessions. It is helpful for typing of foxtail millet germplasms and elucidating the evolution of this millet.
Tugwell, Peter; Pottie, Kevin; Welch, Vivian; Ueffing, Erin; Chambers, Andrea; Feightner, John
2011-01-01
Background: This article describes the evidence review and guideline development method developed for the Clinical Preventive Guidelines for Immigrants and Refugees in Canada by the Canadian Collaboration for Immigrant and Refugee Health Guideline Committee. Methods: The Appraisal of Guidelines for Research and Evaluation (AGREE) best-practice framework was combined with the recently developed Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach to produce evidence-based clinical guidelines for immigrants and refugees in Canada. Results: A systematic approach was designed to produce the evidence reviews and apply the GRADE approach, including building on evidence from previous systematic reviews, searching for and comparing evidence between general and specific immigrant populations, and applying the GRADE criteria for making recommendations. This method was used for priority health conditions that had been selected by practitioners caring for immigrants and refugees in Canada. Interpretation: This article outlines the 14-step method that was defined to standardize the guideline development process for each priority health condition. PMID:20573711
Combining Review Text Content and Reviewer-Item Rating Matrix to Predict Review Rating
Wang, Bingkun; Huang, Yongfeng; Li, Xing
2016-01-01
E-commerce develops rapidly. Learning and taking good advantage of the myriad reviews from online customers has become crucial to the success in this game, which calls for increasingly more accuracy in sentiment classification of these reviews. Therefore the finer-grained review rating prediction is preferred over the rough binary sentiment classification. There are mainly two types of method in current review rating prediction. One includes methods based on review text content which focus almost exclusively on textual content and seldom relate to those reviewers and items remarked in other relevant reviews. The other one contains methods based on collaborative filtering which extract information from previous records in the reviewer-item rating matrix, however, ignoring review textual content. Here we proposed a framework for review rating prediction which shows the effective combination of the two. Then we further proposed three specific methods under this framework. Experiments on two movie review datasets demonstrate that our review rating prediction framework has better performance than those previous methods. PMID:26880879
Fusing Symbolic and Numerical Diagnostic Computations
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
X-2000 Anomaly Detection Language denotes a developmental computing language, and the software that establishes and utilizes the language, for fusing two diagnostic computer programs, one implementing a numerical analysis method, the other implementing a symbolic analysis method into a unified event-based decision analysis software system for realtime detection of events (e.g., failures) in a spacecraft, aircraft, or other complex engineering system. The numerical analysis method is performed by beacon-based exception analysis for multi-missions (BEAMs), which has been discussed in several previous NASA Tech Briefs articles. The symbolic analysis method is, more specifically, an artificial-intelligence method of the knowledge-based, inference engine type, and its implementation is exemplified by the Spacecraft Health Inference Engine (SHINE) software. The goal in developing the capability to fuse numerical and symbolic diagnostic components is to increase the depth of analysis beyond that previously attainable, thereby increasing the degree of confidence in the computed results. In practical terms, the sought improvement is to enable detection of all or most events, with no or few false alarms.
Shintani, H
1985-05-31
Studies were made of the analytical conditions required for indirect photometric ion chromatography using ultraviolet photometric detection (UV method) for the determination of serum cations following a previously developed serum pre-treatment. The sensitivities of the conductivity detection (CD) and UV methods and the amounts of serum cations determined by both methods were compared. Attempts to improve the sensitivity of the conventional UV method are reported. It was found that the mobile phase previously reported by Small and Miller showed no quantitative response when more than 4 mM copper(II) sulphate pentahydrate was used. As a result, there was no significant difference in the amounts of serum cations shown by the CD and UV methods. However, by adding 0.5-5 mM cobalt(II) sulphate heptahydrate, nickel(II) sulphate hexahydrate, zinc(II) sulphate heptahydrate or cobalt(II) diammonium sulphate hexahydrate to 0.5-1.5 mM copper(II) sulphate pentahydrate, higher sensitivity and a quantitative response were attained.
Combining Review Text Content and Reviewer-Item Rating Matrix to Predict Review Rating.
Wang, Bingkun; Huang, Yongfeng; Li, Xing
2016-01-01
E-commerce develops rapidly. Learning and taking good advantage of the myriad reviews from online customers has become crucial to the success in this game, which calls for increasingly more accuracy in sentiment classification of these reviews. Therefore the finer-grained review rating prediction is preferred over the rough binary sentiment classification. There are mainly two types of method in current review rating prediction. One includes methods based on review text content which focus almost exclusively on textual content and seldom relate to those reviewers and items remarked in other relevant reviews. The other one contains methods based on collaborative filtering which extract information from previous records in the reviewer-item rating matrix, however, ignoring review textual content. Here we proposed a framework for review rating prediction which shows the effective combination of the two. Then we further proposed three specific methods under this framework. Experiments on two movie review datasets demonstrate that our review rating prediction framework has better performance than those previous methods.
GMDD: a database of GMO detection methods.
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans J P; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-06-04
Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier.
NASA Astrophysics Data System (ADS)
Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer; Kosson, David
2017-03-01
An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy's Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale from non-discernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy's sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy's sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.
Burger, Joanna; Gochfeld, Michael; Bunn, Amoret; Downs, Janelle; Jeitner, Christian; Pittfield, Taryn; Salisbury, Jennifer; Kosson, David
2017-03-01
An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy's Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale from non-discernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy's sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy's sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.
Snapp-Childs, Winona; Fath, Aaron J; Watson, Carol A; Flatters, Ian; Mon-Williams, Mark; Bingham, Geoffrey P
2015-10-01
Many children have difficulty producing movements well enough to improve in perceptuo-motor learning. We have developed a training method that supports active movement generation to allow improvement in a 3D tracing task requiring good compliance control. We previously tested 7-8 year old children who exhibited poor performance and performance differences before training. After training, performance was significantly improved and performance differences were eliminated. According to the Dynamic Systems Theory of development, appropriate support can enable younger children to acquire the ability to perform like older children. In the present study, we compared 7-8 and 10-12 year old school children and predicted that younger children would show reduced performance that was nonetheless amenable to training. Indeed, the pre-training performance of the 7-8 year olds was worse than that of the 10-12 year olds, but post-training performance was equally good for both groups. This was similar to previous results found using this training method for children with DCD and age-matched typically developing children. We also found in a previous study of 7-8 year old school children that training in the 3D tracing task transferred to a 2D drawing task. We now found similar transfer for the 10-12 year olds. Copyright © 2015 Elsevier B.V. All rights reserved.
Assessing Stream Channel Stability at Bridges in Physiographic Regions
DOT National Transportation Integrated Search
2006-07-01
The objective of this study was to expand and improve a rapid channel stability assessment method developed previously by Johnson et al. to include additional factors, such as major physiographic units across the United States, a greater range of ban...
Optical Metamaterials: Design, Characterization and Applications
ERIC Educational Resources Information Center
Chaturvedi, Pratik
2009-01-01
Artificially engineered metamaterials have emerged with properties and functionalities previously unattainable in natural materials. The scientific breakthroughs made in this new class of electromagnetic materials are closely linked with progress in developing physics-driven design, novel fabrication and characterization methods. The intricate…
Yamashita, Kunihiko; Shinoda, Shinsuke; Hagiwara, Saori; Itagaki, Hiroshi
2015-04-01
To date, there has been no well-established local lymph node assay (LLNA) that includes an elicitation phase. Therefore, we developed a modified local lymph node assay with an elicitation phase (LLNA:DAE) to discriminate true skin sensitizers from chemicals that gave borderline positive results and previously reported this assay. To develop the LLNA:DAE method as a useful stand-alone testing method, we investigated the complete procedure for the LLNA:DAE method using hexyl cinnamic aldehyde (HCA), isoeugenol, and 2,4-dinitrochlorobenzene (DNCB) as test compounds. We defined the LLNA:DAE procedure as follows: in the dose-finding test, four concentrations of chemical applied to dorsum of the right ear on days 1, 2, and 3 and dorsum of both ears on day 10. Ear thickness and skin irritation score were measured on days 1, 3, 5, 10, and 12. Local lymph nodes were excised and weighed on day 12. The test dose for the primary LLNA:DAE study was selected as the dose that gave the highest left ear lymph node weight in the dose-finding study, or the lowest dose that produced a left ear lymph node of over 4 mg. This procedure was validated using nine different chemicals. Furthermore, qualitative relationship was observed between the degree of elicitation response in the left ear lymph node and the skin sensitizing potency of 32 chemicals tested in this study and the previous study. These results indicated that LLNA:DAE method was as first LLNA method that was able to evaluate the skin sensitizing potential and potency in elicitation response.
ERIC Educational Resources Information Center
Conroy, Susan; Pariante, Carmine M.; Marks, Maureen N.; Davies, Helen A.; Farrelly, Simone; Schacht, Robin; Moran, Paul
2012-01-01
Objective: No previous longitudinal study has examined the impact of comorbid maternal personality disorder (PD) and depression on child development. We set out to examine whether maternal PD and depression assessed at 2 months post partum would be independently associated with adverse developmental outcomes at 18 months of age. Method: Women were…
Shortcuts to adiabaticity using flow fields
NASA Astrophysics Data System (ADS)
Patra, Ayoti; Jarzynski, Christopher
2017-12-01
A shortcut to adiabaticity is a recipe for generating adiabatic evolution at an arbitrary pace. Shortcuts have been developed for quantum, classical and (most recently) stochastic dynamics. A shortcut might involve a counterdiabatic (CD) Hamiltonian that causes a system to follow the adiabatic evolution at all times, or it might utilize a fast-forward (FF) potential, which returns the system to the adiabatic path at the end of the process. We develop a general framework for constructing shortcuts to adiabaticity from flow fields that describe the desired adiabatic evolution. Our approach encompasses quantum, classical and stochastic dynamics, and provides surprisingly compact expressions for both CD Hamiltonians and FF potentials. We illustrate our method with numerical simulations of a model system, and we compare our shortcuts with previously obtained results. We also consider the semiclassical connections between our quantum and classical shortcuts. Our method, like the FF approach developed by previous authors, is susceptible to singularities when applied to excited states of quantum systems; we propose a simple, intuitive criterion for determining whether these singularities will arise, for a given excited state.
Barkat, K; Ahmad, M; Minhas, M U; Malik, M Z; Sohail, M
2014-07-01
The objective of study was to develop an accurate and reproducible HPLC method for determination of piracetam in human plasma and to evaluate pharmacokinetic parameters of 800 mg piracetam. A simple, rapid, accurate, precise and sensitive high pressure liquid chromatography method has been developed and subsequently validated for determination of piracetam. This study represents the results of a randomized, single-dose and single-period in 18 healthy male volunteers to assess pharmacokinetic parameters of 800 mg piracetam tablets. Various pharmacokinetic parameters were determined from plasma for piracetam and found to be in good agreement with previous reported values. The data was analyzed by using Kinetica® version 4.4 according to non-compartment model of pharmacokinetic analysis and after comparison with previous studies, no significant differences were found in present study of tested product. The major pharmacokinetic parameters for piracetam were as follows: t1/2 was (4.40 ± 0.179) h; Tmax value was (2.33 ± 0.105) h; Cmax was (14.53 ± 0.282) µg/mL; the AUC(0-∞) was (59.19 ± 4.402) µg · h/mL. AUMC(0-∞) was (367.23 ± 38.96) µg. (h)(2)/mL; Ke was (0.16 ± 0.006) h; MRT was (5.80 ± 0.227) h; Vd was (96.36 ± 8.917 L). A rapid, accurate and precise high pressure liquid chromatography method was developed and validated before the study. It is concluded that this method is very useful for the analysis of pharmacokinetic parameters, in human plasma and assured the safety and efficacy of piracetam, can be effectively used in medical practice. © Georg Thieme Verlag KG Stuttgart · New York.
Development of an ELA-DRA gene typing method based on pyrosequencing technology.
Díaz, S; Echeverría, M G; It, V; Posik, D M; Rogberg-Muñoz, A; Pena, N L; Peral-García, P; Vega-Pla, J L; Giovambattista, G
2008-11-01
The polymorphism of equine lymphocyte antigen (ELA) class II DRA gene had been detected by polymerase chain reaction-single-strand conformational polymorphism (PCR-SSCP) and reference strand-mediated conformation analysis. These methodologies allowed to identify 11 ELA-DRA exon 2 sequences, three of which are widely distributed among domestic horse breeds. Herein, we describe the development of a pyrosequencing-based method applicable to ELA-DRA typing, by screening samples from eight different horse breeds previously typed by PCR-SSCP. This sequence-based method would be useful in high-throughput genotyping of major histocompatibility complex genes in horses and other animal species, making this system interesting as a rapid screening method for animal genotyping of immune-related genes.
Irons, Trevor P.; Hobza, Christopher M.; Steele, Gregory V.; Abraham, Jared D.; Cannia, James C.; Woodward, Duane D.
2012-01-01
Surface nuclear magnetic resonance, a noninvasive geophysical method, measures a signal directly related to the amount of water in the subsurface. This allows for low-cost quantitative estimates of hydraulic parameters. In practice, however, additional factors influence the signal, complicating interpretation. The U.S. Geological Survey, in cooperation with the Central Platte Natural Resources District, evaluated whether hydraulic parameters derived from surface nuclear magnetic resonance data could provide valuable input into groundwater models used for evaluating water-management practices. Two calibration sites in Dawson County, Nebraska, were chosen based on previous detailed hydrogeologic and geophysical investigations. At both sites, surface nuclear magnetic resonance data were collected, and derived parameters were compared with results from four constant-discharge aquifer tests previously conducted at those same sites. Additionally, borehole electromagnetic-induction flowmeter data were analyzed as a less-expensive surrogate for traditional aquifer tests. Building on recent work, a novel surface nuclear magnetic resonance modeling and inversion method was developed that incorporates electrical conductivity and effects due to magnetic-field inhomogeneities, both of which can have a substantial impact on the data. After comparing surface nuclear magnetic resonance inversions at the two calibration sites, the nuclear magnetic-resonance-derived parameters were compared with previously performed aquifer tests in the Central Platte Natural Resources District. This comparison served as a blind test for the developed method. The nuclear magnetic-resonance-derived aquifer parameters were in agreement with results of aquifer tests where the environmental noise allowed data collection and the aquifer test zones overlapped with the surface nuclear magnetic resonance testing. In some cases, the previously performed aquifer tests were not designed fully to characterize the aquifer, and the surface nuclear magnetic resonance was able to provide missing data. In favorable locations, surface nuclear magnetic resonance is able to provide valuable noninvasive information about aquifer parameters and should be a useful tool for groundwater managers in Nebraska.
Progressive retry for software error recovery in distributed systems
NASA Technical Reports Server (NTRS)
Wang, Yi-Min; Huang, Yennun; Fuchs, W. K.
1993-01-01
In this paper, we describe a method of execution retry for bypassing software errors based on checkpointing, rollback, message reordering and replaying. We demonstrate how rollback techniques, previously developed for transient hardware failure recovery, can also be used to recover from software faults by exploiting message reordering to bypass software errors. Our approach intentionally increases the degree of nondeterminism and the scope of rollback when a previous retry fails. Examples from our experience with telecommunications software systems illustrate the benefits of the scheme.
Duque-Ramos, Astrid; Boeker, Martin; Jansen, Ludger; Schulz, Stefan; Iniesta, Miguela; Fernández-Breis, Jesualdo Tomás
2014-01-01
To (1) evaluate the GoodOD guideline for ontology development by applying the OQuaRE evaluation method and metrics to the ontology artefacts that were produced by students in a randomized controlled trial, and (2) informally compare the OQuaRE evaluation method with gold standard and competency questions based evaluation methods, respectively. In the last decades many methods for ontology construction and ontology evaluation have been proposed. However, none of them has become a standard and there is no empirical evidence of comparative evaluation of such methods. This paper brings together GoodOD and OQuaRE. GoodOD is a guideline for developing robust ontologies. It was previously evaluated in a randomized controlled trial employing metrics based on gold standard ontologies and competency questions as outcome parameters. OQuaRE is a method for ontology quality evaluation which adapts the SQuaRE standard for software product quality to ontologies and has been successfully used for evaluating the quality of ontologies. In this paper, we evaluate the effect of training in ontology construction based on the GoodOD guideline within the OQuaRE quality evaluation framework and compare the results with those obtained for the previous studies based on the same data. Our results show a significant effect of the GoodOD training over developed ontologies by topics: (a) a highly significant effect was detected in three topics from the analysis of the ontologies of untrained and trained students; (b) both positive and negative training effects with respect to the gold standard were found for five topics. The GoodOD guideline had a significant effect over the quality of the ontologies developed. Our results show that GoodOD ontologies can be effectively evaluated using OQuaRE and that OQuaRE is able to provide additional useful information about the quality of the GoodOD ontologies.
Computer-aided detection (CAD) of breast cancer on full field digital and screening film mammograms
NASA Astrophysics Data System (ADS)
Sun, Xuejun; Qian, Wei; Song, Xiaoshan; Qian, Yuyan; Song, Dansheng; Clark, Robert A.
2003-05-01
Full-field digital mammography (FFDM) as a new breast imaging modality has potential to detect more breast cancers or to detect them at smaller sizes and earlier stages compared with screening film mammography (SFM). However, its performance needs verification, and it would pose new problems for the development of CAD methods for breast cancer detection and diagnosis. Performance evaluation of CAD systems on FFDM and SFM has been conducted in this study, respectively. First, an adaptive CAD system employing a series of advanced modules has been developed on FFDM. Second, a standardization approach has been developed to make the CAD system independent of characteristics of digitizer or imaging modalities for mammography. CAD systems developed previously for SFM and developed in this study for FFDM have been evaluated on FFDM and SFM images without and with standardization, respectively, to examine the performance improvement of the CAD system developed in this study. Computerized free-response receiver operating characteristic (FROC) analysis has been adopted as performance evaluation method. Compared with previous one, the CAD system developed in this study demonstrated significantly performance improvements. However, the comparison results have shown that the performances of final CAD system in this study are not significantly different on FFDM and on SFM after standardization. It needs further study on the assessment of CAD system performance on FFDM and SFM modalities.
Fast sweeping methods for hyperbolic systems of conservation laws at steady state II
NASA Astrophysics Data System (ADS)
Engquist, Björn; Froese, Brittany D.; Tsai, Yen-Hsi Richard
2015-04-01
The idea of using fast sweeping methods for solving stationary systems of conservation laws has previously been proposed for efficiently computing solutions with sharp shocks. We further develop these methods to allow for a more challenging class of problems including problems with sonic points, shocks originating in the interior of the domain, rarefaction waves, and two-dimensional systems. We show that fast sweeping methods can produce higher-order accuracy. Computational results validate the claims of accuracy, sharp shock curves, and optimal computational efficiency.
Learning process mapping heuristics under stochastic sampling overheads
NASA Technical Reports Server (NTRS)
Ieumwananonthachai, Arthur; Wah, Benjamin W.
1991-01-01
A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.
The method of lines in three dimensional fracture mechanics
NASA Technical Reports Server (NTRS)
Gyekenyesi, J.; Berke, L.
1980-01-01
A review of recent developments in the calculation of design parameters for fracture mechanics by the method of lines (MOL) is presented. Three dimensional elastic and elasto-plastic formulations are examined and results from previous and current research activities are reported. The application of MOL to the appropriate partial differential equations of equilibrium leads to coupled sets of simultaneous ordinary differential equations. Solutions of these equations are obtained by the Peano-Baker and by the recurrance relations methods. The advantages and limitations of both solution methods from the computational standpoint are summarized.
NASA Technical Reports Server (NTRS)
Ryan, Robert S.; Townsend, John S.
1993-01-01
The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.
Rapid calculation of genomic evaluations for new animals
USDA-ARS?s Scientific Manuscript database
A method was developed to calculate preliminary genomic evaluations daily or weekly before the release of official monthly evaluations by processing only newly genotyped animals using estimates of SNP effects from the previous official evaluation. To minimize computing time, reliabilities and genomi...
Transcriptome assembly and digital gene expression atlas of the rainbow trout
USDA-ARS?s Scientific Manuscript database
Background: Transcriptome analysis is a preferred method for gene discovery, marker development and gene expression profiling in non-model organisms. Previously, we sequenced a transcriptome reference using Sanger-based and 454-pyrosequencing, however, a transcriptome assembly is still incomplete an...
Generic Sensor Modeling Using Pulse Method
NASA Technical Reports Server (NTRS)
Helder, Dennis L.; Choi, Taeyoung
2005-01-01
Recent development of high spatial resolution satellites such as IKONOS, Quickbird and Orbview enable observation of the Earth's surface with sub-meter resolution. Compared to the 30 meter resolution of Landsat 5 TM, the amount of information in the output image was dramatically increased. In this era of high spatial resolution, the estimation of spatial quality of images is gaining attention. Historically, the Modulation Transfer Function (MTF) concept has been used to estimate an imaging system's spatial quality. Sometimes classified by target shapes, various methods were developed in laboratory environment utilizing sinusoidal inputs, periodic bar patterns and narrow slits. On-orbit sensor MTF estimation was performed on 30-meter GSD Landsat4 Thematic Mapper (TM) data from the bridge pulse target as a pulse input . Because of a high resolution sensor s small Ground Sampling Distance (GSD), reasonably sized man-made edge, pulse, and impulse targets can be deployed on a uniform grassy area with accurate control of ground targets using tarps and convex mirrors. All the previous work cited calculated MTF without testing the MTF estimator's performance. In previous report, a numerical generic sensor model had been developed to simulate and improve the performance of on-orbit MTF estimating techniques. Results from the previous sensor modeling report that have been incorporated into standard MTF estimation work include Fermi edge detection and the newly developed 4th order modified Savitzky-Golay (MSG) interpolation technique. Noise sensitivity had been studied by performing simulations on known noise sources and a sensor model. Extensive investigation was done to characterize multi-resolution ground noise. Finally, angle simulation was tested by using synthetic pulse targets with angles from 2 to 15 degrees, several brightness levels, and different noise levels from both ground targets and imaging system. As a continuing research activity using the developed sensor model, this report was dedicated to MTF estimation via pulse input method characterization using the Fermi edge detection and 4th order MSG interpolation method. The relationship between pulse width and MTF value at Nyquist was studied including error detection and correction schemes. Pulse target angle sensitivity was studied by using synthetic targets angled from 2 to 12 degrees. In this report, from the ground and system noise simulation, a minimum SNR value was suggested for a stable MTF value at Nyquist for the pulse method. Target width error detection and adjustment technique based on a smooth transition of MTF profile is presented, which is specifically applicable only to the pulse method with 3 pixel wide targets.
Methodes entropiques appliquees au probleme inverse en magnetoencephalographie
NASA Astrophysics Data System (ADS)
Lapalme, Ervig
2005-07-01
This thesis is devoted to biomagnetic source localization using magnetoencephalography. This problem is known to have an infinite number of solutions. So methods are required to take into account anatomical and functional information on the solution. The work presented in this thesis uses the maximum entropy on the mean method to constrain the solution. This method originates from statistical mechanics and information theory. This thesis is divided into two main parts containing three chapters each. The first part reviews the magnetoencephalographic inverse problem: the theory needed to understand its context and the hypotheses for simplifying the problem. In the last chapter of this first part, the maximum entropy on the mean method is presented: its origins are explained and also how it is applied to our problem. The second part is the original work of this thesis presenting three articles; one of them already published and two others submitted for publication. In the first article, a biomagnetic source model is developed and applied in a theoretical con text but still demonstrating the efficiency of the method. In the second article, we go one step further towards a realistic modelization of the cerebral activation. The main priors are estimated using the magnetoencephalographic data. This method proved to be very efficient in realistic simulations. In the third article, the previous method is extended to deal with time signals thus exploiting the excellent time resolution offered by magnetoencephalography. Compared with our previous work, the temporal method is applied to real magnetoencephalographic data coming from a somatotopy experience and results agree with previous physiological knowledge about this kind of cognitive process.
Convergence of methods for coupling of microscopic and mesoscopic reaction-diffusion simulations
NASA Astrophysics Data System (ADS)
Flegg, Mark B.; Hellander, Stefan; Erban, Radek
2015-05-01
In this paper, three multiscale methods for coupling of mesoscopic (compartment-based) and microscopic (molecular-based) stochastic reaction-diffusion simulations are investigated. Two of the three methods that will be discussed in detail have been previously reported in the literature; the two-regime method (TRM) and the compartment-placement method (CPM). The third method that is introduced and analysed in this paper is called the ghost cell method (GCM), since it works by constructing a "ghost cell" in which molecules can disappear and jump into the compartment-based simulation. Presented is a comparison of sources of error. The convergent properties of this error are studied as the time step Δt (for updating the molecular-based part of the model) approaches zero. It is found that the error behaviour depends on another fundamental computational parameter h, the compartment size in the mesoscopic part of the model. Two important limiting cases, which appear in applications, are considered: Δt → 0 and h is fixed; Δt → 0 and h → 0 such that √{ Δt } / h is fixed. The error for previously developed approaches (the TRM and CPM) converges to zero only in the limiting case (ii), but not in case (i). It is shown that the error of the GCM converges in the limiting case (i). Thus the GCM is superior to previous coupling techniques if the mesoscopic description is much coarser than the microscopic part of the model.
CREPT-MCNP code for efficiency calibration of HPGe detectors with the representative point method.
Saegusa, Jun
2008-01-01
The representative point method for the efficiency calibration of volume samples has been previously proposed. For smoothly implementing the method, a calculation code named CREPT-MCNP has been developed. The code estimates the position of a representative point which is intrinsic to each shape of volume sample. The self-absorption correction factors are also given to make correction on the efficiencies measured at the representative point with a standard point source. Features of the CREPT-MCNP code are presented.
Application of 3-signal coherence to core noise transmission
NASA Technical Reports Server (NTRS)
Krejsa, E. A.
1983-01-01
A method for determining transfer functions across turbofan engine components and from the engine to the far-field is developed. The method is based on the three-signal coherence technique used previously to obtain far-field core noise levels. This method eliminates the bias error in transfer function measurements due to contamination of measured pressures by nonpropagating pressure fluctuations. Measured transfer functions from the engine to the far-field, across the tailpipe, and across the turbine are presented for three turbofan engines.
Holmes, Robert R.; Dunn, Chad J.
1996-01-01
A simplified method to estimate total-streambed scour was developed for application to bridges in the State of Illinois. Scour envelope curves, developed as empirical relations between calculated total scour and bridge-site chracteristics for 213 State highway bridges in Illinois, are used in the method to estimate the 500-year flood scour. These 213 bridges, geographically distributed throughout Illinois, had been previously evaluated for streambed scour with the application of conventional hydraulic and scour-analysis methods recommended by the Federal Highway Administration. The bridge characteristics necessary for application of the simplified bridge scour-analysis method can be obtained from an office review of bridge plans, examination of topographic maps, and reconnaissance-level site inspection. The estimates computed with the simplified method generally resulted in a larger value of 500-year flood total-streambed scour than with the more detailed conventional method. The simplified method was successfully verified with a separate data set of 106 State highway bridges, which are geographically distributed throughout Illinois, and 15 county highway bridges.
Generic Safety Requirements for Developing Safe Insulin Pump Software
Zhang, Yi; Jetley, Raoul; Jones, Paul L; Ray, Arnab
2011-01-01
Background The authors previously introduced a highly abstract generic insulin infusion pump (GIIP) model that identified common features and hazards shared by most insulin pumps on the market. The aim of this article is to extend our previous work on the GIIP model by articulating safety requirements that address the identified GIIP hazards. These safety requirements can be validated by manufacturers, and may ultimately serve as a safety reference for insulin pump software. Together, these two publications can serve as a basis for discussing insulin pump safety in the diabetes community. Methods In our previous work, we established a generic insulin pump architecture that abstracts functions common to many insulin pumps currently on the market and near-future pump designs. We then carried out a preliminary hazard analysis based on this architecture that included consultations with many domain experts. Further consultation with domain experts resulted in the safety requirements used in the modeling work presented in this article. Results Generic safety requirements for the GIIP model are presented, as appropriate, in parameterized format to accommodate clinical practices or specific insulin pump criteria important to safe device performance. Conclusions We believe that there is considerable value in having the diabetes, academic, and manufacturing communities consider and discuss these generic safety requirements. We hope that the communities will extend and revise them, make them more representative and comprehensive, experiment with them, and use them as a means for assessing the safety of insulin pump software designs. One potential use of these requirements is to integrate them into model-based engineering (MBE) software development methods. We believe, based on our experiences, that implementing safety requirements using MBE methods holds promise in reducing design/implementation flaws in insulin pump development and evolutionary processes, therefore improving overall safety of insulin pump software. PMID:22226258
NASA Technical Reports Server (NTRS)
Stanley, A. G.; Gauthier, M. K.
1977-01-01
A successful diagnostic technique was developed using a scanning electron microscope (SEM) as a precision tool to determine ionization effects in integrated circuits. Previous SEM methods radiated the entire semiconductor chip or major areas. The large area exposure methods do not reveal the exact components which are sensitive to radiation. To locate these sensitive components a new method was developed, which consisted in successively irradiating selected components on the device chip with equal doses of electrons /10 to the 6th rad (Si)/, while the whole device was subjected to representative bias conditions. A suitable device parameter was measured in situ after each successive irradiation with the beam off.
NASA Astrophysics Data System (ADS)
Manjanaik, N.; Parameshachari, B. D.; Hanumanthappa, S. N.; Banu, Reshma
2017-08-01
Intra prediction process of H.264 video coding standard used to code first frame i.e. Intra frame of video to obtain good coding efficiency compare to previous video coding standard series. More benefit of intra frame coding is to reduce spatial pixel redundancy with in current frame, reduces computational complexity and provides better rate distortion performance. To code Intra frame it use existing process Rate Distortion Optimization (RDO) method. This method increases computational complexity, increases in bit rate and reduces picture quality so it is difficult to implement in real time applications, so the many researcher has been developed fast mode decision algorithm for coding of intra frame. The previous work carried on Intra frame coding in H.264 standard using fast decision mode intra prediction algorithm based on different techniques was achieved increased in bit rate, degradation of picture quality(PSNR) for different quantization parameters. Many previous approaches of fast mode decision algorithms on intra frame coding achieved only reduction of computational complexity or it save encoding time and limitation was increase in bit rate with loss of quality of picture. In order to avoid increase in bit rate and loss of picture quality a better approach was developed. In this paper developed a better approach i.e. Gaussian pulse for Intra frame coding using diagonal down left intra prediction mode to achieve higher coding efficiency in terms of PSNR and bitrate. In proposed method Gaussian pulse is multiplied with each 4x4 frequency domain coefficients of 4x4 sub macro block of macro block of current frame before quantization process. Multiplication of Gaussian pulse for each 4x4 integer transformed coefficients at macro block levels scales the information of the coefficients in a reversible manner. The resulting signal would turn abstract. Frequency samples are abstract in a known and controllable manner without intermixing of coefficients, it avoids picture getting bad hit for higher values of quantization parameters. The proposed work was implemented using MATLAB and JM 18.6 reference software. The proposed work measure the performance parameters PSNR, bit rate and compression of intra frame of yuv video sequences in QCIF resolution under different values of quantization parameter with Gaussian value for diagonal down left intra prediction mode. The simulation results of proposed algorithm are tabulated and compared with previous algorithm i.e. Tian et al method. The proposed algorithm achieved reduced in bit rate averagely 30.98% and maintain consistent picture quality for QCIF sequences compared to previous algorithm i.e. Tian et al method.
Improving consensus contact prediction via server correlation reduction.
Gao, Xin; Bu, Dongbo; Xu, Jinbo; Li, Ming
2009-05-06
Protein inter-residue contacts play a crucial role in the determination and prediction of protein structures. Previous studies on contact prediction indicate that although template-based consensus methods outperform sequence-based methods on targets with typical templates, such consensus methods perform poorly on new fold targets. However, we find out that even for new fold targets, the models generated by threading programs can contain many true contacts. The challenge is how to identify them. In this paper, we develop an integer linear programming model for consensus contact prediction. In contrast to the simple majority voting method assuming that all the individual servers are equally important and independent, the newly developed method evaluates their correlation by using maximum likelihood estimation and extracts independent latent servers from them by using principal component analysis. An integer linear programming method is then applied to assign a weight to each latent server to maximize the difference between true contacts and false ones. The proposed method is tested on the CASP7 data set. If the top L/5 predicted contacts are evaluated where L is the protein size, the average accuracy is 73%, which is much higher than that of any previously reported study. Moreover, if only the 15 new fold CASP7 targets are considered, our method achieves an average accuracy of 37%, which is much better than that of the majority voting method, SVM-LOMETS, SVM-SEQ, and SAM-T06. These methods demonstrate an average accuracy of 13.0%, 10.8%, 25.8% and 21.2%, respectively. Reducing server correlation and optimally combining independent latent servers show a significant improvement over the traditional consensus methods. This approach can hopefully provide a powerful tool for protein structure refinement and prediction use.
Nguyen, Dat Tien; Pham, Tuyen Danh; Baek, Na Rae; Park, Kang Ryoung
2018-01-01
Although face recognition systems have wide application, they are vulnerable to presentation attack samples (fake samples). Therefore, a presentation attack detection (PAD) method is required to enhance the security level of face recognition systems. Most of the previously proposed PAD methods for face recognition systems have focused on using handcrafted image features, which are designed by expert knowledge of designers, such as Gabor filter, local binary pattern (LBP), local ternary pattern (LTP), and histogram of oriented gradients (HOG). As a result, the extracted features reflect limited aspects of the problem, yielding a detection accuracy that is low and varies with the characteristics of presentation attack face images. The deep learning method has been developed in the computer vision research community, which is proven to be suitable for automatically training a feature extractor that can be used to enhance the ability of handcrafted features. To overcome the limitations of previously proposed PAD methods, we propose a new PAD method that uses a combination of deep and handcrafted features extracted from the images by visible-light camera sensor. Our proposed method uses the convolutional neural network (CNN) method to extract deep image features and the multi-level local binary pattern (MLBP) method to extract skin detail features from face images to discriminate the real and presentation attack face images. By combining the two types of image features, we form a new type of image features, called hybrid features, which has stronger discrimination ability than single image features. Finally, we use the support vector machine (SVM) method to classify the image features into real or presentation attack class. Our experimental results indicate that our proposed method outperforms previous PAD methods by yielding the smallest error rates on the same image databases. PMID:29495417
Nguyen, Dat Tien; Pham, Tuyen Danh; Baek, Na Rae; Park, Kang Ryoung
2018-02-26
Although face recognition systems have wide application, they are vulnerable to presentation attack samples (fake samples). Therefore, a presentation attack detection (PAD) method is required to enhance the security level of face recognition systems. Most of the previously proposed PAD methods for face recognition systems have focused on using handcrafted image features, which are designed by expert knowledge of designers, such as Gabor filter, local binary pattern (LBP), local ternary pattern (LTP), and histogram of oriented gradients (HOG). As a result, the extracted features reflect limited aspects of the problem, yielding a detection accuracy that is low and varies with the characteristics of presentation attack face images. The deep learning method has been developed in the computer vision research community, which is proven to be suitable for automatically training a feature extractor that can be used to enhance the ability of handcrafted features. To overcome the limitations of previously proposed PAD methods, we propose a new PAD method that uses a combination of deep and handcrafted features extracted from the images by visible-light camera sensor. Our proposed method uses the convolutional neural network (CNN) method to extract deep image features and the multi-level local binary pattern (MLBP) method to extract skin detail features from face images to discriminate the real and presentation attack face images. By combining the two types of image features, we form a new type of image features, called hybrid features, which has stronger discrimination ability than single image features. Finally, we use the support vector machine (SVM) method to classify the image features into real or presentation attack class. Our experimental results indicate that our proposed method outperforms previous PAD methods by yielding the smallest error rates on the same image databases.
Qualitative PCR method for Roundup Ready soybean: interlaboratory study.
Kodama, Takashi; Kasahara, Masaki; Minegishi, Yasutaka; Futo, Satoshi; Sawada, Chihiro; Watai, Masatoshi; Akiyama, Hiroshi; Teshima, Reiko; Kurosawa, Yasunori; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi
2011-01-01
Quantitative and qualitative methods based on PCR have been developed for genetically modified organisms (GMO). Interlaboratory studies were previously conducted for GMO quantitative methods; in this study, an interlaboratory study was conducted for a qualitative method for a GM soybean, Roundup Ready soy (RR soy), with primer pairs designed for the quantitative method of RR soy studied previously. Fourteen laboratories in Japan participated. Each participant extracted DNA from 1.0 g each of the soy samples containing 0, 0.05, and 0.10% of RR soy, and performed PCR with primer pairs for an internal control gene (Le1) and RR soy followed by agarose gel electrophoresis. The PCR product amplified in this PCR system for Le1 was detected from all samples. The sensitivity, specificity, and false-negative and false-positive rates of the method were obtained from the results of RR soy detection. False-negative rates at the level of 0.05 and 0.10% of the RR soy samples were 6.0 and 2.3%, respectively, revealing that the LOD of the method was somewhat below 0.10%. The current study demonstrated that the qualitative method would be practical for monitoring the labeling system of GM soy in kernel lots.
NASA Technical Reports Server (NTRS)
Moore, E. N.; Altick, P. L.
1972-01-01
The research performed is briefly reviewed. A simple method was developed for the calculation of continuum states of atoms when autoionization is present. The method was employed to give the first theoretical cross section for beryllium and magnesium; the results indicate that the values used previously at threshold were sometimes seriously in error. These threshold values have potential applications in astrophysical abundance estimates.
NASA Astrophysics Data System (ADS)
Dietrich, Dietmar; Fodor, Georg; Zucker, Gerhard; Bruckner, Dietmar
The approach to developing models described within the following chapters breaks with some of the previously used approaches in Artificial Intelligence. This is the first attempt to use methods from psychoanalysis organized in a strictly topdown design method in order to take an important step towards the creation of intelligent systems. Hence, the vision and the research hypothesis are described in the beginning and will hopefully prove to have sufficient grounds for this approach.
Solution of electromagnetic scattering problems using time domain techniques
NASA Technical Reports Server (NTRS)
Britt, Charles L.
1989-01-01
New methods are developed to calculate the electromagnetic diffraction or scattering characteristics of objects of arbitrary material and shape. The methods extend the efforts of previous researchers in the use of finite-difference and pulse response techniques. Examples are given of the scattering from infinite conducting and nonconducting cylinders, open channel, sphere, cone, cone sphere, coated disk, open boxes, and open and closed finite cylinders with axially incident waves.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, T; Lin, H; Xu, X
Purpose: To develop a nuclear medicine dosimetry module for the GPU-based Monte Carlo code ARCHER. Methods: We have developed a nuclear medicine dosimetry module for the fast Monte Carlo code ARCHER. The coupled electron-photon Monte Carlo transport kernel included in ARCHER is built upon the Dose Planning Method code (DPM). The developed module manages the radioactive decay simulation by consecutively tracking several types of radiation on a per disintegration basis using the statistical sampling method. Optimization techniques such as persistent threads and prefetching are studied and implemented. The developed module is verified against the VIDA code, which is based onmore » Geant4 toolkit and has previously been verified against OLINDA/EXM. A voxelized geometry is used in the preliminary test: a sphere made of ICRP soft tissue is surrounded by a box filled with water. Uniform activity distribution of I-131 is assumed in the sphere. Results: The self-absorption dose factors (mGy/MBqs) of the sphere with varying diameters are calculated by ARCHER and VIDA respectively. ARCHER’s result is in agreement with VIDA’s that are obtained from a previous publication. VIDA takes hours of CPU time to finish the computation, while it takes ARCHER 4.31 seconds for the 12.4-cm uniform activity sphere case. For a fairer CPU-GPU comparison, more effort will be made to eliminate the algorithmic differences. Conclusion: The coupled electron-photon Monte Carlo code ARCHER has been extended to radioactive decay simulation for nuclear medicine dosimetry. The developed code exhibits good performance in our preliminary test. The GPU-based Monte Carlo code is developed with grant support from the National Institute of Biomedical Imaging and Bioengineering through an R01 grant (R01EB015478)« less
ERIC Educational Resources Information Center
Prado, Elizabeth L.; Abbeddou, Souheila; Adu-Afarwuah, Seth; Arimond, Mary; Ashorn, Per; Ashorn, Ulla; Bendabenda, Jaden; Brown, Kenneth H.; Hess, Sonja Y.; Kortekangas, Emma; Lartey, Anna; Maleta, Kenneth; Oaks, Brietta M.; Ocansey, Eugenia; Okronipa, Harriet; Ouédraogo, Jean Bosco; Pulakka, Anna; Somé, Jérôme W.; Stewart, Christine P.; Stewart, Robert C.; Vosti, Stephen A.; Yakes Jimenez, Elizabeth; Dewey, Kathryn G.
2017-01-01
Background: Previous reviews have identified 44 risk factors for poor early child development (ECD) in low- and middle-income countries. Further understanding of their relative influence and pathways is needed to inform the design of interventions targeting ECD. Methods: We conducted path analyses of factors associated with 18-month language and…
Reilly, T.E.; Frimpter, M.H.; LeBlanc, D.R.; Goodman, A.S.
1987-01-01
Sharp interface methods have been used successfully to describe the physics of upconing. A finite-element model is developed to simulate a sharp interface for determination of the steady-state position of the interface and maximum permissible well discharges. The model developed is compared to previous published electric-analog model results of Bennett and others (1968). -from Authors
Potential implementation of light steel housing system for affordable housing project in Malaysia
NASA Astrophysics Data System (ADS)
Saikah, M.; Kasim, N.; Zainal, R.; Sarpin, N.; Rahim, M. H. I. A.
2017-11-01
An unparalleled number between housing demand and housing supply in Malaysia has increased the housing prices, which gives consequences to the homeownership issue. One way to reduce the housing price is by faster increase the number of affordable housing, but the construction sector faces difficulties in delivering as expected number by using conventional and current industrialised building system (IBS) due to the issue related high project cost, time and labour. Therefore, light steel housing (LSH) system as one of another type of IBS method can be utilised in housing construction project. This method can replace the conventional method that was currently used in the construction of affordable housing project. The objectives of this study are to identify the potential of LSH and influencing factors of system implementation. This is an initial stage to review the previous study related to LSH implementation in developed and developing countries. The previous study will be analysed regarding advantages and disadvantages of LSH and factors that influence the implementation of the system. Based on the literature review it is expected to define the potential and influencing factors of the LSH system. The findings are meaningful in framing and enhance construction housing method of an affordable housing project in Malaysia.
Rapid quantification of neutral lipids and triglycerides during zebrafish embryogenesis.
Yoganantharjah, Prusothman; Byreddy, Avinesh R; Fraher, Daniel; Puri, Munish; Gibert, Yann
2017-01-01
The zebrafish is a useful vertebrate model to study lipid metabolism. Oil Red-O (ORO) staining of zebrafish embryos, though sufficient for visualizing the localization of triglycerides, was previously inadequate to quantify neutral lipid abundance. For metabolic studies, it is crucial to be able to quantify lipids during embryogenesis. Currently no cost effective, rapid and reliable method exists to quantify the deposition of neutral lipids and triglycerides. Thin layer chromatography (TLC), gas chromatography and mass spectrometry can be used to accurately measure lipid levels, but are time consuming and costly in their use. Hence, we developed a rapid and reliable method to quantify neutral lipids and triglycerides. Zebrafish embryos were exposed to Rimonabant (Rimo) or WIN 55,212-2 mesylate (WIN), compounds previously shown to modify lipid content during zebrafish embryogenesis. Following this, ORO stain was extracted out of both the zebrafish body and yolk sac and optical density was measured to give an indication of neutral lipid and triglyceride accumulation. Embryos treated with 0.3 microM WIN resulted in increased lipid accumulation, whereas 3 microM Rimo caused a decrease in lipid accumulation during embryogenesis. TLC was performed on zebrafish bodies to validate the developed method. In addition, BODIPY free fatty acids were injected into zebrafish embryos to confirm quantification of changes in lipid content in the embryo. Previously, ORO was limited to qualitative assessment; now ORO can be used as a quantitative tool to directly determine changes in the levels of neutral lipids and triglycerides.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The final report for the project is presented in five volumes. This volume, Detailed Methodology Review, presents a discussion of the methods considered and used to estimate the impacts of Outer Continental Shelf (OCS) oil and gas development on coastal recreation in California. The purpose is to provide the Minerals Management Service with data and methods to improve their ability to analyze the socio-economic impacts of OCS development. Chapter II provides a review of previous attempts to evaluate the effects of OCS development and of oil spills on coastal recreation. The review also discusses the strengths and weaknesses of differentmore » approaches and presents the rationale for the methodology selection made. Chapter III presents a detailed discussion of the methods actually used in the study. The volume contains the bibliography for the entire study.« less
Improving Upon String Methods for Transition State Discovery.
Chaffey-Millar, Hugh; Nikodem, Astrid; Matveev, Alexei V; Krüger, Sven; Rösch, Notker
2012-02-14
Transition state discovery via application of string methods has been researched on two fronts. The first front involves development of a new string method, named the Searching String method, while the second one aims at estimating transition states from a discretized reaction path. The Searching String method has been benchmarked against a number of previously existing string methods and the Nudged Elastic Band method. The developed methods have led to a reduction in the number of gradient calls required to optimize a transition state, as compared to existing methods. The Searching String method reported here places new beads on a reaction pathway at the midpoint between existing beads, such that the resolution of the path discretization in the region containing the transition state grows exponentially with the number of beads. This approach leads to favorable convergence behavior and generates more accurate estimates of transition states from which convergence to the final transition states occurs more readily. Several techniques for generating improved estimates of transition states from a converged string or nudged elastic band have been developed and benchmarked on 13 chemical test cases. Optimization approaches for string methods, and pitfalls therein, are discussed.
MOELCULAR SIZE EXCLUSION BY SOIL ORGANIC MATERIALS ESTIMATED FROM THEIR SWELLING IN ORGANIC SOLVENTS
A published method previously developed to measure the swelling characteristics of pow dered coal samples has been adapted for swelling measurements on various peat, pollen, chain, and cellulose samples The swelling of these macromolecular materials is the volumetric manifestatio...
MOLECULAR SIZE EXCLUSION BY SOIL ORGANIC MATERIALS ESTIMATED FROM THEIR SWELLING IN ORGANIC SOLVENTS
A published method previously developed to measure the swelling characteristics of powdered coal samples has been adapted for swelling measurements on various peat, pollen, chitin, and cellulose samples. he swelling of these macromolecular materials is the volumetric manifestatio...
Bridge-in-a-Backpack(TM). Task 2 : reduction of costs through design modifications and optimization.
DOT National Transportation Integrated Search
2011-09-01
The cost effective use of FRP composites in infrastructure requires the efficient use of the : composite materials in the design. Previous work during the development phase and : demonstration phase illustrated the need to refine the design methods f...
New targets for expedient detection of viruses within shellfish
USDA-ARS?s Scientific Manuscript database
Previously our laboratory developed an expedient method for extraction of viral RNA from food-borne virus contaminated bivalve shellfish, termed the GPTT protocol. This protocol utilizes either whole shellfish or dissected digestive diverticula. This four step protocol utilizes a high pH glycine or...
Perennial plan establishment and productivity can be influenced by previous annual crops
USDA-ARS?s Scientific Manuscript database
Developing efficient, economical methods of perennial mixture establishment is needed for grazing and conservation purposes. Study objectives were to evaluate different perennial monocultures and mixtures planted into spring wheat (Triticum aestivum L.), corn (Zea mays L.), soybean (Glycine max L. ...
NONINVASIVE DETERMINATION OF RESPIRATORY OZONE ABSORPTION: THE BOLUS-RESPONSE METHOD
Dr. James Ultman and colleagues used a fast-responding ozone measurement system, which they had developed with previous HEI support, to noninvasively measure the absorption of inhaled ozone in different regions of the respiratory tract of healthy adult men. While the subjec...
NASA Technical Reports Server (NTRS)
Tan, P. W.; Raju, I. S.; Shivakumar, K. N.; Newman, J. C., Jr.
1990-01-01
A re-evaluation of the 3-D finite-element models and methods used to analyze surface crack at stress concentrations is presented. Previous finite-element models used by Raju and Newman for surface and corner cracks at holes were shown to have ill-shaped elements at the intersection of the hole and crack boundaries. Improved models, without these ill-shaped elements, were developed for a surface crack at a circular hole and at a semi-circular edge notch. Stress-intensity factors were calculated by both the nodal-force and virtual-crack-closure methods. Comparisons made between the previously developed stress-intensity factor equations and the results from the improved models agreed well except for configurations with large notch-radii-to-plate-thickness ratios. Stress-intensity factors for a semi-elliptical surface crack located at the center of a semi-circular edge notch in a plate subjected to remote tensile loadings were calculated using the improved models.
Battery Test Manual For Plug-In Hybrid Electric Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey R. Belt
2010-09-01
This battery test procedure manual was prepared for the United States Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy (EERE), Vehicle Technologies Program. It is based on technical targets established for energy storage development projects aimed at meeting system level DOE goals for Plug-in Hybrid Electric Vehicles (PHEV). The specific procedures defined in this manual support the performance and life characterization of advanced battery devices under development for PHEV’s. However, it does share some methods described in the previously published battery test manual for power-assist hybrid electric vehicles. Due to the complexity of some of the proceduresmore » and supporting analysis, a revision including some modifications and clarifications of these procedures is expected. As in previous battery and capacitor test manuals, this version of the manual defines testing methods for full-size battery systems, along with provisions for scaling these tests for modules, cells or other subscale level devices.« less
Battery Test Manual For Plug-In Hybrid Electric Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey R. Belt
2010-12-01
This battery test procedure manual was prepared for the United States Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy (EERE), Vehicle Technologies Program. It is based on technical targets established for energy storage development projects aimed at meeting system level DOE goals for Plug-in Hybrid Electric Vehicles (PHEV). The specific procedures defined in this manual support the performance and life characterization of advanced battery devices under development for PHEV’s. However, it does share some methods described in the previously published battery test manual for power-assist hybrid electric vehicles. Due to the complexity of some of the proceduresmore » and supporting analysis, a revision including some modifications and clarifications of these procedures is expected. As in previous battery and capacitor test manuals, this version of the manual defines testing methods for full-size battery systems, along with provisions for scaling these tests for modules, cells or other subscale level devices.« less
Pediatric intensive care unit admission tool: a colorful approach.
Biddle, Amy
2007-12-01
This article discusses the development, implementation, and utilization of our institution's Pediatric Intensive Care Unit (PICU) Color-Coded Admission Status Tool. Rather than the historical method of identifying a maximum number of staffed beds, a tool was developed to color code the PICU's admission status. Previous methods had been ineffective and led to confusion between the PICU leadership team and the administration. The tool includes the previously missing components of staffing and acuity, which are essential in determining admission capability. The PICU tool has three colored levels: green indicates open for admissions; yellow, admission alert resulting from available beds or because staffing is not equal to the projected patient numbers or required acuity; and red, admissions on hold because only one trauma or arrest bed is available or staffing is not equal to the projected acuity. Yellow and red designations require specific actions and the medical director's approval. The tool has been highly successful and significantly impacted nursing with the inclusion of the essential component of nurse staffing necessary in determining bed availability.
Kjelstrom, L.C.
1995-01-01
Many individual springs and groups of springs discharge water from volcanic rocks that form the north canyon wall of the Snake River between Milner Dam and King Hill. Previous estimates of annual mean discharge from these springs have been used to understand the hydrology of the eastern part of the Snake River Plain. Four methods that were used in previous studies or developed to estimate annual mean discharge since 1902 were (1) water-budget analysis of the Snake River; (2) correlation of water-budget estimates with discharge from 10 index springs; (3) determination of the combined discharge from individual springs or groups of springs by using annual discharge measurements of 8 springs, gaging-station records of 4 springs and 3 sites on the Malad River, and regression equations developed from 5 of the measured springs; and (4) a single regression equation that correlates gaging-station records of 2 springs with historical water-budget estimates. Comparisons made among the four methods of estimating annual mean spring discharges from 1951 to 1959 and 1963 to 1980 indicated that differences were about equivalent to a measurement error of 2 to 3 percent. The method that best demonstrates the response of annual mean spring discharge to changes in ground-water recharge and discharge is method 3, which combines the measurements and regression estimates of discharge from individual springs.
Wright, Adam; Laxmisan, Archana; Ottosen, Madelene J; McCoy, Jacob A; Butten, David; Sittig, Dean F
2012-01-01
Objective We describe a novel, crowdsourcing method for generating a knowledge base of problem–medication pairs that takes advantage of manually asserted links between medications and problems. Methods Through iterative review, we developed metrics to estimate the appropriateness of manually entered problem–medication links for inclusion in a knowledge base that can be used to infer previously unasserted links between problems and medications. Results Clinicians manually linked 231 223 medications (55.30% of prescribed medications) to problems within the electronic health record, generating 41 203 distinct problem–medication pairs, although not all were accurate. We developed methods to evaluate the accuracy of the pairs, and after limiting the pairs to those meeting an estimated 95% appropriateness threshold, 11 166 pairs remained. The pairs in the knowledge base accounted for 183 127 total links asserted (76.47% of all links). Retrospective application of the knowledge base linked 68 316 medications not previously linked by a clinician to an indicated problem (36.53% of unlinked medications). Expert review of the combined knowledge base, including inferred and manually linked problem–medication pairs, found a sensitivity of 65.8% and a specificity of 97.9%. Conclusion Crowdsourcing is an effective, inexpensive method for generating a knowledge base of problem–medication pairs that is automatically mapped to local terminologies, up-to-date, and reflective of local prescribing practices and trends. PMID:22582202
Fractal analysis of GPS time series for early detection of disastrous seismic events
NASA Astrophysics Data System (ADS)
Filatov, Denis M.; Lyubushin, Alexey A.
2017-03-01
A new method of fractal analysis of time series for estimating the chaoticity of behaviour of open stochastic dynamical systems is developed. The method is a modification of the conventional detrended fluctuation analysis (DFA) technique. We start from analysing both methods from the physical point of view and demonstrate the difference between them which results in a higher accuracy of the new method compared to the conventional DFA. Then, applying the developed method to estimate the measure of chaoticity of a real dynamical system - the Earth's crust, we reveal that the latter exhibits two distinct mechanisms of transition to a critical state: while the first mechanism has already been known due to numerous studies of other dynamical systems, the second one is new and has not previously been described. Using GPS time series, we demonstrate efficiency of the developed method in identification of critical states of the Earth's crust. Finally we employ the method to solve a practically important task: we show how the developed measure of chaoticity can be used for early detection of disastrous seismic events and provide a detailed discussion of the numerical results, which are shown to be consistent with outcomes of other researches on the topic.
Diagnostic accuracy of different caries risk assessment methods. A systematic review.
Senneby, Anna; Mejàre, Ingegerd; Sahlin, Nils-Eric; Svensäter, Gunnel; Rohlin, Madeleine
2015-12-01
To evaluate the accuracy of different methods used to identify individuals with increased risk of developing dental coronal caries. Studies on following methods were included: previous caries experience, tests using microbiota, buffering capacity, salivary flow rate, oral hygiene, dietary habits and sociodemographic variables. QUADAS-2 was used to assess risk of bias. Sensitivity, specificity, predictive values, and likelihood ratios (LR) were calculated. Quality of evidence based on ≥3 studies of a method was rated according to GRADE. PubMed, Cochrane Library, Web of Science and reference lists of included publications were searched up to January 2015. From 5776 identified articles, 18 were included. Assessment of study quality identified methodological limitations concerning study design, test technology and reporting. No study presented low risk of bias in all domains. Three or more studies were found only for previous caries experience and salivary mutans streptococci and quality of evidence for these methods was low. Evidence regarding other methods was lacking. For previous caries experience, sensitivity ranged between 0.21 and 0.94 and specificity between 0.20 and 1. Tests using salivary mutans streptococci resulted in low sensitivity and high specificity. For children with primary teeth at baseline, pooled LR for a positive test was 3 for previous caries experience and 4 for salivary mutans streptococci, given a threshold ≥10(5) CFU/ml. Evidence on the validity of analysed methods used for caries risk assessment is limited. As methodological quality was low, there is a need to improve study design. Low validity for the analysed methods may lead to patients with increased risk not being identified, whereas some are falsely identified as being at risk. As caries risk assessment guides individualized decisions on interventions and intervals for patient recall, improved performance based on best evidence is greatly needed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Sun, Ye; Tao, Jing; Zhang, Geoff G Z; Yu, Lian
2010-09-01
A previous method for measuring solubilities of crystalline drugs in polymers has been improved to enable longer equilibration and used to survey the solubilities of indomethacin (IMC) and nifedipine (NIF) in two homo-polymers [polyvinyl pyrrolidone (PVP) and polyvinyl acetate (PVAc)] and their co-polymer (PVP/VA). These data are important for understanding the stability of amorphous drug-polymer dispersions, a strategy actively explored for delivering poorly soluble drugs. Measuring solubilities in polymers is difficult because their high viscosities impede the attainment of solubility equilibrium. In this method, a drug-polymer mixture prepared by cryo-milling is annealed at different temperatures and analyzed by differential scanning calorimetry to determine whether undissolved crystals remain and thus the upper and lower bounds of the equilibrium solution temperature. The new annealing method yielded results consistent with those obtained with the previous scanning method at relatively high temperatures, but revised slightly the previous results at lower temperatures. It also lowered the temperature of measurement closer to the glass transition temperature. For D-mannitol and IMC dissolving in PVP, the polymer's molecular weight has little effect on the weight-based solubility. For IMC and NIF, the dissolving powers of the polymers follow the order PVP > PVP/VA > PVAc. In each polymer studied, NIF is less soluble than IMC. The activities of IMC and NIF dissolved in various polymers are reasonably well fitted to the Flory-Huggins model, yielding the relevant drug-polymer interaction parameters. The new annealing method yields more accurate data than the previous scanning method when solubility equilibrium is slow to achieve. In practice, these two methods can be combined for efficiency. The measured solubilities are not readily anticipated, which underscores the importance of accurate experimental data for developing predictive models.
Karpušenkaitė, Aistė; Ruzgas, Tomas; Denafas, Gintaras
2018-05-01
The aim of the study was to create a hybrid forecasting method that could produce higher accuracy forecasts than previously used 'pure' time series methods. Mentioned methods were already tested with total automotive waste, hazardous automotive waste, and total medical waste generation, but demonstrated at least a 6% error rate in different cases and efforts were made to decrease it even more. Newly developed hybrid models used a random start generation method to incorporate different time-series advantages and it helped to increase the accuracy of forecasts by 3%-4% in hazardous automotive waste and total medical waste generation cases; the new model did not increase the accuracy of total automotive waste generation forecasts. Developed models' abilities to forecast short- and mid-term forecasts were tested using prediction horizon.
Jurling, Alden S; Fienup, James R
2014-03-01
Extending previous work by Thurman on wavefront sensing for segmented-aperture systems, we developed an algorithm for estimating segment tips and tilts from multiple point spread functions in different defocused planes. We also developed methods for overcoming two common modes for stagnation in nonlinear optimization-based phase retrieval algorithms for segmented systems. We showed that when used together, these methods largely solve the capture range problem in focus-diverse phase retrieval for segmented systems with large tips and tilts. Monte Carlo simulations produced a rate of success better than 98% for the combined approach.
Innovative signal processing for Johnson Noise thermometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ezell, N. Dianne Bull; Britton, Jr, Charles L.; Roberts, Michael
This report summarizes the newly developed algorithm that subtracted the Electromagnetic Interference (EMI). The EMI performance is very important to this measurement because any interference in the form on pickup from external signal sources from such as fluorescent lighting ballasts, motors, etc. can skew the measurement. Two methods of removing EMI were developed and tested at various locations. This report also summarizes the testing performed at different facilities outside Oak Ridge National Laboratory using both EMI removal techniques. The first EMI removal technique reviewed in previous milestone reports and therefore this report will detail the second method.
Stress analysis of ribbon parachutes
NASA Technical Reports Server (NTRS)
Reynolds, D. T.; Mullins, W. M.
1975-01-01
An analytical method has been developed for determining the internal load distribution for ribbon parachutes subjected to known riser and aerodynamic forces. Finite elements with non-linear elastic properties represent the parachute structure. This method is an extension of the analysis previously developed by the authors and implemented in the digital computer program CANO. The present analysis accounts for the effect of vertical ribbons in the solution for canopy shape and stress distribution. Parametric results are presented which relate the canopy stress distribution to such factors as vertical ribbon strength, number of gores, and gore shape in a ribbon parachute.
Effective Diagnosis of Alzheimer's Disease by Means of Association Rules
NASA Astrophysics Data System (ADS)
Chaves, R.; Ramírez, J.; Górriz, J. M.; López, M.; Salas-Gonzalez, D.; Illán, I.; Segovia, F.; Padilla, P.
In this paper we present a novel classification method of SPECT images for the early diagnosis of the Alzheimer's disease (AD). The proposed method is based on Association Rules (ARs) aiming to discover interesting associations between attributes contained in the database. The system uses firstly voxel-as-features (VAF) and Activation Estimation (AE) to find tridimensional activated brain regions of interest (ROIs) for each patient. These ROIs act as inputs to secondly mining ARs between activated blocks for controls, with a specified minimum support and minimum confidence. ARs are mined in supervised mode, using information previously extracted from the most discriminant rules for centering interest in the relevant brain areas, reducing the computational requirement of the system. Finally classification process is performed depending on the number of previously mined rules verified by each subject, yielding an up to 95.87% classification accuracy, thus outperforming recent developed methods for AD diagnosis.
Active semi-supervised learning method with hybrid deep belief networks.
Zhou, Shusen; Chen, Qingcai; Wang, Xiaolong
2014-01-01
In this paper, we develop a novel semi-supervised learning algorithm called active hybrid deep belief networks (AHD), to address the semi-supervised sentiment classification problem with deep learning. First, we construct the previous several hidden layers using restricted Boltzmann machines (RBM), which can reduce the dimension and abstract the information of the reviews quickly. Second, we construct the following hidden layers using convolutional restricted Boltzmann machines (CRBM), which can abstract the information of reviews effectively. Third, the constructed deep architecture is fine-tuned by gradient-descent based supervised learning with an exponential loss function. Finally, active learning method is combined based on the proposed deep architecture. We did several experiments on five sentiment classification datasets, and show that AHD is competitive with previous semi-supervised learning algorithm. Experiments are also conducted to verify the effectiveness of our proposed method with different number of labeled reviews and unlabeled reviews respectively.
Analysis of modal behavior at frequency cross-over
NASA Astrophysics Data System (ADS)
Costa, Robert N., Jr.
1994-11-01
The existence of the mode crossing condition is detected and analyzed in the Active Control of Space Structures Model 4 (ACOSS4). The condition is studied for its contribution to the inability of previous algorithms to successfully optimize the structure and converge to a feasible solution. A new algorithm is developed to detect and correct for mode crossings. The existence of the mode crossing condition is verified in ACOSS4 and found not to have appreciably affected the solution. The structure is then successfully optimized using new analytic methods based on modal expansion. An unrelated error in the optimization algorithm previously used is verified and corrected, thereby equipping the optimization algorithm with a second analytic method for eigenvector differentiation based on Nelson's Method. The second structure is the Control of Flexible Structures (COFS). The COFS structure is successfully reproduced and an initial eigenanalysis completed.
The application of quadratic optimal cooperative control synthesis to a CH-47 helicopter
NASA Technical Reports Server (NTRS)
Townsend, Barbara K.
1987-01-01
A control-system design method, quadratic optimal cooperative control synthesis (CCS), is applied to the design of a stability and control augmentation system (SCAS). The CCS design method is different from other design methods in that it does not require detailed a priori design criteria, but instead relies on an explicit optimal pilot-model to create desired performance. The design method, which was developed previously for fixed-wing aircraft, is simplified and modified for application to a Boeing CH-47 helicopter. Two SCAS designs are developed using the CCS design methodology. The resulting CCS designs are then compared with designs obtained using classical/frequency-domain methods and linear quadratic regulator (LQR) theory in a piloted fixed-base simulation. Results indicate that the CCS method, with slight modifications, can be used to produce controller designs which compare favorably with the frequency-domain approach.
Method for in situ carbon deposition measurement for solid oxide fuel cells
NASA Astrophysics Data System (ADS)
Kuhn, J.; Kesler, O.
2014-01-01
Previous methods to measure carbon deposition in solid oxide fuel cell (SOFC) anodes do not permit simultaneous electrochemical measurements. Electrochemical measurements supplemented with carbon deposition quantities create the opportunity to further understand how carbon affects SOFC performance and electrochemical impedance spectra (EIS). In this work, a method for measuring carbon in situ, named here as the quantification of gasified carbon (QGC), was developed. TGA experiments showed that carbon with a 100 h residence time in the SOFC was >99.8% gasified. Comparison of carbon mass measurements between the TGA and QGC show good agreement. In situ measurements of carbon deposition in SOFCs at varying molar steam/carbon ratios were performed to further validate the QGC method, and suppression of carbon deposition with increasing steam concentration was observed, in agreement with previous studies. The technique can be used to investigate in situ carbon deposition and gasification behavior simultaneously with electrochemical measurements for a variety of fuels and operating conditions, such as determining conditions under which incipient carbon deposition is reversible.
NASA Astrophysics Data System (ADS)
Nouizi, F.; Erkol, H.; Luk, A.; Marks, M.; Unlu, M. B.; Gulsen, G.
2016-10-01
We previously introduced photo-magnetic imaging (PMI), an imaging technique that illuminates the medium under investigation with near-infrared light and measures the induced temperature increase using magnetic resonance thermometry (MRT). Using a multiphysics solver combining photon migration and heat diffusion, PMI models the spatiotemporal distribution of temperature variation and recovers high resolution optical absorption images using these temperature maps. In this paper, we present a new fast non-iterative reconstruction algorithm for PMI. This new algorithm uses analytic methods during the resolution of the forward problem and the assembly of the sensitivity matrix. We validate our new analytic-based algorithm with the first generation finite element method (FEM) based reconstruction algorithm previously developed by our team. The validation is performed using, first synthetic data and afterwards, real MRT measured temperature maps. Our new method accelerates the reconstruction process 30-fold when compared to a single iteration of the FEM-based algorithm.
Zhang, Jian; Suo, Yan; Liu, Min; Xu, Xun
2018-06-01
Proliferative diabetic retinopathy (PDR) is one of the most common complications of diabetes and can lead to blindness. Proteomic studies have provided insight into the pathogenesis of PDR and a series of PDR-related genes has been identified but are far from fully characterized because the experimental methods are expensive and time consuming. In our previous study, we successfully identified 35 candidate PDR-related genes through the shortest-path algorithm. In the current study, we developed a computational method using the random walk with restart (RWR) algorithm and the protein-protein interaction (PPI) network to identify potential PDR-related genes. After some possible genes were obtained by the RWR algorithm, a three-stage filtration strategy, which includes the permutation test, interaction test and enrichment test, was applied to exclude potential false positives caused by the structure of PPI network, the poor interaction strength, and the limited similarity on gene ontology (GO) terms and biological pathways. As a result, 36 candidate genes were discovered by the method which was different from the 35 genes reported in our previous study. A literature review showed that 21 of these 36 genes are supported by previous experiments. These findings suggest the robustness and complementary effects of both our efforts using different computational methods, thus providing an alternative method to study PDR pathogenesis. Copyright © 2017 Elsevier B.V. All rights reserved.
Prediction and analysis of beta-turns in proteins by support vector machine.
Pham, Tho Hoan; Satou, Kenji; Ho, Tu Bao
2003-01-01
Tight turn has long been recognized as one of the three important features of proteins after the alpha-helix and beta-sheet. Tight turns play an important role in globular proteins from both the structural and functional points of view. More than 90% tight turns are beta-turns. Analysis and prediction of beta-turns in particular and tight turns in general are very useful for the design of new molecules such as drugs, pesticides, and antigens. In this paper, we introduce a support vector machine (SVM) approach to prediction and analysis of beta-turns. We have investigated two aspects of applying SVM to the prediction and analysis of beta-turns. First, we developed a new SVM method, called BTSVM, which predicts beta-turns of a protein from its sequence. The prediction results on the dataset of 426 non-homologous protein chains by sevenfold cross-validation technique showed that our method is superior to the other previous methods. Second, we analyzed how amino acid positions support (or prevent) the formation of beta-turns based on the "multivariable" classification model of a linear SVM. This model is more general than the other ones of previous statistical methods. Our analysis results are more comprehensive and easier to use than previously published analysis results.
NASA Astrophysics Data System (ADS)
Fujinami, Taku; Kigami, Hiroshi; Unno, Noriyuki; Taniguchi, Jun; Satake, Shin-ichi
2018-06-01
Total internal reflection fluorescence microscopy (TIRFM) is a promising method for measuring fluid flow close to a wall with nanoscale resolution in a process that is termed "multilayer nanoparticle image velocimetry" (MnPIV). TIRFM uses evanescent light that is generated on a substrate (typically a glass slide) by total internal reflection of light. Many researchers have previously studied x- y- z (3D) flows of water close to flat glass slides using MnPIV. On the other hand, a fluid flow close to a structured surface is also important. To measure flows of water near micro-patterns, we previously developed an MnPIV technique that uses a refractive-index-matching method. In previous study, the micropattern is made of a thermoplastic material with a refractive index that closely matches that of water. In this study, ultraviolet nanoimprint lithography was used for fabricating the appropriate micro-patterns because this technique can fabricate a pattern with a high resolution. As a result, we succeeded in performing MnPIV in water with a circular hole array pattern made by ultraviolet nanoimprint using a refractive-index-matching method. We believe that this technique will be helpful in elucidating fluid flows around microstructures.
NASA Astrophysics Data System (ADS)
Modegi, Toshio
We are developing audio watermarking techniques which enable extraction of embedded data by cell phones. For that we have to embed data onto frequency ranges, where our auditory response is prominent, therefore data embedding will cause much auditory noises. Previously we have proposed applying a two-channel stereo play-back feature, where noises generated by a data embedded left-channel signal will be reduced by the other right-channel signal. However, this proposal has practical problems of restricting extracting terminal location. In this paper, we propose synthesizing the noise reducing right-channel signal with the left-signal and reduces noises completely by generating an auditory stream segregation phenomenon to users. This newly proposed makes the noise reducing right-channel signal unnecessary and supports monaural play-back operations. Moreover, we propose a wide-band embedding method causing dual auditory stream segregation phenomena, which enables data embedding on whole public phone frequency ranges and stable extractions with 3-G mobile phones. From these proposals, extraction precisions become higher than those by the previously proposed method whereas the quality damages of embedded signals become smaller. In this paper we present an abstract of our newly proposed method and experimental results comparing with those by the previously proposed method.
NASA Astrophysics Data System (ADS)
Fujinami, Taku; Kigami, Hiroshi; Unno, Noriyuki; Taniguchi, Jun; Satake, Shin-ichi
2018-03-01
Total internal reflection fluorescence microscopy (TIRFM) is a promising method for measuring fluid flow close to a wall with nanoscale resolution in a process that is termed "multilayer nanoparticle image velocimetry" (MnPIV). TIRFM uses evanescent light that is generated on a substrate (typically a glass slide) by total internal reflection of light. Many researchers have previously studied x-y-z (3D) flows of water close to flat glass slides using MnPIV. On the other hand, a fluid flow close to a structured surface is also important. To measure flows of water near micro-patterns, we previously developed an MnPIV technique that uses a refractive-index-matching method. In previous study, the micropattern is made of a thermoplastic material with a refractive index that closely matches that of water. In this study, ultraviolet nanoimprint lithography was used for fabricating the appropriate micro-patterns because this technique can fabricate a pattern with a high resolution. As a result, we succeeded in performing MnPIV in water with a circular hole array pattern made by ultraviolet nanoimprint using a refractive-index-matching method. We believe that this technique will be helpful in elucidating fluid flows around microstructures.
Conjugate-gradient optimization method for orbital-free density functional calculations.
Jiang, Hong; Yang, Weitao
2004-08-01
Orbital-free density functional theory as an extension of traditional Thomas-Fermi theory has attracted a lot of interest in the past decade because of developments in both more accurate kinetic energy functionals and highly efficient numerical methodology. In this paper, we developed a conjugate-gradient method for the numerical solution of spin-dependent extended Thomas-Fermi equation by incorporating techniques previously used in Kohn-Sham calculations. The key ingredient of the method is an approximate line-search scheme and a collective treatment of two spin densities in the case of spin-dependent extended Thomas-Fermi problem. Test calculations for a quartic two-dimensional quantum dot system and a three-dimensional sodium cluster Na216 with a local pseudopotential demonstrate that the method is accurate and efficient. (c) 2004 American Institute of Physics.
GMDD: a database of GMO detection methods
Dong, Wei; Yang, Litao; Shen, Kailin; Kim, Banghyun; Kleter, Gijs A; Marvin, Hans JP; Guo, Rong; Liang, Wanqi; Zhang, Dabing
2008-01-01
Background Since more than one hundred events of genetically modified organisms (GMOs) have been developed and approved for commercialization in global area, the GMO analysis methods are essential for the enforcement of GMO labelling regulations. Protein and nucleic acid-based detection techniques have been developed and utilized for GMOs identification and quantification. However, the information for harmonization and standardization of GMO analysis methods at global level is needed. Results GMO Detection method Database (GMDD) has collected almost all the previous developed and reported GMOs detection methods, which have been grouped by different strategies (screen-, gene-, construct-, and event-specific), and also provide a user-friendly search service of the detection methods by GMO event name, exogenous gene, or protein information, etc. In this database, users can obtain the sequences of exogenous integration, which will facilitate PCR primers and probes design. Also the information on endogenous genes, certified reference materials, reference molecules, and the validation status of developed methods is included in this database. Furthermore, registered users can also submit new detection methods and sequences to this database, and the newly submitted information will be released soon after being checked. Conclusion GMDD contains comprehensive information of GMO detection methods. The database will make the GMOs analysis much easier. PMID:18522755
Report Briefs: Publications of the Energy Division, Oak Ridge National Laboratory, 1999
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moser, C.I.
The Bureau of Labor Statistics (BLS) is responsible for collecting data to estimate price indices such as the Consumer Price Index (CPI). BLS accomplishes this task by sending field staff to places of business to price actual products. The field staff are given product checklists to help them determine whether the products found are comparable to products priced the previous month. Prices for noncomparable products are not included in the current month's price index calculations. A serious problem facing BLS is developing product checklists for dynamic product areas, new industries, and the service sector. It is difficult to keep checklistsmore » up to date and quite often simply to develop checklists for service industry products. Some people estimate that more than 50% of U.S. economic activity is not accounted for in the CPI. The objective it to provide the results of tests on a method for helping BLS staff build new product checklists quickly and efficiently. The domain chosen for studying the method was the telecommunications industry. The method developed by ORNL is based on behavioral science and knowledge-engineering principles. The method has ten steps, which include developing a sample of domain experts, asking experts to list products in the domain, culling the list of products to a manageable number, asking experts to group the remaining products, identifying product clusters using multidimensional scaling and cluster analysis, asking experts to compare pairs of products within clusters, and, finally, developing checklists with the comparison data. The method performed as expected. Several prototype checklists for products in the telecommunications domain were developed, including checklists for paging services, digital cell phones, web browsers, routers, and LAN modems. It was particularly difficult, however, to find experts to participate in the project. Attending a professional meeting and contacting experts from the conference's mailing list proved to be the best approach for this domain. The method has performed well in two domains: the telecommunications industry, as demonstrated in this project, and the PC software industry, as demonstrated in a previous project. It is recommended that the method be further tested in additional service industries, such as the nursing home industry. In addition, further attention needs to be devoted to developing procedures for the method to improve its cost and time efficiency. For example, if automated methods were used to collect information from the experts and if the experts could be assembled at one time, it could be possible to create prototype checklists in one day.« less
Birth month affects lifetime disease risk: a phenome-wide method.
Boland, Mary Regina; Shahn, Zachary; Madigan, David; Hripcsak, George; Tatonetti, Nicholas P
2015-09-01
An individual's birth month has a significant impact on the diseases they develop during their lifetime. Previous studies reveal relationships between birth month and several diseases including atherothrombosis, asthma, attention deficit hyperactivity disorder, and myopia, leaving most diseases completely unexplored. This retrospective population study systematically explores the relationship between seasonal affects at birth and lifetime disease risk for 1688 conditions. We developed a hypothesis-free method that minimizes publication and disease selection biases by systematically investigating disease-birth month patterns across all conditions. Our dataset includes 1 749 400 individuals with records at New York-Presbyterian/Columbia University Medical Center born between 1900 and 2000 inclusive. We modeled associations between birth month and 1688 diseases using logistic regression. Significance was tested using a chi-squared test with multiplicity correction. We found 55 diseases that were significantly dependent on birth month. Of these 19 were previously reported in the literature (P < .001), 20 were for conditions with close relationships to those reported, and 16 were previously unreported. We found distinct incidence patterns across disease categories. Lifetime disease risk is affected by birth month. Seasonally dependent early developmental mechanisms may play a role in increasing lifetime risk of disease. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Kim, Yun Hak; Jeong, Dae Cheon; Pak, Kyoungjune; Goh, Tae Sik; Lee, Chi-Seung; Han, Myoung-Eun; Kim, Ji-Young; Liangwen, Liu; Kim, Chi Dae; Jang, Jeon Yeob; Cha, Wonjae; Oh, Sae-Ock
2017-09-29
Accurate prediction of prognosis is critical for therapeutic decisions regarding cancer patients. Many previously developed prognostic scoring systems have limitations in reflecting recent progress in the field of cancer biology such as microarray, next-generation sequencing, and signaling pathways. To develop a new prognostic scoring system for cancer patients, we used mRNA expression and clinical data in various independent breast cancer cohorts (n=1214) from the Molecular Taxonomy of Breast Cancer International Consortium (METABRIC) and Gene Expression Omnibus (GEO). A new prognostic score that reflects gene network inherent in genomic big data was calculated using Network-Regularized high-dimensional Cox-regression (Net-score). We compared its discriminatory power with those of two previously used statistical methods: stepwise variable selection via univariate Cox regression (Uni-score) and Cox regression via Elastic net (Enet-score). The Net scoring system showed better discriminatory power in prediction of disease-specific survival (DSS) than other statistical methods (p=0 in METABRIC training cohort, p=0.000331, 4.58e-06 in two METABRIC validation cohorts) when accuracy was examined by log-rank test. Notably, comparison of C-index and AUC values in receiver operating characteristic analysis at 5 years showed fewer differences between training and validation cohorts with the Net scoring system than other statistical methods, suggesting minimal overfitting. The Net-based scoring system also successfully predicted prognosis in various independent GEO cohorts with high discriminatory power. In conclusion, the Net-based scoring system showed better discriminative power than previous statistical methods in prognostic prediction for breast cancer patients. This new system will mark a new era in prognosis prediction for cancer patients.
NASA Astrophysics Data System (ADS)
Guo, Yanhui; Zhou, Chuan; Chan, Heang-Ping; Wei, Jun; Chughtai, Aamer; Sundaram, Baskaran; Hadjiiski, Lubomir M.; Patel, Smita; Kazerooni, Ella A.
2013-04-01
A 3D multiscale intensity homogeneity transformation (MIHT) method was developed to reduce false positives (FPs) in our previously developed CAD system for pulmonary embolism (PE) detection. In MIHT, the voxel intensity of a PE candidate region was transformed to an intensity homogeneity value (IHV) with respect to the local median intensity. The IHVs were calculated in multiscales (MIHVs) to measure the intensity homogeneity, taking into account vessels of different sizes and different degrees of occlusion. Seven new features including the entropy, gradient, and moments that characterized the intensity distributions of the candidate regions were derived from the MIHVs and combined with the previously designed features that described the shape and intensity of PE candidates for the training of a linear classifier to reduce the FPs. 59 CTPA PE cases were collected from our patient files (UM set) with IRB approval and 69 cases from the PIOPED II data set with access permission. 595 and 800 PEs were identified as reference standard by experienced thoracic radiologists in the UM and PIOPED set, respectively. FROC analysis was used for performance evaluation. Compared with our previous CAD system, at a test sensitivity of 80%, the new method reduced the FP rate from 18.9 to 14.1/scan for the PIOPED set when the classifier was trained with the UM set and from 22.6 to 16.0/scan vice versa. The improvement was statistically significant (p<0.05) by JAFROC analysis. This study demonstrated that the MIHT method is effective in reducing FPs and improving the performance of the CAD system.
Kim, Yun Hak; Jeong, Dae Cheon; Pak, Kyoungjune; Goh, Tae Sik; Lee, Chi-Seung; Han, Myoung-Eun; Kim, Ji-Young; Liangwen, Liu; Kim, Chi Dae; Jang, Jeon Yeob; Cha, Wonjae; Oh, Sae-Ock
2017-01-01
Accurate prediction of prognosis is critical for therapeutic decisions regarding cancer patients. Many previously developed prognostic scoring systems have limitations in reflecting recent progress in the field of cancer biology such as microarray, next-generation sequencing, and signaling pathways. To develop a new prognostic scoring system for cancer patients, we used mRNA expression and clinical data in various independent breast cancer cohorts (n=1214) from the Molecular Taxonomy of Breast Cancer International Consortium (METABRIC) and Gene Expression Omnibus (GEO). A new prognostic score that reflects gene network inherent in genomic big data was calculated using Network-Regularized high-dimensional Cox-regression (Net-score). We compared its discriminatory power with those of two previously used statistical methods: stepwise variable selection via univariate Cox regression (Uni-score) and Cox regression via Elastic net (Enet-score). The Net scoring system showed better discriminatory power in prediction of disease-specific survival (DSS) than other statistical methods (p=0 in METABRIC training cohort, p=0.000331, 4.58e-06 in two METABRIC validation cohorts) when accuracy was examined by log-rank test. Notably, comparison of C-index and AUC values in receiver operating characteristic analysis at 5 years showed fewer differences between training and validation cohorts with the Net scoring system than other statistical methods, suggesting minimal overfitting. The Net-based scoring system also successfully predicted prognosis in various independent GEO cohorts with high discriminatory power. In conclusion, the Net-based scoring system showed better discriminative power than previous statistical methods in prognostic prediction for breast cancer patients. This new system will mark a new era in prognosis prediction for cancer patients. PMID:29100405
Oldenburg, J; Goudemand, J; Valentino, L; Richards, M; Luu, H; Kriukov, A; Gajek, H; Spotts, G; Ewenstein, B
2010-11-01
Postauthorization safety surveillance of factor VIII (FVIII) concentrates is essential for assessing rare adverse event incidence. We determined safety and efficacy of ADVATE [antihaemophilic factor (recombinant), plasma/albumin-free method, (rAHF-PFM)] during routine clinical practice. Subjects with differing haemophilia A severities and medical histories were monitored during 12 months of prophylactic and/or on-demand therapy. Among 408 evaluable subjects, 386 (95%) received excellent/good efficacy ratings for all on-demand assessments; the corresponding number for subjects with previous FVIII inhibitors was 36/41 (88%). Among 276 evaluable subjects receiving prophylaxis continuously in the study, 255 (92%) had excellent/good ratings for all prophylactic assessments; the corresponding number for subjects with previous FVIII inhibitors was 41/46 (89%). Efficacy of surgical prophylaxis was excellent/good in 16/16 evaluable procedures. Among previously treated patients (PTPs) with >50 exposure days (EDs) and FVIII≤2%, three (0.75%) developed low-titre inhibitors. Two of these subjects had a positive inhibitor history; thus, the incidence of de novo inhibitor formation in PTPs with FVIII≤2% and no inhibitor history was 1/348 (0.29%; 95% CI, 0.01-1.59%). A PTP with moderate haemophilia developed a low-titre inhibitor. High-titre inhibitors were reported in a PTP with mild disease (following surgery), a previously untreated patient (PUP) with moderate disease (following surgery) and a PUP with severe disease. The favourable benefit/risk profile of rAHF-PFM previously documented in prospective clinical trials has been extended to include a broader range of haemophilia patients, many of whom would have been ineligible for registration studies. © 2010 Blackwell Publishing Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burger, Joanna; Gochfeld, Michael; Bunn, Amoret
An assessment of the potential risks to ecological resources from remediation activities or other perturbations should involve a quantitative evaluation of resources on the remediation site and in the surrounding environment. We developed a risk methodology to rapidly evaluate potential impact on ecological resources for the U.S. Department of Energy’s Hanford Site in southcentral Washington State. We describe the application of the risk evaluation for two case studies to illustrate its applicability. The ecological assessment involves examining previous sources of information for the site, defining different resource levels from 0 to 5. We also developed a risk rating scale frommore » nondiscernable to very high. Field assessment is the critical step to determine resource levels or to determine if current conditions are the same as previously evaluated. We provide a rapid assessment method for current ecological conditions that can be compared to previous site-specific data, or that can be used to assess resource value on other sites where ecological information is not generally available. The method is applicable to other Department of Energy’s sites, where its development may involve a range of state regulators, resource trustees, Tribes and other stakeholders. Achieving consistency across Department of Energy’s sites for valuation of ecological resources on remediation sites will assure Congress and the public that funds and personnel are being deployed appropriately.« less
A Roadmap for the Development of Alternative (Non-Animal) Methods for Systemic Toxicity Testing
Systemic toxicity testing forms the cornerstone for the safety evaluation of substances. Pressures to move from traditional animal models to novel technologies arise from various concerns, including: the need to evaluate large numbers of previously untested chemicals and new prod...
An Action Learning Method for Increased Innovation Capability in Organisations
ERIC Educational Resources Information Center
Olsson, Annika; Wadell, Carl; Odenrick, Per; Norell Bergendahl, Margareta
2010-01-01
Product innovation in highly complex and technological areas, such as medical technology, puts high requirements on the innovation capability of an organisation. Previous research and publications have highlighted organisational issues and learning matters as important and necessary for the development of innovation capability. Action learning…
DOT National Transportation Integrated Search
2009-08-01
Asphalt mixtures designed using modern conventional methods, whether Marshall or Superpave methodologies, fail to address the cracking performance of these mixtures. Research previously conducted at the University of Florida for the Florida Departmen...
Exploiting MeSH indexing in MEDLINE to generate a data set for word sense disambiguation.
Jimeno-Yepes, Antonio J; McInnes, Bridget T; Aronson, Alan R
2011-06-02
Evaluation of Word Sense Disambiguation (WSD) methods in the biomedical domain is difficult because the available resources are either too small or too focused on specific types of entities (e.g. diseases or genes). We present a method that can be used to automatically develop a WSD test collection using the Unified Medical Language System (UMLS) Metathesaurus and the manual MeSH indexing of MEDLINE. We demonstrate the use of this method by developing such a data set, called MSH WSD. In our method, the Metathesaurus is first screened to identify ambiguous terms whose possible senses consist of two or more MeSH headings. We then use each ambiguous term and its corresponding MeSH heading to extract MEDLINE citations where the term and only one of the MeSH headings co-occur. The term found in the MEDLINE citation is automatically assigned the UMLS CUI linked to the MeSH heading. Each instance has been assigned a UMLS Concept Unique Identifier (CUI). We compare the characteristics of the MSH WSD data set to the previously existing NLM WSD data set. The resulting MSH WSD data set consists of 106 ambiguous abbreviations, 88 ambiguous terms and 9 which are a combination of both, for a total of 203 ambiguous entities. For each ambiguous term/abbreviation, the data set contains a maximum of 100 instances per sense obtained from MEDLINE.We evaluated the reliability of the MSH WSD data set using existing knowledge-based methods and compared their performance to that of the results previously obtained by these algorithms on the pre-existing data set, NLM WSD. We show that the knowledge-based methods achieve different results but keep their relative performance except for the Journal Descriptor Indexing (JDI) method, whose performance is below the other methods. The MSH WSD data set allows the evaluation of WSD algorithms in the biomedical domain. Compared to previously existing data sets, MSH WSD contains a larger number of biomedical terms/abbreviations and covers the largest set of UMLS Semantic Types. Furthermore, the MSH WSD data set has been generated automatically reusing already existing annotations and, therefore, can be regenerated from subsequent UMLS versions.
NASA Technical Reports Server (NTRS)
Trimble, Jay Phillip
2014-01-01
The Resource Prospector Mission seeks to rove the lunar surface with an in-situ resource utilization payload in search of volatiles at a polar region. The mission operations system (MOS) will need to perform the short-duration mission while taking advantage of the near real time control that the short one-way light time to the Moon provides. To maximize our use of limited resources for the design and development of the MOS we are utilizing agile and lean methods derived from our previous experience with applying these methods to software. By using methods such as "say it then sim it" we will spend less time in meetings and more time focused on the one outcome that counts - the effective utilization of our assets on the Moon to meet mission objectives.
Assessing Discriminative Performance at External Validation of Clinical Prediction Models
Nieboer, Daan; van der Ploeg, Tjeerd; Steyerberg, Ewout W.
2016-01-01
Introduction External validation studies are essential to study the generalizability of prediction models. Recently a permutation test, focusing on discrimination as quantified by the c-statistic, was proposed to judge whether a prediction model is transportable to a new setting. We aimed to evaluate this test and compare it to previously proposed procedures to judge any changes in c-statistic from development to external validation setting. Methods We compared the use of the permutation test to the use of benchmark values of the c-statistic following from a previously proposed framework to judge transportability of a prediction model. In a simulation study we developed a prediction model with logistic regression on a development set and validated them in the validation set. We concentrated on two scenarios: 1) the case-mix was more heterogeneous and predictor effects were weaker in the validation set compared to the development set, and 2) the case-mix was less heterogeneous in the validation set and predictor effects were identical in the validation and development set. Furthermore we illustrated the methods in a case study using 15 datasets of patients suffering from traumatic brain injury. Results The permutation test indicated that the validation and development set were homogenous in scenario 1 (in almost all simulated samples) and heterogeneous in scenario 2 (in 17%-39% of simulated samples). Previously proposed benchmark values of the c-statistic and the standard deviation of the linear predictors correctly pointed at the more heterogeneous case-mix in scenario 1 and the less heterogeneous case-mix in scenario 2. Conclusion The recently proposed permutation test may provide misleading results when externally validating prediction models in the presence of case-mix differences between the development and validation population. To correctly interpret the c-statistic found at external validation it is crucial to disentangle case-mix differences from incorrect regression coefficients. PMID:26881753
A Model-Based Approach for Identifying Signatures of Ancient Balancing Selection in Genetic Data
DeGiorgio, Michael; Lohmueller, Kirk E.; Nielsen, Rasmus
2014-01-01
While much effort has focused on detecting positive and negative directional selection in the human genome, relatively little work has been devoted to balancing selection. This lack of attention is likely due to the paucity of sophisticated methods for identifying sites under balancing selection. Here we develop two composite likelihood ratio tests for detecting balancing selection. Using simulations, we show that these methods outperform competing methods under a variety of assumptions and demographic models. We apply the new methods to whole-genome human data, and find a number of previously-identified loci with strong evidence of balancing selection, including several HLA genes. Additionally, we find evidence for many novel candidates, the strongest of which is FANK1, an imprinted gene that suppresses apoptosis, is expressed during meiosis in males, and displays marginal signs of segregation distortion. We hypothesize that balancing selection acts on this locus to stabilize the segregation distortion and negative fitness effects of the distorter allele. Thus, our methods are able to reproduce many previously-hypothesized signals of balancing selection, as well as discover novel interesting candidates. PMID:25144706
A model-based approach for identifying signatures of ancient balancing selection in genetic data.
DeGiorgio, Michael; Lohmueller, Kirk E; Nielsen, Rasmus
2014-08-01
While much effort has focused on detecting positive and negative directional selection in the human genome, relatively little work has been devoted to balancing selection. This lack of attention is likely due to the paucity of sophisticated methods for identifying sites under balancing selection. Here we develop two composite likelihood ratio tests for detecting balancing selection. Using simulations, we show that these methods outperform competing methods under a variety of assumptions and demographic models. We apply the new methods to whole-genome human data, and find a number of previously-identified loci with strong evidence of balancing selection, including several HLA genes. Additionally, we find evidence for many novel candidates, the strongest of which is FANK1, an imprinted gene that suppresses apoptosis, is expressed during meiosis in males, and displays marginal signs of segregation distortion. We hypothesize that balancing selection acts on this locus to stabilize the segregation distortion and negative fitness effects of the distorter allele. Thus, our methods are able to reproduce many previously-hypothesized signals of balancing selection, as well as discover novel interesting candidates.
would (a) require less time and (b) not destroy vitamins. Feed contaminated with Bacillus stearothermophilus and B. subtilis was sterilized in high...The study was designed to develop a method for sterilizing vitamin-fortified commercial diets for feeding germ-free and defined-flora rodents that...high-prevacuum autoclaves makes possible a reduction in the time required to sterilize autoclavable, commercial diets. Previous methods have required
We have previously developed a statistical method to identify gene sets enriched with condition-specific genetic dependencies. The method constructs gene dependency networks from bootstrapped samples in one condition and computes the divergence between distributions of network likelihood scores from different conditions. It was shown to be capable of sensitive and specific identification of pathways with phenotype-specific dysregulation, i.e., rewiring of dependencies between genes in different conditions.
Identification of open quantum systems from observable time traces
Zhang, Jun; Sarovar, Mohan
2015-05-27
Estimating the parameters that dictate the dynamics of a quantum system is an important task for quantum information processing and quantum metrology, as well as fundamental physics. In our paper we develop a method for parameter estimation for Markovian open quantum systems using a temporal record of measurements on the system. Furthermore, the method is based on system realization theory and is a generalization of our previous work on identification of Hamiltonian parameters.
Exercise and Bone Density: Meta-Analysis
2003-10-01
were estimat- included in our analysis. Thus, for example, if BMD ed using previously developed methods .ŕ T- was also assessed in women performing...from 12.6% unpublished work is inappropriate because it has in the placebo group to 9.0% in the alendronate not gone through the peer review process...Olkin I. Statistical Methods for Meta-Analy- taken that could enhance BMD, cigarette smoking, 0 sis. San Diego, CA: Academic Press; 1985. take tht
Optical Sensors and Methods for Underwater 3D Reconstruction
Massot-Campos, Miquel; Oliver-Codina, Gabriel
2015-01-01
This paper presents a survey on optical sensors and methods for 3D reconstruction in underwater environments. The techniques to obtain range data have been listed and explained, together with the different sensor hardware that makes them possible. The literature has been reviewed, and a classification has been proposed for the existing solutions. New developments, commercial solutions and previous reviews in this topic have also been gathered and considered. PMID:26694389
A Comprehension Based Analysis of Autoflight System Interfaces
NASA Technical Reports Server (NTRS)
Palmer, Everett (Technical Monitor); Polson, Peter G.
2003-01-01
This cooperative agreement supported Dr. Peter Polson's participation in two interrelated research programs. The first was the development of the Situation-Goal-Behavior (SGB) Model that is both a formal description of an avionics system's logic and behavior and a representation of a system that can be understood by avionics designers, pilots, and training developers. The second was the development of a usability inspection method based on an approximate model, RAFIV, of pilot interactions with the Flight Management System (FMS). The main purpose of this report is to integrate the two models and provide a context in order to better characterize the accomplishments of this research program. A major focus of both the previous and this Cooperative Agreement was the development of usability evaluation methods that can be effectively utilized during all phases of the design, development, and certification process of modern avionics systems. The current efforts to validate these methods have involved showing that they generate useful analyses of known operational and training problems with the current generation of avionics systems in modern commercial airliners. This report is organized into seven sections. Following the overview, the second section describes the Goal-Situation-Behavior model and its applications. The next section summarizes the foundations of the RAFIV model and describes the model in some detail. The contents of both these sections are derived from previous reports referenced in footnotes. The fourth section integrates these two models into a complete design evaluation and training development framework. The fifth section contains conclusions and possible future directions for research. References are in Section 6. Section 7 contains the titles and abstracts of the papers paper describing in more detail the results of this research program.
Chen, Chenglong; Ni, Jiangqun; Shen, Zhaoyi; Shi, Yun Qing
2017-06-01
Geometric transformations, such as resizing and rotation, are almost always needed when two or more images are spliced together to create convincing image forgeries. In recent years, researchers have developed many digital forensic techniques to identify these operations. Most previous works in this area focus on the analysis of images that have undergone single geometric transformations, e.g., resizing or rotation. In several recent works, researchers have addressed yet another practical and realistic situation: successive geometric transformations, e.g., repeated resizing, resizing-rotation, rotation-resizing, and repeated rotation. We will also concentrate on this topic in this paper. Specifically, we present an in-depth analysis in the frequency domain of the second-order statistics of the geometrically transformed images. We give an exact formulation of how the parameters of the first and second geometric transformations influence the appearance of periodic artifacts. The expected positions of characteristic resampling peaks are analytically derived. The theory developed here helps to address the gap left by previous works on this topic and is useful for image security and authentication, in particular, the forensics of geometric transformations in digital images. As an application of the developed theory, we present an effective method that allows one to distinguish between the aforementioned four different processing chains. The proposed method can further estimate all the geometric transformation parameters. This may provide useful clues for image forgery detection.
Weisser, Johan J; Hansen, Martin; Björklund, Erland; Sonne, Christian; Dietz, Rune; Styrishave, Bjarne
2016-04-01
This paper presents the development and evaluation of a methodology for extraction, clean-up and analysis of three key corticosteroids (aldosterone, cortisol and corticosterone) in polar bear hair. Such a methodology can be used to monitor stress biomarkers in polar bears and may provide as a useful tool for long-term and retrospective information. We developed a combined pressurized liquid extraction (PLE)-solid phase extraction (SPE) procedure for corticosteroid extraction and clean-up followed by high pressure liquid chromatography tandem mass spectrometry (HPLC-MS/MS) analysis. This procedure allows for the simultaneous determination of multiple steroids, which is in contrast to previous polar bear studies based on ELISA techniques. Absolute method recoveries were 81%, 75% and 60% for cortisol, corticosterone and aldosterone, respectively. We applied the developed method on a hair sample pooled from four East Greenland polar bears. Herein cortisol and corticosterone were successfully determined in levels of 0.32±0.02ng/g hair and 0.13±0.02ng/g hair, respectively. Aldosterone was below limit of detection (LOD<0.17ng/g). The cortisol hair concentration found in these East Greenland polar bears was consistent with cortisol levels previously determined in the Southern Hudson Bay and James Bay in Canada using ELISA kits. Copyright © 2016 Elsevier B.V. All rights reserved.
A noninvasive, direct real-time PCR method for sex determination in multiple avian species
Brubaker, Jessica L.; Karouna-Renier, Natalie K.; Chen, Yu; Jenko, Kathryn; Sprague, Daniel T.; Henry, Paula F.P.
2011-01-01
Polymerase chain reaction (PCR)-based methods to determine the sex of birds are well established and have seen few modifications since they were first introduced in the 1990s. Although these methods allowed for sex determination in species that were previously difficult to analyse, they were not conducive to high-throughput analysis because of the laboriousness of DNA extraction and gel electrophoresis. We developed a high-throughput real-time PCR-based method for analysis of sex in birds, which uses noninvasive sample collection and avoids DNA extraction and gel electrophoresis.
A comparison of methods for teaching receptive language to toddlers with autism.
Vedora, Joseph; Grandelski, Katrina
2015-01-01
The use of a simple-conditional discrimination training procedure, in which stimuli are initially taught in isolation with no other comparison stimuli, is common in early intensive behavioral intervention programs. Researchers have suggested that this procedure may encourage the development of faulty stimulus control during training. The current study replicated previous work that compared the simple-conditional and the conditional-only methods to teach receptive labeling of pictures to young children with autism spectrum disorder. Both methods were effective, but the conditional-only method required fewer sessions to mastery. © Society for the Experimental Analysis of Behavior.
Development of a method of alignment between various SOLAR MAXIMUM MISSION experiments
NASA Technical Reports Server (NTRS)
1977-01-01
Results of an engineering study of the methods of alignment between various experiments for the solar maximum mission are described. The configuration studied consists of the instruments, mounts and instrument support platform located within the experiment module. Hardware design, fabrication methods and alignment techniques were studied with regard to optimizing the coalignment between the experiments and the fine sun sensor. The proposed hardware design was reviewed with regard to loads, stress, thermal distortion, alignment error budgets, fabrication techniques, alignment techniques and producibility. Methods of achieving comparable alignment accuracies on previous projects were also reviewed.
The cardiac muscle duplex as a method to study myocardial heterogeneity
Solovyova, O.; Katsnelson, L.B.; Konovalov, P.V.; Kursanov, A.G.; Vikulova, N.A.; Kohl, P.; Markhasin, V.S.
2014-01-01
This paper reviews the development and application of paired muscle preparations, called duplex, for the investigation of mechanisms and consequences of intra-myocardial electro-mechanical heterogeneity. We illustrate the utility of the underlying combined experimental and computational approach for conceptual development and integration of basic science insight with clinically relevant settings, using previously published and new data. Directions for further study are identified. PMID:25106702
Objective Method for Pain Detection/Diagnosis
2013-11-01
implications. We developed a prototype of the Finger Sensor by combining a wireless pulse oximeter with the previously discussed Shimmer GSR sensor (Figure...NeuroSky and have executed a Developer Agreement. We elected to use a Bluetooth Finger Pulse Oximeter for recording blood oxygen saturation and... pulse . This pulse oximeter was chosen because it met all five of our selection criterion. The Bluetooth functionality makes it easy for us to
NASA Astrophysics Data System (ADS)
Lee, Minsuk; Won, Youngjae; Park, Byungjun; Lee, Seungrag
2017-02-01
Not only static characteristics but also dynamic characteristics of the red blood cell (RBC) contains useful information for the blood diagnosis. Quantitative phase imaging (QPI) can capture sample images with subnanometer scale depth resolution and millisecond scale temporal resolution. Various researches have been used QPI for the RBC diagnosis, and recently many researches has been developed to decrease the process time of RBC information extraction using QPI by the parallel computing algorithm, however previous studies are interested in the static parameters such as morphology of the cells or simple dynamic parameters such as root mean square (RMS) of the membrane fluctuations. Previously, we presented a practical blood test method using the time series correlation analysis of RBC membrane flickering with QPI. However, this method has shown that there is a limit to the clinical application because of the long computation time. In this study, we present an accelerated time series correlation analysis of RBC membrane flickering using the parallel computing algorithm. This method showed consistent fractal scaling exponent results of the surrounding medium and the normal RBC with our previous research.
NASA Astrophysics Data System (ADS)
El Akbar, R. Reza; Anshary, Muhammad Adi Khairul; Hariadi, Dennis
2018-02-01
Model MACP for HE ver.1. Is a model that describes how to perform measurement and monitoring performance for Higher Education. Based on a review of the research related to the model, there are several parts of the model component to develop in further research, so this research has four main objectives. The first objective is to differentiate the CSF (critical success factor) components in the previous model, the two key KPI (key performance indicators) exploration in the previous model, the three based on the previous objective, the new and more detailed model design. The final goal is the fourth designed prototype application for performance measurement in higher education, based on a new model created. The method used is explorative research method and application design using prototype method. The results of this study are first, forming a more detailed new model for measurement and monitoring of performance in higher education, differentiation and exploration of the Model MACP for HE Ver.1. The second result compiles a dictionary of college performance measurement by re-evaluating the existing indicators. The third result is the design of prototype application of performance measurement in higher education.
Immersed boundary-simplified lattice Boltzmann method for incompressible viscous flows
NASA Astrophysics Data System (ADS)
Chen, Z.; Shu, C.; Tan, D.
2018-05-01
An immersed boundary-simplified lattice Boltzmann method is developed in this paper for simulations of two-dimensional incompressible viscous flows with immersed objects. Assisted by the fractional step technique, the problem is resolved in a predictor-corrector scheme. The predictor step solves the flow field without considering immersed objects, and the corrector step imposes the effect of immersed boundaries on the velocity field. Different from the previous immersed boundary-lattice Boltzmann method which adopts the standard lattice Boltzmann method (LBM) as the flow solver in the predictor step, a recently developed simplified lattice Boltzmann method (SLBM) is applied in the present method to evaluate intermediate flow variables. Compared to the standard LBM, SLBM requires lower virtual memories, facilitates the implementation of physical boundary conditions, and shows better numerical stability. The boundary condition-enforced immersed boundary method, which accurately ensures no-slip boundary conditions, is implemented as the boundary solver in the corrector step. Four typical numerical examples are presented to demonstrate the stability, the flexibility, and the accuracy of the present method.
Three novel approaches to structural identifiability analysis in mixed-effects models.
Janzén, David L I; Jirstrand, Mats; Chappell, Michael J; Evans, Neil D
2016-05-06
Structural identifiability is a concept that considers whether the structure of a model together with a set of input-output relations uniquely determines the model parameters. In the mathematical modelling of biological systems, structural identifiability is an important concept since biological interpretations are typically made from the parameter estimates. For a system defined by ordinary differential equations, several methods have been developed to analyse whether the model is structurally identifiable or otherwise. Another well-used modelling framework, which is particularly useful when the experimental data are sparsely sampled and the population variance is of interest, is mixed-effects modelling. However, established identifiability analysis techniques for ordinary differential equations are not directly applicable to such models. In this paper, we present and apply three different methods that can be used to study structural identifiability in mixed-effects models. The first method, called the repeated measurement approach, is based on applying a set of previously established statistical theorems. The second method, called the augmented system approach, is based on augmenting the mixed-effects model to an extended state-space form. The third method, called the Laplace transform mixed-effects extension, is based on considering the moment invariants of the systems transfer function as functions of random variables. To illustrate, compare and contrast the application of the three methods, they are applied to a set of mixed-effects models. Three structural identifiability analysis methods applicable to mixed-effects models have been presented in this paper. As method development of structural identifiability techniques for mixed-effects models has been given very little attention, despite mixed-effects models being widely used, the methods presented in this paper provides a way of handling structural identifiability in mixed-effects models previously not possible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Feedback systems for nontraditional medicines: a case for the signal flow diagram.
Tice, B S
1998-11-01
The signal flow diagram is a graphic method used to represent complex data that is found in the field of biology and hence the field of medicine. The signal flow diagram is analyzed against a table of data and a flow chart of data and evaluated on the clarity and simplicity of imparting this information. The data modeled is from previous clinical studies and nontraditional medicine from Africa, China, and South America. This report is a development from previous presentations of the signal flow diagram.1-4
Huynh, Dao; Zhou, Shao Jia; Gibson, Robert; Palmer, Lyndon; Muhlhausler, Beverly
2015-01-01
In this study a novel method to determine iodine concentrations in human breast milk was developed and validated. The iodine was analyzed by inductively coupled plasma mass spectrometry (ICPMS) following tetramethylammonium hydroxide (TMAH) extraction at 90°C in disposable polypropylene tubes. While similar approaches have been used previously, this method adopted a shorter extraction time (1h vs. 3h) and used antimony (Sb) as the internal standard, which exhibited greater stability in breast milk and milk powder matrices compared to tellurium (Te). Method validation included: defining iodine linearity up to 200μgL(-1); confirming recovery of iodine from NIST 1549 milk powder. A recovery of 94-98% was also achieved for the NIST 1549 milk powder and human breast milk samples spiked with sodium iodide and thyroxine (T4) solutions. The method quantitation limit (MQL) for human breast milk was 1.6μgL(-1). The intra-assay and inter-assay coefficient of variation for the breast milk samples and NIST powder were <1% and <3.5%, respectively. NIST 1549 milk powder, human breast milk samples and calibration standards spiked with the internal standard were all stable for at least 2.5 months after extraction. The results of the validation process confirmed that this newly developed method provides greater accuracy and precision in the assessment of iodine concentrations in human breast milk than previous methods and therefore offers a more reliable approach for assessing iodine concentrations in human breast milk. Copyright © 2014 Elsevier GmbH. All rights reserved.
A new method of artificial latent fingerprint creation using artificial sweat and inkjet printer.
Hong, Sungwook; Hong, Ingi; Han, Aleum; Seo, Jin Yi; Namgung, Juyoung
2015-12-01
In order to study fingerprinting in the field of forensic science, it is very important to have two or more latent fingerprints with identical chemical composition and intensity. However, it is impossible to obtain identical fingerprints, in reality, because fingerprinting comes out slightly differently every time. A previous research study had proposed an artificial fingerprint creation method in which inkjet ink was replaced with amino acids and sodium chloride solution: the components of human sweat. But, this method had some drawbacks: divalent cations were not added while formulating the artificial sweat solution, and diluted solutions were used for creating weakly deposited latent fingerprint. In this study, a method was developed for overcoming the drawbacks of the methods used in the previous study. Several divalent cations were added in this study because the amino acid-ninhydrin (or some of its analogues) complex is known to react with divalent cations to produce a photoluminescent product; and, similarly, the amino acid-1,2-indanedione complex is known to be catalyzed by a small amount of zinc ions to produce a highly photoluminescent product. Also, in this study, a new technique was developed which enables to adjust the intensity when printing the latent fingerprint patterns. In this method, image processing software is used to control the intensity of the master fingerprint patterns, which adjusts the printing intensity of the latent fingerprints. This new method opened the way to produce a more realistic artificial fingerprint in various strengths with one artificial sweat working solution. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Gatenby, Piers; Bhattacharjee, Santanu; Wall, Christine; Caygill, Christine; Watson, Anthony
2016-01-01
AIM To clarify risk based upon segment length, diagnostic histological findings, patient age and year of surveillance, duration of surveillance and gender. METHODS Patients registered with the United Kingdom Barrett’s Oesophagus Registry from 9 United Kingdom centers were included. The outcome measures were (1) development of all grades of dysplasia; (2) development of high-grade of dysplasia or adenocarcinoma; and (3) development of adenocarcinoma. Prevalent cases and subjects with < 1 year of follow-up were excluded. The covariates examined were segment length, previous biopsy findings, age at surveillance, duration of surveillance, year of surveillance and gender. RESULTS One thousand and one hundred thirty six patients were included (total 6474 patient-years). Fifty-four patients developed adenocarcinoma (0.83% per annum), 70 developed high-grade dysplasia/adenocarcinoma (1.1% per annum) and 190 developed any grade of dysplasia (3.5% per annum). High grade dysplasia and adenocarcinoma increased with age and duration of surveillance. The risk of low-grade dysplasia development was not dependent on age at surveillance. Segment length and previous biopsy findings were also significant factors for development of dysplasia and adenocarcinoma. CONCLUSION The risk of development of low-grade dysplasia is independent of age at surveillance, but high-grade dysplasia and adenocarcinoma were more commonly found at older age. Segment length and previous biopsy findings are also markers of risk. This study did not demonstrate stabilisation of the metaplastic segment with prolonged surveillance. PMID:28082811
NASA Technical Reports Server (NTRS)
Chang, Sin-Chung
1995-01-01
A new numerical framework for solving conservation laws is being developed. This new framework differs substantially in both concept and methodology from the well-established methods, i.e., finite difference, finite volume, finite element, and spectral methods. It is conceptually simple and designed to overcome several key limitations of the above traditional methods. A two-level scheme for solving the convection-diffusion equation is constructed and used to illuminate the major differences between the present method and those previously mentioned. This explicit scheme, referred to as the a-mu scheme, has two independent marching variables.
Researcher’s Perspective of Substitution Method on Text Steganography
NASA Astrophysics Data System (ADS)
Zamir Mansor, Fawwaz; Mustapha, Aida; Azah Samsudin, Noor
2017-08-01
The linguistic steganography studies are still in the stage of development and empowerment practices. This paper will present several text steganography on substitution methods based on the researcher’s perspective, all scholar paper will analyse and compared. The objective of this paper is to give basic information in the substitution method of text domain steganography that has been applied by previous researchers. The typical ways of this method also will be identified in this paper to reveal the most effective method in text domain steganography. Finally, the advantage of the characteristic and drawback on these techniques in generally also presented in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Method 1664 was developed by the United States Environmental Protection Agency Office of Science and Technology to replace previously used gravimetric procedures that employed Freon-113, a Class I CFC, as the extraction solvent for the determination of oil and grease and petroleum hydrocarbons. Method 1664 is a performance-based method applicable to aqueous matrices that requires the use of n-hexane as the extraction solvent and gravimetry as the determinative technique. In addition, QC procedures designed to monitor precision and accuracy have been incorporated into Method 1664.
Wiegers, Ann L
2003-07-01
Third-party accreditation is a valuable tool to demonstrate a laboratory's competence to conduct testing. Accreditation, internationally and in the United States, has been discussed previously. However, accreditation is only I part of establishing data credibility. A validated test method is the first component of a valid measurement system. Validation is defined as confirmation by examination and the provision of objective evidence that the particular requirements for a specific intended use are fulfilled. The international and national standard ISO/IEC 17025 recognizes the importance of validated methods and requires that laboratory-developed methods or methods adopted by the laboratory be appropriate for the intended use. Validated methods are therefore required and their use agreed to by the client (i.e., end users of the test results such as veterinarians, animal health programs, and owners). ISO/IEC 17025 also requires that the introduction of methods developed by the laboratory for its own use be a planned activity conducted by qualified personnel with adequate resources. This article discusses considerations and recommendations for the conduct of veterinary diagnostic test method development, validation, evaluation, approval, and transfer to the user laboratory in the ISO/IEC 17025 environment. These recommendations are based on those of nationally and internationally accepted standards and guidelines, as well as those of reputable and experienced technical bodies. They are also based on the author's experience in the evaluation of method development and transfer projects, validation data, and the implementation of quality management systems in the area of method development.
Terrain and refractivity effects on non-optical paths
NASA Astrophysics Data System (ADS)
Barrios, Amalia E.
1994-07-01
The split-step parabolic equation (SSPE) has been used extensively to model tropospheric propagation over the sea, but recent efforts have extended this method to propagation over arbitrary terrain. At the Naval Command, Control and Ocean Surveillance Center (NCCOSC), Research, Development, Test and Evaluation Division, a split-step Terrain Parabolic Equation Model (TPEM) has been developed that takes into account variable terrain and range-dependent refractivity profiles. While TPEM has been previously shown to compare favorably with measured data and other existing terrain models, two alternative methods to model radiowave propagation over terrain, implemented within TPEM, will be presented that give a two to ten-fold decrease in execution time. These two methods are also shown to agree well with measured data.
ERIC Educational Resources Information Center
Goodlin-Jones, Beth L.; Waters, Sara; Anders, Thomas F.
2009-01-01
Objective: This study investigated the association between preschool children's sleep patterns measured by actigraphy and parent-reported hyperactivity symptoms. Many previous studies have reported sleep problems in children with attention deficit hyperactivity disorder (ADHD)-like symptoms. Methods: This study examined a cross-sectional sample of…
Changing Epistemological Beliefs: The Unexpected Impact of a Short-Term Intervention
ERIC Educational Resources Information Center
Kienhues, Dorothe; Bromme, Rainer; Stahl, Elmar
2008-01-01
Background: Previous research has shown that sophisticated epistemological beliefs exert a positive influence on students' learning strategies and learning outcomes. This gives a clear educational relevance to studies on the development of methods for promoting a change in epistemological beliefs and making them more sophisticated. Aims: To…
Using Mixed-Effects Structural Equation Models to Study Student Academic Development.
ERIC Educational Resources Information Center
Pike, Gary R.
1992-01-01
A study at the University of Tennessee Knoxville used mixed-effect structural equation models incorporating latent variables as an alternative to conventional methods of analyzing college students' (n=722) first-year-to-senior academic gains. Results indicate, contrary to previous analysis, that coursework and student characteristics interact to…
Perspectives on Rural Health Workforce Issues: Illinois-Arkansas Comparison
ERIC Educational Resources Information Center
MacDowell, Martin; Glasser, Michael; Fitts, Michael; Fratzke, Mel; Peters, Karen
2009-01-01
Context: Past research has documented rural physician and health care professional shortages. Purpose: Rural hospital chief executive officers' (CEOs') reported shortages of health professionals and perceptions about recruiting and retention are compared in Illinois and Arkansas. Methods: A survey, previously developed and sent to 28 CEOs in…
Hierarchical Bayesian Models of Subtask Learning
ERIC Educational Resources Information Center
Anglim, Jeromy; Wynton, Sarah K. A.
2015-01-01
The current study used Bayesian hierarchical methods to challenge and extend previous work on subtask learning consistency. A general model of individual-level subtask learning was proposed focusing on power and exponential functions with constraints to test for inconsistency. To study subtask learning, we developed a novel computer-based booking…
Change Blindness as a Means of Studying Expertise in Physics
ERIC Educational Resources Information Center
Feil, Adam; Mestre, Jose P.
2010-01-01
Previous studies examining expertise have used a wide range of methods. Beyond characterizing expert and novice behavior in different contexts and circumstances, many studies have examined the processes that comprise the behavior itself and, more recently, processes that comprise training and practice that develop expertise. Other studies, dating…
Gilligan's Moral Orientation Hypothesis: Strategies of Justification and Practical Deliberation.
ERIC Educational Resources Information Center
Keefer, Matthew Wilks
Previous studies failed to determine whether Gilligan's (1982) justice and care perspectives represent two distinct orientations of moral reasoning. Using methods developed in research on reasoning and discourse processes, a study used a discursive framework to validate an alternate methodology for the investigation of moral orientation reasoning.…
Leisure Service Career Programs Model. Final Report.
ERIC Educational Resources Information Center
Twining, Marilyn
This report identifies leisure career occupations, determines the occupational outlook, and develops primary core competencies as well as specialized, optional competencies for entry level employment. The main method of inquiry is described as a needs assessment based on an audit at Moraine Valley Community College, two previous studies by the…
ERIC Educational Resources Information Center
Narr, Katherine L.; Woods, Roger P.; Lin, James; Kim, John; Phillips, Owen R.; Del'Homme, Melissa; Caplan, Rochelle; Toga, Arthur W.; McCracken, James T.; Levitt, Jennifer G.
2009-01-01
Objective: This cross-sectional study sought to confirm the presence and regional profile of previously reported changes in laminar cortical thickness in children and adolescents with attention-deficit/hyperactivity disorder (ADHD) compared with typically developing control subjects. Method: High-resolution magnetic resonance images were obtained…
Renting Rooms in Three Canadian Cities: Accepting and Rejecting the AIDS Patient.
ERIC Educational Resources Information Center
Page, Stewart
Following methods previously developed, this study investigated the social stigma associated with Acquired Immunodeficiency Syndrome (AIDS) by placing 90 telephone calls to landlords advertising rooms for rent in each of three Canadian cities: Windsor, Toronto, and Halifax. Compared to control conditions, calls ostensibly from AIDS patients were…
FaculTea: Professional Development for Learning Centered Academic Advising
ERIC Educational Resources Information Center
Voller, Julie Givans
2013-01-01
The theory of learning centered academic advising states that the purpose of advising is to teach undergraduate students about the logic and purpose of their education. Previous scholarship on learning centered advising has focused on the theoretical or on implementation by faculty at small colleges and universities. Methods for supporting…
Emotion Regulation Predicts Attention Bias in Maltreated Children At-Risk for Depression
ERIC Educational Resources Information Center
Romens, Sarah E.; Pollak, Seth D.
2012-01-01
Background: Child maltreatment is associated with heightened risk for depression; however, not all individuals who experience maltreatment develop depression. Previous research indicates that maltreatment contributes to an attention bias for emotional cues, and that depressed individuals show attention bias for sad cues. Method: The present study…
Scale of Academic Emotion in Science Education: Development and Validation
ERIC Educational Resources Information Center
Chiang, Wen-Wei; Liu, Chia-Ju
2014-01-01
Contemporary research into science education has generally been conducted from the perspective of "conceptual change" in learning. This study sought to extend previous work by recognizing that human rationality can be influenced by the emotions generated by the learning environment and specific actions related to learning. Methods used…
Classical Civilization (Greece-Hellenistic-Rome). Teacher's Manual. 1968 Edition.
ERIC Educational Resources Information Center
Leppert, Ella C.; Smith, Rozella B.
This secondary teachers guide builds upon a previous sequential course described in SO 003 173, and consists of three sections on the classical civilizations--Greek, Hellenistic, and Rome. Major emphasis is upon students gaining an understanding of cultural development and transmission. Using an analytic method, students learn to examine primary…
The Effect of a Brief Training in Motivational Interviewing on Trainee Skill Development
ERIC Educational Resources Information Center
Young, Tabitha L.; Hagedorn, W. Bryce
2012-01-01
Motivational interviewing (MI) is an empirically based practice that provides counselors with methods for working with resistant and ambivalent clients. Whereas previous research has demonstrated the effectiveness of training current clinicians in this evidenced-based practice, no research has investigated the efficacy of teaching MI to…
USDA-ARS?s Scientific Manuscript database
Ehrlichiosis, a potentially fatal infection, is caused by rickettsial bacteria transmitted by the lone star tick, Amblyomma americanum. We previously analyzed the chemosensory appendage proteome of A. americanum as part of a project to develop new chemosensory-based vector control methods. Among the...
Soares, Cristina M Dias; Alves, Rita C; Casal, Susana; Oliveira, M Beatriz P P; Fernandes, José Oliveira
2010-04-01
The present study describes the development and validation of a new method based on a matrix solid-phase dispersion (MSPD) sample preparation procedure followed by GC-MS for determination of acrylamide levels in coffee (ground coffee and brewed coffee) and coffee substitute samples. Samples were dispersed in C(18) sorbent and the mixture was further packed into a preconditioned custom-made ISOLUTE bilayered SPE column (C(18)/Multimode; 1 g + 1 g). Acrylamide was subsequently eluted with water, and then derivatized with bromine and quantified by GC-MS in SIM mode. The MSPD/GC-MS method presented a LOD of 5 microg/kg and a LOQ of 10 microg/kg. Intra and interday precisions ranged from 2% to 4% and 4% to 10%, respectively. To evaluate the performance of the method, 11 samples of ground and brewed coffee and coffee substitutes were simultaneously analyzed by the developed method and also by a previously validated method based in a liquid-extraction (LE) procedure, and the results were compared showing a high correlation between them.
The development of a primary dental care outreach course.
Waterhouse, P; Maguire, A; Tabari, D; Hind, V; Lloyd, J
2008-02-01
The aim of this work was to develop the first north-east based primary dental care outreach (PDCO) course for clinical dental undergraduate students at Newcastle University. The process of course design will be described and involved review of the existing Bachelor of Dental Surgery (BDS) degree course in relation to previously published learning outcomes. Areas were identified where the existing BDS course did not meet fully these outcomes. This was followed by setting the PDCO course aims and objectives, intended learning outcomes, curriculum and structure. The educational strategy and methods of teaching and learning were subsequently developed together with a strategy for overall quality control of the teaching and learning experience. The newly developed curriculum was aligned with appropriate student assessment methods, including summative, formative and ipsative elements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Judith C.
The purpose of this grant is to develop the multi-scale theoretical methods to describe the nanoscale oxidation of metal thin films, as the PI (Yang) extensive previous experience in the experimental elucidation of the initial stages of Cu oxidation by primarily in situ transmission electron microscopy methods. Through the use and development of computational tools at varying length (and time) scales, from atomistic quantum mechanical calculation, force field mesoscale simulations, to large scale Kinetic Monte Carlo (KMC) modeling, the fundamental underpinings of the initial stages of Cu oxidation have been elucidated. The development of computational modeling tools allows for acceleratedmore » materials discovery. The theoretical tools developed from this program impact a wide range of technologies that depend on surface reactions, including corrosion, catalysis, and nanomaterials fabrication.« less
Stevens, Katherine; Palfreyman, Simon
2012-12-01
To describe how qualitative methods can be used in the development of descriptive systems of preference-based measures (PBMs) of health-related quality of life. The requirements of the National Institute for Health and Clinical Excellence and other agencies together with the increasing use of patient-reported outcome measures has led to an increase in the demand for PBMs. Recently, interest has grown in developing new PBMs and while previous research on PBMs has mainly focused on the methods of valuation, research into the methods of developing descriptive systems is an emerging field. Traditionally, descriptive systems of PBMs were developed by using top-down methods, where content was derived from existing measures, the literature, or health surveys. A contrasting approach is a bottom-up methodology, which takes the views of patients or laypeople on how their life is affected by their health. This approach generally requires the use of qualitative methods. Qualitative methods lend themselves well to the development of PBMs. They also ensure that the measure has appropriate language, content validity, and responsiveness to change. While the use of qualitative methods in the development of non-PBMs is fairly standard, their use in developing PBMs was until recently nonexistent. In this article, we illustrate the use of qualitative methods by presenting two case studies of recently developed PBMs, one generic and one condition specific. We outline the stages involved, discuss the strengths and weaknesses of the approach, and compare with the top-down approach used in the majority of PBMs to date. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Automated three-component synthesis of a library of γ-lactams
Fenster, Erik; Hill, David; Reiser, Oliver
2012-01-01
Summary A three-component method for the synthesis of γ-lactams from commercially available maleimides, aldehydes, and amines was adapted to parallel library synthesis. Improvements to the chemistry over previous efforts include the optimization of the method to a one-pot process, the management of by-products and excess reagents, the development of an automated parallel sequence, and the adaption of the method to permit the preparation of enantiomerically enriched products. These efforts culminated in the preparation of a library of 169 γ-lactams. PMID:23209515
The application of quadratic optimal cooperative control synthesis to a CH-47 helicopter
NASA Technical Reports Server (NTRS)
Townsend, Barbara K.
1986-01-01
A control-system design method, Quadratic Optimal Cooperative Control Synthesis (CCS), is applied to the design of a Stability and Control Augmentation Systems (SCAS). The CCS design method is different from other design methods in that it does not require detailed a priori design criteria, but instead relies on an explicit optimal pilot-model to create desired performance. The design model, which was developed previously for fixed-wing aircraft, is simplified and modified for application to a Boeing Vertol CH-47 helicopter. Two SCAS designs are developed using the CCS design methodology. The resulting CCS designs are then compared with designs obtained using classical/frequency-domain methods and Linear Quadratic Regulator (LQR) theory in a piloted fixed-base simulation. Results indicate that the CCS method, with slight modifications, can be used to produce controller designs which compare favorably with the frequency-domain approach.
Noncontact evaluation for interface states by photocarrier counting
NASA Astrophysics Data System (ADS)
Furuta, Masaaki; Shimizu, Kojiro; Maeta, Takahiro; Miyashita, Moriya; Izunome, Koji; Kubota, Hiroshi
2018-03-01
We have developed a noncontact measurement method that enables in-line measurement and does not have any test element group (TEG) formation. In this method, the number of photocarriers excited from the interface states are counted which is called “photocarrier counting”, and then the energy distribution of the interface states density (D it) is evaluated by spectral light excitation. In our previous experiment, the method used was a preliminary contact measurement method at the oxide on top of the Si wafer. We developed, at this time, a D it measurement method as a noncontact measurement with a gap between the probes and the wafer. The shallow trench isolation (STI) sidewall has more localized interface states than the region under the gate electrode. We demonstrate the noncontact measurement of trapped carriers from interface states using wafers of three different crystal plane orientations. The demonstration will pave the way for evaluating STI sidewall interface states in future studies.
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
Method for phosphorothioate antisense DNA sequencing by capillary electrophoresis with UV detection.
Froim, D; Hopkins, C E; Belenky, A; Cohen, A S
1997-11-01
The progress of antisense DNA therapy demands development of reliable and convenient methods for sequencing short single-stranded oligonucleotides. A method of phosphorothioate antisense DNA sequencing analysis using UV detection coupled to capillary electrophoresis (CE) has been developed based on a modified chain termination sequencing method. The proposed method reduces the sequencing cost since it uses affordable CE-UV instrumentation and requires no labeling with minimal sample processing before analysis. Cycle sequencing with ThermoSequenase generates quantities of sequencing products that are readily detectable by UV. Discrimination of undesired components from sequencing products in the reaction mixture, previously accomplished by fluorescent or radioactive labeling, is now achieved by bringing concentrations of undesired components below the UV detection range which yields a 'clean', well defined sequence. UV detection coupled with CE offers additional conveniences for sequencing since it can be accomplished with commercially available CE-UV equipment and is readily amenable to automation.
Method for phosphorothioate antisense DNA sequencing by capillary electrophoresis with UV detection.
Froim, D; Hopkins, C E; Belenky, A; Cohen, A S
1997-01-01
The progress of antisense DNA therapy demands development of reliable and convenient methods for sequencing short single-stranded oligonucleotides. A method of phosphorothioate antisense DNA sequencing analysis using UV detection coupled to capillary electrophoresis (CE) has been developed based on a modified chain termination sequencing method. The proposed method reduces the sequencing cost since it uses affordable CE-UV instrumentation and requires no labeling with minimal sample processing before analysis. Cycle sequencing with ThermoSequenase generates quantities of sequencing products that are readily detectable by UV. Discrimination of undesired components from sequencing products in the reaction mixture, previously accomplished by fluorescent or radioactive labeling, is now achieved by bringing concentrations of undesired components below the UV detection range which yields a 'clean', well defined sequence. UV detection coupled with CE offers additional conveniences for sequencing since it can be accomplished with commercially available CE-UV equipment and is readily amenable to automation. PMID:9336449
Holzhütter, H G; Genschow, E; Diener, W; Schlede, E
2003-05-01
The acute toxic class (ATC) methods were developed for determining LD(50)/LC(50) estimates of chemical substances with significantly fewer animals than needed when applying conventional LD(50)/LC(50) tests. The ATC methods are sequential stepwise procedures with fixed starting doses/concentrations and a maximum of six animals used per dose/concentration. The numbers of dead/moribund animals determine whether further testing is necessary or whether the test is terminated. In recent years we have developed classification procedures for the oral, dermal and inhalation routes of administration by using biometric methods. The biometric approach assumes a probit model for the mortality probability of a single animal and assigns the chemical to that toxicity class for which the best concordance is achieved between the statistically expected and the observed numbers of dead/moribund animals at the various steps of the test procedure. In previous publications we have demonstrated the validity of the biometric ATC methods on the basis of data obtained for the oral ATC method in two-animal ring studies with 15 participants from six countries. Although the test procedures and biometric evaluations for the dermal and inhalation ATC methods have already been published, there was a need for an adaptation of the classification schemes to the starting doses/concentrations of the Globally Harmonized Classification System (GHS) recently adopted by the Organization for Economic Co-operation and Development (OECD). Here we present the biometric evaluation of the dermal and inhalation ATC methods for the starting doses/concentrations of the GHS and of some other international classification systems still in use. We have developed new test procedures and decision rules for the dermal and inhalation ATC methods, which require significantly fewer animals to provide predictions of toxicity classes, that are equally good or even better than those achieved by using the conventional LD(50)/LC(50) methods. In order to cope with rather narrow dose/concentration classes of the GHS we have, as in our previous publications, combined the outcome of all results that can be obtained during testing for the allocation to one of the defined toxicity classes of the GHS. Our results strongly recommend the deletion of the dermal LD(50) and the inhalation LC(50) test as regulatory tests and the adoption of the dermal and inhalation ATC methods as internationally accepted alternatives.
Relationship between behavioral and physiological spectral-ripple discrimination.
Won, Jong Ho; Clinard, Christopher G; Kwon, Seeyoun; Dasika, Vasant K; Nie, Kaibao; Drennan, Ward R; Tremblay, Kelly L; Rubinstein, Jay T
2011-06-01
Previous studies have found a significant correlation between spectral-ripple discrimination and speech and music perception in cochlear implant (CI) users. This relationship could be of use to clinicians and scientists who are interested in using spectral-ripple stimuli in the assessment and habilitation of CI users. However, previous psychoacoustic tasks used to assess spectral discrimination are not suitable for all populations, and it would be beneficial to develop methods that could be used to test all age ranges, including pediatric implant users. Additionally, it is important to understand how ripple stimuli are processed in the central auditory system and how their neural representation contributes to behavioral performance. For this reason, we developed a single-interval, yes/no paradigm that could potentially be used both behaviorally and electrophysiologically to estimate spectral-ripple threshold. In experiment 1, behavioral thresholds obtained using the single-interval method were compared to thresholds obtained using a previously established three-alternative forced-choice method. A significant correlation was found (r = 0.84, p = 0.0002) in 14 adult CI users. The spectral-ripple threshold obtained using the new method also correlated with speech perception in quiet and noise. In experiment 2, the effect of the number of vocoder-processing channels on the behavioral and physiological threshold in normal-hearing listeners was determined. Behavioral thresholds, using the new single-interval method, as well as cortical P1-N1-P2 responses changed as a function of the number of channels. Better behavioral and physiological performance (i.e., better discrimination ability at higher ripple densities) was observed as more channels added. In experiment 3, the relationship between behavioral and physiological data was examined. Amplitudes of the P1-N1-P2 "change" responses were significantly correlated with d' values from the single-interval behavioral procedure. Results suggest that the single-interval procedure with spectral-ripple phase inversion in ongoing stimuli is a valid approach for measuring behavioral or physiological spectral resolution.
A second-order accurate immersed boundary-lattice Boltzmann method for particle-laden flows
NASA Astrophysics Data System (ADS)
Zhou, Qiang; Fan, Liang-Shih
2014-07-01
A new immersed boundary-lattice Boltzmann method (IB-LBM) is presented for fully resolved simulations of incompressible viscous flows laden with rigid particles. The immersed boundary method (IBM) recently developed by Breugem (2012) [19] is adopted in the present method, development including the retraction technique, the multi-direct forcing method and the direct account of the inertia of the fluid contained within the particles. The present IB-LBM is, however, formulated with further improvement with the implementation of the high-order Runge-Kutta schemes in the coupled fluid-particle interaction. The major challenge to implement high-order Runge-Kutta schemes in the LBM is that the flow information such as density and velocity cannot be directly obtained at a fractional time step from the LBM since the LBM only provides the flow information at an integer time step. This challenge can be, however, overcome as given in the present IB-LBM by extrapolating the flow field around particles from the known flow field at the previous integer time step. The newly calculated fluid-particle interactions from the previous fractional time steps of the current integer time step are also accounted for in the extrapolation. The IB-LBM with high-order Runge-Kutta schemes developed in this study is validated by several benchmark applications. It is demonstrated, for the first time, that the IB-LBM has the capacity to resolve the translational and rotational motion of particles with the second-order accuracy. The optimal retraction distances for spheres and tubes that help the method achieve the second-order accuracy are found to be around 0.30 and -0.47 times of the lattice spacing, respectively. Simulations of the Stokes flow through a simple cubic lattice of rotational spheres indicate that the lift force produced by the Magnus effect can be very significant in view of the magnitude of the drag force when the practical rotating speed of the spheres is encountered. This finding may lead to more comprehensive studies of the effect of the particle rotation on fluid-solid drag laws. It is also demonstrated that, when the third-order or the fourth-order Runge-Kutta scheme is used, the numerical stability of the present IB-LBM is better than that of all methods in the literature, including the previous IB-LBMs and also the methods with the combination of the IBM and the traditional incompressible Navier-Stokes solver.
NASA Astrophysics Data System (ADS)
Xiong, J. P.; Zhang, A. L.; Ji, K. F.; Feng, S.; Deng, H.; Yang, Y. F.
2016-01-01
Photospheric bright points (PBPs) are tiny and short-lived phenomena which can be seen within dark inter-granular lanes. In this paper, we develop a new method to identify and track the PBPs in the three-dimensional data cube. Different from the previous way such as Detection-Before-Tracking, this method is based on the Tracking-While-Detection. Using this method, the whole lifetime of a PBP can be accurately measured while this PBP is possibly separated into several with Laplacian and morphological dilation (LMD) method due to its weak intensity sometimes. With consideration of the G-band PBPs observed by Hinode/SOT (Solar Optical Telescope) for more than two hours, we find that the isolated PBPs have an average lifetime of 3 minutes, and the longest one is up to 27 minutes, which are greater than the values detected by the previous LMD method. Furthermore, we also find that the mean intensity of PBPs is 1.02 times of the mean photospheric intensity, which is less than the values detected by LMD method, and the intensity of PBPs presents a period of oscillation with 2-3 minutes during the whole lifetime.
Optics-Only Calibration of a Neural-Net Based Optical NDE Method for Structural Health Monitoring
NASA Technical Reports Server (NTRS)
Decker, Arthur J.
2004-01-01
A calibration process is presented that uses optical measurements alone to calibrate a neural-net based NDE method. The method itself detects small changes in the vibration mode shapes of structures. The optics-only calibration process confirms previous work that the sensitivity to vibration-amplitude changes can be as small as 10 nanometers. A more practical value in an NDE service laboratory is shown to be 50 nanometers. Both model-generated and experimental calibrations are demonstrated using two implementations of the calibration technique. The implementations are based on previously published demonstrations of the NDE method and an alternative calibration procedure that depends on comparing neural-net and point sensor measurements. The optics-only calibration method, unlike the alternative method, does not require modifications of the structure being tested or the creation of calibration objects. The calibration process can be used to test improvements in the NDE process and to develop a vibration-mode-independence of damagedetection sensitivity. The calibration effort was intended to support NASA s objective to promote safety in the operations of ground test facilities or aviation safety, in general, by allowing the detection of the gradual onset of structural changes and damage.
A discussion on the origin of quantum probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holik, Federico, E-mail: olentiev2@gmail.com; Departamento de Matemática - Ciclo Básico Común, Universidad de Buenos Aires - Pabellón III, Ciudad Universitaria, Buenos Aires; Sáenz, Manuel
We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivationmore » of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.« less
Simultaneous Separation of Actinium and Radium Isotopes from a Proton Irradiated Thorium Matrix
Mastren, Tara; Radchenko, Valery; Owens, Allison; ...
2017-08-15
A new method has been developed for the isolation of 223,224,225Ra, in high yield and purity, from a proton irradiated 232Th matrix. We report an all-aqueous process using multiple solid-supported adsorption steps including a citrate chelation method developed to remove >99.9% of the barium contaminants by activity from the final radium product. Moreover, we developed a procedure involving the use of three columns in succession, and the separation of 223,224,225Ra from the thorium matrix was obtained with an overall recovery yield of 91 ± 3%, average radiochemical purity of 99.9%, and production yields that correspond to physical yields based onmore » previously measured excitation functions.« less
Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems
Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R
2006-01-01
Background We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. Results We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Conclusion Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems. PMID:17081289
Novel metaheuristic for parameter estimation in nonlinear dynamic biological systems.
Rodriguez-Fernandez, Maria; Egea, Jose A; Banga, Julio R
2006-11-02
We consider the problem of parameter estimation (model calibration) in nonlinear dynamic models of biological systems. Due to the frequent ill-conditioning and multi-modality of many of these problems, traditional local methods usually fail (unless initialized with very good guesses of the parameter vector). In order to surmount these difficulties, global optimization (GO) methods have been suggested as robust alternatives. Currently, deterministic GO methods can not solve problems of realistic size within this class in reasonable computation times. In contrast, certain types of stochastic GO methods have shown promising results, although the computational cost remains large. Rodriguez-Fernandez and coworkers have presented hybrid stochastic-deterministic GO methods which could reduce computation time by one order of magnitude while guaranteeing robustness. Our goal here was to further reduce the computational effort without loosing robustness. We have developed a new procedure based on the scatter search methodology for nonlinear optimization of dynamic models of arbitrary (or even unknown) structure (i.e. black-box models). In this contribution, we describe and apply this novel metaheuristic, inspired by recent developments in the field of operations research, to a set of complex identification problems and we make a critical comparison with respect to the previous (above mentioned) successful methods. Robust and efficient methods for parameter estimation are of key importance in systems biology and related areas. The new metaheuristic presented in this paper aims to ensure the proper solution of these problems by adopting a global optimization approach, while keeping the computational effort under reasonable values. This new metaheuristic was applied to a set of three challenging parameter estimation problems of nonlinear dynamic biological systems, outperforming very significantly all the methods previously used for these benchmark problems.
Jasińska, Kaja K; Guei, Sosthène
2018-02-02
Portable neuroimaging approaches provide new advances to the study of brain function and brain development with previously inaccessible populations and in remote locations. This paper shows the development of field functional Near Infrared Spectroscopy (fNIRS) imaging to the study of child language, reading, and cognitive development in a rural village setting of Côte d'Ivoire. Innovation in methods and the development of culturally appropriate neuroimaging protocols allow a first-time look into the brain's development and children's learning outcomes in understudied environments. This paper demonstrates protocols for transporting and setting up a mobile laboratory, discusses considerations for field versus laboratory neuroimaging, and presents a guide for developing neuroimaging consent procedures and building meaningful long-term collaborations with local government and science partners. Portable neuroimaging methods can be used to study complex child development contexts, including the impact of significant poverty and adversity on brain development. The protocol presented here has been developed for use in Côte d'Ivoire, the world's primary source of cocoa, and where reports of child labor in the cocoa sector are common. Yet, little is known about the impact of child labor on brain development and learning. Field neuroimaging methods have the potential to yield new insights into such urgent issues, and the development of children globally.
NASA Astrophysics Data System (ADS)
Karemore, Gopal; Nielsen, Mads; Karssemeijer, Nico; Brandt, Sami S.
2014-11-01
It is well understood nowadays that changes in the mammographic parenchymal pattern are an indicator of a risk of breast cancer and we have developed a statistical method that estimates the mammogram regions where the parenchymal changes, due to breast cancer, occur. This region of interest is computed from a score map by utilising the anatomical breast coordinate system developed in our previous work. The method also makes an automatic scale selection to avoid overfitting while the region estimates are computed by a nested cross-validation scheme. In this way, it is possible to recover those mammogram regions that show a significant difference in classification scores between the cancer and the control group. Our experiments suggested that the most significant mammogram region is the region behind the nipple and that can be justified by previous findings from other research groups. This result was conducted on the basis of the cross-validation experiments on independent training, validation and testing sets from the case-control study of 490 women, of which 245 women were diagnosed with breast cancer within a period of 2-4 years after the baseline mammograms. We additionally generalised the estimated region to another, mini-MIAS study and showed that the transferred region estimate gives at least a similar classification result when compared to the case where the whole breast region is used. In all, by following our method, one most likely improves both preclinical and follow-up breast cancer screening, but a larger study population will be required to test this hypothesis.
NASA Technical Reports Server (NTRS)
Della-Corte, Christopher
2012-01-01
Foil gas bearings are a key technology in many commercial and emerging oilfree turbomachinery systems. These bearings are nonlinear and have been difficult to analytically model in terms of performance characteristics such as load capacity, power loss, stiffness, and damping. Previous investigations led to an empirically derived method to estimate load capacity. This method has been a valuable tool in system development. The current work extends this tool concept to include rules for stiffness and damping coefficient estimation. It is expected that these rules will further accelerate the development and deployment of advanced oil-free machines operating on foil gas bearings.
[Analysis on origin and evolution of mumps].
Zhao, Yan
2004-10-01
There was a preliminary recognition on mumps during the Qin-Han to Sui-Tang dynasty, laying a foundation for the scholastic development on this topic in later generations. The title of this disease was identified in Song-Jin-Yuan dynasty with gradual deepening on its principle-method-formula-medication system, a great progress of recognition as compared with the previous ages. In the Ming-Qing dynasty, the recognition became even more systematic, with certain breakthrough in the system of principle-method-formula-medication. In modern age, the experiences were inherited and developed to integrate to modern biomedicine, so that the theory and clinical practice become even more perfect.
Digital Mapping Techniques '09-Workshop Proceedings, Morgantown, West Virginia, May 10-13, 2009
Soller, David R.
2011-01-01
As in the previous years' meetings, the objective was to foster informal discussion and exchange of technical information, principally in order to develop more efficient methods for digital mapping, cartography, GIS analysis, and information management. At this meeting, oral and poster presentations and special discussion sessions emphasized (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase format; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
High-Fidelity Micromechanics Model Developed for the Response of Multiphase Materials
NASA Technical Reports Server (NTRS)
Aboudi, Jacob; Pindera, Marek-Jerzy; Arnold, Steven M.
2002-01-01
A new high-fidelity micromechanics model has been developed under funding from the NASA Glenn Research Center for predicting the response of multiphase materials with arbitrary periodic microstructures. The model's analytical framework is based on the homogenization technique, but the method of solution for the local displacement and stress fields borrows concepts previously employed in constructing the higher order theory for functionally graded materials. The resulting closed-form macroscopic and microscopic constitutive equations, valid for both uniaxial and multiaxial loading of periodic materials with elastic and inelastic constitutive phases, can be incorporated into a structural analysis computer code. Consequently, this model now provides an alternative, accurate method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kidd, M.E.C.
1997-02-01
The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.
The costs of nurse turnover: part 1: an economic perspective.
Jones, Cheryl Bland
2004-12-01
Nurse turnover is costly for healthcare organizations. Administrators and nurse executives need a reliable estimate of nurse turnover costs and the origins of those costs if they are to develop effective measures of reducing nurse turnover and its costs. However, determining how to best capture and quantify nurse turnover costs can be challenging. Part 1 of this series conceptualizes nurse turnover via human capital theory and presents an update of a previously developed method for determining the costs of nurse turnover, the Nursing Turnover Cost Calculation Method. Part 2 (January 2005) presents a recent application of the methodology in an acute care hospital.
An approximate Riemann solver for magnetohydrodynamics (that works in more than one dimension)
NASA Technical Reports Server (NTRS)
Powell, Kenneth G.
1994-01-01
An approximate Riemann solver is developed for the governing equations of ideal magnetohydrodynamics (MHD). The Riemann solver has an eight-wave structure, where seven of the waves are those used in previous work on upwind schemes for MHD, and the eighth wave is related to the divergence of the magnetic field. The structure of the eighth wave is not immediately obvious from the governing equations as they are usually written, but arises from a modification of the equations that is presented in this paper. The addition of the eighth wave allows multidimensional MHD problems to be solved without the use of staggered grids or a projection scheme, one or the other of which was necessary in previous work on upwind schemes for MHD. A test problem made up of a shock tube with rotated initial conditions is solved to show that the two-dimensional code yields answers consistent with the one-dimensional methods developed previously.
NASA Astrophysics Data System (ADS)
Anding, K.; Kuritcyn, P.; Garten, D.
2016-11-01
In this paper a new method for the automatic visual inspection of metallic surfaces is proposed by using Convolutional Neural Networks (CNN). The different combinations of network parameters were developed and tested. The obtained results of CNN were analysed and compared with the results of our previous investigations with color and texture features as input parameters for a Support Vector Machine. Advantages and disadvantages of the different classifying methods are explained.
Automated product recovery in a HG-196 photochemical isotope separation process
Grossman, Mark W.; Speer, Richard
1992-01-01
A method of removing deposited product from a photochemical reactor used in the enrichment of .sup.196 Hg has been developed and shown to be effective for rapid re-cycling of the reactor system. Unlike previous methods relatively low temperatures are used in a gas and vapor phase process of removal. Importantly, the recovery process is understood in a quantitative manner so that scaling design to larger capacity systems can be easily carried out.
Automated product recovery in a Hg-196 photochemical isotope separation process
Grossman, M.W.; Speer, R.
1992-07-21
A method of removing deposited product from a photochemical reactor used in the enrichment of [sup 196]Hg has been developed and shown to be effective for rapid re-cycling of the reactor system. Unlike previous methods relatively low temperatures are used in a gas and vapor phase process of removal. Importantly, the recovery process is understood in a quantitative manner so that scaling design to larger capacity systems can be easily carried out. 2 figs.
Warren, Jamie M; Pawliszyn, Janusz
2011-12-16
For air/headspace analysis, needle trap devices (NTDs) are applicable for sampling a wide range of volatiles such as benzene, alkanes, and semi-volatile particulate bound compounds such as pyrene. This paper describes a new NTD that is simpler to produce and improves performance relative to previous NTD designs. A NTD utilizing a side-hole needle used a modified tip, which removed the need to use epoxy glue to hold sorbent particles inside the NTD. This design also improved the seal between the NTD and narrow neck liner of the GC injector; therefore, improving the desorption efficiency. A new packing method has been developed and evaluated using solvent to pack the device, and is compared to NTDs prepared using the previous vacuum aspiration method. The slurry packing method reduced preparation time and improved reproducibility between NTDs. To evaluate the NTDs, automated headspace extraction was completed using benzene, toluene, ethylbenzene, p-xylene (BTEX), anthracene, and pyrene (PAH). NTD geometries evaluated include: blunt tip with side-hole needle, tapered tip with side-hole needle, slider tip with side-hole, dome tapered tip with side-hole and blunt with no side-hole needle (expanded desorptive flow). Results demonstrate that the tapered and slider tip NTDs performed with improved desorption efficiency. Copyright © 2011 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Kelly Porter
Key goals towards national biosecurity include methods for analyzing pathogens, predicting their emergence, and developing countermeasures. These goals are served by studying bacterial genes that promote pathogenicity and the pathogenicity islands that mobilize them. Cyberinfrastructure promoting an island database advances this field and enables deeper bioinformatic analysis that may identify novel pathogenicity genes. New automated methods and rich visualizations were developed for identifying pathogenicity islands, based on the principle that islands occur sporadically among closely related strains. The chromosomally-ordered pan-genome organizes all genes from a clade of strains; gaps in this visualization indicate islands, and decorations of the gene matrixmore » facilitate exploration of island gene functions. A %E2%80%9Clearned phyloblocks%E2%80%9D method was developed for automated island identification, that trains on the phylogenetic patterns of islands identified by other methods. Learned phyloblocks better defined termini of previously identified islands in multidrug-resistant Klebsiella pneumoniae ATCC BAA-2146, and found its only antibiotic resistance island.« less
NASA Technical Reports Server (NTRS)
Gallardo, V. C.; Storace, A. S.; Gaffney, E. F.; Bach, L. J.; Stallone, M. J.
1981-01-01
The component element method was used to develop a transient dynamic analysis computer program which is essentially based on modal synthesis combined with a central, finite difference, numerical integration scheme. The methodology leads to a modular or building-block technique that is amenable to computer programming. To verify the analytical method, turbine engine transient response analysis (TETRA), was applied to two blade-out test vehicles that had been previously instrumented and tested. Comparison of the time dependent test data with those predicted by TETRA led to recommendations for refinement or extension of the analytical method to improve its accuracy and overcome its shortcomings. The development of working equations, their discretization, numerical solution scheme, the modular concept of engine modelling, the program logical structure and some illustrated results are discussed. The blade-loss test vehicles (rig full engine), the type of measured data, and the engine structural model are described.
User Requirements Based Development of a Web Portal for Chronic Patients.
Kopanitsa, Georgy
2017-01-01
In the current study, we tried to identify practices that help overcoming data entering and operational barriers, and involve patients and doctors in the development process to improve the acceptance of Web portals for chronic patients. This paper presents a follow up project implementing a Web portal for chronic patients considering previously studied barriers and opportunities. The following methods were applied to facilitate the acceptance of the portal: 1) a joint use case definition and discussion session before starting the development; 2) involvement of the users in prototyping the portal; 3) training of doctors and patients together before the implementation. During the first week of the portal's operation we have measured the number of data transactions and the number of active users to compare it with previous experience. The first weeks of operating the portal, we could observe an active contribution of doctors and patients, who submitted vital signs data and recommendations to the portal.
Quantification of false positive reduction in nucleic acid purification on hemorrhagic fever DNA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
James, Conrad D.; Pohl, Kenneth Roy; Derzon, Mark Steven
2006-11-01
Columbia University has developed a sensitive highly multiplexed system for genetic identification of nucleic acid targets. The primary obstacle to implementing this technology is the high rate of false positives due to high levels of unbound reporters that remain within the system after hybridization. The ability to distinguish between free reporters and reporters bound to targets limits the use of this technology. We previously demonstrated a new electrokinetic method for binary separation of kb pair long DNA molecules and oligonucleotides. The purpose of this project 99864 is to take these previous demonstrations and further develop the technique and hardware formore » field use. Specifically, our objective was to implement separation in a heterogeneous sample (containing target DNA and background oligo), to perform the separation in a flow-based device, and to develop all of the components necessary for field testing a breadboard prototype system.« less
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains. PMID:22164064
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains.
Patwary, Nurmohammed; Preza, Chrysanthe
2015-01-01
A depth-variant (DV) image restoration algorithm for wide field fluorescence microscopy, using an orthonormal basis decomposition of DV point-spread functions (PSFs), is investigated in this study. The efficient PSF representation is based on a previously developed principal component analysis (PCA), which is computationally intensive. We present an approach developed to reduce the number of DV PSFs required for the PCA computation, thereby making the PCA-based approach computationally tractable for thick samples. Restoration results from both synthetic and experimental images show consistency and that the proposed algorithm addresses efficiently depth-induced aberration using a small number of principal components. Comparison of the PCA-based algorithm with a previously-developed strata-based DV restoration algorithm demonstrates that the proposed method improves performance by 50% in terms of accuracy and simultaneously reduces the processing time by 64% using comparable computational resources. PMID:26504634
Quantifying Impacts of Urban Growth Potential on Army Training Capabilities
2017-09-12
Capacity” ERDC/CERL TR-17-34 ii Abstract Building on previous studies of urban growth and population effects on U.S. military installations and...combat team studies . CAA has developed an iterative process that builds on Military Value Anal- ysis (MVA) models that include a set of attributes that...Methods and tools were developed to support a nationwide analysis. This study focused on installations operating training areas that were high
Design Criteria for Low Profile Flange Calculations
NASA Technical Reports Server (NTRS)
Leimbach, K. R.
1973-01-01
An analytical method and a design procedure to develop flanged separable pipe connectors are discussed. A previously established algorithm is the basis for calculating low profile flanges. The characteristics and advantages of the low profile flange are analyzed. The use of aluminum, titanium, and plastics for flange materials is described. Mathematical models are developed to show the mechanical properties of various flange configurations. A computer program for determining the structural stability of the flanges is described.
Feasibility study for automatic reduction of phase change imagery
NASA Technical Reports Server (NTRS)
Nossaman, G. O.
1971-01-01
The feasibility of automatically reducing a form of pictorial aerodynamic heating data is discussed. The imagery, depicting the melting history of a thin coat of fusible temperature indicator painted on an aerodynamically heated model, was previously reduced by manual methods. Careful examination of various lighting theories and approaches led to an experimentally verified illumination concept capable of yielding high-quality imagery. Both digital and video image processing techniques were applied to reduction of the data, and it was demonstrated that either method can be used to develop superimposed contours. Mathematical techniques were developed to find the model-to-image and the inverse image-to-model transformation using six conjugate points, and methods were developed using these transformations to determine heating rates on the model surface. A video system was designed which is able to reduce the imagery rapidly, economically and accurately. Costs for this system were estimated. A study plan was outlined whereby the mathematical transformation techniques developed to produce model coordinate heating data could be applied to operational software, and methods were discussed and costs estimated for obtaining the digital information necessary for this software.
Glossary of reference terms for alternative test methods and their validation.
Ferrario, Daniele; Brustio, Roberta; Hartung, Thomas
2014-01-01
This glossary was developed to provide technical references to support work in the field of the alternatives to animal testing. It was compiled from various existing reference documents coming from different sources and is meant to be a point of reference on alternatives to animal testing. Giving the ever-increasing number of alternative test methods and approaches being developed over the last decades, a combination, revision, and harmonization of earlier published collections of terms used in the validation of such methods is required. The need to update previous glossary efforts came from the acknowledgement that new words have emerged with the development of new approaches, while others have become obsolete, and the meaning of some terms has partially changed over time. With this glossary we intend to provide guidance on issues related to the validation of new or updated testing methods consistent with current approaches. Moreover, because of new developments and technologies, a glossary needs to be a living, constantly updated document. An Internet-based version based on this compilation may be found at http://altweb.jhsph.edu/, allowing the addition of new material.
Ijzerman, Maarten J; Steuten, Lotte M G
2011-09-01
Worldwide, billions of dollars are invested in medical product development and there is an increasing pressure to maximize the revenues of these investments. That is, governments need to be informed about the benefits of spending public resources, companies need more information to manage their product development portfolios and even universities may need to direct their research programmes in order to maximize societal benefits. Assuming that all medical products need to be adopted by the heavily regulated healthcare market at one point in time, it is worthwhile to look at the logic behind healthcare decision making, specifically, decisions on the coverage of medical products and decisions on the use of these products under competing and uncertain conditions. With the growing tension between leveraging economic growth through R&D spending on the one hand and stricter control of healthcare budgets on the other, several attempts have been made to apply the health technology assessment (HTA) methodology to earlier stages of technology development and implementation. For instance, horizon scanning was introduced to systematically assess emerging technologies in order to inform health policy. Others have introduced iterative economic evaluation, e.g. economic evaluations in earlier stages of clinical research. However, most of these methods are primarily intended to support governments in making decisions regarding potentially expensive new medical products. They do not really inform biomedical product developers on the probability of return on investment, nor do they inform about the market needs and specific requirements of technologies in development. It is precisely this aspect that increasingly receives attention, i.e. is it possible to use HTA tools and methods to inform biomedical product development and to anticipate further development and market access. Several methods have been used in previous decades, but have never been compiled in a comprehensive review. The main objective of this article was to provide an overview of previous work and methods in the field of early HTA, and to put these approaches in perspective through a conceptual framework introduced in this paper. A particular goal of the review was to familiarize decision makers with available techniques that can be employed in early-stage decision making, and to identify opportunities for further methodological growth in this emerging field of HTA.
NASA Technical Reports Server (NTRS)
Ford, Hugh; Turner, C. E.; Fenner, R. T.; Curr, R. M.; Ivankovic, A.
1995-01-01
The objects of the first, exploratory, stage of the project were listed as: (1) to make a detailed and critical review of the Boundary Element method as already published and with regard to elastic-plastic fracture mechanics, to assess its potential for handling present concepts in two-dimensional and three-dimensional cases. To this was subsequently added the Finite Volume method and certain aspects of the Finite Element method for comparative purposes; (2) to assess the further steps needed to apply the methods so far developed to the general field, covering a practical range of geometries, work hardening materials, and composites: to consider their application under higher temperature conditions; (3) to re-assess the present stage of development of the energy dissipation rate, crack tip opening angle and J-integral models in relation to the possibilities of producing a unified technology with the previous two items; and (4) to report on the feasibility and promise of this combined approach and, if appropriate, make recommendations for the second stage aimed at developing a generalized crack growth technology for its application to real-life problems.
Fluid-structure interaction simulations of deformable structures with non-linear thin shell elements
NASA Astrophysics Data System (ADS)
Asgharzadeh, Hafez; Hedayat, Mohammadali; Borazjani, Iman; Scientific Computing; Biofluids Laboratory Team
2017-11-01
Large deformation of structures in a fluid is simulated using a strongly coupled partitioned fluid-structure interaction (FSI) approach which is stabilized with under-relaxation and the Aitken acceleration technique. The fluid is simulated using a recently developed implicit Newton-Krylov method with a novel analytical Jacobian. Structures are simulated using a triangular thin-shell finite element formulation, which considers only translational degrees of freedom. The thin-shell method is developed on the top of a previously implemented membrane finite element formulation. A sharp interface immersed boundary method is used to handle structures in the fluid domain. The developed FSI framework is validated against two three-dimensional experiments: (1) a flexible aquatic vegetation in the fluid and (2) a heaving flexible panel in fluid. Furthermore, the developed FSI framework is used to simulate tissue heart valves, which involve large deformations and non-linear material properties. This work was supported by American Heart Association (AHA) Grant 13SDG17220022 and the Center of Computational Research (CCR) of University at Buffalo.
Combinatorial Strategies for the Development of Bulk Metallic Glasses
NASA Astrophysics Data System (ADS)
Ding, Shiyan
The systematic identification of multi-component alloys out of the vast composition space is still a daunting task, especially in the development of bulk metallic glasses that are typically based on three or more elements. In order to address this challenge, combinatorial approaches have been proposed. However, previous attempts have not successfully coupled the synthesis of combinatorial libraries with high-throughput characterization methods. The goal of my dissertation is to develop efficient high-throughput characterization methods, optimized to identify glass formers systematically. Here, two innovative approaches have been invented. One is to measure the nucleation temperature in parallel for up-to 800 compositions. The composition with the lowest nucleation temperature has a reasonable agreement with the best-known glass forming composition. In addition, the thermoplastic formability of a metallic glass forming system is determined through blow molding a compositional library. Our results reveal that the composition with the largest thermoplastic deformation correlates well with the best-known formability composition. I have demonstrated both methods as powerful tools to develop new bulk metallic glasses.
Exploiting MeSH indexing in MEDLINE to generate a data set for word sense disambiguation
2011-01-01
Background Evaluation of Word Sense Disambiguation (WSD) methods in the biomedical domain is difficult because the available resources are either too small or too focused on specific types of entities (e.g. diseases or genes). We present a method that can be used to automatically develop a WSD test collection using the Unified Medical Language System (UMLS) Metathesaurus and the manual MeSH indexing of MEDLINE. We demonstrate the use of this method by developing such a data set, called MSH WSD. Methods In our method, the Metathesaurus is first screened to identify ambiguous terms whose possible senses consist of two or more MeSH headings. We then use each ambiguous term and its corresponding MeSH heading to extract MEDLINE citations where the term and only one of the MeSH headings co-occur. The term found in the MEDLINE citation is automatically assigned the UMLS CUI linked to the MeSH heading. Each instance has been assigned a UMLS Concept Unique Identifier (CUI). We compare the characteristics of the MSH WSD data set to the previously existing NLM WSD data set. Results The resulting MSH WSD data set consists of 106 ambiguous abbreviations, 88 ambiguous terms and 9 which are a combination of both, for a total of 203 ambiguous entities. For each ambiguous term/abbreviation, the data set contains a maximum of 100 instances per sense obtained from MEDLINE. We evaluated the reliability of the MSH WSD data set using existing knowledge-based methods and compared their performance to that of the results previously obtained by these algorithms on the pre-existing data set, NLM WSD. We show that the knowledge-based methods achieve different results but keep their relative performance except for the Journal Descriptor Indexing (JDI) method, whose performance is below the other methods. Conclusions The MSH WSD data set allows the evaluation of WSD algorithms in the biomedical domain. Compared to previously existing data sets, MSH WSD contains a larger number of biomedical terms/abbreviations and covers the largest set of UMLS Semantic Types. Furthermore, the MSH WSD data set has been generated automatically reusing already existing annotations and, therefore, can be regenerated from subsequent UMLS versions. PMID:21635749
van Dijk, Kor-jent; Mellors, Jane; Waycott, Michelle
2014-01-01
• Premise of the study: New microsatellites were developed for the seagrass Thalassia hemprichii (Hydrocharitaceae), a long-lived seagrass species that is found throughout the shallow waters of tropical and subtropical Indo-West Pacific. Three multiplex PCR panels were designed utilizing new and previously developed markers, resulting in a toolkit for generating a 16-locus genotype. • Methods and Results: Through the use of microsatellite enrichment and next-generation sequencing, 16 new, validated, polymorphic microsatellite markers were isolated. Diversity was between two and four alleles per locus totaling 36 alleles. These markers, plus previously developed microsatellite markers for T. hemprichii and T. testudinum, were tested for suitability in multiplex PCR panels. • Conclusions: The generation of an easily replicated suite of multiplex panels of codominant molecular markers will allow for high-resolution and detailed genetic structure analysis and clonality assessment with minimal genotyping costs. We suggest the establishment of a T. hemprichii primer convention for the unification of future data sets. PMID:25383269
Advances in dual-tone development for pitch frequency doubling
NASA Astrophysics Data System (ADS)
Fonseca, Carlos; Somervell, Mark; Scheer, Steven; Kuwahara, Yuhei; Nafus, Kathleen; Gronheid, Roel; Tarutani, Shinji; Enomoto, Yuuichiro
2010-04-01
Dual-tone development (DTD) has been previously proposed as a potential cost-effective double patterning technique1. DTD was reported as early as in the late 1990's2. The basic principle of dual-tone imaging involves processing exposed resist latent images in both positive tone (aqueous base) and negative tone (organic solvent) developers. Conceptually, DTD has attractive cost benefits since it enables pitch doubling without the need for multiple etch steps of patterned resist layers. While the concept for DTD technique is simple to understand, there are many challenges that must be overcome and understood in order to make it a manufacturing solution. Previous work by the authors demonstrated feasibility of DTD imaging for 50nm half-pitch features at 0.80NA (k1 = 0.21) and discussed challenges lying ahead for printing sub-40nm half-pitch features with DTD. While previous experimental results suggested that clever processing on the wafer track can be used to enable DTD beyond 50nm halfpitch, it also suggest that identifying suitable resist materials or chemistries is essential for achieving successful imaging results with novel resist processing methods on the wafer track. In this work, we present recent advances in the search for resist materials that work in conjunction with novel resist processing methods on the wafer track to enable DTD. Recent experimental results with new resist chemistries, specifically designed for DTD, are presented in this work. We also present simulation studies that help and support identifying resist properties that could enable DTD imaging, which ultimately lead to producing viable DTD resist materials.
Use of Web-Based Portfolios as Tools for Reflection in Preservice Teacher Education
ERIC Educational Resources Information Center
Oner, Diler; Adadan, Emine
2011-01-01
This mixed-methods study examined the use of web-based portfolios for developing preservice teachers' reflective skills. Building on the work of previous research, the authors proposed a set of reflection-based tasks to enrich preservice teachers' internship experiences. Their purpose was to identify (a) whether preservice teachers demonstrated…
A Mixed Method Case Study on Learner Engagement in e-Learning Professional Training
ERIC Educational Resources Information Center
Zhao, Jane Yanfeng
2014-01-01
Previous research showed that learners' reluctance in participating in e-Learning training is a major obstacle in achieving training objectives. This study focused on learners' e-Learning engagement in professional training in the chapter of the American Society for Training and Development (ASTD). The participants were 21 chapter members.…
ERIC Educational Resources Information Center
Ford, Julie Dyke; Bracken, Jennifer L.; Wilson, Gregory D.
2009-01-01
This article addresses previous arguments that call for increased emphasis on research in technical communication programs. Focusing on the value of scholarly-based research at the undergraduate level, we present New Mexico Tech's thesis model as an example of helping students develop familiarity with research skills and methods. This two-semester…
Updated generalized biomass equations for North American tree species
David C. Chojnacky; Linda S. Heath; Jennifer C. Jenkins
2014-01-01
Historically, tree biomass at large scales has been estimated by applying dimensional analysis techniques and field measurements such as diameter at breast height (dbh) in allometric regression equations. Equations often have been developed using differing methods and applied only to certain species or isolated areas. We previously had compiled and combined (in meta-...
Asteroid mass estimation with Markov-chain Monte Carlo
NASA Astrophysics Data System (ADS)
Siltala, L.; Granvik, M.
2017-09-01
We have developed a new Markov-chain Monte Carlo-based algorithm for asteroid mass estimation based on mutual encounters and tested it for several different asteroids. Our results are in line with previous literature values but suggest that uncertainties of prior estimates may be misleading as a consequence of using linearized methods.
ERIC Educational Resources Information Center
Haebig, Eileen; Leonard, Laurence; Usler, Evan; Deevy, Patricia; Weber, Christine
2018-01-01
Purpose: Previous behavioral studies have found deficits in lexical--semantic abilities in children with specific language impairment (SLI), including reduced depth and breadth of word knowledge. This study explored the neural correlates of early emerging familiar word processing in preschoolers with SLI and typical development. Method: Fifteen…
Female Anastrepha suspensa (Loew) response to the vibration component of male wing-fanning signals
USDA-ARS?s Scientific Manuscript database
Anastrepha suspensa (Loew) is an important pest of fruit crops in Florida and islands in the Caribbean region. Courtship and mating behaviors have been analyzed in previous studies to develop control methods. During courtship, males group in leks on leaves of host trees, fan their wings, and release...
Rapid purification of fluorescent enzymes by ultrafiltration
NASA Technical Reports Server (NTRS)
Benjaminson, M. A.; Satyanarayana, T.
1983-01-01
In order to expedite the preparation of fluorescently tagged enzymes for histo-cyctochemistry, a previously developed method employing gel column purification was compared with a more rapid modern technique using the Millipore Immersible CX-ultrafilter. Microscopic evaluation of the resulting conjugates showed comparable products. Much time and effort is saved using the new technique.
Rapid purification of fluorescent enzymes by ultrafiltration
NASA Technical Reports Server (NTRS)
Benjaminson, M. A.; Satyanarayana, T.
1983-01-01
In order to expedite the preparation of fluorescently tagged enzymes for histo/cytochemistry, a previously developed method employing gel column purification was compared with a more rapid modern technique using the Millipore Immersible CX-ultrafilter. Microscopic evaluation of the resulting conjugates showed comparable products. Much time and effort is saved using the new technique.
School Nutrition Directors are Receptive to Web-Based Training Opportunities: A National Survey
ERIC Educational Resources Information Center
Zoellner, Jamie; Carr, Deborah H.
2009-01-01
Purpose/Objective: The purpose of this study was to investigate school nutrition directors' (SNDs) previous experience with web-based training (WBT), interest in utilizing WBT within 14 functional areas, and logistical issues (time, price, educational credits, etc.) of developing and delivering WBT learning modules. Methods: A survey was developed…
Deal or No Deal? Evaluating Big Deals and Their Journals
ERIC Educational Resources Information Center
Blecic, Deborah D.; Wiberley, Stephen E., Jr.; Fiscella, Joan B.; Bahnmaier-Blaszczak, Sara; Lowery, Rebecca
2013-01-01
This paper presents methods to develop metrics that compare Big Deal journal packages and the journals within those packages. Deal-level metrics guide selection of a Big Deal for termination. Journal-level metrics guide selection of individual subscriptions from journals previously provided by a terminated deal. The paper argues that, while the…
Cooperative-Experiential Learning: Using Student-Developed Games to Increase Knowledge Retention
ERIC Educational Resources Information Center
Camp, Kerri M.; Avery, Sherry; Lirely, Roger
2012-01-01
Previous literature has discussed the use of cooperative and experiential learning as a means of augmenting student involvement in the learning process. Teamwork has been one method of employing cooperative learning and having students play games has been used extensively in experiential learning approaches. Often the two pedagogies are employed…
Do Fine Motor Skills Contribute to Early Reading Development?
ERIC Educational Resources Information Center
Suggate, Sebastian; Pufke, Eva; Stoeger, Heidrun
2018-01-01
Background: Little is known about how fine motor skills (FMS) relate to early literacy skills, especially over and above cognitive variables. Moreover, a lack of distinction between FMS, grapho-motor and writing skills may have hampered previous work. Method: In Germany, kindergartners (n = 144, aged 6;1) were recruited before beginning formal…
The Influence of Trust in Principals' Mentoring Experiences across Different Career Phases
ERIC Educational Resources Information Center
Bakioglu, Aysen; Hacifazlioglu, Ozge; Ozcan, Kenan
2010-01-01
The purpose of this study is to examine the perceptions of primary school principals about the influence of "trust" in their mentoring experiences. Both quantitative and qualitative methods were used in the study. The Primary School Principals' Mentoring Questionnaire previously developed by the researchers was applied to 1462 primary…
ERIC Educational Resources Information Center
Hasselbring, Ted S.; Lewis, Preston; Bausch, Margaret E.; Axelson, Mary; Kay, Ken; Honey, Margaret
2005-01-01
This final edition focuses on two global issues that encompass most, if not all, of the topics explored in previous issues. With an emphasis on academic success for all children, the authors look at a method of universally designed assessment developed by the Kentucky Department of Education and based on the tenets of universal design for…
NASA Astrophysics Data System (ADS)
Eckert, R.; Neyhart, J. T.; Burd, L.; Polikar, R.; Mandayam, S. A.; Tseng, M.
2003-03-01
Mammography is the best method available as a non-invasive technique for the early detection of breast cancer. The radiographic appearance of the female breast consists of radiolucent (dark) regions due to fat and radiodense (light) regions due to connective and epithelial tissue. The amount of radiodense tissue can be used as a marker for predicting breast cancer risk. Previously, we have shown that the use of statistical models is a reliable technique for segmenting radiodense tissue. This paper presents improvements in the model that allow for further development of an automated system for segmentation of radiodense tissue. The segmentation algorithm employs a two-step process. In the first step, segmentation of tissue and non-tissue regions of a digitized X-ray mammogram image are identified using a radial basis function neural network. The second step uses a constrained Neyman-Pearson algorithm, developed especially for this research work, to determine the amount of radiodense tissue. Results obtained using the algorithm have been validated by comparing with estimates provided by a radiologist employing previously established methods.
Hybrid finite element and Brownian dynamics method for charged particles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huber, Gary A., E-mail: ghuber@ucsd.edu; Miao, Yinglong; Zhou, Shenggao
2016-04-28
Diffusion is often the rate-determining step in many biological processes. Currently, the two main computational methods for studying diffusion are stochastic methods, such as Brownian dynamics, and continuum methods, such as the finite element method. A previous study introduced a new hybrid diffusion method that couples the strengths of each of these two methods, but was limited by the lack of interactions among the particles; the force on each particle had to be from an external field. This study further develops the method to allow charged particles. The method is derived for a general multidimensional system and is presented usingmore » a basic test case for a one-dimensional linear system with one charged species and a radially symmetric system with three charged species.« less
Flexible functional regression methods for estimating individualized treatment regimes.
Ciarleglio, Adam; Petkova, Eva; Tarpey, Thaddeus; Ogden, R Todd
2016-01-01
A major focus of personalized medicine is on the development of individualized treatment rules. Good decision rules have the potential to significantly advance patient care and reduce the burden of a host of diseases. Statistical methods for developing such rules are progressing rapidly, but few methods have considered the use of pre-treatment functional data to guide in decision-making. Furthermore, those methods that do allow for the incorporation of functional pre-treatment covariates typically make strong assumptions about the relationships between the functional covariates and the response of interest. We propose two approaches for using functional data to select an optimal treatment that address some of the shortcomings of previously developed methods. Specifically, we combine the flexibility of functional additive regression models with Q -learning or A -learning in order to obtain treatment decision rules. Properties of the corresponding estimators are discussed. Our approaches are evaluated in several realistic settings using synthetic data and are applied to real data arising from a clinical trial comparing two treatments for major depressive disorder in which baseline imaging data are available for subjects who are subsequently treated.
Generalization of the Poincare sphere to process 2D displacement signals
NASA Astrophysics Data System (ADS)
Sciammarella, Cesar A.; Lamberti, Luciano
2017-06-01
Traditionally the multiple phase method has been considered as an essential tool for phase information recovery. The in-quadrature phase method that theoretically is an alternative pathway to achieve the same goal failed in actual applications. The authors in a previous paper dealing with 1D signals have shown that properly implemented the in-quadrature method yields phase values with the same accuracy than the multiple phase method. The present paper extends the methodology developed in 1D to 2D. This extension is not a straight forward process and requires the introduction of a number of additional concepts and developments. The concept of monogenic function provides the necessary tools required for the extension process. The monogenic function has a graphic representation through the Poincare sphere familiar in the field of Photoelasticity and through the developments introduced in this paper connected to the analysis of displacement fringe patterns. The paper is illustrated with examples of application that show that multiple phases method and the in-quadrature are two aspects of the same basic theoretical model.
A comparison of high-frequency cross-correlation measures
NASA Astrophysics Data System (ADS)
Precup, Ovidiu V.; Iori, Giulia
2004-12-01
On a high-frequency scale the time series are not homogeneous, therefore standard correlation measures cannot be directly applied to the raw data. There are two ways to deal with this problem. The time series can be homogenised through an interpolation method (An Introduction to High-Frequency Finance, Academic Press, NY, 2001) (linear or previous tick) and then the Pearson correlation statistic computed. Recently, methods that can handle raw non-synchronous time series have been developed (Int. J. Theor. Appl. Finance 6(1) (2003) 87; J. Empirical Finance 4 (1997) 259). This paper compares two traditional methods that use interpolation with an alternative method applied directly to the actual time series.
Comarison of Four Methods for Teaching Phases of the Moon
NASA Astrophysics Data System (ADS)
Upton, Brianna; Cid, Ximena; Lopez, Ramon
2008-03-01
Previous studies have shown that many students have misconceptions about basic concepts in astronomy. As a consequence, various interactive engagement methods have been developed for introductory astronomy. We will present the results of a study that compares four different teaching methods for the subject of the phases of the Moon, which is well known to produce student difficulties. We compare a fairly traditional didactic approach, the use of manipulatives (moonballs) in lecture, the University of Arizona Lecture Tutorials, and an interactive computer program used in a didactic fashion. We use pre- and post-testing with the Lunar Phase Concept Inventory to determine the relative effectiveness of these methods.
Zone plate method for electronic holographic display using resolution redistribution technique.
Takaki, Yasuhiro; Nakamura, Junya
2011-07-18
The resolution redistribution (RR) technique can increase the horizontal viewing-zone angle and screen size of electronic holographic display. The present study developed a zone plate method that would reduce hologram calculation time for the RR technique. This method enables calculation of an image displayed on a spatial light modulator by performing additions of the zone plates, while the previous calculation method required performing the Fourier transform twice. The derivation and modeling of the zone plate are shown. In addition, the look-up table approach was introduced for further reduction in computation time. Experimental verification using a holographic display module based on the RR technique is presented.
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Kleb, William L.
2005-01-01
A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.
Efficient Construction of Discrete Adjoint Operators on Unstructured Grids Using Complex Variables
NASA Technical Reports Server (NTRS)
Nielsen, Eric J.; Kleb, William L.
2005-01-01
A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.
Assurance of Complex Electronics. What Path Do We Take?
NASA Technical Reports Server (NTRS)
Plastow, Richard A.
2007-01-01
Many of the methods used to develop software bare a close resemblance to Complex Electronics (CE) development. CE are now programmed to perform tasks that were previously handled in software, such as communication protocols. For instance, Field Programmable Gate Arrays (FPGAs) can have over a million logic gates while system-on-chip (SOC) devices can combine a microprocessor, input and output channels, and sometimes an FPGA for programmability. With this increased intricacy, the possibility of "software-like" bugs such as incorrect design, logic, and unexpected interactions within the logic is great. Since CE devices are obscuring the hardware/software boundary, we propose that mature software methodologies may be utilized with slight modifications to develop these devices. By using standardized S/W Engineering methods such as checklists, missing requirements and "bugs" can be detected earlier in the development cycle, thus creating a development process for CE that will be easily maintained and configurable based on the device used.
The Factors that Affect Science Teachers' Participation in Professional Development
NASA Astrophysics Data System (ADS)
Roux, Judi Ann
Scientific literacy for our students and the possibilities for careers available in Science, Technology, Engineering, and Mathematics (STEM) areas are important topics for economic growth as well as global competitiveness. The achievement of students in science learning is dependent upon the science teachers' effectiveness and experienced science teachers depend upon relevant professional development experiences to support their learning. In order to understand how to improve student learning in science, the learning of science teachers must also be understood. Previous research studies on teacher professional development have been conducted in other states, but Minnesota science teachers comprised a new and different population from those previously studied. The purpose of this two-phase mixed methods study was to identify the current types of professional development in which experienced, Minnesota secondary science teachers participated and the factors that affect their participation in professional development activities. The mixed-methods approach s utilized an initial online survey followed by qualitative interviews with five survey respondents. The results of the quantitative survey and the qualitative interviews indicated the quality of professional development experiences and the factors which affected the science teachers' participation in professional development activities. The supporting and inhibiting factors involved the availability of resources such as time and money, external relationships with school administrators, teacher colleagues, and family members, and personal intrinsic attributes such as desires to learn and help students. This study also describes implications for science teachers, school administrators, policymakers, and professional development providers. Recommendations for future research include the following areas: relationships between and among intrinsic and extrinsic factors, science-related professional development activities within local school districts, the use of formal and informal professional development, and the needs of rural science teachers compared to urban and suburban teachers.
Identification of informative features for predicting proinflammatory potentials of engine exhausts.
Wang, Chia-Chi; Lin, Ying-Chi; Lin, Yuan-Chung; Jhang, Syu-Ruei; Tung, Chun-Wei
2017-08-18
The immunotoxicity of engine exhausts is of high concern to human health due to the increasing prevalence of immune-related diseases. However, the evaluation of immunotoxicity of engine exhausts is currently based on expensive and time-consuming experiments. It is desirable to develop efficient methods for immunotoxicity assessment. To accelerate the development of safe alternative fuels, this study proposed a computational method for identifying informative features for predicting proinflammatory potentials of engine exhausts. A principal component regression (PCR) algorithm was applied to develop prediction models. The informative features were identified by a sequential backward feature elimination (SBFE) algorithm. A total of 19 informative chemical and biological features were successfully identified by SBFE algorithm. The informative features were utilized to develop a computational method named FS-CBM for predicting proinflammatory potentials of engine exhausts. FS-CBM model achieved a high performance with correlation coefficient values of 0.997 and 0.943 obtained from training and independent test sets, respectively. The FS-CBM model was developed for predicting proinflammatory potentials of engine exhausts with a large improvement on prediction performance compared with our previous CBM model. The proposed method could be further applied to construct models for bioactivities of mixtures.
Chen, C L; Kaber, D B; Dempsey, P G
2000-06-01
A new and improved method to feedforward neural network (FNN) development for application to data classification problems, such as the prediction of levels of low-back disorder (LBD) risk associated with industrial jobs, is presented. Background on FNN development for data classification is provided along with discussions of previous research and neighborhood (local) solution search methods for hard combinatorial problems. An analytical study is presented which compared prediction accuracy of a FNN based on an error-back propagation (EBP) algorithm with the accuracy of a FNN developed by considering results of local solution search (simulated annealing) for classifying industrial jobs as posing low or high risk for LBDs. The comparison demonstrated superior performance of the FNN generated using the new method. The architecture of this FNN included fewer input (predictor) variables and hidden neurons than the FNN developed based on the EBP algorithm. Independent variable selection methods and the phenomenon of 'overfitting' in FNN (and statistical model) generation for data classification are discussed. The results are supportive of the use of the new approach to FNN development for applications to musculoskeletal disorders and risk forecasting in other domains.
Center index method-an alternative for wear measurements with radiostereometry (RSA).
Dahl, Jon; Figved, Wender; Snorrason, Finnur; Nordsletten, Lars; Röhrl, Stephan M
2013-03-01
Radiostereometry (RSA) is considered to be the most precise and accurate method for wear-measurements in total hip replacement. Post-operative stereoradiographs has so far been necessary for wear measurement. Hence, the use of RSA has been limited to studies planned for RSA measurements. We compared a new RSA method for wear measurements that does not require previous radiographs with conventional RSA. Instead of comparing present stereoradiographs with post-operative ones, we developed a method for calculating the post-operative position of the center of the femoral head on the present examination and using this as the index measurement. We compared this alternative method to conventional RSA in 27 hips in an ongoing RSA study. We found a high degree of agreement between the methods for both mean proximal (1.19 mm vs. 1.14 mm) and mean 3D wear (1.52 mm vs. 1.44 mm) after 10 years. Intraclass correlation coefficients (ICC) were 0.958 and 0.955, respectively (p<0.001 for both ICCs). The results were also within the limits of agreement when plotted subject-by-subject in a Bland-Altman plot. Our alternative method for wear measurements with RSA offers comparable results to conventional RSA measurements. It allows precise wear measurements without previous radiological examinations. Copyright © 2012 Orthopaedic Research Society.
Stout, Peter R; Horn, Carl K; Klette, Kevin L
2002-01-01
To facilitate analysis of high sample volumes, an extraction, derivatization and gas chromatographic-mass spectrometric analysis method was developed to simultaneously determine amphetamine (AMP), methamphetamine (MAMP), 3,4-methylenedioxyamphetamine (MDA) 3,4-methylenedioxymethamphetamine (MDMA), and 3,4-methylenedioxyethylamphetamine (MDEA) in urine. This method utilized a positive-pressure manifold cation-exchange polymer-based solid-phase extraction followed by elution directly into automated liquid sampler (ALS) vials. Rapid derivatization was accomplished using heptafluorobutyric anhydride (HFBA). Recoveries averaged 90% or greater for each of the compounds. Limits of detection were 62.5 ng/mL (AMP and MDEA), 15.6 ng/mL (MAMP), and 31.3 ng/mL (MDA and MDMA) using a 2-mL sample volume. The method was linear to 5000 ng/mL for all compounds using MDMA-d5 and MAMP-d14 as internal standards. Over 200 human urine samples previously determined to contain the target analytes were analyzed using the method. Excellent agreement was seen with previous quantitations. The method was challenged with 75 potentially interfering compounds and no interferences were seen. These interfering compounds included ephedrine, pseudoephedrine, phenylpropanolamine, and phenethylamine. The method resulted in dramatic reductions in processing time and waste production.
Proteomic analysis of mare follicular fluid during late follicle development
2011-01-01
Background Follicular fluid accumulates into the antrum of follicle from the early stage of follicle development. Studies on its components may contribute to a better understanding of the mechanisms underlying follicular development and oocyte quality. With this objective, we performed a proteomic analysis of mare follicular fluid. First, we hypothesized that proteins in follicular fluid may differ from those in the serum, and also may change during follicle development. Second, we used four different approaches of Immunodepletion and one enrichment method, in order to overcome the masking effect of high-abundance proteins present in the follicular fluid, and to identify those present in lower abundance. Finally, we compared our results with previous studies performed in mono-ovulant (human) and poly-ovulant (porcine and canine) species in an attempt to identify common and/or species-specific proteins. Methods Follicular fluid samples were collected from ovaries at three different stages of follicle development (early dominant, late dominant and preovulatory). Blood samples were also collected at each time. The proteomic analysis was carried out on crude, depleted and enriched follicular fluid by 2D-PAGE, 1D-PAGE and mass spectrometry. Results Total of 459 protein spots were visualized by 2D-PAGE of crude mare follicular fluid, with no difference among the three physiological stages. Thirty proteins were observed as differentially expressed between serum and follicular fluid. Enrichment method was found to be the most powerful method for detection and identification of low-abundance proteins from follicular fluid. Actually, we were able to identify 18 proteins in the crude follicular fluid, and as many as 113 in the enriched follicular fluid. Inhibins and a few other proteins involved in reproduction could only be identified after enrichment of follicular fluid, demonstrating the power of the method used. The comparison of proteins found in mare follicular fluid with proteins previously identified in human, porcine and canine follicular fluids, led to the identification of 12 common proteins and of several species-specific proteins. Conclusions This study provides the first description of mare follicular fluid proteome during the late follicle development stages. We identified several proteins from crude, depleted and enriched follicular fluid. Our results demonstrate that the enrichment method, combined with 2D-PAGE and mass spectrometry, can be successfully used to visualize and further identify the low-abundance proteins in the follicular fluid. PMID:21923925
Validating a Prognostic Scoring System for Postmastectomy Locoregional Recurrence in Breast Cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Skye Hung-Chun, E-mail: skye@kfsyscc.org; Clinical Research Office, Koo Foundation Sun Yat-Sen Cancer Center, Taipei, Taiwan; Department of Radiation Oncology, Duke University Medical Center, Durham, North Carolina
2013-03-15
Purpose: This study is designed to validate a previously developed locoregional recurrence risk (LRR) scoring system and further define which groups of patients with breast cancer would benefit from postmastectomy radiation therapy (PMRT). Methods and Materials: An LRR risk scoring system was developed previously at our institution using breast cancer patients initially treated with modified radical mastectomy between 1990 and 2001. The LRR score comprised 4 factors: patient age, lymphovascular invasion, estrogen receptor negativity, and number of involved lymph nodes. We sought to validate the original study by examining a new dataset of 1545 patients treated between 2002 and 2007. Results:more » The 1545 patients were scored according to the previously developed criteria: 920 (59.6%) were low risk (score 0-1), 493 (31.9%) intermediate risk (score 2-3), and 132 (8.5%) were high risk (score ≥4). The 5-year locoregional control rates with and without PMRT in low-risk, intermediate-risk, and high-risk groups were 98% versus 97% (P=.41), 97% versus 91% (P=.0005), and 89% versus 50% (P=.0002) respectively. Conclusions: This analysis of an additional 1545 patients treated between 2002 and 2007 validates our previously reported LRR scoring system and suggests appropriate patients for whom PMRT will be beneficial. Independent validation of this scoring system by other institutions is recommended.« less
Kim, Huiyong; Hwang, Sung June; Lee, Kwang Soon
2015-02-03
Among various CO2 capture processes, the aqueous amine-based absorption process is considered the most promising for near-term deployment. However, the performance evaluation of newly developed solvents still requires complex and time-consuming procedures, such as pilot plant tests or the development of a rigorous simulator. Absence of accurate and simple calculation methods for the energy performance at an early stage of process development has lengthened and increased expense of the development of economically feasible CO2 capture processes. In this paper, a novel but simple method to reliably calculate the regeneration energy in a standard amine-based carbon capture process is proposed. Careful examination of stripper behaviors and exploitation of energy balance equations around the stripper allowed for calculation of the regeneration energy using only vapor-liquid equilibrium and caloric data. Reliability of the proposed method was confirmed by comparing to rigorous simulations for two well-known solvents, monoethanolamine (MEA) and piperazine (PZ). The proposed method can predict the regeneration energy at various operating conditions with greater simplicity, greater speed, and higher accuracy than those proposed in previous studies. This enables faster and more precise screening of various solvents and faster optimization of process variables and can eventually accelerate the development of economically deployable CO2 capture processes.
NASA Astrophysics Data System (ADS)
Viger, R. J.; Van Beusekom, A. E.
2016-12-01
The treatment of glaciers in modeling requires information about their shape and extent. This presentation discusses new methods and their application in a new glacier-capable variant of the USGS PRMS model, a physically-based, spatially distributed daily time-step model designed to simulate the runoff and evolution of glaciers through time. In addition to developing parameters describing PRMS land surfaces (hydrologic response units, HRUs), several of the analyses and products are likely of interest to cryospheric science community in general. The first method is a (fully automated) variation of logic previously presented in the literature for definition of the glacier centerline. Given that the surface of a glacier might be convex, using traditional topographic analyses based on a DEM to trace a path down the glacier is not reliable. Instead a path is derived based on a cost function. Although only a single path is presented in our results, the method can be easily modified to delineate a branched network of centerlines for each glacier. The second method extends the glacier terminus downslope by an arbitrary distance, according to local surface topography. This product is can be used to explore possible, if unlikely, scenarios under which glacier area grows. More usefully, this method can be used to approximate glacier extents from previous years without needing historical imagery. The final method presents an approach for segmenting the glacier into altitude-based HRUs. Successful integration of this information with traditional approaches for discretizing the non-glacierized portions of a basin requires several additional steps. These include synthesizing the glacier centerline network with one developed with a traditional DEM analysis, ensuring that flow can be routed under and beyond glaciers to a basin outlet. Results are presented based on analysis of the Copper River Basin, Alaska.
Detection and 3D representation of pulmonary air bubbles in HRCT volumes
NASA Astrophysics Data System (ADS)
Silva, Jose S.; Silva, Augusto F.; Santos, Beatriz S.; Madeira, Joaquim
2003-05-01
Bubble emphysema is a disease characterized by the presence of air bubbles within the lungs. With the purpose of identifying pulmonary air bubbles, two alternative methods were developed, using High Resolution Computer Tomography (HRCT) exams. The search volume is confined to the pulmonary volume through a previously developed pulmonary contour detection algorithm. The first detection method follows a slice by slice approach and uses selection criteria based on the Hounsfield levels, dimensions, shape and localization of the bubbles. Candidate regions that do not exhibit axial coherence along at least two sections are excluded. Intermediate sections are interpolated for a more realistic representation of lungs and bubbles. The second detection method, after the pulmonary volume delimitation, follows a fully 3D approach. A global threshold is applied to the entire lung volume returning candidate regions. 3D morphologic operators are used to remove spurious structures and to circumscribe the bubbles. Bubble representation is accomplished by two alternative methods. The first generates bubble surfaces based on the voxel volumes previously detected; the second method assumes that bubbles are approximately spherical. In order to obtain better 3D representations, fits super-quadrics to bubble volume. The fitting process is based on non-linear least squares optimization method, where a super-quadric is adapted to a regular grid of points defined on each bubble. All methods were applied to real and semi-synthetical data where artificial and randomly deformed bubbles were embedded in the interior of healthy lungs. Quantitative results regarding bubble geometric features are either similar to a priori known values used in simulation tests, or indicate clinically acceptable dimensions and locations when dealing with real data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Yuyu; Smith, Steven J.; Elvidge, Christopher
Accurate information of urban areas at regional and global scales is important for both the science and policy-making communities. The Defense Meteorological Satellite Program/Operational Linescan System (DMSP/OLS) nighttime stable light data (NTL) provide a potential way to map urban area and its dynamics economically and timely. In this study, we developed a cluster-based method to estimate the optimal thresholds and map urban extents from the DMSP/OLS NTL data in five major steps, including data preprocessing, urban cluster segmentation, logistic model development, threshold estimation, and urban extent delineation. Different from previous fixed threshold method with over- and under-estimation issues, in ourmore » method the optimal thresholds are estimated based on cluster size and overall nightlight magnitude in the cluster, and they vary with clusters. Two large countries of United States and China with different urbanization patterns were selected to map urban extents using the proposed method. The result indicates that the urbanized area occupies about 2% of total land area in the US ranging from lower than 0.5% to higher than 10% at the state level, and less than 1% in China, ranging from lower than 0.1% to about 5% at the province level with some municipalities as high as 10%. The derived thresholds and urban extents were evaluated using high-resolution land cover data at the cluster and regional levels. It was found that our method can map urban area in both countries efficiently and accurately. Compared to previous threshold techniques, our method reduces the over- and under-estimation issues, when mapping urban extent over a large area. More important, our method shows its potential to map global urban extents and temporal dynamics using the DMSP/OLS NTL data in a timely, cost-effective way.« less
A New Principle of Sound Frequency Analysis
NASA Technical Reports Server (NTRS)
Theodorsen, Theodore
1932-01-01
In connection with the study of aircraft and propeller noises, the National Advisory Committee for Aeronautics has developed an instrument for sound-frequency analysis which differs fundamentally from previous types, and which, owing to its simplicity of principle, construction, and operation, has proved to be of value in this investigation. The method is based on the well-known fact that the Ohmic loss in an electrical resistance is equal to the sum of the losses of the harmonic components of a complex wave, except for the case in which any two components approach or attain vectorial identity, in which case the Ohmic loss is increased by a definite amount. The principle of frequency analysis has been presented mathematically and a number of distinct advantages relative to previous methods have been pointed out. An automatic recording instrument embodying this principle is described in detail. It employs a beat-frequency oscillator as a source of variable frequency. A large number of experiments have verified the predicted superiority of the method. A number of representative records are presented.
Octave spanning supercontinuum in an As₂S₃ taper using ultralow pump pulse energy.
Hudson, Darren D; Dekker, Stephen A; Mägi, Eric C; Judge, Alexander C; Jackson, Stuart D; Li, Enbang; Sanghera, J S; Shaw, L B; Aggarwal, I D; Eggleton, Benjamin J
2011-04-01
An octave spanning spectrum is generated in an As₂S₃ taper via 77 pJ pulses from an ultrafast fiber laser. Using a previously developed tapering method, we construct a 1.3 μm taper that has a zero-dispersion wavelength around 1.4 μm. The low two-photon absorption of sulfide-based chalcogenide fiber allows for higher input powers than previous efforts in selenium-based chalcogenide tapered fibers. This higher power handling capability combined with input pulse chirp compensation allows an octave spanning spectrum to be generated directly from the taper using the unamplified laser output.
Web client and ODBC access to legacy database information: a low cost approach.
Sanders, N. W.; Mann, N. H.; Spengler, D. M.
1997-01-01
A new method has been developed for the Department of Orthopaedics of Vanderbilt University Medical Center to access departmental clinical data. Previously this data was stored only in the medical center's mainframe DB2 database, it is now additionally stored in a departmental SQL database. Access to this data is available via any ODBC compliant front-end or a web client. With a small budget and no full time staff, we were able to give our department on-line access to many years worth of patient data that was previously inaccessible. PMID:9357735
Validation of an asthma questionnaire for use in healthcare workers
Delclos, G L; Arif, A A; Aday, L; Carson, A; Lai, D; Lusk, C; Stock, T; Symanski, E; Whitehead, L W; Benavides, F G; Antó, J M
2006-01-01
Background Previous studies have described increased occurrence of asthma among healthcare workers, but to our knowledge there are no validated survey questionnaires with which to study this occupational group. Aims To develop, validate, and refine a new survey instrument on asthma for use in epidemiological studies of healthcare workers. Methods An initial draft questionnaire, designed by a multidisciplinary team, used previously validated questions where possible; the occupational exposure section was developed by updating health services specific chemical lists through hospital walk‐through surveys and review of material safety data sheets. A cross‐sectional validation study was conducted in 118 non‐smoking subjects, who also underwent bronchial challenge testing, an interview with an industrial hygienist, and measurement of specific IgE antibodies to common aeroallergens. Results The final version consisted of 43 main questions in four sections. Time to completion of the questionnaire ranged from 13 to 25 minutes. Test–retest reliability of asthma and allergy items ranged from 75% to 94%, and internal consistency for these items was excellent (Cronbach's α ⩾ 0.86). Against methacholine challenge, an eight item combination of asthma related symptoms had a sensitivity of 71% and specificity of 70%; against a physician diagnosis of asthma, this same combination showed a sensitivity of 79% and specificity of 98%. Agreement between self‐reported exposures and industrial hygienist review was similar to previous studies and only moderate, indicating the need to incorporate more reliable methods of exposure assessment. Against the aerollergen panel, the best combinations of sensitivity and specificity were obtained for a history of allergies to dust, dust mite, and animals. Conclusions Initial evaluation of this new questionnaire indicates good validity and reliability, and further field testing and cross‐validation in a larger healthcare worker population is in progress. The need for development of more reliable occupational exposure assessment methods that go beyond self‐report is underscored. PMID:16497858
High-resolution mapping of vehicle emissions in China in 2008
NASA Astrophysics Data System (ADS)
Zheng, B.; Huo, H.; Zhang, Q.; Yao, Z. L.; Wang, X. T.; Yang, X. F.; Liu, H.; He, K. B.
2014-09-01
This study is the first in a series of papers that aim to develop high-resolution emission databases for different anthropogenic sources in China. Here we focus on on-road transportation. Because of the increasing impact of on-road transportation on regional air quality, developing an accurate and high-resolution vehicle emission inventory is important for both the research community and air quality management. This work proposes a new inventory methodology to improve the spatial and temporal accuracy and resolution of vehicle emissions in China. We calculate, for the first time, the monthly vehicle emissions for 2008 in 2364 counties (an administrative unit one level lower than city) by developing a set of approaches to estimate vehicle stock and monthly emission factors at county-level, and technology distribution at provincial level. We then introduce allocation weights for the vehicle kilometers traveled to assign the county-level emissions onto 0.05° × 0.05° grids based on the China Digital Road-network Map (CDRM). The new methodology overcomes the common shortcomings of previous inventory methods, including neglecting the geographical differences between key parameters and using surrogates that are weakly related to vehicle activities to allocate vehicle emissions. The new method has great advantages over previous methods in depicting the spatial distribution characteristics of vehicle activities and emissions. This work provides a better understanding of the spatial representation of vehicle emissions in China and can benefit both air quality modeling and management with improved spatial accuracy.
O'Dor, Sarah L; Grasso, Damion J; Forbes, Danielle; Bates, John E; McCarthy, Kimberly J; Wakschlag, Lauren S; Briggs-Gowan, Margaret J
2017-04-01
Elucidating the complex mechanisms by which harsh parenting increases risk of child psychopathology is key to targeted prevention. This requires nuanced methods that capture the varied perceptions and experiences of diverse families. The Family Socialization Interview-Revised (FSI-R), adapted from an interview developed by Dodge et al. (Child Development, 65, 649-665, 1994), is a comprehensive, semi-structured interview for characterizing methods of parental discipline used with young children. The FSI-R coding system systematically rates parenting style, usual discipline techniques, and most intense physical and psychological discipline based on rater judgment across two eras: (1) birth to the previous year, and (2) the previous year to present. The current study examined the psychometric properties of the FSI-R in a diverse, high-risk community sample of 386 mothers and their children, ages 3 to 6 years. Interrater reliability was good to excellent for codes capturing physically and psychologically harsh parenting, and restrictive/punitive parenting styles. Findings supported the FSI-R's convergent and incremental validity. Importantly, the FSI-R demonstrated incremental utility, explaining unique variance in children's externalizing and internalizing symptoms beyond that explained by traditional surveys and observed parenting. The FSI-R appeared particularly promising for capturing risk associated with young children's depressive symptoms, as these were generally not significantly associated with other measures of harsh parenting. Overall, findings support the added value of the FSI-R within a multi-method assessment of disciplinary practices across early child development. Future implications for prevention are discussed.
Wysocki, William P; Ruiz-Sanchez, Eduardo; Yin, Yanbin; Duvall, Melvin R
2016-05-20
Next-generation sequencing now allows for total RNA extracts to be sequenced in non-model organisms such as bamboos, an economically and ecologically important group of grasses. Bamboos are divided into three lineages, two of which are woody perennials with bisexual flowers, which undergo gregarious monocarpy. The third lineage, which are herbaceous perennials, possesses unisexual flowers that undergo annual flowering events. Transcriptomes were assembled using both reference-based and de novo methods. These two methods were tested by characterizing transcriptome content using sequence alignment to previously characterized reference proteomes and by identifying Pfam domains. Because of the striking differences in floral morphology and phenology between the herbaceous and woody bamboo lineages, MADS-box genes, transcription factors that control floral development and timing, were characterized and analyzed in this study. Transcripts were identified using phylogenetic methods and categorized as A, B, C, D or E-class genes, which control floral development, or SOC or SVP-like genes, which control the timing of flowering events. Putative nuclear orthologues were also identified in bamboos to use as phylogenetic markers. Instances of gene copies exhibiting topological patterns that correspond to shared phenotypes were observed in several gene families including floral development and timing genes. Alignments and phylogenetic trees were generated for 3,878 genes and for all genes in a concatenated analysis. Both the concatenated analysis and those of 2,412 separate gene trees supported monophyly among the woody bamboos, which is incongruent with previous phylogenetic studies using plastid markers.
Computation of transonic viscous-inviscid interacting flow
NASA Technical Reports Server (NTRS)
Whitfield, D. L.; Thomas, J. L.; Jameson, A.; Schmidt, W.
1983-01-01
Transonic viscous-inviscid interaction is considered using the Euler and inverse compressible turbulent boundary-layer equations. Certain improvements in the inverse boundary-layer method are mentioned, along with experiences in using various Runge-Kutta schemes to solve the Euler equations. Numerical conditions imposed on the Euler equations at a surface for viscous-inviscid interaction using the method of equivalent sources are developed, and numerical solutions are presented and compared with experimental data to illustrate essential points. Previously announced in STAR N83-17829
Modeling Innovations Advance Wind Energy Industry
NASA Technical Reports Server (NTRS)
2009-01-01
In 1981, Glenn Research Center scientist Dr. Larry Viterna developed a model that predicted certain elements of wind turbine performance with far greater accuracy than previous methods. The model was met with derision from others in the wind energy industry, but years later, Viterna discovered it had become the most widely used method of its kind, enabling significant wind energy technologies-like the fixed pitch turbines produced by manufacturers like Aerostar Inc. of Westport, Massachusetts-that are providing sustainable, climate friendly energy sources today.
Practical uncertainty reduction and quantification in shock physics measurements
Akin, M. C.; Nguyen, J. H.
2015-04-20
We report the development of a simple error analysis sampling method for identifying intersections and inflection points to reduce total uncertainty in experimental data. This technique was used to reduce uncertainties in sound speed measurements by 80% over conventional methods. Here, we focused on its impact on a previously published set of Mo sound speed data and possible implications for phase transition and geophysical studies. However, this technique's application can be extended to a wide range of experimental data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dobson, Patrick; Houseworth, James
2013-11-22
The objective of this report is to build upon previous compilations of shale formations within many of the major sedimentary basins in the US by developing GIS data delineating isopach and structural depth maps for many of these units. These data are being incorporated into the LANL digital GIS database being developed for determining host rock distribution and depth/thickness parameters consistent with repository design. Methods were developed to assess hydrological and geomechanical properties and conditions for shale formations based on sonic velocity measurements.
Automated Measurement of Patient-Specific Tibial Slopes from MRI
Amerinatanzi, Amirhesam; Summers, Rodney K.; Ahmadi, Kaveh; Goel, Vijay K.; Hewett, Timothy E.; Nyman, Edward
2017-01-01
Background: Multi-planar proximal tibial slopes may be associated with increased likelihood of osteoarthritis and anterior cruciate ligament injury, due in part to their role in checking the anterior-posterior stability of the knee. Established methods suffer repeatability limitations and lack computational efficiency for intuitive clinical adoption. The aims of this study were to develop a novel automated approach and to compare the repeatability and computational efficiency of the approach against previously established methods. Methods: Tibial slope geometries were obtained via MRI and measured using an automated Matlab-based approach. Data were compared for repeatability and evaluated for computational efficiency. Results: Mean lateral tibial slope (LTS) for females (7.2°) was greater than for males (1.66°). Mean LTS in the lateral concavity zone was greater for females (7.8° for females, 4.2° for males). Mean medial tibial slope (MTS) for females was greater (9.3° vs. 4.6°). Along the medial concavity zone, female subjects demonstrated greater MTS. Conclusion: The automated method was more repeatable and computationally efficient than previously identified methods and may aid in the clinical assessment of knee injury risk, inform surgical planning, and implant design efforts. PMID:28952547
Improved regulatory element prediction based on tissue-specific local epigenomic signatures
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Yupeng; Gorkin, David U.; Dickel, Diane E.
Accurate enhancer identification is critical for understanding the spatiotemporal transcriptional regulation during development as well as the functional impact of disease-related noncoding genetic variants. Computational methods have been developed to predict the genomic locations of active enhancers based on histone modifications, but the accuracy and resolution of these methods remain limited. Here, we present an algorithm, regulator y element prediction based on tissue-specific local epigenetic marks (REPTILE), which integrates histone modification and whole-genome cytosine DNA methylation profiles to identify the precise location of enhancers. We tested the ability of REPTILE to identify enhancers previously validated in reporter assays. Compared withmore » existing methods, REPTILE shows consistently superior performance across diverse cell and tissue types, and the enhancer locations are significantly more refined. We show that, by incorporating base-resolution methylation data, REPTILE greatly improves upon current methods for annotation of enhancers across a variety of cell and tissue types.« less
Automatic segmentation of cortical vessels in pre- and post-tumor resection laser range scan images
NASA Astrophysics Data System (ADS)
Ding, Siyi; Miga, Michael I.; Thompson, Reid C.; Garg, Ishita; Dawant, Benoit M.
2009-02-01
Measurement of intra-operative cortical brain movement is necessary to drive mechanical models developed to predict sub-cortical shift. At our institution, this is done with a tracked laser range scanner. This device acquires both 3D range data and 2D photographic images. 3D cortical brain movement can be estimated if 2D photographic images acquired over time can be registered. Previously, we have developed a method, which permits this registration using vessels visible in the images. But, vessel segmentation required the localization of starting and ending points for each vessel segment. Here, we propose a method, which automates the segmentation process further. This method involves several steps: (1) correction of lighting artifacts, (2) vessel enhancement, and (3) vessels' centerline extraction. Result obtained on 5 images obtained in the operating room suggests that our method is robust and is able to segment vessels reliably.
Sequencing Cyclic Peptides by Multistage Mass Spectrometry
Mohimani, Hosein; Yang, Yu-Liang; Liu, Wei-Ting; Hsieh, Pei-Wen; Dorrestein, Pieter C.; Pevzner, Pavel A.
2012-01-01
Some of the most effective antibiotics (e.g., Vancomycin and Daptomycin) are cyclic peptides produced by non-ribosomal biosynthetic pathways. While hundreds of biomedically important cyclic peptides have been sequenced, the computational techniques for sequencing cyclic peptides are still in their infancy. Previous methods for sequencing peptide antibiotics and other cyclic peptides are based on Nuclear Magnetic Resonance spectroscopy, and require large amount (miligrams) of purified materials that, for most compounds, are not possible to obtain. Recently, development of mass spectrometry based methods has provided some hope for accurate sequencing of cyclic peptides using picograms of materials. In this paper we develop a method for sequencing of cyclic peptides by multistage mass spectrometry, and show its advantages over single stage mass spectrometry. The method is tested on known and new cyclic peptides from Bacillus brevis, Dianthus superbus and Streptomyces griseus, as well as a new family of cyclic peptides produced by marine bacteria. PMID:21751357
Control theory based airfoil design for potential flow and a finite volume discretization
NASA Technical Reports Server (NTRS)
Reuther, J.; Jameson, A.
1994-01-01
This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.
Improved regulatory element prediction based on tissue-specific local epigenomic signatures
He, Yupeng; Gorkin, David U.; Dickel, Diane E.; ...
2017-02-13
Accurate enhancer identification is critical for understanding the spatiotemporal transcriptional regulation during development as well as the functional impact of disease-related noncoding genetic variants. Computational methods have been developed to predict the genomic locations of active enhancers based on histone modifications, but the accuracy and resolution of these methods remain limited. Here, we present an algorithm, regulator y element prediction based on tissue-specific local epigenetic marks (REPTILE), which integrates histone modification and whole-genome cytosine DNA methylation profiles to identify the precise location of enhancers. We tested the ability of REPTILE to identify enhancers previously validated in reporter assays. Compared withmore » existing methods, REPTILE shows consistently superior performance across diverse cell and tissue types, and the enhancer locations are significantly more refined. We show that, by incorporating base-resolution methylation data, REPTILE greatly improves upon current methods for annotation of enhancers across a variety of cell and tissue types.« less
NASA Technical Reports Server (NTRS)
1991-01-01
Induction heating technology, a magnetic non-deforming process, was developed by Langley researchers to join plastic and composite components in space. Under NASA license, Inductron Corporation uses the process to produce induction heating systems and equipment for numerous applications. The Torobonder, a portable system, comes with a number of interchangeable heads for aircraft repair. Other developments are the E Heating Head, the Toroid Joining Gun, and the Torobrazer. These products perform bonding applications more quickly, safely and efficiently than previous methods.
Teen Sized Humanoid Robot: Archie
NASA Astrophysics Data System (ADS)
Baltes, Jacky; Byagowi, Ahmad; Anderson, John; Kopacek, Peter
This paper describes our first teen sized humanoid robot Archie. This robot has been developed in conjunction with Prof. Kopacek’s lab from the Technical University of Vienna. Archie uses brushless motors and harmonic gears with a novel approach to position encoding. Based on our previous experience with small humanoid robots, we developed software to create, store, and play back motions as well as control methods which automatically balance the robot using feedback from an internal measurement unit (IMU).
Thomas, Jonathan V.; Stanton, Gregory P.; Bumgarner, Johnathan R.; Pearson, Daniel K.; Teeple, Andrew; Houston, Natalie A.; Payne, Jason; Musgrove, MaryLynn
2013-01-01
Several previous studies have been done to compile or collect physical and chemical data, describe the hydrogeologic processes, and develop conceptual and numerical groundwater-flow models of the Edwards-Trinity aquifer in the Trans-Pecos region. Documented methods were used to compile and collect groundwater, surface-water, geochemical, geophysical, and geologic information that subsequently were used to develop this conceptual model.
Carter, Jacoby; Merino, Sergio
2018-03-19
This report provides an overview of the pilot study and description of the techniques developed for a future mitigation study of Pomacea maculata (giant applesnail) at the U.S. Fish and Wildlife Service Mandalay National Wildlife Refuge, Louisiana (MNWR). Egg mass suppression is a potential strategy for the mitigation of the invasive giant applesnail. In previous studies at Langan Municipal Park in Mobile, Alabama (LMP), and National Park Service Jean Lafitte National Park-Barataria Unit, Louisiana (JLNP), we determined that spraying food-grade oil (coconut oil or Pam™ spray) on egg masses significantly reduced egg hatching. At JLNP we also developed methods to estimate snail population size. The purpose of this pilot study was to adapt techniques developed for previous studies to the circumstances of MNWR in preparation for a larger experiment whereby we will test the effectiveness of egg mass suppression as an applesnail mitigation tool. We selected four canals that will be used as treatment and control sites for the experiment (two each). We established that an efficient way to destroy egg masses is to knock them down with a high-velocity stream of water pumped directly from the canal. The traps used at JLNP had to be modified to accommodate the greater range of water-level fluctuation at MNWR. One of the three marking methods used at JLNP was selected for use at MNWR.
Jo, Bum Seak; Myong, Jun Pyo; Rhee, Chin Kook; Yoon, Hyoung Kyu; Koo, Jung Wan; Kim, Hyoung Ryoul
2018-01-15
The present study aimed to update the prediction equations for spirometry and their lower limits of normal (LLN) by using the lambda, mu, sigma (LMS) method and to compare the outcomes with the values of previous spirometric reference equations. Spirometric data of 10,249 healthy non-smokers (8,776 females) were extracted from the fourth and fifth versions of the Korea National Health and Nutrition Examination Survey (KNHANES IV, 2007-2009; V, 2010-2012). Reference equations were derived using the LMS method which allows modeling skewness (lambda [L]), mean (mu [M]), and coefficient of variation (sigma [S]). The outcome equations were compared with previous reference values. Prediction equations were presented in the following form: predicted value = e{a + b × ln(height) + c × ln(age) + M - spline}. The new predicted values for spirometry and their LLN derived using the LMS method were shown to more accurately reflect transitions in pulmonary function in young adults than previous prediction equations derived using conventional regression analysis in 2013. There were partial discrepancies between the new reference values and the reference values from the Global Lung Function Initiative in 2012. The results should be interpreted with caution for young adults and elderly males, particularly in terms of the LLN for forced expiratory volume in one second/forced vital capacity in elderly males. Serial spirometry follow-up, together with correlations with other clinical findings, should be emphasized in evaluating the pulmonary function of individuals. Future studies are needed to improve the accuracy of reference data and to develop continuous reference values for spirometry across all ages. © 2018 The Korean Academy of Medical Sciences.
An improved, robust, axial line singularity method for bodies of revolution
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.
1989-01-01
The failures encountered in attempts to increase the range of applicability of the axial line singularity method for representing incompressible, inviscid flow about an inclined and slender body-of-revolution are presently noted to be common to all efforts to solve Fredholm equations of the first kind. It is shown that a previously developed smoothing technique yields a robust method for numerical solution of the governing equations; this technique is easily retrofitted to existing codes, and allows the number of circularities to be increased until the most accurate line singularity solution is obtained.
A Synthetic Quadrature Phase Detector/Demodulator for Fourier Transform Transform Spectrometers
NASA Technical Reports Server (NTRS)
Campbell, Joel
2008-01-01
A method is developed to demodulate (velocity correct) Fourier transform spectrometer (FTS) data that is taken with an analog to digital converter that digitizes equally spaced in time. This method makes it possible to use simple low cost, high resolution audio digitizers to record high quality data without the need for an event timer or quadrature laser hardware, and makes it possible to use a metrology laser of any wavelength. The reduced parts count and simplicity implementation makes it an attractive alternative in space based applications when compared to previous methods such as the Brault algorithm.
Simple method for assembly of CRISPR synergistic activation mediator gRNA expression array.
Vad-Nielsen, Johan; Nielsen, Anders Lade; Luo, Yonglun
2018-05-20
When studying complex interconnected regulatory networks, effective methods for simultaneously manipulating multiple genes expression are paramount. Previously, we have developed a simple method for generation of an all-in-one CRISPR gRNA expression array. We here present a Golden Gate Assembly-based system of synergistic activation mediator (SAM) compatible CRISPR/dCas9 gRNA expression array for the simultaneous activation of multiple genes. Using this system, we demonstrated the simultaneous activation of the transcription factors, TWIST, SNAIL, SLUG, and ZEB1 a human breast cancer cell line. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Lamar, J. E.
1976-01-01
A new subsonic method has been developed by which the mean camber surface can be determined for trimmed noncoplanar planforms with minimum vortex drag. This method uses a vortex lattice and overcomes previous difficulties with chord loading specification. A Trefftz plane analysis is utilized to determine the optimum span loading for minimum drag, then solved for the mean camber surface of the wing, which provides the required loading. Sensitivity studies, comparisons with other theories, and applications to configurations which include a tandem wing and a wing winglet combination have been made and are presented.
Pereira, Elenilda J.; Carvalho, Lucia M. J.; Dellamora-Ortiz, Gisela M.; Cardoso, Flávio S. N.; Carvalho, José L. V.; Viana, Daniela S.; Freitas, Sidinea C.; Rocha, Maurisrael M.
2014-01-01
Background Because iron deficiency anemia is prevalent in developing countries, determining the levels of iron and zinc in beans, the second most consumed staple food in Brazil, is essential, especially for the low-income people who experience a deficiency of these minerals in their diet. Objectives This study aimed to evaluate the effect of cooking methods by measuring the iron and zinc contents in cowpea cultivars before and after soaking to determine the retention of these minerals. Methods The samples were cooked in both regular pans and pressure cookers with and without previous soaking. Mineral analyses were carried out by Spectrometry of Inductively Coupled Plasma (ICP). Results The results showed high contents of iron and zinc in raw samples as well as in cooked ones, with the use of regular pan resulting in greater percentage of iron retention and the use of pressure cooker ensuring higher retention of zinc. Conclusions The best retention of iron was found in the BRS Aracê cultivar prepared in a regular pan with previous soaking. This cultivar may be indicated for cultivation and human consumption. The best retention of zinc was found for the BRS Tumucumaque cultivar prepared in a pressure cooker without previous soaking. PMID:24624050
NASA Astrophysics Data System (ADS)
Cherry, M.; Dierken, J.; Boehnlein, T.; Pilchak, A.; Sathish, S.; Grandhi, R.
2018-01-01
A new technique for performing quantitative scanning acoustic microscopy imaging of Rayleigh surface wave (RSW) velocity was developed based on b-scan processing. In this technique, the focused acoustic beam is moved through many defocus distances over the sample and excited with an impulse excitation, and advanced algorithms based on frequency filtering and the Hilbert transform are used to post-process the b-scans to estimate the Rayleigh surface wave velocity. The new method was used to estimate the RSW velocity on an optically flat E6 glass sample, and the velocity was measured at ±2 m/s and the scanning time per point was on the order of 1.0 s, which are both improvement from the previous two-point defocus method. The new method was also applied to the analysis of two titanium samples, and the velocity was estimated with very low standard deviation in certain large grains on the sample. A new behavior was observed with the b-scan analysis technique where the amplitude of the surface wave decayed dramatically on certain crystallographic orientations. The new technique was also compared with previous results, and the new technique has been found to be much more reliable and to have higher contrast than previously possible with impulse excitation.
A method to assess social sustainability of capture fisheries: An application to a Norwegian trawler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Veldhuizen, L.J.L., E-mail: linda.veldhuizen@wur.nl; Berentsen, P.B.M.; Bokkers, E.A.M.
Social sustainability assessment of capture fisheries is, both in terms of method development and measurement, not well developed. The objective of this study, therefore, was to develop a method consisting of indicators and rubrics (i.e. categories that articulate levels of performance) to assess social sustainability of capture fisheries. This method was applied to a Norwegian trawler that targets cod and haddock in the northeast Atlantic. Based on previous research, 13 social sustainability issues were selected. To measure the state of these issues, 17 process and outcome indicators were determined. To interpret indicator values, rubrics were developed for each indicator, usingmore » standards set by international conventions or data retrieved from national statistics, industry agreements or scientific publications that explore rubric scales. The indicators and rubrics were subsequently used in a social sustainability assessment of a Norwegian trawler. This assessment indicated that overall, social sustainability of this trawler is relatively high, with high rubric scores, for example, for worker safety, provisions aboard for the crew and companies' salary levels. The assessment also indicated that the trawler could improve on healthy working environment, product freshness and fish welfare during capture. This application demonstrated that our method provides insight into social sustainability at the level of the vessel and can be used to identify potential room for improvement. This method is also promising for social sustainability assessment of other capture fisheries. - Highlights: • A method was developed for social sustainability assessment of capture fisheries. • This method entailed determining outcome and process indicators for important issues. • To interpret indicator values, a rubric was developed for each indicator. • Use of this method gives insight into social sustainability and improvement options. • This method is promising for social sustainability assessment of capture fisheries.« less
The Fluorescent-Oil Film Method and Other Techniques for Boundary-Layer Flow Visualization
NASA Technical Reports Server (NTRS)
Loving, Donald L.; Katzoff, S.
1959-01-01
A flow-visualization technique, known as the fluorescent-oil film method, has been developed which appears to be generally simpler and to require less experience and development of technique than previously published methods. The method is especially adapted to use in the large high-powered wind tunnels which require considerable time to reach the desired test conditions. The method consists of smearing a film of fluorescent oil over a surface and observing where the thickness is affected by the shearing action of the boundary layer. These films are detected and identified, and their relative thicknesses are determined by use of ultraviolet light. Examples are given of the use of this technique. Other methods that show promise in the study of boundary-layer conditions are described. These methods include the use of a temperature-sensitive fluorescent paint and the use of a radiometer that is sensitive to the heat radiation from a surface. Some attention is also given to methods that can be used with a spray apparatus in front of the test model.
Development of the Ion Exchange-Gravimetric Method for Sodium in Serum as a Definitive Method
Moody, John R.; Vetter, Thomas W.
1996-01-01
An ion exchange-gravimetric method, previously developed as a National Committee for Clinical Laboratory Standards (NCCLS) reference method for the determination of sodium in human serum, has been re-evaluated and improved. Sources of analytical error in this method have been examined more critically and the overall uncertainties decreased. Additionally, greater accuracy and repeatability have been achieved by the application of this definitive method to a sodium chloride reference material. In this method sodium in serum is ion-exchanged, selectively eluted and converted to a weighable precipitate as Na2SO4. Traces of sodium eluting before or after the main fraction, and precipitate contaminants are determined instrumentally. Co-precipitating contaminants contribute less than 0.1 % while the analyte lost to other eluted ion-exchange fractions contributes less than 0.02 % to the total precipitate mass. With improvements, the relative expanded uncertainty (k = 2) of the method, as applied to serum, is 0.3 % to 0.4 % and is less than 0.1 % when applied to a sodium chloride reference material. PMID:27805122
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.
2010-01-01
This technical publication details part of an effort focused on the development of a standardized facesheet/core peel debonding test procedure. The purpose of the test is to characterize facesheet/core peel in sandwich structure, accomplished through the measurement of the critical strain energy release rate associated with the debonding process. Following an examination of previously developed tests and a recent evaluation of a selection of these methods, a single cantilever beam (SCB) specimen was identified as being a promising candidate for establishing such a standardized test procedure. The objective of the work described here was to begin development of a protocol for conducting a SCB test that will render the procedure suitable for standardization. To this end, a sizing methodology was developed to ensure appropriate SCB specimen dimensions are selected for a given sandwich system. Application of this method to actual sandwich systems yielded SCB specimen dimensions that would be practical for use. This study resulted in the development of a practical SCB specimen sizing method, which should be well-suited for incorporation into a standardized testing protocol.
Methods for determining time of death.
Madea, Burkhard
2016-12-01
Medicolegal death time estimation must estimate the time since death reliably. Reliability can only be provided empirically by statistical analysis of errors in field studies. Determining the time since death requires the calculation of measurable data along a time-dependent curve back to the starting point. Various methods are used to estimate the time since death. The current gold standard for death time estimation is a previously established nomogram method based on the two-exponential model of body cooling. Great experimental and practical achievements have been realized using this nomogram method. To reduce the margin of error of the nomogram method, a compound method was developed based on electrical and mechanical excitability of skeletal muscle, pharmacological excitability of the iris, rigor mortis, and postmortem lividity. Further increasing the accuracy of death time estimation involves the development of conditional probability distributions for death time estimation based on the compound method. Although many studies have evaluated chemical methods of death time estimation, such methods play a marginal role in daily forensic practice. However, increased precision of death time estimation has recently been achieved by considering various influencing factors (i.e., preexisting diseases, duration of terminal episode, and ambient temperature). Putrefactive changes may be used for death time estimation in water-immersed bodies. Furthermore, recently developed technologies, such as H magnetic resonance spectroscopy, can be used to quantitatively study decompositional changes. This review addresses the gold standard method of death time estimation in forensic practice and promising technological and scientific developments in the field.
Comellas, L; Portillo, J L; Vaquero, M T
1993-12-24
A procedure for determining linear alkylbenzenesulphonates (LASs) in sewage sludge and amended soils has been developed. Extraction by sample treatment with 0.5 M potassium hydroxide in methanol and reflux was compared with a previously described extraction procedure in Soxhlet with methanol and solid sodium hydroxide in the sample. Repeatability results were similar with savings in extraction time, solvents and evaporation time. A clean-up method involving a C18 cartridge has been developed. Analytes were quantified by a reversed-phase HPLC method with UV and fluorescence detectors. Recoveries obtained were higher than 84%. The standing procedure was applied to high doses of sewage sludge-amended soils (15%) with increasing quantities of added LASs. Degradation data for a 116-day period are presented.
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.
2001-01-01
A research program is in progress to develop strain rate dependent deformation and failure models for the analysis of polymer matrix composites subject to impact loads. Previously, strain rate dependent inelastic constitutive equations developed to model the polymer matrix were incorporated into a mechanics of materials based micromechanics method. In the current work, the micromechanics method is revised such that the composite unit cell is divided into a number of slices. Micromechanics equations are then developed for each slice, with laminate theory applied to determine the elastic properties, effective stresses and effective inelastic strains for the unit cell. Verification studies are conducted using two representative polymer matrix composites with a nonlinear, strain rate dependent deformation response. The computed results compare well to experimentally obtained values.
Pearson, Brooke; Mills, Alexander; Tucker, Madeline; Gao, Siyue; McLandsborough, Lynne; He, Lili
2018-06-01
Bacterial foodborne illness continues to be a pressing issue in our food supply. Rapid detection methods are needed for perishable foods due to their short shelf lives and significant contribution to foodborne illness. Previously, a sensitive and reliable surface-enhanced Raman spectroscopy (SERS) sandwich assay based on 3-mercaptophenylboronic acid (3-MBPA) as a capturer and indicator molecule was developed for rapid bacteria detection. In this study, we explored the advantages and constraints of this assay over the conventional aerobic plate count (APC) method and further developed methods for detection in real environmental and food matrices. The SERS sandwich assay was able to detect environmental bacteria in pond water and on spinach leaves at higher levels than the APC method. In addition, the SERS assay appeared to have higher sensitivity to quantify bacteria in the stationary phase. On the other hand, the APC method was more sensitive to cell viability. Finally, a method to detect bacteria in a challenging high-sugar juice matrix was developed to enhance bacteria capture. This study advanced the SERS technique for real applications in environment and food matrices. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Jutte, Christine V.; Ko, William L.; Stephens, Craig A.; Bakalyar, John A.; Richards, W. Lance
2011-01-01
A ground loads test of a full-scale wing (175-ft span) was conducted using a fiber optic strain-sensing system to obtain distributed surface strain data. These data were input into previously developed deformed shape equations to calculate the wing s bending and twist deformation. A photogrammetry system measured actual shape deformation. The wing deflections reached 100 percent of the positive design limit load (equivalent to 3 g) and 97 percent of the negative design limit load (equivalent to -1 g). The calculated wing bending results were in excellent agreement with the actual bending; tip deflections were within +/- 2.7 in. (out of 155-in. max deflection) for 91 percent of the load steps. Experimental testing revealed valuable opportunities for improving the deformed shape equations robustness to real world (not perfect) strain data, which previous analytical testing did not detect. These improvements, which include filtering methods developed in this work, minimize errors due to numerical anomalies discovered in the remaining 9 percent of the load steps. As a result, all load steps attained +/- 2.7 in. accuracy. Wing twist results were very sensitive to errors in bending and require further development. A sensitivity analysis and recommendations for fiber implementation practices, along with, effective filtering methods are included
NASA Technical Reports Server (NTRS)
DellaCorte, Christopher
2010-01-01
Foil gas bearings are a key technology in many commercial and emerging Oil-Free turbomachinery systems. These bearings are non-linear and have been difficult to analytically model in terms of performance characteristics such as load capacity, power loss, stiffness and damping. Previous investigations led to an empirically derived method, a rule-of-thumb, to estimate load capacity. This method has been a valuable tool in system development. The current paper extends this tool concept to include rules for stiffness and damping coefficient estimation. It is expected that these rules will further accelerate the development and deployment of advanced Oil-Free machines operating on foil gas bearings
Kokubun, Hideya; Ouki, Makiko; Matoba, Motohiro; Kubo, Hiroaki; Hoka, Sumio; Yago, Kazuo
2005-03-01
We developed an HPLC procedure using electrochemical detection for the quantitation of oxycodone and hydrocotarnine in cancer patients serum. An eluent of methanol:acetonitrile:5 mM pH 8 phosphate buffer (2:1:7) was used for the mobile phase. The calibration curve was linear in the range from 10 ng/mL to 100 ng/mL. The recovery of oxycodone and hydrocotarnine was 97.2% and 90.5%, respectively. The relative standard deviations within-runs and between-runs for the assay of oxycodone or hydrocotarnine were less than 4.8%. The method developed here was better than the method reported previously.
Zhai, Di-Hua; Xia, Yuanqing
2018-02-01
This paper addresses the adaptive control for task-space teleoperation systems with constrained predefined synchronization error, where a novel switched control framework is investigated. Based on multiple Lyapunov-Krasovskii functionals method, the stability of the resulting closed-loop system is established in the sense of state-independent input-to-output stability. Compared with previous work, the developed method can simultaneously handle the unknown kinematics/dynamics, asymmetric varying time delays, and prescribed performance control in a unified framework. It is shown that the developed controller can guarantee the prescribed transient-state and steady-state synchronization performances between the master and slave robots, which is demonstrated by the simulation study.
Epithelial Membrane Protein-2 Expression is an Early Predictor of Endometrial Cancer Development
Habeeb, Omar; Goodglick, Lee; Soslow, Robert A.; Rao, Rajiv; Gordon, Lynn K.; Schirripa, Osvaldo; Horvath, Steve; Braun, Jonathan; Seligson, David B.; Wadehra, Madhuri
2010-01-01
BACKGROUND Endometrial cancer (EC) is a common malignancy worldwide. It is often preceded by endometrial hyperplasia, whose management and risk of neoplastic progression vary. Previously, we have shown that the tetraspan protein Epithelial Membrane Protein-2 (EMP2) is a prognostic indicator for EC aggressiveness and survival. Here we validate the expression of EMP2 in EC, and further examine whether EMP2 expression within preneoplastic lesions is an early prognostic biomarker for EC development. METHODS A tissue microarray (TMA) was constructed with a wide representation of benign and malignant endometrial samples. The TMA contains a metachronous cohort of cases from individuals who either developed or did not develop EC. Intensity and frequency of EMP2 expression were assessed using immunohistochemistry. RESULTS There was a stepwise, statistically-significant increase in the average EMP2 expression from benign to hyperplasia to atypia to EC. Furthermore, detailed analysis of EMP2 expression in potentially premalignant cases demonstrated that EMP2 positivity was a strong predictor for EC development. CONCLUSION EMP2 is an early predictor of EC development in preneoplastic lesions. In addition, combined with our previous findings, these results validate that EMP2 as a novel biomarker for EC development. PMID:20578181
Wang, Xin; Mair, Raydel; Hatcher, Cynthia; Theodore, M Jordan; Edmond, Karen; Wu, Henry M; Harcourt, Brian H; Carvalho, Maria da Gloria S; Pimenta, Fabiana; Nymadawa, Pagbajab; Altantsetseg, Dorjpurev; Kirsch, Mariah; Satola, Sarah W; Cohn, Amanda; Messonnier, Nancy E; Mayer, Leonard W
2011-04-01
Since the implementation of Haemophilus influenzae (Hi) serotype b vaccine, other serotypes and non-typeable strains have taken on greater importance as a cause of Hi diseases. A rapid and accurate method is needed to detect all Hi regardless of the encapsulation status. We developed 2 real-time PCR (rt-PCR) assays to detect specific regions of the protein D gene (hpd). Both hpd assays are very specific and sensitive for detection of Hi. Of the 63 non-Hi isolates representing 21 bacterial species, none was detected by the hpd #1 assay, and only one of 2 H. aphrophilus isolates was detected by the hpd #3 assay. The hpd #1 and #3 assays detected 97% (229/237) and 99% (234/237) of Hi isolates, respectively, and were superior for detection of both typeable and non-typeable Hi isolates, as compared to previously developed rt-PCR targeting ompP2 or bexA. The diagnostic sensitivity and specificity of these rt-PCR assays were assessed on cerebrospinal fluid specimens collected as part of meningitis surveillance in Ulaanbaatar, Mongolia. The etiology (Neisseria meningitidis, Hi, and Streptococcus pneumoniae) of 111 suspected meningitis cases was determined by conventional methods (culture and latex agglutination), previously developed rt-PCR assays, and the new hpd assays. The rt-PCR assays were more sensitive for detection of meningitis pathogens than other classical methods and improved detection from 50% (56/111) to 75% (83/111). The hpd #3 assay identified a non-b Hi that was missed by the bexA assay and other methods. A sensitive rt-PCR assay to detect both typeable and non-typeable Hi is a useful tool for improving Hi disease surveillance especially after Hib vaccine introduction. Published by Elsevier GmbH.
Soucek, David J; Dickinson, Amy
2015-09-01
Although insects occur in nearly all freshwater ecosystems, few sensitive insect models exist for use in determining the toxicity of contaminants. The objectives of the present study were to adapt previously developed culturing and toxicity testing methods for the mayfly Neocloeon triangulifer (Ephemeroptera: Baetidae), and to further develop a method for chronic toxicity tests spanning organism ages of less than 24 h post hatch to adult emergence, using a laboratory cultured diatom diet. The authors conducted 96-h fed acute tests and full-life chronic toxicity tests with sodium chloride, sodium nitrate, and sodium sulfate. The authors generated 96-h median lethal concentrations (LC50s) of 1062 mg Cl/L (mean of 3 tests), 179 mg N-NO3 /L, and 1227 mg SO4 /L. Acute to chronic ratios ranged from 2.1 to 6.4 for chloride, 2.5 to 5.1 for nitrate, and 2.3 to 8.5 for sulfate. The endpoints related to survival and development time were consistently the most sensitive in the tests. The chronic values generated for chloride were in the same range as those generated by others using natural foods. Furthermore, our weight-versus-fecundity plots were similar to those previously published using the food culturing method on which the present authors' method was based, indicating good potential for standardization. The authors believe that the continued use of this sensitive mayfly species in laboratory studies will help to close the gap in understanding between standard laboratory toxicity test results and field-based observations of community impairment. © 2015 SETAC.
Jochems, Arthur; El-Naqa, Issam; Kessler, Marc; Mayo, Charles S; Jolly, Shruti; Matuszak, Martha; Faivre-Finn, Corinne; Price, Gareth; Holloway, Lois; Vinod, Shalini; Field, Matthew; Barakat, Mohamed Samir; Thwaites, David; de Ruysscher, Dirk; Dekker, Andre; Lambin, Philippe
2018-02-01
Early death after a treatment can be seen as a therapeutic failure. Accurate prediction of patients at risk for early mortality is crucial to avoid unnecessary harm and reducing costs. The goal of our work is two-fold: first, to evaluate the performance of a previously published model for early death in our cohorts. Second, to develop a prognostic model for early death prediction following radiotherapy. Patients with NSCLC treated with chemoradiotherapy or radiotherapy alone were included in this study. Four different cohorts from different countries were available for this work (N = 1540). The previous model used age, gender, performance status, tumor stage, income deprivation, no previous treatment given (yes/no) and body mass index to make predictions. A random forest model was developed by learning on the Maastro cohort (N = 698). The new model used performance status, age, gender, T and N stage, total tumor volume (cc), total tumor dose (Gy) and chemotherapy timing (none, sequential, concurrent) to make predictions. Death within 4 months of receiving the first radiotherapy fraction was used as the outcome. Early death rates ranged from 6 to 11% within the four cohorts. The previous model performed with AUC values ranging from 0.54 to 0.64 on the validation cohorts. Our newly developed model had improved AUC values ranging from 0.62 to 0.71 on the validation cohorts. Using advanced machine learning methods and informative variables, prognostic models for early mortality can be developed. Development of accurate prognostic tools for early mortality is important to inform patients about treatment options and optimize care.
Attention to Social Stimuli and Facial Identity Recognition Skills in Autism Spectrum Disorder
ERIC Educational Resources Information Center
Wilson, C. E.; Brock, J.; Palermo, R.
2010-01-01
Background: Previous research suggests that individuals with autism spectrum disorder (ASD) have a reduced preference for viewing social stimuli in the environment and impaired facial identity recognition. Methods: Here, we directly tested a link between these two phenomena in 13 ASD children and 13 age-matched typically developing (TD) controls.…
Planning Your Journey in Coaching: Building a Network for Success
ERIC Educational Resources Information Center
Van Mullem, Pete; Croft, Chris
2015-01-01
A coach develops his or her craft by reflecting on previous playing experiences (Erickson, Côté, & Fraser-Thomas, 2007) and continuing to seek learning opportunities through a variety of informal and non-formal learning methods (e.g. discussion with other coaches, trial and error, observation, advice of a mentor, clinics, web sites, books and…
ERIC Educational Resources Information Center
Hitt, Fernando; González-Martín, Alejandro S.
2015-01-01
Semiotic representations have been an important topic of study in mathematics education. Previous research implicitly placed more importance on the development of institutional representations of mathematical concepts in students rather than other types of representations. In the context of an extensive research project, in progress since 2005,…
ERIC Educational Resources Information Center
Orgiles, Mireia; Johnson, Blair T.; Huedo-Medina, Tania B.; Espada, Jose P.
2012-01-01
Introduction: According to previous studies, when parents divorce it may increase the vulnerability of children to develop personal problems, such as lowering academic performance. This research examines the academic performance of Spanish children with divorced parents and its relation to academic self-concept and social anxiety. Method: The…
Unit area control--its development and application
William E. Hallin
1954-01-01
Thirty years of research in the ponderosa pine and mixed conifer forests of the Sierra of California by Dunning and his associates have shown that previous methods of silviculture were not providing adequate restocking of pine. Analysis of records indicated a new approach was necessary. From this work, new procedures that gave most promise of success were formulated...
ERIC Educational Resources Information Center
Lancioni, Giulio E.; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Lang, Russell
2012-01-01
Background: A camera-based microswitch technology was recently developed to monitor small facial responses of persons with multiple disabilities and allow those responses to control environmental stimulation. This study assessed such a technology with 2 new participants using slight variations of previous responses. Method: The technology involved…
Using a previously developed method to measure OH production, formation rates were obtained for several water systems. Employing an amino-nitroxide probe and DMSO, an action
spectrum for the product consistent with the production of OH by quinone moieties within humic material...
ERIC Educational Resources Information Center
Jones, Anna Marie; Punia, Mandeep; Young, Shannan; Huegli, Carol Chase; Zidenberg-Cherr, Sheri
2013-01-01
Purpose/Objectives: The objective of this study was to determine the perceived training needs of California school nutrition personnel. Methods: A questionnaire was developed using items from previous questionnaires administered to similar populations. New items were written based on feedback from stakeholders. Respondents were asked to rate their…
Formation of Verbal Behavior of Deaf-Blind Children.
ERIC Educational Resources Information Center
Umezu, Hachizo
The monograph describes the development of verbal behavior over a 20-year period in two deaf Japanese children (5- and 7-years-old when first contacted by the author) with whom previous training attempts had failed. It is noted that prior training methods which had succeeded with Laura Bridgman and Helen Keller failed with these two children. A…
ERIC Educational Resources Information Center
Trujillo, Caleb M.; Anderson, Trevor R.; Pelaez, Nancy J.
2016-01-01
When undergraduate biology students learn to explain biological mechanisms, they face many challenges and may overestimate their understanding of living systems. Previously, we developed the MACH model of four components used by expert biologists to explain mechanisms: Methods, Analogies, Context, and How. This study explores the implementation of…
Women with previous stress fractures show reduced bone material strength
Duarte Sosa, Daysi; Fink Eriksen, Erik
2016-01-01
Background and purpose — Bone fragility is determined by bone mass, bone architecture, and the material properties of bone. Microindentation has been introduced as a measurement method that reflects bone material properties. The pathogenesis of underlying stress fractures, in particular the role of impaired bone material properties, is still poorly understood. Based on the hypothesis that impaired bone material strength might play a role in the development of stress fractures, we used microindentation in patients with stress fractures and in controls. Patients and methods — We measured bone material strength index (BMSi) by microindentation in 30 women with previous stress fractures and in 30 normal controls. Bone mineral density by DXA and levels of the bone markers C-terminal cross-linking telopeptide of type-1 collagen (CTX) and N-terminal propeptide of type-1 procollagen (P1NP) were also determined. Results — Mean BMSi in stress fracture patients was significantly lower than in the controls (SD 72 (8.7) vs. 77 (7.2); p = 0.02). The fracture subjects also had a significantly lower mean bone mineral density (BMD) than the controls (0.9 (0.02) vs. 1.0 (0.06); p = 0.03). Bone turnover—as reflected in serum levels of the bone marker CTX—was similar in both groups, while P1NP levels were significantly higher in the women with stress fractures (55 μg/L vs. 42 μg/L; p = 0.03). There was no correlation between BMSi and BMD or bone turnover. Interpretation — BMSi was inferior in patients with previous stress fracture, but was unrelated to BMD and bone turnover. The lower values of BMSi in patients with previous stress fracture combined with a lower BMD may contribute to the increased propensity to develop stress fractures in these patients. PMID:27321443
Kozar, Mark D.; Kahle, Sue C.
2013-01-01
This report documents the standard procedures, policies, and field methods used by the U.S. Geological Survey’s (USGS) Washington Water Science Center staff for activities related to the collection, processing, analysis, storage, and publication of groundwater data. This groundwater quality-assurance plan changes through time to accommodate new methods and requirements developed by the Washington Water Science Center and the USGS Office of Groundwater. The plan is based largely on requirements and guidelines provided by the USGS Office of Groundwater, or the USGS Water Mission Area. Regular updates to this plan represent an integral part of the quality-assurance process. Because numerous policy memoranda have been issued by the Office of Groundwater since the previous groundwater quality assurance plan was written, this report is a substantial revision of the previous report, supplants it, and contains significant additional policies not covered in the previous report. This updated plan includes information related to the organization and responsibilities of USGS Washington Water Science Center staff, training, safety, project proposal development, project review procedures, data collection activities, data processing activities, report review procedures, and archiving of field data and interpretative information pertaining to groundwater flow models, borehole aquifer tests, and aquifer tests. Important updates from the previous groundwater quality assurance plan include: (1) procedures for documenting and archiving of groundwater flow models; (2) revisions to procedures and policies for the creation of sites in the Groundwater Site Inventory database; (3) adoption of new water-level forms to be used within the USGS Washington Water Science Center; (4) procedures for future creation of borehole geophysics, surface geophysics, and aquifer-test archives; and (5) use of the USGS Multi Optional Network Key Entry System software for entry of routine water-level data collected as part of long-term water-level monitoring networks.
The Complex Admixture History and Recent Southern Origins of Siberian Populations
Pugach, Irina; Matveev, Rostislav; Spitsyn, Viktor; Makarov, Sergey; Novgorodov, Innokentiy; Osakovsky, Vladimir; Stoneking, Mark; Pakendorf, Brigitte
2016-01-01
Although Siberia was inhabited by modern humans at an early stage, there is still debate over whether it remained habitable during the extreme cold of the Last Glacial Maximum or whether it was subsequently repopulated by peoples with recent shared ancestry. Previous studies of the genetic history of Siberian populations were hampered by the extensive admixture that appears to have taken place among these populations, because commonly used methods assume a tree-like population history and at most single admixture events. Here we analyze geogenetic maps and use other approaches to distinguish the effects of shared ancestry from prehistoric migrations and contact, and develop a new method based on the covariance of ancestry components, to investigate the potentially complex admixture history. We furthermore adapt a previously devised method of admixture dating for use with multiple events of gene flow, and apply these methods to whole-genome genotype data from over 500 individuals belonging to 20 different Siberian ethnolinguistic groups. The results of these analyses indicate that there have been multiple layers of admixture detectable in most of the Siberian populations, with considerable differences in the admixture histories of individual populations. Furthermore, most of the populations of Siberia included here, even those settled far to the north, appear to have a southern origin, with the northward expansions of different populations possibly being driven partly by the advent of pastoralism, especially reindeer domestication. These newly developed methods to analyze multiple admixture events should aid in the investigation of similarly complex population histories elsewhere. PMID:26993256
Methods Development for Spectral Simplification of Room-Temperature Rotational Spectra
NASA Astrophysics Data System (ADS)
Kent, Erin B.; Shipman, Steven
2014-06-01
Room-temperature rotational spectra are dense and difficult to assign, and so we have been working to develop methods to accelerate this process. We have tested two different methods with our waveguide-based spectrometer, which operates from 8.7 to 26.5 GHz. The first method, based on previous work by Medvedev and De Lucia, was used to estimate lower state energies of transitions by performing relative intensity measurements at a range of temperatures between -20 and +50 °C. The second method employed hundreds of microwave-microwave double resonance measurements to determine level connectivity between rotational transitions. The relative intensity measurements were not particularly successful in this frequency range (the reasons for this will be discussed), but the information gleaned from the double-resonance measurements can be incorporated into other spectral search algorithms (such as autofit or genetic algorithm approaches) via scoring or penalty functions to help with the spectral assignment process. I.R. Medvedev, F.C. De Lucia, Astrophys. J. 656, 621-628 (2007).
Stuebner, Michael; Haider, Mansoor A
2010-06-18
A new and efficient method for numerical solution of the continuous spectrum biphasic poroviscoelastic (BPVE) model of articular cartilage is presented. Development of the method is based on a composite Gauss-Legendre quadrature approximation of the continuous spectrum relaxation function that leads to an exponential series representation. The separability property of the exponential terms in the series is exploited to develop a numerical scheme that can be reduced to an update rule requiring retention of the strain history at only the previous time step. The cost of the resulting temporal discretization scheme is O(N) for N time steps. Application and calibration of the method is illustrated in the context of a finite difference solution of the one-dimensional confined compression BPVE stress-relaxation problem. Accuracy of the numerical method is demonstrated by comparison to a theoretical Laplace transform solution for a range of viscoelastic relaxation times that are representative of articular cartilage. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
A forward model-based validation of cardiovascular system identification
NASA Technical Reports Server (NTRS)
Mukkamala, R.; Cohen, R. J.
2001-01-01
We present a theoretical evaluation of a cardiovascular system identification method that we previously developed for the analysis of beat-to-beat fluctuations in noninvasively measured heart rate, arterial blood pressure, and instantaneous lung volume. The method provides a dynamical characterization of the important autonomic and mechanical mechanisms responsible for coupling the fluctuations (inverse modeling). To carry out the evaluation, we developed a computational model of the cardiovascular system capable of generating realistic beat-to-beat variability (forward modeling). We applied the method to data generated from the forward model and compared the resulting estimated dynamics with the actual dynamics of the forward model, which were either precisely known or easily determined. We found that the estimated dynamics corresponded to the actual dynamics and that this correspondence was robust to forward model uncertainty. We also demonstrated the sensitivity of the method in detecting small changes in parameters characterizing autonomic function in the forward model. These results provide confidence in the performance of the cardiovascular system identification method when applied to experimental data.
Goldstein, S J; Hensley, C A; Armenta, C E; Peters, R J
1997-03-01
Recent developments in extraction chromatography have simplified the separation of americium from complex matrices in preparation for alpha-spectroscopy relative to traditional methods. Here we present results of procedures developed/adapted for water, air, and bioassay samples with less than 1 g of inorganic residue. Prior analytical methods required the use of a complex, multistage procedure for separation of americium from these matrices. The newer, simplified procedure requires only a single 2 mL extraction chromatographic separation for isolation of Am and lanthanides from other components of the sample. This method has been implemented on an extensive variety of "real" environmental and bioassay samples from the Los Alamos area, and consistently reliable and accurate results with appropriate detection limits have been obtained. The new method increases analytical throughput by a factor of approximately 2 and decreases environmental hazards from acid and mixed-waste generation relative to the prior technique. Analytical accuracy, reproducibility, and reliability are also significantly improved over the more complex and laborious method used previously.
3DHZETRN: Inhomogeneous Geometry Issues
NASA Technical Reports Server (NTRS)
Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.
2017-01-01
Historical methods for assessing radiation exposure inside complicated geometries for space applications were limited by computational constraints and lack of knowledge associated with nuclear processes occurring over a broad range of particles and energies. Various methods were developed and utilized to simplify geometric representations and enable coupling with simplified but efficient particle transport codes. Recent transport code development efforts, leading to 3DHZETRN, now enable such approximate methods to be carefully assessed to determine if past exposure analyses and validation efforts based on those approximate methods need to be revisited. In this work, historical methods of representing inhomogeneous spacecraft geometry for radiation protection analysis are first reviewed. Two inhomogeneous geometry cases, previously studied with 3DHZETRN and Monte Carlo codes, are considered with various levels of geometric approximation. Fluence, dose, and dose equivalent values are computed in all cases and compared. It is found that although these historical geometry approximations can induce large errors in neutron fluences up to 100 MeV, errors on dose and dose equivalent are modest (<10%) for the cases studied here.
Advanced Testing Method for Ground Thermal Conductivity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Xiaobing; Clemenzi, Rick; Liu, Su
A new method is developed that can quickly and more accurately determine the effective ground thermal conductivity (GTC) based on thermal response test (TRT) results. Ground thermal conductivity is an important parameter for sizing ground heat exchangers (GHEXs) used by geothermal heat pump systems. The conventional GTC test method usually requires a TRT for 48 hours with a very stable electric power supply throughout the entire test. In contrast, the new method reduces the required test time by 40%–60% or more, and it can determine GTC even with an unstable or intermittent power supply. Consequently, it can significantly reduce themore » cost of GTC testing and increase its use, which will enable optimal design of geothermal heat pump systems. Further, this new method provides more information about the thermal properties of the GHEX and the ground than previous techniques. It can verify the installation quality of GHEXs and has the potential, if developed, to characterize the heterogeneous thermal properties of the ground formation surrounding the GHEXs.« less
Region of influence regression for estimating the 50-year flood at ungaged sites
Tasker, Gary D.; Hodge, S.A.; Barks, C.S.
1996-01-01
Five methods of developing regional regression models to estimate flood characteristics at ungaged sites in Arkansas are examined. The methods differ in the manner in which the State is divided into subrogions. Each successive method (A to E) is computationally more complex than the previous method. Method A makes no subdivision. Methods B and C define two and four geographic subrogions, respectively. Method D uses cluster/discriminant analysis to define subrogions on the basis of similarities in watershed characteristics. Method E, the new region of influence method, defines a unique subregion for each ungaged site. Split-sample results indicate that, in terms of root-mean-square error, method E (38 percent error) is best. Methods C and D (42 and 41 percent error) were in a virtual tie for second, and methods B (44 percent error) and A (49 percent error) were fourth and fifth best.
2010-01-01
Background The development of new wireless communication technologies that emit radio frequency electromagnetic fields (RF-EMF) is ongoing, but little is known about the RF-EMF exposure distribution in the general population. Previous attempts to measure personal exposure to RF-EMF have used different measurement protocols and analysis methods making comparisons between exposure situations across different study populations very difficult. As a result, observed differences in exposure levels between study populations may not reflect real exposure differences but may be in part, or wholly due to methodological differences. Methods The aim of this paper is to develop a study protocol for future personal RF-EMF exposure studies based on experience drawn from previous research. Using the current knowledge base, we propose procedures for the measurement of personal exposure to RF-EMF, data collection, data management and analysis, and methods for the selection and instruction of study participants. Results We have identified two basic types of personal RF-EMF measurement studies: population surveys and microenvironmental measurements. In the case of a population survey, the unit of observation is the individual and a randomly selected representative sample of the population is needed to obtain reliable results. For microenvironmental measurements, study participants are selected in order to represent typical behaviours in different microenvironments. These two study types require different methods and procedures. Conclusion Applying our proposed common core procedures in future personal measurement studies will allow direct comparisons of personal RF-EMF exposures in different populations and study areas. PMID:20487532
NASA Astrophysics Data System (ADS)
Villanueva Perez, Carlos Hernan
Computational design optimization provides designers with automated techniques to develop novel and non-intuitive optimal designs. Topology optimization is a design optimization technique that allows for the evolution of a broad variety of geometries in the optimization process. Traditional density-based topology optimization methods often lack a sufficient resolution of the geometry and physical response, which prevents direct use of the optimized design in manufacturing and the accurate modeling of the physical response of boundary conditions. The goal of this thesis is to introduce a unified topology optimization framework that uses the Level Set Method (LSM) to describe the design geometry and the eXtended Finite Element Method (XFEM) to solve the governing equations and measure the performance of the design. The methodology is presented as an alternative to density-based optimization approaches, and is able to accommodate a broad range of engineering design problems. The framework presents state-of-the-art methods for immersed boundary techniques to stabilize the systems of equations and enforce the boundary conditions, and is studied with applications in 2D and 3D linear elastic structures, incompressible flow, and energy and species transport problems to test the robustness and the characteristics of the method. A comparison of the framework against density-based topology optimization approaches is studied with regards to convergence, performance, and the capability to manufacture the designs. Furthermore, the ability to control the shape of the design to operate within manufacturing constraints is developed and studied. The analysis capability of the framework is validated quantitatively through comparison against previous benchmark studies, and qualitatively through its application to topology optimization problems. The design optimization problems converge to intuitive designs and resembled well the results from previous 2D or density-based studies.
Lehotay, Steven J; Mastovska, Katerina; Lightfield, Alan R; Nuñez, Alberto; Dutko, Terry; Ng, Chilton; Bluhm, Louis
2013-10-25
A high-throughput qualitative screening and identification method for 9 aminoglycosides of regulatory interest has been developed, validated, and implemented for bovine kidney, liver, and muscle tissues. The method involves extraction at previously validated conditions, cleanup using disposable pipette extraction, and analysis by a 3 min ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method. The drug analytes include neomycin, streptomycin, dihydrosptreptomycin, and spectinomycin, which have residue tolerances in bovine in the US, and kanamicin, gentamicin, apramycin, amikacin, and hygromycin, which do not have US tolerances established in bovine tissues. Tobramycin was used as an internal standard. An additional drug, paromomycin also was validated in the method, but it was dropped during implementation due to conversion of neomycin into paromomycin. Proposed fragmentation patterns for the monitored ions of each analyte were elucidated with the aid of high resolution MS using a quadrupole-time-of-flight instrument. Recoveries from spiking experiments at regulatory levels of concern showed that all analytes averaged 70-120% recoveries in all tissues, except hygromycin averaged 61% recovery. Lowest calibrated levels were as low as 0.005 μg/g in matrix extracts, which approximately corresponded to the limit of detection for screening purposes. Drug identifications at levels <0.05 μg/g were made in spiked and/or real samples for all analytes and tissues tested. Analyses of 60 samples from 20 slaughtered cattle previously screened positive for aminoglycosides showed that this method worked well in practice. The UHPLC-MS/MS method has several advantages compared to the previous microbial inhibition screening assay, especially for distinguishing individual drugs from a mixture and improving identification of gentamicin in tissue samples. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Jovanović, J.; Petronijević, R. B.; Lukić, M.; Karan, D.; Parunović, N.; Branković-Lazić, I.
2017-09-01
During the previous development of a chemometric method for estimating the amount of added colorant in meat products, it was noticed that the natural colorant most commonly added to boiled sausages, E 120, has different CIE-LAB behavior compared to artificial colors that are used for the same purpose. This has opened the possibility of transforming the developed method into a method for identifying the addition of natural or synthetic colorants in boiled sausages based on the measurement of the color of the cross-section. After recalibration of the CIE-LAB method using linear discriminant analysis, verification was performed on 76 boiled sausages, of either frankfurters or Parisian sausage types. The accuracy and reliability of the classification was confirmed by comparison with the standard HPLC method. Results showed that the LDA + CIE-LAB method can be applied with high accuracy, 93.42 %, to estimate food color type in boiled sausages. Natural orange colors can give false positive results. Pigments from spice mixtures had no significant effect on CIE-LAB results.
NASA Astrophysics Data System (ADS)
Kim, D.; Lee, H.; Yu, H.; Beighley, E.; Durand, M. T.; Alsdorf, D. E.; Hwang, E.
2017-12-01
River discharge is a prerequisite for an understanding of flood hazard and water resource management, yet we have poor knowledge of it, especially over remote basins. Previous studies have successfully used a classic hydraulic geometry, at-many-stations hydraulic geometry (AMHG), and Manning's equation to estimate the river discharge. Theoretical bases of these empirical methods were introduced by Leopold and Maddock (1953) and Manning (1889), and those have been long used in the field of hydrology, water resources, and geomorphology. However, the methods to estimate the river discharge from remotely sensed data essentially require bathymetric information of the river or are not applicable to braided rivers. Furthermore, the methods used in the previous studies adopted assumptions of river conditions to be steady and uniform. Consequently, those methods have limitations in estimating the river discharge in complex and unsteady flow in nature. In this study, we developed a novel approach to estimating river discharges by applying the weak learner method (here termed WLQ), which is one of the ensemble methods using multiple classifiers, to the remotely sensed measurements of water levels from Envisat altimetry, effective river widths from PALSAR images, and multi-temporal surface water slopes over a part of the mainstem Congo. Compared with the methods used in the previous studies, the root mean square error (RMSE) decreased from 5,089 m3s-1 to 3,701 m3s-1, and the relative RMSE (RRMSE) improved from 12% to 8%. It is expected that our method can provide improved estimates of river discharges in complex and unsteady flow conditions based on the data-driven prediction model by machine learning (i.e. WLQ), even when the bathymetric data is not available or in case of the braided rivers. Moreover, it is also expected that the WLQ can be applied to the measurements of river levels, slopes and widths from the future Surface Water Ocean Topography (SWOT) mission to be launched in 2021.
Development efforts to improve curved-channel microchannel plates
NASA Technical Reports Server (NTRS)
Corbett, M. B.; Feller, W. B.; Laprade, B. N.; Cochran, R.; Bybee, R.; Danks, A.; Joseph, C.
1993-01-01
Curved-channel microchannel plate (C-plate) improvements resulting from an ongoing NASA STIS microchannel plate (MCP) development program are described. Performance limitations of previous C-plates led to a development program in support of the STIS MAMA UV photon counter, a second generation instrument on the Hubble Space Telescope. C-plate gain, quantum detection efficiency, dark noise, and imaging distortion, which are influenced by channel curvature non-uniformities, have all been improved through use of a new centrifuge fabrication technique. This technique will be described, along with efforts to improve older, more conventional shearing methods. Process optimization methods used to attain targeted C-plate performance goals will be briefly characterized. Newly developed diagnostic measurement techniques to study image distortion, gain uniformity, input bias angle, channel curvature, and ion feedback, will be described. Performance characteristics and initial test results of the improved C-plates will be reported. Future work and applications will also be discussed.
Efficient detection of dangling pointer error for C/C++ programs
NASA Astrophysics Data System (ADS)
Zhang, Wenzhe
2017-08-01
Dangling pointer error is pervasive in C/C++ programs and it is very hard to detect. This paper introduces an efficient detector to detect dangling pointer error in C/C++ programs. By selectively leave some memory accesses unmonitored, our method could reduce the memory monitoring overhead and thus achieves better performance over previous methods. Experiments show that our method could achieve an average speed up of 9% over previous compiler instrumentation based method and more than 50% over previous page protection based method.
Formal verification of mathematical software
NASA Technical Reports Server (NTRS)
Sutherland, D.
1984-01-01
Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.
Double arch mirror study. Part 3: Fabrication and test report
NASA Technical Reports Server (NTRS)
Vukobratovich, D.; Hillman, D.
1983-01-01
A method of mounting a cryogenically cooled, lightweight, double arch, glass mirror was developed for infrared, astronomical telescopes such as the Space Infrared Telescope Facility (SIRTF). A 50 cm, fused silica mirror which was previously fabricated was modified for use with a new mount configuration. This mount concept was developed. The modification of the mirror, the fabrication of the mirror mount, and the room temperature testing of the mounted mirror are reported. A design for a SIRTF class primary mirror is suggested.
Numerical Technology for Large-Scale Computational Electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharpe, R; Champagne, N; White, D
The key bottleneck of implicit computational electromagnetics tools for large complex geometries is the solution of the resulting linear system of equations. The goal of this effort was to research and develop critical numerical technology that alleviates this bottleneck for large-scale computational electromagnetics (CEM). The mathematical operators and numerical formulations used in this arena of CEM yield linear equations that are complex valued, unstructured, and indefinite. Also, simultaneously applying multiple mathematical modeling formulations to different portions of a complex problem (hybrid formulations) results in a mixed structure linear system, further increasing the computational difficulty. Typically, these hybrid linear systems aremore » solved using a direct solution method, which was acceptable for Cray-class machines but does not scale adequately for ASCI-class machines. Additionally, LLNL's previously existing linear solvers were not well suited for the linear systems that are created by hybrid implicit CEM codes. Hence, a new approach was required to make effective use of ASCI-class computing platforms and to enable the next generation design capabilities. Multiple approaches were investigated, including the latest sparse-direct methods developed by our ASCI collaborators. In addition, approaches that combine domain decomposition (or matrix partitioning) with general-purpose iterative methods and special purpose pre-conditioners were investigated. Special-purpose pre-conditioners that take advantage of the structure of the matrix were adapted and developed based on intimate knowledge of the matrix properties. Finally, new operator formulations were developed that radically improve the conditioning of the resulting linear systems thus greatly reducing solution time. The goal was to enable the solution of CEM problems that are 10 to 100 times larger than our previous capability.« less
Solving the shrinkage-induced PDMS alignment registration issue in multilayer soft lithography
NASA Astrophysics Data System (ADS)
Moraes, Christopher; Sun, Yu; Simmons, Craig A.
2009-06-01
Shrinkage of polydimethylsiloxane (PDMS) complicates alignment registration between layers during multilayer soft lithography fabrication. This often hinders the development of large-scale microfabricated arrayed devices. Here we report a rapid method to construct large-area, multilayered devices with stringent alignment requirements. This technique, which exploits a previously unrecognized aspect of sandwich mold fabrication, improves device yield, enables highly accurate alignment over large areas of multilayered devices and does not require strict regulation of fabrication conditions or extensive calibration processes. To demonstrate this technique, a microfabricated Braille display was developed and characterized. High device yield and accurate alignment within 15 µm were achieved over three layers for an array of 108 Braille units spread over a 6.5 cm2 area, demonstrating the fabrication of well-aligned devices with greater ease and efficiency than previously possible.
NASA Astrophysics Data System (ADS)
Butler, S. L.
2017-12-01
The electrical resistivity method is now highly developed with 2D and even 3D surveys routinely performed and with available fast inversion software. However, rules of thumb, based on simple mathematical formulas, for important quantities like depth of investigation, horizontal position and resolution have not previously been available and would be useful for survey planning, preliminary interpretation and general education about the method. In this contribution, I will show that the sensitivity function for the resistivity method for a homogeneous half-space can be analyzed in terms of its first and second moments which yield simple mathematical formulas. The first moment gives the sensitivity-weighted center of an apparent resistivity measurement with the vertical center being an estimate of the depth of investigation. I will show that this depth of investigation estimate works at least as well as previous estimates based on the peak and median of the depth sensitivity function which must be calculated numerically for a general four electrode array. The vertical and horizontal first moments can also be used as pseudopositions when plotting 1, 2 and 3D pseudosections. The appropriate horizontal plotting point for a pseudosection was not previously obvious for nonsymmetric arrays. The second moments of the sensitivity function give estimates of the spatial extent of the region contributing to an apparent resistivity measurement and hence are measures of the resolution. These also have simple mathematical formulas.
ECHO: A reference-free short-read error correction algorithm
Kao, Wei-Chun; Chan, Andrew H.; Song, Yun S.
2011-01-01
Developing accurate, scalable algorithms to improve data quality is an important computational challenge associated with recent advances in high-throughput sequencing technology. In this study, a novel error-correction algorithm, called ECHO, is introduced for correcting base-call errors in short-reads, without the need of a reference genome. Unlike most previous methods, ECHO does not require the user to specify parameters of which optimal values are typically unknown a priori. ECHO automatically sets the parameters in the assumed model and estimates error characteristics specific to each sequencing run, while maintaining a running time that is within the range of practical use. ECHO is based on a probabilistic model and is able to assign a quality score to each corrected base. Furthermore, it explicitly models heterozygosity in diploid genomes and provides a reference-free method for detecting bases that originated from heterozygous sites. On both real and simulated data, ECHO is able to improve the accuracy of previous error-correction methods by several folds to an order of magnitude, depending on the sequence coverage depth and the position in the read. The improvement is most pronounced toward the end of the read, where previous methods become noticeably less effective. Using a whole-genome yeast data set, it is demonstrated here that ECHO is capable of coping with nonuniform coverage. Also, it is shown that using ECHO to perform error correction as a preprocessing step considerably facilitates de novo assembly, particularly in the case of low-to-moderate sequence coverage depth. PMID:21482625
Measurement of breast volume using body scan technology(computer-aided anthropometry).
Veitch, Daisy; Burford, Karen; Dench, Phil; Dean, Nicola; Griffin, Philip
2012-01-01
Assessment of breast volume is an important tool for preoperative planning in various breast surgeries and other applications, such as bra development. Accurate assessment can improve the consistency and quality of surgery outcomes. This study outlines a non-invasive method to measure breast volume using a whole body 3D laser surface anatomy scanner, the Cyberware WBX. It expands on a previous publication where this method was validated against patients undergoing mastectomy. It specifically outlines and expands the computer-aided anthropometric (CAA) method for extracting breast volumes in a non-invasive way from patients enrolled in a breast reduction study at Flinders Medical Centre, South Australia. This step-by-step description allows others to replicate this work and provides an additional tool to assist them in their own clinical practice and development of designs.
A new method of Curie depth evaluation from magnetic data: Theory
NASA Technical Reports Server (NTRS)
Won, I. J. (Principal Investigator)
1981-01-01
An approach to estimating the Curie point isotherm uses the classical Gauss method inverting a system of nonlinear equations. The method, slightly modified by a differential correction technique, directly inverts filtered Magsat data to calculate the crustal structure above the Curie depth, which is modeled as a magnetized layer of varying thickness and susceptibility. Since the depth below the layer is assumed to be nonmagnetic, the bottom of the layer is interpreted as the Curie depth. The method, once fully developed, tested, and compared with previous work by others, is to be applied to a portion of the eastern U.S. when sufficient Magsat data are accumulated for the region.
Gaze Estimation Method Using Analysis of Electrooculogram Signals and Kinect Sensor
Tanno, Koichi
2017-01-01
A gaze estimation system is one of the communication methods for severely disabled people who cannot perform gestures and speech. We previously developed an eye tracking method using a compact and light electrooculogram (EOG) signal, but its accuracy is not very high. In the present study, we conducted experiments to investigate the EOG component strongly correlated with the change of eye movements. The experiments in this study are of two types: experiments to see objects only by eye movements and experiments to see objects by face and eye movements. The experimental results show the possibility of an eye tracking method using EOG signals and a Kinect sensor. PMID:28912800
Zaazaa, Hala E; Elzanfaly, Eman S; Soudi, Aya T; Salem, Maissa Y
2015-05-15
Ratio difference spectrophotometric method was developed for the determination of ibuprofen and famotidine in their mixture form. Ibuprofen and famotidine were determined in the presence of each other by the ratio difference spectrophotometric (RD) method where linearity was obtained from 50 to 600μg/mL and 2.5 to 25μg/mL for ibuprofen and famotidine, respectively. The suggested method was validated according to ICH guidelines and successfully applied for the analysis of ibuprofen and famotidine in their pharmaceutical dosage forms without interference from any additives or excipients. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Abedi, Maysam
2015-06-01
This reply discusses the results of two previously developed approaches in mineral prospectivity/potential mapping (MPM), i.e., ELECTRE III and PROMETHEE II as well-known methods in multi-criteria decision-making (MCDM) problems. Various geo-data sets are integrated to prepare MPM in which generated maps have acceptable matching with the drilled boreholes. Equal performance of the applied methods is indicated in the studied case. Complementary information of these methods is also provided in order to help interested readers to implement them in MPM process.
Nonlinear PET parametric image reconstruction with MRI information using kernel method
NASA Astrophysics Data System (ADS)
Gong, Kuang; Wang, Guobao; Chen, Kevin T.; Catana, Ciprian; Qi, Jinyi
2017-03-01
Positron Emission Tomography (PET) is a functional imaging modality widely used in oncology, cardiology, and neurology. It is highly sensitive, but suffers from relatively poor spatial resolution, as compared with anatomical imaging modalities, such as magnetic resonance imaging (MRI). With the recent development of combined PET/MR systems, we can improve the PET image quality by incorporating MR information. Previously we have used kernel learning to embed MR information in static PET reconstruction and direct Patlak reconstruction. Here we extend this method to direct reconstruction of nonlinear parameters in a compartment model by using the alternating direction of multiplier method (ADMM) algorithm. Simulation studies show that the proposed method can produce superior parametric images compared with existing methods.
A proof of the DBRF-MEGN method, an algorithm for deducing minimum equivalent gene networks
2011-01-01
Background We previously developed the DBRF-MEGN (difference-based regulation finding-minimum equivalent gene network) method, which deduces the most parsimonious signed directed graphs (SDGs) consistent with expression profiles of single-gene deletion mutants. However, until the present study, we have not presented the details of the method's algorithm or a proof of the algorithm. Results We describe in detail the algorithm of the DBRF-MEGN method and prove that the algorithm deduces all of the exact solutions of the most parsimonious SDGs consistent with expression profiles of gene deletion mutants. Conclusions The DBRF-MEGN method provides all of the exact solutions of the most parsimonious SDGs consistent with expression profiles of gene deletion mutants. PMID:21699737
A Mapmark method of standard setting as implemented for the National Assessment Governing Board.
Schulz, E Matthew; Mitzel, Howard C
2011-01-01
This article describes a Mapmark standard setting procedure, developed under contract with the National Assessment Governing Board (NAGB). The procedure enhances the bookmark method with spatially representative item maps, holistic feedback, and an emphasis on independent judgment. A rationale for these enhancements, and the bookmark method, is presented, followed by a detailed description of the materials and procedures used in a meeting to set standards for the 2005 National Assessment of Educational Progress (NAEP) in Grade 12 mathematics. The use of difficulty-ordered content domains to provide holistic feedback is a particularly novel feature of the method. Process evaluation results comparing Mapmark to Anghoff-based methods previously used for NAEP standard setting are also presented.
A contracting-interval program for the Danilewski method. Ph.D. Thesis - Va. Univ.
NASA Technical Reports Server (NTRS)
Harris, J. D.
1971-01-01
The concept of contracting-interval programs is applied to finding the eigenvalues of a matrix. The development is a three-step process in which (1) a program is developed for the reduction of a matrix to Hessenberg form, (2) a program is developed for the reduction of a Hessenberg matrix to colleague form, and (3) the characteristic polynomial with interval coefficients is readily obtained from the interval of colleague matrices. This interval polynomial is then factored into quadratic factors so that the eigenvalues may be obtained. To develop a contracting-interval program for factoring this polynomial with interval coefficients it is necessary to have an iteration method which converges even in the presence of controlled rounding errors. A theorem is stated giving sufficient conditions for the convergence of Newton's method when both the function and its Jacobian cannot be evaluated exactly but errors can be made proportional to the square of the norm of the difference between the previous two iterates. This theorem is applied to prove the convergence of the generalization of the Newton-Bairstow method that is used to obtain quadratic factors of the characteristic polynomial.
Rogers, Richard S; Abernathy, Michael; Richardson, Douglas D; Rouse, Jason C; Sperry, Justin B; Swann, Patrick; Wypych, Jette; Yu, Christopher; Zang, Li; Deshpande, Rohini
2017-11-30
Today, we are experiencing unprecedented growth and innovation within the pharmaceutical industry. Established protein therapeutic modalities, such as recombinant human proteins, monoclonal antibodies (mAbs), and fusion proteins, are being used to treat previously unmet medical needs. Novel therapies such as bispecific T cell engagers (BiTEs), chimeric antigen T cell receptors (CARTs), siRNA, and gene therapies are paving the path towards increasingly personalized medicine. This advancement of new indications and therapeutic modalities is paralleled by development of new analytical technologies and methods that provide enhanced information content in a more efficient manner. Recently, a liquid chromatography-mass spectrometry (LC-MS) multi-attribute method (MAM) has been developed and designed for improved simultaneous detection, identification, quantitation, and quality control (monitoring) of molecular attributes (Rogers et al. MAbs 7(5):881-90, 2015). Based on peptide mapping principles, this powerful tool represents a true advancement in testing methodology that can be utilized not only during product characterization, formulation development, stability testing, and development of the manufacturing process, but also as a platform quality control method in dispositioning clinical materials for both innovative biotherapeutics and biosimilars.
NASA Technical Reports Server (NTRS)
Lih, Shyh-Shiuh; Bar-Cohen, Yoseph; Lee, Hyeong Jae; Takano, Nobuyuki; Bao, Xiaoqi
2013-01-01
An advanced signal processing methodology is being developed to monitor the height of condensed water thru the wall of a steel pipe while operating at temperatures as high as 250deg. Using existing techniques, previous study indicated that, when the water height is low or there is disturbance in the environment, the predicted water height may not be accurate. In recent years, the use of the autocorrelation and envelope techniques in the signal processing has been demonstrated to be a very useful tool for practical applications. In this paper, various signal processing techniques including the auto correlation, Hilbert transform, and the Shannon Energy Envelope methods were studied and implemented to determine the water height in the steam pipe. The results have shown that the developed method provides a good capability for monitoring the height in the regular conditions. An alternative solution for shallow water or no water conditions based on a developed hybrid method based on Hilbert transform (HT) with a high pass filter and using the optimized windowing technique is suggested. Further development of the reported methods would provide a powerful tool for the identification of the disturbances of water height inside the pipe.
A Noise Spectroscopy-Based Selective Gas Sensing with MOX Gas Sensors
NASA Astrophysics Data System (ADS)
Gomri, S.; Seguin, J.; Contaret, T.; Fiorido, T.; Aguir, K.
We propose a new method for obtaining a fluctuation-enhanced sensing (FES) signature of a gas using a single metal oxide (MOX) gas micro sensor. Starting from our model of adsorption-desorption (A-D) noise previously developed, we show theoretically that the product of frequency by the power spectrum density (PSD) of the gas sensing layer resistance fluctuations often has a maximum which is characteristic of the gas. This property was experimentally confirmed in the case of the detection of NO2 and O3 using a WO3 sensing layer. This method could be useful for classifying gases. Furthermore, our noise measurements confirm our previous model showing that PSD of the A-Dnoise in MOX gas sensor is a combination of Lorentzians having a low frequency magnitude and a cut-off frequency which depends on the nature of the detected gas.
In-situ measurement of electroosmotic drag coefficient in Nafion membrane for the PEMFC.
Peng, Zhe; Morin, Arnaud; Huguet, Patrice; Schott, Pascal; Pauchet, Joël
2011-11-10
A new method based on hydrogen pump has been developed to measure the electroosmotic drag coefficient in representative PEMFC operating conditions. It allows eliminating the back-flow of water which leads to some errors in the calculation of this coefficient with previously reported electrochemical methods. Measurements have been performed on 50 μm thick Nafion membranes both extruded and recast. Contrary to what has been described in most of previous published works, the electroosmotic drag coefficient decreases as the membrane water content increases. The same trend is observed for temperatures between 25 and 80 °C. For the same membrane water content, the electroosmotic drag coefficient increases with temperature. In the same condition, there is no difference in drag coefficient for extruded Nafion N112 and recast Nafion NRE212. These results are discussed on the basis of the two commonly accepted proton transport mechanisms, namely, Grotthus and vehicular.
Constructing service-oriented architecture adoption maturity matrix using Kano model
NASA Astrophysics Data System (ADS)
Hamzah, Mohd Hamdi Irwan; Baharom, Fauziah; Mohd, Haslina
2017-10-01
Commonly, organizations adopted Service-Oriented Architecture (SOA) because it can provide a flexible reconfiguration and can reduce the development time and cost. In order to guide the SOA adoption, previous industry and academia have constructed SOA maturity model. However, there is a limited number of works on how to construct the matrix in the previous SOA maturity model. Therefore, this study is going to provide a method that can be used in order to construct the matrix in the SOA maturity model. This study adapts Kano Model to construct the cross evaluation matrix focused on SOA adoption IT and business benefits. This study found that Kano Model can provide a suitable and appropriate method for constructing the cross evaluation matrix in SOA maturity model. Kano model also can be used to plot, organize and better represent the evaluation dimension for evaluating the SOA adoption.
NASA Astrophysics Data System (ADS)
Miyajima, Hiroyuki; Yuhara, Naohiro
Regenerative Life Support Systems (RLSS), which maintain human lives by recycling substances essential for living, are comprised of humans, plants, and material circulation systems. The plants supply food to the humans or reproduce water and gases by photosynthesis, while the material circulation systems recycle physicochemically and circulate substances disposed by humans and plants. RLSS attracts attention since manned space activities have been shifted from previous short trips to long-term stay activities as such base as a space station, a lunar base, and a Mars base. The present typical space base is the International Space Station (ISS), a manned experimental base for prolonged stays, where RLSS recycles only water and air. In order to accommodate prolonged and extended manned activity in future space bases, developing RLSS that implements food production and regeneration of resources at once using plants is expected. The configuration of RLSS should be designed to suit its own duty, for which design requirements for RLSS with an unprecedented configuration may arise. Accordingly, it is necessary to establish a conceptual design method for generalized RLSS. It is difficult, however, to systematize the design process by analyzing previous design because there are only a few ground-experimental facilities, namely CEEF (Closed Ecology Experiment Facilities) of Japan, BIO-Plex (Bioregenerative Planetary Life Support Systems Test Complex) of the U.S., and BIOS3 of Russia. Thus a conceptual design method which doesn’t rely on previous design examples is required for generalized RLSS from the above reasons. This study formalizes a conceptual design process, and develops a conceptual design support tool for RLSS based on this design process.
Niu, Yanyan; Li, Sensen; Lin, Zongtao; Liu, Meixian; Wang, Daidong; Wang, Hong; Chen, Shizhong
2016-09-09
Fufang Banbianlian Injection (FBI) has been widely used as an anti-inflammatory and anti-tumor prescription. To understand the relationships between its bioactive ingredients and pharmacological efficacies, our previous study has been successfully identified some DNA-binding compounds in FBI using an established on-line screening system, in which 4',6-diamidino-2-phenylindole (DAPI) was developed as a probe. However, DAPI can be only used to screen ATT-specific DNA minor groove binders, leaving the potential active intercalators unknown in FBI. As a continuation of our studies on FBI, here we present a sensitive analytical method for rapid identification and evaluation of DNA-intercalators using propidium iodide (PI) as a fluorescent probe. We have firstly established the technique of high-performance liquid chromatography-diode-array detector-multistage mass spectrometry-deoxyribonucleic acid-propidium iodide-fluorescence detector (HPLC-DAD-MS(n)-DNA-PI-FLD) system. As a result, 38 of 58 previously identified compounds in FBI were DNA-intercalation active. Interestingly, all previously reported DNA-binders also showed intercalative activities, suggesting they are dual-mode DNA-binders. Quantitative study showed that flavonoid glycosides and chlorogenic acids were the main active compounds in FBI, and displayed similar DNA-binding ability using either DAPI or PI. In addition, 13 active compounds were used to establish the structure-activity relationships. In this study, PI was developed into an on-line method for identifying DNA-intercalators for the first time, and thus it will be a useful high-throughput screening technique for other related samples. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Szeląg, Bartosz; Barbusiński, Krzysztof; Studziński, Jan; Bartkiewicz, Lidia
2017-11-01
In the study, models developed using data mining methods are proposed for predicting wastewater quality indicators: biochemical and chemical oxygen demand, total suspended solids, total nitrogen and total phosphorus at the inflow to wastewater treatment plant (WWTP). The models are based on values measured in previous time steps and daily wastewater inflows. Also, independent prediction systems that can be used in case of monitoring devices malfunction are provided. Models of wastewater quality indicators were developed using MARS (multivariate adaptive regression spline) method, artificial neural networks (ANN) of the multilayer perceptron type combined with the classification model (SOM) and cascade neural networks (CNN). The lowest values of absolute and relative errors were obtained using ANN+SOM, whereas the MARS method produced the highest error values. It was shown that for the analysed WWTP it is possible to obtain continuous prediction of selected wastewater quality indicators using the two developed independent prediction systems. Such models can ensure reliable WWTP work when wastewater quality monitoring systems become inoperable, or are under maintenance.
NASA Astrophysics Data System (ADS)
Liu, Ruiwen; Jiao, Binbin; Kong, Yanmei; Li, Zhigang; Shang, Haiping; Lu, Dike; Gao, Chaoqun; Chen, Dapeng
2013-09-01
Micro-devices with a bi-material-cantilever (BMC) commonly suffer initial curvature due to the mismatch of residual stress. Traditional corrective methods to reduce the residual stress mismatch generally involve the development of different material deposition recipes. In this paper, a new method for reducing residual stress mismatch in a BMC is proposed based on various previously developed deposition recipes. An initial material film is deposited using two or more developed deposition recipes. This first film is designed to introduce a stepped stress gradient, which is then balanced by overlapping a second material film on the first and using appropriate deposition recipes to form a nearly stress-balanced structure. A theoretical model is proposed based on both the moment balance principle and total equal strain at the interface of two adjacent layers. Experimental results and analytical models suggest that the proposed method is effective in producing multi-layer micro cantilevers that display balanced residual stresses. The method provides a generic solution to the problem of mismatched initial stresses which universally exists in micro-electro-mechanical systems (MEMS) devices based on a BMC. Moreover, the method can be incorporated into a MEMS design automation package for efficient design of various multiple material layer devices from MEMS material library and developed deposition recipes.
NASA Astrophysics Data System (ADS)
Jayanimitta, M. E.; Puspasari, D. A.; Widyahantari, R.; Kristina, D.; Ratnaningtyas, T.; Setionurjaya, A.; Anindita, Y. A.
2018-02-01
Vulnerability Assessment is usually used for assessing the ability of an area on facing disaster. In previous studies, the study of Vulnerability Assessment applied only quantitative method to show the vulnerability level. Therefore, this study attempts to add information reviews using qualitative method. Kemijen City Village is one of the administrative areas in the northern part of Semarang City affected by climate change. The residents have to adapt it by renovating and elevating their houses and other infrastructures to avoid floods. There are some development programs held by government, NGOs, and corporations such as Banger Polder Development, PLPBK, etc. It is interesting to know how big the vulnerability level of Kemijen on facing flood disasters, then how the projects can affect local adaptive capacity. To answer it, this research uses mixed-method approach. Vulnerability Assessment uses quantitative method by scoring indicators of Exposure, Sensitivity, and Adaptive Capacity, while the development impact uses qualitative method. The data were collected through interviews and FGD conducted in Joint Studio Course between Diponegoro University and University of Hawaii in October 2016. Non-physical programs such as community empowerment have more positive impacts on local adaptive capacity in Kemijen. Community participation is important for environmental sustainability that can not be done in a short time to educate the people.
Guinney, Justin; Wang, Tao; Laajala, Teemu D; Winner, Kimberly Kanigel; Bare, J Christopher; Neto, Elias Chaibub; Khan, Suleiman A; Peddinti, Gopal; Airola, Antti; Pahikkala, Tapio; Mirtti, Tuomas; Yu, Thomas; Bot, Brian M; Shen, Liji; Abdallah, Kald; Norman, Thea; Friend, Stephen; Stolovitzky, Gustavo; Soule, Howard; Sweeney, Christopher J; Ryan, Charles J; Scher, Howard I; Sartor, Oliver; Xie, Yang; Aittokallio, Tero; Zhou, Fang Liz; Costello, James C
2016-01-01
Summary Background Improvements to prognostic models in metastatic castration-resistant prostate cancer have the potential to augment clinical trial design and guide treatment strategies. In partnership with Project Data Sphere, a not-for-profit initiative allowing data from cancer clinical trials to be shared broadly with researchers, we designed an open-data, crowdsourced, DREAM (Dialogue for Reverse Engineering Assessments and Methods) challenge to not only identify a better prognostic model for prediction of survival in patients with metastatic castration-resistant prostate cancer but also engage a community of international data scientists to study this disease. Methods Data from the comparator arms of four phase 3 clinical trials in first-line metastatic castration-resistant prostate cancer were obtained from Project Data Sphere, comprising 476 patients treated with docetaxel and prednisone from the ASCENT2 trial, 526 patients treated with docetaxel, prednisone, and placebo in the MAINSAIL trial, 598 patients treated with docetaxel, prednisone or prednisolone, and placebo in the VENICE trial, and 470 patients treated with docetaxel and placebo in the ENTHUSE 33 trial. Datasets consisting of more than 150 clinical variables were curated centrally, including demographics, laboratory values, medical history, lesion sites, and previous treatments. Data from ASCENT2, MAINSAIL, and VENICE were released publicly to be used as training data to predict the outcome of interest—namely, overall survival. Clinical data were also released for ENTHUSE 33, but data for outcome variables (overall survival and event status) were hidden from the challenge participants so that ENTHUSE 33 could be used for independent validation. Methods were evaluated using the integrated time-dependent area under the curve (iAUC). The reference model, based on eight clinical variables and a penalised Cox proportional-hazards model, was used to compare method performance. Further validation was done using data from a fifth trial—ENTHUSE M1—in which 266 patients with metastatic castration-resistant prostate cancer were treated with placebo alone. Findings 50 independent methods were developed to predict overall survival and were evaluated through the DREAM challenge. The top performer was based on an ensemble of penalised Cox regression models (ePCR), which uniquely identified predictive interaction effects with immune biomarkers and markers of hepatic and renal function. Overall, ePCR outperformed all other methods (iAUC 0·791; Bayes factor >5) and surpassed the reference model (iAUC 0·743; Bayes factor >20). Both the ePCR model and reference models stratified patients in the ENTHUSE 33 trial into high-risk and low-risk groups with significantly different overall survival (ePCR: hazard ratio 3·32, 95% CI 2·39–4·62, p<0·0001; reference model: 2·56, 1·85–3·53, p<0·0001). The new model was validated further on the ENTHUSE M1 cohort with similarly high performance (iAUC 0·768). Meta-analysis across all methods confirmed previously identified predictive clinical variables and revealed aspartate aminotransferase as an important, albeit previously under-reported, prognostic biomarker. Interpretation Novel prognostic factors were delineated, and the assessment of 50 methods developed by independent international teams establishes a benchmark for development of methods in the future. The results of this effort show that data-sharing, when combined with a crowdsourced challenge, is a robust and powerful framework to develop new prognostic models in advanced prostate cancer. Funding Sanofi US Services, Project Data Sphere. PMID:27864015
Human Factors Engineering as a System in the Vision for Exploration
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; Smith, Danielle; Holden, Kritina
2006-01-01
In order to accomplish NASA's Vision for Exploration, while assuring crew safety and productivity, human performance issues must be well integrated into system design from mission conception. To that end, a two-year Technology Development Project (TDP) was funded by NASA Headquarters to develop a systematic method for including the human as a system in NASA's Vision for Exploration. The specific goals of this project are to review current Human Systems Integration (HSI) standards (i.e., industry, military, NASA) and tailor them to selected NASA Exploration activities. Once the methods are proven in the selected domains, a plan will be developed to expand the effort to a wider scope of Exploration activities. The methods will be documented for inclusion in NASA-specific documents (such as the Human Systems Integration Standards, NASA-STD-3000) to be used in future space systems. The current project builds on a previous TDP dealing with Human Factors Engineering processes. That project identified the key phases of the current NASA design lifecycle, and outlined the recommended HFE activities that should be incorporated at each phase. The project also resulted in a prototype of a webbased HFE process tool that could be used to support an ideal HFE development process at NASA. This will help to augment the limited human factors resources available by providing a web-based tool that explains the importance of human factors, teaches a recommended process, and then provides the instructions, templates and examples to carry out the process steps. The HFE activities identified by the previous TDP are being tested in situ for the current effort through support to a specific NASA Exploration activity. Currently, HFE personnel are working with systems engineering personnel to identify HSI impacts for lunar exploration by facilitating the generation of systemlevel Concepts of Operations (ConOps). For example, medical operations scenarios have been generated for lunar habitation in order to identify HSI requirements for the lunar communications architecture. Throughout these ConOps exercises, HFE personnel are testing various tools and methodologies that have been identified in the literature. A key part of the effort is the identification of optimal processes, methods, and tools for these early development phase activities, such as ConOps, requirements development, and early conceptual design. An overview of the activities completed thus far, as well as the tools and methods investigated will be presented.
Proteomic analysis of mare follicular fluid during late follicle development.
Fahiminiya, Somayyeh; Labas, Valérie; Roche, Stéphane; Dacheux, Jean-Louis; Gérard, Nadine
2011-09-17
Follicular fluid accumulates into the antrum of follicle from the early stage of follicle development. Studies on its components may contribute to a better understanding of the mechanisms underlying follicular development and oocyte quality. With this objective, we performed a proteomic analysis of mare follicular fluid. First, we hypothesized that proteins in follicular fluid may differ from those in the serum, and also may change during follicle development. Second, we used four different approaches of Immunodepletion and one enrichment method, in order to overcome the masking effect of high-abundance proteins present in the follicular fluid, and to identify those present in lower abundance. Finally, we compared our results with previous studies performed in mono-ovulant (human) and poly-ovulant (porcine and canine) species in an attempt to identify common and/or species-specific proteins. Follicular fluid samples were collected from ovaries at three different stages of follicle development (early dominant, late dominant and preovulatory). Blood samples were also collected at each time. The proteomic analysis was carried out on crude, depleted and enriched follicular fluid by 2D-PAGE, 1D-PAGE and mass spectrometry. Total of 459 protein spots were visualized by 2D-PAGE of crude mare follicular fluid, with no difference among the three physiological stages. Thirty proteins were observed as differentially expressed between serum and follicular fluid. Enrichment method was found to be the most powerful method for detection and identification of low-abundance proteins from follicular fluid. Actually, we were able to identify 18 proteins in the crude follicular fluid, and as many as 113 in the enriched follicular fluid. Inhibins and a few other proteins involved in reproduction could only be identified after enrichment of follicular fluid, demonstrating the power of the method used. The comparison of proteins found in mare follicular fluid with proteins previously identified in human, porcine and canine follicular fluids, led to the identification of 12 common proteins and of several species-specific proteins. This study provides the first description of mare follicular fluid proteome during the late follicle development stages. We identified several proteins from crude, depleted and enriched follicular fluid. Our results demonstrate that the enrichment method, combined with 2D-PAGE and mass spectrometry, can be successfully used to visualize and further identify the low-abundance proteins in the follicular fluid.
RAPID METHOD FOR DETERMINATION OF RADIOSTRONTIUM IN EMERGENCY MILK SAMPLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, S.; Culligan, B.
2008-07-17
A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml samplemore » aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.« less
Development of a Multiplex Single Base Extension Assay for Mitochondrial DNA Haplogroup Typing
Nelson, Tahnee M.; Just, Rebecca S.; Loreille, Odile; Schanfield, Moses S.; Podini, Daniele
2007-01-01
Aim To provide a screening tool to reduce time and sample consumption when attempting mtDNA haplogroup typing. Methods A single base primer extension assay was developed to enable typing, in a single reaction, of twelve mtDNA haplogroup specific polymorphisms. For validation purposes a total of 147 samples were tested including 73 samples successfully haplogroup typed using mtDNA control region (CR) sequence data, 21 samples inconclusively haplogroup typed by CR data, 20 samples previously haplogroup typed using restriction fragment length polymorphism (RFLP) analysis, and 31 samples of known ancestral origin without previous haplogroup typing. Additionally, two highly degraded human bones embalmed and buried in the early 1950s were analyzed using the single nucleotide polymorphisms (SNP) multiplex. Results When the SNP multiplex was used to type the 96 previously CR sequenced specimens, an increase in haplogroup or macrohaplogroup assignment relative to conventional CR sequence analysis was observed. The single base extension assay was also successfully used to assign a haplogroup to decades-old, embalmed skeletal remains dating to World War II. Conclusion The SNP multiplex was successfully used to obtain haplogroup status of highly degraded human bones, and demonstrated the ability to eliminate possible contributors. The SNP multiplex provides a low-cost, high throughput method for typing of mtDNA haplogroups A, B, C, D, E, F, G, H, L1/L2, L3, M, and N that could be useful for screening purposes for human identification efforts and anthropological studies. PMID:17696300
NASA Astrophysics Data System (ADS)
McLaughlin, Joyce; Renzi, Daniel
2006-04-01
Transient elastography and supersonic imaging are promising new techniques for characterizing the elasticity of soft tissues. Using this method, an 'ultrafast imaging' system (up to 10 000 frames s-1) follows in real time the propagation of a low-frequency shear wave. The displacement of the propagating shear wave is measured as a function of time and space. Here we develop a fast level set based algorithm for finding the shear wave speed from the interior positions of the propagating front. We compare the performance of level curve methods developed here and our previously developed (McLaughlin J and Renzi D 2006 Shear wave speed recovery in transient elastography and supersonic imaging using propagating fronts Inverse Problems 22 681-706) distance methods. We give reconstruction examples from synthetic data and from data obtained from a phantom experiment accomplished by Mathias Fink's group (the Laboratoire Ondes et Acoustique, ESPCI, Université Paris VII).
Shin, Jae-Wook; Kim, Kyeong-Jun; Yoon, Jinho; Jo, Jinhee; El-Said, Waleed Ahmed; Choi, Jeong-Woo
2017-01-01
Several neurological disorders such as Alzheimer’s disease and Parkinson’s disease have become a serious impediment to aging people nowadays. One of the efficient methods used to monitor these neurological disorders is the detection of neurotransmitters such as dopamine. Metal materials, such as gold and platinum, are widely used in this electrochemical detection method; however, low sensitivity and linearity at low dopamine concentrations limit the use of these materials. To overcome these limitations, a silver nanoparticle (SNP) modified electrode covered by graphene oxide for the detection of dopamine was newly developed in this study. For the first time, the surface of an indium tin oxide (ITO) electrode was modified using SNPs and graphene oxide sequentially through the electrochemical deposition method. The developed biosensor provided electrochemical signal enhancement at low dopamine concentrations in comparison with previous biosensors. Therefore, our newly developed SNP modified electrode covered by graphene oxide can be used to monitor neurological diseases through electrochemical signal enhancement at low dopamine concentrations. PMID:29186040
Shin, Jae-Wook; Kim, Kyeong-Jun; Yoon, Jinho; Jo, Jinhee; El-Said, Waleed Ahmed; Choi, Jeong-Woo
2017-11-29
Several neurological disorders such as Alzheimer's disease and Parkinson's disease have become a serious impediment to aging people nowadays. One of the efficient methods used to monitor these neurological disorders is the detection of neurotransmitters such as dopamine. Metal materials, such as gold and platinum, are widely used in this electrochemical detection method; however, low sensitivity and linearity at low dopamine concentrations limit the use of these materials. To overcome these limitations, a silver nanoparticle (SNP) modified electrode covered by graphene oxide for the detection of dopamine was newly developed in this study. For the first time, the surface of an indium tin oxide (ITO) electrode was modified using SNPs and graphene oxide sequentially through the electrochemical deposition method. The developed biosensor provided electrochemical signal enhancement at low dopamine concentrations in comparison with previous biosensors. Therefore, our newly developed SNP modified electrode covered by graphene oxide can be used to monitor neurological diseases through electrochemical signal enhancement at low dopamine concentrations.
Hello World! - Experiencing Usability Methods without Usability Expertise
NASA Astrophysics Data System (ADS)
Eriksson, Elina; Cajander, Åsa; Gulliksen, Jan
How do you do usability work when no usability expertise is available? What happens in an organization when system developers, with no previous HCI knowledge, after a 3-day course, start applying usability methods, and particularly field studies? In order to answer these questions qualitative data were gathered through participatory observations, a feed back survey, field study documentation and interviews from 47 system developers from a public authority. Our results suggest that field studies enhance the developer’s understanding of the user perspective, and provide a more holistic overview of the use situation, but that some developers were unable to interpret their observations and see solutions to the users’ problems. The field study method was very much appreciated and has now become standard operating procedure within the organization. However, although field studies may be useful, it does not replace the need for usability pro fes sion als, as their knowledge is essential for more complex observations, analysis and for keeping the focus on usability.
Dispersive shock waves in systems with nonlocal dispersion of Benjamin-Ono type
NASA Astrophysics Data System (ADS)
El, G. A.; Nguyen, L. T. K.; Smyth, N. F.
2018-04-01
We develop a general approach to the description of dispersive shock waves (DSWs) for a class of nonlinear wave equations with a nonlocal Benjamin-Ono type dispersion term involving the Hilbert transform. Integrability of the governing equation is not a pre-requisite for the application of this method which represents a modification of the DSW fitting method previously developed for dispersive-hydrodynamic systems of Korteweg-de Vries (KdV) type (i.e. reducible to the KdV equation in the weakly nonlinear, long wave, unidirectional approximation). The developed method is applied to the Calogero-Sutherland dispersive hydrodynamics for which the classification of all solution types arising from the Riemann step problem is constructed and the key physical parameters (DSW edge speeds, lead soliton amplitude, intermediate shelf level) of all but one solution type are obtained in terms of the initial step data. The analytical results are shown to be in excellent agreement with results of direct numerical simulations.
Electric field feedback for Magneto(elasto)Electric magnetometer development
NASA Astrophysics Data System (ADS)
Yang, M.-T.; Zhuang, X.; Sing, M. Lam Chok; Dolabdjian, C.; Finkel, P.; Li, J.; Viehland, D.
2017-12-01
Magneto(elasto)Electric (ME) sensors based on magnetostrictive-piezoelectric composites have been investigated to evaluate their performances to sense a magnetic signal. Previous results have shown that the dielectric loss noise in the piezoelectric layer exhibits as the dominant intrinsic noise at low frequencies, which limits the sensor performances. Also, it has intrinsically no DC capability. To avoid a part of this limitation, modulation detection methods are evaluated through a frequency up-conversion technique [1-4]. Moreover, classical magnetic field feedback techniques can be used to increase the dynamic range, the sensing stability and the system linearity, too. In this paper, we propose a new method to feedback the system by using both the magneto-capacitance modulation and an electric field feedback technique. Our development shows the feasibility of the method and the results match with the theoretical description and material properties. Even if the present results are not totally satisfactory, they give the proof of concept and yield a way for the development of very low power magnetometers.
NASA Astrophysics Data System (ADS)
Fujimoto, Kazuhiro J.
2012-07-01
A transition-density-fragment interaction (TDFI) combined with a transfer integral (TI) method is proposed. The TDFI method was previously developed for describing electronic Coulomb interaction, which was applied to excitation-energy transfer (EET) [K. J. Fujimoto and S. Hayashi, J. Am. Chem. Soc. 131, 14152 (2009)] and exciton-coupled circular dichroism spectra [K. J. Fujimoto, J. Chem. Phys. 133, 124101 (2010)]. In the present study, the TDFI method is extended to the exchange interaction, and hence it is combined with the TI method for applying to the EET via charge-transfer (CT) states. In this scheme, the overlap correction is also taken into account. To check the TDFI-TI accuracy, several test calculations are performed to an ethylene dimer. As a result, the TDFI-TI method gives a much improved description of the electronic coupling, compared with the previous TDFI method. Based on the successful description of the electronic coupling, the decomposition analysis is also performed with the TDFI-TI method. The present analysis clearly shows a large contribution from the Coulomb interaction in most of the cases, and a significant influence of the CT states at the small separation. In addition, the exchange interaction is found to be small in this system. The present approach is useful for analyzing and understanding the mechanism of EET.
Van Duren, B H; Pandit, H; Beard, D J; Murray, D W; Gill, H S
2009-04-01
The recent development in Oxford lateral unicompartmental knee arthroplasty (UKA) design requires a valid method of assessing its kinematics. In particular, the use of single plane fluoroscopy to reconstruct the 3D kinematics of the implanted knee. The method has been used previously to investigate the kinematics of UKA, but mostly it has been used in conjunction with total knee arthroplasty (TKA). However, no accuracy assessment of the method when used for UKA has previously been reported. In this study we performed computer simulation tests to investigate the effect of the different geometry of the unicompartmental implant has on the accuracy of the method in comparison to the total knee implants. A phantom was built to perform in vitro tests to determine the accuracy of the method for UKA. The computer simulations suggested that the use of the method for UKA would prove less accurate than for TKA's. The rotational degrees of freedom for the femur showed greatest disparity between the UKA and TKA. The phantom tests showed that the in-plane translations were accurate to <0.5mm RMS and the out-of-plane translations were less accurate with 4.1mm RMS. The rotational accuracies were between 0.6 degrees and 2.3 degrees which are less accurate than those reported in the literature for TKA, however, the method is sufficient for studying overall knee kinematics.
Nonlinear calculations of the time evolution of black hole accretion disks
NASA Technical Reports Server (NTRS)
Luo, C.
1994-01-01
Based on previous works on black hole accretion disks, I continue to explore the disk dynamics using the finite difference method to solve the highly nonlinear problem of time-dependent alpha disk equations. Here a radially zoned model is used to develop a computational scheme in order to accommodate functional dependence of the viscosity parameter alpha on the disk scale height and/or surface density. This work is based on the author's previous work on the steady disk structure and the linear analysis of disk dynamics to try to apply to x-ray emissions from black candidates (i.e., multiple-state spectra, instabilities, QPO's, etc.).
Dynamic one-dimensional modeling of secondary settling tanks and system robustness evaluation.
Li, Ben; Stenstrom, M K
2014-01-01
One-dimensional secondary settling tank models are widely used in current engineering practice for design and optimization, and usually can be expressed as a nonlinear hyperbolic or nonlinear strongly degenerate parabolic partial differential equation (PDE). Reliable numerical methods are needed to produce approximate solutions that converge to the exact analytical solutions. In this study, we introduced a reliable numerical technique, the Yee-Roe-Davis (YRD) method as the governing PDE solver, and compared its reliability with the prevalent Stenstrom-Vitasovic-Takács (SVT) method by assessing their simulation results at various operating conditions. The YRD method also produced a similar solution to the previously developed Method G and Enquist-Osher method. The YRD and SVT methods were also used for a time-to-failure evaluation, and the results show that the choice of numerical method can greatly impact the solution. Reliable numerical methods, such as the YRD method, are strongly recommended.
Wang, Wenyi; Kim, Marlene T.; Sedykh, Alexander
2015-01-01
Purpose Experimental Blood–Brain Barrier (BBB) permeability models for drug molecules are expensive and time-consuming. As alternative methods, several traditional Quantitative Structure-Activity Relationship (QSAR) models have been developed previously. In this study, we aimed to improve the predictivity of traditional QSAR BBB permeability models by employing relevant public bio-assay data in the modeling process. Methods We compiled a BBB permeability database consisting of 439 unique compounds from various resources. The database was split into a modeling set of 341 compounds and a validation set of 98 compounds. Consensus QSAR modeling workflow was employed on the modeling set to develop various QSAR models. A five-fold cross-validation approach was used to validate the developed models, and the resulting models were used to predict the external validation set compounds. Furthermore, we used previously published membrane transporter models to generate relevant transporter profiles for target compounds. The transporter profiles were used as additional biological descriptors to develop hybrid QSAR BBB models. Results The consensus QSAR models have R2=0.638 for fivefold cross-validation and R2=0.504 for external validation. The consensus model developed by pooling chemical and transporter descriptors showed better predictivity (R2=0.646 for five-fold cross-validation and R2=0.526 for external validation). Moreover, several external bio-assays that correlate with BBB permeability were identified using our automatic profiling tool. Conclusions The BBB permeability models developed in this study can be useful for early evaluation of new compounds (e.g., new drug candidates). The combination of chemical and biological descriptors shows a promising direction to improve the current traditional QSAR models. PMID:25862462
Animal models in the research of abdominal aortic aneurysms development.
Patelis, N; Moris, D; Schizas, D; Damaskos, C; Perrea, D; Bakoyiannis, C; Liakakos, T; Georgopoulos, S
2017-12-20
Abdominal aortic aneurysm (AAA) is a prevalent and potentially life threatening disease. Many animal models have been developed to simulate the natural history of the disease or test preclinical endovascular devices and surgical procedures. The aim of this review is to describe different methods of AAA induction in animal models and report on the effectiveness of the methods described in inducing an analogue of a human AAA. The PubMed database was searched for publications with titles containing the following terms "animal" or "animal model(s)" and keywords "research", "aneurysm(s)", "aorta", "pancreatic elastase", "Angiotensin", "AngII" "calcium chloride" or "CaCl(2)". Starting date for this search was set to 2004, since previously bibliography was already covered by the review of Daugherty and Cassis (2004). We focused on animal studies that reported a model of aneurysm development and progression. A number of different approaches of AAA induction in animal models has been developed, used and combined since the first report in the 1960's. Although specific methods are successful in AAA induction in animal models, it is necessary that these methods and their respective results are in line with the pathophysiology and the mechanisms involved in human AAA development. A researcher should know the advantages/disadvantages of each animal model and choose the appropriate model.
O:2-CRM(197) conjugates against Salmonella Paratyphi A.
Micoli, Francesca; Rondini, Simona; Gavini, Massimiliano; Lanzilao, Luisa; Medaglini, Donata; Saul, Allan; Martin, Laura B
2012-01-01
Enteric fevers remain a common and serious disease, affecting mainly children and adolescents in developing countries. Salmonella enterica serovar Typhi was believed to cause most enteric fever episodes, but several recent reports have shown an increasing incidence of S. Paratyphi A, encouraging the development of a bivalent vaccine to protect against both serovars, especially considering that at present there is no vaccine against S. Paratyphi A. The O-specific polysaccharide (O:2) of S. Paratyphi A is a protective antigen and clinical data have previously demonstrated the potential of using O:2 conjugate vaccines. Here we describe a new conjugation chemistry to link O:2 and the carrier protein CRM(197), using the terminus 3-deoxy-D-manno-octulosonic acid (KDO), thus leaving the O:2 chain unmodified. The new conjugates were tested in mice and compared with other O:2-antigen conjugates, synthesized adopting previously described methods that use CRM(197) as carrier protein. The newly developed conjugation chemistry yielded immunogenic conjugates with strong serum bactericidal activity against S. Paratyphi A.
Atomistic cluster alignment method for local order mining in liquids and glasses
NASA Astrophysics Data System (ADS)
Fang, X. W.; Wang, C. Z.; Yao, Y. X.; Ding, Z. J.; Ho, K. M.
2010-11-01
An atomistic cluster alignment method is developed to identify and characterize the local atomic structural order in liquids and glasses. With the “order mining” idea for structurally disordered systems, the method can detect the presence of any type of local order in the system and can quantify the structural similarity between a given set of templates and the aligned clusters in a systematic and unbiased manner. Moreover, population analysis can also be carried out for various types of clusters in the system. The advantages of the method in comparison with other previously developed analysis methods are illustrated by performing the structural analysis for four prototype systems (i.e., pure Al, pure Zr, Zr35Cu65 , and Zr36Ni64 ). The results show that the cluster alignment method can identify various types of short-range orders (SROs) in these systems correctly while some of these SROs are difficult to capture by most of the currently available analysis methods (e.g., Voronoi tessellation method). Such a full three-dimensional atomistic analysis method is generic and can be applied to describe the magnitude and nature of noncrystalline ordering in many disordered systems.
Strongly Coupled Fluid-Body Dynamics in the Immersed Boundary Projection Method
NASA Astrophysics Data System (ADS)
Wang, Chengjie; Eldredge, Jeff D.
2014-11-01
A computational algorithm is developed to simulate dynamically coupled interaction between fluid and rigid bodies. The basic computational framework is built upon a multi-domain immersed boundary method library, whirl, developed in previous work. In this library, the Navier-Stokes equations for incompressible flow are solved on a uniform Cartesian grid by the vorticity-based immersed boundary projection method of Colonius and Taira. A solver for the dynamics of rigid-body systems is also included. The fluid and rigid-body solvers are strongly coupled with an iterative approach based on the block Gauss-Seidel method. Interfacial force, with its intimate connection with the Lagrange multipliers used in the fluid solver, is used as the primary iteration variable. Relaxation, developed from a stability analysis of the iterative scheme, is used to achieve convergence in only 2-4 iterations per time step. Several two- and three-dimensional numerical tests are conducted to validate and demonstrate the method, including flapping of flexible wings, self-excited oscillations of a system of linked plates and three-dimensional propulsion of flexible fluked tail. This work has been supported by AFOSR, under Award FA9550-11-1-0098.