Wide-Field Infrared Survey Telescope (WFIRST) Integrated Modeling
NASA Technical Reports Server (NTRS)
Liu, Kuo-Chia; Blaurock, Carl
2017-01-01
Contents: introduction to WFIRST (Wide-Field Infrared Survey Telescope) and integrated modeling; WFIRST stability requirement summary; instability mitigation strategies; dynamic jitter results; STOP (structural-thermal-optical performance) (thermal distortion) results; STOP and jitter capability limitations; model validation philosophy.
In-situ Testing of the EHT High Gain and Frequency Ultra-Stable Integrators
NASA Astrophysics Data System (ADS)
Miller, Kenneth; Ziemba, Timothy; Prager, James; Slobodov, Ilia; Lotz, Dan
2014-10-01
Eagle Harbor Technologies (EHT) has developed a long-pulse integrator that exceeds the ITER specification for integration error and pulse duration. During the Phase I program, EHT improved the RPPL short-pulse integrators, added a fast digital reset, and demonstrated that the new integrators exceed the ITER integration error and pulse duration requirements. In Phase II, EHT developed Field Programmable Gate Array (FPGA) software that allows for integrator control and real-time signal digitization and processing. In the second year of Phase II, the EHT integrator will be tested at a validation platform experiment (HIT-SI) and tokamak (DIII-D). In the Phase IIB program, EHT will continue development of the EHT integrator to reduce overall cost per channel. EHT will test lower cost components, move to surface mount components, and add an onboard Field Programmable Gate Array and data acquisition to produce a stand-alone system with lower cost per channel and increased the channel density. EHT will test the Phase IIB integrator at a validation platform experiment (HIT-SI) and tokamak (DIII-D). Work supported by the DOE under Contract Number (DE-SC0006281).
Guseinov, Israfil
2004-02-01
In this study, using complete orthonormal sets of Psi(alpha)-ETOs (where alpha=1, 0, -1, -2, ...) introduced by the author, a large number of series expansion formulae for the multicenter electronic attraction (EA), electric field (EF) and electric field gradient (EFG) integrals of the Yukawa-like screened Coulomb potentials (SCPs) is presented through the new central and noncentral potentials and the overlap integrals with the same screening constants. The final results obtained are valid for arbitrary locations of STOs and their parameters.
The Challenge of Grounding Planning in Simulation with an Interactive Model Development Environment
NASA Technical Reports Server (NTRS)
Clement, Bradley J.; Frank, Jeremy D.; Chachere, John M.; Smith, Tristan B.; Swanson, Keith J.
2011-01-01
A principal obstacle to fielding automated planning systems is the difficulty of modeling. Physical systems are modeled conventionally based on specification documents and the modeler's understanding of the system. Thus, the model is developed in a way that is disconnected from the system's actual behavior and is vulnerable to manual error. Another obstacle to fielding planners is testing and validation. For a space mission, generated plans must be validated often by translating them into command sequences that are run in a simulation testbed. Testing in this way is complex and onerous because of the large number of possible plans and states of the spacecraft. Though, if used as a source of domain knowledge, the simulator can ease validation. This paper poses a challenge: to ground planning models in the system physics represented by simulation. A proposed, interactive model development environment illustrates the integration of planning and simulation to meet the challenge. This integration reveals research paths for automated model construction and validation.
Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.
2009-01-01
Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.
ERIC Educational Resources Information Center
Sanetti, Lisa M. Hagermoser; DiGennaro Reed, Florence D.
2012-01-01
Treatment integrity data are essential to drawing valid conclusions in treatment outcome studies. Such data, however, are not always included in peer-reviewed research articles in school psychology or related fields. To gain a better understanding of why treatment integrity data are lacking in the school psychology research, we surveyed the…
Madsen, Kristoffer H; Ewald, Lars; Siebner, Hartwig R; Thielscher, Axel
2015-01-01
Field calculations for transcranial magnetic stimulation (TMS) are increasingly implemented online in neuronavigation systems and in more realistic offline approaches based on finite-element methods. They are often based on simplified and/or non-validated models of the magnetic vector potential of the TMS coils. To develop an approach to reconstruct the magnetic vector potential based on automated measurements. We implemented a setup that simultaneously measures the three components of the magnetic field with high spatial resolution. This is complemented by a novel approach to determine the magnetic vector potential via volume integration of the measured field. The integration approach reproduces the vector potential with very good accuracy. The vector potential distribution of a standard figure-of-eight shaped coil determined with our setup corresponds well with that calculated using a model reconstructed from x-ray images. The setup can supply validated models for existing and newly appearing TMS coils. Copyright © 2015 Elsevier Inc. All rights reserved.
Magnetic force microscopy method and apparatus to detect and image currents in integrated circuits
Campbell, Ann. N.; Anderson, Richard E.; Cole, Jr., Edward I.
1995-01-01
A magnetic force microscopy method and improved magnetic tip for detecting and quantifying internal magnetic fields resulting from current of integrated circuits. Detection of the current is used for failure analysis, design verification, and model validation. The interaction of the current on the integrated chip with a magnetic field can be detected using a cantilevered magnetic tip. Enhanced sensitivity for both ac and dc current and voltage detection is achieved with voltage by an ac coupling or a heterodyne technique. The techniques can be used to extract information from analog circuits.
Magnetic force microscopy method and apparatus to detect and image currents in integrated circuits
Campbell, A.N.; Anderson, R.E.; Cole, E.I. Jr.
1995-11-07
A magnetic force microscopy method and improved magnetic tip for detecting and quantifying internal magnetic fields resulting from current of integrated circuits are disclosed. Detection of the current is used for failure analysis, design verification, and model validation. The interaction of the current on the integrated chip with a magnetic field can be detected using a cantilevered magnetic tip. Enhanced sensitivity for both ac and dc current and voltage detection is achieved with voltage by an ac coupling or a heterodyne technique. The techniques can be used to extract information from analog circuits. 17 figs.
Neutron Reference Benchmark Field Specification: ACRR Free-Field Environment (ACRR-FF-CC-32-CL).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vega, Richard Manuel; Parma, Edward J.; Griffin, Patrick J.
2015-07-01
This report was put together to support the International Atomic Energy Agency (IAEA) REAL- 2016 activity to validate the dosimetry community’s ability to use a consistent set of activation data and to derive consistent spectral characterizations. The report captures details of integral measurements taken in the Annular Core Research Reactor (ACRR) central cavity free-field reference neutron benchmark field. The field is described and an “a priori” calculated neutron spectrum is reported, based on MCNP6 calculations, and a subject matter expert (SME) based covariance matrix is given for this “a priori” spectrum. The results of 31 integral dosimetry measurements in themore » neutron field are reported.« less
Electromagnetic Fields Exposure Limits
2018-01-01
analysis, synthesis, integration and validation of knowledge derived through the scientific method. In NATO, S&T is addressed using different...Panel • NMSG NATO Modelling and Simulation Group • SAS System Analysis and Studies Panel • SCI Systems Concepts and Integration Panel • SET... integrity or morphology. They later also failed to find a lack of direct DNA damage in human blood (strand breaks, alkali-labile sites, and incomplete
Comprehensive Cocurricular Support Promotes Persistence of Community College STEM Students
ERIC Educational Resources Information Center
Shadduck, Peggy
2017-01-01
Despite good career prospects in science, technology, engineering, and mathematics (STEM) fields, persistence of students in STEM fields of study at the community college and transfer to universities to pursue STEM majors is often quite low. Theories of persistence emphasize the importance of engagement, integration, validation, and financial…
ERIC Educational Resources Information Center
Sanetti, Lisa M. Hagermoser; Gritter, Katie L.; Dobey, Lisa M.
2011-01-01
Increased accountability in education has resulted in a focus on implementing interventions with strong empirical support. Both student outcome and treatment integrity data are needed to draw valid conclusions about intervention effectiveness. Reviews of the literature in other fields (e.g., applied behavior analysis, prevention science) suggest…
NASA Astrophysics Data System (ADS)
Gatlin, P. N.; Conover, H.; Berendes, T.; Maskey, M.; Naeger, A. R.; Wingo, S. M.
2017-12-01
A key component of NASA's Earth observation system is its field experiments, for intensive observation of particular weather phenomena, or for ground validation of satellite observations. These experiments collect data from a wide variety of airborne and ground-based instruments, on different spatial and temporal scales, often in unique formats. The field data are often used with high volume satellite observations that have very different spatial and temporal coverage. The challenges inherent in working with such diverse datasets make it difficult for scientists to rapidly collect and analyze the data for physical process studies and validation of satellite algorithms. The newly-funded VISAGE project will address these issues by combining and extending nascent efforts to provide on-line data fusion, exploration, analysis and delivery capabilities. A key building block is the Field Campaign Explorer (FCX), which allows users to examine data collected during field campaigns and simplifies data acquisition for event-based research. VISAGE will extend FCX's capabilities beyond interactive visualization and exploration of coincident datasets, to provide interrogation of data values and basic analyses such as ratios and differences between data fields. The project will also incorporate new, higher level fused and aggregated analysis products from the System for Integrating Multi-platform data to Build the Atmospheric column (SIMBA), which combines satellite and ground-based observations into a common gridded atmospheric column data product; and the Validation Network (VN), which compiles a nationwide database of coincident ground- and satellite-based radar measurements of precipitation for larger scale scientific analysis. The VISAGE proof-of-concept will target "golden cases" from Global Precipitation Measurement Ground Validation campaigns. This presentation will introduce the VISAGE project, initial accomplishments and near term plans.
ERIC Educational Resources Information Center
Brückner, Sebastian; Pellegrino, James W.
2016-01-01
The Standards for Educational and Psychological Testing indicate that validation of assessments should include analyses of participants' response processes. However, such analyses typically are conducted only to supplement quantitative field studies with qualitative data, and seldom are such data connected to quantitative data on student or item…
Coupled NASTRAN/boundary element formulation for acoustic scattering
NASA Technical Reports Server (NTRS)
Everstine, Gordon C.; Henderson, Francis M.; Schuetz, Luise S.
1987-01-01
A coupled finite element/boundary element capability is described for calculating the sound pressure field scattered by an arbitrary submerged 3-D elastic structure. Structural and fluid impedances are calculated with no approximation other than discretization. The surface fluid pressures and normal velocities are first calculated by coupling a NASTRAN finite element model of the structure with a discretized form of the Helmholtz surface integral equation for the exterior field. Far field pressures are then evaluated from the surface solution using the Helmholtz exterior integral equation. The overall approach is illustrated and validated using a known analytic solution for scattering from submerged spherical shells.
Optimization of Pockels electric field in transverse modulated optical voltage sensor
NASA Astrophysics Data System (ADS)
Huang, Yifan; Xu, Qifeng; Chen, Kun-Long; Zhou, Jie
2018-05-01
This paper investigates the possibilities of optimizing the Pockels electric field in a transverse modulated optical voltage sensor with a spherical electrode structure. The simulations show that due to the edge effect and the electric field concentrations and distortions, the electric field distributions in the crystal are non-uniform. In this case, a tiny variation in the light path leads to an integral error of more than 0.5%. Moreover, a 2D model cannot effectively represent the edge effect, so a 3D model is employed to optimize the electric field distributions. Furthermore, a new method to attach a quartz crystal to the electro-optic crystal along the electric field direction is proposed to improve the non-uniformity of the electric field. The integral error is reduced therefore from 0.5% to 0.015% and less. The proposed method is simple, practical and effective, and it has been validated by numerical simulations and experimental tests.
Instruments Measuring Integrated Care: A Systematic Review of Measurement Properties.
Bautista, Mary Ann C; Nurjono, Milawaty; Lim, Yee Wei; Dessers, Ezra; Vrijhoef, Hubertus Jm
2016-12-01
Policy Points: Investigations on systematic methodologies for measuring integrated care should coincide with the growing interest in this field of research. A systematic review of instruments provides insights into integrated care measurement, including setting the research agenda for validating available instruments and informing the decision to develop new ones. This study is the first systematic review of instruments measuring integrated care with an evidence synthesis of the measurement properties. We found 209 index instruments measuring different constructs related to integrated care; the strength of evidence on the adequacy of the majority of their measurement properties remained largely unassessed. Integrated care is an important strategy for increasing health system performance. Despite its growing significance, detailed evidence on the measurement properties of integrated care instruments remains vague and limited. Our systematic review aims to provide evidence on the state of the art in measuring integrated care. Our comprehensive systematic review framework builds on the Rainbow Model for Integrated Care (RMIC). We searched MEDLINE/PubMed for published articles on the measurement properties of instruments measuring integrated care and identified eligible articles using a standard set of selection criteria. We assessed the methodological quality of every validation study reported using the COSMIN checklist and extracted data on study and instrument characteristics. We also evaluated the measurement properties of each examined instrument per validation study and provided a best evidence synthesis on the adequacy of measurement properties of the index instruments. From the 300 eligible articles, we assessed the methodological quality of 379 validation studies from which we identified 209 index instruments measuring integrated care constructs. The majority of studies reported on instruments measuring constructs related to care integration (33%) and patient-centered care (49%); fewer studies measured care continuity/comprehensive care (15%) and care coordination/case management (3%). We mapped 84% of the measured constructs to the clinical integration domain of the RMIC, with fewer constructs related to the domains of professional (3.7%), organizational (3.4%), and functional (0.5%) integration. Only 8% of the instruments were mapped to a combination of domains; none were mapped exclusively to the system or normative integration domains. The majority of instruments were administered to either patients (60%) or health care providers (20%). Of the measurement properties, responsiveness (4%), measurement error (7%), and criterion (12%) and cross-cultural validity (14%) were less commonly reported. We found <50% of the validation studies to be of good or excellent quality for any of the measurement properties. Only a minority of index instruments showed strong evidence of positive findings for internal consistency (15%), content validity (19%), and structural validity (7%); with moderate evidence of positive findings for internal consistency (14%) and construct validity (14%). Our results suggest that the quality of measurement properties of instruments measuring integrated care is in need of improvement with the less-studied constructs and domains to become part of newly developed instruments. © 2016 Milbank Memorial Fund.
Instruments Measuring Integrated Care: A Systematic Review of Measurement Properties
BAUTISTA, MARY ANN C.; NURJONO, MILAWATY; DESSERS, EZRA; VRIJHOEF, HUBERTUS JM
2016-01-01
Policy Points: Investigations on systematic methodologies for measuring integrated care should coincide with the growing interest in this field of research.A systematic review of instruments provides insights into integrated care measurement, including setting the research agenda for validating available instruments and informing the decision to develop new ones.This study is the first systematic review of instruments measuring integrated care with an evidence synthesis of the measurement properties.We found 209 index instruments measuring different constructs related to integrated care; the strength of evidence on the adequacy of the majority of their measurement properties remained largely unassessed. Context Integrated care is an important strategy for increasing health system performance. Despite its growing significance, detailed evidence on the measurement properties of integrated care instruments remains vague and limited. Our systematic review aims to provide evidence on the state of the art in measuring integrated care. Methods Our comprehensive systematic review framework builds on the Rainbow Model for Integrated Care (RMIC). We searched MEDLINE/PubMed for published articles on the measurement properties of instruments measuring integrated care and identified eligible articles using a standard set of selection criteria. We assessed the methodological quality of every validation study reported using the COSMIN checklist and extracted data on study and instrument characteristics. We also evaluated the measurement properties of each examined instrument per validation study and provided a best evidence synthesis on the adequacy of measurement properties of the index instruments. Findings From the 300 eligible articles, we assessed the methodological quality of 379 validation studies from which we identified 209 index instruments measuring integrated care constructs. The majority of studies reported on instruments measuring constructs related to care integration (33%) and patient‐centered care (49%); fewer studies measured care continuity/comprehensive care (15%) and care coordination/case management (3%). We mapped 84% of the measured constructs to the clinical integration domain of the RMIC, with fewer constructs related to the domains of professional (3.7%), organizational (3.4%), and functional (0.5%) integration. Only 8% of the instruments were mapped to a combination of domains; none were mapped exclusively to the system or normative integration domains. The majority of instruments were administered to either patients (60%) or health care providers (20%). Of the measurement properties, responsiveness (4%), measurement error (7%), and criterion (12%) and cross‐cultural validity (14%) were less commonly reported. We found <50% of the validation studies to be of good or excellent quality for any of the measurement properties. Only a minority of index instruments showed strong evidence of positive findings for internal consistency (15%), content validity (19%), and structural validity (7%); with moderate evidence of positive findings for internal consistency (14%) and construct validity (14%). Conclusions Our results suggest that the quality of measurement properties of instruments measuring integrated care is in need of improvement with the less‐studied constructs and domains to become part of newly developed instruments. PMID:27995711
An algorithm to estimate unsteady and quasi-steady pressure fields from velocity field measurements.
Dabiri, John O; Bose, Sanjeeb; Gemmell, Brad J; Colin, Sean P; Costello, John H
2014-02-01
We describe and characterize a method for estimating the pressure field corresponding to velocity field measurements such as those obtained by using particle image velocimetry. The pressure gradient is estimated from a time series of velocity fields for unsteady calculations or from a single velocity field for quasi-steady calculations. The corresponding pressure field is determined based on median polling of several integration paths through the pressure gradient field in order to reduce the effect of measurement errors that accumulate along individual integration paths. Integration paths are restricted to the nodes of the measured velocity field, thereby eliminating the need for measurement interpolation during this step and significantly reducing the computational cost of the algorithm relative to previous approaches. The method is validated by using numerically simulated flow past a stationary, two-dimensional bluff body and a computational model of a three-dimensional, self-propelled anguilliform swimmer to study the effects of spatial and temporal resolution, domain size, signal-to-noise ratio and out-of-plane effects. Particle image velocimetry measurements of a freely swimming jellyfish medusa and a freely swimming lamprey are analyzed using the method to demonstrate the efficacy of the approach when applied to empirical data.
Optical fringe-reflection deflectometry with sparse representation
NASA Astrophysics Data System (ADS)
Xiao, Yong-Liang; Li, Sikun; Zhang, Qican; Zhong, Jianxin; Su, Xianyu; You, Zhisheng
2018-05-01
Optical fringe-reflection deflectometry is a surprisingly attractive scratch detection technique for specular surfaces owing to its unparalleled local sensibility. Full-field surface topography is obtained from a measured normal field using gradient integration. However, there may not be an ideal measured gradient field for deflectometry reconstruction in practice. Both the non-integrability condition and various kinds of image noise distributions, which are present in the indirect measured gradient field, may lead to ambiguity about the scratches on specular surfaces. In order to reduce misjudgment of scratches, sparse representation is introduced into the Southwell curl equation for deflectometry. The curl can be represented as a linear combination of the given redundant dictionary for curl and the sparsest solution for gradient refinement. The non-integrability condition and noise permutation can be overcome with sparse representation for gradient refinement. Numerical simulations demonstrate that the accuracy rate of judgment of scratches can be enhanced with sparse representation compared to the standard least-squares integration. Preliminary experiments are performed with the application of practical measured deflectometric data to verify the validity of the algorithm.
Integrable Time-Dependent Quantum Hamiltonians
NASA Astrophysics Data System (ADS)
Sinitsyn, Nikolai A.; Yuzbashyan, Emil A.; Chernyak, Vladimir Y.; Patra, Aniket; Sun, Chen
2018-05-01
We formulate a set of conditions under which the nonstationary Schrödinger equation with a time-dependent Hamiltonian is exactly solvable analytically. The main requirement is the existence of a non-Abelian gauge field with zero curvature in the space of system parameters. Known solvable multistate Landau-Zener models satisfy these conditions. Our method provides a strategy to incorporate time dependence into various quantum integrable models while maintaining their integrability. We also validate some prior conjectures, including the solution of the driven generalized Tavis-Cummings model.
A novel integrated framework and improved methodology of computer-aided drug design.
Chen, Calvin Yu-Chian
2013-01-01
Computer-aided drug design (CADD) is a critical initiating step of drug development, but a single model capable of covering all designing aspects remains to be elucidated. Hence, we developed a drug design modeling framework that integrates multiple approaches, including machine learning based quantitative structure-activity relationship (QSAR) analysis, 3D-QSAR, Bayesian network, pharmacophore modeling, and structure-based docking algorithm. Restrictions for each model were defined for improved individual and overall accuracy. An integration method was applied to join the results from each model to minimize bias and errors. In addition, the integrated model adopts both static and dynamic analysis to validate the intermolecular stabilities of the receptor-ligand conformation. The proposed protocol was applied to identifying HER2 inhibitors from traditional Chinese medicine (TCM) as an example for validating our new protocol. Eight potent leads were identified from six TCM sources. A joint validation system comprised of comparative molecular field analysis, comparative molecular similarity indices analysis, and molecular dynamics simulation further characterized the candidates into three potential binding conformations and validated the binding stability of each protein-ligand complex. The ligand pathway was also performed to predict the ligand "in" and "exit" from the binding site. In summary, we propose a novel systematic CADD methodology for the identification, analysis, and characterization of drug-like candidates.
NASA Astrophysics Data System (ADS)
Parise, M.
2018-01-01
A highly accurate analytical solution is derived to the electromagnetic problem of a short vertical wire antenna located on a stratified ground. The derivation consists of three steps. First, the integration path of the integrals describing the fields of the dipole is deformed and wrapped around the pole singularities and the two vertical branch cuts of the integrands located in the upper half of the complex plane. This allows to decompose the radiated field into its three contributions, namely the above-surface ground wave, the lateral wave, and the trapped surface waves. Next, the square root terms responsible for the branch cuts are extracted from the integrands of the branch-cut integrals. Finally, the extracted square roots are replaced with their rational representations according to Newton's square root algorithm, and residue theorem is applied to give explicit expressions, in series form, for the fields. The rigorous integration procedure and the convergence of square root algorithm ensure that the obtained formulas converge to the exact solution. Numerical simulations are performed to show the validity and robustness of the developed formulation, as well as its advantages in terms of time cost over standard numerical integration procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vega, Richard Manuel; Parma, Edward J.; Griffin, Patrick J.
2015-07-01
This report was put together to support the International Atomic Energy Agency (IAEA) REAL- 2016 activity to validate the dosimetry community’s ability to use a consistent set of activation data and to derive consistent spectral characterizations. The report captures details of integral measurements taken in the Annular Core Research Reactor (ACRR) central cavity with the 44 inch Lead-Boron (LB44) bucket, reference neutron benchmark field. The field is described and an “a priori” calculated neutron spectrum is reported, based on MCNP6 calculations, and a subject matter expert (SME) based covariance matrix is given for this “a priori” spectrum. The results ofmore » 31 integral dosimetry measurements in the neutron field are reported.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karcher, Sandra; Willighagen, Egon L.; Rumble, John
Many groups within the broad field of nanoinformatics are already developing data repositories and analytical tools driven by their individual organizational goals. Integrating these data resources across disciplines and with non-nanotechnology resources can support multiple objectives by enabling the reuse of the same information. Integration can also serve as the impetus for novel scientific discoveries by providing the framework to support deeper data analyses. This article discusses current data integration practices in nanoinformatics and in comparable mature fields, and nanotechnology-specific challenges impacting data integration. Based on results from a nanoinformatics-community-wide survey, recommendations for achieving integration of existing operational nanotechnology resourcesmore » are presented. Nanotechnology-specific data integration challenges, if effectively resolved, can foster the application and validation of nanotechnology within and across disciplines. This paper is one of a series of articles by the Nanomaterial Data Curation Initiative that address data issues such as data curation workflows, data completeness and quality, curator responsibilities, and metadata.« less
Samoudi, Amine M; Van Audenhaege, Karen; Vermeeren, Günter; Poole, Michael; Tanghe, Emmeric; Martens, Luc; Van Holen, Roel; Joseph, Wout
2015-12-01
We investigated the temporal variation of the induced magnetic field due to the transverse and the longitudinal gradient coils in tungsten collimators arranged in hexagonal and pentagonal geometries with and without gaps between the collimators. We modeled x-, y-, and z-gradient coils and different arrangements of single-photon emission computed tomography (SPECT) collimators using FEKO, a three-dimensional electromagnetic simulation tool. A time analysis approach was used to generate the pulsed magnetic field gradient. The approach was validated with measurements using a 7T MRI scanner. Simulations showed an induced magnetic field representing 4.66% and 0.87% of the applied gradient field (gradient strength = 500 mT/m) for longitudinal and transverse gradient coils, respectively. These values can be reduced by 75% by adding gaps between the collimators for the pentagonal arrangement, bringing the maximum induced magnetic field to less than 2% of the applied gradient for all of the gradient coils. Characterization of the maximum induced magnetic field shows that by adding gaps between the collimators for an integrated SPECT/MRI system, eddy currents can be corrected by the MRI system to avoid artifact. The numerical model was validated and was proposed as a tool for studying the effect of a SPECT collimator within the MRI gradient coils. © 2014 Wiley Periodicals, Inc.
DOT National Transportation Integrated Search
2016-10-01
Railways are an important component of a multi-modal freight transport network. The structural integrity of rail substructure and problematic railway elements can be compromised leading to track instability and ultimately, train derailments. Because ...
NASA Astrophysics Data System (ADS)
Malard, J. J.; Rojas, M.; Adamowski, J. F.; Anandaraja, N.; Tuy, H.; Melgar-Quiñonez, H.
2016-12-01
While several well-validated crop growth models are currently widely used, very few crop pest models of the same caliber have been developed or applied, and pest models that take trophic interactions into account are even rarer. This may be due to several factors, including 1) the difficulty of representing complex agroecological food webs in a quantifiable model, and 2) the general belief that pesticides effectively remove insect pests from immediate concern. However, pests currently claim a substantial amount of harvests every year (and account for additional control costs), and the impact of insects and of their trophic interactions on agricultural crops cannot be ignored, especially in the context of changing climates and increasing pressures on crops across the globe. Unfortunately, most integrated pest management frameworks rely on very simple models (if at all), and most examples of successful agroecological management remain more anecdotal than scientifically replicable. In light of this, there is a need for validated and robust agroecological food web models that allow users to predict the response of these webs to changes in management, crops or climate, both in order to predict future pest problems under a changing climate as well as to develop effective integrated management plans. Here we present Tiko'n, a Python-based software whose API allows users to rapidly build and validate trophic web agroecological models that predict pest dynamics in the field. The programme uses a Bayesian inference approach to calibrate the models according to field data, allowing for the reuse of literature data from various sources and reducing the need for extensive field data collection. We apply the model to the cononut black-headed caterpillar (Opisina arenosella) and associated parasitoid data from Sri Lanka, showing how the modeling framework can be used to rapidly develop, calibrate and validate models that elucidate how the internal structures of food webs determine their behaviour and allow users to evaluate different integrated management options.
Deriving all p-brane superalgebras via integrability
NASA Astrophysics Data System (ADS)
Grasso, D. T.; McArthur, I. N.
2018-03-01
In previous work we demonstrated that the enlarged super-Poincare algebras which underlie p-brane and D-brane actions in superstring theory can be directly determined based on the integrability of supersymmetry transformations assigned to fields appearing in Wess-Zumino terms. In that work we derived p-brane superalgebras for p = 2 and 3. Here we extend our previous results and give a compact expression for superalgebras for all valid p.
Applications of GPS technologies to field sports.
Aughey, Robert J
2011-09-01
Global positioning system (GPS) technology was made possible after the invention of the atomic clock. The first suggestion that GPS could be used to assess the physical activity of humans followed some 40 y later. There was a rapid uptake of GPS technology, with the literature concentrating on validation studies and the measurement of steady-state movement. The first attempts were made to validate GPS for field sport applications in 2006. While GPS has been validated for applications for team sports, some doubts continue to exist on the appropriateness of GPS for measuring short high-velocity movements. Thus, GPS has been applied extensively in Australian football, cricket, hockey, rugby union and league, and soccer. There is extensive information on the activity profile of athletes from field sports in the literature stemming from GPS, and this includes total distance covered by players and distance in velocity bands. Global positioning systems have also been applied to detect fatigue in matches, identify periods of most intense play, different activity profiles by position, competition level, and sport. More recent research has integrated GPS data with the physical capacity or fitness test score of athletes, game-specific tasks, or tactical or strategic information. The future of GPS analysis will involve further miniaturization of devices, longer battery life, and integration of other inertial sensor data to more effectively quantify the effort of athletes.
Lee, Jungpyo; Bonoli, Paul; Wright, John
2011-01-01
The quasilinear diffusion coefficient assuming a constant magnetic field along the electron orbit is widely used to describe electron Landau damping of waves in a tokamak where the magnitude of the magnetic field varies on a flux surface. To understand the impact of violating the constant magnetic field assumption, we introduce the effect of a broad-bandwidth wave spectrum which has been used in the past to validate quasilinear theory for the fast decorrelation process between resonances. By the reevaluation of the diffusion coefficient through the level of the phase integral for the tokamak geometry with the broad-band wave effect included,more » we identify the three acceptable errors for the use of the quasilinear diffusion coefficient.« less
32 CFR 806b.7 - Responsibilities.
Code of Federal Regulations, 2010 CFR
2010-07-01
... is the senior Air Force Privacy Official with overall responsibility for the Air Force Privacy Act... Data Integrity Board. (5) Provides guidance and assistance to Major Commands, field operating agencies... validate currency. (6) Evaluate the health of the program at regular intervals using this part as guidance...
2017-04-01
unlimited. N/A The objective of this project was to demonstrate and validate the integrated Training Range Environmental Evaluation and Characterization...Defense (DoD) training and testing ranges. Training Range Environmental Evaluation and Characterization System (TREECS™), Environmental Fate Simulator...Defense (DoD) Ranges” ERDC/EL TR-17-5 ii Abstract The Training Range Environmental Evaluation and Characterization Sys- tem (TREECS™) was developed to
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borg, Lori; Tobin, David; Reale, Anthony
This IOP has been a coordinated effort involving the U.S. Department of Energy (DOE) Atmospheric Radiation (ARM) Climate Research Facility, the University of Wisconsin (UW)-Madison, and the JPSS project to validate SNPP NOAA Unique Combined Atmospheric Processing System (NUCAPS) temperature and moisture sounding products from the Cross-track Infrared Sounder (CrIS) and the Advanced Technology Microwave Sounder (ATMS). In this arrangement, funding for radiosondes was provided by the JPSS project to ARM. These radiosondes were launched coincident with the SNPP satellite overpasses (OP) at four of the ARM field sites beginning in July 2012 and running through September 2017. Combined withmore » other ARM data, an assessment of the radiosonde data quality was performed and post-processing corrections applied producing an ARM site Best Estimate (BE) product. The SNPP targeted radiosondes were integrated into the NOAA Products Validation System (NPROVS+) system, which collocated the radiosondes with satellite products (NOAA, National Aeronautics and Space Administration [NASA], European Organisation for the Exploitation of Meteorological Satellites [EUMETSAT], Geostationary Operational Environmental Satellite [GOES], Constellation Observing System for Meteorology, Ionosphere, and Climate [COSMIC]) and Numerical Weather Prediction (NWP forecasts for use in product assessment and algorithm development. This work was a fundamental, integral, and cost-effective part of the SNPP validation effort and provided critical accuracy assessments of the SNPP temperature and water vapor soundings.« less
2015-06-24
physically . While not distinct from IH models, they require inner boundary magnetic field and plasma property values, the latter not currently measured...initialization for the computational grid. Model integration continues until a physically consistent steady-state is attained. Because of the more... physical basis and greater likelihood of realistic solutions, only MHD-type coronal models were considered in the review. There are two major types of
NASA Astrophysics Data System (ADS)
Daniele, Vito G.; Lombardi, Guido; Zich, Rodolfo S.
2017-12-01
Complex scattering problems are often made by composite structures where wedges and penetrable substrates may interact at near field. In this paper (Part 1) together with its companion paper (Part 2) we study the canonical problem constituted of a Perfectly Electrically Conducting (PEC) wedge lying on a grounded dielectric slab with a comprehensive mathematical model based on the application of the Generalized Wiener-Hopf Technique (GWHT) with the help of equivalent circuital representations for linear homogenous regions (angular and layered regions). The proposed procedure is valid for the general case, and the papers focus on E-polarization. The solution is obtained using analytical and semianalytical approaches that reduce the Wiener-Hopf factorization to integral equations. Several numerical test cases validate the proposed method. The scope of Part 1 is to present the method and its validation applied to the problem. The companion paper Part 2 focuses on the properties of the solution, and it presents physical and engineering insights as Geometrical Theory of Diffraction (GTD)/Uniform Theory of Diffraction(UTD) coefficients, total far fields, modal fields, and excitation of surface and leaky waves for different kinds of source. The structure is of interest in antenna technologies and electromagnetic compatibility (tip on a substrate with guiding and antenna properties).
A novel configuration for a brushless DC motor with an integrated planetary gear train
NASA Astrophysics Data System (ADS)
Yan, Hong-Sen; Wu, Yi-Chang
2006-06-01
This paper presents a novel configuration of a brushless DC (BLDC) motor with an integrated planetary gear train, which provides further functional and structural integrations to overcome inherent drawbacks of traditional designs. The effects of gear teeth on the magnetic field and performance of the BLDC motor are investigated. Two standard gear profile systems integrated on the stator with feasible numbers of gear teeth are introduced to reduce the cogging torque. An equivalent magnetic circuit model and an air-gap permeance model are applied to analytically analyze the magnetic field, while the validity is verified by 2-D finite-element method (FEM). Furthermore, the motor performance is discussed and compared with an existing design. The results show that the present design has the characteristics of lower cogging torque and torque ripple than the conventional design, which is of benefit to the widely applications on accurate motion and position control for BLDC motors.
Using Focus Groups to Validate a Pharmacy Vaccination Training Program.
Bushell, Mary; Morrissey, Hana; Ball, Patrick
2015-06-12
Introduction: Focus group methodology is commonly used to quickly collate, integrated views from a variety of different stakeholders. This paper provides an example of how focus groups can be employed to collate expert opinion informing amendments on a newly developed training program for integration into undergraduate pharmacy curricula. Materials and methods: Four focus groups were conducted, across three continents, to determine the appropriateness and reliability of a developed vaccination training program with nested injection skills training. All focus groups were comprised of legitimate experts in the field of vaccination, medicine and/or pharmacy. Results: Themes that emerged across focus groups informed amendments giving rise to a validated version of a training program. Discussion : The rigorous validation of the vaccination training program offers generalizable lessons to inform the design and validation of future training programs intended for the health sector and or pharmacy curricula. Using the knowledge and experience of focus group participants fostered collaborative problem solving and validation of material and concept development. The group dynamics of a focus group allowed synthesis of feedback in an inter-professional manner. Conclusions : This paper provides a demonstration of how focus groups can be structured and used by health researchers to validate a newly developed training program.
Opto-mechanical design of an image slicer for the GRIS spectrograph at GREGOR
NASA Astrophysics Data System (ADS)
Vega Reyes, N.; Esteves, M. A.; Sánchez-Capuchino, J.; Salaun, Y.; López, R. L.; Gracia, F.; Estrada Herrera, P.; Grivel, C.; Vaz Cedillo, J. J.; Collados, M.
2016-07-01
An image slicer has been proposed for the Integral Field Spectrograph [1] of the 4-m European Solar Telescope (EST) [2] The image slicer for EST is called MuSICa (Multi-Slit Image slicer based on collimator-Camera) [3] and it is a telecentric system with diffraction limited optical quality offering the possibility to obtain high resolution Integral Field Solar Spectroscopy or Spectro-polarimetry by coupling a polarimeter after the generated slit (or slits). Considering the technical complexity of the proposed Integral Field Unit (IFU), a prototype has been designed for the GRIS spectrograph at GREGOR telescope at Teide Observatory (Tenerife), composed by the optical elements of the image slicer itself, a scanning system (to cover a larger field of view with sequential adjacent measurements) and an appropriate re-imaging system. All these subsystems are placed in a bench, specially designed to facilitate their alignment, integration and verification, and their easy installation in front of the spectrograph. This communication describes the opto-mechanical solution adopted to upgrade GRIS while ensuring repeatability between the observational modes, IFU and long-slit. Results from several tests which have been performed to validate the opto-mechanical prototypes are also presented.
HDU Deep Space Habitat (DSH) Overview
NASA Technical Reports Server (NTRS)
Kennedy, Kriss J.
2011-01-01
This paper gives an overview of the National Aeronautics and Space Administration (NASA) led multi-center Habitat Demonstration Unit (HDU) project Deep Space Habitat (DSH) analog that will be field-tested during the 2011 Desert Research and Technologies Studies (D-RATS) field tests. The HDU project is a technology pull project that integrates technologies and innovations from multiple NASA centers. This project will repurpose the HDU Pressurized Excursion Module (PEM) that was field tested in the 2010 D-RATS, adding habitation functionality to the prototype unit. The 2010 configuration of the HDU-PEM consisted of a lunar surface laboratory module that was used to bring over 20 habitation-related technologies together in a single platform that could be tested as an advanced habitation analog in the context of mission architectures and surface operations. The 2011 HDU-DSH configuration will build upon the PEM work, and emphasize validity of crew operations (habitation and living, etc), EVA operations, mission operations, logistics operations, and science operations that might be required in a deep space context for Near Earth Object (NEO) exploration mission architectures. The HDU project consists of a multi-center team brought together in a skunkworks approach to quickly build and validate hardware in analog environments. The HDU project is part of the strategic plan from the Exploration Systems Mission Directorate (ESMD) Directorate Integration Office (DIO) and the Exploration Mission Systems Office (EMSO) to test destination elements in analog environments. The 2011 analog field test will include Multi Mission Space Exploration Vehicles (MMSEV) and the DSH among other demonstration elements to be brought together in a mission architecture context. This paper will describe overall objectives, various habitat configurations, strategic plan, and technology integration as it pertains to the 2011 field tests.
Whelan, Maurice; Eskes, Chantra
Validation is essential for the translation of newly developed alternative approaches to animal testing into tools and solutions suitable for regulatory applications. Formal approaches to validation have emerged over the past 20 years or so and although they have helped greatly to progress the field, it is essential that the principles and practice underpinning validation continue to evolve to keep pace with scientific progress. The modular approach to validation should be exploited to encourage more innovation and flexibility in study design and to increase efficiency in filling data gaps. With the focus now on integrated approaches to testing and assessment that are based on toxicological knowledge captured as adverse outcome pathways, and which incorporate the latest in vitro and computational methods, validation needs to adapt to ensure it adds value rather than hinders progress. Validation needs to be pursued both at the method level, to characterise the performance of in vitro methods in relation their ability to detect any association of a chemical with a particular pathway or key toxicological event, and at the methodological level, to assess how integrated approaches can predict toxicological endpoints relevant for regulatory decision making. To facilitate this, more emphasis needs to be given to the development of performance standards that can be applied to classes of methods and integrated approaches that provide similar information. Moreover, the challenge of selecting the right reference chemicals to support validation needs to be addressed more systematically, consistently and in a manner that better reflects the state of the science. Above all however, validation requires true partnership between the development and user communities of alternative methods and the appropriate investment of resources.
Kinematic validation of a quasi-geostrophic model for the fast dynamics in the Earth's outer core
NASA Astrophysics Data System (ADS)
Maffei, S.; Jackson, A.
2017-09-01
We derive a quasi-geostrophic (QG) system of equations suitable for the description of the Earth's core dynamics on interannual to decadal timescales. Over these timescales, rotation is assumed to be the dominant force and fluid motions are strongly invariant along the direction parallel to the rotation axis. The diffusion-free, QG system derived here is similar to the one derived in Canet et al. but the projection of the governing equations on the equatorial disc is handled via vertical integration and mass conservation is applied to the velocity field. Here we carefully analyse the properties of the resulting equations and we validate them neglecting the action of the Lorentz force in the momentum equation. We derive a novel analytical solution describing the evolution of the magnetic field under these assumptions in the presence of a purely azimuthal flow and an alternative formulation that allows us to numerically solve the evolution equations with a finite element method. The excellent agreement we found with the analytical solution proves that numerical integration of the QG system is possible and that it preserves important physical properties of the magnetic field. Implementation of magnetic diffusion is also briefly considered.
NASA Technical Reports Server (NTRS)
Tri, Terry O.; Kennedy, Kriss J.; Toups, Larry; Gill, Tracy R.; Howe, A. Scott
2011-01-01
This paper describes the construction, assembly, subsystem integration, transportation, and field testing operations associated with the Habitat Demonstration Unit (HDU) Pressurized Excursion Module (PEM) and discusses lessons learned. In a one-year period beginning summer 2009, a tightly scheduled design-develop-build process was utilized by a small NASA "tiger team" to produce the functional HDU-PEM prototype in time to participate in the 2010 Desert Research and Technology Studies (Desert RATS) field campaign. The process required the coordination of multiple teams, subcontractors, facility management and safety staff. It also required a well-choreographed material handling and transportation process to deliver the finished product from the NASA-Johnson Space Center facilities to the remote Arizona desert locations of the field test. Significant findings of this paper include the team s greater understanding of the HDU-PEM s many integration issues and the in-field training the team acquired which will enable the implementation of the next-generation of improvements and development of high-fidelity field operations in a harsh environment. The Desert RATS analog environment is being promoted by NASA as an efficient means to design, build, and integrate multiple technologies in a mission architecture context, with the eventual goal of evolving the technologies into robust flight hardware systems. The HDU-PEM in-field demonstration at Desert RATS 2010 provided a validation process for the integration team, which has already begun to retool for the 2011 field tests that require an adapted architecture.
An introduction to generalized functions with some applications in aerodynamics and aeroacoustics
NASA Technical Reports Server (NTRS)
Farassat, F.
1994-01-01
In this paper, we start with the definition of generalized functions as continuous linear functionals on the space of infinitely differentiable functions with compact support. The concept of generalization differentiation is introduced next. This is the most important concept in generalized function theory and the applications we present utilize mainly this concept. First, some of the results of classical analysis, such as Leibniz rule of differentiation under the integral sign and the divergence theorem, are derived using the generalized function theory. It is shown that the divergence theorem remains valid for discontinuous vector fields provided that the derivatives are all viewed as generalized derivatives. This implies that all conservation laws of fluid mechanics are valid as they stand for discontinuous fields with all derivatives treated as generalized deriatives. Once these derivatives are written as ordinary derivatives and jumps in the field parameters across discontinuities, the jump conditions can be easily found. For example, the unsteady shock jump conditions can be derived from mass and momentum conservation laws. By using a generalized function theory, this derivative becomes trivial. Other applications of the generalized function theory in aerodynamics discussed in this paper are derivation of general transport theorems for deriving governing equations of fluid mechanics, the interpretation of finite part of divergent integrals, derivation of Oswatiitsch integral equation of transonic flow, and analysis of velocity field discontinuities as sources of vorticity. Applications in aeroacoustics presented here include the derivation of the Kirchoff formula for moving surfaces,the noise from moving surfaces, and shock noise source strength based on the Ffowcs Williams-Hawkings equation.
Gauge and integrable theories in loop spaces
NASA Astrophysics Data System (ADS)
Ferreira, L. A.; Luchini, G.
2012-05-01
We propose an integral formulation of the equations of motion of a large class of field theories which leads in a quite natural and direct way to the construction of conservation laws. The approach is based on generalized non-abelian Stokes theorems for p-form connections, and its appropriate mathematical language is that of loop spaces. The equations of motion are written as the equality of a hyper-volume ordered integral to a hyper-surface ordered integral on the border of that hyper-volume. The approach applies to integrable field theories in (1+1) dimensions, Chern-Simons theories in (2+1) dimensions, and non-abelian gauge theories in (2+1) and (3+1) dimensions. The results presented in this paper are relevant for the understanding of global properties of those theories. As a special byproduct we solve a long standing problem in (3+1)-dimensional Yang-Mills theory, namely the construction of conserved charges, valid for any solution, which are invariant under arbitrary gauge transformations.
Aeroacoustic Validation of Installed Low Noise Propulsion for NASA's N+2 Supersonic Airliner
NASA Technical Reports Server (NTRS)
Bridges, James
2018-01-01
An aeroacoustic test was conducted at NASA Glenn Research Center on an integrated propulsion system designed to meet noise regulations of ICAO Chapter 4 with 10EPNdB cumulative margin. The test had two objectives: to demonstrate that the aircraft design did meet the noise goal, and to validate the acoustic design tools used in the design. Variations in the propulsion system design and its installation were tested and the results compared against predictions. Far-field arrays of microphones measured the acoustic spectral directivity, which was transformed to full scale as noise certification levels. Phased array measurements confirmed that the shielding of the installation model adequately simulated the full aircraft and provided data for validating RANS-based noise prediction tools. Particle image velocimetry confirmed that the flow field around the nozzle on the jet rig mimicked that of the full aircraft and produced flow data to validate the RANS solutions used in the noise predictions. The far-field acoustic measurements confirmed the empirical predictions for the noise. Results provided here detail the steps taken to ensure accuracy of the measurements and give insights into the physics of exhaust noise from installed propulsion systems in future supersonic vehicles.
NASA Astrophysics Data System (ADS)
Costantini, Mario; Malvarosa, Fabio; Minati, Federico
2010-03-01
Phase unwrapping and integration of finite differences are key problems in several technical fields. In SAR interferometry and differential and persistent scatterers interferometry digital elevation models and displacement measurements can be obtained after unambiguously determining the phase values and reconstructing the mean velocities and elevations of the observed targets, which can be performed by integrating differential estimates of these quantities (finite differences between neighboring points).In this paper we propose a general formulation for robust and efficient integration of finite differences and phase unwrapping, which includes standard techniques methods as sub-cases. The proposed approach allows obtaining more reliable and accurate solutions by exploiting redundant differential estimates (not only between nearest neighboring points) and multi-dimensional information (e.g. multi-temporal, multi-frequency, multi-baseline observations), or external data (e.g. GPS measurements). The proposed approach requires the solution of linear or quadratic programming problems, for which computationally efficient algorithms exist.The validation tests obtained on real SAR data confirm the validity of the method, which was integrated in our production chain and successfully used also in massive productions.
Simulation of Jet Noise with OVERFLOW CFD Code and Kirchhoff Surface Integral
NASA Technical Reports Server (NTRS)
Kandula, M.; Caimi, R.; Voska, N. (Technical Monitor)
2002-01-01
An acoustic prediction capability for supersonic axisymmetric jets was developed on the basis of OVERFLOW Navier-Stokes CFD (Computational Fluid Dynamics) code of NASA Langley Research Center. Reynolds-averaged turbulent stresses in the flow field are modeled with the aid of Spalart-Allmaras one-equation turbulence model. Appropriate acoustic and outflow boundary conditions were implemented to compute time-dependent acoustic pressure in the nonlinear source-field. Based on the specification of acoustic pressure, its temporal and normal derivatives on the Kirchhoff surface, the near-field and the far-field sound pressure levels are computed via Kirchhoff surface integral, with the Kirchhoff surface chosen to enclose the nonlinear sound source region described by the CFD code. The methods are validated by a comparison of the predictions of sound pressure levels with the available data for an axisymmetric turbulent supersonic (Mach 2) perfectly expanded jet.
NASA Technical Reports Server (NTRS)
Kandula, Max; Caimi, Raoul; Steinrock, T. (Technical Monitor)
2001-01-01
An acoustic prediction capability for supersonic axisymmetric jets was developed on the basis of OVERFLOW Navier-Stokes CFD (Computational Fluid Dynamics) code of NASA Langley Research Center. Reynolds-averaged turbulent stresses in the flow field are modeled with the aid of Spalart-Allmaras one-equation turbulence model. Appropriate acoustic and outflow boundary conditions were implemented to compute time-dependent acoustic pressure in the nonlinear source-field. Based on the specification of acoustic pressure, its temporal and normal derivatives on the Kirchhoff surface, the near-field and the far-field sound pressure levels are computed via Kirchhoff surface integral, with the Kirchhoff surface chosen to enclose the nonlinear sound source region described by the CFD code. The methods are validated by a comparison of the predictions of sound pressure levels with the available data for an axisymmetric turbulent supersonic (Mach 2) perfectly expanded jet.
Developing a New Field-Validated Methodology for Landfill Methane Emissions in California
USDA-ARS?s Scientific Manuscript database
This project was initiated in the US by the California Energy Commission (CEC) in cooperation with the California Integrated Waste Management Board (CIWMB) to develop improved methods for landfill methane emissions for the California greenhouse gas inventory. This 3-year project (2007-2010) is devel...
Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop
Sali, Andrej; Berman, Helen M.; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K.; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M.J.J.; Chiu, Wah; Dal Peraro, Matteo; Di Maio, Frank; Ferrin, Thomas E.; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L.; Meiler, Jens; Marti-Renom, Marc A.; Montelione, Gaetano T.; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J.; Saibil, Helen; Schröder, Gunnar F.; Schwieters, Charles D.; Seidel, Claus A. M.; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L.; Velankar, Sameer; Westbrook, John D.
2016-01-01
Summary Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? PMID:26095030
Valentijn, Pim P.; Schepman, Sanneke M.; Opheij, Wilfrid; Bruijnzeels, Marc A.
2013-01-01
Introduction Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. Methods The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. Results The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. Discussion The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective. PMID:23687482
Valentijn, Pim P; Schepman, Sanneke M; Opheij, Wilfrid; Bruijnzeels, Marc A
2013-01-01
Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective.
NASA Technical Reports Server (NTRS)
Starr, David
1999-01-01
The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include ASTER, CERES, MISR, MODIS and MOPITT. In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS), AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2, though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community. Detailed information about the EOS Terra validation Program can be found on the EOS Validation program homepage i/e.: http://ospso.gsfc.nasa.gov/validation/valpage.html).
Duff, Anthony P.; Durand, Dominique; Gabel, Frank; Hendrickson, Wayne A.; Hura, Greg L.; Jacques, David A.; Kirby, Nigel M.; Kwan, Ann H.; Pérez, Javier; Pollack, Lois; Ryan, Timothy M.; Sali, Andrej; Schneidman-Duhovny, Dina; Vachette, Patrice; Westbrook, John
2017-01-01
In 2012, preliminary guidelines were published addressing sample quality, data acquisition and reduction, presentation of scattering data and validation, and modelling for biomolecular small-angle scattering (SAS) experiments. Biomolecular SAS has since continued to grow and authors have increasingly adopted the preliminary guidelines. In parallel, integrative/hybrid determination of biomolecular structures is a rapidly growing field that is expanding the scope of structural biology. For SAS to contribute maximally to this field, it is essential to ensure open access to the information required for evaluation of the quality of SAS samples and data, as well as the validity of SAS-based structural models. To this end, the preliminary guidelines for data presentation in a publication are reviewed and updated, and the deposition of data and associated models in a public archive is recommended. These guidelines and recommendations have been prepared in consultation with the members of the International Union of Crystallography (IUCr) Small-Angle Scattering and Journals Commissions, the Worldwide Protein Data Bank (wwPDB) Small-Angle Scattering Validation Task Force and additional experts in the field. PMID:28876235
Barros, Wilson; Gochberg, Daniel F.; Gore, John C.
2009-01-01
The description of the nuclear magnetic resonance magnetization dynamics in the presence of long-range dipolar interactions, which is based upon approximate solutions of Bloch–Torrey equations including the effect of a distant dipolar field, has been revisited. New experiments show that approximate analytic solutions have a broader regime of validity as well as dependencies on pulse-sequence parameters that seem to have been overlooked. In order to explain these experimental results, we developed a new method consisting of calculating the magnetization via an iterative formalism where both diffusion and distant dipolar field contributions are treated as integral operators incorporated into the Bloch–Torrey equations. The solution can be organized as a perturbative series, whereby access to higher order terms allows one to set better boundaries on validity regimes for analytic first-order approximations. Finally, the method legitimizes the use of simple analytic first-order approximations under less demanding experimental conditions, it predicts new pulse-sequence parameter dependencies for the range of validity, and clarifies weak points in previous calculations. PMID:19425789
NASA Astrophysics Data System (ADS)
Bock, O.; Doerflinger, E.; Masson, F.; Walpersdorf, A.; Van-Baelen, J.; Tarniewicz, J.; Troller, M.; Somieski, A.; Geiger, A.; Bürki, B.
A dense network of 17 dual frequency GPS receivers has been operated for two weeks during June 2001 within a 20 km × 20 km area around Marseille, France, as part of the ESCOMPTE field campaign ([Cros et al., 2004. The ESCOMPTE program: an overview. Atmos. Res. 69, 241-279]; http://medias.obs-mip.fr/escompte). The goal of this GPS experiment was to provide GPS data allowing for tomographic inversions and their validation within a well-documented observing period (the ESCOMPTE campaign). Simultaneous water vapor radiometer, solar spectrometer, Raman lidar and radiosonde data are used for comparison and validation. In this paper, we highlight the motivation, issues and describe the GPS field experiment. Some first results of integrated water vapor retrievals from GPS and the other sensing techniques are presented. The strategies for GPS data processing and tomographic inversions are discussed.
Turner, Todd J.; Shade, Paul A.; Bernier, Joel V.; ...
2016-11-18
High-Energy Diffraction Microscopy (HEDM) is a 3-d x-ray characterization method that is uniquely suited to measuring the evolving micromechanical state and microstructure of polycrystalline materials during in situ processing. The near-field and far-field configurations provide complementary information; orientation maps computed from the near-field measurements provide grain morphologies, while the high angular resolution of the far-field measurements provide intergranular strain tensors. The ability to measure these data during deformation in situ makes HEDM an ideal tool for validating micro-mechanical deformation models that make their predictions at the scale of individual grains. Crystal Plasticity Finite Element Models (CPFEM) are one such classmore » of micro-mechanical models. While there have been extensive studies validating homogenized CPFEM response at a macroscopic level, a lack of detailed data measured at the level of the microstructure has hindered more stringent model validation efforts. We utilize an HEDM dataset from an alphatitanium alloy (Ti-7Al), collected at the Advanced Photon Source, Argonne National Laboratory, under in situ tensile deformation. The initial microstructure of the central slab of the gage section, measured via near-field HEDM, is used to inform a CPFEM model. The predicted intergranular stresses for 39 internal grains are then directly compared to data from 4 far-field measurements taken between ~4% and ~80% of the macroscopic yield strength. In conclusion, the intergranular stresses from the CPFEM model and far-field HEDM measurements up to incipient yield are shown to be in good agreement, and implications for application of such an integrated computational/experimental approach to phenomena such as fatigue and crack propagation is discussed.« less
Omnetric Group Demonstrates Distributed Grid-Edge Control Hierarchy at NREL
| Energy Systems Integration Facility | NREL Omnetric Group Omnetric Group Demonstrates Group demonstrated a distributed control hierarchy-based on an open field message bus (OpenFMB resources. OMNETRIC Group first developed and validated the system in the ESIF with a combination of
ERIC Educational Resources Information Center
Akkermans, Jos; Brenninkmeijer, Veerle; Huibers, Marthe; Blonk, Roland W. B.
2013-01-01
A new and promising area of research has recently emerged in the field of career development: career competencies. The present article provides a framework of career competencies that integrates several perspectives from the literature. The framework distinguishes between reflective, communicative, and behavioral career competencies. Six career…
Video Games for Neuro-Cognitive Optimization.
Mishra, Jyoti; Anguera, Joaquin A; Gazzaley, Adam
2016-04-20
Sophisticated video games that integrate engaging cognitive training with real-time biosensing and neurostimulation have the potential to optimize cognitive performance in health and disease. We argue that technology development must be paired with rigorous scientific validation and discuss academic and industry opportunities in this field. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Generazio, Ed; Burke, Eric
2015-01-01
The current activities in the National Aeronautics and Space Administration Nondestructive Evaluation (NDE) Program are presented. The topics covered include organizational communications, orbital weld inspection, electric field imaging, fracture critical probability of detection validation, monitoring of thermal protection systems, physical and document standards, image quality indicators, integrity of composite pressure vessels, and NDE for additively manufactured components.
Harmonics analysis of the ITER poloidal field converter based on a piecewise method
NASA Astrophysics Data System (ADS)
Xudong, WANG; Liuwei, XU; Peng, FU; Ji, LI; Yanan, WU
2017-12-01
Poloidal field (PF) converters provide controlled DC voltage and current to PF coils. The many harmonics generated by the PF converter flow into the power grid and seriously affect power systems and electric equipment. Due to the complexity of the system, the traditional integral operation in Fourier analysis is complicated and inaccurate. This paper presents a piecewise method to calculate the harmonics of the ITER PF converter. The relationship between the grid input current and the DC output current of the ITER PF converter is deduced. The grid current is decomposed into the sum of some simple functions. By calculating simple function harmonics based on the piecewise method, the harmonics of the PF converter under different operation modes are obtained. In order to examine the validity of the method, a simulation model is established based on Matlab/Simulink and a relevant experiment is implemented in the ITER PF integration test platform. Comparative results are given. The calculated results are found to be consistent with simulation and experiment. The piecewise method is proved correct and valid for calculating the system harmonics.
A multilevel finite element method for Fredholm integral eigenvalue problems
NASA Astrophysics Data System (ADS)
Xie, Hehu; Zhou, Tao
2015-12-01
In this work, we proposed a multigrid finite element (MFE) method for solving the Fredholm integral eigenvalue problems. The main motivation for such studies is to compute the Karhunen-Loève expansions of random fields, which play an important role in the applications of uncertainty quantification. In our MFE framework, solving the eigenvalue problem is converted to doing a series of integral iterations and eigenvalue solving in the coarsest mesh. Then, any existing efficient integration scheme can be used for the associated integration process. The error estimates are provided, and the computational complexity is analyzed. It is noticed that the total computational work of our method is comparable with a single integration step in the finest mesh. Several numerical experiments are presented to validate the efficiency of the proposed numerical method.
Electromagnetic Field Penetration Studies
NASA Technical Reports Server (NTRS)
Deshpande, M.D.
2000-01-01
A numerical method is presented to determine electromagnetic shielding effectiveness of rectangular enclosure with apertures on its wall used for input and output connections, control panels, visual-access windows, ventilation panels, etc. Expressing EM fields in terms of cavity Green's function inside the enclosure and the free space Green's function outside the enclosure, integral equations with aperture tangential electric fields as unknown variables are obtained by enforcing the continuity of tangential electric and magnetic fields across the apertures. Using the Method of Moments, the integral equations are solved for unknown aperture fields. From these aperture fields, the EM field inside a rectangular enclosure due to external electromagnetic sources are determined. Numerical results on electric field shielding of a rectangular cavity with a thin rectangular slot obtained using the present method are compared with the results obtained using simple transmission line technique for code validation. The present technique is applied to determine field penetration inside a Boeing-757 by approximating its passenger cabin as a rectangular cavity filled with a homogeneous medium and its passenger windows by rectangular apertures. Preliminary results for, two windows, one on each side of fuselage were considered. Numerical results for Boeing-757 at frequencies 26 MHz, 171-175 MHz, and 428-432 MHz are presented.
Outcome of the First wwPDB Hybrid/Integrative Methods Task Force Workshop.
Sali, Andrej; Berman, Helen M; Schwede, Torsten; Trewhella, Jill; Kleywegt, Gerard; Burley, Stephen K; Markley, John; Nakamura, Haruki; Adams, Paul; Bonvin, Alexandre M J J; Chiu, Wah; Peraro, Matteo Dal; Di Maio, Frank; Ferrin, Thomas E; Grünewald, Kay; Gutmanas, Aleksandras; Henderson, Richard; Hummer, Gerhard; Iwasaki, Kenji; Johnson, Graham; Lawson, Catherine L; Meiler, Jens; Marti-Renom, Marc A; Montelione, Gaetano T; Nilges, Michael; Nussinov, Ruth; Patwardhan, Ardan; Rappsilber, Juri; Read, Randy J; Saibil, Helen; Schröder, Gunnar F; Schwieters, Charles D; Seidel, Claus A M; Svergun, Dmitri; Topf, Maya; Ulrich, Eldon L; Velankar, Sameer; Westbrook, John D
2015-07-07
Structures of biomolecular systems are increasingly computed by integrative modeling that relies on varied types of experimental data and theoretical information. We describe here the proceedings and conclusions from the first wwPDB Hybrid/Integrative Methods Task Force Workshop held at the European Bioinformatics Institute in Hinxton, UK, on October 6 and 7, 2014. At the workshop, experts in various experimental fields of structural biology, experts in integrative modeling and visualization, and experts in data archiving addressed a series of questions central to the future of structural biology. How should integrative models be represented? How should the data and integrative models be validated? What data should be archived? How should the data and models be archived? What information should accompany the publication of integrative models? Copyright © 2015 Elsevier Ltd. All rights reserved.
Nosal, Eva-Marie; Hodgson, Murray; Ashdown, Ian
2004-08-01
This paper explores acoustical (or time-dependent) radiosity--a geometrical-acoustics sound-field prediction method that assumes diffuse surface reflection. The literature of acoustical radiosity is briefly reviewed and the advantages and disadvantages of the method are discussed. A discrete form of the integral equation that results from meshing the enclosure boundaries into patches is presented and used in a discrete-time algorithm. Furthermore, an averaging technique is used to reduce computational requirements. To generalize to nonrectangular rooms, a spherical-triangle method is proposed as a means of evaluating the integrals over solid angles that appear in the discrete form of the integral equation. The evaluation of form factors, which also appear in the numerical solution, is discussed for rectangular and nonrectangular rooms. This algorithm and associated methods are validated by comparison of the steady-state predictions for a spherical enclosure to analytical solutions.
NASA Astrophysics Data System (ADS)
Nosal, Eva-Marie; Hodgson, Murray; Ashdown, Ian
2004-08-01
This paper explores acoustical (or time-dependent) radiosity-a geometrical-acoustics sound-field prediction method that assumes diffuse surface reflection. The literature of acoustical radiosity is briefly reviewed and the advantages and disadvantages of the method are discussed. A discrete form of the integral equation that results from meshing the enclosure boundaries into patches is presented and used in a discrete-time algorithm. Furthermore, an averaging technique is used to reduce computational requirements. To generalize to nonrectangular rooms, a spherical-triangle method is proposed as a means of evaluating the integrals over solid angles that appear in the discrete form of the integral equation. The evaluation of form factors, which also appear in the numerical solution, is discussed for rectangular and nonrectangular rooms. This algorithm and associated methods are validated by comparison of the steady-state predictions for a spherical enclosure to analytical solutions.
An Instrument to Measure Maturity of Integrated Care: A First Validation Study
2018-01-01
Introduction: Lessons captured from interviews with 12 European regions are represented in a new instrument, the B3-Maturity Model (B3-MM). B3-MM aims to assess maturity along 12 dimensions reflecting the various aspects that need to be managed in order to deliver integrated care. The objective of the study was to test the content validity of B3-MM as part of SCIROCCO (Scaling Integrated Care into Context), a European Union funded project. Methods: A literature review was conducted to compare B3-MM’s 12 dimensions and their measurement scales with existing measures and instruments that focus on assessing the development of integrated care. Subsequently, a three-round survey conducted through a Delphi study with international experts in the field of integrated care was performed to test the relevance of: 1) the dimensions, 2) the maturity indicators and 3) the assessment scale used in B3-MM. Results: The 11 articles included in the literature review confirmed all the dimensions described in the original version of B3-MM. The Delphi study rounds resulted in various phrasing amendments of indicators and assessment scale. Full agreement among the experts on the relevance of the 12 B3-MM dimensions, their indicators, and assessment scale was reached after the third Delphi round. Conclusion and discussion: The B3-MM dimensions, maturity indicators and assessment scale showed satisfactory content validity. While the B3-MM is a unique instrument based on existing knowledge and experiences of regions in integrated care, further testing is needed to explore other measurement properties of B3-MM. PMID:29588644
Building a Better Grid, in Partnership with the OMNETRIC Group and Siemens
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waight, Jim; Grover, Shailendra; Wiedetz, Clark
In collaboration with Siemens and the National Renewable Energy Laboratory (NREL), OMNETRIC Group developed a distributed control hierarchy—based on an open field message bus (OpenFMB) framework—that allows control decisions to be made at the edge of the grid. The technology was validated and demonstrated at NREL’s Energy Systems Integration Facility.
An efficient and robust method for predicting helicopter rotor high-speed impulsive noise
NASA Technical Reports Server (NTRS)
Brentner, Kenneth S.
1996-01-01
A new formulation for the Ffowcs Williams-Hawkings quadrupole source, which is valid for a far-field in-plane observer, is presented. The far-field approximation is new and unique in that no further approximation of the quadrupole source strength is made and integrands with r(exp -2) and r(exp -3) dependence are retained. This paper focuses on the development of a retarded-time formulation in which time derivatives are analytically taken inside the integrals to avoid unnecessary computational work when the observer moves with the rotor. The new quadrupole formulation is similar to Farassat's thickness and loading formulation 1A. Quadrupole noise prediction is carried out in two parts: a preprocessing stage in which the previously computed flow field is integrated in the direction normal to the rotor disk, and a noise computation stage in which quadrupole surface integrals are evaluated for a particular observer position. Preliminary predictions for hover and forward flight agree well with experimental data. The method is robust and requires computer resources comparable to thickness and loading noise prediction.
NASA Astrophysics Data System (ADS)
Madsen, Lars Bojer; Jensen, Frank; Dnestryan, Andrey I.; Tolstikhin, Oleg I.
2017-07-01
In the leading-order approximation of the weak-field asymptotic theory (WFAT), the dependence of the tunneling ionization rate of a molecule in an electric field on its orientation with respect to the field is determined by the structure factor of the ionizing molecular orbital. The WFAT yields an expression for the structure factor in terms of a local property of the orbital in the asymptotic region. However, in general quantum chemistry approaches molecular orbitals are expanded in a Gaussian basis which does not reproduce their asymptotic behavior correctly. This hinders the application of the WFAT to polyatomic molecules, which are attracting increasing interest in strong-field physics. Recently, an integral-equation approach to the WFAT for tunneling ionization of one electron from an arbitrary potential has been developed. The structure factor is expressed in an integral form as a matrix element involving the ionizing orbital. The integral is not sensitive to the asymptotic behavior of the orbital, which resolves the difficulty mentioned above. Here, we extend the integral representation for the structure factor to many-electron systems treated within the Hartree-Fock method and show how it can be implemented on the basis of standard quantum chemistry software packages. We validate the methodology by considering noble-gas atoms and the CO molecule, for which accurate structure factors exist in the literature. We also present benchmark results for CO2 and for NH3 in the pyramidal and planar geometries.
Can we estimate total magnetization directions from aeromagnetic data using Helbig's integrals?
Phillips, J.D.
2005-01-01
An algorithm that implements Helbig's (1963) integrals for estimating the vector components (mx, my, mz) of tile magnetic dipole moment from the first order moments of the vector magnetic field components (??X, ??Y, ??Z) is tested on real and synthetic data. After a grid of total field aeromagnetic data is converted to vector component grids using Fourier filtering, Helbig's infinite integrals are evaluated as finite integrals in small moving windows using a quadrature algorithm based on the 2-D trapezoidal rule. Prior to integration, best-fit planar surfaces must be removed from the component data within the data windows in order to make the results independent of the coordinate system origin. Two different approaches are described for interpreting the results of the integration. In the "direct" method, results from pairs of different window sizes are compared to identify grid nodes where the angular difference between solutions is small. These solutions provide valid estimates of total magnetization directions for compact sources such as spheres or dipoles, but not for horizontally elongated or 2-D sources. In the "indirect" method, which is more forgiving of source geometry, results of the quadrature analysis are scanned for solutions that are parallel to a specified total magnetization direction.
Exact Mass-Coupling Relation for the Homogeneous Sine-Gordon Model.
Bajnok, Zoltán; Balog, János; Ito, Katsushi; Satoh, Yuji; Tóth, Gábor Zsolt
2016-05-06
We derive the exact mass-coupling relation of the simplest multiscale quantum integrable model, i.e., the homogeneous sine-Gordon model with two mass scales. The relation is obtained by comparing the perturbed conformal field theory description of the model valid at short distances to the large distance bootstrap description based on the model's integrability. In particular, we find a differential equation for the relation by constructing conserved tensor currents, which satisfy a generalization of the Θ sum rule Ward identity. The mass-coupling relation is written in terms of hypergeometric functions.
A Fully Implantable, NFC Enabled, Continuous Interstitial Glucose Monitor
Anabtawi, Nijad; Freeman, Sabrina; Ferzli, Rony
2017-01-01
This work presents an integrated system-on-chip (SoC) that forms the core of a long-term, fully implantable, battery assisted, passive continuous glucose monitor. It integrates an amperometric glucose sensor interface, a near field communication (NFC) wireless front-end and a fully digital switched mode power management unit for supply regulation and on board battery charging. It uses 13.56 MHz (ISM) band to harvest energy and backscatter data to an NFC reader. System was implemented in 14nm CMOS technology and validated with post layout simulations. PMID:28702512
A Fully Implantable, NFC Enabled, Continuous Interstitial Glucose Monitor.
Anabtawi, Nijad; Freeman, Sabrina; Ferzli, Rony
2016-02-01
This work presents an integrated system-on-chip (SoC) that forms the core of a long-term, fully implantable, battery assisted, passive continuous glucose monitor. It integrates an amperometric glucose sensor interface, a near field communication (NFC) wireless front-end and a fully digital switched mode power management unit for supply regulation and on board battery charging. It uses 13.56 MHz (ISM) band to harvest energy and backscatter data to an NFC reader. System was implemented in 14nm CMOS technology and validated with post layout simulations.
Richter, Jack; McFarland, Lela; Bredfeldt, Christine
2012-01-01
Background/Aims Integrating data across systems can be a daunting process. The traditional method of moving data to a common location, mapping fields with different formats and meanings, and performing data cleaning activities to ensure valid and reliable integration across systems can be both expensive and extremely time consuming. As the scope of needed research data increases, the traditional methodology may not be sustainable. Data Virtualization provides an alternative to traditional methods that may reduce the effort required to integrate data across disparate systems. Objective Our goal was to survey new methods in data integration, cloud computing, enterprise data management and virtual data management for opportunities to increase the efficiency of producing VDW and similar data sets. Methods Kaiser Permanente Information Technology (KPIT), in collaboration with the Mid-Atlantic Permanente Research Institute (MAPRI) reviewed methodologies in the burgeoning field of Data Virtualization. We identified potential strengths and weaknesses of new approaches to data integration. For each method, we evaluated its potential application for producing effective research data sets. Results Data Virtualization provides opportunities to reduce the amount of data movement required to integrate data sources on different platforms in order to produce research data sets. Additionally, Data Virtualization also includes methods for managing “fuzzy” matching used to match fields known to have poor reliability such as names, addresses and social security numbers. These methods could improve the efficiency of integrating state and federal data such as patient race, death, and tumors with internal electronic health record data. Discussion The emerging field of Data Virtualization has considerable potential for increasing the efficiency of producing research data sets. An important next step will be to develop a proof of concept project that will help us understand to benefits and drawbacks of these techniques.
The first experiments in SST-1
NASA Astrophysics Data System (ADS)
Pradhan, S.; Khan, Z.; Tanna, V. L.; Sharma, A. N.; Doshi, K. J.; Prasad, U.; Masand, H.; Kumar, Aveg; Patel, K. B.; Bhandarkar, M. K.; Dhongde, J. R.; Shukla, B. K.; Mansuri, I. A.; Varadarajulu, A.; Khristi, Y. S.; Biswas, P.; Gupta, C. N.; Sharma, D. K.; Raval, D. C.; Srinivasan, R.; Pandya, S. P.; Atrey, P. K.; Sharma, P. K.; Patel, P. J.; Patel, H. S.; Santra, P.; Parekh, T. J.; Dhanani, K. R.; Paravastu, Y.; Pathan, F. S.; Chauhan, P. K.; Khan, M. S.; Tank, J. K.; Panchal, P. N.; Panchal, R. N.; Patel, R. J.; George, S.; Semwal, P.; Gupta, P.; Mahesuriya, G. I.; Sonara, D. P.; Jayswal, S. P.; Sharma, M.; Patel, J. C.; Varmora, P. P.; Patel, D. J.; Srikanth, G. L. N.; Christian, D. R.; Garg, A.; Bairagi, N.; Babu, G. R.; Panchal, A. G.; Vora, M. M.; Singh, A. K.; Sharma, R.; Raju, D.; Kulkarni, S. V.; Kumar, M.; Manchanda, R.; Joisa, S.; Tahiliani, K.; Pathak, S. K.; Patel, K. M.; Nimavat, H. D.; Shah, P. R.; Chudasma, H. H.; Raval, T. Y.; Sharma, A. L.; Ojha, A.; Parghi, B. R.; Banaudha, M.; Makwana, A. R.; Chowdhuri, M. B.; Ramaiya, N.; kumar, A.; Raval, J. V.; Gupta, S.; Purohit, S.; Kaur, R.; Adhiya, A. N.; Jha, R.; Kumar, S.; Nagora, U. C.; Siju, V.; Thomas, J.; Chaudhari, V. R.; Patel, K. G.; Ambulkar, K. K.; Dalakoti, S.; Virani, C. G.; Parmar, P. R.; Thakur, A. L.; Das, A.; Bora, D.; the SST-1 Team
2015-10-01
A steady state superconducting tokamak (SST-1) has been commissioned after the successful experimental and engineering validations of its critical sub-systems. During the ‘engineering validation phase’ of SST-1; the cryostat was demonstrated to be leak-tight in all operational scenarios, 80 K thermal shields were demonstrated to be uniformly cooled without regions of ‘thermal runaway and hot spots’, the superconducting toroidal field magnets were demonstrated to be cooled to their nominal operational conditions and charged up to 1.5 T of the field at the major radius. The engineering validations further demonstrated the assembled SST-1 machine shell to be a graded, stress-strain optimized and distributed thermo-mechanical device, apart from the integrated vacuum vessel being validated to be UHV compatible etc. Subsequently, ‘field error components’ in SST-1 were measured to be acceptable towards plasma discharges. A successful breakdown in SST-1 was obtained in SST-1 in June 2013 assisted with electron cyclotron pre-ionization in the second harmonic mode, thus marking the ‘first plasma’ in SST-1 and the arrival of SST-1 into the league of contemporary steady state devices. Subsequent to the first plasma, successful repeatable plasma start-ups with E ˜ 0.4 V m-1, and plasma current in excess of 70 kA for 400 ms assisted with electron cyclotron heating pre-ionization at a field of 1.5 T have so far been achieved in SST-1. Lengthening the plasma pulse duration with lower hybrid current drive, confinement and transport in SST-1 plasmas and magnetohydrodynamic activities typical to large aspect ratio SST-1 discharges are presently being investigated in SST-1. In parallel, SST-1 has uniquely demonstrated reliable cryo-stable high field operation of superconducting TF magnets in the two-phase cooling mode, operation of vapour-cooled current leads with cold gas instead of liquid helium and an order less dc joint resistance in superconducting magnet winding packs with high transport currents. In parallel, SST-1 is also continually getting up-graded with first wall integration, superconducting central solenoid installation and over-loaded MgB2-brass based current leads etc. Phase-1 of SST-1 up-gradation is scheduled by the first half of 2015, after which long pulse plasma experiments in both circular and elongated configurations have been planned in SST-1.
Forward Bay Cover Separation Modeling and Testing for the Orion Multi-Purpose Crew Vehicle
NASA Technical Reports Server (NTRS)
Ali, Yasmin; Chuhta, Jesse D.; Hughes, Michael P.; Radke, Tara S.
2015-01-01
Spacecraft multi-body separation events during atmospheric descent require complex testing and analysis to validate the flight separation dynamics models used to verify no re-contact. The NASA Orion Multi-Purpose Crew Vehicle (MPCV) architecture includes a highly-integrated Forward Bay Cover (FBC) jettison assembly design that combines parachutes and piston thrusters to separate the FBC from the Crew Module (CM) and avoid re-contact. A multi-disciplinary team across numerous organizations examined key model parameters and risk areas to develop a robust but affordable test campaign in order to validate and verify the FBC separation event for Exploration Flight Test-1 (EFT-1). The FBC jettison simulation model is highly complex, consisting of dozens of parameters varied simultaneously, with numerous multi-parameter interactions (coupling and feedback) among the various model elements, and encompassing distinct near-field, mid-field, and far-field regimes. The test campaign was composed of component-level testing (for example gas-piston thrusters and parachute mortars), ground FBC jettison tests, and FBC jettison air-drop tests that were accomplished by a highly multi-disciplinary team. Three ground jettison tests isolated the testing of mechanisms and structures to anchor the simulation models excluding aerodynamic effects. Subsequently, two air-drop tests added aerodynamic and parachute elements, and served as integrated system demonstrations, which had been preliminarily explored during the Orion Pad Abort-1 (PA-1) flight test in May 2010. Both ground and drop tests provided extensive data to validate analytical models and to verify the FBC jettison event for EFT-1. Additional testing will be required to support human certification of this separation event, for which NASA and Lockheed Martin are applying knowledge from Apollo and EFT-1 testing and modeling to develop a robust human-rated FBC separation event.
Delaney, Aogán; Tamás, Peter A; Crane, Todd A; Chesterman, Sabrina
2016-01-01
There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts' commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research.
Crane, Todd A.; Chesterman, Sabrina
2016-01-01
There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts’ commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research. PMID:26901409
NASA Astrophysics Data System (ADS)
Yi, Dake; Wang, TzuChiang
2018-06-01
In the paper, a new procedure is proposed to investigate three-dimensional fracture problems of a thin elastic plate with a long through-the-thickness crack under remote uniform tensile loading. The new procedure includes a new analytical method and high accurate finite element simulations. In the part of theoretical analysis, three-dimensional Maxwell stress functions are employed in order to derive three-dimensional crack tip fields. Based on the theoretical analysis, an equation which can describe the relationship among the three-dimensional J-integral J( z), the stress intensity factor K( z) and the tri-axial stress constraint level T z ( z) is derived first. In the part of finite element simulations, a fine mesh including 153360 elements is constructed to compute the stress field near the crack front, J( z) and T z ( z). Numerical results show that in the plane very close to the free surface, the K field solution is still valid for in-plane stresses. Comparison with the numerical results shows that the analytical results are valid.
Label-free hyperspectral dark-field microscopy for quantitative scatter imaging
NASA Astrophysics Data System (ADS)
Cheney, Philip; McClatchy, David; Kanick, Stephen; Lemaillet, Paul; Allen, David; Samarov, Daniel; Pogue, Brian; Hwang, Jeeseong
2017-03-01
A hyperspectral dark-field microscope has been developed for imaging spatially distributed diffuse reflectance spectra from light-scattering samples. In this report, quantitative scatter spectroscopy is demonstrated with a uniform scattering phantom, namely a solution of polystyrene microspheres. A Monte Carlo-based inverse model was used to calculate the reduced scattering coefficients of samples of different microsphere concentrations from wavelength-dependent backscattered signal measured by the dark-field microscope. The results are compared to the measurement results from a NIST double-integrating sphere system for validation. Ongoing efforts involve quantitative mapping of scattering and absorption coefficients in samples with spatially heterogeneous optical properties.
Rotorcraft Brownout: Advanced Understanding, Control and Mitigation
2008-12-31
the Gauss Seidel iterative method . The overall steps of SIMPLER algorithm can be summarized as: 1. Guess velocity field, 2. Calculate the momentum...techniques and numerical methods , and the team will begin to develop a methodology that is capable of integrating these solutions and highlighting...rotorcraft design optimization techniques will then be undertaken using the validated computational methods . 15. SUBJECT TERMS Rotorcraft
Integrated tokamak modeling: when physics informs engineering and research planning
NASA Astrophysics Data System (ADS)
Poli, Francesca
2017-10-01
Simulations that integrate virtually all the relevant engineering and physics aspects of a real tokamak experiment are a power tool for experimental interpretation, model validation and planning for both present and future devices. This tutorial will guide through the building blocks of an ``integrated'' tokamak simulation, such as magnetic flux diffusion, thermal, momentum and particle transport, external heating and current drive sources, wall particle sources and sinks. Emphasis is given to the connection and interplay between external actuators and plasma response, between the slow time scales of the current diffusion and the fast time scales of transport, and how reduced and high-fidelity models can contribute to simulate a whole device. To illustrate the potential and limitations of integrated tokamak modeling for discharge prediction, a helium plasma scenario for the ITER pre-nuclear phase is taken as an example. This scenario presents challenges because it requires core-edge integration and advanced models for interaction between waves and fast-ions, which are subject to a limited experimental database for validation and guidance. Starting from a scenario obtained by re-scaling parameters from the demonstration inductive ``ITER baseline'', it is shown how self-consistent simulations that encompass both core and edge plasma regions, as well as high-fidelity heating and current drive source models are needed to set constraints on the density, magnetic field and heating scheme. This tutorial aims at demonstrating how integrated modeling, when used with adequate level of criticism, can not only support design of operational scenarios, but also help to asses the limitations and gaps in the available models, thus indicating where improved modeling tools are required and how present experiments can help their validation and inform research planning. Work supported by DOE under DE-AC02-09CH1146.
NASA Astrophysics Data System (ADS)
Zhang, Sheng; Rao, Jia-Yu; Tai, Wen-Si; Wang, Ting; Liu, Fa-Lin
2016-09-01
In this paper, a kind of quasi eighth substrate integrated waveguide resonator (QESIWR) with defected fractal structure (DFS) is proposed firstly. Compared with the eighth substrate integrated waveguide resonator (ESIWR), this kind of resonator has lower resonant frequency (f0), acceptable unloaded quality (Qu) value and almost unchanged electric field distribution. In order to validate the properties of QESIWR, a cascaded quadruplet QESIWRs filter is designed and optimized. By using cross coupling and gap coupling compensation, this filter has two transmission zeros (TZs) at each side of the passband. Meanwhile, in comparison with the conventional ones, its size is cut down over 90 %. The measured results agree well with the simulated ones.
Plasma RNA integrity analysis: methodology and validation.
Wong, Blenda C K; Lo, Y M Dennis
2006-09-01
The detection of cell-free RNA in plasma and serum of human subjects has found increasing applications in the field of medical diagnostics. However, many questions regarding the biology of circulating RNA remain to be addressed. One issue concerns the molecular nature of these circulating RNA species. We have recently developed a simple and quantitative method to investigate the integrity of plasma RNA. Our results have suggested that cell-free RNA in plasma is generally present as fragmented molecules instead of intact transcripts, with a predominance of 5' fragments. In this article, we summarize the basic principles in the experimental design for plasma RNA integrity analysis and highlight some of the important technical considerations for this type of investigation.
Cuevas, Soledad
Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.
A Comprehensive Validation Approach Using The RAVEN Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J
2015-06-01
The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less
Robust multiscale field-only formulation of electromagnetic scattering
NASA Astrophysics Data System (ADS)
Sun, Qiang; Klaseboer, Evert; Chan, Derek Y. C.
2017-01-01
We present a boundary integral formulation of electromagnetic scattering by homogeneous bodies that are characterized by linear constitutive equations in the frequency domain. By working with the Cartesian components of the electric E and magnetic H fields and with the scalar functions (r .E ) and (r .H ) where r is a position vector, the problem can be cast as having to solve a set of scalar Helmholtz equations for the field components that are coupled by the usual electromagnetic boundary conditions at material boundaries. This facilitates a direct solution for the surface values of E and H rather than having to work with surface currents or surface charge densities as intermediate quantities in existing methods. Consequently, our formulation is free of the well-known numerical instability that occurs in the zero-frequency or long-wavelength limit in traditional surface integral solutions of Maxwell's equations and our numerical results converge uniformly to the static results in the long-wavelength limit. Furthermore, we use a formulation of the scalar Helmholtz equation that is expressed as classically convergent integrals and does not require the evaluation of principal value integrals or any knowledge of the solid angle. Therefore, standard quadrature and higher order surface elements can readily be used to improve numerical precision for the same number of degrees of freedom. In addition, near and far field values can be calculated with equal precision, and multiscale problems in which the scatterers possess characteristic length scales that are both large and small relative to the wavelength can be easily accommodated. From this we obtain results for the scattering and transmission of electromagnetic waves at dielectric boundaries that are valid for any ratio of the local surface curvature to the wave number. This is a generalization of the familiar Fresnel formula and Snell's law, valid at planar dielectric boundaries, for the scattering and transmission of electromagnetic waves at surfaces of arbitrary curvature. Implementation details are illustrated with scattering by multiple perfect electric conductors as well as dielectric bodies with complex geometries and composition.
The development of the ICME supply-chain: Route to ICME implementation and sustainment
NASA Astrophysics Data System (ADS)
Furrer, David; Schirra, John
2011-04-01
Over the past twenty years, integrated computational materials engineering (ICME) has emerged as a key engineering field with great promise. Models simulating materials-related phenomena have been developed and are being validated for industrial application. The integration of computational methods into material, process and component design has been a challenge, however, in part due to the complexities in the development of an ICME "supply-chain" that supports, sustains and delivers this emerging technology. ICME touches many disciplines, which results in a requirement for many types of computational-based technology organizations to be involved to provide tools that can be rapidly developed, validated, deployed and maintained for industrial applications. The need for, and the current state of an ICME supply-chain along with development and future requirements for the continued pace of introduction of ICME into industrial design practices will be reviewed within this article.
Valentijn, Pim P; Bruijnzeels, Marc A; de Leeuw, Rob J; Schrijvers, Guus J.P
2012-01-01
Purpose Capacity problems and political pressures have led to a rapid change in the organization of primary care from mono disciplinary small business to complex inter-organizational relationships. It is assumed that inter-organizational collaboration is the driving force to achieve integrated (primary) care. Despite the importance of collaboration and integration of services in primary care, there is no unambiguous definition for both concepts. The purpose of this study is to examine and link the conceptualisation and validation of the terms inter-organizational collaboration and integrated primary care using a theoretical framework. Theory The theoretical framework is based on the complex collaboration process of negotiation among multiple stakeholder groups in primary care. Methods A literature review of health sciences and business databases, and targeted grey literature sources. Based on the literature review we operationalized the constructs of inter-organizational collaboration and integrated primary care in a theoretical framework. The framework is being validated in an explorative study of 80 primary care projects in the Netherlands. Results and conclusions Integrated primary care is considered as a multidimensional construct based on a continuum of integration, extending from segregation to integration. The synthesis of the current theories and concepts of inter-organizational collaboration is insufficient to deal with the complexity of collaborative issues in primary care. One coherent and integrated theoretical framework was found that could make the complex collaboration process in primary care transparent. This study presented theoretical framework is a first step to understand the patterns of successful collaboration and integration in primary care services. These patterns can give insights in the organization forms needed to create a good working integrated (primary) care system that fits the local needs of a population. Preliminary data of the patterns of collaboration and integration will be presented.
The Reduction of Ducted Fan Engine Noise Via A Boundary Integral Equation Method
NASA Technical Reports Server (NTRS)
Tweed, J.; Dunn, M.
1997-01-01
The development of a Boundary Integral Equation Method (BIEM) for the prediction of ducted fan engine noise is discussed. The method is motivated by the need for an efficient and versatile computational tool to assist in parametric noise reduction studies. In this research, the work in reference 1 was extended to include passive noise control treatment on the duct interior. The BEM considers the scattering of incident sound generated by spinning point thrust dipoles in a uniform flow field by a thin cylindrical duct. The acoustic field is written as a superposition of spinning modes. Modal coefficients of acoustic pressure are calculated term by term. The BEM theoretical framework is based on Helmholtz potential theory. A boundary value problem is converted to a boundary integral equation formulation with unknown single and double layer densities on the duct wall. After solving for the unknown densities, the acoustic field is easily calculated. The main feature of the BIEM is the ability to compute any portion of the sound field without the need to compute the entire field. Other noise prediction methods such as CFD and Finite Element methods lack this property. Additional BIEM attributes include versatility, ease of use, rapid noise predictions, coupling of propagation and radiation both forward and aft, implementable on midrange personal computers, and valid over a wide range of frequencies.
Mapping global cropland and field size.
Fritz, Steffen; See, Linda; McCallum, Ian; You, Liangzhi; Bun, Andriy; Moltchanova, Elena; Duerauer, Martina; Albrecht, Fransizka; Schill, Christian; Perger, Christoph; Havlik, Petr; Mosnier, Aline; Thornton, Philip; Wood-Sichra, Ulrike; Herrero, Mario; Becker-Reshef, Inbal; Justice, Chris; Hansen, Matthew; Gong, Peng; Abdel Aziz, Sheta; Cipriani, Anna; Cumani, Renato; Cecchi, Giuliano; Conchedda, Giulia; Ferreira, Stefanus; Gomez, Adriana; Haffani, Myriam; Kayitakire, Francois; Malanding, Jaiteh; Mueller, Rick; Newby, Terence; Nonguierma, Andre; Olusegun, Adeaga; Ortner, Simone; Rajak, D Ram; Rocha, Jansle; Schepaschenko, Dmitry; Schepaschenko, Maria; Terekhov, Alexey; Tiangwa, Alex; Vancutsem, Christelle; Vintrou, Elodie; Wenbin, Wu; van der Velde, Marijn; Dunwoody, Antonia; Kraxner, Florian; Obersteiner, Michael
2015-05-01
A new 1 km global IIASA-IFPRI cropland percentage map for the baseline year 2005 has been developed which integrates a number of individual cropland maps at global to regional to national scales. The individual map products include existing global land cover maps such as GlobCover 2005 and MODIS v.5, regional maps such as AFRICOVER and national maps from mapping agencies and other organizations. The different products are ranked at the national level using crowdsourced data from Geo-Wiki to create a map that reflects the likelihood of cropland. Calibration with national and subnational crop statistics was then undertaken to distribute the cropland within each country and subnational unit. The new IIASA-IFPRI cropland product has been validated using very high-resolution satellite imagery via Geo-Wiki and has an overall accuracy of 82.4%. It has also been compared with the EarthStat cropland product and shows a lower root mean square error on an independent data set collected from Geo-Wiki. The first ever global field size map was produced at the same resolution as the IIASA-IFPRI cropland map based on interpolation of field size data collected via a Geo-Wiki crowdsourcing campaign. A validation exercise of the global field size map revealed satisfactory agreement with control data, particularly given the relatively modest size of the field size data set used to create the map. Both are critical inputs to global agricultural monitoring in the frame of GEOGLAM and will serve the global land modelling and integrated assessment community, in particular for improving land use models that require baseline cropland information. These products are freely available for downloading from the http://cropland.geo-wiki.org website. © 2015 John Wiley & Sons Ltd.
Effects of Lambertian sources design on uniformity and measurements
NASA Astrophysics Data System (ADS)
Cariou, Nadine; Durell, Chris; McKee, Greg; Wilks, Dylan; Glastre, Wilfried
2014-10-01
Integrating sphere (IS) based uniform sources are a primary tool for ground based calibration, characterization and testing of flight radiometric equipment. The idea of a Lambertian field of energy is a very useful tool in radiometric testing, but this concept is being checked in many ways by newly lowered uncertainty goals. At an uncertainty goal of 2% one needs to assess carefully uniformity in addition to calibration uncertainties, as even sources with a 0.5% uniformity are now substantial proportions of uncertainty budgets. The paper explores integrating sphere design options for achieving 99.5% and better uniformity of exit port radiance and spectral irradiance created by an integrating sphere. Uniformity in broad spectrum and spectral bands are explored. We discuss mapping techniques and results as a function of observed uniformity as well as laboratory testing results customized to match with customer's instrumentation field of view. We will also discuss recommendations with basic commercial instrumentation, we have used to validate, inspect, and improve correlation of uniformity measurements with the intended application.
NASA Technical Reports Server (NTRS)
Starr, David
2000-01-01
The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community.
NASA Astrophysics Data System (ADS)
Hirt, Christian; Reußner, Elisabeth; Rexer, Moritz; Kuhn, Michael
2016-09-01
Over the past years, spectral techniques have become a standard to model Earth's global gravity field to 10 km scales, with the EGM2008 geopotential model being a prominent example. For some geophysical applications of EGM2008, particularly Bouguer gravity computation with spectral techniques, a topographic potential model of adequate resolution is required. However, current topographic potential models have not yet been successfully validated to degree 2160, and notable discrepancies between spectral modeling and Newtonian (numerical) integration well beyond the 10 mGal level have been reported. Here we accurately compute and validate gravity implied by a degree 2160 model of Earth's topographic masses. Our experiments are based on two key strategies, both of which require advanced computational resources. First, we construct a spectrally complete model of the gravity field which is generated by the degree 2160 Earth topography model. This involves expansion of the topographic potential to the 15th integer power of the topography and modeling of short-scale gravity signals to ultrahigh degree of 21,600, translating into unprecedented fine scales of 1 km. Second, we apply Newtonian integration in the space domain with high spatial resolution to reduce discretization errors. Our numerical study demonstrates excellent agreement (8 μGgal RMS) between gravity from both forward modeling techniques and provides insight into the convergence process associated with spectral modeling of gravity signals at very short scales (few km). As key conclusion, our work successfully validates the spectral domain forward modeling technique for degree 2160 topography and increases the confidence in new high-resolution global Bouguer gravity maps.
A New Method for Analyzing Near-Field Faraday Probe Data in Hall Thrusters
NASA Technical Reports Server (NTRS)
Huang, Wensheng; Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani
2013-01-01
This paper presents a new method for analyzing near-field Faraday probe data obtained from Hall thrusters. Traditional methods spawned from far-field Faraday probe analysis rely on assumptions that are not applicable to near-field Faraday probe data. In particular, arbitrary choices for the point of origin and limits of integration have made interpretation of the results difficult. The new method, called iterative pathfinding, uses the evolution of the near-field plume with distance to provide feedback for determining the location of the point of origin. Although still susceptible to the choice of integration limits, this method presents a systematic approach to determining the origin point for calculating the divergence angle. The iterative pathfinding method is applied to near-field Faraday probe data taken in a previous study from the NASA-300M and NASA-457Mv2 Hall thrusters. Since these two thrusters use centrally mounted cathodes the current density associated with the cathode plume is removed before applying iterative pathfinding. A procedure is presented for removing the cathode plume. The results of the analysis are compared to far-field probe analysis results. This paper ends with checks on the validity of the new method and discussions on the implications of the results.
A New Method for Analyzing Near-Field Faraday Probe Data in Hall Thrusters
NASA Technical Reports Server (NTRS)
Huang, Wensheng; Shastry, Rohit; Herman, Daniel A.; Soulas, George C.; Kamhawi, Hani
2013-01-01
This paper presents a new method for analyzing near-field Faraday probe data obtained from Hall thrusters. Traditional methods spawned from far-field Faraday probe analysis rely on assumptions that are not applicable to near-field Faraday probe data. In particular, arbitrary choices for the point of origin and limits of integration have made interpretation of the results difficult. The new method, called iterative pathfinding, uses the evolution of the near-field plume with distance to provide feedback for determining the location of the point of origin. Although still susceptible to the choice of integration limits, this method presents a systematic approach to determining the origin point for calculating the divergence angle. The iterative pathfinding method is applied to near-field Faraday probe data taken in a previous study from the NASA-300M and NASA-457Mv2 Hall thrusters. Since these two thrusters use centrally mounted cathodes, the current density associated with the cathode plume is removed before applying iterative pathfinding. A procedure is presented for removing the cathode plume. The results of the analysis are compared to far-field probe analysis results. This paper ends with checks on the validity of the new method and discussions on the implications of the results.
Initial operation of the Lockheed Martin T4B experiment
NASA Astrophysics Data System (ADS)
Garrett, M. L.; Blinzer, A.; Ebersohn, F.; Gucker, S.; Heinrich, J.; Lohff, C.; McGuire, T.; Montecalvo, N.; Raymond, A.; Rhoads, J.; Ross, P.; Sommers, B.; Strandberg, E.; Sullivan, R.; Walker, J.
2017-10-01
The T4B experiment is a linear, encapsulated ring cusp confinement device, designed to develop a physics and technology basis for a follow-on high beta (β 1) machine. The experiment consists of 13 magnetic field coils (11 external, 2 internal), to produce a series of on-axis field nulls surrounded by modest magnetic fields of up to 0.3 T. The primary plasma source used on T4B is a lanthanum hexaboride (LaB6) cathode, capable of coupling over 100 kW into the plasma. Initial testing focused on commissioning of components and integration of diagnostics. Diagnostics include both long and short wavelength interferometry, bolometry, visible and X-ray spectroscopy, Langmuir and B-dot probes, Thomson scattering, flux loops, and fast camera imagery. Low energy discharges were used to begin validation of physics models and simulation efforts. Following the initial machine check-out, neutral beam injection (NBI) was integrated onto the device. Detailed results will be presented. 2017 Lockheed Martin Corporation. All Rights Reserved.
Embedded performance validity testing in neuropsychological assessment: Potential clinical tools.
Rickards, Tyler A; Cranston, Christopher C; Touradji, Pegah; Bechtold, Kathleen T
2018-01-01
The article aims to suggest clinically-useful tools in neuropsychological assessment for efficient use of embedded measures of performance validity. To accomplish this, we integrated available validity-related and statistical research from the literature, consensus statements, and survey-based data from practicing neuropsychologists. We provide recommendations for use of 1) Cutoffs for embedded performance validity tests including Reliable Digit Span, California Verbal Learning Test (Second Edition) Forced Choice Recognition, Rey-Osterrieth Complex Figure Test Combination Score, Wisconsin Card Sorting Test Failure to Maintain Set, and the Finger Tapping Test; 2) Selecting number of performance validity measures to administer in an assessment; and 3) Hypothetical clinical decision-making models for use of performance validity testing in a neuropsychological assessment collectively considering behavior, patient reporting, and data indicating invalid or noncredible performance. Performance validity testing helps inform the clinician about an individual's general approach to tasks: response to failure, task engagement and persistence, compliance with task demands. Data-driven clinical suggestions provide a resource to clinicians and to instigate conversation within the field to make more uniform, testable decisions to further the discussion, and guide future research in this area.
NASA Technical Reports Server (NTRS)
Smith, Andrew; LaVerde, Bruce; Fulcher, Clay; Hunt, Ron
2012-01-01
An approach for predicting the vibration, strain, and force responses of a flight-like vehicle panel assembly to acoustic pressures is presented. Important validation for the approach is provided by comparison to ground test measurements in a reverberant chamber. The test article and the corresponding analytical model were assembled in several configurations to demonstrate the suitability of the approach for response predictions when the vehicle panel is integrated with equipment. Critical choices in the analysis necessary for convergence of the predicted and measured responses are illustrated through sensitivity studies. The methodology includes representation of spatial correlation of the pressure field over the panel surface. Therefore, it is possible to demonstrate the effects of hydrodynamic coincidence in the response. The sensitivity to pressure patch density clearly illustrates the onset of coincidence effects on the panel response predictions.
Xian, George Z.; Homer, Collin G.; Rigge, Matthew B.; Shi, Hua; Meyer, Debbie
2015-01-01
Accurate and consistent estimates of shrubland ecosystem components are crucial to a better understanding of ecosystem conditions in arid and semiarid lands. An innovative approach was developed by integrating multiple sources of information to quantify shrubland components as continuous field products within the National Land Cover Database (NLCD). The approach consists of several procedures including field sample collections, high-resolution mapping of shrubland components using WorldView-2 imagery and regression tree models, Landsat 8 radiometric balancing and phenological mosaicking, medium resolution estimates of shrubland components following different climate zones using Landsat 8 phenological mosaics and regression tree models, and product validation. Fractional covers of nine shrubland components were estimated: annual herbaceous, bare ground, big sagebrush, herbaceous, litter, sagebrush, shrub, sagebrush height, and shrub height. Our study area included the footprint of six Landsat 8 scenes in the northwestern United States. Results show that most components have relatively significant correlations with validation data, have small normalized root mean square errors, and correspond well with expected ecological gradients. While some uncertainties remain with height estimates, the model formulated in this study provides a cross-validated, unbiased, and cost effective approach to quantify shrubland components at a regional scale and advances knowledge of horizontal and vertical variability of these components.
Estimating Tropical Cyclone Surface Wind Field Parameters with the CYGNSS Constellation
NASA Astrophysics Data System (ADS)
Morris, M.; Ruf, C. S.
2016-12-01
A variety of parameters can be used to describe the wind field of a tropical cyclone (TC). Of particular interest to the TC forecasting and research community are the maximum sustained wind speed (VMAX), radius of maximum wind (RMW), 34-, 50-, and 64-kt wind radii, and integrated kinetic energy (IKE). The RMW is the distance separating the storm center and the VMAX position. IKE integrates the square of surface wind speed over the entire storm. These wind field parameters can be estimated from observations made by the Cyclone Global Navigation Satellite System (CYGNSS) constellation. The CYGNSS constellation consists of eight small satellites in a 35-degree inclination circular orbit. These satellites will be operating in standard science mode by the 2017 Atlantic TC season. CYGNSS will provide estimates of ocean surface wind speed under all precipitating conditions with high temporal and spatial sampling in the tropics. TC wind field data products can be derived from the level-2 CYGNSS wind speed product. CYGNSS-based TC wind field science data products are developed and tested in this paper. Performance of these products is validated using a mission simulator prelaunch.
Microscopic theory of linear light scattering from mesoscopic media and in near-field optics.
Keller, Ole
2005-08-01
On the basis of quantum mechanical response theory a microscopic propagator theory of linear light scattering from mesoscopic systems is presented. The central integral equation problem is transferred to a matrix equation problem by discretization in transitions between pairs of (many-body) energy eigenstates. The local-field calculation which appears from this approach is valid down to the microscopic region. Previous theories based on the (macroscopic) dielectric constant concept make use of spatial (geometrical) discretization and cannot in general be trusted on the mesoscopic length scale. The present theory can be applied to light scattering studies in near-field optics. After a brief discussion of the macroscopic integral equation problem a microscopic potential description of the scattering process is established. In combination with the use of microscopic electromagnetic propagators the formalism allows one to make contact to the macroscopic theory of light scattering and to the spatial photon localization problem. The quantum structure of the microscopic conductivity response tensor enables one to establish a clear physical picture of the origin of local-field phenomena in mesoscopic and near-field optics. The Huygens scalar propagator formalism is revisited and its generality in microscopic physics pointed out.
Haregu, Tilahun Nigatu; Setswe, Geoffrey; Elliott, Julian; Oldenburg, Brian
2014-01-01
Introduction: Although there are several models of integrated architecture, we still lack models and theories about the integration process of health system responses to HIV/AIDS and NCDs. Objective: The overall purpose of this study is to design an action model, a systematic approach, for the integration of health system responses to HIV/AIDS and NCDs in developing countries. Methods: An iterative and progressive approach of model development using inductive qualitative evidence synthesis techniques was applied. As evidence about integration is spread across different fields, synthesis of evidence from a broad range of disciplines was conducted. Results: An action model of integration having 5 underlying principles, 4 action fields, and a 9-step action cycle is developed. The INTEGRATE model is an acronym of the 9 steps of the integration process: 1) Interrelate the magnitude and distribution of the problems, 2) Navigate the linkage between the problems, 3) Testify individual level co-occurrence of the problems, 4) Examine the similarities and understand the differences between the response functions, 5) Glance over the health system’s environment for integration, 6) Repackage and share evidence in a useable form, 7) Ascertain the plan for integration, 8) Translate the plan in to action, 9) Evaluate and Monitor the integration. Conclusion: Our model provides a basis for integration of health system responses to HIV/AIDS and NCDs in the context of developing countries. We propose that future empirical work is needed to refine the validity and applicability of the model. PMID:24373260
Advanced ballistic range technology
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1993-01-01
Optical images, such as experimental interferograms, schlieren, and shadowgraphs, are routinely used to identify and locate features in experimental flow fields and for validating computational fluid dynamics (CFD) codes. Interferograms can also be used for comparing experimental and computed integrated densities. By constructing these optical images from flow-field simulations, one-to-one comparisons of computation and experiment are possible. During the period from February 1, 1992, to November 30, 1992, work has continued on the development of CISS (Constructed Interferograms, Schlieren, and Shadowgraphs), a code that constructs images from ideal- and real-gas flow-field simulations. In addition, research connected with the automated film-reading system and the proposed reactivation of the radiation facility has continued.
NASA Astrophysics Data System (ADS)
Deng, B.; Xiao, L.; Zhao, X.; Baker, E.; Gong, D.; Guo, D.; He, H.; Hou, S.; Liu, C.; Liu, T.; Sun, Q.; Thomas, J.; Wang, J.; Xiang, A. C.; Yang, D.; Ye, J.; Zhou, W.
2018-05-01
Two optical data link data transmission Application Specific Integrated Circuits (ASICs), the baseline and its backup, have been designed for the ATLAS Liquid Argon (LAr) Calorimeter Phase-I trigger upgrade. The latency of each ASIC and that of its corresponding receiver implemented in a back-end Field-Programmable Gate Array (FPGA) are critical specifications. In this paper, we present the latency measurements and simulation of two ASICs. The measurement results indicate that both ASICs achieve their design goals and meet the latency specifications. The consistency between the simulation and measurements validates the ASIC latency characterization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmed E. Hassan
2006-01-24
Models have an inherent uncertainty. The difficulty in fully characterizing the subsurface environment makes uncertainty an integral component of groundwater flow and transport models, which dictates the need for continuous monitoring and improvement. Building and sustaining confidence in closure decisions and monitoring networks based on models of subsurface conditions require developing confidence in the models through an iterative process. The definition of model validation is postulated as a confidence building and long-term iterative process (Hassan, 2004a). Model validation should be viewed as a process not an end result. Following Hassan (2004b), an approach is proposed for the validation process ofmore » stochastic groundwater models. The approach is briefly summarized herein and detailed analyses of acceptance criteria for stochastic realizations and of using validation data to reduce input parameter uncertainty are presented and applied to two case studies. During the validation process for stochastic models, a question arises as to the sufficiency of the number of acceptable model realizations (in terms of conformity with validation data). Using a hierarchical approach to make this determination is proposed. This approach is based on computing five measures or metrics and following a decision tree to determine if a sufficient number of realizations attain satisfactory scores regarding how they represent the field data used for calibration (old) and used for validation (new). The first two of these measures are applied to hypothetical scenarios using the first case study and assuming field data consistent with the model or significantly different from the model results. In both cases it is shown how the two measures would lead to the appropriate decision about the model performance. Standard statistical tests are used to evaluate these measures with the results indicating they are appropriate measures for evaluating model realizations. The use of validation data to constrain model input parameters is shown for the second case study using a Bayesian approach known as Markov Chain Monte Carlo. The approach shows a great potential to be helpful in the validation process and in incorporating prior knowledge with new field data to derive posterior distributions for both model input and output.« less
Integrated Syntactic/Semantic XML Data Validation with a Reusable Software Component
ERIC Educational Resources Information Center
Golikov, Steven
2013-01-01
Data integration is a critical component of enterprise system integration, and XML data validation is the foundation for sound data integration of XML-based information systems. Since B2B e-commerce relies on data validation as one of the critical components for enterprise integration, it is imperative for financial industries and e-commerce…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miyao, Tadahiro; Spohn, Herbert
The retarded van der Waals potential, as first obtained by Casimir and Polder, is usually computed on the basis of nonrelativistic quantum electrodynamics . The Hamiltonian describes two infinitely heavy nuclei, charge e, separated by a distance R and two spinless electrons, charge -e, nonrelativistically coupled to the quantized radiation field. Casimir and Polder used the dipole approximation and small coupling to the Maxwell field. We employ here the full Hamiltonian and determine the asymptotic strength of the leading -R{sup -7} potential, which is valid for all e. Our computation is based on a path integral representation and expands inmore » 1/R, rather than in e.« less
NASA Astrophysics Data System (ADS)
Naserpour, Mahin; Zapata-Rodríguez, Carlos J.
2018-01-01
The evaluation of vector wave fields can be accurately performed by means of diffraction integrals, differential equations and also series expansions. In this paper, a Bessel series expansion which basis relies on the exact solution of the Helmholtz equation in cylindrical coordinates is theoretically developed for the straightforward yet accurate description of low-numerical-aperture focal waves. The validity of this approach is confirmed by explicit application to Gaussian beams and apertured focused fields in the paraxial regime. Finally we discuss how our procedure can be favorably implemented in scattering problems.
Wang, Yixin; Guo, Fang
2014-01-01
A large amount of studies show that real-world study has strong external validity than the traditional randomized controlled trials and can evaluate the effect of interventions in a real clinical setting, which open up a new path for researches of integrative medicine in coronary heart disease. However, clinical data of integrative medicine in coronary heart disease are large in amount and complex in data types, making exploring the appropriate methodology a hot topic. Data mining techniques are to analyze and dig out useful information and knowledge from the mass data to guide people's practices. The present review provides insights for the main features of data mining and their applications of integrative medical studies in coronary heart disease, aiming to analyze the progress and prospect in this field. PMID:25544853
Vortex Lattice UXO Mobility Model Integration
2015-03-01
law , no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB...predictions of the fate and transport of a broad-field UXO population are extremely sensitive to the initial state of that population, specifically: the...limit the model’s computational domain. This revised model software was built on the concept of interconnected geomorphic control cells consisting of
Reverse engineering biomolecular systems using -omic data: challenges, progress and opportunities.
Quo, Chang F; Kaddi, Chanchala; Phan, John H; Zollanvari, Amin; Xu, Mingqing; Wang, May D; Alterovitz, Gil
2012-07-01
Recent advances in high-throughput biotechnologies have led to the rapid growing research interest in reverse engineering of biomolecular systems (REBMS). 'Data-driven' approaches, i.e. data mining, can be used to extract patterns from large volumes of biochemical data at molecular-level resolution while 'design-driven' approaches, i.e. systems modeling, can be used to simulate emergent system properties. Consequently, both data- and design-driven approaches applied to -omic data may lead to novel insights in reverse engineering biological systems that could not be expected before using low-throughput platforms. However, there exist several challenges in this fast growing field of reverse engineering biomolecular systems: (i) to integrate heterogeneous biochemical data for data mining, (ii) to combine top-down and bottom-up approaches for systems modeling and (iii) to validate system models experimentally. In addition to reviewing progress made by the community and opportunities encountered in addressing these challenges, we explore the emerging field of synthetic biology, which is an exciting approach to validate and analyze theoretical system models directly through experimental synthesis, i.e. analysis-by-synthesis. The ultimate goal is to address the present and future challenges in reverse engineering biomolecular systems (REBMS) using integrated workflow of data mining, systems modeling and synthetic biology.
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2010 CFR
2010-07-01
... INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Validation and data integrity of records... verify that the information provided to the NICS Index remains valid and correct. (b) Each data source...
Bednarz, Bryan; Xu, X George
2012-01-01
There is a serious and growing concern about the increased risk of radiation-induced second cancers and late tissue injuries associated with radiation treatment. To better understand and to more accurately quantify non-target organ doses due to scatter and leakage radiation from medical accelerators, a detailed Monte Carlo model of the medical linear accelerator is needed. This paper describes the development and validation of a detailed accelerator model of the Varian Clinac operating at 6 and 18 MV beam energies. Over 100 accelerator components have been defined and integrated using the Monte Carlo code MCNPX. A series of in-field and out-of-field dose validation studies were performed. In-field dose distributions calculated using the accelerator models were tuned to match measurement data that are considered the de facto ‘gold standard’ for the Varian Clinac accelerator provided by the manufacturer. Field sizes of 4 cm × 4 cm, 10 cm × 10 cm, 20 cm × 20 cm and 40 cm × 40 cm were considered. The local difference between calculated and measured dose on the percent depth dose curve was less than 2% for all locations. The local difference between calculated and measured dose on the dose profile curve was less than 2% in the plateau region and less than 2 mm in the penumbra region for all locations. Out-of-field dose profiles were calculated and compared to measurement data for both beam energies for field sizes of 4 cm × 4 cm, 10 cm × 10 cm and 20 cm × 20 cm. For all field sizes considered in this study, the average local difference between calculated and measured dose for the 6 and 18 MV beams was 14 and 16%, respectively. In addition, a method for determining neutron contamination in the 18 MV operating model was validated by comparing calculated in-air neutron fluence with reported calculations and measurements. The average difference between calculated and measured neutron fluence was 20%. As one of the most detailed accelerator models for both in-field and out-of-field dose calculations, the model will be combined with anatomically realistic computational patient phantoms into a computational framework to calculate non-target organ doses to patients from various radiation treatment plans. PMID:19141879
Implementation and application of an interactive user-friendly validation software for RADIANCE
NASA Astrophysics Data System (ADS)
Sundaram, Anand; Boonn, William W.; Kim, Woojin; Cook, Tessa S.
2012-02-01
RADIANCE extracts CT dose parameters from dose sheets using optical character recognition and stores the data in a relational database. To facilitate validation of RADIANCE's performance, a simple user interface was initially implemented and about 300 records were evaluated. Here, we extend this interface to achieve a wider variety of functions and perform a larger-scale validation. The validator uses some data from the RADIANCE database to prepopulate quality-testing fields, such as correspondence between calculated and reported total dose-length product. The interface also displays relevant parameters from the DICOM headers. A total of 5,098 dose sheets were used to test the performance accuracy of RADIANCE in dose data extraction. Several search criteria were implemented. All records were searchable by accession number, study date, or dose parameters beyond chosen thresholds. Validated records were searchable according to additional criteria from validation inputs. An error rate of 0.303% was demonstrated in the validation. Dose monitoring is increasingly important and RADIANCE provides an open-source solution with a high level of accuracy. The RADIANCE validator has been updated to enable users to test the integrity of their installation and verify that their dose monitoring is accurate and effective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vega, Richard Manuel; Parm, Edward J.; Griffin, Patrick J.
2015-07-01
This report was put together to support the International Atomic Energy Agency (IAEA) REAL- 2016 activity to validate the dosimetry community’s ability to use a consistent set of activation data and to derive consistent spectral characterizations. The report captures details of integral measurements taken in the Annular Core Research Reactor (ACRR) central cavity with the Polyethylene-Lead-Graphite (PLG) bucket, reference neutron benchmark field. The field is described and an “a priori” calculated neutron spectrum is reported, based on MCNP6 calculations, and a subject matter expert (SME) based covariance matrix is given for this “a priori” spectrum. The results of 37 integralmore » dosimetry measurements in the neutron field are reported.« less
NASA Technical Reports Server (NTRS)
Menrad, Robert J.; Larson, Wiley J.
2008-01-01
This paper shares the findings of NASA's Integrated Learning and Development Program (ILDP) in its effort to reinvigorate the HANDS-ON practice of space systems engineering and project/program management through focused coursework, training opportunities, on-the job learning and special assignments. Prior to March 2005, NASA responsibility for technical workforce development (the program/project manager, systems engineering, discipline engineering, discipline engineering and associated communities) was executed by two parallel organizations. In March 2005 these organizations merged. The resulting program-ILDP-was chartered to implement an integrated competency-based development model capable of enhancing NASA's technical workforce performance as they face the complex challenges of Earth science, space science, aeronautics and human spaceflight missions. Results developed in collaboration with NASA Field Centers are reported on. This work led to definition of the agency's first integrated technical workforce development model known as the Requisite Occupation Competence and Knowledge (the ROCK). Critical processes and products are presented including: 'validation' techniques to guide model development, the Design-A-CUrriculuM (DACUM) process, and creation of the agency's first systems engineering body-of-knowledge. Findings were validated via nine focus groups from industry and government, validated with over 17 space-related organizations, at an estimated cost exceeding $300,000 (US). Masters-level programs and training programs have evolved to address the needs of these practitioner communities based upon these results. The ROCK reintroduced rigor and depth to the practitioner's development in these critical disciplines enabling their ability to take mission concepts from imagination to reality.
Mu, Tingkui; Pacheco, Shaun; Chen, Zeyu; Zhang, Chunmin; Liang, Rongguang
2017-02-13
In this paper, the design and experimental demonstration of a snapshot linear-Stokes imaging spectropolarimeter (SLSIS) is presented. The SLSIS, which is based on division-of-focal-plane polarimetry with four parallel linear polarization channels and integral field spectroscopy with numerous slit dispersive paths, has no moving parts and provides video-rate Stokes-vector hyperspectral datacubes. It does not need any scanning in the spectral, spatial or polarization dimension and offers significant advantages of rapid reconstruction without heavy computation during post-processing. The principle and the experimental setup of the SLSIS are described in detail. The image registration, Stokes spectral reconstruction and calibration procedures are included, and the system is validated using measurements of tungsten light and a static scene. The SLSIS's snapshot ability to resolve polarization spectral signatures is demonstrated using measurements of a dynamic scene.
Mu, Tingkui; Pacheco, Shaun; Chen, Zeyu; Zhang, Chunmin; Liang, Rongguang
2017-01-01
In this paper, the design and experimental demonstration of a snapshot linear-Stokes imaging spectropolarimeter (SLSIS) is presented. The SLSIS, which is based on division-of-focal-plane polarimetry with four parallel linear polarization channels and integral field spectroscopy with numerous slit dispersive paths, has no moving parts and provides video-rate Stokes-vector hyperspectral datacubes. It does not need any scanning in the spectral, spatial or polarization dimension and offers significant advantages of rapid reconstruction without heavy computation during post-processing. The principle and the experimental setup of the SLSIS are described in detail. The image registration, Stokes spectral reconstruction and calibration procedures are included, and the system is validated using measurements of tungsten light and a static scene. The SLSIS’s snapshot ability to resolve polarization spectral signatures is demonstrated using measurements of a dynamic scene. PMID:28191819
NASA Astrophysics Data System (ADS)
Vlasov, Vladimir; Rosenblum, Michael; Pikovsky, Arkady
2016-08-01
As has been shown by Watanabe and Strogatz (WS) (1993 Phys. Rev. Lett. 70 2391), a population of identical phase oscillators, sine-coupled to a common field, is a partially integrable system: for any ensemble size its dynamics reduce to equations for three collective variables. Here we develop a perturbation approach for weakly nonidentical ensembles. We calculate corrections to the WS dynamics for two types of perturbations: those due to a distribution of natural frequencies and of forcing terms, and those due to small white noise. We demonstrate that in both cases, the complex mean field for which the dynamical equations are written is close to the Kuramoto order parameter, up to the leading order in the perturbation. This supports the validity of the dynamical reduction suggested by Ott and Antonsen (2008 Chaos 18 037113) for weakly inhomogeneous populations.
A general low frequency acoustic radiation capability for NASTRAN
NASA Technical Reports Server (NTRS)
Everstine, G. C.; Henderson, F. M.; Schroeder, E. A.; Lipman, R. R.
1986-01-01
A new capability called NASHUA is described for calculating the radiated acoustic sound pressure field exterior to a harmonically-excited arbitrary submerged 3-D elastic structure. The surface fluid pressures and velocities are first calculated by coupling a NASTRAN finite element model of the structure with a discretized form of the Helmholtz surface integral equation for the exterior fluid. After the fluid impedance is calculated, most of the required matrix operations are performed using the general matrix manipulation package (DMAP) available in NASTRAN. Far field radiated pressures are then calculated from the surface solution using the Helmholtz exterior integral equation. Other output quantities include the maximum sound pressure levels in each of the three coordinate planes, the rms and average surface pressures and normal velocities, the total radiated power and the radiation efficiency. The overall approach is illustrated and validated using known analytic solutions for submerged spherical shells subjected to both uniform and nonuniform applied loads.
Unmanned Aerial Mass Spectrometer Systems for In-Situ Volcanic Plume Analysis
NASA Astrophysics Data System (ADS)
Diaz, Jorge Andres; Pieri, David; Wright, Kenneth; Sorensen, Paul; Kline-Shoder, Robert; Arkin, C. Richard; Fladeland, Matthew; Bland, Geoff; Buongiorno, Maria Fabrizia; Ramirez, Carlos; Corrales, Ernesto; Alan, Alfredo; Alegria, Oscar; Diaz, David; Linick, Justin
2015-02-01
Technology advances in the field of small, unmanned aerial vehicles and their integration with a variety of sensor packages and instruments, such as miniature mass spectrometers, have enhanced the possibilities and applications of what are now called unmanned aerial systems (UAS). With such technology, in situ and proximal remote sensing measurements of volcanic plumes are now possible without risking the lives of scientists and personnel in charge of close monitoring of volcanic activity. These methods provide unprecedented, and otherwise unobtainable, data very close in space and time to eruptions, to better understand the role of gas volatiles in magma and subsequent eruption products. Small mass spectrometers, together with the world's smallest turbo molecular pump, have being integrated into NASA and University of Costa Rica UAS platforms to be field-tested for in situ volcanic plume analysis, and in support of the calibration and validation of satellite-based remote sensing data. These new UAS-MS systems are combined with existing UAS flight-tested payloads and assets, such as temperature, pressure, relative humidity, SO2, H2S, CO2, GPS sensors, on-board data storage, and telemetry. Such payloads are capable of generating real time 3D concentration maps of the Turrialba volcano active plume in Costa Rica, while remote sensing data are simultaneously collected from the ASTER and OMI space-borne instruments for comparison. The primary goal is to improve the understanding of the chemical and physical properties of emissions for mitigation of local volcanic hazards, for the validation of species detection and abundance of retrievals based on remote sensing, and to validate transport models.
Unmanned aerial mass spectrometer systems for in-situ volcanic plume analysis.
Diaz, Jorge Andres; Pieri, David; Wright, Kenneth; Sorensen, Paul; Kline-Shoder, Robert; Arkin, C Richard; Fladeland, Matthew; Bland, Geoff; Buongiorno, Maria Fabrizia; Ramirez, Carlos; Corrales, Ernesto; Alan, Alfredo; Alegria, Oscar; Diaz, David; Linick, Justin
2015-02-01
Technology advances in the field of small, unmanned aerial vehicles and their integration with a variety of sensor packages and instruments, such as miniature mass spectrometers, have enhanced the possibilities and applications of what are now called unmanned aerial systems (UAS). With such technology, in situ and proximal remote sensing measurements of volcanic plumes are now possible without risking the lives of scientists and personnel in charge of close monitoring of volcanic activity. These methods provide unprecedented, and otherwise unobtainable, data very close in space and time to eruptions, to better understand the role of gas volatiles in magma and subsequent eruption products. Small mass spectrometers, together with the world's smallest turbo molecular pump, have being integrated into NASA and University of Costa Rica UAS platforms to be field-tested for in situ volcanic plume analysis, and in support of the calibration and validation of satellite-based remote sensing data. These new UAS-MS systems are combined with existing UAS flight-tested payloads and assets, such as temperature, pressure, relative humidity, SO2, H2S, CO2, GPS sensors, on-board data storage, and telemetry. Such payloads are capable of generating real time 3D concentration maps of the Turrialba volcano active plume in Costa Rica, while remote sensing data are simultaneously collected from the ASTER and OMI space-borne instruments for comparison. The primary goal is to improve the understanding of the chemical and physical properties of emissions for mitigation of local volcanic hazards, for the validation of species detection and abundance of retrievals based on remote sensing, and to validate transport models.
A finite element conjugate gradient FFT method for scattering
NASA Technical Reports Server (NTRS)
Collins, Jeffery D.; Ross, Dan; Jin, J.-M.; Chatterjee, A.; Volakis, John L.
1991-01-01
Validated results are presented for the new 3D body of revolution finite element boundary integral code. A Fourier series expansion of the vector electric and mangnetic fields is employed to reduce the dimensionality of the system, and the exact boundary condition is employed to terminate the finite element mesh. The mesh termination boundary is chosen such that is leads to convolutional boundary operatores of low O(n) memory demand. Improvements of this code are discussed along with the proposed formulation for a full 3D implementation of the finite element boundary integral method in conjunction with a conjugate gradiant fast Fourier transformation (CGFFT) solution.
Colaert, Niklaas; Barsnes, Harald; Vaudel, Marc; Helsens, Kenny; Timmerman, Evy; Sickmann, Albert; Gevaert, Kris; Martens, Lennart
2011-08-05
The Thermo Proteome Discoverer program integrates both peptide identification and quantification into a single workflow for peptide-centric proteomics. Furthermore, its close integration with Thermo mass spectrometers has made it increasingly popular in the field. Here, we present a Java library to parse the msf files that constitute the output of Proteome Discoverer. The parser is also implemented as a graphical user interface allowing convenient access to the information found in the msf files, and in Rover, a program to analyze and validate quantitative proteomics information. All code, binaries, and documentation is freely available at http://thermo-msf-parser.googlecode.com.
Linear diffusion into a Faraday cage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warne, Larry Kevin; Lin, Yau Tang; Merewether, Kimball O.
2011-11-01
Linear lightning diffusion into a Faraday cage is studied. An early-time integral valid for large ratios of enclosure size to enclosure thickness and small relative permeability ({mu}/{mu}{sub 0} {le} 10) is used for this study. Existing solutions for nearby lightning impulse responses of electrically thick-wall enclosures are refined and extended to calculate the nearby lightning magnetic field (H) and time-derivative magnetic field (HDOT) inside enclosures of varying thickness caused by a decaying exponential excitation. For a direct strike scenario, the early-time integral for a worst-case line source outside the enclosure caused by an impulse is simplified and numerically integrated tomore » give the interior H and HDOT at the location closest to the source as well as a function of distance from the source. H and HDOT enclosure response functions for decaying exponentials are considered for an enclosure wall of any thickness. Simple formulas are derived to provide a description of enclosure interior H and HDOT as well. Direct strike voltage and current bounds for a single-turn optimally-coupled loop for all three waveforms are also given.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stoynov, Y.; Dineva, P.
The stress, magnetic and electric field analysis of multifunctional composites, weakened by impermeable cracks, is of fundamental importance for their structural integrity and reliable service performance. The aim is to study dynamic behavior of a plane of functionally graded magnetoelectroelastic composite with more than one crack. The coupled material properties vary exponentially in an arbitrary direction. The plane is subjected to anti-plane mechanical and in-plane electric and magnetic load. The boundary value problem described by the partial differential equations with variable coefficients is reduced to a non-hypersingular traction boundary integral equation based on the appropriate functional transform and frequency-dependent fundamentalmore » solution derived in a closed form by Radon transform. Software code based on the boundary integral equation method (BIEM) is developed, validated and inserted in numerical simulations. The obtained results show the sensitivity of the dynamic stress, magnetic and electric field concentration in the cracked plane to the type and characteristics of the dynamic load, to the location and cracks disposition, to the wave-crack-crack interactions and to the magnitude and direction of the material gradient.« less
Achievements and challenges in structural bioinformatics and computational biophysics.
Samish, Ilan; Bourne, Philip E; Najmanovich, Rafael J
2015-01-01
The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. © The Author 2014. Published by Oxford University Press.
Achievements and challenges in structural bioinformatics and computational biophysics
Samish, Ilan; Bourne, Philip E.; Najmanovich, Rafael J.
2015-01-01
Motivation: The field of structural bioinformatics and computational biophysics has undergone a revolution in the last 10 years. Developments that are captured annually through the 3DSIG meeting, upon which this article reflects. Results: An increase in the accessible data, computational resources and methodology has resulted in an increase in the size and resolution of studied systems and the complexity of the questions amenable to research. Concomitantly, the parameterization and efficiency of the methods have markedly improved along with their cross-validation with other computational and experimental results. Conclusion: The field exhibits an ever-increasing integration with biochemistry, biophysics and other disciplines. In this article, we discuss recent achievements along with current challenges within the field. Contact: Rafael.Najmanovich@USherbrooke.ca PMID:25488929
Bridging the divide: a model-data approach to Polar and Alpine microbiology.
Bradley, James A; Anesio, Alexandre M; Arndt, Sandra
2016-03-01
Advances in microbial ecology in the cryosphere continue to be driven by empirical approaches including field sampling and laboratory-based analyses. Although mathematical models are commonly used to investigate the physical dynamics of Polar and Alpine regions, they are rarely applied in microbial studies. Yet integrating modelling approaches with ongoing observational and laboratory-based work is ideally suited to Polar and Alpine microbial ecosystems given their harsh environmental and biogeochemical characteristics, simple trophic structures, distinct seasonality, often difficult accessibility, geographical expansiveness and susceptibility to accelerated climate changes. In this opinion paper, we explain how mathematical modelling ideally complements field and laboratory-based analyses. We thus argue that mathematical modelling is a powerful tool for the investigation of these extreme environments and that fully integrated, interdisciplinary model-data approaches could help the Polar and Alpine microbiology community address some of the great research challenges of the 21st century (e.g. assessing global significance and response to climate change). However, a better integration of field and laboratory work with model design and calibration/validation, as well as a stronger focus on quantitative information is required to advance models that can be used to make predictions and upscale processes and fluxes beyond what can be captured by observations alone. © FEMS 2016.
Hoeijmakers, H J; Arts, M L J; Snik, F; Keller, C U; Kuiper, J M
2016-09-19
We provide a proof of the technical feasibility of LOUPE, the first integral-field snapshot spectropolarimeter, designed to monitor the reflected flux and polarization spectrum of Earth. These are to be used as benchmark data for the retrieval of biomarkers and atmospheric and surface characteristics from future direct observations of exoplanets. We perform a design trade-off for an implementation in which LOUPE performs snapshot integral-field spectropolarimetry at visible wavelengths. We used off-the-shelf optics to construct a polarization modulator, in which polarization information is encoded into the spectrum as a wavelength-dependent modulation, while spatial resolution is maintained using a micro-lens array. The performance of this design concept is validated in a laboratory setup. Our proof-of-concept is capable of measuring a grid of 50 × 50 polarization spectra between 610 and 780 nm of a mock target planet - proving the merit of this design. The measurements are affected by systematic noise on the percent level, and we discuss how to mitigate this in future iterations. We conclude that LOUPE can be small and robust while meeting the science goals of this particular space application, and note the many potential applications that may benefit from our concept for doing snapshot integral-field spectropolarimetry.
Bridging the divide: a model-data approach to Polar and Alpine microbiology
Bradley, James A.; Anesio, Alexandre M.; Arndt, Sandra
2016-01-01
Advances in microbial ecology in the cryosphere continue to be driven by empirical approaches including field sampling and laboratory-based analyses. Although mathematical models are commonly used to investigate the physical dynamics of Polar and Alpine regions, they are rarely applied in microbial studies. Yet integrating modelling approaches with ongoing observational and laboratory-based work is ideally suited to Polar and Alpine microbial ecosystems given their harsh environmental and biogeochemical characteristics, simple trophic structures, distinct seasonality, often difficult accessibility, geographical expansiveness and susceptibility to accelerated climate changes. In this opinion paper, we explain how mathematical modelling ideally complements field and laboratory-based analyses. We thus argue that mathematical modelling is a powerful tool for the investigation of these extreme environments and that fully integrated, interdisciplinary model-data approaches could help the Polar and Alpine microbiology community address some of the great research challenges of the 21st century (e.g. assessing global significance and response to climate change). However, a better integration of field and laboratory work with model design and calibration/validation, as well as a stronger focus on quantitative information is required to advance models that can be used to make predictions and upscale processes and fluxes beyond what can be captured by observations alone. PMID:26832206
2016-08-01
area denial environments . Near peer adversaries continue to develop low observable aircraft , proliferate counter-precision guided munition systems ...when the Air Force had significantly more control over its requirements validation and acquisition processes. The only tactical aircraft currently in... systems such as the F-35A. Interestingly, upgrades to these previously fielded aircraft also take longer after JCIDS was implemented than it did to
Gary Achtemeier
2012-01-01
A cellular automata fire model represents âelementsâ of fire by autonomous agents. A few simple algebraic expressions substituted for complex physical and meteorological processes and solved iteratively yield simulations for âsuper-diffusiveâ fire spread and coupled surface-layer (2-m) fireâatmosphere processes. Pressure anomalies, which are integrals of the thermal...
ASSP Advanced Sensor Signal Processor.
1984-06-01
Field of Regard ±150 Time to Impact : - 4 seconds Guidance Stop: 3.5 seconds after start (blind range of i I seeker at 100 feet above target) The control...These will be integrated across the engagement time in open-loop fashion and will typically lead to terminal impact inaccuracies. The validation was...15) with a natural frequency of around 3 Hz. The frequency and damping do not change substantially over the flight regime, where impact velocities
A Multi-Purpose Simulation Environment for UAV Research
2003-05-01
Maximum 200 Words) Unmanned aerial vehicles (UAVs) are playing an important role in today’s military initiatives. UAVs have proven to be invaluable in...battlefield commanders. Integration of new technologies necessitates simulation prior to fielding new systems in order to avoid costly er- rors. The unique...collection ofinformation if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD
Integrative biological analysis for neuropsychopharmacology.
Emmett, Mark R; Kroes, Roger A; Moskal, Joseph R; Conrad, Charles A; Priebe, Waldemar; Laezza, Fernanda; Meyer-Baese, Anke; Nilsson, Carol L
2014-01-01
Although advances in psychotherapy have been made in recent years, drug discovery for brain diseases such as schizophrenia and mood disorders has stagnated. The need for new biomarkers and validated therapeutic targets in the field of neuropsychopharmacology is widely unmet. The brain is the most complex part of human anatomy from the standpoint of number and types of cells, their interconnections, and circuitry. To better meet patient needs, improved methods to approach brain studies by understanding functional networks that interact with the genome are being developed. The integrated biological approaches--proteomics, transcriptomics, metabolomics, and glycomics--have a strong record in several areas of biomedicine, including neurochemistry and neuro-oncology. Published applications of an integrated approach to projects of neurological, psychiatric, and pharmacological natures are still few but show promise to provide deep biological knowledge derived from cells, animal models, and clinical materials. Future studies that yield insights based on integrated analyses promise to deliver new therapeutic targets and biomarkers for personalized medicine.
Classical electromagnetic fields from quantum sources in heavy-ion collisions
NASA Astrophysics Data System (ADS)
Holliday, Robert; McCarty, Ryan; Peroutka, Balthazar; Tuchin, Kirill
2017-01-01
Electromagnetic fields are generated in high energy nuclear collisions by spectator valence protons. These fields are traditionally computed by integrating the Maxwell equations with point sources. One might expect that such an approach is valid at distances much larger than the proton size and thus such a classical approach should work well for almost the entire interaction region in the case of heavy nuclei. We argue that, in fact, the contrary is true: due to the quantum diffusion of the proton wave function, the classical approximation breaks down at distances of the order of the system size. We compute the electromagnetic field created by a charged particle described initially as a Gaussian wave packet of width 1 fm and evolving in vacuum according to the Klein-Gordon equation. We completely neglect the medium effects. We show that the dynamics, magnitude and even sign of the electromagnetic field created by classical and quantum sources are different.
MUSE optical alignment procedure
NASA Astrophysics Data System (ADS)
Laurent, Florence; Renault, Edgard; Loupias, Magali; Kosmalski, Johan; Anwand, Heiko; Bacon, Roland; Boudon, Didier; Caillier, Patrick; Daguisé, Eric; Dubois, Jean-Pierre; Dupuy, Christophe; Kelz, Andreas; Lizon, Jean-Louis; Nicklas, Harald; Parès, Laurent; Remillieux, Alban; Seifert, Walter; Valentin, Hervé; Xu, Wenli
2012-09-01
MUSE (Multi Unit Spectroscopic Explorer) is a second generation VLT integral field spectrograph (1x1arcmin² Field of View) developed for the European Southern Observatory (ESO), operating in the visible wavelength range (0.465-0.93 μm). A consortium of seven institutes is currently assembling and testing MUSE in the Integration Hall of the Observatoire de Lyon for the Preliminary Acceptance in Europe, scheduled for 2013. MUSE is composed of several subsystems which are under the responsibility of each institute. The Fore Optics derotates and anamorphoses the image at the focal plane. A Splitting and Relay Optics feed the 24 identical Integral Field Units (IFU), that are mounted within a large monolithic instrument mechanical structure. Each IFU incorporates an image slicer, a fully refractive spectrograph with VPH-grating and a detector system connected to a global vacuum and cryogenic system. During 2011, all MUSE subsystems were integrated, aligned and tested independently in each institute. After validations, the systems were shipped to the P.I. institute at Lyon and were assembled in the Integration Hall This paper describes the end-to-end optical alignment procedure of the MUSE instrument. The design strategy, mixing an optical alignment by manufacturing (plug and play approach) and few adjustments on key components, is presented. We depict the alignment method for identifying the optical axis using several references located in pupil and image planes. All tools required to perform the global alignment between each subsystem are described. The success of this alignment approach is demonstrated by the good results for the MUSE image quality. MUSE commissioning at the VLT (Very Large Telescope) is planned for 2013.
Gómez-Ros, J M; Bedogni, R; Bortot, D; Domingo, C; Esposito, A; Introini, M V; Lorenzoli, M; Mazzitelli, G; Moraleda, M; Pola, A; Sacco, D
2017-04-01
This communication describes two new instruments, based on multiple active thermal neutron detectors arranged within a single moderator, that permit to unfold the neutron spectrum (from thermal to hundreds of MeV) and to determine the corresponding integral quantities with only one exposure. This makes them especially advantageous for neutron field characterisation and workplace monitoring in neutron-producing facilities. One of the devices has spherical geometry and nearly isotropic response, the other one has cylindrical symmetry and it is only sensitive to neutrons incident along the cylinder axis. In both cases, active detectors have been specifically developed looking for the criteria of miniaturisation, high sensitivity, linear response and good photon rejection. The calculated response matrix has been validated by experimental irradiations in neutron reference fields with a global uncertainty of 3%. The measurements performed in realistic neutron fields permitted to determine the neutron spectra and the integral quantities, in particular H*(10). © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Uslenghi, Piergiorgio L. E.; Laxpati, Sharad R.; Kawalko, Stephen F.
1993-01-01
The third phase of the development of the computer codes for scattering by coated bodies that has been part of an ongoing effort in the Electromagnetics Laboratory of the Electrical Engineering and Computer Science Department at the University of Illinois at Chicago is described. The work reported discusses the analytical and numerical results for the scattering of an obliquely incident plane wave by impedance bodies of revolution with phi variation of the surface impedance. Integral equation formulation of the problem is considered. All three types of integral equations, electric field, magnetic field, and combined field, are considered. These equations are solved numerically via the method of moments with parametric elements. Both TE and TM polarization of the incident plane wave are considered. The surface impedance is allowed to vary along both the profile of the scatterer and in the phi direction. Computer code developed for this purpose determines the electric surface current as well as the bistatic radar cross section. The results obtained with this code were validated by comparing the results with available results for specific scatterers such as the perfectly conducting sphere. Results for the cone-sphere and cone-cylinder-sphere for the case of an axially incident plane were validated by comparing the results with the results with those obtained in the first phase of this project. Results for body of revolution scatterers with an abrupt change in the surface impedance along both the profile of the scatterer and the phi direction are presented.
NASA Astrophysics Data System (ADS)
Hidayati, A.; Rahmi, A.; Yohandri; Ratnawulan
2018-04-01
The importance of teaching materials in accordance with the characteristics of students became the main reason for the development of basic electronics I module integrated character values based on conceptual change teaching model. The module development in this research follows the development procedure of Plomp which includes preliminary research, prototyping phase and assessment phase. In the first year of this research, the module is validated. Content validity is seen from the conformity of the module with the development theory in accordance with the demands of learning model characteristics. The validity of the construct is seen from the linkage and consistency of each module component developed with the characteristic of the integrated learning model of character values obtained through validator assessment. The average validation value assessed by the validator belongs to a very valid category. Based on the validator assessment then revised the basic electronics I module integrated character values based on conceptual change teaching model.
Eshelby's problem of non-elliptical inclusions
NASA Astrophysics Data System (ADS)
Zou, Wennan; He, Qichang; Huang, Mojia; Zheng, Quanshui
2010-03-01
The Eshelby problem consists in determining the strain field of an infinite linearly elastic homogeneous medium due to a uniform eigenstrain prescribed over a subdomain, called inclusion, of the medium. The salient feature of Eshelby's solution for an ellipsoidal inclusion is that the strain tensor field inside the latter is uniform. This uniformity has the important consequence that the solution to the fundamental problem of determination of the strain field in an infinite linearly elastic homogeneous medium containing an embedded ellipsoidal inhomogeneity and subjected to remote uniform loading can be readily deduced from Eshelby's solution for an ellipsoidal inclusion upon imposing appropriate uniform eigenstrains. Based on this result, most of the existing micromechanics schemes dedicated to estimating the effective properties of inhomogeneous materials have been nevertheless applied to a number of materials of practical interest where inhomogeneities are in reality non-ellipsoidal. Aiming to examine the validity of the ellipsoidal approximation of inhomogeneities underlying various micromechanics schemes, we first derive a new boundary integral expression for calculating Eshelby's tensor field (ETF) in the context of two-dimensional isotropic elasticity. The simple and compact structure of the new boundary integral expression leads us to obtain the explicit expressions of ETF and its average for a wide variety of non-elliptical inclusions including arbitrary polygonal ones and those characterized by the finite Laurent series. In light of these new analytical results, we show that: (i) the elliptical approximation to the average of ETF is valid for a convex non-elliptical inclusion but becomes inacceptable for a non-convex non-elliptical inclusion; (ii) in general, the Eshelby tensor field inside a non-elliptical inclusion is quite non-uniform and cannot be replaced by its average; (iii) the substitution of the generalized Eshelby tensor involved in various micromechanics schemes by the average Eshelby tensor for non-elliptical inhomogeneities is in general inadmissible.
Flow Mapping Based on the Motion-Integration Errors of Autonomous Underwater Vehicles
NASA Astrophysics Data System (ADS)
Chang, D.; Edwards, C. R.; Zhang, F.
2016-02-01
Knowledge of a flow field is crucial in the navigation of autonomous underwater vehicles (AUVs) since the motion of AUVs is affected by ambient flow. Due to the imperfect knowledge of the flow field, it is typical to observe a difference between the actual and predicted trajectories of an AUV, which is referred to as a motion-integration error (also known as a dead-reckoning error if an AUV navigates via dead-reckoning). The motion-integration error has been essential for an underwater glider to compute its flow estimate from the travel information of the last leg and to improve navigation performance by using the estimate for the next leg. However, the estimate by nature exhibits a phase difference compared to ambient flow experienced by gliders, prohibiting its application in a flow field with strong temporal and spatial gradients. In our study, to mitigate the phase problem, we have developed a local ocean model by combining the flow estimate based on the motion-integration error with flow predictions from a tidal ocean model. Our model has been used to create desired trajectories of gliders for guidance. Our method is validated by Long Bay experiments in 2012 and 2013 in which we deployed multiple gliders on the shelf of South Atlantic Bight and near the edge of Gulf Stream. In our recent study, the application of the motion-integration error is further extended to create a spatial flow map. Considering that the motion-integration errors of AUVs accumulate along their trajectories, the motion-integration error is formulated as a line integral of ambient flow which is then reformulated into algebraic equations. By solving an inverse problem for these algebraic equations, we obtain the knowledge of such flow in near real time, allowing more effective and precise guidance of AUVs in a dynamic environment. This method is referred to as motion tomography. We provide the results of non-parametric and parametric flow mapping from both simulated and experimental data.
NASA Astrophysics Data System (ADS)
Isotta Cristofori, Elena; Demarchi, Alessandro; Facello, Anna; Cámaro, Walther; Hermosilla, Fernando; López, Jaime
2016-04-01
The study and validation of tidal current patterns relies on the combination of several data sources such as numerical weather prediction models, hydrodynamic models, weather stations, current drifters and remote sensing observations. The assessment of the accuracy and the reliability of produced patterns and the communication of results, including an easy to understand visualization of data, is crucial for a variety of stakeholders including decision-makers. The large diffusion of geospatial equipment such as GPS, current drifters, aerial photogrammetry, allows to collect data in the field using mobile and portable devices with a relative limited effort in terms of time and economic resources. Theses real-time measurements are essential in order to validate the models and specifically to assess the skill of the model during critical environmental conditions. Moreover, the considerable development in remote sensing technologies, cartographic services and GPS applications have enabled the creation of Geographic Information Systems (GIS) capable to store, analyze, manage and integrate spatial or geographical information with hydro-meteorological data. This valuable contribution of Information and geospatial technologies can benefit manifold decision-makers including high level sport athletes. While the numerical approach, commonly used to validate models with in-situ data, is more familiar for scientific users, high level sport users are not familiar with a numerical representations of data. Therefore the integration of data collected in the field into a GIS allows an immediate visualization of performed analysis into geographic maps. This visualization represents a particularly effective way to communicate current patterns assessment results and uncertainty in information, leading to an increase of confidence level about the forecast. The aim of this paper is to present the methodology set-up in collaboration with the Austrian Sailing Federation, for the study of tidal current patterns of the Guanabara Bay, venue for the sailing competitions of Rio 2016 Olympic Games. The methodology relies on the integration of a consistent amount of data collected in the field, hydrodynamic model output, cartography and "key-signs" visible on the water into a GIS, proving to be particularly useful to simplify the final information, to help the learning process and to improve the decision making.
Booly: a new data integration platform.
Do, Long H; Esteves, Francisco F; Karten, Harvey J; Bier, Ethan
2010-10-13
Data integration is an escalating problem in bioinformatics. We have developed a web tool and warehousing system, Booly, that features a simple yet flexible data model coupled with the ability to perform powerful comparative analysis, including the use of Boolean logic to merge datasets together, and an integrated aliasing system to decipher differing names of the same gene or protein. Furthermore, Booly features a collaborative sharing system and a public repository so that users can retrieve new datasets while contributors can easily disseminate new content. We illustrate the uses of Booly with several examples including: the versatile creation of homebrew datasets, the integration of heterogeneous data to identify genes useful for comparing avian and mammalian brain architecture, and generation of a list of Food and Drug Administration (FDA) approved drugs with possible alternative disease targets. The Booly paradigm for data storage and analysis should facilitate integration between disparate biological and medical fields and result in novel discoveries that can then be validated experimentally. Booly can be accessed at http://booly.ucsd.edu.
Booly: a new data integration platform
2010-01-01
Background Data integration is an escalating problem in bioinformatics. We have developed a web tool and warehousing system, Booly, that features a simple yet flexible data model coupled with the ability to perform powerful comparative analysis, including the use of Boolean logic to merge datasets together, and an integrated aliasing system to decipher differing names of the same gene or protein. Furthermore, Booly features a collaborative sharing system and a public repository so that users can retrieve new datasets while contributors can easily disseminate new content. Results We illustrate the uses of Booly with several examples including: the versatile creation of homebrew datasets, the integration of heterogeneous data to identify genes useful for comparing avian and mammalian brain architecture, and generation of a list of Food and Drug Administration (FDA) approved drugs with possible alternative disease targets. Conclusions The Booly paradigm for data storage and analysis should facilitate integration between disparate biological and medical fields and result in novel discoveries that can then be validated experimentally. Booly can be accessed at http://booly.ucsd.edu. PMID:20942966
Rakotonarivo, O Sarobidy; Schaafsma, Marije; Hockley, Neal
2016-12-01
While discrete choice experiments (DCEs) are increasingly used in the field of environmental valuation, they remain controversial because of their hypothetical nature and the contested reliability and validity of their results. We systematically reviewed evidence on the validity and reliability of environmental DCEs from the past thirteen years (Jan 2003-February 2016). 107 articles met our inclusion criteria. These studies provide limited and mixed evidence of the reliability and validity of DCE. Valuation results were susceptible to small changes in survey design in 45% of outcomes reporting reliability measures. DCE results were generally consistent with those of other stated preference techniques (convergent validity), but hypothetical bias was common. Evidence supporting theoretical validity (consistency with assumptions of rational choice theory) was limited. In content validity tests, 2-90% of respondents protested against a feature of the survey, and a considerable proportion found DCEs to be incomprehensible or inconsequential (17-40% and 10-62% respectively). DCE remains useful for non-market valuation, but its results should be used with caution. Given the sparse and inconclusive evidence base, we recommend that tests of reliability and validity are more routinely integrated into DCE studies and suggest how this might be achieved. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
The bottom-up approach to integrative validity: a new perspective for program evaluation.
Chen, Huey T
2010-08-01
The Campbellian validity model and the traditional top-down approach to validity have had a profound influence on research and evaluation. That model includes the concepts of internal and external validity and within that model, the preeminence of internal validity as demonstrated in the top-down approach. Evaluators and researchers have, however, increasingly recognized that in an evaluation, the over-emphasis on internal validity reduces that evaluation's usefulness and contributes to the gulf between academic and practical communities regarding interventions. This article examines the limitations of the Campbellian validity model and the top-down approach and provides a comprehensive, alternative model, known as the integrative validity model for program evaluation. The integrative validity model includes the concept of viable validity, which is predicated on a bottom-up approach to validity. This approach better reflects stakeholders' evaluation views and concerns, makes external validity workable, and becomes therefore a preferable alternative for evaluation of health promotion/social betterment programs. The integrative validity model and the bottom-up approach enable evaluators to meet scientific and practical requirements, facilitate in advancing external validity, and gain a new perspective on methods. The new perspective also furnishes a balanced view of credible evidence, and offers an alternative perspective for funding. Copyright (c) 2009 Elsevier Ltd. All rights reserved.
FDTD Modeling and Counteraction to Scintillation Effects in the lonosphere
2014-04-05
collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT...for vectorial diffraction calculations and its numerical implementation,” J. Opt. Soc. Am. A, 23 (3), pp. 713-722, 2006. [21] Coe, R. L. and E. J...Seibel, “Improved near-field calculations using vectorial diffraction integrals in the finite-difference time-domain method,” J. Opt. Soc. Am. A, 28
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.; Singh, Upendra N.; Koch, Grady J.; Yu, Jirong; Amzajerdian, Farzin; Trieu, Bo C.; Petros, Mulugeta
2006-01-01
A new project, selected in 2005 by NASA's Science Mission Directorate (SMD), under the Instrument Incubator Program (IIP), will be described. The 3-year effort is intended to design, fabricate, and demonstrate a packaged, rugged, compact, space-qualifiable coherent Doppler wind lidar (DWL) transceiver capable of future validation in an aircraft and/or Unmanned Aerial Vehicle (UAV). The state-of-the-art 2-micron coherent DWL breadboard at NASA/LaRC will be engineered and compactly packaged consistent with future aircraft flights. The packaged transceiver will be integrated into a coherent DWL system test bed at LaRC. Atmospheric wind measurements will be made to validate the packaged technology. This will greatly advance the coherent part of the hybrid DWL solution to the need for global tropospheric wind measurements.
Clinical Assessment of Risk Management: an INtegrated Approach (CARMINA).
Tricarico, Pierfrancesco; Tardivo, Stefano; Sotgiu, Giovanni; Moretti, Francesca; Poletti, Piera; Fiore, Alberto; Monturano, Massimo; Mura, Ida; Privitera, Gaetano; Brusaferro, Silvio
2016-08-08
Purpose - The European Union recommendations for patient safety calls for shared clinical risk management (CRM) safety standards able to guide organizations in CRM implementation. The purpose of this paper is to develop a self-evaluation tool to measure healthcare organization performance on CRM and guide improvements over time. Design/methodology/approach - A multi-step approach was implemented including: a systematic literature review; consensus meetings with an expert panel from eight Italian leader organizations to get to an agreement on the first version; field testing to test instrument feasibility and flexibility; Delphi strategy with a second expert panel for content validation and balanced scoring system development. Findings - The self-assessment tool - Clinical Assessment of Risk Management: an INtegrated Approach includes seven areas (governance, communication, knowledge and skills, safe environment, care processes, adverse event management, learning from experience) and 52 standards. Each standard is evaluated according to four performance levels: minimum; monitoring; outcomes; and improvement actions, which resulted in a feasible, flexible and valid instrument to be used throughout different organizations. Practical implications - This tool allows practitioners to assess their CRM activities compared to minimum levels, monitor performance, benchmarking with other institutions and spreading results to different stakeholders. Originality/value - The multi-step approach allowed us to identify core minimum CRM levels in a field where no consensus has been reached. Most standards may be easily adopted in other countries.
ERIC Educational Resources Information Center
Lee, Hee-Sun; Liu, Ou Lydia; Linn, Marcia C.
2011-01-01
This study explores measurement of a construct called knowledge integration in science using multiple-choice and explanation items. We use construct and instructional validity evidence to examine the role multiple-choice and explanation items plays in measuring students' knowledge integration ability. For construct validity, we analyze item…
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 28 Judicial Administration 1 2013-07-01 2013-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 28 Judicial Administration 1 2012-07-01 2012-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...
28 CFR 25.5 - Validation and data integrity of records in the system.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Validation and data integrity of records in the system. 25.5 Section 25.5 Judicial Administration DEPARTMENT OF JUSTICE DEPARTMENT OF JUSTICE INFORMATION SYSTEMS The National Instant Criminal Background Check System § 25.5 Validation and data integrity...
NASA Astrophysics Data System (ADS)
Kubina, Stanley J.
1989-09-01
The review of the status of computational electromagnetics by Miller and the exposition by Burke of the developments in one of the more important computer codes in the application of the electric field integral equation method, the Numerical Electromagnetic Code (NEC), coupled with Molinet's summary of progress in techniques based on the Geometrical Theory of Diffraction (GTD), provide a clear perspective on the maturity of the modern discipline of computational electromagnetics and its potential. Audone's exposition of the application to the computation of Radar Scattering Cross-section (RCS) is an indication of the breadth of practical applications and his exploitation of modern near-field measurement techniques reminds one of progress in the measurement discipline which is essential to the validation or calibration of computational modeling methodology when applied to complex structures such as aircraft and ships. The latter monograph also presents some comparison results with computational models. Some of the results presented for scale model and flight measurements show some serious disagreements in the lobe structure which would require some detailed examination. This also applies to the radiation patterns obtained by flight measurement compared with those obtained using wire-grid models and integral equation modeling methods. In the examples which follow, an attempt is made to match measurements results completely over the entire 2 to 30 MHz HF range for antennas on a large patrol aircraft. The problem of validating computer models of HF antennas on a helicopter and using computer models to generate radiation pattern information which cannot be obtained by measurements are discussed. The use of NEC computer models to analyze top-side ship configurations where measurement results are not available and only self-validation measures are available or at best comparisons with an alternate GTD computer modeling technique is also discussed.
NASA Astrophysics Data System (ADS)
Song, Yi; Ma, Mingguo; Li, Xin; Wang, Xufeng
2011-11-01
This research dealt with a daytime integration method with the help of Simple Biosphere Model, Version 2 (SiB2). The field observations employed in this study were obtained at the Yingke (YK) oasis super-station, which includes an Automatic Meteorological Station (AMS), an eddy covariance (EC) system and a Soil Moisture and Temperature Measuring System (SMTMS). This station is located in the Heihe River Basin, the second largest inland river basin in China. The remotely sensed data and field observations employed in this study were derived from Watershed Allied Telemetry Experimental Research (WATER). Daily variations of EF in temporal and spatial scale would be detected by using SiB2. An instantaneous midday EF was calculated based on a remote-sensing-based estimation of surface energy budget. The invariance of daytime EF was examined using the instantaneous midday EF calculated from a remote-sensing-based estimation. The integration was carried out using the constant EF method in the intervals with a steady EF. Intervals with an inconsistent EF were picked up and ET in these intervals was integrated separately. The truth validation of land Surface ET at satellite pixel scale was carried out using the measurement of eddy covariance (EC) system.
NASA Technical Reports Server (NTRS)
Dunagan, Stephen E.; Norman, Thomas R.
1987-01-01
A wind tunnel experiment simulating a steady three-dimensional helicopter rotor blade/vortex interaction is reported. The experimental configuration consisted of a vertical semispan vortex-generating wing, mounted upstream of a horizontal semispan rotor blade airfoil. A three-dimensional laser velocimeter was used to measure the velocity field in the region of the blade. Sectional lift coefficients were calculated by integrating the velocity field to obtain the bound vorticity. Total lift values, obtained by using an internal strain-gauge balance, verified the laser velocimeter data. Parametric variations of vortex strength, rotor blade angle of attack, and vortex position relative to the rotor blade were explored. These data are reported (with attention to experimental limitations) to provide a dataset for the validation of analytical work.
NASA Astrophysics Data System (ADS)
Astuti, Sri Rejeki Dwi; Suyanta, LFX, Endang Widjajanti; Rohaeti, Eli
2017-05-01
The demanding of assessment in learning process was impact by policy changes. Nowadays, assessment is not only emphasizing knowledge, but also skills and attitudes. However, in reality there are many obstacles in measuring them. This paper aimed to describe how to develop integrated assessment instrument and to verify instruments' validity such as content validity and construct validity. This instrument development used test development model by McIntire. Development process data was acquired based on development test step. Initial product was observed by three peer reviewer and six expert judgments (two subject matter experts, two evaluation experts and two chemistry teachers) to acquire content validity. This research involved 376 first grade students of two Senior High Schools in Bantul Regency to acquire construct validity. Content validity was analyzed used Aiken's formula. The verifying of construct validity was analyzed by exploratory factor analysis using SPSS ver 16.0. The result show that all constructs in integrated assessment instrument are asserted valid according to content validity and construct validity. Therefore, the integrated assessment instrument is suitable for measuring critical thinking abilities and science process skills of senior high school students on electrolyte solution matter.
The sound field of a rotating dipole in a plug flow.
Wang, Zhao-Huan; Belyaev, Ivan V; Zhang, Xiao-Zheng; Bi, Chuan-Xing; Faranosov, Georgy A; Dowell, Earl H
2018-04-01
An analytical far field solution for a rotating point dipole source in a plug flow is derived. The shear layer of the jet is modelled as an infinitely thin cylindrical vortex sheet and the far field integral is calculated by the stationary phase method. Four numerical tests are performed to validate the derived solution as well as to assess the effects of sound refraction from the shear layer. First, the calculated results using the derived formulations are compared with the known solution for a rotating dipole in a uniform flow to validate the present model in this fundamental test case. After that, the effects of sound refraction for different rotating dipole sources in the plug flow are assessed. Then the refraction effects on different frequency components of the signal at the observer position, as well as the effects of the motion of the source and of the type of source are considered. Finally, the effect of different sound speeds and densities outside and inside the plug flow is investigated. The solution obtained may be of particular interest for propeller and rotor noise measurements in open jet anechoic wind tunnels.
Beam-tracing model for predicting sound fields in rooms with multilayer bounding surfaces
NASA Astrophysics Data System (ADS)
Wareing, Andrew; Hodgson, Murray
2005-10-01
This paper presents the development of a wave-based room-prediction model for predicting steady-state sound fields in empty rooms with specularly reflecting, multilayer surfaces. A triangular beam-tracing model with phase, and a transfer-matrix approach to model the surfaces, were involved. Room surfaces were modeled as multilayers of fluid, solid, or porous materials. Biot theory was used in the transfer-matrix formulation of the porous layer. The new model consisted of the transfer-matrix model integrated into the beam-tracing algorithm. The transfer-matrix model was validated by comparing predictions with those by theory, and with experiment. The test surfaces were a glass plate, double drywall panels, double steel panels, a carpeted floor, and a suspended-acoustical ceiling. The beam-tracing model was validated in the cases of three idealized room configurations-a small office, a corridor, and a small industrial workroom-with simple boundary conditions. The number of beams, the reflection order, and the frequency resolution required to obtain accurate results were investigated. Beam-tracing predictions were compared with those by a method-of-images model with phase. The model will be used to study sound fields in rooms with local- or extended-reaction multilayer surfaces.
Multiplexed protein measurement: technologies and applications of protein and antibody arrays
Kingsmore, Stephen F.
2006-01-01
The ability to measure the abundance of many proteins precisely and simultaneously in experimental samples is an important, recent advance for static and dynamic, as well as descriptive and predictive, biological research. The value of multiplexed protein measurement is being established in applications such as comprehensive proteomic surveys, studies of protein networks and pathways, validation of genomic discoveries and clinical biomarker development. As standards do not yet exist that bridge all of these applications, the current recommended best practice for validation of results is to approach study design in an iterative process and to integrate data from several measurement technologies. This review describes current and emerging multiplexed protein measurement technologies and their applications, and discusses the remaining challenges in this field. PMID:16582876
I -Love- Q relations for white dwarf stars
NASA Astrophysics Data System (ADS)
Boshkayev, K.; Quevedo, H.; Zhami, B.
2017-02-01
We investigate the equilibrium configurations of uniformly rotating white dwarfs, using Chandrasekhar and Salpeter equations of state in the framework of Newtonian physics. The Hartle formalism is applied to integrate the field equation together with the hydrostatic equilibrium condition. We consider the equations of structure up to the second order in the angular velocity, and compute all basic parameters of rotating white dwarfs to test the so-called moment of inertia, rotational Love number, and quadrupole moment (I-Love-Q) relations. We found that the I-Love-Q relations are also valid for white dwarfs regardless of the equation of state and nuclear composition. In addition, we show that the moment of inertia, quadrupole moment, and eccentricity (I-Q-e) relations are valid as well.
Williams, Jessica A R; Nelson, Candace C; Cabán-Martinez, Alberto J; Katz, Jeffrey N; Wagner, Gregory R; Pronk, Nicolaas P; Sorensen, Glorian; McLellan, Deborah L
2015-09-01
To conduct validation analyses for a new measure of the integration of worksite health protection and health promotion approaches developed in earlier research. A survey of small- to medium-sized employers located in the United States was conducted between October 2013 and March 2014 (n = 111). Cronbach α coefficient was used to assess reliability, and Pearson correlation coefficients were used to assess convergent validity. The integration score was positively associated with the measures of occupational safety and health and health promotion activities/policies-supporting its convergent validity (Pearson correlation coefficients of 0.32 to 0.47). Cronbach α coefficient was 0.94, indicating excellent reliability. The integration score seems to be a promising tool for assessing integration of health promotion and health protection. Further work is needed to test its dimensionality and validate its use in other samples.
NASA Astrophysics Data System (ADS)
Widodo, W.; Sudibyo, E.; Sari, D. A. P.
2018-04-01
This study aims to develop student worksheets for higher education that apply integrated science learning in discussing issues about motion in humans. These worksheets will guide students to solve the problem about human movement. They must integrate their knowledge about biology, physics, and chemistry to solve the problem. The worksheet was validated by three experts in Natural Science Integrated Science, especially in Human Movement topic. The aspects of the validation were feasibility of the content, the construction, and the language. This research used the Likert scale to measure the validity of each aspect, which is 4.00 for very good validity criteria, 3.00 for good validity criteria, 2.00 for more or less validity criteria, and 1.00 for not good validity criteria. Data showed that the validity for each aspect were in the range of good validity and very good validity criteria (3.33 to 3.67 for the content aspect, 2.33 to 4.00 for the construction aspect, and 3.33 to 4.00 for language aspect). However, there was a part of construction aspect that needed to improve. Overall, this students’ worksheet can be applied in classroom after some revisions based on suggestions from the validators.
Formal verification and testing: An integrated approach to validating Ada programs
NASA Technical Reports Server (NTRS)
Cohen, Norman H.
1986-01-01
An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.
GPS Water Vapor Tomography Based on Accurate Estimations of the GPS Tropospheric Parameters
NASA Astrophysics Data System (ADS)
Champollion, C.; Masson, F.; Bock, O.; Bouin, M.; Walpersdorf, A.; Doerflinger, E.; van Baelen, J.; Brenot, H.
2003-12-01
The Global Positioning System (GPS) is now a common technique for the retrieval of zenithal integrated water vapor (IWV). Further applications in meteorology need also slant integrated water vapor (SIWV) which allow to precisely define the high variability of tropospheric water vapor at different temporal and spatial scales. Only precise estimations of IWV and horizontal gradients allow the estimation of accurate SIWV. We present studies developed to improve the estimation of tropospheric water vapor from GPS data. Results are obtained from several field experiments (MAP, ESCOMPTE, OHM-CV, IHOP, .). First IWV are estimated using different GPS processing strategies and results are compared to radiosondes. The role of the reference frame and the a priori constraints on the coordinates of the fiducial and local stations is generally underestimated. It seems to be of first order in the estimation of the IWV. Second we validate the estimated horizontal gradients comparing zenith delay gradients and single site gradients. IWV, gradients and post-fit residuals are used to construct slant integrated water delays. Validation of the SIWV is under progress comparing GPS SIWV, Lidar measurements and high resolution meteorological models (Meso-NH). A careful analysis of the post-fit residuals is needed to separate tropospheric signal from multipaths. The slant tropospheric delays are used to study the 3D heterogeneity of the troposphere. We develop a tomographic software to model the three-dimensional distribution of the tropospheric water vapor from GPS data. The software is applied to the ESCOMPTE field experiment, a dense network of 17 dual frequency GPS receivers operated in southern France. Three inversions have been successfully compared to three successive radiosonde launches. Good resolution is obtained up to heights of 3000 m.
Development of a Trans-disciplinary Intervention Module for Adolescent Girls on Self-awareness.
John, Jasmine Mary; Navneetham, Janardhan; Nagendra, H R
2017-08-01
Mental health promotion among adolescents has been a key area of intervention for professionals working with children and adolescents. The opinions of experts in the field of mental health have taken to frame a trans-disciplinary intervention for adolescent girls on self awareness. To discuss the development and validation of a structured intervention by combining the knowledge from different disciplines in helping adolescents enhancing self awareness. Both qualitative and quantitative methodologies were followed for the development and validation of the module. First phase of the development of intervention module was the framing of intervention module after conducting in-depth interviews with experts in both mental health and yoga fields. Six experts each from mental health and yoga field were chosen for interview through convenient sampling. Validated interview guides were used for the process. The framed intervention module was given to six mental health experts and six yoga experts for content validation. The experts rated the usefulness of the intervention on a scale 0-4 (4=extremely helpful). The themes derived in the interviews were importance of self awareness, autonomy of self, physical level of self understanding, self regulation of emotions and self monitoring. The interviews were consolidated to frame the intervention module consisting of eight sessions having two parts in each session. Part one of each session is activities and interactions on mental health and part two is guided instructions for body focused meditation. Sessions were finalized with rating and suggestions from the experts. The final version of the module was pilot tested and had found to have enhanced self awareness among adolescent girls. Integration of multiple disciplines brought in novel perspectives in intervention.
Developing smartphone apps for behavioural studies: The AlcoRisk app case study.
Smith, Anthony; de Salas, Kristy; Lewis, Ian; Schüz, Benjamin
2017-08-01
Smartphone apps have emerged as valuable research tools to sample human behaviours at their time of occurrence within natural environments. Human behaviour sampling methods, such as Ecological Momentary Assessment (EMA), aim to facilitate research that is situated in ecologically valid real world environments rather than laboratory environments. Researchers have trialled a range of EMA smartphone apps to sample human behaviours such as dieting, physical activity and smoking. Software development processes for EMA smartphones apps, however, are not widely documented with little guidance provided for the integration of complex multidisciplinary behavioural and technical fields. In this paper, the AlcoRisk app for studying alcohol consumption and risk taking tendencies is presented alongside a software development process that integrates these multidisciplinary fields. The software development process consists of three stages including requirements analysis, feature and interface design followed by app implementation. Results from a preliminary feasibility study support the efficacy of the AlcoRisk app's software development process. Copyright © 2017 Elsevier Inc. All rights reserved.
A fast analytical undulator model for realistic high-energy FEL simulations
NASA Astrophysics Data System (ADS)
Tatchyn, R.; Cremer, T.
1997-02-01
A number of leading FEL simulation codes used for modeling gain in the ultralong undulators required for SASE saturation in the <100 Å range employ simplified analytical models both for field and error representations. Although it is recognized that both the practical and theoretical validity of such codes could be enhanced by incorporating realistic undulator field calculations, the computational cost of doing this can be prohibitive, especially for point-to-point integration of the equations of motion through each undulator period. In this paper we describe a simple analytical model suitable for modeling realistic permanent magnet (PM), hybrid/PM, and non-PM undulator structures, and discuss selected techniques for minimizing computation time.
Imagination and society: the role of visual sociology.
Cipriani, Roberto; Del Re, Emanuela C
2012-10-01
The paper presents the field of Visual Sociology as an approach that makes use of photographs, films, documentaries, videos, to capture and assess aspects of social life and social signals. It overviews some relevant works in the field, it deals with methodological and epistemological issues, by raising the question of the relation between the observer and the observed, and makes reference to some methods of analysis, such as those proposed by the Grounded Theory, and to some connected tools for automatic qualitative analysis, like NVivo. The relevance of visual sociology to the study of social signals lies in the fact that it can validly integrate the information, introducing a multi-modal approach in the analysis of social signals.
Creating and field-testing diagnostic criteria for partner and child maltreatment.
Heyman, Richard E; Smith Slep, Amy M
2006-09-01
An integrated set of diagnostic criteria for partner abuse and child abuse and neglect were developed and tested in 4 studies conducted with a branch of America's largest family maltreatment protection agency (i.e., the U.S. military's Family Advocacy Program). Maltreatment criteria then in force were found to have adequate levels of content validity, but experts' and users' feedback indicated ambiguities and poorly specified criteria that undermined reliable application. Criteria incorporating elements of the best existing civilian and military operationalizations were developed and evaluated in two field trials. The final definitions were found to support very high levels of agreement (92%) between base adjudicating committees and master reviewers. Copyright (c) 2006 APA, all rights reserved.
Compact, Engineered 2-Micron Coherent Doppler Wind Lidar Prototype for Field and Airborne Evaluation
NASA Technical Reports Server (NTRS)
Kavaya, Michael J.; Amzajerdian, Farzin; Koch, Grady J.
2006-01-01
The state-of-the-art 2-micron coherent Doppler wind lidar breadboard at NASA/LaRC will be engineered and compactly packaged consistent with future aircraft flights. The packaged transceiver will be integrated into a coherent Doppler wind lidar system test bed at LaRC. Atmospheric wind measurements will be made to validate the packaged technology. This will greatly advance the coherent part of the hybrid Doppler wind lidar solution to the need for global tropospheric wind measurements.
NASA Technical Reports Server (NTRS)
Abbott, Mark R.
1996-01-01
The objectives of the last six months were: (1) Complete sensitivity analysis of fluorescence; line height algorithms (2) Deliver fluorescence algorithm code and test data to the University of Miami for integration; (3) Complete analysis of bio-optical data from Southern Ocean cruise; (4) Conduct laboratory experiments based on analyses of field data; (5) Analyze data from bio-optical mooring off Hawaii; (6) Develop calibration/validation plan for MODIS fluorescence data; (7) Respond to the Japanese Research Announcement for GLI; and (8) Continue to review plans for EOSDIS and assist ECS contractor.
Mingus Discontinuous Multiphysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pat Notz, Dan Turner
Mingus provides hybrid coupled local/non-local mechanics analysis capabilities that extend several traditional methods to applications with inherent discontinuities. Its primary features include adaptations of solid mechanics, fluid dynamics and digital image correlation that naturally accommodate dijointed data or irregular solution fields by assimilating a variety of discretizations (such as control volume finite elements, peridynamics and meshless control point clouds). The goal of this software is to provide an analysis framework form multiphysics engineering problems with an integrated image correlation capability that can be used for experimental validation and model
Wave excitation at Lindblad resonances using the method of multiple scales
NASA Astrophysics Data System (ADS)
Horák, Jiří
2017-12-01
In this note, the method of multiple scales is adopted to the problem of excitation of non–axisymmetric acoustic waves in vertically integrated disk by tidal gravitational fields. We derive a formula describing a waveform of exited wave that is uniformly valid in a whole disk as long as only a single Lindblad resonance is present. Our formalism is subsequently applied to two classical problems: trapped p–mode oscillations in relativistic accretion disks and the excitation of waves in infinite disks.
NASA Astrophysics Data System (ADS)
Deng, Baoqing; Si, Yinbing; Wang, Jia
2017-12-01
Transient storages may vary along the stream due to stream hydraulic conditions and the characteristics of storage. Analytical solutions of transient storage models in literature didn't cover the spatially non-uniform storage. A novel integral transform strategy is presented that simultaneously performs integral transforms to the concentrations in the stream and in storage zones by using the single set of eigenfunctions derived from the advection-diffusion equation of the stream. The semi-analytical solution of the multiple-zone transient storage model with the spatially non-uniform storage is obtained by applying the generalized integral transform technique to all partial differential equations in the multiple-zone transient storage model. The derived semi-analytical solution is validated against the field data in literature. Good agreement between the computed data and the field data is obtained. Some illustrative examples are formulated to demonstrate the applications of the present solution. It is shown that solute transport can be greatly affected by the variation of mass exchange coefficient and the ratio of cross-sectional areas. When the ratio of cross-sectional areas is big or the mass exchange coefficient is small, more reaches are recommended to calibrate the parameter.
A Numerical Method of Calculating Propeller Noise Including Acoustic Nonlinear Effects
NASA Technical Reports Server (NTRS)
Korkan, K. D.
1985-01-01
Using the transonic flow fields(s) generated by the NASPROP-E computer code for an eight blade SR3-series propeller, a theoretical method is investigated to calculate the total noise values and frequency content in the acoustic near and far field without using the Ffowcs Williams - Hawkings equation. The flow field is numerically generated using an implicit three dimensional Euler equation solver in weak conservation law form. Numerical damping is required by the differencing method for stability in three dimensions, and the influence of the damping on the calculated acoustic values is investigated. The acoustic near field is solved by integrating with respect to time the pressure oscillations induced at a stationary observer location. The acoustic far field is calculated from the near field primitive variables as generated by NASPROP-E computer code using a method involving a perturbation velocity potential as suggested by Hawkings in the calculation of the acoustic pressure time-history at a specified far field observed location. the methodologies described are valid for calculating total noise levels and are applicable to any propeller geometry for which a flow field solution is available.
The GPM Ground Validation Program: Pre to Post-Launch
NASA Astrophysics Data System (ADS)
Petersen, W. A.
2014-12-01
NASA GPM Ground Validation (GV) activities have transitioned from the pre to post-launch era. Prior to launch direct validation networks and associated partner institutions were identified world-wide, covering a plethora of precipitation regimes. In the U.S. direct GV efforts focused on use of new operational products such as the NOAA Multi-Radar Multi-Sensor suite (MRMS) for TRMM validation and GPM radiometer algorithm database development. In the post-launch, MRMS products including precipitation rate, types and data quality are being routinely generated to facilitate statistical GV of instantaneous and merged GPM products. To assess precipitation column impacts on product uncertainties, range-gate to pixel-level validation of both Dual-Frequency Precipitation Radar (DPR) and GPM microwave imager data are performed using GPM Validation Network (VN) ground radar and satellite data processing software. VN software ingests quality-controlled volumetric radar datasets and geo-matches those data to coincident DPR and radiometer level-II data. When combined MRMS and VN datasets enable more comprehensive interpretation of ground-satellite estimation uncertainties. To support physical validation efforts eight (one) field campaigns have been conducted in the pre (post) launch era. The campaigns span regimes from northern latitude cold-season snow to warm tropical rain. Most recently the Integrated Precipitation and Hydrology Experiment (IPHEx) took place in the mountains of North Carolina and involved combined airborne and ground-based measurements of orographic precipitation and hydrologic processes underneath the GPM Core satellite. One more U.S. GV field campaign (OLYMPEX) is planned for late 2015 and will address cold-season precipitation estimation, process and hydrology in the orographic and oceanic domains of western Washington State. Finally, continuous direct and physical validation measurements are also being conducted at the NASA Wallops Flight Facility multi-radar, gauge and disdrometer facility located in coastal Virginia. This presentation will summarize the evolution of the NASA GPM GV program from pre to post-launch eras and highlight early evaluations of GPM satellite datasets.
NASA Astrophysics Data System (ADS)
Andrades-Filho, Clódis de Oliveira; Rossetti, Dilce de Fátima; Bezerra, Francisco Hilario Rego; Medeiros, Walter Eugênio; Valeriano, Márcio de Morisson; Cremon, Édipo Henrique; Oliveira, Roberto Gusmão de
2014-12-01
Neogene and late Quaternary sedimentary deposits corresponding respectively to the Barreiras Formation and Post-Barreiras Sediments are abundant along the Brazilian coast. Such deposits are valuable for reconstructing sea level fluctuations and recording tectonic reactivation along the passive margin of South America. Despite this relevance, much effort remains to be invested in discriminating these units in their various areas of occurrence. The main objective of this work is to develop and test a new methodology for semi-automated mapping of Neogene and late Quaternary sedimentary deposits in northeastern Brazil integrating geophysical and remote sensing data. The central onshore Paraíba Basin was selected due to the recent availability of a detailed map based on the integration of surface and subsurface geological data. We used airborne gamma-ray spectrometry (i.e., potassium-K and thorium-Th concentration) and morphometric data (i.e., relief-dissection, slope and elevation) extracted from the digital elevation model (DEM) generated by the Shuttle Radar Topography Mission (SRTM). The procedures included: (a) data integration using geographic information systems (GIS); (b) exploratory statistical analyses, including the definition of parameters and thresholds for class discrimination for a set of sample plots; and (c) development and application of a decision-tree classification. Data validation was based on: (i) statistical analysis of geochemical and airborne gamma-ray spectrometry data consisting of K and Th concentrations; and (ii) map validation with the support of a confusion matrix, overall accuracy, as well as quantity disagreement and allocation disagreement for accuracy assessment based on field points. The concentration of K successfully separated the sedimentary units of the basin from Precambrian basement rocks. The relief-dissection morphometric variable allowed the discrimination between the Barreiras Formation and the Post-Barreiras Sediments. In addition, two units of the latter (i.e., PB1 and PB2) previously mapped in the field were promptly separated based on Th concentration. A regression analysis indicated that the relationship between geophysical and geochemical values obtained for the PB1, PB2 and Barreiras Formation is significant (R-squared = 0.91; p-value <0.05). Map validation presented a high overall accuracy of 84%, with a coefficient of quantity disagreement of 12% and a coefficient of allocation disagreement of 8%. These results indicate that the methodology applied in the central onshore Paraíba Basin can be successfully used for mapping the Barreiras Formation and Post-Barreiras Sediments in other areas of the Brazilian coast. The ability to rapidly and precisely map these units using such methodology could reveal their geographic distribution along the northeastern coast of Brazil.
Mini-UAV based sensory system for measuring environmental variables in greenhouses.
Roldán, Juan Jesús; Joossen, Guillaume; Sanz, David; del Cerro, Jaime; Barrientos, Antonio
2015-02-02
This paper describes the design, construction and validation of a mobile sensory platform for greenhouse monitoring. The complete system consists of a sensory system on board a small quadrotor (i.e., a four rotor mini-UAV). The goals of this system include taking measures of temperature, humidity, luminosity and CO2 concentration and plotting maps of these variables. These features could potentially allow for climate control, crop monitoring or failure detection (e.g., a break in a plastic cover). The sensors have been selected by considering the climate and plant growth models and the requirements for their integration onboard the quadrotor. The sensors layout and placement have been determined through a study of quadrotor aerodynamics and the influence of the airflows from its rotors. All components of the system have been developed, integrated and tested through a set of field experiments in a real greenhouse. The primary contributions of this paper are the validation of the quadrotor as a platform for measuring environmental variables and the determination of the optimal location of sensors on a quadrotor.
Mini-UAV Based Sensory System for Measuring Environmental Variables in Greenhouses
Roldán, Juan Jesús; Joossen, Guillaume; Sanz, David; del Cerro, Jaime; Barrientos, Antonio
2015-01-01
This paper describes the design, construction and validation of a mobile sensory platform for greenhouse monitoring. The complete system consists of a sensory system on board a small quadrotor (i.e., a four rotor mini-UAV). The goals of this system include taking measures of temperature, humidity, luminosity and CO2 concentration and plotting maps of these variables. These features could potentially allow for climate control, crop monitoring or failure detection (e.g., a break in a plastic cover). The sensors have been selected by considering the climate and plant growth models and the requirements for their integration onboard the quadrotor. The sensors layout and placement have been determined through a study of quadrotor aerodynamics and the influence of the airflows from its rotors. All components of the system have been developed, integrated and tested through a set of field experiments in a real greenhouse. The primary contributions of this paper are the validation of the quadrotor as a platform for measuring environmental variables and the determination of the optimal location of sensors on a quadrotor. PMID:25648713
Satellite-driven modeling approach for monitoring lava flow hazards during the 2017 Etna eruption
NASA Astrophysics Data System (ADS)
Del Negro, C.; Bilotta, G.; Cappello, A.; Ganci, G.; Herault, A.; Zago, V.
2017-12-01
The integration of satellite data and modeling represents an efficient strategy that may provide immediate answers to the main issues raised at the onset of a new effusive eruption. Satellite-based thermal remote sensing of hotspots related to effusive activity can effectively provide a variety of products suited to timing, locating, and tracking the radiant character of lava flows. Hotspots show the location and occurrence of eruptive events (vents). Discharge rate estimates may indicate the current intensity (effusion rate) and potential magnitude (volume). High-spatial resolution multispectral satellite data can complement field observations for monitoring the front position (length) and extension of flows (area). Physics-based models driven, or validated, by satellite-derived parameters are now capable of fast and accurate forecast of lava flow inundation scenarios (hazard). Here, we demonstrate the potential of the integrated application of satellite remote-sensing techniques and lava flow models during the 2017 effusive eruption at Mount Etna in Italy. This combined approach provided insights into lava flow field evolution by supplying detailed views of flow field construction (e.g., the opening of ephemeral vents) that were useful for more accurate and reliable forecasts of eruptive activity. Moreover, we gave a detailed chronology of the lava flow activity based on field observations and satellite images, assessed the potential extent of impacted areas, mapped the evolution of lava flow field, and executed hazard projections. The underside of this combination is the high sensitivity of lava flow inundation scenarios to uncertainties in vent location, discharge rate, and other parameters, which can make interpreting hazard forecasts difficult during an effusive crisis. However, such integration at last makes timely forecasts of lava flow hazards during effusive crises possible at the great majority of volcanoes for which no monitoring exists.
Integrative Biological Analysis For Neuropsychopharmacology
Emmett, Mark R; Kroes, Roger A; Moskal, Joseph R; Conrad, Charles A; Priebe, Waldemar; Laezza, Fernanda; Meyer-Baese, Anke; Nilsson, Carol L
2014-01-01
Although advances in psychotherapy have been made in recent years, drug discovery for brain diseases such as schizophrenia and mood disorders has stagnated. The need for new biomarkers and validated therapeutic targets in the field of neuropsychopharmacology is widely unmet. The brain is the most complex part of human anatomy from the standpoint of number and types of cells, their interconnections, and circuitry. To better meet patient needs, improved methods to approach brain studies by understanding functional networks that interact with the genome are being developed. The integrated biological approaches—proteomics, transcriptomics, metabolomics, and glycomics—have a strong record in several areas of biomedicine, including neurochemistry and neuro-oncology. Published applications of an integrated approach to projects of neurological, psychiatric, and pharmacological natures are still few but show promise to provide deep biological knowledge derived from cells, animal models, and clinical materials. Future studes that yield insights based on integrated analyses promise to deliver new therapeutic targets and biomarkers for personalized medicine. PMID:23800968
Integrating legacy medical data sensors in a wireless network infrastucture.
Dembeyiotis, S; Konnis, G; Koutsouris, D
2005-01-01
In the process of developing a wireless networking solution to provide effective field-deployable communications and telemetry support for rescuers during major natural disasters, we are faced with the task of interfacing the multitude of medical and other legacy data collection sensors to the network grid. In this paper, we detail a number of solutions, with particular attention given to the issue of data security. The chosen implementation allows for sensor control and management from remote network locations, while the sensors can wirelessly transmit their data to nearby network nodes securely, utilizing the latest commercially available cryptography solutions. Initial testing validates the design choices, while the network-enabled sensors are being integrated in the overall wireless network security framework.
Predicting Flory-Huggins χ from Simulations
NASA Astrophysics Data System (ADS)
Zhang, Wenlin; Gomez, Enrique D.; Milner, Scott T.
2017-07-01
We introduce a method, based on a novel thermodynamic integration scheme, to extract the Flory-Huggins χ parameter as small as 10-3k T for polymer blends from molecular dynamics (MD) simulations. We obtain χ for the archetypical coarse-grained model of nonpolar polymer blends: flexible bead-spring chains with different Lennard-Jones interactions between A and B monomers. Using these χ values and a lattice version of self-consistent field theory (SCFT), we predict the shape of planar interfaces for phase-separated binary blends. Our SCFT results agree with MD simulations, validating both the predicted χ values and our thermodynamic integration method. Combined with atomistic simulations, our method can be applied to predict χ for new polymers from their chemical structures.
Fei, Ding-Yu; Zhao, Xiaoming; Boanca, Cosmin; Hughes, Esther; Bai, Ou; Merrell, Ronald; Rafiq, Azhar
2010-07-01
To design and test an embedded biomedical sensor system that can monitor astronauts' comprehensive physiological parameters, and provide real-time data display during extra-vehicle activities (EVA) in the space exploration. An embedded system was developed with an array of biomedical sensors that can be integrated into the spacesuit. Wired communications were tested for physiological data acquisition and data transmission to a computer mounted on the spacesuit during task performances simulating EVA sessions. The sensor integration, data collection and communication, and the real-time data monitoring were successfully validated in the NASA field tests. The developed system may work as an embedded system for monitoring health status during long-term space mission. Copyright 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Pavlov, Al. A.; Shevchenko, A. M.; Khotyanovsky, D. V.; Pavlov, A. A.; Shmakov, A. S.; Golubev, M. P.
2017-10-01
We present a method for and results of determination of the field of integral density in the structure of flow corresponding to the Mach interaction of shock waves at Mach number M = 3. The optical diagnostics of flow was performed using an interference technique based on self-adjusting Zernike filters (SA-AVT method). Numerical simulations were carried out using the CFS3D program package for solving the Euler and Navier-Stokes equations. Quantitative data on the distribution of integral density on the path of probing radiation in one direction of 3D flow transillumination in the region of Mach interaction of shock waves were obtained for the first time.
Integrated bio-photonics to revolutionize health care enabled through PIX4life and PIXAPP
NASA Astrophysics Data System (ADS)
Jans, Hilde; O'Brien, Peter; Artundo, Iñigo; Porcel, Marco A. G.; Hoofman, Romano; Geuzebroek, Douwe; Dumon, Pieter; van der Vliet, Marcel; Witzens, Jeremy; Bourguignon, Eric; Van Dorpe, Pol; Lagae, Liesbet
2018-02-01
Photonics has become critical to life sciences. However, the field is far from benefiting fully from photonics' capabilities. Today, bulky and expensive optical systems dominate biomedical photonics, even though robust optical functionality can be realized cost-effectively on single photonic integrated circuits (PICs). Such chips are commercially available mostly for telecom applications, and at infrared wavelengths. Although proof-of-concept demonstrations for PICs in life sciences, using visible wavelengths are abundant, the gating factor for wider adoption is limited in resource capacity. Two European pilot lines, PIX4life and PIXAPP, were established to facilitate European R and D in biophotonics, by helping European companies and universities bridge the gap between research and industrial development. Through creation of an open-access model, PIX4life aims to lower barriers to entry for prototyping and validating biophotonics concepts for larger scale production. In addition, PIXAPP enables the assembly and packaging of photonic integrated circuits.
Integration of XNAT/PACS, DICOM, and Research Software for Automated Multi-modal Image Analysis.
Gao, Yurui; Burns, Scott S; Lauzon, Carolyn B; Fong, Andrew E; James, Terry A; Lubar, Joel F; Thatcher, Robert W; Twillie, David A; Wirt, Michael D; Zola, Marc A; Logan, Bret W; Anderson, Adam W; Landman, Bennett A
2013-03-29
Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.
Integration of XNAT/PACS, DICOM, and research software for automated multi-modal image analysis
NASA Astrophysics Data System (ADS)
Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.
2013-03-01
Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.
Integration of XNAT/PACS, DICOM, and Research Software for Automated Multi-modal Image Analysis
Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.
2013-01-01
Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software. PMID:24386548
Toward multidomain integrated network management for ATM and SDH networks
NASA Astrophysics Data System (ADS)
Galis, Alex; Gantenbein, Dieter; Covaci, Stefan; Bianza, Carlo; Karayannis, Fotis; Mykoniatis, George
1996-12-01
ACTS Project AC080 MISA has embarked upon the task of realizing and validating via European field trials integrated end-to-end management of hybrid SDH and ATM networks in the framework of open network provision. This paper reflects the initial work of the project and gives an overview of the proposed MISA system architecture and initial design. We describe our understanding of the underlying enterprise model in the network management context, including the concept of the MISA Global Broadband Connectivity Management service. It supports Integrated Broadband Communication by defining an end-to-end broadband connection service in a multi-domain business environment. Its implementation by the MISA consortium within trials across Europe aims for an efficient management of network resources of the SDH and ATM infrastructure, considering optimum end-to-end quality of service and the needs of a number of telecommunication actors: customers, value-added service providers, and network providers.
Integrating Buddhism and HIV prevention in U.S. southeast Asian communities.
Loue, S; Lane, S D; Lloyd, L S; Loh, L
1999-02-01
Asian Pacific Islander communities in the United States have experienced an alarming increase in HIV infection over the past few years, possibly due to a lack of knowledge and the relative absence of appropriate educational interventions. The authors propose a new approach to the development of HIV prevention programs in U.S. southeast Asian communities. This article reviews the cultural and economic factors that may facilitate HIV transmission within these communities. Relying on the basic precepts of Buddhism, the dominant religion of many southeast Asian populations in the United States, the health belief model is utilized to demonstrate how recognizable, acceptable religious constructs can be integrated into the content of HIV prevention messages. This integration of religious concepts with HIV prevention messages may increase the likelihood that the message audience will accept the prevention messages as relevant. This nuanced approach to HIV prevention must be validated and refined through field research.
NASA Astrophysics Data System (ADS)
Well, Reinhard; Böttcher, Jürgen; Butterbach-Bahl, Klaus; Dannenmann, Michael; Deppe, Marianna; Dittert, Klaus; Dörsch, Peter; Horn, Marcus; Ippisch, Olaf; Mikutta, Robert; Müller, Carsten; Müller, Christoph; Senbayram, Mehmet; Vogel, Hans-Jörg; Wrage-Mönnig, Nicole
2016-04-01
Robust denitrification data suitable to validate soil N2 fluxes in denitrification models are scarce due to methodical limitations and the extreme spatio-temporal heterogeneity of denitrification in soils. Numerical models have become essential tools to predict denitrification at different scales. Model performance could either be tested for total gaseous flux (NO + N2O + N2), individual denitrification products (e.g. N2O and/or NO) or for the effect of denitrification factors (e.g. C-availability, respiration, diffusivity, anaerobic volume, etc.). While there are numerous examples for validating N2O fluxes, there are neither robust field data of N2 fluxes nor sufficiently resolved measurements of control factors used as state variables in the models. To the best of our knowledge there has been only one published validation of modelled soil N2 flux by now, using a laboratory data set to validate an ecosystem model. Hence there is a need for validation data at both, the mesocosm and the field scale including validation of individual denitrification controls. Here we present the concept for collecting model validation data which is be part of the DFG-research unit "Denitrification in Agricultural Soils: Integrated Control and Modelling at Various Scales (DASIM)" starting this year. We will use novel approaches including analysis of stable isotopes, microbial communities, pores structure and organic matter fractions to provide denitrification data sets comprising as much detail on activity and regulation as possible as a basis to validate existing and calibrate new denitrification models that are applied and/or developed by DASIM subprojects. The basic idea is to simulate "field-like" conditions as far as possible in an automated mesocosm system without plants in order to mimic processes in the soil parts not significantly influenced by the rhizosphere (rhizosphere soils are studied by other DASIM projects). Hence, to allow model testing in a wide range of conditions, denitrification control factors will be varied in the initial settings (pore volume, plant residues, mineral N, pH) but also over time, where moisture, temperature, and mineral N will be manipulated according to typical time patterns in the field. This will be realized by including precipitation events, fertilization (via irrigation), drainage (via water potential) and temperature in the course of incubations. Moreover, oxygen concentration will be varied to simulate anaerobic events. These data will be used to calibrate the newly to develop DASIM models as well as existing denitrification models. One goal of DASIM is to create a public data base as a joint basis for model testing by denitrification modellers. Therefore we invite contributions of suitable data-sets from the scientific community. Requirements will be briefly outlined.
The B-dot Earth Average Magnetic Field
NASA Technical Reports Server (NTRS)
Capo-Lugo, Pedro A.; Rakoczy, John; Sanders, Devon
2013-01-01
The average Earth's magnetic field is solved with complex mathematical models based on mean square integral. Depending on the selection of the Earth magnetic model, the average Earth's magnetic field can have different solutions. This paper presents a simple technique that takes advantage of the damping effects of the b-dot controller and is not dependent of the Earth magnetic model; but it is dependent on the magnetic torquers of the satellite which is not taken into consideration in the known mathematical models. Also the solution of this new technique can be implemented so easily that the flight software can be updated during flight, and the control system can have current gains for the magnetic torquers. Finally, this technique is verified and validated using flight data from a satellite that it has been in orbit for three years.
A generalized sound extrapolation method for turbulent flows
NASA Astrophysics Data System (ADS)
Zhong, Siyang; Zhang, Xin
2018-02-01
Sound extrapolation methods are often used to compute acoustic far-field directivities using near-field flow data in aeroacoustics applications. The results may be erroneous if the volume integrals are neglected (to save computational cost), while non-acoustic fluctuations are collected on the integration surfaces. In this work, we develop a new sound extrapolation method based on an acoustic analogy using Taylor's hypothesis (Taylor 1938 Proc. R. Soc. Lon. A 164, 476-490. (doi:10.1098/rspa.1938.0032)). Typically, a convection operator is used to filter out the acoustically inefficient components in the turbulent flows, and an acoustics dominant indirect variable Dcp‧ is solved. The sound pressure p' at the far field is computed from Dcp‧ based on the asymptotic properties of the Green's function. Validations results for benchmark problems with well-defined sources match well with the exact solutions. For aeroacoustics applications: the sound predictions by the aerofoil-gust interaction are close to those by an earlier method specially developed to remove the effect of vortical fluctuations (Zhong & Zhang 2017 J. Fluid Mech. 820, 424-450. (doi:10.1017/jfm.2017.219)); for the case of vortex shedding noise from a cylinder, the off-body predictions by the proposed method match well with the on-body Ffowcs-Williams and Hawkings result; different integration surfaces yield close predictions (of both spectra and far-field directivities) for a co-flowing jet case using an established direct numerical simulation database. The results suggest that the method may be a potential candidate for sound projection in aeroacoustics applications.
NASA Astrophysics Data System (ADS)
Battistella, C.; Robinson, D.; McQuarrie, N.; Ghoshal, S.
2017-12-01
Multiple valid balanced cross sections can be produced from mapped surface and subsurface data. By integrating low temperature thermochronologic data, we are better able to predict subsurface geometries. Existing valid balanced cross section for far western Nepal are few (Robinson et al., 2006) and do not incorporate thermochronologic data because the data did not exist. The data published along the Simikot cross section along the Karnali River since then include muscovite Ar, zircon U-Th/He and apatite fission track. We present new mapping and a new valid balanced cross section that takes into account the new field data as well as the limitations that thermochronologic data places on the kinematics of the cross section. Additional constrains include some new geomorphology data acquired since 2006 that indicate areas of increased vertical uplift, which indicate locations of buried ramps in the Main Himalayan thrust and guide the locations of Lesser Himalayan ramps in the balanced cross section. Future work will include flexural modeling, new low temperature thermochronometic data, and 2-D thermokinematic models from a sequentially forward modeled balanced cross sections in far western Nepal.
NASA Astrophysics Data System (ADS)
To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian
2008-09-01
Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.
NASA Technical Reports Server (NTRS)
Gore, Brian F.
2011-01-01
As automation and advanced technologies are introduced into transport systems ranging from the Next Generation Air Transportation System termed NextGen, to the advanced surface transportation systems as exemplified by the Intelligent Transportations Systems, to future systems designed for space exploration, there is an increased need to validly predict how the future systems will be vulnerable to error given the demands imposed by the assistive technologies. One formalized approach to study the impact of assistive technologies on the human operator in a safe and non-obtrusive manner is through the use of human performance models (HPMs). HPMs play an integral role when complex human-system designs are proposed, developed, and tested. One HPM tool termed the Man-machine Integration Design and Analysis System (MIDAS) is a NASA Ames Research Center HPM software tool that has been applied to predict human-system performance in various domains since 1986. MIDAS is a dynamic, integrated HPM and simulation environment that facilitates the design, visualization, and computational evaluation of complex man-machine system concepts in simulated operational environments. The paper will discuss a range of aviation specific applications including an approach used to model human error for NASA s Aviation Safety Program, and what-if analyses to evaluate flight deck technologies for NextGen operations. This chapter will culminate by raising two challenges for the field of predictive HPMs for complex human-system designs that evaluate assistive technologies: that of (1) model transparency and (2) model validation.
Probing sensorimotor integration during musical performance.
Furuya, Shinichi; Furukawa, Yuta; Uehara, Kazumasa; Oku, Takanori
2018-03-10
An integration of afferent sensory information from the visual, auditory, and proprioceptive systems into execution and update of motor programs plays crucial roles in control and acquisition of skillful sequential movements in musical performance. However, conventional behavioral and neurophysiological techniques that have been applied to study simplistic motor behaviors limit elucidating online sensorimotor integration processes underlying skillful musical performance. Here, we propose two novel techniques that were developed to investigate the roles of auditory and proprioceptive feedback in piano performance. First, a closed-loop noninvasive brain stimulation system that consists of transcranial magnetic stimulation, a motion sensor, and a microcomputer enabled to assess time-varying cortical processes subserving auditory-motor integration during piano playing. Second, a force-field system capable of manipulating the weight of a piano key allowed for characterizing movement adaptation based on the feedback obtained, which can shed light on the formation of an internal representation of the piano. Results of neurophysiological and psychophysics experiments provided evidence validating these systems as effective means for disentangling computational and neural processes of sensorimotor integration in musical performance. © 2018 New York Academy of Sciences.
Suitable RF spectrum in ISM band for 2-way advanced metering network in India
NASA Astrophysics Data System (ADS)
Mishra, A.; Khan, M. A.; Gaur, M. S.
2013-01-01
The ISM (Industrial Scientific and Medical) bands in the radio frequency space in India offer two alternative spectra to implement wireless network for advanced metering infrastructure (AMI). These bands lie in the range of 2.4GHz and sub-GHz frequencies 865 to 867 MHz This paper aims to examine the suitability of both options by designing and executing experiments in laboratory as well as carrying out field trials on electricity meters to validate the selected option. A parameter, communication effectiveness index (CEI2) is defined to measure the effectiveness of 2 way data communication (packet exchange) between two points under different scenarios of buildings and free space. Both 2.4 GHz and Sub-GHz designs were implemented to compare the results. The experiments were conducted across 3 floors of a building. Validation of the selected option was carried out by conducting a field trial by integrating the selected radio frequency (RF) modem into the single phase electricity meters and installing these meters across three floors of the building. The methodology, implementation details, observations and resulting analytical conclusion are described in the paper.
Design and simulation of a 800 Mbit/s data link for magnetic resonance imaging wearables.
Vogt, Christian; Buthe, Lars; Petti, Luisa; Cantarella, Giuseppe; Munzenrieder, Niko; Daus, Alwin; Troster, Gerhard
2015-08-01
This paper presents the optimization of electronic circuitry for operation in the harsh electro magnetic (EM) environment during a magnetic resonance imaging (MRI) scan. As demonstrator, a device small enough to be worn during the scan is optimized. Based on finite element method (FEM) simulations, the induced current densities due to magnetic field changes of 200 T s(-1) were reduced from 1 × 10(10) A m(-2) by one order of magnitude, predicting error-free operation of the 1.8V logic employed. The simulations were validated using a bit error rate test, which showed no bit errors during a MRI scan sequence. Therefore, neither the logic, nor the utilized 800 Mbit s(-1) low voltage differential swing (LVDS) data link of the optimized wearable device were significantly influenced by the EM interference. Next, the influence of ferro-magnetic components on the static magnetic field and consequently the image quality was simulated showing a MRI image loss with approximately 2 cm radius around a commercial integrated circuit of 1×1 cm(2). This was successively validated by a conventional MRI scan.
Optimized formulas for the gravitational field of a tesseroid
NASA Astrophysics Data System (ADS)
Grombein, Thomas; Seitz, Kurt; Heck, Bernhard
2013-07-01
Various tasks in geodesy, geophysics, and related geosciences require precise information on the impact of mass distributions on gravity field-related quantities, such as the gravitational potential and its partial derivatives. Using forward modeling based on Newton's integral, mass distributions are generally decomposed into regular elementary bodies. In classical approaches, prisms or point mass approximations are mostly utilized. Considering the effect of the sphericity of the Earth, alternative mass modeling methods based on tesseroid bodies (spherical prisms) should be taken into account, particularly in regional and global applications. Expressions for the gravitational field of a point mass are relatively simple when formulated in Cartesian coordinates. In the case of integrating over a tesseroid volume bounded by geocentric spherical coordinates, it will be shown that it is also beneficial to represent the integral kernel in terms of Cartesian coordinates. This considerably simplifies the determination of the tesseroid's potential derivatives in comparison with previously published methodologies that make use of integral kernels expressed in spherical coordinates. Based on this idea, optimized formulas for the gravitational potential of a homogeneous tesseroid and its derivatives up to second-order are elaborated in this paper. These new formulas do not suffer from the polar singularity of the spherical coordinate system and can, therefore, be evaluated for any position on the globe. Since integrals over tesseroid volumes cannot be solved analytically, the numerical evaluation is achieved by means of expanding the integral kernel in a Taylor series with fourth-order error in the spatial coordinates of the integration point. As the structure of the Cartesian integral kernel is substantially simplified, Taylor coefficients can be represented in a compact and computationally attractive form. Thus, the use of the optimized tesseroid formulas particularly benefits from a significant decrease in computation time by about 45 % compared to previously used algorithms. In order to show the computational efficiency and to validate the mathematical derivations, the new tesseroid formulas are applied to two realistic numerical experiments and are compared to previously published tesseroid methods and the conventional prism approach.
Status Update on the GPM Ground Validation Iowa Flood Studies (IFloodS) Field Experiment
NASA Astrophysics Data System (ADS)
Petersen, Walt; Krajewski, Witold
2013-04-01
The overarching objective of integrated hydrologic ground validation activities supporting the Global Precipitation Measurement Mission (GPM) is to provide better understanding of the strengths and limitations of the satellite products, in the context of hydrologic applications. To this end, the GPM Ground Validation (GV) program is conducting the first of several hydrology-oriented field efforts: the Iowa Flood Studies (IFloodS) experiment. IFloodS will be conducted in the central to northeastern part of Iowa in Midwestern United States during the months of April-June, 2013. Specific science objectives and related goals for the IFloodS experiment can be summarized as follows: 1. Quantify the physical characteristics and space/time variability of rain (rates, DSD, process/"regime") and map to satellite rainfall retrieval uncertainty. 2. Assess satellite rainfall retrieval uncertainties at instantaneous to daily time scales and evaluate propagation/impact of uncertainty in flood-prediction. 3. Assess hydrologic predictive skill as a function of space/time scales, basin morphology, and land use/cover. 4. Discern the relative roles of rainfall quantities such as rate and accumulation as compared to other factors (e.g. transport of water in the drainage network) in flood genesis. 5. Refine approaches to "integrated hydrologic GV" concept based on IFloodS experiences and apply to future GPM Integrated GV field efforts. These objectives will be achieved via the deployment of the NASA NPOL S-band and D3R Ka/Ku-band dual-polarimetric radars, University of Iowa X-band dual-polarimetric radars, a large network of paired rain gauge platforms with attendant soil moisture and temperature probes, a large network of both 2D Video and Parsivel disdrometers, and USDA-ARS gauge and soil-moisture measurements (in collaboration with the NASA SMAP mission). The aforementioned measurements will be used to complement existing operational WSR-88D S-band polarimetric radar measurements, USGS streamflow, and Iowa Flood Center stream monitoring measurements. Coincident satellite datasets will be archived from current microwave imaging and sounding radiometers flying on NOAA, DMSP, NASA, and EU (METOP) low-earth orbiters, and rapid-scanned IR datasets collected from geostationary (GOES) platforms. Collectively the observational assets will provide a means to create high quality (time and space sampling) ground "reference" rainfall and stream flow datasets. The ground reference radar and rainfall datasets will provide a means to assess uncertainties in both satellite algorithms (physics) and products. Subsequently, the impact of uncertainties in the satellite products can be evaluated in coupled weather, land-surface and distributed hydrologic modeling frameworks as related to flood prediction.
Software Independent Verification and Validation (SIV&V) Simplified
2006-12-01
Configuration Item I/O Input/Output I2V2 Independent Integrated Verification and Validation IBM International Business Machines ICD Interface...IPT Integrated Product Team IRS Interface Requirements Specification ISD Integrated System Diagram ITD Integrated Test Description ITP ...programming languages such as COBOL (Common Business Oriented Language) (Codasyl committee 1960), and FORTRAN (FORmula TRANslator) ( IBM 1952) (Robat 11
Validity of Integrity Tests for Predicting Drug and Alcohol Abuse
1993-08-31
Wiinkler and Sheridan (1989) found that employees who entered employee assistance programs for treating drug addiction were more likely be absent...August 31, 1993 Final 4. TITLE AND SUBTITLE S. FUNDING NUMBERS Validity of Integrity Tests for Predicting Drug and Alcohol Abuse C No. N00014-92-J...words) This research used psychometric meta-analysis (Hunter & Schmidt, 1990b) to examine the validity of integrity tests for predicting drug and
Rossi, Gina; Videler, Arjan; van Alphen, S P J
2018-04-01
Since older adults often show an atypical presentation of (mal)adaptive personality traits and pathological states, the articles in this special issue will concisely discuss some perennial issues in clinical assessment in older adults and thus outline the main challenges this domain faces. By bringing empirical work and meta-analytic studies from leading scholars in the field of geropsychology, the articles will also address these challenges by reporting the latest developments in the field. This way, we hope to reshape the way clinicians and researchers assess (mal)adaptive personality and pathological states in older adults into a more reliable and valid assessment method that integrates the specific biopsychosocial context of older age.
NASA Astrophysics Data System (ADS)
Delbary, Fabrice; Aramini, Riccardo; Bozza, Giovanni; Brignone, Massimo; Piana, Michele
2008-11-01
Microwave tomography is a non-invasive approach to the early diagnosis of breast cancer. However the problem of visualizing tumors from diffracted microwaves is a difficult nonlinear ill-posed inverse scattering problem. We propose a qualitative approach to the solution of such a problem, whereby the shape and location of cancerous tissues can be detected by means of a combination of the Reciprocity Gap Functional method and the Linear Sampling method. We validate this approach to synthetic near-fields produced by a finite element method for boundary integral equations, where the breast is mimicked by the axial view of two nested cylinders, the external one representing the skin and the internal one representing the fat tissue.
Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline
Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur
2010-01-01
Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408
NASA Astrophysics Data System (ADS)
Choi, Jin-Ha; Lee, Jaewon; Shin, Woojung; Choi, Jeong-Woo; Kim, Hyun Jung
2016-10-01
Nanotechnology and bioengineering have converged over the past decades, by which the application of multi-functional nanoparticles (NPs) has been emerged in clinical and biomedical fields. The NPs primed to detect disease-specific biomarkers or to deliver biopharmaceutical compounds have beena validated in conventional in vitro culture models including two dimensional (2D) cell cultures or 3D organoid models. However, a lack of experimental models that have strong human physiological relevance has hampered accurate validation of the safety and functionality of NPs. Alternatively, biomimetic human "Organs-on-Chips" microphysiological systems have recapitulated the mechanically dynamic 3D tissue interface of human organ microenvironment, in which the transport, cytotoxicity, biocompatibility, and therapeutic efficacy of NPs and their conjugates may be more accurately validated. Finally, integration of NP-guided diagnostic detection and targeted nanotherapeutics in conjunction with human organs-on-chips can provide a novel avenue to accelerate the NP-based drug development process as well as the rapid detection of cellular secretomes associated with pathophysiological processes.
Integration and Test Flight Validation Plans for the Pulsed Plasma Thruster Experiment on EO- 1
NASA Technical Reports Server (NTRS)
Zakrzwski, Charles; Benson, Scott; Sanneman, Paul; Hoskins, Andy; Bauer, Frank H. (Technical Monitor)
2002-01-01
The Pulsed Plasma Thruster (PPT) Experiment on the Earth Observing One (EO-1) spacecraft has been designed to demonstrate the capability of a new generation PPT to perform spacecraft attitude control. The PPT is a small, self-contained pulsed electromagnetic propulsion system capable of delivering high specific impulse (900-1200 s), very small impulse bits (10-1000 uN-s) at low average power (less than 1 to 100 W). Teflon fuel is ablated and slightly ionized by means of a capacitative discharge. The discharge also generates electromagnetic fields that accelerate the plasma by means of the Lorentz Force. EO-1 has a single PPT that can produce thrust in either the positive or negative pitch direction. The flight validation has been designed to demonstrate of the ability of the PPT to provide precision pointing accuracy, response and stability, and confirmation of benign plume and EMI effects. This paper will document the success of the flight validation.
Iannuzzi, David; Grant, Andrew; Corriveau, Hélène; Boissy, Patrick; Michaud, Francois
2016-12-01
The objective of this study was to design effectively integrated information architecture for a mobile teleoperated robot in remote assistance to the delivery of home health care. Three role classes were identified related to the deployment of a telerobot, namely, engineer, technology integrator, and health professional. Patients and natural caregivers were indirectly considered, this being a component of future field studies. Interviewing representatives of each class provided the functions, and information content and flows for each function. Interview transcripts enabled the formulation of UML (Universal Modeling Language) diagrams for feedback from participants. The proposed information architecture was validated with a use-case scenario. The integrated information architecture incorporates progressive design, ergonomic integration, and the home care needs from medical specialist, nursing, physiotherapy, occupational therapy, and social worker care perspectives. The integrated architecture iterative process promoted insight among participants. The use-case scenario evaluation showed the design's robustness. Complex innovation such as a telerobot must coherently mesh with health-care service delivery needs. The deployment of integrated information architecture bridging development, with specialist and home care applications, is necessary for home care technology innovation. It enables continuing evolution of robot and novel health information design in the same integrated architecture, while accounting for patient ecological need.
Integration of design and inspection
NASA Astrophysics Data System (ADS)
Simmonds, William H.
1990-08-01
Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.
Enhancements and Evolution of the Real Time Mission Monitor
NASA Technical Reports Server (NTRS)
Goodman, Michael; Blakeslee, Richard; Hardin, Danny; Hall, John; He, Yubin; Regner, Kathryn
2008-01-01
The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. RTMM has proven extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, mission scientists, instrument scientists and program managers alike appreciate the contributions that RTMM makes to their flight projects. We have received numerous plaudits from a wide variety of scientists who used RTMM during recent field campaigns including the 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), 2008 Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) missions, the 2007-2008 NOAA-NASA Aerosonde Hurricane flights and the 2008 Soil Moisture Active-Passive Validation Experiment (SMAP-VEX). Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated for altitude, latitude, longitude, flight leg distance, cumulative distance, flight leg time, cumulative time, and satellite overpass intersections. The resultant flight plan is then generated in KML and quickly posted to the Google Earth-based RTMM for interested scientists to view the planned flight track and then compare it to the actual real time flight progress. A description of the system architecture, components, and applications along with reviews and animations of RTMM during the field campaigns, plus planned enhancements and future opportunities will be presented.
Enhancements and Evolution of the Real Time Mission Monitor
NASA Astrophysics Data System (ADS)
Goodman, M.; Blakeslee, R.; Hardin, D.; Hall, J.; He, Y.; Regner, K.
2008-12-01
The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual earth application. RTMM has proven extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, mission scientists, instrument scientists and program managers alike appreciate the contributions that RTMM makes to their flight projects. RTMM has received numerous plaudits from a wide variety of scientists who used RTMM during recent field campaigns including the 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), 2008 Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS) missions, the 2007-2008 NOAA-NASA Aerosonde Hurricane flights and the 2008 Soil Moisture Active-Passive Validation Experiment (SMAP-VEX). Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated for altitude, latitude, longitude, flight leg distance, cumulative distance, flight leg time, cumulative time, and satellite overpass intersections. The resultant flight plan is then generated in KML and quickly posted to the Google Earth-based RTMM for planning discussions, as well as comparisons to real time flight tracks in progress. A description of the system architecture, components, and applications along with reviews and animations of RTMM during the field campaigns, plus planned enhancements and future opportunities will be presented.
The S2 UAS, a Modular Platform for Atmospheric Science
NASA Astrophysics Data System (ADS)
Elston, J. S.; Stachura, M.; Bland, G.
2017-12-01
Black Swift Technologies, LLC (BST) developed and refined the S2 in partnership with NASA. The S2 is a novel small Unmanned Aircraft System (sUAS) specifically designed to meet the needs of atmospheric and earth observing scientific field campaigns. This tightly integrated system consists of an airframe, avionics, and sensors designed to measure atmospheric parameters (e.g., temperature, pressure, humidity, and 3D winds) and well as carry up to 2.3kg (5lbs) of additional payload. At the core of the sensing suite is a custom designed multi-hole-probe being developed to provide accurate measurements in u, v and w while remaining simple to integrate as well as low-cost. The S2 relies on the commercially-available SwiftCore Flight Management System (FMS), which has been proven in the field to provide a cost-effective, powerful, and easy-to-operate solution to meet the demanding requirements of nomadic scientific field campaigns. The airframe capabilities are currently being expanded to achieve high altitude flights through strong winds and damaging airborne particulates. Additionally, the well-documented power and data interfaces of the S2 will be employed to integrate the sensors required for the measurement of soil moisture content, atmospheric volcanic phenomenon, fire weather, as well as provide satellite calibration via multispectral cameras. Extensive flight testing has been planned to validate the S2 system's ability to operate in difficult terrain including mountainside takeoff and recovery and flights up to 6000m above sea level.
Forward Bay Cover Separation Modeling and Testing for the Orion Multi-Purpose Crew Vehicle
NASA Technical Reports Server (NTRS)
Ali, Yasmin; Radke, Tara; Chuhta, Jesse; Hughes, Michael
2014-01-01
Spacecraft multi-body separation events during atmospheric descent require complex testing and analysis to validate the flight separation dynamics model and to verify no recontact. NASA Orion Multi-Purpose Crew Vehicle (MPCV) teams examined key model parameters and risk areas to develop a robust but affordable test campaign in order to validate and verify the Forward Bay Cover (FBC) separation event for Exploration Flight Test-1 (EFT-1). The FBC jettison simulation model is highly complex, consisting of dozens of parameters varied simultaneously, with numerous multi-parameter interactions (coupling and feedback) among the various model elements, and encompassing distinct near-field, mid-field, and far-field regimes. The test campaign was composed of component-level testing (for example gas-piston thrusters and parachute mortars), ground FBC jettison tests, and FBC jettison air-drop tests that were accomplished by a highly multi-disciplinary team. Three ground jettison tests isolated the testing of mechanisms and structures to anchor the simulation models excluding aerodynamic effects. Subsequently, two air-drop tests added aerodynamic and parachute parameters, and served as integrated system demonstrations, which had been preliminarily explored during the Orion Pad Abort-1 (PA-1) flight test in May 2010. Both ground and drop tests provided extensive data to validate analytical models and to verify the FBC jettison event for EFT-1, but more testing is required to support human certification, for which NASA and Lockheed Martin are applying knowledge from Apollo and EFT-1 testing and modeling to develop a robust but affordable human spacecraft capability.
Dellaserra, Carla L; Gao, Yong; Ransdell, Lynda
2014-02-01
Integrated technology (IT), which includes accelerometers, global positioning systems (GPSs), and heart rate monitors, has been used frequently in public health. More recently, IT data have been used in sports settings to assess training and performance demands. However, the impact of IT in sports settings is yet to be evaluated, particularly in field-based team sports. This narrative-qualitative review provides an overview of the emerging impact of IT in sports settings. Twenty electronic databases (e.g., Medline, SPORTdiscus, and ScienceDirect), print publications (e.g., Signal Processing Magazine and Catapult Innovations news releases), and internet resources were searched using different combinations of keywords as follows: accelerometers, heart rate monitors, GPS, sport training, and field-based sports for relevant articles published from 1990 to the present. A total of 114 publications were identified, and 39 that examined a field-based team sport using a form of IT were analyzed. The articles chosen for analysis examined a field-based team sport using a form of IT. The uses of IT can be divided into 4 categories: (a) quantifying movement patterns (n = 22), (b) assessing the differences between demands of training and competition (n = 12), (c) measuring physiological and metabolic responses (n = 16), and (d) determining a valid definition for velocity and a sprint effort (n = 8). Most studies used elite adult male athletes as participants and analyzed the sports of Australian Rules football, field hockey, cricket, and soccer, with sample sizes between 5 and 20 participants. The limitations of IT in a sports setting include scalability issues, cost, and the inability to receive signals within indoor environments. Integrated technology can contribute to significant improvements in the preparation, training, and recovery aspects of field-based team sports. Future research should focus on using IT with female athlete populations and developing resources to use IT indoors to further enhance individual and team performances.
Barriers to Implementing Treatment Integrity Procedures: Survey of Treatment Outcome Researchers
ERIC Educational Resources Information Center
Perepletchikova, Francheska; Hilt, Lori M.; Chereji, Elizabeth; Kazdin, Alan E.
2009-01-01
Treatment integrity refers to implementing interventions as intended. Treatment integrity is critically important for experimental validity and for drawing valid inferences regarding the relationship between treatment and outcome. Yet, it is rarely adequately addressed in psychotherapy research. The authors examined barriers to treatment integrity…
Celio, Mark A; Vetter-O'Hagen, Courtney S; Lisman, Stephen A; Johansen, Gerard E; Spear, Linda P
2011-12-01
Field methodologies offer a unique opportunity to collect ecologically valid data on alcohol use and its associated problems within natural drinking environments. However, limitations in follow-up data collection methods have left unanswered questions regarding the psychometric properties of field-based measures. The aim of the current study is to evaluate the reliability of self-report data collected in a naturally occurring environment - as indexed by the Alcohol Use Disorders Identification Test (AUDIT) - compared to self-report data obtained through an innovative web-based follow-up procedure. Individuals recruited outside of bars (N=170; mean age=21; range 18-32) provided a BAC sample and completed a self-administered survey packet that included the AUDIT. BAC feedback was provided anonymously through a dedicated web page. Upon sign in, follow-up participants (n=89; 52%) were again asked to complete the AUDIT before receiving their BAC feedback. Reliability analyses demonstrated that AUDIT scores - both continuous and dichotomized at the standard cut-point - were stable across field- and web-based administrations. These results suggest that self-report data obtained from acutely intoxicated individuals in naturally occurring environments are reliable when compared to web-based data obtained after a brief follow-up interval. Furthermore, the results demonstrate the feasibility, utility, and potential of integrating field methods and web-based data collection procedures. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Developments in Sensitivity Methodologies and the Validation of Reactor Physics Calculations
Palmiotti, Giuseppe; Salvatores, Massimo
2012-01-01
The sensitivity methodologies have been a remarkable story when adopted in the reactor physics field. Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. A review of the methods used is provided, and several examples illustrate the success of the methodology in reactor physics. A new application as the improvement of nuclear basic parameters using integral experiments is also described.
NASA Technical Reports Server (NTRS)
Vlahopoulos, Nickolas; Lyle, Karen H.; Burley, Casey L.
1998-01-01
An algorithm for generating appropriate velocity boundary conditions for an acoustic boundary element analysis from the kinematics of an operating propeller is presented. It constitutes the initial phase of Integrating sophisticated rotorcraft models into a conventional boundary element analysis. Currently, the pressure field is computed by a linear approximation. An initial validation of the developed process was performed by comparing numerical results to test data for the external acoustic pressure on the surface of a tilt-rotor aircraft for one flight condition.
Rahmati, Omid; Tahmasebipour, Naser; Haghizadeh, Ali; Pourghasemi, Hamid Reza; Feizizadeh, Bakhtiar
2017-02-01
Despite the importance of soil erosion in sustainable development goals in arid and semi-arid areas, the study of the geo-environmental conditions and factors influencing gully erosion occurrence is rarely undertaken. As effort to this challenge, the main objective of this study is to apply an integrated approach of Geographic Object-Based Image Analysis (GEOBIA) together with high-spatial resolution imagery (SPOT-5) for detecting gully erosion features at the Kashkan-Poldokhtar watershed, Iran. We also aimed to apply a Conditional Probability (CP) model for establishing the spatial relationship between gullies and the Geo-Environmental Factors (GEFs). The gully erosion inventory map prepared using GEOBIA and field surveying was randomly partitioned into two subsets: (1) part 1 that contains 70% was used in the training phase of the CP model; (2) part 2 is a validation dataset (30%) for validation of the model and to confirm its accuracy. Prediction performances of the GEOBIA and CP model were checked by overall accuracy and Receiver Operating Characteristics (ROC) curve methods, respectively. In addition, the influence of all GEFs on gully erosion was evaluated by performing a sensitivity analysis model. The validation findings illustrated that overall accuracy for GEOBIA approach and the area under the ROC curve for the CP model were 92.4% and 89.9%, respectively. Also, based on sensitivity analysis, soil texture, drainage density, and lithology represent significantly effects on the gully erosion occurrence. This study has shown that the integrated framework can be successfully used for modeling gully erosion occurrence in a data-poor environment. Copyright © 2016 Elsevier B.V. All rights reserved.
Panel acoustic contribution analysis.
Wu, Sean F; Natarajan, Logesh Kumar
2013-02-01
Formulations are derived to analyze the relative panel acoustic contributions of a vibrating structure. The essence of this analysis is to correlate the acoustic power flow from each panel to the radiated acoustic pressure at any field point. The acoustic power is obtained by integrating the normal component of the surface acoustic intensity, which is the product of the surface acoustic pressure and normal surface velocity reconstructed by using the Helmholtz equation least squares based nearfield acoustical holography, over each panel. The significance of this methodology is that it enables one to analyze and rank relative acoustic contributions of individual panels of a complex vibrating structure to acoustic radiation anywhere in the field based on a single set of the acoustic pressures measured in the near field. Moreover, this approach is valid for both interior and exterior regions. Examples of using this method to analyze and rank the relative acoustic contributions of a scaled vehicle cabin are demonstrated.
Inhibiting the HIV Integration Process: Past, Present, and the Future
2013-01-01
HIV integrase (IN) catalyzes the insertion into the genome of the infected human cell of viral DNA produced by the retrotranscription process. The discovery of raltegravir validated the existence of the IN, which is a new target in the field of anti-HIV drug research. The mechanism of catalysis of IN is depicted, and the characteristics of the inhibitors of the catalytic site of this viral enzyme are reported. The role played by the resistance is elucidated, as well as the possibility of bypassing this problem. New approaches to block the integration process are depicted as future perspectives, such as development of allosteric IN inhibitors, dual inhibitors targeting both IN and other enzymes, inhibitors of enzymes that activate IN, activators of IN activity, as well as a gene therapy approach. PMID:24025027
Couples coping with cardiovascular disease: A systematic review.
Trump, Lisa J; Mendenhall, Tai J
2017-03-01
Cardiovascular disease (CVD) is the leading cause of death for both men and women. Its potential ramifications on all aspects of life, for patients and partners, are just beginning to be understood. Although research has focused on the individual who has received the diagnosis, relatively little is known about how couples manage CVD. This article presents a systematic review of literature that focuses on how couples cope with one partner's CVD diagnosis. A systematic review is warranted to orient practitioners, policy makers, and researchers to the state of existing knowledge and its gaps and to identify what still needs to be done. Data were extracted from 25 peer-reviewed articles that met our inclusion criteria. Content examined included theory integration, coping constructs and instruments, samples, analyses, and findings. Most articles successfully integrated theory in the studies' respective conceptualizations and designs. Most used valid and reliable instruments to measure coping. Principal limitations included problematic sampling strategies and analysis techniques, thereby limiting external validity. Principal implications of this review's findings relate to our fields' need to provide more care focused on dyads (vs. individual patients), adopt an integrated model in health care, and conduct systemic, longitudinal research to gain a better grasp on how coping changes over time. Doing so will serve to better equip providers in the support of patients and partners living with CVD. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
NASA Technical Reports Server (NTRS)
Stonesifer, R. B.; Atluri, S. N.
1982-01-01
The development of valid creep fracture criteria is considered. Two path-independent integral parameters which show some degree of promise are the C* and (Delta T)sub c integrals. The mathematical aspects of these parameters are reviewed by deriving generalized vector forms of the parameters using conservation laws which are valid for arbitrary, three dimensional, cracked bodies with crack surface tractions (or applied displacements), body forces, inertial effects, and large deformations. Two principal conclusions are that (Delta T)sub c has an energy rate interpretation whereas C* does not. The development and application of fracture criteria often involves the solution of boundary/initial value problems associated with deformation and stresses. The finite element method is used for this purpose. An efficient, small displacement, infinitesimal strain, displacement based finite element model is specialized to two dimensional plane stress and plane strain and to power law creep constitutive relations. A mesh shifting/remeshing procedure is used for simulating crack growth. The model is implemented with the quartz-point node technique and also with specially developed, conforming, crack-tip singularity elements which provide for the r to the n-(1+n) power strain singularity associated with the HRR crack-tip field. Comparisons are made with a variety of analytical solutions and alternate numerical solutions for a number of problems.
The Development and Validation of the Religious/Spiritually Integrated Practice Assessment Scale
ERIC Educational Resources Information Center
Oxhandler, Holly K.; Parrish, Danielle E.
2016-01-01
Objective: This article describes the development and validation of the Religious/Spiritually Integrated Practice Assessment Scale (RSIPAS). The RSIPAS is designed to assess social work practitioners' self-efficacy, attitudes, behaviors, and perceived feasibility concerning the assessment or integration of clients' religious and spiritual beliefs…
Collection of LAI and FPAR Data Over The Terra Core Sites
NASA Technical Reports Server (NTRS)
Myneni, Ranga B.; Knjazihhin, J.; Tian, Y.; Wang, Y.
2001-01-01
The objective of our effort was to collect and archive data on LAI (leaf area index) and FPAR (Fraction of Photosynthetically active Radiation absorbed by vegetation) at the EOS Core validation sites as well as to validate and evaluate global fields of LAI and FPAR derived from atmospherically corrected MODIS (Moderate Resolution Imaging Spectrometer) surface reflectance data by comparing these fields with the EOS Core validation data set. The above has been accomplished by: (a) the participation in selected field campaigns within the EOS Validation Program; (b) the processing of the collected data so that suitable comparison between field measurements and the MODIS LAI/FPAR fields can be made; (c) the comparison of the MODAS LAI/FRAM fields with the EOS Terra Core validation data set.
Measurement and numerical simulation of high intensity focused ultrasound field in water
NASA Astrophysics Data System (ADS)
Lee, Kang Il
2017-11-01
In the present study, the acoustic field of a high intensity focused ultrasound (HIFU) transducer in water was measured by using a commercially available needle hydrophone intended for HIFU use. To validate the results of hydrophone measurements, numerical simulations of HIFU fields were performed by integrating the axisymmetric Khokhlov-Zabolotskaya-Kuznetsov (KZK) equation from the frequency-domain perspective with the help of a MATLAB-based software package developed for HIFU simulation. Quantitative values for the focal waveforms, the peak pressures, and the size of the focal spot were obtained in various regimes of linear, quasilinear, and nonlinear propagation up to the source pressure levels when the shock front was formed in the waveform. The numerical results with the HIFU simulator solving the KZK equation were compared with the experimental data and found to be in good agreement. This confirms that the numerical simulation based on the KZK equation is capable of capturing the nonlinear pressure field of therapeutic HIFU transducers well enough to make it suitable for HIFU treatment planning.
NASA Astrophysics Data System (ADS)
Kikuchi, Ryota; Misaka, Takashi; Obayashi, Shigeru
2016-04-01
An integrated method consisting of a proper orthogonal decomposition (POD)-based reduced-order model (ROM) and a particle filter (PF) is proposed for real-time prediction of an unsteady flow field. The proposed method is validated using identical twin experiments of an unsteady flow field around a circular cylinder for Reynolds numbers of 100 and 1000. In this study, a PF is employed (ROM-PF) to modify the temporal coefficient of the ROM based on observation data because the prediction capability of the ROM alone is limited due to the stability issue. The proposed method reproduces the unsteady flow field several orders faster than a reference numerical simulation based on Navier-Stokes equations. Furthermore, the effects of parameters, related to observation and simulation, on the prediction accuracy are studied. Most of the energy modes of the unsteady flow field are captured, and it is possible to stably predict the long-term evolution with ROM-PF.
Dual metal gate tunneling field effect transistors based on MOSFETs: A 2-D analytical approach
NASA Astrophysics Data System (ADS)
Ramezani, Zeinab; Orouji, Ali A.
2018-01-01
A novel 2-D analytical drain current model of novel Dual Metal Gate Tunnel Field Effect Transistors Based on MOSFETs (DMG-TFET) is presented in this paper. The proposed Tunneling FET is extracted from a MOSFET structure by employing an additional electrode in the source region with an appropriate work function to induce holes in the N+ source region and hence makes it as a P+ source region. The electric field is derived which is utilized to extract the expression of the drain current by analytically integrating the band to band tunneling generation rate in the tunneling region based on the potential profile by solving the Poisson's equation. Through this model, the effects of the thin film thickness and gate voltage on the potential, the electric field, and the effects of the thin film thickness on the tunneling current can be studied. To validate our present model we use SILVACO ATLAS device simulator and the analytical results have been compared with it and found a good agreement.
An Integrated Approach to Establish Validity and Reliability of Reading Tests
ERIC Educational Resources Information Center
Razi, Salim
2012-01-01
This study presents the processes of developing and establishing reliability and validity of a reading test by administering an integrative approach as conventional reliability and validity measures superficially reveals the difficulty of a reading test. In this respect, analysing vocabulary frequency of the test is regarded as a more eligible way…
Analytic Formulation and Numerical Implementation of an Acoustic Pressure Gradient Prediction
NASA Technical Reports Server (NTRS)
Lee, Seongkyu; Brentner, Kenneth S.; Farassat, Fereidoun
2007-01-01
The scattering of rotor noise is an area that has received little attention over the years, yet the limited work that has been done has shown that both the directivity and intensity of the acoustic field may be significantly modified by the presence of scattering bodies. One of the inputs needed to compute the scattered acoustic field is the acoustic pressure gradient on a scattering surface. Two new analytical formulations of the acoustic pressure gradient have been developed and implemented in the PSU-WOPWOP rotor noise prediction code. These formulations are presented in this paper. The first formulation is derived by taking the gradient of Farassat's retarded-time Formulation 1A. Although this formulation is relatively simple, it requires numerical time differentiation of the acoustic integrals. In the second formulation, the time differentiation is taken inside the integrals analytically. The acoustic pressure gradient predicted by these new formulations is validated through comparison with the acoustic pressure gradient determined by a purely numerical approach for two model rotors. The agreement between analytic formulations and numerical method is excellent for both stationary and moving observers case.
High order Nyström method for elastodynamic scattering
NASA Astrophysics Data System (ADS)
Chen, Kun; Gurrala, Praveen; Song, Jiming; Roberts, Ron
2016-02-01
Elastic waves in solids find important applications in ultrasonic non-destructive evaluation. The scattering of elastic waves has been treated using many approaches like the finite element method, boundary element method and Kirchhoff approximation. In this work, we propose a novel accurate and efficient high order Nyström method to solve the boundary integral equations for elastodynamic scattering problems. This approach employs high order geometry description for the element, and high order interpolation for fields inside each element. Compared with the boundary element method, this approach makes the choice of the nodes for interpolation based on the Gaussian quadrature, which renders matrix elements for far field interaction free from integration, and also greatly simplifies the process for singularity and near singularity treatment. The proposed approach employs a novel efficient near singularity treatment that makes the solver able to handle extreme geometries like very thin penny-shaped crack. Numerical results are presented to validate the approach. By using the frequency domain response and performing the inverse Fourier transform, we also report the time domain response of flaw scattering.
The SAMI Galaxy Survey: can we trust aperture corrections to predict star formation?
NASA Astrophysics Data System (ADS)
Richards, S. N.; Bryant, J. J.; Croom, S. M.; Hopkins, A. M.; Schaefer, A. L.; Bland-Hawthorn, J.; Allen, J. T.; Brough, S.; Cecil, G.; Cortese, L.; Fogarty, L. M. R.; Gunawardhana, M. L. P.; Goodwin, M.; Green, A. W.; Ho, I.-T.; Kewley, L. J.; Konstantopoulos, I. S.; Lawrence, J. S.; Lorente, N. P. F.; Medling, A. M.; Owers, M. S.; Sharp, R.; Sweet, S. M.; Taylor, E. N.
2016-01-01
In the low-redshift Universe (z < 0.3), our view of galaxy evolution is primarily based on fibre optic spectroscopy surveys. Elaborate methods have been developed to address aperture effects when fixed aperture sizes only probe the inner regions for galaxies of ever decreasing redshift or increasing physical size. These aperture corrections rely on assumptions about the physical properties of galaxies. The adequacy of these aperture corrections can be tested with integral-field spectroscopic data. We use integral-field spectra drawn from 1212 galaxies observed as part of the SAMI Galaxy Survey to investigate the validity of two aperture correction methods that attempt to estimate a galaxy's total instantaneous star formation rate. We show that biases arise when assuming that instantaneous star formation is traced by broad-band imaging, and when the aperture correction is built only from spectra of the nuclear region of galaxies. These biases may be significant depending on the selection criteria of a survey sample. Understanding the sensitivities of these aperture corrections is essential for correct handling of systematic errors in galaxy evolution studies.
The IFS for WFIRST CGI: Science Requirements to Design
NASA Astrophysics Data System (ADS)
Groff, Tyler; Gong, Qian; Mandell, Avi M.; Zimmerman, Neil; Rizzo, Maxime; McElwain, Michael; harvey, david; Saxena, Prabal; cady, eric; mejia prada, camilo
2018-01-01
Direct Imaging of exoplanets using a coronagraph has become a major field of research both on the ground and in space. Key to the science of direct imaging is the spectroscopic capabilities of the instrument, our ability to extract spectra, and measure the abundance of molecular species such as Methane. To take these spectra, the WFIRST coronagraph instrument (CGI) uses an integral field spectrograph (IFS), which encodes the spectrum into a two-dimensional image on the detector. This results in more efficient detection and characterization of targets, and the spectral information is critical to achieving detection limits below the speckle floor of the imager. The CGI IFS operates in three 18% bands spanning 600nm to 970nm at a nominal spectral resolution of R50. We present the current science and engineering requirements for the IFS design, the instrument design, anticipated performance, and how the calibration is integrated into the focal plane wavefront control algorithms. We also highlight the role of the Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies (PISCES) at the JPL High Contrast Imaging Testbed to demonstrate performance and validate calibration methodologies for the flight instrument.
NASA Astrophysics Data System (ADS)
Steer, D. N.; Iverson, E. A.; Manduca, C. A.
2013-12-01
This research seeks to develop valid and reliable questions that faculty can use to assess geoscience literacy across the curriculum. We are particularly interested on effects of curricula developed to teach Earth, Climate, Atmospheric, and Ocean Science concepts in the context of societal issues across the disciplines. This effort is part of the InTeGrate project designed to create a population of college graduates who are poised to use geoscience knowledge in developing solutions to current and future environmental and resource challenges. Details concerning the project are found at http://serc.carleton.edu/integrate/index.html. The Geoscience Literacy Exam (GLE) under development presently includes 90 questions. Each big idea from each literacy document can be probed using one or more of three independent questions: 1) a single answer, multiple choice question aimed at basic understanding or application of key concepts, 2) a multiple correct answer, multiple choice question targeting the analyzing to analysis levels and 3) a short essay question that tests analysis or evaluation cognitive levels. We anticipate multiple-choice scores and the detail and sophistication of essay responses will increase as students engage with the curriculum. As part of the field testing of InTeGrate curricula, faculty collected student responses from classes that involved over 700 students. These responses included eight pre- and post-test multiple-choice questions that covered various concepts across the four literacies. Discrimination indices calculated from the data suggest that the eight tested questions provide a valid measure of literacy within the scope of the concepts covered. Student normalized gains across an academic term with limited InTeGrate exposure (typically two or fewer weeks of InTeGrate curriculum out of 14 weeks) were found to average 16% gain. A small set of control data (250 students in classes from one institution where no InTeGrate curricula were used) was also collected from a larger bank of test questions. Discrimination indices across the full bank showed variation and additional work is underway to refine and field test in other settings these questions in the absence of InTeGrate curricula. When complete, faculty will be able to assemble sets of questions to track progress toward meeting literacy goals. In addition to covering geoscience content knowledge and understanding, a complementary attitudinal pre/post survey was also developed with the intent to probe InTeGrate students' ability and motivation to use their geoscience expertise to address problems of environmental sustainability. The final instruments will be made available to the geoscience education community as an assessment to be used in conjunction with InTeGrate teaching materials or as a stand-alone tool for departments to measure student learning and attitudinal gains across the major.
Why does trigonometric substitution work?
NASA Astrophysics Data System (ADS)
Cunningham, Daniel W.
2018-05-01
Modern calculus textbooks carefully illustrate how to perform integration by trigonometric substitution. Unfortunately, most of these books do not adequately justify this powerful technique of integration. In this article, we present an accessible proof that establishes the validity of integration by trigonometric substitution. The proof offers calculus instructors a simple argument that can be used to show their students that trigonometric substitution is a valid technique of integration.
Development and validation of the crew-station system-integration research facility
NASA Technical Reports Server (NTRS)
Nedell, B.; Hardy, G.; Lichtenstein, T.; Leong, G.; Thompson, D.
1986-01-01
The various issues associated with the use of integrated flight management systems in aircraft were discussed. To address these issues a fixed base integrated flight research (IFR) simulation of a helicopter was developed to support experiments that contribute to the understanding of design criteria for rotorcraft cockpits incorporating advanced integrated flight management systems. A validation experiment was conducted that demonstrates the main features of the facility and the capability to conduct crew/system integration research.
Phase contrast STEM for thin samples: Integrated differential phase contrast.
Lazić, Ivan; Bosch, Eric G T; Lazar, Sorin
2016-01-01
It has been known since the 1970s that the movement of the center of mass (COM) of a convergent beam electron diffraction (CBED) pattern is linearly related to the (projected) electrical field in the sample. We re-derive a contrast transfer function (CTF) for a scanning transmission electron microscopy (STEM) imaging technique based on this movement from the point of view of image formation and continue by performing a two-dimensional integration on the two images based on the two components of the COM movement. The resulting integrated COM (iCOM) STEM technique yields a scalar image that is linear in the phase shift caused by the sample and therefore also in the local (projected) electrostatic potential field of a thin sample. We confirm that the differential phase contrast (DPC) STEM technique using a segmented detector with 4 quadrants (4Q) yields a good approximation for the COM movement. Performing a two-dimensional integration, just as for the COM, we obtain an integrated DPC (iDPC) image which is approximately linear in the phase of the sample. Beside deriving the CTFs of iCOM and iDPC, we clearly point out the objects of the two corresponding imaging techniques, and highlight the differences to objects corresponding to COM-, DPC-, and (HA) ADF-STEM. The theory is validated with simulations and we present first experimental results of the iDPC-STEM technique showing its capability for imaging both light and heavy elements with atomic resolution and a good signal to noise ratio (SNR). Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Litt, Jonathan S.; Sowers, T Shane; Liu, Yuan; Owen, A. Karl; Guo, Ten-Huei
2015-01-01
The National Aeronautics and Space Administration (NASA) has developed independent airframe and engine models that have been integrated into a single real-time aircraft simulation for piloted evaluation of propulsion control algorithms. In order to have confidence in the results of these evaluations, the integrated simulation must be validated to demonstrate that its behavior is realistic and that it meets the appropriate Federal Aviation Administration (FAA) certification requirements for aircraft. The paper describes the test procedures and results, demonstrating that the integrated simulation generally meets the FAA requirements and is thus a valid testbed for evaluation of propulsion control modes.
Desert Research and Technology Studies (DRATS) 2010 Education and Public Outreach (EPO)
NASA Astrophysics Data System (ADS)
Paul, Heather L.
2013-10-01
The Exploration Systems Mission Directorate, Directorate Integration Office conducts analog field test activities, such as Desert Research and Technology Studies (DRATS), to validate exploration system architecture concepts and conduct technology demonstrations. Education and Public Outreach (EPO) activities have been a part of DRATS missions in the past to engage students, educators, and the general public in analog activities. However, in 2010, for the first time, EPO was elevated as a principal task for the mission and metrics were collected for all EPO activities. EPO activities were planned well in advance of the mission, with emphasis on creating a multitude of activities to attract students of all ages. Web-based and social media interaction between August 31 and September 14, 2010 resulted in 62,260 DRATS Flickr views; 10,906 views of DRATS videos on YouTube; 1,483 new DRATS Twitter followers; and a 111% increase in DRATS Facebook fan interactions. Over 7,000 outreach participants were directly involved in the DRATS 2010 analog mission via student visitations at both the integrated dry-runs prior to the field mission and during the field mission; by participating in live, interactive webcasts and virtual events; and online voting to determine a traverse site as part of the NASA initiative for Participatory Exploration (PE).
NASA Astrophysics Data System (ADS)
Campbell, Todd; Abd-Hamid, Nor Hashidah
2013-08-01
This study describes the development of an instrument to investigate the extent to which technology is integrated in science instruction in ways aligned to science reform outlined in standards documents. The instrument was developed by: (a) creating items consistent with the five dimensions identified in science education literature, (b) establishing content validity with both national and international content experts, (c) refining the item pool based on content expert feedback, (d) piloting testing of the instrument, (e) checking statistical reliability and item analysis, and (f) subsequently refining and finalization of the instrument. The TUSI was administered in a field test across eleven classrooms by three observers, with a total of 33 TUSI ratings completed. The finalized instrument was found to have acceptable inter-rater intraclass correlation reliability estimates. After the final stage of development, the TUSI instrument consisted of 26-items separated into the original five categories, which aligned with the exploratory factor analysis clustering of the items. Additionally, concurrent validity of the TUSI was established with the Reformed Teaching Observation Protocol. Finally, a subsequent set of 17 different classrooms were observed during the spring of 2011, and for the 9 classrooms where technology integration was observed, an overall Cronbach alpha reliability coefficient of 0.913 was found. Based on the analyses completed, the TUSI appears to be a useful instrument for measuring how technology is integrated into science classrooms and is seen as one mechanism for measuring the intersection of technological, pedagogical, and content knowledge in science classrooms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghila, A; Steciw, S; Fallone, B
Purpose: Integrated linac-MR systems are uniquely suited for real time tumor tracking during radiation treatment. Understanding the magnetic field dose effects and incorporating them in treatment planning is paramount for linac-MR clinical implementation. We experimentally validated the EGSnrc dose calculations in the presence of a magnetic field parallel to the radiation beam travel. Methods: Two cylindrical bore electromagnets produced a 0.21 T magnetic field parallel to the central axis of a 6 MV photon beam. A parallel plate ion chamber was used to measure the PDD in a polystyrene phantom, placed inside the bore in two setups: phantom top surfacemore » coinciding with the magnet bore center (183 cm SSD), and with the magnet bore’s top surface (170 cm SSD). We measured the field of the magnet at several points and included the exact dimensions of the coils to generate a 3D magnetic field map in a finite element model. BEAMnrc and DOSXYZnrc simulated the PDD experiments in parallel magnetic field (i.e. 3D magnetic field included) and with no magnetic field. Results: With the phantom surface at the top of the electromagnet, the surface dose increased by 10% (compared to no-magnetic field), due to electrons being focused by the smaller fringe fields of the electromagnet. With the phantom surface at the bore center, the surface dose increased by 30% since extra 13 cm of air column was in relatively higher magnetic field (>0.13T) in the magnet bore. EGSnrc Monte Carlo code correctly calculated the radiation dose with and without the magnetic field, and all points passed the 2%, 2 mm Gamma criterion when the ion chamber’s entrance window and air cavity were included in the simulated phantom. Conclusion: A parallel magnetic field increases the surface and buildup dose during irradiation. The EGSnrc package can model these magnetic field dose effects accurately. Dr. Fallone is a co-founder and CEO of MagnetTx Oncology Solutions (under discussions to license Alberta bi-planar linac MR for commercialization).« less
The large-scale gravitational bias from the quasi-linear regime.
NASA Astrophysics Data System (ADS)
Bernardeau, F.
1996-08-01
It is known that in gravitational instability scenarios the nonlinear dynamics induces non-Gaussian features in cosmological density fields that can be investigated with perturbation theory. Here, I derive the expression of the joint moments of cosmological density fields taken at two different locations. The results are valid when the density fields are filtered with a top-hat filter window function, and when the distance between the two cells is large compared to the smoothing length. In particular I show that it is possible to get the generating function of the coefficients C_p,q_ defined by <δ^p^({vec}(x)_1_)δ^q^({vec}(x)_2_)>_c_=C_p,q_ <δ^2^({vec}(x))>^p+q-2^ <δ({vec}(x)_1_)δ({vec}(x)_2_)> where δ({vec}(x)) is the local smoothed density field. It is then possible to reconstruct the joint density probability distribution function (PDF), generalizing for two points what has been obtained previously for the one-point density PDF. I discuss the validity of the large separation approximation in an explicit numerical Monte Carlo integration of the C_2,1_ parameter as a function of |{vec}(x)_1_-{vec}(x)_2_|. A straightforward application is the calculation of the large-scale ``bias'' properties of the over-dense (or under-dense) regions. The properties and the shape of the bias function are presented in details and successfully compared with numerical results obtained in an N-body simulation with CDM initial conditions.
NASA Astrophysics Data System (ADS)
Andromeda, A.; Lufri; Festiyed; Ellizar, E.; Iryani, I.; Guspatni, G.; Fitri, L.
2018-04-01
This Research & Development study aims to produce a valid and practical experiment integrated guided inquiry based module on topic of colloidal chemistry. 4D instructional design model was selected in this study. Limited trial of the product was conducted at SMAN 7 Padang. Instruments used were validity and practicality questionnaires. Validity and practicality data were analyzed using Kappa moment. Analysis of the data shows that Kappa moment for validity was 0.88 indicating a very high degree of validity. Kappa moments for the practicality from students and teachers were 0.89 and 0.95 respectively indicating high degree of practicality. Analysis on the module filled in by students shows that 91.37% students could correctly answer critical thinking, exercise, prelab, postlab and worksheet questions asked in the module. These findings indicate that the integrated guided inquiry based module on topic of colloidal chemistry was valid and practical for chemistry learning in senior high school.
NASA Technical Reports Server (NTRS)
Belcastro, Celeste M.; Fischl, Robert; Kam, Moshe
1992-01-01
This paper presents a strategy for dynamically monitoring digital controllers in the laboratory for susceptibility to electromagnetic disturbances that compromise control integrity. The integrity of digital control systems operating in harsh electromagnetic environments can be compromised by upsets caused by induced transient electrical signals. Digital system upset is a functional error mode that involves no component damage, can occur simultaneously in all channels of a redundant control computer, and is software dependent. The motivation for this work is the need to develop tools and techniques that can be used in the laboratory to validate and/or certify critical aircraft controllers operating in electromagnetically adverse environments that result from lightning, high-intensity radiated fields (HIRF), and nuclear electromagnetic pulses (NEMP). The detection strategy presented in this paper provides dynamic monitoring of a given control computer for degraded functional integrity resulting from redundancy management errors, control calculation errors, and control correctness/effectiveness errors. In particular, this paper discusses the use of Kalman filtering, data fusion, and statistical decision theory in monitoring a given digital controller for control calculation errors.
Application Agreement and Integration Services
NASA Technical Reports Server (NTRS)
Driscoll, Kevin R.; Hall, Brendan; Schweiker, Kevin
2013-01-01
Application agreement and integration services are required by distributed, fault-tolerant, safety critical systems to assure required performance. An analysis of distributed and hierarchical agreement strategies are developed against the backdrop of observed agreement failures in fielded systems. The documented work was performed under NASA Task Order NNL10AB32T, Validation And Verification of Safety-Critical Integrated Distributed Systems Area 2. This document is intended to satisfy the requirements for deliverable 5.2.11 under Task 4.2.2.3. This report discusses the challenges of maintaining application agreement and integration services. A literature search is presented that documents previous work in the area of replica determinism. Sources of non-deterministic behavior are identified and examples are presented where system level agreement failed to be achieved. We then explore how TTEthernet services can be extended to supply some interesting application agreement frameworks. This document assumes that the reader is familiar with the TTEthernet protocol. The reader is advised to read the TTEthernet protocol standard [1] before reading this document. This document does not re-iterate the content of the standard.
GPM Ground Validation: Pre to Post-Launch Era
NASA Astrophysics Data System (ADS)
Petersen, Walt; Skofronick-Jackson, Gail; Huffman, George
2015-04-01
NASA GPM Ground Validation (GV) activities have transitioned from the pre to post-launch era. Prior to launch direct validation networks and associated partner institutions were identified world-wide, covering a plethora of precipitation regimes. In the U.S. direct GV efforts focused on use of new operational products such as the NOAA Multi-Radar Multi-Sensor suite (MRMS) for TRMM validation and GPM radiometer algorithm database development. In the post-launch, MRMS products including precipitation rate, accumulation, types and data quality are being routinely generated to facilitate statistical GV of instantaneous (e.g., Level II orbit) and merged (e.g., IMERG) GPM products. Toward assessing precipitation column impacts on product uncertainties, range-gate to pixel-level validation of both Dual-Frequency Precipitation Radar (DPR) and GPM microwave imager data are performed using GPM Validation Network (VN) ground radar and satellite data processing software. VN software ingests quality-controlled volumetric radar datasets and geo-matches those data to coincident DPR and radiometer level-II data. When combined MRMS and VN datasets enable more comprehensive interpretation of both ground and satellite-based estimation uncertainties. To support physical validation efforts eight (one) field campaigns have been conducted in the pre (post) launch era. The campaigns span regimes from northern latitude cold-season snow to warm tropical rain. Most recently the Integrated Precipitation and Hydrology Experiment (IPHEx) took place in the mountains of North Carolina and involved combined airborne and ground-based measurements of orographic precipitation and hydrologic processes underneath the GPM Core satellite. One more U.S. GV field campaign (OLYMPEX) is planned for late 2015 and will address cold-season precipitation estimation, process and hydrology in the orographic and oceanic domains of western Washington State. Finally, continuous direct and physical validation measurements are also being conducted at the NASA Wallops Flight Facility multi-radar, gauge and disdrometer facility located in coastal Virginia. This presentation will summarize the evolution of the NASA GPM GV program from pre to post-launch eras and place focus on evaluation of year-1 post-launch GPM satellite datasets including Level II GPROF, DPR and Combined algorithms, and Level III IMERG products.
Vadose Zone Transport Field Study: Detailed Test Plan for Simulated Leak Tests
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Anderson L.; Gee, Glendon W.
2000-06-23
This report describes controlled transport experiments at well-instrumented field tests to be conducted during FY 2000 in support of DOE?s Vadose Zone Transport Field Study (VZTFS). The VZTFS supports the Groundwater/Vadose Zone Integration Project Science and Technology Initiative. The field tests will improve understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. These methods will capture the extent of contaminant plumes using existing steel-cased boreholes. Specific objectives are to 1) identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford?s waste disposal sites; 2) reduce uncertainty in conceptualmore » models; 3) develop a detailed and accurate data base of hydraulic and transport parameters for validation of three-dimensional numerical models; and 4) identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. Pacific Northwest National Laboratory (PNNL) manages the VZTFS for DOE.« less
Impacts of petroleum development in the Arctic
S.B., Robertson
1989-01-01
In their article “Cumulative impacts of oil fields on northern Alaskan landscapes.” D. A. Walter et al. (1) document some direct and indirect impacts of petroleum development in the Arctic on selected portions of the Prudhoe Bay Oil field. While most of the kinds of impacts they discuss are valid points to consider in designing an arctic oil field, the magnitude of what they describe is not representative of the Prudhoe Bay field, in general, or of newer oil fields, such as Kuparuk to the west of Prudhoe. It is even less applicable in areas of higher topographic relief, such as the coastal plain of the Arctic National Wildlife Refuge (ANWR).Any development will cause an impact to the land. In the Arctic, as noted by Walker et al., gravel roads and pads have been built that are thick enough to support facilities while the thermal integrity of the underlying permafrost is maintained. Decision-makers must evaluate whether or not the gains of development are worth the impacts incurred. Accurate assessment of both direct and indirect impacts is essential.
NASA Technical Reports Server (NTRS)
Ong, Cindy; Mueller, Andreas; Thome, Kurtis; Pierce, Leland E.; Malthus, Timothy
2016-01-01
Calibration is the process of quantitatively defining a system's responses to known, controlled signal inputs, and validation is the process of assessing, by independent means, the quality of the data products derived from those system outputs [1]. Similar to other Earth observation (EO) sensors, the calibration and validation of spaceborne imaging spectroscopy sensors is a fundamental underpinning activity. Calibration and validation determine the quality and integrity of the data provided by spaceborne imaging spectroscopy sensors and have enormous downstream impacts on the accuracy and reliability of products generated from these sensors. At least five imaging spectroscopy satellites are planned to be launched within the next five years, with the two most advanced scheduled to be launched in the next two years [2]. The launch of these sensors requires the establishment of suitable, standardized, and harmonized calibration and validation strategies to ensure that high-quality data are acquired and comparable between these sensor systems. Such activities are extremely important for the community of imaging spectroscopy users. Recognizing the need to focus on this underpinning topic, the Geoscience Spaceborne Imaging Spectroscopy (previously, the International Spaceborne Imaging Spectroscopy) Technical Committee launched a calibration and validation initiative at the 2013 International Geoscience and Remote Sensing Symposium (IGARSS) in Melbourne, Australia, and a post-conference activity of a vicarious calibration field trip at Lake Lefroy in Western Australia.
NASA Technical Reports Server (NTRS)
Wolf, David B.; Tokay, Ali; Petersen, Walt; Williams, Christopher; Gatlin, Patrick; Wingo, Mathew
2010-01-01
Proper characterization of the precipitation drop size distribution (DSD) is integral to providing realistic and accurate space- and ground-based precipitation retrievals. Current technology allows for the development of DSD products from a variety of platforms, including disdrometers, vertical profilers and dual-polarization radars. Up to now, however, the dissemination or availability of such products has been limited to individual sites and/or field campaigns, in a variety of formats, often using inconsistent algorithms for computing the integral DSD parameters, such as the median- and mass-weighted drop diameter, total number concentration, liquid water content, rain rate, etc. We propose to develop a framework for the generation and dissemination of DSD characteristic products using a unified structure, capable of handling the myriad collection of disdrometers, profilers, and dual-polarization radar data currently available and to be collected during several upcoming GPM Ground Validation field campaigns. This DSD super-structure paradigm is an adaptation of the radar super-structure developed for NASA s Radar Software Library (RSL) and RSL_in_IDL. The goal is to provide the DSD products in a well-documented format, most likely NetCDF, along with tools to ingest and analyze the products. In so doing, we can develop a robust archive of DSD products from multiple sites and platforms, which should greatly benefit the development and validation of precipitation retrieval algorithms for GPM and other precipitation missions. An outline of this proposed framework will be provided as well as a discussion of the algorithms used to calculate the DSD parameters.
NASA Technical Reports Server (NTRS)
Boersma, J.; Rahmat-Samii, Y.
1980-01-01
The diffraction of an arbitrary cylindrical wave by a half-plane has been treated by Rahmat-Samii and Mittra who used a spectral domain approach. In this paper, their exact solution for the total field is expressed in terms of a new integral representation. For large wave number k, two rigorous procedures are described for the exact uniform asymptotic expansion of the total field solution. The uniform expansions obtained are valid in the entire space, including transition regions around the shadow boundaries. The final results are compared with the formulations of two leading uniform theories of edge diffraction, namely, the uniform asymptotic theory and the uniform theory of diffraction. Some unique observations and conclusions are made in relating the two theories.
Development and Validation of a Polarimetric-MCScene 3D Atmospheric Radiation Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berk, Alexander; Hawes, Frederick; Fox, Marsha
2016-03-15
Polarimetric measurements can substantially enhance the ability of both spectrally resolved and single band imagery to detect the proliferation of weapons of mass destruction, providing data for locating and identifying facilities, materials, and processes of undeclared and proliferant nuclear weapons programs worldwide. Unfortunately, models do not exist that efficiently and accurately predict spectral polarized signatures for the materials of interest embedded in complex 3D environments. Having such a model would enable one to test hypotheses and optimize both the enhancement of scene contrast and the signal processing for spectral signature extraction. The Phase I set the groundwork for development ofmore » fully validated polarimetric spectral signature and scene simulation models. This has been accomplished 1. by (a) identifying and downloading state-of-the-art surface and atmospheric polarimetric data sources, (b) implementing tools for generating custom polarimetric data, and (c) identifying and requesting US Government funded field measurement data for use in validation; 2. by formulating an approach for upgrading the radiometric spectral signature model MODTRAN to generate polarimetric intensities through (a) ingestion of the polarimetric data, (b) polarimetric vectorization of existing MODTRAN modules, and (c) integration of a newly developed algorithm for computing polarimetric multiple scattering contributions; 3. by generating an initial polarimetric model that demonstrates calculation of polarimetric solar and lunar single scatter intensities arising from the interaction of incoming irradiances with molecules and aerosols; 4. by developing a design and implementation plan to (a) automate polarimetric scene construction and (b) efficiently sample polarimetric scattering and reflection events, for use in a to be developed polarimetric version of the existing first-principles synthetic scene simulation model, MCScene; and 5. by planning a validation field measurement program in collaboration with the Remote Sensing and Exploitation group at Sandia National Laboratories (SNL) in which data from their ongoing polarimetric field and laboratory measurement program will be shared and, to the extent allowed, tailored for model validation in exchange for model predictions under conditions and for geometries outside of their measurement domain.« less
NASA Astrophysics Data System (ADS)
Aklan, B.; Jakoby, B. W.; Watson, C. C.; Braun, H.; Ritt, P.; Quick, H. H.
2015-06-01
A simulation toolkit, GATE (Geant4 Application for Tomographic Emission), was used to develop an accurate Monte Carlo (MC) simulation of a fully integrated 3T PET/MR hybrid imaging system (Siemens Biograph mMR). The PET/MR components of the Biograph mMR were simulated in order to allow a detailed study of variations of the system design on the PET performance, which are not easy to access and measure on a real PET/MR system. The 3T static magnetic field of the MR system was taken into account in all Monte Carlo simulations. The validation of the MC model was carried out against actual measurements performed on the PET/MR system by following the NEMA (National Electrical Manufacturers Association) NU 2-2007 standard. The comparison of simulated and experimental performance measurements included spatial resolution, sensitivity, scatter fraction, and count rate capability. The validated system model was then used for two different applications. The first application focused on investigating the effect of an extension of the PET field-of-view on the PET performance of the PET/MR system. The second application deals with simulating a modified system timing resolution and coincidence time window of the PET detector electronics in order to simulate time-of-flight (TOF) PET detection. A dedicated phantom was modeled to investigate the impact of TOF on overall PET image quality. Simulation results showed that the overall divergence between simulated and measured data was found to be less than 10%. Varying the detector geometry showed that the system sensitivity and noise equivalent count rate of the PET/MR system increased progressively with an increasing number of axial detector block rings, as to be expected. TOF-based PET reconstructions of the modeled phantom showed an improvement in signal-to-noise ratio and image contrast to the conventional non-TOF PET reconstructions. In conclusion, the validated MC simulation model of an integrated PET/MR system with an overall accuracy error of less than 10% can now be used for further MC simulation applications such as development of hardware components as well as for testing of new PET/MR software algorithms, such as assessment of point-spread function-based reconstruction algorithms.
Quantum gravity in timeless configuration space
NASA Astrophysics Data System (ADS)
Gomes, Henrique
2017-12-01
On the path towards quantum gravity we find friction between temporal relations in quantum mechanics (QM) (where they are fixed and field-independent), and in general relativity (where they are field-dependent and dynamic). This paper aims to attenuate that friction, by encoding gravity in the timeless configuration space of spatial fields with dynamics given by a path integral. The framework demands that boundary conditions for this path integral be uniquely given, but unlike other approaches where they are prescribed—such as the no-boundary and the tunneling proposals—here I postulate basic principles to identify boundary conditions in a large class of theories. Uniqueness arises only if a reduced configuration space can be defined and if it has a profoundly asymmetric fundamental structure. These requirements place strong restrictions on the field and symmetry content of theories encompassed here; shape dynamics is one such theory. When these constraints are met, any emerging theory will have a Born rule given merely by a particular volume element built from the path integral in (reduced) configuration space. Also as in other boundary proposals, Time, including space-time, emerges as an effective concept; valid for certain curves in configuration space but not assumed from the start. When some such notion of time becomes available, conservation of (positive) probability currents ensues. I show that, in the appropriate limits, a Schrödinger equation dictates the evolution of weakly coupled source fields on a classical gravitational background. Due to the asymmetry of reduced configuration space, these probabilities and currents avoid a known difficulty of standard WKB approximations for Wheeler DeWitt in minisuperspace: the selection of a unique Hamilton–Jacobi solution to serve as background. I illustrate these constructions with a simple example of a full quantum gravitational theory (i.e. not in minisuperspace) for which the formalism is applicable, and give a formula for calculating gravitational semi-classical relative probabilities in it.
I-15 San Diego, California, model validation and calibration report.
DOT National Transportation Integrated Search
2010-02-01
The Integrated Corridor Management (ICM) initiative requires the calibration and validation of simulation models used in the Analysis, Modeling, and Simulation of Pioneer Site proposed integrated corridors. This report summarizes the results and proc...
The Integrated Airframe/Propulsion Control System Architecture program (IAPSA)
NASA Technical Reports Server (NTRS)
Palumbo, Daniel L.; Cohen, Gerald C.; Meissner, Charles W.
1990-01-01
The Integrated Airframe/Propulsion Control System Architecture program (IAPSA) is a two-phase program which was initiated by NASA in the early 80s. The first phase, IAPSA 1, studied different architectural approaches to the problem of integrating engine control systems with airframe control systems in an advanced tactical fighter. One of the conclusions of IAPSA 1 was that the technology to construct a suitable system was available, yet the ability to create these complex computer architectures has outpaced the ability to analyze the resulting system's performance. With this in mind, the second phase of IAPSA approached the same problem with the added constraint that the system be designed for validation. The intent of the design for validation requirement is that validation requirements should be shown to be achievable early in the design process. IAPSA 2 has demonstrated that despite diligent efforts, integrated systems can retain characteristics which are difficult to model and, therefore, difficult to validate.
Flight evaluation of advanced third-generation midwave infrared sensor
NASA Astrophysics Data System (ADS)
Shen, Chyau N.; Donn, Matthew
1998-08-01
In FY-97 the Counter Drug Optical Upgrade (CDOU) demonstration program was initiated by the Program Executive Office for Counter Drug to increase the detection and classification ranges of P-3 counter drug aircraft by using advanced staring infrared sensors. The demonstration hardware is a `pin-for-pin' replacement of the AAS-36 Infrared Detection Set (IRDS) located under the nose radome of a P-3 aircraft. The hardware consists of a 3rd generation mid-wave infrared (MWIR) sensor integrated into a three axis-stabilized turret. The sensor, when installed on the P- 3, has a hemispheric field of regard and analysis has shown it will be capable of detecting and classifying Suspected Drug Trafficking Aircraft and Vessels at ranges several factors over the current IRDS. This paper will discuss the CDOU system and it's lab, ground, and flight evaluation results. Test targets included target templates, range targets, dedicated target boats, and targets of opportunity at the Naval Air Warfare Center Aircraft Division and at operational test sites. The objectives of these tests were to: (1) Validate the integration concept of the CDOU package into the P-3 aircraft. (2) Validate the end-to-end functionality of the system, including sensor/turret controls and recording of imagery during flight. (3) Evaluate the system sensitivity and resolution on a set of verified resolution targets templates. (4) Validate the ability of the 3rd generation MWIR sensor to detect and classify targets at a significantly increased range.
NASA Astrophysics Data System (ADS)
Haddad, Bouchra; Palacios, David; Pastor, Manuel; Zamorano, José Juan
2016-09-01
Lahars are among the most catastrophic volcanic processes, and the ability to model them is central to mitigating their effects. Several lahars recently generated by the Popocatépetl volcano (Mexico) moved downstream through the Huiloac Gorge towards the village of Santiago Xalitzintla. The most dangerous was the 2001 lahar, in which the destructive power of the debris flow was maintained throughout the extent of the flow. Identifying the zone of hazard can be based either on numerical or empirical models, but a calibration and validation process is required to ensure hazard map quality. The Geoflow-SPH depth integrated numerical model used in this study to reproduce the 2001 lahar was derived from the velocity-pressure version of the Biot-Zienkiewicz model, and was discretized using the smoothed particle hydrodynamics (SPH) method. The results of the calibrated SPH model were validated by comparing the simulated deposit depth with the field depth measured at 16 cross sections distributed strategically along the gorge channel. Moreover, the dependency of the results on topographic mesh resolution, initial lahar mass shape and dimensions is also investigated. The results indicate that to accurately reproduce the 2001 lahar flow dynamics the channel topography needed to be discretized using a mesh having a minimum 5 m resolution, and an initial lahar mass shape that adopted the source area morphology. Field validation of the calibrated model showed that there was a satisfactory relationship between the simulated and field depths, the error being less than 20% for 11 of the 16 cross sections. This study demonstrates that the Geoflow-SPH model was able to accurately reproduce the lahar path and the extent of the flow, but also reproduced other parameters including flow velocity and deposit depth.
Aguilera Díaz, Jerónimo; Arias, Antonio Eduardo; Budalich, Cintia Mabel; Benítez, Sonia Elizabeth; López, Gastón; Borbolla, Damián; Plazzotta, Fernando; Luna, Daniel; de Quirós, Fernán González Bernaldo
2010-01-01
This paper describes the development and implementation of a web based electronic health record for the Homecare Service program in the Hospital Italiano de Buenos Aires. It reviews the process of the integration of the new electronic health record to the hospital information system, allowing physicians to access the clinical data repository from their Pc's at home and with the capability of consulting past and present history of the patient health care, order, tests, and referrals with others professionals trough the new Electronic Health Record. We also discuss how workflow processes were changed and improved for the physicians, nurses, and administrative personnel of the Homecare Services and the educational methods used to improve acceptance and adoption of these new technologies. We also briefly describe the validation of physicians and their field work with electronic signatures.
Pressure loadings in a rectangular cavity with and without a captive store
Barone, Matthew; Arunajatesan, Srinivasan
2016-05-31
Simulations of the flow past a rectangular cavity containing a model captive store are performed using a hybrid Reynolds-averaged Navier–Stokes/large-eddy simulation model. Calculated pressure fluctuation spectra are validated using measurements made on the same configuration in a trisonic wind tunnel at Mach numbers of 0.60, 0.80, and 1.47. The simulation results are used to calculate unsteady integrated forces and moments acting on the store. Spectra of the forces and moments, along with correlations calculated for force/moment pairs, reveal that a complex relationship exists between the unsteady integrated forces and the measured resonant cavity modes, as indicated in the cavity wallmore » pressure measurements. As a result, the structure of identified cavity resonant tones is examined by visualization of filtered surface pressure fields.« less
NASA Astrophysics Data System (ADS)
Pei, C.; Bieber, J. W.; Burger, R. A.; Clem, J.
2010-12-01
We present a detailed description of our newly developed stochastic approach for solving Parker's transport equation, which we believe is the first attempt to solve it with time dependence in 3-D, evolving from our 3-D steady state stochastic approach. Our formulation of this method is general and is valid for any type of heliospheric magnetic field, although we choose the standard Parker field as an example to illustrate the steps to calculate the transport of galactic cosmic rays. Our 3-D stochastic method is different from other stochastic approaches in the literature in several ways. For example, we employ spherical coordinates to integrate directly, which makes the code much more efficient by reducing coordinate transformations. What is more, the equivalence between our stochastic differential equations and Parker's transport equation is guaranteed by Ito's theorem in contrast to some other approaches. We generalize the technique for calculating particle flux based on the pseudoparticle trajectories for steady state solutions and for time-dependent solutions in 3-D. To validate our code, first we show that good agreement exists between solutions obtained by our steady state stochastic method and a traditional finite difference method. Then we show that good agreement also exists for our time-dependent method for an idealized and simplified heliosphere which has a Parker magnetic field and a simple initial condition for two different inner boundary conditions.
Linking Fine-Scale Observations and Model Output with Imagery at Multiple Scales
NASA Astrophysics Data System (ADS)
Sadler, J.; Walthall, C. L.
2014-12-01
The development and implementation of a system for seasonal worldwide agricultural yield estimates is underway with the international Group on Earth Observations GeoGLAM project. GeoGLAM includes a research component to continually improve and validate its algorithms. There is a history of field measurement campaigns going back decades to draw upon for ways of linking surface measurements and model results with satellite observations. Ground-based, in-situ measurements collected by interdisciplinary teams include yields, model inputs and factors affecting scene radiation. Data that is comparable across space and time with careful attention to calibration is essential for the development and validation of agricultural applications of remote sensing. Data management to ensure stewardship, availability and accessibility of the data are best accomplished when considered an integral part of the research. The expense and logistical challenges of field measurement campaigns can be cost-prohibitive and because of short funding cycles for research, access to consistent, stable study sites can be lost. The use of a dedicated staff for baseline data needed by multiple investigators, and conducting measurement campaigns using existing measurement networks such as the USDA Long Term Agroecosystem Research network can fulfill these needs and ensure long-term access to study sites.
Wireless Fidelity Electromagnetic Field Exposure Monitoring With Wearable Body Sensor Networks.
Lecoutere, Jeroen; Thielens, Arno; Agneessens, Sam; Rogier, Hendrik; Joseph, Wout; Puers, Robert
2016-06-01
With the breakthrough of the Internet of Things and the steady increase of wireless applications in the daily environment, the assessment of radio frequency electromagnetic field (RF-EMF) exposure is key in determining possible health effects of exposure to certain levels of RF-EMF. This paper presents the first experimental validation of a novel personal exposimeter system based on a distributed measurement approach to achieve higher measurement quality and lower measurement variability than the commonly used single point measurement approach of existing exposimeters. An important feature of the system is the integration of inertial sensors in order to determine activity and posture during exposure measurements. The system is designed to assess exposure to frequencies within the 389 to 464, 779 to 928 and 2400 to 2483.5 MHz bands using only two transceivers per node. In this study, the 2400 to 2483.5 MHz band is validated. Every node provides antenna diversity for the different bands in order to achieve higher sensitivity at these frequencies. Two AAA batteries power each standalone node and as such determine the node hardware size of this proof of concept (53 mm×25 mm×15 mm) , making it smaller than any other commercially available exposimeter.
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.
1975-01-01
Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.
76 FR 28664 - Method 301-Field Validation of Pollutant Measurement Methods From Various Waste Media
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-18
... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 63 [OAR-2004-0080, FRL-9306-8] RIN 2060-AF00 Method 301--Field Validation of Pollutant Measurement Methods From Various Waste Media AGENCY: Environmental Protection Agency (EPA). ACTION: Final rule. SUMMARY: This action amends EPA's Method 301, Field Validation...
NASA Technical Reports Server (NTRS)
Sweeney, Christopher; Bunnell, John; Chung, William; Giovannetti, Dean; Mikula, Julie; Nicholson, Bob; Roscoe, Mike
2001-01-01
Joint Shipboard Helicopter Integration Process (JSHIP) is a Joint Test and Evaluation (JT&E) program sponsored by the Office of the Secretary of Defense (OSD). Under the JSHDP program is a simulation effort referred to as the Dynamic Interface Modeling and Simulation System (DIMSS). The purpose of DIMSS is to develop and test the processes and mechanisms that facilitate ship-helicopter interface testing via man-in-the-loop ground-based flight simulators. Specifically, the DIMSS charter is to develop an accredited process for using a flight simulator to determine the wind-over-the-deck (WOD) launch and recovery flight envelope for the UH-60A ship/helicopter combination. DIMSS is a collaborative effort between the NASA Ames Research Center and OSD. OSD determines the T&E and warfighter training requirements, provides the programmatics and dynamic interface T&E experience, and conducts ship/aircraft interface tests for validating the simulation. NASA provides the research and development element, simulation facility, and simulation technical experience. This paper will highlight the benefits of the NASA/JSHIP collaboration and detail achievements of the project in terms of modeling and simulation. The Vertical Motion Simulator (VMS) at NASA Ames Research Center offers the capability to simulate a wide range of simulation cueing configurations, which include visual, aural, and body-force cueing devices. The system flexibility enables switching configurations io allow back-to-back evaluation and comparison of different levels of cueing fidelity in determining minimum training requirements. The investigation required development and integration of several major simulation system at the VMS. A new UH-60A BlackHawk interchangeable cab that provides an out-the-window (OTW) field-of-view (FOV) of 220 degrees in azimuth and 70 degrees in elevation was built. Modeling efforts involved integrating Computational Fluid Dynamics (CFD) generated data of an LHA ship airwake and integrating a real-time ship motion model developed based on a batch model from Naval Surface Warfare Center. Engineering development and integration of a three degrees-of-freedom (DOF) dynamic seat to simulate high frequency rotor-dynamics dependent motion cues for use in conjunction with the large motion system was accomplished. The development of an LHA visual model in several different levels of resolution and an aural cueing system in which three separate fidelity levels could be selected were developed. VMS also integrated a PC-based E&S simFUSION system to investigate cost effective IG alternatives. The DIMSS project consists of three phases that follow an approved Validation, Verification and accreditation (VV&A) process. The first phase will support the accreditation of the individual subsystems and models. The second will follow the verification and validation of the integrated subsystems and models, and will address fidelity requirements of the integrated models and subsystems. The third and final phase will allow the verification and validation of the full system integration. This VV&A process will address the utility of the simulated WOD launch and recovery envelope. Simulations supporting the first two stages have been completed and the data is currently being reviewed and analyzed.
Validation of the Community Integration Questionnaire in the adult burn injury population.
Gerrard, Paul; Kazis, Lewis E; Ryan, Colleen M; Shie, Vivian L; Holavanahalli, Radha; Lee, Austin; Jette, Alan; Fauerbach, James A; Esselman, Peter; Herndon, David; Schneider, Jeffrey C
2015-11-01
With improved survival, long-term effects of burn injuries on quality of life, particularly community integration, are important outcomes. This study aims to assess the Community Integration Questionnaire's psychometric properties in the adult burn population. Data were obtained from a multicenter longitudinal data set of burn survivors. The psychometric properties of the Community Integration Questionnaire (n = 492) were examined. The questionnaire items were evaluated for clinical and substantive relevance; validation procedures were conducted on different samples of the population; construct validity was assessed using exploratory factor analysis; internal consistency reliability was examined using Cronbach's α statistics; and item response theory was applied to the final models. The CIQ-15 was reduced by two questions to form the CIQ-13, with a two-factor structure, interpreted as self/family care and social integration. Item response theory testing suggests that Factor 2 captures a wider range of community integration levels. Cronbach's α was 0.80 for Factor 1, 0.77 for Factor 2, and 0.79 for the test as a whole. The CIQ-13 demonstrates validity and reliability in the adult burn survivor population addressing issues of self/family care and social integration. This instrument is useful in future research of community reintegration outcomes in the burn population.
Agricultural crop harvest progress monitoring by fully polarimetric synthetic aperture radar imagery
NASA Astrophysics Data System (ADS)
Yang, Hao; Zhao, Chunjiang; Yang, Guijun; Li, Zengyuan; Chen, Erxue; Yuan, Lin; Yang, Xiaodong; Xu, Xingang
2015-01-01
Dynamic mapping and monitoring of crop harvest on a large spatial scale will provide critical information for the formulation of optimal harvesting strategies. This study evaluates the feasibility of C-band polarimetric synthetic aperture radar (PolSAR) for monitoring the harvesting progress of oilseed rape (Brassica napus L.) fields. Five multitemporal, quad-pol Radarsat-2 images and one optical ZY-1 02C image were acquired over a farmland area in China during the 2013 growing season. Typical polarimetric signatures were obtained relying on polarimetric decomposition methods. Temporal evolutions of these signatures of harvested fields were compared with the ones of unharvested fields in the context of the entire growing cycle. Significant sensitivity was observed between the specific polarimetric parameters and the harvest status of oilseed rape fields. Based on this sensitivity, a new method that integrates two polarimetric features was devised to detect the harvest status of oilseed rape fields using a single image. The validation results are encouraging even for the harvested fields covered with high residues. This research demonstrates the capability of PolSAR remote sensing in crop harvest monitoring, which is a step toward more complex applications of PolSAR data in precision agriculture.
Healing of voids in the aluminum metallization of integrated circuit chips
NASA Technical Reports Server (NTRS)
Cuddihy, Edward F.; Lawton, Russell A.; Gavin, Thomas R.
1990-01-01
The thermal stability of GaAs modulation-doped field effect transistors (MODFETs) is evaluated in order to identify failure mechanisms and validate the reliability of these devices. The transistors were exposed to thermal step-stress and characterized at ambient temperatures to indicate device reliability, especially that of the transistor ohmic contacts with and without molybdenum diffusion barriers. The devices without molybdenum exhibited important transconductance deterioration. MODFETs with molybdenum diffusion barriers were tolerant to temperatures above 300 C. This tolerance indicates that thermally activated failure mechanisms are slow at operational temperatures. Therefore, high-reliability MODFET-based circuits are possible.
Freud's superpotential in general relativity and in Einstein-Cartan theory
NASA Astrophysics Data System (ADS)
Böhmer, Christian G.; Hehl, Friedrich W.
2018-02-01
The identification of a suitable gravitational energy in theories of gravity has a long history, and it is well known that a unique answer cannot be given. In the first part of this paper we present a streamlined version of the derivation of Freud's superpotential in general relativity. It is found if we once integrate the gravitational field equation by parts. This allows us to extend these results directly to the Einstein-Cartan theory. Interestingly, Freud's original expression, first stated in 1939, remains valid even when considering gravitational theories in Riemann-Cartan or, more generally, in metric-affine spacetimes.
Baker, Ruth E.; Schnell, Santiago; Maini, Philip K.
2014-01-01
In this article we will discuss the integration of developmental patterning mechanisms with waves of competency that control the ability of a homogeneous field of cells to react to pattern forming cues and generate spatially heterogeneous patterns. We base our discussion around two well known patterning events that take place in the early embryo: somitogenesis and feather bud formation. We outline mathematical models to describe each patterning mechanism, present the results of numerical simulations and discuss the validity of each model in relation to our example patterning processes. PMID:19557684
A multi-frequency iterative imaging method for discontinuous inverse medium problem
NASA Astrophysics Data System (ADS)
Zhang, Lei; Feng, Lixin
2018-06-01
The inverse medium problem with discontinuous refractive index is a kind of challenging inverse problem. We employ the primal dual theory and fast solution of integral equations, and propose a new iterative imaging method. The selection criteria of regularization parameter is given by the method of generalized cross-validation. Based on multi-frequency measurements of the scattered field, a recursive linearization algorithm has been presented with respect to the frequency from low to high. We also discuss the initial guess selection strategy by semi-analytical approaches. Numerical experiments are presented to show the effectiveness of the proposed method.
General Series Solutions for Stresses and Displacements in an Inner-fixed Ring
NASA Astrophysics Data System (ADS)
Jiao, Yongshu; Liu, Shuo; Qi, Dexuan
2018-03-01
The general series solution approach is provided to get the stress and displacement fields in the inner-fixed ring. After choosing an Airy stress function in series form, stresses are expressed by infinite coefficients. Displacements are obtained by integrating the geometric equations. For an inner-fixed ring, the arbitrary loads acting on outer edge are extended into two sets of Fourier series. The zero displacement boundary conditions on inner surface are utilized. Then the stress (and displacement) coefficients are expressed by loading coefficients. A numerical example shows the validity of this approach.
Electronic labelling in recycling of manufactured articles.
Olejnik, Lech; Krammer, Alfred
2002-12-01
The concept of a recycling system aiming at the recovery of resources from manufactured articles is proposed. The system integrates electronic labels for product identification and internet for global data exchange. A prototype for the recycling of electric motors has been developed, which implements a condition-based recycling decision system to automatically select the environmentally and economically appropriate recycling strategy, thereby opening a potential market for second-hand motors and creating a profitable recycling process itself. The project has been designed to evaluate the feasibility of electronic identification applied on a large number of motors and to validate the system in real field conditions.
Luce, T. C.; Petty, C. C.; Meyer, W. H.; ...
2016-11-02
An approximate method to correct the motional Stark effect (MSE) spectroscopy for the effects of intrinsic plasma electric fields has been developed. The motivation for using an approximate method is to incorporate electric field effects for between-pulse or real-time analysis of the current density or safety factor profile. The toroidal velocity term in the momentum balance equation is normally the dominant contribution to the electric field orthogonal to the flux surface over most of the plasma. When this approximation is valid, the correction to the MSE data can be included in a form like that used when electric field effectsmore » are neglected. This allows measurements of the toroidal velocity to be integrated into the interpretation of the MSE polarization angles without changing how the data is treated in existing codes. In some cases, such as the DIII-D system, the correction is especially simple, due to the details of the neutral beam and MSE viewing geometry. The correction method is compared using DIII-D data in a variety of plasma conditions to analysis that assumes no radial electric field is present and to analysis that uses the standard correction method, which involves significant human intervention for profile fitting. The comparison shows that the new correction method is close to the standard one, and in all cases appears to offer a better result than use of the uncorrected data. Lastly, the method has been integrated into the standard DIII-D equilibrium reconstruction code in use for analysis between plasma pulses and is sufficiently fast that it will be implemented in real-time equilibrium analysis for control applications.« less
Designing and validating the joint battlespace infosphere
NASA Astrophysics Data System (ADS)
Peterson, Gregory D.; Alexander, W. Perry; Birdwell, J. Douglas
2001-08-01
Fielding and managing the dynamic, complex information systems infrastructure necessary for defense operations presents significant opportunities for revolutionary improvements in capabilities. An example of this technology trend is the creation and validation of the Joint Battlespace Infosphere (JBI) being developed by the Air Force Research Lab. The JBI is a system of systems that integrates, aggregates, and distributes information to users at all echelons, from the command center to the battlefield. The JBI is a key enabler of meeting the Air Force's Joint Vision 2010 core competencies such as Information Superiority, by providing increased situational awareness, planning capabilities, and dynamic execution. At the same time, creating this new operational environment introduces significant risk due to an increased dependency on computational and communications infrastructure combined with more sophisticated and frequent threats. Hence, the challenge facing the nation is the most effective means to exploit new computational and communications technologies while mitigating the impact of attacks, faults, and unanticipated usage patterns.
Development of Airport Surface Required Navigation Performance (RNP)
NASA Technical Reports Server (NTRS)
Cassell, Rick; Smith, Alex; Hicok, Dan
1999-01-01
The U.S. and international aviation communities have adopted the Required Navigation Performance (RNP) process for defining aircraft performance when operating the en-route, approach and landing phases of flight. RNP consists primarily of the following key parameters - accuracy, integrity, continuity, and availability. The processes and analytical techniques employed to define en-route, approach and landing RNP have been applied in the development of RNP for the airport surface. To validate the proposed RNP requirements several methods were used. Operational and flight demonstration data were analyzed for conformance with proposed requirements, as were several aircraft flight simulation studies. The pilot failure risk component was analyzed through several hypothetical scenarios. Additional simulator studies are recommended to better quantify crew reactions to failures as well as additional simulator and field testing to validate achieved accuracy performance, This research was performed in support of the NASA Low Visibility Landing and Surface Operations Programs.
NASA-LaRc Flight-Critical Digital Systems Technology Workshop
NASA Technical Reports Server (NTRS)
Meissner, C. W., Jr. (Editor); Dunham, J. R. (Editor); Crim, G. (Editor)
1989-01-01
The outcome is documented of a Flight-Critical Digital Systems Technology Workshop held at NASA-Langley December 13 to 15 1988. The purpose of the workshop was to elicit the aerospace industry's view of the issues which must be addressed for the practical realization of flight-critical digital systems. The workshop was divided into three parts: an overview session; three half-day meetings of seven working groups addressing aeronautical and space requirements, system design for validation, failure modes, system modeling, reliable software, and flight test; and a half-day summary of the research issues presented by the working group chairmen. Issues that generated the most consensus across the workshop were: (1) the lack of effective design and validation methods with support tools to enable engineering of highly-integrated, flight-critical digital systems, and (2) the lack of high quality laboratory and field data on system failures especially due to electromagnetic environment (EME).
Equivalent radiation source of 3D package for electromagnetic characteristics analysis
NASA Astrophysics Data System (ADS)
Li, Jun; Wei, Xingchang; Shu, Yufei
2017-10-01
An equivalent radiation source method is proposed to characterize electromagnetic emission and interference of complex three dimensional integrated circuits (IC) in this paper. The method utilizes amplitude-only near-field scanning data to reconstruct an equivalent magnetic dipole array, and the differential evolution optimization algorithm is proposed to extract the locations, orientation and moments of those dipoles. By importing the equivalent dipoles model into a 3D full-wave simulator together with the victim circuit model, the electromagnetic interference issues in mixed RF/digital systems can be well predicted. A commercial IC is used to validate the accuracy and efficiency of this proposed method. The coupled power at the victim antenna port calculated by the equivalent radiation source is compared with the measured data. Good consistency is obtained which confirms the validity and efficiency of the method. Project supported by the National Nature Science Foundation of China (No. 61274110).
NASA Astrophysics Data System (ADS)
Moriarty, Patrick; Sanz Rodrigo, Javier; Gancarski, Pawel; Chuchfield, Matthew; Naughton, Jonathan W.; Hansen, Kurt S.; Machefaux, Ewan; Maguire, Eoghan; Castellani, Francesco; Terzi, Ludovico; Breton, Simon-Philippe; Ueda, Yuko
2014-06-01
Researchers within the International Energy Agency (IEA) Task 31: Wakebench have created a framework for the evaluation of wind farm flow models operating at the microscale level. The framework consists of a model evaluation protocol integrated with a web-based portal for model benchmarking (www.windbench.net). This paper provides an overview of the building-block validation approach applied to wind farm wake models, including best practices for the benchmarking and data processing procedures for validation datasets from wind farm SCADA and meteorological databases. A hierarchy of test cases has been proposed for wake model evaluation, from similarity theory of the axisymmetric wake and idealized infinite wind farm, to single-wake wind tunnel (UMN-EPFL) and field experiments (Sexbierum), to wind farm arrays in offshore (Horns Rev, Lillgrund) and complex terrain conditions (San Gregorio). A summary of results from the axisymmetric wake, Sexbierum, Horns Rev and Lillgrund benchmarks are used to discuss the state-of-the-art of wake model validation and highlight the most relevant issues for future development.
Williams, Jessica A R; Schult, Tamara M; Nelson, Candace C; Cabán-Martinez, Alberto J; Katz, Jeffrey N; Wagner, Gregory R; Pronk, Nicolaas P; Sorensen, Glorian; McLellan, Deborah L
2016-05-01
To conduct validation and dimensionality analyses for an existing measure of the integration of worksite health protection and health promotion approaches. A survey of small to medium size employers located in the United States was conducted between October 2013 and March 2014 (N = 115). A survey of Department of Veterans Affairs (VA) administrative parents was also conducted from June to July 2014 (N = 140). Exploratory factor analysis (EFA) was used to determine the dimensionality of the Integration Score in each sample. Using EFA, both samples indicated the presence of one unified factor. The VA survey indicated that customization improves the relevance of the Integration Score for different types of organizations. The Integration Score is a valid index for assessing the integration of worksite health protection and health promotion approaches and is customizable based on industry. The Integration Score may be used as a single metric for assessing the integration of worksite health protection and health promotion approaches in differing work contexts.
Refining and validating a conceptual model of Clinical Nurse Leader integrated care delivery.
Bender, Miriam; Williams, Marjory; Su, Wei; Hites, Lisle
2017-02-01
To empirically validate a conceptual model of Clinical Nurse Leader integrated care delivery. There is limited evidence of frontline care delivery models that consistently achieve quality patient outcomes. Clinical Nurse Leader integrated care delivery is a promising nursing model with a growing record of success. However, theoretical clarity is necessary to generate causal evidence of effectiveness. Sequential mixed methods. A preliminary Clinical Nurse Leader practice model was refined and survey items developed to correspond with model domains, using focus groups and a Delphi process with a multi-professional expert panel. The survey was administered in 2015 to clinicians and administrators involved in Clinical Nurse Leader initiatives. Confirmatory factor analysis and structural equation modelling were used to validate the measurement and model structure. Final sample n = 518. The model incorporates 13 components organized into five conceptual domains: 'Readiness for Clinical Nurse Leader integrated care delivery'; 'Structuring Clinical Nurse Leader integrated care delivery'; 'Clinical Nurse Leader Practice: Continuous Clinical Leadership'; 'Outcomes of Clinical Nurse Leader integrated care delivery'; and 'Value'. Sample data had good fit with specified model and two-level measurement structure. All hypothesized pathways were significant, with strong coefficients suggesting good fit between theorized and observed path relationships. The validated model articulates an explanatory pathway of Clinical Nurse Leader integrated care delivery, including Clinical Nurse Leader practices that result in improved care dynamics and patient outcomes. The validated model provides a basis for testing in practice to generate evidence that can be deployed across the healthcare spectrum. © 2016 John Wiley & Sons Ltd.
A Translational Model of Research-Practice Integration
Vivian, Dina; Hershenberg, Rachel; Teachman, Bethany A.; Drabick, Deborah A. G.; Goldfried, Marvin R.; Wolfe, Barry
2013-01-01
We propose a four-level, recursive Research-Practice Integration framework as a heuristic to (a) integrate and reflect on the articles in this Special Section as contributing to a bidirectional bridge between research and practice, and (b) consider additional opportunities to address the research–practice gap. Level 1 addresses Treatment Validation studies and includes an article by Lochman and colleagues concerning the programmatic adaptation, implementation, and dissemination of the empirically supported Coping Power treatment program for youth aggression. Level 2 translation, Training in Evidence-Based Practice, includes a paper by Hershenberg, Drabick, and Vivian, which focuses on the critical role that predoctoral training plays in bridging the research–practice gap. Level 3 addresses the Assessment of Clinical Utility and Feedback to Research aspects of translation. The articles by Lambert and Youn, Kraus, and Castonguay illustrate the use of commercial outcome packages that enable psychotherapists to integrate ongoing client assessment, thus enhancing the effectiveness of treatment implementation and providing data that can be fed back to researchers. Lastly, Level 4 translation, the Cross-Level Integrative Research and Communication, concerns research efforts that integrate data from clinical practice and all other levels of translation, as well as communication efforts among all stakeholders, such as researchers, psychotherapists, and clients. Using a two-chair technique as a framework for his discussion, Wolfe's article depicts the struggle inherent in research–practice integration efforts and proposes a rapprochement that highlights advancements in the field. PMID:22642522
NASA Astrophysics Data System (ADS)
Šprlák, Michal; Novák, Pavel
2017-02-01
New spherical integral formulas between components of the second- and third-order gravitational tensors are formulated in this article. First, we review the nomenclature and basic properties of the second- and third-order gravitational tensors. Initial points of mathematical derivations, i.e., the second- and third-order differential operators defined in the spherical local North-oriented reference frame and the analytical solutions of the gradiometric boundary-value problem, are also summarized. Secondly, we apply the third-order differential operators to the analytical solutions of the gradiometric boundary-value problem which gives 30 new integral formulas transforming (1) vertical-vertical, (2) vertical-horizontal and (3) horizontal-horizontal second-order gravitational tensor components onto their third-order counterparts. Using spherical polar coordinates related sub-integral kernels can efficiently be decomposed into azimuthal and isotropic parts. Both spectral and closed forms of the isotropic kernels are provided and their limits are investigated. Thirdly, numerical experiments are performed to test the consistency of the new integral transforms and to investigate properties of the sub-integral kernels. The new mathematical apparatus is valid for any harmonic potential field and may be exploited, e.g., when gravitational/magnetic second- and third-order tensor components become available in the future. The new integral formulas also extend the well-known Meissl diagram and enrich the theoretical apparatus of geodesy.
Shi, X; Zhou, J L; Zhao, H; Hou, L; Yang, Y
2014-09-01
Polar organic chemical integrative sampler (POCIS) was used in assessing the occurrence and risk of 12 widely used antibiotics and 5 most potent endocrine disrupting chemicals (EDCs) in the Yangtze Estuary, China. During laboratory validation, the kinetics of pollutant uptake by POCIS were linear, and the sampling rates of most compounds were raised by flow rate and salinity, reaching the highest values at salinity 14‰. The sampling rates varied with the target compounds with the EDCs showing the highest values (overall average=0.123Ld(-1)), followed by chloramphenicols (0.100Ld(-1)), macrolides (0.089Ld(-1)), and finally sulfonamides (0.056Ld(-1)). Validation in the Yangtze Estuary in 2013 showed that the field sampling rates were significantly greater for all compounds except bisphenol A, in comparison to laboratory results, and high-frequency spot sampling is critical for fully validating the passive sampler. The field studies show that antibiotics were widely detected in the Yangtze Estuary, with concentrations varying from below quantification to 1613ngL(-1), suggesting their widespread use and persistence in estuarine waters. The dominating pollutants in July were sulfonamides with a total concentration of 258ngL(-1) and in October were macrolides with a total concentration of 350ngL(-1). The calculation of risk quotient suggested that sulfapyridine, sulfaquinoxaline and erythromycin-H2O may have caused medium damage to sensitive organisms such as fish. Copyright © 2014. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Schuetze, C.; Sauer, U.; Dietrich, P.
2015-12-01
Reliable detection and assessment of near-surface CO2 emissions from natural or anthropogenic sources require the application of various monitoring tools at different spatial scales. Especially, optical remote sensing tools for atmospheric monitoring have the potential to measure integrally CO2 emissions over larger scales (> 10.000m2). Within the framework of the MONACO project ("Monitoring approach for geological CO2 storage sites using a hierarchical observation concept"), an integrative hierarchical monitoring concept was developed and validated at different field sites with the aim to establish a modular observation strategy including investigations in the shallow subsurface, at ground surface level and the lower atmospheric boundary layer. The main aims of the atmospheric monitoring using optical remote sensing were the observation of the gas dispersion in to the near-surface atmosphere, the determination of maximum concentration values and identification of the main challenges associated with the monitoring of extended emission sources with the proposed methodological set up under typical environmental conditions. The presentation will give an overview about several case studies using the integrative approach of Open-Path Fourier Transform Infrared spectroscopy (OP FTIR) in combination with in situ measurements. As a main result, the method was validated as possible approach for continuous monitoring of the atmospheric composition, in terms of integral determination of GHG concentrations and to identify target areas which are needed to be investigated more in detail. Especially the data interpretation should closely consider the micrometeorological conditions. Technical aspects concerning robust equipment, experimental set up and fast data processing algorithms have to be taken into account for the enhanced automation of atmospheric monitoring.
2015-07-01
steps to identify and mitigate potential challenges; (2) extent the services’ efforts to validate gender -neutral occupational standards are...to address statutory and Joint Staff requirements for validating gender -neutral occupational standards. GAO identified five elements required for...SOCOM Have Studies Underway to Validate Gender -Neutral Occupational Standards 21 DOD Is Providing Oversight of Integration Efforts, but Has Not
1992-06-01
predicting both job performance and counterproductive behaviors on the job such as theft, disciplinary problems, and absenteeism . Validities were found to...DECLASSIFICATION/DOWNGRADING SCHEDULE 4 PERFORMING ORGANIZATION REPORT NUMBER(S) 92-1 6a NAME OF PERFORMING ORGANIZATION Universi+y of Iowa...be generalizable. The estimated mean operational predictive validity of integrity tests for supervisory ratings of job performance is .41. For the
Cytotoxicity of metal and semiconductor nanoparticles indicated by cellular micromotility.
Tarantola, Marco; Schneider, David; Sunnick, Eva; Adam, Holger; Pierrat, Sebastien; Rosman, Christina; Breus, Vladimir; Sönnichsen, Carsten; Basché, Thomas; Wegener, Joachim; Janshoff, Andreas
2009-01-27
In the growing field of nanotechnology, there is an urgent need to sensitively determine the toxicity of nanoparticles since many technical and medical applications are based on controlled exposure to particles, that is, as contrast agents or for drug delivery. Before the in vivo implementation, in vitro cell experiments are required to achieve a detailed knowledge of toxicity and biodegradation as a function of the nanoparticles' physical and chemical properties. In this study, we show that the micromotility of animal cells as monitored by electrical cell-substrate impedance analysis (ECIS) is highly suitable to quantify in vitro cytotoxicity of semiconductor quantum dots and gold nanorods. The method is validated by conventional cytotoxicity testing and accompanied by fluorescence and dark-field microscopy to visualize changes in the cytoskeleton integrity and to determine the location of the particles within the cell.
Identifying differentially expressed genes in cancer patients using a non-parameter Ising model.
Li, Xumeng; Feltus, Frank A; Sun, Xiaoqian; Wang, James Z; Luo, Feng
2011-10-01
Identification of genes and pathways involved in diseases and physiological conditions is a major task in systems biology. In this study, we developed a novel non-parameter Ising model to integrate protein-protein interaction network and microarray data for identifying differentially expressed (DE) genes. We also proposed a simulated annealing algorithm to find the optimal configuration of the Ising model. The Ising model was applied to two breast cancer microarray data sets. The results showed that more cancer-related DE sub-networks and genes were identified by the Ising model than those by the Markov random field model. Furthermore, cross-validation experiments showed that DE genes identified by Ising model can improve classification performance compared with DE genes identified by Markov random field model. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Using Landsat satellite data to support pesticide exposure assessment in California
Maxwell, Susan K.; Airola, Matthew; Nuckols, John R.
2010-01-01
We found the combination of Landsat 5 and 7 image data would clearly benefit pesticide exposure assessment in this region by 1) providing information on crop field conditions at or near the time when pesticides are applied, and 2) providing information for validating the CDWR map. The Landsat image time-series was useful for identifying idle, single-, and multi-cropped fields. Landsat data will be limited during the winter months due to cloud cover, and for years prior to the Landsat 7 launch (1999) when only one satellite was operational at any given time. We suggest additional research to determine the feasibility of integrating CDWR land use maps and Landsat data to derive crop maps in locations and time periods where maps are not available, which will allow for substantial improvements to chemical exposure estimation.
From Sommerfeld and Brillouin forerunners to optical precursors
NASA Astrophysics Data System (ADS)
Macke, Bruno; Ségard, Bernard
2013-04-01
The Sommerfeld and Brillouin forerunners generated in a single-resonance absorbing medium by an incident step-modulated pulse are theoretically considered in the double limit where the susceptibility of the medium is weak and the resonance is narrow. Combining direct Laplace-Fourier integration and calculations by the saddle-point method, we establish an explicit analytical expression of the transmitted field valid at any time, even when the two forerunners significantly overlap. We examine how their complete overlapping, occurring for shorter propagation distances, originates the formation of the unique transient currently named resonant precursor or dynamical beat. We obtain an expression of this transient identical to that usually derived within the slowly varying envelope approximation in spite of the initial discontinuity of the incident field envelope. The dynamical beats and 0π pulses generated by ultrashort incident pulses are also briefly examined.
Coupled BE/FE/BE approach for scattering from fluid-filled structures
NASA Technical Reports Server (NTRS)
Everstine, Gordon C.; Cheng, Raymond S.
1990-01-01
NASHUA is a coupled finite element/boundary element capability built around NASTRAN for calculating the low frequency far-field acoustic pressure field radiated or scattered by an arbitrary, submerged, three-dimensional, elastic structure subjected to either internal time-harmonic mechanical loads or external time-harmonic incident loadings. Described here are the formulation and use of NASHUA for solving such structural acoustics problems when the structure is fluid-filled. NASTRAN is used to generate the structural finite element model and to perform most of the required matrix operations. Both fluid domains are modeled using the boundary element capability in NASHUA, whose matrix formulation (and the associated NASTRAN DMAP) for evacuated structures can be used with suitable interpretation of the matrix definitions. After computing surface pressures and normal velocities, far-field pressures are evaluated using an asymptotic form of the Helmholtz exterior integral equation. The proposed numerical approach is validated by comparing the acoustic field scattered from a submerged fluid-filled spherical thin shell to that obtained with a series solution, which is also derived here.
Electromagnetic Field Assessment as a Smart City Service: The SmartSantander Use-Case
Diez, Luis; Agüero, Ramón; Muñoz, Luis
2017-01-01
Despite the increasing presence of wireless communications in everyday life, there exist some voices raising concerns about their adverse effects. One particularly relevant example is the potential impact of the electromagnetic field they induce on the population’s health. Traditionally, very specialized methods and devices (dosimetry) have been used to assess the strength of the E-field, with the main objective of checking whether it respects the corresponding regulations. In this paper, we propose a complete novel approach, which exploits the functionality leveraged by a smart city platform. We deploy a number of measuring probes, integrated as sensing devices, to carry out a characterization embracing large areas, as well as long periods of time. This unique platform has been active for more than one year, generating a vast amount of information. We process such information, and the obtained results validate the whole methodology. In addition, we discuss the variation of the E-field caused by cellular networks, considering additional information, such as usage statistics. Finally, we establish the exposure that can be attributed to the base stations within the scenario under analysis. PMID:28561783
Electromagnetic Field Assessment as a Smart City Service: The SmartSantander Use-Case.
Diez, Luis; Agüero, Ramón; Muñoz, Luis
2017-05-31
Despite the increasing presence of wireless communications in everyday life, there exist some voices raising concerns about their adverse effects. One particularly relevant example is the potential impact of the electromagnetic field they induce on the population's health. Traditionally, very specialized methods and devices (dosimetry) have been used to assess the strength of the E-field, with the main objective of checking whether it respects the corresponding regulations. In this paper, we propose a complete novel approach, which exploits the functionality leveraged by a smart city platform. We deploy a number of measuring probes, integrated as sensing devices, to carry out a characterization embracing large areas, as well as long periods of time. This unique platform has been active for more than one year, generating a vast amount of information. We process such information, and the obtained results validate the whole methodology. In addition, we discuss the variation of the E-field caused by cellular networks, considering additional information, such as usage statistics. Finally, we establish the exposure that can be attributed to the base stations within the scenario under analysis.
LANL* V1.0: a radiation belt drift shell model suitable for real-time and reanalysis applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koller, Josep; Reeves, Geoffrey D; Friedel, Reiner H W
2008-01-01
Space weather modeling, forecasts, and predictions, especially for the radiation belts in the inner magnetosphere, require detailed information about the Earth's magnetic field. Results depend on the magnetic field model and the L* (pron. L-star) values which are used to describe particle drift shells. Space wather models require integrating particle motions along trajectories that encircle the Earth. Numerical integration typically takes on the order of 10{sup 5} calls to a magnetic field model which makes the L* calculations very slow, in particular when using a dynamic and more accurate magnetic field model. Researchers currently tend to pick simplistic models overmore » more accurate ones but also risking large inaccuracies and even wrong conclusions. For example, magnetic field models affect the calculation of electron phase space density by applying adiabatic invariants including the drift shell value L*. We present here a new method using a surrogate model based on a neural network technique to replace the time consuming L* calculations made with modern magnetic field models. The advantage of surrogate models (or meta-models) is that they can compute the same output in a fraction of the time while adding only a marginal error. Our drift shell model LANL* (Los Alamos National Lab L-star) is based on L* calculation using the TSK03 model. The surrogate model has currently been tested and validated only for geosynchronous regions but the method is generally applicable to any satellite orbit. Computations with the new model are several million times faster compared to the standard integration method while adding less than 1% error. Currently, real-time applications for forecasting and even nowcasting inner magnetospheric space weather is limited partly due to the long computing time of accurate L* values. Without them, real-time applications are limited in accuracy. Reanalysis application of past conditions in the inner magnetosphere are used to understand physical processes and their effect. Without sufficiently accurate L* values, the interpretation of reanalysis results becomes difficult and uncertain. However, with a method that can calculate accurate L* values orders of magnitude faster, analyzing whole solar cycles worth of data suddenly becomes feasible.« less
NASA Astrophysics Data System (ADS)
Irwanto, Rohaeti, Eli; LFX, Endang Widjajanti; Suyanta
2017-05-01
This research aims to develop instrument and determine the characteristics of an integrated assessment instrument. This research uses 4-D model, which includes define, design, develop, and disseminate. The primary product is validated by expert judgment, tested it's readability by students, and assessed it's feasibility by chemistry teachers. This research involved 246 students of grade XI of four senior high schools in Yogyakarta, Indonesia. Data collection techniques include interview, questionnaire, and test. Data collection instruments include interview guideline, item validation sheet, users' response questionnaire, instrument readability questionnaire, and essay test. The results show that the integrated assessment instrument has Aiken validity value of 0.95. Item reliability was 0.99 and person reliability was 0.69. Teachers' response to the integrated assessment instrument is very good. Therefore, the integrated assessment instrument is feasible to be applied to measure the students' analytical thinking and science process skills.
Comelli, M; Colonna, N; Martini, L; Licitra, G
2009-12-01
An integrated system to evaluate the magnetic field generated by power lines exposure has been developed using a specific simulation model (PLEIA-EMF). This is part of a software toolset, subjected to internal suitability verifications and in-field validations. A state indicator related to each span has been determined using the data extracted from digital cartography, the magnetic field calculated by PLEIA and the number of people living in the nearest buildings. In this way, it is possible to determine eventual criticalities in the considered area, focusing attention on those cases with more considerable exposure levels and involving a higher number of people. A campaign of inspections has been planned using PLEIA simulations. The reliability of stored technical data and the real population exposure levels have been evaluated in critical cases, individuated through the following described methodology. The procedures leading to the indicator determination and the modalities of in situ inspections are here presented.
2011-09-01
The transfer of new technologies (e.g., evidence-based practices) into substance abuse treatment organizations often occurs long after they have been developed and shown to be effective. Transfer is slowed, in part, due to a lack of clear understanding about all that is needed to achieve full implementation of these technologies. Such misunderstanding is exacerbated by inconsistent terminology and overlapping models of an innovation, including its development and validation, dissemination to the public, and implementation or use in the field. For this reason, a workgroup of the Addiction Technology Transfer Center (ATTC) Network developed a field-driven conceptual model of the innovation process that more precisely defines relevant terms and concepts and integrates them into a comprehensive taxonomy. The proposed definitions and conceptual framework will allow for improved understanding and consensus regarding the distinct meaning and conceptual relationships between dimensions of the technology transfer process and accelerate the use of evidence-based practices. Copyright © 2011 Elsevier Inc. All rights reserved.
DOE R&D Accomplishments Database
Lamb, W. E. Jr.
1978-11-01
This report describes research on the theory of isotope separation produced by the illumination of polyatomic molecules by intense infrared laser radiation. Newton`s equations of motion were integrated for the atoms of the SF{sub 6} molecule including the laser field interaction. The first year`s work has been largely dedicated to obtaining a suitable interatomic potential valid for arbitrary configurations of the seven particles. This potential gives the correct symmetry of the molecule, the equilibrium configuration, the frequencies of the six distinct normal modes of oscillation and the correct (or assumed) value of the total potential energy of the molecule. Other conditions can easily be imposed in order to obtain a more refined potential energy function, for example, by making allowance for anharmonicity data. A suitable expression was also obtained for the interaction energy between a laser field and the polyatomic molecule. The electromagnetic field is treated classically, and it would be easily possible to treat the cases of time dependent pulses, frequency modulation and noise.
A self-sensing magnetorheological damper with power generation
NASA Astrophysics Data System (ADS)
Chen, Chao; Liao, Wei-Hsin
2012-02-01
Magnetorheological (MR) dampers are promising for semi-active vibration control of various dynamic systems. In the current MR damper systems, a separate power supply and dynamic sensor are required. To enable the MR damper to be self-powered and self-sensing in the future, in this paper we propose and investigate a self-sensing MR damper with power generation, which integrates energy harvesting, dynamic sensing and MR damping technologies into one device. This MR damper has self-contained power generation and velocity sensing capabilities, and is applicable to various dynamic systems. It combines the advantages of energy harvesting—reusing wasted energy, MR damping—controllable damping force, and sensing—providing dynamic information for controlling system dynamics. This multifunctional integration would bring great benefits such as energy saving, size and weight reduction, lower cost, high reliability, and less maintenance for the MR damper systems. In this paper, a prototype of the self-sensing MR damper with power generation was designed, fabricated, and tested. Theoretical analyses and experimental studies on power generation were performed. A velocity-sensing method was proposed and experimentally validated. The magnetic-field interference among three functions was prevented by a combined magnetic-field isolation method. Modeling, analysis, and experimental results on damping forces are also presented.
Aircraft/island/ship/satellite intercomparison: Preliminary results from July 16, 1987
NASA Technical Reports Server (NTRS)
Hanson, Howard P.; Davidson, Ken; Gerber, Herman; Khalsa, Siri Jodha Singh; Kloesel, Kevin A.; Schwiesow, Ronald; Snider, Jack B.; Wielicki, Bruce M.; Wylie, Donald P.
1990-01-01
The First ISCCP Regional Experiment (FIRE) objective of validating and improving satellite algorithms for inferring cloud properties from satellite radiances was one of the central motivating factors in the design of the specific field experimental strategies used in the July, 1987 marine stratocumulus intensive field observations (IFO). The in situ measuring platforms were deployed to take maximum advantage of redundant measurements (for intercomparison of the in situ sensors) and to provide optimal coverage within satellite images. One of the most ambitious of these strategies was the attempt to coordinate measurements from San Nicolas Island (SNI), the R/V Pt. Sur, the meteorological aircraft, and the satellites. For the most part, this attempt was frustrated by flight restrictions in the vicinity of SNI. The exception was the mission of July 16, 1987, which achieved remarkable success in the coordination of the platforms. This presentation concerns operations conducted by the National Center for Atmospheric Research (NCAR) Electra and how data from the Electra can be integrated with and compared to data from the Pt. Sur, SNI, and the satellites. The focus is on the large-scale, integrated picture of the conditions on July 16 from the perspective of the Electra's flight operations.
NASA Astrophysics Data System (ADS)
Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi
2015-07-01
In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl- + CH3Cl → ClCH3 + Cl-) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.
Kido, Kentaro; Kasahara, Kento; Yokogawa, Daisuke; Sato, Hirofumi
2015-07-07
In this study, we reported the development of a new quantum mechanics/molecular mechanics (QM/MM)-type framework to describe chemical processes in solution by combining standard molecular-orbital calculations with a three-dimensional formalism of integral equation theory for molecular liquids (multi-center molecular Ornstein-Zernike (MC-MOZ) method). The theoretical procedure is very similar to the 3D-reference interaction site model self-consistent field (RISM-SCF) approach. Since the MC-MOZ method is highly parallelized for computation, the present approach has the potential to be one of the most efficient procedures to treat chemical processes in solution. Benchmark tests to check the validity of this approach were performed for two solute (solute water and formaldehyde) systems and a simple SN2 reaction (Cl(-) + CH3Cl → ClCH3 + Cl(-)) in aqueous solution. The results for solute molecular properties and solvation structures obtained by the present approach were in reasonable agreement with those obtained by other hybrid frameworks and experiments. In particular, the results of the proposed approach are in excellent agreements with those of 3D-RISM-SCF.
Validating empirical force fields for molecular-level simulation of cellulose dissolution
USDA-ARS?s Scientific Manuscript database
The calculations presented here, which include dynamics simulations using analytical force fields and first principles studies, indicate that the COMPASS force field is preferred over the Dreiding and Universal force fields for studying dissolution of large cellulose structures. The validity of thes...
The development of the time dependence of the nuclear EMP electric field
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eng, C
The nuclear electromagnetic pulse (EMP) electric field calculated with the legacy code CHAP is compared with the field given by an integral solution of Maxwell's equations, also known as the Jefimenko equation, to aid our current understanding on the factors that affect the time dependence of the EMP. For a fair comparison the CHAP current density is used as a source in the Jefimenko equation. At first, the comparison is simplified by neglecting the conduction current and replacing the standard atmosphere with a constant density air slab. The simplicity of the resultant current density aids in determining the factors thatmore » affect the rise, peak and tail of the EMP electric field versus time. The three dimensional nature of the radiating source, i.e. sources off the line-of-sight, and the time dependence of the derivative of the current density with respect to time are found to play significant roles in shaping the EMP electric field time dependence. These results are found to hold even when the conduction current and the standard atmosphere are properly accounted for. Comparison of the CHAP electric field with the Jefimenko electric field offers a direct validation of the high-frequency/outgoing wave approximation.« less
Psikuta, Agnes; Koelblen, Barbara; Mert, Emel; Fontana, Piero; Annaheim, Simon
2017-12-07
Following the growing interest in the further development of manikins to simulate human thermal behaviour more adequately, thermo-physiological human simulators have been developed by coupling a thermal sweating manikin with a thermo-physiology model. Despite their availability and obvious advantages, the number of studies involving these devices is only marginal, which plausibly results from the high complexity of the development and evaluation process and need of multi-disciplinary expertise. The aim of this paper is to present an integrated approach to develop, validate and operate such devices including technical challenges and limitations of thermo-physiological human simulators, their application and measurement protocol, strategy for setting test scenarios, and the comparison to standard methods and human studies including details which have not been published so far. A physical manikin controlled by a human thermoregulation model overcame the limitations of mathematical clothing models and provided a complementary method to investigate thermal interactions between the human body, protective clothing, and its environment. The opportunities of these devices include not only realistic assessment of protective clothing assemblies and equipment but also potential application in many research fields ranging from biometeorology, automotive industry, environmental engineering, and urban climate to clinical and safety applications.
PSIKUTA, Agnes; KOELBLEN, Barbara; MERT, Emel; FONTANA, Piero; ANNAHEIM, Simon
2017-01-01
Following the growing interest in the further development of manikins to simulate human thermal behaviour more adequately, thermo-physiological human simulators have been developed by coupling a thermal sweating manikin with a thermo-physiology model. Despite their availability and obvious advantages, the number of studies involving these devices is only marginal, which plausibly results from the high complexity of the development and evaluation process and need of multi-disciplinary expertise. The aim of this paper is to present an integrated approach to develop, validate and operate such devices including technical challenges and limitations of thermo-physiological human simulators, their application and measurement protocol, strategy for setting test scenarios, and the comparison to standard methods and human studies including details which have not been published so far. A physical manikin controlled by a human thermoregulation model overcame the limitations of mathematical clothing models and provided a complementary method to investigate thermal interactions between the human body, protective clothing, and its environment. The opportunities of these devices include not only realistic assessment of protective clothing assemblies and equipment but also potential application in many research fields ranging from biometeorology, automotive industry, environmental engineering, and urban climate to clinical and safety applications. PMID:28966294
Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications
NASA Technical Reports Server (NTRS)
Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.
1990-01-01
The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning valve control system (TVCS) are discussed, and their solutions are documented. The emphasis of this paper is on the validation of integrated systems.
Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications
NASA Technical Reports Server (NTRS)
Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.
1990-01-01
The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.
Robust numerical electromagnetic eigenfunction expansion algorithms
NASA Astrophysics Data System (ADS)
Sainath, Kamalesh
This thesis summarizes developments in rigorous, full-wave, numerical spectral-domain (integral plane wave eigenfunction expansion [PWE]) evaluation algorithms concerning time-harmonic electromagnetic (EM) fields radiated by generally-oriented and positioned sources within planar and tilted-planar layered media exhibiting general anisotropy, thickness, layer number, and loss characteristics. The work is motivated by the need to accurately and rapidly model EM fields radiated by subsurface geophysical exploration sensors probing layered, conductive media, where complex geophysical and man-made processes can lead to micro-laminate and micro-fractured geophysical formations exhibiting, at the lower (sub-2MHz) frequencies typically employed for deep EM wave penetration through conductive geophysical media, bulk-scale anisotropic (i.e., directional) electrical conductivity characteristics. When the planar-layered approximation (layers of piecewise-constant material variation and transversely-infinite spatial extent) is locally, near the sensor region, considered valid, numerical spectral-domain algorithms are suitable due to their strong low-frequency stability characteristic, and ability to numerically predict time-harmonic EM field propagation in media with response characterized by arbitrarily lossy and (diagonalizable) dense, anisotropic tensors. If certain practical limitations are addressed, PWE can robustly model sensors with general position and orientation that probe generally numerous, anisotropic, lossy, and thick layers. The main thesis contributions, leading to a sensor and geophysical environment-robust numerical modeling algorithm, are as follows: (1) Simple, rapid estimator of the region (within the complex plane) containing poles, branch points, and branch cuts (critical points) (Chapter 2), (2) Sensor and material-adaptive azimuthal coordinate rotation, integration contour deformation, integration domain sub-region partition and sub-region-dependent integration order (Chapter 3), (3) Integration partition-extrapolation-based (Chapter 3) and Gauss-Laguerre Quadrature (GLQ)-based (Chapter 4) evaluations of the deformed, semi-infinite-length integration contour tails, (4) Robust in-situ-based (i.e., at the spectral-domain integrand level) direct/homogeneous-medium field contribution subtraction and analytical curbing of the source current spatial spectrum function's ill behavior (Chapter 5), and (5) Analytical re-casting of the direct-field expressions when the source is embedded within a NBAM, short for non-birefringent anisotropic medium (Chapter 6). The benefits of these contributions are, respectively, (1) Avoiding computationally intensive critical-point location and tracking (computation time savings), (2) Sensor and material-robust curbing of the integrand's oscillatory and slow decay behavior, as well as preventing undesirable critical-point migration within the complex plane (computation speed, precision, and instability-avoidance benefits), (3) sensor and material-robust reduction (or, for GLQ, elimination) of integral truncation error, (4) robustly stable modeling of scattered fields and/or fields radiated from current sources modeled as spatially distributed (10 to 1000-fold compute-speed acceleration also realized for distributed-source computations), and (5) numerically stable modeling of fields radiated from sources within NBAM layers. Having addressed these limitations, are PWE algorithms applicable to modeling EM waves in tilted planar-layered geometries too? This question is explored in Chapter 7 using a Transformation Optics-based approach, allowing one to model wave propagation through layered media that (in the sensor's vicinity) possess tilted planar interfaces. The technique leads to spurious wave scattering however, whose induced computation accuracy degradation requires analysis. Mathematical exhibition, and exhaustive simulation-based study and analysis of the limitations of, this novel tilted-layer modeling formulation is Chapter 7's main contribution.
Validation and Application of the ReaxFF Reactive Force Field to Hydrocarbon Oxidation Kinetics
2016-06-23
AFRL-AFOSR-VA-TR-2016-0278 Validation and application of the ReaxFF reactive force field to hydrocarbon oxidation kinetics Adrianus Van Duin...application of the ReaxFF reactive force field to hydrocarbon oxidation kinetics 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-14-1-0355 5c. PROGRAM...Chenoweth Dec14 Validation and application of the ReaxFF reactive force field to hydrocarbon oxidation kinetics DISTRIBUTION A: Distribution approved for
Mermelstein, Robin J.; Revenson, Tracey A.
2013-01-01
Basic social psychological theories have much to contribute to our understanding of health problems and health-related behaviors and may provide potential avenues for intervention development. However, for these theories to have broader reach and applicability to the field of health psychology, more work needs to be done in integrating contexts into these theories and addressing more specifically their application across settings, behaviors, and populations. We argue that integration of these theories into a broader multi-disciplinary and multi-level ecological framework is needed to enhance their translation into real-world applications. To enhance this translation, we make several recommendations, including breaking down silos between disciplinary perspectives and enhancing bidirectional communication and translation; analyzing boundary conditions of theories; expanding research approaches to move outside the laboratory and maintain a focus on external validity; and conducting efficacy testing of theories with meaningful, relevant endpoints. PMID:23646843
Mermelstein, Robin J; Revenson, Tracey A
2013-05-01
Basic social psychological theories have much to contribute to our understanding of health problems and health-related behaviors and may provide potential avenues for intervention development. However, for these theories to have broader reach and applicability to the field of health psychology, more work needs to be done in integrating contexts into these theories and addressing more specifically their application across settings, behaviors, and populations. We argue that integration of these theories into a broader multidisciplinary and multilevel ecological framework is needed to enhance their translation into real-world applications. To enhance this translation, we make several recommendations, including breaking down silos between disciplinary perspectives and enhancing bidirectional communication and translation; analyzing boundary conditions of theories; expanding research approaches to move outside the laboratory and maintain a focus on external validity; and conducting efficacy testing of theories with meaningful, relevant endpoints. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Chihi, Asma; Ben Azza, Hechmi; Jemli, Mohamed; Sellami, Anis
2017-09-01
The aim of this paper is to provide high performance control of pumping system. The proposed method is designed by an indirect field oriented control based on Sliding Mode (SM) technique. The first contribution of this work is to design modified switching surfaces which presented by adding an integral action to the considered controlled variables. Then, in order to prevent the chattering phenomenon, modified nonlinear component is developed. The SM concept and a Lyapunov function are combined to compute the Sliding Mode Control (SMC) gains. Besides, the motor performance is validated by numeric simulations and real time implementation using a dSpace system with DS1104 controller board. Also, to show the effectiveness of the proposed approach, the obtained results are compared with other techniques such as conventional PI, Proportional Sliding Mode (PSM) and backstepping controls. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Streaming and particle motion in acoustically-actuated leaky systems
NASA Astrophysics Data System (ADS)
Nama, Nitesh; Barnkob, Rune; Jun Huang, Tony; Kahler, Christian; Costanzo, Francesco
2017-11-01
The integration of acoustics with microfluidics has shown great promise for applications within biology, chemistry, and medicine. A commonly employed system to achieve this integration consists of a fluid-filled, polymer-walled microchannel that is acoustically actuated via standing surface acoustic waves. However, despite significant experimental advancements, the precise physical understanding of such systems remains a work in progress. In this work, we investigate the nature of acoustic fields that are setup inside the microchannel as well as the fundamental driving mechanism governing the fluid and particle motion in these systems. We provide an experimental benchmark using state-of-art 3D measurements of fluid and particle motion and present a Lagrangian velocity based temporal multiscale numerical framework to explain the experimental observations. Following verification and validation, we employ our numerical model to reveal the presence of a pseudo-standing acoustic wave that drives the acoustic streaming and particle motion in these systems.
Scene recognition based on integrating active learning with dictionary learning
NASA Astrophysics Data System (ADS)
Wang, Chengxi; Yin, Xueyan; Yang, Lin; Gong, Chengrong; Zheng, Caixia; Yi, Yugen
2018-04-01
Scene recognition is a significant topic in the field of computer vision. Most of the existing scene recognition models require a large amount of labeled training samples to achieve a good performance. However, labeling image manually is a time consuming task and often unrealistic in practice. In order to gain satisfying recognition results when labeled samples are insufficient, this paper proposed a scene recognition algorithm named Integrating Active Learning and Dictionary Leaning (IALDL). IALDL adopts projective dictionary pair learning (DPL) as classifier and introduces active learning mechanism into DPL for improving its performance. When constructing sampling criterion in active learning, IALDL considers both the uncertainty and representativeness as the sampling criteria to effectively select the useful unlabeled samples from a given sample set for expanding the training dataset. Experiment results on three standard databases demonstrate the feasibility and validity of the proposed IALDL.
Design and control of compliant tensegrity robots through simulation and hardware validation.
Caluwaerts, Ken; Despraz, Jérémie; Işçen, Atıl; Sabelhaus, Andrew P; Bruce, Jonathan; Schrauwen, Benjamin; SunSpiral, Vytas
2014-09-06
To better understand the role of tensegrity structures in biological systems and their application to robotics, the Dynamic Tensegrity Robotics Lab at NASA Ames Research Center, Moffett Field, CA, USA, has developed and validated two software environments for the analysis, simulation and design of tensegrity robots. These tools, along with new control methodologies and the modular hardware components developed to validate them, are presented as a system for the design of actuated tensegrity structures. As evidenced from their appearance in many biological systems, tensegrity ('tensile-integrity') structures have unique physical properties that make them ideal for interaction with uncertain environments. Yet, these characteristics make design and control of bioinspired tensegrity robots extremely challenging. This work presents the progress our tools have made in tackling the design and control challenges of spherical tensegrity structures. We focus on this shape since it lends itself to rolling locomotion. The results of our analyses include multiple novel control approaches for mobility and terrain interaction of spherical tensegrity structures that have been tested in simulation. A hardware prototype of a spherical six-bar tensegrity, the Reservoir Compliant Tensegrity Robot, is used to empirically validate the accuracy of simulation. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model.
Berthelsen, Hanne; Hakanen, Jari J; Westerlund, Hugo
2018-01-01
This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field.
Broadband radiometric LED measurements
NASA Astrophysics Data System (ADS)
Eppeldauer, G. P.; Cooksey, C. C.; Yoon, H. W.; Hanssen, L. M.; Podobedov, V. B.; Vest, R. E.; Arp, U.; Miller, C. C.
2016-09-01
At present, broadband radiometric LED measurements with uniform and low-uncertainty results are not available. Currently, either complicated and expensive spectral radiometric measurements or broadband photometric LED measurements are used. The broadband photometric measurements are based on the CIE standardized V(λ) function, which cannot be used in the UV range and leads to large errors when blue or red LEDs are measured in its wings, where the realization is always poor. Reference irradiance meters with spectrally constant response and high-intensity LED irradiance sources were developed here to implement the previously suggested broadband radiometric LED measurement procedure [1, 2]. Using a detector with spectrally constant response, the broadband radiometric quantities of any LEDs or LED groups can be simply measured with low uncertainty without using any source standard. The spectral flatness of filtered-Si detectors and low-noise pyroelectric radiometers are compared. Examples are given for integrated irradiance measurement of UV and blue LED sources using the here introduced reference (standard) pyroelectric irradiance meters. For validation, the broadband measured integrated irradiance of several LED-365 sources were compared with the spectrally determined integrated irradiance derived from an FEL spectral irradiance lamp-standard. Integrated responsivity transfer from the reference irradiance meter to transfer standard and field UV irradiance meters is discussed.
Fan, Tingbo; Liu, Zhenbo; Chen, Tao; Li, Faqi; Zhang, Dong
2011-09-01
In this work, the authors propose a modeling approach to compute the nonlinear acoustic field generated by a flat piston transmitter with an attached aluminum lens. In this approach, the geometrical parameters (radius and focal length) of a virtual source are initially determined by Snell's refraction law and then adjusted based on the Rayleigh integral result in the linear case. Then, this virtual source is used with the nonlinear spheroidal beam equation (SBE) model to predict the nonlinear acoustic field in the focal region. To examine the validity of this approach, the calculated nonlinear result is compared with those from the Westervelt and (Khokhlov-Zabolotskaya-Kuznetsov) KZK equations for a focal intensity of 7 kW/cm(2). Results indicate that this approach could accurately describe the nonlinear acoustic field in the focal region with less computation time. The proposed modeling approach is shown to accurately describe the nonlinear acoustic field in the focal region. Compared with the Westervelt equation, the computation time of this approach is significantly reduced. It might also be applicable for the widely used concave focused transmitter with a large aperture angle.
The Hyper-X Flight Systems Validation Program
NASA Technical Reports Server (NTRS)
Redifer, Matthew; Lin, Yohan; Bessent, Courtney Amos; Barklow, Carole
2007-01-01
For the Hyper-X/X-43A program, the development of a comprehensive validation test plan played an integral part in the success of the mission. The goal was to demonstrate hypersonic propulsion technologies by flight testing an airframe-integrated scramjet engine. Preparation for flight involved both verification and validation testing. By definition, verification is the process of assuring that the product meets design requirements; whereas validation is the process of assuring that the design meets mission requirements for the intended environment. This report presents an overview of the program with emphasis on the validation efforts. It includes topics such as hardware-in-the-loop, failure modes and effects, aircraft-in-the-loop, plugs-out, power characterization, antenna pattern, integration, combined systems, captive carry, and flight testing. Where applicable, test results are also discussed. The report provides a brief description of the flight systems onboard the X-43A research vehicle and an introduction to the ground support equipment required to execute the validation plan. The intent is to provide validation concepts that are applicable to current, follow-on, and next generation vehicles that share the hybrid spacecraft and aircraft characteristics of the Hyper-X vehicle.
Portal dosimetry for VMAT using integrated images obtained during treatment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bedford, James L., E-mail: James.Bedford@icr.ac.uk; Hanson, Ian M.; Hansen, Vibeke Nordmark
2014-02-15
Purpose: Portal dosimetry provides an accurate and convenient means of verifying dose delivered to the patient. A simple method for carrying out portal dosimetry for volumetric modulated arc therapy (VMAT) is described, together with phantom measurements demonstrating the validity of the approach. Methods: Portal images were predicted by projecting dose in the isocentric plane through to the portal image plane, with exponential attenuation and convolution with a double-Gaussian scatter function. Appropriate parameters for the projection were selected by fitting the calculation model to portal images measured on an iViewGT portal imager (Elekta AB, Stockholm, Sweden) for a variety of phantommore » thicknesses and field sizes. This model was then used to predict the portal image resulting from each control point of a VMAT arc. Finally, all these control point images were summed to predict the overall integrated portal image for the whole arc. The calculated and measured integrated portal images were compared for three lung and three esophagus plans delivered to a thorax phantom, and three prostate plans delivered to a homogeneous phantom, using a gamma index for 3% and 3 mm. A 0.6 cm{sup 3} ionization chamber was used to verify the planned isocentric dose. The sensitivity of this method to errors in monitor units, field shaping, gantry angle, and phantom position was also evaluated by means of computer simulations. Results: The calculation model for portal dose prediction was able to accurately compute the portal images due to simple square fields delivered to solid water phantoms. The integrated images of VMAT treatments delivered to phantoms were also correctly predicted by the method. The proportion of the images with a gamma index of less than unity was 93.7% ± 3.0% (1SD) and the difference between isocenter dose calculated by the planning system and measured by the ionization chamber was 0.8% ± 1.0%. The method was highly sensitive to errors in monitor units and field shape, but less sensitive to errors in gantry angle or phantom position. Conclusions: This method of predicting integrated portal images provides a convenient means of verifying dose delivered using VMAT, with minimal image acquisition and data processing requirements.« less
Rui, Bruno R; Angrimani, Daniel S R; Losano, João Diego A; Bicudo, Luana de Cássia; Nichi, Marcílio; Pereira, Ricardo J G
2017-12-01
Several methods have been developed to evaluate spermatozoa function in birds but many of these are sometimes complicated, costly and not applicable to field studies (i.e., performed within poultry breeding facilities). The objective was, therefore, to validate efficient, practical and inexpensive procedures to determine DNA fragmentation, acrosomal integrity, and mitochondrial activity in poultry spermatozoa. Initially, ejaculates were individually diluted and divided into control (4°C, 4h) and UV-irradiated aliquots (room temperature, 4h), and then samples containing different percentages of DNA-damaged spermatozoa (0%, 25%, 50%, 75% and 100%) were subjected to Toluidine Blue (TB) and Sperm Chromatin Dispersion assessments (SCD). Fast Green-Rose Bengal (FG-RB) and FITC-PSA staining protocols were subsequently used to assess acrosome status in aliquots comprising assorted amounts of acrosome-reacted spermatozoa. Furthermore, to validate 3,3'-diaminobenzidine (DAB) assay, ejaculates containing different gradients of spermatozoa with great amounts of mitochondrial activity were concurrently evaluated using DAB and JC-1 stains. The proportion of spermatozoa with abnormal DNA integrity when evaluated using the TB assessment correlated significantly with the expected percentages of UV-irradiated spermatozoa and with SCD results. A significant linear regression coefficient was also observed between expected amounts of acrosome-intact spermatozoa and FG-RB readings, and there was a significant correlation of the data when FG-RB and FITC-PSA were used. Likewise, the use of the DAB assay enabled for accurately ascertaining percentages of rooster spermatozoa with greater and lesser mitochondrial function, and results were highly correlated to results with staining with JC-1. Altogether, findings of the present study indicate acrosomal status, DNA integrity and mitochondrial activity in rooster spermatozoa can be easily and reliably determined using FG-RB, TB and DAB stains. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kuncarayakti, H.; Galbany, L.; Anderson, J. P.; Krühler, T.; Hamuy, M.
2016-09-01
Context. Stellar populations are the building blocks of galaxies, including the Milky Way. The majority, if not all, extragalactic studies are entangled with the use of stellar population models given the unresolved nature of their observation. Extragalactic systems contain multiple stellar populations with complex star formation histories. However, studies of these systems are mainly based upon the principles of simple stellar populations (SSP). Hence, it is critical to examine the validity of SSP models. Aims: This work aims to empirically test the validity of SSP models. This is done by comparing SSP models against observations of spatially resolved young stellar population in the determination of its physical properties, that is, age and metallicity. Methods: Integral field spectroscopy of a young stellar cluster in the Milky Way, NGC 3603, was used to study the properties of the cluster as both a resolved and unresolved stellar population. The unresolved stellar population was analysed using the Hα equivalent width as an age indicator and the ratio of strong emission lines to infer metallicity. In addition, spectral energy distribution (SED) fitting using STARLIGHT was used to infer these properties from the integrated spectrum. Independently, the resolved stellar population was analysed using the colour-magnitude diagram (CMD) to determine age and metallicity. As the SSP model represents the unresolved stellar population, the derived age and metallicity were tested to determine whether they agree with those derived from resolved stars. Results: The age and metallicity estimate of NGC 3603 derived from integrated spectroscopy are confirmed to be within the range of those derived from the CMD of the resolved stellar population, including other estimates found in the literature. The result from this pilot study supports the reliability of SSP models for studying unresolved young stellar populations. Based on observations collected at the European Organisation for Astronomical Research in the Southern Hemisphere under ESO programme 60.A-9344.
The effects of solarization on the performance of a gas turbine
NASA Astrophysics Data System (ADS)
Homann, Christiaan; van der Spuy, Johan; von Backström, Theodor
2016-05-01
Various hybrid solar gas turbine configurations exist. The Stellenbosch University Solar Power Thermodynamic (SUNSPOT) cycle consists of a heliostat field, solar receiver, primary Brayton gas turbine cycle, thermal storage and secondary Rankine steam cycle. This study investigates the effect of the solarization of a gas turbine on its performance and details the integration of a gas turbine into a solar power plant. A Rover 1S60 gas turbine was modelled in Flownex, a thermal-fluid system simulation and design code, and validated against a one-dimensional thermodynamic model at design input conditions. The performance map of a newly designed centrifugal compressor was created and implemented in Flownex. The effect of the improved compressor on the performance of the gas turbine was evident. The gas turbine cycle was expanded to incorporate different components of a CSP plant, such as a solar receiver and heliostat field. The solarized gas turbine model simulates the gas turbine performance when subjected to a typical variation in solar resource. Site conditions at the Helio100 solar field were investigated and the possibility of integrating a gas turbine within this system evaluated. Heat addition due to solar irradiation resulted in a decreased fuel consumption rate. The influence of the additional pressure drop over the solar receiver was evident as it leads to decreased net power output. The new compressor increased the overall performance of the gas turbine and compensated for pressure losses incurred by the addition of solar components. The simulated integration of the solarized gas turbine at Helio100 showed potential, although the solar irradiation is too little to run the gas turbine on solar heat alone. The simulation evaluates the feasibility of solarizing a gas turbine and predicts plant performance for such a turbine cycle.
Bhattacharya, S.; Doveton, J.H.; Carr, T.R.; Guy, W.R.; Gerlach, P.M.
2005-01-01
Small independent operators produce most of the Mississippian carbonate fields in the United States mid-continent, where a lack of integrated characterization studies precludes maximization of hydrocarbon recovery. This study uses integrative techniques to leverage extant data in an Osagian and Meramecian (Mississippian) cherty carbonate reservoir in Kansas. Available data include petrophysical logs of varying vintages, limited number of cores, and production histories from each well. A consistent set of assumptions were used to extract well-level porosity and initial saturations, from logs of different types and vintages, to build a geomodel. Lacking regularly recorded well shut-in pressures, an iterative technique, based on material balance formulations, was used to estimate average reservoir-pressure decline that matched available drillstem test data and validated log-analysis assumptions. Core plugs representing the principal reservoir petrofacies provide critical inputs for characterization and simulation studies. However, assigning plugs among multiple reservoir petrofacies is difficult in complex (carbonate) reservoirs. In a bottom-up approach, raw capillary pressure (Pc) data were plotted on the Super-Pickett plot, and log- and core-derived saturation-height distributions were reconciled to group plugs by facies, to identify core plugs representative of the principal reservoir facies, and to discriminate facies in the logged interval. Pc data from representative core plugs were used for effective pay evaluation to estimate water cut from completions, in infill and producing wells, and guide-selective perforations for economic exploitation of mature fields. The results from this study were used to drill 22 infill wells. Techniques demonstrated here can be applied in other fields and reservoirs. Copyright ?? 2005. The American Association of Petroleum Geologists. All rights reserved.
PISCES: An Integral Field Spectrograph Technology Demonstration for the WFIRST Coronagraph
NASA Technical Reports Server (NTRS)
McElwain, Michael W.; Mandell, Avi M.; Gong, Qian; Llop-Sayson, Jorge; Brandt, Timothy; Chambers, Victor J.; Grammer, Bryan; Greeley, Bradford; Hilton, George; Perrin, Marshall D.;
2016-01-01
We present the design, integration, and test of the Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies (PISCES) integral field spectrograph (IFS). The PISCES design meets the science requirements for the Wide-Field Infra Red Survey Telescope (WFIRST) Coronagraph Instrument (CGI). PISCES was integrated and tested in the integral field spectroscopy laboratory at NASA Goddard. In June 2016, PISCES was delivered to the Jet Propulsion Laboratory (JPL) where it was integrated with the Shaped Pupil Coronagraph (SPC) High Contrast Imaging Testbed (HCIT). The SPC/PISCES configuration will demonstrate high contrast integral field spectroscopy as part of the WFIRST CGI technology development program.
PISCES: an integral field spectrograph technology demonstration for the WFIRST coronagraph
NASA Astrophysics Data System (ADS)
McElwain, Michael W.; Mandell, Avi M.; Gong, Qian; Llop-Sayson, Jorge; Brandt, Timothy; Chambers, Victor J.; Grammer, Bryan; Greeley, Bradford; Hilton, George; Perrin, Marshall D.; Stapelfeldt, Karl R.; Demers, Richard; Tang, Hong; Cady, Eric
2016-07-01
We present the design, integration, and test of the Prototype Imaging Spectrograph for Coronagraphic Exoplanet Studies (PISCES) integral field spectrograph (IFS). The PISCES design meets the science requirements for the Wide-Field InfraRed Survey Telescope (WFIRST) Coronagraph Instrument (CGI). PISCES was integrated and tested in the integral field spectroscopy laboratory at NASA Goddard. In June 2016, PISCES was delivered to the Jet Propulsion Laboratory (JPL) where it was integrated with the Shaped Pupil Coronagraph (SPC) High Contrast Imaging Testbed (HCIT). The SPC/PISCES configuration will demonstrate high contrast integral field spectroscopy as part of the WFIRST CGI technology development program.
Tang, Zhentao; Hou, Wenqian; Liu, Xiuming; Wang, Mingfeng; Duan, Yixiang
2016-08-26
Integral analysis plays an important role in study and quality control of substances with complex matrices in our daily life. As the preliminary construction of integral analysis of substances with complex matrices, developing a relatively comprehensive and sensitive methodology might offer more informative and reliable characteristic components. Flavoring mixtures belonging to the representatives of substances with complex matrices have now been widely used in various fields. To better study and control the quality of flavoring mixtures as additives in food industry, an in-house fabricated solid-phase microextraction (SPME) fiber was prepared based on sol-gel technology in this work. The active organic component of the fiber coating was multi-walled carbon nanotubes (MWCNTs) functionalized with hydroxyl-terminated polydimethyldiphenylsiloxane, which integrate the non-polar and polar chains of both materials. In this way, more sensitive extraction capability for a wider range of compounds can be obtained in comparison with commercial SPME fibers. Preliminarily integral analysis of three similar types of samples were realized by the optimized SPME-GC-MS method. With the obtained GC-MS data, a valid and well-fit model was established by partial least square discriminant analysis (PLS-DA) for classification of these samples (R2X=0.661, R2Y=0.996, Q2=0.986). The validity of the model (R2=0.266, Q2=-0.465) has also approved the potential to predict the "belongingness" of new samples. With the PLS-DA and SPSS method, further screening out the markers among three similar batches of samples may be helpful for monitoring and controlling the quality of the flavoring mixtures as additives in food industry. Conversely, the reliability and effectiveness of the GC-MS data has verified the comprehensive and efficient extraction performance of the in-house fabricated fiber. Copyright © 2016 Elsevier B.V. All rights reserved.
This Integrated Summary Report (ISR) summarizes, in a single document, the results from an international multi-laboratory validation study conducted for two in vitro estrogen receptor (ER) binding assays. These assays both use human recombinant estrogen receptor, alpha subtype (h...
ERIC Educational Resources Information Center
Davidson, William B.; Beck, Hall P.; Milligan, Meg
2009-01-01
The investigators reviewed the retention literature and developed a 53-item questionnaire and tested its validity. Component analysis of the responses of 2,022 students at four schools yielded six reliable factors: Institutional Commitment, Degree Commitment, Academic Integration, Social Integration, Support Services Satisfaction, and Academic…
NASA Technical Reports Server (NTRS)
Magee, Todd E.; Fugal, Spencer R.; Fink, Lawrence E.; Adamson, Eric E.; Shaw, Stephen G.
2015-01-01
This report describes the work conducted under NASA funding for the Boeing N+2 Supersonic Experimental Validation project to experimentally validate the conceptual design of a supersonic airliner feasible for entry into service in the 2018 -to 2020 timeframe (NASA N+2 generation). The primary goal of the project was to develop a low-boom configuration optimized for minimum sonic boom signature (65 to 70 PLdB). This was a very aggressive goal that could be achieved only through integrated multidisciplinary optimization tools validated in relevant ground and, later, flight environments. The project was split into two phases. Phase I of the project covered the detailed aerodynamic design of a low boom airliner as well as the wind tunnel tests to validate that design (ref. 1). This report covers Phase II of the project, which continued the design methodology development of Phase I with a focus on the propulsion integration aspects as well as the testing involved to validate those designs. One of the major airplane configuration features of the Boeing N+2 low boom design was the overwing nacelle. The location of the nacelle allowed for a minimal effect on the boom signature, however, it added a level of difficulty to designing an inlet with acceptable performance in the overwing flow field. Using the Phase I work as the starting point, the goals of the Phase 2 project were to design and verify inlet performance while maintaining a low-boom signature. The Phase II project was successful in meeting all contract objectives. New modular nacelles were built for the larger Performance Model along with a propulsion rig with an electrically-actuated mass flow plug. Two new mounting struts were built for the smaller Boom Model, along with new nacelles. Propulsion integration testing was performed using an instrumented fan face and a mass flow plug, while boom signatures were measured using a wall-mounted pressure rail. A side study of testing in different wind tunnels was completed as a precursor to the selection of the facilities used for validation testing. As facility schedules allowed, the propulsion testing was done at the NASA Glenn Research Center (GRC) 8 x 6-Foot wind tunnel, while boom and force testing was done at the NASA Ames Research Center (ARC) 9 x 7-Foot wind tunnel. During boom testing, a live balance was used for gathering force data. This report is broken down into nine sections. The first technical section (Section 2) covers the general scope of the Phase II activities, goals, a description of the design and testing efforts, and the project plan and schedule. Section 3 covers the details of the propulsion system concepts and design evolution. A series of short tests to evaluate the suitability of different wind tunnels for boom, propulsion, and force testing was also performed under the Phase 2 effort, with the results covered in Section 4. The propulsion integration testing is covered in Section 5 and the boom and force testing in Section 6. CFD comparisons and analyses are included in Section 7. Section 8 includes the conclusions and lessons learned.
A comparison of four embedded validity indices for the RBANS in a memory disorders clinic.
Paulson, Daniel; Horner, Michael David; Bachman, David
2015-05-01
This examination of four embedded validity indices for the Repeated Battery for the Assessment of Neuropsychological Status (RBANS) explores the potential utility of integrating cognitive and self-reported depressive measures. Examined indices include the proposed RBANS Performance Validity Index (RBANS PVI) and the Charleston Revised Index of Effort for the RBANS (CRIER). The CRIER represented the novel integration of cognitive test performance and depression self-report information. The sample included 234 patients without dementia who could be identified as having demonstrated either valid or invalid responding, based on standardized criteria. Sensitivity and specificity for invalid responding varied widely, with the CRIER emerging as the best all-around index (sensitivity = 0.84, specificity = 0.90, AUC = 0.94). Findings support the use of embedded response validity indices, and suggest that the integration of cognitive and self-report depression data may optimize detection of invalid responding among older Veterans. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Yidong, E-mail: yidongyang@med.miami.edu; Wang, Ken Kang-Hsin; Wong, John W.
2015-04-15
Purpose: The cone beam computed tomography (CBCT) guided small animal radiation research platform (SARRP) has been developed for focal tumor irradiation, allowing laboratory researchers to test basic biological hypotheses that can modify radiotherapy outcomes in ways that were not feasible previously. CBCT provides excellent bone to soft tissue contrast, but is incapable of differentiating tumors from surrounding soft tissue. Bioluminescence tomography (BLT), in contrast, allows direct visualization of even subpalpable tumors and quantitative evaluation of tumor response. Integration of BLT with CBCT offers complementary image information, with CBCT delineating anatomic structures and BLT differentiating luminescent tumors. This study is tomore » develop a systematic method to calibrate an integrated CBCT and BLT imaging system which can be adopted onboard the SARRP to guide focal tumor irradiation. Methods: The integrated imaging system consists of CBCT, diffuse optical tomography (DOT), and BLT. The anatomy acquired from CBCT and optical properties acquired from DOT serve as a priori information for the subsequent BLT reconstruction. Phantoms were designed and procedures were developed to calibrate the CBCT, DOT/BLT, and the entire integrated system. Geometrical calibration was performed to calibrate the CBCT system. Flat field correction was performed to correct the nonuniform response of the optical imaging system. Absolute emittance calibration was performed to convert the camera readout to the emittance at the phantom or animal surface, which enabled the direct reconstruction of the bioluminescence source strength. Phantom and mouse imaging were performed to validate the calibration. Results: All calibration procedures were successfully performed. Both CBCT of a thin wire and a euthanized mouse revealed no spatial artifact, validating the accuracy of the CBCT calibration. The absolute emittance calibration was validated with a 650 nm laser source, resulting in a 3.0% difference between simulated and measured signal. The calibration of the entire system was confirmed through the CBCT and BLT reconstruction of a bioluminescence source placed inside a tissue-simulating optical phantom. Using a spatial region constraint, the source position was reconstructed with less than 1 mm error and the source strength reconstructed with less than 24% error. Conclusions: A practical and systematic method has been developed to calibrate an integrated x-ray and optical tomography imaging system, including the respective CBCT and optical tomography system calibration and the geometrical calibration of the entire system. The method can be modified and adopted to calibrate CBCT and optical tomography systems that are operated independently or hybrid x-ray and optical tomography imaging systems.« less
Yang, Yidong; Wang, Ken Kang-Hsin; Eslami, Sohrab; Iordachita, Iulian I.; Patterson, Michael S.; Wong, John W.
2015-01-01
Purpose: The cone beam computed tomography (CBCT) guided small animal radiation research platform (SARRP) has been developed for focal tumor irradiation, allowing laboratory researchers to test basic biological hypotheses that can modify radiotherapy outcomes in ways that were not feasible previously. CBCT provides excellent bone to soft tissue contrast, but is incapable of differentiating tumors from surrounding soft tissue. Bioluminescence tomography (BLT), in contrast, allows direct visualization of even subpalpable tumors and quantitative evaluation of tumor response. Integration of BLT with CBCT offers complementary image information, with CBCT delineating anatomic structures and BLT differentiating luminescent tumors. This study is to develop a systematic method to calibrate an integrated CBCT and BLT imaging system which can be adopted onboard the SARRP to guide focal tumor irradiation. Methods: The integrated imaging system consists of CBCT, diffuse optical tomography (DOT), and BLT. The anatomy acquired from CBCT and optical properties acquired from DOT serve as a priori information for the subsequent BLT reconstruction. Phantoms were designed and procedures were developed to calibrate the CBCT, DOT/BLT, and the entire integrated system. Geometrical calibration was performed to calibrate the CBCT system. Flat field correction was performed to correct the nonuniform response of the optical imaging system. Absolute emittance calibration was performed to convert the camera readout to the emittance at the phantom or animal surface, which enabled the direct reconstruction of the bioluminescence source strength. Phantom and mouse imaging were performed to validate the calibration. Results: All calibration procedures were successfully performed. Both CBCT of a thin wire and a euthanized mouse revealed no spatial artifact, validating the accuracy of the CBCT calibration. The absolute emittance calibration was validated with a 650 nm laser source, resulting in a 3.0% difference between simulated and measured signal. The calibration of the entire system was confirmed through the CBCT and BLT reconstruction of a bioluminescence source placed inside a tissue-simulating optical phantom. Using a spatial region constraint, the source position was reconstructed with less than 1 mm error and the source strength reconstructed with less than 24% error. Conclusions: A practical and systematic method has been developed to calibrate an integrated x-ray and optical tomography imaging system, including the respective CBCT and optical tomography system calibration and the geometrical calibration of the entire system. The method can be modified and adopted to calibrate CBCT and optical tomography systems that are operated independently or hybrid x-ray and optical tomography imaging systems. PMID:25832060
The Habitat Demonstration Unit Project Overview
NASA Technical Reports Server (NTRS)
Kennedy, Kriss J.; Grill, Tracy R.; Tri, Terry O.; Howe, Alan S.
2010-01-01
This paper will describe an overview of the National Aeronautics and Space Administration (NASA) led multi-center Habitat Demonstration Unit (HDU) Project. The HDU project is a "technology-pull" project that integrates technologies and innovations from numerous NASA centers. This project will be used to investigate and validate surface architectures, operations concepts, and requirements definition of various habitation concepts. The first habitation configuration this project will build and test is the Pressurized Excursion Module (PEM). This habitat configuration - the PEM - is based on the Constellation Architecture Scenario 12.1 concept of a vertically oriented habitat module. The HDU project will be tested as part of the 2010 Desert Research and Technologies Simulations (D-RATS) test objectives. The purpose of this project is to develop, integrate, test, and evaluate a habitat configuration in the context of the mission architectures and surface operation concepts. A multi-center approach will be leveraged to build, integrate, and test the PEM through a shared collaborative effort of multiple NASA centers. The HDU project is part of the strategic plan from the Exploration Systems Mission Directorate (ESMD) Directorate Integration Office (DIO) and the Lunar Surface Systems Project Office (LSSPO) to test surface elements in a surface analog environment. The 2010 analog field test will include two Lunar Electric Rovers (LER) and the PEM among other surface demonstration elements. This paper will describe the overall objectives, its various habitat configurations, strategic plan, and technology integration as it pertains to the 2010 and 2011 field analog tests. To accomplish the development of the PEM from conception in June 2009 to rollout for operations in July 2010, the HDU project team is using a set of design standards to define the interfaces between the various systems of PEM and to the payloads, such as the Geology Lab, that those systems will support. Scheduled activities such as early fit-checks and the utilization of a habitat avionics test bed prior to equipment installation into PEM are planned to facilitate the integration process.
Integrated Modeling of Time Evolving 3D Kinetic MHD Equilibria and NTV Torque
NASA Astrophysics Data System (ADS)
Logan, N. C.; Park, J.-K.; Grierson, B. A.; Haskey, S. R.; Nazikian, R.; Cui, L.; Smith, S. P.; Meneghini, O.
2016-10-01
New analysis tools and integrated modeling of plasma dynamics developed in the OMFIT framework are used to study kinetic MHD equilibria evolution on the transport time scale. The experimentally observed profile dynamics following the application of 3D error fields are described using a new OMFITprofiles workflow that directly addresses the need for rapid and comprehensive analysis of dynamic equilibria for next-step theory validation. The workflow treats all diagnostic data as fundamentally time dependent, provides physics-based manipulations such as ELM phase data selection, and is consistent across multiple machines - including DIII-D and NSTX-U. The seamless integration of tokamak data and simulation is demonstrated by using the self-consistent kinetic EFIT equilibria and profiles as input into 2D particle, momentum and energy transport calculations using TRANSP as well as 3D kinetic MHD equilibrium stability and neoclassical transport modeling using General Perturbed Equilibrium Code (GPEC). The result is a smooth kinetic stability and NTV torque evolution over transport time scales. Work supported by DE-AC02-09CH11466.
Sweeping the Floor or Putting a Man on the Moon: How to Define and Measure Meaningful Work.
Both-Nwabuwe, Jitske M C; Dijkstra, Maria T M; Beersma, Bianca
2017-01-01
Meaningful work is integral to well-being and a flourishing life. The construct of "meaningful work" is, however, consistently affected by conceptual ambiguity. Although there is substantial support for arguments to maintain the status of conceptual ambiguity, we make a case for the benefits of having consensus on a definition and scale of meaningful work in the context of paid work. The objective of this article, therefore, was twofold. Firstly, we wanted to develop a more integrative definition of meaningful work. Secondly, we wanted to establish a corresponding operationalization. We reviewed the literature on the existing definitions of meaningful work and the scales designed to measure it. We found 14 definitions of meaningful work. Based on these definitions, we identified four categories of definitions, which led us to propose an integrative and comprehensive definition of meaningful work. We identified two validated scales that were partly aligned with the proposed definition. Based on our review, we conclude that scholars in this field should coalesce rather than diverge their efforts to conceptualize and measure meaningful work.
Active C4 Electrodes for Local Field Potential Recording Applications
Wang, Lu; Freedman, David; Sahin, Mesut; Ünlü, M. Selim; Knepper, Ronald
2016-01-01
Extracellular neural recording, with multi-electrode arrays (MEAs), is a powerful method used to study neural function at the network level. However, in a high density array, it can be costly and time consuming to integrate the active circuit with the expensive electrodes. In this paper, we present a 4 mm × 4 mm neural recording integrated circuit (IC) chip, utilizing IBM C4 bumps as recording electrodes, which enable a seamless active chip and electrode integration. The IC chip was designed and fabricated in a 0.13 μm BiCMOS process for both in vitro and in vivo applications. It has an input-referred noise of 4.6 μVrms for the bandwidth of 10 Hz to 10 kHz and a power dissipation of 11.25 mW at 2.5 V, or 43.9 μW per input channel. This prototype is scalable for implementing larger number and higher density electrode arrays. To validate the functionality of the chip, electrical testing results and acute in vivo recordings from a rat barrel cortex are presented. PMID:26861324
Recent advances in nanoplasmonic biosensors: applications and lab-on-a-chip integration
NASA Astrophysics Data System (ADS)
Lopez, Gerardo A.; Estevez, M.-Carmen; Soler, Maria; Lechuga, Laura M.
2017-01-01
Motivated by the recent progress in the nanofabrication field and the increasing demand for cost-effective, portable, and easy-to-use point-of-care platforms, localized surface plasmon resonance (LSPR) biosensors have been subjected to a great scientific interest in the last few years. The progress observed in the research of this nanoplasmonic technology is remarkable not only from a nanostructure fabrication point of view but also in the complete development and integration of operative devices and their application. The potential benefits that LSPR biosensors can offer, such as sensor miniaturization, multiplexing opportunities, and enhanced performances, have quickly positioned them as an interesting candidate in the design of lab-on-a-chip (LOC) optical biosensor platforms. This review covers specifically the most significant achievements that occurred in recent years towards the integration of this technology in compact devices, with views of obtaining LOC devices. We also discuss the most relevant examples of the use of the nanoplasmonic biosensors for real bioanalytical and clinical applications from assay development and validation to the identification of the implications, requirements, and challenges to be surpassed to achieve fully operative devices.
Integrated optical 3D digital imaging based on DSP scheme
NASA Astrophysics Data System (ADS)
Wang, Xiaodong; Peng, Xiang; Gao, Bruce Z.
2008-03-01
We present a scheme of integrated optical 3-D digital imaging (IO3DI) based on digital signal processor (DSP), which can acquire range images independently without PC support. This scheme is based on a parallel hardware structure with aid of DSP and field programmable gate array (FPGA) to realize 3-D imaging. In this integrated scheme of 3-D imaging, the phase measurement profilometry is adopted. To realize the pipeline processing of the fringe projection, image acquisition and fringe pattern analysis, we present a multi-threads application program that is developed under the environment of DSP/BIOS RTOS (real-time operating system). Since RTOS provides a preemptive kernel and powerful configuration tool, with which we are able to achieve a real-time scheduling and synchronization. To accelerate automatic fringe analysis and phase unwrapping, we make use of the technique of software optimization. The proposed scheme can reach a performance of 39.5 f/s (frames per second), so it may well fit into real-time fringe-pattern analysis and can implement fast 3-D imaging. Experiment results are also presented to show the validity of proposed scheme.
Immittance Data Validation by Kramers‐Kronig Relations – Derivation and Implications
2017-01-01
Abstract Explicitly based on causality, linearity (superposition) and stability (time invariance) and implicit on continuity (consistency), finiteness (convergence) and uniqueness (single valuedness) in the time domain, Kramers‐Kronig (KK) integral transform (KKT) relations for immittances are derived as pure mathematical constructs in the complex frequency domain using the two‐sided (bilateral) Laplace integral transform (LT) reduced to the Fourier domain for sufficiently rapid exponential decaying, bounded immittances. Novel anti KK relations are also derived to distinguish LTI (linear, time invariant) systems from non‐linear, unstable and acausal systems. All relations can be used to test KK transformability on the LTI principles of linearity, stability and causality of measured and model data by Fourier transform (FT) in immittance spectroscopy (IS). Also, integral transform relations are provided to estimate (conjugate) immittances at zero and infinite frequency particularly useful to normalise data and compare data. Also, important implications for IS are presented and suggestions for consistent data analysis are made which generally apply likewise to complex valued quantities in many fields of engineering and natural sciences. PMID:29577007
Sweeping the Floor or Putting a Man on the Moon: How to Define and Measure Meaningful Work
Both-Nwabuwe, Jitske M. C.; Dijkstra, Maria T. M.; Beersma, Bianca
2017-01-01
Meaningful work is integral to well-being and a flourishing life. The construct of “meaningful work” is, however, consistently affected by conceptual ambiguity. Although there is substantial support for arguments to maintain the status of conceptual ambiguity, we make a case for the benefits of having consensus on a definition and scale of meaningful work in the context of paid work. The objective of this article, therefore, was twofold. Firstly, we wanted to develop a more integrative definition of meaningful work. Secondly, we wanted to establish a corresponding operationalization. We reviewed the literature on the existing definitions of meaningful work and the scales designed to measure it. We found 14 definitions of meaningful work. Based on these definitions, we identified four categories of definitions, which led us to propose an integrative and comprehensive definition of meaningful work. We identified two validated scales that were partly aligned with the proposed definition. Based on our review, we conclude that scholars in this field should coalesce rather than diverge their efforts to conceptualize and measure meaningful work. PMID:29033867
Nonparaxial Dark-Hollow Gaussian Beams
NASA Astrophysics Data System (ADS)
Gao, Zeng-Hui; Lü, Bai-Da
2006-01-01
The concept of nonparaxial dark-hollow Gaussian beams (DHGBs) is introduced. By using the Rayleigh-Sommerfeld diffraction integral, the analytical propagation equation of DHGBs in free space is derived. The on-axis intensity, far-field equation and, in particular, paraxial expressions are given and treated as special cases of our result. It is shown that the parameter f = 1/kw0 with k being the wave number and w0 being the waist width determines the nonparaxiality of DHGBs. However, the parameter range, within which the paraxial approach is valid, depends on the propagation distance. The beam order affects the beam profile and position of maximum on-axis intensity.
NASA Technical Reports Server (NTRS)
Fabinsky, Beth
2006-01-01
WISE, the Wide Field Infrared Survey Explorer, is scheduled for launch in June 2010. The mission operations system for WISE requires a software modeling tool to help plan, integrate and simulate all spacecraft pointing and verify that no attitude constraints are violated. In the course of developing the requirements for this tool, an investigation was conducted into the design of similar tools for other space-based telescopes. This paper summarizes the ground software and processes used to plan and validate pointing for a selection of space telescopes; with this information as background, the design for WISE is presented.
Monogenic Mouse Models of Autism Spectrum Disorders: Common Mechanisms and Missing Links
Hulbert, Samuel W.; Jiang, Yong-hui
2016-01-01
Autism Spectrum Disorders (ASDs) present unique challenges in the fields of genetics and neurobiology because of the clinical and molecular heterogeneity underlying these disorders. Genetic mutations found in ASD patients provide opportunities to dissect the molecular and circuit mechanisms underlying autistic behaviors using animal models. Ongoing studies of genetically modified models have offered critical insight into possible common mechanisms arising from different mutations, but links between molecular abnormalities and behavioral phenotypes remain elusive. The challenges encountered in modeling autism in mice demand a new analytic paradigm that integrates behavioral analysis with circuit-level analysis in genetically modified models with strong construct validity. PMID:26733386
magnum.fe: A micromagnetic finite-element simulation code based on FEniCS
NASA Astrophysics Data System (ADS)
Abert, Claas; Exl, Lukas; Bruckner, Florian; Drews, André; Suess, Dieter
2013-11-01
We have developed a finite-element micromagnetic simulation code based on the FEniCS package called magnum.fe. Here we describe the numerical methods that are applied as well as their implementation with FEniCS. We apply a transformation method for the solution of the demagnetization-field problem. A semi-implicit weak formulation is used for the integration of the Landau-Lifshitz-Gilbert equation. Numerical experiments show the validity of simulation results. magnum.fe is open source and well documented. The broad feature range of the FEniCS package makes magnum.fe a good choice for the implementation of novel micromagnetic finite-element algorithms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanley, Eugene; Liu, Li
In this project, we target at three primary objectives: (1) Molecular Dynamics (MD) code development for Fe-Cr alloys, which can be utilized to provide thermodynamic and kinetic properties as inputs in mesoscale Phase Field (PF) simulations; (2) validation and implementation of the MD code to explain thermal ageing and radiation damage; and (3) an integrated modeling platform for MD and PF simulations. These two simulation tools, MD and PF, will ultimately be merged to understand and quantify the kinetics and mechanisms of microstructure and property evolution of Fe-Cr alloys under various thermal and irradiation environments
NASA Astrophysics Data System (ADS)
Liu, Yang; D'Angelo, Ralph M.; Sinha, Bikash K.; Zeroug, Smaine
2017-02-01
Modeling and understanding the complex elastic-wave physics prevalent in solid-fluid cylindrically-layered structures is of importance in many NDE fields, and most pertinently in the domain of well integrity evaluation of cased holes in the oil and gas industry. Current sonic measurements provide viable techniques for well integrity evaluation yet their practical effectiveness is hampered by the current lack of knowledge of acoustic wave fields particularly in complicated cased-hole geometry where for instance two or more nested steel strings are present in the borehole. In this article, we propose and implement a Sweeping Frequency Finite Element Method (SFFEM) for acoustic guided waves simulation in complex geometries that include double steel strings cemented to each other and to the formation and where the strings may be non-concentric. Transient dynamic finite element models are constructed with sweeping frequency signals being applied as the excitation sources. The sources and receivers disposition simulate current sonic measurement tools deployed in the oilfield. Synthetic wavetrains are recorded and processed with modified matrix pencil method to isolate both the dispersive and non-dispersive propagating guided wave modes. Scaled experiments of fluid-filled double strings with dimensions mimicking the real ones encountered in the field have also been carried out to generate reference data. A comparison of the experimental and numerical results indicates that the SFFEM is capable of accurately reproducing the rich and intricate higher-order multiple wave fields observed experimentally in the fluid-filled double string geometries.
Incompressible Deformation Estimation Algorithm (IDEA) from Tagged MR Images
Liu, Xiaofeng; Abd-Elmoniem, Khaled Z.; Stone, Maureen; Murano, Emi Z.; Zhuo, Jiachen; Gullapalli, Rao P.; Prince, Jerry L.
2013-01-01
Measuring the three-dimensional motion of muscular tissues, e.g., the heart or the tongue, using magnetic resonance (MR) tagging is typically carried out by interpolating the two-dimensional motion information measured on orthogonal stacks of images. The incompressibility of muscle tissue is an important constraint on the reconstructed motion field and can significantly help to counter the sparsity and incompleteness of the available motion information. Previous methods utilizing this fact produced incompressible motions with limited accuracy. In this paper, we present an incompressible deformation estimation algorithm (IDEA) that reconstructs a dense representation of the three-dimensional displacement field from tagged MR images and the estimated motion field is incompressible to high precision. At each imaged time frame, the tagged images are first processed to determine components of the displacement vector at each pixel relative to the reference time. IDEA then applies a smoothing, divergence-free, vector spline to interpolate velocity fields at intermediate discrete times such that the collection of velocity fields integrate over time to match the observed displacement components. Through this process, IDEA yields a dense estimate of a three-dimensional displacement field that matches our observations and also corresponds to an incompressible motion. The method was validated with both numerical simulation and in vivo human experiments on the heart and the tongue. PMID:21937342
Berger, Cezar; Freitas, Renato; Malafaia, Osvaldo; Pinto, José Simão de Paula; Mocellin, Marcos; Macedo, Evaldo; Fagundes, Marina Serrato Coelho
2012-01-01
Summary Introduction: In the health field, computerization has become increasingly necessary in professional practice, since it facilitates data recovery and assists in the development of research with greater scientific rigor. Objective: the present work aimed to develop, apply, and validate specific electronic protocols for patients referred for rhinoplasty. Methods: The prospective research had 3 stages: (1) preparation of theoretical data bases; (2) creation of a master protocol using Integrated System of Electronic Protocol (SINPE©); and (3) elaboration, application, and validation of a specific protocol for the nose and sinuses regarding rhinoplasty. Results: After the preparation of the master protocol, which dealt with the entire field of otorhinolaryngology, we idealized a specific protocol containing all matters related to the patient. In particular, the aesthetic and functional nasal complaints referred for surgical treatment (i.e., rhinoplasty) were organized into 6 main hierarchical categories: anamnesis, physical examination, complementary exams, diagnosis, treatment, and outcome. This protocol utilized these categories and their sub-items: finality; access; surgical maneuvers on the nasal dorsum, tip, and base; clinical evolution after 3, 6, and 12 months; revisional surgery; and quantitative and qualitative evaluations. Conclusion: The developed electronic-specific protocol is feasible and important for information registration from patients referred to rhinoplasty. PMID:25991979
LANL*V2.0: global modeling and validation
NASA Astrophysics Data System (ADS)
Koller, J.; Zaharia, S.
2011-08-01
We describe in this paper the new version of LANL*, an artificial neural network (ANN) for calculating the magnetic drift invariant L*. This quantity is used for modeling radiation belt dynamics and for space weather applications. We have implemented the following enhancements in the new version: (1) we have removed the limitation to geosynchronous orbit and the model can now be used for a much larger region. (2) The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005) (TS05) instead of the older model by Tsyganenko et al. (2003). We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically ΔL* < 0.2 which corresponds to an error of 3 % at geosynchronous orbit. This new LANL* V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.
Structural biology data archiving - where we are and what lies ahead.
Kleywegt, Gerard J; Velankar, Sameer; Patwardhan, Ardan
2018-05-10
For almost 50 years, structural biology has endeavoured to conserve and share its experimental data and their interpretations (usually, atomistic models) through global public archives such as the Protein Data Bank, Electron Microscopy Data Bank and Biological Magnetic Resonance Data Bank (BMRB). These archives are treasure troves of freely accessible data that document our quest for molecular or atomic understanding of biological function and processes in health and disease. They have prepared the field to tackle new archiving challenges as more and more (combinations of) techniques are being utilized to elucidate structure at ever increasing length scales. Furthermore, the field has made substantial efforts to develop validation methods that help users to assess the reliability of structures and to identify the most appropriate data for their needs. In this Review, we present an overview of public data archives in structural biology and discuss the importance of validation for users and producers of structural data. Finally, we sketch our efforts to integrate structural data with bioimaging data and with other sources of biological data. This will make relevant structural information available and more easily discoverable for a wide range of scientists. © 2018 The Authors. FEBS Letters published by John Wiley & Sons Ltd on behalf of Federation of European Biochemical Societies.
Validation of a pulsed electric field process to pasteurize strawberry puree
USDA-ARS?s Scientific Manuscript database
An inexpensive data acquisition method was developed to validate the exact number and shape of the pulses applied during pulsed electric fields (PEF) processing. The novel validation method was evaluated in conjunction with developing a pasteurization PEF process for strawberry puree. Both buffered...
Fast solver for large scale eddy current non-destructive evaluation problems
NASA Astrophysics Data System (ADS)
Lei, Naiguang
Eddy current testing plays a very important role in non-destructive evaluations of conducting test samples. Based on Faraday's law, an alternating magnetic field source generates induced currents, called eddy currents, in an electrically conducting test specimen. The eddy currents generate induced magnetic fields that oppose the direction of the inducing magnetic field in accordance with Lenz's law. In the presence of discontinuities in material property or defects in the test specimen, the induced eddy current paths are perturbed and the associated magnetic fields can be detected by coils or magnetic field sensors, such as Hall elements or magneto-resistance sensors. Due to the complexity of the test specimen and the inspection environments, the availability of theoretical simulation models is extremely valuable for studying the basic field/flaw interactions in order to obtain a fuller understanding of non-destructive testing phenomena. Theoretical models of the forward problem are also useful for training and validation of automated defect detection systems. Theoretical models generate defect signatures that are expensive to replicate experimentally. In general, modelling methods can be classified into two categories: analytical and numerical. Although analytical approaches offer closed form solution, it is generally not possible to obtain largely due to the complex sample and defect geometries, especially in three-dimensional space. Numerical modelling has become popular with advances in computer technology and computational methods. However, due to the huge time consumption in the case of large scale problems, accelerations/fast solvers are needed to enhance numerical models. This dissertation describes a numerical simulation model for eddy current problems using finite element analysis. Validation of the accuracy of this model is demonstrated via comparison with experimental measurements of steam generator tube wall defects. These simulations generating two-dimension raster scan data typically takes one to two days on a dedicated eight-core PC. A novel direct integral solver for eddy current problems and GPU-based implementation is also investigated in this research to reduce the computational time.
Why do health and social care providers co-operate?
van Raak, Arno; Paulus, Aggie; Mur-Veeman, Ingrid
2005-09-28
Within Europe, although there are numerous examples of poor co-ordination in the delivery of integrated care, many providers do co-operate. We wanted to know why providers are moved to co-operate. In terms of systematic research, this is a new field; researchers have only begun to theorise about the rationales for co-operation. Practically, the issue of achieving co-operation attracts much attention from policymakers. Understanding the reasons for co-operation is a prerequisite for developing effective policy in support of integrated care. Our aim is to explore the comparative validity of different theoretical perspectives on the reasons for co-operation, to indicate directions for further study and for policy making. We used data from three successive studies to perform pattern matching with six established theoretical perspectives: transaction costs economics, strategic choice theory, resource dependence theory, learning theory, stakeholder theory and institutional theory. Insights from the studies were compared for validating purposes (triangulation). The first study concerned the evaluation of the Dutch 'National Home Health Care Programme' according to the case study methodology. The second and third studies were surveys among project directors: questionnaires were based on the concepts derived from the first study. Researchers should combine normative institutional theory, resource dependence theory and stakeholder theory into one perspective, in order to study relationship formation in health and social care. The concept of institutions (rules) is the linchpin between the theories. Policy makers must map the institutions of stakeholders and enable integrated care policy to correspond with these institutions as much as possible.
Temporal integration of soil N2O fluxes: validation of IPNOA station automatic chamber prototype.
Laville, P; Bosco, S; Volpi, I; Virgili, G; Neri, S; Continanza, D; Bonari, E
2017-09-04
The assessment of nitrous oxide (N 2 O) fluxes from agricultural soil surfaces still poses a major challenge to the scientific community. The evaluations of integrated soil fluxes of N 2 O are difficult owing to their lower emissions when compared with CO 2 . These emissions are also sporadic as environmental conditions act as a limiting factor. A station prototype was developed to integrate annual N 2 O and CO 2 emissions using an automatic chamber technique and infrared spectrometers within the LIFE project (IPNOA: LIFE11 ENV/IT/00032). It was installed from June 2014 to October 2015 in an experimental maize field in Tuscany. The detection limits for the fluxes were evaluated up to 1.6 ng N-N 2 O m 2 s -1 and 0.3 μg C-CO 2 m 2 s -1 . A cross-comparison carried out in September 2015 with the "mobile IPNOA prototype"; a high-sensibility transportable instrument already validated provided evidence of very similar values and highlighted flux assessment limitations according to the gas analyzers used. The permanent monitoring device showed that temporal distribution of N 2 O fluxes can be very large and discontinuous over short periods of less than 10 days and that N 2 O fluxes were below the detection limit of the instrumentation during approximately 70% of the measurement time. The N 2 O emission factors were estimated to 1.9% in 2014 and 1.7% in 2015, within the range of IPCC assessments.
Jet Engine Fan Response to Inlet Distortions Generated by Ingesting Boundary Layer Flow
NASA Astrophysics Data System (ADS)
Giuliani, James Edward
Future civil transport designs may incorporate engines integrated into the body of the aircraft to take advantage of efficiency increases due to weight and drag reduction. Additional increases in engine efficiency are predicted if the inlets ingest the lower momentum boundary layer flow that develops along the surface of the aircraft. Previous studies have shown, however, that the efficiency benefits of Boundary Layer Ingesting (BLI) inlets are very sensitive to the magnitude of fan and duct losses, and blade structural response to the non-uniform flow field that results from a BLI inlet has not been studied in-depth. This project represents an effort to extend the modeling capabilities of TURBO, an existing rotating turbomachinery unsteady analysis code, to include the ability to solve the external and internal flow fields of a BLI inlet. The TURBO code has been a successful tool in evaluating fan response to flow distortions for traditional engine/inlet integrations. Extending TURBO to simulate the external and inlet flow field upstream of the fan will allow accurate pressure distortions that result from BLI inlet configurations to be computed and used to analyze fan aerodynamics and structural response. To validate the modifications for the BLI inlet flow field, an experimental NASA project to study flush-mounted S-duct inlets with large amounts of boundary layer ingestion was modeled. Results for the flow upstream and in the inlet are presented and compared to experimental data for several high Reynolds number flows to validate the modifications to the solver. Once the inlet modifications were validated, a hypothetical compressor fan was connected to the inlet, matching the inlet operating conditions so that the effect on the distortion could be evaluated. Although the total pressure distortion upstream of the fan was symmetrical for this geometry, the pressure rise generated by the fan blades was not, because of the velocity non-uniformity of the distortion. Total pressure profiles at various axial locations are computed to identify the overall distortion pattern, how the distortion evolves through the blade passages and mixes out downstream of the blades, and where any critical performance concerns might be. Stall cells are identified that are stationary in the absolute frame and are fixed to the inlet distortion. Flow paths around the blades are examined to study the stall mechanism. Rather than a static airfoil stall, it is observed that the non-uniform pressure loading promotes a three-dimensional dynamic stall. The stall occurs at a point of rapid incidence angle oscillation, observed when a blade passes through the distortion, and re-attaches when the blade leaves the distortion.
Identification of sea ice types in spaceborne synthetic aperture radar data
NASA Technical Reports Server (NTRS)
Kwok, Ronald; Rignot, Eric; Holt, Benjamin; Onstott, R.
1992-01-01
This study presents an approach for identification of sea ice types in spaceborne SAR image data. The unsupervised classification approach involves cluster analysis for segmentation of the image data followed by cluster labeling based on previously defined look-up tables containing the expected backscatter signatures of different ice types measured by a land-based scatterometer. Extensive scatterometer observations and experience accumulated in field campaigns during the last 10 yr were used to construct these look-up tables. The classification approach, its expected performance, the dependence of this performance on radar system performance, and expected ice scattering characteristics are discussed. Results using both aircraft and simulated ERS-1 SAR data are presented and compared to limited field ice property measurements and coincident passive microwave imagery. The importance of an integrated postlaunch program for the validation and improvement of this approach is discussed.
A CCD experimental platform for large telescope in Antarctica based on FPGA
NASA Astrophysics Data System (ADS)
Zhu, Yuhua; Qi, Yongjun
2014-07-01
The CCD , as a detector , is one of the important components of astronomical telescopes. For a large telescope in Antarctica, a set of CCD detector system with large size, high sensitivity and low noise is indispensable. Because of the extremely low temperatures and unattended, system maintenance and software and hardware upgrade become hard problems. This paper introduces a general CCD controller experiment platform, using Field programmable gate array FPGA, which is, in fact, a large-scale field reconfigurable array. Taking the advantage of convenience to modify the system, construction of driving circuit, digital signal processing module, network communication interface, control algorithm validation, and remote reconfigurable module may realize. With the concept of integrated hardware and software, the paper discusses the key technology of building scientific CCD system suitable for the special work environment in Antarctica, focusing on the method of remote reconfiguration for controller via network and then offering a feasible hardware and software solution.
Adiabatic model and design of a translating field reversed configuration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Intrator, T. P.; Siemon, R. E.; Sieck, P. E.
We apply an adiabatic evolution model to predict the behavior of a field reversed configuration (FRC) during decompression and translation, as well as during boundary compression. Semi-empirical scaling laws, which were developed and benchmarked primarily for collisionless FRCs, are expected to remain valid even for the collisional regime of FRX-L experiment. We use this approach to outline the design implications for FRX-L, the high density translated FRC experiment at Los Alamos National Laboratory. A conical theta coil is used to accelerate the FRC to the largest practical velocity so it can enter a mirror bounded compression region, where it mustmore » be a suitable target for a magnetized target fusion (MTF) implosion. FRX-L provides the physics basis for the integrated MTF plasma compression experiment at the Shiva-Star pulsed power facility at Kirtland Air Force Research Laboratory, where the FRC will be compressed inside a flux conserving cylindrical shell.« less
Field Analysis of Microbial Contamination Using Three Molecular Methods in Parallel
NASA Technical Reports Server (NTRS)
Morris, H.; Stimpson, E.; Schenk, A.; Kish, A.; Damon, M.; Monaco, L.; Wainwright, N.; Steele, A.
2010-01-01
Advanced technologies with the capability of detecting microbial contamination remain an integral tool for the next stage of space agency proposed exploration missions. To maintain a clean, operational spacecraft environment with minimal potential for forward contamination, such technology is a necessity, particularly, the ability to analyze samples near the point of collection and in real-time both for conducting biological scientific experiments and for performing routine monitoring operations. Multiple molecular methods for detecting microbial contamination are available, but many are either too large or not validated for use on spacecraft. Two methods, the adenosine- triphosphate (ATP) and Limulus Amebocyte Lysate (LAL) assays have been approved by the NASA Planetary Protection Office for the assessment of microbial contamination on spacecraft surfaces. We present the first parallel field analysis of microbial contamination pre- and post-cleaning using these two methods as well as universal primer-based polymerase chain reaction (PCR).
Parametric Characterization of TES Detectors Under DC Bias
NASA Technical Reports Server (NTRS)
Chiao, Meng P.; Smith, Stephen James; Kilbourne, Caroline A.; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Datesman, Aaron M.; Eckart, Megan E.; Ewin, Audrey J.;
2016-01-01
The X-ray integrated field unit (X-IFU) in European Space Agency's (ESA's) Athena mission will be the first high-resolution X-ray spectrometer in space using a large-format transition-edge sensor microcalorimeter array. Motivated by optimization of detector performance for X-IFU, we have conducted an extensive campaign of parametric characterization on transition-edge sensor (TES) detectors with nominal geometries and physical properties in order to establish sensitivity trends relative to magnetic field, dc bias on detectors, operating temperature, and to improve our understanding of detector behavior relative to its fundamental properties such as thermal conductivity, heat capacity, and transition temperature. These results were used for validation of a simple linear detector model in which a small perturbation can be introduced to one or multiple parameters to estimate the error budget for X-IFU. We will show here results of our parametric characterization of TES detectors and briefly discuss the comparison with the TES model.
Overview of the present progress and activities on the CFETR
NASA Astrophysics Data System (ADS)
Wan, Yuanxi; Li, Jiangang; Liu, Yong; Wang, Xiaolin; Chan, Vincent; Chen, Changan; Duan, Xuru; Fu, Peng; Gao, Xiang; Feng, Kaiming; Liu, Songlin; Song, Yuntao; Weng, Peide; Wan, Baonian; Wan, Farong; Wang, Heyi; Wu, Songtao; Ye, Minyou; Yang, Qingwei; Zheng, Guoyao; Zhuang, Ge; Li, Qiang; CFETR Team
2017-10-01
The China Fusion Engineering Test Reactor (CFETR) is the next device in the roadmap for the realization of fusion energy in China, which aims to bridge the gaps between the fusion experimental reactor ITER and the demonstration reactor (DEMO). CFETR will be operated in two phases. Steady-state operation and self-sufficiency will be the two key issues for Phase I with a modest fusion power of up to 200 MW. Phase II aims for DEMO validation with a fusion power over 1 GW. Advanced H-mode physics, high magnetic fields up to 7 T, high frequency electron cyclotron resonance heating and lower hybrid current drive together with off-axis negative-ion neutral beam injection will be developed for achieving steady-state advanced operation. The recent detailed design, research and development (R&D) activities including integrated modeling of operation scenarios, high field magnet, material, tritium plant, remote handling and future plans are introduced in this paper.
ARN Integrated Retail Module (IRM) & 3D Whole Body Scanner System at Fort Carson, Colorado
2006-12-01
the Central Issue Facility (CIF), Ft. Carson, CO; and, 4) Develop and validate dynamic local tariffs. Additional information on Apparel...Scanner; 3) Integrate 3D Whole Body scanning technology with the ARN Integrated Retail Module (IRM) for clothing issue at the Central Issue Facility ...CIF), Ft. Carson, CO; and, 4) Develop and validate dynamic local tariffs. The main goals of the ARN 3D scanning research initiative at the Ft
NASA Astrophysics Data System (ADS)
Sari, Anggi Ristiyana Puspita; Suyanta, LFX, Endang Widjajanti; Rohaeti, Eli
2017-05-01
Recognizing the importance of the development of critical thinking and science process skills, the instrument should give attention to the characteristics of chemistry. Therefore, constructing an accurate instrument for measuring those skills is important. However, the integrated instrument assessment is limited in number. The purpose of this study is to validate an integrated assessment instrument for measuring students' critical thinking and science process skills on acid base matter. The development model of the test instrument adapted McIntire model. The sample consisted of 392 second grade high school students in the academic year of 2015/2016 in Yogyakarta. Exploratory Factor Analysis (EFA) was conducted to explore construct validity, whereas content validity was substantiated by Aiken's formula. The result shows that the KMO test is 0.714 which indicates sufficient items for each factor and the Bartlett test is significant (a significance value of less than 0.05). Furthermore, content validity coefficient which is based on 8 experts is obtained at 0.85. The findings support the integrated assessment instrument to measure critical thinking and science process skills on acid base matter.
NASA Astrophysics Data System (ADS)
Vidal, A.; San-Blas, A. A.; Quesada-Pereira, F. D.; Pérez-Soler, J.; Gil, J.; Vicente, C.; Gimeno, B.; Boria, V. E.
2015-07-01
A novel technique for the full-wave analysis of 3-D complex waveguide devices is presented. This new formulation, based on the Boundary Integral-Resonant Mode Expansion (BI-RME) method, allows the rigorous full-wave electromagnetic characterization of 3-D arbitrarily shaped metallic structures making use of extremely low CPU resources (both time and memory). The unknown electric current density on the surface of the metallic elements is represented by means of Rao-Wilton-Glisson basis functions, and an algebraic procedure based on a singular value decomposition is applied to transform such functions into the classical solenoidal and nonsolenoidal basis functions needed by the original BI-RME technique. The developed tool also provides an accurate computation of the electromagnetic fields at an arbitrary observation point of the considered device, so it can be used for predicting high-power breakdown phenomena. In order to validate the accuracy and efficiency of this novel approach, several new designs of band-pass waveguides filters are presented. The obtained results (S-parameters and electromagnetic fields) are successfully compared both to experimental data and to numerical simulations provided by a commercial software based on the finite element technique. The results obtained show that the new technique is specially suitable for the efficient full-wave analysis of complex waveguide devices considering an integrated coaxial excitation, where the coaxial probes may be in contact with the metallic insets of the component.
Hsieh, Pi-Jung
2015-07-01
Cloud computing technology has recently been seen as an important milestone in medical informatics development. Despite its great potential, there are gaps in our understanding of how users evaluate change in relation to the health cloud and how they decide to resist it. Integrating technology acceptance and status quo bias perspectives, this study develops an integrated model to explain healthcare professionals' intention to use the health cloud service and their intention to resist it. A field survey was conducted in Taiwan to collect data from healthcare professionals; a structural equation model was used to examine the data. A valid sample of 209 healthcare professionals was collected for data analysis. The results show that healthcare professionals' resistance to the use of the health cloud is the result of regret avoidance, inertia, perceived value, switching costs, and perceived threat. Attitude, subjective norm, and perceived behavior control are shown to have positive and direct effects on healthcare professionals' intention to use the health cloud. The results also indicate a significant negative effect in the relationship between healthcare professionals' intention and resistance to using the health cloud. Our study illustrates the importance of incorporating user resistance in technology acceptance studies in general and in health technology usage studies in particular. This study also identifies key factors for practitioners and hospitals to make adoption decisions in relation to the health cloud. Further, the study provides a useful reference for future studies in this subject field. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Real-Time fMRI Pattern Decoding and Neurofeedback Using FRIEND: An FSL-Integrated BCI Toolbox
Sato, João R.; Basilio, Rodrigo; Paiva, Fernando F.; Garrido, Griselda J.; Bramati, Ivanei E.; Bado, Patricia; Tovar-Moll, Fernanda; Zahn, Roland; Moll, Jorge
2013-01-01
The demonstration that humans can learn to modulate their own brain activity based on feedback of neurophysiological signals opened up exciting opportunities for fundamental and applied neuroscience. Although EEG-based neurofeedback has been long employed both in experimental and clinical investigation, functional MRI (fMRI)-based neurofeedback emerged as a promising method, given its superior spatial resolution and ability to gauge deep cortical and subcortical brain regions. In combination with improved computational approaches, such as pattern recognition analysis (e.g., Support Vector Machines, SVM), fMRI neurofeedback and brain decoding represent key innovations in the field of neuromodulation and functional plasticity. Expansion in this field and its applications critically depend on the existence of freely available, integrated and user-friendly tools for the neuroimaging research community. Here, we introduce FRIEND, a graphic-oriented user-friendly interface package for fMRI neurofeedback and real-time multivoxel pattern decoding. The package integrates routines for image preprocessing in real-time, ROI-based feedback (single-ROI BOLD level and functional connectivity) and brain decoding-based feedback using SVM. FRIEND delivers an intuitive graphic interface with flexible processing pipelines involving optimized procedures embedding widely validated packages, such as FSL and libSVM. In addition, a user-defined visual neurofeedback module allows users to easily design and run fMRI neurofeedback experiments using ROI-based or multivariate classification approaches. FRIEND is open-source and free for non-commercial use. Processing tutorials and extensive documentation are available. PMID:24312569
NASA Astrophysics Data System (ADS)
Salomir, Rares; Rata, Mihaela; Lafon, Cyril; Melodelima, David; Chapelon, Jean-Yves; Mathias, Adrien; Cotton, François; Bonmartin, Alain; Cathignol, Dominique
2006-05-01
Contact application of high intensity ultrasound was demonstrated to be suitable for thermal ablation of sectorial tumours of the digestive duct. Experimental validation of a new MR compatible ultrasonic device is described here, dedicated to the minimal invasive therapy of localized colorectal cancer. This is a cylindrical 1D 64-element phased array transducer of 14 mm diameter and 25 mm height (Imasonic, France) allowing electronic rotation of the acoustic beam. Operating frequency ranges from 3.5 to 4.0 MHz and up to 5 effective electrical watts per element are available. A plane wave is reconstructed by simultaneous excitation of eigth adjacent elements with an appropriate phase law. Driving electronics operates outside the Faraday cage of the scanner and provides fast switching capabilities. Excellent passive and active compatibility with the MRI data acquisition has been demonstrated. In addition, feasibility of active temperature control has been demonstrated based on real-time data export out of the MR scanner and a PID feedback algorithm. Further studies will address the in-vivo validation and the integration of a miniature NMR coil for increased SNR in the near field.
A passive integrative sampler for mercury vapor in air and neutral mercury species in water
Brumbaugh, W.G.; Petty, J.D.; May, T.W.; Huckins, J.N.
2000-01-01
A passive integrative mercury sampler (PIMS) based on a sealed polymeric membrane was effective for the collection and preconcentration of Hg0. Because the Hg is both oxidized and stabilized in the PIMS, sampling intervals of weeks to months are possible. The effective air sampling rate for a 15 x 2.5 cm device was about 21-equivalents/day (0.002 m3/day) and the detection limit for 4-week sampling was about 2 ng/m3 for conventional ICP-MS determination without clean-room preparation. Sampling precision was ??? 5% RSD for laboratory exposures, and 5-10% RSD for field exposures. These results suggest that the PIMS could be useful for screening assessments of Hg contamination and exposure in the environment, the laboratory, and the workplace. The PIMS approach may be particularly useful for applications requiring unattended sampling for extended periods at remote locations. Preliminary results indicate that sampling for dissolved gaseous mercury (DGM) and potentially other neutral mercury species from water is also feasible. Rigorous validation of the sampler performance is currently in progress. (C) 1999 Elsevier Science Ltd.A passive integrative mercury sampler (PIMS) based on a sealed polymeric membrane was effective for the collection and preconcentration of Hg0. Because the Hg is both oxidized and stabilized in the PIMS, sampling intervals of weeks to months are possible. The effective air sampling rate for a 15??2.5 cm device was about 21-equivalents/day (0.002 m3/day) and the detection limit for 4-week sampling was about 2 ng/m3 for conventional ICP-MS determination without clean-room preparation. Sampling precision was ???5% RSD for laboratory exposures, and 5-10% RSD for field exposures. These results suggest that the PIMS could be useful for screening assessments of Hg contamination and exposure in the environment, the laboratory, and the workplace. The PIMS approach may be particularly useful for applications requiring unattended sampling for extended periods at remote locations. Preliminary results indicate that sampling for dissolved gaseous mercury (DGM) and potentially other neutral mercury species from water is also feasible. Rigorous validation of the sampler performance is currently in progress.
Validation of two innovative methods to measure contaminant mass flux in groundwater
NASA Astrophysics Data System (ADS)
Goltz, Mark N.; Close, Murray E.; Yoon, Hyouk; Huang, Junqi; Flintoft, Mark J.; Kim, Sehjong; Enfield, Carl
2009-04-01
The ability to quantify the mass flux of a groundwater contaminant that is leaching from a source area is critical to enable us to: (1) evaluate the risk posed by the contamination source and prioritize cleanup, (2) evaluate the effectiveness of source remediation technologies or natural attenuation processes, and (3) quantify a source term for use in models that may be applied to predict maximum contaminant concentrations in downstream wells. Recently, a number of new methods have been developed and subsequently applied to measure contaminant mass flux in groundwater in the field. However, none of these methods has been validated at larger than the laboratory-scale through a comparison of measured mass flux and a known flux that has been introduced into flowing groundwater. A couple of innovative flux measurement methods, the tandem circulation well (TCW) and modified integral pumping test (MIPT) methods, have recently been proposed. The TCW method can measure mass flux integrated over a large subsurface volume without extracting water. The TCW method may be implemented using two different techniques. One technique, the multi-dipole technique, is relatively simple and inexpensive, only requiring measurement of heads, while the second technique requires conducting a tracer test. The MIPT method is an easily implemented method of obtaining volume-integrated flux measurements. In the current study, flux measurements obtained using these two methods are compared with known mass fluxes in a three-dimensional, artificial aquifer. Experiments in the artificial aquifer show that the TCW multi-dipole and tracer test techniques accurately estimated flux, within 2% and 16%, respectively; although the good results obtained using the multi-dipole technique may be fortuitous. The MIPT method was not as accurate as the TCW method, underestimating flux by as much as 70%. MIPT method inaccuracies may be due to the fact that the method assumptions (two-dimensional steady groundwater flow to fully-screened wells) were not well-approximated. While fluxes measured using the MIPT method were consistently underestimated, the method's simplicity and applicability to the field may compensate for the inaccuracies that were observed in this artificial aquifer test.
Stateless and stateful implementations of faithful execution
Pierson, Lyndon G; Witzke, Edward L; Tarman, Thomas D; Robertson, Perry J; Eldridge, John M; Campbell, Philip L
2014-12-16
A faithful execution system includes system memory, a target processor, and protection engine. The system memory stores a ciphertext including value fields and integrity fields. The value fields each include an encrypted executable instruction and the integrity fields each include an encrypted integrity value for determining whether a corresponding one of the value fields has been modified. The target processor executes plaintext instructions decoded from the ciphertext while the protection engine is coupled between the system memory and the target processor. The protection engine includes logic to retrieve the ciphertext from the system memory, decrypt the value fields into the plaintext instructions, perform an integrity check based on the integrity fields to determine whether any of the corresponding value fields have been modified, and provide the plaintext instructions to the target processor for execution.
NASA Astrophysics Data System (ADS)
Drouin, Ariane; Michaud, Aubert; Sylvain, Jean-Daniel; N'Dayegamiye, Adrien; Gasser, Marc-Olivier; Nolin, Michel; Perron, Isabelle; Grenon, Lucie; Beaudin, Isabelle; Desjardins, Jacques; Côté, Noémi
2013-04-01
This project aims at developing and validating an operational integrated management and localized approach at field scale using remote sensing data. It is realized in order to support the competitiveness of agricultural businesses, to ensure soil productivity in the long term and prevent diffuse contamination of surface waters. Our intention is to help agrienvironmental advisors and farmers in the consideration of spatial variability of soil properties in the management of fields. The proposed approach of soil properties recognition is based on the combination of elevation data and multispectral satellite imagery (Landsat) within statistical models. The method is based on the use of the largest possible number of satellite images to cover the widest range of soil moisture variability. Several spectral indices are calculated for each image (normalized brightness index, soil color index, organic matter index, etc.). The assignation of soils is based on a calibration procedure making use of the spatial soil database available in Canada. It includes soil profile point data associated to a database containing the information collected in the field. Three soil properties are predicted and mapped: A horizon texture, B horizon texture and drainage class. All the spectral indices, elevation data and soil data are combined in a discriminant analysis that produces discriminant functions. These are then used to produce maps of soil properties. In addition, from mapping soil properties, management zones are delineated within the field. The delineation of management zones with relatively similar soil properties is created to enable farmers to manage their fertilizers by taking greater account of their soils. This localized or precision management aims to adjust the application of fertilizer according to the real needs of soils and to reduce costs for farmers and the exports of nutrients to the stream. Mapping of soil properties will be validated in three agricultural regions in Quebec through an experimental field protocol (spatial sampling by management zones). Soils will be sampled, but crop yields under different nitrogen rates will also be assessed. Specifically, in each of the management areas defined, five different doses of nitrogen were applied (0, 50, 100, 150, 200 kg N / ha) on corn fields. In fall, the corn is harvested to assess differences in yields between the management areas and also in terms of doses of nitrogen. Ultimately, on the basis of well-established management areas, showing contrasting soil properties, the farmer will be able to ensure optimal correction of soil acidity, nitrogen fertilization, richness of soil in P and K, and improve soil drainage and physical properties. Environmentally, the principles of integrated and localized management carries significant benefits, particularly in terms of reduction of diffuse nutrient pollution.
Methodology and issues of integral experiments selection for nuclear data validation
NASA Astrophysics Data System (ADS)
Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian
2017-09-01
Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).
FDA 2011 process validation guidance: lifecycle compliance model.
Campbell, Cliff
2014-01-01
This article has been written as a contribution to the industry's efforts in migrating from a document-driven to a data-driven compliance mindset. A combination of target product profile, control engineering, and general sum principle techniques is presented as the basis of a simple but scalable lifecycle compliance model in support of modernized process validation. Unit operations and significant variables occupy pole position within the model, documentation requirements being treated as a derivative or consequence of the modeling process. The quality system is repositioned as a subordinate of system quality, this being defined as the integral of related "system qualities". The article represents a structured interpretation of the U.S. Food and Drug Administration's 2011 Guidance for Industry on Process Validation and is based on the author's educational background and his manufacturing/consulting experience in the validation field. The U.S. Food and Drug Administration's Guidance for Industry on Process Validation (2011) provides a wide-ranging and rigorous outline of compliant drug manufacturing requirements relative to its 20(th) century predecessor (1987). Its declared focus is patient safety, and it identifies three inter-related (and obvious) stages of the compliance lifecycle. Firstly, processes must be designed, both from a technical and quality perspective. Secondly, processes must be qualified, providing evidence that the manufacturing facility is fully "roadworthy" and fit for its intended purpose. Thirdly, processes must be verified, meaning that commercial batches must be monitored to ensure that processes remain in a state of control throughout their lifetime.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundstrom, Blake; Chakraborty, Sudipta; Lauss, Georg
This paper presents a concise description of state-of-the-art real-time simulation-based testing methods and demonstrates how they can be used independently and/or in combination as an integrated development and validation approach for smart grid DERs and systems. A three-part case study demonstrating the application of this integrated approach at the different stages of development and validation of a system-integrated smart photovoltaic (PV) inverter is also presented. Laboratory testing results and perspectives from two international research laboratories are included in the case study.
Excitation of a Parallel Plate Waveguide by an Array of Rectangular Waveguides
NASA Technical Reports Server (NTRS)
Rengarajan, Sembiam
2011-01-01
This work addresses the problem of excitation of a parallel plate waveguide by an array of rectangular waveguides that arises in applications such as the continuous transverse stub (CTS) antenna and dual-polarized parabolic cylindrical reflector antennas excited by a scanning line source. In order to design the junction region between the parallel plate waveguide and the linear array of rectangular waveguides, waveguide sizes have to be chosen so that the input match is adequate for the range of scan angles for both polarizations. Electromagnetic wave scattered by the junction of a parallel plate waveguide by an array of rectangular waveguides is analyzed by formulating coupled integral equations for the aperture electric field at the junction. The integral equations are solved by the method of moments. In order to make the computational process efficient and accurate, the method of weighted averaging was used to evaluate rapidly oscillating integrals encountered in the moment matrix. In addition, the real axis spectral integral is evaluated in a deformed contour for speed and accuracy. The MoM results for a large finite array have been validated by comparing its reflection coefficients with corresponding results for an infinite array generated by the commercial finite element code, HFSS. Once the aperture electric field is determined by MoM, the input reflection coefficients at each waveguide port, and coupling for each polarization over the range of useful scan angles, are easily obtained. Results for the input impedance and coupling characteristics for both the vertical and horizontal polarizations are presented over a range of scan angles. It is shown that the scan range is limited to about 35 for both polarizations and therefore the optimum waveguide is a square of size equal to about 0.62 free space wavelength.
NASA Astrophysics Data System (ADS)
Riera, Enrique; Blanco, Alfonso; García, José; Benedito, José; Mulet, Antonio; Gallego-Juárez, Juan A.; Blasco, Miguel
2010-01-01
Oil is an important component of almonds and other vegetable substrates that can show an influence on human health. In this work the development and validation of an innovative, robust, stable, reliable and efficient ultrasonic system at pilot scale to assist supercritical CO2 extraction of oils from different substrates is presented. In the extraction procedure ultrasonic energy represents an efficient way of producing deep agitation enhancing mass transfer processes because of some mechanisms (radiation pressure, streaming, agitation, high amplitude vibrations, etc.). A previous work to this research pointed out the feasibility of integrating an ultrasonic field inside a supercritical extractor without losing a significant volume fraction. This pioneer method enabled to accelerate mass transfer and then, improving supercritical extraction times. To commercially develop the new procedure fulfilling industrial requirements, a new configuration device has been designed, implemented, tested and successfully validated for supercritical fluid extraction of oil from different vegetable substrates.
Spring 2013 Graduate Engineering Internship Summary
NASA Technical Reports Server (NTRS)
Ehrlich, Joshua
2013-01-01
In the spring of 2013, I participated in the National Aeronautics and Space Administration (NASA) Pathways Intern Employment Program at the Kennedy Space Center (KSC) in Florida. This was my final internship opportunity with NASA, a third consecutive extension from a summer 2012 internship. Since the start of my tenure here at KSC, I have gained an invaluable depth of engineering knowledge and extensive hands-on experience. These opportunities have granted me the ability to enhance my systems engineering approach in the field of payload design and testing as well as develop a strong foundation in the area of composite fabrication and testing for repair design on space vehicle structures. As a systems engineer, I supported the systems engineering and integration team with final acceptance testing of the Vegetable Production System, commonly referred to as Veggie. Verification and validation (V and V) of Veggie was carried out prior to qualification testing of the payload, which incorporated the process of confirming the system's design requirements dependent on one or more validation methods: inspection, analysis, demonstration, and testing.
Exact finite volume expectation values of local operators in excited states
NASA Astrophysics Data System (ADS)
Pozsgay, B.; Szécsényi, I. M.; Takács, G.
2015-04-01
We present a conjecture for the exact expression of finite volume expectation values in excited states in integrable quantum field theories, which is an extension of an earlier conjecture to the case of general diagonal factorized scattering with bound states and a nontrivial bootstrap structure. The conjectured expression is a spectral expansion which uses the exact form factors and the excited state thermodynamic Bethe Ansatz as building blocks. The conjecture is proven for the case of the trace of the energy-moment tensor. Concerning its validity for more general operators, we provide numerical evidence using the truncated conformal space approach. It is found that the expansion fails to be well-defined for small values of the volume in cases when the singularity structure of the TBA equations undergoes a non-trivial rearrangement under some critical value of the volume. Despite these shortcomings, the conjectured expression is expected to be valid for all volumes for most of the excited states, and as an expansion above the critical volume for the rest.
Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Yungster, S.
1996-01-01
A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.
Generation and validation of a universal perinatal database and biospecimen repository: PeriBank.
Antony, K M; Hemarajata, P; Chen, J; Morris, J; Cook, C; Masalas, D; Gedminas, M; Brown, A; Versalovic, J; Aagaard, K
2016-11-01
There is a dearth of biospecimen repositories available to perinatal researchers. In order to address this need, here we describe the methodology used to establish such a resource. With the collaboration of MedSci.net, we generated an online perinatal database with 847 fields of clinical information. Simultaneously, we established a biospecimen repository of the same clinical participants. The demographic and clinical outcomes data are described for the first 10 000 participants enrolled. The demographic characteristics are consistent with the demographics of the delivery hospitals. Quality analysis of the biospecimens reveals variation in very few analytes. Furthermore, since the creation of PeriBank, we have demonstrated validity of the database and tissue integrity of the biospecimen repository. Here we establish that the creation of a universal perinatal database and biospecimen collection is not only possible, but allows for the performance of state-of-the-science translational perinatal research and is a potentially valuable resource to academic perinatal researchers.
Validation of the procedures. [integrated multidisciplinary optimization of rotorcraft
NASA Technical Reports Server (NTRS)
Mantay, Wayne R.
1989-01-01
Validation strategies are described for procedures aimed at improving the rotor blade design process through a multidisciplinary optimization approach. Validation of the basic rotor environment prediction tools and the overall rotor design are discussed.
Integrated cellular network of transcription regulations and protein-protein interactions
2010-01-01
Background With the accumulation of increasing omics data, a key goal of systems biology is to construct networks at different cellular levels to investigate cellular machinery of the cell. However, there is currently no satisfactory method to construct an integrated cellular network that combines the gene regulatory network and the signaling regulatory pathway. Results In this study, we integrated different kinds of omics data and developed a systematic method to construct the integrated cellular network based on coupling dynamic models and statistical assessments. The proposed method was applied to S. cerevisiae stress responses, elucidating the stress response mechanism of the yeast. From the resulting integrated cellular network under hyperosmotic stress, the highly connected hubs which are functionally relevant to the stress response were identified. Beyond hyperosmotic stress, the integrated network under heat shock and oxidative stress were also constructed and the crosstalks of these networks were analyzed, specifying the significance of some transcription factors to serve as the decision-making devices at the center of the bow-tie structure and the crucial role for rapid adaptation scheme to respond to stress. In addition, the predictive power of the proposed method was also demonstrated. Conclusions We successfully construct the integrated cellular network which is validated by literature evidences. The integration of transcription regulations and protein-protein interactions gives more insight into the actual biological network and is more predictive than those without integration. The method is shown to be powerful and flexible and can be used under different conditions and for different species. The coupling dynamic models of the whole integrated cellular network are very useful for theoretical analyses and for further experiments in the fields of network biology and synthetic biology. PMID:20211003
Integrated cellular network of transcription regulations and protein-protein interactions.
Wang, Yu-Chao; Chen, Bor-Sen
2010-03-08
With the accumulation of increasing omics data, a key goal of systems biology is to construct networks at different cellular levels to investigate cellular machinery of the cell. However, there is currently no satisfactory method to construct an integrated cellular network that combines the gene regulatory network and the signaling regulatory pathway. In this study, we integrated different kinds of omics data and developed a systematic method to construct the integrated cellular network based on coupling dynamic models and statistical assessments. The proposed method was applied to S. cerevisiae stress responses, elucidating the stress response mechanism of the yeast. From the resulting integrated cellular network under hyperosmotic stress, the highly connected hubs which are functionally relevant to the stress response were identified. Beyond hyperosmotic stress, the integrated network under heat shock and oxidative stress were also constructed and the crosstalks of these networks were analyzed, specifying the significance of some transcription factors to serve as the decision-making devices at the center of the bow-tie structure and the crucial role for rapid adaptation scheme to respond to stress. In addition, the predictive power of the proposed method was also demonstrated. We successfully construct the integrated cellular network which is validated by literature evidences. The integration of transcription regulations and protein-protein interactions gives more insight into the actual biological network and is more predictive than those without integration. The method is shown to be powerful and flexible and can be used under different conditions and for different species. The coupling dynamic models of the whole integrated cellular network are very useful for theoretical analyses and for further experiments in the fields of network biology and synthetic biology.
The WATERMED field experiment: validation of the AATSR LST product with in situ measurements
NASA Astrophysics Data System (ADS)
Noyes, E.; Soria, G.; Sobrino, J.; Remedios, J.; Llewellyn-Jones, D.; Corlett, G.
The Advanced Along-Track Scanning Radiometer (AATSR) onboard ESA's Envisat Satellite, is the third in a series of a precision radiometers designed to measure Sea Surface Temperature (SST) with accuracies of better than ± 0.3 K (within 1-sigma limit). Since its launch in March 2001, a prototype AATSR Land Surface Temperature (LST) product has been produced for validation purposes only, with the product becoming operational from mid-2004. The (A)ATSR instrument design is unique in that it has both a nadir- and a forward-view, allowing the Earth's surface to be viewed along two different atmospheric path lengths, thus enabling an improved atmospheric correction to be made when retrieving surface temperature. It also uses an innovative and exceptionally stable on-board calibration system for its infrared channels, which, together with actively cooled detectors, gives extremely high radiometric sensitivity and precision. In this presentation, results from a comparison of the prototype LST product with ground-based measurements obtained at the WATERMED (WATer use Efficiency in natural vegetation and agricultural areas by Remote sensing in the MEDiterranean basin) field site near Marrakech, Morocco, are presented. The comparison shows that the AATSR has a positive bias of + 1.5 K, with a standard deviation of 0.7 K, indicating that the product is operating within the target specification (± 2.5 K) over the WATERMED field site. However, several anomalous validation points were observed during the analysis and we will discuss possible reasons for the occurrence of these data, including their coincidence with the presence of an Envisat blanking pulse (indicating the presence of a radar pulse at the time of AATSR pixel integration). Further investigation into this matter is required as previous investigations have always indicated that the presence of a payload radar pulse does not have any effect on (A)ATSR data quality.
NASA Astrophysics Data System (ADS)
McCorkel, J.; Kuester, M. A.; Johnson, B. R.; Krause, K.; Kampe, T. U.; Moore, D. J.
2011-12-01
The National Ecological Observatory Network (NEON) is a research facility under development by the National Science Foundation to improve our understanding of and ability to forecast the impacts of climate change, land-use change, and invasive species on ecology. The infrastructure, designed to operate over 30 years or more, includes site-based flux tower and field measurements, coordinated with airborne remote sensing observations to observe key ecological processes over a broad range of temporal and spatial scales. NEON airborne data on vegetation biochemical, biophysical, and structural properties and on land use and land cover will be captured at 1 to 2 meter resolution by an imaging spectrometer, a small-footprint waveform-LiDAR and a high-resolution digital camera. Annual coverage of the 60 NEON sites and capacity to support directed research flights or respond to unexpected events will require three airborne observation platforms (AOP). The integration of field and airborne data with satellite observations and other national geospatial data for analysis, monitoring and input to ecosystem models will extend NEON observations to regions across the United States not directly sampled by the observatory. The different spatial scales and measurement methods make quantitative comparisons between remote sensing and field data, typically collected over small sample plots (e.g. < 0.2 ha), difficult. New approaches to developing temporal and spatial scaling relationships between these data are necessary to enable validation of airborne and satellite remote sensing data and for incorporation of these data into continental or global scale ecological models. In addition to consideration of the methods used to collect ground-based measurements, careful calibration of the remote sensing instrumentation and an assessment of the accuracy of algorithms used to derive higher-level science data products are needed. Furthermore, long-term consistency of the data collected by all three airborne instrument packages over the NEON sites requires traceability of the calibration to national standards, field-based verification of instrument calibration and stability in the aircraft environment, and an independent assessment of the quality of derived data products. This work describes the development of the calibration laboratory, early evaluation of field-based vicarious calibration, development of scaling relationships, and test flights. Complementary laboratory- and field-based calibration of the AOP in addition to consistency with on-board calibration methods provide confidence that low-level data such as radiance and surface reflectance measurements are accurate and comparable among different sensors. Algorithms that calculate higher-level data products including essential climate variables will be validated against equivalent ground- and satellite-based results. Such a validated data set across multiple spatial and temporal scales is key to enabling ecosystem models to forecast the effects of climate change, land-use change and invasive species on the continental scale.
González-Cutre, David; Sicilia, Álvaro; Fernández, Alberto
2010-11-01
The purpose of this study was to validate the Behavioural Regulation in Exercise Questionnaire in the Spanish context, including items to measure integrated regulation. Participants were 524 exercisers, mean age 29.59 years. The results revealed acceptable fit indices in the confirmatory factor analysis and good internal consistency (with a Cronbach alpha of .87 for integrated regulation). The diverse subscales also conformed to a simplex pattern and the factor structure was invariant across gender and age. Integrated regulation reflected high temporal stability over a 4-week period (ICC=.90). The criterion validity analysis of integrated regulation indicated that this variable was positively predicted by satisfaction of the needs for competence and autonomy. The results regarding the importance of measuring integrated regulation in exercise are discussed.
Numerical Simulation of a High-Lift Configuration Embedded with High Momentum Fluidic Actuators
NASA Technical Reports Server (NTRS)
Vatsa, Veer N.; Duda, Benjamin; Fares, Ehab; Lin, John C.
2016-01-01
Numerical simulations have been performed for a vertical tail configuration with deflected rudder. The suction surface of the main element of this configuration, just upstream of the hinge line, is embedded with an array of 32 fluidic actuators that produce oscillating sweeping jets. Such oscillating jets have been found to be very effective for flow control applications in the past. In the current paper, a high-fidelity computational fluid dynamics (CFD) code known as the PowerFLOW R code is used to simulate the entire flow field associated with this configuration, including the flow inside the actuators. A fully compressible version of the PowerFLOW R code valid for high speed flows is used for the present simulations to accurately represent the transonic flow regimes encountered in the flow field due to the actuators operating at higher mass flow (momentum) rates required to mitigate reverse flow regions on a highly-deflected rudder surface. The computed results for the surface pressure and integrated forces compare favorably with measured data. In addition, numerical solutions predict the correct trends in forces with active flow control compared to the no control case. The effect of varying the rudder deflection angle on integrated forces and surface pressures is also presented.
NASA Technical Reports Server (NTRS)
Peddle, Derek R.; Huemmrich, K. Fred; Hall, Forrest G.; Masek, Jeffrey G.; Soenen, Scott A.; Jackson, Chris D.
2011-01-01
Canopy reflectance model inversion using look-up table approaches provides powerful and flexible options for deriving improved forest biophysical structural information (BSI) compared with traditional statistical empirical methods. The BIOPHYS algorithm is an improved, physically-based inversion approach for deriving BSI for independent use and validation and for monitoring, inventory and quantifying forest disturbance as well as input to ecosystem, climate and carbon models. Based on the multiple-forward mode (MFM) inversion approach, BIOPHYS results were summarized from different studies (Minnesota/NASA COVER; Virginia/LEDAPS; Saskatchewan/BOREAS), sensors (airborne MMR; Landsat; MODIS) and models (GeoSail; GOMS). Applications output included forest density, height, crown dimension, branch and green leaf area, canopy cover, disturbance estimates based on multi-temporal chronosequences, and structural change following recovery from forest fires over the last century. Good correspondences with validation field data were obtained. Integrated analyses of multiple solar and view angle imagery further improved retrievals compared with single pass data. Quantifying ecosystem dynamics such as the area and percent of forest disturbance, early regrowth and succession provide essential inputs to process-driven models of carbon flux. BIOPHYS is well suited for large-area, multi-temporal applications involving multiple image sets and mosaics for assessing vegetation disturbance and quantifying biophysical structural dynamics and change. It is also suitable for integration with forest inventory, monitoring, updating, and other programs.
Integrated System Health Management: Pilot Operational Implementation in a Rocket Engine Test Stand
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Schmalzel, John L.; Morris, Jonathan A.; Turowski, Mark P.; Franzl, Richard
2010-01-01
This paper describes a credible implementation of integrated system health management (ISHM) capability, as a pilot operational system. Important core elements that make possible fielding and evolution of ISHM capability have been validated in a rocket engine test stand, encompassing all phases of operation: stand-by, pre-test, test, and post-test. The core elements include an architecture (hardware/software) for ISHM, gateways for streaming real-time data from the data acquisition system into the ISHM system, automated configuration management employing transducer electronic data sheets (TEDS?s) adhering to the IEEE 1451.4 Standard for Smart Sensors and Actuators, broadcasting and capture of sensor measurements and health information adhering to the IEEE 1451.1 Standard for Smart Sensors and Actuators, user interfaces for management of redlines/bluelines, and establishment of a health assessment database system (HADS) and browser for extensive post-test analysis. The ISHM system was installed in the Test Control Room, where test operators were exposed to the capability. All functionalities of the pilot implementation were validated during testing and in post-test data streaming through the ISHM system. The implementation enabled significant improvements in awareness about the status of the test stand, and events and their causes/consequences. The architecture and software elements embody a systems engineering, knowledge-based approach; in conjunction with object-oriented environments. These qualities are permitting systematic augmentation of the capability and scaling to encompass other subsystems.
Copenhagen Psychosocial Questionnaire - A validation study using the Job Demand-Resources model
Hakanen, Jari J.; Westerlund, Hugo
2018-01-01
Aim This study aims at investigating the nomological validity of the Copenhagen Psychosocial Questionnaire (COPSOQ II) by using an extension of the Job Demands-Resources (JD-R) model with aspects of work ability as outcome. Material and methods The study design is cross-sectional. All staff working at public dental organizations in four regions of Sweden were invited to complete an electronic questionnaire (75% response rate, n = 1345). The questionnaire was based on COPSOQ II scales, the Utrecht Work Engagement scale, and the one-item Work Ability Score in combination with a proprietary item. The data was analysed by Structural Equation Modelling. Results This study contributed to the literature by showing that: A) The scale characteristics were satisfactory and the construct validity of COPSOQ instrument could be integrated in the JD-R framework; B) Job resources arising from leadership may be a driver of the two processes included in the JD-R model; and C) Both the health impairment and motivational processes were associated with WA, and the results suggested that leadership may impact WA, in particularly by securing task resources. Conclusion In conclusion, the nomological validity of COPSOQ was supported as the JD-R model-can be operationalized by the instrument. This may be helpful for transferral of complex survey results and work life theories to practitioners in the field. PMID:29708998
High-efficiency non-uniformity correction for wide dynamic linear infrared radiometry system
NASA Astrophysics Data System (ADS)
Li, Zhou; Yu, Yi; Tian, Qi-Jie; Chang, Song-Tao; He, Feng-Yun; Yin, Yan-He; Qiao, Yan-Feng
2017-09-01
Several different integration times are always set for a wide dynamic linear and continuous variable integration time infrared radiometry system, therefore, traditional calibration-based non-uniformity correction (NUC) are usually conducted one by one, and furthermore, several calibration sources required, consequently makes calibration and process of NUC time-consuming. In this paper, the difference of NUC coefficients between different integration times have been discussed, and then a novel NUC method called high-efficiency NUC, which combines the traditional calibration-based non-uniformity correction, has been proposed. It obtains the correction coefficients of all integration times in whole linear dynamic rangesonly by recording three different images of a standard blackbody. Firstly, mathematical procedure of the proposed non-uniformity correction method is validated and then its performance is demonstrated by a 400 mm diameter ground-based infrared radiometry system. Experimental results show that the mean value of Normalized Root Mean Square (NRMS) is reduced from 3.78% to 0.24% by the proposed method. In addition, the results at 4 ms and 70 °C prove that this method has a higher accuracy compared with traditional calibration-based NUC. In the meantime, at other integration time and temperature there is still a good correction effect. Moreover, it greatly reduces the number of correction time and temperature sampling point, and is characterized by good real-time performance and suitable for field measurement.
Shioda, Ai; Tadaka, Etsuko; Okochi, Ayako
2017-01-01
Community integration is an essential right for people with schizophrenia that affects their well-being and quality of life, but no valid instrument exists to measure it in Japan. The aim of the present study is to develop and evaluate the reliability and validity of the Japanese version of the Community Integration Measure (CIM) for people with schizophrenia. The Japanese version of the CIM was developed as a self-administered questionnaire based on the original version of the CIM, which was developed by McColl et al. This study of the Japanese CIM had a cross-sectional design. Construct validity was determined using a confirmatory factor analysis (CFA) and data from 291 community-dwelling people with schizophrenia in Japan. Internal consistency was calculated using Cronbach's alpha. The Lubben Social Network Scale (LSNS-6), the Rosenberg Self-Esteem Scale (RSE) and the UCLA Loneliness Scale, version 3 (UCLALS) were administered to assess the criterion-related validity of the Japanese version of the CIM. The participants were 263 people with schizophrenia who provided valid responses. The Cronbach's alpha was 0.87, and CFA identified one domain with ten items that demonstrated the following values: goodness of fit index = 0.924, adjusted goodness of fit index = 0.881, comparative fit index = 0.925, and root mean square error of approximation = 0.085. The correlation coefficients were 0.43 (p < 0.001) with the LSNS-6, 0.42 (p < 0.001) with the RSE, and -0.57 (p < 0.001) with the UCLALS. The Japanese version of the CIM demonstrated adequate reliability and validity for assessing community integration for people with schizophrenia in Japan.
Validation and Comprehension: An Integrated Overview
ERIC Educational Resources Information Center
Kendeou, Panayiota
2014-01-01
In this article, I review and discuss the work presented in this special issue while focusing on a number of issues that warrant further investigation in validation research. These issues pertain to the nature of the validation processes, the processes and mechanisms that support validation during comprehension, the factors that influence…
Validity: Applying Current Concepts and Standards to Gynecologic Surgery Performance Assessments
ERIC Educational Resources Information Center
LeClaire, Edgar L.; Nihira, Mikio A.; Hardré, Patricia L.
2015-01-01
Validity is critical for meaningful assessment of surgical competency. According to the Standards for Educational and Psychological Testing, validation involves the integration of data from well-defined classifications of evidence. In the authoritative framework, data from all classifications support construct validity claims. The two aims of this…
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
Full-wave Nonlinear Inverse Scattering for Acoustic and Electromagnetic Breast Imaging
NASA Astrophysics Data System (ADS)
Haynes, Mark Spencer
Acoustic and electromagnetic full-wave nonlinear inverse scattering techniques are explored in both theory and experiment with the ultimate aim of noninvasively mapping the material properties of the breast. There is evidence that benign and malignant breast tissue have different acoustic and electrical properties and imaging these properties directly could provide higher quality images with better diagnostic certainty. In this dissertation, acoustic and electromagnetic inverse scattering algorithms are first developed and validated in simulation. The forward solvers and optimization cost functions are modified from traditional forms in order to handle the large or lossy imaging scenes present in ultrasonic and microwave breast imaging. An antenna model is then presented, modified, and experimentally validated for microwave S-parameter measurements. Using the antenna model, a new electromagnetic volume integral equation is derived in order to link the material properties of the inverse scattering algorithms to microwave S-parameters measurements allowing direct comparison of model predictions and measurements in the imaging algorithms. This volume integral equation is validated with several experiments and used as the basis of a free-space inverse scattering experiment, where images of the dielectric properties of plastic objects are formed without the use of calibration targets. These efforts are used as the foundation of a solution and formulation for the numerical characterization of a microwave near-field cavity-based breast imaging system. The system is constructed and imaging results of simple targets are given. Finally, the same techniques are used to explore a new self-characterization method for commercial ultrasound probes. The method is used to calibrate an ultrasound inverse scattering experiment and imaging results of simple targets are presented. This work has demonstrated the feasibility of quantitative microwave inverse scattering by way of a self-consistent characterization formalism, and has made headway in the same area for ultrasound.
MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics*
Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R.; Chen, Albert J.; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G.; Liu, Xiaowen; Ge, Ying
2016-01-01
Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644
NASA Technical Reports Server (NTRS)
Bauer, Frank H. (Technical Monitor); Dennehy, Neil; Gambino, Joel; Maynard, Andrew; Brady, T.; Buckley, S.; Zinchuk, J.
2003-01-01
The Inertial Stellar Compass (ISC) is a miniature, low power, stellar inertial attitude determination system with an accuracy of better than 0.1 degree (1 sigma) in three axes. The ISC consumes only 3.5 Watts of power and is contained in a 2.5 kg package. With its embedded on-board processor, the ISC provides attitude quaternion information and has Lost-in-Space (LIS) initialization capability. The attitude accuracy and LIS capability are provided by combining a wide field of view Active Pixel Sensor (APS) star camera and Micro- ElectroMechanical System (MEMS) inertial sensor information in an integrated sensor system. The performance and small form factor make the ISC a useful sensor for a wide range of missions. In particular, the ISC represents an enabling, fully integrated, micro-satellite attitude determination system. Other applications include using the ISC as a single sensor solution for attitude determination on medium performance spacecraft and as a bolt on independent safe-hold sensor or coarse acquisition sensor for many other spacecraft. NASA's New Millennium Program (NMP) has selected the ISC technology for a Space Technology 6 (ST6) flight validation experiment scheduled for 2004. NMP missions, such a s ST6, are intended to validate advanced technologies that have not flown in space in order to reduce the risk associated with their infusion into future NASA missions. This paper describes the design, operation, and performance of the ISC and outlines the technology validation plan. A number of mission applications for the ISC technology are highlighted, both for the baseline ST6 ISC configuration and more ambitious applications where ISC hardware and software modifications would be required. These applications demonstrate the wide range of Space and Earth Science missions that would benefit from infusion of the ISC technology.
Automated Identification and Shape Analysis of Chorus Elements in the Van Allen Radiation Belts
NASA Astrophysics Data System (ADS)
Sen Gupta, Ananya; Kletzing, Craig; Howk, Robin; Kurth, William; Matheny, Morgan
2017-12-01
An important goal of the Van Allen Probes mission is to understand wave-particle interaction by chorus emissions in terrestrial Van Allen radiation belts. To test models, statistical characterization of chorus properties, such as amplitude variation and sweep rates, is an important scientific goal. The Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) instrumentation suite provides measurements of wave electric and magnetic fields as well as DC magnetic fields for the Van Allen Probes mission. However, manual inspection across terabytes of EMFISIS data is not feasible and as such introduces human confirmation bias. We present signal processing techniques for automated identification, shape analysis, and sweep rate characterization of high-amplitude whistler-mode chorus elements in the Van Allen radiation belts. Specifically, we develop signal processing techniques based on the radon transform that disambiguate chorus elements with a dominant sweep rate against hiss-like chorus. We present representative results validating our techniques and also provide statistical characterization of detected chorus elements across a case study of a 6 s epoch.
[Evaluation of nursing processes in psychiatry--a field study project].
Bertram, Mathias; Ostermann, Thomas; Adam, Klaus; Alvater, Thomas; Gude, Dieter; Heinis, Renate; Löber, Reinhard; Roknic, Marko; Steger, Lucia
2009-10-01
Professional nursing care can be characterised as a support in handling of and coping with activities of daily living. While somatic care mainly focusses on the compensation of physical complaints, the needs in the field of psychiatry leading to nursing interventions are quite different. The descriptive and explorative study presented here aims at finding typical needs for psychiatric nursing care and characteristic intervention patterns of nurses. Within the setting of a qualitative field study in an anthroposophically oriented hospital for mental disorders heuristic methods (Grounded Theory) were applied to evaluate those nursing events which induced a unique and specific nursing benefit. The narrative data from 59 episodes generated this way yielded four typical patterns of interpretation of nursing needs: integration versus autonomy and routine nursing versus crisis intervention. Our results are limited by the fact that only data of one hospital were analysed. However, due to quality control of internal and external validity, and adjustment of the results towards current findings it can be concluded that the study results have a heuristic potential for describing and precising other settings within and outside psychiatric care.
Developing a regional canopy fuels assessment strategy using multi-scale lidar
Peterson, Birgit E.; Nelson, Kurtis
2011-01-01
Accurate assessments of canopy fuels are needed by fire scientists to understand fire behavior and to predict future fire occurrence. A key descriptor for canopy fuels is canopy bulk density (CBD). CBD is closely linked to the structure of the canopy; therefore, lidar measurements are particularly well suited to assessments of CBD. LANDFIRE scientists are exploring methods to integrate airborne and spaceborne lidar datasets into a national mapping effort. In this study, airborne lidar, spaceborne lidar, and field data are used to map CBD in the Yukon Flats Ecoregion, with the airborne lidar serving as a bridge between the field data and the spaceborne observations. The field-based CBD was positively correlated with airborne lidar observations (R2=0.78). Mapped values of CBD using the airborne lidar dataset were significantly correlated with spaceborne lidar observations when analyzed by forest type (R2=0.62, evergreen and R2=0.71, mixed). Though continued research is necessary to validate these results, they do support the feasibility of airborne and, most importantly, spaceborne lidar data for canopy fuels assessment.
Vorticity and Vertical Motions Diagnosed from Satellite Deep-Layer Temperatures. Revised
NASA Technical Reports Server (NTRS)
Spencer, Roy W.; Lapenta, William M.; Robertson, Franklin R.
1994-01-01
Spatial fields of satellite-measured deep-layer temperatures are examined in the context of quasigeostrophic theory. It is found that midtropospheric geostrophic vorticity and quasigeostrophic vertical motions can be diagnosed from microwave temperature measurements of only two deep layers. The lower- ( 1000-400 hPa) and upper- (400-50 hPa) layer temperatures are estimated from limb-corrected TIROS-N Microwave Sounding Units (MSU) channel 2 and 3 data, spatial fields of which can be used to estimate the midtropospheric thermal wind and geostrophic vorticity fields. Together with Trenberth's simplification of the quasigeostrophic omega equation, these two quantities can be then used to estimate the geostrophic vorticity advection by the thermal wind, which is related to the quasigeostrophic vertical velocity in the midtroposphere. Critical to the technique is the observation that geostrophic vorticity fields calculated from the channel 3 temperature features are very similar to those calculated from traditional, 'bottom-up' integrated height fields from radiosonde data. This suggests a lack of cyclone-scale height features near the top of the channel 3 weighting function, making the channel 3 cyclone-scale 'thickness' features approximately the same as height features near the bottom of the weighting function. Thus, the MSU data provide observational validation of the LID (level of insignificant dynamics) assumption of Hirshberg and Fritsch.
Development of Ocean Noise "Budgets"
NASA Astrophysics Data System (ADS)
D'Spain, G. L.; Miller, J. H.; Frisk, G. V.; Bradley, D. L.
2003-12-01
The National Oceanographic Partnership Program recently sponsored the third U.S. National Academy of Sciences study on the potential impact of manmade sound on the marine environment. Several recommendations for future research are made by the 11-member committee in their report titled Ocean Noise and Marine Mammals (National Academies Press, 2003). This presentation will focus on the subset of recommendations related to a "noise budget", i.e., an accounting of the relative contributions of various sources to the ocean noise field. A noise budget is defined in terms of a specific metric of the sound field. The metric, or budget "currency", typically considered is the acoustic pressure spectrum integrated over space and time, which is proportional to the total mechanical energy in the acoustic field. However, this currency may not be the only one of relevance to marine animals. Each of the various ways in which sound can potentially impact these animals, e.g., temporary threshold shift, masking, behavior disruption, etc, probably depends upon a different property, or set of properties, of the sound field. Therefore, a family of noise budgets based on various currencies will be required for complete evaluation of the potential impact of manmade noise on the marine environment. Validation of noise budgets will require sustained, long term measurements of the underwater noise field.
Helicopter noise in hover: Computational modelling and experimental validation
NASA Astrophysics Data System (ADS)
Kopiev, V. F.; Zaytsev, M. Yu.; Vorontsov, V. I.; Karabasov, S. A.; Anikin, V. A.
2017-11-01
The aeroacoustic characteristics of a helicopter rotor are calculated by a new method, to assess its applicability in assessing rotor performance in hovering. Direct solution of the Euler equations in a noninertial coordinate system is used to calculate the near-field flow around the spinning rotor. The far-field noise field is calculated by the Ffowcs Williams-Hawkings (FW-H) method using permeable control surfaces that include the blade. For a multiblade rotor, the signal obtained is duplicated and shifted in phase for each successive blade. By that means, the spectral characteristics of the far-field noise may be obtained. To determine the integral aerodynamic characteristics of the rotor, software is written to calculate the thrust and torque characteristics from the near-field flow solution. The results of numerical simulation are compared with experimental acoustic and aerodynamic data for a large-scale model of a helicopter main rotor in an open test facility. Two- and four-blade configurations of the rotor are considered, in different hover conditions. The proposed method satisfactorily predicts the aerodynamic characteristics of the blades in such conditions and gives good estimates for the first harmonics of the noise. That permits the practical use of the proposed method, not only for hovering but also for forward flight.
Effects of population density on corticosterone levels of prairie voles in the field
Blondel, Dimitri V.; Wallace, Gerard N.; Calderone, Stefanie; Gorinshteyn, Marija; St. Mary, Colette M.; Phelps, Steven M.
2015-01-01
High population density is often associated with increased levels of stress-related hormones, such as corticosterone (CORT). Prairie voles (Microtus ochrogaster) are a socially monogamous species known for their large population density fluctuations in the wild. Although CORT influences the social behavior of prairie voles in the lab, the effect of population density on CORT has not previously been quantified in this species in the field. We validated a non-invasive hormone assay for measuring CORT metabolites in prairie vole feces. We then used semi-natural enclosures to experimentally manipulate population density, and measured density effects on male space use and fecal CORT levels. Our enclosures generated patterns of space use and social interaction that were consistent with previous prairie vole field studies. Contrary to the positive relationship between CORT and density typical of other taxa, we found that lower population densities (80 animals/ha) produced higher fecal CORT than high densities (240/ha). Combined with prior work in the lab and field, the data suggest that high prairie vole population densities indicate favorable environments, perhaps through reduced predation risk. Lastly, we found that field animals had lower fecal CORT levels than laboratory-living animals. The data emphasize the usefulness of prairie voles as models for integrating ecological, evolutionary and mechanistic questions in social behavior. PMID:26342968
Buckow, Roman; Semrau, Julius; Sui, Qian; Wan, Jason; Knoerzer, Kai
2012-01-01
A computational fluid dynamics (CFD) model describing the flow, electric field and temperature distribution of a laboratory-scale pulsed electric field (PEF) treatment chamber with co-field electrode configuration was developed. The predicted temperature increase was validated by means of integral temperature studies using thermocouples at the outlet of each flow cell for grape juice and salt solutions. Simulations of PEF treatments revealed intensity peaks of the electric field and laminar flow conditions in the treatment chamber causing local temperature hot spots near the chamber walls. Furthermore, thermal inactivation kinetics of lactoperoxidase (LPO) dissolved in simulated milk ultrafiltrate were determined with a glass capillary method at temperatures ranging from 65 to 80 °C. Temperature dependence of first order inactivation rate constants was accurately described by the Arrhenius equation yielding an activation energy of 597.1 kJ mol(-1). The thermal impact of different PEF processes on LPO activity was estimated by coupling the derived Arrhenius model with the CFD model and the predicted enzyme inactivation was compared to experimental measurements. Results indicated that LPO inactivation during combined PEF/thermal treatments was largely due to thermal effects, but 5-12% enzyme inactivation may be related to other electro-chemical effects occurring during PEF treatments. Copyright © 2012 American Institute of Chemical Engineers (AIChE).
Broadband Radiometric LED Measurements
Eppeldauer, G. P.; Cooksey, C. C.; Yoon, H. W.; Hanssen, L. M.; Podobedov, V. B.; Vest, R. E.; Arp, U.; Miller, C. C.
2017-01-01
At present, broadband radiometric measurements of LEDs with uniform and low-uncertainty results are not available. Currently, either complicated and expensive spectral radiometric measurements or broadband photometric LED measurements are used. The broadband photometric measurements are based on the CIE standardized V(λ) function, which cannot be used in the UV range and leads to large errors when blue or red LEDs are measured in its wings, where the realization is always poor. Reference irradiance meters with spectrally constant response and high-intensity LED irradiance sources were developed here to implement the previously suggested broadband radiometric LED measurement procedure [1, 2]. Using a detector with spectrally constant response, the broadband radiometric quantities of any LEDs or LED groups can be simply measured with low uncertainty without using any source standard. The spectral flatness of filtered-Si detectors and low-noise pyroelectric radiometers are compared. Examples are given for integrated irradiance measurement of UV and blue LED sources using the here introduced reference (standard) pyroelectric irradiance meters. For validation, the broadband measured integrated irradiance of several LED-365 sources were compared with the spectrally determined integrated irradiance derived from an FEL spectral irradiance lamp-standard. Integrated responsivity transfer from the reference irradiance meter to transfer standard and field UV irradiance meters is discussed. PMID:28649167
Broadband Radiometric LED Measurements.
Eppeldauer, G P; Cooksey, C C; Yoon, H W; Hanssen, L M; Podobedov, V B; Vest, R E; Arp, U; Miller, C C
2016-01-01
At present, broadband radiometric measurements of LEDs with uniform and low-uncertainty results are not available. Currently, either complicated and expensive spectral radiometric measurements or broadband photometric LED measurements are used. The broadband photometric measurements are based on the CIE standardized V(λ) function, which cannot be used in the UV range and leads to large errors when blue or red LEDs are measured in its wings, where the realization is always poor. Reference irradiance meters with spectrally constant response and high-intensity LED irradiance sources were developed here to implement the previously suggested broadband radiometric LED measurement procedure [1, 2]. Using a detector with spectrally constant response, the broadband radiometric quantities of any LEDs or LED groups can be simply measured with low uncertainty without using any source standard. The spectral flatness of filtered-Si detectors and low-noise pyroelectric radiometers are compared. Examples are given for integrated irradiance measurement of UV and blue LED sources using the here introduced reference (standard) pyroelectric irradiance meters. For validation, the broadband measured integrated irradiance of several LED-365 sources were compared with the spectrally determined integrated irradiance derived from an FEL spectral irradiance lamp-standard. Integrated responsivity transfer from the reference irradiance meter to transfer standard and field UV irradiance meters is discussed.
A hierarchy of effective teaching and learning to acquire competence in evidenced-based medicine
Khan, Khalid S; Coomarasamy, Arri
2006-01-01
Background A variety of methods exists for teaching and learning evidence-based medicine (EBM). However, there is much debate about the effectiveness of various EBM teaching and learning activities, resulting in a lack of consensus as to what methods constitute the best educational practice. There is a need for a clear hierarchy of educational activities to effectively impart and acquire competence in EBM skills. This paper develops such a hierarchy based on current empirical and theoretical evidence. Discussion EBM requires that health care decisions be based on the best available valid and relevant evidence. To achieve this, teachers delivering EBM curricula need to inculcate amongst learners the skills to gain, assess, apply, integrate and communicate new knowledge in clinical decision-making. Empirical and theoretical evidence suggests that there is a hierarchy of teaching and learning activities in terms of their educational effectiveness: Level 1, interactive and clinically integrated activities; Level 2(a), interactive but classroom based activities; Level 2(b), didactic but clinically integrated activities; and Level 3, didactic, classroom or standalone teaching. Summary All health care professionals need to understand and implement the principles of EBM to improve care of their patients. Interactive and clinically integrated teaching and learning activities provide the basis for the best educational practice in this field. PMID:17173690
NASA Astrophysics Data System (ADS)
Aboulbanine, Zakaria; El Khayati, Naïma
2018-04-01
The use of phase space in medical linear accelerator Monte Carlo (MC) simulations significantly improves the execution time and leads to results comparable to those obtained from full calculations. The classical representation of phase space stores directly the information of millions of particles, producing bulky files. This paper presents a virtual source model (VSM) based on a reconstruction algorithm, taking as input a compressed file of roughly 800 kb derived from phase space data freely available in the International Atomic Energy Agency (IAEA) database. This VSM includes two main components; primary and scattered particle sources, with a specific reconstruction method developed for each. Energy spectra and other relevant variables were extracted from IAEA phase space and stored in the input description data file for both sources. The VSM was validated for three photon beams: Elekta Precise 6 MV/10 MV and a Varian TrueBeam 6 MV. Extensive calculations in water and comparisons between dose distributions of the VSM and IAEA phase space were performed to estimate the VSM precision. The Geant4 MC toolkit in multi-threaded mode (Geant4-[mt]) was used for fast dose calculations and optimized memory use. Four field configurations were chosen for dose calculation validation to test field size and symmetry effects, , , and for squared fields, and for an asymmetric rectangular field. Good agreement in terms of formalism, for 3%/3 mm and 2%/3 mm criteria, for each evaluated radiation field and photon beam was obtained within a computation time of 60 h on a single WorkStation for a 3 mm voxel matrix. Analyzing the VSM’s precision in high dose gradient regions, using the distance to agreement concept (DTA), showed also satisfactory results. In all investigated cases, the mean DTA was less than 1 mm in build-up and penumbra regions. In regards to calculation efficiency, the event processing speed is six times faster using Geant4-[mt] compared to sequential Geant4, when running the same simulation code for both. The developed VSM for 6 MV/10 MV beams widely used, is a general concept easy to adapt in order to reconstruct comparable beam qualities for various linac configurations, facilitating its integration for MC treatment planning purposes.
ERIC Educational Resources Information Center
Johnson, Will L.
2011-01-01
Objective: Analysis of the validity and implementation of a child maltreatment actuarial risk assessment model, the California Family Risk Assessment (CFRA). Questions addressed: (1) Is there evidence of the validity of the CFRA under field operating conditions? (2) Do actuarial risk assessment results influence child welfare workers' service…
Wagenlehner, Florian Martin Erich; Fröhlich, Oliver; Bschleipfer, Thomas; Weidner, Wolfgang; Perletti, Gianpaolo
2014-06-01
Anatomical damage to pelvic floor structures may cause multiple symptoms. The Integral Theory System Questionnaire (ITSQ) is a holistic questionnaire that uses symptoms to help locate damage in specific connective tissue structures as a guide to reconstructive surgery. It is based on the integral theory, which states that pelvic floor symptoms and prolapse are both caused by lax suspensory ligaments. The aim of the present study was to psychometrically validate the ITSQ. Established psychometric properties including validity, reliability, and responsiveness were considered for evaluation. Criterion validity was assessed in a cohort of 110 women with pelvic floor dysfunctions by analyzing the correlation of questionnaire responses with objective clinical data. Test-retest was performed with questionnaires from 47 patients. Cronbach's alpha and "split-half" reliability coefficients were calculated for inner consistency analysis. Psychometric properties of ITSQ were comparable to the ones of previously validated Pelvic Floor Questionnaires. Face validity and content validity were approved by an expert group of the International Collaboration of Pelvic Floor surgeons. Convergent validity assessed using Bayesian method was at least as accurate as the expert assessment of anatomical defects. Objective data measurement in patients demonstrated significant correlations with ITSQ domains fulfilling criterion validity. Internal consistency values ranked from 0.85 to 0.89 in different scenarios. The ITSQ proofed accurate and is able to serve as a holistic Pelvic Floor Questionnaire directing symptoms to site-specific pelvic floor reconstructive surgery.
The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.
2004-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.
Integrating and Visualizing Tropical Cyclone Data Using the Real Time Mission Monitor
NASA Technical Reports Server (NTRS)
Goodman, H. Michael; Blakeslee, Richard; Conover, Helen; Hall, John; He, Yubin; Regner, Kathryn
2009-01-01
The Real Time Mission Monitor (RTMM) is a visualization and information system that fuses multiple Earth science data sources, to enable real time decision-making for airborne and ground validation experiments. Developed at the NASA Marshall Space Flight Center, RTMM is a situational awareness, decision-support system that integrates satellite imagery, radar, surface and airborne instrument data sets, model output parameters, lightning location observations, aircraft navigation data, soundings, and other applicable Earth science data sets. The integration and delivery of this information is made possible using data acquisition systems, network communication links, network server resources, and visualizations through the Google Earth virtual globe application. RTMM is extremely valuable for optimizing individual Earth science airborne field experiments. Flight planners, scientists, and managers appreciate the contributions that RTMM makes to their flight projects. A broad spectrum of interdisciplinary scientists used RTMM during field campaigns including the hurricane-focused 2006 NASA African Monsoon Multidisciplinary Analyses (NAMMA), 2007 NOAA-NASA Aerosonde Hurricane Noel flight, 2007 Tropical Composition, Cloud, and Climate Coupling (TC4), plus a soil moisture (SMAP-VEX) and two arctic research experiments (ARCTAS) in 2008. Improving and evolving RTMM is a continuous process. RTMM recently integrated the Waypoint Planning Tool, a Java-based application that enables aircraft mission scientists to easily develop a pre-mission flight plan through an interactive point-and-click interface. Individual flight legs are automatically calculated "on the fly". The resultant flight plan is then immediately posted to the Google Earth-based RTMM for interested scientists to view the planned flight track and subsequently compare it to the actual real time flight progress. We are planning additional capabilities to RTMM including collaborations with the Jet Propulsion Laboratory in the joint development of a Tropical Cyclone Integrated Data Exchange and Analysis System (TC IDEAS) which will serve as a web portal for access to tropical cyclone data, visualizations and model output.
Quantum noise in the mirror-field system: A field theoretic approach
NASA Astrophysics Data System (ADS)
Hsiang, Jen-Tsung; Wu, Tai-Hung; Lee, Da-Shin; King, Sun-Kun; Wu, Chun-Hsien
2013-02-01
We revisit the quantum noise problem in the mirror-field system by a field-theoretic approach. Here a perfectly reflecting mirror is illuminated by a single-mode coherent state of the massless scalar field. The associated radiation pressure is described by a surface integral of the stress-tensor of the field. The read-out field is measured by a monopole detector, from which the effective distance between the detector and mirror can be obtained. In the slow-motion limit of the mirror, this field-theoretic approach allows to identify various sources of quantum noise that all in all leads to uncertainty of the read-out measurement. In addition to well-known sources from shot noise and radiation pressure fluctuations, a new source of noise is found from field fluctuations modified by the mirror's displacement. Correlation between different sources of noise can be established in the read-out measurement as the consequence of interference between the incident field and the field reflected off the mirror. In the case of negative correlation, we found that the uncertainty can be lowered than the value predicted by the standard quantum limit. Since the particle-number approach is often used in quantum optics, we compared results obtained by both approaches and examine its validity. We also derive a Langevin equation that describes the stochastic dynamics of the mirror. The underlying fluctuation-dissipation relation is briefly mentioned. Finally we discuss the backreaction induced by the radiation pressure. It will alter the mean displacement of the mirror, but we argue this backreaction can be ignored for a slowly moving mirror.
Experience in Evaluating AAL Solutions in Living Labs
Colomer, Juan Bautista Montalvá; Salvi, Dario; Cabrera-Umpierrez, Maria Fernanda; Arredondo, Maria Teresa; Abril, Patricia; Jimenez-Mixco, Viveca; García-Betances, Rebeca; Fioravanti, Alessio; Pastorino, Matteo; Cancela, Jorge; Medrano, Alejandro
2014-01-01
Ambient assisted living (AAL) is a complex field, where different technologies are integrated to offer solutions for the benefit of different stakeholders. Several evaluation techniques are commonly applied that tackle specific aspects of AAL; however, holistic evaluation approaches are lacking when addressing the needs of both developers and end-users. Living labs have been often used as real-life test and experimentation environments for co-designing AAL technologies and validating them with relevant stakeholders. During the last five years, we have been evaluating AAL systems and services in the framework of various research projects. This paper presents the lessons learned in this experience and proposes a set of harmonized guidelines to conduct evaluations in living labs. PMID:24763209
The Application of Hardware in the Loop Testing for Distributed Engine Control
NASA Technical Reports Server (NTRS)
Thomas, George L.; Culley, Dennis E.; Brand, Alex
2016-01-01
The essence of a distributed control system is the modular partitioning of control function across a hardware implementation. This type of control architecture requires embedding electronics in a multitude of control element nodes for the execution of those functions, and their integration as a unified system. As the field of distributed aeropropulsion control moves toward reality, questions about building and validating these systems remain. This paper focuses on the development of hardware-in-the-loop (HIL) test techniques for distributed aero engine control, and the application of HIL testing as it pertains to potential advanced engine control applications that may now be possible due to the intelligent capability embedded in the nodes.
Applications of magnetic resonance image segmentation in neurology
NASA Astrophysics Data System (ADS)
Heinonen, Tomi; Lahtinen, Antti J.; Dastidar, Prasun; Ryymin, Pertti; Laarne, Paeivi; Malmivuo, Jaakko; Laasonen, Erkki; Frey, Harry; Eskola, Hannu
1999-05-01
After the introduction of digital imagin devices in medicine computerized tissue recognition and classification have become important in research and clinical applications. Segmented data can be applied among numerous research fields including volumetric analysis of particular tissues and structures, construction of anatomical modes, 3D visualization, and multimodal visualization, hence making segmentation essential in modern image analysis. In this research project several PC based software were developed in order to segment medical images, to visualize raw and segmented images in 3D, and to produce EEG brain maps in which MR images and EEG signals were integrated. The software package was tested and validated in numerous clinical research projects in hospital environment.
A Three-Year Field Validation Study to Improve the Integrated Pest Management of Hot Pepper
Kim, Ji-Hoon; Yun, Sung-Chul
2013-01-01
To improve the integrated pest management (IPM) of hot pepper, field study was conducted in Hwasung from 2010 to 2012 and an IPM system was developed to help growers decide when to apply pesticides to control anthracnose, tobacco budworm, Phytophthora blight, bacterial wilt, and bacterial leaf spot. The three field treatments consisted of IPM sprays following the forecast model advisory, a periodic spray at 7-to-10-day intervals, and no spray (control). The number of annual pesticide applications for the IPM treatment ranged from six to eight, whereas the plots subjected to the periodic treatment received pesticide 11 or 12 times annually for three years. Compared to the former strategy, our improved IPM strategy features more intense pest management, with frequent spraying for anthracnose and mixed spraying for tobacco budworm or Phytophthora blight. The incidences for no pesticide control in 2010, 2011, and 2012 were 91, 97.6, and 41.4%, respectively. Conversely, the incidences for the IPM treatment for those years were 7.6, 62.6, and 2%, and the yields from IPM-treated plots were 48.6 kg, 12.1 kg, and 48.8 kg. The incidence and yield in the IPM-treated plots were almost the same as those of the periodic treatment except in 2011, in which no unnecessary sprays were given, meaning that the IPM control was quite successful. From reviewing eight years of field work, sophisticated forecasts that optimize pesticide spray timing reveal that reliance on pesticides can be reduced without compromising yield. Eco-friendly strategies can be implemented in the pest management of hot pepper. PMID:25288956
Field Validation of POCIS for Monitoring at Underwater Munitions Sites.
Rosen, Gunther; Lotufo, Guilherme R; George, Robert D; Wild, Bill; Rabalais, Lauren K; Morrison, Shane; Belden, Jason B
2018-04-24
The present study evaluated Polar Organic Chemical Integrative Samplers (POCIS) for quantification of conventional munitions constituents (MC), including trinitrotoluene (TNT), aminodinitrotoluenes, diaminonitrotoluenes, dinitrotoluene, and hexahydro-1,3,5-trinitro-1,3,5-triazine (RDX) in a field setting. POCIS were deployed at varying distances from the commonly used explosive formulation Composition B (39.5% TNT, 59.5% RDX, 1% wax) in an embayment of Santa Rosa Sound (Florida, USA). Time-weighted averaged (TWA) water concentrations from a 13-day deployment ranged from 9-103 ng/L for TNT and RDX approximately 0.3 to 2 m from the source. Concentrations decreased with increasing distance from the source to below quantitation limits (5-7 ng/L) at stations greater than 2 m away. Moderate biofouling of POCIS membranes after 13-d led to a subsequent effort to quantify potential effects of biofouling on sampling rate for MC. After biofouling was allowed to occur for periods of 0, 7, 14 or 28 days at the field site, POCIS were transferred to aquaria spiked with MC. No significant differences in uptake of TNT or RDX were observed across a gradient of biofouling presence, although mass of fouling organisms on the membranes was statistically greater for the 28-d field exposure. The present study verified the high sensitivity and integrative nature of POCIS for relevant MC potentially present in aquatic environments, indicating that application at underwater military munitions sites may be useful for ecological risk assessment. This article is protected by copyright. All rights reserved This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Reisner, A. T.; Khitrov, M. Y.; Chen, L.; Blood, A.; Wilkins, K.; Doyle, W.; Wilcox, S.; Denison, T.; Reifman, J.
2013-01-01
Summary Background Advanced decision-support capabilities for prehospital trauma care may prove effective at improving patient care. Such functionality would be possible if an analysis platform were connected to a transport vital-signs monitor. In practice, there are technical challenges to implementing such a system. Not only must each individual component be reliable, but, in addition, the connectivity between components must be reliable. Objective We describe the development, validation, and deployment of the Automated Processing of Physiologic Registry for Assessment of Injury Severity (APPRAISE) platform, intended to serve as a test bed to help evaluate the performance of decision-support algorithms in a prehospital environment. Methods We describe the hardware selected and the software implemented, and the procedures used for laboratory and field testing. Results The APPRAISE platform met performance goals in both laboratory testing (using a vital-sign data simulator) and initial field testing. After its field testing, the platform has been in use on Boston MedFlight air ambulances since February of 2010. Conclusion These experiences may prove informative to other technology developers and to healthcare stakeholders seeking to invest in connected electronic systems for prehospital as well as in-hospital use. Our experiences illustrate two sets of important questions: are the individual components reliable (e.g., physical integrity, power, core functionality, and end-user interaction) and is the connectivity between components reliable (e.g., communication protocols and the metadata necessary for data interpretation)? While all potential operational issues cannot be fully anticipated and eliminated during development, thoughtful design and phased testing steps can reduce, if not eliminate, technical surprises. PMID:24155791
ARC: An open-source library for calculating properties of alkali Rydberg atoms
NASA Astrophysics Data System (ADS)
Šibalić, N.; Pritchard, J. D.; Adams, C. S.; Weatherill, K. J.
2017-11-01
We present an object-oriented Python library for the computation of properties of highly-excited Rydberg states of alkali atoms. These include single-body effects such as dipole matrix elements, excited-state lifetimes (radiative and black-body limited) and Stark maps of atoms in external electric fields, as well as two-atom interaction potentials accounting for dipole and quadrupole coupling effects valid at both long and short range for arbitrary placement of the atomic dipoles. The package is cross-referenced to precise measurements of atomic energy levels and features extensive documentation to facilitate rapid upgrade or expansion by users. This library has direct application in the field of quantum information and quantum optics which exploit the strong Rydberg dipolar interactions for two-qubit gates, robust atom-light interfaces and simulating quantum many-body physics, as well as the field of metrology using Rydberg atoms as precise microwave electrometers. Program Files doi:http://dx.doi.org/10.17632/hm5n8w628c.1 Licensing provisions: BSD-3-Clause Programming language: Python 2.7 or 3.5, with C extension External Routines: NumPy [1], SciPy [1], Matplotlib [2] Nature of problem: Calculating atomic properties of alkali atoms including lifetimes, energies, Stark shifts and dipole-dipole interaction strengths using matrix elements evaluated from radial wavefunctions. Solution method: Numerical integration of radial Schrödinger equation to obtain atomic wavefunctions, which are then used to evaluate dipole matrix elements. Properties are calculated using second order perturbation theory or exact diagonalisation of the interaction Hamiltonian, yielding results valid even at large external fields or small interatomic separation. Restrictions: External electric field fixed to be parallel to quantisation axis. Supplementary material: Detailed documentation (.html), and Jupyter notebook with examples and benchmarking runs (.html and .ipynb). [1] T.E. Oliphant, Comput. Sci. Eng. 9, 10 (2007). http://www.scipy.org/. [2] J.D. Hunter, Comput. Sci. Eng. 9, 90 (2007). http://matplotlib.org/.
Mills, Jeremy F; Gray, Andrew L
2013-11-01
This study is an initial validation study of the Two-Tiered Violence Risk Estimates instrument (TTV), a violence risk appraisal instrument designed to support an integrated-actuarial approach to violence risk assessment. The TTV was scored retrospectively from file information on a sample of violent offenders. Construct validity was examined by comparing the TTV with instruments that have shown utility to predict violence that were prospectively scored: The Historical-Clinical-Risk Management-20 (HCR-20) and Lifestyle Criminality Screening Form (LCSF). Predictive validity was examined through a long-term follow-up of 12.4 years with a sample of 78 incarcerated offenders. Results show the TTV to be highly correlated with the HCR-20 and LCSF. The base rate for violence over the follow-up period was 47.4%, and the TTV was equally predictive of violent recidivism relative to the HCR-20 and LCSF. Discussion centers on the advantages of an integrated-actuarial approach to the assessment of violence risk.
Independent validation of Swarm Level 2 magnetic field products and `Quick Look' for Level 1b data
NASA Astrophysics Data System (ADS)
Beggan, Ciarán D.; Macmillan, Susan; Hamilton, Brian; Thomson, Alan W. P.
2013-11-01
Magnetic field models are produced on behalf of the European Space Agency (ESA) by an independent scientific consortium known as the Swarm Satellite Constellation Application and Research Facility (SCARF), through the Level 2 Processor (L2PS). The consortium primarily produces magnetic field models for the core, lithosphere, ionosphere and magnetosphere. Typically, for each magnetic product, two magnetic field models are produced in separate chains using complementary data selection and processing techniques. Hence, the magnetic field models from the complementary processing chains will be similar but not identical. The final step in the overall L2PS therefore involves inspection and validation of the magnetic field models against each other and against data from (semi-) independent sources (e.g. ground observatories). We describe the validation steps for each magnetic field product and the comparison against independent datasets, and we show examples of the output of the validation. In addition, the L2PS also produces a daily set of `Quick Look' output graphics and statistics to monitor the overall quality of Level 1b data issued by ESA. We describe the outputs of the `Quick Look' chain.
The GODAE High Resolution Sea Surface Temperature Pilot Project (GHRSST-PP)
NASA Astrophysics Data System (ADS)
Donlon, C.; Ghrsst-Pp Science Team
2003-04-01
This paper summarises Development and Implementation Plan of the GODAE High Resolution Sea Surface Temperature Pilot Project (GHRSST-PP). The aim of the GHRSST-PP is to coordinate a new generation of global, multi-sensor, high-resolution (better than 10 km and 12 hours) SST products for the benefit of the operational and scientific community and for those with a potential interest in the products of GODAE. The GHRSST-PP project will deliver a demonstration system that integrates data from existing international satellite and in situ data sources using state-of-the-art communications and analysis tools. Primary GHRSST-PP products will be generated by fusing infrared and microwave satellite data obtained from sensors in near polar, geostationary and low earth orbits, constrained by in situ observations. Surface skin SST, sub-surface SST and SST at depth will be produced as both merged and analysed data products. Merged data products have a common grid but all input data retaining their error statistics whereas analysed data products use all data to derive a best estimate data source having one set of error statistics. Merged SST fields will not be interpolated thereby preserving the integrity of the source data as much as possible. Products will be first produced and validated using in situ observations for regional areas by regional data assembly centres (RDAC) and sent to a global data analysis centre (GDAC) for integration with other data to provide global coverage. GDAC and RDAC will be connected together with other data using a virtual dynamic distributed database (DDD). The GDAC will merge and analyse RDAC data together with other data (from the GTS and space agencies) to provide global coverage every 12 hours in real time. In all cases data products will be accurate to better than 0.5 K validated using data collected at globally distributed diagnostic data set (DDS) sites. A user information service (UIS) will work together with user applications and services (AUS) to ensure that the GHRSST-PP is able to respond appropriately to user demands. In addition, the GDAC will provide product validation and dissemination services as well as the means for researchers to test and use the In situ and Satellite Data Integration Processing Model (ISDI-PM) operational demonstration code using a large supercomputer.
NASA Technical Reports Server (NTRS)
Chavez, Carlos; Hammel, Bruce; Hammel, Allan; Moore, John R.
2014-01-01
Unmanned Aircraft Systems (UAS) represent a new capability that will provide a variety of services in the government (public) and commercial (civil) aviation sectors. The growth of this potential industry has not yet been realized due to the lack of a common understanding of what is required to safely operate UAS in the National Airspace System (NAS). To address this deficiency, NASA has established a project called UAS Integration in the NAS (UAS in the NAS), under the Integrated Systems Research Program (ISRP) of the Aeronautics Research Mission Directorate (ARMD). This project provides an opportunity to transition concepts, technology, algorithms, and knowledge to the Federal Aviation Administration (FAA) and other stakeholders to help them define the requirements, regulations, and issues for routine UAS access to the NAS. The safe, routine, and efficient integration of UAS into the NAS requires new radio frequency (RF) spectrum allocations and a new data communications system which is both secure and scalable with increasing UAS traffic without adversely impacting the Air Traffic Control (ATC) communication system. These data communications, referred to as Control and Non-Payload Communications (CNPC), whose purpose is to exchange information between the unmanned aircraft and the ground control station to ensure safe, reliable, and effective unmanned aircraft flight operation. A Communications Subproject within the UAS in the NAS Project has been established to address issues related to CNPC development, certification and fielding. The focus of the Communications Subproject is on validating and allocating new RF spectrum and data link communications to enable civil UAS integration into the NAS. The goal is to validate secure, robust data links within the allocated frequency spectrum for UAS. A vision, architectural concepts, and seed requirements for the future commercial UAS CNPC system have been developed by RTCA Special Committee 203 (SC-203) in the process of determining formal recommendations to the FAA in its role provided for under the Federal Advisory Committee Act. NASA intends to conduct its research and development in keeping with this vision and associated architectural concepts. The prototype communication systems developed and tested by NASA will be used to validate and update the initial SC-203 requirements in order to provide a foundation for SC-203's Minimum Aviation System Performance Standards (MASPS).
Electro-gravity via geometric chrononfield
NASA Astrophysics Data System (ADS)
Suchard, Eytan H.
2017-05-01
In De Sitter / Anti De Sitter space-time and in other geometries, reference sub-manifolds from which proper time is measured along integral curves, are described as events. We introduce here a foliation with the help of a scalar field. The scalar field need not be unique but from the gradient of the scalar field, an intrinsic Reeb vector of the foliations perpendicular to the gradient vector is calculated. The Reeb vector describes the acceleration of a physical particle that moves along the integral curves that are formed by the gradient of the scalar field. The Reeb vector appears as a component of an anti-symmetric matrix which is a part of a rank-2, 2-Form. The 2-form is extended into a non-degenerate 4-form and into rank-4 matrix of a 2-form, which when multiplied by a velocity of a particle, becomes the acceleration of the particle. The matrix has one U(1) degree of freedom and an additional SU(2) degrees of freedom in two vectors that span the plane perpendicular to the gradient of the scalar field and to the Reeb vector. In total, there are U(1) x SU(2) degrees of freedom. SU(3) degrees of freedom arise from three dimensional foliations but require an additional symmetry to exist in order to have a valid covariant meaning. Matter in the Einstein Grossmann equation is replaced by the action of the acceleration field, i.e. by a geometric action which is not anticipated by the metric alone. This idea leads to a new formalism that replaces the conventional stress-energy-momentum-tensor. The formalism will be mainly developed for classical physics but will also be discussed for quantized physics based on events instead of particles. The result is that a positive charge manifests small attracting gravity and a stronger but small repelling acceleration field that repels even uncharged particles that have a rest mass. Negative charge manifests a repelling anti-gravity but also a stronger acceleration field that attracts even uncharged particles that have rest mass. Preliminary version: http://sciencedomain.org/abstract/9858
Static Vented Chamber and Eddy Covariance Methane Flux Comparisons in Mid-South US Rice
NASA Astrophysics Data System (ADS)
Reba, M. L.; Fong, B.; Adviento-Borbe, A.; Runkle, B.; Suvocarev, K.; Rival, I.
2017-12-01
Rice cultivation contributes higher amounts of GHG emissions (CO2 and CH4) due to flooded field conditions. A comparison between eddy covariance and static vented flux chamber measurement techniques is presented. Rice GHG emissions originating from plot level chambers may not accurately describe the aggregate effects of all the soil and micrometeorological variations across a production field. Eddy covariance (EC) is a direct, integrated field measurement of field scale trace gases. Flux measurements were collected in NE Arkansas production size rice fields (16 ha, 40 ac) during the 2015 and 2016 production seasons (June-August) in continuous flood (CF) irrigation. The study objectives included quantifying the difference between chamber and EC measurements, and categorizing flux behavior to growth stage and field history. EC daily average emissions correlated with chamber measurements (R2=0.27-0.54) more than average from 09:00-12:00 which encompassed chamber measurement times (R2=0.23-0.32). Maximum methane emissions occurred in the late afternoon from 14:00-18:00 which corresponded with maximum soil heat flux and air temperature. The total emissions from the study fields ranged from 27-117 kg CH4-C ha-1 season-1. The emission profile was lower in 2015, most likely due to higher rainfall and cooler temperatures during the growing season compared to 2016. These findings improve our understanding of GHG emissions at the field scale under typical production practices and validity of chamber and EC flux measurement techniques.
Validation of farm-scale methane emissions using nocturnal boundary layer budgets
NASA Astrophysics Data System (ADS)
Stieger, J.; Bamberger, I.; Buchmann, N.; Eugster, W.
2015-08-01
This study provides the first experimental validation of Swiss agricultural methane emission estimates at the farm scale. We measured CH4 concentrations at a Swiss farmstead during two intensive field campaigns in August 2011 and July 2012 to (1) quantify the source strength of livestock methane emissions using a tethered balloon system, and (2) to validate inventory emission estimates via nocturnal boundary layer (NBL) budgets. Field measurements were performed at a distance of 150 m from the nearest farm buildings with a tethered balloon system in combination with gradient measurements at eight heights on a 10 m tower to better resolve the near-surface concentrations. Vertical profiles of air temperature, relative humidity, CH4 concentration, wind speed and wind direction showed that the NBL was strongly influenced by local transport processes and by the valley wind system. Methane concentrations showed a pronounced time course, with highest concentrations in the second half of the night. NBL budget flux estimates were obtained via a time-space kriging approach. Main uncertainties of NBL budget flux estimates were associated with instationary atmospheric conditions and the estimate of the inversion height zi (top of volume integration). The mean NBL budget fluxes of 1.60 ± 0.31 μg CH4 m-2 s-1 (1.40 ± 0.50 and 1.66 ± 0.20 μg CH4 m-2 s-1 in 2011 and 2012, respectively) were in good agreement with local inventory estimates based on current livestock number and default emission factors, with 1.29 ± 0.47 and 1.74 ± 0.63 μg CH4 m-2 s-1 for 2011 and 2012, respectively. This indicates that emission factors used for the national inventory reports are adequate, and we conclude that the NBL budget approach is a useful tool to validate emission inventory estimates.
Validation of farm-scale methane emissions using nocturnal boundary layer budgets
NASA Astrophysics Data System (ADS)
Stieger, J.; Bamberger, I.; Buchmann, N.; Eugster, W.
2015-12-01
This study provides the first experimental validation of Swiss agricultural methane emission estimates at the farm scale. We measured CH4 concentrations at a Swiss farmstead during two intensive field campaigns in August 2011 and July 2012 to (1) quantify the source strength of livestock methane emissions using a tethered balloon system and (2) to validate inventory emission estimates via nocturnal boundary layer (NBL) budgets. Field measurements were performed at a distance of 150 m from the nearest farm buildings with a tethered balloon system in combination with gradient measurements at eight heights on a 10 m tower to better resolve the near-surface concentrations. Vertical profiles of air temperature, relative humidity, CH4 concentration, wind speed, and wind direction showed that the NBL was strongly influenced by local transport processes and by the valley wind system. Methane concentrations showed a pronounced time course, with highest concentrations in the second half of the night. NBL budget flux estimates were obtained via a time-space kriging approach. Main uncertainties of NBL budget flux estimates were associated with nonstationary atmospheric conditions and the estimate of the inversion height zi (top of volume integration). The mean NBL budget fluxes of 1.60 ± 0.31 μg CH4 m-2 s-1 (1.40 ± 0.50 and 1.66 ± 0.20 μg CH4 m-2 s-1 in 2011 and 2012 respectively) were in good agreement with local inventory estimates based on current livestock number and default emission factors, with 1.29 ± 0.47 and 1.74 ± 0.63 μg CH4 m-2 s-1 for 2011 and 2012 respectively. This indicates that emission factors used for the national inventory reports are adequate, and we conclude that the NBL budget approach is a useful tool to validate emission inventory estimates.
40 CFR 86.1341-98 - Test cycle validation criteria.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Test cycle validation criteria. 86... Procedures § 86.1341-98 Test cycle validation criteria. Section 86.1341-98 includes text that specifies...-90 (d)(4), shall be excluded from both cycle validation and the integrated work used for emissions...
40 CFR 86.1341-98 - Test cycle validation criteria.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 20 2013-07-01 2013-07-01 false Test cycle validation criteria. 86... Procedures § 86.1341-98 Test cycle validation criteria. Section 86.1341-98 includes text that specifies...-90 (d)(4), shall be excluded from both cycle validation and the integrated work used for emissions...
40 CFR 86.1341-98 - Test cycle validation criteria.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Test cycle validation criteria. 86... Procedures § 86.1341-98 Test cycle validation criteria. Section 86.1341-98 includes text that specifies...-90 (d)(4), shall be excluded from both cycle validation and the integrated work used for emissions...
An online tool for tracking soil nitrogen
NASA Astrophysics Data System (ADS)
Wang, J.; Umar, M.; Banger, K.; Pittelkow, C. M.; Nafziger, E. D.
2016-12-01
Near real-time crop models can be useful tools for optimizing agricultural management practices. For example, model simulations can potentially provide current estimates of nitrogen availability in soil, helping growers decide whether more nitrogen needs to be applied in a given season. Traditionally, crop models have been used at point locations (i.e. single fields) with homogenous soil, climate and initial conditions. However, nitrogen availability across fields with varied weather and soil conditions at a regional or national level is necessary to guide better management decisions. This study presents the development of a publicly available, online tool that automates the integration of high-spatial-resolution forecast and past weather and soil data in DSSAT to estimate nitrogen availability for individual fields in Illinois. The model has been calibrated with field experiments from past year at six research corn fields across Illinois. These sites were treated with applications of different N fertilizer timings and amounts. The tool requires minimal management information from growers and yet has the capability to simulate nitrogen-water-crop interactions with calibrated parameters that are more appropriate for Illinois. The results from the tool will be combined with incoming field experiment data from 2016 for model validation and further improvement of model's predictive accuracy. The tool has the potential to help guide better nitrogen management practices to maximize economic and environmental benefits.
NASA Astrophysics Data System (ADS)
Kabiri, K.
2017-09-01
The capabilities of Sentinel-2A imagery to determine bathymetric information in shallow coastal waters were examined. In this regard, two Sentinel-2A images (acquired on February and March 2016 in calm weather and relatively low turbidity) were selected from Nayband Bay, located in the northern Persian Gulf. In addition, a precise and accurate bathymetric map for the study area were obtained and used for both calibrating the models and validating the results. Traditional linear and ratio transform techniques, as well as a novel integrated method, were employed to determine depth values. All possible combinations of the three bands (Band 2: blue (458-523 nm), Band 3: green (543-578 nm), and Band 4: red (650-680 nm), spatial resolution: 10 m) have been considered (11 options) using the traditional linear and ratio transform techniques, together with 10 model options for the integrated method. The accuracy of each model was assessed by comparing the determined bathymetric information with field measured values. The correlation coefficients (R2), and root mean square errors (RMSE) for validation points were calculated for all models and for two satellite images. When compared with the linear transform method, the method employing ratio transformation with a combination of all three bands yielded more accurate results (R2Mac = 0.795, R2Feb = 0.777, RMSEMac = 1.889 m, and RMSEFeb =2.039 m). Although most of the integrated transform methods (specifically the method including all bands and band ratios) have yielded the highest accuracy, these increments were not significant, hence the ratio transformation has selected as optimum method.
A test of the validity of the motivational interviewing treatment integrity code.
Forsberg, Lars; Berman, Anne H; Kallmén, Håkan; Hermansson, Ulric; Helgason, Asgeir R
2008-01-01
To evaluate the Swedish version of the Motivational Interviewing Treatment Code (MITI), MITI coding was applied to tape-recorded counseling sessions. Construct validity was assessed using factor analysis on 120 MITI-coded sessions. Discriminant validity was assessed by comparing MITI coding of motivational interviewing (MI) sessions with information- and advice-giving sessions as well as by comparing MI-trained practitioners with untrained practitioners. A principal-axis factoring analysis yielded some evidence for MITI construct validity. MITI differentiated between practitioners with different levels of MI training as well as between MI practitioners and advice-giving counselors, thus supporting discriminant validity. MITI may be used as a training tool together with supervision to confirm and enhance MI practice in clinical settings. MITI can also serve as a tool for evaluating MI integrity in clinical research.
High Sensitivity Detection of Broadband Acoustic Vibration Using Optical Demodulation Method
NASA Astrophysics Data System (ADS)
Zhang, Zhen
Measuring the high frequency acoustic vibrations represents the fundamental interest in revealing the intrinsic dynamic characteristic of board range of systems, such as the growth of the fetus, blood flow in human palms, and vibrations of carbon nanotube. However, the acoustic wave detection capability is limited by the detection bandwidth and sensitivity of the commonly used piezoelectric based ultrasound detectors. To overcome these limitations, this thesis focuses on exploring the optical demodulation method for highly sensitive detection of broadband acoustic vibration. First, a transparent optical ultrasonic detector has been developed using micro-ring resonator (MRR) made of soft polymeric materials. It outperforms the traditional piezoelectric detectors with broader detection bandwidth, miniaturized size and wide angular sensitivity. Its ease of integration into photoacoustic microscopy system has resulted in the great improvement of the imaging resolution. A theoretic framework has been developed to establish the quantitative understanding of its unique distance and angular dependent detection characteristics and was subsequently validated experimentally. The developed theoretic framework provides a guideline to fully accounts for the trade-offs between axial and lateral resolution, working distance, and the field of view in developing optimal imaging performance for a wide range of biological and clinical applications. MRR-based ultrasonic detector is further integrated into confocal fluorescence microscopy to realize the simultaneous imaging of fluorescence and optical absorption of retinal pigment epithelium, achieving multi-contrast imaging at sub-cellular level. The needs to resolve the fine details of the biological specimen with the resolution beyond the diffraction limit further motivate the development of optical demodulated ultrasonic detection method based on near-field scanning optical microscopy (NSOM). The nano-focusing probe was developed for adiabatic focusing of surface plasmon polaritons to the probe apex with high energy efficiency and the suppression of the background noise was accomplished through the implementation of the harmonic demodulation technique. Collectively, this system is capable of delivering intense near-field illumination source while effectively suppressing the background signal due to the far-field scattering and thus, allows for quantitative mapping of local evanescent field with enhanced contrast and improved resolutions. The performance of the developed NSOM system has been validated through the experimental measurements of the surface plasmon polariton mode. This new NSOM system enables optical demodulated ultrasound detection at nanoscale spatial resolution. Using it to detect the ultrasound signal within the acoustic near-field has led to the successful experimental demonstration of the sub-surface photoacoustic imaging of buried objects with sub-diffraction-limited resolution and high sensitivity. Such a new ultrasound detection method holds promising potential for super-resolution ultrasound imaging.
Frequency Response Studies using Receptance Coupling Approach in High Speed Spindles
NASA Astrophysics Data System (ADS)
Shaik, Jakeer Hussain; Ramakotaiah, K.; Srinivas, J.
2018-01-01
In order to assess the stability of high speed machining, estimate the frequency response at the end of tool tip is of great importance. Evaluating dynamic response of several combinations of integrated spindle-tool holder-tool will consume a lot of time. This paper presents coupled field dynamic response at tool tip for the entire integrated spindle tool unit. The spindle unit is assumed to be relying over the front and rear bearings and investigated using the Timoshenko beam theory to arrive the receptances at different locations of the spindle-tool unit. The responses are further validated with conventional finite element model as well as with the experiments. This approach permits quick outputs without losing accuracy of solution and further these methods are utilized to analyze the various design variables on system dynamics. The results obtained through this analysis are needed to design the better spindle unit in an attempt to reduce the frequency amplitudes at the tool tip to improvise the milling stability during cutting process.
Critical success factors for achieving superior m-health success.
Dwivedi, A; Wickramasinghe, N; Bali, R K; Naguib, R N G
2007-01-01
Recent healthcare trends clearly show significant investment by healthcare institutions into various types of wired and wireless technologies to facilitate and support superior healthcare delivery. This trend has been spurred by the shift in the concept and growing importance of the role of health information and the influence of fields such as bio-informatics, biomedical and genetic engineering. The demand is currently for integrated healthcare information systems; however for such initiatives to be successful it is necessary to adopt a macro model and appropriate methodology with respect to wireless initiatives. The key contribution of this paper is the presentation of one such integrative model for mobile health (m-health) known as the Wi-INET Business Model, along with a detailed Adaptive Mapping to Realisation (AMR) methodology. The AMR methodology details how the Wi-INET Business Model can be implemented. Further validation on the concepts detailed in the Wi-INET Business Model and the AMR methodology is offered via a short vignette on a toolkit based on a leading UK-based healthcare information technology solution.
Stress estimation in reservoirs using an integrated inverse method
NASA Astrophysics Data System (ADS)
Mazuyer, Antoine; Cupillard, Paul; Giot, Richard; Conin, Marianne; Leroy, Yves; Thore, Pierre
2018-05-01
Estimating the stress in reservoirs and their surroundings prior to the production is a key issue for reservoir management planning. In this study, we propose an integrated inverse method to estimate such initial stress state. The 3D stress state is constructed with the displacement-based finite element method assuming linear isotropic elasticity and small perturbations in the current geometry of the geological structures. The Neumann boundary conditions are defined as piecewise linear functions of depth. The discontinuous functions are determined with the CMA-ES (Covariance Matrix Adaptation Evolution Strategy) optimization algorithm to fit wellbore stress data deduced from leak-off tests and breakouts. The disregard of the geological history and the simplified rheological assumptions mean that only the stress field, statically admissible and matching the wellbore data should be exploited. The spatial domain of validity of this statement is assessed by comparing the stress estimations for a synthetic folded structure of finite amplitude with a history constructed assuming a viscous response.
Supersymmetric and non-supersymmetric models without catastrophic Goldstone bosons
NASA Astrophysics Data System (ADS)
Braathen, Johannes; Goodsell, Mark D.; Staub, Florian
2017-11-01
The calculation of the Higgs mass in general renormalisable field theories has been plagued by the so-called "Goldstone Boson Catastrophe," where light (would-be) Goldstone bosons give infra-red divergent loop integrals. In supersymmetric models, previous approaches included a workaround that ameliorated the problem for most, but not all, parameter space regions; while giving divergent results everywhere for non-supersymmetric models! We present an implementation of a general solution to the problem in the public code SARAH, along with new calculations of some necessary loop integrals and generic expressions. We discuss the validation of our code in the Standard Model, where we find remarkable agreement with the known results. We then show new applications in Split SUSY, the NMSSM, the Two-Higgs-Doublet Model, and the Georgi-Machacek model. In particular, we take some first steps to exploring where the habit of using tree-level mass relations in non-supersymmetric models breaks down, and show that the loop corrections usually become very large well before naive perturbativity bounds are reached.
Fluid-structure interaction of turbulent boundary layer over a compliant surface
NASA Astrophysics Data System (ADS)
Anantharamu, Sreevatsa; Mahesh, Krishnan
2016-11-01
Turbulent flows induce unsteady loads on surfaces in contact with them, which affect material stresses, surface vibrations and far-field acoustics. We are developing a numerical methodology to study the coupled interaction of a turbulent boundary layer with the underlying surface. The surface is modeled as a linear elastic solid, while the fluid follows the spatially filtered incompressible Navier-Stokes equations. An incompressible Large Eddy Simulation finite volume flow approach based on the algorithm of Mahesh et al. is used in the fluid domain. The discrete kinetic energy conserving property of the method ensures robustness at high Reynolds number. The linear elastic model in the solid domain is integrated in space using finite element method and in time using the Newmark time integration method. The fluid and solid domain solvers are coupled using both weak and strong coupling methods. Details of the algorithm, validation, and relevant results will be presented. This work is supported by NSWCCD, ONR.
A computational workflow for designing silicon donor qubits
Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; ...
2016-09-19
Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to performmore » detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. In conclusion, the resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.« less
Mobile health: the power of wearables, sensors, and apps to transform clinical trials.
Munos, Bernard; Baker, Pamela C; Bot, Brian M; Crouthamel, Michelle; de Vries, Glen; Ferguson, Ian; Hixson, John D; Malek, Linda A; Mastrototaro, John J; Misra, Veena; Ozcan, Aydogan; Sacks, Leonard; Wang, Pei
2016-07-01
Mobile technology has become a ubiquitous part of everyday life, and the practical utility of mobile devices for improving human health is only now being realized. Wireless medical sensors, or mobile biosensors, are one such technology that is allowing the accumulation of real-time biometric data that may hold valuable clues for treating even some of the most devastating human diseases. From wearable gadgets to sophisticated implantable medical devices, the information retrieved from mobile technology has the potential to revolutionize how clinical research is conducted and how disease therapies are delivered in the coming years. Encompassing the fields of science and engineering, analytics, health care, business, and government, this report explores the promise that wearable biosensors, along with integrated mobile apps, hold for improving the quality of patient care and clinical outcomes. The discussion focuses on groundbreaking device innovation, data optimization and validation, commercial platform integration, clinical implementation and regulation, and the broad societal implications of using mobile health technologies. © 2016 New York Academy of Sciences.
A methodology for extending domain coverage in SemRep.
Rosemblat, Graciela; Shin, Dongwook; Kilicoglu, Halil; Sneiderman, Charles; Rindflesch, Thomas C
2013-12-01
We describe a domain-independent methodology to extend SemRep coverage beyond the biomedical domain. SemRep, a natural language processing application originally designed for biomedical texts, uses the knowledge sources provided by the Unified Medical Language System (UMLS©). Ontological and terminological extensions to the system are needed in order to support other areas of knowledge. We extended SemRep's application by developing a semantic representation of a previously unsupported domain. This was achieved by adapting well-known ontology engineering phases and integrating them with the UMLS knowledge sources on which SemRep crucially depends. While the process to extend SemRep coverage has been successfully applied in earlier projects, this paper presents in detail the step-wise approach we followed and the mechanisms implemented. A case study in the field of medical informatics illustrates how the ontology engineering phases have been adapted for optimal integration with the UMLS. We provide qualitative and quantitative results, which indicate the validity and usefulness of our methodology. Published by Elsevier Inc.
Toolboxes for a standardised and systematic study of glycans
2014-01-01
Background Recent progress in method development for characterising the branched structures of complex carbohydrates has now enabled higher throughput technology. Automation of structure analysis then calls for software development since adding meaning to large data collections in reasonable time requires corresponding bioinformatics methods and tools. Current glycobioinformatics resources do cover information on the structure and function of glycans, their interaction with proteins or their enzymatic synthesis. However, this information is partial, scattered and often difficult to find to for non-glycobiologists. Methods Following our diagnosis of the causes of the slow development of glycobioinformatics, we review the "objective" difficulties encountered in defining adequate formats for representing complex entities and developing efficient analysis software. Results Various solutions already implemented and strategies defined to bridge glycobiology with different fields and integrate the heterogeneous glyco-related information are presented. Conclusions Despite the initial stage of our integrative efforts, this paper highlights the rapid expansion of glycomics, the validity of existing resources and the bright future of glycobioinformatics. PMID:24564482
Airborne Detection and Tracking of Geologic Leakage Sites
NASA Astrophysics Data System (ADS)
Jacob, Jamey; Allamraju, Rakshit; Axelrod, Allan; Brown, Calvin; Chowdhary, Girish; Mitchell, Taylor
2014-11-01
Safe storage of CO2 to reduce greenhouse gas emissions without adversely affecting energy use or hindering economic growth requires development of monitoring technology that is capable of validating storage permanence while ensuring the integrity of sequestration operations. Soil gas monitoring has difficulty accurately distinguishing gas flux signals related to leakage from those associated with meteorologically driven changes of soil moisture and temperature. Integrated ground and airborne monitoring systems are being deployed capable of directly detecting CO2 concentration in storage sites. Two complimentary approaches to detecting leaks in the carbon sequestration fields are presented. The first approach focuses on reducing the requisite network communication for fusing individual Gaussian Process (GP) CO2 sensing models into a global GP CO2 model. The GP fusion approach learns how to optimally allocate the static and mobile sensors. The second approach leverages a hierarchical GP-Sigmoidal Gaussian Cox Process for airborne predictive mission planning to optimally reducing the entropy of the global CO2 model. Results from the approaches will be presented.
NREL Spectrum of Clean Energy Innovation (Brochure)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2011-09-01
This brochure describes the NREL Spectrum of Clean Energy Innovation, which includes analysis and decision support, fundamental science, market relevant research, systems integration, testing and validation, commercialization and deployment. Through deep technical expertise and an unmatched breadth of capabilities, the National Renewable Energy Laboratory (NREL) leads an integrated approach across the spectrum of renewable energy innovation. From scientific discovery to accelerating market deployment, NREL works in partnership with private industry to drive the transformation of our nation's energy systems. NREL integrates the entire spectrum of innovation, including fundamental science, market relevant research, systems integration, testing and validation, commercialization, and deployment.more » Our world-class analysis and decision support informs every point on the spectrum. The innovation process at NREL is inter-dependent and iterative. Many scientific breakthroughs begin in our own laboratories, but new ideas and technologies may come to NREL at any point along the innovation spectrum to be validated and refined for commercial use.« less
ERIC Educational Resources Information Center
Perkmen, Serkan; Antonenko, Pavlo; Caracuel, Alfonso
2016-01-01
The main purpose of this study was to examine the validity of the Teacher Intentions to Integrate Technology in Education Scale using pre-service teacher samples from three countries on three continents--Turkey, Spain and the United States. Study participants were 550 pre-service teachers from three universities in Turkey, Spain and the USA (219,…
NASA Astrophysics Data System (ADS)
Scipioni, A.; Tagliaferri, F.
2009-04-01
Objective of the document is to define lines of development and distribution of the services to support detection, prevention and planning of the agricultural-forest-rural land against fire. The services will be a valid support on hand of the Regional and National Administrations involved in the agricultural-forest-rural activities (Ministry of Agricultural and Forestry Policies, National Forest Police, ecc..), through the employment of the SIAN "National Agricultural Informative System", that is the integrated national information system for the entire agriculture, forestry and fisheries Administration. The services proposals would be distributed through the GIS (Geographic Information Systems) of the SIAN: the GIS database is a single nation-wide digital graphic database consisting of: - Ortophotos: Aerial images of approz. 45 km2 each with ground resolution of 50 cm; - Cadastral maps: Land maps; - Thematic layers: Land use and crops identification The GIS services can take full advantage of the benefits of SIAN architectural model designed for best integration and interoperability with other Central and Local P.A. bodies whose main items are: - Integration of information from different sources; - Maintainance of the internal coeherence of any integrated information; - Flexibility with respect to technical or organizational changes The "innovative "services described below could be useful to support the development of institutional tasks of public Agencies and Administrations (es. Regions or Civil Protection agencies) according to than previewed from the D.Lgs. 173/98. Services of support to the management of the phenomenon of wildland fires The activities outlined in below figure, don't have a linear and defined temporal sequence, but a dynamic and time integration. It guarantees not only the integrated use of the various information, but also the value of every product, for level of accuracy, coherence and timeliness of the information. Description of four main services proposed. • rapid alert: individuation and fast location of fires, also eventually in their starting phase (fire start), carried out through use of satellite data to high and most very high cycle (every 15 minute) to concur and organize a more effective fighti to spread fire; • perimeter of the area burned by the fire, with generation of polygons (compatible scale with the cadastre maps and data) through photo interpretation of spectral images, colours and infrared, at highest resolution (50 cm), and through fine aerial missions purposely planned during summery season, in substitution or in integrate way of the relief in field for: big fires, zones difficult to reach, isolated uneven area (reference scale from 200 to 400 kmq) • validation activity: services for quality control and validation of the activities of covered detail and relief perimeter of the area burned by the fire carried out through the employment end integration of the acquired data from land/aerial/satellite reliefs in application of law 353/2000. Data supplied to the municipalities, the regions and the prefecture for institutional adoptions. • damage statistics: Services of support to the generation of statistics through analysis of the damage and the vegetation resumption in relation to the type of forest with the use of different platform: satellite, aerial and land observation, for a temporal analysis.
U.S. Department of Energy's Regional Carbon Sequestration Partnership Program: Overview
Litynski, J.; Plasynski, S.; Spangler, L.; Finley, R.; Steadman, E.; Ball, D.; Nemeth, K.J.; McPherson, B.; Myer, L.
2009-01-01
The U.S. Department of Energy (DOE) has formed a nationwide network of seven regional partnerships to help determine the best approaches for capturing and permanently storing gases that can contribute to global climate change. The Regional Carbon Sequestration Partnerships (RCSPs) are tasked with determining the most suitable technologies, regulations, and infrastructure for carbon capture, transport, and storage in their areas of the country and parts of Canada. The seven partnerships include more than 350 state agencies, universities, national laboratories, private companies, and environmental organizations, spanning 42 states, two Indian nations, and four Canadian provinces. The Regional Partnerships initiative is being implemented in three phases: ???Characterization Phase (2003-2005): The objective was to collect data on CO2 sources and sinks and develop the human capital to support and enable future carbon sequestration field tests and deployments. The completion of this Phase was marked by release of the Carbon Sequestration Atlas of the United States and Canada-Version 1 which included a common methodology for capacity assessment and reported over 3,000GT of storage capacity in saline formations, depleted oil and gas fields, and coal seams.???Validation Phase (2005-2009): The objective is to plan and implement small-scale (<1??million tons CO2) field testing of storage technologies in areas determined to be favorable for carbon storage. The partnerships are currently conducting over 20 small-scale geologic field tests and 11 terrestrial field tests.???Development Phase (2008-2018): The primary objective is the development of large-scale (>1??million tons of CO2) Carbon Capture and Storage (CCS) projects, which will demonstrate that large volumes of CO2 can be injected safely, permanently, and economically into geologic formations representative of large storage capacity. Even though the RCSP Program is being implemented in three phases, it should be viewed as an integrated whole, with many of the goals and objectives transitioning from one phase to the next. Accomplishments and results from the Characterization Phase have helped to refine goals and activities in the Validation and Deployment Phases. The RCSP Program encourages and requires open information sharing among its members by sponsoring both general workshops and meetings to facilitate information exchange. Although each RCSP has its own objectives and field tests, mutual cooperation has been an important part of the Program thus far. The primary goal of the RCSP initiative is to promote the development of a regional framework and the infrastructure necessary to validate and deploy carbon sequestration technologies within each Partnership's region. ?? 2009 Elsevier Ltd. All rights reserved.
Huang, Yanqi; He, Lan; Dong, Di; Yang, Caiyun; Liang, Cuishan; Chen, Xin; Ma, Zelan; Huang, Xiaomei; Yao, Su; Liang, Changhong; Tian, Jie; Liu, Zaiyi
2018-02-01
To develop and validate a radiomics prediction model for individualized prediction of perineural invasion (PNI) in colorectal cancer (CRC). After computed tomography (CT) radiomics features extraction, a radiomics signature was constructed in derivation cohort (346 CRC patients). A prediction model was developed to integrate the radiomics signature and clinical candidate predictors [age, sex, tumor location, and carcinoembryonic antigen (CEA) level]. Apparent prediction performance was assessed. After internal validation, independent temporal validation (separate from the cohort used to build the model) was then conducted in 217 CRC patients. The final model was converted to an easy-to-use nomogram. The developed radiomics nomogram that integrated the radiomics signature and CEA level showed good calibration and discrimination performance [Harrell's concordance index (c-index): 0.817; 95% confidence interval (95% CI): 0.811-0.823]. Application of the nomogram in validation cohort gave a comparable calibration and discrimination (c-index: 0.803; 95% CI: 0.794-0.812). Integrating the radiomics signature and CEA level into a radiomics prediction model enables easy and effective risk assessment of PNI in CRC. This stratification of patients according to their PNI status may provide a basis for individualized auxiliary treatment.
THE NEW YORK CITY URBAN DISPERSION PROGRAM MARCH 2005 FIELD STUDY: TRACER METHODS AND RESULTS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
WATSON, T.B.; HEISER, J.; KALB, P.
The Urban Dispersion Program March 2005 Field Study tracer releases, sampling, and analytical methods are described in detail. There were two days where tracer releases and sampling were conducted. A total of 16.0 g of six tracers were released during the first test day or Intensive Observation Period (IOP) 1 and 15.7 g during IOP 2. Three types of sampling instruments were used in this study. Sequential air samplers, or SAS, collected six-minute samples, while Brookhaven atmospheric tracer samplers (BATS) and personal air samplers (PAS) collected thirty-minute samples. There were a total of 1300 samples resulting from the two IOPs.more » Confidence limits in the sampling and analysis method were 20% as determined from 100 duplicate samples. The sample recovery rate was 84%. The integrally averaged 6-minute samples were compared to the 30-minute samples. The agreement was found to be good in most cases. The validity of using a background tracer to calculate sample volumes was examined and also found to have a confidence level of 20%. Methods for improving sampling and analysis are discussed. The data described in this report are available as Excel files. An additional Excel file of quality assured tracer data for use in model validation efforts is also available. The file consists of extensively quality assured BATS tracer data with background concentrations subtracted.« less
LANL* V2.0: global modeling and validation
NASA Astrophysics Data System (ADS)
Koller, J.; Zaharia, S.
2011-03-01
We describe in this paper the new version of LANL*. Just like the previous version, this new version V2.0 of LANL* is an artificial neural network (ANN) for calculating the magnetic drift invariant, L*, that is used for modeling radiation belt dynamics and for other space weather applications. We have implemented the following enhancements in the new version: (1) we have removed the limitation to geosynchronous orbit and the model can now be used for any type of orbit. (2) The new version is based on the improved magnetic field model by Tsyganenko and Sitnov (2005) (TS05) instead of the older model by Tsyganenko et al. (2003). We have validated the model and compared our results to L* calculations with the TS05 model based on ephemerides for CRRES, Polar, GPS, a LANL geosynchronous satellite, and a virtual RBSP type orbit. We find that the neural network performs very well for all these orbits with an error typically Δ L* < 0.2 which corresponds to an error of 3% at geosynchronous orbit. This new LANL-V2.0 artificial neural network is orders of magnitudes faster than traditional numerical field line integration techniques with the TS05 model. It has applications to real-time radiation belt forecasting, analysis of data sets involving decades of satellite of observations, and other problems in space weather.
NASA Astrophysics Data System (ADS)
Hank, Tobias B.; Bach, Heike; Danner, Martin; Hodrius, Martina; Mauser, Wolfram
2016-08-01
Nitrogen, being the basic element for the construction of plant proteins and pigments, is one of the most important production factors for agricultural cultivation. High resolution and near real-time information on nitrogen status in the soil thus is of highest interest for economically and ecologically optimized fertilizer planning and application. Unfortunately, nitrogen storage in the soil column cannot be directly observed with Earth Observation (EO) instruments. Advanced EO supported process modelling approaches therefore must be applied that allow tracing the spatiotemporal dynamics of nitrogen transformation, translocation and transport in the soil and in the canopy. Before these models can be applied as decision support tools for smart farming, they must be carefully parameterized and validated. This study applies an advanced land surface process model (PROMET) to selected winter cereal fields in Southern Germany and correlates the model outputs to destructively sampled nitrogen data from the growing season of 2015 (17 sampling dates, 8 sample locations). The spatial parametrization of the process model thereby is supported by assimilating eight satellite images (5 times Landsat 8 OLI and 3 times RapidEye). It was found that the model is capable of realistically tracing the temporal and spatial dynamics of aboveground nitrogen uptake and allocation (R2 = 0.84, RMSE 31.3 kg ha-1).
Alvarez, D.A.; Petty, J.D.; Huckins, J.N.; Jones-Lepp, T. L.; Getting, D.T.; Goddard, J.P.; Manahan, S.E.
2004-01-01
Increasingly it is being realized that a holistic hazard assessment of complex environmental contaminant mixtures requires data on the concentrations of hydrophilic organic contaminants including new generation pesticides, pharmaceuticals, personal care products, and many chemicals associated with household, industrial, and agricultural wastes. To address this issue, we developed a passive in situ sampling device (the polar organic chemical integrative sampler [POCIS]) that integratively concentrates trace levels of complex mixtures of hydrophilic environmental contaminants, enables the determination of their time-weighted average water concentrations, and provides a method of estimating the potential exposure of aquatic organisms to the complex mixture of waterborne contaminants. Using a prototype sampler, linear uptake of selected herbicides and pharmaceuticals with log KowS < 4.0 was observed for up to 56 d. Estimation of the ambient water concentrations of chemicals of interest is achieved by using appropriate uptake models and determination of POCIS sampling rates for appropriate exposure conditions. Use of POCIS in field validation studies targeting the herbicide diuron in the United Kingdom resulted in the detection of the chemical at estimated concentrations of 190 to 600 ng/L. These values are in agreement with reported levels found in traditional grab samples taken concurrently.
NASA Astrophysics Data System (ADS)
Zhang, S. Q.; Li, H. N.; Schmidt, R.; Müller, P. C.
2014-02-01
Thin-walled piezoelectric integrated smart structures are easily excited to vibrate by unknown disturbances. In order to design and simulate a control strategy, firstly, an electro-mechanically coupled dynamic finite element (FE) model of smart structures is developed based on first-order shear deformation (FOSD) hypothesis. Linear piezoelectric constitutive equations and the assumption of constant electric field through the thickness are considered. Based on the dynamic FE model, a disturbance rejection (DR) control with proportional-integral (PI) observer using step functions as the fictitious model of disturbances is developed for vibration suppression of smart structures. In order to achieve a better dynamic behavior of the fictitious model of disturbances, the PI observer is extended to generalized proportional-integral (GPI) observer, in which sine or polynomial functions can be used to represent disturbances resulting in better dynamics. Therefore the disturbances can be estimated either by PI or GPI observer, and then the estimated signals are fed back to the controller. The DR control is validated by various kinds of unknown disturbances, and compared with linear-quadratic regulator (LQR) control. The results illustrate that the vibrations are better suppressed by the proposed DR control.
Investigating Integration Capabilities Between Ifc and Citygml LOD3 for 3d City Modelling
NASA Astrophysics Data System (ADS)
Floros, G.; Pispidikis, I.; Dimopoulou, E.
2017-10-01
Smart cities are applied to an increasing number of application fields. This evolution though urges data collection and integration, hence major issues arise that need to be tackled. One of the most important challenges is the heterogeneity of collected data, especially if those data derive from different standards and vary in terms of geometry, topology and semantics. Another key challenge is the efficient analysis and visualization of spatial data, which due to the complexity of the physical reality in modern world, 2D GIS struggles to cope with. So, in order to facilitate data analysis and enhance the role of smart cities, the 3rd dimension needs to be implemented. Standards such as CityGML and IFC fulfill that necessity but they present major differences in their schemas that render their integration a challenging task. This paper focuses on addressing those differences, examining the up to date research work and investigates an alternative methodology in order to bridge the gap between those Standards. Within this framework, a generic IFC model is generated and converted to a CityGML Model, which is validated and evaluated on its geometrical correctness and semantical coherence. General results as well as future research considerations are presented.
A DBMS architecture for global change research
NASA Astrophysics Data System (ADS)
Hachem, Nabil I.; Gennert, Michael A.; Ward, Matthew O.
1993-08-01
The goal of this research is the design and development of an integrated system for the management of very large scientific databases, cartographic/geographic information processing, and exploratory scientific data analysis for global change research. The system will represent both spatial and temporal knowledge about natural and man-made entities on the eath's surface, following an object-oriented paradigm. A user will be able to derive, modify, and apply, procedures to perform operations on the data, including comparison, derivation, prediction, validation, and visualization. This work represents an effort to extend the database technology with an intrinsic class of operators, which is extensible and responds to the growing needs of scientific research. Of significance is the integration of many diverse forms of data into the database, including cartography, geography, hydrography, hypsography, images, and urban planning data. Equally important is the maintenance of metadata, that is, data about the data, such as coordinate transformation parameters, map scales, and audit trails of previous processing operations. This project will impact the fields of geographical information systems and global change research as well as the database community. It will provide an integrated database management testbed for scientific research, and a testbed for the development of analysis tools to understand and predict global change.
NASA GPM GV Science Implementation
NASA Technical Reports Server (NTRS)
Petersen, W. A.
2009-01-01
Pre-launch algorithm development & post-launch product evaluation: The GPM GV paradigm moves beyond traditional direct validation/comparison activities by incorporating improved algorithm physics & model applications (end-to-end validation) in the validation process. Three approaches: 1) National Network (surface): Operational networks to identify and resolve first order discrepancies (e.g., bias) between satellite and ground-based precipitation estimates. 2) Physical Process (vertical column): Cloud system and microphysical studies geared toward testing and refinement of physically-based retrieval algorithms. 3) Integrated (4-dimensional): Integration of satellite precipitation products into coupled prediction models to evaluate strengths/limitations of satellite precipitation producers.
NASA Astrophysics Data System (ADS)
Helma, H.; Mirna, M.; Edizon, E.
2018-04-01
Mathematics is often applied in physics, chemistry, economics, engineering, and others. Besides that, mathematics is also used in everyday life. Learning mathematics in school should be associated with other sciences and everyday life. In this way, the learning of mathematics is more realstic, interesting, and meaningful. Needs analysis shows that required contextual mathematics teaching materials integrated related sciences and realistic on learning mathematics. The purpose of research is to produce a valid and practical contextual mathematics teaching material integrated related sciences and realistic. This research is development research. The result of this research is a valid and practical contextual mathematics teaching material integrated related sciences and realistic produced
The role of non-governmental organizations in the social and the health system.
Piotrowicz, Maria; Cianciara, Dorota
2013-01-01
The article presents the definitions, objectives, fields and tasks of non-governmental organizations in social life, health system and health policy. In addition, the article addresses the issue of effectiveness and quality of NGOs' activity. The term "NGOs" (Non-governmental Organizations) includes different categories of entities that operate not to obtain financial gain, and also do not belong to the government sector. Non-governmental Organizations' fields of activity were described in the International Classification of Non-Profit Organizations (ICNPO). NGOs are an integral part of a democratic society. Sociological sciences emphasize their importance in enhancing social integration, implementation of the principle of subsidiarity, building civil society, social dialogue and participatory democracy. The main tasks of NGOs in the health system are providing services and health advocacy. Provision of services includes medical, social and psychological services as well as, integration activities, care and nursing, material and financial support, educational and information services and training. Health advocacy is a combination of individual and social actions designed to gain political commitment, policy support, social acceptance and systems support for a particular health goal or program. An important task carried out by NGOs is participation in the formation of health policy. The increasing role of NGOs in providing social services and the participation in political processes, result in the need to confirm the validity and credibility of their operation. One of the ways could be to introduce the mechanisms to assess quality and efficiency, such as registration as a part of a legal system, self-regulatory activities (card rules, codes of ethics), certification, participation in networks, monitoring and audit.
Uncovering the structure of (super)conformal field theories
NASA Astrophysics Data System (ADS)
Liendo, Pedro
Conformal field theories (CFTs) are of central importance in modern theoretical physics, with applications that range from condensed matter physics to particle theory phenomenology. In this Ph.D. thesis we study CFTs from two somehow orthogonal (but complementary) points of view. In the first approach we concentrate our efforts in two specific examples: the Veneziano limit of N = 2 and N = 1 superconformal QCD. The addition of supersymmetry makes these theories amenable to analytical analysis. In particular, we use the correspondence between single trace operators and states of a spin chain to study the integrability properties of each theory. Our results indicate that these theories are not completely integrable, but they do contain some subsectors in which integrability might hold. In the second approach, we consider the so-called "bootstrap program'', which is the ambitious idea that the restrictions imposed by conformal symmetry (crossing symmetry in particular) are so powerful that starting from a few basic assumptions one should be able to fix the form of a theory. In this thesis we apply bootstrap techniques to CFTs in the presence of a boundary. We study two-point functions using analytical and numerical methods. One-loop results were re-obtained from crossing symmetry alone and a variety of numerical bounds for conformal dimensions of operators were obtained. These bounds are quite general and valid for any CFT in the presence of a boundary, in contrast to our first approach where a specific set of theories was studied. A natural continuation of this work is to apply bootstrap techniques to supersymmetric theories. Some preliminary results along these lines are presented.
Bi-centenary of successes of Fourier theorem: its power and limitations in optical system designs
NASA Astrophysics Data System (ADS)
Roychoudhuri, Chandrasekhar
2007-09-01
We celebrate the two hundred years of successful use of the Fourier theorem in optics. However, there is a great enigma associated with the Fourier transform integral. It is one of the most pervasively productive and useful tool of physics and optics because its foundation is based on the superposition of harmonic functions and yet we have never declared it as a principle of physics for valid reasons. And, yet there are a good number of situations where we pretend it to be equivalent to the superposition principle of physics, creating epistemological problems of enormous magnitude. The purpose of the paper is to elucidate the problems while underscoring the successes and the elegance of the Fourier theorem, which are not explicitly discussed in the literature. We will make our point by taking six major engineering fields of optics and show in each case why it works and under what restricted conditions by bringing in the relevant physics principles. The fields are (i) optical signal processing, (ii) Fourier transform spectrometry, (iii) classical spectrometry of pulsed light, (iv) coherence theory, (v) laser mode locking and (vi) pulse broadening. We underscore that mathematical Fourier frequencies, not being physical frequencies, cannot generate real physical effects on our detectors. Appreciation of this fundamental issue will open up ways to be innovative in many new optical instrument designs. We underscore the importance of always validating our design platforms based on valid physics principles (actual processes undergoing in nature) captured by an appropriate hypothesis based on diverse observations. This paper is a comprehensive view of the power and limitations of Fourier Transform by summarizing a series of SPIE conference papers presented during 2003-2007.
Integrating Validity Theory with Use of Measurement Instruments in Clinical Settings
Kelly, P Adam; O'Malley, Kimberly J; Kallen, Michael A; Ford, Marvella E
2005-01-01
Objective To present validity concepts in a conceptual framework useful for research in clinical settings. Principal Findings We present a three-level decision rubric for validating measurement instruments, to guide health services researchers step-by-step in gathering and evaluating validity evidence within their specific situation. We address construct precision, the capacity of an instrument to measure constructs it purports to measure and differentiate from other, unrelated constructs; quantification precision, the reliability of the instrument; and translation precision, the ability to generalize scores from an instrument across subjects from the same or similar populations. We illustrate with specific examples, such as an approach to validating a measurement instrument for veterans when prior evidence of instrument validity for this population does not exist. Conclusions Validity should be viewed as a property of the interpretations and uses of scores from an instrument, not of the instrument itself: how scores are used and the consequences of this use are integral to validity. Our advice is to liken validation to building a court case, including discovering evidence, weighing the evidence, and recognizing when the evidence is weak and more evidence is needed. PMID:16178998
Brunckhorst, Oliver; Shahid, Shahab; Aydin, Abdullatif; McIlhenny, Craig; Khan, Shahid; Raza, Syed Johar; Sahai, Arun; Brewin, James; Bello, Fernando; Kneebone, Roger; Khan, Muhammad Shamim; Dasgupta, Prokar; Ahmed, Kamran
2015-09-01
Current training modalities within ureteroscopy have been extensively validated and must now be integrated within a comprehensive curriculum. Additionally, non-technical skills often cause surgical error and little research has been conducted to combine this with technical skills teaching. This study therefore aimed to develop and validate a curriculum for semi-rigid ureteroscopy, integrating both technical and non-technical skills teaching within the programme. Delphi methodology was utilised for curriculum development and content validation, with a randomised trial then conducted (n = 32) for curriculum evaluation. The developed curriculum consisted of four modules; initially developing basic technical skills and subsequently integrating non-technical skills teaching. Sixteen participants underwent the simulation-based curriculum and were subsequently assessed, together with the control cohort (n = 16) within a full immersion environment. Both technical (Time to completion, OSATS and a task specific checklist) and non-technical (NOTSS) outcome measures were recorded with parametric and non-parametric analyses used depending on the distribution of our data as evaluated by a Shapiro-Wilk test. Improvements within the intervention cohort demonstrated educational value across all technical and non-technical parameters recorded, including time to completion (p < 0.01), OSATS scores (p < 0.001), task specific checklist scores (p = 0.011) and NOTSS scores (p < 0.001). Content validity, feasibility and acceptability were all demonstrated through curriculum development and post-study questionnaire results. The current developed curriculum demonstrates that integrating both technical and non-technical skills teaching is both educationally valuable and feasible. Additionally, the curriculum offers a validated simulation-based training modality within ureteroscopy and a framework for the development of other simulation-based programmes.
Asadi-Lari, Mohsen; Ahmadi Pishkuhi, Mahin; Almasi-Hashiani, Amir; Safiri, Saeid; Sepidarkish, Mahdi
2015-07-01
Developing a tool for measuring patient's needs is a vital step in the process of cancer treatment and research. In recent years, the European Organization for Research and Treatment of Cancer (EORTC) made a questionnaire to measure cancer patients' received information. Since validity and reliability of any instrument should be evaluated in the new environment and culture, the aim of this study was to assess the validity and reliability of the EORTC QLQ-INFO25 in Iranian cancer patients. One hundred seventy-three patients with different stages of cancer filled questionnaire EORTC QLQ-INFO25, EORTC QLQ-C30, and EORTC IN-PATSAT32. Twenty-five patients answered the questionnaire twice at an interval of 2 weeks. Reliability and validity of the questionnaire was measured by Cronbach's alpha, interclass correlation, test retest, inter-rater agreement (IRA), and exploratory factorial analyses. Using a conservative approach, the IRA for the overall relevancy and clarity of the tool was 87/86% and 83.33%, respectively. Overall appropriateness and clarity were 94.13 and 91.87%, respectively. Overall integrity of the instrument was determined to be 85%. Cronbach's alpha coefficients for all domains and total inventory were top 70 and 90%, respectively. Interclass correlation index ranges between 0.708 and 0.965. Exploratory factorial analyses demonstrate six fields suitable for instrument. Correlation between areas of the questionnaires EORTC QLQ-INFO25 and EORTC in-Patsat32 represents the convergent validity of the questionnaire. Also, results show a standard divergent validity in all domains of the questionnaire (Rho <0.3). Low correlation between the areas of the questionnaires EORTC QLQ-INFO25 and EORTC QLQ-C30 (<0.3) demonstrates the divergence validity of the questionnaire. The results showed that Persian version of the questionnaire EORTC QLQ-INFO25 is a reliable and valid instrument for measuring the perception of information in cancer patients.
A review of the available urology skills training curricula and their validation.
Shepherd, William; Arora, Karan Singh; Abboudi, Hamid; Shamim Khan, Mohammed; Dasgupta, Prokar; Ahmed, Kamran
2014-01-01
The transforming field of urological surgery continues to demand development of novel training devices and curricula for its trainees. Contemporary trainees have to balance workplace demands while overcoming the cognitive barriers of acquiring skills in rapidly multiplying and advancing surgical techniques. This article provides a brief review of the process involved in developing a surgical curriculum and the current status of real and simulation-based curricula in the 4 subgroups of urological surgical practice: open, laparoscopic, endoscopic, and robotic. An informal literature review was conducted to provide a snapshot into the variety of simulation training tools available for technical and nontechnical urological surgical skills within all subgroups of urological surgery using the following keywords: "urology, surgery, training, curriculum, validation, non-technical skills, technical skills, LESS, robotic, laparoscopy, animal models." Validated training tools explored in research were tabulated and summarized. A total of 20 studies exploring validated training tools were identified. Huge variation was noticed in the types of validity sought by researchers and suboptimal incorporation of these tools into curricula was noted across the subgroups of urological surgery. The following key recommendations emerge from the review: adoption of simulation-based curricula in training; better integration of dedicated training time in simulated environments within a trainee's working hours; better incentivization for educators and assessors to improvise, research, and deliver teaching using the technologies available; and continued emphasis on developing nontechnical skills in tandem with technical operative skills. © 2013 Published by Association of Program Directors in Surgery on behalf of Association of Program Directors in Surgery.
Systematic review of serious games for medical education and surgical skills training.
Graafland, M; Schraagen, J M; Schijven, M P
2012-10-01
The application of digital games for training medical professionals is on the rise. So-called 'serious' games form training tools that provide a challenging simulated environment, ideal for future surgical training. Ultimately, serious games are directed at reducing medical error and subsequent healthcare costs. The aim was to review current serious games for training medical professionals and to evaluate the validity testing of such games. PubMed, Embase, the Cochrane Database of Systematic Reviews, PsychInfo and CINAHL were searched using predefined inclusion criteria for available studies up to April 2012. The primary endpoint was validation according to current criteria. A total of 25 articles were identified, describing a total of 30 serious games. The games were divided into two categories: those developed for specific educational purposes (17) and commercial games also useful for developing skills relevant to medical personnel (13). Pooling of data was not performed owing to the heterogeneity of study designs and serious games. Six serious games were identified that had a process of validation. Of these six, three games were developed for team training in critical care and triage, and three were commercially available games applied to train laparoscopic psychomotor skills. None of the serious games had completed a full validation process for the purpose of use. Blended and interactive learning by means of serious games may be applied to train both technical and non-technical skills relevant to the surgical field. Games developed or used for this purpose need validation before integration into surgical teaching curricula. Copyright © 2012 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.
On space of integrable quantum field theories
Smirnov, F. A.; Zamolodchikov, A. B.
2016-12-21
Here, we study deformations of 2D Integrable Quantum Field Theories (IQFT) which preserve integrability (the existence of infinitely many local integrals of motion). The IQFT are understood as “effective field theories”, with finite ultraviolet cutoff. We show that for any such IQFT there are infinitely many integrable deformations generated by scalar local fields X s, which are in one-to-one correspondence with the local integrals of motion; moreover, the scalars X s are built from the components of the associated conserved currents in a universal way. The first of these scalars, X 1, coincides with the composite field View the MathMLmore » source(TT¯) built from the components of the energy–momentum tensor. The deformations of quantum field theories generated by X 1 are “solvable” in a certain sense, even if the original theory is not integrable. In a massive IQFT the deformations X s are identified with the deformations of the corresponding factorizable S-matrix via the CDD factor. The situation is illustrated by explicit construction of the form factors of the operators X s in sine-Gordon theory. Lastly, we also make some remarks on the problem of UV completeness of such integrable deformations.« less
NASA Astrophysics Data System (ADS)
Campbell, J. L.; Burrows, S.; Gower, S. T.; Cohen, W. B.
1999-09-01
The BigFoot Project is funded by the Earth Science Enterprise to collect and organize data to be used in the EOS Validation Program. The data collected by the BigFoot Project are unique in being ground-based observations coincident with satellite overpasses. In addition to collecting data, the BigFoot project will develop and test new algorithms for scaling point measurements to the same spatial scales as the EOS satellite products. This BigFoot Field Manual Mill be used to achieve completeness and consistency of data collected at four initial BigFoot sites and at future sites that may collect similar validation data. Therefore, validation datasets submitted to the ORNL DAAC that have been compiled in a manner consistent with the field manual will be especially valuable in the validation program.
Computation of Sound Propagation by Boundary Element Method
NASA Technical Reports Server (NTRS)
Guo, Yueping
2005-01-01
This report documents the development of a Boundary Element Method (BEM) code for the computation of sound propagation in uniform mean flows. The basic formulation and implementation follow the standard BEM methodology; the convective wave equation and the boundary conditions on the surfaces of the bodies in the flow are formulated into an integral equation and the method of collocation is used to discretize this equation into a matrix equation to be solved numerically. New features discussed here include the formulation of the additional terms due to the effects of the mean flow and the treatment of the numerical singularities in the implementation by the method of collocation. The effects of mean flows introduce terms in the integral equation that contain the gradients of the unknown, which is undesirable if the gradients are treated as additional unknowns, greatly increasing the sizes of the matrix equation, or if numerical differentiation is used to approximate the gradients, introducing numerical error in the computation. It is shown that these terms can be reformulated in terms of the unknown itself, making the integral equation very similar to the case without mean flows and simple for numerical implementation. To avoid asymptotic analysis in the treatment of numerical singularities in the method of collocation, as is conventionally done, we perform the surface integrations in the integral equation by using sub-triangles so that the field point never coincide with the evaluation points on the surfaces. This simplifies the formulation and greatly facilitates the implementation. To validate the method and the code, three canonic problems are studied. They are respectively the sound scattering by a sphere, the sound reflection by a plate in uniform mean flows and the sound propagation over a hump of irregular shape in uniform flows. The first two have analytical solutions and the third is solved by the method of Computational Aeroacoustics (CAA), all of which are used to compare the BEM solutions. The comparisons show very good agreements and validate the accuracy of the BEM approach implemented here.
González-Soltero, Rocío; Learte, Ana Isabel R; Sánchez, Ana Mª; Gal, Beatriz
2017-11-29
Establishing innovative teaching programs in biomedical education involves dealing with several national and supra-national (i.e. European) regulations as well as with new pedagogical and demographic demands. We aimed to develop and validate a suitable instrument to integrate activities across preclinical years in all Health Science Degrees while meeting requirements of national quality agencies. The new approach was conceived at two different levels: first, we identified potentially integrative units from different fields according to national learning goals established for each preclinical year (national quality agency regulations). Secondly, we implemented a new instrument that combines active methodologies in Work Station Learning Activities (WSLA), using clinical scenarios as a guiding common thread to instruct students from an integrated perspective. We evaluated students' perception through a Likert-type survey of a total of 118 students enrolled in the first year of the Bachelor's Degree in Medicine. Our model of integrated activities through WSLA is feasible, scalable and manageable with large groups of students and a minimum number of instructors, two major limitations in many medical schools. Students' perception of WSLA was positive in overall terms. Seventy nine percent of participants stated that WSLA sessions were more useful than non-integrated activities. Eighty three percent confirmed that the WSLA methodology was effective at integrating concepts covered by different subjects. The WSLA approach is a flexible and scalable instrument for moving towards integrated curricula, and it can be successfully adapted to teach basic subjects in preclinical years of Health Science degrees. WSLA can be applied to large groups of students in a variety of contexts or environments using clinical cases as connecting threads.
Refaat, Tamer F; Singh, Upendra N; Yu, Jirong; Petros, Mulugeta; Remus, Ruben; Ismail, Syed
2016-05-20
Field experiments were conducted to test and evaluate the initial atmospheric carbon dioxide (CO2) measurement capability of airborne, high-energy, double-pulsed, 2-μm integrated path differential absorption (IPDA) lidar. This IPDA was designed, integrated, and operated at the NASA Langley Research Center on-board the NASA B-200 aircraft. The IPDA was tuned to the CO2 strong absorption line at 2050.9670 nm, which is the optimum for lower tropospheric weighted column measurements. Flights were conducted over land and ocean under different conditions. The first validation experiments of the IPDA for atmospheric CO2 remote sensing, focusing on low surface reflectivity oceanic surface returns during full day background conditions, are presented. In these experiments, the IPDA measurements were validated by comparison to airborne flask air-sampling measurements conducted by the NOAA Earth System Research Laboratory. IPDA performance modeling was conducted to evaluate measurement sensitivity and bias errors. The IPDA signals and their variation with altitude compare well with predicted model results. In addition, off-off-line testing was conducted, with fixed instrument settings, to evaluate the IPDA systematic and random errors. Analysis shows an altitude-independent differential optical depth offset of 0.0769. Optical depth measurement uncertainty of 0.0918 compares well with the predicted value of 0.0761. IPDA CO2 column measurement compares well with model-driven, near-simultaneous air-sampling measurements from the NOAA aircraft at different altitudes. With a 10-s shot average, CO2 differential optical depth measurement of 1.0054±0.0103 was retrieved from a 6-km altitude and a 4-GHz on-line operation. As compared to CO2 weighted-average column dry-air volume mixing ratio of 404.08 ppm, derived from air sampling, IPDA measurement resulted in a value of 405.22±4.15 ppm with 1.02% uncertainty and 0.28% additional bias. Sensitivity analysis of environmental systematic errors correlates the additional bias to water vapor. IPDA ranging resulted in a measurement uncertainty of <3 m.
NASA Astrophysics Data System (ADS)
Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.
2016-12-01
It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.
Small UAS-Based Wind Feature Identification System Part 1: Integration and Validation
Rodriguez Salazar, Leopoldo; Cobano, Jose A.; Ollero, Anibal
2016-01-01
This paper presents a system for identification of wind features, such as gusts and wind shear. These are of particular interest in the context of energy-efficient navigation of Small Unmanned Aerial Systems (UAS). The proposed system generates real-time wind vector estimates and a novel algorithm to generate wind field predictions. Estimations are based on the integration of an off-the-shelf navigation system and airspeed readings in a so-called direct approach. Wind predictions use atmospheric models to characterize the wind field with different statistical analyses. During the prediction stage, the system is able to incorporate, in a big-data approach, wind measurements from previous flights in order to enhance the approximations. Wind estimates are classified and fitted into a Weibull probability density function. A Genetic Algorithm (GA) is utilized to determine the shaping and scale parameters of the distribution, which are employed to determine the most probable wind speed at a certain position. The system uses this information to characterize a wind shear or a discrete gust and also utilizes a Gaussian Process regression to characterize continuous gusts. The knowledge of the wind features is crucial for computing energy-efficient trajectories with low cost and payload. Therefore, the system provides a solution that does not require any additional sensors. The system architecture presents a modular decentralized approach, in which the main parts of the system are separated in modules and the exchange of information is managed by a communication handler to enhance upgradeability and maintainability. Validation is done providing preliminary results of both simulations and Software-In-The-Loop testing. Telemetry data collected from real flights, performed in the Seville Metropolitan Area in Andalusia (Spain), was used for testing. Results show that wind estimation and predictions can be calculated at 1 Hz and a wind map can be updated at 0.4 Hz. Predictions show a convergence time with a 95% confidence interval of approximately 30 s. PMID:28025531
Small UAS-Based Wind Feature Identification System Part 1: Integration and Validation.
Rodriguez Salazar, Leopoldo; Cobano, Jose A; Ollero, Anibal
2016-12-23
This paper presents a system for identification of wind features, such as gusts and wind shear. These are of particular interest in the context of energy-efficient navigation of Small Unmanned Aerial Systems (UAS). The proposed system generates real-time wind vector estimates and a novel algorithm to generate wind field predictions. Estimations are based on the integration of an off-the-shelf navigation system and airspeed readings in a so-called direct approach. Wind predictions use atmospheric models to characterize the wind field with different statistical analyses. During the prediction stage, the system is able to incorporate, in a big-data approach, wind measurements from previous flights in order to enhance the approximations. Wind estimates are classified and fitted into a Weibull probability density function. A Genetic Algorithm (GA) is utilized to determine the shaping and scale parameters of the distribution, which are employed to determine the most probable wind speed at a certain position. The system uses this information to characterize a wind shear or a discrete gust and also utilizes a Gaussian Process regression to characterize continuous gusts. The knowledge of the wind features is crucial for computing energy-efficient trajectories with low cost and payload. Therefore, the system provides a solution that does not require any additional sensors. The system architecture presents a modular decentralized approach, in which the main parts of the system are separated in modules and the exchange of information is managed by a communication handler to enhance upgradeability and maintainability. Validation is done providing preliminary results of both simulations and Software-In-The-Loop testing. Telemetry data collected from real flights, performed in the Seville Metropolitan Area in Andalusia (Spain), was used for testing. Results show that wind estimation and predictions can be calculated at 1 Hz and a wind map can be updated at 0.4 Hz . Predictions show a convergence time with a 95% confidence interval of approximately 30 s .
NASA Astrophysics Data System (ADS)
Ahmad, Waqas; Kim, Soohyun; Kim, Dongkyun
2017-04-01
Land subsidence and crustal deformation associated with groundwater abstraction is a gradually instigating phenomenon. The exploitation of Interferometric Synthetic Aperture Radar (InSAR) for land subsidence velocity and the Gravity Recovery and Climate Experiment (GRACE) for change in groundwater storage have great potential besides other applications to address this problem. In this paper we used an integrated approach to combine InSAR and GRACE solutions to show that land subsidence velocity in a rapidly urbanizing and groundwater dependent basin in Pakistan is largely attributed to over exploitation of groundwater aquifer. We analyzed a total of 28 Sentinel-1 based interferograms generated for the period October 2014 to November 2016 to quantify the level of land subsidence in the study area. To increase the accuracy of our interferometry results we then applied a filter of Amplitude Dispersion Index (ADI) to confine the spatial extent of land subsidence to persistently scattering pixels. For the GRACE experiment we take the average of change in Total Water Storage (TWS) solutions provided by the Center for Space Research (CSR), the German Research Centre for Geosciences (GFZ), and the Jet Propulsion Laboratory (JPL) and validate this mean TWS for the study area using a network of observed time series groundwater levels. The validation result of GRACE TWS field shows that although the GRACE foot print is spatially larger than the extent of the study area but significant change in water storage can contribute to the overall trend of declining water storage. Finally we compared our results of InSAR land subsidence velocities and GRACE TWS change field. A strong dependence of the land subsidence on the temporal change in TWS suggests that most of the land subsidence could be attributed to the unchecked exploitation of groundwater aquifer.
Numerical Simulation of Boundary Layer Ingesting (BLI) Inlet-Fan Interaction
NASA Technical Reports Server (NTRS)
Giuliani, James; Chen, Jen-Ping; Beach, Timothy; Bakhle, Milind
2014-01-01
Future civil transport designs may incorporate engine inlets integrated into the body of the aircraft to take advantage of efficiency increases due to weight and drag reduction. Additional increases in engine efficiency are predicted if the inlet ingests the lower momentum boundary layer flow. Previous studies have shown, however, that efficiency benefits of Boundary Layer Ingesting (BLI) ingestion are very sensitive to the magnitude of fan and duct losses, and blade structural response to the non-uniform flow field that results from a BLI inlet has not been studied in-depth. This paper presents an effort to extend the modeling capabilities of an existing rotating turbomachinery unsteady analysis code to include the ability to solve the external and internal flow fields of a BLI inlet. The TURBO code has been a successful tool in evaluating fan response to flow distortions for traditional engine/inlet integrations, such as the development of rotating stall and inlet distortion through compressor stages. This paper describes the first phase of an effort to extend the TURBO model to calculate the external and inlet flowfield upstream of fan so that accurate pressure distortions that result from BLI configurations can be computed and used to analyze fan aerodynamics and structural response. To validate the TURBO program modifications for the BLI flowfield, experimental test data obtained by NASA for a flushmounted S-duct with large amounts of boundary layer ingestion was modeled. Results for the flow upstream and in the inlet are presented and compared to experimental data for several high Reynolds number flows to validate the modifications to the solver. Quantitative data is presented that indicates good predictive capability of the model in the upstream flow. A representative fan is attached to the inlet and results are presented for the coupled inlet/fan model. The impact on the total pressure distortion at the AIP after the fan is attached is examined.
Lefschetz thimbles in fermionic effective models with repulsive vector-field
NASA Astrophysics Data System (ADS)
Mori, Yuto; Kashiwa, Kouji; Ohnishi, Akira
2018-06-01
We discuss two problems in complexified auxiliary fields in fermionic effective models, the auxiliary sign problem associated with the repulsive vector-field and the choice of the cut for the scalar field appearing from the logarithmic function. In the fermionic effective models with attractive scalar and repulsive vector-type interaction, the auxiliary scalar and vector fields appear in the path integral after the bosonization of fermion bilinears. When we make the path integral well-defined by the Wick rotation of the vector field, the oscillating Boltzmann weight appears in the partition function. This "auxiliary" sign problem can be solved by using the Lefschetz-thimble path-integral method, where the integration path is constructed in the complex plane. Another serious obstacle in the numerical construction of Lefschetz thimbles is caused by singular points and cuts induced by multivalued functions of the complexified scalar field in the momentum integration. We propose a new prescription which fixes gradient flow trajectories on the same Riemann sheet in the flow evolution by performing the momentum integration in the complex domain.
Hervás, Gonzalo; Vázquez, Carmelo
2013-04-22
We introduce the Pemberton Happiness Index (PHI), a new integrative measure of well-being in seven languages, detailing the validation process and presenting psychometric data. The scale includes eleven items related to different domains of remembered well-being (general, hedonic, eudaimonic, and social well-being) and ten items related to experienced well-being (i.e., positive and negative emotional events that possibly happened the day before); the sum of these items produces a combined well-being index. A distinctive characteristic of this study is that to construct the scale, an initial pool of items, covering the remembered and experienced well-being domains, were subjected to a complete selection and validation process. These items were based on widely used scales (e.g., PANAS, Satisfaction With Life Scale, Subjective Happiness Scale, and Psychological Well-Being Scales). Both the initial items and reference scales were translated into seven languages and completed via Internet by participants (N = 4,052) aged 16 to 60 years from nine countries (Germany, India, Japan, Mexico, Russia, Spain, Sweden, Turkey, and USA). Results from this initial validation study provided very good support for the psychometric properties of the PHI (i.e., internal consistency, a single-factor structure, and convergent and incremental validity). Given the PHI's good psychometric properties, this simple and integrative index could be used as an instrument to monitor changes in well-being. We discuss the utility of this integrative index to explore well-being in individuals and communities.
[Development of a scale to measure Korean ego-integrity in older adults].
Chang, Sung Ok; Kong, Eun Sook; Kim, Kwuy Bun; Kim, Nam Cho; Kim, Ju Hee; Kim, Chun Gill; Kim, Hee Kyung; Song, Mi Soon; Ahn, Soo Yeon; Lee, Kyung Ja; Lee, Young Whee; Chon, Si Ja; Cho, Nam Ok; Cho, Myung Ok; Choi, Kyung Sook
2007-04-01
Ego-integrity in older adults is the central concept related to quality of life in later life. Therefore, for effective interventions to enhance the quality of later life, a scale to measure ego-integrity in older adults is necessary. This study was carried out to develop a scale to measure ego-integrity in older adults. This study utilized cronbach's alpha in analyzing the reliability of the collected data and expert group, and factor analysis and item analysis to analyze validity. Seventeen items were selected from a total of 21 items. Cronbach's alpha coefficient for internal consistency was .88 for the 17 items of ego-integrity in the older adults scale. Three factors evolved by factor analysis, which explained 50.71% of the total variance. The scale for measuring ego-integrity in Korean older adults in this study was evaluated as a tool with a high degree of reliability and validity.
Experimental Validation of an Integrated Controls-Structures Design Methodology
NASA Technical Reports Server (NTRS)
Maghami, Peiman G.; Gupta, Sandeep; Elliot, Kenny B.; Walz, Joseph E.
1996-01-01
The first experimental validation of an integrated controls-structures design methodology for a class of large order, flexible space structures is described. Integrated redesign of the controls-structures-interaction evolutionary model, a laboratory testbed at NASA Langley, was described earlier. The redesigned structure was fabricated, assembled in the laboratory, and experimentally tested against the original structure. Experimental results indicate that the structure redesigned using the integrated design methodology requires significantly less average control power than the nominal structure with control-optimized designs, while maintaining the required line-of-sight pointing performance. Thus, the superiority of the integrated design methodology over the conventional design approach is experimentally demonstrated. Furthermore, amenability of the integrated design structure to other control strategies is evaluated, both analytically and experimentally. Using Linear-Quadratic-Guassian optimal dissipative controllers, it is observed that the redesigned structure leads to significantly improved performance with alternate controllers as well.
Beckwith, Jonathan G; Chu, Jeffrey J; Greenwald, Richard M
2007-08-01
Although the epidemiology and mechanics of concussion in sports have been investigated for many years, the biomechanical factors that contribute to mild traumatic brain injury remain unclear because of the difficulties in measuring impact events in the field. The purpose of this study was to validate an instrumented boxing headgear (IBH) that can be used to measure impact severity and location during play. The instrumented boxing headgear data were processed to determine linear and rotational acceleration at the head center of gravity, impact location, and impact severity metrics, such as the Head Injury Criterion (HIC) and Gadd Severity Index (GSI). The instrumented boxing headgear was fitted to a Hybrid III (HIII) head form and impacted with a weighted pendulum to characterize accuracy and repeatability. Fifty-six impacts over 3 speeds and 5 locations were used to simulate blows most commonly observed in boxing. A high correlation between the HIII and instrumented boxing headgear was established for peak linear and rotational acceleration (r2= 0.91), HIC (r2 = 0.88), and GSI (r2 = 0.89). Mean location error was 9.7 +/- 5.2 masculine. Based on this study, the IBH is a valid system for measuring head acceleration and impact location that can be integrated into training and competition.
A call for change: clinical evaluation of student registered nurse anesthetists.
Collins, Shawn; Callahan, Margaret Faut
2014-02-01
The ability to integrate theory with practice is integral to a student's success. A common reason for attrition from a nurse anesthesia program is clinical issues. To document clinical competence, students are evaluated using various tools. For use of a clinical evaluation tool as possible evidence for a student's dismissal, an important psychometric property to ensure is instrument validity. Clinical evaluation instruments of nurse anesthesia programs are not standardized among programs, which suggests a lack of instrument validity. The lack of established validity of the instruments used to evaluate students' clinical progress brings into question their ability to detect a student who is truly in jeopardy of attrition. Given this possibility, clinical instrument validity warrants research to be fair to students and improve attrition rates based on valid data. This ex post facto study evaluated a 17-item clinical instrument tool to demonstrate the need for validity of clinical evaluation tools. It also compared clinical scores with scores on the National Certification Examination.