Teaching contact metamorphism, isograds, and mixed-volatile reactions: A suite-based approach
NASA Astrophysics Data System (ADS)
Peck, W. H.
2003-12-01
An important goal of teaching Introductory Petrology is to demonstrate how different kinds of approaches are integrated in studying petrologic problems. Depending on the goals of the study data used can be from the field, hand-sample, microscope, electron beam instrument, or mass spectrometer. A suite of samples with a known geographical and geological context can help students in drawing connections between different petrologic approaches, as the `geologic story' of the samples becomes a unifying theme. For teaching a unit on calc-silicates I use a suite of siliceous dolomite samples collected from the Ubehebe contact aureole (Death Valley, NV) as well as published data (Roselle et al., 1997; 1999) in a linked series of laboratory exercises and problem sets. The geology of the contact aureole is introduced in a three-hour laboratory exercise, where students identify the appearance of tremolite, forsterite, and periclase/brucite and the disappearance of quartz as the intrusion is approached. A concurrent problem set uses simplified mineral assemblage maps from the aureole. In the problem set students delineate isograds and determine the balanced metamorphic reactions by which the metamorphic minerals formed. Lecture material during this unit focuses on the physical properties of fluids in the crust and the mineralogical evidence for fluid-flow (with an emphasis on mixed-volatile reactions and T-XCO2 diagrams). A concrete field example helps focus student attention on the interrelation of disparate approaches by which petrologic problems addressed. The Ubehebe suite then becomes a unifying theme throughout the course: the specimens or regional geology are used in subsequent laboratories and lectures when introducing concepts such as grain nucleation and growth, reaction overstepping, and replacement textures. A virtual field trip of the Alta aureole, UT (using field photographs, maps, and photomicrographs) concludes the unit. The geology of the Alta aureole is similar to that of Ubehebe, and the virtual field trip acts as a review that emphases the general usefulness of the approaches discussed.
Some problems of selection and evaluation of the Martian suit enclosure concept
NASA Astrophysics Data System (ADS)
Abramov, Isaak; Moiseyev, Nikolay; Stoklitsky, Anatoly
2005-12-01
One of the most important tasks for preparation of a future manned mission to Mars is to create a space suit, which ensures efficient and safe operation of the man on the planet surface. The concept of space suit (SS) utilisation on the Mars surface will be determined mainly by the Mars mission scenario. Currently the preference is given to utilisation of robotics with the crew driving a Mars rover vehicle, whereby the suit will be used solely as an additional safety means. However, one cannot exclude the necessity of a durable self-contained stay of the man outside a pressurised compartment, to pick up, for instance, soil samples or do certain repair work in case of an emergency. The requirements to the Mars suit and especially to the personal self-contained life support system (LSS) will depend in many respects on the Mars environmental conditions, the space vehicle system concept and performance characteristics, the airlock and its interface design, the availability of expendable elements for the LSS, etc. The paper reviews principal problems, which have to be solved during development of the Martian suit. A special attention is paid to the issue of suited man mobility during traversing on the planet surface. The paper also reviews the arguments for application of a suit semi-rigid design concept and evaluates potentialities of using certain elements of the existing "Orlan" type suit. The paper presents results of a number of studies on selection of the planetary SS enclosure concept and on experimental evaluation of mobility of the lower torso and leg enclosures in conjunction with a specially designed prototype model (tentative model) of the SS enclosure.
A suite of benchmark and challenge problems for enhanced geothermal systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Mark; Fu, Pengcheng; McClure, Mark
A diverse suite of numerical simulators is currently being applied to predict or understand the performance of enhanced geothermal systems (EGS). To build confidence and identify critical development needs for these analytical tools, the United States Department of Energy, Geothermal Technologies Office sponsored a Code Comparison Study (GTO-CCS), with participants from universities, industry, and national laboratories. A principal objective for the study was to create a community forum for improvement and verification of numerical simulators for EGS modeling. Teams participating in the study were those representing U.S. national laboratories, universities, and industries, and each team brought unique numerical simulation capabilitiesmore » to bear on the problems. Two classes of problems were developed during the study, benchmark problems and challenge problems. The benchmark problems were structured to test the ability of the collection of numerical simulators to solve various combinations of coupled thermal, hydrologic, geomechanical, and geochemical processes. This class of problems was strictly defined in terms of properties, driving forces, initial conditions, and boundary conditions. The challenge problems were based on the enhanced geothermal systems research conducted at Fenton Hill, near Los Alamos, New Mexico, between 1974 and 1995. The problems involved two phases of research, stimulation, development, and circulation in two separate reservoirs. The challenge problems had specific questions to be answered via numerical simulation in three topical areas: 1) reservoir creation/stimulation, 2) reactive and passive transport, and 3) thermal recovery. Whereas the benchmark class of problems were designed to test capabilities for modeling coupled processes under strictly specified conditions, the stated objective for the challenge class of problems was to demonstrate what new understanding of the Fenton Hill experiments could be realized via the application of modern numerical simulation tools by recognized expert practitioners. We present the suite of benchmark and challenge problems developed for the GTO-CCS, providing problem descriptions and sample solutions.« less
Characterization of Carbon Dioxide Washout Measurement Techniques in the Mark-III Space Suit
NASA Technical Reports Server (NTRS)
Meginnis, I; Norcross, J.; Bekdash, O.
2016-01-01
It is essential to provide adequate carbon dioxide (CO2) washout in a space suit to reduce the risks associated with manned operations in space suits. Symptoms of elevated CO2 levels range from reduced cognitive performance and headache to unconsciousness and death at high levels of CO2. Because of this, NASA imposes limits on inspired CO2 levels for space suits when they are used in space and for ground testing. Testing and/or analysis must be performed to verify that a space suit meets CO2 washout requirements. Testing for developmental space suits has traditionally used an oronasal mask that collects CO2 samples at the left and rights sides of the mouth. Testing with this mask resulted in artificially elevated CO2 concentration measurements, which is most likely due to the dead space volume at the front of the mask. The mask also extends outward and into the supply gas stream, which may disrupt the washout effect of the suit supply gas. To mitigate these problems, a nasal cannula was investigated as a method for measuring inspired CO2 based on the assumptions that it is low profile and would not interfere with the designed suit gas flow path, and it has reduced dead space. This test series compared the performance of a nasal cannula to the oronasal mask in the Mark III space suit. Inspired CO2 levels were measured with subjects at rest and at metabolic workloads of 1000, 2000, and 3000 BTU/hr. Workloads were achieved by use of an arm ergometer or treadmill. Test points were conducted at air flow rates of 2, 4, and 6 actual cubic feet per minute, with a suit pressure of 4.3 psid. Results from this test series will evaluate the accuracy and repeatability across subjects of the nasal cannula collection method, which will provide rationale for using a nasal cannula as the new method for measuring inspired CO2 in a space suit. Proper characterization of sampling methods and of suit CO2 washout capability will better inform requirements definition and verification techniques for future CO2 washout limits in space suits
NASA Astrophysics Data System (ADS)
Othman, Rozmie R.; Ahmad, Mohd Zamri Zahir; Ali, Mohd Shaiful Aziz Rashid; Zakaria, Hasneeza Liza; Rahman, Md. Mostafijur
2015-05-01
Consuming 40 to 50 percent of software development cost, software testing is one of the most resource consuming activities in software development lifecycle. To ensure an acceptable level of quality and reliability of a typical software product, it is desirable to test every possible combination of input data under various configurations. Due to combinatorial explosion problem, considering all exhaustive testing is practically impossible. Resource constraints, costing factors as well as strict time-to-market deadlines are amongst the main factors that inhibit such consideration. Earlier work suggests that sampling strategy (i.e. based on t-way parameter interaction or called as t-way testing) can be effective to reduce number of test cases without effecting the fault detection capability. However, for a very large system, even t-way strategy will produce a large test suite that need to be executed. In the end, only part of the planned test suite can be executed in order to meet the aforementioned constraints. Here, there is a need for test engineers to measure the effectiveness of partially executed test suite in order for them to assess the risk they have to take. Motivated by the abovementioned problem, this paper presents the effectiveness comparison of partially executed t-way test suite generated by existing strategies using tuples coverage method. Here, test engineers can predict the effectiveness of the testing process if only part of the original test cases is executed.
Summary of Documentation for DYNA3D-ParaDyn's Software Quality Assurance Regression Test Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zywicz, Edward
The Software Quality Assurance (SQA) regression test suite for DYNA3D (Zywicz and Lin, 2015) and ParaDyn (DeGroot, et al., 2015) currently contains approximately 600 problems divided into 21 suites, and is a required component of ParaDyn’s SQA plan (Ferencz and Oliver, 2013). The regression suite allows developers to ensure that software modifications do not unintentionally alter the code response. The entire regression suite is run prior to permanently incorporating any software modification or addition. When code modifications alter test problem results, the specific cause must be determined and fully understood before the software changes and revised test answers can bemore » incorporated. The regression suite is executed on LLNL platforms using a Python script and an associated data file. The user specifies the DYNA3D or ParaDyn executable, number of processors to use, test problems to run, and other options to the script. The data file details how each problem and its answer extraction scripts are executed. For each problem in the regression suite there exists an input deck, an eight-processor partition file, an answer file, and various extraction scripts. These scripts assemble a temporary answer file in a specific format from the simulation results. The temporary and stored answer files are compared to a specific level of numerical precision, and when differences are detected the test problem is flagged as failed. Presently, numerical results are stored and compared to 16 digits. At this accuracy level different processor types, compilers, number of partitions, etc. impact the results to various degrees. Thus, for consistency purposes the regression suite is run with ParaDyn using 8 processors on machines with a specific processor type (currently the Intel Xeon E5530 processor). For non-parallel regression problems, i.e., the two XFEM problems, DYNA3D is used instead. When environments or platforms change, executables using the current source code and the new resource are created and the regression suite is run. If differences in answers arise, the new answers are retained provided that the differences are inconsequential. This bootstrap approach allows the test suite answers to evolve in a controlled manner with a high level of confidence. Developers also run the entire regression suite with (serial) DYNA3D. While these results normally differ from the stored (parallel) answers, abnormal termination or wildly different values are strong indicators of potential issues.« less
Craig M. Thompson; J. Andrew Royle; James D. Garner
2012-01-01
Wildlife management often hinges upon an accurate assessment of population density. Although undeniably useful, many of the traditional approaches to density estimation such as visual counts, livetrapping, or markârecapture suffer from a suite of methodological and analytical weaknesses. Rare, secretive, or highly mobile species exacerbate these problems through the...
Sierra/SolidMechanics 4.46 Example Problems Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose
Presented in this document are tests that exist in the Sierra/SolidMechanics example problem suite, which is a subset of the Sierra/SM regression and performance test suite. These examples showcase common and advanced code capabilities. A wide variety of other regression and verification tests exist in the Sierra/SM test suite that are not included in this manual.
A Comparative Study of Optimization Algorithms for Engineering Synthesis.
1983-03-01
the ADS program demonstrates the flexibility a design engineer would have in selecting an optimization algorithm best suited to solve a particular...demonstrates the flexibility a design engineer would have in selecting an optimization algorithm best suited to solve a particular problem. 4 TABLE OF...algorithm to suit a particular problem. The ADS library of design optimization algorithms was . developed by Vanderplaats in response to the first
Gaussian process surrogates for failure detection: A Bayesian experimental design approach
NASA Astrophysics Data System (ADS)
Wang, Hongqiao; Lin, Guang; Li, Jinglai
2016-05-01
An important task of uncertainty quantification is to identify the probability of undesired events, in particular, system failures, caused by various sources of uncertainties. In this work we consider the construction of Gaussian process surrogates for failure detection and failure probability estimation. In particular, we consider the situation that the underlying computer models are extremely expensive, and in this setting, determining the sampling points in the state space is of essential importance. We formulate the problem as an optimal experimental design for Bayesian inferences of the limit state (i.e., the failure boundary) and propose an efficient numerical scheme to solve the resulting optimization problem. In particular, the proposed limit-state inference method is capable of determining multiple sampling points at a time, and thus it is well suited for problems where multiple computer simulations can be performed in parallel. The accuracy and performance of the proposed method is demonstrated by both academic and practical examples.
Vigelius, Matthias; Meyer, Bernd
2012-01-01
For many biological applications, a macroscopic (deterministic) treatment of reaction-drift-diffusion systems is insufficient. Instead, one has to properly handle the stochastic nature of the problem and generate true sample paths of the underlying probability distribution. Unfortunately, stochastic algorithms are computationally expensive and, in most cases, the large number of participating particles renders the relevant parameter regimes inaccessible. In an attempt to address this problem we present a genuine stochastic, multi-dimensional algorithm that solves the inhomogeneous, non-linear, drift-diffusion problem on a mesoscopic level. Our method improves on existing implementations in being multi-dimensional and handling inhomogeneous drift and diffusion. The algorithm is well suited for an implementation on data-parallel hardware architectures such as general-purpose graphics processing units (GPUs). We integrate the method into an operator-splitting approach that decouples chemical reactions from the spatial evolution. We demonstrate the validity and applicability of our algorithm with a comprehensive suite of standard test problems that also serve to quantify the numerical accuracy of the method. We provide a freely available, fully functional GPU implementation. Integration into Inchman, a user-friendly web service, that allows researchers to perform parallel simulations of reaction-drift-diffusion systems on GPU clusters is underway. PMID:22506001
Comet composition and density analyzer
NASA Technical Reports Server (NTRS)
Clark, B. C.
1982-01-01
Distinctions between cometary material and other extraterrestrial materials (meteorite suites and stratospherically-captured cosmic dust) are addressed. The technique of X-ray fluorescence (XRF) for analysis of elemental composition is involved. Concomitant with these investigations, the problem of collecting representative samples of comet dust (for rendezvous missions) was solved, and several related techniques such as mineralogic analysis (X-ray diffraction), direct analysis of the nucleus without docking (electron macroprobe), dust flux rate measurement, and test sample preparation were evaluated. An explicit experiment concept based upon X-ray fluorescence analysis of biased and unbiased sample collections was scoped and proposed for a future rendezvous mission with a short-period comet.
Orbit Determination Using Vinti’s Solution
2016-09-15
Surveillance Network STK Systems Tool Kit TBP Two Body Problem TLE Two-line Element Set xv Acronym Definition UKF Unscented Kalman Filter WPAFB Wright...simplicity, stability, and speed. On the other hand, Kalman filters would be best suited for sequential estimation of stochastic or random components of a...be likened to how an Unscented Kalman Filter samples a system’s nonlinearities directly, avoiding linearizing the dynamics in the partials matrices
NASA Technical Reports Server (NTRS)
Mandra, Salvatore
2017-01-01
We study the performance of the D-Wave 2X quantum annealing machine on systems with well-controlled ground-state degeneracy. While obtaining the ground state of a spin-glass benchmark instance represents a difficult task, the gold standard for any optimization algorithm or machine is to sample all solutions that minimize the Hamiltonian with more or less equal probability. Our results show that while naive transverse-field quantum annealing on the D-Wave 2X device can find the ground-state energy of the problems, it is not well suited in identifying all degenerate ground-state configurations associated to a particular instance. Even worse, some states are exponentially suppressed, in agreement with previous studies on toy model problems [New J. Phys. 11, 073021 (2009)]. These results suggest that more complex driving Hamiltonians are needed in future quantum annealing machines to ensure a fair sampling of the ground-state manifold.
Algorithm and Architecture Independent Benchmarking with SEAK
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tallent, Nathan R.; Manzano Franco, Joseph B.; Gawande, Nitin A.
2016-05-23
Many applications of high performance embedded computing are limited by performance or power bottlenecks. We have designed the Suite for Embedded Applications & Kernels (SEAK), a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions; and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) and goal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user black-box evaluation that can capture tradeoffs between performance, power, accuracy, size, andmore » weight. The tradeoffs are especially informative for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.« less
The Suite for Embedded Applications and Kernels
DOE Office of Scientific and Technical Information (OSTI.GOV)
2016-05-10
Many applications of high performance embedded computing are limited by performance or power bottlenecks. We havedesigned SEAK, a new benchmark suite, (a) to capture these bottlenecks in a way that encourages creative solutions to these bottlenecks? and (b) to facilitate rigorous, objective, end-user evaluation for their solutions. To avoid biasing solutions toward existing algorithms, SEAK benchmarks use a mission-centric (abstracted from a particular algorithm) andgoal-oriented (functional) specification. To encourage solutions that are any combination of software or hardware, we use an end-user blackbox evaluation that can capture tradeoffs between performance, power, accuracy, size, and weight. The tradeoffs are especially informativemore » for procurement decisions. We call our benchmarks future proof because each mission-centric interface and evaluation remains useful despite shifting algorithmic preferences. It is challenging to create both concise and precise goal-oriented specifications for mission-centric problems. This paper describes the SEAK benchmark suite and presents an evaluation of sample solutions that highlights power and performance tradeoffs.« less
How Not To Drown in Data: A Guide for Biomaterial Engineers.
Vasilevich, Aliaksei S; Carlier, Aurélie; de Boer, Jan; Singh, Shantanu
2017-08-01
High-throughput assays that produce hundreds of measurements per sample are powerful tools for quantifying cell-material interactions. With advances in automation and miniaturization in material fabrication, hundreds of biomaterial samples can be rapidly produced, which can then be characterized using these assays. However, the resulting deluge of data can be overwhelming. To the rescue are computational methods that are well suited to these problems. Machine learning techniques provide a vast array of tools to make predictions about cell-material interactions and to find patterns in cellular responses. Computational simulations allow researchers to pose and test hypotheses and perform experiments in silico. This review describes approaches from these two domains that can be brought to bear on the problem of analyzing biomaterial screening data. Copyright © 2017 Elsevier Ltd. All rights reserved.
DYNA3D/ParaDyn Regression Test Suite Inventory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Jerry I.
2016-09-01
The following table constitutes an initial assessment of feature coverage across the regression test suite used for DYNA3D and ParaDyn. It documents the regression test suite at the time of preliminary release 16.1 in September 2016. The columns of the table represent groupings of functionalities, e.g., material models. Each problem in the test suite is represented by a row in the table. All features exercised by the problem are denoted by a check mark (√) in the corresponding column. The definition of “feature” has not been subdivided to its smallest unit of user input, e.g., algorithmic parameters specific to amore » particular type of contact surface. This represents a judgment to provide code developers and users a reasonable impression of feature coverage without expanding the width of the table by several multiples. All regression testing is run in parallel, typically with eight processors, except problems involving features only available in serial mode. Many are strictly regression tests acting as a check that the codes continue to produce adequately repeatable results as development unfolds; compilers change and platforms are replaced. A subset of the tests represents true verification problems that have been checked against analytical or other benchmark solutions. Users are welcomed to submit documented problems for inclusion in the test suite, especially if they are heavily exercising, and dependent upon, features that are currently underrepresented.« less
Improved radiation dose efficiency in solution SAXS using a sheath flow sample environment
Kirby, Nigel; Cowieson, Nathan; Hawley, Adrian M.; Mudie, Stephen T.; McGillivray, Duncan J.; Kusel, Michael; Samardzic-Boban, Vesna; Ryan, Timothy M.
2016-01-01
Radiation damage is a major limitation to synchrotron small-angle X-ray scattering analysis of biomacromolecules. Flowing the sample during exposure helps to reduce the problem, but its effectiveness in the laminar-flow regime is limited by slow flow velocity at the walls of sample cells. To overcome this limitation, the coflow method was developed, where the sample flows through the centre of its cell surrounded by a flow of matched buffer. The method permits an order-of-magnitude increase of X-ray incident flux before sample damage, improves measurement statistics and maintains low sample concentration limits. The method also efficiently handles sample volumes of a few microlitres, can increase sample throughput, is intrinsically resistant to capillary fouling by sample and is suited to static samples and size-exclusion chromatography applications. The method unlocks further potential of third-generation synchrotron beamlines to facilitate new and challenging applications in solution scattering. PMID:27917826
Carbon in weathered ordinary chondrites from Roosevelt County
NASA Technical Reports Server (NTRS)
Ash, R. D.; Pillinger, C. T.
1993-01-01
A suite of Roosevelt County ordinary chondrites of known terrestrial age have been analyzed for carbon content and isotopic composition. Initial results indicate that significant carbon contamination is evident only in samples with a terrestrial age greater than 40 ka. These samples are of weathering grade D and E and contain three times more carbon than the less weathered samples. The soil in which they were preserved has a carbon content of ca. 1.5 percent. Over 200 meteorites have been recovered from a series of soil depleted areas of New Mexico and West Texas. Most have been recovered from blowouts near Clovis in Roosevelt County (RC) on the high plains of New Mexico. The mineralogical and petrologic Al effects of weathering upon these samples have been studied previously and show that the degree of weathering is largely depend ant upon the terrestrial residence time. The study was undertaken to determine the effects of prolonged exposure to the soil and climate of Roosevelt County upon ordinary chondrites in the hope that this will enable a better understanding of the problems associated with the collection of meteoritic falls. A suite of ten grade 4 to 6 H, L, and LL ordinary chondrites were analyzed for carbon content and isotopic composition.
Visibility-Based Goal Oriented Metrics and Application to Navigation and Path Planning Problems
2017-12-14
Suite 5.300 Austin , TX 78712 -1532 Agency Code: Proposal Number: 62381MA Address: 101 East 27th Street, Austin , TX 787121532 Country: USA DUNS...University of Texas at Austin , USA Research supported by NSF, ARO Points sampled from imaging devices What are the total length of the cables? The...by other documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park
Sharing clinical information across care settings: the birth of an integrated assessment system
Gray, Leonard C; Berg, Katherine; Fries, Brant E; Henrard, Jean-Claude; Hirdes, John P; Steel, Knight; Morris, John N
2009-01-01
Background Population ageing, the emergence of chronic illness, and the shift away from institutional care challenge conventional approaches to assessment systems which traditionally are problem and setting specific. Methods From 2002, the interRAI research collaborative undertook development of a suite of assessment tools to support assessment and care planning of persons with chronic illness, frailty, disability, or mental health problems across care settings. The suite constitutes an early example of a "third generation" assessment system. Results The rationale and development strategy for the suite is described, together with a description of potential applications. To date, ten instruments comprise the suite, each comprising "core" items shared among the majority of instruments and "optional" items that are specific to particular care settings or situations. Conclusion This comprehensive suite offers the opportunity for integrated multi-domain assessment, enabling electronic clinical records, data transfer, ease of interpretation and streamlined training. PMID:19402891
AX-5 space suit bearing torque investigation
NASA Technical Reports Server (NTRS)
Loewenthal, Stuart; Vykukal, Vic; Mackendrick, Robert; Culbertson, Philip, Jr.
1990-01-01
The symptoms and eventual resolution of a torque increase problem occurring with ball bearings in the joints of the AX-5 space suit are described. Starting torques that rose 5 to 10 times initial levels were observed in crew evaluation tests of the suit in a zero-g water tank. This bearing problem was identified as a blocking torque anomaly, observed previously in oscillatory gimbal bearings. A large matrix of lubricants, ball separator designs and materials were evaluated. None of these combinations showed sufficient tolerance to lubricant washout when repeatedly cycled in water. The problem was resolved by retrofitting a pressure compensated, water exclusion seal to the outboard side of the bearing cavity. The symptoms and possible remedies to blocking are discussed.
The Sample Analysis at Mars Investigation and Instrument Suite
NASA Technical Reports Server (NTRS)
Mahaffy, Paul; Webster, Chris R.; Cabane, M.; Conrad, Pamela G.; Coll, Patrice; Atreya, Sushil K.; Arvey, Robert; Barciniak, Michael; Benna, Mehdi; Bleacher, L.;
2012-01-01
The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory(MSL) addresses the chemical and isotopic composition of the atmosphere and volatilesextracted from solid samples. The SAM investigation is designed to contribute substantiallyto the mission goal of quantitatively assessing the habitability of Mars as an essentialstep in the search for past or present life on Mars. SAM is a 40 kg instrument suite locatedin the interior of MSLs Curiosity rover. The SAM instruments are a quadrupole massspectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupledthrough solid and gas processing systems to provide complementary information on thesame samples. The SAM suite is able to measure a suite of light isotopes and to analyzevolatiles directly from the atmosphere or thermally released from solid samples. In additionto measurements of simple inorganic compounds and noble gases SAM will conducta sensitive search for organic compounds with either thermal or chemical extraction fromsieved samples delivered by the sample processing system on the Curiosity rovers roboticarm.
Chan, Karen K; Neighbors, Clayton; Marlatt, G Alan
2004-12-01
Employee assistance programs (EAPs) are widely available to assist employees with a variety of problems. This research examined factors related to utilization and outcome by individuals with addictive behaviors (ABs) versus other problem areas. The specific aims of this study were to evaluate referral source and treatment outcome by gender and presenting problem. The sample included 3890 men and women who attended the EAP for a variety of concerns. Men were less likely than women to self-refer and more likely to be mandated to the EAP. Men were also much more likely to present with ABs. Relative to clients presenting with other issues, individuals with ABs were less likely to self-refer, have their problems resolved in the EAP, and were seen for fewer sessions. These results suggest that EAPs may be well suited for implementation of brief interventions (BIs) that have been empirically supported in other contexts.
Evaluation of Ages in the Lunar Highlands with Implications for the Evolution of the Moon
NASA Astrophysics Data System (ADS)
Borg, L. E.; Gaffney, A. M.; Carlson, R. W.
2012-12-01
The lunar highlands are composed of rocks from the ferroan anorthosite (FAN) and Mg-suites. These samples have been extensively studied because they record most of the major events associated with the formation and evolution of the Earth's Moon. Despite their potential to constrain the timing of these events, chronologic investigations are often ambiguous; in most cases because absolute ages and/or initial isotopic compositions are inconsistent with stratigraphic and petrologic relationships of various rock suites inferred from mineralogical and geochemical studies. The problem is exacerbated by the fact that most samples are difficult to date due to their small size and nearly monomineralic nature, as well as isotopic disturbances associated with impacts. Here several criteria are used to assess the reliability of lunar ages, including: (1) concordance between multiple chronometers, (2) linearity of individual isochrons, (3) resistance of the chronometers to disruption by impact or contamination, (4) consistency between initial isotopic compositions and the petrogenisis of samples, and (5) reasonableness of the elemental concentrations of mineral fractions. If only those samples that meet 4 out of 5 of these criteria are used to constrain lunar chronology many of the apparent conflicts between chronometry and petrology disappear. For example, this analysis demonstrates that the most ancient ages reported for lunar samples are some of the least reliable. The oldest ages determined on both FAN and Mg-suite highland rocks with confidence are in fact ~4.35 Ga. This age is concordant with 142Nd mare source formation ages and a peak in zircon ages, suggesting it represents a major event at ~4.35 Ga. In contrast, several apparently reliable KREEP model ages are older at ~4.48 Ga. If these older model ages are correct, they may represent the solidification age of the Moon, whereas the 4.35 Ga event could reflect secondary magmatism and cumulate re-equilibration associated with density overturn of primordial magma ocean cumulates.
Enhanced verification test suite for physics simulation codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.
2008-09-01
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.
Spherical Harmonic Solutions to the 3D Kobayashi Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, P.N.; Chang, B.; Hanebutte, U.R.
1999-12-29
Spherical harmonic solutions of order 5, 9 and 21 on spatial grids containing up to 3.3 million cells are presented for the Kobayashi benchmark suite. This suite of three problems with simple geometry of pure absorber with large void region was proposed by Professor Kobayashi at an OECD/NEA meeting in 1996. Each of the three problems contains a source, a void and a shield region. Problem 1 can best be described as a box in a box problem, where a source region is surrounded by a square void region which itself is embedded in a square shield region. Problems 2more » and 3 represent a shield with a void duct. Problem 2 having a straight and problem 3 a dog leg shaped duct. A pure absorber and a 50% scattering case are considered for each of the three problems. The solutions have been obtained with Ardra, a scalable, parallel neutron transport code developed at Lawrence Livermore National Laboratory (LLNL). The Ardra code takes advantage of a two-level parallelization strategy, which combines message passing between processing nodes and thread based parallelism amongst processors on each node. All calculations were performed on the IBM ASCI Blue-Pacific computer at LLNL.« less
A writer's guide to education scholarship: Qualitative education scholarship (part 2).
Chan, Teresa M; Ting, Daniel K; Hall, Andrew Koch; Murnaghan, Aleisha; Thoma, Brent; McEwen, Jill; Yarris, Lalena M
2018-03-01
Education scholarship can be conducted using a variety of methods, from quantitative experiments to qualitative studies. Qualitative methods are less commonly used in emergency medicine (EM) education research but are well-suited to explore complex educational problems and generate hypotheses. We aimed to review the literature to provide resources to guide educators who wish to conduct qualitative research in EM education. We conducted a scoping review to outline: 1) a list of journals that regularly publish qualitative educational papers; 2) an aggregate set of quality markers for qualitative educational research and scholarship; and 3) a list of quality checklists for qualitative educational research and scholarship. We found nine journals that have published more than one qualitative educational research paper in EM. From the literature, we identified 39 quality markers that were grouped into 10 themes: Initial Grounding Work (preparation, background); Goals, Problem Statement, or Question; Methods (general considerations); Sampling Techniques; Data Collection Techniques; Data Interpretation and Theory Generation; Measures to Optimize Rigour and Trustworthiness; Relevance to the Field; Evidence of Reflective Practice; Dissemination and Reporting. Lastly, five quality checklists were found for guiding educators in reporting their qualitative work. Many problems that EM educators face are well-suited to exploration using qualitative methods. The results of our scoping review provide publication venues, quality indicators, and checklists that may be useful to EM educators embarking on qualitative projects.
The Sample Analysis at Mars Investigation and Instrument Suite
NASA Technical Reports Server (NTRS)
Mahaffy, Paul; Webster, Christopher R.; Conrad, Pamela G.; Arvey, Robert; Bleacher, Lora; Brinckerhoff, William B.; Eigenbrode, Jennifer L.; Chalmers, Robert A.; Dworkin, Jason P.; Errigo, Therese;
2012-01-01
The Sample Analysis at Mars (SAM) investigation of the Mars Science Laboratory (MSL) addresses the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. The SAM investigation is designed to contribute substantially to the mission goal of quantitatively assessing the habitability of Mars as an essential step in the search for past or present life on Mars. SAM is a 40 kg instrument suite located in the interior of MSL's Curiosity rover. The SAM instruments are a quadrupole mass spectrometer, a tunable laser spectrometer, and a 6-column gas chromatograph all coupled through solid and gas processing systems to provide complementary information on the same samples. The SAM suite is able to measure a suite of light isotopes and to analyze volatiles directly from the atmosphere or thermally released from solid samples. In addition to measurements of simple inorganic compounds and noble gases SAM will conduct a sensitive search for organic compounds with either thermal or chemical extraction from sieved samples delivered by the sample processing system on the Curiosity rover's robotic arm,
Sierra/SolidMechanics 4.48 Verification Tests Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose
2018-03-01
Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less
Sierra/SolidMechanics 4.48 Verification Tests Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel Jose
Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less
Assessment and Management of the Risks of Debris Hits During Space Station EVAs
NASA Technical Reports Server (NTRS)
Pate-Cornell, Elisabeth; Sachon, Marc
1997-01-01
The risk of EVAs is critical to the decision of whether or not to automate a large part of the construction of the International Space Station (ISS). Furthermore, the choice of the technologies of the space suit and the life support system will determine (1) the immediate safety of these operations, and (2) the long-run costs and risks of human presence in space, not only in lower orbit (as is the case of the ISS) but also perhaps, outside these orbits, or on the surface of other planets. The problem is therefore both an immediate one and a long-term one. The fundamental question is how and when to shift from the existing EMU system (suit, helmet, gloves and life support system) to another type (e.g. a hard suit), given the potential trade-offs among life-cycle costs, risks to the astronauts, performance of tasks, and uncertainties about new systems' safety inherent to such a shift in technology. A more immediate issue is how to manage the risks of EVAs during the construction and operation of the ISS in order to make the astronauts (in the words of the NASA Administrator) "as safe outside as inside". For the moment (June 1997), the plan is to construct the Space Station using the low-pressure space suits that have been developed for the space shuttle. In the following, we will refer to this suit assembly as EMU (External Maneuvering Unit). It is the product of a long evolution, starting from the U.S. Air Force pilot suits through the various versions and changes that occurred for the purpose of NASA space exploration, in particular during the Gemini and the Apollo programs. The Shuttle EMU is composed of both soft fabrics and hard plates. As an alternative to the shuttle suit, at least two hard suits were developed by NASA: the AX5 and the MRKIII. The problem of producing hard suits for space exploration is very similar to that of producing deep-sea diving suits. There was thus an opportunity to develop a suit that could be manufactured for both purposes with the economies of scale that could be gained from a two-branch manufacturing line (space and deep sea). Of course, the space suit would need to be space qualified. Some of the problems in adopting one of the hard suits were first that the testing had to be completed, and second that it required additional storage space. The decision was made not to develop a hard suit in time for the construction and operation of the ISS. Instead, to improve the safety of the current suit, it was decided to reinforce the soft parts of the shuttle EMU with KEVLAR linings to strengthen it against debris impacts. Test results, however, show that this advanced suit design has little effect on the penetration characteristics.
Recent advances on terrain database correlation testing
NASA Astrophysics Data System (ADS)
Sakude, Milton T.; Schiavone, Guy A.; Morelos-Borja, Hector; Martin, Glenn; Cortes, Art
1998-08-01
Terrain database correlation is a major requirement for interoperability in distributed simulation. There are numerous situations in which terrain database correlation problems can occur that, in turn, lead to lack of interoperability in distributed training simulations. Examples are the use of different run-time terrain databases derived from inconsistent on source data, the use of different resolutions, and the use of different data models between databases for both terrain and culture data. IST has been developing a suite of software tools, named ZCAP, to address terrain database interoperability issues. In this paper we discuss recent enhancements made to this suite, including improved algorithms for sampling and calculating line-of-sight, an improved method for measuring terrain roughness, and the application of a sparse matrix method to the terrain remediation solution developed at the Visual Systems Lab of the Institute for Simulation and Training. We review the application of some of these new algorithms to the terrain correlation measurement processes. The application of these new algorithms improves our support for very large terrain databases, and provides the capability for performing test replications to estimate the sampling error of the tests. With this set of tools, a user can quantitatively assess the degree of correlation between large terrain databases.
Lunar Meteorites: A Global Geochemical Dataset
NASA Technical Reports Server (NTRS)
Zeigler, R. A.; Joy, K. H.; Arai, T.; Gross, J.; Korotev, R. L.; McCubbin, F. M.
2017-01-01
To date, the world's meteorite collections contain over 260 lunar meteorite stones representing at least 120 different lunar meteorites. Additionally, there are 20-30 as yet unnamed stones currently in the process of being classified. Collectively these lunar meteorites likely represent 40-50 distinct sampling locations from random locations on the Moon. Although the exact provenance of each individual lunar meteorite is unknown, collectively the lunar meteorites represent the best global average of the lunar crust. The Apollo sites are all within or near the Procellarum KREEP Terrane (PKT), thus lithologies from the PKT are overrepresented in the Apollo sample suite. Nearly all of the lithologies present in the Apollo sample suite are found within the lunar meteorites (high-Ti basalts are a notable exception), and the lunar meteorites contain several lithologies not present in the Apollo sample suite (e.g., magnesian anorthosite). This chapter will not be a sample-by-sample summary of each individual lunar meteorite. Rather, the chapter will summarize the different types of lunar meteorites and their relative abundances, comparing and contrasting the lunar meteorite sample suite with the Apollo sample suite. This chapter will act as one of the introductory chapters to the volume, introducing lunar samples in general and setting the stage for more detailed discussions in later more specialized chapters. The chapter will begin with a description of how lunar meteorites are ejected from the Moon, how deep samples are being excavated from, what the likely pairing relationships are among the lunar meteorite samples, and how the lunar meteorites can help to constrain the impactor flux in the inner solar system. There will be a discussion of the biases inherent to the lunar meteorite sample suite in terms of underrepresented lithologies or regions of the Moon, and an examination of the contamination and limitations of lunar meteorites due to terrestrial weathering. The bulk of the chapter will use examples from the lunar meteorite suite to examine important recent advances in lunar science, including (but not limited to the following: (1) Understanding the global compositional diversity of the lunar surface; (2) Understanding the formation of the ancient lunar primary crust; (3) Understanding the diversity and timing of mantle melting, and secondary crust formation; (4) Comparing KREEPy lunar meteorites to KREEPy Apollo samples as evidence of variability within the PKT; and (5) A better understanding of the South Pole Aitken Basin through lunar meteorites whose provenance are within that Terrane.
Hybrid Enhanced Epidermal SpaceSuit Design Approaches
NASA Astrophysics Data System (ADS)
Jessup, Joseph M.
A Space suit that does not rely on gas pressurization is a multi-faceted problem that requires major stability controls to be incorporated during design and construction. The concept of Hybrid Epidermal Enhancement space suit integrates evolved human anthropomorphic and physiological adaptations into its functionality, using commercially available bio-medical technologies to address shortcomings of conventional gas pressure suits, and the impracticalities of MCP suits. The prototype HEE Space Suit explored integumentary homeostasis, thermal control and mobility using advanced bio-medical materials technology and construction concepts. The goal was a space suit that functions as an enhanced, multi-functional bio-mimic of the human epidermal layer that works in attunement with the wearer rather than as a separate system. In addressing human physiological requirements for design and construction of the HEE suit, testing regimes were devised and integrated into the prototype which was then subject to a series of detailed tests using both anatomical reproduction methods and human subject.
NASA Technical Reports Server (NTRS)
Elardo, S. M.; Shearer, C. K.; McCubbin, F. M.
2017-01-01
The lunar magnesian-suite, or Mg-suite, is a series of ancient plutonic rocks from the lunar crust. They have received a considerable amount of attention from lunar scientists since their discovery for three primary reasons: 1) their ages and geochemistry indicate they represent pristine magmatic samples that crystallized very soon after the formation of the Moon; 2) their ages often overlap with ages of the ferroan anorthosite (FAN) crust; and 3) planetary-scale processes are needed in formation models to account for their unique geochemical features. Taken as a whole, the Mg-suite samples, as magmatic cumulate rocks, approximate a fractional crystallization sequence in the low-pressure forsterite-anorthite-silica system, and thus these samples are generally thought to be derived from layered mafic intrusions which crystallized very slowly from magmas that intruded the anorthositic crust. However, no direct linkages have been established between different Mg-suite samples based either on field relationships or geochemistry.The model for the origin of the Mg-suite, which best fits the limited available data, is one where Mg-suite magmas form from melting of a hybrid cumulate package consisting of deep mantle dunite, crustal anorthosite, and KREEP (potassium-rare earth elements-phosphorus) at the base of the crust under the Procellarum KREEP Terrane (PKT). In this model, these three LMO (Lunar Magma Ocean) cumulate components are brought into close proximity by the cumulate overturn process. Deep mantle dunitic cumulates with an Mg number of approximately 90 rise to the base of the anorthositic crust due to their buoyancy relative to colder, more dense Fe- and Ti-rich cumulates. This hybridized source rock melts to form Mg-suite magmas, saturated in Mg-rich olivine and anorthitic plagioclase, that have a substantial KREEP component.
NASA Astrophysics Data System (ADS)
Ransom, Stephen; Böttcher, Jörg; Steinsiek, Frank
The Astrium Space Infrastructure Division has begun an in-house research activity of an Earth-based simulation facility supporting future manned missions to Mars. This research unit will help to prepare and support planned missions in the following ways: 1) to enable the investigation and analysis of contamination issues in advance of a human visit to Mars; 2) as a design tool to investigate and simulate crew operations; 3) to simulate crew operation during an actual mission; 4) to enable on-surface scientific operations without leaving the shirt-sleeve habitation environment ("glove box principle"). The MESA module is a surface EVA facility attached to the main habitation or laboratory module, or mobile pressurized rover. It will be sealed, but not pressurized, and provide protection against the harsh Martian environment. This module will include a second crew airlock for safety reasons. The compartment can also be used to provide an external working bench and experiment area for the crew. A simpler MESA concept provides only an open shelter against wind and dust. This concept does not incorporate working and experimental areas. The principle idea behind the MESA concept is to tackle the issue of contamination by minimizing the decontamination processes needed to clean surface equipment and crew suit surfaces after an EVA excursion prior to the astronaut re-entering the habitable area. The technical solution envisages the use of a dedicated crew suit airlock. This airlock uses an EVA suit which is externally attached by its back-pack to the EVA compartment area facing the Martian environment. The crew donns the suit from inside the habitable volume through the airlock on the back of the suit. The surface EVA can be accomplished after closing the back-pack and detaching the suit. A special technical design concept foresees an extendable suit back-pack, so that the astronaut can operate outside and in the vincinity of the module. The key driver in the investigation is the problem of contamination of the habitable volume by EVA and sampling activities and the transport of Earth-generated contaminants to Mars.
Mei, Suyu; Zhu, Hao
2015-01-26
Protein-protein interaction (PPI) prediction is generally treated as a problem of binary classification wherein negative data sampling is still an open problem to be addressed. The commonly used random sampling is prone to yield less representative negative data with considerable false negatives. Meanwhile rational constraints are seldom exerted on model selection to reduce the risk of false positive predictions for most of the existing computational methods. In this work, we propose a novel negative data sampling method based on one-class SVM (support vector machine, SVM) to predict proteome-wide protein interactions between HTLV retrovirus and Homo sapiens, wherein one-class SVM is used to choose reliable and representative negative data, and two-class SVM is used to yield proteome-wide outcomes as predictive feedback for rational model selection. Computational results suggest that one-class SVM is more suited to be used as negative data sampling method than two-class PPI predictor, and the predictive feedback constrained model selection helps to yield a rational predictive model that reduces the risk of false positive predictions. Some predictions have been validated by the recent literature. Lastly, gene ontology based clustering of the predicted PPI networks is conducted to provide valuable cues for the pathogenesis of HTLV retrovirus.
Local classifier weighting by quadratic programming.
Cevikalp, Hakan; Polikar, Robi
2008-10-01
It has been widely accepted that the classification accuracy can be improved by combining outputs of multiple classifiers. However, how to combine multiple classifiers with various (potentially conflicting) decisions is still an open problem. A rich collection of classifier combination procedures -- many of which are heuristic in nature -- have been developed for this goal. In this brief, we describe a dynamic approach to combine classifiers that have expertise in different regions of the input space. To this end, we use local classifier accuracy estimates to weight classifier outputs. Specifically, we estimate local recognition accuracies of classifiers near a query sample by utilizing its nearest neighbors, and then use these estimates to find the best weights of classifiers to label the query. The problem is formulated as a convex quadratic optimization problem, which returns optimal nonnegative classifier weights with respect to the chosen objective function, and the weights ensure that locally most accurate classifiers are weighted more heavily for labeling the query sample. Experimental results on several data sets indicate that the proposed weighting scheme outperforms other popular classifier combination schemes, particularly on problems with complex decision boundaries. Hence, the results indicate that local classification-accuracy-based combination techniques are well suited for decision making when the classifiers are trained by focusing on different regions of the input space.
Proof of Concept for the Rewrite Rule Machine: Interensemble Studies
1994-02-23
34 -,,, S2 •fbo fibo 0 1 Figure 1: Concurrent Rewriting of Fibonacci Expressions exploit a problem’s parallelism at several levels. We call this...property multigrain concurrency; it makes the RRM very well suited for solving not only homogeneous problems, but also complex, locally homogeneous but...interprocessor message passing over a network-is not well suited to data parallelism. A key goal of the RRM is to combine the best of these two approaches in a
Enhanced Verification Test Suite for Physics Simulation Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kamm, J R; Brock, J S; Brandon, S T
2008-10-10
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less
BioPreDyn-bench: a suite of benchmark problems for dynamic modelling in systems biology.
Villaverde, Alejandro F; Henriques, David; Smallbone, Kieran; Bongard, Sophia; Schmid, Joachim; Cicin-Sain, Damjan; Crombach, Anton; Saez-Rodriguez, Julio; Mauch, Klaus; Balsa-Canto, Eva; Mendes, Pedro; Jaeger, Johannes; Banga, Julio R
2015-02-20
Dynamic modelling is one of the cornerstones of systems biology. Many research efforts are currently being invested in the development and exploitation of large-scale kinetic models. The associated problems of parameter estimation (model calibration) and optimal experimental design are particularly challenging. The community has already developed many methods and software packages which aim to facilitate these tasks. However, there is a lack of suitable benchmark problems which allow a fair and systematic evaluation and comparison of these contributions. Here we present BioPreDyn-bench, a set of challenging parameter estimation problems which aspire to serve as reference test cases in this area. This set comprises six problems including medium and large-scale kinetic models of the bacterium E. coli, baker's yeast S. cerevisiae, the vinegar fly D. melanogaster, Chinese Hamster Ovary cells, and a generic signal transduction network. The level of description includes metabolism, transcription, signal transduction, and development. For each problem we provide (i) a basic description and formulation, (ii) implementations ready-to-run in several formats, (iii) computational results obtained with specific solvers, (iv) a basic analysis and interpretation. This suite of benchmark problems can be readily used to evaluate and compare parameter estimation methods. Further, it can also be used to build test problems for sensitivity and identifiability analysis, model reduction and optimal experimental design methods. The suite, including codes and documentation, can be freely downloaded from the BioPreDyn-bench website, https://sites.google.com/site/biopredynbenchmarks/ .
Development of a space activity suit
NASA Technical Reports Server (NTRS)
Annis, J. F.; Webb, P.
1971-01-01
The development of a series of prototype space activity suit (SAS) assemblies is discussed. The SAS is a new type of pressure suit designed especially for extravehicular activity. It consists of a set of carefully tailored elastic fabric garments which have been engineered to supply sufficient counterpressure to the body to permit subjects to breath O2 at pressures up to 200 mm Hg without circulatory difficulty. A closed, positive pressure breathing system (PPBS) and a full bubble helmet were also developed to complete the system. The ultimate goal of the SAS is to improve the range of activity and decrease the energy cost of work associated with wearing conventional gas filled pressure suits. Results are presented from both laboratory (1 atmosphere) and altitude chamber tests with subjects wearing various SAS assemblies. In laboratory tests lasting up to three hours, the SAS was worn while subjects breathed O2 at pressures up to 170 mm Hg without developing physiological problems. The only physiological symptoms apparent were a moderate tachycardia related to breathing pressures above 130 mm Hg, and a small collection of edema fluid in the hands. Both problems were considered to be related to areas of under-pressurization by the garments. These problems, it is suggested, can ultimately be corrected by the development of new elastic fabrics and tailoring techniques. Energy cost of activity, and mobility and dexterity of subjects in the SAS, were found to be superior to those in comparable tests on subjects in full pressure suits.
Scalable approximate policies for Markov decision process models of hospital elective admissions.
Zhu, George; Lizotte, Dan; Hoey, Jesse
2014-05-01
To demonstrate the feasibility of using stochastic simulation methods for the solution of a large-scale Markov decision process model of on-line patient admissions scheduling. The problem of admissions scheduling is modeled as a Markov decision process in which the states represent numbers of patients using each of a number of resources. We investigate current state-of-the-art real time planning methods to compute solutions to this Markov decision process. Due to the complexity of the model, traditional model-based planners are limited in scalability since they require an explicit enumeration of the model dynamics. To overcome this challenge, we apply sample-based planners along with efficient simulation techniques that given an initial start state, generate an action on-demand while avoiding portions of the model that are irrelevant to the start state. We also propose a novel variant of a popular sample-based planner that is particularly well suited to the elective admissions problem. Results show that the stochastic simulation methods allow for the problem size to be scaled by a factor of almost 10 in the action space, and exponentially in the state space. We have demonstrated our approach on a problem with 81 actions, four specialities and four treatment patterns, and shown that we can generate solutions that are near-optimal in about 100s. Sample-based planners are a viable alternative to state-based planners for large Markov decision process models of elective admissions scheduling. Copyright © 2014 Elsevier B.V. All rights reserved.
Sekiguchi, Yuki; Yamamoto, Masaki; Oroguchi, Tomotaka; Takayama, Yuki; Suzuki, Shigeyuki; Nakasako, Masayoshi
2014-11-01
Using our custom-made diffraction apparatus KOTOBUKI-1 and two multiport CCD detectors, cryogenic coherent X-ray diffraction imaging experiments have been undertaken at the SPring-8 Angstrom Compact free electron LAser (SACLA) facility. To efficiently perform experiments and data processing, two software suites with user-friendly graphical user interfaces have been developed. The first is a program suite named IDATEN, which was developed to easily conduct four procedures during experiments: aligning KOTOBUKI-1, loading a flash-cooled sample into the cryogenic goniometer stage inside the vacuum chamber of KOTOBUKI-1, adjusting the sample position with respect to the X-ray beam using a pair of telescopes, and collecting diffraction data by raster scanning the sample with X-ray pulses. Named G-SITENNO, the other suite is an automated version of the original SITENNO suite, which was designed for processing diffraction data. These user-friendly software suites are now indispensable for collecting a large number of diffraction patterns and for processing the diffraction patterns immediately after collecting data within a limited beam time.
Multi-scale image segmentation and numerical modeling in carbonate rocks
NASA Astrophysics Data System (ADS)
Alves, G. C.; Vanorio, T.
2016-12-01
Numerical methods based on computational simulations can be an important tool in estimating physical properties of rocks. These can complement experimental results, especially when time constraints and sample availability are a problem. However, computational models created at different scales can yield conflicting results with respect to the physical laboratory. This problem is exacerbated in carbonate rocks due to their heterogeneity at all scales. We developed a multi-scale approach performing segmentation of the rock images and numerical modeling across several scales, accounting for those heterogeneities. As a first step, we measured the porosity and the elastic properties of a group of carbonate samples with varying micrite content. Then, samples were imaged by Scanning Electron Microscope (SEM) as well as optical microscope at different magnifications. We applied three different image segmentation techniques to create numerical models from the SEM images and performed numerical simulations of the elastic wave-equation. Our results show that a multi-scale approach can efficiently account for micro-porosities in tight micrite-supported samples, yielding acoustic velocities comparable to those obtained experimentally. Nevertheless, in high-porosity samples characterized by larger grain/micrite ratio, results show that SEM scale images tend to overestimate velocities, mostly due to their inability to capture macro- and/or intragranular- porosity. This suggests that, for high-porosity carbonate samples, optical microscope images would be more suited for numerical simulations.
Sm-Nd Isotopic Systematics of Troctolite 76335
NASA Technical Reports Server (NTRS)
Edmunson, J.; Nyquist, L. E.; Borg, L. E.
2007-01-01
A study of the Sm-Nd isotopic systematics of lunar Mg-suite troctolite 76335 was undertaken to further establish the early chronology of lunar magmatism. Because the Rb-Sr isotopic systematics of similar sample 76535 yielded an age of 4570 +/- 70 Ma [2, lambda = 1.402 x 10(exp -11)], 76335 was expected to yield an old age. In contrast, the Sm-Nd and K-Ar ages of 76535 indicate that the sample is approximately 4260 Ma old, one of the youngest ages obtained for a Mg-suite rock. This study establishes the age of 76335 and discusses the constraints placed on its petrogenesis by its Sm-Nd isotope systematics. The Sm-Nd isotopic system of lunar Mg-suite troctolite 76335 indicates an age of 4278 +/- 60 Ma with an initial epsilon (sup 143)(sub Nd) value of 0.06 +/- 0.39. These values are consistent with the Sm-Nd isotopic systematics of similar sample 76535. Thus, it appears that a robust Sm-Nd age can be determined from a highly brecciated lunar sample. The Sm-Nd isotopic systematics of troctolites 76335 and 76535 appear to be different from those dominating the Mg-suite norites and KREEP basalts. Further analysis of the Mg-suite must be completed to reveal the isotopic relationships of these early lunar rocks.
The experience in operation and improving the Orlan-type space suits.
Abramov, I P
1995-07-01
Nowadays significant experience has been gained in Russia concerning extravehicular activity (EVA) with cosmonauts wearing a semi-rigid space suit of the "Orlan" type. The conditions for the cosmonauts' vital activities, the operational and ergonomic features of the space suit and its reliability are the most critical factors defining the efficiency of the scheduled operation to be performed by the astronaut and his safety. As the missions performed by the cosmonauts during EVA become more and more elaborate, the requirements for EVA space suits and their systems become more and more demanding, resulting in their consistent advancement. This paper provides certain results of the space suit's operation and analysis of its major problems as applied to the Salyut and MIR orbiting stations. The modification steps of the space suit in the course of operation (Orlan-D, Orlan-DM, Orlan-DMA) and its specific features are presented. The concept of the suited cosmonauts' safety is described as well as trends for future space suit improvements.
Surgical suite environmental control system. [using halothane absorbing filter
NASA Technical Reports Server (NTRS)
Higginbotham, E. J.; Jacobs, M. L.
1974-01-01
Theoretical and experimental work for a systems analysis approach to the problem of surgical suit exhaust systems centered on evaluation of halothane absorbing filters. An activated charcoal-alumina-charcoal combination proved to be the best filter for eliminating halothane through multilayer absorption of gas molecules.
NASA Astrophysics Data System (ADS)
Eric, L.; Vrugt, J. A.
2010-12-01
Spatially distributed hydrologic models potentially contain hundreds of parameters that need to be derived by calibration against a historical record of input-output data. The quality of this calibration strongly determines the predictive capability of the model and thus its usefulness for science-based decision making and forecasting. Unfortunately, high-dimensional optimization problems are typically difficult to solve. Here we present our recent developments to the Differential Evolution Adaptive Metropolis (DREAM) algorithm (Vrugt et al., 2009) to warrant efficient solution of high-dimensional parameter estimation problems. The algorithm samples from an archive of past states (Ter Braak and Vrugt, 2008), and uses multiple-try Metropolis sampling (Liu et al., 2000) to decrease the required burn-in time for each individual chain and increase efficiency of posterior sampling. This approach is hereafter referred to as MT-DREAM. We present results for 2 synthetic mathematical case studies, and 2 real-world examples involving from 10 to 240 parameters. Results for those cases show that our multiple-try sampler, MT-DREAM, can consistently find better solutions than other Bayesian MCMC methods. Moreover, MT-DREAM is admirably suited to be implemented and ran on a parallel machine and is therefore a powerful method for posterior inference.
Plastic toy shark drifts through airlock in front of EMU suited MS Lenoir
NASA Technical Reports Server (NTRS)
1982-01-01
Plastic toy shark drifts through airlock and around fully extravehicular mobility unit (EMU) suited Mission Specialist (MS) Lenoir. Lenoir watches as shark drifts pass his left hand. Lenoir donned the EMU in preparation for a scheduled extravehicular activity (EVA) which was cancelled due to EMU problems.
DOT National Transportation Integrated Search
1996-01-01
An integrated cockpit display suite, the T-NASA (Taxiway Navigation and : Situation Awareness) system, is under development for NASA's Terminal Area : Productivity (TAP) Low-Visibility Landing and Surface Operations (LVLASO) : program. This system ha...
Multisample conversion of water to hydrogen by zinc for stable isotope determination
Kendall, C.; Coplen, T.B.
1985-01-01
Two techniques for the conversion of water to hydrogen for stable isotope ratio determination have been developed that are especially suited for automated multisample analysis. Both procedures involve reaction of zinc shot with a water sample at 450 ??C. in one method designed for water samples in bottles, the water is put in capillaries and is reduced by zinc in reaction vessels; overall savings in sample preparation labor of 75% have been realized over the standard uranium reduction technique. The second technique is for waters evolved under vacuum and is a sealed-tube method employing 9 mm o.d. quartz tubing. Problems inherent with zinc reduction include surface inhomogeneity of the zinc and exchange of hydrogen both with the zinc and with the glass walls of the vessels. For best results, water/zinc and water/glass surface area ratios of vessels should be kept as large as possible.
Enhanced sampling techniques in molecular dynamics simulations of biological systems.
Bernardi, Rafael C; Melo, Marcelo C R; Schulten, Klaus
2015-05-01
Molecular dynamics has emerged as an important research methodology covering systems to the level of millions of atoms. However, insufficient sampling often limits its application. The limitation is due to rough energy landscapes, with many local minima separated by high-energy barriers, which govern the biomolecular motion. In the past few decades methods have been developed that address the sampling problem, such as replica-exchange molecular dynamics, metadynamics and simulated annealing. Here we present an overview over theses sampling methods in an attempt to shed light on which should be selected depending on the type of system property studied. Enhanced sampling methods have been employed for a broad range of biological systems and the choice of a suitable method is connected to biological and physical characteristics of the system, in particular system size. While metadynamics and replica-exchange molecular dynamics are the most adopted sampling methods to study biomolecular dynamics, simulated annealing is well suited to characterize very flexible systems. The use of annealing methods for a long time was restricted to simulation of small proteins; however, a variant of the method, generalized simulated annealing, can be employed at a relatively low computational cost to large macromolecular complexes. Molecular dynamics trajectories frequently do not reach all relevant conformational substates, for example those connected with biological function, a problem that can be addressed by employing enhanced sampling algorithms. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014 Elsevier B.V. All rights reserved.
Overview of the Development for a Suite of Low-Thrust Trajectory Analysis Tools
NASA Technical Reports Server (NTRS)
Kos, Larry D.; Polsgrove, Tara; Hopkins, Randall; Thomas, Dan; Sims, Jon A.
2006-01-01
A NASA intercenter team has developed a suite of low-thrust trajectory analysis tools to make a significant improvement in three major facets of low-thrust trajectory and mission analysis. These are: 1) ease of use, 2) ability to more robustly converge to solutions, and 3) higher fidelity modeling and accuracy of results. Due mostly to the short duration of the development, the team concluded that a suite of tools was preferred over having one integrated tool. This tool-suite, their characteristics, and their applicability will be described. Trajectory analysts can read this paper and determine which tool is most appropriate for their problem.
NASA Astrophysics Data System (ADS)
Snyder, Gregory A.; Taylor, Lawrence A.; Halliday, Alex N.
1993-03-01
Several rocks of alkalic affinity, from the western highlands of the Moon, have been analyzed for their Nd and Sr isotopic compositions. One sample yields a Sm-Nd mineral isochron of 4110 = 41 Ma. This age, in conjunction with U-Pb zircon ages on two other alkalic rocks from the Apollo 14 landing site suggests a distinct western highlands 'event' which was approximately 100 Ma in duration. Since the last dregs of the lunar magma ocean likely crystallized prior to 4.3 Ga, this alkalic 'event' may have included the re-melting of evolved plutons or the remobilization of urKREEP trapped liquid from upper mantle cumulates. Alkalic lithologies such as granites and felsites have been known from the Moon since the earliest days of the Apollo lunar sample returns. However, not until 1977 were alkali-rich rocks recognized from typical highlands suites such as ferroan anorthosites (FAN) and norites and Mg-suite rocks. In the intervening years, several other alkali suite samples have been discovered and characterized, mostly through labor-intesive breccia pull-apart studies of clasts and analyses of coarse-fine fractions of soils. We will speculate on the origins of this suite of lunar highlands rocks.
NASA Technical Reports Server (NTRS)
Snyder, Gregory A.; Taylor, Lawrence A.; Halliday, Alex N.
1993-01-01
Several rocks of alkalic affinity, from the western highlands of the Moon, have been analyzed for their Nd and Sr isotopic compositions. One sample yields a Sm-Nd mineral isochron of 4110 = 41 Ma. This age, in conjunction with U-Pb zircon ages on two other alkalic rocks from the Apollo 14 landing site suggests a distinct western highlands 'event' which was approximately 100 Ma in duration. Since the last dregs of the lunar magma ocean likely crystallized prior to 4.3 Ga, this alkalic 'event' may have included the re-melting of evolved plutons or the remobilization of urKREEP trapped liquid from upper mantle cumulates. Alkalic lithologies such as granites and felsites have been known from the Moon since the earliest days of the Apollo lunar sample returns. However, not until 1977 were alkali-rich rocks recognized from typical highlands suites such as ferroan anorthosites (FAN) and norites and Mg-suite rocks. In the intervening years, several other alkali suite samples have been discovered and characterized, mostly through labor-intesive breccia pull-apart studies of clasts and analyses of coarse-fine fractions of soils. We will speculate on the origins of this suite of lunar highlands rocks.
Mass loss of shuttle space suit orthofabric under simulated ionospheric atomic oxygen bombardment
NASA Technical Reports Server (NTRS)
Miller, W. L.
1985-01-01
Many polymeric materials used for thermal protection and insulation on spacecraft degrade significantly under prolonged bombardment by ionospheric atomic oxygen. The covering fabric of the multilayered shuttle space suit is composed of a loose weave of GORE-TEX fibers, Nomex and Kevlar-29, which are all polymeric materials. The complete evaluation of suit fabric degradation from ionospheric atomic oxygen is of importance in reevaluating suit lifetime and inspection procedures. The mass loss and visible physical changes of each test sample was determined. Kapton control samples and data from previous asher and flight tests were used to scale the results to reflect ionospheric conditions at about 220 km altitude. It is predicted that the orthofabric loses mass in the ionosphere at a rate of about 66% of the original orthofabric mass/yr. The outer layer of the two-layer orthofabric test samples shows few easily visible signs of degradation, even when observed at 440X. It is concluded that the orthofabric could suffer significant loss of performance after much less than a year of total exposure time, while the degradation might be undetectable in post flight visual examinations of space suits.
Predicting outcome of Internet-based treatment for depressive symptoms.
Warmerdam, Lisanne; Van Straten, Annemieke; Twisk, Jos; Cuijpers, Pim
2013-01-01
In this study we explored predictors and moderators of response to Internet-based cognitive behavioral therapy (CBT) and Internet-based problem-solving therapy (PST) for depressive symptoms. The sample consisted of 263 participants with moderate to severe depressive symptoms. Of those, 88 were randomized to CBT, 88 to PST and 87 to a waiting list control condition. Outcomes were improvement and clinically significant change in depressive symptoms after 8 weeks. Higher baseline depression and higher education predicted improvement, while higher education, less avoidance behavior and decreased rational problem-solving skills predicted clinically significant change across all groups. No variables were found that differentially predicted outcome between Internet-based CBT and Internet-based PST. More research is needed with sufficient power to investigate predictors and moderators of response to reveal for whom Internet-based therapy is best suited.
Mineralogy, petrology and chemistry of ANT-suite rocks from the lunar highlands
NASA Technical Reports Server (NTRS)
Prinz, M.; Keil, K.
1977-01-01
Anorthositic-noritic-troctolitic (ANT) rocks are the oldest and most abundant rocks of the lunar surface, and comprise about 90% of the suite of the lunar highlands. Consideration is given to the mineralogy, petrology, bulk chemistry, and origin of ANT-suite rocks. Problems associated in classifying and labeling lunar highland rocks because of textural complexities occurring from impact modifications are discussed. The mineralogy of ANT-suite rocks, dominated by plagioclase, olivine and pyrozene, and containing various minor minerals, is outlined. The petrology of ANT-suite rocks is reviewed along with the major element bulk composition of these rocks, noting that they are extremely depleted in K2O and P2O5. Various models describing the origin of ANT-suite rocks are summarized, and it is suggested that this origin involves a parental liquid of high-alumina basalt with low Fe/Fe+Mg.
The ZPIC educational code suite
NASA Astrophysics Data System (ADS)
Calado, R.; Pardal, M.; Ninhos, P.; Helm, A.; Mori, W. B.; Decyk, V. K.; Vieira, J.; Silva, L. O.; Fonseca, R. A.
2017-10-01
Particle-in-Cell (PIC) codes are used in almost all areas of plasma physics, such as fusion energy research, plasma accelerators, space physics, ion propulsion, and plasma processing, and many other areas. In this work, we present the ZPIC educational code suite, a new initiative to foster training in plasma physics using computer simulations. Leveraging on our expertise and experience from the development and use of the OSIRIS PIC code, we have developed a suite of 1D/2D fully relativistic electromagnetic PIC codes, as well as 1D electrostatic. These codes are self-contained and require only a standard laptop/desktop computer with a C compiler to be run. The output files are written in a new file format called ZDF that can be easily read using the supplied routines in a number of languages, such as Python, and IDL. The code suite also includes a number of example problems that can be used to illustrate several textbook and advanced plasma mechanisms, including instructions for parameter space exploration. We also invite contributions to this repository of test problems that will be made freely available to the community provided the input files comply with the format defined by the ZPIC team. The code suite is freely available and hosted on GitHub at https://github.com/zambzamb/zpic. Work partially supported by PICKSC.
2013-04-08
This illustration shows the instruments and subsystems of the Sample Analysis at Mars SAM suite on the Curiosity Rover of NASA Mars Science Laboratory Project. SAM analyzes the gases in the Martian atmosphere.
Applying Bayesian statistics to the study of psychological trauma: A suggestion for future research.
Yalch, Matthew M
2016-03-01
Several contemporary researchers have noted the virtues of Bayesian methods of data analysis. Although debates continue about whether conventional or Bayesian statistics is the "better" approach for researchers in general, there are reasons why Bayesian methods may be well suited to the study of psychological trauma in particular. This article describes how Bayesian statistics offers practical solutions to the problems of data non-normality, small sample size, and missing data common in research on psychological trauma. After a discussion of these problems and the effects they have on trauma research, this article explains the basic philosophical and statistical foundations of Bayesian statistics and how it provides solutions to these problems using an applied example. Results of the literature review and the accompanying example indicates the utility of Bayesian statistics in addressing problems common in trauma research. Bayesian statistics provides a set of methodological tools and a broader philosophical framework that is useful for trauma researchers. Methodological resources are also provided so that interested readers can learn more. (c) 2016 APA, all rights reserved).
Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zhen; Safta, Cosmin; Sargsyan, Khachik
2014-09-01
In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO 2 . This will allow for the examination of regional-scale transport and distribution of CO 2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developedmore » a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO 2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO 2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF assimilated meteorology fields, making it possible to perform a hybrid simulation, in which the Eulerian model (CMAQ) can be used to compute the initial condi- tion needed by the Lagrangian model, while the source-receptor relationships for a large state vector can be efficiently computed using the Lagrangian model in its backward mode. In ad- dition, CMAQ has a complete treatment of atmospheric chemistry of a suite of traditional air pollutants, many of which could help attribute GHGs from different sources. The inference of emissions sources using atmospheric observations is cast as a Bayesian model calibration problem, which is solved using a variety of Bayesian techniques, such as the bias-enhanced Bayesian inference algorithm, which accounts for the intrinsic model deficiency, Polynomial Chaos Expansion to accelerate model evaluation and Markov Chain Monte Carlo sampling, and Karhunen-Lo %60 eve (KL) Expansion to reduce the dimensionality of the state space. We have established an atmospheric measurement site in Livermore, CA and are collect- ing continuous measurements of CO 2 , CH 4 and other species that are typically co-emitted with these GHGs. Measurements of co-emitted species can assist in attributing the GHGs to different emissions sectors. Automatic calibrations using traceable standards are performed routinely for the gas-phase measurements. We are also collecting standard meteorological data at the Livermore site as well as planetary boundary height measurements using a ceilometer. The location of the measurement site is well suited to sample air transported between the San Francisco Bay area and the California Central Valley.« less
Clay-mineral suites, sources, and inferred dispersal routes: Southern California continental shelf
Hein, J.R.; Dowling, J.S.; Schuetze, A.; Lee, H.J.
2003-01-01
Clay mineralogy is useful in determining the distribution, sources, and dispersal routes of fine-grained sediments. In addition, clay minerals, especially smectite, may control the degree to which contaminants are adsorbed by the sediment. We analyzed 250 shelf sediment samples, 24 river-suspended-sediment samples, and 12 river-bed samples for clay-mineral contents in the Southern California Borderland from Point Conception to the Mexico border. In addition, six samples were analyzed from the Palos Verdes Headland in order to characterize the clay minerals contributed to the offshore from that point source. The <2 ??m-size fraction was isolated, Mg-saturated, and glycolated before analysis by X-ray diffraction. Semi-quantitative percentages of smectite, illite, and kaolinite plus chlorite were calculated using peak areas and standard weighting factors. Most fine-grained sediment is supplied to the shelf by rivers during major winter storms, especially during El Nin??o years. The largest sediment fluxes to the region are from the Santa Ynez and Santa Clara Rivers, which drain the Transverse Ranges. The mean clay-mineral suite for the entire shelf sediment data set (26% smectite, 50% illite, 24% kaolinite+chlorite) is closely comparable to that for the mean of all the rivers (31% smectite, 49% illite, 20% kaolinite+chlorite), indicating that the main source of shelf fine-grained sediments is the adjacent rivers. However, regional variations do exist and the shelf is divided into four provinces with characteristic clay-mineral suites. The means of the clay-mineral suites of the two southernmost provinces are within analytical error of the mineral suites of adjacent rivers. The next province to the north includes Santa Monica Bay and has a suite of clay minerals derived from mixing of fine-grained sediments from several sources, both from the north and south. The northernmost province clay-mineral suite matches moderately well that of the adjacent rivers, but does indicate some mixing from sources in adjacent provinces.
Spectral Characterization of Analog Samples in Anticipation of OSIRIS-REx's Arrival at Bennu
NASA Technical Reports Server (NTRS)
Donaldson Hanna, K. L.; Schrader, D. L.; Bowles, N. E.; Clark, B. E.; Cloutis, E. A.; Connolly, H. C., Jr.; Hamilton, V. E.; Keller, L. P.; Lauretta, D. S.; Lim, L. F.;
2017-01-01
NASA's Origins, Spectral Interpretation, Resource Identification, and Security-Regolith Explorer (OSIRIS-REx) mission successfully launched on September 8th, 2016. During its rendezvous with near-Earth asteroid (101955) Bennu beginning in 2018, OSIRIS-REx will characterize the asteroid's physical, mineralogical, and chemical properties in an effort to globally map the properties of Bennu, a primitive carbonaceous asteroid, and choose a sampling location [e.g. 1]. In preparation for these observations, we spectrally characterized a suite of analog samples across visible, near- and thermal-infrared wavelengths and used these in initial tests of phase detection and abundance determination software algorithms. Here we present the thermal infrared laboratory measurements of the analog sample suite measured under asteroidlike conditions, which are relevant to the interpretation of spectroscopic observations by the OSIRIS-REx Thermal Emission Spectrometer (OTES) [2, 3]. This suite of laboratory measurements of asteroid analogs under asteroid-like conditions is the first of their kind.
PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Frederick, J. M.
2016-12-01
In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.
The physician's reaction to a malpractice suit.
Lavery, J P
1988-01-01
A malpractice suit can have a devastating impact on a practitioner's professional and personal life. The physician's reaction to this event is profound, affecting his own life-style and that of family, colleagues, and patients. This commentary presents an analogy between the physician's reaction to a malpractice suit and the stages of grief described by Elisabeth Kübler-Ross: the sequence of denial, anger, bargaining, depression, and acceptance. Understanding the psychodynamics of this reaction can help physicians to cope with the problems inherent in a malpractice suit and to maintain a greater stability in their personal lives. Adverse effects on medical practice and private life-style, and on the legal proceedings, can be minimized.
Characterization of Carbon Dioxide Washout Measurement Techniques in the Mark-III Space Suit
NASA Technical Reports Server (NTRS)
Norcross, J.; Bekdash, O.; Meginnis, I.
2016-01-01
Providing adequate carbon dioxide (CO2) washout is essential to the reduction of risk in performing suited operations. Long term CO2 exposure can lead to symptoms such as headache, lethargy, dizziness, and in severe cases can lead to unconsciousness and death. Thus maintaining adequate CO2 washout in both ground testing and during in flight EVAs is a requirement of current and future suit designs. It is necessary to understand the inspired CO2 of suit wearers such that future requirements for space suits appropriately address the risk of inadequate washout. Testing conducted by the EVA Physiology Laboratory at the NASA Johnson Space Center aimed to characterize a method for noninvasively measuring inspired oronasal CO2 under pressurized suited conditions in order to better inform requirements definition and verification techniques for future CO2 washout limits in space suits. Prior work conducted by the EPL examined several different wearable, respirator style, masks that could be used to sample air from the vicinity surround the nose and mouth of a suited test subject. Previously published studies utilized these masks, some being commercial products and some novel designs, to monitor CO2 under various exercise and flow conditions with mixed results for repeatability and/or consistency between subjects. Based on a meta-analysis of those studies it was decided to test a nasal cannula as it is a commercially available device that is placed directly in the flow path of the user as they breathe. A nasal cannula was used to sample air inhaled by the test subjects during both rest and exercise conditions. Eight subjects were tasked with walking on a treadmill or operating an arm ergometer to reach target metabolic rates of 1000, 2000, and 3000 BTU/hr. Suit pressure was maintained at 4.3 psid for all tests, with supply flow rates of 6, 4, and 2 actual cubic feet per minute depending on the test condition. Each test configuration was conducted twice with subjects breathing either through their nose only, or however they felt comfortable. By restricting breathing through a single orifice, we are able to more accurately define exactly what flow stream the sampled CO2 is taken from. Oronasal CO2 was monitored using real time infrared gas analyzers fed via sample tubes connected to the nasal cannula within the suit. Two additional sampling tubes were placed at the head and chin of the test subject, in an effort to capture CO2 concentrations across the entire flow stream of the Mark-III vent system (flow path is head to neck). Metabolic rate was calculated via the exhaust CO2 concentration and used to adjust subject workload on either the treadmill or arm ergometer until the target was reached. Forward work will aim to characterize the historically accepted minimum ppCO2 in suit during EVA by repeating this study in the Extravehicular Mobility Unit (EMU) space suit. This will help to define washout requirements for future suits, be they NASA (e.g. Z-2) or Commercial Crew designed. Additionally it is important to determine the functional consequences of CO2 exposure during EVA. Severe CO2 symptoms are a result of very high concentration, acute exposures. While long term, low concentration exposures have been shown to result in slight cognitive decline, symptoms resolve upon quickly returning to nominal concentrations and it remains unknown the impact that minor deficits in cognitive performance can have on EVA performance.
[Research progress of thermal control system for extravehicular activity space suit].
Wu, Z Q; Shen, L P; Yuan, X G
1999-08-01
New research progress of thermal control system for oversea Extravehicular Activity (EVA) space suit is presented. Characteristics of several thermal control systems are analyzed in detail. Some research tendencies and problems are discussed, which are worthwhile to be specially noted. Finally, author's opinion about thermal control system in the future is put forward.
Interaction of Space Suits with Windblown Soil: Preliminary Mars Wind Tunnel Results
NASA Astrophysics Data System (ADS)
Marshall, J.; Bratton, C.; Kosmo, J.; Trevino, R.
1999-09-01
Experiments in the Mars Wind Tunnel at NASA Ames Research Center show that under Mars conditions, spacesuit materials are highly susceptible to dust contamination when exposed to windblown soil. This effect was suspected from knowledge of the interaction of electrostatically adhesive dust with solid surfaces in general. However, it is important to evaluate the respective roles of materials, meteorological and radiation effects, and the character of the soil. The tunnel permits evaluation of dust contamination and sand abrasion of space suits by simulating both pressure and wind conditions on Mars. The long-term function of space suits on Mars will be primarily threatened by dust contamination. Lunar EVA activities caused heavy contamination of space suits, but the problem was never seriously manifest because of the brief utilization of the suits, and the suits were never reused. Electrostatically adhering dust grains have various detrimental effects: (1) penetration and subsequent wear of suit fabrics, (2) viewing obscuration through visors and scratching/pitting of visor surfaces, (3) penetration, wear, and subsequent seizing-up of mechanical suit joints, (4) changes in albedo and therefore of radiation properties of external heat-exchanger systems, (5) changes in electrical conductivity of suit surfaces which may affect tribocharging of suits and create spurious discharge effects detrimental to suit electronics/radio systems. Additional information is contained in the original.
The Mars Science Laboratory Organic Check Material
NASA Technical Reports Server (NTRS)
Conrad, Pamela G.; Eigenbrode, J. E.; Mogensen, C. T.; VonderHeydt, M. O.; Glavin, D. P.; Mahaffy, P. M.; Johnson, J. A.
2011-01-01
The Organic Check Material (OCM) has been developed for use on the Mars Science Laboratory mission to serve as a sample standard for verification of organic cleanliness and characterization of potential sample alteration as a function of the sample acquisition and portioning process on the Curiosity rover. OCM samples will be acquired using the same procedures for drilling, portioning and delivery as are used to study martian samples with The Sample Analysis at Mars (SAM) instrument suite during MSL surface operations. Because the SAM suite is highly sensitive to organic molecules, the mission can better verify the cleanliness of Curiosity's sample acquisition hardware if a known material can be processed through SAM and compared with the results obtained from martian samples.
Simplified Abrasion Test Methodology for Candidate EVA Glove Lay-Ups
NASA Technical Reports Server (NTRS)
Rabel, Emily; Aitchison, Lindsay
2015-01-01
During the Apollo Program, space suit outer-layer fabrics were badly abraded after performing just a few extravehicular activities (EVAs). For example, the Apollo 12 commander reported abrasive wear on the boots that penetrated the outer-layer fabric into the thermal protection layers after less than 8 hrs of surface operations. Current plans for the exploration planetary space suits require the space suits to support hundreds of hours of EVA on a lunar or Martian surface, creating a challenge for space suit designers to utilize materials advances made over the last 40 years and improve on the space suit fabrics used in the Apollo Program. Over the past 25 years the NASA Johnson Space Center Crew and Thermal Systems Division has focused on tumble testing as means of simulating wear on the outer layer of the space suit fabric. Most recently, in 2009, testing was performed on 4 different candidate outer layers to gather baseline data for future use in design of planetary space suit outer layers. In support of the High Performance EVA Glove Element of the Next Generation Life Support Project, testing a new configuration was recently attempted in which require 10% of the fabric per replicate of that need in 2009. The smaller fabric samples allowed for reduced per sample cost and flexibility to test small samples from manufacturers without the overhead to have a production run completed. Data collected from this iteration was compared to that taken in 2009 to validate the new test method. In addition the method also evaluated the fabrics and fabric layups used in a prototype thermal micrometeoroid garment (TMG) developed for EVA gloves under the NASA High Performance EVA Glove Project. This paper provides a review of previous abrasion studies on space suit fabrics, details methodologies used for abrasion testing in this particular study, results of the validation study, and results of the TMG testing.
Vadose zone flow convergence test suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butcher, B. T.
Performance Assessment (PA) simulations for engineered disposal systems at the Savannah River Site involve highly contrasting materials and moisture conditions at and near saturation. These conditions cause severe convergence difficulties that typically result in unacceptable convergence or long simulation times or excessive analyst effort. Adequate convergence is usually achieved in a trial-anderror manner by applying under-relaxation to the Saturation or Pressure variable, in a series of everdecreasing RELAxation values. SRNL would like a more efficient scheme implemented inside PORFLOW to achieve flow convergence in a more reliable and efficient manner. To this end, a suite of test problems that illustratemore » these convergence problems is provided to facilitate diagnosis and development of an improved convergence strategy. The attached files are being transmitted to you describing the test problem and proposed resolution.« less
Biclustering as a method for RNA local multiple sequence alignment.
Wang, Shu; Gutell, Robin R; Miranker, Daniel P
2007-12-15
Biclustering is a clustering method that simultaneously clusters both the domain and range of a relation. A challenge in multiple sequence alignment (MSA) is that the alignment of sequences is often intended to reveal groups of conserved functional subsequences. Simultaneously, the grouping of the sequences can impact the alignment; precisely the kind of dual situation biclustering is intended to address. We define a representation of the MSA problem enabling the application of biclustering algorithms. We develop a computer program for local MSA, BlockMSA, that combines biclustering with divide-and-conquer. BlockMSA simultaneously finds groups of similar sequences and locally aligns subsequences within them. Further alignment is accomplished by dividing both the set of sequences and their contents. The net result is both a multiple sequence alignment and a hierarchical clustering of the sequences. BlockMSA was tested on the subsets of the BRAliBase 2.1 benchmark suite that display high variability and on an extension to that suite to larger problem sizes. Also, alignments were evaluated of two large datasets of current biological interest, T box sequences and Group IC1 Introns. The results were compared with alignments computed by ClustalW, MAFFT, MUCLE and PROBCONS alignment programs using Sum of Pairs (SPS) and Consensus Count. Results for the benchmark suite are sensitive to problem size. On problems of 15 or greater sequences, BlockMSA is consistently the best. On none of the problems in the test suite are there appreciable differences in scores among BlockMSA, MAFFT and PROBCONS. On the T box sequences, BlockMSA does the most faithful job of reproducing known annotations. MAFFT and PROBCONS do not. On the Intron sequences, BlockMSA, MAFFT and MUSCLE are comparable at identifying conserved regions. BlockMSA is implemented in Java. Source code and supplementary datasets are available at http://aug.csres.utexas.edu/msa/
Astronaut L. Gordon Cooper Jr. - Misc. - Gemini-Titan (GT)-5 - Suiting-Up - Prime Crew - Cape
1965-08-19
S65-46367 (19 Aug. 1965) --- Astronauts Charles Conrad Jr. (right) and L. Gordon Cooper Jr. are pictured during suiting up operations before Gemini-5 spaceflight. Editor's note: The scheduled Aug. 19 launch was postponed due to weather conditions and problems with loading cryogenic fuel for the fuel cell. The launch occurred on Aug. 21, 1965.
ERIC Educational Resources Information Center
Cooper, Melanie M.; Cox, Charles T., Jr.; Nammouz, Minory; Case, Edward; Stevens, Ronald
2008-01-01
Improving students' problem-solving skills is a major goal for most science educators. While a large body of research on problem solving exists, assessment of meaningful problem solving is very difficult, particularly for courses with large numbers of students in which one-on-one interactions are not feasible. We have used a suite of software…
A Collection of Problems for Physics Teaching
ERIC Educational Resources Information Center
Grober, S.; Jodl, H. -J.
2010-01-01
Problems are an important instrument for teachers to mediate physics content and for learners to adopt this content. This collection of problems is not only suited to traditional teaching and learning in lectures or student labs, but also to all kinds of new ways of teaching and learning, such as self-study, long-distance teaching,…
Improving Students' Problem Solving in a Virtual Chemistry Simulation through Metacognitive Messages
ERIC Educational Resources Information Center
Beal, Carole R.; Stevens, Ronald H.
2011-01-01
Recent assessments indicate that American students do not score well on tests of scientific problem solving, relative to students in other nations. IMMEX is a web-based virtual environment that provides students with opportunities to solve science problems by viewing information resources through a suite of menu options, developing a hypothesis…
Heim, Brett C; Ivy, Jamie A; Latch, Emily K
2012-01-01
The addax (Addax nasomaculatus) is a critically endangered antelope that is currently maintained in zoos through regional, conservation breeding programs. As for many captive species, incomplete pedigree data currently impedes the ability of addax breeding programs to confidently manage the genetics of captive populations and to select appropriate animals for reintroduction. Molecular markers are often used to improve pedigree resolution, thereby improving the long-term effectiveness of genetic management. When developing a suite of molecular markers, it is important to consider the source of DNA, as the utility of markers may vary across DNA sources. In this study, we optimized a suite of microsatellite markers for use in genotyping captive addax blood samples collected on FTA cards. We amplified 66 microsatellite loci previously described in other Artiodactyls. Sixteen markers amplified a single product in addax, but only 5 of these were found to be polymorphic in a sample of 37 addax sampled from a captive herd at Fossil Rim Wildlife Center in the US. The suite of microsatellite markers developed in this study provides a new tool for the genetic management of captive addax, and demonstrates that FTA cards can be a useful means of sample storage, provided appropriate loci are used in downstream analyses. © 2011 Wiley Periodicals, Inc.
Terrestrial EVA Suit = Fire Fighter's Protective Clothing
NASA Technical Reports Server (NTRS)
Foley, Tico; Brown, Robert G.; Burrell, Eddie; DelRosso, Dominic; Krishen, Kumar; Moffitt, Harold; Orndoff, Evelyne; Santos, Beatrice; Butzer, Melissa; Dasgupta, Rajib
1999-01-01
Firefighters want to go to work, do their job well, and go home alive and uninjured. For their most important job, saving lives, firefighters want protective equipment that will allow more extended and effective time at fire scenes in order to perform victim search and rescue. A team, including engineers at NASA JSC and firefighters from Houston, has developed a list of problem areas for which NASA technology and know-how can recommend improvements for firefighter suits and gear. Prototypes for solutions have been developed and are being evaluated. This effort will spin back to NASA as improvements for lunar and planetary suits.
Suites of dwarfs around Nearby giant galaxies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karachentsev, Igor D.; Kaisina, Elena I.; Makarov, Dmitry I., E-mail: ikar@sao.ru, E-mail: kei@sao.ru, E-mail: dim@sao.ru
2014-01-01
The Updated Nearby Galaxy Catalog (UNGC) contains the most comprehensive summary of distances, radial velocities, and luminosities for 800 galaxies located within 11 Mpc from us. The high density of observables in the UNGC makes this sample indispensable for checking results of N-body simulations of cosmic structures on a ∼1 Mpc scale. The environment of each galaxy in the UNGC was characterized by a tidal index Θ{sub 1}, depending on the separation and mass of the galaxy's main disturber (MD). We grouped UNGC galaxies with a common MD in suites, and ranked suite members according to their Θ{sub 1}. Allmore » suite members with positive Θ{sub 1} are assumed to be physical companions of the MD. About 58% of the sample are members of physical groups. The distribution of suites by the number of members, n, follows a relation N(n) ∼ n {sup –2}. The 20 most populated suites contain 468 galaxies, i.e., 59% of the UNGC sample. The fraction of MDs among the brightest galaxies is almost 100% and drops to 50% at M{sub B} = –18{sup m}. We discuss various properties of MDs, as well as galaxies belonging to their suites. The suite abundance practically does not depend on the morphological type, linear diameter, or hydrogen mass of the MD, the tightest correlation being with the MD dynamical mass. Dwarf galaxies around MDs exhibit well-known segregation effects: the population of the outskirts has later morphological types, richer H I contents, and higher rates of star formation activity. Nevertheless, there are some intriguing cases where dwarf spheroidal galaxies occur at the far periphery of the suites, as well as some late-type dwarfs residing close to MDs. Comparing simulation results with galaxy groups, most studies assume the Local Group is fairly typical. However, we recognize that the nearby groups significantly differ from each other and there is considerable variation in their properties. The suites of companions around the Milky Way and M31, consisting of the Local Group, do not quite seem to be a typical nearby group. The multiplicity of nearby groups of the number of their physical members can be described by the Hirsh-like index h{sub g} = 9, indicating that the Local Volume contains nine groups with populations exceeding nine companions to their MDs.« less
Ramchand, Rajeev; Rudavsky, Rena; Grant, Sean; Tanielian, Terri; Jaycox, Lisa
2015-05-01
This review summarizes the epidemiology of posttraumatic stress disorder (PTSD) and related mental health problems among persons who served in the armed forces during the Iraq and Afghanistan conflicts, as reflected in the literature published between 2009 and 2014. One-hundred and sixteen research studies are reviewed, most of which are among non-treatment-seeking US service members or treatment-seeking US veterans. Evidence is provided for demographic, military, and deployment-related risk factors for PTSD, though most derive from cross-sectional studies and few control for combat exposure, which is a primary risk factor for mental health problems in this cohort. Evidence is also provided linking PTSD with outcomes in the following domains: physical health, suicide, housing and homelessness, employment and economic well-being, social well-being, and aggression, violence, and criminality. Also included is evidence about the prevalence of mental health service use in this cohort. In many instances, the current suite of studies replicates findings observed in civilian samples, but new findings emerge of relevance to both military and civilian populations, such as the link between PTSD and suicide. Future research should make effort to control for combat exposure and use longitudinal study designs; promising areas for investigation are in non-treatment-seeking samples of US veterans and the role of social support in preventing or mitigating mental health problems in this group.
Comparative Ergonomic Evaluation of Spacesuit and Space Vehicle Design
NASA Technical Reports Server (NTRS)
England, Scott; Cowley, Matthew; Benson, Elizabeth; Harvill, Lauren; Blackledge, Christopher; Perez, Esau; Rajulu, Sudhakar
2012-01-01
With the advent of the latest human spaceflight objectives, a series of prototype architectures for a new launch and reentry spacesuit that would be suited to the new mission goals. Four prototype suits were evaluated to compare their performance and enable the selection of the preferred suit components and designs. A consolidated approach to testing was taken: concurrently collecting suit mobility data, seat-suit-vehicle interface clearances, and qualitative assessments of suit performance within the volume of a Multi-Purpose Crew Vehicle mockup. It was necessary to maintain high fidelity in a mockup and use advanced motion-capture technologies in order to achieve the objectives of the study. These seemingly mutually exclusive goals were accommodated with the construction of an optically transparent and fully adjustable frame mockup. The construction of the mockup was such that it could be dimensionally validated rapidly with the motioncapture system. This paper describes the method used to create a space vehicle mockup compatible with use of an optical motion-capture system, the consolidated approach for evaluating spacesuits in action, and a way to use the complex data set resulting from a limited number of test subjects to generate hardware requirements for an entire population. Kinematics, hardware clearance, anthropometry (suited and unsuited), and subjective feedback data were recorded on 15 unsuited and 5 suited subjects. Unsuited subjects were selected chiefly based on their anthropometry in an attempt to find subjects who fell within predefined criteria for medium male, large male, and small female subjects. The suited subjects were selected as a subset of the unsuited medium male subjects and were tested in both unpressurized and pressurized conditions. The prototype spacesuits were each fabricated in a single size to accommodate an approximately average-sized male, so select findings from the suit testing were systematically extrapolated to the extremes of the population to anticipate likely problem areas. This extrapolation was achieved by first comparing suited subjects performance with their unsuited performance, and then applying the results to the entire range of the population. The use of a transparent space vehicle mockup enabled the collection of large amounts of data during human-in-the-loop testing. Mobility data revealed that most of the tested spacesuits had sufficient ranges of motion for the selected tasks to be performed successfully. A suited subject's inability to perform a task most often stemmed from a combination of poor field of view in a seated position, poor dexterity of the pressurized gloves, or from suit/vehicle interface issues. Seat ingress and egress testing showed that problems with anthropometric accommodation did not exclusively occur with the largest or smallest subjects, but also with specific combinations of measurements that led to narrower seat ingress/egress clearance.
NASA Technical Reports Server (NTRS)
Smith, James T.
2008-01-01
The development of the in-house Miniaturized Double Latching Solenoid Valve, or Microvalve, for the Gas Processing System (GPS) of the Sample Analysis at Mars (SAM) instrument suite is described. The Microvalve is a double latching solenoid valve that actuates a pintle shaft axially to hermetically seal an orifice. The key requirements and the design innovations implemented to meet them are described.
Two-question depression-screeners - the solution to all problems?
Albani, Cornelia; Bailer, Harald; Blaser, Gerd; Brähler, Elmar; Geyer, Michael; Grulke, Norbert
2006-04-01
Depression constitutes a considerable issue in medicine and it is anticipated that the amount of people suffering from affective disorders will increase significantly. It would be useful to have a simple, fast screening procedure which would help detect depression. In four recently published articles a two-question depression-screener is recommended. Sensitivity, specificity, likelihood ratios, negative and positive predictive values were compared. For four different clinical samples and one sample that was representative of the German population the prevalence for depression ranged from 6.9 % to 18.1 %. Sensitivity and specificity reached values from 72.6 % to 96.6 % and from 56.9 % to 90.0 % respectively. All negative predictive values were high (< 97 %) opposed to positive predictive values (17.8 % to 38.5 %). Overall, it seems that the two-question screenings are well suited for the exclusion of a major depression. It is possible that regular screening could further lower the percentage of undiagnosed cases.
A Research Methodology for Studying What Makes Some Problems Difficult to Solve
ERIC Educational Resources Information Center
Gulacar, Ozcan; Fynewever, Herb
2010-01-01
We present a quantitative model for predicting the level of difficulty subjects will experience with specific problems. The model explicitly accounts for the number of subproblems a problem can be broken into and the difficultly of each subproblem. Although the model builds on previously published models, it is uniquely suited for blending with…
The Canadian space agency planetary analogue materials suite
NASA Astrophysics Data System (ADS)
Cloutis, Edward A.; Mann, Paul; Izawa, Matthew R. M.; Applin, Daniel M.; Samson, Claire; Kruzelecky, Roman; Glotch, Timothy D.; Mertzman, Stanley A.; Mertzman, Karen R.; Haltigin, Timothy W.; Fry, Christopher
2015-12-01
The Canadian Space Agency (CSA) recently commissioned the development of a suite of over fifty well-characterized planetary analogue materials. These materials are terrestrial rocks and minerals that are similar to those known or suspected to occur on the lunar or martian surfaces. These include: Mars analogue sedimentary, hydrothermal, igneous and low-temperature alteration rock suites; lunar analogue basaltic and anorthositic rock suites; and a generic impactite rock suite from a variety of terrestrial impact structures. Representative thin sections of the materials have been characterized by optical microscopy and electron probe microanalysis (EPMA). Reflectance spectra have been collected in the ultraviolet, visible, near-infrared and mid-infrared, covering 0.2-25 μm. Thermal infrared emission spectra were collected from 5 to 50 μm. Raman spectra with 532 nm excitation, and laser-induced fluorescence spectra with 405 nm excitation were also measured. Bulk chemical analysis was carried out using X-ray fluorescence, with Fe valence determined by wet chemistry. Chemical and mineralogical data were collected using a field-portable Terra XRD-XRF instrument similar to CheMin on the MSL Curiosity rover. Laser-induced breakdown spectroscopy (LIBS) data similar to those measured by ChemCam on MSL were collected for powdered samples, cut slab surfaces, and as depth profiles into weathered surfaces where present. Three-dimensional laser camera images of rock textures were collected for selected samples. The CSA intends to make available sample powders (<45 μm and 45-1000 μm grain sizes), thin sections, and bulk rock samples, and all analytical data collected in the initial characterisation study to the broader planetary science community. Aiming to complement existing planetary analogue rock and mineral libraries, the CSA suite represents a new resource for planetary scientists and engineers. We envision many potential applications for these materials in the definition, development and testing of new analytical instruments for use in planetary missions, as well as possible calibration and ground-truthing of remote sensing data sets. These materials may also be useful as reference materials for cross-calibration between different instruments and laboratories. Comparison of the analytical data for selected samples is useful for highlighting the relative strengths, weaknesses and synergies of different analytical techniques.
Stochastic Evolutionary Algorithms for Planning Robot Paths
NASA Technical Reports Server (NTRS)
Fink, Wolfgang; Aghazarian, Hrand; Huntsberger, Terrance; Terrile, Richard
2006-01-01
A computer program implements stochastic evolutionary algorithms for planning and optimizing collision-free paths for robots and their jointed limbs. Stochastic evolutionary algorithms can be made to produce acceptably close approximations to exact, optimal solutions for path-planning problems while often demanding much less computation than do exhaustive-search and deterministic inverse-kinematics algorithms that have been used previously for this purpose. Hence, the present software is better suited for application aboard robots having limited computing capabilities (see figure). The stochastic aspect lies in the use of simulated annealing to (1) prevent trapping of an optimization algorithm in local minima of an energy-like error measure by which the fitness of a trial solution is evaluated while (2) ensuring that the entire multidimensional configuration and parameter space of the path-planning problem is sampled efficiently with respect to both robot joint angles and computation time. Simulated annealing is an established technique for avoiding local minima in multidimensional optimization problems, but has not, until now, been applied to planning collision-free robot paths by use of low-power computers.
Thermal Performance Of Space Suit Elements With Aerogel Insulation For Moon And Mars Exploration
NASA Technical Reports Server (NTRS)
Tang, Henry H.; Orndoff, Evelyne S.; Trevino, Luis A.
2006-01-01
Flexible fiber-reinforced aerogel composites were studied for use as insulation materials of a future space suit for Moon and Mars exploration. High flexibility and good thermal insulation properties of fiber-reinforced silica aerogel composites at both high and low vacuum conditions make it a promising insulation candidate for the space suit application. This paper first presents the results of a durability (mechanical cycling) study of these aerogels composites in the context of retaining their thermal performance. The study shows that some of these Aerogels materials retained most of their insulation performance after up to 250,000 cycles of mechanical flex cycling. This paper also examines the problem of integrating these flexible aerogel composites into the current space suit elements. Thermal conductivity evaluations are proposed for different types of aerogels space suit elements to identify the lay-up concept that may have the best overall thermal performance for both Moon and Mars environments. Potential solutions in mitigating the silica dusting issue related to the application of these aerogels materials for the space suit elements are also discussed.
Cycle life machine for AX-5 space suit
NASA Technical Reports Server (NTRS)
Schenberger, Deborah S.
1990-01-01
In order to accurately test the AX-5 space suit, a complex series of motions needed to be performed which provided a unique opportunity for mechanism design. The cycle life machine design showed how 3-D computer images can enhance mechanical design as well as help in visualizing mechanisms before manufacturing them. In the early stages of the design, potential problems in the motion of the joint and in the four bar linkage system were resolved using CAD. Since these problems would have been very difficult and tedious to solve on a drawing board, they would probably not have been addressed prior to fabrication, thus limiting the final design or requiring design modification after fabrication.
Characterizing the Effect of Shock on Isotopic Ages. 2; Mg-Suite Troctolite Major Elements
NASA Technical Reports Server (NTRS)
Edmunson, Jennifer; Cohen, Barbara
2009-01-01
Two troctolites from the lunar magnesium suite (Mg-suite), 76335 and 76535, have Sm-147-ND-143 and Rb-87- Sr-87 ages that do not indicate the same age for their respective sample. In the case of 76335, the Sm-147-ND-143 age is 4278 +/- 60 Ma, but the Rb-87-Sr-87 data does not reveal an isochron]. For 76535, the Sm-147-ND-143 age is significantly younger (4260 +/- 60 Ma) than the Rb-87- Sr-87 age (4570 +/- 70 Ma, Lambda = 1.402x10(exp -11)). This study was designed to discover why the Sm-147-ND-143 and Rb-87-Sr-87 ages did not match for each individual sample.
Evaluation of Methods for Multidisciplinary Design Optimization (MDO). Part 2
NASA Technical Reports Server (NTRS)
Kodiyalam, Srinivas; Yuan, Charles; Sobieski, Jaroslaw (Technical Monitor)
2000-01-01
A new MDO method, BLISS, and two different variants of the method, BLISS/RS and BLISS/S, have been implemented using iSIGHT's scripting language and evaluated in this report on multidisciplinary problems. All of these methods are based on decomposing a modular system optimization system into several subtasks optimization, that may be executed concurrently, and the system optimization that coordinates the subtasks optimization. The BLISS method and its variants are well suited for exploiting the concurrent processing capabilities in a multiprocessor machine. Several steps, including the local sensitivity analysis, local optimization, response surfaces construction and updates are all ideally suited for concurrent processing. Needless to mention, such algorithms that can effectively exploit the concurrent processing capabilities of the compute servers will be a key requirement for solving large-scale industrial design problems, such as the automotive vehicle problem detailed in Section 3.4.
NASA Astrophysics Data System (ADS)
Alemany, Kristina
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
Hettich, Robert L.; Pan, Chongle; Chourey, Karuna; Giannone, Richard J.
2013-01-01
Summary The availability of extensive genome information for many different microbes, including unculturable species in mixed communities from environmental samples, has enabled systems-biology interrogation by providing a means to access genomic, transcriptomic, and proteomic information. To this end, metaproteomics exploits the power of high performance mass spectrometry for extensive characterization of the complete suite of proteins expressed by a microbial community in an environmental sample. PMID:23469896
Collaborative Problem Writing in the Multicultural Classroom.
ERIC Educational Resources Information Center
Padula, Janice; Nin, Lucy; Lam, Sucy
1998-01-01
Suggests that teachers could write problems to suit student populations in their schools and in the process of doing so encourage discussion, reflection, and writing by their students on mathematical language. Presents a series of games for the creation of equations for Year 7 students. (Contains 13 references.) (ASK)
Sequences for Student Investigation
ERIC Educational Resources Information Center
Barton, Jeffrey; Feil, David; Lartigue, David; Mullins, Bernadette
2004-01-01
We describe two classes of sequences that give rise to accessible problems for undergraduate research. These problems may be understood with virtually no prerequisites and are well suited for computer-aided investigation. The first sequence is a variation of one introduced by Stephen Wolfram in connection with his study of cellular automata. The…
Teachers' Use of Agricultural Laboratories in Secondary Agricultural Education
ERIC Educational Resources Information Center
Shoulders, Catherine W.; Myers, Brian E.
2012-01-01
Trends in the agriculture industry require students to have the ability to solve problems associated with scientific content. Agricultural laboratories are considered a main component of secondary agricultural education, and are well suited to provide students with opportunities to develop problem-solving skills through experiential learning. This…
Digesting Student-Authored Story Problems
ERIC Educational Resources Information Center
Alexander, Cathleen M.; Ambrose, Rebecca C.
2010-01-01
When students are asked to write original story problems about fractional amounts, it can illustrate their misunderstandings about fractions. Think about the situations students would describe to model 1/2 + 2/3. Three elements, in particular, challenge students: (1) Which of three models (region, or area; measure; or set) is best suited for a…
Challenging Aerospace Problems for Intelligent Systems
NASA Technical Reports Server (NTRS)
Krishnakumar, Kalmanje; Kanashige, John; Satyadas, A.; Clancy, Daniel (Technical Monitor)
2002-01-01
In this paper we highlight four problem domains that are well suited and challenging for intelligent system technologies. The problems are defined and an outline of a probable approach is presented. No attempt is made to define the problems as test cases. In other words, no data or set of equations that a user can code and get results are provided. The main idea behind this paper is to motivate intelligent system researchers to examine problems that will elevate intelligent system technologies and applications to a higher level.
Challenging Aerospace Problems for Intelligent Systems
NASA Technical Reports Server (NTRS)
KrishnaKumar, K.; Kanashige, J.; Satyadas, A.
2003-01-01
In this paper we highlight four problem domains that are well suited and challenging for intelligent system technologies. The problems are defined and an outline of a probable approach is presented. No attempt is made to define the problems as test cases. In other words, no data or set of equations that a user can code and get results are provided. The main idea behind this paper is to motivate intelligent system researchers to examine problems that will elevate intelligent system technologies and applications to a higher level.
Ages of pristine noritic clasts from lunar breccias 15445 and 15455
NASA Technical Reports Server (NTRS)
Shih, C.-Y.; Nyquist, L. E.; Dasch, E. J.; Bogard, D. D.; Bansal, B. M.; Wiesmann, H.
1993-01-01
The Rb-Sr and Sm-Nd isotopic ages were determined for two Apollo 15 pristine lunar breccias, 15445 and 15455, collected near Spur Crater on the Apennine Front. The analyses of mineral separates from two norite samples in breccia 15445 showed that the Sm-Nd isotopic system for both norites from the large Clast B of 15445 was well defined, yielding precise ages of 4.28 +/- 0.03 Ga and 4.46 +/- 0.07 Ga, suggesting that the Cast B is a mixture of two or more lithologies. The overall age results indicate that some Mg-suite rocks are as old as ferroan-anorthosite-suite rocks. Moreover, age data of three major crustal rocks (a Mg suite, a ferroan-anorthosite suite, and an evolved suite) show that they all have variable ages.
Demonstration of the Capabilities of the KINEROS2 – AGWA 3.0 Suite of Modeling Tools
This poster and computer demonstration illustrates a sampling of the wide range of applications that are possible using the KINEROS2 - AGWA suite of modeling tools. Applications include: 1) Incorporation of Low Impact Development (LID) features; 2) A real-time flash flood forecas...
NASA Technical Reports Server (NTRS)
Stern, Jennifer C.; Navarro-Gonzalez, Rafael; Freissinet, Caroline; McKay, Christopher P.; Archer, P. Douglas, Jr.; Buch, Arnaud; Coll, Patrice; Eigenbrode, Jennifer L.; Franz, Heather B.; Glavin, Daniel P.;
2014-01-01
The Sampl;e Analysis at Mars (sam) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen bearing compounds during the pyrolysis of surface materials from the three sites at Gale Crater. Preliminary detections of nitrogen species include No, HCN, ClCN, and TFMA ((trifluoro-N-methyl-acetamide), Confirmation of indigenous Martian nitrogen-bearing compounds requires quantifying N contribution from the terrestrial derivatization reagents carried for SAM's wet chemistry experiment that contribute to the SAM background. Nitrogen species detected in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate a compound that has also been identified by SAM in Mars solid samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anastas, M.Y.; Caplan, P.E.; Froehlich, P.A.
An on-site visit was made to the Ortho Pharmaceutical Corporation (OPC), Raritan, New Jersey to evaluate methods of controlling exposure to hazardous materials during the manufacturing of medications. OPC produced oral-contraceptive tablets containing norethindrone (NOR), mestranol, and ethynylestradiol (EE). Ventilation was an important engineering control at this site. Other engineering controls included the isolation of work procedures and automation of work practices for weighing ingredients, granulation of substances, tableting, and packaging. Area samples were taken for air monitoring of steroid concentration levels in each manufacturing area. Access to the work areas was only through the locker rooms. Samples taken inmore » the locker rooms revealed no detectable contaminant levels. Workers performing high risk activities wore air supplied vinyl suits and disposable rubber gloves. The vinyl suits had overshoes attached. For moderate risk activities the workers wore a disposable suit, rubber gloves and shoe covers. Appropriate respirators were provided. Workers in low risk activities wore disposable rubber gloves and appropriate respirators. Sampling indicated that processing workers experienced breathing-zone levels outside their vinyl suits of 16.40 and 0.36 micrograms/cubic meter of NOR and EE, respectively.« less
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
Development of paper-based electrochemical sensors for water quality monitoring
NASA Astrophysics Data System (ADS)
Smith, Suzanne; Bezuidenhout, Petroné; Mbanjwa, Mesuli; Zheng, Haitao; Conning, Mariette; Palaniyandy, Nithyadharseni; Ozoemena, Kenneth; Land, Kevin
2016-02-01
We present a method for the development of paper-based electrochemical sensors for detection of heavy metals in water samples. Contaminated water leads to serious health problems and environmental issues. Paper is ideally suited for point-of-care testing, as it is low cost, disposable, and multi-functional. Initial sensor designs were manufactured on paper substrates using combinations of inkjet printing and screen printing technologies using silver and carbon inks. Bismuth onion-like carbon nanoparticle ink was manufactured and used as the active material of the sensor for both commercial and paper-based sensors, which were compared using standard electrochemical analysis techniques. The results highlight the potential of paper-based sensors to be used effectively for rapid water quality monitoring at the point-of-need.
NASA Technical Reports Server (NTRS)
Allen, Carlton; Sellar, Glenn; Nunez, Jorge; Mosie, Andrea; Schwarz, Carol; Parker, Terry; Winterhalter, Daniel; Farmer, Jack
2009-01-01
Astronauts on long-duration lunar missions will need the capability to high-grade their samples to select the highest value samples for transport to Earth and to leave others on the Moon. We are supporting studies to define the necessary and sufficient measurements and techniques for high-grading samples at a lunar outpost. A glovebox, dedicated to testing instruments and techniques for high-grading samples, is in operation at the JSC Lunar Experiment Laboratory. A reference suite of lunar rocks and soils, spanning the full compositional range found in the Apollo collection, is available for testing in this laboratory. Thin sections of these samples are available for direct comparison. The Lunar Sample Compendium, on-line at http://www-curator.jsc.nasa.gov/lunar/compendium.cfm, summarizes previous analyses of these samples. The laboratory, sample suite, and Compendium are available to the lunar research and exploration community. In the first test of possible instruments for lunar sample high-grading, we imaged 18 lunar rocks and four soils from the reference suite using the Multispectral Microscopic Imager (MMI) developed by Arizona State University and JPL (see Farmer et. al. abstract). The MMI is a fixed-focus digital imaging system with a resolution of 62.5 microns/pixel, a field size of 40 x 32 mm, and a depth-of-field of approximately 5 mm. Samples are illuminated sequentially by 21 light emitting diodes in discrete wavelengths spanning the visible to shortwave infrared. Measurements of reflectance standards and background allow calibration to absolute reflectance. ENVI-based software is used to produce spectra for specific minerals as well as multi-spectral images of rock textures.
Geochemical and isotopic water results, Barrow, Alaska, 2012-2013
Heikoop, Jeff; Wilson, Cathy; Newman, Brent
2012-07-18
Data include a large suite of analytes (geochemical and isotopic) for samples collected in Barrow, Alaska (2012-2013). Sample types are indicated, and include soil pore waters, drainage waters, snowmelt, precipitation, and permafrost samples.
System Aware Cybersecurity: A Multi-Sentinel Scheme to Protect a Weapons Research Lab
2015-12-07
In the simplified deployment scenario, some sensors report their output over a wireless link and other sensors are connected via CAT 5 (Ethernet...cable to reduce the chance of a wireless ‘jamming’ event impacting ALL sensors . In addition to this first sensor suite ( Sensor Suite “A”), the team...generating wind turbines , and video reconnaissance systems on unmanned aerial vehicles (UAVs). The most basic decision problem in designing a systems
Dame, Brittany E; Solomon, D Kip; Evans, William C.; Ingebritsen, Steven E.
2015-01-01
Helium (He) concentration and 3 He/ 4 He anomalies in soil gas and spring water are potentially powerful tools for investigating hydrothermal circulation associated with volca- nism and could perhaps serve as part of a hazards warning system. However, in operational practice, He and other gases are often sampled only after volcanic unrest is detected by other means. A new passive diffusion sampler suite, intended to be collected after the onset of unrest, has been developed and tested as a relatively low-cost method of determining He- isotope composition pre- and post-unrest. The samplers, each with a distinct equilibration time, passively record He concen- tration and isotope ratio in springs and soil gas. Once collected and analyzed, the He concentrations in the samplers are used to deconvolve the time history of the He concentration and the 3 He/ 4 He ratio at the collection site. The current suite consisting of three samplers is sufficient to deconvolve both the magnitude and the timing of a step change in in situ con- centration if the suite is collected within 100 h of the change. The effects of temperature and prolonged deployment on the suite ’ s capability of recording He anomalies have also been evaluated. The suite has captured a significant 3 He/ 4 He soil gas anomaly at Horseshoe Lake near Mammoth Lakes, California. The passive diffusion sampler suite appears to be an accurate and affordable alternative for determining He anomalies associated with volcanic unrest.
Concordancers and Dictionaries as Problem-Solving Tools for ESL Academic Writing
ERIC Educational Resources Information Center
Yoon, Choongil
2016-01-01
The present study investigated how 6 Korean ESL graduate students in Canada used a suite of freely available reference resources, consisting of Web-based corpus tools, Google search engines, and dictionaries, for solving linguistic problems while completing an authentic academic writing assignment in English. Using a mixed methods design, the…
Racism; A Film Course Study Guide.
ERIC Educational Resources Information Center
Kernan, Margot
The medium of film is uniquely suited to the representation of social problems such as racism. By stressing major issues of racism--slavery, the black cultural heritage, black power, and the black civil rights movement--and coupling these issues with films which give a realistic view of the substance and problems of racism, both concepts…
Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows
Xia, Yidong; Wang, Chuanjin; Luo, Hong; ...
2015-12-15
Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, we have attempted some form of solution verification to identify sensitivities in the solution methods, and to suggest best practices when using the Hydra-TH code.« less
Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Yidong; Wang, Chuanjin; Luo, Hong
Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, we have attempted some form of solution verification to identify sensitivities in the solution methods, and to suggest best practices when using the Hydra-TH code.« less
NASA Technical Reports Server (NTRS)
Snyder, Gregory A.; Neal, Clive R.; Taylor, Lawrence A.; Halliday, Alex N.
1995-01-01
The earliest evolution of the Moon likely included the formation of a magma ocean and the subsequent development of anorthositic flotation cumulates. This primary anorthositic crust was then intruded by mafic magmas which crystallized to form the lunar highlands magnesian suite. The present study is a compilation of petrologic, mineral-chemical, and geochemical information on all pristine magnesian-suite plutonic rocks and the interpretation of this data in light of 18 'new' samples. Of these 18 clasts taken from Apollo 14 breccias, 12 are probably pristine and include four dunites, two norites, four troctolites, and two anorthosites. Radiogenic isotopic whole rock data also are reported for one of the 'probably pristine' anorthositic troctolites, sample 14303,347. The relatively low Rb content and high Sm and Nd abundances of 14303,347 suggest that this cumulate rock was derived from a parental magma which had these chemical characteristics. Trace element, isotopic, and mineral-chemical data are used to interpret the total highlands magnesian suite as crustal precipitates of a primitive KREEP (possessing a K-, rare earth element (REE)-, and P-enriched chemical signature) basalt magma. This KREEP basalt was created by the mixing of ascending ultramafic melts from the lunar interior with urKREEP (the late, K-, REE-, and P-enriched residuum of the lunar magma ocean). A few samples of the magnesian suite with extremely elevated large-ion lithophile elements (5-10x other magnesian-suite rocks) cannot be explained by this model or any other model of autometasomatism, equilibrium crystallization, or 'local melt-pocket equilibrium' without recourse to an extremely large-ion lithophile element-enriched parent liquid. It is difficult to generate parental liquids which are 2-4 x higher in the REE than average lunar KREEP, unless the liquids are the basic complement of a liquid-liquid pair, i.e., the so-called 'REEP-fraction,' from the silicate liquid immiscibility of urKREEP. Scarce age information on lunar rocks suggests that magnesian-suite magmatism was initiated at progressively more recent time from the northeast to the southwest on the lunar nearside from 4.45 to 4.25 Ga.
A bicriteria heuristic for an elective surgery scheduling problem.
Marques, Inês; Captivo, M Eugénia; Vaz Pato, Margarida
2015-09-01
Resource rationalization and reduction of waiting lists for surgery are two main guidelines for hospital units outlined in the Portuguese National Health Plan. This work is dedicated to an elective surgery scheduling problem arising in a Lisbon public hospital. In order to increase the surgical suite's efficiency and to reduce the waiting lists for surgery, two objectives are considered: maximize surgical suite occupation and maximize the number of surgeries scheduled. This elective surgery scheduling problem consists of assigning an intervention date, an operating room and a starting time for elective surgeries selected from the hospital waiting list. Accordingly, a bicriteria surgery scheduling problem arising in the hospital under study is presented. To search for efficient solutions of the bicriteria optimization problem, the minimization of a weighted Chebyshev distance to a reference point is used. A constructive and improvement heuristic procedure specially designed to address the objectives of the problem is developed and results of computational experiments obtained with empirical data from the hospital are presented. This study shows that by using the bicriteria approach presented here it is possible to build surgical plans with very good performance levels. This method can be used within an interactive approach with the decision maker. It can also be easily adapted to other hospitals with similar scheduling conditions.
Vijaysegaran, Praveen; Knibbs, Luke D; Morawska, Lidia; Crawford, Ross W
2018-05-01
The role of space suits in the prevention of orthopedic prosthetic joint infection remains unclear. Recent evidence suggests that space suits may in fact contribute to increased infection rates, with bioaerosol emissions from space suits identified as a potential cause. This study aimed to compare the particle and microbiological emission rates (PER and MER) of space suits and standard surgical clothing. A comparison of emission rates between space suits and standard surgical clothing was performed in a simulated surgical environment during 5 separate experiments. Particle counts were analyzed with 2 separate particle counters capable of detecting particles between 0.1 and 20 μm. An Andersen impactor was used to sample bacteria, with culture counts performed at 24 and 48 hours. Four experiments consistently showed statistically significant increases in both PER and MER when space suits are used compared with standard surgical clothing. One experiment showed inconsistent results, with a trend toward increases in both PER and MER when space suits are used compared with standard surgical clothing. Space suits cause increased PER and MER compared with standard surgical clothing. This finding provides mechanistic evidence to support the increased prosthetic joint infection rates observed in clinical studies. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Hennessy, Mary J.
1992-01-01
The Electron Microscopy Abrasion Analysis of Candidate Fabrics for Planetary Space Suit Protective Overgarment Application is in support of the Abrasion Resistance Materials Screening Test. The fundamental assumption made for the SEM abrasion analysis was that woven fabrics to be used as the outermost layer of the protective overgarment in the design of the future, planetary space suits perform best when new. It is the goal of this study to determine which of the candidate fabrics was abraded the least in the tumble test. The sample that was abraded the least will be identified at the end of the report as the primary candidate fabric for further investigation. In addition, this analysis will determine if the abrasion seen by the laboratory tumbled samples is representative of actual EVA Apollo abrasion.
Development of the DL/H-1 full pressure suit for private spaceflight
NASA Astrophysics Data System (ADS)
León, Pablo de; Harris, Gary L.
2010-06-01
The objective of this paper is to detail the need for full pressure suits to protect spaceflight participants during the experimental phases of flight testing of new space vehicles. It also details the objectives, historical background, basis for design, problems encountered by the designers and final development of the DL/H-1 full pressure suit. It will include justification for its use and results of the initial tests in the high altitude chamber and spacecraft simulator at the J.D. Odegard School of Aerospace Sciences at the University of North Dakota. For the test flights of early commercial space vehicles and tourist suborbital spacecrafts, emergency protection from the rarified air of the upper atmosphere and the vacuum of low Earth orbit almost certainly will be a requirement. Suborbital vehicles could be operating in "space equivalent conditions" for as long as 30 min to as much as several hours. In the case of cabin pressure loss, without personal protection, catastrophic loss of crew and vehicle could result. This paper explains the different steps taken by the authors who designed and built a preflight hardware pressure suit that can meet the physiological and comfort requirements of the tourist suborbital industry and the early commercial private spaceflight community. The suborbital tourist and commercial spaceflight industry have unique problems confronting the pressure suit builder such as unpressurized comfort, reasonable expense, unique sizing of the general population, decompression complications of persons not fitting a past military physiology profile and equipment weight issues. In addition, the lack of a certifying agency or guidance from international or national aviation authorities has created the opportunity for the emerging civilian pressure suit industry to create a new safety standard by which it can regulate itself in the same way the recreational SCUBA diving industry has since the late 1950s.
Major Volatiles Released from the Fourth John Klein Portion
2013-04-08
As the Sample Analysis at Mars SAM suite of instruments on NASA Curiosity Mars rover heats a sample, gases are released or evolved from the sample and can be identified using SAM quadrupole mass spectrometer.
Unsteady Solution of Non-Linear Differential Equations Using Walsh Function Series
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.
2015-01-01
Walsh functions form an orthonormal basis set consisting of square waves. The discontinuous nature of square waves make the system well suited for representing functions with discontinuities. The product of any two Walsh functions is another Walsh function - a feature that can radically change an algorithm for solving non-linear partial differential equations (PDEs). The solution algorithm of non-linear differential equations using Walsh function series is unique in that integrals and derivatives may be computed using simple matrix multiplication of series representations of functions. Solutions to PDEs are derived as functions of wave component amplitude. Three sample problems are presented to illustrate the Walsh function series approach to solving unsteady PDEs. These include an advection equation, a Burgers equation, and a Riemann problem. The sample problems demonstrate the use of the Walsh function solution algorithms, exploiting Fast Walsh Transforms in multi-dimensions (O(Nlog(N))). Details of a Fast Walsh Reciprocal, defined here for the first time, enable inversion of aWalsh Symmetric Matrix in O(Nlog(N)) operations. Walsh functions have been derived using a fractal recursion algorithm and these fractal patterns are observed in the progression of pairs of wave number amplitudes in the solutions. These patterns are most easily observed in a remapping defined as a fractal fingerprint (FFP). A prolongation of existing solutions to the next highest order exploits these patterns. The algorithms presented here are considered a work in progress that provide new alternatives and new insights into the solution of non-linear PDEs.
Pérez-Del-Olmo, A; Montero, F E; Fernández, M; Barrett, J; Raga, J A; Kostadinova, A
2010-10-01
We address the effect of spatial scale and temporal variation on model generality when forming predictive models for fish assignment using a new data mining approach, Random Forests (RF), to variable biological markers (parasite community data). Models were implemented for a fish host-parasite system sampled along the Mediterranean and Atlantic coasts of Spain and were validated using independent datasets. We considered 2 basic classification problems in evaluating the importance of variations in parasite infracommunities for assignment of individual fish to their populations of origin: multiclass (2-5 population models, using 2 seasonal replicates from each of the populations) and 2-class task (using 4 seasonal replicates from 1 Atlantic and 1 Mediterranean population each). The main results are that (i) RF are well suited for multiclass population assignment using parasite communities in non-migratory fish; (ii) RF provide an efficient means for model cross-validation on the baseline data and this allows sample size limitations in parasite tag studies to be tackled effectively; (iii) the performance of RF is dependent on the complexity and spatial extent/configuration of the problem; and (iv) the development of predictive models is strongly influenced by seasonal change and this stresses the importance of both temporal replication and model validation in parasite tagging studies.
NASA Technical Reports Server (NTRS)
Papike, J. J.; Fowler, G. W.; Shearer, C. K.
1994-01-01
The lunar Mg suite, which includes dunites, troctolites, and norites, could make up 20-30% of the Moon's crust down to a depth of 60 km. The remainder is largely anorthositic. This report focuses on norites because we have found that the chemical characteristics of orthopyroxene are effective recorders of their parental melt compositions. Many of the samples representing the Mg suite are small and unrepresentative. In addition, they are cumulates and thus are difficult to study by whole-rock techniques. Therefore, we decided to study these rocks by SIMS techniques to analyze a suite of trace elements in orthopyroxene. The 12 norite samples were selected from a recent compilation by Warren who attempted to select the best candidate samples from the standpoint of their pristine character. Our present database includes greater than 300 superior Electromagnetic Pulse (EMP) analyses and greater than 50 scanning ion mass spectroscopy (SIMS) analyses for 8 Rare Earth Elements (REE), Zr, Y, and Sr. The Mg#s for the parental melts calculated from Mg#s in orthopyroxene show that most melts have Mg#s in the range of 0.36-0.60. This compares with a range of Mg#s for lunar volcanic picritic glass beads of 0.4-0.68. Therefore, although the cumulate whole-rock compositions of the Mg suite can be extremely magnesian, the calculated parental melts are not anomalously high in Mg. A chemical characteristic of the Mg-suite norites that is more difficult to explain is the high KREEP content of the calculated parental melts. The REE contents for the calculated norite parental melts have REE that match or exceed the high-K KREEP component of Warren. Therefore, mixing of a KREEP component and a picritic melt cannot, by itself, explain the high estimated REE contents of the melts parental to norites. Advanced crystallization following KREEP incorporation, especially of plagiclase, may also be required.
Spacesuit and Space Vehicle Comparative Ergonomic Evaluation
NASA Technical Reports Server (NTRS)
England, Scott; Benson, Elizabeth; Cowley, Matthew; Harvill, Lauren; Blackledge, Christopher; Perez, Esau; Rajulu, Sudhakar
2011-01-01
With the advent of the latest manned spaceflight objectives, a series of prototype launch and reentry spacesuit architectures were evaluated for eventual down selection by NASA based on the performance of a set of designated tasks. A consolidated approach was taken to testing, concurrently collecting suit mobility data, seat-suit-vehicle interface clearances and movement strategies within the volume of a Multi-Purpose Crew Vehicle mockup. To achieve the objectives of the test, a requirement was set forth to maintain high mockup fidelity while using advanced motion capture technologies. These seemingly mutually exclusive goals were accommodated with the construction of an optically transparent and fully adjustable frame mockup. The mockup was constructed such that it could be dimensionally validated rapidly with the motion capture system. This paper will describe the method used to create a motion capture compatible space vehicle mockup, the consolidated approach for evaluating spacesuits in action, as well as the various methods for generating hardware requirements for an entire population from the resulting complex data set using a limited number of test subjects. Kinematics, hardware clearance, suited anthropometry, and subjective feedback data were recorded on fifteen unsuited and five suited subjects. Unsuited subjects were selected chiefly by anthropometry, in an attempt to find subjects who fell within predefined criteria for medium male, large male and small female subjects. The suited subjects were selected as a subset of the unsuited subjects and tested in both unpressurized and pressurized conditions. Since the prototype spacesuits were fabricated in a single size to accommodate an approximately average sized male, the findings from the suit testing were systematically extrapolated to the extremes of the population to anticipate likely problem areas. This extrapolation was achieved by first performing population analysis through a comparison of suited subjects performance to their unsuited performance and then applying the results to the entire range of population. The use of a transparent space vehicle mockup enabled the collection of large amounts of data during human-in-the-loop testing. Mobility data revealed that most of the tested spacesuits had sufficient ranges of motion for tasks to be performed successfully. A failed tasked by a suited subject most often stemmed from a combination of poor field of view while seated and poor dexterity of the gloves when pressurized or from suit/vehicle interface issues. Seat ingress/egress testing showed that problems with anthropometric accommodation does not exclusively occur with the largest or smallest subjects, but rather specific combinations of measurements that lead to narrower seat ingress/egress clearance.
Automated model optimisation using the Cylc workflow engine (Cyclops v1.0)
NASA Astrophysics Data System (ADS)
Gorman, Richard M.; Oliver, Hilary J.
2018-06-01
Most geophysical models include many parameters that are not fully determined by theory, and can be tuned
to improve the model's agreement with available data. We might attempt to automate this tuning process in an objective way by employing an optimisation algorithm to find the set of parameters that minimises a cost function derived from comparing model outputs with measurements. A number of algorithms are available for solving optimisation problems, in various programming languages, but interfacing such software to a complex geophysical model simulation presents certain challenges. To tackle this problem, we have developed an optimisation suite (Cyclops
) based on the Cylc workflow engine that implements a wide selection of optimisation algorithms from the NLopt Python toolbox (Johnson, 2014). The Cyclops optimisation suite can be used to calibrate any modelling system that has itself been implemented as a (separate) Cylc model suite, provided it includes computation and output of the desired scalar cost function. A growing number of institutions are using Cylc to orchestrate complex distributed suites of interdependent cycling tasks within their operational forecast systems, and in such cases application of the optimisation suite is particularly straightforward. As a test case, we applied the Cyclops to calibrate a global implementation of the WAVEWATCH III (v4.18) third-generation spectral wave model, forced by ERA-Interim input fields. This was calibrated over a 1-year period (1997), before applying the calibrated model to a full (1979-2016) wave hindcast. The chosen error metric was the spatial average of the root mean square error of hindcast significant wave height compared with collocated altimeter records. We describe the results of a calibration in which up to 19 parameters were optimised.
Design and Evaluation of a Ventilated Garment for Use in Temperatures up to 200°C
Crockford, G. W.; Hellon, R. F.
1964-01-01
The protection of personnel against high air and radiant temperatures is a problem that has been confronting industry for many years now, and for many industrial situations it still has not been solved. The experiments reported here were intended to determine the most suitable form of insulation for a hot entry suit for use primarily in furnace wrecking where mean radiant temperatures of 200°C. are met and where heat-reflecting garments are unsuitable due to the rapid deterioration of the reflecting surface. From a preliminary consideration of the problem it was concluded that a ventilated garment was required and that conventional ventilated garments in which air is induced to flow parallel to the body surfaces (axial ventilation) are basically unsound in design as the air is not utilized for the transfer of heat in the most efficient manner. A new form of ventilation was therefore developed in which air flows out through a permeable suit (radial ventilation). This form of ventilation produces what is called dynamic insulation, and this method of insulation, when compared with two alternative methods on a physical model, was found to be very effective. The model experiments were confirmed by comparative trials of three ventilated suits each using one of three different forms of insulation thought to be suitable for use in heat-protective clothing. Physiological measurements made on the subjects and physical measurement made on the suits confirmed that dynamic insulation is the most suitable insulation for a hot entry suit for furnace wrecking. With the air flows used in these experiments, dynamic insulation had a thermal conductance one-fifth that of conventional static insulation, and sweat losses and oral temperature rises were reduced by one-third and one-half respectively. PMID:14180476
EVA Suit Microbial Leakage Investigation Project
NASA Technical Reports Server (NTRS)
Falker, Jay; Baker, Christopher; Clayton, Ronald; Rucker, Michelle
2016-01-01
The objective of this project is to collect microbial samples from various EVA suits to determine how much microbial contamination is typically released during simulated planetary exploration activities. Data will be released to the planetary protection and science communities, and advanced EVA system designers. In the best case scenario, we will discover that very little microbial contamination leaks from our current or prototype suit designs, in the worst case scenario, we will identify leak paths, learn more about what affects leakage--and we'll have a new, flight-certified swab tool for our EVA toolbox.
Elastic-Tether Suits for Artificial Gravity and Exercise
NASA Technical Reports Server (NTRS)
Torrance, Paul; Biesinger, Paul; Rybicki, Daniel D.
2005-01-01
Body suits harnessed to systems of elastic tethers have been proposed as means of approximating the effects of normal Earth gravitation on crewmembers of spacecraft in flight to help preserve the crewmembers physical fitness. The suits could also be used on Earth to increase effective gravitational loads for purposes of athletic training. The suit according to the proposal would include numerous small tether-attachment fixtures distributed over its outer surface so as to distribute the artificial gravitational force as nearly evenly as possible over the wearer s body. Elastic tethers would be connected between these fixtures and a single attachment fixture on a main elastic tether that would be anchored to a fixture on or under a floor. This fixture might include multiple pulleys to make the effective length of the main tether great enough that normal motions of the wearer cause no more than acceptably small variations in the total artificial gravitational force. Among the problems in designing the suit would be equalizing the load in the shoulder area and keeping tethers out of the way below the knees to prevent tripping. The solution would likely include running tethers through rings on the sides. Body suits with a weight or water ballast system are also proposed for very slight spinning space-station scenarios, in which cases the proposed body suits will easily be able to provide the equivalency of a 1-G or even greater load.
NASA Technical Reports Server (NTRS)
Stern, J. C.; Malespin, C. A.; Eigenbrode, J.; Graham, H. V.; Archer, P. D.; Brunner, A.; Freissinet, C.; Franz, H. B.; Fuentes, J.; Glavin, D. P.;
2014-01-01
The combustion experiment on the Sample Analysis at Mars (SAM) suite on Curiosity will heat a sample of Mars regolith in the presence of oxygen and measure composition of the evolved gases using quadrupole mass spectrometry (QMS) and tunable laser spectrometry (TLS). QMS will enable detection of combustion products such as CO, CO2, NO, and other oxidized species, while TLS will enable precision measurements of the abundance and carbon isotopic composition (delta C-13) of the evolved CO2 and hydrogen isotopic composition (delta D) of H2O. SAM will perform a two-step combustion to isolate combustible materials below approx. 550 C and above approx. 550 C.
Virtual reality in laparoscopic surgery.
Uranüs, Selman; Yanik, Mustafa; Bretthauer, Georg
2004-01-01
Although the many advantages of laparoscopic surgery have made it an established technique, training in laparoscopic surgery posed problems not encountered in conventional surgical training. Virtual reality simulators open up new perspectives for training in laparoscopic surgery. Under realistic conditions in real time, trainees can tailor their sessions with the VR simulator to suit their needs and goals, and can repeat exercises as often as they wish. VR simulators reduce the number of experimental animals needed for training purposes and are suited to the pursuit of research in laparoscopic surgery.
Seeking Signs of Life on Mars: The Importance of Sedimentary Suites as Part of Mars Sample Return
NASA Astrophysics Data System (ADS)
iMOST Team; Mangold, N.; McLennan, S. M.; Czaja, A. D.; Ori, G. G.; Tosca, N. J.; Altieri, F.; Amelin, Y.; Ammannito, E.; Anand, M.; Beaty, D. W.; Benning, L. G.; Bishop, J. L.; Borg, L. E.; Boucher, D.; Brucato, J. R.; Busemann, H.; Campbell, K. A.; Carrier, B. L.; Debaille, V.; Des Marais, D. J.; Dixon, M.; Ehlmann, B. L.; Farmer, J. D.; Fernandez-Remolar, D. C.; Fogarty, J.; Glavin, D. P.; Goreva, Y. S.; Grady, M. M.; Hallis, L. J.; Harrington, A. D.; Hausrath, E. M.; Herd, C. D. K.; Horgan, B.; Humayun, M.; Kleine, T.; Kleinhenz, J.; Mackelprang, R.; Mayhew, L. E.; McCubbin, F. M.; McCoy, J. T.; McSween, H. Y.; Moser, D. E.; Moynier, F.; Mustard, J. F.; Niles, P. B.; Raulin, F.; Rettberg, P.; Rucker, M. A.; Schmitz, N.; Sefton-Nash, E.; Sephton, M. A.; Shaheen, R.; Shuster, D. L.; Siljestrom, S.; Smith, C. L.; Spry, J. A.; Steele, A.; Swindle, T. D.; ten Kate, I. L.; Usui, T.; Van Kranendonk, M. J.; Wadhwa, M.; Weiss, B. P.; Werner, S. C.; Westall, F.; Wheeler, R. M.; Zipfel, J.; Zorzano, M. P.
2018-04-01
Sedimentary, and especially lacustrine, depositional environments are high-priority geological/astrobiological settings for Mars Sample Return. We review the detailed investigations, measurements, and sample types required to evaluate such settings.
Shkolyar, Svetlana; Eshelman, Evan J; Farmer, Jack D; Hamilton, David; Daly, Michael G; Youngbull, Cody
2018-04-01
The Mars 2020 mission will analyze samples in situ and identify any that could have preserved biosignatures in ancient habitable environments for later return to Earth. Highest priority targeted samples include aqueously formed sedimentary lithologies. On Earth, such lithologies can contain fossil biosignatures as aromatic carbon (kerogen). In this study, we analyzed nonextracted kerogen in a diverse suite of natural, complex samples using colocated UV excitation (266 nm) time-gated (UV-TG) Raman and laser-induced fluorescence spectroscopies. We interrogated kerogen and its host matrix in samples to (1) explore the capabilities of UV-TG Raman and fluorescence spectroscopies for detecting kerogen in high-priority targets in the search for possible biosignatures on Mars; (2) assess the effectiveness of time gating and UV laser wavelength in reducing fluorescence in Raman spectra; and (3) identify sample-specific issues that could challenge rover-based identifications of kerogen using UV-TG Raman spectroscopy. We found that ungated UV Raman spectroscopy is suited to identify diagnostic kerogen Raman bands without interfering fluorescence and that UV fluorescence spectroscopy is suited to identify kerogen. These results highlight the value of combining colocated Raman and fluorescence spectroscopies, similar to those obtainable by SHERLOC on Mars 2020, to strengthen the confidence of kerogen detection as a potential biosignature in complex natural samples. Key Words: Raman spectroscopy-Laser-induced fluorescence spectroscopy-Mars Sample Return-Mars 2020 mission-Kerogen-Biosignatures. Astrobiology 18, 431-453.
Basalt generation at the Apollo 12 site. Part 1: New data, classification, and re-evaluation
NASA Technical Reports Server (NTRS)
Neal, Clive R.; Hacker, Matthew D.; Snyder, Gregory A.; Taylor, Lawrence A.; Liu, Yun-Gang; Schmitt, Roman A.
1994-01-01
New data are reported from five previously unanalyzed Apollo 12 mare basalts that are incorporated into an evaluation of previous petrogenetic models and classification schemes for these basalts. This paper proposes a classification for Apollo 12 mare basalts on the basis of whole-rock Mg# (molar 100*(Mg/(Mg+Fe))) and Rb/Sr ratio (analyzed by isotope dilution), whereby the ilmenite, olivine, and pigeonite basalt groups are readily distinguished from each other. Scrutiny of the Apollo 12 feldspathic 'suite' demonstrates that two of the three basalts previously assigned to this group (12031, 12038, 12072) can be reclassified: 12031 is a plagioclase-rich pigeonite basalt; and 12072 is an olivine basalt. Only basalt 12038 stands out as a unique sample to the Apollo 12 site, but whether this represents a single sample from another flow at the Apollo 12 site or is exotic to this site is equivocal. The question of whether the olivine and pigeonite basalt suites are co-magmatic is addressed by incompatible trace-element chemistry: the trends defined by these two suites when Co/Sm and Sm/Eu ratios are plotted against Rb/Sr ratio demonstrate that these two basaltic types cannot be co-magmatic. Crystal fractionation/accumulation paths have been calculated and show that neither the pigeonite, olivine, or ilmenite basalts are related by this process. Each suite requires a distinct and separate source region. This study also examines sample heterogeneity and the degree to which whole-rock analyses are representative, which is critical when petrogenetic interpretation is undertaken. Sample heterogeneity has been investigated petrographically (inhomogeneous mineral distribution) with consideration of duplicate analyses, and whether a specific sample (using average data) plots consistently upon a fractionation trend when a number of different compostional parameters are considered. Using these criteria, four basalts have been identified where reported analyses are not representative of the whole-rock composition: 12005, an ilmenite basalt; 12006 and 12036, olivine basalts; and 12031 previously classified as a feldspathic basalt, but reclassified as part of the pigeonite suite.
Unstructured Adaptive (UA) NAS Parallel Benchmark. Version 1.0
NASA Technical Reports Server (NTRS)
Feng, Huiyu; VanderWijngaart, Rob; Biswas, Rupak; Mavriplis, Catherine
2004-01-01
We present a complete specification of a new benchmark for measuring the performance of modern computer systems when solving scientific problems featuring irregular, dynamic memory accesses. It complements the existing NAS Parallel Benchmark suite. The benchmark involves the solution of a stylized heat transfer problem in a cubic domain, discretized on an adaptively refined, unstructured mesh.
Drugs, Devices, and Desires: A Problem-Based Learning Course in the History of Medicine
ERIC Educational Resources Information Center
Levitt, Sarah; McKeage, Anne; Rangachari, P. K.
2013-01-01
Problem-based learning (PBL) is well suited for courses in the history of medicine, where multiple perspectives exist and information has to be gleaned from different sources. A student, an archivist, and a teacher offer three perspectives about a senior level course where students explored the antecedents and consequences of medical technology.…
Solving the Curriculum Sequencing Problem with DNA Computing Approach
ERIC Educational Resources Information Center
Debbah, Amina; Ben Ali, Yamina Mohamed
2014-01-01
In the e-learning systems, a learning path is known as a sequence of learning materials linked to each others to help learners achieving their learning goals. As it is impossible to have the same learning path that suits different learners, the Curriculum Sequencing problem (CS) consists of the generation of a personalized learning path for each…
NASA Astrophysics Data System (ADS)
Azezan, Nur Arif; Ramli, Mohammad Fadzli; Masran, Hafiz
2017-11-01
In this paper, we discussed a literature on blood collection-distribution that based on vehicle routing problem. This problem emergence when the process from collection to stock up must be completed in timely manner. We also modified the mathematical model so that it will suited to general collection of blood. A discussion on its algorithm and solution methods are also pointed out briefly in this paper.
Development of a coupled expert system for the spacecraft attitude control problem
NASA Technical Reports Server (NTRS)
Kawamura, K.; Beale, G.; Schaffer, J.; Hsieh, B.-J.; Padalkar, S.; Rodriguezmoscoso, J.; Vinz, F.; Fernandez, K.
1987-01-01
A majority of the current expert systems focus on the symbolic-oriented logic and inference mechanisms of artificial intelligence (AI). Common rule-based systems employ empirical associations and are not well suited to deal with problems often arising in engineering. Described is a prototype expert system which combines both symbolic and numeric computing. The expert system's configuration is presented and its application to a spacecraft attitude control problem is discussed.
NASA Astrophysics Data System (ADS)
Louro, Vinicius; Cawood, Peter; Mantovani, Marta
2016-04-01
The Jauru Terrain hosts the Figueira Branca Intrusive Suite (FBS) in the SW of the Amazon Craton (Brazil). The FBS is a series of 1425 Ma layered mafic intrusions, previously interpreted as anorogenic. The FBS area is located in foreland to the Santa Helena orogen, formed by the subduction of the Rio Alegre Terrain under the Jauru Terrain. Potential field methods (magnetic and gravity), gamma-ray spectrometry, geochemical and isotope data were used to characterize and to model the extent of FBS magmatism, the distribution of faults and shear zones in the area, to evaluate affinities of the magmatic activity, and the relation between the FBS and the Santa Helena orogen. The geophysical methods identified three anomalies corresponding with FBS outcrops. A fourth anomaly with significantly higher amplitude was observed to the north of the three anomalies. From south to north, the anomalies were named Indiavaí, Azteca, Figueira Branca and Jauru. These anomalies were modeled and indicated a northwest-southeast trend, parallel to regional shear zones. The gamma-ray data enabled the collection of 50 samples from the FBS rocks, the Alto Jauru group that hosts the FBS, from nearby intrusive suites, and the Rio Alegre Terrain. The 30 freshest samples were analyzed by X-ray fluorescence for oxides and some trace elements, 20 by ICP-MS for Rare-Earth Elements and 10 for Nd-Sr isotope analyses. The FBS samples were gabbros and gabbro-norites with Nb/Yb-Th/Yb and TiO2/Yb-Nb/Yb ratios indicating varying degrees of crustal interaction. The TiO2/Yb-Nb/Yb data suggested a subduction related component and the ɛNd-ɛSr indicated a juvenile source. Samples from coeval adjacent intermediate magma suites displayed similar characteristics, which suggest derivation from a bimodal source probably related with the subduction of the Rio Alegre Terrain. We interpreted the tectonic setting of the FBS as a result of a roll-back of the subducted slab, which resulted in rejuvenation of the mantle under Jauru Terrain to form the FBS and nearby suites, but also under the subducting Rio Alegre Terrain, producing the magma that formed the 1412 Ma to 1380 Ma Santa Rita Suite in the Rio Alegre Terrain, which has a juvenile ɛNd(t) signature (+3.6). Thus in summary out data indicate that the FBS was part of the origin and evolution of the magmatic suites of the Rio Alegre and Jauru Terrains.
What was uniform about the fin-de-siècle sailor suit?
Rose, Clare
2011-01-01
The sailor suits widely worn by children in late-nineteenth-century Britain have been interpreted at the time, and since, as expressions of an Imperial ethos. Yet, a closer examination of the ways that these garments were produced by mass manufacturers, mediated by advertisers and fashion advisors and consumed by families makes us question this characterization. Manufacturers interpreted sailor suits not as unchanging uniforms but as fashion items responding to seasonal changes. Consumers used them to assert social identities and social distinctions, selecting from the multiple variants available. Cultural commentators described sailor suits as emulating Royal practice—but also as ‘common’ and to be avoided. A close analysis of large samples of images and texts from the period 1870–1900 reveals how these different meanings overlapped, making the fin-de-siècle sailor suit a garment that undermines many of our assumptions.
NASA Technical Reports Server (NTRS)
Elardo, S. M.; Shearer, C. K.; McCuddin, F. M.
2018-01-01
The lunar magnesian-suite, or Mg-suite, is a series of ancient plutonic rocks from the lunar crust with ages and compositions indicating that they represent crust-building magmatism occurring immediately after the end of magma ocean crystallization. Samples of the Mg-suite were found at every Apollo landing site except 11 and ubiquitously have geochemical characteristics indicating the involvement of KREEP in their petrogenesis. This observation has led to the suggestion that the presence of the KREEP reservoir under the lunar nearside was responsible for this episode of crust building. The lack of any readily identifiable Mg-suite rocks in meteoritic regolith breccias sourced from outside the Procellarum KREEP Terrane (PKT) seemingly supports this interpretation.
Özkan Tuncay, Fatma; Mollaoğlu, Mukadder
2017-12-01
To determine the effects of cooling suit on fatigue and activities of daily living of individuals with multiple sclerosis. Fatigue is one of the most common symptoms in people with multiple sclerosis and adversely affects their activities of daily living. Studies evaluating fatigue associated with multiple sclerosis have reported that most of the fatigue cases are related to the increase in body temperature and that cooling therapy is effective in coping with fatigue. This study used a two sample, control group design. The study sample comprised 75 individuals who met the inclusion criteria. Data were collected with study forms. After the study data were collected, cooling suit treatment was administered to the experimental group. During home visits paid at the fourth and eighth weeks after the intervention, the aforementioned scales were re-administered to the participants in the experimental and control groups. The analyses performed demonstrated that the severity levels of fatigue experienced by the participants in the experimental group wearing cooling suit decreased. The experimental group also exhibited a significant improvement in the participants' levels of independence in activities of daily living. The cooling suit worn by individuals with multiple sclerosis was determined to significantly improve the participants' levels of fatigue and independence in activities of daily living. The cooling suit therapy was found to be an effective intervention for the debilitating fatigue suffered by many multiple sclerosis patients, thus significantly improving their level of independence in activities of daily living. © 2017 John Wiley & Sons Ltd.
Li, Xiao-jun; Yi, Eugene C; Kemp, Christopher J; Zhang, Hui; Aebersold, Ruedi
2005-09-01
There is an increasing interest in the quantitative proteomic measurement of the protein contents of substantially similar biological samples, e.g. for the analysis of cellular response to perturbations over time or for the discovery of protein biomarkers from clinical samples. Technical limitations of current proteomic platforms such as limited reproducibility and low throughput make this a challenging task. A new LC-MS-based platform is able to generate complex peptide patterns from the analysis of proteolyzed protein samples at high throughput and represents a promising approach for quantitative proteomics. A crucial component of the LC-MS approach is the accurate evaluation of the abundance of detected peptides over many samples and the identification of peptide features that can stratify samples with respect to their genetic, physiological, or environmental origins. We present here a new software suite, SpecArray, that generates a peptide versus sample array from a set of LC-MS data. A peptide array stores the relative abundance of thousands of peptide features in many samples and is in a format identical to that of a gene expression microarray. A peptide array can be subjected to an unsupervised clustering analysis to stratify samples or to a discriminant analysis to identify discriminatory peptide features. We applied the SpecArray to analyze two sets of LC-MS data: one was from four repeat LC-MS analyses of the same glycopeptide sample, and another was from LC-MS analysis of serum samples of five male and five female mice. We demonstrate through these two study cases that the SpecArray software suite can serve as an effective software platform in the LC-MS approach for quantitative proteomics.
Results and Analysis from Space Suit Joint Torque Testing
NASA Technical Reports Server (NTRS)
Matty, Jennifer E.; Aitchison, Lindsay
2009-01-01
A space suit s mobility is critical to an astronaut s ability to perform work efficiently. As mobility increases, the astronaut can perform tasks for longer durations with less fatigue. The term mobility, with respect to space suits, is defined in terms of two key components: joint range of motion and joint torque. Individually these measures describe the path which in which a joint travels and the force required to move it through that path. Previous space suits mobility requirements were defined as the collective result of these two measures and verified by the completion of discrete functional tasks. While a valid way to impose mobility requirements, such a method does necessitate a solid understanding of the operational scenarios in which the final suit will be performing. Because the Constellation space suit system requirements are being finalized with a relatively immature concept of operations, the Space Suit Element team elected to define mobility in terms of its constituent parts to increase the likelihood that the future pressure garment will be mobile enough to enable a broad scope of undefined exploration activities. The range of motion requirements were defined by measuring the ranges of motion test subjects achieved while performing a series of joint maximizing tasks in a variety of flight and prototype space suits. The definition of joint torque requirements has proved more elusive. NASA evaluated several different approaches to the problem before deciding to generate requirements based on unmanned joint torque evaluations of six different space suit configurations being articulated through 16 separate joint movements. This paper discusses the experiment design, data analysis and results, and the process used to determine the final values for the Constellation pressure garment joint torque requirements.
NASA Technical Reports Server (NTRS)
Stern, J. C.; Navarro-Gonzales, R.; Freissinet, C.; McKay, C. P.; Archer, P. D., Jr.; Buch, A.; Brunner, A. E.; Coll, P.; Eigenbrode, J. L.; Franz, H. B.;
2014-01-01
The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) Curiosity Rover detected both reduced and oxidized nitrogen-bearing compounds during the pyrolysis of surface materials at Yellowknife Bay in Gale Crater. Preliminary detections of nitrogen species include NO, HCN, ClCN, CH3CN, and TFMA (trifluoro-N-methyl-acetamide). Confirmation of indigenous Martian N-bearing compounds requires quantifying N contribution from the terrestrial derivatization reagents (e.g. N-methyl-N-tertbutyldimethylsilyltrifluoroacetamide, MTBSTFA and dimethylformamide, DMF) carried for SAM's wet chemistry experiment that contribute to the SAM background. Nitrogen species detected in the SAM solid sample analyses can also be produced during laboratory pyrolysis experiments where these reagents are heated in the presence of perchlorate, a compound that has also been identified by SAM in Mars solid samples.
Analysis of glow discharges for understanding the process of film formation
NASA Technical Reports Server (NTRS)
Venugopalan, M.; Avni, R.
1984-01-01
The physical and chemical processes which occur during the formation of different types of films in a variety of glow discharge plasmas are discussed. Emphasis is placed on plasma diagnostic experiments using spectroscopic methods, probe analysis, mass spectrometric sampling and magnetic resonance techniques which are well suited to investigate the neutral and ionized gas phase species as well as some aspects of plasma surface interactions. The results on metallic, semi-conducting and insulating films are reviewed in conjunction with proposed models and the problem encountered under film deposition conditions. It is concluded that the understanding of film deposition process requires additional experimental information on plasma surface interactions of free radicals and the synergetic effects where photon, electron and ion bombardment change the reactivity of the incident radical with the surface.
Checkout and Standard Use Procedures for the Mark III Space Suit Assembly
NASA Technical Reports Server (NTRS)
Valish, Dana J.
2012-01-01
The operational pressure range is the range to which the suit can be nominally operated for manned testing. The top end of the nominal operational pressure range is equivalent to 1/2 the proof pressure. Structural pressure is 1.5 times the specified test pressure for any given test. Proof pressure is the maximum unmanned pressure to which the suit was tested by the vendor prior to delivery. The maximum allowable working pressure (MAWP) is 90% of the proof pressure. The pressure systems RVs are set to keep components below their MAWPs. If the suit is pressurized over its MAWP, the suit will be taken out of service and an in-depth inspection/review of the suit will be performed before the suit is put back in service. The procedures outlined in this document should be followed as written. However, the suit test engineer (STE) may make redline changes real-time, provided those changes are recorded in the anomaly section of the test data sheet. If technicians supporting suit build-up, check-out, and/or test execution believe that a procedure can be improved, they should notify their lead. If procedures are incorrect to the point of potentially causing hardware damage or affecting safety, bring the problem to the technician lead and/or STE s attention and stop work until a solution (temporary or permanent) is authorized. Certain steps in the procedure are marked with a DV , for Designated Verifier. The Designated Verifier for this procedure is an Advanced Space Suit Technology Development Laboratory technician, not directly involved in performing the procedural steps, who will verify that the step was performed as stated. The steps to be verified by the DV were selected based on one or more of the following criteria: the step was deemed significant in ensuring the safe performance of the test, the data recorded in the step is of specific interest in monitoring the suit system operation, or the step has a strong influence on the successful completion of test objectives. Prior to all manned test activities, Advanced Suit Test Data Sheet (TDS) Parts A-E shall be completed to verify system and team are ready for test. Advanced Suit TDS Parts F-G shall be completed at the end of the suited activity. Appendix B identifies tha appropriate Mark III suit emergency event procedures.
An evaluation of Ada for Al applications
NASA Technical Reports Server (NTRS)
Wallace, David R.
1986-01-01
Expert system technology seems to be the most promising type of Artificial Intelligence (AI) application for Ada. An expert system implemented with an expert system shell provides a highly structured approach that fits well with the structured approach found in Ada systems. The current commercial expert system shells use Lisp. In this highly structured situation a shell could be built that used Ada just as well. On the other hand, if it is necessary to deal with some AI problems that are not suited to expert systems, the use of Ada becomes more problematical. Ada was not designed as an AI development language, and is not suited to that. It is possible that an application developed in say, Common Lisp could be translated to Ada for actual use in a particular application, but this could be difficult. Some standard Ada packages could be developed to make such a translation easier. If the most general AI programs need to be dealt with, a Common Lisp system integrated with the Ada Environment is probably necessary. Aside from problems with language features, Ada, by itself, is not well suited to the prototyping and incremental development that is well supported by Lisp.
2013-02-20
This frame from an animation of NASA Curiosity rover shows the complicated suite of operations involved in conducting the rover first rock sample drilling on Mars and transferring the sample to the rover scoop for inspection.
An Excursion in Applied Mathematics.
ERIC Educational Resources Information Center
von Kaenel, Pierre A.
1981-01-01
An excursion in applied mathematics is detailed in a lesson deemed well-suited for the high school student or undergraduate. The problem focuses on an experimental missile guidance system simulated in the laboratory. (MP)
Manufactured Porous Ambient Surface Simulants
NASA Technical Reports Server (NTRS)
Carey, Elizabeth M.; Peters, Gregory H.; Chu, Lauren; Zhou, Yu Meng; Cohen, Brooklin; Panossian, Lara; Green, Jacklyn R.; Moreland, Scott; Backes, Paul
2016-01-01
The planetary science decadal survey for 2013-2022 (Vision and Voyages, NRC 2011) has promoted mission concepts for sample acquisition from small solar system bodies. Numerous comet-sampling tools are in development to meet this standard. Manufactured Porous Ambient Surface Simulants (MPASS) materials provide an opportunity to simulate variable features at ambient temperatures and pressures to appropriately test potential sample acquisition systems for comets, asteroids, and planetary surfaces. The original "flavor" of MPASS materials is known as Manufactured Porous Ambient Comet Simulants (MPACS), which was developed in parallel with the development of the Biblade Comet Sampling System (Backes et al., in review). The current suite of MPACS materials was developed through research of the physical and mechanical properties of comets from past comet missions results and modeling efforts, coordination with the science community at the Jet Propulsion Laboratory and testing of a wide range of materials and formulations. These simulants were required to represent the physical and mechanical properties of cometary nuclei, based on the current understanding of the science community. Working with cryogenic simulants can be tedious and costly; thus MPACS is a suite of ambient simulants that yields a brittle failure mode similar to that of cryogenic icy materials. Here we describe our suite of comet simulants known as MPACS that will be used to test and validate the Biblade Comet Sampling System (Backes et al., in review).
SSAGES: Software Suite for Advanced General Ensemble Simulations.
Sidky, Hythem; Colón, Yamil J; Helfferich, Julian; Sikora, Benjamin J; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S; Reid, Daniel R; Sevgen, Emre; Thapar, Vikram; Webb, Michael A; Whitmer, Jonathan K; de Pablo, Juan J
2018-01-28
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques-including adaptive biasing force, string methods, and forward flux sampling-that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.
The Impact of Residence Design on Freshman Outcomes: Dormitories versus Suite-Style Residences
ERIC Educational Resources Information Center
Rodger, Susan C.; Johnson, Andrew W.
2005-01-01
This study was designed to measure affective, behavioral, and cognitive variables in a sample of 3159 first-year students, and to compare these variables by the type of residence building in which the student lived. Students living in suite-style buildings reported a greater sense of belonging, and higher activity levels than students living in…
Testing of materials for passive thermal control of space suits
NASA Technical Reports Server (NTRS)
Squire, Bernadette
1988-01-01
An effort is underway to determine the coating material of choice for the AX-5 prototype hard space suit. Samples of 6061 aluminum have been coated with one of 10 selected metal coatings, and subjected to corrosion, abrasion, and thermal testing. Changes in reflectance after exposure are documented. Plated gold exhibited minimal degradation of optical properties. A computer model is used in evaluating coating thermal performance in the EVA environment. The model is verified with an experiment designed to measure the heat transfer characteristics of coated space suit parts in a thermal vacuum chamber. Details of this experiment are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Yidong; Andrs, David; Martineau, Richard Charles
This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for timemore » integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.« less
The effects of microgravity on the skeletal system--a review.
Droppert, P M
1990-01-01
Exposure of astronauts to microgravity leads to the loss of calcium from weightbearing bones. Prolonged exposure, e.g., during a journey to Mars, may present problems on return to Earth, with increased risk of fractures and premature osteoporosis in later life. The precise mechanisms of calcium loss have yet to be determined although a key feature is the absence of mechanical loading. Countermeasures aimed at reducing calcium loss to acceptable levels include the use of exercise, drugs, dietary modifications and inertia suits such as the Soviet "Penguin" suit. Missions of a number of years may, however, require the development of artificial gravity on a spacecraft. The country that first solves the physiological problems of man in space and, in particular, skeletal calcium loss, will almost certainly be the first to be able to put a man on Mars.
The Paradox of Public Service Jefferson, Education, and the Problem of Plato's Cave
ERIC Educational Resources Information Center
Holowchak, M. Andrew
2013-01-01
Plato noticed a sizeable problem apropos of establishing his republic--that there was always a ready pool of zealous potential rulers, lying in wait for a suitable opportunity to rule on their own tyrannical terms. He also recognized that those persons best suited to rule, those persons with foursquare and unimpeachable virtue, would be least…
Wilson, Justin; Dai, Manhong; Jakupovic, Elvis; Watson, Stanley; Meng, Fan
2007-01-01
Modern video cards and game consoles typically have much better performance to price ratios than that of general purpose CPUs. The parallel processing capabilities of game hardware are well-suited for high throughput biomedical data analysis. Our initial results suggest that game hardware is a cost-effective platform for some computationally demanding bioinformatics problems.
Multicriteria analysis of ontologically represented information
NASA Astrophysics Data System (ADS)
Wasielewska, K.; Ganzha, M.; Paprzycki, M.; Bǎdicǎ, C.; Ivanovic, M.; Lirkov, I.
2014-11-01
Our current work concerns the development of a decision support system for the software selection problem. The main idea is to utilize expert knowledge to help the user in selecting the best software / method / computational resource to solve a computational problem. Obviously, this involves multicriterial decision making and the key open question is: which method to choose. The context of the work is provided by the Agents in Grid (AiG) project, where the software selection (and thus multicriterial analysis) is to be realized when all information concerning the problem, the hardware and the software is ontologically represented. Initially, we have considered the Analytical Hierarchy Process (AHP), which is well suited for the hierarchical data structures (e.g., such that have been formulated in terms of ontologies). However, due to its well-known shortcomings, we have decided to extend our search for the multicriterial analysis method best suited for the problem in question. In this paper we report results of our search, which involved: (i) TOPSIS (Technique for Order Preference by Similarity to Ideal Solution), (ii) PROMETHEE, and (iii) GRIP (Generalized Regression with Intensities of Preference). We also briefly argue why other methods have not been considered as valuable candidates.
Solving Large-Scale Inverse Magnetostatic Problems using the Adjoint Method
Bruckner, Florian; Abert, Claas; Wautischer, Gregor; Huber, Christian; Vogler, Christoph; Hinze, Michael; Suess, Dieter
2017-01-01
An efficient algorithm for the reconstruction of the magnetization state within magnetic components is presented. The occurring inverse magnetostatic problem is solved by means of an adjoint approach, based on the Fredkin-Koehler method for the solution of the forward problem. Due to the use of hybrid FEM-BEM coupling combined with matrix compression techniques the resulting algorithm is well suited for large-scale problems. Furthermore the reconstruction of the magnetization state within a permanent magnet as well as an optimal design application are demonstrated. PMID:28098851
NASA Astrophysics Data System (ADS)
Zheng, Genrang; Lin, ZhengChun
The problem of winner determination in combinatorial auctions is a hotspot electronic business, and a NP hard problem. A Hybrid Artificial Fish Swarm Algorithm(HAFSA), which is combined with First Suite Heuristic Algorithm (FSHA) and Artificial Fish Swarm Algorithm (AFSA), is proposed to solve the problem after probing it base on the theories of AFSA. Experiment results show that the HAFSA is a rapidly and efficient algorithm for The problem of winner determining. Compared with Ant colony Optimization Algorithm, it has a good performance with broad and prosperous application.
Variation of Nb-Ta, Zr-Hf, Th-U and K-Cs in two diabase-granophyre suites
Gottfried, D.; Greenland, L.P.; Campbell, E.Y.
1968-01-01
Concentrations of Nb, Ta, Zr, Hf, Th, U and Cs have been determined in samples of igneous rocks representing the diabase-granophyre suites from Dillsburg, Pennsylvania, and Great Lake, Tasmania. Niobium and tantalum have a three to fourfold increase with differentiation in each of the suites. The chilled margin of the Great Lake intrusion contains half the niobium and tantalum content (5.3 ppm and 0.4 ppm, respectively) of the chilled basalt from Dillsburg (10 ppm and 0.9 ppm, respectively). The twofold difference between the suites is correlated with differences in their titanium content. The average Nb Ta ratios for each suite are similar: 13.5 for the Great Lake suite, and 14.4 for the Dillsburg suite. The zirconium content of the two suites is essentially the same and increases from 50 to 60 ppm in the chilled margins to 240-300 ppm in the granophyres. Hafnium is low in the early formed rocks (0.5 -1.5 ppm and achieves a maximum in the granophyres (5-8 ppm). The Zr Hfratio decreases from 68 to 33 with progressive differentiation. In the Dillsburg suite thorium and uranium increase from 2.6 ppm and 0.6 ppm, respectively, in the chilled samples to 11.8 ppm and 3.1 ppm in the granophyres. The chilled margin of the Great Lake suite contains 3.2 ppm thorium and 9.8 ppm uranium; the granophyre contains 11.2 ppm thorium and 2.8 ppm uranium. The average Th U ratios of the Dillsburg and Great Lake suites are nearly the same-4.1 and 4.4, respectively. Within each suite the Th U ratio remains quite constant. Cesium and the K Cs ratio do not vary systematically in the Dillsburg suite possibly because of redistribution or loss of cesium by complex geologic processes. Except for the chilled margin of the Great Lake suite, the variation of Cs and the K Cs ratio are in accord with theoretical considerations. Cesium increases from about 0.6 ppm in the lower zone to 3.5 ppm in the granophyre; the K Cs ratio varies from 10 ?? 103 in the lower zone to 6 ?? 103 in the granophyre. A comparison of the abundance of some of these elements is made with those reported on oceanic tholeiites from the Atlantic and Pacific oceans. Trace elements with large ionic radii (Th, U, Cs) are present in significantly greater concentrations in the two continental tholeiitic series than in the oceanic tholeiites. However, this does not seem to be true for lithophilic elements of smaller ionic radii (Zr and Nb). These trace element distribution patterns, when considered with other minor element and isotopic studies, indicate that 1. (1) crustal contamination does not entirely account for differences between continental and oceanic tholeiites, and 2. (2) the oceanic tholeiites do not necessarily delimit the geochemical characteristics of the mantle. ?? 1968.
Solid-solid phase change thermal storage application to space-suit battery pack
NASA Astrophysics Data System (ADS)
Son, Chang H.; Morehouse, Jeffrey H.
1989-01-01
High cell temperatures are seen as the primary safety problem in the Li-BCX space battery. The exothermic heat from the chemical reactions could raise the temperature of the lithium electrode above the melting temperature. Also, high temperature causes the cell efficiency to decrease. Solid-solid phase-change materials were used as a thermal storage medium to lower this battery cell temperature by utilizing their phase-change (latent heat storage) characteristics. Solid-solid phase-change materials focused on in this study are neopentyl glycol and pentaglycerine. Because of their favorable phase-change characteristics, these materials appear appropriate for space-suit battery pack use. The results of testing various materials are reported as thermophysical property values, and the space-suit battery operating temperature is discussed in terms of these property results.
NASA Technical Reports Server (NTRS)
Williamson, Rebecca; Carbo, Jorge; Luna, Bernadette; Webbon, Bruce W.
1998-01-01
Wearing impermeable garments for hazardous materials clean up can often present a health and safety problem for the wearer. Even short duration clean up activities can produce heat stress injuries in hazardous materials workers. It was hypothesized that an internal cooling system might increase worker productivity and decrease likelihood of heat stress injuries in typical HazMat operations. Two HazMat protective ensembles were compared during treadmill exercise. The different ensembles were created using two different suits: a Trelleborg VPS suit representative of current HazMat suits and a prototype suit developed by NASA engineers. The two life support systems used were a current technology Interspiro Spirolite breathing apparatus and a liquid air breathing system that also provided convective cooling. Twelve local members of a HazMat team served as test subjects. They were fully instrumented to allow a complete physiological comparison of their thermal responses to the different ensembles. Results showed that cooling from the liquid air system significantly decreased thermal stress. The results of the subjective evaluations of new design features in the prototype suit were also highly favorable. Incorporation of these new design features could lead to significant operational advantages in the future.
NASA Astrophysics Data System (ADS)
Martin, P.; Ehlmann, B. L.; Blaney, D. L.; Bhartia, R.; Allwood, A.
2015-12-01
Using the recently developed Ultra Compact Imaging Spectrometer (UCIS) (0.4-2.5 μm) to generate outcrop-scale infrared images and compositional maps, a Mars-relevant field site near China Ranch in the Mojave Desert has been surveyed and sampled to analyze the synergies between instruments in the Mars 2020 rover instrument suite. The site is broadly comprised of large lacustrine gypsum beds with fine-grained gypsiferous mudstones and interbedded volcanic ashes deposited in the Pleistocene, with a carbonate unit atop the outcrop. Alteration products such as clays and iron oxides are pervasive throughout the sequence. Mineralogical mapping of the outcrop was performed using UCIS. As the 2020 rover will have an onboard multispectral camera and IR point spectrometer, Mastcam-Z and SuperCam, this process of spectral analysis leading to the selection of sites for more detailed investigation is similar to the process by which samples will be selected for increased scrutiny during the 2020 mission. The infrared image is resampled (spatially and spectrally) to the resolutions of Mastcam-Z and SuperCam to simulate data from the Mars 2020 rover. Hand samples were gathered in the field (guided by the prior infrared compositional mapping), capturing samples of spectral and mineralogical variance in the scene. After collection, a limited number of specimens were chosen for more detailed analysis. The hand samples are currently being analyzed using JPL prototypes of the Mars 2020 arm-mounted contact instruments, specifically PIXL (Planetary Instrument for X-ray Lithochemistry) and SHERLOC (Scanning Habitable Environments with Raman & Luminescence). The geologic story as told by the Mars 2020 instrument data will be analyzed and compared to the full suite of data collected by hyperspectral imaging and terrestrial techniques (e.g. XRD) applied to the collected hand samples. This work will shed light on the potential uses and synergies of the Mars 2020 instrument suite, especially with regards to spectral (i.e. remote) recognition of important and interesting samples on which to do contact science.
MARS: bringing the automation of small-molecule bioanalytical sample preparations to a new frontier.
Li, Ming; Chou, Judy; Jing, Jing; Xu, Hui; Costa, Aldo; Caputo, Robin; Mikkilineni, Rajesh; Flannelly-King, Shane; Rohde, Ellen; Gan, Lawrence; Klunk, Lewis; Yang, Liyu
2012-06-01
In recent years, there has been a growing interest in automating small-molecule bioanalytical sample preparations specifically using the Hamilton MicroLab(®) STAR liquid-handling platform. In the most extensive work reported thus far, multiple small-molecule sample preparation assay types (protein precipitation extraction, SPE and liquid-liquid extraction) have been integrated into a suite that is composed of graphical user interfaces and Hamilton scripts. Using that suite, bioanalytical scientists have been able to automate various sample preparation methods to a great extent. However, there are still areas that could benefit from further automation, specifically, the full integration of analytical standard and QC sample preparation with study sample extraction in one continuous run, real-time 2D barcode scanning on the Hamilton deck and direct Laboratory Information Management System database connectivity. We developed a new small-molecule sample-preparation automation system that improves in all of the aforementioned areas. The improved system presented herein further streamlines the bioanalytical workflow, simplifies batch run design, reduces analyst intervention and eliminates sample-handling error.
Complexity of Fit, with Application to Space Suits
NASA Technical Reports Server (NTRS)
Rajulu, Sudhakar; Benson, Elizabeth
2009-01-01
Although fitting a garment is often considered more of an art than a science, experts suggest that a subjectively poor fit is a symptom of inappropriate ease, the space between the wearer and the garment. The condition of poor suit fit is a unique problem for the space program and it can be attributed primarily to: a) NASA s policy to accommodate a wide variety of people (males and females from 1st to 99th percentile range and with various shapes and sizes) and b) its requirement to deploy a minimum number of suit sizes for logistical reasons. These factors make the space suit fit difficult to assess, where a wide range of people must be fit by the minimum possible number of suits, and yet, fit is crucial for operability and safety. Existing simplistic sizing scheme do not account for wide variations in shape within a diverse population with very limited sizing options. The complex issue of fit has been addressed by a variety of methods, many of which have been developed by the military, which has always had a keen interest in fitting its diverse population but with a multitude of sizing options. The space program has significantly less sizing options, so a combination of these advanced methods should be used to optimize space suit size and assess space suit fit. Multivariate methods can be used to develop sizing schemes that better reflect the wearer population, and integrated sizing systems can form a compromise between fitting men and women. Range of motion and operability testing can be combined with subjective feedback to provide a comprehensive evaluation of fit. The amount of ease can be tailored using these methods, to provide enough extra room where it is needed, without compromising mobility and comfort. This paper discusses the problem of fit in one of its most challenging applications: providing a safe and comfortable spacesuit that will protect its wearer from the extreme environment of space. It will discuss the challenges and necessity of closely fitting its potential wearers, a group of people from a broad spectrum of the population, and will detail some of the methods that can be employed to ensure and validate a good fit.
Quantifying MCMC exploration of phylogenetic tree space.
Whidden, Chris; Matsen, Frederick A
2015-05-01
In order to gain an understanding of the effectiveness of phylogenetic Markov chain Monte Carlo (MCMC), it is important to understand how quickly the empirical distribution of the MCMC converges to the posterior distribution. In this article, we investigate this problem on phylogenetic tree topologies with a metric that is especially well suited to the task: the subtree prune-and-regraft (SPR) metric. This metric directly corresponds to the minimum number of MCMC rearrangements required to move between trees in common phylogenetic MCMC implementations. We develop a novel graph-based approach to analyze tree posteriors and find that the SPR metric is much more informative than simpler metrics that are unrelated to MCMC moves. In doing so, we show conclusively that topological peaks do occur in Bayesian phylogenetic posteriors from real data sets as sampled with standard MCMC approaches, investigate the efficiency of Metropolis-coupled MCMC (MCMCMC) in traversing the valleys between peaks, and show that conditional clade distribution (CCD) can have systematic problems when there are multiple peaks. © The Author(s) 2015. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.
NASA Astrophysics Data System (ADS)
Cartwright, Ian; Cendón, Dioni; Currell, Matthew; Meredith, Karina
2017-12-01
Documenting the location and magnitude of groundwater recharge is critical for understanding groundwater flow systems. Radioactive tracers, notably 14C, 3H, 36Cl, and the noble gases, together with other tracers whose concentrations vary over time, such as the chlorofluorocarbons or sulfur hexafluoride, are commonly used to estimate recharge rates. This review discusses some of the advantages and problems of using these tracers to estimate recharge rates. The suite of tracers allows recharge to be estimated over timescales ranging from a few years to several hundred thousand years, which allows both the long-term and modern behaviour of groundwater systems to be documented. All tracers record mean residence times and mean recharge rates rather than a specific age and date of recharge. The timescale over which recharge rates are averaged increases with the mean residence time. This is an advantage in providing representative recharge rates but presents a problem in comparing recharge rates derived from these tracers with those from other techniques, such as water table fluctuations or lysimeters. In addition to issues relating to the sampling and interpretation of specific tracers, macroscopic dispersion and mixing in groundwater flow systems limit how precisely groundwater residence times and recharge rates may be estimated. Additionally, many recharge studies have utilised existing infrastructure that may not be ideal for this purpose (e.g., wells with long screens that sample groundwater several kilometres from the recharge area). Ideal recharge studies would collect sufficient information to optimise the use of specific tracers and minimise the problems of mixing and dispersion.
A Volunteer Computing Project for Solving Geoacoustic Inversion Problems
NASA Astrophysics Data System (ADS)
Zaikin, Oleg; Petrov, Pavel; Posypkin, Mikhail; Bulavintsev, Vadim; Kurochkin, Ilya
2017-12-01
A volunteer computing project aimed at solving computationally hard inverse problems in underwater acoustics is described. This project was used to study the possibilities of the sound speed profile reconstruction in a shallow-water waveguide using a dispersion-based geoacoustic inversion scheme. The computational capabilities provided by the project allowed us to investigate the accuracy of the inversion for different mesh sizes of the sound speed profile discretization grid. This problem suits well for volunteer computing because it can be easily decomposed into independent simpler subproblems.
Iterative repair for scheduling and rescheduling
NASA Technical Reports Server (NTRS)
Zweben, Monte; Davis, Eugene; Deale, Michael
1991-01-01
An iterative repair search method is described called constraint based simulated annealing. Simulated annealing is a hill climbing search technique capable of escaping local minima. The utility of the constraint based framework is shown by comparing search performance with and without the constraint framework on a suite of randomly generated problems. Results are also shown of applying the technique to the NASA Space Shuttle ground processing problem. These experiments show that the search methods scales to complex, real world problems and reflects interesting anytime behavior.
Guide star catalogue data retrieval software 2
NASA Technical Reports Server (NTRS)
Smirnov, O. M.; Malkov, O. YU.
1992-01-01
The Guide Star Catalog (GSC), being the largest astronomical catalog to date, is widely used by the astronomical community for all sorts of applications, such as statistical studies of certain sky regions, searches for counterparts to observational phenomena, and generation of finder charts. It's format (2 CD-ROM's) requires minimum hardware and is ideally suited for all sorts of conditions, especially observations. Unfortunately, the actual GSC data is not easily accessible. It takes the form of FITS tables, and the coordinates of the objects are given in one coordinate system (equinox 2000). The included reading software is rudimentary at best. Thus, even generation of a simple finder chart is not a trivial undertaking. To solve this problem, at least for PC users, GUIDARES was created. GUIDARES is a user-friendly program that lets you look directly at the data in the GSC, either as a graphical sky map or as a text table. GUIDARES can read a sampling of GSC data from a given sky region, store this sampling in a text file, and display a graphical map of the sampled region in projected celestial coordinates (perfect for finder charts). GUIDARES supports rectangular and circular regions defined by coordinates in the equatorial, ecliptic (any equinox) or galactic systems.
Rosenblum, Michael A; Laan, Mark J van der
2009-01-07
The validity of standard confidence intervals constructed in survey sampling is based on the central limit theorem. For small sample sizes, the central limit theorem may give a poor approximation, resulting in confidence intervals that are misleading. We discuss this issue and propose methods for constructing confidence intervals for the population mean tailored to small sample sizes. We present a simple approach for constructing confidence intervals for the population mean based on tail bounds for the sample mean that are correct for all sample sizes. Bernstein's inequality provides one such tail bound. The resulting confidence intervals have guaranteed coverage probability under much weaker assumptions than are required for standard methods. A drawback of this approach, as we show, is that these confidence intervals are often quite wide. In response to this, we present a method for constructing much narrower confidence intervals, which are better suited for practical applications, and that are still more robust than confidence intervals based on standard methods, when dealing with small sample sizes. We show how to extend our approaches to much more general estimation problems than estimating the sample mean. We describe how these methods can be used to obtain more reliable confidence intervals in survey sampling. As a concrete example, we construct confidence intervals using our methods for the number of violent deaths between March 2003 and July 2006 in Iraq, based on data from the study "Mortality after the 2003 invasion of Iraq: A cross sectional cluster sample survey," by Burnham et al. (2006).
Using pseudoalignment and base quality to accurately quantify microbial community composition
Novembre, John
2018-01-01
Pooled DNA from multiple unknown organisms arises in a variety of contexts, for example microbial samples from ecological or human health research. Determining the composition of pooled samples can be difficult, especially at the scale of modern sequencing data and reference databases. Here we propose a novel method for taxonomic profiling in pooled DNA that combines the speed and low-memory requirements of k-mer based pseudoalignment with a likelihood framework that uses base quality information to better resolve multiply mapped reads. We apply the method to the problem of classifying 16S rRNA reads using a reference database of known organisms, a common challenge in microbiome research. Using simulations, we show the method is accurate across a variety of read lengths, with different length reference sequences, at different sample depths, and when samples contain reads originating from organisms absent from the reference. We also assess performance in real 16S data, where we reanalyze previous genetic association data to show our method discovers a larger number of quantitative trait associations than other widely used methods. We implement our method in the software Karp, for k-mer based analysis of read pools, to provide a novel combination of speed and accuracy that is uniquely suited for enhancing discoveries in microbial studies. PMID:29659582
Task-driven dictionary learning.
Mairal, Julien; Bach, Francis; Ponce, Jean
2012-04-01
Modeling data with linear combinations of a few elements from a learned dictionary has been the focus of much recent research in machine learning, neuroscience, and signal processing. For signals such as natural images that admit such sparse representations, it is now well established that these models are well suited to restoration tasks. In this context, learning the dictionary amounts to solving a large-scale matrix factorization problem, which can be done efficiently with classical optimization tools. The same approach has also been used for learning features from data for other purposes, e.g., image classification, but tuning the dictionary in a supervised way for these tasks has proven to be more difficult. In this paper, we present a general formulation for supervised dictionary learning adapted to a wide variety of tasks, and present an efficient algorithm for solving the corresponding optimization problem. Experiments on handwritten digit classification, digital art identification, nonlinear inverse image problems, and compressed sensing demonstrate that our approach is effective in large-scale settings, and is well suited to supervised and semi-supervised classification, as well as regression tasks for data that admit sparse representations.
Xia, Yidong; Podgorney, Robert; Huang, Hai
2016-03-17
FALCON (“Fracturing And Liquid CONvection”) is a hybrid continuous / discontinuous Galerkin finite element geothermal reservoir simulation code based on the MOOSE (“Multiphysics Object-Oriented Simulation Environment”) framework being developed and used for multiphysics applications. In the present work, a suite of verification and validation (“V&V”) test problems for FALCON was defined to meet the design requirements, and solved to the interests of enhanced geothermal system (“EGS”) design. Furthermore, the intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the FALCON solution methods. The simulation problems vary in complexity from singly mechanical ormore » thermo process, to coupled thermo-hydro-mechanical processes in geological porous media. Numerical results obtained by FALCON agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these capabilities in FALCON. Some form of solution verification has been attempted to identify sensitivities in the solution methods, where possible, and suggest best practices when using the FALCON code.« less
Progressive Stochastic Reconstruction Technique (PSRT) for cryo electron tomography.
Turoňová, Beata; Marsalek, Lukas; Davidovič, Tomáš; Slusallek, Philipp
2015-03-01
Cryo Electron Tomography (cryoET) plays an essential role in Structural Biology, as it is the only technique that allows to study the structure of large macromolecular complexes in their close to native environment in situ. The reconstruction methods currently in use, such as Weighted Back Projection (WBP) or Simultaneous Iterative Reconstruction Technique (SIRT), deliver noisy and low-contrast reconstructions, which complicates the application of high-resolution protocols, such as Subtomogram Averaging (SA). We propose a Progressive Stochastic Reconstruction Technique (PSRT) - a novel iterative approach to tomographic reconstruction in cryoET based on Monte Carlo random walks guided by Metropolis-Hastings sampling strategy. We design a progressive reconstruction scheme to suit the conditions present in cryoET and apply it successfully to reconstructions of macromolecular complexes from both synthetic and experimental datasets. We show how to integrate PSRT into SA, where it provides an elegant solution to the region-of-interest problem and delivers high-contrast reconstructions that significantly improve template-based localization without any loss of high-resolution structural information. Furthermore, the locality of SA is exploited to design an importance sampling scheme which significantly speeds up the otherwise slow Monte Carlo approach. Finally, we design a new memory efficient solution for the specimen-level interior problem of cryoET, removing all associated artifacts. Copyright © 2015 Elsevier Inc. All rights reserved.
EV space suit gloves (passive)
NASA Technical Reports Server (NTRS)
Fletcher, E. G.; Dodson, J. D.; Elkins, W.; Tickner, E. G.
1975-01-01
A pair of pressure and thermal insulating overgloves to be used with an Extravehicular (EV) suit assembly was designed, developed, fabricated, and tested. The design features extensive use of Nomex felt materials in lieu of the multiple layer insulation formerly used with the Apollo thermal glove. The glove theoretically satisfies all of the thermal requirements. The presence of the thermal glove does not degrade pressure glove tactility by more than the acceptable 10% value. On the other hand, the thermal glove generally degrades pressure glove mobility by more than the acceptable 10% value, primarily in the area of the fingers. Life cycling tests were completed with minimal problems. The thermal glove/pressure glove ensemble was also tested for comfort; the test subjects found no problems with the thermal glove although they did report difficulties with pressure points on the pressure glove which were independent of the thermal glove.
NASA Astrophysics Data System (ADS)
Kovalevsky, Louis; Langley, Robin S.; Caro, Stephane
2016-05-01
Due to the high cost of experimental EMI measurements significant attention has been focused on numerical simulation. Classical methods such as Method of Moment or Finite Difference Time Domain are not well suited for this type of problem, as they require a fine discretisation of space and failed to take into account uncertainties. In this paper, the authors show that the Statistical Energy Analysis is well suited for this type of application. The SEA is a statistical approach employed to solve high frequency problems of electromagnetically reverberant cavities at a reduced computational cost. The key aspects of this approach are (i) to consider an ensemble of system that share the same gross parameter, and (ii) to avoid solving Maxwell's equations inside the cavity, using the power balance principle. The output is an estimate of the field magnitude distribution in each cavity. The method is applied on a typical aircraft structure.
Hwang, Sang Mee; Lee, Ki Chan; Lee, Min Seob; Park, Kyoung Un
2018-01-01
Transition to next generation sequencing (NGS) for BRCA1 / BRCA2 analysis in clinical laboratories is ongoing but different platforms and/or data analysis pipelines give different results resulting in difficulties in implementation. We have evaluated the Ion Personal Genome Machine (PGM) Platforms (Ion PGM, Ion PGM Dx, Thermo Fisher Scientific) for the analysis of BRCA1 /2. The results of Ion PGM with OTG-snpcaller, a pipeline based on Torrent mapping alignment program and Genome Analysis Toolkit, from 75 clinical samples and 14 reference DNA samples were compared with Sanger sequencing for BRCA1 / BRCA2 . Ten clinical samples and 14 reference DNA samples were additionally sequenced by Ion PGM Dx with Torrent Suite. Fifty types of variants including 18 pathogenic or variants of unknown significance were identified from 75 clinical samples and known variants of the reference samples were confirmed by Sanger sequencing and/or NGS. One false-negative results were present for Ion PGM/OTG-snpcaller for an indel variant misidentified as a single nucleotide variant. However, eight discordant results were present for Ion PGM Dx/Torrent Suite with both false-positive and -negative results. A 40-bp deletion, a 4-bp deletion and a 1-bp deletion variant was not called and a false-positive deletion was identified. Four other variants were misidentified as another variant. Ion PGM/OTG-snpcaller showed acceptable performance with good concordance with Sanger sequencing. However, Ion PGM Dx/Torrent Suite showed many discrepant results not suitable for use in a clinical laboratory, requiring further optimization of the data analysis for calling variants.
TOUGH Simulations of the Updegraff's Set of Fluid and Heat Flow Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moridis, G.J.; Pruess
1992-11-01
The TOUGH code [Pruess, 1987] for two-phase flow of water, air, and heat in penneable media has been exercised on a suite of test problems originally selected and simulated by C. D. Updegraff [1989]. These include five 'verification' problems for which analytical or numerical solutions are available, and three 'validation' problems that model laboratory fluid and heat flow experiments. All problems could be run without any code modifications (*). Good and efficient numerical performance, as well as accurate results were obtained throughout. Additional code verification and validation problems from the literature are briefly summarized, and suggestions are given for propermore » applications of TOUGH and related codes.« less
Verification of MCNP6.2 for Nuclear Criticality Safety Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2017-05-10
Several suites of verification/validation benchmark problems were run in early 2017 to verify that the new production release of MCNP6.2 performs correctly for nuclear criticality safety applications (NCS). MCNP6.2 results for several NCS validation suites were compared to the results from MCNP6.1 [1] and MCNP6.1.1 [2]. MCNP6.1 is the production version of MCNP® released in 2013, and MCNP6.1.1 is the update released in 2014. MCNP6.2 includes all of the standard features for NCS calculations that have been available for the past 15 years, along with new features for sensitivity-uncertainty based methods for NCS validation [3]. Results from the benchmark suitesmore » were compared with results from previous verification testing [4-8]. Criticality safety analysts should consider testing MCNP6.2 on their particular problems and validation suites. No further development of MCNP5 is planned. MCNP6.1 is now 4 years old, and MCNP6.1.1 is now 3 years old. In general, released versions of MCNP are supported only for about 5 years, due to resource limitations. All future MCNP improvements, bug fixes, user support, and new capabilities are targeted only to MCNP6.2 and beyond.« less
Thermal comparison of aircrew clothing aboard OV-10 aircraft.
Constable, R; Webster, R L; Nunneley, S A
1988-12-01
Thermal evaluation of aircrew clothing in the field usually involves conditions which make it difficult to distinguish clothing effects from other, confounding factors. This paper reports a field study designed to solve this problem. The question concerned possible heat stress effects caused by adding an oxygen mask, an anti-g suit, or both to the usual clothing worn by pilots of OV-10 (twin turboprop) aircraft during 20 hot-weather flights (actual experimental temperatures of Tdb = 28-38 degrees C, rh = 18-20%). The four test ensembles were flown simultaneously, allowing side-by-side comparison to compensate for variations in flight profile and weather. Subjects (n = 10) encountered noticeable heat stress in flight, with rectal temperature = 37.4-37.6 degrees C, skin temperature = 35.0-35.5 degrees C, and weight losses = 2.1-2.4 kg. There were no measurable differences among the four clothing outfits, indicating that the anti-g suit does not present a heat stress problem. The mask and anti-g suit did contribute to aircrew discomfort as their impermeable materials prevented evaporation of sweat and caused 100% skin wetting in covered areas. Under these dry, desert conditions, the body was apparently able to compensate for the loss of evaporative surface area.
PMLB: a large benchmark suite for machine learning evaluation and comparison.
Olson, Randal S; La Cava, William; Orzechowski, Patryk; Urbanowicz, Ryan J; Moore, Jason H
2017-01-01
The selection, development, or comparison of machine learning methods in data mining can be a difficult task based on the target problem and goals of a particular study. Numerous publicly available real-world and simulated benchmark datasets have emerged from different sources, but their organization and adoption as standards have been inconsistent. As such, selecting and curating specific benchmarks remains an unnecessary burden on machine learning practitioners and data scientists. The present study introduces an accessible, curated, and developing public benchmark resource to facilitate identification of the strengths and weaknesses of different machine learning methodologies. We compare meta-features among the current set of benchmark datasets in this resource to characterize the diversity of available data. Finally, we apply a number of established machine learning methods to the entire benchmark suite and analyze how datasets and algorithms cluster in terms of performance. From this study, we find that existing benchmarks lack the diversity to properly benchmark machine learning algorithms, and there are several gaps in benchmarking problems that still need to be considered. This work represents another important step towards understanding the limitations of popular benchmarking suites and developing a resource that connects existing benchmarking standards to more diverse and efficient standards in the future.
Akner, Gunnar; Järhult, Bengt
2016-05-17
The international trend »value-based care« (VbC) started with a book by Porter and Olmsted Teisberg in 2006 followed by Porter's 7-point proposal for value-based reform of health care in 2009. VbC may have relevance for delimited, procedure-related health problems with a foreseeable course of development. Most health problems in health care, however, do not involve such delimited problems. VbC is probably not suited as a steering model for chronic health conditions or for multiple health problems. VbC is being rapidly introduced to steer health care without scientific evidence.
Sm-Nd systematics of lunar ferroan anorthositic suite rocks: Constraints on lunar crust formation
NASA Astrophysics Data System (ADS)
Boyet, Maud; Carlson, Richard W.; Borg, Lars E.; Horan, Mary
2015-01-01
We have measured Sm-Nd systematics, including the short-lived 146Sm-142Nd chronometer, in lunar ferroan anorthositic suite (FAS) whole rocks (15415, 62236, 62255, 65315, 60025). At least some members of the suite are thought to be primary crystallization products formed by plagioclase flotation during crystallization of the lunar magma ocean (LMO). Most of these samples, except 62236, have not been exposed to galactic cosmic rays for a long period and thus require minimal correction to their 142Nd isotope composition. These samples all have measured deficits in 142Nd relative to the JNdi-1 terrestrial standard in the range -45 to -21 ppm. The range is -45 to -15 ppm once the 62236 142Nd/144Nd ratio is corrected from neutron-capture effects. Analyzed FAS samples do not define a single isochron in either 146Sm-142Nd or 147Sm-143Nd systematics, suggesting that they either do not have the same crystallization age, come from different sources, or have suffered isotopic disturbance. Because the age is not known for some samples, we explore the implications of their initial isotopic compositions for crystallization ages in the first 400 Ma of solar system history, a timing interval that covers all the ages determined for the ferroan anorthositic suite whole rocks as well as different estimates for the crystallization of the LMO. 62255 has the largest deficit in initial 142Nd and does not appear to have followed the same differentiation path as the other FAS samples. The large deficit in 142Nd of FAN 62255 may suggest a crystallization age around 60-125 Ma after the beginning of solar system accretion. This result provides essential information about the age of the giant impact forming the Moon. The initial Nd isotopic compositions of FAS samples can be matched either with a bulk-Moon with chondritic Sm/Nd ratio but enstatite-chondrite-like initial 142Nd/144Nd (e.g. 10 ppm below modern terrestrial), or a bulk-Moon with superchondritic Sm/Nd ratio and initial 142Nd/144Nd similar to ordinary chondrites.
Sm-Nd systematics of lunar ferroan anorthositic suite rocks: Constraints on lunar crust
Boyet, Maud; Carlson, Richard W.; Borg, Lars E.; ...
2014-09-28
Here, we have measured Sm–Nd systematics, including the short-lived 146Sm– 142Nd chronometer, in lunar ferroan anorthositic suite (FAS) whole rocks (15415, 62236, 62255, 65315, 60025). At least some members of the suite are thought to be primary crystallization products formed by plagioclase flotation during crystallization of the lunar magma ocean (LMO). Most of these samples, except 62236, have not been exposed to galactic cosmic rays for a long period and thus require minimal correction to their 142Nd isotope composition. These samples all have measured deficits in 142Nd relative to the JNdi-1 terrestrial standard in the range –45 to –21 ppm.more » The range is –45 to –15 ppm once the 62236 142Nd/ 144Nd ratio is corrected from neutron-capture effects. Analyzed FAS samples do not define a single isochron in either 146Sm– 142Nd or 147Sm– 143Nd systematics, suggesting that they either do not have the same crystallization age, come from different sources, or have suffered isotopic disturbance. Because the age is not known for some samples, we explore the implications of their initial isotopic compositions for crystallization ages in the first 400 Ma of solar system history, a timing interval that covers all the ages determined for the ferroan anorthositic suite whole rocks as well as different estimates for the crystallization of the LMO. 62255 has the largest deficit in initial 142Nd and does not appear to have followed the same differentiation path as the other FAS samples. The large deficit in 142Nd of FAN 62255 may suggest a crystallization age around 60–125 Ma after the beginning of solar system accretion. This result provides essential information about the age of the giant impact forming the Moon. The initial Nd isotopic compositions of FAS samples can be matched either with a bulk-Moon with chondritic Sm/Nd ratio but enstatite-chondrite-like initial 142Nd/ 144Nd (e.g. 10 ppm below modern terrestrial), or a bulk-Moon with superchondritic Sm/Nd ratio and initial 142Nd/ 144Nd similar to ordinary chondrites.« less
Irena : tool suite for modeling and analysis of small-angle scattering.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ilavsky, J.; Jemian, P.
2009-04-01
Irena, a tool suite for analysis of both X-ray and neutron small-angle scattering (SAS) data within the commercial Igor Pro application, brings together a comprehensive suite of tools useful for investigations in materials science, physics, chemistry, polymer science and other fields. In addition to Guinier and Porod fits, the suite combines a variety of advanced SAS data evaluation tools for the modeling of size distribution in the dilute limit using maximum entropy and other methods, dilute limit small-angle scattering from multiple non-interacting populations of scatterers, the pair-distance distribution function, a unified fit, the Debye-Bueche model, the reflectivity (X-ray and neutron)more » using Parratt's formalism, and small-angle diffraction. There are also a number of support tools, such as a data import/export tool supporting a broad sampling of common data formats, a data modification tool, a presentation-quality graphics tool optimized for small-angle scattering data, and a neutron and X-ray scattering contrast calculator. These tools are brought together into one suite with consistent interfaces and functionality. The suite allows robust automated note recording and saving of parameters during export.« less
CO2 Washout Testing of the REI and EM-ACES Space Suits
NASA Technical Reports Server (NTRS)
Mitchell, Kathryn C.; Norcross, Jason
2012-01-01
When a space suit is used during ground testing, adequate carbon dioxide (CO2) washout must be provided for the suited subject. Symptoms of acute CO2 exposure depend on partial pressure of CO2 (ppCO2), metabolic rate of the subject, and other factors. This test was done to characterize inspired oronasal ppCO2 in the Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES) for a range of workloads and flow rates for which ground testing is nominally performed. Three subjects were tested in each suit. In all but one case, each subject performed the test twice. Suit pressure was maintained at 4.3 psid. Subjects wore the suit while resting, performing arm ergometry, and walking on a treadmill to generate metabolic workloads of about 500 to 3000 BTU/hr. Supply airflow was varied between 6, 5, and 4 actual cubic feet per minute (ACFM) at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored in real time by gas analyzers with sampling tubes connected to the mask. Metabolic rate was calculated from the total CO2 production measured by an additional gas analyzer at the suit air outlet. Real-time metabolic rate was used to adjust the arm ergometer or treadmill workload to meet target metabolic rates. In both suits, inspired CO2 was affected mainly by the metabolic rate of the subject: increased metabolic rate significantly (P < 0.05) increased inspired ppCO2. Decreased air flow caused small increases in inspired ppCO2. The effect of flow was more evident at metabolic rates . 2000 BTU/hr. CO2 washout values of the EM-ACES were slightly but not significantly better than those of the REI suit. Regression equations were developed for each suit to predict the mean inspired ppCO2 as a function of metabolic rate and suit flow rate. This paper provides detailed descriptions of the test hardware, methodology, and results as well as implications for future ground testing in the REI-suit and EM-ACES.
SSAGES: Software Suite for Advanced General Ensemble Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniquesmore » as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.« less
SSAGES: Software Suite for Advanced General Ensemble Simulations
NASA Astrophysics Data System (ADS)
Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.
2018-01-01
Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.
Idempotent Methods for Continuous Time Nonlinear Stochastic Control
2012-09-13
AND ADDRESS(ES) dba AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Stochastech Corporation dba Tempest Technologies 8939 S...Stochastic Control Problems Ben G. Fitzpatrick Tempest Technologies 8939 S. Sepulveda Boulevard, Suite 506 Los Angeles, CA 90045 Sponsored by
Petrology of the Western Highland Province: Ancient crust formation at the Apollo 14 site
NASA Astrophysics Data System (ADS)
Shervais, John W.; McGee, James J.
1999-03-01
Plutonic rocks found at the Apollo 14 site comprise four lithologic suites: the magnesian suite, the alkali suite, evolved lithologies, and the ferroan anorthosite suite (FAN). Rocks of the magnesian suite include troctolite, anorthosite, norite, dunite, and harzburgite; they are characterized by plagioclase ~An95 and mafic minerals with mg#s 82-92. Alkali suite rocks and evolved rocks generally have plagioclase ~An90 to ~An40, and mafic minerals with mg#s 82-40. Lithologies include anorthosite, norite, quartz monzodiorite, granite, and felsite. Ferroan anorthosites have plagioclase ~An96 and mafic minerals with mg#s 45-70. Whole rock geochemical data show that most magnesian suite samples and all alkali anorthosites are cumulates with little or no trapped liquid component. Norites may contain significant trapped liquid component, and some alkali norites may represent cumulate-enriched, near-liquid compositions, similar to KREEP basalt 15386. Evolved lithologies include evolved partial cumulates related to alkali suite fractionation (quartz monzodiorite), immiscible melts derived from these evolved magmas (granites), and impact melts of preexisting granite (felsite). Plots of whole rock mg# versus whole rock Ca/(Ca+Na+K) show a distinct gap between rocks of the magnesian suite and rocks of the alkali suite, suggesting either distinct parent magmas or distinct physical processes of formation. Chondrite-normalized rare earth element (REE) patterns show that rocks of both the magnesian suite and alkali suite have similar ranges, despite the large difference in major element chemistry. Current models for the origin of the magnesian suite call for a komatiitic parent magma derived from early magma ocean cumulates; these melts must assimilate plagiophile elements to form troctolites at low pressures and must assimilate a highly enriched KREEP component so that the resulting mixture has REE concentrations similar to high-K KREEP. There are as yet no plausible scenarios that can explain these unusual requirements. We propose that partial melting of a primitive lunar interior and buffering of these melts by ultramagnesian early magma ocean cumulates provides a more reasonable pathway to form magnesian troctolites. Alkali anorthosites and norites formed by crystallization of a parent magma with major element compositions similar to KREEP basalt 15386. If the parent magma of the alkali suite and evolved rocks is related to the magnesian suite, then that magma must have evolved through combined assimilation-fractional crystallization processes to form the alkali suite cumulates.
Sierra/Aria 4.48 Verification Manual.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sierra Thermal Fluid Development Team
Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.
Linguistic Preprocessing and Tagging for Problem Report Trend Analysis
NASA Technical Reports Server (NTRS)
Beil, Robert J.; Malin, Jane T.
2012-01-01
Mr. Robert Beil, Systems Engineer at Kennedy Space Center (KSC), requested the NASA Engineering and Safety Center (NESC) develop a prototype tool suite that combines complementary software technology used at Johnson Space Center (JSC) and KSC for problem report preprocessing and semantic tag extraction, to improve input to data mining and trend analysis. This document contains the outcome of the assessment and the Findings, Observations and NESC Recommendations.
USGS invasive species solutions
Simpson, Annie
2011-01-01
Land managers must meet the invasive species challenge every day, starting with identification of problem species, then the collection of best practices for their control, and finally the implementation of a plan to remove the problem. At each step of the process, the availability of reliable information is essential to success. The U.S. Geological Survey (USGS) has developed a suite of resources for early detection and rapid response, along with data management and sharing.
Influx: A Tool and Framework for Reasoning under Uncertainty
2015-09-01
Interfaces to external programs Not all types of problems are naturally suited to being entirely modelled and implemented within Influx1. In general... development pertaining to the implementation of the reasoning tool and specific applications are not included in this document. RELEASE LIMITATION...which case a probability is supposed to reflect the subjective belief of an agent for the problem at hand ( based on its experience and/or current state
A Stabilized Sparse-Matrix U-D Square-Root Implementation of a Large-State Extended Kalman Filter
NASA Technical Reports Server (NTRS)
Boggs, D.; Ghil, M.; Keppenne, C.
1995-01-01
The full nonlinear Kalman filter sequential algorithm is, in theory, well-suited to the four-dimensional data assimilation problem in large-scale atmospheric and oceanic problems. However, it was later discovered that this algorithm can be very sensitive to computer roundoff, and that results may cease to be meaningful as time advances. Implementations of a modified Kalman filter are given.
Sample Analysis at Mars Instrument, Side Panels Off
2012-08-27
An instrument suite that will analyze the chemical ingredients in samples of Martian atmosphere, rocks and soil during the mission of NASA Mars rover Curiosity, is shown here during assembly at NASA Goddard Space Flight Center, Greenbelt, Md., in 2010.
Ghysels, Pieter; Li, Xiaoye S.; Rouet, Francois -Henry; ...
2016-10-27
Here, we present a sparse linear system solver that is based on a multifrontal variant of Gaussian elimination and exploits low-rank approximation of the resulting dense frontal matrices. We use hierarchically semiseparable (HSS) matrices, which have low-rank off-diagonal blocks, to approximate the frontal matrices. For HSS matrix construction, a randomized sampling algorithm is used together with interpolative decompositions. The combination of the randomized compression with a fast ULV HSS factoriz ation leads to a solver with lower computational complexity than the standard multifrontal method for many applications, resulting in speedups up to 7 fold for problems in our test suite.more » The implementation targets many-core systems by using task parallelism with dynamic runtime scheduling. Numerical experiments show performance improvements over state-of-the-art sparse direct solvers. The implementation achieves high performance and good scalability on a range of modern shared memory parallel systems, including the Intel Xeon Phi (MIC). The code is part of a software package called STRUMPACK - STRUctured Matrices PACKage, which also has a distributed memory component for dense rank-structured matrices.« less
[Epigenetic research of cognitive deficit in schizophrenia: some methodological considerations].
Lezheiko, T V; Alfimova, M V
To highlights the problems of assessing cognitive deficits in schizophrenia, relevant to the epigenetic, as well as a wide range of other approaches to the search for biological bases of cognition. The literature on the weaknesses in the evaluation of cognitive functions in patients with schizophrenia are summarized and discussed. The analysis is illustrated by our experience in developing a cognitive battery and a sample to examine relationships between DNA methylation in blood cells and cognitive deficits in schizophrenia. It has been shown that to assess cognitive deficits in patients and to reduce the influence of confounders in epigenetic analysis it is necessary (1) to use a battery with the existing co-normative data in the target population, which allows to evaluate representativeness of control and patients included in the study sample, (2) to verify the theoretically driven battery structure using normative population and a cohort of patients, (3) to balance groups of cases and controls on the number, age and sex, for which an individual matching of cases and controls is best suited, (4) to conduct an additional statistical analysis controlling for education and smoking.
Fieldpath Lunar Meteorite Graves Nunataks 06157, a Magnesian Piece of the Lunar Highlands Crust
NASA Technical Reports Server (NTRS)
Zeigler, Ryan A.; Korotev, R. L.; Korotev, R. L.
2012-01-01
To date, 49 feldspathic lunar meteorites (FLMs) have been recovered, likely representing a minimum of 35 different sample locations in the lunar highlands. The compositional variability among FLMs far exceeds the variability observed among highland samples in the Apollo and Luna sample suites. Here we will discuss in detail one of the compositional end members of the FLM suite, Graves Nunataks (GRA) 06157, which was collected by the 2006-2007 ANSMET field team. At 0.79 g, GRA 06157 is the smallest lunar meteorite so far recovered. Despite its small size, its highly feldspathic and highly magnesian composition are intriguing. Although preliminary bulk compositions have been reported, thus far no petrographic descriptions are in the literature. Here we expand upon the bulk compositional data, including major-element compositions, and provide a detailed petrographic description of GRA 06157.
Provenance studies by fission-track dating of zircon-etching and counting procedures
Naeser, N.D.; Zeitler, P.K.; Naeser, C.W.; Cerveny, P.F.
1987-01-01
In sedimentary rocks that have not been heated to high enough temperatures to anneal fission tracks in zircon (greater than ≈ 160°C), fission-track ages of individual detrital zircon grains provide valuable information about the source rocks eroded to form the sediments. The success of such studies depends, however, on the degree to which the ages determined from the detrital suite accurately portray the range of grain ages that are present in the suite. This in turn depends to a large extent on using counting and, in particular, etching procedures that permit proper sampling of grains with a wide range of age and uranium concentrations. Results are reported here of an experimental study of a ‘detrital’ zircon suite manufactured from several zircon populations of known age. This study suggests that multiple etches are required when a complete spectrum of ages in a zircon suite is desired.
Provenance studies by fission-track dating of zircon-etching and counting procedures
Naeser, Nancy D.; Zeitler, Peter K.; Naeser, Charles W.; Cerveny, Philip F.
1987-01-01
In sedimentary rocks that have not been heated to high enough temperatures to anneal fission tracks in zircon (greater than approximately equals 160 degree C), fission-track ages of individual detrital zircon grains provide valuable information about the source rocks eroded to form the sediments. The success of such studies depends, however, on the degree to which the ages determined from the detrital suite accurately portray the range of grain ages that are present in the suite. This in turn depends to a large extent on using counting and, in particular, etching procedures that permit proper sampling of grains with a wide range of age and uranium concentrations. Results are reported here of an experimental study of a 'detrital' zircon suite manufactured from several zircon populations of known age. This study suggests that multiple etches are required when a complete spectrum of ages in a zircon suite is desired.
Pyrolysis-mass spectrometry/pattern recognition on a well-characterized suite of humic samples
MacCarthy, P.; DeLuca, S.J.; Voorhees, K.J.; Malcolm, R.L.; Thurman, E.M.
1985-01-01
A suite of well-characterized humic and fulvic acids of freshwater, soil and plant origin was subjected to pyrolysis-mass spectrometry and the resulting data were analyzed by pattern recognition and factor analysis. A factor analysis plot of the data shows that the humic acids and fulvic acids can be segregated into two distinct classes. Carbohydrate and phenolic components are more pronounced in the pyrolysis products of the fulvic acids, and saturated and unsaturated hydrocarbons contribute more to the humic acid pyrolysis products. A second factor analysis plot shows a separation which appears to be based primarily on whether the samples are of aquatic or soil origin. ?? 1985.
Joint Geophysical Inversion With Multi-Objective Global Optimization Methods
NASA Astrophysics Data System (ADS)
Lelievre, P. G.; Bijani, R.; Farquharson, C. G.
2015-12-01
Pareto multi-objective global optimization (PMOGO) methods generate a suite of solutions that minimize multiple objectives (e.g. data misfits and regularization terms) in a Pareto-optimal sense. Providing a suite of models, as opposed to a single model that minimizes a weighted sum of objectives, allows a more complete assessment of the possibilities and avoids the often difficult choice of how to weight each objective. We are applying PMOGO methods to three classes of inverse problems. The first class are standard mesh-based problems where the physical property values in each cell are treated as continuous variables. The second class of problems are also mesh-based but cells can only take discrete physical property values corresponding to known or assumed rock units. In the third class we consider a fundamentally different type of inversion in which a model comprises wireframe surfaces representing contacts between rock units; the physical properties of each rock unit remain fixed while the inversion controls the position of the contact surfaces via control nodes. This third class of problem is essentially a geometry inversion, which can be used to recover the unknown geometry of a target body or to investigate the viability of a proposed Earth model. Joint inversion is greatly simplified for the latter two problem classes because no additional mathematical coupling measure is required in the objective function. PMOGO methods can solve numerically complicated problems that could not be solved with standard descent-based local minimization methods. This includes the latter two classes of problems mentioned above. There are significant increases in the computational requirements when PMOGO methods are used but these can be ameliorated using parallelization and problem dimension reduction strategies.
Electrostatic point charge fitting as an inverse problem: Revealing the underlying ill-conditioning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivanov, Maxim V.; Talipov, Marat R.; Timerghazin, Qadir K., E-mail: qadir.timerghazin@marquette.edu
2015-10-07
Atom-centered point charge (PC) model of the molecular electrostatics—a major workhorse of the atomistic biomolecular simulations—is usually parameterized by least-squares (LS) fitting of the point charge values to a reference electrostatic potential, a procedure that suffers from numerical instabilities due to the ill-conditioned nature of the LS problem. To reveal the origins of this ill-conditioning, we start with a general treatment of the point charge fitting problem as an inverse problem and construct an analytical model with the point charges spherically arranged according to Lebedev quadrature which is naturally suited for the inverse electrostatic problem. This analytical model is contrastedmore » to the atom-centered point-charge model that can be viewed as an irregular quadrature poorly suited for the problem. This analysis shows that the numerical problems of the point charge fitting are due to the decay of the curvatures corresponding to the eigenvectors of LS sum Hessian matrix. In part, this ill-conditioning is intrinsic to the problem and is related to decreasing electrostatic contribution of the higher multipole moments, that are, in the case of Lebedev grid model, directly associated with the Hessian eigenvectors. For the atom-centered model, this association breaks down beyond the first few eigenvectors related to the high-curvature monopole and dipole terms; this leads to even wider spread-out of the Hessian curvature values. Using these insights, it is possible to alleviate the ill-conditioning of the LS point-charge fitting without introducing external restraints and/or constraints. Also, as the analytical Lebedev grid PC model proposed here can reproduce multipole moments up to a given rank, it may provide a promising alternative to including explicit multipole terms in a force field.« less
Gray, F.; Page, N.J.; Carlson, C.A.; Wilson, S.A.; Carlson, R.R.
1986-01-01
Analyses for platinum-group elements of the varied rock suites of three Alaskan-type ultramafic to mafic multi-intrusive bodies are reported. Ir and Ru are less than analytical sensitivities of 100 and 20 ppb; Rh is less than or near 1 ppb. Average Pd assays vary among the rocks within intrusive complexes and between the three complexes (6.3, 13.7, 36.4 ppb); average Pt assays vary little among the same samples (27.9, 60.9, 34.0 ppb). Statistically adjusted Pt/(Pt + Pd) ratios increase in each suite from gabbro through clinopyroxenite to olivine-rich rocks, possibly owing to Pd fractionation.-G.J.N.
Metabolic and Subjective Results Review of the Integrated Suit Test Series
NASA Technical Reports Server (NTRS)
Norcross, J.R.; Stroud, L.C.; Klein, J.; Desantis, L.; Gernhardt, M.L.
2009-01-01
Crewmembers will perform a variety of exploration and construction activities on the lunar surface. These activities will be performed while inside an extravehicular activity (EVA) spacesuit. In most cases, human performance is compromised while inside an EVA suit as compared to a crewmember s unsuited performance baseline. Subjects completed different EVA type tasks, ranging from ambulation to geology and construction activities, in different lunar analog environments including overhead suspension, underwater and 1-g lunar-like terrain, in both suited and unsuited conditions. In the suited condition, the Mark III (MKIII) EVA technology demonstrator suit was used and suit pressure and suit weight were parameters tested. In the unsuited conditions, weight, mass, center of gravity (CG), terrain type and navigation were the parameters. To the extent possible, one parameter was varied while all others were held constant. Tests were not fully crossed, but rather one parameter was varied while all others were left in the most nominal setting. Oxygen consumption (VO2), modified Cooper-Harper (CH) ratings of operator compensation and ratings of perceived exertion (RPE) were measured for each trial. For each variable, a lower value correlates to more efficient task performance. Due to a low sample size, statistical significance was not attainable. Initial findings indicate that suit weight, CG and the operational environment can have a large impact on human performance during EVA. Systematic, prospective testing series such as those performed to date will enable a better understanding of the crucial interactions of the human and the EVA suit system and their environment. However, work remains to be done to confirm these findings. These data have been collected using only unsuited subjects and one EVA suit prototype that is known to fit poorly on a large demographic of the astronaut population. Key findings need to be retested using an EVA suit prototype better suited to a larger anthropometric portion of the astronaut population, and elements tested only in the unsuited condition need to be evaluated with an EVA suit and appropriate analog environment.
Carbon Dioxide Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit
NASA Technical Reports Server (NTRS)
Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses
2014-01-01
Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy, and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject, and physiological differences between subjects. Computational Fluid Dynamics (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test was to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III suit across a range of workload and flow rates. Data and trends observed during testing along with refined CFD models will be used to help design an inlet vent configuration for the Z-2 space suit. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit, and the Enhanced Mobility Advanced Crew Escape Suit. Three subjects performed two test sessions each in the Mark-III suit to allow for comparison between tests. Six different helmet inlet vent configurations were evaluated during each test session. Suit pressure was maintained at 4.3 psid. Suited test subjects walked on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute were tested at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the CO2 production measured by an additional gas analyzer at the air outlet from the suit. Real-time metabolic rate measurements were used to adjust the treadmill workload to meet target metabolic rates. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future inlet vent designs and ground testing.
Durable Suit Bladder with Improved Water Permeability for Pressure and Environment Suits
NASA Technical Reports Server (NTRS)
Bue, Grant C.; Kuznetz, Larry; Orndoff, Evelyne; Tang, Henry; Aitchison, Lindsay; Ross, Amy
2009-01-01
Water vapor permeability is shown to be useful in rejecting heat and managing moisture accumulation in launch-and-entry pressure suits. Currently this is accomplished through a porous Gortex layer in the Advanced Crew and Escape Suit (ACES) and in the baseline design of the Constellation Suit System Element (CSSE) Suit 1. Non-porous dense monolithic membranes (DMM) that are available offer potential improvements for water vapor permeability with reduced gas leak. Accordingly, three different pressure bladder materials were investigated for water vapor permeability and oxygen leak: ElasthaneTM 80A (thermoplastic polyether urethane) provided from stock polymer material and two custom thermoplastic polyether urethanes. Water vapor, carbon dioxide and oxygen permeability of the DMM's was measured in a 0.13 mm thick stand-alone layer, a 0.08 mm and 0.05 mm thick layer each bonded to two different nylon and polyester woven reinforcing materials. Additional water vapor permeability and mechanical compression measurements were made with the reinforced 0.05 mm thick layers, further bonded with a polyester wicking and overlaid with moistened polyester fleece thermal underwear .This simulated the pressure from a supine crew person. The 0.05 mm thick nylon reinforced sample with polyester wicking layer was further mechanically tested for wear and abrasion. Concepts for incorporating these materials in launch/entry and Extravehicular Activity pressure suits are presented.
NASA Technical Reports Server (NTRS)
Gaier, James R.; deLeon, Pablo G.; Lee, Pascal; McCue, Terry R.; Hodgson, Edward W.; Thrasher, Jeff
2010-01-01
In August 2009 YAP Films (Toronto) received permission from all entities involved to create a documentary film illustrating what it might be like to be on the surface of Mars in a space suit during a dust storm or in a dust devil. The science consultants on this project utilized this opportunity to collect data which could be helpful to assess the durability of current space suit construction to the Martian environment. The NDX?1 prototype planetary space suit developed at the University of North Dakota was used in this study. The suit features a hard upper torso garment, and a soft lower torso and boots assembly. On top of that, a nylon-cotton outer layer is used to protect the suit from dust. Unmanned tests were carried out in the Martian Surface Wind Tunnel (MARSWIT) at the NASA Ames Research Center, with the suit pressurized to 10 kPa gauge. These tests blasted the space suit upper torso and helmet, and a collection of nine candidate outer layer fabrics, with wind-borne simulant for five different 10 minute tests under both terrestrial and Martian surface pressures. The infiltration of the dust through the outer fabric of the space suit was photographically documented. The nine fabric samples were analyzed under light and electron microscopes for abrasion damage. Manned tests were carried out at Showbiz Studios (Van Nuys, CA) with the pressure maintained at 20?2 kPa gauge. A large fan-created vortex lifted Martian dust simulant (Fullers Earth or JSC Mars?1) off of the floor, and one of the authors (Lee) wearing the NDX?1 space suit walked through it to judge both subjectively and objectively how the suit performed under these conditions. Both the procedures to scale the tests to Martian conditions and the results of the infiltration and abrasion studies will be discussed.
NASA Technical Reports Server (NTRS)
Gaier, James R.; deLeon, Pablo G.; Lee, Pascal; McCue, Terry R.; Hodgson, Edward W.; Thrasher, Jeff
2010-01-01
In August 2009 YAP Films (Toronto) received permission from all entities involved to create a documentary film illustrating what it might be like to be on the surface of Mars in a space suit during a dust storm or in a dust devil. The science consultants on this project utilized this opportunity to collect data which could be helpful to assess the durability of current space suit construction to the Martian environment. The NDX-1 prototype planetary space suit developed at the University of North Dakota was used in this study. The suit features a hard upper torso garment, and a soft lower torso and boots assembly. On top of that, a nylon-cotton outer layer is used to protect the suit from dust. Unmanned tests were carried out in the Martian Surface Wind Tunnel (MARSWIT) at the NASA Ames Research Center, with the suit pressurized to 10 kPa gauge. These tests blasted the space suit upper torso and helmet, and a collection of nine candidate outer layer fabrics, with wind-borne simulant for five different 10 min tests under both terrestrial and Martian surface pressures. The infiltration of the dust through the outer fabric of the space suit was photographically documented. The nine fabric samples were analyzed under light and electron microscopes for abrasion damage. Manned tests were carried out at Showbiz Studios (Van Nuys, California) with the pressure maintained at 20 2 kPa gauge. A large fan-created vortex lifted Martian dust simulant (Fullers Earth or JSC Mars-1) off of the floor, and one of the authors (Lee) wearing the NDX-1 space suit walked through it to judge both subjectively and objectively how the suit performed under these conditions. Both the procedures to scale the tests to Martian conditions and the results of the infiltration and abrasion studies will be discussed.
Ultrafiltrate and microdialysis DL probe in vitro recoveries: electrolytes and metabolites
NASA Technical Reports Server (NTRS)
Janle, E. M.; Cregor, M.
1996-01-01
UF ultrafiltration and DL microdialysis probes are well-suited for sampling interstitial concentrations of ions and metabolites in peripheral tissue. The first step in utilization of membrane sampling techniques is to determine the recovery characteristics of the probes in vitro.
NASA Astrophysics Data System (ADS)
Xu, Li; Liu, Lanlan; Niu, Jie; Tang, Li; Li, Jinliang; Zhou, Zhanfan; Long, Chenhai; Yang, Qi; Yi, Ziqi; Guo, Hao; Long, Yang; Fu, Yanyi
2017-05-01
As social requirement of power supply reliability keeps rising, distribution network working with power uninterrupted has been widely carried out, while the high - temperature operating environment in summer can easily lead to physical discomfort for the operators, and then lead to safety incidents. Aiming at above problem, air-conditioning suit for distribution network working with power uninterrupted has been putted forward in this paper, and the structure composition and cooling principle of which has been explained, and it has been ultimately put to on-site application. The results showed that, cooling effect of air-conditioning suits was remarkable, and improved the working environment for the operators effectively, which is of great significance to improve Chinese level of working with power uninterrupted, reduce the probability of accidents and enhance the reliability of power supply.
Livermore Compiler Analysis Loop Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hornung, R. D.
2013-03-01
LCALS is designed to evaluate compiler optimizations and performance of a variety of loop kernels and loop traversal software constructs. Some of the loop kernels are pulled directly from "Livermore Loops Coded in C", developed at LLNL (see item 11 below for details of earlier code versions). The older suites were used to evaluate floating-point performances of hardware platforms prior to porting larger application codes. The LCALS suite is geared toward assissing C++ compiler optimizations and platform performance related to SIMD vectorization, OpenMP threading, and advanced C++ language features. LCALS contains 20 of 24 loop kernels from the older Livermoremore » Loop suites, plus various others representative of loops found in current production appkication codes at LLNL. The latter loops emphasize more diverse loop constructs and data access patterns than the others, such as multi-dimensional difference stencils. The loops are included in a configurable framework, which allows control of compilation, loop sampling for execution timing, which loops are run and their lengths. It generates timing statistics for analysis and comparing variants of individual loops. Also, it is easy to add loops to the suite as desired.« less
The application of a novel optical SPM in biomedicine
NASA Astrophysics Data System (ADS)
Li, Yinli; Chen, Haibo; Wu, Shifa; Song, Linfeng; Zhang, Jian
2005-01-01
As an analysis tool, SPM has been broadly used in biomedicine in recent years, such as AFM and SNOM; they are effective instruments in detecting life nanostructures at atomic level. Atomic force and photon scanning tunneling microscope (AF/PSTM) is one of member of SPM, it can be used to obtain sample" optical and atomic fore images at once scanning, these images include the transmissivity image, reflection index image and topography image. This report mainly introduces the application of AF/PSTM in red blood membrane and the effect of different sample dealt with processes on the experiment result. The materials for preparing red cells membrane samples are anticoagulant blood, isotonic phosphatic buffer solution (PBS) and new two times distilled water. The images of AF/PSTM give real expression to the biology samples" fact despite of different sample dealt with processes, which prove that AF/PSTM suits to biology sample imaging. At the same time, the optical images and the topography image of AF/PSTM of the same sample are complementary with each other; this will make AF/PSTM a facile tool to analysis biologic samples" nanostructure. As another sample, this paper gives the application of AF/PSTM in immunoassay, the result shows that AF/PSTM is suit to analysis biologic sample, and it will become a new tool for biomedicine test.
ten Kate, Inge L; Canham, John S; Conrad, Pamela G; Errigo, Therese; Katz, Ira; Mahaffy, Paul R
2008-06-01
The objective of the 2009 Mars Science Laboratory (MSL), which is planned to follow the Mars Exploration Rovers and the Phoenix lander to the surface of Mars, is to explore and assess quantitatively a site on Mars as a potential habitat for present or past life. Specific goals include an assessment of the past or present biological potential of the target environment and a characterization of its geology and geochemistry. Included in the 10 investigations of the MSL rover is the Sample Analysis at Mars (SAM) instrument suite, which is designed to obtain trace organic measurements, measure water and other volatiles, and measure several light isotopes with experiment sequences designed for both atmospheric and solid-phase samples. SAM integrates a gas chromatograph, a mass spectrometer, and a tunable laser spectrometer supported by sample manipulation tools both within and external to the suite. The sub-part-per-billion sensitivity of the suite for trace species, particularly organic molecules, along with a mobile platform that will contain many kilograms of organic materials, presents a considerable challenge due to the potential for terrestrial contamination to mask the signal of martian organics. We describe the effort presently underway to understand and mitigate, wherever possible within the resource constraints of the mission, terrestrial contamination in MSL and SAM measurements.
Knott, J.R.; Sarna-Wojcicki, A. M.; Montanez, I.P.; Wan, E.
2007-01-01
Volcanic glass samples from the same volcanic center (intra-source) often have a similar major-element composition. Thus, it can be difficult to distinguish between individual tephra layers, particularly when using similarity coefficients calculated from electron microprobe major-element measurements. Minor/trace element concentrations in glass can be determined by solution inductively coupled plasma mass spectrometry (S-ICP-MS), but have not been shown as suitable for use in large tephrochronologic databases. Here, we present minor/trace-element concentrations measured by S-ICP-MS and compare these data by similarity coefficients, the method commonly used in large databases. Trial samples from the Bishop Tuff, the upper and lower tuffs of Glass Mountain and the tuffs of Mesquite Spring suites from eastern California, USA, which have an indistinguishable major-element composition, were analyzed using S-ICP-MS. The resulting minor/trace element similarity coefficients clearly separated the suites of tephra layers and, in most cases, individual tephra layers within each suite. Comparisons with previous instrumental neutron activation analysis (INAA) elemental measurements were marginally successful. This is important step toward quantitative correlation in large tephrochronologic databases to achieve definitive identification of volcanic glass samples and for high-resolution age determinations. ?? 2007 Elsevier Ltd and INQUA.
Božičević, Alen; Dobrzyński, Maciej; De Bie, Hans; Gafner, Frank; Garo, Eliane; Hamburger, Matthias
2017-12-05
The technological development of LC-MS instrumentation has led to significant improvements of performance and sensitivity, enabling high-throughput analysis of complex samples, such as plant extracts. Most software suites allow preprocessing of LC-MS chromatograms to obtain comprehensive information on single constituents. However, more advanced processing needs, such as the systematic and unbiased comparative metabolite profiling of large numbers of complex LC-MS chromatograms remains a challenge. Currently, users have to rely on different tools to perform such data analyses. We developed a two-step protocol comprising a comparative metabolite profiling tool integrated in ACD/MS Workbook Suite, and a web platform developed in R language designed for clustering and visualization of chromatographic data. Initially, all relevant chromatographic and spectroscopic data (retention time, molecular ions with the respective ion abundance, and sample names) are automatically extracted and assembled in an Excel spreadsheet. The file is then loaded into an online web application that includes various statistical algorithms and provides the user with tools to compare and visualize the results in intuitive 2D heatmaps. We applied this workflow to LC-ESIMS profiles obtained from 69 honey samples. Within few hours of calculation with a standard PC, honey samples were preprocessed and organized in clusters based on their metabolite profile similarities, thereby highlighting the common metabolite patterns and distributions among samples. Implementation in the ACD/Laboratories software package enables ulterior integration of other analytical data, and in silico prediction tools for modern drug discovery.
Mathematical Structure of Electromagnetic Terrain Feature Canopy Models.
1982-11-01
problems in this formulation is how to introduce canopy abstraction and how to project the foliage area index. Suits -- - "-7 U -16- (1972...extinction coefficient of light through vegetation canopy will determine how the beam will be depleted with depth. The intensity of light reaching the...describe how lations of the canopy reflectance problem are being at- layer i responds to flux incident from below. The flux tempted, most notably by Verhoef
An educational approach to problem-based learning.
Chen, Nan-Chieh
2008-03-01
This paper provides an analysis of the educational framework of problem-based learning (PBL). As known and used, PBL finds its root in the Structuralism and Pragmatism schools of philosophy. In this paper, the three main requirements of PBL, namely learning by doing, learning in context, and focusing on the student, are discussed within the context of these two schools of thought. Given these attributes, PBL also seems ideally suited for use in learning bioethics.
ERIC Educational Resources Information Center
Lee, Sooin Tim
2012-01-01
There is a hunger for effective teacher equipping programs for adult volunteer teachers in the educational ministry of today's churches. In addition, these programs for volunteer teachers need to be well-suited for adult learners and relevant to their real-life situations. The purpose of this qualitative study is to explore the effects of…
Pettingill, H.S.; Sinha, A.K.; Tatsumoto, M.
1984-01-01
Rb-Sr isotopic data for anorthosites, charnockites, ferrodioritic to quartz monzonitic plutons, and high-grade gneisses of the Blue Ridge of central Virginia show evidence of post-emplacement metamorphism, but in some cases retain Grenville ages. The Pedlar River Charnockite Suite yields an isochron age of 1021 +/-36 Ma, (initial 87Sr/86Sr ratio of 0.7047 +/-6), which agrees with published U-Pb zircon ages. Five samples of that unit which contain Paleozoic mylonitic fabrics define a regression line of 683 Ma, interpreted as a mixing line with no age significance. Samples of the Roseland Anorthosite Complex show excessive scatter on a Rb-Sr evolution diagram probably due to Paleozoic (475 m.y.) metamorphism. Data from the ferrodioritic to quartz monzonitic plutons of the area yield an age of 1009 +/-26 Ma (inital ratio=0.7058 +/-4), which is in the range of the U-Pb zircon ages of 1000-1100 Ma. The Stage Road Layered Gneiss yields an age of 1147 +/-34 Ma (initial ratio of 0.7047 +/- 5). Sm-Nd data for the Pedlar River Charnockite Suite reflect a pre-Grenville age of 1489 +/-118 Ma (e{open}Nd=+6.7 +/-1.2). Data for the Roseland Anorthosite Complex and the ferrodioritic to quartz monzonitic plutons yield Grenville isochron ages of 1045 +/44 Ma (e{open}Nd=+1.0 +/-0.3) and 1027 +/-101 Ma (e{open}Nd=+1.4 +/-1.0), respectively. Two Roseland Anorthosite samples plot far above the isochron, demonstrating the effects of post-emplacement disturbance of Sm-Nd systematics, while mylonitized Pedlar River Charnockite Suite samples show no evidence of Sm-Nd redistribution. The disparity of the Sm-Nd age and other isotopic ages for the Pedlar River Charnockite Suite probably reflects a Sm-Nd "source" age, suggesting the presence of an older crust within this portion of the ca. 1 Ga old basement. ?? 1984 Springer-Verlag.
Location uncertainty and the tri-areal design
Francis A. Roesch
2007-01-01
The U.S. Department of Agriculture Forest Service Forest Inventory and Analysis Program (FIA) uses a field plot design that incorporates multiple sample selection mechanisms. Not all of the five FIA units currently use the entire suite of available sample selection mechanisms. These sampling selection mechanisms could be described in a number of ways with respect to...
Ninety six gasoline samples were collected from around the U.S. in Autumn 2004. A detailed hydrocarbon analysis was performed on each sample resulting in a data set of approximately 300 chemicals per sample. Statistical analyses were performed on the entire suite of reported chem...
Location uncertainty and the tri-areal design
Francis A. Roesch
2005-01-01
The U.S. Department of Agriculture Forest Service Forest Inventory and Analysis Program (FTA) uses a field plot design that incorporates multiple sample selection mechanisms. Not all of the five FIA units currently use the entire suite of available sample selection mechanisms. These sampling selection mechanisms could be described in a number of ways with respect to...
The adaptive, cut-cell Cartesian approach (warts and all)
NASA Technical Reports Server (NTRS)
Powell, Kenneth G.
1995-01-01
Solution-adaptive methods based on cutting bodies out of Cartesian grids are gaining popularity now that the ways of circumventing the accuracy problems associated with small cut cells have been developed. Researchers are applying Cartesian-based schemes to a broad class of problems now, and, although there is still development work to be done, it is becoming clearer which problems are best suited to the approach (and which are not). The purpose of this paper is to give a candid assessment, based on applying Cartesian schemes to a variety of problems, of the strengths and weaknesses of the approach as it is currently implemented.
Bruhn, Peter; Geyer-Schulz, Andreas
2002-01-01
In this paper, we introduce genetic programming over context-free languages with linear constraints for combinatorial optimization, apply this method to several variants of the multidimensional knapsack problem, and discuss its performance relative to Michalewicz's genetic algorithm with penalty functions. With respect to Michalewicz's approach, we demonstrate that genetic programming over context-free languages with linear constraints improves convergence. A final result is that genetic programming over context-free languages with linear constraints is ideally suited to modeling complementarities between items in a knapsack problem: The more complementarities in the problem, the stronger the performance in comparison to its competitors.
Advanced Curation: Solving Current and Future Sample Return Problems
NASA Technical Reports Server (NTRS)
Fries, M.; Calaway, M.; Evans, C.; McCubbin, F.
2015-01-01
Advanced Curation is a wide-ranging and comprehensive research and development effort at NASA Johnson Space Center that identifies and remediates sample related issues. For current collections, Advanced Curation investigates new cleaning, verification, and analytical techniques to assess their suitability for improving curation processes. Specific needs are also assessed for future sample return missions. For each need, a written plan is drawn up to achieve the requirement. The plan draws while upon current Curation practices, input from Curators, the analytical expertise of the Astromaterials Research and Exploration Science (ARES) team, and suitable standards maintained by ISO, IEST, NIST and other institutions. Additionally, new technologies are adopted on the bases of need and availability. Implementation plans are tested using customized trial programs with statistically robust courses of measurement, and are iterated if necessary until an implementable protocol is established. Upcoming and potential NASA missions such as OSIRIS-REx, the Asteroid Retrieval Mission (ARM), sample return missions in the New Frontiers program, and Mars sample return (MSR) all feature new difficulties and specialized sample handling requirements. The Mars 2020 mission in particular poses a suite of challenges since the mission will cache martian samples for possible return to Earth. In anticipation of future MSR, the following problems are among those under investigation: What is the most efficient means to achieve the less than 1.0 ng/sq cm total organic carbon (TOC) cleanliness required for all sample handling hardware? How do we maintain and verify cleanliness at this level? The Mars 2020 Organic Contamination Panel (OCP) predicts that organic carbon, if present, will be present at the "one to tens" of ppb level in martian near-surface samples. The same samples will likely contain wt% perchlorate salts, or approximately 1,000,000x as much perchlorate oxidizer as organic carbon. The chemical kinetics of this reaction are poorly understood at present under the conditions of cached or curated martian samples. Among other parameters, what is the maximum temperature allowed during storage in order to preserve native martian organic compounds for analysis? What is the best means to collect headspace gases from cached martian (and other) samples? This gas will contain not only martian atmosphere but also off-gassed volatiles from the cached solids.
Engineering Antimicrobials Refractory to Resistance
USDA-ARS?s Scientific Manuscript database
Multi-drug resistant superbugs are a persistent problem in modern health care, demonstrating the need for a new class of antimicrobials that can address this concern. Triple-acting peptidoglycan hydrolase fusions are a novel class of antimicrobials which have qualities well suited to avoiding resis...
The Use of Returned Martian Samples to Evaluate the Possibility of Extant Life on Mars
NASA Astrophysics Data System (ADS)
iMOST Team; ten Kate, I. L.; Mackelprang, R.; Rettberg, P.; Smith, C. L.; Altieri, F.; Amelin, Y.; Ammannito, E.; Anand, M.; Beaty, D. W.; Benning, L. G.; Bishop, J. L.; Borg, L. E.; Boucher, D.; Brucato, J. R.; Busemann, H.; Campbell, K. A.; Carrier, B. L.; Czaja, A. D.; Debaille, V.; Des Marais, D. J.; Dixon, M.; Ehlmann, B. L.; Farmer, J. D.; Fernandez-Remolar, D. C.; Fogarty, J.; Glavin, D. P.; Goreva, Y. S.; Grady, M. M.; Hallis, L. J.; Harrington, A. D.; Hausrath, E. M.; Herd, C. D. K.; Horgan, B.; Humayun, M.; Kleine, T.; Kleinhenz, J.; Mangold, N.; Mayhew, L. E.; McCoy, J. T.; McCubbin, F. M.; McLennan, S. M.; McSween, H. Y.; Moser, D. E.; Moynier, F.; Mustard, J. F.; Niles, P. B.; Ori, G. G.; Raulin, F.; Rucker, M. A.; Schmitz, N.; Sefton-Nash, E.; Sephton, M. A.; Shaheen, R.; Shuster, D. L.; Siljestrom, S.; Spry, J. A.; Steele, A.; Swindle, T. D.; Tosca, N. J.; Usui, T.; Van Kranendonk, M. J.; Wadhwa, M.; Weiss, B. P.; Werner, S. C.; Westall, F.; Wheeler, R. M.; Zipfel, J.; Zorzano, M. P.
2018-04-01
The astrobiological community is highly interested in interrogating returned martian samples for evidence of extant life. A single observation with one method will not constitute evidence of extant life — it will require a suite of investigations.
NASA Technical Reports Server (NTRS)
Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.
2011-01-01
One key goal for the future exploration of Mars is the search for chemical biomarkers including complex organic compounds important in life on Earth. The Sample Analysis at Mars (SAM) instrument suite on the Mars Science Laboratory (MSL) will provide the most sensitive measurements of the organic composition of rocks and regolith samples ever carried out in situ on Mars. SAM consists of a gas chromatograph (GC), quadrupole mass spectrometer (QMS), and tunable laser spectrometer to measure volatiles in the atmosphere and released from rock powders heated up to 1000 C. The measurement of organics in solid samples will be accomplished by three experiments: (1) pyrolysis QMS to identify alkane fragments and simple aromatic compounds; pyrolysis GCMS to separate and identify complex mixtures of larger hydrocarbons; and (3) chemical derivatization and GCMS extract less volatile compounds including amino and carboxylic acids that are not detectable by the other two experiments.
Petrographic and petrological studies of lunar rocks. [Apollo 15 breccias and Russian tektites
NASA Technical Reports Server (NTRS)
Winzer, S. R.
1978-01-01
Clasts, rind glass, matrix glass, and matrix minerals from five Apollo 15 glass-coated breccias (15255, 15286, 15465, 15466, and 15505) were studied optically and with the SEM/microprobe. Rind glass compositions differ from sample to sample, but are identical, or nearly so, to the local soil, suggesting their origin by fusion of that soil. Most breccia samples contain green or colorless glass spheres identical to the Apollo 15 green glasses. These glasses, along with other glass shards and fragments, indicate a large soil component is present in the breccias. Clast populations include basalts and gabbros containing phases highly enriched in iron, indicative of extreme differentiation or fractional crystallization. Impact melts, anorthosites, and minor amounts of ANT suite material are also present among the clasts. Tektite glasses, impact melts, and breccias from the Zhamanshin structure, USSR, were also studied. Basic tektite glasses were found to be identical in composition to impact melts from the structure, but no satisfactory parent material has been identified in the limited suite of samples available.
A Diagnostic Assessment of Evolutionary Multiobjective Optimization for Water Resources Systems
NASA Astrophysics Data System (ADS)
Reed, P.; Hadka, D.; Herman, J.; Kasprzyk, J.; Kollat, J.
2012-04-01
This study contributes a rigorous diagnostic assessment of state-of-the-art multiobjective evolutionary algorithms (MOEAs) and highlights key advances that the water resources field can exploit to better discover the critical tradeoffs constraining our systems. This study provides the most comprehensive diagnostic assessment of MOEAs for water resources to date, exploiting more than 100,000 MOEA runs and trillions of design evaluations. The diagnostic assessment measures the effectiveness, efficiency, reliability, and controllability of ten benchmark MOEAs for a representative suite of water resources applications addressing rainfall-runoff calibration, long-term groundwater monitoring (LTM), and risk-based water supply portfolio planning. The suite of problems encompasses a range of challenging problem properties including (1) many-objective formulations with 4 or more objectives, (2) multi-modality (or false optima), (3) nonlinearity, (4) discreteness, (5) severe constraints, (6) stochastic objectives, and (7) non-separability (also called epistasis). The applications are representative of the dominant problem classes that have shaped the history of MOEAs in water resources and that will be dominant foci in the future. Recommendations are provided for which modern MOEAs should serve as tools and benchmarks in the future water resources literature.
Heat strain and heat stress for workers wearing protective suits at a hazardous waste site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paull, J.M.; Rosenthal, F.S.
1987-05-01
In order to evaluate the effects of heat stress when full body protective suits are worn, heart rates, oral temperatures and environmental parameters were measured for five unacclimatized male workers (25-33 years of age) who performed sampling activities during hazardous waste clean-up operations. The protective ensembles included laminated PVC-Tyvec chemical resistant hood suits with rubber boots, gloves, full facepiece dual cartridge respirators and hard hats. For comparison, measurements also were performed when the men worked at a similar level of activity while they wore ordinary work clothes. A comparison of the heart rates for the men working with and withoutmore » suits indicated that wearing the suits imposed a heat stress equivalent to adding 6/sup 0/ to 11/sup 0/C (11/sup 0/ to 20/sup 0/F) to the ambient WBGT index. A similar result was obtained by calculating the WBGT in the microclimate inside the suits and comparing it to the ambient WBGT. These results indicate the following: 1) there exists a significant risk of heat injury during hazardous waste work when full body protective clothing is worn, and 2) threshold limit values for heat stress established by the ACGIH must be lowered substantially before extending them to cover workers under these conditions.« less
Scanning electron microscopy of clays and clay minerals
Bohor, B.F.; Hughes, R.E.
1971-01-01
The scanning electron microscope (SEM) proves to be ideally suited for studying the configuration, texture, and fabric of clay samples. Growth mechanics of crystalline units—interpenetration and interlocking of crystallites, crystal habits, twinning, helical growth, and topotaxis—also are uniquely revealed by the SEM.Authigenic kaolins make up the bulk of the examples because their larger crystallite size, better crystallinity, and open texture make them more suited to examination by the SEM than most other clay mineral types.
Fuzzy Logic Approaches to Multi-Objective Decision-Making in Aerospace Applications
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1994-01-01
Fuzzy logic allows for the quantitative representation of multi-objective decision-making problems which have vague or fuzzy objectives and parameters. As such, fuzzy logic approaches are well-suited to situations where alternatives must be assessed by using criteria that are subjective and of unequal importance. This paper presents an overview of fuzzy logic and provides sample applications from the aerospace industry. Applications include an evaluation of vendor proposals, an analysis of future space vehicle options, and the selection of a future space propulsion system. On the basis of the results provided in this study, fuzzy logic provides a unique perspective on the decision-making process, allowing the evaluator to assess the degree to which each option meets the evaluation criteria. Future decision-making should take full advantage of fuzzy logic methods to complement existing approaches in the selection of alternatives.
Many-body optimization using an ab initio monte carlo method.
Haubein, Ned C; McMillan, Scott A; Broadbelt, Linda J
2003-01-01
Advances in computing power have made it possible to study solvated molecules using ab initio quantum chemistry. Inclusion of discrete solvent molecules is required to determine geometric information about solute/solvent clusters. Monte Carlo methods are well suited to finding minima in many-body systems, and ab initio methods are applicable to the widest range of systems. A first principles Monte Carlo (FPMC) method was developed to find minima in many-body systems, and emphasis was placed on implementing moves that increase the likelihood of finding minimum energy structures. Partial optimization and molecular interchange moves aid in finding minima and overcome the incomplete sampling that is unavoidable when using ab initio methods. FPMC was validated by studying the boron trifluoride-water system, and then the method was used to examine the methyl carbenium ion in water to demonstrate its application to solvation problems.
Laser Time-of-Flight Mass Spectrometry for Future In Situ Planetary Missions
NASA Technical Reports Server (NTRS)
Getty, S. A.; Brinckerhoff, W. B.; Cornish, T.; Ecelberger, S. A.; Li, X.; Floyd, M. A. Merrill; Chanover, N.; Uckert, K.; Voelz, D.; Xiao, X.;
2012-01-01
Laser desorption/ionization time-of-flight mass spectrometry (LD-TOF-MS) is a versatile, low-complexity instrument class that holds significant promise for future landed in situ planetary missions that emphasize compositional analysis of surface materials. Here we describe a 5kg-class instrument that is capable of detecting and analyzing a variety of analytes directly from rock or ice samples. Through laboratory studies of a suite of representative samples, we show that detection and analysis of key mineral composition, small organics, and particularly, higher molecular weight organics are well suited to this instrument design. A mass range exceeding 100,000 Da has recently been demonstrated. We describe recent efforts in instrument prototype development and future directions that will enhance our analytical capabilities targeting organic mixtures on primitive and icy bodies. We present results on a series of standards, simulated mixtures, and meteoritic samples.
Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong
2015-02-01
Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.
Optical control and diagnostics sensors for gas turbine machinery
NASA Astrophysics Data System (ADS)
Trolinger, James D.; Jenkins, Thomas P.; Heeg, Bauke
2012-10-01
There exists a vast range of optical techniques that have been under development for solving complex measurement problems related to gas-turbine machinery and phenomena. For instance, several optical techniques are ideally suited for studying fundamental combustion phenomena in laboratory environments. Yet other techniques hold significant promise for use as either on-line gas turbine control sensors, or as health monitoring diagnostics sensors. In this paper, we briefly summarize these and discuss, in more detail, some of the latter class of techniques, including phosphor thermometry, hyperspectral imaging and low coherence interferometry, which are particularly suited for control and diagnostics sensing on hot section components with ceramic thermal barrier coatings (TBCs).
Human-Centric Teaming in a Multi-Agent EVA Assembly Task
NASA Technical Reports Server (NTRS)
Rehnmark, Fredrik; Currie, Nancy; Ambrose, Robert O.; Culbert, Christopher
2004-01-01
NASA's Human Space Flight program depends heavily on spacewalks performed by pairs of suited human astronauts. These Extra-Vehicular Activities (EVAs) are severely restricted in both duration and scope by consumables and available manpower.An expanded multi-agent EVA team combining the information-gathering and problem-solving skills of human astronauts with the survivability and physical capabilities of highly dexterous space robots is proposed. A 1-g test featuring two NASA/DARPA Robonaut systems working side-by-side with a suited human subject is conducted to evaluate human-robot teaming strategies in the context of a simulated EVA assembly task based on the STS-61B ACCESS flight experiment.
NASA Astrophysics Data System (ADS)
Richman, B. A.; Hsiao, G. S.; Rella, C.
2010-12-01
Optical spectroscopy based CRDS technology for isotopic analysis of δD and δ18O directly from liquid water has greatly increased the number and type of liquid samples analyzed. This increase has also revealed a previously unrecognized sample contamination problem. Recently West[1] and Brand[2] identified samples containing ethanol, methanol, plant extracts and other organic compounds analyzed by CRDS and other spectroscopy based techniques as yielding erroneous results for δD and δ18O (especially δD) due to spectroscopic interference. Not all organic compounds generate interference. Thus, identifying which samples are contaminated by which organic compounds is of key importance for data credibility and correction. To address this problem a new approach in the form of a software suite, ChemCorrect™, has been developed. A chemometrics component uses a spectral library of water isotopologues and interfering organic compounds to best fit the measured spectra. The best fit values provide a quantitative assay of the actual concentrations of the various species and are then evaluated to generate a visual flag indicating samples affected by organic contamination. Laboratory testing of samples spiked with known quantities of interfering organic compounds such as methanol, ethanol, and terpenes was performed. The software correctly flagged and identified type of contamination for all the spiked samples without any false positives. Furthermore the reported values were a linear function of actual concentration with an R^2>0.99 even for samples which contained multiple organic compounds. Further testing was carried out against a range of industrial chemical compounds which can contaminate ground water as well as a variety of plant derived waters and juices which were also analyzed by IRMS. The excellent results obtained give good insight into which organic compounds cause interference and which classes of plants are likely to contain interfering compounds. Finally approaches to minimize the effect of interfering compounds will be discussed including methods to assess the confidence level of an isotopic value obtained from a contaminated sample. [1] Rapid Commun. Mass Spectrom. 2010; 24: 1-7 [2] Rapid Commun. Mass Spectrom. 2009; 23: 1879-1884 Results from laboratory samples, most of which were spiked with interfering organic compounds. Samples are color coded as follows: blue=standard, green=no contamination, yellow=slight contamination, red=heavily contaminated.
Wing, Steve; Richardson, David B; Hoffmann, Wolfgang
2011-04-01
In April 2010, the U.S. Nuclear Regulatory Commission asked the National Academy of Sciences to update a 1990 study of cancer risks near nuclear facilities. Prior research on this topic has suffered from problems in hypothesis formulation and research design. We review epidemiologic principles used in studies of generic exposure-response associations and in studies of specific sources of exposure. We then describe logical problems with assumptions, formation of testable hypotheses, and interpretation of evidence in previous research on cancer risks near nuclear facilities. Advancement of knowledge about cancer risks near nuclear facilities depends on testing specific hypotheses grounded in physical and biological mechanisms of exposure and susceptibility while considering sample size and ability to adequately quantify exposure, ascertain cancer cases, and evaluate plausible confounders. Next steps in advancing knowledge about cancer risks near nuclear facilities require studies of childhood cancer incidence, focus on in utero and early childhood exposures, use of specific geographic information, and consideration of pathways for transport and uptake of radionuclides. Studies of cancer mortality among adults, cancers with long latencies, large geographic zones, and populations that reside at large distances from nuclear facilities are better suited for public relations than for scientific purposes.
A parametric model order reduction technique for poroelastic finite element models.
Lappano, Ettore; Polanz, Markus; Desmet, Wim; Mundo, Domenico
2017-10-01
This research presents a parametric model order reduction approach for vibro-acoustic problems in the frequency domain of systems containing poroelastic materials (PEM). The method is applied to the Finite Element (FE) discretization of the weak u-p integral formulation based on the Biot-Allard theory and makes use of reduced basis (RB) methods typically employed for parametric problems. The parametric reduction is obtained rewriting the Biot-Allard FE equations for poroelastic materials using an affine representation of the frequency (therefore allowing for RB methods) and projecting the frequency-dependent PEM system on a global reduced order basis generated with the proper orthogonal decomposition instead of standard modal approaches. This has proven to be better suited to describe the nonlinear frequency dependence and the strong coupling introduced by damping. The methodology presented is tested on two three-dimensional systems: in the first experiment, the surface impedance of a PEM layer sample is calculated and compared with results of the literature; in the second, the reduced order model of a multilayer system coupled to an air cavity is assessed and the results are compared to those of the reference FE model.
A mosaic-based approach is needed to conserve biodiversity in disturbed freshwater ecosystems
Hitchman, Sean M.; Mather, Martha E.; Smith, Joseph M.; Fencl, Jane S.
2017-01-01
Conserving native biodiversity in the face of human‐ and climate‐related impacts is a challenging and globally important ecological problem that requires an understanding of spatially connected, organismal‐habitat relationships. Globally, a suite of disturbances (e.g., agriculture, urbanization, climate change) degrades habitats and threatens biodiversity. A mosaic approach (in which connected, interacting collections of juxtaposed habitat patches are examined) provides a scientific foundation for addressing many disturbance‐related, ecologically based conservation problems. For example, if specific habitat types disproportionately increase biodiversity, these keystones should be incorporated into research and management plans. Our sampling of fish biodiversity and aquatic habitat along ten 3‐km sites within the Upper Neosho River subdrainage, KS, from June‐August 2013 yielded three generalizable ecological insights. First, specific types of mesohabitat patches (i.e., pool, riffle, run, and glide) were physically distinct and created unique mosaics of mesohabitats that varied across sites. Second, species richness was higher in riffle mesohabitats when mesohabitat size reflected field availability. Furthermore, habitat mosaics that included more riffles had greater habitat diversity and more fish species. Thus, riffles (<5% of sampled area) acted as keystone habitats. Third, additional conceptual development, which we initiate here, can broaden the identification of keystone habitats across ecosystems and further operationalize this concept for research and conservation. Thus, adopting a mosaic approach can increase scientific understanding of organismal‐habitat relationships, maintain natural biodiversity, advance spatial ecology, and facilitate effective conservation of native biodiversity in human‐altered ecosystems.
Hitchman, Sean M; Mather, Martha E; Smith, Joseph M; Fencl, Jane S
2018-01-01
Conserving native biodiversity in the face of human- and climate-related impacts is a challenging and globally important ecological problem that requires an understanding of spatially connected, organismal-habitat relationships. Globally, a suite of disturbances (e.g., agriculture, urbanization, climate change) degrades habitats and threatens biodiversity. A mosaic approach (in which connected, interacting collections of juxtaposed habitat patches are examined) provides a scientific foundation for addressing many disturbance-related, ecologically based conservation problems. For example, if specific habitat types disproportionately increase biodiversity, these keystones should be incorporated into research and management plans. Our sampling of fish biodiversity and aquatic habitat along ten 3-km sites within the Upper Neosho River subdrainage, KS, from June-August 2013 yielded three generalizable ecological insights. First, specific types of mesohabitat patches (i.e., pool, riffle, run, and glide) were physically distinct and created unique mosaics of mesohabitats that varied across sites. Second, species richness was higher in riffle mesohabitats when mesohabitat size reflected field availability. Furthermore, habitat mosaics that included more riffles had greater habitat diversity and more fish species. Thus, riffles (<5% of sampled area) acted as keystone habitats. Third, additional conceptual development, which we initiate here, can broaden the identification of keystone habitats across ecosystems and further operationalize this concept for research and conservation. Thus, adopting a mosaic approach can increase scientific understanding of organismal-habitat relationships, maintain natural biodiversity, advance spatial ecology, and facilitate effective conservation of native biodiversity in human-altered ecosystems. © 2017 John Wiley & Sons Ltd.
A glimpse from the inside of a space suit: What is it really like to train for an EVA?
NASA Astrophysics Data System (ADS)
Gast, Matthew A.; Moore, Sandra K.
2011-01-01
The beauty of the view from the office of a spacewalking astronaut gives the impression of simplicity, but few beyond the astronauts, and those who train them, know what it really takes to get there. Extravehicular Activity (EVA) training is an intense process that utilizes NASA's Neutral Buoyancy Laboratory (NBL) to develop a very specific skill set needed to safely construct and maintain the orbiting International Space Station. To qualify for flight assignments, astronauts must demonstrate the ability to work safely and efficiently in the physically demanding environment of the space suit, possess an acute ability to resolve unforeseen problems, and implement proper tool protocols to ensure no tools will be lost in space. Through the insights and the lessons learned by actual EVA astronauts and EVA instructors, this paper will take you on a journey through an astronaut's earliest experiences working in the space suit, termed the Extravehicular Mobility Unit (EMU), in the underwater training environment of the NBL. This work details an actual Suit Qualification NBL training event, outlines the numerous challenges the astronauts face throughout their initial training, and the various ways they adapt their own abilities to overcome them. The goal of this paper is to give everyone a small glimpse into what it is really like to work in a space suit.
NASA Astrophysics Data System (ADS)
Shervais, John W.; Vetter, Scott K.
1993-05-01
Many current models for the origin of lunar highland rocks feature as an essential component the assimilation of KREEPy material by primitive magmas parental to the Mg-rich suite and alkali suite plutonic rocks. Similar models have also been proposed for the origin of various mare basalt suites. However, any model which considers assimilation of KREEP an important petrologic process must sooner-or-later deal with the question: what is KREEP? Because pristine KREEP basalts are rare, and most known samples are small (e.g., 15382/15386), the geochemical variability of KREEP basalts is poorly known. Other KREEP compositions which are commonly used in these models include the hypothetical 'high-K KREEP' component of Warren and Wasson, which is derived from Apollo 14 soil data, and the 'superKREEP' quartz-monzodiorite 15405. Lunar breccia 15205 is a polymict regolith breccia that consists of approximately 20% KREEP basalt clasts and 20% quartz-normative basalt clasts in a KREEP-rich matrix. Bulk rock mixing calculations show that this sample comprises about 84% KREEP. The clasts range up to 1 cm in size, but most are considerably smaller. The primary aim is to characterize pristine KREEP basalts petrographically, to establish the range in chemical compositions of KREEP basalts, and to test models that were proposed for their origin. In addition, we may be able to extend the compositional range recognized in the quartz-normative basalt suite and cast some light on its origin as well. Preliminary whole rock geochemical data on the KREEP basalts are presented in a companion paper by M.M. Lindstrom and co-workers. Concentration is on petrography and mineral chemistry of these clasts, and the implications these data have for the origin of the different melt rock suites.
NASA Technical Reports Server (NTRS)
Shervais, John W.; Vetter, Scott K.
1993-01-01
Many current models for the origin of lunar highland rocks feature as an essential component the assimilation of KREEPy material by primitive magmas parental to the Mg-rich suite and alkali suite plutonic rocks. Similar models have also been proposed for the origin of various mare basalt suites. However, any model which considers assimilation of KREEP an important petrologic process must sooner-or-later deal with the question: what is KREEP? Because pristine KREEP basalts are rare, and most known samples are small (e.g., 15382/15386), the geochemical variability of KREEP basalts is poorly known. Other KREEP compositions which are commonly used in these models include the hypothetical 'high-K KREEP' component of Warren and Wasson, which is derived from Apollo 14 soil data, and the 'superKREEP' quartz-monzodiorite 15405. Lunar breccia 15205 is a polymict regolith breccia that consists of approximately 20% KREEP basalt clasts and 20% quartz-normative basalt clasts in a KREEP-rich matrix. Bulk rock mixing calculations show that this sample comprises about 84% KREEP. The clasts range up to 1 cm in size, but most are considerably smaller. The primary aim is to characterize pristine KREEP basalts petrographically, to establish the range in chemical compositions of KREEP basalts, and to test models that were proposed for their origin. In addition, we may be able to extend the compositional range recognized in the quartz-normative basalt suite and cast some light on its origin as well. Preliminary whole rock geochemical data on the KREEP basalts are presented in a companion paper by M.M. Lindstrom and co-workers. Concentration is on petrography and mineral chemistry of these clasts, and the implications these data have for the origin of the different melt rock suites.
Development of Emergency Intravehicular Spacesuit (EIS) assembly
NASA Technical Reports Server (NTRS)
1973-01-01
A program was undertaken to develop and test two prototype pressure suits to operate at pressures up to 413 mm Hg (8.0 PSIG). The units were designated Emergency Intravehicular Spacesuits (EIS). Performance requirements, design evolution, testing performed, problems encountered, and final EIS configuration are reported.
Cooling of Electric Motors Used for Propulsion on SCEPTOR
NASA Technical Reports Server (NTRS)
Christie, Robert J.; Dubois, Arthur; Derlaga, Joseph M.
2017-01-01
NASA is developing a suite of hybrid-electric propulsion technologies for aircraft. These technologies have the benefit of lower emissions, diminished noise, increased efficiency, and reduced fuel burn. These will provide lower operating costs for aircraft operators. Replacing internal combustion engines with distributed electric propulsion is a keystone of this technology suite, but presents many new problems to aircraft system designers. One of the problems is how to cool these electric motors without adding significant aerodynamic drag, cooling system weight or fan power. This paper discusses the options evaluated for cooling the motors on SCEPTOR (Scalable Convergent Electric Propulsion Technology and Operations Research): a project that will demonstrate Distributed Electric Propulsion technology in flight. Options for external and internal cooling, inlet and exhaust locations, ducting and adjustable cowling, and axial and centrifugal fans were evaluated. The final design was based on a trade between effectiveness, simplicity, robustness, mass and performance over a range of ground and flight operation environments.
Performance Metrics for Monitoring Parallel Program Executions
NASA Technical Reports Server (NTRS)
Sarukkai, Sekkar R.; Gotwais, Jacob K.; Yan, Jerry; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
Existing tools for debugging performance of parallel programs either provide graphical representations of program execution or profiles of program executions. However, for performance debugging tools to be useful, such information has to be augmented with information that highlights the cause of poor program performance. Identifying the cause of poor performance necessitates the need for not only determining the significance of various performance problems on the execution time of the program, but also needs to consider the effect of interprocessor communications of individual source level data structures. In this paper, we present a suite of normalized indices which provide a convenient mechanism for focusing on a region of code with poor performance and highlights the cause of the problem in terms of processors, procedures and data structure interactions. All the indices are generated from trace files augmented with data structure information.. Further, we show with the help of examples from the NAS benchmark suite that the indices help in detecting potential cause of poor performance, based on augmented execution traces obtained by monitoring the program.
Multi-Tasking Non-Destructive Laser Technology in Conservation Diagnostic Procedures
NASA Astrophysics Data System (ADS)
Tornari, V.; Tsiranidou, E.; Orphanos, Y.; Falldorf, C.; Klattenhof, R.; Esposito, E.; Agnani, A.; Dabu, R.; Stratan, A.; Anastassopoulos, A.; Schipper, D.; Hasperhoven, J.; Stefanaggi, M.; Bonnici, H.; Ursu, D.
Laser metrology provides techniques that have been successfully applied in industrial structural diagnostic fields but have not yet been refined and optimised for the special investigative requirements found in cultural heritage applications. A major impediment is the partial applicability of various optical coherent techniques, each one narrowing its use down to a specific application. This characteristic is not well suited for a field that encounters a great variety of diagnostic problems ranging from movable, multiple-composition museum objects, to immovable multi-layered wall paintings, statues and wood carvings, to monumental constructions and outdoor cultural heritage sites. Various diagnostic techniques have been suggested and are uniquely suited for each of the mentioned problems but it is this fragmented suitability that obstructs the technology transfer. Since optical coherent techniques for metrology are based on fundamental principles and take advantage of similar procedures for generation of informative signals for data collection, then the imposed limits elevate our aim to identify complementary capabilities to accomplish the needed functionality.
Rapid equilibrium sampling initiated from nonequilibrium data.
Huang, Xuhui; Bowman, Gregory R; Bacallado, Sergio; Pande, Vijay S
2009-11-24
Simulating the conformational dynamics of biomolecules is extremely difficult due to the rugged nature of their free energy landscapes and multiple long-lived, or metastable, states. Generalized ensemble (GE) algorithms, which have become popular in recent years, attempt to facilitate crossing between states at low temperatures by inducing a random walk in temperature space. Enthalpic barriers may be crossed more easily at high temperatures; however, entropic barriers will become more significant. This poses a problem because the dominant barriers to conformational change are entropic for many biological systems, such as the short RNA hairpin studied here. We present a new efficient algorithm for conformational sampling, called the adaptive seeding method (ASM), which uses nonequilibrium GE simulations to identify the metastable states, and seeds short simulations at constant temperature from each of them to quantitatively determine their equilibrium populations. Thus, the ASM takes advantage of the broad sampling possible with GE algorithms but generally crosses entropic barriers more efficiently during the seeding simulations at low temperature. We show that only local equilibrium is necessary for ASM, so very short seeding simulations may be used. Moreover, the ASM may be used to recover equilibrium properties from existing datasets that failed to converge, and is well suited to running on modern computer clusters.
Tracking Multiple People Online and in Real Time
2015-12-21
NO. 0704-0188 3. DATES COVERED (From - To) - UU UU UU UU 21-12-2015 Approved for public release; distribution is unlimited. Tracking multiple people ...online and in real time We cast the problem of tracking several people as a graph partitioning problem that takes the form of an NP-hard binary...PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. Duke University 2200 West Main Street Suite 710 Durham, NC 27705 -4010 ABSTRACT Tracking multiple
1982-01-01
concepts. Fatunla (1981) proposed symmetric hybrid schemes well suited to periodic initial value problems. A generalization of this idea is proposed...one time step to another was kept below a prescribed value. Obviously this limits the truncation error only in some vague, general sense. The schemes ...STIFFLY STABLE LINEAR MULTISTEP METHODS. S.O. FATUNLA, Trinity College, Dublin: P-STABLE HYBRID SCHEMES FOR INITIAL VALUE PROBLEMS APRIL 13, 1982 G
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
Introduction to the Apollo collections. Part 1: Lunar igneous rocks
NASA Technical Reports Server (NTRS)
Mcgee, P. E.; Warner, J. L.; Simonds, C. H.
1977-01-01
The basic petrographic, chemical, and age data is presented for a representative suite of igneous rocks gathered during the six Apollo missions. Tables are given for 69 samples: 32 igneous rocks and 37 impactites (breccias). A description is given of 26 basalts, four plutonic rocks, and two pyroclastic samples. The textural-mineralogic name assigned each sample is included.
CO2 Washout Testing of the REI and EM-ACES Space Suits
NASA Technical Reports Server (NTRS)
Mitchell, Kate; Norcross, Jason
2011-01-01
Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy and eventually unconsciousness or even death. Symptoms depend on several factors including partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject and physiological differences between subjects. The objective of this test was to characterize inspired oronasal ppCO2 in the Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES) across a range of workloads and flow rates for which ground testing is nominally performed. Three subjects were tested in each suit. In all but one case, each subject performed the test twice to allow for comparison between tests. Suit pressure was maintained at 4.3 psid. Subjects wore the suit while resting, performing arm ergometry, and walking on a treadmill to generate metabolic workloads of approximately 500 to 3000 BTU/hr. Supply airflow was varied at 6, 5 and 4 actual cubic feet per minute (ACFM) at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the total CO2 production measured by an additional gas analyzer at the air outlet from the suit. Real-time metabolic rate was used to adjust the arm ergometer or treadmill workload to meet target metabolic rates. In both suits, inspired CO2 was primarily affected by the metabolic rate of the subject, with increased metabolic rate resulting in increased inspired ppCO2. Suit flow rate also affected inspired ppCO2, with decreased flow causing small increases in inspired ppCO2. The effect of flow was more evident at metabolic rates greater than or equal to 2000 BTU/hr. Results were consistent between suits, with the EM-ACES demonstrating slightly better CO2 washout than the REI suit, but not statistically significant. Regression equations were developed for each suit to predict the mean inspired ppCO2 as a function of metabolic rate and suit flow rate. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future ground testing in the REI and EM-ACES.
Mid-IR Spectra of HED Meteorites and Synthetic Pyroxenes: Reststrahlen Features (9-12 micron)
NASA Technical Reports Server (NTRS)
Lim, Lucy F.; Emery, Joshua P.; Moskovitz, Nicholas A.
2010-01-01
In an earlier study. Hamilton (2000) mapped the behavior of the 9-12 micron reststrahlen structures with composition in a suite of primarily natural terrestrial pyroxenes. Here we examine the same set of reststrahlen features in the spectra of diogenites and eucrites and place them in the context of the terrestrial samples and of a suite of well-characterized synthetic pyroxenes. The results will be useful to the interpretation of mid-IR spectra of 4 Vesta and other basaltic asteroids.
CO2 Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit
NASA Technical Reports Server (NTRS)
Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses
2014-01-01
Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject and physiological differences between subjects. Computational Fluid Dynamic (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test was to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III suit across a range of workload and flow rates. Data and trends observed during testing along with refined CFD models will be used to help design an inlet vent configuration for the Z-2 space suit. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES). Three subjects performed two test sessions each in the Mark-III suit to allow for comparison between tests. Six different helmet inlet vent configurations were evaluated during each test session. Suit pressure was maintained at 4.3 psid. Suited test subjects walked on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute (ACFM) were tested at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the total oxygen consumption and CO2 production measured by additional gas analyzers at the air outlet from the suit. Realtime metabolic rate measurements were used to adjust the treadmill workload to meet target metabolic rates. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future inlet vent designs and ground testing.
CO2 Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit
NASA Technical Reports Server (NTRS)
Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses
2014-01-01
Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject and physiological differences between subjects. Computational Fluid Dynamic (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test was to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III suit across a range of workload and flow rates. Data and trends observed during testing along with refined CFD models will be used to help design an inlet vent configuration for the Z-2 space suit. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES). Three subjects performed two test sessions each in the Mark-III suit to allow for comparison between tests. Six different helmet inlet vent configurations were evaluated during each test session. Suit pressure was maintained at 4.3 psid. Suited test subjects walked on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute (ACFM) were tested at each workload. Subjects wore an oronasal mask with an open port in front of the mouth and were allowed to breathe freely. Oronasal ppCO2 was monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate was calculated from the total oxygen consumption and CO2 production measured by additional gas analyzers at the air outlet from the suit. Real-time metabolic rate measurements were used to adjust the treadmill workload to meet target metabolic rates. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future inlet vent designs and ground testing.
ERIC Educational Resources Information Center
Tolmach, Judy
This publication describes economical ways to provide better housing for students and provides examples of workable solutions to the problem at various colleges and universities. For more economical and better housing, the author recommends (1) humanizing existing dormitories by changing the standard double rooms into suites of bedrooms with a…
Evaluation of Learning Gains through Integrated STEM Projects
ERIC Educational Resources Information Center
Corlu, Mehmet Ali; Aydin, Emin
2016-01-01
New approaches to instruction are needed in all educational levels in order to develop the skills suited to the twenty-first century (i.e., inquiry, problem solving, innovation, entrepreneurship, technological communication, experimental design, and investigativeness). This research evaluated the outcomes of an approach aiming to develop such…
Mitochondrial Disease: Possible Symptoms
... problems Fatigue Short stature Contact Us 8085 Saltsburg Road, Suite 201 Pittsburgh, PA 15239 1-888-317-UMDF P 412-793-8077 F 412-793-6477 info@umdf.org The UMDF Mission Our mission: To promote research and education for the diagnosis, treatment, and cure of mitochondrial disorders and to ...
Materials science approaches to solve problems with emerging mycotoxins in corn
USDA-ARS?s Scientific Manuscript database
Materials science technology is an attractive, cost effective, and robust alternative to address the limitations of highly selective natural receptors. These materials are especially well suited to address issues with emerging toxins for which a better understanding is needed to establish levels of ...
Fabrication and performance analysis of a DEA cuff designed for dry-suit applications
NASA Astrophysics Data System (ADS)
Ahmadi, S.; Camacho Mattos, A.; Barbazza, A.; Soleimani, M.; Boscariol, P.; Menon, C.
2013-03-01
A method for manufacturing a cylindrical dielectric elastomer actuator (DEA) is presented. The cylindrical DEA can be used in fabricating the cuff area of dry-suits where the garment is very tight and wearing the suit is difficult. When electrically actuated, the DEA expands radially and the suit can be worn more comfortably. In order to study the performance of the DEA, a customized testing setup was designed, and silicone-made cuff samples with different material stiffnesses were tested. Analytical and FEM modeling were considered to evaluate the experimental output. The results revealed that although the stiffness of the DEA material has a direct relationship with the radial constrictive pressure caused by mechanically stretching the DEA, it has a minor effect on the actuation pressure. It was also found that stacking multiple layers of the DEA to fabricate a laminated structure enabled the attainment of a desired variation of pressure required for the implementation of an electrically tunable cuff.
Spacesuit Soft Upper Torso Sizing Systems
NASA Technical Reports Server (NTRS)
Graziosi, David; Splawn, Keith
2011-01-01
The passive sizing system consists of a series of low-profile pulleys attached to the front and back of the shoulder bearings on a spacesuit soft upper torso (SUT), textile cord or stainless steel cable, and a modified commercial ratchet mechanism. The cord/cable is routed through the pulleys and attached to the ratchet mechanism mounted on the front of the spacesuit within reach of the suited subject. Upon actuating the ratchet mechanism, the shoulder bearing breadth is changed, providing variable upper torso sizing. The active system consists of a series of pressurizable nastic cells embedded into the fabric layers of a spacesuit SUT. These cells are integrated to the front and back of the SUT and are connected to an air source with a variable regulator. When inflated, the nastic cells provide a change in the overall shoulder bearing breadth of the spacesuit and thus, torso sizing. The research focused on the development of a high-performance sizing and actuation system. This technology has application as a suit-sizing mechanism to allow easier suit entry and more accurate suit fit with fewer torso sizes than the existing EMU (Extravehicular Mobility Unit) suit system. This advanced SUT will support NASA s Advanced EMU Evolutionary Concept of a two-sizes-fit-all upper torso for replacement of the current EMU hard upper torso (HUT). Both the passive and nastic sizing system approaches provide astronauts with real-time upper torso sizing, which translates into a more comfortable suit, providing enhanced fit resulting in improved crewmember performance during extravehicular activity. These systems will also benefit NASA by reducing flight logistics as well as overall suit system cost. The nastic sizing system approach provides additional structural redundancy over existing SUT designs by embedding additional coated fabric and uncoated fabric layers. Two sizing systems were selected to build into a prototype SUT: one active and one passive. From manned testing, it was found that both systems offer good solutions to sizing a SUT to fit a crewmember. This new system provided improved suit don/doff over existing spacesuit designs as well as providing better fit at suit operational pressure resulting in improved comfort and mobility. It was found that a SUT with a sizing system may solve several problems that have plagued existing HUT designs, and that a SUT with a sizing system may be a viable option for advanced suit architectures.
Veliz-Cuba, Alan; Aguilar, Boris; Hinkelmann, Franziska; Laubenbacher, Reinhard
2014-06-26
A key problem in the analysis of mathematical models of molecular networks is the determination of their steady states. The present paper addresses this problem for Boolean network models, an increasingly popular modeling paradigm for networks lacking detailed kinetic information. For small models, the problem can be solved by exhaustive enumeration of all state transitions. But for larger models this is not feasible, since the size of the phase space grows exponentially with the dimension of the network. The dimension of published models is growing to over 100, so that efficient methods for steady state determination are essential. Several methods have been proposed for large networks, some of them heuristic. While these methods represent a substantial improvement in scalability over exhaustive enumeration, the problem for large networks is still unsolved in general. This paper presents an algorithm that consists of two main parts. The first is a graph theoretic reduction of the wiring diagram of the network, while preserving all information about steady states. The second part formulates the determination of all steady states of a Boolean network as a problem of finding all solutions to a system of polynomial equations over the finite number system with two elements. This problem can be solved with existing computer algebra software. This algorithm compares favorably with several existing algorithms for steady state determination. One advantage is that it is not heuristic or reliant on sampling, but rather determines algorithmically and exactly all steady states of a Boolean network. The code for the algorithm, as well as the test suite of benchmark networks, is available upon request from the corresponding author. The algorithm presented in this paper reliably determines all steady states of sparse Boolean networks with up to 1000 nodes. The algorithm is effective at analyzing virtually all published models even those of moderate connectivity. The problem for large Boolean networks with high average connectivity remains an open problem.
2014-01-01
Background A key problem in the analysis of mathematical models of molecular networks is the determination of their steady states. The present paper addresses this problem for Boolean network models, an increasingly popular modeling paradigm for networks lacking detailed kinetic information. For small models, the problem can be solved by exhaustive enumeration of all state transitions. But for larger models this is not feasible, since the size of the phase space grows exponentially with the dimension of the network. The dimension of published models is growing to over 100, so that efficient methods for steady state determination are essential. Several methods have been proposed for large networks, some of them heuristic. While these methods represent a substantial improvement in scalability over exhaustive enumeration, the problem for large networks is still unsolved in general. Results This paper presents an algorithm that consists of two main parts. The first is a graph theoretic reduction of the wiring diagram of the network, while preserving all information about steady states. The second part formulates the determination of all steady states of a Boolean network as a problem of finding all solutions to a system of polynomial equations over the finite number system with two elements. This problem can be solved with existing computer algebra software. This algorithm compares favorably with several existing algorithms for steady state determination. One advantage is that it is not heuristic or reliant on sampling, but rather determines algorithmically and exactly all steady states of a Boolean network. The code for the algorithm, as well as the test suite of benchmark networks, is available upon request from the corresponding author. Conclusions The algorithm presented in this paper reliably determines all steady states of sparse Boolean networks with up to 1000 nodes. The algorithm is effective at analyzing virtually all published models even those of moderate connectivity. The problem for large Boolean networks with high average connectivity remains an open problem. PMID:24965213
Rowe, Cynthia L.
2010-01-01
Synopsis Adolescent substance abuse rarely occurs without other psychiatric and developmental problems, yet it is often treated and researched as if it can be isolated from comorbid conditions. Few comprehensive interventions are available that effectively address the range of co-occurring problems associated with adolescent substance abuse. This article reviews the clinical interventions and research evidence supporting the use of Multidimensional Family Therapy (MDFT) for adolescents with substance abuse and co-occurring problems. MDFT is uniquely suited to address adolescent substance abuse and related disorders given its comprehensive interventions that systematically target the multiple interacting risk factors underlying many developmental disruptions of adolescence. PMID:20682221
[EC5-Space Suit Assembly Team- Internship
NASA Technical Reports Server (NTRS)
Maicke, Andrew
2016-01-01
There were three main projects in this internship. The first pertained to the Bearing Dust Cycle Test, in particular automating the test to allow for easier administration. The second concerned modifying the communication system setup in the Z2 suit, where speakers and mics were adjusted to allow for more space in the helmet. And finally, the last project concerned the tensile strength testing of fabrics deemed as candidates for space suit materials and desired to be sent off for radiation testing. The major duties here are split up between the major projects detailed above. For the Bearing Dust Cycle Test, the first objective was to find a way to automate administration of the test, as the previous version was long and tedious to perform. In order to do this, it was necessary to introduce additional electronics and perform programming to control the automation. Once this was done, it would be necessary to update documents concerning the test setup, procedure, and potential hazards. Finally, I was tasked with running tests using the new system to confirm system performance. For the Z2 communication system modifications, it was necessary to investigate alternative speakers and microphones which may have better performance than those currently used in the suit. Further, new speaker and microphone positions needed to be identified to keep them out of the way of the suit user. Once this was done, appropriate hardware (such as speaker or microphone cases and holders) could be prototyped and fabricated. For the suit material strength testing, the first task was to gather and document various test fabrics to identify the best suit material candidates. Then, it was needed to prepare samples for testing to establish baseline measurements and specify a testing procedure. Once the data was fully collected, additional test samples would be prepared and sent off-site to undergo irradiation before being tested again to observe changes in strength performance. For the Bearing Dust Cycle Test, automation was achieved through use of a servo motor and code written in LabVIEW. With this a small electrical servo controller was constructed and added to the system. For the Z2 communication modifications speaker cases were developed and printed, and new speakers and mics were selected. This allowed us to move the speakers and mics to locations to remain out of the suit users way. For the suit material strength testing, five material candidates were identified and test samples were created. These samples underwent testing, and baseline test results were gathered, though these results are currently being investigated for accuracy. The main process efficiency developed during the course of this internship comes from automation of the Bearing Dust Cycle Test. In particular, many hours of human involvement and precise operation are replaced with a simple motor setup. Thus it is no longer required to man the test, saving valuable employee time. This internship has confirmed a few things for me, namely that I both want to work as an engineer for an aerospace firm and that in particular I want to work for the Johnson Space Center. I am also confirmed in my desire to work with electronics, though I was surprised to enjoy prototyping 3D CAD design as much as I did. Therefore, I will make an effort to build my skills in this area so that I can continue to design mechanical models. In fact, I found the process of hands-on prototyping to be perhaps the most fun aspect of my time working here. This internship has also furthered my excitement for continual education, and I will hopefully be pursuing a masters in my field in the near future.
Learning With Case-Injected Genetic Algorithms
2004-08-01
JSSP ) offers an example of a problem that is suited to an 001 1 L[]1 order-based encoding where where the order of the allelles 010 1 1 matters. Order...8217 t. ýt2 * .tl problem. The next section uses the JSSP to propose and evaluate a solution similarity distance metric for order-based encodings. P1’ P2...P3 . Pn Fig. 7. Individual represents an allocation of platforms to targets. VII. JOB SHOP SCHEDULING ( JSSP ) The JSSP has been well studied elsewhere
Hamilton's Equations with Euler Parameters for Rigid Body Dynamics Modeling. Chapter 3
NASA Technical Reports Server (NTRS)
Shivarama, Ravishankar; Fahrenthold, Eric P.
2004-01-01
A combination of Euler parameter kinematics and Hamiltonian mechanics provides a rigid body dynamics model well suited for use in strongly nonlinear problems involving arbitrarily large rotations. The model is unconstrained, free of singularities, includes a general potential energy function and a minimum set of momentum variables, and takes an explicit state space form convenient for numerical implementation. The general formulation may be specialized to address particular applications, as illustrated in several three dimensional example problems.
Burner liner thermal-structural load modeling
NASA Technical Reports Server (NTRS)
Maffeo, R.
1986-01-01
The software package Transfer Analysis Code to Interface Thermal/Structural Problems (TRANCITS) was developed. The TRANCITS code is used to interface temperature data between thermal and structural analytical models. The use of this transfer module allows the heat transfer analyst to select the thermal mesh density and thermal analysis code best suited to solve the thermal problem and gives the same freedoms to the stress analyst, without the efficiency penalties associated with common meshes and the accuracy penalties associated with the manual transfer of thermal data.
Implementation and Performance Issues in Collaborative Optimization
NASA Technical Reports Server (NTRS)
Braun, Robert; Gage, Peter; Kroo, Ilan; Sobieski, Ian
1996-01-01
Collaborative optimization is a multidisciplinary design architecture that is well-suited to large-scale multidisciplinary optimization problems. This paper compares this approach with other architectures, examines the details of the formulation, and some aspects of its performance. A particular version of the architecture is proposed to better accommodate the occurrence of multiple feasible regions. The use of system level inequality constraints is shown to increase the convergence rate. A series of simple test problems, demonstrated to challenge related optimization architectures, is successfully solved with collaborative optimization.
Optimal shortening of uniform covering arrays
Rangel-Valdez, Nelson; Avila-George, Himer; Carrizalez-Turrubiates, Oscar
2017-01-01
Software test suites based on the concept of interaction testing are very useful for testing software components in an economical way. Test suites of this kind may be created using mathematical objects called covering arrays. A covering array, denoted by CA(N; t, k, v), is an N × k array over Zv={0,…,v-1} with the property that every N × t sub-array covers all t-tuples of Zvt at least once. Covering arrays can be used to test systems in which failures occur as a result of interactions among components or subsystems. They are often used in areas such as hardware Trojan detection, software testing, and network design. Because system testing is expensive, it is critical to reduce the amount of testing required. This paper addresses the Optimal Shortening of Covering ARrays (OSCAR) problem, an optimization problem whose objective is to construct, from an existing covering array matrix of uniform level, an array with dimensions of (N − δ) × (k − Δ) such that the number of missing t-tuples is minimized. Two applications of the OSCAR problem are (a) to produce smaller covering arrays from larger ones and (b) to obtain quasi-covering arrays (covering arrays in which the number of missing t-tuples is small) to be used as input to a meta-heuristic algorithm that produces covering arrays. In addition, it is proven that the OSCAR problem is NP-complete, and twelve different algorithms are proposed to solve it. An experiment was performed on 62 problem instances, and the results demonstrate the effectiveness of solving the OSCAR problem to facilitate the construction of new covering arrays. PMID:29267343
A Model of Objective Weighting for EIA.
ERIC Educational Resources Information Center
Ying, Long Gen; Liu, You Ci
1995-01-01
In the research of environmental impact assessment (EIA), the problem of weight distribution for a set of parameters has not yet been properly solved. Presents an approach of objective weighting by using a procedure of Pij principal component-factor analysis (Pij PCFA), which suits specifically those parameters measured directly by physical…
Implementing and Assessing Computational Modeling in Introductory Mechanics
ERIC Educational Resources Information Center
Caballero, Marcos D.; Kohlmyer, Matthew A.; Schatz, Michael F.
2012-01-01
Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term, 1357 students in this course solved a suite of 14 computational…
Can't Do Maths--Understanding Students' Maths Anxiety
ERIC Educational Resources Information Center
Metje, N.; Frank, H. L.; Croft, P.
2007-01-01
The number of students continuing with their mathematics education post GCSE level has declined in recent years and hence students entering Engineering degrees are reducing. The University of Birmingham recognized this problem and introduced the Suite of Technology programme (STP) which no longer requires students to have A-level mathematics.…
Public University Responses to Academic Dishonesty: Disciplinary or Academic.
ERIC Educational Resources Information Center
Roberts, Robert N.
1986-01-01
Reviews court decisions in cases involving suspension or dismissal of public university students for academic dishonesty. The courts have required universities defending such suits to meet the procedural and due process standards for nonacademic disciplinary proceedings. Discusses the constitutional due process problems raised by the suspension or…
Music Learning: Greater than the Sum of Its Parts.
ERIC Educational Resources Information Center
Zentz, Donald M.
1992-01-01
Discusses that Gestalt principles are especially well suited to teaching music. Identifies the laws of proximity, similarity, common direction, and simplicity in the notation system. Suggests that music teachers use these principles by following a logical progression to teach students to improve musical skills, solve problems, and think in…
Enhancing Student Performance Using Tablet Computers
ERIC Educational Resources Information Center
Enriquez, Amelito G.
2010-01-01
Tablet PCs have the potential to change the dynamics of classroom interaction through wireless communication coupled with pen-based computing technology that is suited for analyzing and solving engineering problems. This study focuses on how tablet PCs and wireless technology can be used during classroom instruction to create an Interactive…
Negligence--When Is the Principal Liable? A Legal Memorandum.
ERIC Educational Resources Information Center
Stern, Ralph D., Ed.
Negligence, a tort liability, is defined, discussed, and reviewed in relation to several court decisions involving school principals. The history of liability suits against school principals suggests that a reasonable, prudent principal can avoid legal problems. Ten guidelines are presented to assist principals in avoiding charges of negligence.…
ERIC Educational Resources Information Center
Radack, Shirley M.
1994-01-01
Examines the role of the National Institute of Standards and Technology (NIST) in the development of the National Information Infrastructure (NII). Highlights include the standards process; voluntary standards; Open Systems Interconnection problems; Internet Protocol Suite; consortia; government's role; and network security. (16 references) (LRW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pastore, Giovanni; Rabiti, Cristian; Pizzocri, Davide
PolyPole is a numerical algorithm for the calculation of intra-granular fission gas release. In particular, the algorithm solves the gas diffusion problem in a fuel grain in time-varying conditions. The program has been extensively tested. PolyPole combines a high accuracy with a high computational efficiency and is ideally suited for application in fuel performance codes.
EOS Interpolation and Thermodynamic Consistency
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gammel, J. Tinka
2015-11-16
As discussed in LA-UR-08-05451, the current interpolator used by Grizzly, OpenSesame, EOSPAC, and similar routines is the rational function interpolator from Kerley. While the rational function interpolator is well-suited for interpolation on sparse grids with logarithmic spacing and it preserves monotonicity in 1-d, it has some known problems.
Approaches in highly parameterized inversion - GENIE, a general model-independent TCP/IP run manager
Muffels, Christopher T.; Schreuder, Willem A.; Doherty, John E.; Karanovic, Marinko; Tonkin, Matthew J.; Hunt, Randall J.; Welter, David E.
2012-01-01
GENIE is a model-independent suite of programs that can be used to generally distribute, manage, and execute multiple model runs via the TCP/IP infrastructure. The suite consists of a file distribution interface, a run manage, a run executer, and a routine that can be compiled as part of a program and used to exchange model runs with the run manager. Because communication is via a standard protocol (TCP/IP), any computer connected to the Internet can serve in any of the capacities offered by this suite. Model independence is consistent with the existing template and instruction file protocols of the widely used PEST parameter estimation program. This report describes (1) the problem addressed; (2) the approach used by GENIE to queue, distribute, and retrieve model runs; and (3) user instructions, classes, and functions developed. It also includes (4) an example to illustrate the linking of GENIE with Parallel PEST using the interface routine.
NASA Astrophysics Data System (ADS)
Deckard, Michael; Ratib, Osman M.; Rubino, Gregory
2002-05-01
Our project was to design and implement a ceiling-mounted multi monitor display unit for use in a high-field MRI surgical suite. The system is designed to simultaneously display images/data from four different digital and/or analog sources with: minimal interference from the adjacent high magnetic field, minimal signal-to-noise/artifact contribution to the MRI images and compliance with codes and regulations for the sterile neuro-surgical environment. Provisions were also made to accommodate the importing and exporting of video information via PACS and remote processing/display for clinical and education uses. Commercial fiber optic receivers/transmitters were implemented along with supporting video processing and distribution equipment to solve the video communication problem. A new generation of high-resolution color flat panel displays was selected for the project. A custom-made monitor mount and in-suite electronics enclosure was designed and constructed at UCLA. Difficulties with implementing an isolated AC power system are discussed and a work-around solution presented.
Implementation and verification of global optimization benchmark problems
NASA Astrophysics Data System (ADS)
Posypkin, Mikhail; Usov, Alexander
2017-12-01
The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.
John F. Caratti
2006-01-01
The FIREMON Line Intercept (LI) method is used to assess changes in plant species cover for a macroplot. This method uses multiple line transects to sample within plot variation and quantify statistically valid changes in plant species cover and height over time. This method is suited for most forest and rangeland communities, but is especially useful for sampling...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preston, D.R.; Schaub, S.A.
1991-09-01
A literature and market search of existing technology for the detection, identification, and quantification of microorganisms in water was conducted. Based upon the availability of technologies and their configurations, an assessment of the appropriate strategies to pursue for the near and long term development plans in development of the Rapid Field Bacteriology Test Kit was performed. Near term technologies to improve the Army's capability to detect microorganisms would appear to be essentially improvements in versatility and measurement of coliform indicator organisms. New chromogenic and fluorogenic indicator substances associated with new substrates appear to be best suited for test kit developmentmore » either for quantitative membrane filter tests or presence/absence and multiple fermentation tests. Test times, incubator requirements, and operator involvement appear to be similar to older technologies. Long term development would appear to favor such technologies as genetic probes with amplification of the hydridized nucleic acid materials of positive samples, and some immunological based systems such as enzyme linked, immuno-sorbent assays. In both cases, the major problems would appear to be sample preparation and development of signal strengths from the reactions which would allow the user to see results in 1 hour.« less
Extending substructure based iterative solvers to multiple load and repeated analyses
NASA Technical Reports Server (NTRS)
Farhat, Charbel
1993-01-01
Direct solvers currently dominate commercial finite element structural software, but do not scale well in the fine granularity regime targeted by emerging parallel processors. Substructure based iterative solvers--often called also domain decomposition algorithms--lend themselves better to parallel processing, but must overcome several obstacles before earning their place in general purpose structural analysis programs. One such obstacle is the solution of systems with many or repeated right hand sides. Such systems arise, for example, in multiple load static analyses and in implicit linear dynamics computations. Direct solvers are well-suited for these problems because after the system matrix has been factored, the multiple or repeated solutions can be obtained through relatively inexpensive forward and backward substitutions. On the other hand, iterative solvers in general are ill-suited for these problems because they often must restart from scratch for every different right hand side. In this paper, we present a methodology for extending the range of applications of domain decomposition methods to problems with multiple or repeated right hand sides. Basically, we formulate the overall problem as a series of minimization problems over K-orthogonal and supplementary subspaces, and tailor the preconditioned conjugate gradient algorithm to solve them efficiently. The resulting solution method is scalable, whereas direct factorization schemes and forward and backward substitution algorithms are not. We illustrate the proposed methodology with the solution of static and dynamic structural problems, and highlight its potential to outperform forward and backward substitutions on parallel computers. As an example, we show that for a linear structural dynamics problem with 11640 degrees of freedom, every time-step beyond time-step 15 is solved in a single iteration and consumes 1.0 second on a 32 processor iPSC-860 system; for the same problem and the same parallel processor, a pair of forward/backward substitutions at each step consumes 15.0 seconds.
Application of high-performance computing to numerical simulation of human movement
NASA Technical Reports Server (NTRS)
Anderson, F. C.; Ziegler, J. M.; Pandy, M. G.; Whalen, R. T.
1995-01-01
We have examined the feasibility of using massively-parallel and vector-processing supercomputers to solve large-scale optimization problems for human movement. Specifically, we compared the computational expense of determining the optimal controls for the single support phase of gait using a conventional serial machine (SGI Iris 4D25), a MIMD parallel machine (Intel iPSC/860), and a parallel-vector-processing machine (Cray Y-MP 8/864). With the human body modeled as a 14 degree-of-freedom linkage actuated by 46 musculotendinous units, computation of the optimal controls for gait could take up to 3 months of CPU time on the Iris. Both the Cray and the Intel are able to reduce this time to practical levels. The optimal solution for gait can be found with about 77 hours of CPU on the Cray and with about 88 hours of CPU on the Intel. Although the overall speeds of the Cray and the Intel were found to be similar, the unique capabilities of each machine are better suited to different portions of the computational algorithm used. The Intel was best suited to computing the derivatives of the performance criterion and the constraints whereas the Cray was best suited to parameter optimization of the controls. These results suggest that the ideal computer architecture for solving very large-scale optimal control problems is a hybrid system in which a vector-processing machine is integrated into the communication network of a MIMD parallel machine.
A collection of problems for physics teaching
NASA Astrophysics Data System (ADS)
Gröber, S.; Jodl, H.-J.
2010-07-01
Problems are an important instrument for teachers to mediate physics content and for learners to adopt this content. This collection of problems is not only suited to traditional teaching and learning in lectures or student labs, but also to all kinds of new ways of teaching and learning, such as self-study, long-distance teaching, project-oriented learning and the use of remote labs/web experiments. We focus on Rutherford's scattering experiment, electron diffraction, Millikan's experiment and the use of pendulums to measure the dependence of gravitational acceleration on latitude. The collection contains about 50 problems with 160 subtasks and solutions, altogether 100 pages. Structure, content, range and the added value of the problems are described. The whole collection can be downloaded for free from http://rcl.physik.uni-kl.de.
NASA Astrophysics Data System (ADS)
Guenthner, W. R.; Reiners, P. W.
2009-12-01
Despite widespread use of zircon (U-Th)/He thermochronometry in many geologic applications, our understanding of the kinetics of He diffusion in this system is rudimentary. Previous studies have shown that both radiation damage and crystallographic anisotropy may strongly influence diffusion kinetics and ages. We present observations of zircon He ages from multiple single-grain analyses from both detrital and bedrock suites from a wide variety of locations, showing relationships consistent with effects arising from the interaction of radiation damage and anisotropy. Individual zircons in each suite have experienced the same post-depositional or exhumational t-T history but grains appear to have experienced differential He loss that is correlated with effective uranium (eU) content, a proxy for the relative extent of radiation damage within each suite. Several suites of zircons heated to partial resetting upon burial or that have experienced slow cooling show positive correlations between age and eU. Examples of partially reset detrital samples include Cretaceous Sevier foreland basin sandstones buried to ~6-8 km depth, with ages ranging from 88-309 Ma across an eU range of 215-1453 ppm, and Apennines and Olympics greywackes heated to >~120 °C, showing similar trends. Some slowly-cooled bedrock samples also show positive age-eU correlations, suggesting increasing closure temperature with higher extents of radiation damage. Conversely, zircons from cratonal bedrock samples with high levels of radiation damage—measured as accumulated alpha dosage (in this case >~10^18 α/g)—generally show negative age-eU correlations. We interpret these contrasting age-eU relationships as a manifestation of the interaction of radiation damage and anisotropic diffusion: at low damage, He diffusivity is relatively high and preferentially through c-axis-parallel channels. As suggested by Farley (2007), however, with increasing damage, channels are progressively blocked and He diffusivity decreases. Eventually, a crystal reaches a threshold level (>~10^18 α/g ) wherein radiation damage is so extensive that damage zones become interconnected and He diffusivity increases once again. In order to evaluate these assertions, we conducted a series of step-heating experiments on several pairs of zircon slabs. Individual slabs were crystallographically oriented either orthogonal or parallel to the c-axis and each pair possessed varying degrees of radiation damage. Results from these experiments provide new closure temperature estimates, explain age-eU correlations within a data set, and allow us to construct diffusion models that more accurately describe the t-T history of a given sample.
Bishop, P A; Lee, S M; Conza, N E; Clapp, L L; Moore, A D; Williams, W J; Guilliams, M E; Greenisen, M C
1999-07-01
In the event of an emergency on landing, Space Shuttle crewmembers while wearing the Launch and Entry Suit (LES) must stand, move to the hatch, exit the spacecraft with the helmet visor closed breathing 100% O2, and walk or run unassisted to a distance of 380 m upwind from the vehicle. The purpose of this study was to characterize the inspired CO2 and metabolic requirements during a simulated unaided egress from the Space Shuttle in healthy subjects wearing the LES. As a simulation of a Shuttle landing with an unaided egress, 12 male subjects completed a 6-min seated pre-breathe with 100% O2 followed by a 2-min stand and 5-min walking at 1.56 m x s(-1) (5.6 km x h(-1), 3.5 mph) with the helmet visor closed. During walks with four different G-suit pressures (0.0, 0.5, 1.0, 1.5 psi; 3.4, 6.9, 10.3 kPa), inspired CO2 and walking time were measured. After a 10-min seated recovery, subjects repeated the 5-min walk with the same G-suit pressure and the helmet visor open for the measurement of metabolic rate (VO2). When G-suit inflation levels were 1.0 or 1.5 psi, only one-third of our subjects were able to complete the 5-min visor-closed walk after a 6-min pre-breathe. Inspired CO2 levels measured at the mouth were routinely greater than 4% (30 mmHg) during walking. The metabolic cost at the 1.5 psi G-suit inflation was over 135% of the metabolic cost at 0.0 psi inflation. During unaided egress, G-suit inflation pressures of 1.0 and 1.5 psi resulted in elevated CO2 in the LES helmet and increased metabolic cost of walking, both of which may impact unaided egress performance. Neither the LES, the LES helmet, nor the G-suit were designed for ambulation. Data from this investigation suggests that adapting flight equipment for uses other than those for which it was originally designed can result in unforeseen problems.
Advanced Curation Preparation for Mars Sample Return and Cold Curation
NASA Technical Reports Server (NTRS)
Fries, M. D.; Harrington, A. D.; McCubbin, F. M.; Mitchell, J.; Regberg, A. B.; Snead, C.
2017-01-01
NASA Curation is tasked with the care and distribution of NASA's sample collections, such as the Apollo lunar samples and cometary material collected by the Stardust spacecraft. Curation is also mandated to perform Advanced Curation research and development, which includes improving the curation of existing collections as well as preparing for future sample return missions. Advanced Curation has identified a suite of technologies and techniques that will require attention ahead of Mars sample return (MSR) and missions with cold curation (CCur) requirements, perhaps including comet sample return missions.
Problem- and case-based learning in science: an introduction to distinctions, values, and outcomes.
Allchin, Douglas
2013-01-01
Case-based learning and problem-based learning have demonstrated great promise in reforming science education. Yet an instructor, in newly considering this suite of interrelated pedagogical strategies, faces a number of important instructional choices. Different features and their related values and learning outcomes are profiled here, including: the level of student autonomy; instructional focus on content, skills development, or nature-of-science understanding; the role of history, or known outcomes; scope, clarity, and authenticity of problems provided to students; extent of collaboration; complexity, in terms of number of interpretive perspectives; and, perhaps most importantly, the role of applying versus generating knowledge.
Problem- and Case-Based Learning in Science: An Introduction to Distinctions, Values, and Outcomes
Allchin, Douglas
2013-01-01
Case-based learning and problem-based learning have demonstrated great promise in reforming science education. Yet an instructor, in newly considering this suite of interrelated pedagogical strategies, faces a number of important instructional choices. Different features and their related values and learning outcomes are profiled here, including: the level of student autonomy; instructional focus on content, skills development, or nature-of-science understanding; the role of history, or known outcomes; scope, clarity, and authenticity of problems provided to students; extent of collaboration; complexity, in terms of number of interpretive perspectives; and, perhaps most importantly, the role of applying versus generating knowledge. PMID:24006385
Shared direct memory access on the Explorer 2-LX
NASA Technical Reports Server (NTRS)
Musgrave, Jeffrey L.
1990-01-01
Advances in Expert System technology and Artificial Intelligence have provided a framework for applying automated Intelligence to the solution of problems which were generally perceived as intractable using more classical approaches. As a result, hybrid architectures and parallel processing capability have become more common in computing environments. The Texas Instruments Explorer II-LX is an example of a machine which combines a symbolic processing environment, and a computationally oriented environment in a single chassis for integrated problem solutions. This user's manual is an attempt to make these capabilities more accessible to a wider range of engineers and programmers with problems well suited to solution in such an environment.
Code IN Exhibits - Supercomputing 2000
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; McCann, Karen M.; Biswas, Rupak; VanderWijngaart, Rob F.; Kwak, Dochan (Technical Monitor)
2000-01-01
The creation of parameter study suites has recently become a more challenging problem as the parameter studies have become multi-tiered and the computational environment has become a supercomputer grid. The parameter spaces are vast, the individual problem sizes are getting larger, and researchers are seeking to combine several successive stages of parameterization and computation. Simultaneously, grid-based computing offers immense resource opportunities but at the expense of great difficulty of use. We present ILab, an advanced graphical user interface approach to this problem. Our novel strategy stresses intuitive visual design tools for parameter study creation and complex process specification, and also offers programming-free access to grid-based supercomputer resources and process automation.
Characterizing the Effect of Shock on Isotopic Ages I: Ferroan Anorthosite Major Elements
NASA Technical Reports Server (NTRS)
Edmunson, J.; Cohen, B. A.; Spilde, M. N.
2009-01-01
A study underway at Marshall Space Flight Center is further characterizing the effects of shock on isotopic ages. The study was inspired by the work of L. Nyquist et al. [1, 2], but goes beyond their work by investigating the spatial distribution of elements in lunar ferroan anorthosites (FANs) and magnesium-suite (Mg-suite) rocks in order to understand the processes that may influence the radioisotope ages obtained on early lunar samples. This paper discusses the first data set (major elements) obtained on FANs 62236 and 67075.
Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Yidong, E-mail: yidong.xia@inl.gov; Wang, Chuanjin; Luo, Hong
Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using the Hydra-TH code. -- Highlights: •We performed a comprehensive study to verify and validate the turbulence models in Hydra-TH. •Hydra-TH delivers 2nd-order grid convergence for the incompressible Navier–Stokes equations. •Hydra-TH can accurately simulate the laminar boundary layers. •Hydra-TH can accurately simulate the turbulent boundary layers with RANS turbulence models. •Hydra-TH delivers high-fidelity LES capability for simulating turbulent flows in confined space.« less
It looks easy! Heuristics for combinatorial optimization problems.
Chronicle, Edward P; MacGregor, James N; Ormerod, Thomas C; Burr, Alistair
2006-04-01
Human performance on instances of computationally intractable optimization problems, such as the travelling salesperson problem (TSP), can be excellent. We have proposed a boundary-following heuristic to account for this finding. We report three experiments with TSPs where the capacity to employ this heuristic was varied. In Experiment 1, participants free to use the heuristic produced solutions significantly closer to optimal than did those prevented from doing so. Experiments 2 and 3 together replicated this finding in larger problems and demonstrated that a potential confound had no effect. In all three experiments, performance was closely matched by a boundary-following model. The results implicate global rather than purely local processes. Humans may have access to simple, perceptually based, heuristics that are suited to some combinatorial optimization tasks.
Final report on mid-polarity analytes in food matrix: mid-polarity pesticides in tea
NASA Astrophysics Data System (ADS)
Sin, Della W. M.; Li, Hongmei; Wong, S. K.; Lo, M. F.; Wong, Y. L.; Wong, Y. C.; Mok, C. S.
2015-01-01
At the Paris meeting in April 2011, the CCQM Working Group on Organic Analysis (OAWG) agreed on a suite of Track A studies meant to support the assessment of measurement capabilities needed for the delivery of measurement services within the scope of the OAWG Terms of Reference. One of the studies discussed and agreed upon for the suite of ten Track A studies that support the 5-year plan of the CCQM Core Competence assessment was CCQM-K95 'Mid-Polarity Analytes in Food Matrix: Mid-Polarity Pesticides in Tea'. This key comparison was co-organized by the Government Laboratory of Hong Kong Special Administrative Region (GL) and the National Institute of Metrology, China (NIM). To allow wider participation, a pilot study, CCQM-P136, was run in parallel. Participants' capabilities in measuring mid-polarity analytes in food matrix were demonstrated through this key comparison. Most of the participating NMIs/DIs successfully measured beta-endosulfan and endosulfan sulphate in the sample, however, there is room for further improvement for some participants. This key comparison involved not only extraction, clean-up, analytical separation and selective detection of the analytes in a complex food matrix, but also the pre-treatment procedures of the material before the extraction process. The problem of incomplete extraction of the incurred analytes from the sample matrix may not be observed simply by using spike recovery. The relative standard deviations for the data included in the KCRV calculation in this key comparison were less than 7 % which was acceptable given the complexity of the matrix, the level of the analytes and the complexity of the analytical procedure. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
Liu, Hui; Chen, Fu; Sun, Huiyong; Li, Dan; Hou, Tingjun
2017-04-11
By means of estimators based on non-equilibrium work, equilibrium free energy differences or potentials of mean force (PMFs) of a system of interest can be computed from biased molecular dynamics (MD) simulations. The approach, however, is often plagued by slow conformational sampling and poor convergence, especially when the solvent effects are taken into account. Here, as a possible way to alleviate the problem, several widely used implicit-solvent models, which are derived from the analytic generalized Born (GB) equation and implemented in the AMBER suite of programs, were employed in free energy calculations based on non-equilibrium work and evaluated for their abilities to emulate explicit water. As a test case, pulling MD simulations were carried out on an alanine polypeptide with different solvent models and protocols, followed by comparisons of the reconstructed PMF profiles along the unfolding coordinate. The results show that when employing the non-equilibrium work method, sampling with an implicit-solvent model is several times faster and, more importantly, converges more rapidly than that with explicit water due to reduction of dissipation. Among the assessed GB models, the Neck variants outperform the OBC and HCT variants in terms of accuracy, whereas their computational costs are comparable. In addition, for the best-performing models, the impact of the solvent-accessible surface area (SASA) dependent nonpolar solvation term was also examined. The present study highlights the advantages of implicit-solvent models for non-equilibrium sampling.
Therapeutic Exercise for Body Alignment and Function.
ERIC Educational Resources Information Center
Daniels, Lucille; Worthingham, Catherine
This textbook is designed for the use of persons dealing with the problems of body alignment and function, primarily the physical therapist, the physical educator, and the physician. Those procedures are included that appear to be best suited for prevention of disability, improvement of impaired function, and maintenance of the optimum level of…
Pen-Enabled, Real-Time Student Engagement for Teaching in STEM Subjects
ERIC Educational Resources Information Center
Urban, Sylvia
2017-01-01
The introduction of pen-enabling devices has been demonstrated to increase a student's ability to solve problems, communicate, and learn during note taking. For the science, technology, engineering, and mathematics subjects that are considered to be symbolic in nature, pen interfaces are better suited for visual-spatial content and also provide a…
Dan Says - Continuum Magazine | NREL
Dan Says Leading Energy Systems Integration A headshot of a man in a suit, smiling. Photo by Dennis U.S. dedicated to solving the complex problems associated with energy systems integration (ESI) on a national scale. Our 185,000-square-foot Energy Systems Integration Facility (ESIF) is designed to provide a
Heaping-Induced Bias in Regression-Discontinuity Designs. NBER Working Paper No. 17408
ERIC Educational Resources Information Center
Barreca, Alan I.; Lindo, Jason M.; Waddell, Glen R.
2011-01-01
This study uses Monte Carlo simulations to demonstrate that regression-discontinuity designs arrive at biased estimates when attributes related to outcomes predict heaping in the running variable. After showing that our usual diagnostics are poorly suited to identifying this type of problem, we provide alternatives. We also demonstrate how the…
Ten Problems in Artificial Intelligence.
1987-01-01
22217 Computer Systems Management , Inc. 5 copies 130o Wilson Boulevard, Suite 102 % Arlington, Virginia 22209 Ms. Robin D)illard cop)v Naval Ocean...old ones lightlY inl order to fit ne.w data. A good model for thi-s kind ,)f knowlede t riicture re.vi-in is XllaIi.tit. Xhen, people have a predlictiv
The Problem of Defensive Medicine
ERIC Educational Resources Information Center
Barondess, Jeremiah A.; Tancredi, Laurence R.
1978-01-01
Defensive medicine (the use of diagnostic and end-treatment measures explicitly for the purposes of averting malpractice suits) is frequently cited as one of the least desirable effects of the current rise in medical litigation. It is claimed that defensive medicine is responsible for the rising cost of health care and the exposure of patients to…
Rendering of Foreign Language Inclusions in the Russian Translations of the Novels by Graham Greene
ERIC Educational Resources Information Center
Valeeva, Roza A.; Martynova, Irina N.
2016-01-01
The importance of the problem under discussion is preconditioned by the scientific inquiry of the best variants of foreign language inclusions translation which would suite original narration in the source text stylistically, emotionally and conceptually and also fully projects the author's communicative intention in every particular case. The…
The Monty Hall Problem as a Class Activity Using Clickers
ERIC Educational Resources Information Center
Irons, Stephen H.
2012-01-01
Demonstrating probabilistic outcomes using real-time data is especially well-suited to larger lecture classes where one can generate large data sets easily. The difficulty comes in quickly collecting, analyzing, and displaying the information. With the advent of wireless polling technology (clickers), this difficulty is removed. In this paper we…
Applied Behavior Analysis: Beyond Discrete Trial Teaching
ERIC Educational Resources Information Center
Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold
2007-01-01
We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…
Education and Middle Manpower Development in Malaysia.
ERIC Educational Resources Information Center
Harris, Norman C.
One of the essential factors in the economic development of nations is the attainment of a manpower mix which is strategically suited to current development problems, and which will also provide a catalyst for improvement and change. A review of the literature indicates that, although education per se is important, individual countries must…
ERIC Educational Resources Information Center
Ercolani, Gianfranco
2005-01-01
The finite-difference boundary-value method is a numerical method suited for the solution of the one-dimensional Schrodinger equation encountered in problems of hindered rotation. Further, the application of the method, in combination with experimental results for the evaluation of the rotational energy barrier in ethane is presented.
ERIC Educational Resources Information Center
Ma, Angela Kit Fong; O'Toole, John Mitchell
2013-01-01
The study described in this paper investigated how the major stakeholders of a teacher education institution responded to a particular suite of educational products that involved video-based educational learning objects. It aims to look into stakeholder attitudes to potential technological development in fostering student-centred learning in…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edmunson, J; E.Borg, L; Nyquist, L E
2008-11-17
Lunar Mg-suite norite 78238 was dated using the Sm-Nd, Rb-Sr, and U-Pb isotopic systems in order to constrain the age of lunar magma ocean solidification and the beginning of Mg-suite magmatism, as well as to provide a direct comparison between the three isotopic systems. The Sm-Nd isotopic system yields a crystallization age for 78238 of 4334 {+-} 37 Ma and an initial {var_epsilon}{sub Nd}{sup 143} value of -0.27 {+-} 0.74. The age-initial {var_epsilon}{sub Nd}{sup 143} (T-I) systematics of a variety of KREEP-rich samples, including 78238 and other Mg-suite rocks, KREEP basalts, and olivine cumulate NWA 773, suggest that lunar differentiationmore » was completed by 4492 {+-} 61 Ma assuming a Chondritic Uniform Reservoir bulk composition for the Moon. The Rb-Sr isotopic systematics of 78238 were disturbed by post-crystallization processes. Nevertheless, selected data points yield two Rb-Sr isochrons. One is concordant with the Sm-Nd crystallization age, 4366 {+-} 53 Ma. The other is 4003 {+-} 95 Ma and is concordant with an Ar-Ar age for 78236. The {sup 207}Pb-{sup 206}Pb age of 4333 {+-} 59 Ma is concordant with the Sm-Nd age. The U-Pb isotopic systematics of 78238 yield linear arrays equivalent to younger ages than the Pb-Pb system, and may reflect fractionation of U and Pb during sample handling. Despite the disturbed nature of the U-Pb systems, a time-averaged {mu} ({sup 238}U/{sup 204}Pb) value of the source can be estimated at 27 {+-} 30 from the Pb-Pb isotopic systematics. Because KREEP-rich samples are likely to be derived from source regions with the highest U/Pb ratios, the relatively low {mu} value calculated for the 78238 source suggests the bulk Moon does not have an exceedingly high {mu} value.« less
Sivanandan, Indu; Bowker, Karen E; Bannister, Gordon C; Soar, Jasmeet
2011-02-01
Surgical site infections are one of the most important causes of healthcare associated infections (HCAI), accounting for 20% of all HCAIs. Surgical site infections affect 1% of joint replacement operations. This study was designed to assess whether theatre clothing is contaminated more inside or outside the theatre suite. Petri dishes filled with horse blood agar were pressed on theatre clothes at 0, 2, 4, 6 and 8 hours to sample bacterial contamination in 20 doctors whilst working in and outside the theatre suite. The results showed that there was greater bacterial contamination when outside the theatre suite at 2 hours. There were no differences in the amount of contamination at 4, 6 and 8 hours. This study suggests that the level of contamination of theatre clothes is similar both inside and outside the theatre setting.
Design of a device to remove lunar dust from space suits for the proposed lunar base
NASA Technical Reports Server (NTRS)
Harrington, David; Havens, Jack; Hester, Daniel
1990-01-01
The National Aeronautics and Space Administration plans to begin construction of a lunar base soon after the turn of the century. During the Apollo missions, lunar dust proved to be a problem because the dust adhered to all exposed material surfaces. Since lunar dust will be a problem during the establishment and operation of this base, the need exists for a device to remove the dust from space suits before the astronauts enter clean environments. The physical properties of lunar dust were characterized and energy methods for removing the dust were identified. Eight alternate designs were developed to remove the dust. The final design uses a brush and gas jet to remove the dust. The brush bristles are made from Kevlar fibers and the gas jet uses pressurized carbon dioxide from a portable tank. A throttling valve allows variable gas flow. Also, the tank is insulated with Kapton and electrically heated to prevent condensation of the carbon dioxide when the tank is exposed to the cold (- 240 F) lunar night.
Methods for biological data integration: perspectives and challenges
Gligorijević, Vladimir; Pržulj, Nataša
2015-01-01
Rapid technological advances have led to the production of different types of biological data and enabled construction of complex networks with various types of interactions between diverse biological entities. Standard network data analysis methods were shown to be limited in dealing with such heterogeneous networked data and consequently, new methods for integrative data analyses have been proposed. The integrative methods can collectively mine multiple types of biological data and produce more holistic, systems-level biological insights. We survey recent methods for collective mining (integration) of various types of networked biological data. We compare different state-of-the-art methods for data integration and highlight their advantages and disadvantages in addressing important biological problems. We identify the important computational challenges of these methods and provide a general guideline for which methods are suited for specific biological problems, or specific data types. Moreover, we propose that recent non-negative matrix factorization-based approaches may become the integration methodology of choice, as they are well suited and accurate in dealing with heterogeneous data and have many opportunities for further development. PMID:26490630
As the Human Exposure Program focuses on the exposure of children to pesticides, there are concerns about the effect, or perceived effect, of components of the sampling procedure on the health and well-being of the infant and the ability to collect pesticide residues.
One...
NASA Technical Reports Server (NTRS)
Merrick, E. B.
1979-01-01
An alternative space suit insulation concept using a monolayer woven pile material is discussed. The material reduces cost and improves the durability of the overgarment, while providing protection similar to that provided by multilayer insulation (MLI). Twelve samples of different configurations were fabricated and tested for compressibility and thermal conductivity as a function of compression loading. Two samples which showed good results in the initial tests were further tested for thermal conductivity with respect to ambient pressure and temperature. Results of these tests were similar to results of the MLI tests, indicating the potential of the monolayer fabric to replace the present MLI. A seaming study illustrated that the fabric can be sewn in a structurally sound seam with minimal heat loss. It is recommended that a prototype thermal meteroid garment be fabricated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moran, James; Alexander, Thomas; Aalseth, Craig
2017-08-01
Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.
Ultraviolet Testing of Space Suit Materials for Mars
NASA Technical Reports Server (NTRS)
Larson, Kristine; Fries, Marc
2017-01-01
Human missions to Mars may require radical changes in the approach to extra-vehicular (EVA) suit design. A major challenge is the balance of building a suit robust enough to complete multiple EVAs under intense ultraviolet (UV) light exposure without losing mechanical strength or compromising the suit's mobility. To study how the materials degrade on Mars in-situ, the Jet Propulsion Laboratory (JPL) invited the Advanced Space Suit team at NASA's Johnson Space Center (JSC) to place space suit materials on the Scanning Habitable Environments with Raman & Luminescence for Organics and Chemicals (SHERLOC) instrument's calibration target of the Mars 2020 rover. In order to select materials for the rover and understand the effects from Mars equivalent UV exposure, JSC conducted ground testing on both current and new space suit materials when exposed to 2500 hours of Mars mission equivalent UV. To complete this testing, JSC partnered with NASA's Marshall Space Flight Center to utilize their UV vacuum chambers. Materials tested were Orthofabric, polycarbonate, Teflon, Dacron, Vectran, spectra, bladder, nGimat coated Teflon, and nGimat coated Orthofabric. All samples were measured for mass, tensile strength, and chemical composition before and after radiation. Mass loss was insignificant (less than 0.5%) among the materials. Most materials loss tensile strength after radiation and became more brittle with a loss of elongation. Changes in chemical composition were seen in all radiated materials through Spectral Analysis. Results from this testing helped select the materials that will fly on the Mars 2020 rover. In addition, JSC can use this data to create a correlation to the chemical changes after radiation-which is what the rover will send back while on Mars-to the mechanical changes, such as tensile strength.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shervais, J.W.; McGee, J.J.
1998-09-01
Most of the Moon`s highland crust comprises Fe-rich anorthosites with calcic plagioclase compositions. Subsequent evolution of the highland crust was dominated by troctolites, anorthosites, and norites of the Mg-suite. This plutonic series is characterized by calcic plagioclase, and mafic minerals with high mg{number_sign} (=100{sup *}Mg/[Mg + Fe]). In an effort to distinguish the origin of this important lunar rock series, the authors have analyzed the REE content of primary cumulus phases in ten Mg-suite cumulates using SIMS, along with their major and minor element compositions by electron microprobe analysis. Nine of these samples have high mg{number_sign}s, consistent with their formationmore » from the most primitive parent melts of the Mg-suite. The data presented here show that Mg-suite troctolites and anorthosites preserve major and trace element characteristics acquired during their formation as igneous cumulate rocks and that these characteristics can be used to reconstruct related aspects of the parent magma composition. Data show that primitive cumulates of the Mg-suite crystallized from magmas with REE contents similar to high-K KREEP in both concentration and relative abundance. The highly enriched nature of this parent magma contrasts with its primitive major element characteristics, as pointed out by previous workers. This enigma is best explained by the mixing of residual magma ocean urKREEP melts with ultramagnesian komatiitic partial melts from the deep lunar interior. The data do not support earlier models that invoke crustal metasomatism to enrich the Mg-suite cumulates after formation, or models which call for a superKREEP parent for the troctolites and anorthosites.« less
Charnockites and granites of the western Adirondacks, New York, USA: a differentiated A-type suite
Whitney, P.R.
1992-01-01
Granitic rocks in the west-central Adirondack Highlands of New York State include both relatively homogeneous charnockitic and hornblende granitic gneisses (CG), that occur in thick stratiform bodies and elliptical domes, and heterogeneous leucogneisses (LG), that commonly are interlayered with metasedimentary rocks. Major- and trace-element geochemical analyses were obtained for 115 samples, including both types of granitoids. Data for CG fail to show the presence of more than one distinct group based on composition. Most of the variance within the CG sample population is consistent with magmatic differentiation combined with incomplete separation of early crystals of alkali feldspar, plagioclase, and pyroxenes or amphibole from the residual liquid. Ti, Fe, Mg, Ca, P, Sr, Ba, and Zr decrease with increasing silica, while Rb and K increase. Within CG, the distinction between charnockitic (orthopyroxene-bearing) and granitic gneisses is correlated with bulk chemistry. The charnockites are consistently more mafic than the hornblende granitic gneisses, although forming a continuum with them. The leucogneisses, while generally more felsic than the charnockites and granitic gneisses, are otherwise geochemically similar to them. The data are consistent with the LG suite being an evolved extrusive equivalent of the intrusive CG suite. Both CG and LG suites are metaluminous to mildly peraluminous and display an A-type geochemical signature, enriched in Fe, K, Ce, Y, Nb, Zr, and Ga and depleted in Ca, Mg, and Sr relative to I- and S-type granites. Rare earth element patterns show moderate LREE enrichment and a negative Eu anomaly throughout the suite. The geochemical data suggest an origin by partial melting of biotite- and plagioclase-rich crustal rocks. Emplacement occurred in an anorogenic or post-collisional tectonic setting, probably at relatively shallow depths. Deformation and granulite-facies metamorphism with some partial melting followed during the Ottawan phase of the Grenville Orogeny, yielding the present migmatitic granitic and charnockitic gneisses. ?? 1992.
Guturu, Parthasarathy; Dantu, Ram
2008-06-01
Many graph- and set-theoretic problems, because of their tremendous application potential and theoretical appeal, have been well investigated by the researchers in complexity theory and were found to be NP-hard. Since the combinatorial complexity of these problems does not permit exhaustive searches for optimal solutions, only near-optimal solutions can be explored using either various problem-specific heuristic strategies or metaheuristic global-optimization methods, such as simulated annealing, genetic algorithms, etc. In this paper, we propose a unified evolutionary algorithm (EA) to the problems of maximum clique finding, maximum independent set, minimum vertex cover, subgraph and double subgraph isomorphism, set packing, set partitioning, and set cover. In the proposed approach, we first map these problems onto the maximum clique-finding problem (MCP), which is later solved using an evolutionary strategy. The proposed impatient EA with probabilistic tabu search (IEA-PTS) for the MCP integrates the best features of earlier successful approaches with a number of new heuristics that we developed to yield a performance that advances the state of the art in EAs for the exploration of the maximum cliques in a graph. Results of experimentation with the 37 DIMACS benchmark graphs and comparative analyses with six state-of-the-art algorithms, including two from the smaller EA community and four from the larger metaheuristics community, indicate that the IEA-PTS outperforms the EAs with respect to a Pareto-lexicographic ranking criterion and offers competitive performance on some graph instances when individually compared to the other heuristic algorithms. It has also successfully set a new benchmark on one graph instance. On another benchmark suite called Benchmarks with Hidden Optimal Solutions, IEA-PTS ranks second, after a very recent algorithm called COVER, among its peers that have experimented with this suite.
An event-based architecture for solving constraint satisfaction problems
Mostafa, Hesham; Müller, Lorenz K.; Indiveri, Giacomo
2015-01-01
Constraint satisfaction problems are ubiquitous in many domains. They are typically solved using conventional digital computing architectures that do not reflect the distributed nature of many of these problems, and are thus ill-suited for solving them. Here we present a parallel analogue/digital hardware architecture specifically designed to solve such problems. We cast constraint satisfaction problems as networks of stereotyped nodes that communicate using digital pulses, or events. Each node contains an oscillator implemented using analogue circuits. The non-repeating phase relations among the oscillators drive the exploration of the solution space. We show that this hardware architecture can yield state-of-the-art performance on random SAT problems under reasonable assumptions on the implementation. We present measurements from a prototype electronic chip to demonstrate that a physical implementation of the proposed architecture is robust to practical non-idealities and to validate the theory proposed. PMID:26642827
Wing, Steve; Richardson, David B.; Hoffmann, Wolfgang
2011-01-01
Background In April 2010, the U.S. Nuclear Regulatory Commission asked the National Academy of Sciences to update a 1990 study of cancer risks near nuclear facilities. Prior research on this topic has suffered from problems in hypothesis formulation and research design. Objectives We review epidemiologic principles used in studies of generic exposure–response associations and in studies of specific sources of exposure. We then describe logical problems with assumptions, formation of testable hypotheses, and interpretation of evidence in previous research on cancer risks near nuclear facilities. Discussion Advancement of knowledge about cancer risks near nuclear facilities depends on testing specific hypotheses grounded in physical and biological mechanisms of exposure and susceptibility while considering sample size and ability to adequately quantify exposure, ascertain cancer cases, and evaluate plausible confounders. Conclusions Next steps in advancing knowledge about cancer risks near nuclear facilities require studies of childhood cancer incidence, focus on in utero and early childhood exposures, use of specific geographic information, and consideration of pathways for transport and uptake of radionuclides. Studies of cancer mortality among adults, cancers with long latencies, large geographic zones, and populations that reside at large distances from nuclear facilities are better suited for public relations than for scientific purposes. PMID:21147606
Ethical aspects of aging research.
Seppet, Enn; Pääsuke, Mati; Conte, Maria; Capri, Miriam; Franceschi, Claudio
2011-12-01
During the last 50-60 years, due to development of medical care and hygienically safe living conditions, the average life span of European citizens has substantially increased, with a rapid growth of the population older than 65 years. This trend places ever-growing medical and economical burden on society, as many of the older subjects suffer from age-related diseases and frailty. Coping with these problems requires not only appropriate medical treatment and social support but also extensive research in many fields of aging-from biology to sociology, with involvement of older people as the research subjects. This work anticipates development and application of ethical standards suited to dynamic advances in aging research. The aim of this review is to update the knowledge in ethical requirements toward recruitment of older research subjects, obtaining of informed consent, collection of biological samples, and use of stem cells in preclinical and clinical settings. It is concluded that application of adequate ethical platform markedly facilitates recruitment of older persons for participation in research. Currently, the basic ethical concepts are subjected to extensive discussion, with participation of all interested parties, in order to guarantee successful research on problems of human aging, protect older people from undesired interference, and afford their benefits through supporting innovations in research, therapy, and care.
CCP4i2: the new graphical user interface to the CCP4 program suite.
Potterton, Liz; Agirre, Jon; Ballard, Charles; Cowtan, Kevin; Dodson, Eleanor; Evans, Phil R; Jenkins, Huw T; Keegan, Ronan; Krissinel, Eugene; Stevenson, Kyle; Lebedev, Andrey; McNicholas, Stuart J; Nicholls, Robert A; Noble, Martin; Pannu, Navraj S; Roth, Christian; Sheldrick, George; Skubak, Pavol; Turkenburg, Johan; Uski, Ville; von Delft, Frank; Waterman, David; Wilson, Keith; Winn, Martyn; Wojdyr, Marcin
2018-02-01
The CCP4 (Collaborative Computational Project, Number 4) software suite for macromolecular structure determination by X-ray crystallography groups brings together many programs and libraries that, by means of well established conventions, interoperate effectively without adhering to strict design guidelines. Because of this inherent flexibility, users are often presented with diverse, even divergent, choices for solving every type of problem. Recently, CCP4 introduced CCP4i2, a modern graphical interface designed to help structural biologists to navigate the process of structure determination, with an emphasis on pipelining and the streamlined presentation of results. In addition, CCP4i2 provides a framework for writing structure-solution scripts that can be built up incrementally to create increasingly automatic procedures.
Cleaning By Blasting With Pellets Of Dry Ice
NASA Technical Reports Server (NTRS)
Fody, Jody
1993-01-01
Dry process strips protective surface coats from parts to be cleaned, without manual scrubbing. Does not involve use of flammable or toxic solvents. Used to remove coats from variety of materials, including plastics, ceramics, ferrous and nonferrous metals, and composites. Adds no chemical-pollution problem to problem of disposal of residue of coating material. Process consists of blasting solid carbon dioxide (dry ice) pellets at surface to be cleaned. Pellets sublime on impact and pass into atmosphere as carbon dioxide gas. Size, harness, velocity, and quantity of pellets adjusted to suit coating material and substrate.
A comparison of the finite difference and finite element methods for heat transfer calculations
NASA Technical Reports Server (NTRS)
Emery, A. F.; Mortazavi, H. R.
1982-01-01
The finite difference method and finite element method for heat transfer calculations are compared by describing their bases and their application to some common heat transfer problems. In general it is noted that neither method is clearly superior, and in many instances, the choice is quite arbitrary and depends more upon the codes available and upon the personal preference of the analyst than upon any well defined advantages of one method. Classes of problems for which one method or the other is better suited are defined.
A new asymptotic method for jump phenomena
NASA Technical Reports Server (NTRS)
Reiss, E. L.
1980-01-01
Physical phenomena involving rapid and sudden transitions, such as snap buckling of elastic shells, explosions, and earthquakes, are characterized mathematically as a small disturbance causing a large-amplitude response. Because of this, standard asymptotic and perturbation methods are ill-suited to these problems. In the present paper, a new method of analyzing jump phenomena is proposed. The principal feature of the method is the representation of the response in terms of rational functions. For illustration, the method is applied to the snap buckling of an elastic arch and to a simple combustion problem.
An implementation and analysis of the Abstract Syntax Notation One and the basic encoding rules
NASA Technical Reports Server (NTRS)
Harvey, James D.; Weaver, Alfred C.
1990-01-01
The details of abstract syntax notation one standard (ASN.1) and the basic encoding rules standard (BER) that collectively solve the problem of data transfer across incompatible host environments are presented, and a compiler that was built to automate their use is described. Experiences with this compiler are also discussed which provide a quantitative analysis of the performance costs associated with the application of these standards. An evaluation is offered as to how well suited ASN.1 and BER are in solving the common data representation problem.
Automated measurement of retinal vascular tortuosity.
Hart, W. E.; Goldbaum, M.; Côté, B.; Kube, P.; Nelson, M. R.
1997-01-01
Automatic measurement of blood vessel tortuosity is a useful capability for automatic ophthalmological diagnostic tools. We describe a suite of automated tortuosity measures for blood vessel segments extracted from RGB retinal images. The tortuosity measures were evaluated in two classification tasks: (1) classifying the tortuosity of blood vessel segments and (2) classifying the tortuosity of blood vessel networks. These tortuosity measures were able to achieve a classification rate of 91% for the first problem and 95% on the second problem, which confirms that they capture much of the ophthalmologists' notion of tortuosity. Images Figure 1 PMID:9357668
Lateral Coherence and Mixing in the Coastal Ocean: Adaptive Sampling using Gliders
2012-09-30
Adaptive Sampling using Gliders R. Kipp Shearman Jonathan D. Nash James N. Moum John A. Barth College of Oceanic & Atmospheric Sciences Oregon State...persistent on O (3 day) timescales, so are ideally suited to be adaptively sampled by autonomous gliders that actively report both turbulent and...plan to deploy 4 AUV gliders to perform intensive, adaptive surveys. Newly-enhanced to measure turbulent mixing, water-column currents and dye
Lateral Coherence and Mixing in the Coastal Ocean: Adaptive Sampling using Gliders
2011-09-30
Coherence and Mixing in the Coastal Ocean: Adaptive Sampling using Gliders R. Kipp Shearman Jonathan D. Nash James N. Moum John A. Barth College of...These structures evolve yet are often persistent on O (3 day) timescales, so are ideally suited to be adaptively sampled by autonomous gliders that...processes driving lateral dispersion, we plan to deploy 4 AUV gliders to perform intensive, adaptive surveys. Newly-enhanced to measure turbulent mixing
Sample Manipulation System for Sample Analysis at Mars
NASA Technical Reports Server (NTRS)
Mumm, Erik; Kennedy, Tom; Carlson, Lee; Roberts, Dustyn
2008-01-01
The Sample Analysis at Mars (SAM) instrument will analyze Martian samples collected by the Mars Science Laboratory Rover with a suite of spectrometers. This paper discusses the driving requirements, design, and lessons learned in the development of the Sample Manipulation System (SMS) within SAM. The SMS stores and manipulates 74 sample cups to be used for solid sample pyrolysis experiments. Focus is given to the unique mechanism architecture developed to deliver a high packing density of sample cups in a reliable, fault tolerant manner while minimizing system mass and control complexity. Lessons learned are presented on contamination control, launch restraint mechanisms for fragile sample cups, and mechanism test data.
NASA Astrophysics Data System (ADS)
Lehnert, K. A.; Goldstein, S. L.; Vinayagamoorthy, S.; Lenhardt, W. C.
2005-12-01
Data on samples represent a primary foundation of Geoscience research across disciplines, ranging from the study of climate change, to biogeochemical cycles, to mantle and continental dynamics and are key to our knowledge of the Earth's dynamical systems and evolution. Different data types are generated for individual samples by different research groups, published in different papers, and stored in different databases on a global scale. The utility of these data is critically dependent on their integration. Such integration can be achieved within a Geoscience Cyberinfrastructure, but requires unambiguous identification of samples. Currently, naming of samples is arbitrary and inconsistent and therefore severely limits our ability to share, link, and integrate sample-based data. Major problems include name duplication, and changing of names as a sample is passed along over many years to different investigators. SESAR, the System for Earth Sample Registration (http://www.geosamples.org), addresses this problem by building a registry that generates and administers globally unique identifiers for Geoscience samples: the International Geo Sample Number (IGSN). Implementation of the IGSN in data publication and digital data management will dramatically advance interoperability among information systems for sample-based data, opening an extensive range of new opportunities for discovery and for interdisciplinary approaches in research. The IGSN will also facilitate the ability of investigators to build on previously collected data on samples as new measurements are made or new techniques are developed. With potentially broad application to all types of Geoscience samples, SESAR is global in scope. It is a web-based system that can be easily accessed by individual users through an interactive web interface and by distributed client systems via standard web services. Samples can be registered individually or in batches and at various levels of granularity from entire cores or dredges or sample suites to individual samples to sub-samples such as splits and separates. Relationships between `parent' and `child' samples are tracked. The system generates bar codes that users can download as images for labeling purposes. SESAR released a beta version of the registry in April 2005 that allows users to register a limited range of sample types. Identifiers generated by the beta version will remain valid when SESAR moves into its operational stage. Since then more than 3700 samples have been registered in SESAR. Registration of samples at a central clearinghouse will automatically build a global catalog of Geoscience samples, which will become a hugely valuable resource for the Geoscience community that allows more efficient planning of field and laboratory projects and facilitates sharing of samples, which will help build more comprehensive data sets for individual samples. The SESAR catalog will provide links to sample profiles on external systems that hold data about samples, thereby enabling users to easily obtain complete information about samples.
Thermal conductance of space suit insulations, thermal micrometeroid garments, and other insulations
NASA Technical Reports Server (NTRS)
Richardson, D. L.; Stevens, J. M.
1976-01-01
The thermal protection capabilities of development and operational thermal micrometeroid garments and other insulations were evaluated. The relationship among sample thermal conductance, surface temperature, and compressive loads was empirically defined.
The planetary data system educational CD-ROM
NASA Technical Reports Server (NTRS)
Guinness, E. A.; Arvidson, R. E.; Martin, M.; Dueck, S.
1993-01-01
The Planetary Data System (PDS) is producing a special educational CD-ROM that contains samples of PDS datasets and is expected to be released in 1993. The CD-ROM will provide university-level instructors with PDS-compatible materials and information that can be used to construct student problem sets using real datasets. The main purposes of the CD-ROM are to facilitate wide use of planetary data and to introduce a large community to the PDS. To meet these objectives the Educational CD-ROM will also contain software to manipulate the data, background discussions about scientific questions that can be addressed with the data, and a suite of exercises that illustrate analysis techniques. Students will also be introduced to the SPICE concept, which is a new way of maintaining geometry and instrument information. The exercises will be presented at the freshman through graduate student levels. With simplification, some of the material should also be of use at the high school level.
Covariance Matrix Adaptation Evolutionary Strategy for Drift Correction of Electronic Nose Data
NASA Astrophysics Data System (ADS)
Di Carlo, S.; Falasconi, M.; Sanchez, E.; Sberveglieri, G.; Scionti, A.; Squillero, G.; Tonda, A.
2011-09-01
Electronic Noses (ENs) might represent a simple, fast, high sample throughput and economic alternative to conventional analytical instruments [1]. However, gas sensors drift still limits the EN adoption in real industrial setups due to high recalibration effort and cost [2]. In fact, pattern recognition (PaRC) models built in the training phase become useless after a period of time, in some cases a few weeks. Although algorithms to mitigate the drift date back to the early 90 this is still a challenging issue for the chemical sensor community [3]. Among other approaches, adaptive drift correction methods adjust the PaRC model in parallel with data acquisition without need of periodic calibration. Self-Organizing Maps (SOMs) [4] and Adaptive Resonance Theory (ART) networks [5] have been already tested in the past with fair success. This paper presents and discusses an original methodology based on a Covariance Matrix Adaptation Evolution Strategy (CMA-ES) [6], suited for stochastic optimization of complex problems.
Comparison of a 3-D CFD-DSMC Solution Methodology With a Wind Tunnel Experiment
NASA Technical Reports Server (NTRS)
Glass, Christopher E.; Horvath, Thomas J.
2002-01-01
A solution method for problems that contain both continuum and rarefied flow regions is presented. The methodology is applied to flow about the 3-D Mars Sample Return Orbiter (MSRO) that has a highly compressed forebody flow, a shear layer where the flow separates from a forebody lip, and a low density wake. Because blunt body flow fields contain such disparate regions, employing a single numerical technique to solve the entire 3-D flow field is often impractical, or the technique does not apply. Direct simulation Monte Carlo (DSMC) could be employed to solve the entire flow field; however, the technique requires inordinate computational resources for continuum and near-continuum regions, and is best suited for the wake region. Computational fluid dynamics (CFD) will solve the high-density forebody flow, but continuum assumptions do not apply in the rarefied wake region. The CFD-DSMC approach presented herein may be a suitable way to obtain a higher fidelity solution.
NASA Astrophysics Data System (ADS)
Resano, Martín; Flórez, María del Rosario; Queralt, Ignasi; Marguí, Eva
2015-03-01
This work investigates the potential of high-resolution continuum source graphite furnace atomic absorption spectrometry for the direct determination of Pd, Pt and Rh in two samples of very different nature. While analysis of active pharmaceutical ingredients is straightforward and it is feasible to minimize matrix effects, to the point that calibration can be carried out against aqueous standard solutions, the analysis of used automobile catalysts is more challenging requiring the addition of a chemical modifier (NH4F·HF) to help in releasing the analytes, a more vigorous temperature program and the use of a solid standard (CRM ERM®-EB504) for calibration. However, in both cases it was possible to obtain accurate results and precision values typically better than 10% RSD in a fast and simple way, while only two determinations are needed for the three analytes, since Pt and Rh can be simultaneously monitored in both types of samples. Overall, the methods proposed seem suited for the determination of these analytes in such types of samples, offering a greener and faster alternative that circumvents the traditional problems associated with sample digestion, requiring a small amount of sample only (0.05 mg per replicate for catalysts, and a few milligrams for the pharmaceuticals) and providing sufficient sensitivity to easily comply with regulations. The LODs achieved were 6.5 μg g- 1 (Pd), 8.3 μg g- 1 (Pt) and 9.3 μg g- 1 (Rh) for catalysts, which decreased to 0.08 μg g- 1 (Pd), 0.15 μg g- 1 (Pt) and 0.10 μg g- 1 (Rh) for pharmaceuticals.
Chang, Cecily C.Y.; Langston, J.; Riggs, M.; Campbell, D.H.; Silva, S.R.; Kendall, C.
1999-01-01
Recently, methods have been developed to analyze NO3- for δ15N and δ18O, improving our ability to identify NO3- sources and transformations. However, none of the existing methods are suited for waters with low NO3- concentrations (0.7-10 µM). We describe an improved method for collecting and recovering NO3- on exchange columns. To overcome the lengthy collection loading times imposed by the large sample volumes (7-70 L), the sample was prefiltered (0.45 µm) with a large surface area filter. Switching to AG2X anion resin and using a coarser mesh size (100-200) than previous methods also enhanced sample flow. Placement of a cation column in front of the anion column minimized clogging of the anion column by dissolved organic carbon (DOC) accumulation. This also served to minimize transfer of unwanted oxygen atoms from DOC to the 18O portion of the NO3- sample, thereby contaminating the sample and shifting δ18O. The cat-AG2X method is suited for on-site sample collection, making it possible to collect and recover NO3- from low ionic strength waters with modest DOC concentrations (80-800 µM), relieves the investigator of transporting large volumes of water back to the laboratory, and offers a means of sampling rain, snow, snowmelt, and stream samples from access-limited sites.
Characterization of viable bacteria from Siberian permafrost by 16S rDNA sequencing
NASA Technical Reports Server (NTRS)
Shi, T.; Reeves, R. H.; Gilichinsky, D. A.; Friedmann, E. I.
1997-01-01
Viable bacteria were found in permafrost core samples from the Kolyma-Indigirka lowland of northeast Siberia. The samples were obtained at different depths; the deepest was about 3 million years old. The average temperature of the permafrost is -10 degrees C. Twenty-nine bacterial isolates were characterized by 16S rDNA sequencing and phylogenetic analysis, cell morphology, Gram staining, endospore formation, and growth at 30 degrees C. The majority of the bacterial isolates were rod shaped and grew well at 30 degrees C; but two of them did not grow at or above 28 degrees C, and had optimum growth temperatures around 20 degrees C. Thirty percent of the isolates could form endospores. Phylogenetic analysis revealed that the isolates fell into four categories: high-GC Gram-positive bacteria, beta-proteobacteria, gamma-proteobacteria, and low-GC Gram-positive bacteria. Most high-GC Gram-positive bacteria and beta-proteobacteria, and all gamma-proteobacteria, came from samples with an estimated age of 1.8-3.0 million years (Olyor suite). Most low-GC Gram-positive bacteria came from samples with an estimated age of 5,000-8,000 years (Alas suite).
Three-Dimensional Passive-Source Reverse-Time Migration of Converted Waves: The Method
NASA Astrophysics Data System (ADS)
Li, Jiahang; Shen, Yang; Zhang, Wei
2018-02-01
At seismic discontinuities in the crust and mantle, part of the compressional wave energy converts to shear wave, and vice versa. These converted waves have been widely used in receiver function (RF) studies to image discontinuity structures in the Earth. While generally successful, the conventional RF method has its limitations and is suited mostly to flat or gently dipping structures. Among the efforts to overcome the limitations of the conventional RF method is the development of the wave-theory-based, passive-source reverse-time migration (PS-RTM) for imaging complex seismic discontinuities and scatters. To date, PS-RTM has been implemented only in 2D in the Cartesian coordinate for local problems and thus has limited applicability. In this paper, we introduce a 3D PS-RTM approach in the spherical coordinate, which is better suited for regional and global problems. New computational procedures are developed to reduce artifacts and enhance migrated images, including back-propagating the main arrival and the coda containing the converted waves separately, using a modified Helmholtz decomposition operator to separate the P and S modes in the back-propagated wavefields, and applying an imaging condition that maintains a consistent polarity for a given velocity contrast. Our new approach allows us to use migration velocity models with realistic velocity discontinuities, improving accuracy of the migrated images. We present several synthetic experiments to demonstrate the method, using regional and teleseismic sources. The results show that both regional and teleseismic sources can illuminate complex structures and this method is well suited for imaging dipping interfaces and sharp lateral changes in discontinuity structures.
An inverse dynamics approach to trajectory optimization and guidance for an aerospace plane
NASA Technical Reports Server (NTRS)
Lu, Ping
1992-01-01
The optimal ascent problem for an aerospace planes is formulated as an optimal inverse dynamic problem. Both minimum-fuel and minimax type of performance indices are considered. Some important features of the optimal trajectory and controls are used to construct a nonlinear feedback midcourse controller, which not only greatly simplifies the difficult constrained optimization problem and yields improved solutions, but is also suited for onboard implementation. Robust ascent guidance is obtained by using combination of feedback compensation and onboard generation of control through the inverse dynamics approach. Accurate orbital insertion can be achieved with near-optimal control of the rocket through inverse dynamics even in the presence of disturbances.
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; McCann, Karen M.; Biswas, Rupak; VanderWijngaart, Rob; Yan, Jerry C. (Technical Monitor)
2000-01-01
The creation of parameter study suites has recently become a more challenging problem as the parameter studies have now become multi-tiered and the computational environment has become a supercomputer grid. The parameter spaces are vast, the individual problem sizes are getting larger, and researchers are now seeking to combine several successive stages of parameterization and computation. Simultaneously, grid-based computing offers great resource opportunity but at the expense of great difficulty of use. We present an approach to this problem which stresses intuitive visual design tools for parameter study creation and complex process specification, and also offers programming-free access to grid-based supercomputer resources and process automation.
CO2 Washout Testing Using Various Inlet Vent Configurations in the Mark-III Space Suit
NASA Technical Reports Server (NTRS)
Korona, F. Adam; Norcross, Jason; Conger, Bruce; Navarro, Moses
2014-01-01
Requirements for using a space suit during ground testing include providing adequate carbon dioxide (CO2) washout for the suited subject. Acute CO2 exposure can lead to symptoms including headache, dyspnea, lethargy and eventually unconsciousness or even death. Symptoms depend on several factors including inspired partial pressure of CO2 (ppCO2), duration of exposure, metabolic rate of the subject and physiological differences between subjects. Computational Fluid Dynamic (CFD) analysis has predicted that the configuration of the suit inlet vent has a significant effect on oronasal CO2 concentrations. The main objective of this test is to characterize inspired oronasal ppCO2 for a variety of inlet vent configurations in the Mark-III space suit across a range of workload and flow rates. As a secondary objective, results will be compared to the predicted CO2 concentrations and used to refine existing CFD models. These CFD models will then be used to help design an inlet vent configuration for the Z-2 space suit, which maximizes oronasal CO2 washout. This test has not been completed, but is planned for January 2014. The results of this test will be incorporated into this paper. The testing methodology used in this test builds upon past CO2 washout testing performed on the Z-1 suit, Rear Entry I-Suit (REI) and the Enhanced Mobility Advanced Crew Escape Suit (EM-ACES). Three subjects will be tested in the Mark-III space suit with each subject performing two test sessions to allow for comparison between tests. Six different helmet inlet vent configurations will be evaluated during each test session. Suit pressure will be maintained at 4.3 psid. Subjects will wear the suit while walking on a treadmill to generate metabolic workloads of approximately 2000 and 3000 BTU/hr. Supply airflow rates of 6 and 4 actual cubic feet per minute (ACFM) will be tested at each workload. Subjects will wear an oronasal mask with an open port in front of the mouth and will be allowed to breathe freely. Oronasal ppCO2 will be monitored real-time via gas analyzers with sampling tubes connected to the oronasal mask. Metabolic rate will be calculated from the total oxygen consumption and CO2 production measured by additional gas analyzers at the air outlet from the suit. Real-time metabolic rate measurements will be used to adjust the treadmill workload to meet target metabolic rates. This paper provides detailed descriptions of the test hardware, methodology and results, as well as implications for future inlet vent design and ground testing in the Mark-III.
2013-01-01
Chemical cross-linking of proteins combined with mass spectrometry provides an attractive and novel method for the analysis of native protein structures and protein complexes. Analysis of the data however is complex. Only a small number of cross-linked peptides are produced during sample preparation and must be identified against a background of more abundant native peptides. To facilitate the search and identification of cross-linked peptides, we have developed a novel software suite, named Hekate. Hekate is a suite of tools that address the challenges involved in analyzing protein cross-linking experiments when combined with mass spectrometry. The software is an integrated pipeline for the automation of the data analysis workflow and provides a novel scoring system based on principles of linear peptide analysis. In addition, it provides a tool for the visualization of identified cross-links using three-dimensional models, which is particularly useful when combining chemical cross-linking with other structural techniques. Hekate was validated by the comparative analysis of cytochrome c (bovine heart) against previously reported data.1 Further validation was carried out on known structural elements of DNA polymerase III, the catalytic α-subunit of the Escherichia coli DNA replisome along with new insight into the previously uncharacterized C-terminal domain of the protein. PMID:24010795
NASA Astrophysics Data System (ADS)
Ishikawa, Akira; Suzuki, Katsuhiko; Collerson, Kenneth D.; Liu, Jingao; Pearson, D. Graham; Komiya, Tsuyoshi
2017-11-01
We determined highly siderophile element (HSE: Os, Ir, Ru, Pt, Pd, and Re) concentrations and 187Os/188Os ratios for ultramafic rocks distributed over the Eoarchean gneiss complex of the Saglek-Hebron area in northern Labrador, Canada in order to constrain to what extent variations in HSE abundances are recorded in Early Archean mantle that have well-resolved 182W isotope anomalies relative to the present-day mantle (∼+11 ppm: Liu et al., 2016). The samples analysed here have been previously classified into two suites: mantle-derived peridotites occurring as tectonically-emplaced slivers of lithospheric mantle, and metakomatiites comprising mostly pyroxenitic layers in supracrustal units dominated by amphibolites. Although previous Sm-Nd and Pb-Pb isotope studies provided whole-rock isochrons indicative of ∼3.8 Ga protolith formation for both suites, our whole-rock Re-Os isotope data on a similar set of samples yield considerably younger errorchrons with ages of 3612 ± 130 Ma (MSWD = 40) and 3096 ± 170 Ma (MSWD = 10.2) for the metakomatiite and lithospheric mantle suites, respectively. The respective initial 187Os/188Os = 0.10200 ± 18 for metakomatiites and 0.1041 ± 18 for lithospheric mantle rocks are within the range of chondrites. Re-depletion Os model ages for unradiogenic samples from the two suites are consistent with the respective Re-Os errorchrons (metakomatiite TRD = 3.4-3.6 Ga; lithospheric mantle TRD = 2.8-3.3 Ga). These observations suggest that the two ultramafic suites are not coeval. However, the estimated mantle sources for the two ultramafics suites are similar in terms of their broadly chondritic evolution of 187Os/188Os and their relative HSE patterns. In detail, both mantle sources show a small excess of Ru/Ir similar to that in modern primitive mantle, but a ∼20% deficit in absolute HSE abundances relative to that in modern primitive mantle (metakomatiite 74 ± 18% of PUM; lithospheric mantle 82 ± 10% of PUM), consistent with the ∼3.8 Ga Isua mantle source and Neoarchean komatiite sources around the world (∼70-86% of PUM). This demonstrates that the lower HSE abundances are not unique to the sources of komatiites, but rather might be a ubiquitous feature of Archean convecting mantle. This tentatively suggests that chondritic late accretion components boosted the convecting mantle HSE inventory after core separation in the Hadean, and that the Eoarchean to Neoarchean convecting mantle was depleted in its HSE content relative to that of today. Further investigation of Archean mantle-derived rocks is required to explore this hypothesis.
Pepe, Vera Lúcia Edais; Ventura, Miriam; Sant'ana, João Maurício Brambati; Figueiredo, Tatiana Aragão; Souza, Vanessa Dos Reis de; Simas, Luciana; Osorio-de-Castro, Claudia Garcia Serpa
2010-03-01
Recognition of the right to health raises two practical issues: the government's ethical and legal duty to ensure comprehensive health care and citizens' recourse to legal action to guarantee this right. This study focused on lawsuits to demand "essential" medicines, filed at the State Court of Appeals in Rio de Janeiro, Brazil, in 2006. One hundred and eighty-five suits were examined, and the claims were granted in all but three cases. Median times between filing the suit, the injunction, first ruling, and appellate ruling were 7, 239, and 478 days respectively. In 80.6% of the 98 suits in which the specific medicines could be identified, at least one drug did not belong to any publicly funded list of medicines. This could indicate that lawsuits demanding "essential" drugs were motivated not only by problems in procurement, distribution, and dispensing of medicines but also by non-inclusion of medicines on official lists. Most of the medicines demanded through lawsuits were for conditions involving the cardiovascular and nervous systems.
Pedagogies of engagement in science: A comparison of PBL, POGIL, and PLTL*
Eberlein, Thomas; Kampmeier, Jack; Minderhout, Vicky; Moog, Richard S; Platt, Terry; Varma-Nelson, Pratibha; White, Harold B
2008-01-01
Problem-based learning, process-oriented guided inquiry learning, and peer-led team learning are student-centered, active-learning pedagogies commonly used in science education. The characteristic features of each are compared and contrasted to enable new practitioners to decide which approach or combination of approaches will suit their particular situation. PMID:19381266
Using multi-criteria analysis of simulation models to understand complex biological systems
Maureen C. Kennedy; E. David Ford
2011-01-01
Scientists frequently use computer-simulation models to help solve complex biological problems. Typically, such models are highly integrated, they produce multiple outputs, and standard methods of model analysis are ill suited for evaluating them. We show how multi-criteria optimization with Pareto optimality allows for model outputs to be compared to multiple system...
How Does PISA Measure Students' Ability to Collaborate? PISA in Focus. No. 77
ERIC Educational Resources Information Center
OECD Publishing, 2017
2017-01-01
Solving unfamiliar problems on one's own is important, but in today's increasingly interconnected world, people are often required to collaborate in order to achieve their goals. Teamwork has numerous benefits, from a diverse range of opinions to synergies among team members, and assigning tasks to those who are best suited to them. Collaboration…
Toward an Analysis of Video Games for Mathematics Education
ERIC Educational Resources Information Center
Offenholley, Kathleen
2011-01-01
Video games have tremendous potential in mathematics education, yet there is a push to simply add mathematics to a video game without regard to whether the game structure suits the mathematics, and without regard to the level of mathematical thought being learned in the game. Are students practicing facts, or are they problem-solving? This paper…
Diagrams and Math Notation in E-Learning: Growing Pains of a New Generation
ERIC Educational Resources Information Center
Smith, Glenn Gordon; Ferguson, David
2004-01-01
Current e-learning environments are ill-suited to college mathematics. Instructors/students struggle to post diagrams and math notation. A new generation of math-friendly e-learning tools, including WebEQ, bundled with Blackboard 6, and NetTutor's Whiteboard, address these problems. This paper compares these two systems using criteria for ideal…
ERIC Educational Resources Information Center
Khanlari, Ahmad
2016-01-01
Twenty-first century education systems should create an environment wherein students encounter critical learning components (such as problem-solving, teamwork, and communication skills) and embrace lifelong learning. A review of literature demonstrates that new technologies, in general, and robotics, in particular, are well suited for this aim.…
Are Deep Strategic Learners Better Suited to PBL? A Preliminary Study
ERIC Educational Resources Information Center
Papinczak, Tracey
2009-01-01
The aim of this study was to determine if medical students categorised as having deep and strategic approaches to their learning find problem-based learning (PBL) enjoyable and supportive of their learning, and achieve well in the first-year course. Quantitative and qualitative data were gathered from first-year medical students (N = 213). All…
ERIC Educational Resources Information Center
Galisson, Robert
1977-01-01
Discusses causes for the problems in language instruction in France, and offers suggestions for improving the situation, including greater communication among language professionals, greater flexibility to instructional change, and greater attention to the student. (AM)
Teaching Statistical Inference for Causal Effects in Experiments and Observational Studies
ERIC Educational Resources Information Center
Rubin, Donald B.
2004-01-01
Inference for causal effects is a critical activity in many branches of science and public policy. The field of statistics is the one field most suited to address such problems, whether from designed experiments or observational studies. Consequently, it is arguably essential that departments of statistics teach courses in causal inference to both…
The Characterization of Biosignatures in Caves Using an Instrument Suite.
Uckert, Kyle; Chanover, Nancy J; Getty, Stephanie; Voelz, David G; Brinckerhoff, William B; McMillan, Nancy; Xiao, Xifeng; Boston, Penelope J; Li, Xiang; McAdam, Amy; Glenar, David A; Chavez, Arriana
2017-12-01
The search for life and habitable environments on other Solar System bodies is a major motivator for planetary exploration. Due to the difficulty and significance of detecting extant or extinct extraterrestrial life in situ, several independent measurements from multiple instrument techniques will bolster the community's confidence in making any such claim. We demonstrate the detection of subsurface biosignatures using a suite of instrument techniques including IR reflectance spectroscopy, laser-induced breakdown spectroscopy, and scanning electron microscopy/energy dispersive X-ray spectroscopy. We focus our measurements on subterranean calcium carbonate field samples, whose biosignatures are analogous to those that might be expected on some high-interest astrobiology targets. In this work, we discuss the feasibility and advantages of using each of the aforementioned instrument techniques for the in situ search for biosignatures and present results on the autonomous characterization of biosignatures using multivariate statistical analysis techniques. Key Words: Biosignature suites-Caves-Mars-Life detection. Astrobiology 17, 1203-1218.
de Vreede, Gert-Jan; Briggs, Robert O; Reiter-Palmon, Roni
2010-04-01
The aim of this study was to compare the results of two different modes of using multiple groups (instead of one large group) to identify problems and develop solutions. Many of the complex problems facing organizations today require the use of very large groups or collaborations of groups from multiple organizations. There are many logistical problems associated with the use of such large groups, including the ability to bring everyone together at the same time and location. A field study involved two different organizations and compared productivity and satisfaction of group. The approaches included (a) multiple small groups, each completing the entire process from start to end and combining the results at the end (parallel mode); and (b) multiple subgroups, each building on the work provided by previous subgroups (serial mode). Groups using the serial mode produced more elaborations compared with parallel groups, whereas parallel groups produced more unique ideas compared with serial groups. No significant differences were found related to satisfaction with process and outcomes between the two modes. Preferred mode depends on the type of task facing the group. Parallel groups are more suited for tasks for which a variety of new ideas are needed, whereas serial groups are best suited when elaboration and in-depth thinking on the solution are required. Results of this research can guide the development of facilitated sessions of large groups or "teams of teams."
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madrid, V.; Singleton, M. J.; Visser, A.
This report combines and summarizes results for two groundwater-sampling events (October 2012 and October/November 2015) from the Sandia National Laboratories/New Mexico (SNL/NM) Burn Site Groundwater (BSG) Area of Concern (AOC) located in the Lurance Canyon Arroyo southeast of Albuquerque, NM in the Manzanita Mountains. The first phase of groundwater sampling occurred in October 2012 including samples from 19 wells at three separate sites that were analyzed by the Environmental Radiochemistry Laboratory at Lawrence Livermore National Laboratory as part of a nitrate Monitored Natural Attenuation (MNA) evaluation. The three sites (BSG, Technical Area-V, and Tijeras Arroyo) are shown on the regionalmore » hydrogeologic map and described in the Sandia Annual Groundwater Monitoring Report. The first phase of groundwater sampling included six monitoring wells at the Burn Site, eight monitoring wells at Technical Area-V, and five monitoring wells at Tijeras Arroyo. Each groundwater sample was analyzed using the two specialized analytical methods, age-dating and denitrification suites. In September 2015, a second phase of groundwater sampling took place at the Burn Site including 10 wells sampled and analyzed by the same two analytical suites. Five of the six wells sampled in 2012 were resampled in 2015. This report summarizes results from two sampling events in order to evaluate evidence for in situ denitrification, the average age of the groundwater, and the extent of recent recharge of the bedrock fracture system beneath the BSG AOC.« less
MANGROVE-DERIVED NUTRIENTS AND CORAL REEFS
Understanding the consequences of the declining global cover of mangroves due to anthropogenic disturbance necessitates consideration of how mangrove-derived nutrients contribute to threatened coral reef systems. We sampled potential sources of organic matter and a suite of sessi...
NASA Astrophysics Data System (ADS)
Krumholz, Mark R.; Fumagalli, Michele; da Silva, Robert L.; Rendahl, Theodore; Parra, Jonathan
2015-09-01
Stellar population synthesis techniques for predicting the observable light emitted by a stellar population have extensive applications in numerous areas of astronomy. However, accurate predictions for small populations of young stars, such as those found in individual star clusters, star-forming dwarf galaxies, and small segments of spiral galaxies, require that the population be treated stochastically. Conversely, accurate deductions of the properties of such objects also require consideration of stochasticity. Here we describe a comprehensive suite of modular, open-source software tools for tackling these related problems. These include the following: a greatly-enhanced version of the SLUG code introduced by da Silva et al., which computes spectra and photometry for stochastically or deterministically sampled stellar populations with nearly arbitrary star formation histories, clustering properties, and initial mass functions; CLOUDY_SLUG, a tool that automatically couples SLUG-computed spectra with the CLOUDY radiative transfer code in order to predict stochastic nebular emission; BAYESPHOT, a general-purpose tool for performing Bayesian inference on the physical properties of stellar systems based on unresolved photometry; and CLUSTER_SLUG and SFR_SLUG, a pair of tools that use BAYESPHOT on a library of SLUG models to compute the mass, age, and extinction of mono-age star clusters, and the star formation rate of galaxies, respectively. The latter two tools make use of an extensive library of pre-computed stellar population models, which are included in the software. The complete package is available at http://www.slugsps.com.
Sample Return Robot Centennial Challenge
2012-06-16
Visitors, some with their dogs, line up to make their photo inside a space suit exhibit during the Worcester Polytechnic Institute (WPI) "TouchTomorrow" education and outreach event that was held in tandem with the NASA-WPI Sample Return Robot Centennial Challenge on Saturday, June 16, 2012 in Worcester, Mass. The NASA-WPI challenge tasked robotic teams to build autonomous robots that can identify, collect and return samples. NASA needs autonomous robotic capability for future planetary exploration. Photo Credit: (NASA/Bill Ingalls)
Fully automatic hp-adaptivity for acoustic and electromagnetic scattering in three dimensions
NASA Astrophysics Data System (ADS)
Kurtz, Jason Patrick
We present an algorithm for fully automatic hp-adaptivity for finite element approximations of elliptic and Maxwell boundary value problems in three dimensions. The algorithm automatically generates a sequence of coarse grids, and a corresponding sequence of fine grids, such that the energy norm of the error decreases exponentially with respect to the number of degrees of freedom in either sequence. At each step, we employ a discrete optimization algorithm to determine the refinements for the current coarse grid such that the projection-based interpolation error for the current fine grid solution decreases with an optimal rate with respect to the number of degrees of freedom added by the refinement. The refinements are restricted only by the requirement that the resulting mesh is at most 1-irregular, but they may be anisotropic in both element size h and order of approximation p. While we cannot prove that our method converges at all, we present numerical evidence of exponential convergence for a diverse suite of model problems from acoustic and electromagnetic scattering. In particular we show that our method is well suited to the automatic resolution of exterior problems truncated by the introduction of a perfectly matched layer. To enable and accelerate the solution of these problems on commodity hardware, we include a detailed account of three critical aspects of our implementation, namely an efficient implementation of sum factorization, several efficient interfaces to the direct multi-frontal solver MUMPS, and some fast direct solvers for the computation of a sequence of nested projections.
Audit method suited for DSS in clinical environment.
Vicente, Javier
2015-01-01
This chapter presents a novel online method to audit predictive models using a Bayesian perspective. The auditing model has been specifically designed for Decision Support Systems (DSSs) suited for clinical or research environments. Taking as starting point the working diagnosis supplied by the clinician, this method compares and evaluates the predictive skills of those models able to answer to that diagnosis. The approach consists in calculating the posterior odds of a model through the composition of a prior odds, a static odds, and a dynamic odds. To do so, this method estimates the posterior odds from the cases that the comparing models had in common during the design stage and from the cases already viewed by the DSS after deployment in the clinical site. In addition, if an ontology of the classes is available, this method can audit models answering related questions, which offers a reinforcement to the decisions the user already took and gives orientation on further diagnostic steps.The main technical novelty of this approach lies in the design of an audit model adapted to suit the decision workflow of a clinical environment. The audit model allows deciding what is the classifier that best suits each particular case under evaluation and allows the detection of possible misbehaviours due to population differences or data shifts in the clinical site. We show the efficacy of our method for the problem of brain tumor diagnosis with Magnetic Resonance Spectroscopy (MRS).
Sampling bias in blending validation and a different approach to homogeneity assessment.
Kraemer, J; Svensson, J R; Melgaard, H
1999-02-01
Sampling of batches studied for validation is reported. A thief particularly suited for granules, rather than cohesive powders, was used in the study. It is shown, as has been demonstrated in the past, that traditional 1x to 3x thief sampling of a blend is biased, and that the bias decreases as the sample size increases. It is shown that taking 50 samples of tablets after blending and testing this subpopulation for normality is a discriminating manner of testing for homogeneity. As a criterion, it is better than sampling at mixer or drum stage would be even if an unbiased sampling device were available.
Coish, Raymond; Kim, Jonathan; Twelker, Evan; Zolkos, Scott P.; Walsh, Gregory J.
2015-01-01
The Moretown Formation, exposed as a north-trending unit that extends from northern Vermont to Connecticut, is located along a critical Appalachian litho-tectonic zone between the paleomargin of Laurentia and accreted oceanic terranes. Remnants of magmatic activity, in part preserved as metamorphosed mafic rocks in the Moretown Formation and the overlying Cram Hill Formation, are a key to further understanding the tectonic history of the northern Appalachians. Field relationships suggest that the metamorphosed mafic rocks might have formed during and after Taconian deformation, which occurred at ca. 470 to 460 Ma. Geochemistry indicates that the sampled metamorphosed mafic rocks were mostly basalts or basaltic andesites. The rocks have moderate TiO2 contents (1–2.5 wt %), are slightly enriched in the light-rare earth elements relative to the heavy rare earths, and have negative Nb-Ta anomalies in MORB-normalized extended rare earth element diagrams. Their chemistry is similar to compositions of basalts from western Pacific extensional basins near volcanic arcs. The metamorphosed mafic rocks of this study are similar in chemistry to both the pre-Silurian Mount Norris Intrusive Suite of northern Vermont, and also to some of Late Silurian rocks within the Lake Memphremagog Intrusive Suite, particularly the Comerford Intrusive Complex of Vermont and New Hampshire. Both suites may be represented among the samples of this study. The geochemistry of all samples indicates that parental magmas were generated in supra-subduction extensional environments during lithospheric delamination.
Tracking Hadean processes in modern basalts with 142-Neodymium
NASA Astrophysics Data System (ADS)
Horan, M. F.; Carlson, R. W.; Walker, R. J.; Jackson, M.; Garçon, M.; Norman, M.
2018-02-01
The short-lived 146Sm→142 Nd isotope system (t1/2 = 103 Ma) provides constraints on the timing and processes of terrestrial silicate fractionation during the early Hadean. Although some Archean terranes preserve variability in 142Nd/144Nd, no anomalies have been resolved previously in young rocks. This study provides high precision 142Nd/144Nd data on a suite of ocean island basalts from Samoa and Hawaii previously shown to have variable depletions in 182W/184W that are inversely correlated with 3He/4He ratios. Improved analytical techniques and multiple replicate analyses of Nd show a variation in μ142 Nd values between -1.3 and +2.7 in the suite, relative to the JNdi standard. Given the reproducibility of the standard (±2.9 ppm, 2 SD), two Samoan samples exhibit resolved variability in their 142Nd/144Nd ratios outside of their 95% confidence intervals, suggesting minor variability in the Samoan hotspot. One sample from Samoa has a higher μ142 Nd of +2.7, outside the 95% confidence interval (±1.0 ppm) of the average of the JNdi standard. Limited, but resolved, variation in 142Nd/144Nd within the suite suggests the preservation of early Hadean silicate differentiation in the sources of at least some basalts from Samoa. Larger variations of 182W/184W and 3He/4He ratios in the same samples suggest that metal-silicate separation and mantle outgassing left a more persistent imprint on the accessible mantle compared to 142Nd/144Nd ratios which are impacted by early silicate differentiation.
Hard X-ray full field microscopy and magnifying microtomography using compound refractive lenses
NASA Astrophysics Data System (ADS)
Schroer, Christian G.; Günzler, Til Florian; Benner, Boris; Kuhlmann, Marion; Tümmler, Johannes; Lengeler, Bruno; Rau, Christoph; Weitkamp, Timm; Snigirev, Anatoly; Snigireva, Irina
2001-07-01
For hard X-rays, parabolic compound refractive lenses (PCRLs) are genuine imaging devices like glass lenses for visible light. Based on these new lenses, a hard X-ray full field microscope has been constructed that is ideally suited to image the interior of opaque samples with a minimum of sample preparation. As a result of a large depth of field, CRL micrographs are sharp projection images of most samples. To obtain 3D information about a sample, tomographic techniques are combined with magnified imaging.
Chapin, Thomas P.; Todd, Andrew S.
2012-01-01
Abandoned hard-rock mines can be a significant source of acid mine drainage (AMD) and toxic metal pollution to watersheds. In Colorado, USA, abandoned mines are often located in remote, high elevation areas that are snowbound for 7–8 months of the year. The difficulty in accessing these remote sites, especially during winter, creates challenging water sampling problems and major hydrologic and toxic metal loading events are often under sampled. Currently available automated water samplers are not well suited for sampling remote snowbound areas so the U.S. Geological Survey (USGS) has developed a new water sampler, the MiniSipper, to provide long-duration, high-resolution water sampling in remote areas. The MiniSipper is a small, portable sampler that uses gas bubbles to separate up to 250 five milliliter acidified samples in a long tubing coil. The MiniSipper operates for over 8 months unattended in water under snow/ice, reduces field work costs, and greatly increases sampling resolution, especially during inaccessible times. MiniSippers were deployed in support of an U.S. Environmental Protection Agency (EPA) project evaluating acid mine drainage inputs from the Pennsylvania Mine to the Snake River watershed in Summit County, CO, USA. MiniSipper metal results agree within 10% of EPA-USGS hand collected grab sample results. Our high-resolution results reveal very strong correlations (R2 > 0.9) between potentially toxic metals (Cd, Cu, and Zn) and specific conductivity at the Pennsylvania Mine site. The large number of samples collected by the MiniSipper over the entire water year provides a detailed look at the effects of major hydrologic events such as snowmelt runoff and rainstorms on metal loading from the Pennsylvania Mine. MiniSipper results will help guide EPA sampling strategy and remediation efforts in the Snake River watershed.
Chapin, Thomas P; Todd, Andrew S
2012-11-15
Abandoned hard-rock mines can be a significant source of acid mine drainage (AMD) and toxic metal pollution to watersheds. In Colorado, USA, abandoned mines are often located in remote, high elevation areas that are snowbound for 7-8 months of the year. The difficulty in accessing these remote sites, especially during winter, creates challenging water sampling problems and major hydrologic and toxic metal loading events are often under sampled. Currently available automated water samplers are not well suited for sampling remote snowbound areas so the U.S. Geological Survey (USGS) has developed a new water sampler, the MiniSipper, to provide long-duration, high-resolution water sampling in remote areas. The MiniSipper is a small, portable sampler that uses gas bubbles to separate up to 250 five milliliter acidified samples in a long tubing coil. The MiniSipper operates for over 8 months unattended in water under snow/ice, reduces field work costs, and greatly increases sampling resolution, especially during inaccessible times. MiniSippers were deployed in support of an U.S. Environmental Protection Agency (EPA) project evaluating acid mine drainage inputs from the Pennsylvania Mine to the Snake River watershed in Summit County, CO, USA. MiniSipper metal results agree within 10% of EPA-USGS hand collected grab sample results. Our high-resolution results reveal very strong correlations (R(2)>0.9) between potentially toxic metals (Cd, Cu, and Zn) and specific conductivity at the Pennsylvania Mine site. The large number of samples collected by the MiniSipper over the entire water year provides a detailed look at the effects of major hydrologic events such as snowmelt runoff and rainstorms on metal loading from the Pennsylvania Mine. MiniSipper results will help guide EPA sampling strategy and remediation efforts in the Snake River watershed. Published by Elsevier B.V.
NASA Technical Reports Server (NTRS)
Stamnes, K.; Lie-Svendsen, O.; Rees, M. H.
1991-01-01
The linear Boltzmann equation can be cast in a form mathematically identical to the radiation-transport equation. A multigroup procedure is used to reduce the energy (or velocity) dependence of the transport equation to a series of one-speed problems. Each of these one-speed problems is equivalent to the monochromatic radiative-transfer problem, and existing software is used to solve this problem in slab geometry. The numerical code conserves particles in elastic collisions. Generic examples are provided to illustrate the applicability of this approach. Although this formalism can, in principle, be applied to a variety of test particle or linearized gas dynamics problems, it is particularly well-suited to study the thermalization of suprathermal particles interacting with a background medium when the thermal motion of the background cannot be ignored. Extensions of the formalism to include external forces and spherical geometry are also feasible.
Beverage Cans Used for Sediment Collection.
ERIC Educational Resources Information Center
Studlick, Joseph R. J.; Trautman, Timothy A.
1979-01-01
Beverage cans are well suited for sediment collection and storage containers. Advantages include being free, readily available, and the correct size for many samples. Instruction for selection, preparation, and use of cans in sediment collection and storage is provided. (RE)
Seeking the Signs of Life: Assessing the Presence of Biosignatures in the Returned Sample Suite
NASA Astrophysics Data System (ADS)
iMOST Team; Des Marais, D. J.; Grady, M. M.; Shaheen, R.; Steele, A.; Westall, F.; Altieri, F.; Amelin, Y.; Ammannito, E.; Anand, M.; Beaty, D. W.; Benning, L. G.; Bishop, J. L.; Borg, L. E.; Boucher, D.; Brucato, J. R.; Busemann, H.; Campbell, K. A.; Carrier, B. L.; Czaja, A. D.; Debaille, V.; Dixon, M.; Ehlmann, B. L.; Farmer, J. D.; Fernandez-Remolar, D. C.; Fogarty, J.; Glavin, D. P.; Goreva, Y. S.; Hallis, L. J.; Harrington, A. D.; Hausrath, E. M.; Herd, C. D. K.; Horgan, B.; Humayun, M.; Kleine, T.; Kleinhenz, J.; Mangold, N.; Mackelprang, R.; Mayhew, L. E.; McCubbin, F. M.; McCoy, J. T.; McLennan, S. M.; McSween, H. Y.; Moser, D. E.; Moynier, F.; Mustard, J. F.; Niles, P. B.; Ori, G. G.; Raulin, F.; Rettberg, P.; Rucker, M. A.; Schmitz, N.; Sefton-Nash, E.; Sephton, M. A.; Shuster, D. L.; Siljestrom, S.; Smith, C. L.; Spry, J. A.; Swindle, T. D.; ten Kate, I. L.; Tosca, N. J.; Usui, T.; Van Kranendonk, M. J.; Wadhwa, M.; Weiss, B. P.; Werner, S. C.; Wheeler, R. M.; Zipfel, J.; Zorzano, M. P.
2018-04-01
Biosignatures are objects, substances, and/or patterns whose origins require life. They occur as organic compounds, stable isotope patterns, minerals, and morphologies. Each type requires particular modes of preservation and analytical measurements.
Signs of Perchlorates and Sulfur Containing Compounds
2012-12-03
NASA Mars rover Curiosity has detected sulfur, chlorine, and oxygen compounds in fine grains scooped by the rover at a wind drift site called Rocknest. The grains were heated and analyzed using the rover Sample Analysis at Mars instrument suite.
OCAMS: The OSIRIS-REx Camera Suite
NASA Astrophysics Data System (ADS)
Rizk, B.; Drouet d'Aubigny, C.; Golish, D.; Fellows, C.; Merrill, C.; Smith, P.; Walker, M. S.; Hendershot, J. E.; Hancock, J.; Bailey, S. H.; DellaGiustina, D. N.; Lauretta, D. S.; Tanner, R.; Williams, M.; Harshman, K.; Fitzgibbon, M.; Verts, W.; Chen, J.; Connors, T.; Hamara, D.; Dowd, A.; Lowman, A.; Dubin, M.; Burt, R.; Whiteley, M.; Watson, M.; McMahon, T.; Ward, M.; Booher, D.; Read, M.; Williams, B.; Hunten, M.; Little, E.; Saltzman, T.; Alfred, D.; O'Dougherty, S.; Walthall, M.; Kenagy, K.; Peterson, S.; Crowther, B.; Perry, M. L.; See, C.; Selznick, S.; Sauve, C.; Beiser, M.; Black, W.; Pfisterer, R. N.; Lancaster, A.; Oliver, S.; Oquest, C.; Crowley, D.; Morgan, C.; Castle, C.; Dominguez, R.; Sullivan, M.
2018-02-01
The OSIRIS-REx Camera Suite (OCAMS) will acquire images essential to collecting a sample from the surface of Bennu. During proximity operations, these images will document the presence of satellites and plumes, record spin state, enable an accurate model of the asteroid's shape, and identify any surface hazards. They will confirm the presence of sampleable regolith on the surface, observe the sampling event itself, and image the sample head in order to verify its readiness to be stowed. They will document Bennu's history as an example of early solar system material, as a microgravity body with a planetesimal size-scale, and as a carbonaceous object. OCAMS is fitted with three cameras. The MapCam will record color images of Bennu as a point source on approach to the asteroid in order to connect Bennu's ground-based point-source observational record to later higher-resolution surface spectral imaging. The SamCam will document the sample site before, during, and after it is disturbed by the sample mechanism. The PolyCam, using its focus mechanism, will observe the sample site at sub-centimeter resolutions, revealing surface texture and morphology. While their imaging requirements divide naturally between the three cameras, they preserve a strong degree of functional overlap. OCAMS and the other spacecraft instruments will allow the OSIRIS-REx mission to collect a sample from a microgravity body on the same visit during which it was first optically acquired from long range, a useful capability as humanity reaches out to explore near-Earth, Main-Belt and Jupiter Trojan asteroids.
CCP4i2: the new graphical user interface to the CCP4 program suite
Potterton, Liz; Ballard, Charles; Dodson, Eleanor; Evans, Phil R.; Keegan, Ronan; Krissinel, Eugene; Stevenson, Kyle; Lebedev, Andrey; McNicholas, Stuart J.; Noble, Martin; Pannu, Navraj S.; Roth, Christian; Sheldrick, George; Skubak, Pavol; Uski, Ville
2018-01-01
The CCP4 (Collaborative Computational Project, Number 4) software suite for macromolecular structure determination by X-ray crystallography groups brings together many programs and libraries that, by means of well established conventions, interoperate effectively without adhering to strict design guidelines. Because of this inherent flexibility, users are often presented with diverse, even divergent, choices for solving every type of problem. Recently, CCP4 introduced CCP4i2, a modern graphical interface designed to help structural biologists to navigate the process of structure determination, with an emphasis on pipelining and the streamlined presentation of results. In addition, CCP4i2 provides a framework for writing structure-solution scripts that can be built up incrementally to create increasingly automatic procedures. PMID:29533233
Northwest Africa 1401: A Polymict Cumulate Eucrite with a Unique Ferroan Heteradcumulate Mafic Clast
NASA Technical Reports Server (NTRS)
Mittlefehldt, David W.; Killgore, Marvin
2003-01-01
The howardite, eucrite and diogenite (HED) clan is the largest suite of achondrites available for study. The suite gives us a unique view of the magmatism that affected some asteroids early in solar system history. One problem with mining the HED clan for petrogenetic information is that there is only limited petrologic diversity among the rock types. Thus, discovering unusual HED materials holds the potential for revealing new insights into the petrologic evolution of the HED parent asteroid. Here we report on petrologic study of an unusual, 27 gram polymict eucrite, Northwest Africa (NWA) 1401. The thin section studied (approx. 20 x 10 mm) contains one large, ferroan clast described separately. The remainder of the rock, including mineral fragments and other, smaller lithic clasts, forms the host breccia.
Prevention of decompression sickness during extravehicular activity in space: a review.
Tokumaru, O
1997-12-01
Extended and more frequent extravehicular activity (EVA) is planned in NASA's future space programs. The more EVAs are conducted, the higher the incidence of decompression sickness (DCS) that is anticipated. Since Japan is also promoting the Space Station Freedom project with NASA, DCS during EVA will be an inevitable complication. The author reviewed the pathophysiology of DCS and detailed four possible ways of preventing decompression sickness during EVA in space: (1) higher pressure suit technology; (2) preoxygenation/prebreathing; (3) staged decompression; and (4) habitat or vehicle pressurization. Among these measures, development of zero-prebreathe higher pressure suit technology seems most ideal, but because of economic and technical reasons and in cases of emergency, other methods must also be improved. Unsolved problems like repeated decompression or oxygen toxicity were also listed.
Use of ancillary data to improve the analysis of forest health indicators
Dave Gartner
2013-01-01
In addition to its standard suite of mensuration variables, the Forest Inventory and Analysis (FIA) program of the U.S. Forest Service also collects data on forest health variables formerly measured by the Forest Health Monitoring program. FIA obtains forest health information on a subset of the base sample plots. Due to the sample size differences, the two sets of...
ERIC Educational Resources Information Center
Vogt, Dawne S.; Proctor, Susan P.; King, Daniel W.; King, Lynda A.; Vasterling, Jennifer J.
2008-01-01
The Deployment Risk and Resilience Inventory (DRRI) is a suite of scales that can be used to assess deployment-related factors implicated in the health and well-being of military veterans. Although initial evidence for the reliability and validity of DRRI scales based on Gulf War veteran samples is encouraging, evidence with respect to a more…
A robust set of black walnut microsatellites for parentage and clonal identification
Rodney L. Robichaud; Jeffrey C. Glaubitz; Olin E. Rhodes; Keith Woeste
2006-01-01
We describe the development of a robust and powerful suite of 12 microsatellite marker loci for use in genetic investigations of black walnut and related species. These 12 loci were chosen from a set of 17 candidate loci used to genotype 222 trees sampled from a 38-year-old black walnut progeny test. The 222 genotypes represent a sampling from the broad geographic...
2000-11-13
Collection and Nutrient Analyses Ashumet Pond water column profiles and samples were taken by the School for Marine Science and Technology (SMAST) at the...Collection & Analysis ........................................ .......... 77 4.3.1 SMAST Water Sampling Plan/Collection and Nutrient Analyses...suited as an indicator of phosphate limitation in natural waters . In this study alkaline phosphatase is used to understand the nutrient limitation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyet, Maud; Carlson, Richard W.; Borg, Lars E.
Here, we have measured Sm–Nd systematics, including the short-lived 146Sm– 142Nd chronometer, in lunar ferroan anorthositic suite (FAS) whole rocks (15415, 62236, 62255, 65315, 60025). At least some members of the suite are thought to be primary crystallization products formed by plagioclase flotation during crystallization of the lunar magma ocean (LMO). Most of these samples, except 62236, have not been exposed to galactic cosmic rays for a long period and thus require minimal correction to their 142Nd isotope composition. These samples all have measured deficits in 142Nd relative to the JNdi-1 terrestrial standard in the range –45 to –21 ppm.more » The range is –45 to –15 ppm once the 62236 142Nd/ 144Nd ratio is corrected from neutron-capture effects. Analyzed FAS samples do not define a single isochron in either 146Sm– 142Nd or 147Sm– 143Nd systematics, suggesting that they either do not have the same crystallization age, come from different sources, or have suffered isotopic disturbance. Because the age is not known for some samples, we explore the implications of their initial isotopic compositions for crystallization ages in the first 400 Ma of solar system history, a timing interval that covers all the ages determined for the ferroan anorthositic suite whole rocks as well as different estimates for the crystallization of the LMO. 62255 has the largest deficit in initial 142Nd and does not appear to have followed the same differentiation path as the other FAS samples. The large deficit in 142Nd of FAN 62255 may suggest a crystallization age around 60–125 Ma after the beginning of solar system accretion. This result provides essential information about the age of the giant impact forming the Moon. The initial Nd isotopic compositions of FAS samples can be matched either with a bulk-Moon with chondritic Sm/Nd ratio but enstatite-chondrite-like initial 142Nd/ 144Nd (e.g. 10 ppm below modern terrestrial), or a bulk-Moon with superchondritic Sm/Nd ratio and initial 142Nd/ 144Nd similar to ordinary chondrites.« less
Validating a large geophysical data set: Experiences with satellite-derived cloud parameters
NASA Technical Reports Server (NTRS)
Kahn, Ralph; Haskins, Robert D.; Knighton, James E.; Pursch, Andrew; Granger-Gallegos, Stephanie
1992-01-01
We are validating the global cloud parameters derived from the satellite-borne HIRS2 and MSU atmospheric sounding instrument measurements, and are using the analysis of these data as one prototype for studying large geophysical data sets in general. The HIRS2/MSU data set contains a total of 40 physical parameters, filling 25 MB/day; raw HIRS2/MSU data are available for a period exceeding 10 years. Validation involves developing a quantitative sense for the physical meaning of the derived parameters over the range of environmental conditions sampled. This is accomplished by comparing the spatial and temporal distributions of the derived quantities with similar measurements made using other techniques, and with model results. The data handling needed for this work is possible only with the help of a suite of interactive graphical and numerical analysis tools. Level 3 (gridded) data is the common form in which large data sets of this type are distributed for scientific analysis. We find that Level 3 data is inadequate for the data comparisons required for validation. Level 2 data (individual measurements in geophysical units) is needed. A sampling problem arises when individual measurements, which are not uniformly distributed in space or time, are used for the comparisons. Standard 'interpolation' methods involve fitting the measurements for each data set to surfaces, which are then compared. We are experimenting with formal criteria for selecting geographical regions, based upon the spatial frequency and variability of measurements, that allow us to quantify the uncertainty due to sampling. As part of this project, we are also dealing with ways to keep track of constraints placed on the output by assumptions made in the computer code. The need to work with Level 2 data introduces a number of other data handling issues, such as accessing data files across machine types, meeting large data storage requirements, accessing other validated data sets, processing speed and throughput for interactive graphical work, and problems relating to graphical interfaces.
Hierarchical Protein Free Energy Landscapes from Variationally Enhanced Sampling.
Shaffer, Patrick; Valsson, Omar; Parrinello, Michele
2016-12-13
In recent work, we demonstrated that it is possible to obtain approximate representations of high-dimensional free energy surfaces with variationally enhanced sampling ( Shaffer, P.; Valsson, O.; Parrinello, M. Proc. Natl. Acad. Sci. , 2016 , 113 , 17 ). The high-dimensional spaces considered in that work were the set of backbone dihedral angles of a small peptide, Chignolin, and the high-dimensional free energy surface was approximated as the sum of many two-dimensional terms plus an additional term which represents an initial estimate. In this paper, we build on that work and demonstrate that we can calculate high-dimensional free energy surfaces of very high accuracy by incorporating additional terms. The additional terms apply to a set of collective variables which are more coarse than the base set of collective variables. In this way, it is possible to build hierarchical free energy surfaces, which are composed of terms that act on different length scales. We test the accuracy of these free energy landscapes for the proteins Chignolin and Trp-cage by constructing simple coarse-grained models and comparing results from the coarse-grained model to results from atomistic simulations. The approach described in this paper is ideally suited for problems in which the free energy surface has important features on different length scales or in which there is some natural hierarchy.
Koehler Leman, Julia; Bonneau, Richard
2018-04-03
Membrane proteins composed of soluble and membrane domains are often studied one domain at a time. However, to understand the biological function of entire protein systems and their interactions with each other and drugs, knowledge of full-length structures or models is required. Although few computational methods exist that could potentially be used to model full-length constructs of membrane proteins, none of these methods are perfectly suited for the problem at hand. Existing methods require an interface or knowledge of the relative orientations of the domains or are not designed for domain assembly, and none of them are developed for membrane proteins. Here we describe the first domain assembly protocol specifically designed for membrane proteins that assembles intra- and extracellular soluble domains and the transmembrane domain into models of the full-length membrane protein. Our protocol does not require an interface between the domains and samples possible domain orientations based on backbone dihedrals in the flexible linker regions, created via fragment insertion, while keeping the transmembrane domain fixed in the membrane. For five examples tested, our method mp_domain_assembly, implemented in RosettaMP, samples domain orientations close to the known structure and is best used in conjunction with experimental data to reduce the conformational search space.
Xinyang Li; Poli, Riccardo; Valenza, Gaetano; Scilingo, Enzo Pasquale; Citi, Luca
2017-07-01
Assessment and recognition of perceived well-being has wide applications in the development of assistive healthcare systems for people with physical and mental disorders. In practical data collection, these systems need to be less intrusive, and respect users' autonomy and willingness as much as possible. As a result, self-reported data are not necessarily available at all times. Conventional classifiers, which usually require feature vectors of a prefixed dimension, are not well suited for this problem. To address the issue of non-uniformly sampled measurements, in this study we propose a method for the modelling and prediction of self-reported well-being scores based on a linear dynamic system. Within the model, we formulate different features as observations, making predictions even in the presence of inconsistent and irregular data. We evaluate the proposed method with synthetic data, as well as real data from two patients diagnosed with cancer. In the latter, self-reported scores from three well-being-related scales were collected over a period of approximately 60 days. Prompted each day, the patients had the choice whether to respond or not. Results show that the proposed model is able to track and predict the patients' perceived well-being dynamics despite the irregularly sampled data.
Rethinking the lecture: the application of problem based learning methods to atypical contexts.
Rogal, Sonya M M; Snider, Paul D
2008-05-01
Problem based learning is a teaching and learning strategy that uses a problematic stimulus as a means of motivating and directing students to develop and acquire knowledge. Problem based learning is a strategy that is typically used with small groups attending a series of sessions. This article describes the principles of problem based learning and its application in atypical contexts; large groups attending discrete, stand-alone sessions. The principles of problem based learning are based on Socratic teaching, constructivism and group facilitation. To demonstrate the application of problem based learning in an atypical setting, this article focuses on the graduate nurse intake from a teaching hospital. The groups are relatively large and meet for single day sessions. The modified applications of problem based learning to meet the needs of atypical groups are described. This article contains a step by step guide of constructing a problem based learning package for large, single session groups. Nurse educators facing similar groups will find they can modify problem based learning to suit their teaching context.
Carbon Isotopic Measurements of Amino Acids in Stardust-Returned Samples
NASA Technical Reports Server (NTRS)
Elsila, Jamie
2009-01-01
NASA's Stardust spacecraft returned to Earth samples from comet 81P/Wild 2 in January 2006. Preliminary examinations revealed the presence of a suite of organic compounds including several amines and amino acids, but the origin of these compounds could not be identified. Here, we present the carbon isotopic ratios of glycine and e-aminocaproic acid (EACA), the two most abundant amino acids, in Stardust-returned foil samples measured by gas chromatography-combustion-isotope ratio mass spectrometry coupled with quadrupole mass spectrometry (GC-CAMS/IRMS).
NASA Technical Reports Server (NTRS)
Houseman, John; Patzold, Jack D.; Jackson, Julie R.; Brown, Pamela R.
1999-01-01
The loading of spacecraft with Hydrazine type fuels has long been recognized as a hazardous operation. This has led to safety strategies that include the use of SCAPE protective suits for personnel. The use of SCAPE suits have an excellent safety record, however there are associated drawbacks. Drawbacks include the high cost of maintaining and cleaning the suits, reduced mobility and dexterity when wearing the suits, the requirement for extensive specialized health and safety training, and the need to rotate personnel every two hours. A study was undertaken to look at procedures and/or equipment to eliminate or reduce the time spent in SCAPE-type operations. The major conclusions are drawn from observations of the loading of the JPL/NASA spacecraft Deep Space One (DS1) at KSC and the loading of a commercial communications satellite by Motorola at Vandenberg AF Base. The DS1 operations require extensive use of SCAPE suits, while the Motorola operation uses only SPLASH attire with a two-man team on standby in SCAPE. The Motorola team used very different loading equipment and procedures based on an integrated approach involving the propellant supplier. Overall, the Motorola approach was very clean, much faster and simpler than the DS1 procedure. The DS1 spacecraft used a bladder in the propellant tank, whereas the Motorola spacecraft used a Propellant Management Device (PMD). The Motorola approach cannot be used for tanks with bladders. To overcome this problem, some new procedures and new equipment are proposed to enable tanks with bladders to be loaded without using SCAPE, using a modified Motorola approach. Overall, it appears feasible to adopt the non-SCAPE approach while maintaining a very high degree of safety and reliability.
apGA: An adaptive parallel genetic algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liepins, G.E.; Baluja, S.
1991-01-01
We develop apGA, a parallel variant of the standard generational GA, that combines aggressive search with perpetual novelty, yet is able to preserve enough genetic structure to optimally solve variably scaled, non-uniform block deceptive and hierarchical deceptive problems. apGA combines elitism, adaptive mutation, adaptive exponential scaling, and temporal memory. We present empirical results for six classes of problems, including the DeJong test suite. Although we have not investigated hybrids, we note that apGA could be incorporated into other recent GA variants such as GENITOR, CHC, and the recombination stage of mGA. 12 refs., 2 figs., 2 tabs.
Inductrack III configuration--a maglev system for high loads
Post, Richard F
2015-03-24
Inductrack III configurations are suited for use in transporting heavy freight loads. Inductrack III addresses a problem associated with the cantilevered track of the Inductrack II configuration. The use of a cantilevered track could present mechanical design problems in attempting to achieve a strong enough track system such that it would be capable of supporting very heavy loads. In Inductrack III, the levitating portion of the track can be supported uniformly from below, as the levitating Halbach array used on the moving vehicle is a single-sided one, thus does not require the cantilevered track as employed in Inductrack II.
Inductrack III configuration--a maglev system for high loads
Post, Richard F
2013-11-12
Inductrack III configurations are suited for use in transporting heavy freight loads. Inductrack III addresses a problem associated with the cantilevered track of the Inductrack II configuration. The use of a cantilevered track could present mechanical design problems in attempting to achieve a strong enough track system such that it would be capable of supporting very heavy loads. In Inductrack III, the levitating portion of the track can be supported uniformly from below, as the levitating Halbach array used on the moving vehicle is a single-sided one, thus does not require the cantilevered track as employed in Inductrack II.
Validation of US3D for Capsule Aerodynamics using 05-CA Wind Tunnel Test Data
NASA Technical Reports Server (NTRS)
Schwing, Alan
2012-01-01
RANS is ill-suited for analysis of these problems. For transonic and supersonic cases, US3D shows fairly good agreement using DES across all cases. Separation prediction and resulting backshell pressure are problems across all portions of this analysis. This becomes more of an issue at lower Mach numbers: .Stagnation pressures not as large - wake and backshell are more significant .Errors on shoulder act on a large area - small discrepancies manifest as large changes Subsonic comparisons are mixed with regard to integrated loads and merit more attention. Dominant unsteady behavior (wake shedding) resolved well, though.
Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, S R; Bihari, B L; Salari, K
As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.
Adapting the Recreation Opportunity Spectrum (ROS) for states lands planning
Susan Bulmer; Linda Henzel; Ann Mates; Matt Moore; Thomas A. More
2002-01-01
The huge population increases anticipated over the next century make the problem of identifying and conserving open space critical. While the Recreation Opportunity Spectrum is undoubtedly the most sophisticated recreation inventory system established to date, it was designed for, and is best suited to, the large tracts of public lands in the western U.S, In this paper...
Philip A. Loring; F. Stuart Chapin; S. Craig Gerlach
2008-01-01
Computational thinking (CT) is a way to solve problems and understand complex systems that draws on concepts fundamental to computer science and is well suited to the challenges that face researchers of complex, linked social-ecological systems. This paper explores CT's usefulness to sustainability science through the application of the services-oriented...
A SOCIO-ECONOMIST LOOKS AT THE CURRENT VALUES AND CHANGING NEEDS OF YOUTH. FINAL DRAFT.
ERIC Educational Resources Information Center
THEOBALD, ROBERT
MAN HAS ACHIEVED THE POWER TO CREATE AN ENVIRONMENT SUITED TO HIS NEEDS. THIS POWER COMES FROM DEVELOPMENTS IN THE UTILIZATION OF ENERGY, ADVANCEMENTS IN CHEMISTRY, AN INCREASE IN SCIENTIFIC PROBLEM SOLVING ABILITY AND COMPUTER TECHNOLOGY. THESE SOURCES OF POWER RESULT IN THE DRIVE TOWARD THE DEVELOPMENT OF DESTRUCTIVE POWER, THE CAPABILITY OF…
1993-03-01
representation is needed to characterize such signature. Pseudo Wigner - Ville distribution is ideally suited for portraying non-stationary signal in the...features jointly in time and frequency. 14, SUBJECT TERIMS 15. NUMBER OF PAGES Pseudo Wigner - Ville Distribution , Analytic Signal, 83 Hilbert Transform...D U C T IO N ............................................................................ . 1 II. PSEUDO WIGNER - VILLE DISTRIBUTION
ERIC Educational Resources Information Center
Barry, Adam E.; Misra, Ranjita; Dennis, Maurice
2006-01-01
Driving a vehicle under the influence of alcohol is a major public health concern. By distinguishing the type of individuals violating driving while intoxicated (DWI) sanctions, intervention programs will be better suited to reduce drinking and driving. The purpose of this study was to examine the personal characteristics of DWI offenders and…
A Case Study of a Reluctant Word Processor: A Look at One Student in a Word Processing Classroom.
ERIC Educational Resources Information Center
Sloane, Sarah
A case study examined the writing problems of Jay, a freshman composition student at the University of Massachusetts, to determine how teachers should handle students whose composing styles are not suited to writing with word processors. Interviews, classroom observation, and careful analyses of Jay's essays in progress and logsheets were…
Team Training through Communications Control
1982-02-01
training * operational environment * team training research issues * training approach * team communications * models of operator beharior e...on the market soon, it certainly would be investigated carefully for its applicability to the team training problem. ce A text-to-speech voice...generation system. Votrax has recently marketed such a device, and others may soon follow suit. ’ d. A speech replay system designed to produce speech from
The Fraser Experimental Forest ... its work and aims
L. D. Love
1960-01-01
In 1937 the Fraser Experimental Forest was established in the heart of the Central Rocky Mountains near the Continental Divide 65 miles north and west of Denver. This 36-square-mile outdoor research laboratory is well suited for the study of pressing problems related to water yield from high-elevation forests and alpine areas. (Originally published in 1952; revised in...
ERIC Educational Resources Information Center
Kim, Min Kyu
2012-01-01
It is generally accepted that the cognitive development for a wide range of students can be improved through adaptive instruction-learning environments optimized to suit individual needs (e.g., Cronbach, Am Psychol 12:671-684, 1957; Lee and Park, in Handbook of research for educational communications and technology, Taylor & Francis Group,…
Interferometer using a 3 × 3 coupler and Faraday mirrors
NASA Astrophysics Data System (ADS)
Breguet, J.; Gisin, N.
1995-06-01
A new interferometric setup using a 3 \\times 3 coupler and two Faraday mirrors is presented. It has the advantages of being built only with passive components, of freedom from the polarization fading problem, and of operation with a LED. It is well suited for sensing time-dependent signals and does not depend on reciprocal or nonreciprocal constant perturbations.
Roles of Human Factors and Ergonomics in Meeting the Challenge of Terrorism
ERIC Educational Resources Information Center
Nickerson, Raymond S.
2011-01-01
Human factors and ergonomics research focuses on questions pertaining to the design of devices, systems, and procedures with the goal of making sure that they are well suited to human use and focuses on studies of the interaction of people with simple and complex systems and machines. Problem areas studied include the allocation of function to…
The Cognitive Consequences of Patterns of Information Flow
NASA Technical Reports Server (NTRS)
Hutchins, Edwin
1999-01-01
The flight deck of a modern commercial airliner is a complex system consisting of two or more crew and a suite of technological devices. The flight deck of the state-of-the-art Boeing 747-400 is shown. When everything goes right, all modern flight decks are easy to use. When things go sour, however, automated flight decks provide opportunities for new kinds of problems. A recent article in Aviation Week cited industry concern over the problem of verifying the safety of complex systems on automated, digital aircraft, stating that the industry must "guard against the kind of incident in which people and the automation seem to mismanage a minor occurrence or non-routine situation into larger trouble." The design of automated flight deck systems that flight crews find easy to use safely is a challenge in part because this design activity requires a theoretical perspective which can simultaneously cover the interactions of people with each other and with technology. In this paper, some concepts that can be used to understand the flight deck as a system that is composed of two or more pilots and a complex suite of automated devices is introduced.
Wavelet-Smoothed Interpolation of Masked Scientific Data for JPEG 2000 Compression
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brislawn, Christopher M.
2012-08-13
How should we manage scientific data with 'holes'? Some applications, like JPEG 2000, expect logically rectangular data, but some sources, like the Parallel Ocean Program (POP), generate data that isn't defined on certain subsets. We refer to grid points that lack well-defined, scientifically meaningful sample values as 'masked' samples. Wavelet-smoothing is a highly scalable interpolation scheme for regions with complex boundaries on logically rectangular grids. Computation is based on forward/inverse discrete wavelet transforms, so runtime complexity and memory scale linearly with respect to sample count. Efficient state-of-the-art minimal realizations yield small constants (O(10)) for arithmetic complexity scaling, and in-situ implementationmore » techniques make optimal use of memory. Implementation in two dimensions using tensor product filter banks is straighsorward and should generalize routinely to higher dimensions. No hand-tuning required when the interpolation mask changes, making the method aeractive for problems with time-varying masks. Well-suited for interpolating undefined samples prior to JPEG 2000 encoding. The method outperforms global mean interpolation, as judged by both SNR rate-distortion performance and low-rate artifact mitigation, for data distributions whose histograms do not take the form of sharply peaked, symmetric, unimodal probability density functions. These performance advantages can hold even for data whose distribution differs only moderately from the peaked unimodal case, as demonstrated by POP salinity data. The interpolation method is very general and is not tied to any particular class of applications, could be used for more generic smooth interpolation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohamed, Abdul Aziz; Al Rashid Megat Ahmad, Megat Harun; Md Idris, Faridah
2010-01-05
Malaysian Nuclear Agency's (Nuclear Malaysia) Small Angle Neutron Scattering (SANS) facility--(MYSANS)--is utilizing low flux of thermal neutron at the agency's 1 MW TRIGA reactor. As the design nature of the 8 m SANS facility can allow object resolution in the range between 5 and 80 nm to be obtained. It can be used to study alloys, ceramics and polymers in certain area of problems that relate to samples containing strong scatterers or contrast. The current SANS system at Malaysian Nuclear Agency is only capable to measure Q in limited range with a PSD (128x128) fixed at 4 m from themore » sample. The existing reactor hall that incorporate this MYSANS facility has a layout that prohibits the rebuilding of MYSANS therefore the position between the wavelength selector (HOPG) and sample and the PSD cannot be increased for wider Q range. The flux of the neutron at current sample holder is very low which around 10{sup 3} n/cm{sup 2}/sec. Thus it is important to rebuild the MYSANS to maximize the utilization of neutron. Over the years, the facility has undergone maintenance and some changes have been made. Modification on secondary shutter and control has been carried out to improve the safety level of the instrument. A compact micro-focus SANS method can suit this objective together with an improve cryostat system. This paper will explain some design concept and approaches in achieving higher flux and the modification needs to establish the micro-focused SANS.« less
Saido, Katsuhiko; Koizumi, Koshiro; Sato, Hideto; Ogawa, Naoto; Kwon, Bum Gun; Chung, Seon-Yong; Kusui, Takashi; Nishimura, Masahiko; Kodera, Yoichi
2014-03-01
The pollution caused by plastic debris is an environmental problem with increasing concern in the oceans. Among the plastic polymers, polystyrene (PS) is one of the most problematic plastics due to the direct public health risk associated with their dispersion, as well as the numerous adverse environmental impacts which arise both directly from the plastics and from their degradation products. Little is known about their potential distribution characteristics throughout the oceans. For the first time, we report here on the regional distribution of styrene monomer (SM), styrene dimers (SD; 2,4-diphenyl-1-butene, SD1; 1,3-diphenyl propane, SD2), and styrene trimer (2,4,6-triphenyl-1-hexene: ST1), as products of PS decomposition determined from samples of sand and seawater from the shorelines of the North-West Pacific ocean. In order to quantitatively determine SM, SD (=SD1+SD2), and ST1, a new analytical method was developed. The detection limit was 3.3 μg L(-1), based on a signal-to-noise ratio of three, which was well-suited to quantify levels of SM, SD, and ST1 in samples. Surprisingly, the concentrations of SM, SD, and ST1 in sand samples from the shorelines were consistently greater than those in seawater samples from the same location. The results of this study suggest that SM, SD, and ST1 can be widely dispersed throughout the North-West Pacific oceans. Copyright © 2013 Elsevier B.V. All rights reserved.
Compressive sensing based wireless sensor for structural health monitoring
NASA Astrophysics Data System (ADS)
Bao, Yuequan; Zou, Zilong; Li, Hui
2014-03-01
Data loss is a common problem for monitoring systems based on wireless sensors. Reliable communication protocols, which enhance communication reliability by repetitively transmitting unreceived packets, is one approach to tackle the problem of data loss. An alternative approach allows data loss to some extent and seeks to recover the lost data from an algorithmic point of view. Compressive sensing (CS) provides such a data loss recovery technique. This technique can be embedded into smart wireless sensors and effectively increases wireless communication reliability without retransmitting the data. The basic idea of CS-based approach is that, instead of transmitting the raw signal acquired by the sensor, a transformed signal that is generated by projecting the raw signal onto a random matrix, is transmitted. Some data loss may occur during the transmission of this transformed signal. However, according to the theory of CS, the raw signal can be effectively reconstructed from the received incomplete transformed signal given that the raw signal is compressible in some basis and the data loss ratio is low. This CS-based technique is implemented into the Imote2 smart sensor platform using the foundation of Illinois Structural Health Monitoring Project (ISHMP) Service Tool-suite. To overcome the constraints of limited onboard resources of wireless sensor nodes, a method called random demodulator (RD) is employed to provide memory and power efficient construction of the random sampling matrix. Adaptation of RD sampling matrix is made to accommodate data loss in wireless transmission and meet the objectives of the data recovery. The embedded program is tested in a series of sensing and communication experiments. Examples and parametric study are presented to demonstrate the applicability of the embedded program as well as to show the efficacy of CS-based data loss recovery for real wireless SHM systems.
Kilambi, Krishna Praneeth; Pacella, Michael S; Xu, Jianqing; Labonte, Jason W; Porter, Justin R; Muthu, Pravin; Drew, Kevin; Kuroda, Daisuke; Schueler-Furman, Ora; Bonneau, Richard; Gray, Jeffrey J
2013-12-01
Rounds 20-27 of the Critical Assessment of PRotein Interactions (CAPRI) provided a testing platform for computational methods designed to address a wide range of challenges. The diverse targets drove the creation of and new combinations of computational tools. In this study, RosettaDock and other novel Rosetta protocols were used to successfully predict four of the 10 blind targets. For example, for DNase domain of Colicin E2-Im2 immunity protein, RosettaDock and RosettaLigand were used to predict the positions of water molecules at the interface, recovering 46% of the native water-mediated contacts. For α-repeat Rep4-Rep2 and g-type lysozyme-PliG inhibitor complexes, homology models were built and standard and pH-sensitive docking algorithms were used to generate structures with interface RMSD values of 3.3 Å and 2.0 Å, respectively. A novel flexible sugar-protein docking protocol was also developed and used for structure prediction of the BT4661-heparin-like saccharide complex, recovering 71% of the native contacts. Challenges remain in the generation of accurate homology models for protein mutants and sampling during global docking. On proteins designed to bind influenza hemagglutinin, only about half of the mutations were identified that affect binding (T55: 54%; T56: 48%). The prediction of the structure of the xylanase complex involving homology modeling and multidomain docking pushed the limits of global conformational sampling and did not result in any successful prediction. The diversity of problems at hand requires computational algorithms to be versatile; the recent additions to the Rosetta suite expand the capabilities to encompass more biologically realistic docking problems. Copyright © 2013 Wiley Periodicals, Inc.
Making Individual Prognoses in Psychiatry Using Neuroimaging and Machine Learning.
Janssen, Ronald J; Mourão-Miranda, Janaina; Schnack, Hugo G
2018-04-22
Psychiatric prognosis is a difficult problem. Making a prognosis requires looking far into the future, as opposed to making a diagnosis, which is concerned with the current state. During the follow-up period, many factors will influence the course of the disease. Combined with the usually scarcer longitudinal data and the variability in the definition of outcomes/transition, this makes prognostic predictions a challenging endeavor. Employing neuroimaging data in this endeavor introduces the additional hurdle of high dimensionality. Machine-learning techniques are especially suited to tackle this challenging problem. This review starts with a brief introduction to machine learning in the context of its application to clinical neuroimaging data. We highlight a few issues that are especially relevant for prediction of outcome and transition using neuroimaging. We then review the literature that discusses the application of machine learning for this purpose. Critical examination of the studies and their results with respect to the relevant issues revealed the following: 1) there is growing evidence for the prognostic capability of machine-learning-based models using neuroimaging; and 2) reported accuracies may be too optimistic owing to small sample sizes and the lack of independent test samples. Finally, we discuss options to improve the reliability of (prognostic) prediction models. These include new methodologies and multimodal modeling. Paramount, however, is our conclusion that future work will need to provide properly (cross-)validated accuracy estimates of models trained on sufficiently large datasets. Nevertheless, with the technological advances enabling acquisition of large databases of patients and healthy subjects, machine learning represents a powerful tool in the search for psychiatric biomarkers. Copyright © 2018 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Veerasami, Maroudam; Venkataraman, K; Karuppannan, Chitra; Shanmugam, Arun Attur; Prudhvi, Mallepaddi Chand; Holder, Thomas; Rathnagiri, Polavarapu; Arunmozhivarman, K; Raj, Gopal Dhinakar; Vordermeier, Martin; Mohana Subramanian, B
2018-03-01
Tuberculosis is a significant problem globally for domestic animals as well as captive and free ranging wild life. Rapid point of care (POC) serology kits are well suited for the diagnosis of TB in wild animals. However, wild animals are invariably exposed to environmental non-pathogenic mycobacterium species with the development of cross reacting antibodies. In the present study, POC TB diagnosis kit was developed using a combination of pathogenic Mycobacteria specific recombinant antigens and purified protein derivatives of pathogenic and non-pathogenic Mycobacteria . To benchmark the TB antibody detection kit, particularly in respect to specificity which could not be determined in wildlife due to the lack of samples from confirmed uninfected animals, we first tested well-characterized sera from 100 M. bovis infected and 100 uninfected cattle. Then we investigated the kit's performance using sera samples from wildlife, namely Sloth Bears (n = 74), Elephants (n = 9), Cervidae (n = 14), Felidae (n = 21), Cape buffalo (n = 2), Wild bear (n = 1) and Wild dog (n = 1).In cattle, a sensitivity of 81% and a specificity of 90% were obtained. The diagnostic sensitivity of the kit was 94% when the kit was tested using known TB positive sloth bear sera samples. 47.4% of the in-contact sloth bears turned seropositive using the rapid POC TB diagnostic kit. Seropositivity in other wild animals was 25% when the sera samples were tested using the kit. A point of care TB sero-diagnostic kit with the combination of proteins was developed and the kit was validated using the sera samples of wild animals.
Rauscher, Sarah; Neale, Chris; Pomès, Régis
2009-10-13
Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.
Assessing the condition of bayous and estuaries: Bayou Chico Gulf of Mexico demonstration study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dickson, K.; Acevedo, M.; Waller, T.
1995-12-31
A demonstration study was conducted in May 1994 on Bayou Chico to assess the utility of various assessment and measurement endpoints in determining the condition of bayous and estuaries. Bayou Chico has water quality problems attributed to its low flushing rate and urban/industrial land use in its watershed. The sampling scheme assessed the within-sampling station and spatial variability of measurement endpoints. Fourteen sampling stations in Bayou Chico and 3 stations in Pensacola Bay were selected based on an intensified EMAP sampling grid. Time and space coordinated sampling was conducted for: sediment contaminants and properties, sediment toxicity, water quality, benthic infauna,more » zooplankton and phytoplankton populations. Fish and crabs were also collected and analyzed for a suite of biomarkers and organic chemical residues. Primary productivity was measured via the light bottle dark bottle oxygen method and via diurnal oxygen measurements made with continuous recording data sondes. Stream sites were evaluated for water and sediment quality, water and sediment toxicity, benthic invertebrates and fish. Watershed analyses included assessment of land use/landcover (via SPOT and TM images), soils, pollution sources (point and non-point) and hydrography. These data were coordinated via an Arc/Info GIS system for display and spatial analysis. 1994 survey data were used to parameterize environmental fate models such as SWMM (Storm Water Management Model), DYNHYD5 (WASP5 hydrodynamics model) and WASP5 (Water Quality Analysis Simulation Program) to make predictions about the dynamics and fate of chemical contaminants in Bayou Chico. This paper will present an overview, and report on the results in regards to within-site and spatial variability in Bayou Chico. Conclusions on the efficacy of the assessment and measurement endpoints in evaluating the condition (health) of Bayou Chico will be presented.« less
The Effects of Lunar Dust on EVA Systems During the Apollo Missions
NASA Technical Reports Server (NTRS)
Gaier, James R.
2005-01-01
Mission documents from the six Apollo missions that landed on the lunar surface have been studied in order to catalog the effects of lunar dust on Extra-Vehicular Activity (EVA) systems, primarily the Apollo surface space suit. It was found that the effects could be sorted into nine categories: vision obscuration, false instrument readings, dust coating and contamination, loss of traction, clogging of mechanisms, abrasion, thermal control problems, seal failures, and inhalation and irritation. Although simple dust mitigation measures were sufficient to mitigate some of the problems (i.e., loss of traction) it was found that these measures were ineffective to mitigate many of the more serious problems (i.e., clogging, abrasion, diminished heat rejection). The severity of the dust problems were consistently underestimated by ground tests, indicating a need to develop better simulation facilities and procedures.
The Effects of Lunar Dust on EVA Systems During the Apollo Missions
NASA Technical Reports Server (NTRS)
Gaier, James R.
2007-01-01
Mission documents from the six Apollo missions that landed on the lunar surface have been studied in order to catalog the effects of lunar dust on Extra-Vehicular Activity (EVA) systems, primarily the Apollo surface space suit. It was found that the effects could be sorted into nine categories: vision obscuration, false instrument readings, dust coating and contamination, loss of traction, clogging of mechanisms, abrasion, thermal control problems, seal failures, and inhalation and irritation. Although simple dust mitigation measures were sufficient to mitigate some of the problems (i.e., loss of traction) it was found that these measures were ineffective to mitigate many of the more serious problems (i.e., clogging, abrasion, diminished heat rejection). The severity of the dust problems were consistently underestimated by ground tests, indicating a need to develop better simulation facilities and procedures.
A Comparison of Genetic Programming Variants for Hyper-Heuristics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, Sean
Modern society is faced with ever more complex problems, many of which can be formulated as generate-and-test optimization problems. General-purpose optimization algorithms are not well suited for real-world scenarios where many instances of the same problem class need to be repeatedly and efficiently solved, such as routing vehicles over highways with constantly changing traffic flows, because they are not targeted to a particular scenario. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario. Hyper-heuristics typically employ Genetic Programming (GP) and this project has investigated the relationship between the choice of GP and performance inmore » Hyper-heuristics. Results are presented demonstrating the existence of problems for which there is a statistically significant performance differential between the use of different types of GP.« less
Ordination of the estuarine environment: What the organism experiences
Investigators customarily schedule estuary sampling trips with regard to a variety of criteria, especially tide stage and the day-night cycle. However, estuarine organisms experience a wide suite of continuously changing tide and light conditions. Such organisms may undertake i...
A cubic spline approximation for problems in fluid mechanics
NASA Technical Reports Server (NTRS)
Rubin, S. G.; Graves, R. A., Jr.
1975-01-01
A cubic spline approximation is presented which is suited for many fluid-mechanics problems. This procedure provides a high degree of accuracy, even with a nonuniform mesh, and leads to an accurate treatment of derivative boundary conditions. The truncation errors and stability limitations of several implicit and explicit integration schemes are presented. For two-dimensional flows, a spline-alternating-direction-implicit method is evaluated. The spline procedure is assessed, and results are presented for the one-dimensional nonlinear Burgers' equation, as well as the two-dimensional diffusion equation and the vorticity-stream function system describing the viscous flow in a driven cavity. Comparisons are made with analytic solutions for the first two problems and with finite-difference calculations for the cavity flow.
Enabling fast, stable and accurate peridynamic computations using multi-time-step integration
Lindsay, P.; Parks, M. L.; Prakash, A.
2016-04-13
Peridynamics is a nonlocal extension of classical continuum mechanics that is well-suited for solving problems with discontinuities such as cracks. This paper extends the peridynamic formulation to decompose a problem domain into a number of smaller overlapping subdomains and to enable the use of different time steps in different subdomains. This approach allows regions of interest to be isolated and solved at a small time step for increased accuracy while the rest of the problem domain can be solved at a larger time step for greater computational efficiency. Lastly, performance of the proposed method in terms of stability, accuracy, andmore » computational cost is examined and several numerical examples are presented to corroborate the findings.« less
GASPACHO: a generic automatic solver using proximal algorithms for convex huge optimization problems
NASA Astrophysics Data System (ADS)
Goossens, Bart; Luong, Hiêp; Philips, Wilfried
2017-08-01
Many inverse problems (e.g., demosaicking, deblurring, denoising, image fusion, HDR synthesis) share various similarities: degradation operators are often modeled by a specific data fitting function while image prior knowledge (e.g., sparsity) is incorporated by additional regularization terms. In this paper, we investigate automatic algorithmic techniques for evaluating proximal operators. These algorithmic techniques also enable efficient calculation of adjoints from linear operators in a general matrix-free setting. In particular, we study the simultaneous-direction method of multipliers (SDMM) and the parallel proximal algorithm (PPXA) solvers and show that the automatically derived implementations are well suited for both single-GPU and multi-GPU processing. We demonstrate this approach for an Electron Microscopy (EM) deconvolution problem.
The August Krogh principle applies to plants
NASA Technical Reports Server (NTRS)
Wayne, R.; Staves, M. P.
1996-01-01
The Krogh principle refers to the use of a large number of animals to study the large number of physiological problems, rather than limiting study to a particular organism for all problems. There may be organisms that are more suited to study of a particular problem than others. This same principle applies to plants. The authors are concerned with the recent trend in plant biology of using Arabidopsis thaliana as the "organism of choice." Arabidopsis is an excellent organism for molecular genetic research, but other plants are superior models for other research areas of plant biology. The authors present examples of the successful use of the Krogh principle in plant cell biology research, emphasizing the particular characteristics of the selected research organisms that make them the appropriate choice.
Calibrating LOFAR using the Black Board Selfcal System
NASA Astrophysics Data System (ADS)
Pandey, V. N.; van Zwieten, J. E.; de Bruyn, A. G.; Nijboer, R.
2009-09-01
The Black Board SelfCal (BBS) system is designed as the final processing system to carry out the calibration of LOFAR in an efficient way. In this paper we give a brief description of its architectural and software design including its distributed computing approach. A confusion limited deep all sky image (from 38-62 MHz) by calibrating LOFAR test data with the BBS suite is shown as a sample result. The present status and future directions of development of BBS suite are also touched upon. Although BBS is mainly developed for LOFAR, it may also be used to calibrate other instruments once their specific algorithms are plugged in.
ARIES NDA Robot operators` manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheer, N.L.; Nelson, D.C.
1998-05-01
The ARIES NDA Robot is an automation device for servicing the material movements for a suite of Non-destructive assay (NDA) instruments. This suite of instruments includes a calorimeter, a gamma isotopic system, a segmented gamma scanner (SGS), and a neutron coincidence counter (NCC). Objects moved by the robot include sample cans, standard cans, and instrument plugs. The robot computer has an RS-232 connection with the NDA Host computer, which coordinates robot movements and instrument measurements. The instruments are expected to perform measurements under the direction of the Host without operator intervention. This user`s manual describes system startup, using the mainmore » menu, manual operation, and error recovery.« less
NASA Astrophysics Data System (ADS)
Alexander, Louise; Snape, Joshua F.; Joy, Katherine H.; Downes, Hilary; Crawford, Ian A.
2016-09-01
Lunar mare basalts provide insights into the compositional diversity of the Moon's interior. Basalt fragments from the lunar regolith can potentially sample lava flows from regions of the Moon not previously visited, thus, increasing our understanding of lunar geological evolution. As part of a study of basaltic diversity at the Apollo 12 landing site, detailed petrological and geochemical data are provided here for 13 basaltic chips. In addition to bulk chemistry, we have analyzed the major, minor, and trace element chemistry of mineral phases which highlight differences between basalt groups. Where samples contain olivine, the equilibrium parent melt magnesium number (Mg#; atomic Mg/[Mg + Fe]) can be calculated to estimate parent melt composition. Ilmenite and plagioclase chemistry can also determine differences between basalt groups. We conclude that samples of approximately 1-2 mm in size can be categorized provided that appropriate mineral phases (olivine, plagioclase, and ilmenite) are present. Where samples are fine-grained (grain size <0.3 mm), a "paired samples t-test" can provide a statistical comparison between a particular sample and known lunar basalts. Of the fragments analyzed here, three are found to belong to each of the previously identified olivine and ilmenite basalt suites, four to the pigeonite basalt suite, one is an olivine cumulate, and two could not be categorized because of their coarse grain sizes and lack of appropriate mineral phases. Our approach introduces methods that can be used to investigate small sample sizes (i.e., fines) from future sample return missions to investigate lava flow diversity and petrological significance.
Personal consequences of malpractice lawsuits on American surgeons.
Balch, Charles M; Oreskovich, Michael R; Dyrbye, Lotte N; Colaiano, Joseph M; Satele, Daniel V; Sloan, Jeff A; Shanafelt, Tait D
2011-11-01
Our objective was to identify the prevalence of recent malpractice litigation against American surgeons and evaluate associations with personal well-being. Although malpractice lawsuits are often filed against American surgeons, the personal consequences with respect to burnout, depression, and career satisfaction are poorly understood. Members of the American College of Surgeons were sent an anonymous, cross-sectional survey in October 2010. Surgeons were asked if they had been involved in a malpractice suit during 2 previous years. The survey also evaluated demographic variables, practice characteristics, career satisfaction, burnout, and quality of life. Of the approximately 25,073 surgeons sampled, 7,164 (29%) returned surveys. Involvement in a recent malpractice suit was reported by 1,764 of 7,164 (24.6%) responding surgeons. Surgeons involved in a recent malpractice suit were younger, worked longer hours, had more night call, and were more likely to be in private practice (all p <0.0001). Recent malpractice suits were strongly related to burnout (p < 0.0001), depression (p < 0.0001), and recent thoughts of suicide (p < 0.0001) among surgeons. In multivariable modeling, both depression (odds ratio = 1.273; p = 0.0003) and burnout (odds ratio = 1.168; p = 0.0306) were independently associated with a recent malpractice suit after controlling for all other personal and professional characteristics. Hours worked, nights on call, subspecialty, and practice setting were also independently associated with recent malpractice suits. Surgeons who had experienced a recent malpractice suit reported less career satisfaction and were less likely to recommend a surgical or medical career to their children (p < 0.0001). Malpractice lawsuits are common and have potentially profound personal consequences for US surgeons. Additional research is needed to identify individual, organizational, and societal interventions to support surgeons subjected to malpractice litigation. Copyright © 2011 American College of Surgeons. Published by Elsevier Inc. All rights reserved.
EVA-Compatible Microbial Swab Tool
NASA Technical Reports Server (NTRS)
Rucker, Michelle A.
2016-01-01
When we send humans to search for life on Mars, we'll need to know what we brought with us versus what may already be there. To ensure our crewed spacecraft meet planetary protection requirements—and to protect our science from human contamination—we'll need to know whether micro-organisms are leaking/venting from our ships and spacesuits. This is easily done by swabbing external vents and suit surfaces for analysis, but requires a specialized tool for the job. Engineers at the National Aeronautics and Space Administration (NASA) recently developed an Extravehicular Activity (EVA)-compatible swab tool that can be used to sample current space suits and life support systems. Data collected now will influence Mars life support and EVA hardware early in the planning process, before design changes become difficult and expensive.NASA’s EVA swab tool pairs a Space Shuttle-era tool handle with a commercially available swab tip mounted into a custom-designed end effector. A glove-compatible release mechanism allows the handle to quickly switch between swab tips, much like a shaving razor handle can snap onto a disposable blade cartridge. Swab tips are stowed inside individual sterile containers, each fitted with a microbial filter that allows the container to equalize atmospheric pressure, but prevents cabin contaminants from rushing into the container when passing from the EVA environment into a pressurized cabin. A bank of containers arrayed inside a tool caddy allows up to six individual samples to be collected during a given spacewalk.NASA plans to use the tool in 2016 to collect samples from various spacesuits during ground testing to determine what (if any) human-borne microbial contamination leaks from the suit under simulated thermal vacuum conditions. Next, the tool will be used on board the International Space Station to assess the types of microbial contaminants found on external environmental control and life support system vents. Data will support advanced EVA and life support system maturation studies, helping to answer questions such as “how close can an EVA-suited crew member approach an area of scientific interest without compromising the science?”
Field protection effectiveness of chemical protective suits and gloves evaluated by biomonitoring
Chang, F K; Chen, M L; Cheng, S F; Shih, T S; Mao, I F
2007-01-01
Objectives To determine the effectiveness of protective suits and gloves by biomonitoring. Methods Fifteen male spray painters at a ship coating factory were studied for two weeks. Workers wore no protective clothing during the first week and wore protective suits and gloves during the second week. Sampling was conducted on four consecutive working days each week. Ethyl benzene and xylene in the air were collected by using 3M 3500 organic vapour monitors. Urine was collected before and after each work shift. Results Urinary mandelic acid (MA) and methyl hippuric acid (MHA) levels were divided by the personal exposure concentrations of ethyl benzene and xylene, respectively. Mean (SE) corrected MA and MHA concentrations in the first week were 1.07 (0.18) and 2.66 (0.68) (mg/g creatinine)/(mg/m3), and concentrations in the second week were 0.50 (0.12) and 1.76 (0.35) (mg/g creatinine)/(mg/m3) in the second week, respectively. Both MA and MHA concentrations in the second week (when spray painters wore protective suits and gloves) were lower than in the first week, respectively (p<0.001, p = 0.011). Mean decrease in MA and MHA biomarkers were 69% and 49%, respectively. Conclusion This study successfully evaluated the effectiveness of chemical protective suits and gloves by using biomarkers as urinary MA and MHA. This method is feasible for determining the performance of workers wearing personal protective equipment. Moreover, the experimental results suggest that dermal exposure may be the major contributor to total body burden of solvents in spray painters without protective suits and gloves. PMID:17522137
Characterization of Carbon Dioxide Washout Measurement Techniques in the Mark-III Space Suit
NASA Technical Reports Server (NTRS)
Meginnis, Ian M.; Norcross, Jason; Bekdash, Omar; Ploutz-Snyder, Robert
2016-01-01
A space suit must provide adequate carbon dioxide (CO2) washout inside the helmet to prevent symptoms of hypercapnia. In the past, an oronasal mask has been used to measure the inspired air of suited subjects to determine a space suit's CO2 washout capability. While sufficient for super-ambient pressure testing of space suits, the oronasal mask fails to meet several human factors and operational criterion needed for future sub-ambient pressure testing (e.g. compatibility with a Valsalva device). This paper describes the evaluation of a nasal cannula as a device for measuring inspired air within a space suit. Eight test subjects were tasked with walking on a treadmill or operating an arm ergometer to achieve target metabolic rates of 1000, 2000, and 3000 British thermal units per hour (BTU/hr), at flow rates of 2, 4, and 6 actual cubic feet per minute (ACFM). Each test configuration was conducted twice, with subjects instructed to breathe either through their nose only, or however they felt comfortable. Test data shows that the nasal cannula provides more statistically consistent data across test subjects than the oronasal mask used in previous tests. The data also shows that inhaling/exhaling through only the nose provides a lower sample variance than a normal breathing style. Nose-only breathing reports better CO2 washout due to several possible reasons, including a decreased respiratory rate, an increased tidal volume, and because nose-only breathing directs all of the exhaled CO2 down and away from the oronasal region. The test subjects in this study provided feedback that the nasal cannula is comfortable and can be used with the Valsalva device.
Isotopic variation in the Tuolumne Intrusive Suite, central Sierra Nevada, California
Kistler, R.W.; Chappell, B.W.; Peck, D.L.; Bateman, P.C.
1986-01-01
Granitoid rocks of the compositionally zoned Late Cretaceous Toulumne Intrusive Suite in the central Sierra Nevada, California, have initial87Sr/86Sr values (Sri) and143Nd/144Nd values (Ndi) that vary from 0.7057 to 0.7067 and from 0.51239 to 0.51211 respectively. The observed variation of both Sri and Ndi and of chemical composition in rocks of the suite cannot be due to crystal fractionation of magma solely under closed system conditons. The largest variation in chemistry, Ndi, and Sri is present in the outer-most equigranular units of the Tuolumne Intrusive Suite. Sri varies positively with SiO2, Na2O, K2O, and Rb concentrations, and negatively with Ndi, Al2O3, Fe2O3, MgO, FeO, CaO, MnO, P2O5, TiO2, and Sr concentrations. This covariation of Sri, Ndi and chemistry can be modeled by a process of simple mixing of basaltic and granitic magmas having weight percent SiO2 of 48.0 and 73.3 respectively. Isotopic characteristic of the mafic magma are Sri=0.7047, Ndi=0.51269 and ??18O=6.0, and of the felsic magma are Sri=0.7068, Ndi=0.51212 and ??18O=8.9. The rocks sampled contain from 50 to 80% of the felsic component. An aplite in the outer equigranular unit of the Tuolumne Intrusive Suite apparently was derived by fractional crystallization of plagioclase and hornblende from magma with granudiorite composition that was a product of mixing of the magmas described above. Siliceous magmas derived from the lower crust, having a maximum of 15 percent mantle-derived mafic component, are represented by the inner prophyritic units of the Tuolumne Intrusive Suite. ?? 1986 Springer-Verlag.
NASA Astrophysics Data System (ADS)
Montefalco de Lira Santos, Lauro Cézar; Dantas, Elton Luiz; Cawood, Peter A.; José dos Santos, Edilton; Fuck, Reinhardt A.
2017-11-01
Pre-Brasiliano rocks in the Borborema Province (NE Brazil) are concentrated in basement blocks, such as the Alto Moxotó Terrane. Petrographic, geochemical, and U-Pb and Sm-Nd isotopic data from two basement metagranitic suites within the terrane provide evidence for Neoarchean (2.6 Ga) and Paleoproterozoic (2.1 Ga) subduction-related events. The Riacho das Lajes Suite is made of medium to coarse-grained hornblende and biotite-bearing metatonalites and metamonzogranites. Whole-rock geochemical data indicate that these rocks represent calcic, magnesian and meta-to peraluminous magmas, and have unequivocal affinities with high-Al low-REE tonalite-trondhjemite-granodiorites (TTG). Zircon U-Pb data from two samples of this suite indicate that they were emplaced at 2.6 Ga, which is the first discovered Archean crust in the central portion of the province. The suite has Neoarchean depleted mantle model ages (TDM) and slightly negative to positive εNd(t), indicating slight crustal contamination. The overall geochemical and isotopic data indicate a Neoarchean intraoceanic setting for genesis of the Riacho das Lajes magma via melting of basaltic oceanic crust submitted to high-pressure eclogite facies conditions. On the other hand, the Floresta Suite comprise metaigneous rocks, which are mostly tonalitic and granodioritic in composition. Geochemical data indicate that this suite shares similarities with calcic to calc-alkalic magmas with magnesian and metaluminous to slightly peraluminous characteristics. Other geochemical features include anomolous Ni, V and Cr contents, as well as high large-ion litophile elements (LILE) values. The suite yields U-Pb zircon ages of approximately 2.1 Ga, Archean to Paleoproterozoic TDM ages, and negative to positive εNd(t) values, suggesting both new crust formation and reworking of Archean crust, in addition to mantle metasomatism, reflecting mixed sources. The most likely tectonic setting for the Floresta Suite magmas involved crustal thickening by terrane accretion, coeval to slab break off. Our results provide new insights on proto-Western Gondwana crustal evolution.
Higher order explicit symmetric integrators for inseparable forms of coordinates and momenta
NASA Astrophysics Data System (ADS)
Liu, Lei; Wu, Xin; Huang, Guoqing; Liu, Fuyao
2016-06-01
Pihajoki proposed the extended phase-space second-order explicit symmetric leapfrog methods for inseparable Hamiltonian systems. On the basis of this work, we survey a critical problem on how to mix the variables in the extended phase space. Numerical tests show that sequent permutations of coordinates and momenta can make the leapfrog-like methods yield the most accurate results and the optimal long-term stabilized error behaviour. We also present a novel method to construct many fourth-order extended phase-space explicit symmetric integration schemes. Each scheme represents the symmetric production of six usual second-order leapfrogs without any permutations. This construction consists of four segments: the permuted coordinates, triple product of the usual second-order leapfrog without permutations, the permuted momenta and the triple product of the usual second-order leapfrog without permutations. Similarly, extended phase-space sixth, eighth and other higher order explicit symmetric algorithms are available. We used several inseparable Hamiltonian examples, such as the post-Newtonian approach of non-spinning compact binaries, to show that one of the proposed fourth-order methods is more efficient than the existing methods; examples include the fourth-order explicit symplectic integrators of Chin and the fourth-order explicit and implicit mixed symplectic integrators of Zhong et al. Given a moderate choice for the related mixing and projection maps, the extended phase-space explicit symplectic-like methods are well suited for various inseparable Hamiltonian problems. Samples of these problems involve the algorithmic regularization of gravitational systems with velocity-dependent perturbations in the Solar system and post-Newtonian Hamiltonian formulations of spinning compact objects.
Zamli, Kamal Z.; Din, Fakhrud; Bures, Miroslav
2018-01-01
The sine-cosine algorithm (SCA) is a new population-based meta-heuristic algorithm. In addition to exploiting sine and cosine functions to perform local and global searches (hence the name sine-cosine), the SCA introduces several random and adaptive parameters to facilitate the search process. Although it shows promising results, the search process of the SCA is vulnerable to local minima/maxima due to the adoption of a fixed switch probability and the bounded magnitude of the sine and cosine functions (from -1 to 1). In this paper, we propose a new hybrid Q-learning sine-cosine- based strategy, called the Q-learning sine-cosine algorithm (QLSCA). Within the QLSCA, we eliminate the switching probability. Instead, we rely on the Q-learning algorithm (based on the penalty and reward mechanism) to dynamically identify the best operation during runtime. Additionally, we integrate two new operations (Lévy flight motion and crossover) into the QLSCA to facilitate jumping out of local minima/maxima and enhance the solution diversity. To assess its performance, we adopt the QLSCA for the combinatorial test suite minimization problem. Experimental results reveal that the QLSCA is statistically superior with regard to test suite size reduction compared to recent state-of-the-art strategies, including the original SCA, the particle swarm test generator (PSTG), adaptive particle swarm optimization (APSO) and the cuckoo search strategy (CS) at the 95% confidence level. However, concerning the comparison with discrete particle swarm optimization (DPSO), there is no significant difference in performance at the 95% confidence level. On a positive note, the QLSCA statistically outperforms the DPSO in certain configurations at the 90% confidence level. PMID:29771918
Zamli, Kamal Z; Din, Fakhrud; Ahmed, Bestoun S; Bures, Miroslav
2018-01-01
The sine-cosine algorithm (SCA) is a new population-based meta-heuristic algorithm. In addition to exploiting sine and cosine functions to perform local and global searches (hence the name sine-cosine), the SCA introduces several random and adaptive parameters to facilitate the search process. Although it shows promising results, the search process of the SCA is vulnerable to local minima/maxima due to the adoption of a fixed switch probability and the bounded magnitude of the sine and cosine functions (from -1 to 1). In this paper, we propose a new hybrid Q-learning sine-cosine- based strategy, called the Q-learning sine-cosine algorithm (QLSCA). Within the QLSCA, we eliminate the switching probability. Instead, we rely on the Q-learning algorithm (based on the penalty and reward mechanism) to dynamically identify the best operation during runtime. Additionally, we integrate two new operations (Lévy flight motion and crossover) into the QLSCA to facilitate jumping out of local minima/maxima and enhance the solution diversity. To assess its performance, we adopt the QLSCA for the combinatorial test suite minimization problem. Experimental results reveal that the QLSCA is statistically superior with regard to test suite size reduction compared to recent state-of-the-art strategies, including the original SCA, the particle swarm test generator (PSTG), adaptive particle swarm optimization (APSO) and the cuckoo search strategy (CS) at the 95% confidence level. However, concerning the comparison with discrete particle swarm optimization (DPSO), there is no significant difference in performance at the 95% confidence level. On a positive note, the QLSCA statistically outperforms the DPSO in certain configurations at the 90% confidence level.
Sidler, Dominik; Schwaninger, Arthur; Riniker, Sereina
2016-10-21
In molecular dynamics (MD) simulations, free-energy differences are often calculated using free energy perturbation or thermodynamic integration (TI) methods. However, both techniques are only suited to calculate free-energy differences between two end states. Enveloping distribution sampling (EDS) presents an attractive alternative that allows to calculate multiple free-energy differences in a single simulation. In EDS, a reference state is simulated which "envelopes" the end states. The challenge of this methodology is the determination of optimal reference-state parameters to ensure equal sampling of all end states. Currently, the automatic determination of the reference-state parameters for multiple end states is an unsolved issue that limits the application of the methodology. To resolve this, we have generalised the replica-exchange EDS (RE-EDS) approach, introduced by Lee et al. [J. Chem. Theory Comput. 10, 2738 (2014)] for constant-pH MD simulations. By exchanging configurations between replicas with different reference-state parameters, the complexity of the parameter-choice problem can be substantially reduced. A new robust scheme to estimate the reference-state parameters from a short initial RE-EDS simulation with default parameters was developed, which allowed the calculation of 36 free-energy differences between nine small-molecule inhibitors of phenylethanolamine N-methyltransferase from a single simulation. The resulting free-energy differences were in excellent agreement with values obtained previously by TI and two-state EDS simulations.
Path planning in uncertain flow fields using ensemble method
NASA Astrophysics Data System (ADS)
Wang, Tong; Le Maître, Olivier P.; Hoteit, Ibrahim; Knio, Omar M.
2016-10-01
An ensemble-based approach is developed to conduct optimal path planning in unsteady ocean currents under uncertainty. We focus our attention on two-dimensional steady and unsteady uncertain flows, and adopt a sampling methodology that is well suited to operational forecasts, where an ensemble of deterministic predictions is used to model and quantify uncertainty. In an operational setting, much about dynamics, topography, and forcing of the ocean environment is uncertain. To address this uncertainty, the flow field is parametrized using a finite number of independent canonical random variables with known densities, and the ensemble is generated by sampling these variables. For each of the resulting realizations of the uncertain current field, we predict the path that minimizes the travel time by solving a boundary value problem (BVP), based on the Pontryagin maximum principle. A family of backward-in-time trajectories starting at the end position is used to generate suitable initial values for the BVP solver. This allows us to examine and analyze the performance of the sampling strategy and to develop insight into extensions dealing with general circulation ocean models. In particular, the ensemble method enables us to perform a statistical analysis of travel times and consequently develop a path planning approach that accounts for these statistics. The proposed methodology is tested for a number of scenarios. We first validate our algorithms by reproducing simple canonical solutions, and then demonstrate our approach in more complex flow fields, including idealized, steady and unsteady double-gyre flows.
Co-elution phenomena in polymer mixtures studied by asymmetric flow field-flow fractionation.
Zielke, Claudia; Fuentes, Catalina; Piculell, Lennart; Nilsson, Lars
2018-01-12
Most polymers generally have complex characteristics. Analysis and understanding of these characteristics is crucial as they, for instance, influence functionality. Separation and analysis of samples of polymers, biopolymers in particular, is challenging since they often display broad distributions in size, structure and molar mass (M) and/or a tendency to form aggregates. Only few analytical techniques are suitable for the task. AF4-MALS-dRI is highly suited for the task, but the analysis can nevertheless be especially challenging for heterogeneous mixtures of polymers that exhibit wide size distributions or aggregation. For such systems, systematic and thorough method development is clearly a requirement. This is the purpose of the present work, where we approach the problem of heterogeneous polymer samples systematically by analyzing mixtures of two different polymers which are also characterized individually. An often observed phenomenon in AF4 of samples with a high polydispersity is a downturn in M vs. elution time, especially common at high retention. This result is often dismissed as an artifact attributed to various errors in detection and data processing. In this work, we utilize AF4-MALS-dRI to separate and analyze binary mixtures of the well-known polysaccharides pullulan and glycogen, or pullulan and poly(ethylene oxide), respectively, in solution. The results show that an observed downturn - or even an upturn - in M can be a correct result, caused by inherent properties of the analyzed polymers. Copyright © 2017 Elsevier B.V. All rights reserved.
Sampling in Qualitative Research
LUBORSKY, MARK R.; RUBINSTEIN, ROBERT L.
2011-01-01
In gerontology the most recognized and elaborate discourse about sampling is generally thought to be in quantitative research associated with survey research and medical research. But sampling has long been a central concern in the social and humanistic inquiry, albeit in a different guise suited to the different goals. There is a need for more explicit discussion of qualitative sampling issues. This article will outline the guiding principles and rationales, features, and practices of sampling in qualitative research. It then describes common questions about sampling in qualitative research. In conclusion it proposes the concept of qualitative clarity as a set of principles (analogous to statistical power) to guide assessments of qualitative sampling in a particular study or proposal. PMID:22058580
NASA Astrophysics Data System (ADS)
Shkolyar, S.; Eshelman, E.; Farmer, J. D.; Hamilton, D.; Daly, M. G.; Youngbull, C.
2017-12-01
The Mars 2020 mission will analyze samples in situ and identify any that could have preserved biosignatures in ancient habitable environments for later return to Earth. Highest-priority targeted samples include aqueously formed sedimentary lithologies containing fossil biosignatures as aromatic carbon (kerogen). In this study, we analyze non-extracted, naturally preserved kerogen in a diverse suite of realistic Mars analogs using combined UV excitation time-gated (UV-TG) Raman and laser-induced fluorescence spectroscopy. We interrogated kerogen and its host matrix in samples to: (1) explore the capabilities of UV-TG Raman and fluorescence spectroscopy for detecting kerogen in high-priority targets in the search for a Martian fossil record; (2) assess the effectiveness of time-gating and UV laser wavelength in reducing fluorescence; and (3) identify sample-specific issues which could challenge rover-based identifications of kerogen using UV-TG Raman spectroscopy. We found that ungated UV Raman is suited to identify diagnostic kerogen Raman bands without interfering fluorescence and that fluorescence features indicating kerogen are detectable. These data highlight the value of using both co-located Raman and fluorescence data sets together to strengthen the confidence of kerogen detection as a potential biosignature and are obtainable by SHERLOC onboard Mars 2020.
Do recreational cannabis users, unlicensed and licensed medical cannabis users form distinct groups?
Sznitman, Sharon R
2017-04-01
This study aims to gain a more nuanced perspective on the differences between recreationally and medically motivated cannabis use by distinguishing between people who use cannabis for recreational purposes, unlicensed and licensed medical users. Data collection was conducted online from a convenience sample of 1479 Israeli cannabis users. Multinomial regression analysis compared unlicensed medical users (38%) with recreational (42%) and licensed medical (5.6%) users in terms of sociodemographics, mode, frequency and problematic cannabis use. There were more variables distinguishing unlicensed from licensed users than there were distinguishing features between unlicensed and recreational users. Recreational users were more likely to be male, less likely to eat cannabis, to use cannabis frequently and to use alone and before midday than unlicensed users. Licensed users were older than unlicensed users, they reported less hours feeling stoned, less cannabis use problems and they were more likely to report cannabis use patterns analogous of medication administration for chronic problems (frequent use, vaping, use alone and use before midday). This study suggests that a sizable proportion of cannabis users in Israel self-prescribe cannabis and that licensed medical cannabis users differ from unlicensed users. This is, in turn, suggestive of a rigorous medicalized cannabis program that does not function as a backdoor for legal access to recreational use. However, due to methodological limitations this conclusion is only suggestive. The most meaningful differences across recreational, unlicensed and licensed users were mode and patterns of use rather than cannabis use problems. Current screening tools for cannabis use problems may, however, not be well suited to assess such problems in medically motivated users. Indeed, when screening for problematic cannabis use there is a need for a more careful consideration of whether or not cannabis use is medically motivated. Copyright © 2016 Elsevier B.V. All rights reserved.
Probabilistic Open Set Recognition
NASA Astrophysics Data System (ADS)
Jain, Lalit Prithviraj
Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary support vector machines. Building from the success of statistical EVT based recognition methods such as PI-SVM and W-SVM on the open set problem, we present a new general supervised learning algorithm for multi-class classification and multi-class open set recognition called the Extreme Value Local Basis (EVLB). The design of this algorithm is motivated by the observation that extrema from known negative class distributions are the closest negative points to any positive sample during training, and thus should be used to define the parameters of a probabilistic decision model. In the EVLB, the kernel distribution for each positive training sample is estimated via an EVT distribution fit over the distances to the separating hyperplane between positive training sample and closest negative samples, with a subset of the overall positive training data retained to form a probabilistic decision boundary. Using this subset as a frame of reference, the probability of a sample at test time decreases as it moves away from the positive class. Possessing this property, the EVLB is well-suited to open set recognition problems where samples from unknown or novel classes are encountered at test. Our experimental evaluation shows that the EVLB provides a substantial improvement in scalability compared to standard radial basis function kernel machines, as well as P I-SVM and W-SVM, with improved accuracy in many cases. We evaluate our algorithm on open set variations of the standard visual learning benchmarks, as well as with an open subset of classes from Caltech 256 and ImageNet. Our experiments show that PI-SVM, WSVM and EVLB provide significant advances over the previous state-of-the-art solutions for the same tasks.
Determination of Caffeine in Beverages by High Performance Liquid Chromatography.
ERIC Educational Resources Information Center
DiNunzio, James E.
1985-01-01
Describes the equipment, procedures, and results for the determination of caffeine in beverages by high performance liquid chromatography. The method is simple, fast, accurate, and, because sample preparation is minimal, it is well suited for use in a teaching laboratory. (JN)
Wavelet extractor: A Bayesian well-tie and wavelet extraction program
NASA Astrophysics Data System (ADS)
Gunning, James; Glinsky, Michael E.
2006-06-01
We introduce a new open-source toolkit for the well-tie or wavelet extraction problem of estimating seismic wavelets from seismic data, time-to-depth information, and well-log suites. The wavelet extraction model is formulated as a Bayesian inverse problem, and the software will simultaneously estimate wavelet coefficients, other parameters associated with uncertainty in the time-to-depth mapping, positioning errors in the seismic imaging, and useful amplitude-variation-with-offset (AVO) related parameters in multi-stack extractions. It is capable of multi-well, multi-stack extractions, and uses continuous seismic data-cube interpolation to cope with the problem of arbitrary well paths. Velocity constraints in the form of checkshot data, interpreted markers, and sonic logs are integrated in a natural way. The Bayesian formulation allows computation of full posterior uncertainties of the model parameters, and the important problem of the uncertain wavelet span is addressed uses a multi-model posterior developed from Bayesian model selection theory. The wavelet extraction tool is distributed as part of the Delivery seismic inversion toolkit. A simple log and seismic viewing tool is included in the distribution. The code is written in Java, and thus platform independent, but the Seismic Unix (SU) data model makes the inversion particularly suited to Unix/Linux environments. It is a natural companion piece of software to Delivery, having the capacity to produce maximum likelihood wavelet and noise estimates, but will also be of significant utility to practitioners wanting to produce wavelet estimates for other inversion codes or purposes. The generation of full parameter uncertainties is a crucial function for workers wishing to investigate questions of wavelet stability before proceeding to more advanced inversion studies.
Evaluating Contaminants of Emerging Concern as tracers of wastewater from septic systems.
James, C Andrew; Miller-Schulze, Justin P; Ultican, Shawn; Gipe, Alex D; Baker, Joel E
2016-09-15
Bacterial and nutrient contamination from anthropogenic sources impacts fresh and marine waters, reducing water quality and restricting recreational and commercial activities. In many cases the source of this contamination is ambiguous, and a tracer or set of tracers linking contamination to source would be valuable. In this work, the effectiveness of utilizing a suite of Contaminants of Emerging Concern (CECs) as tracers of bacteria from human septic system effluent is investigated. Field sampling was performed at more than 20 locations over approximately 18 months and analyzed for a suite of CECs and fecal coliform bacteria. The sampling locations included seeps and small freshwater discharges to the shoreline. Sites were selected and grouped according to level of impact by septic systems as determined by previous field sampling programs. A subset of selected locations had been positively identified as being impacted by effluent from failing septic systems through dye testing. The CECs were selected based on their predominant use, their frequency of use, and putative fate and transport properties. In addition, two rounds of focused sampling were performed at selected sites to characterize short-term variations in CEC and fecal coliform concentrations, and to evaluate environmental persistence following source correction activities. The results indicate that a suite of common use compounds are suitable as generalized tracers of bacterial contamination from septic systems and that fate and transport properties are important in tracer selection. Highly recalcitrant or highly labile compounds likely follow different loss profiles in the subsurface compared to fecal bacteria and are not suitable tracers. The use of more than one tracer compound is recommended due to source variability of septic systems and to account for variations in the subsurface condition. In addition, concentrations of some CECs were measured in receiving waters at levels which suggested the potential for environmental harm, indicating that the possible risk presented from these sources warrants further investigation. Copyright © 2016 Elsevier Ltd. All rights reserved.
[The picture of malingered symptom presentation in public opinion].
Schlicht, D; Merten, Thomas
2014-09-01
The views held by health experts on certain topics may differ drastically from what appears to be obvious from observations in daily living or public opinion. This is true for a number of myths which continue to haunt the literature with respect to feigned health problems. Such myths tend to ignore or to distort the results of modern research. We performed two pilot studies: first a content analysis of 67 German- and English-language articles from newspapers, magazines or internet journals, and second, a survey on the experience of, and beliefs related to, exaggeration and symptom invention in health care and forensic assessment. A non-representative sample of 15 adults from the general population was interviewed. Most of them reported their own experiences or incidents in their social networks involving feigned health problems. Base rate estimates of malingering in five prototypical contexts ranged between 46 and 67 percent of cases. While the participants showed a preference for an adaptational explanatory model of malingering (selected for about 53% of cases of malingering), journalistic sources often employed pejorative language and combat rhetoric, aiming to arouse indignation or outrage in the readership. The majority of articles were classified to adhere to a criminological explanatory model. While the pilot character of the studies limits their generalisability, the results may be suited to question the validity of some long-held expert beliefs.
Reclamation of abandoned mined lands along th Upper Illinois Waterway using dredged material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Luik, A; Harrison, W
1982-01-01
Sediments were sampled and characterized from 28 actual or proposed maintenance-dredging locations in the Upper Illinois Waterway, that is, the Calumet-Sag Channel, the Des Plaines River downstream of its confluence with the Calumet-Sag Channel, and the Illinois River from the confluence of the Kankakee and Des Plaines rivers to Havana, Illinois. Sufficient data on chemical constituents and physical sediments were obtained to allow the classification of these sediments by currently applicable criteria of the Illinois Environmental Protection Agency for the identification of hazardous, persistent, and potentially hazardous wastes. By these criteria, the potential dredged materials studied were not hazardous, persistent,more » or potentially hazardous; they are a suitable topsoil/ reclamation medium. A study of problem abandoned surface-mined land sites (problem lands are defined as being acidic and/or sparsely vegetated) along the Illinois River showed that three sites were particularly well suited to the needs of the Corps of Engineers (COE) for a dredged material disposal/reclamation site. Thes sites were a pair of municipally owned sites in Morris, Illinois, and a small corporately owned site east of Ottawa, Illinois, and adjacent to the Illinois River. Other sites were also ranked as to suitability for COE involvement in their reclamation. Reclamation disposal was found to be an economically competitive alternative to near-source confined disposal for Upper Illinois Waterway dredged material.« less
Strict Constraint Feasibility in Analysis and Design of Uncertain Systems
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.
2006-01-01
This paper proposes a methodology for the analysis and design optimization of models subject to parametric uncertainty, where hard inequality constraints are present. Hard constraints are those that must be satisfied for all parameter realizations prescribed by the uncertainty model. Emphasis is given to uncertainty models prescribed by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles. These models make it possible to consider sets of parameters having comparable as well as dissimilar levels of uncertainty. Two alternative formulations for hyper-rectangular sets are proposed, one based on a transformation of variables and another based on an infinity norm approach. The suite of tools developed enable us to determine if the satisfaction of hard constraints is feasible by identifying critical combinations of uncertain parameters. Since this practice is performed without sampling or partitioning the parameter space, the resulting assessments of robustness are analytically verifiable. Strategies that enable the comparison of the robustness of competing design alternatives, the approximation of the robust design space, and the systematic search for designs with improved robustness characteristics are also proposed. Since the problem formulation is generic and the solution methods only require standard optimization algorithms for their implementation, the tools developed are applicable to a broad range of problems in several disciplines.
Johnson, Michelle J; Feng, Xin; Johnson, Laura M; Winters, Jack M
2007-03-01
There is a need to improve semi-autonomous stroke therapy in home environments often characterized by low supervision of clinical experts and low extrinsic motivation. Our distributed device approach to this problem consists of an integrated suite of low-cost robotic/computer-assistive technologies driven by a novel universal access software framework called UniTherapy. Our design strategy for personalizing the therapy, providing extrinsic motivation and outcome assessment is presented and evaluated. Three studies were conducted to evaluate the potential of the suite. A conventional force-reflecting joystick, a modified joystick therapy platform (TheraJoy), and a steering wheel platform (TheraDrive) were tested separately with the UniTherapy software. Stroke subjects with hemiparesis and able-bodied subjects completed tracking activities with the devices in different positions. We quantify motor performance across subject groups and across device platforms and muscle activation across devices at two positions in the arm workspace. Trends in the assessment metrics were consistent across devices with able-bodied and high functioning strokes subjects being significantly more accurate and quicker in their motor performance than low functioning subjects. Muscle activation patterns were different for shoulder and elbow across different devices and locations. The Robot/CAMR suite has potential for stroke rehabilitation. By manipulating hardware and software variables, we can create personalized therapy environments that engage patients, address their therapy need, and track their progress. A larger longitudinal study is still needed to evaluate these systems in under-supervised environments such as the home.
An advanced software suite for the processing and analysis of silicon luminescence images
NASA Astrophysics Data System (ADS)
Payne, D. N. R.; Vargas, C.; Hameiri, Z.; Wenham, S. R.; Bagnall, D. M.
2017-06-01
Luminescence imaging is a versatile characterisation technique used for a broad range of research and industrial applications, particularly for the field of photovoltaics where photoluminescence and electroluminescence imaging is routinely carried out for materials analysis and quality control. Luminescence imaging can reveal a wealth of material information, as detailed in extensive literature, yet these techniques are often only used qualitatively instead of being utilised to their full potential. Part of the reason for this is the time and effort required for image processing and analysis in order to convert image data to more meaningful results. In this work, a custom built, Matlab based software suite is presented which aims to dramatically simplify luminescence image processing and analysis. The suite includes four individual programs which can be used in isolation or in conjunction to achieve a broad array of functionality, including but not limited to, point spread function determination and deconvolution, automated sample extraction, image alignment and comparison, minority carrier lifetime calibration and iron impurity concentration mapping.
Mitigating Insider Threat Using Human Behavior Influence Models
2006-06-01
such as misdemeanors, lawsuits, divorce proceedings, or child custody suits. Major Legal Activity occurs when an employee experiences major legal...1994 Child born Change in Family (positive) 2002 Family trouble Change in Family (negative) 1995 Pay Raise Change in work environment (positive...and lawsuit Legal Activity (minor) 2003 Family trouble Change in Family (negative) 1997 Child born Change in Family (positive) 2003 Health Problem
Inheritance of germinative energy and germinative capacity in Douglas-fir
Thomas E. Greathouse
1966-01-01
In the West foresters have had considerable difficulty in reforesting south-facing slopes. We considered this problem when we selected plus-trees for the first Douglas-fir seed orchard in Region 6. We were, however, faced with the need to answer such questions as these: (1) Should we try to produce seed inherently suited for south slopes? (2) If so, should we strive to...
A compact two-wave dichrometer of an optical biosensor analytical system for medicine
NASA Astrophysics Data System (ADS)
Chulkov, D. P.; Gusev, V. M.; Kompanets, O. N.; Vereschagin, F. V.; Skuridin, S. G.; Yevdokimov, Yu. M.
2017-01-01
An experimental model has been developed of a compact two-wave dichrometer on the base of LEDs that is well-suited to work with "liquid" DNA nanoconstructions as biosensing units. The mobile and inexpensive device is intended for use in a biosensor analytical system for rapid determination of biologically active compounds in liquids to solve practical problems of clinic medicine and pharmacology.
1998-07-31
The advantage in utilizing 15 shape-memory cables made of Nitinol for size reduction of the remote control actuator system is 1 Fi well suited for...a submarine environment because of its non-magnetic and corrosion resistance 17 properties. Use of thermoelastic Nitinol introduces other...problems because of the cooling and 18 resetting properties of Nitinol cables. It is therefore an important object of the present invention 19 on to
Global Behavior in Large Scale Systems
2013-12-05
release. AIR FORCE RESEARCH LABORATORY AF OFFICE OF SCIENTIFIC RESEARCH (AFOSR)/RSL ARLINGTON, VIRGINIA 22203 AIR FORCE MATERIEL COMMAND AFRL-OSR-VA...and Research 875 Randolph Street, Suite 325 Room 3112, Arlington, VA 22203 December 3, 2013 1 Abstract This research attained two main achievements: 1...microscopic random interactions among the agents. 2 1 Introduction In this research we considered two main problems: 1) large deviation error performance in
PETSc Users Manual Revision 3.7
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, Satish; Abhyankar, S.; Adams, M.
This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.
PETSc Users Manual Revision 3.8
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balay, S.; Abhyankar, S.; Adams, M.
This manual describes the use of PETSc for the numerical solution of partial differential equations and related problems on high-performance computers. The Portable, Extensible Toolkit for Scientific Computation (PETSc) is a suite of data structures and routines that provide the building blocks for the implementation of large-scale application codes on parallel (and serial) computers. PETSc uses the MPI standard for all message-passing communication.
Using Piezoelectric Ceramics for Dust Mitigation of Space Suits
NASA Technical Reports Server (NTRS)
Angel, Heather K.
2004-01-01
The particles that make up moon dust and Mars soil can be hazardous to an astronaut s health if not handled properly. In the near future, while exploring outer space, astronauts plan to wander the surfaces of unknown planets. During these explorations, dust and soil will cling to their space suits and become imbedded in the fabric. The astronauts will track moon dust and mars soil back into their living quarters. This not only will create a mess with millions of tiny air-born particles floating around, but will also be dangerous in the case that the fine particles are breathed in and become trapped in an astronaut s lungs. research center are investigating ways to remove these particles from space suits. This problem is very difficult due to the nature of the particles: They are extremely small and have jagged edges which can easily latch onto the fibers of the fabric. For the past summer, I have been involved in researching the potential problems, investigating ways to remove the particles, and conducting experiments to validate the techniques. The current technique under investigation uses piezoelectric ceramics imbedded in the fabric that vibrate and shake the particles free. The particles will be left on the planet s surface or collected a vacuum to be disposed of later. The ceramics vibrate when connected to an AC voltage supply and create a small scale motion similar to what people use at the beach to shake sand off of a beach towel. Because the particles are so small, similar to volcanic ash, caution must be taken to make sure that this technique does not further inbed them in the fabric and make removal more difficult. Only a very precise range of frequency and voltage will produce a suitable vibration. My summer project involved many experiments to determine the correct range. Analysis involved hands on experience with oscilloscopes, amplifiers, piezoelectrics, a high speed camera, microscopes and computers. perfect this technology. Someday, vibration to remove dust may a vital component to the space exploration program. In order to mitigate this problem, engineers and scientists at the NASA-Glenn Further research and experiments are planned to better understand and ultimately
NASA Astrophysics Data System (ADS)
Yuan, Chao; Chareyre, Bruno; Darve, Félix
2016-09-01
A pore-scale model is introduced for two-phase flow in dense packings of polydisperse spheres. The model is developed as a component of a more general hydromechanical coupling framework based on the discrete element method, which will be elaborated in future papers and will apply to various processes of interest in soil science, in geomechanics and in oil and gas production. Here the emphasis is on the generation of a network of pores mapping the void space between spherical grains, and the definition of local criteria governing the primary drainage process. The pore space is decomposed by Regular Triangulation, from which a set of pores connected by throats are identified. A local entry capillary pressure is evaluated for each throat, based on the balance of capillary pressure and surface tension at equilibrium. The model reflects the possible entrapment of disconnected patches of the receding wetting phase. It is validated by a comparison with drainage experiments. In the last part of the paper, a series of simulations are reported to illustrate size and boundary effects, key questions when studying small samples made of spherical particles be it in simulations or experiments. Repeated tests on samples of different sizes give evolution of water content which are not only scattered but also strongly biased for small sample sizes. More than 20,000 spheres are needed to reduce the bias on saturation below 0.02. Additional statistics are generated by subsampling a large sample of 64,000 spheres. They suggest that the minimal sampling volume for evaluating saturation is one hundred times greater that the sampling volume needed for measuring porosity with the same accuracy. This requirement in terms of sample size induces a need for efficient computer codes. The method described herein has a low algorithmic complexity in order to satisfy this requirement. It will be well suited to further developments toward coupled flow-deformation problems in which evolution of the microstructure require frequent updates of the pore network.
Moyle, Phillip R.; Causey, J. Douglas
2001-01-01
This report provides chemical analyses for 31 samples collected from various phosphate mine sites in southeastern Idaho (25), northern Utah (2), and western Wyoming (4). The sampling effort was undertaken as a reconnaissance and does not constitute a characterization of mine wastes. Twenty-five samples were collected from waste rock dumps, 2 from stockpiles, and 1 each from slag, tailings, mill shale, and an outcrop. All samples were analyzed for a suite of major, minor, and trace elements. Although the analytical data set for the 31 samples is too small for detailed statistical analysis, a summary of general observations is made.
Potential Nucleosynthetic Sources of the Titanium Isotope Variations in Solar System Materials
NASA Astrophysics Data System (ADS)
Williams, N. H.; Fehr, M. A.; Akram, W. M.; Parkinson, I. J.; Schönbächler, M.
2012-09-01
The Ti isotope ratios of a comprehensive sample suite of solar system material were analyzed by MC-ICPMS. This data was then used to evaluate nucleosynthetic models for the source of isotopic correlations observed in Iron group elements and Zr.
ANALYTICAL METHODS FOR FUEL OXYGENATES
MTBE (and potentially any other oxygenate) may be present at any petroleum UST site, whether the release is new or old, virtually anywhere in the United States. Consequently, it is prudent to analyze samples for the entire suite of oxygenates as identified in this protocol (i.e....
NASA Astrophysics Data System (ADS)
Wurstner White, S.; Brandenberger, J. M.; Kulongoski, J. T.; Aalseth, C.; Williams, R. M.; Mace, E. K.; Humble, P.; Seifert, A.; Cloutier, J. M.
2015-12-01
Argon-39 has a half-life of 269 years, making it an ideal tracer for groundwater dating in the age range of 50-1000 years. In September 2014, two production wells within the San Joaquin Valley Aquifer System, located in Fresno, CA were sampled and analyzed for a suite of inorganic and organic contaminants and isotopic constituents. The radiotracers 3H (< 50 years) and 14C (> 1000 years) are routinely measured as part of the U. S. Geological Survey (USGS) National Water Quality Assessment (NAWQA) Enhanced Trends Network project. Adding 39Ar to the suite of tracers provides age data in the intermediate range to refine the groundwater age distribution of mixed waters and establishes groundwater residence times and flow rates. Characterizing the groundwater recharge and flow rate is of particular interest at these wells for determining the sources and movement of contaminants in groundwater, particularly nitrate, DBCP, and perchlorate. The sampled wells were pumped and purged. The sample collection for the 39Ar measurements required extracting the dissolved gases from 3000-5000 L of groundwater using a membrane degasification system with a maximum flow rate of 50 gpm (11.4 m^3/hr). The membranes are plastic hollow fibers that are hydrophobic. The gas was collected in duplicate large aluminum coated plastic sample bags. The gas was purified and then counted via direct beta counting using ultra-low background proportional counters loaded with a mixture of geologic Ar and methane to enhance the sensitivity for Ar measurements. The activity of 39Ar is 1.01 Bq/kg Ar, corresponding to an abundance of 0.808 ppq. The estimated absolute ages of the samples from the two groundwater wells were 23.3 and 27.0 percent of modern Ar. The comparison of the groundwater residence times determined using the suite of radiotracers (3H, 39Ar, and 14C) highlighted the value of knowing the intermediate age of groundwater when determining contaminant fate and transport pathways.
Derived heuristics-based consistent optimization of material flow in a gold processing plant
NASA Astrophysics Data System (ADS)
Myburgh, Christie; Deb, Kalyanmoy
2018-01-01
Material flow in a chemical processing plant often follows complicated control laws and involves plant capacity constraints. Importantly, the process involves discrete scenarios which when modelled in a programming format involves if-then-else statements. Therefore, a formulation of an optimization problem of such processes becomes complicated with nonlinear and non-differentiable objective and constraint functions. In handling such problems using classical point-based approaches, users often have to resort to modifications and indirect ways of representing the problem to suit the restrictions associated with classical methods. In a particular gold processing plant optimization problem, these facts are demonstrated by showing results from MATLAB®'s well-known fmincon routine. Thereafter, a customized evolutionary optimization procedure which is capable of handling all complexities offered by the problem is developed. Although the evolutionary approach produced results with comparatively less variance over multiple runs, the performance has been enhanced by introducing derived heuristics associated with the problem. In this article, the development and usage of derived heuristics in a practical problem are presented and their importance in a quick convergence of the overall algorithm is demonstrated.
Messeder, Ana Márcia; Osorio-de-Castro, Claudia Garcia Serpa; Luiza, Vera Lucia
2005-01-01
There are increasing numbers of legal suits concerning access to medicines brought against the Rio de Janeiro State Health Department. The situation indicated the need for a study to clarify the underlying issues. A sample of 389 court suits from January 1991 to December 2001 (stratified by year) was used. A cross-sectional design was used to describe and analyze the legal suits in relation to the responsibilities defined under the Unified National Health System (SUS). Results suggest major delays in court decisions. Most suits are filed by the Public Defender's Office for users of the National Health System. The most frequent cases involve medicines for the cardiovascular and nervous systems, many of which involve continuous use. Prescribing practices are institutionalized through the inclusion of the most frequently prescribed drugs in public financing lists, which makes rational drug use difficult to achieve. Municipalities are not fulfilling their responsibility to supply medicines to users, and the State is thus encumbered with these responsibilities. However, the State does not adequately supply medicines to the municipalities. The apparent lack of awareness among both lawyers and clients generates stress between the Executive and Judiciary branches and limits the resources for collective pharmaceutical services.
Description and use of LSODE, the Livermore Solver for Ordinary Differential Equations
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan; Hindmarsh, Alan C.
1993-01-01
LSODE, the Livermore Solver for Ordinary Differential Equations, is a package of FORTRAN subroutines designed for the numerical solution of the initial value problem for a system of ordinary differential equations. It is particularly well suited for 'stiff' differential systems, for which the backward differentiation formula method of orders 1 to 5 is provided. The code includes the Adams-Moulton method of orders 1 to 12, so it can be used for nonstiff problems as well. In addition, the user can easily switch methods to increase computational efficiency for problems that change character. For both methods a variety of corrector iteration techniques is included in the code. Also, to minimize computational work, both the step size and method order are varied dynamically. This report presents complete descriptions of the code and integration methods, including their implementation. It also provides a detailed guide to the use of the code, as well as an illustrative example problem.
Fourier-Accelerated Nodal Solvers (FANS) for homogenization problems
NASA Astrophysics Data System (ADS)
Leuschner, Matthias; Fritzen, Felix
2017-11-01
Fourier-based homogenization schemes are useful to analyze heterogeneous microstructures represented by 2D or 3D image data. These iterative schemes involve discrete periodic convolutions with global ansatz functions (mostly fundamental solutions). The convolutions are efficiently computed using the fast Fourier transform. FANS operates on nodal variables on regular grids and converges to finite element solutions. Compared to established Fourier-based methods, the number of convolutions is reduced by FANS. Additionally, fast iterations are possible by assembling the stiffness matrix. Due to the related memory requirement, the method is best suited for medium-sized problems. A comparative study involving established Fourier-based homogenization schemes is conducted for a thermal benchmark problem with a closed-form solution. Detailed technical and algorithmic descriptions are given for all methods considered in the comparison. Furthermore, many numerical examples focusing on convergence properties for both thermal and mechanical problems, including also plasticity, are presented.
NASA Astrophysics Data System (ADS)
Massioni, Paolo; Massari, Mauro
2018-05-01
This paper describes an interesting and powerful approach to the constrained fuel-optimal control of spacecraft in close relative motion. The proposed approach is well suited for problems under linear dynamic equations, therefore perfectly fitting to the case of spacecraft flying in close relative motion. If the solution of the optimisation is approximated as a polynomial with respect to the time variable, then the problem can be approached with a technique developed in the control engineering community, known as "Sum Of Squares" (SOS), and the constraints can be reduced to bounds on the polynomials. Such a technique allows rewriting polynomial bounding problems in the form of convex optimisation problems, at the cost of a certain amount of conservatism. The principles of the techniques are explained and some application related to spacecraft flying in close relative motion are shown.
Postpartum depression: a chronicle of health policy development.
Glasser, Saralee
2010-01-01
The current report presents an example of the path taken from identification of a public health problem at the primary health service level, to conducting research documenting the scope of the problem and nature of the risk factors, disseminating the findings, and fostering development and application of relevant policy. The example presented is the case of postpartum depression, an issue with bio-psycho-social implications. Public health nurses identified the problem, prompting epidemiological research. The findings encouraged the Ministry of Health (MOH) to conduct a pilot program for screening and early intervention among pregnant and postpartum women reporting depressive symptoms. Based on the results of the pilot program, the MOH is expanding the program to all Mother-Child Health (MCH) clinics. Israel?s largest Health Maintenance Organization has followed suit and is including this program in its own clinics. This Israeli experience may serve as an instructive example of a locally identified problem evolving into a national policy.
Bayesian seismic tomography by parallel interacting Markov chains
NASA Astrophysics Data System (ADS)
Gesret, Alexandrine; Bottero, Alexis; Romary, Thomas; Noble, Mark; Desassis, Nicolas
2014-05-01
The velocity field estimated by first arrival traveltime tomography is commonly used as a starting point for further seismological, mineralogical, tectonic or similar analysis. In order to interpret quantitatively the results, the tomography uncertainty values as well as their spatial distribution are required. The estimated velocity model is obtained through inverse modeling by minimizing an objective function that compares observed and computed traveltimes. This step is often performed by gradient-based optimization algorithms. The major drawback of such local optimization schemes, beyond the possibility of being trapped in a local minimum, is that they do not account for the multiple possible solutions of the inverse problem. They are therefore unable to assess the uncertainties linked to the solution. Within a Bayesian (probabilistic) framework, solving the tomography inverse problem aims at estimating the posterior probability density function of velocity model using a global sampling algorithm. Markov chains Monte-Carlo (MCMC) methods are known to produce samples of virtually any distribution. In such a Bayesian inversion, the total number of simulations we can afford is highly related to the computational cost of the forward model. Although fast algorithms have been recently developed for computing first arrival traveltimes of seismic waves, the complete browsing of the posterior distribution of velocity model is hardly performed, especially when it is high dimensional and/or multimodal. In the latter case, the chain may even stay stuck in one of the modes. In order to improve the mixing properties of classical single MCMC, we propose to make interact several Markov chains at different temperatures. This method can make efficient use of large CPU clusters, without increasing the global computational cost with respect to classical MCMC and is therefore particularly suited for Bayesian inversion. The exchanges between the chains allow a precise sampling of the high probability zones of the model space while avoiding the chains to end stuck in a probability maximum. This approach supplies thus a robust way to analyze the tomography imaging uncertainties. The interacting MCMC approach is illustrated on two synthetic examples of tomography of calibration shots such as encountered in induced microseismic studies. On the second application, a wavelet based model parameterization is presented that allows to significantly reduce the dimension of the problem, making thus the algorithm efficient even for a complex velocity model.
Physical properties of sidewall cores from Decatur, Illinois
Morrow, Carolyn A.; Kaven, Joern; Moore, Diane E.; Lockner, David A.
2017-10-18
To better assess the reservoir conditions influencing the induced seismicity hazard near a carbon dioxide sequestration demonstration site in Decatur, Ill., core samples from three deep drill holes were tested to determine a suite of physical properties including bulk density, porosity, permeability, Young’s modulus, Poisson’s ratio, and failure strength. Representative samples of the shale cap rock, the sandstone reservoir, and the Precambrian basement were selected for comparison. Physical properties were strongly dependent on lithology. Bulk density was inversely related to porosity, with the cap rock and basement samples being both least porous (
Lunar Reference Suite to Support Instrument Development and Testing
NASA Technical Reports Server (NTRS)
Allen, Carlton; Sellar, Glenn; Nunez, Jorge I.; Winterhalter, Daniel; Farmer, Jack
2010-01-01
Astronauts on long-duration lunar missions will need the capability to "high-grade" their samples - to select the highest value samples for transport to Earth - and to leave others on the Moon. Instruments that may be useful for such high-grading are under development. Instruments are also being developed for possible use on future lunar robotic landers, for lunar field work, and for more sophisticated analyses at a lunar outpost. The Johnson Space Center Astromaterials acquisition and Curation Office (JSC Curation) wll support such instrument testing by providing lunar sample "ground truth".
Carbon Isotopic Ratios of Amino Acids in Stardust-Returned Samples
NASA Technical Reports Server (NTRS)
Elsila, Jamie E.; Glavin, Daniel P.; Dworkin, Jason P.
2009-01-01
NASA's Stardust spacecraft returned to Earth samples from comet 81P/Wild 2 in January 2006. Preliminary examinations revealed the presence of a suite of organic compounds including several amines and amino acids, but the origin of these compounds could not be identified. Here. we present the carbon isotopic ratios of glycine and E-aminocaproic acid (EACH), the two most abundant amino acids observed, in Stardust-returned foil samples measured by gas chromatography-combustion-isotope ratio crass spectrometry coupled with quadrupole mass spectrometry (GC-QMS/IRMS).
Hawes, Emily M; Pinelli, Nicole R; Sanders, Kimberly A; Lipshutz, Andrew M; Tong, Gretchen; Sievers, Lauren S; Chao, Sarah; Gwynne, Mark
2018-01-01
BACKGROUND Medication-related problems occur at high rates during care transitions. Evidence suggests that pharmacists are well-suited to identify and resolve medication-related problems during hospital admission and at discharge. Additional evidence is needed to understand the impact of face-to-face pharmacist visits in primary care after discharge. The purpose of the study was to describe medication-related problems found during face-to-face pharmacist visits in a medical home after hospital discharge. METHODS A retrospective cohort study was conducted within an academic primary care center staffed by family medicine trained physicians that evaluated patients who attended a hospital follow-up visit with pharmacist-enhanced care (N = 86) versus usual care (N = 86). The primary objective was to describe medication-related problems identified by pharmacists using a modified individualized Medication Assessment and Planning tool for patients receiving pharmacist-enhanced care. Secondary analyses were also conducted to compare 30-day and 60-day hospital readmission and emergency department visit rates in those exposed to pharmacist-enhanced care versus those who were not. RESULTS At baseline, the mean hospitalizations in the prior year were 1.1 ± 1.7 (pharmacist-enhanced care) and 0.76 ± 1.2 (usual care), indicating a low initial readmission risk. Of patients receiving pharmacist-enhanced care, 97.7% were found to have at least 1 medication-related problem, with an average of 4.36 medication-related problems per patient. The 30-day readmission rate was lower, but not significantly different between groups (8.1% for pharmacist-enhanced care versus 12.8% for usual care; adjusted odds ratio (OR), 0.47; 95% confidence interval (CI), 0.16-1.36). LIMITATIONS Limitations include the retrospective cohort study design and small sample size. Medication-related problems were identified and collected prospectively during pharmacist visits. CONCLUSION Medication-related problems are ubiquitous after hospital discharge. Larger prospective studies will be needed to understand the potential value of pharmacist-enhanced care during hospital follow-up visits on readmission rates in low-risk patient populations receiving care within a primary care medical home. ©2018 by the North Carolina Institute of Medicine and The Duke Endowment. All rights reserved.
NASA Astrophysics Data System (ADS)
Hao, Hongda; Campbell, Ian H.; Park, Jung-Woo; Cooke, David R.
2017-11-01
Recent studies have shown that platinum-group elements (PGE) can be used to constrain the timing of sulfide saturation in evolving felsic systems. In this study, we report trace-element, PGE, Re and Au data for the barren and ore-associated suites of intermediate to felsic rocks from the Northparkes Cu-Au porphyry region, emphasizing the timing of sulfide saturation and its influence on the tenor of the associated hydrothermal mineralization. Two barren suites, the Goonumbla and Wombin Volcanics and associate intrusive rocks, are found in the region. Geochemical modelling shows that the barren suites are dominated by plagioclase-pyroxene fractionation, whereas the ore-associated Northparkes Cu-Au porphyry suite is characterized by plagioclase-amphibole fractionation, which requires the ore-bearing suite to have crystallized from a wetter magma than barren suites. The concentrations of PGE, Re and Au in the barren suites decrease continuously during fractional crystallization. This is attributed to early sulfide saturation with the fraction of immiscible sulfide precipitation required to produce the observed trend, being 0.13 and 0.16 wt.% for the Goonumbla and Wombin suites, respectively. The calculated partition coefficients for Au and Pd required to model the observed change in these elements with MgO are well below published values, indicating that R, the mass ratio of silicate to sulfide melt, played a significant role in controlling the rate of decline of these elements with fractionation. Palladium in the ore-associated suite, in contrast, first increases with fractionation then decreases abruptly at 1.2 wt.% MgO. The sharp decrease is attributed to the onset of sulfide precipitation. Platinum on the other hand shows a moderate decrease, starting from the highest MgO sample analysed, but then decreasing strongly from 1.2 wt.% MgO. The initial Pt decrease is attributed to precipitation of a platinum-group mineral (PGM), probably a Pt-Fe alloy, and the sharp decrease of both Pt and Pd at 1.2 wt.% MgO to sulfide saturation. We suggest that the Goonumbla and Wombin suites are barren because early sulfide saturation locked most of the Cu and Au in a sulfide phase in the cumulus pile of a deep parental magma chamber, well before volatile saturation, so that when the magma reached volatile saturation, it did not have access to the Cu and Au. This contrasts with the relatively late sulfide saturation in the ore-associated suite, which was followed shortly afterwards by volatile saturation. Rayleigh fractionation concentrated incompatible Cu and Au by at least a factor of five before volatile saturation. The short crystallization interval between immiscible sulfide and volatile saturation allowed some Au and Cu to be stripped from the evolving magma. Gold, with its higher partition coefficient into immiscible sulfide melts, was more affected than Cu. The result is a Cu-Au deposit. Our study also suggests that Rayleigh fractionation is as at least as important as the initial concentration of chalcophile elements in the parent magma in determining the fertility of felsic magma suites.
Geological evolution of the Neoproterozoic Bemarivo Belt, northern Madagascar
Thomas, Ronald J.; De Waele, B.; Schofield, D.I.; Goodenough, K.M.; Horstwood, M.; Tucker, R.; Bauer, W.; Annells, R.; Howard, K. J.; Walsh, G.; Rabarimanana, M.; Rafahatelo, J.-M.; Ralison, A.V.; Randriamananjara, T.
2009-01-01
The broadly east-west trending, Late Neoproterozoic Bemarivo Belt in northern Madagascar has been re-surveyed at 1:100 000 scale as part of a large multi-disciplinary World Bank-sponsored project. The work included acquisition of 14 U-Pb zircon dates and whole-rock major and trace element geochemical data of representative rocks. The belt has previously been modelled as a juvenile Neoproterozoic arc and our findings broadly support that model. The integrated datasets indicate that the Bemarivo Belt is separated by a major ductile shear zone into northern and southern "terranes", each with different lithostratigraphy and ages. However, both formed as Neoproterozoic arc/marginal basin assemblages that were translated southwards over the north-south trending domains of "cratonic" Madagascar, during the main collisional phase of the East African Orogeny at ca. 540 Ma. The older, southern terrane consists of a sequence of high-grade paragneisses (Sahantaha Group), which were derived from a Palaeoproterozoic source and formed a marginal sequence to the Archaean cratons to the south. These rocks are intruded by an extensive suite of arc-generated metamorphosed plutonic rocks, known as the Antsirabe Nord Suite. Four samples from this suite yielded U-Pb SHRIMP ages at ca. 750 Ma. The northern terrane consists of three groups of metamorphosed supracrustal rocks, including a possible Archaean sequence (Betsiaka Group: maximum depositional age approximately 2477 Ma) and two volcano-sedimentary sequences (high-grade Milanoa Group: maximum depositional age approximately 750 Ma; low grade Daraina Group: extrusive age = 720-740 Ma). These supracrustal rocks are intruded by another suite of arc-generated metamorphosed plutonic rocks, known as the Manambato Suite, 4 samples of which gave U-Pb SHRIMP ages between 705 and 718 Ma. Whole-rock geochemical data confirm the calc-alkaline, arc-related nature of the plutonic rocks. The volcanic rocks of the Daraina and Milanoa groups also show characteristics of arc-related magmatism, but include both calc-alkaline and tholeiitic compositions. It is not certain when the two Bemarivo terranes were juxtaposed, but ages from metamorphic rims on zircon suggest that both the northern and southern terranes were accreted to the northern cratonic margin of Madagascar at about 540-530 Ma. Terrane accretion included the assembly of the Archaean Antongil and Antananarivo cratons and the high-grade Neoproterozoic Anaboriana Belt. Late- to post-tectonic granitoids of the Maevarano Suite, the youngest plutons of which gave ca. 520 Ma ages, intrude all terranes in northern Madagascar showing that terrane accretion was completed by this time. ?? 2009 Natural Environment Research Council (NERC).
Global Optimization of Interplanetary Trajectories in the Presence of Realistic Mission Contraints
NASA Technical Reports Server (NTRS)
Hinckley, David, Jr.; Englander, Jacob; Hitt, Darren
2015-01-01
Interplanetary missions are often subject to difficult constraints, like solar phase angle upon arrival at the destination, velocity at arrival, and altitudes for flybys. Preliminary design of such missions is often conducted by solving the unconstrained problem and then filtering away solutions which do not naturally satisfy the constraints. However this can bias the search into non-advantageous regions of the solution space, so it can be better to conduct preliminary design with the full set of constraints imposed. In this work two stochastic global search methods are developed which are well suited to the constrained global interplanetary trajectory optimization problem.
NASA Technical Reports Server (NTRS)
Stehura, Aaron; Rozek, Matthew
2013-01-01
The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.
Some spectral approximation of one-dimensional fourth-order problems
NASA Technical Reports Server (NTRS)
Bernardi, Christine; Maday, Yvon
1989-01-01
Some spectral type collocation method well suited for the approximation of fourth-order systems are proposed. The model problem is the biharmonic equation, in one and two dimensions when the boundary conditions are periodic in one direction. It is proved that the standard Gauss-Lobatto nodes are not the best choice for the collocation points. Then, a new set of nodes related to some generalized Gauss type quadrature formulas is proposed. Also provided is a complete analysis of these formulas including some new issues about the asymptotic behavior of the weights and we apply these results to the analysis of the collocation method.
NASA Astrophysics Data System (ADS)
Qin, Chen; Ren, Bin; Guo, Longfei; Dou, Wenhua
2014-11-01
Multi-projector three dimension display is a promising multi-view glass-free three dimension (3D) display technology, can produce full colour high definition 3D images on its screen. One key problem of multi-projector 3D display is how to acquire the source images of projector array while avoiding pseudoscopic problem. This paper analysis the displaying characteristics of multi-projector 3D display first and then propose a projector content synthetic method using tetrahedral transform. A 3D video format that based on stereo image pair and associated disparity map is presented, it is well suit for any type of multi-projector 3D display and has advantage in saving storage usage. Experiment results show that our method solved the pseudoscopic problem.
New evidence favoring multilevel decomposition and optimization
NASA Technical Reports Server (NTRS)
Padula, Sharon L.; Polignone, Debra A.
1990-01-01
The issue of the utility of multilevel decomposition and optimization remains controversial. To date, only the structural optimization community has actively developed and promoted multilevel optimization techniques. However, even this community acknowledges that multilevel optimization is ideally suited for a rather limited set of problems. It is warned that decomposition typically requires eliminating local variables by using global variables and that this in turn causes ill-conditioning of the multilevel optimization by adding equality constraints. The purpose is to suggest a new multilevel optimization technique. This technique uses behavior variables, in addition to design variables and constraints, to decompose the problem. The new technique removes the need for equality constraints, simplifies the decomposition of the design problem, simplifies the programming task, and improves the convergence speed of multilevel optimization compared to conventional optimization.
Multiple crack detection in 3D using a stable XFEM and global optimization
NASA Astrophysics Data System (ADS)
Agathos, Konstantinos; Chatzi, Eleni; Bordas, Stéphane P. A.
2018-02-01
A numerical scheme is proposed for the detection of multiple cracks in three dimensional (3D) structures. The scheme is based on a variant of the extended finite element method (XFEM) and a hybrid optimizer solution. The proposed XFEM variant is particularly well-suited for the simulation of 3D fracture problems, and as such serves as an efficient solution to the so-called forward problem. A set of heuristic optimization algorithms are recombined into a multiscale optimization scheme. The introduced approach proves effective in tackling the complex inverse problem involved, where identification of multiple flaws is sought on the basis of sparse measurements collected near the structural boundary. The potential of the scheme is demonstrated through a set of numerical case studies of varying complexity.
A revised burial dose estimation procedure for optical dating of youngand modern-age sediments
Arnold, L.J.; Roberts, R.G.; Galbraith, R.F.; DeLong, S.B.
2009-01-01
The presence of genuinely zero-age or near-zero-age grains in modern-age and very young samples poses a problem for many existing burial dose estimation procedures used in optical (optically stimulated luminescence, OSL) dating. This difficulty currently necessitates consideration of relatively simplistic and statistically inferior age models. In this study, we investigate the potential for using modified versions of the statistical age models of Galbraith et??al. [Galbraith, R.F., Roberts, R.G., Laslett, G.M., Yoshida, H., Olley, J.M., 1999. Optical dating of single and multiple grains of quartz from Jinmium rock shelter, northern Australia: Part I, experimental design and statistical models. Archaeometry 41, 339-364.] to provide reliable equivalent dose (De) estimates for young and modern-age samples that display negative, zero or near-zero De estimates. For this purpose, we have revised the original versions of the central and minimum age models, which are based on log-transformed De values, so that they can be applied to un-logged De estimates and their associated absolute standard errors. The suitability of these 'un-logged' age models is tested using a series of known-age fluvial samples deposited within two arroyo systems from the American Southwest. The un-logged age models provide accurate burial doses and final OSL ages for roughly three-quarters of the total number of samples considered in this study. Sensitivity tests reveal that the un-logged versions of the central and minimum age models are capable of producing accurate burial dose estimates for modern-age and very young (<350??yr) fluvial samples that contain (i) more than 20% of well-bleached grains in their De distributions, or (ii) smaller sub-populations of well-bleached grains for which the De values are known with high precision. Our results indicate that the original (log-transformed) versions of the central and minimum age models are still preferable for most routine dating applications, since these age models are better suited to the statistical properties of typical single-grain and multi-grain single-aliquot De datasets. However, the unique error properties of modern-age samples, combined with the problems of calculating natural logarithms of negative or zero-Gy De values, mean that the un-logged versions of the central and minimum age models currently offer the most suitable means of deriving accurate burial dose estimates for very young and modern-age samples. ?? 2009 Elsevier Ltd. All rights reserved.
Thornber, Carl R.; Budahn, James R.; Ridley, W. Ian; Unruh, Daniel M.
2003-01-01
This open-file report serves as a repository for geochemical data referred to in U.S. Geological Survey Professional Paper 1676 (Heliker, Swanson, and Takahashi, eds., 2003), which includes multidisciplinary research papers pertaining to the first twenty years of Puu Oo Kupaianaha eruption activity. Details of eruption characteristics and nomenclature are provided in the introductory chapter of that volume (Heliker and Mattox, 2003). Geochemical relations of this data are depicted and interpreted by Thornber (2003), Thornber and others (2003a) and Thornber (2001). This report supplements Thornber and others (2003b) in which whole-rock and glass major-element data on ~1000 near-vent lava samples collected during the 1983 to 2001 eruptive interval of Kilauea Volcano, Hawai'i, are presented. Herein, we present whole-rock trace element compositions of 85 representative samples collected from January 1983 to May 2001; glass trace-element compositions of 39 Pele’s Tear (tephra) samples collected from September 1995 to September 1996, and whole-rock Nd, Sr and Pb isotopic analyses of 10 representative samples collected from September 1983 to September 1993. Thornber and others (2003b) provide a specific record of sample characteristics, location, etc., for each of the samples reported here. Spreadsheets of both reports may be integrated and sorted based upon time of formation or sample numbers. General information pertaining to the selectivity and petrologic significance of this sample suite is presented by Thornber and others (2003b). As justified in that report, this select suite of time-constrained geochemical data is suitable for constructing petrologic models of pre-eruptive magmatic processes associated with prolonged rift zone eruption of Hawaiian shield volcanoes.
Leadership style and organisational commitment among nursing staff in Saudi Arabia.
Al-Yami, Mansour; Galdas, Paul; Watson, Roger
2018-03-23
To examine how nurse managers' leadership styles, and nurses' organisational commitment in Saudi Arabia relate. Effective leadership is influential in staff retention; however, recruiting and maintaining nurses is an increasing problem in Saudi Arabia. Using a survey design, the Multifactor Leadership Questionnaire and the Organisational Commitment Questionnaire were distributed to a sample of 219 nurses and nurse managers from two hospitals in Saudi Arabia. Transformational leadership was the most dominant leadership style. After controlling for the influence of manager/staff status, nationality and hospitals, transformational leadership was the strongest contributor to organisational commitment. Perceptions of both transformational and transactional leadership styles, increased with age for nurse managers and nursing staff. Introducing the Full Range of Leadership model to the Saudi nursing workforce could help to prepare Saudi nurses for positions as nurse managers and leaders. The study provides insight into the type of leadership that is best suited to the dynamic and changing health care system in Saudi Arabia. It is possible that transformational leaders could influence and induce positive changes in nursing. © 2018 The Authors. Journal of Nursing Management Published by John Wiley & Sons Ltd.