Sample records for rsil lab code

  1. Determination of the delta(2H/1H)of Water: RSIL Lab Code 1574

    USGS Publications Warehouse

    Revesz, Kinga; Coplen, Tyler B.

    2008-01-01

    Reston Stable Isotope Laboratory (RSIL) lab code 1574 describes a method used to determine the relative hydrogen isotope-ratio delta(2H,1H), abbreviated hereafter as d2H of water. The d2H measurement of water also is a component of the National Water Quality Laboratory (NWQL) schedules 1142 and 1172. The water is collected unfiltered in a 60-mL glass bottle and capped with a Polyseal cap. In the laboratory, the water sample is equilibrated with gaseous hydrogen using a platinum catalyst (Horita, 1988; Horita and others, 1989; Coplen and others, 1991). The reaction for the exchange of one hydrogen atom is shown in equation 1.

  2. Determination of the δ15N of nitrate in solids; RSIL lab code 2894

    USGS Publications Warehouse

    Coplen, Tyler B.; Qi, Haiping; Revesz, Kinga; Casciotti, Karen; Hannon, Janet E.

    2007-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 2894 is to determine the δ15N of nitrate (NO3-) in solids. The nitrate fraction of the nitrogen species is dissolved by water (called leaching) and can be analyzed by the bacterial method covered in RSIL lab code 2899. After leaching, the δ15N of the dissolved NO3- is analyzed by conversion of the NO3- to nitrous oxide (N2O), which serves as the analyte for mass spectrometry. A culture of denitrifying bacteria is used in the enzymatic conversion of NO3- to N2O, which follows the pathway shown in equation 1: NO3- → NO2- → NO → 1/2 N2O (1) Because the bacteria Pseudomonas aureofaciens lack N2O reductive activity, the reaction stops at N2O, unlike the typical denitrification reaction that goes to N2. After several hours, the conversion is complete, and the N2O is extracted from the vial, separated from volatile organic vapor and water vapor by an automated -65 °C isopropanol-slush trap, a Nafion drier, a CO2 and water removal unit (Costech #021020 carbon dioxide absorbent with Mg(ClO4)2), and trapped in a small-volume trap immersed in liquid nitrogen with a modified Finnigan MAT (now Thermo Scientific) GasBench 2 introduction system. After the N2O is released, it is further purified by gas chromatography before introduction to the isotope-ratio mass spectrometer (IRMS). The IRMS is a Thermo Scientific Delta V Plus continuous flow IRMS (CF-IRMS). It has a universal triple collector, consisting of two wide cups with a narrow cup in the middle; it is capable of simultaneously measuring mass/charge (m/z) of the N2O molecule 44, 45, and 46. The ion beams from these m/z values are as follows: m/z = 44 = N2O = 14N14N16O; m/z = 45 = N2O = 14N15N16O or 14N14N17O; m/z = 46 = N2O = 14N14N18O. The 17O contributions to the m/z 44 and m/z 45 ion beams are accounted for before δ15N values are reported.

  3. Determination of the δ13C of dissolved inorganic carbon in water; RSIL lab code 1710

    USGS Publications Warehouse

    Singleton, Glenda L.; Revesz, Kinga; Coplen, Tyler B.

    2012-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 1710 is to present a method to determine the δ13C of dissolved inorganic carbon (DIC) of water. The DIC of water is precipitated using ammoniacal strontium chloride (SrCl2) solution to form strontium carbonate (SrCO3). The δ13C is analyzed by reacting SrCO3 with 100-percent phosphoric acid (H3PO4) to liberate carbon quantitatively as carbon dioxide (CO2), which is collected, purified by vacuum sublimation, and analyzed by dual inlet isotope-ratio mass spectrometry (DI-IRMS). The DI-IRMS is a DuPont double-focusing mass spectrometer. One ion beam passes through a slit in a forward collector and is collected in the rear collector. The other measurable ion beams are collected in the front collector. By changing the ion-accelerating voltage under computer control, the instrument is capable of measuring mass/charge (m/z) 45 or 46 in the rear collector and m/z 44 and 46 or 44 and 45, respectively, in the front collector. The ion beams from these m/z values are as follows: m/z 44 = CO2 = 12C16O16O, m/z 45 = CO2 = 13C16O16O primarily, and m/z 46 = CO2 = 12C16O18O primarily. The data acquisition and control software calculates δ13C values.

  4. Determination of the δ15N and δ18O of nitrate in solids; RSIL lab code 2897

    USGS Publications Warehouse

    Coplen, Tyler B.; Qi, Haiping; Revesz, Kinga; Casciotti, Karen; Hannon, Janet E.

    2007-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 2897 is to determine the δ15N and δ18O of nitrate (NO3-) in solids. The NO3- fraction of the nitrogen species is dissolved by water (called leaching) and can be analyzed by the bacterial method covered in RSIL lab code 2900. After leaching, the δ15N and δ18O of the dissolved NO3- is analyzed by conversion of the NO3- to nitrous oxide (N2O), which serves as the analyte for mass spectrometry. A culture of denitrifying bacteria is used in the enzymatic conversion of NO3- to N2O, which follows the pathway shown in equation 1: NO3- → NO2- → NO → 1/2 N2O (1) Because the bacteria Pseudomonas aureofaciens lack N2O reductive activity, the reaction stops at N2O, unlike the typical denitrification reaction that goes to N2. After several hours, the conversion is complete, and the N2O is extracted from the vial, separated from volatile organic vapor and water vapor by an automated -65 °C isopropanol-slush trap, a Nafion drier, a CO2 and water removal unit (Costech #021020 carbon dioxide absorbent with Mg(ClO4)2), and trapped in a small-volume trap immersed in liquid nitrogen with a modified Finnigan MAT (now Thermo Scientific) GasBench 2 introduction system. After the N2O is released, it is further purified by gas chromatography before introduction to the isotope-ratio mass spectrometer (IRMS). The IRMS is a Thermo Scientific Delta V Plus continuous flow IRMS (CF-IRMS). It has a universal triple collector, consisting of two wide cups with a narrow cup in the middle; it is capable of simultaneously measuring mass/charge (m/z) of the N2O molecule 44, 45, and 46. The ion beams from these m/z values are as follows: m/z = 44 = N2O = 14N14N16O; m/z = 45 = N2O = 14N15N16O or 14N14N17O; m/z = 46 = N2O = 14N14N18O. The 17O contributions to the m/z 44 and m/z 45 ion beams are accounted for before δ15N values are reported.

  5. Automated determination of the stable carbon isotopic composition (δ13C) of total dissolved inorganic carbon (DIC) and total nonpurgeable dissolved organic carbon (DOC) in aqueous samples: RSIL lab codes 1851 and 1852

    USGS Publications Warehouse

    Révész, Kinga M.; Doctor, Daniel H.

    2014-01-01

    The purposes of the Reston Stable Isotope Laboratory (RSIL) lab codes 1851 and 1852 are to determine the total carbon mass and the ratio of the stable isotopes of carbon (δ13C) for total dissolved inorganic carbon (DIC, lab code 1851) and total nonpurgeable dissolved organic carbon (DOC, lab code 1852) in aqueous samples. The analysis procedure is automated according to a method that utilizes a total carbon analyzer as a peripheral sample preparation device for analysis of carbon dioxide (CO2) gas by a continuous-flow isotope ratio mass spectrometer (CF-IRMS). The carbon analyzer produces CO2 and determines the carbon mass in parts per million (ppm) of DIC and DOC in each sample separately, and the CF-IRMS determines the carbon isotope ratio of the produced CO2. This configuration provides a fully automated analysis of total carbon mass and δ13C with no operator intervention, additional sample preparation, or other manual analysis. To determine the DIC, the carbon analyzer transfers a specified sample volume to a heated (70 °C) reaction vessel with a preprogrammed volume of 10% phosphoric acid (H3PO4), which allows the carbonate and bicarbonate species in the sample to dissociate to CO2. The CO2 from the reacted sample is subsequently purged with a flow of helium gas that sweeps the CO2 through an infrared CO2 detector and quantifies the CO2. The CO2 is then carried through a high-temperature (650 °C) scrubber reactor, a series of water traps, and ultimately to the inlet of the mass spectrometer. For the analysis of total dissolved organic carbon, the carbon analyzer performs a second step on the sample in the heated reaction vessel during which a preprogrammed volume of sodium persulfate (Na2S2O8) is added, and the hydroxyl radicals oxidize the organics to CO2. Samples containing 2 ppm to 30,000 ppm of carbon are analyzed. The precision of the carbon isotope analysis is within 0.3 per mill for DIC, and within 0.5 per mill for DOC.

  6. Determination of the δ15N of total nitrogen in solids; RSIL lab code 2893

    USGS Publications Warehouse

    Revesz, Kinga; Qi, Haiping; Coplen, Tyler B.

    2006-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 2893 is to determine the δ(15N/14N), abbreviated as δ15N , of total nitrogen in solid samples. A Carlo Erba NC 2500 elemental analyzer (EA) is used to convert total nitrogen in a solid sample into N2 gas. The EA is connected to a continuous flow isotope-ratio mass spectrometer (CF-IRMS), which determines relative difference in the isotope-amount ratios of stable nitrogen isotopes (15N/14N)of the product N2 gas. The combustion is quantitative; no isotopic fractionation is involved. Samples are placed in a tin capsule and loaded into the Costech Zero Blank Autosampler of the EA. Under computer control, samples are dropped into a heated reaction tube that contains an oxidant, where the combustion takes place in a helium atmosphere containing an excess of oxygen gas. Combustion products are transported by a helium carrier through a reduction tube to remove excess oxygen and convert all nitrous oxides into N2 and through a drying tube to remove water. The gas-phase products, mainly CO2 and N2, are separated by a gas chromatograph. The gas is then introduced into the isotope-ratio mass spectrometer (IRMS) through a Finnigan MAT (now Thermo Scientific) ConFlo II interface, which also is used to inject N2 reference gas and helium for sample dilution. The IRMS is a Thermo Scientific Delta V Plus CF-IRMS. It has a universal triple collector, two wide cups with a narrow cup in the middle, capable of measuring mass/charge (m/z) 28, 29, 30, simultaneously. The ion beams from N2 are as follows: m/z 28 = N2 = 14N14N; m/z 29 = N2 = 14N15N primarily; m/z 30 = NO = 14N16O primarily, which is a sign of contamination or incomplete reduction.

  7. Determination of the δ34S of Total Sulfur in Solids: RSIL Lab Code 1800

    USGS Publications Warehouse

    Revesz, Kinga; Coplen, Tyler B.

    2006-01-01

    The purpose of Reston Stable Isotope Laboratory Lab (RSIL) Code 1800 is to determine the δ(34S/32S), abbreviated as δ34S, of total sulfur in a solid sample. A Carlo Erba NC 2500 elemental analyzer (EA) is used to convert total sulfur in a solid sample into SO2 gas. The EA is connected to a continuous flow isotope-ratio mass spectrometer (CF-IRMS), which determines the relative difference in stable sulfur isotope-amount ratio (34S/32S) of the product SO2 gas. The combustion is quantitative; no isotopic fractionation is involved. Samples are placed in tin capsules and loaded into a Costech Zero-Blank Autosampler on the EA. Under computer control, samples are dropped into a heated tube reaction tube that combines both the oxidation and the reduction reactions. The combustion takes place in a He atmosphere that contains an excess of oxygen gas at the oxidation zone at the top of the reaction tube. Combustion products are transported by a He carrier through the reduction zone at the bottom of the reaction tube to remove excess oxygen and through a separate drying tube to remove any water. The gas-phase products, mainly CO2, N2, and SO2, are separated by a gas chromatograph (GC). The gas is then introduced into the isotope-ratio mass spectrometer (IRMS) through a Thermo-Finnigan ConFlo II interface, which also is used to inject SO2 reference gas and He for sample dilution. The IRMS is a Thermo-Finnigan DeltaPlus CF-IRMS. It has a universal triple collector with two wide cups and a narrow cup in the middle. It is capable of measuring mass/charge (m/z) 64 and 66 simultaneously. The ion beams from SO2 are as follows: m/z 64 = SO2 = 32S16O16O; and m/z 66 = SO2 = 34S16O16O primarily.

  8. Determination of the delta(18O/16O)of Water: RSIL Lab Code 489

    USGS Publications Warehouse

    Revesz, Kinga; Coplen, Tyler

    2008-01-01

    The purpose of the technique described by the Reston Stable Isotope Laboratory (RSIL) lab code 489 is to present a method to determine the delta(180/160), abbreviated as delta-180, of water. This delta-18O measurement of water also is a component of National Water Quality Laboratory (NWQL in USGS) schedules 1142 and 1172. Water samples are loaded into glass sample containers on a vacuum manifold to equilibrate gaseous CO2 at constant temperature (25 deg C) with water samples. After loading water samples on the vacuum manifold, air is evacuated through capillary to avoid evaporation, and CO2 is added. The samples are shaken to increase the equilibration rate of water and CO2. When isotopic equilibrium has been attained, an aliquot of CO2 is extracted sequentially from each sample container, separated from water vapor by means of a dry ice trap, and introduced into a dual-inlet isotope-ratio mass spectrometer (DI-IRMS) for determination of the delta-18O value. There is oxygen isotopic fractionation between water and CO2, but it is constant at constant temperature. The DI-IRMS is a DuPont double-focusing mass spectrometer. It has a double collector. One ion beam passes through a slit in a forward collector and is collected in the rear collector. The other ion beams are collected in the front collector. The instrument is capable of measuring mass/charge (m/z) 44 and 45 or 44 and 46 by changing the ion-accelerating voltage under computer control. The ion beams from these m/z values are as follows: m/z 44=CO2=12C16O16O, m/z 45=CO2=13C16O16O primarily, and m/z 46 = CO2=12C16O18O primarily. The data acquisition and control software calculates delta-18O values.

  9. Determination of the δ15N of nitrate in water; RSIL lab code 2899

    USGS Publications Warehouse

    Coplen, Tyler B.; Qi, Haiping; Revesz, Kinga; Casciotti, Karen; Hannon, Janet E.

    2007-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 2899 is to determine the δ15N of nitrate (NO3-) in water. The δ15N of the dissolved NO3- is analyzed by conversion of the NO3- to nitrous oxide (N2O), which serves as the analyte for mass spectrometry. A culture of denitrifying bacteria is used in the enzymatic conversion of the NO3- to N2O, which follows the pathway shown in equation 1: NO3- → NO2- → NO → 1/2 N2O (1) Because the bacteria Pseudomonas aureofaciens lack N2O reductive activity, the reaction stops at N2O, unlike the typical denitrification reaction that goes to N2. After several hours, the conversion is complete, and the N2O is extracted from the vial, separated from volatile organic vapor and water vapor by an automated -65 °C isopropanol-slush trap, a Nafion drier, a CO2 and water removal unit (Costech #021020 carbon dioxide absorbent with Mg(ClO4)2), and trapped in a small-volume trap immersed in liquid nitrogen with a modified Finnigan MAT (now Thermo Scientific) GasBench 2 introduction system. After the N2O is released, it is further purified by gas chromatography before introduction to the isotope-ratio mass spectrometer (IRMS). The IRMS is a Thermo Scientific Delta V Plus continuous flow IRMS (CF-IRMS). It has a universal triple collector, consisting of two wide cups with a narrow cup in the middle; it is capable of simultaneously measuring mass/charge (m/z) of the N2O molecule 44, 45, and 46. The ion beams from these m/z values are as follows: m/z = 44 = N2O = 14N14N16O; m/z = 45 = N2O = 14N15N16O or 14N14N17O; m/z = 46 = N2O = 14N14N18O. The 17O contributions to the m/z 44 and m/z 45 ion beams are accounted for before δ15N values are reported.

  10. Determination of the δ34S of low-concentration sulfate in water; RSIL lab code 1949

    USGS Publications Warehouse

    Revesz, Kinga; Qi, Haiping; Coplen, Tyler B.

    2006-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 1949 is to determine the δ(34S/32S), abbreviated as δ34S, of dissolved sulfate having a concentration less than 20 milligrams per liter. Dissolved sulfate is collected on an anion-exchange resin in the field, eluted in the laboratory with 3 M KCl, and precipitated with BaCl2 at pH 3 to 4 as BaSO4. The precipitated BaSO4 is filtered and dried before introduction into an elemental analyzer (EA) Carlo Erba NC 2500. The EA is used to convert sulfur in a BaSO4 solid sample into SO2 gas, and the EA is connected to a continuous flow isotope-ratio mass spectrometer (CF-IRMS), which determines differences in the isotope-amount ratios of stable sulfur isotopes (34S/32S) of the product SO2 gas. The combustion is quantitative; no isotopic fractionation is involved. Samples are placed in a tin capsule and loaded into the Costech Zero Blank Autosampler of the EA. Under computer control, samples are dropped into a heated reaction tube that combines the oxidation and reduction reactions. The combustion takes place in a helium atmosphere containing an excess of oxygen gas at the oxidation zone at the top of the reaction tube. Combustion products are transported by a helium carrier through the reduction zone at the bottom of the reaction tube to remove excess oxygen and through a separate drying tube to remove any water. The gas-phase products, mainly CO2, N2, and SO2, are separated by a gas chromatograph. The gas is then introduced into the isotope-ratio mass spectrometer (IRMS) through a Finnigan MAT (now Thermo Scientific) ConFlo II interface, which is also used to inject SO2 reference gas and helium for sample dilution. The IRMS is a Thermo Scientific Delta V Plus CF-IRMS. It has a universal triple collector with two wide cups and a narrow cup in the middle. It is capable of measuring mass/charge (m/z) 64 and 66 simultaneously. The ion beams from SO2 are as follows: m/z 64 = SO2 = 32S16O16O; m/z 66 = SO2 = 34S16O16O primarily.

  11. Determination of the δ15N and δ18O of nitrate in water; RSIL lab code 2900

    USGS Publications Warehouse

    Coplen, Tyler B.; Qi, Haiping; Revesz, Kinga; Casciotti, Karen; Hannon, Janet E.

    2007-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 2900 is to determine the δ15N and δ18O of nitrate (NO3-) in water. The δ15N and δ18O of the dissolved NO3- are analyzed by converting the NO3- to nitrous oxide (N2O), which serves as the analyte for mass spectrometry. A culture of denitrifying bacteria is used in the enzymatic conversion of the NO3- to N2O, which follows the pathway shown in equation 1: NO3- → NO2- → NO → 1/2 N2O (1) Because the bacteria Pseudomonas aureofaciens lack N2O reductive activity, the reaction stops at N2O, unlike the typical denitrification reaction that goes to N2. After several hours, the conversion is complete, and the N2O is extracted from the vial, separated from volatile organic vapor and water vapor by an automated -65 °C isopropanol-slush trap, a Nafion drier, a CO2 and water removal unit (Costech #021020 carbon dioxide absorbent with Mg(ClO4)2), and trapped in a small-volume trap immersed in liquid nitrogen with a modified Finnigan MAT (now Thermo Scientific) GasBench 2 introduction system. After the N2O is released, it is further purified by gas chromatography before introduction to the isotope-ratio mass spectrometer (IRMS). The IRMS is a Thermo Scientific Delta V Plus continuous flow IRMS (CF-IRMS). It has a universal triple collector, consisting of two wide cups with a narrow cup in the middle; it is capable of simultaneously measuring mass/charge (m/z) of the N2O molecule 44, 45, and 46. The ion beams from these m/z values are as follows: m/z = 44 = N2O = 14N14N16O; m/z = 45 = N2O = 14N15N16O or 14N14N17O; m/z = 46 = N2O = 14N14N18O. The 17O contributions to the m/z 44 and m/z 45 ion beams are accounted for before δ15N values are reported.

  12. Determination of the δ34S of sulfate in water; RSIL lab code 1951

    USGS Publications Warehouse

    Revesz, Kinga; Qi, Haiping; Coplen, Tyler B.

    2006-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 1951 is to determine the δ(34S/32S), abbreviated as δ34S, of dissolved sulfate. Dissolved sulfate is collected in the field and precipitated with BaCl2 at pH 3 to 4 as BaSO4 in the laboratory. However, the dissolved organic sulfur (DOS) is oxidized to SO2, and the carbonate is acidified to CO2. Both are degassed from the water sample before the sulfate is precipitated. The precipitated BaSO4 is filtered and dried before introduction into an elemental analyzer (EA) Carlo Erba NC 2500. The EA is used to convert sulfur in a BaSO4 solid sample into SO2 gas, and the EA is connected to a continuous flow isotope-ratio mass spectrometer (CF-IRMS), which determines the differences in the isotope-amount ratios of stable sulfur isotopes (34S/32S) of the product SO2 gas. The combustion is quantitative; no isotopic fractionation is involved. Samples are placed in a tin capsule and loaded into the Costech Zero Blank Autosampler of the EA. Under computer control, samples are dropped into a heated tube reaction tube that combines the oxidation and reduction reactions. The combustion takes place in a helium atmosphere containing an excess of oxygen gas at the oxidation zone at the top of the reaction tube. Combustion products are transported by a helium carrier through the reduction zone at the bottom of the reaction tube to remove excess oxygen and through a separate drying tube to remove any water. The gas-phase products, mainly CO2, N2, and SO2, are separated by a gas chromatograph. The gas is then introduced into the isotope-ratio mass spectrometer (IRMS) through a Finnigan MAT (now Thermo Scientific) ConFlo II interface, which also is used to inject SO2 reference gas and helium for sample dilution. The IRMS is a Thermo Scientific Delta V Plus CF-IRMS. It has a universal triple collector with two wide cups and a narrow cup in the middle. It is capable of measuring mass/charge (m/z) 64 and 66 simultaneously. The ion beams from SO2 are as follows: m/z 64 = SO2 = 32S16O16O; m/z 66 = SO2 = 34S16O16O primarily.

  13. Determination of the δ15N and δ13C of total nitrogen and carbon in solids; RSIL lab code 1832

    USGS Publications Warehouse

    Revesz, Kinga; Qi, Haiping; Coplan, Tyler B.

    2006-01-01

    The purpose of the Reston Stable Isotope Laboratory (RSIL) lab code 1832 is to determine the δ(15N/14N), abbreviated as δ15N, and the δ(13C/12C), abbreviated as δ13C, of total nitrogen and carbon in a solid sample. A Carlo Erba NC 2500 elemental analyzer (EA) is used to convert total nitrogen and carbon in a solid sample into N2 and CO2 gas. The EA is connected to a continuous flow isotope-ratio mass spectrometer (CF-IRMS), which determines the relative difference in stable nitrogen isotope-amount ratio (15N/14N) of the product N2 gas and the relative difference in stable carbon isotope-amount ratio (13C/12C) of the product CO2 gas. The combustion is quantitative; no isotopic fractionation is involved. Samples are placed in tin capsules and loaded into a Costech Zero Blank Autosampler on the EA. Under computer control, samples then are dropped into a heated reaction tube that contains an oxidant, where combustion takes place in a helium atmosphere containing an excess of oxygen gas. Combustion products are transported by a helium carrier through a reduction furnace to remove excess oxygen and to convert all nitrous oxides into N2 and through a drying tube to remove water. The gas-phase products, mainly CO2 and N2, are separated by a gas chromatograph. The gas is then introduced into the IRMS through a Finnigan MAT (now Thermo Scientific) ConFlo II interface. The Finnigan MAT ConFlo II interface is used for introducing not only sample into the IRMS but also N2 and CO2 reference gases and helium for sample dilution. The flash combustion is quantitative; no isotopic fractionation is involved. The IRMS is a Thermo Scientific Delta V CF-IRMS. It has a universal triple collector, two wide cups with a narrow cup in the middle; it is capable of measuring mass/charge (m/z) 28, 29, 30 or with a magnet current change 44, 45, 46, simultaneously. The ion beams from these m/z values are as follows: m/z 28 = N2 = 14N/14N; m/z 29 = N2 = 14N/15N primarily; m/z 30 = NO = 14N/16O primarily, which is a sign of contamination or incomplete reduction; m/z 44 = CO2 = 12C16O16O; m/z 45 = CO2 = 13C16O16O primarily; and m/z 46 = CO2 = 12C16O18O primarily.

  14. Is Single-Port Laparoscopy More Precise and Faster with the Robot?

    PubMed

    Fransen, Sofie A F; van den Bos, Jacqueline; Stassen, Laurents P S; Bouvy, Nicole D

    2016-11-01

    Single-port laparoscopy is a step forward toward nearly scar less surgery. Concern has been raised that single-incision laparoscopic surgery (SILS) is technically more challenging because of the lack of triangulation and the clashing of instruments. Robotic single-incision laparoscopic surgery (RSILS) in chopstick setting might overcome these problems. This study evaluated the outcome in time and errors of two tasks of the Fundamentals of Laparoscopic Surgery on a dry platform, in two settings: SILS versus RSILS. Nine experienced laparoscopic surgeons performed two tasks: peg transfer and a suturing task, on a standard box trainer. All participants practiced each task three times in both settings: SILS and a RSILS setting. The assessment scores (time and errors) were recorded. For the first task of peg transfer, RSILS was significantly better in time (124 versus 230 seconds, P = .0004) and errors (0.80 errors versus 2.60 errors, P = .024) at the first run, compared to the SILS setting. At the third and final run, RSILS still proved to be significantly better in errors (0.10 errors versus 0.80 errors, P = .025) compared to the SILS group. RSILS was faster in the third run, but not significant (116 versus 157 seconds, P = .08). For the second task, a suturing task, only 3 participants of the SILS group were able to perform this task within the set time frame of 600 seconds. There was no significant difference in time in the three runs between SILS and RSILS for the 3 participants that fulfilled both tasks within the 600 seconds. This study shows that robotic single-port surgery seems easier, faster, and more precise to perform basis tasks of the Fundamentals of laparoscopic surgery. For the more complex task of suturing, only the single-port robotic setting enabled all participants to fulfill this task, within the set time frame.

  15. Determination of the delta(15N/14N)of Ammonium (NH4+) in Water: RSIL Lab Code 2898

    USGS Publications Warehouse

    Hannon, Janet E.; Böhlke, John Karl

    2008-01-01

    The purpose of the technique described by Reston Stable Isotope Laboratory (RSIL) lab code 2898 is to determine the N isotopic composition, delta(15N/14N), abbreviated as d15N, of ammonium (NH4+) in water (freshwater and saline water). The procedure involves converting dissolved NH4+ into NH3 gas by raising the pH of the sample to above 9 with MgO and subsequently trapping the gas quantitatively as (NH4)2SO4 on a glass fiber (GF) filter. The GF filter is saturated with NaHSO4 and pressure sealed between two gas-permeable polypropylene filters. The GF filter 'sandwich' floats on the surface of the water sample in a closed bottle. NH3 diffuses from the water through the polypropylene filter and reacts with NaHSO4, forming (NH4)2SO4 on the GF filter. The GF filter containing (NH4)2SO4 is dried and then combusted with a Carlo Erba NC 2500 elemental analyzer (EA), which is used to convert total nitrogen in a solid sample into N2 gas. The EA is connected to a continuous-flow isotope-ratio mass spectrometer (CF-IRMS), which determines the relative difference in ratios of the amounts of the stable isotopes of nitrogen (15N and 14N) of the product N2 gas and a reference N2 gas. The filters containing the samples are compressed in tin capsules and loaded into a Costech Zero-Blank Autosampler on the EA. Under computer control, samples then are dropped into a heated reaction tube that contains an oxidant, where combustion takes place in a He atmosphere containing an excess of O2 gas. To remove S-O gases produced from the NaHSO4, a plug of Ag-coated Cu wool is inserted at the bottom of the reaction tube. Combustion products are transported by a He carrier through a reduction furnace to remove excess O2, toconvert all nitrogen oxides to N2, and to remove any remaining S-O gases. The gases then pass through a drying tube to remove water. The gas-phase products, mainly N2 and a small amount of background CO2, are separated by a gas chromatograph (GC). The gas is then introduced into the IRMS through a Finnigan ConFlo II interface. The ConFlo II interface is used to introduce not only sample into the IRMS but also N2 reference gas and He for sample dilution. The flash combustion is quantitative, so no isotopic fractionation is involved. The IRMS is a Finnigan Delta V CF-IRMS with 10 cups and is capable of detecting ion beams with mass/charge (m/z) 28, 29, 30. The ion beams from N2 are as follows: m/z 28 = 14N14N, m/z 29 = 14N15N, and m/z 30 = 15N15N. The ion beam with m/z 30 also represents 14N16O, which may indicate contamination or incomplete reduction.

  16. Advanced Stirling Convertor Dual Convertor Controller Testing at NASA Glenn Research Center in the Radioisotope Power Systems System Integration Laboratory

    NASA Technical Reports Server (NTRS)

    Dugala, Gina M.; Taylor, Linda M.; Bell, Mark E.; Dolce, James L.; Fraeman, Martin; Frankford, David P.

    2015-01-01

    NASA Glenn Research Center developed a nonnuclear representation of a Radioisotope Power System (RPS) consisting of a pair of Advanced Stirling Convertors (ASCs), Dual Convertor Controller (DCC) EMs (engineering models) 2 and 3, and associated support equipment, which were tested in the Radioisotope Power Systems System Integration Laboratory (RSIL). The DCC was designed by the Johns Hopkins University Applied Physics Laboratory (JHU/APL) to actively control a pair of ASCs. The first phase of testing included a Dual Advanced Stirling Convertor Simulator (DASCS), which was developed by JHU/APL and simulates the operation and electrical behavior of a pair of ASCs in real time via a combination of hardware and software. RSIL provides insight into the electrical interactions between a representative radioisotope power generator, its associated control schemes, and realistic electric system loads. The first phase of integration testing included the following spacecraft bus configurations: capacitive, battery, and super-capacitor. A load profile, created based on data from several missions, tested the RPS's and RSIL's ability to maintain operation during load demands above and below the power provided by the RPS. The integration testing also confirmed the DCC's ability to disconnect from the spacecraft when the bus voltage dipped below 22 volts or exceeded 36 volts. Once operation was verified with the DASCS, the tests were repeated with actual operating ASCs. The goal of this integration testing was to verify operation of the DCC when connected to a spacecraft and to verify the functionality of the newly designed RSIL. The results of these tests are presented in this paper.

  17. Advanced Stirling Convertor Dual Convertor Controller Testing at NASA Glenn Research Center in the Radioisotope Power Systems System Integration Laboratory

    NASA Technical Reports Server (NTRS)

    Dugala, Gina M.; Taylor, Linda M.; Bell, Mark E.; Dolce, James L.; Fraeman, Martin; Frankford, David P.

    2015-01-01

    NASA Glenn Research Center (GRC) developed a non-nuclear representation of a Radioisotope Power System (RPS) consisting of a pair of Advanced Stirling Convertors (ASC), a Dual Convertor Controller (DCC) EM (engineering model) 2 & 3, and associated support equipment, which were tested in the Radioisotope Power Systems System Integration Laboratory (RSIL). The DCC was designed by the Johns Hopkins University/Applied Physics Laboratory (JHU/APL) to actively control a pair of Advanced Stirling Convertors (ASC). The first phase of testing included a Dual Advanced Stirling Convertor Simulator (DASCS) which was developed by JHU/APL and simulates the operation and electrical behavior of a pair of ASC's in real time via a combination of hardware and software. RSIL provides insight into the electrical interactions between a representative radioisotope power generator, its associated control schemes, and realistic electric system loads. The first phase of integration testing included the following spacecraft bus configurations: capacitive, battery, and supercapacitor. A load profile, created based on data from several missions, tested the RPS and RSIL ability to maintain operation during load demands above and below the power provided by the RPS. The integration testing also confirmed the DCC's ability to disconnect from the spacecraft when the bus voltage dipped below 22 V or exceeded 36 V. Once operation was verified with the DASCS, the tests were repeated with actual operating ASC's. The goal of this integration testing was to verify operation of the DCC when connected to a spacecraft and to verify the functionality of the newly designed RSIL. The results of these tests are presented in this paper.

  18. Determination of the δ2H and δ18O of soil water and water in plant matter; RSIL lab code 1700

    USGS Publications Warehouse

    Revesz, Kinga M.; Buck, Bryan; Coplen, Tyler B.

    2012-01-01

    The purpose of the Reston Stable Isotope Laboratory lab code 1700 is to determine the δ2H/1H), abbreviated as δ2H, and the δ18O/16O), abbreviated as δ18O, of soil water and water in plant matter. This method is based on the observation that water and toluene form an azeotropic mixture at 84.1 °C. This temperature is substantially lower than the boiling points of water (100 °C) and toluene (110 °C), but water and toluene are immiscible at ambient temperature. The water content of a soil or plant is determined by weighing, drying, and reweighing a small amount of sample. Sufficient sample to collect 3 to 5 milliliters of water after distillation is loaded into a distillation flask. Sufficient toluene is added so that the sample is immersed throughout the entire distillation to minimize evaporation of water, which would affect the δ2H and δ18O values. The mixture of sample and toluene is heated in a flask to its boiling point (84.1 °C) so that water from the sample and toluene can distill together into a specially designed collection funnel. The temperature of 84.1 °C is maintained until the water has been quantitatively transferred to the collection funnel, at which time the temperature is raised to the boiling point of the remaining component (toluene, 110 °C). The collection funnel is maintained at ambient temperature so that the sample water and toluene can be separated physically. After separation, the sample water is purified by addition of paraffin wax to the container with the sample water, capping the container, and heating to approximately 60 °C to melt the wax. Trace amounts of toluene will dissolve in the wax, purifying the sample water for isotopic analysis. The isotopic composition of the purified water is then determined by equilibration with gaseous hydrogen or carbon dioxide, followed by dual-inlet isotope-ratio mass spectrometry. Because laser-absorption spectrometry is sensitive to organic compounds, such as trace toluene remaining in water samples, water samples should be analyzed for isotopic composition only by mass spectrometry and not by laser-absorption spectrometry.

  19. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR CODING: ARIZONA LAB DATA (UA-D-13.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for Arizona Lab Data. This strategy was developed for use in the Arizona NHEXAS project and the "Border" study. Keywords: data; coding; lab data forms.

    The National Human Exposure Assessment Survey (NHEXAS) is a federal ...

  20. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR CODING: ARIZONA LAB DATA (UA-D-13.0)

    EPA Science Inventory

    The purpose of this SOP is to define the coding strategy for Arizona Lab Data. This strategy was developed for use in the Arizona NHEXAS project and the Border study. Keywords: data; coding; lab data forms.

    The U.S.-Mexico Border Program is sponsored by the Environmental Healt...

  1. Pion and Kaon Lab Frame Differential Cross Sections for Intermediate Energy Nucleus-Nucleus Collisions

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Blattnig, Steve R.

    2008-01-01

    Space radiation transport codes require accurate models for hadron production in intermediate energy nucleus-nucleus collisions. Codes require cross sections to be written in terms of lab frame variables and it is important to be able to verify models against experimental data in the lab frame. Several models are compared to lab frame data. It is found that models based on algebraic parameterizations are unable to describe intermediate energy differential cross section data. However, simple thermal model parameterizations, when appropriately transformed from the center of momentum to the lab frame, are able to account for the data.

  2. Microsoft Licenses Berkeley Lab's Home Energy Saver Code for Its Energy

    Science.gov Websites

    -based tool for calculating energy use in residential buildings. About one million people visit the Home Management Software | Berkeley Lab Berkeley Lab A-Z Index Directory Submit Web People Navigation Berkeley Lab Search Submit Web People Close About the Lab Leadership/Organization Calendar News

  3. Improving Care for Veterans with PTSD: Comparing Risks and Benefits of Antipsychotics Versus Other Medications to Augment First-Line Pharmacologic Therapy

    DTIC Science & Technology

    2017-10-01

    for all project Aims. Timeline- months 3-6. Status: completed. Task 6: Complete primary analyses and hypothesis testing for Aim 2, including...glucose. For each of these lab tests , each VA site can name them something different and can change names over times. Labs should be linked to Logical...Observation Identifiers Names (LOINC) codes, an international standard system that assigns a numeric code to specific lab tests . However, VA data

  4. PFMCal : Photonic force microscopy calibration extended for its application in high-frequency microrheology

    NASA Astrophysics Data System (ADS)

    Butykai, A.; Domínguez-García, P.; Mor, F. M.; Gaál, R.; Forró, L.; Jeney, S.

    2017-11-01

    The present document is an update of the previously published MatLab code for the calibration of optical tweezers in the high-resolution detection of the Brownian motion of non-spherical probes [1]. In this instance, an alternative version of the original code, based on the same physical theory [2], but focused on the automation of the calibration of measurements using spherical probes, is outlined. The new added code is useful for high-frequency microrheology studies, where the probe radius is known but the viscosity of the surrounding fluid maybe not. This extended calibration methodology is automatic, without the need of a user's interface. A code for calibration by means of thermal noise analysis [3] is also included; this is a method that can be applied when using viscoelastic fluids if the trap stiffness is previously estimated [4]. The new code can be executed in MatLab and using GNU Octave. Program Files doi:http://dx.doi.org/10.17632/s59f3gz729.1 Licensing provisions: GPLv3 Programming language: MatLab 2016a (MathWorks Inc.) and GNU Octave 4.0 Operating system: Linux and Windows. Supplementary material: A new document README.pdf includes basic running instructions for the new code. Journal reference of previous version: Computer Physics Communications, 196 (2015) 599 Does the new version supersede the previous version?: No. It adds alternative but compatible code while providing similar calibration factors. Nature of problem (approx. 50-250 words): The original code uses a MatLab-provided user's interface, which is not available in GNU Octave, and cannot be used outside of a proprietary software as MatLab. Besides, the process of calibration when using spherical probes needs an automatic method when calibrating big amounts of different data focused to microrheology. Solution method (approx. 50-250 words): The new code can be executed in the latest version of MatLab and using GNU Octave, a free and open-source alternative to MatLab. This code generates an automatic calibration process which requires only to write the input data in the main script. Additionally, we include a calibration method based on thermal noise statistics, which can be used with viscoelastic fluids if the trap stiffness is previously estimated. Reasons for the new version: This version extends the functionality of PFMCal for the particular case of spherical probes and unknown fluid viscosities. The extended code is automatic, works in different operating systems and it is compatible with GNU Octave. Summary of revisions: The original MatLab program in the previous version, which is executed by PFMCal.m, is not changed. Here, we have added two additional main archives named PFMCal_auto.m and PFMCal_histo.m, which implement automatic calculations of the calibration process and calibration through Boltzmann statistics, respectively. The process of calibration using this code for spherical beads is described in the README.pdf file provided in the new code submission. Here, we obtain different calibration factors, β (given in μm/V), according to [2], related to two statistical quantities: the mean-squared displacement (MSD), βMSD, and the velocity autocorrelation function (VAF), βVAF. Using that methodology, the trap stiffness, k, and the zero-shear viscosity of the fluid, η, can be calculated if the value of the particle's radius, a, is previously known. For comparison, we include in the extended code the method of calibration using the corner frequency of the power-spectral density (PSD) [5], providing a calibration factor βPSD. Besides, with the prior estimation of the trap stiffness, along with the known value of the particle's radius, we can use thermal noise statistics to obtain calibration factors, β, according to the quadratic form of the optical potential, βE, and related to the Gaussian distribution of the bead's positions, βσ2. This method has been demonstrated to be applicable to the calibration of optical tweezers when using non-Newtonian viscoelastic polymeric liquids [4]. An example of the results using this calibration process is summarized in Table 1. Using the data provided in the new code submission, for water and acetone fluids, we calculate all the calibration factors by using the original PFMCal.m and by the new non-GUI code PFMCal_auto.m and PFMCal_histo.m. Regarding the new code, PFMCal_auto.m returns η, k, βMSD, βVAF and βPSD, while PFMCal_histo.m provides βσ2 and βE. Table 1 shows how we obtain the expected viscosity of the two fluids at this temperature and how the different methods provide good agreement between trap stiffnesses and calibration factors. Additional comments including Restrictions and Unusual features (approx. 50-250 words): The original code, PFMCal.m, runs under MatLab using the Statistics Toolbox. The extended code, PFMCal_auto.m and PFMCal_histo.m, can be executed without modification using MatLab or GNU Octave. The code has been tested in Linux and Windows operating systems.

  5. Tri-Lab Co-Design Milestone: In-Depth Performance Portability Analysis of Improved Integrated Codes on Advanced Architecture.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoekstra, Robert J.; Hammond, Simon David; Richards, David

    2017-09-01

    This milestone is a tri-lab deliverable supporting ongoing Co-Design efforts impacting applications in the Integrated Codes (IC) program element Advanced Technology Development and Mitigation (ATDM) program element. In FY14, the trilabs looked at porting proxy application to technologies of interest for ATS procurements. In FY15, a milestone was completed evaluating proxy applications in multiple programming models and in FY16, a milestone was completed focusing on the migration of lessons learned back into production code development. This year, the co-design milestone focuses on extracting the knowledge gained and/or code revisions back into production applications.

  6. Spatial-Gain Improvement Resulting from Left/Right Discriminating Elements of an Underwater Towed Array

    DTIC Science & Technology

    1981-09-15

    DISCRIMINATING ELEMENTS OF AN UNDERWATER TOWED ARRAY rsil RONALD A. WAGSTAFF and PIETRO MANASCA D TIC": L• ’" .... 4"•’.,, 15 SEPTEMBER 1981...IMPROVEMENT RESULTING FROM LEFT/RIGHT DISCRIMINATING ELEMENTS OF AN UNDERWATER TOWED ARRAY by Ronald A. Wagstaff and Pietro Zanasca 15 September 1981 F...RESULTING FROM LEFT/RIGHT DISCRIMINATING ELEMENTS OF AN UNDERWATER TOWED ARRAY by Ronald A. Wagstaff and Pietro Zanasca ABSTRACT The improvement in

  7. TReacLab: An object-oriented implementation of non-intrusive splitting methods to couple independent transport and geochemical software

    NASA Astrophysics Data System (ADS)

    Jara, Daniel; de Dreuzy, Jean-Raynald; Cochepin, Benoit

    2017-12-01

    Reactive transport modeling contributes to understand geophysical and geochemical processes in subsurface environments. Operator splitting methods have been proposed as non-intrusive coupling techniques that optimize the use of existing chemistry and transport codes. In this spirit, we propose a coupler relying on external geochemical and transport codes with appropriate operator segmentation that enables possible developments of additional splitting methods. We provide an object-oriented implementation in TReacLab developed in the MATLAB environment in a free open source frame with an accessible repository. TReacLab contains classical coupling methods, template interfaces and calling functions for two classical transport and reactive software (PHREEQC and COMSOL). It is tested on four classical benchmarks with homogeneous and heterogeneous reactions at equilibrium or kinetically-controlled. We show that full decoupling to the implementation level has a cost in terms of accuracy compared to more integrated and optimized codes. Use of non-intrusive implementations like TReacLab are still justified for coupling independent transport and chemical software at a minimal development effort but should be systematically and carefully assessed.

  8. A Series of Computational Neuroscience Labs Increases Comfort with MATLAB.

    PubMed

    Nichols, David F

    2015-01-01

    Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course. Using short focused surveys before and after each lab, student comfort levels were shown to increase drastically from a majority of students being uncomfortable or with neutral feelings about working in the MATLAB environment to a vast majority of students being comfortable working in the environment. Though change was reported within each lab, a series of labs was necessary in order to establish a lasting high level of comfort. Comfort working with code is important as a first step in acquiring computational skills that are required to address many questions within neuroscience.

  9. A Series of Computational Neuroscience Labs Increases Comfort with MATLAB

    PubMed Central

    Nichols, David F.

    2015-01-01

    Computational simulations allow for a low-cost, reliable means to demonstrate complex and often times inaccessible concepts to undergraduates. However, students without prior computer programming training may find working with code-based simulations to be intimidating and distracting. A series of computational neuroscience labs involving the Hodgkin-Huxley equations, an Integrate-and-Fire model, and a Hopfield Memory network were used in an undergraduate neuroscience laboratory component of an introductory level course. Using short focused surveys before and after each lab, student comfort levels were shown to increase drastically from a majority of students being uncomfortable or with neutral feelings about working in the MATLAB environment to a vast majority of students being comfortable working in the environment. Though change was reported within each lab, a series of labs was necessary in order to establish a lasting high level of comfort. Comfort working with code is important as a first step in acquiring computational skills that are required to address many questions within neuroscience. PMID:26557798

  10. Projectile and Lab Frame Differential Cross Sections for Electromagnetic Dissociation

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Adamczyk, Anne; Dick, Frank

    2008-01-01

    Differential cross sections for electromagnetic dissociation in nuclear collisions are calculated for the first time. In order to be useful for three - dimensional transport codes, these cross sections have been calculated in both the projectile and lab frames. The formulas for these cross sections are such that they can be immediately used in space radiation transport codes. Only a limited amount of data exists, but the comparison between theory and experiment is good.

  11. A&M. TAN607 second floor plan for cold assembly area. Metallurgical ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. TAN-607 second floor plan for cold assembly area. Metallurgical lab, chemistry lab, nuclear instrument lab, equipment rooms. Ralph M. Parsons 902-ANP-607-A 102. Date: December 1952. Approved by INEEL Classification Office for public release. INEEL index code no. 034-0607-693-106754 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  12. Inertial Impact Switches for Artillery Fuzes. Part III. Rocket Application

    DTIC Science & Technology

    1975-04-01

    INtWT.INiUUITO» iiTwar mN.coimcT Nom > i. minrtwotMutt. SLOTS IN END UP MO SCOVTOBCmUNC«ITH EMM OTMR WITHIN 10* SCCNOTC...ENGR LAB COMMANDER NAVAL SURFACE WEAPONS CENTER WHITE OAK, MD 20910 ATTN CODE 043 , PROJ MGR, FUZES ATTN CODE 511, ATTN CODE 512, ATTN CODE 522

  13. NSDann2BS, a neutron spectrum unfolding code based on neural networks technology and two bonner spheres

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.

    In this work a neutron spectrum unfolding code, based on artificial intelligence technology is presented. The code called ''Neutron Spectrometry and Dosimetry with Artificial Neural Networks and two Bonner spheres'', (NSDann2BS), was designed in a graphical user interface under the LabVIEW programming environment. The main features of this code are to use an embedded artificial neural network architecture optimized with the ''Robust design of artificial neural networks methodology'' and to use two Bonner spheres as the only piece of information. In order to build the code here presented, once the net topology was optimized and properly trained, knowledge stored atmore » synaptic weights was extracted and using a graphical framework build on the LabVIEW programming environment, the NSDann2BS code was designed. This code is friendly, intuitive and easy to use for the end user. The code is freely available upon request to authors. To demonstrate the use of the neural net embedded in the NSDann2BS code, the rate counts of {sup 252}Cf, {sup 241}AmBe and {sup 239}PuBe neutron sources measured with a Bonner spheres system.« less

  14. NSDann2BS, a neutron spectrum unfolding code based on neural networks technology and two bonner spheres

    NASA Astrophysics Data System (ADS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-07-01

    In this work a neutron spectrum unfolding code, based on artificial intelligence technology is presented. The code called "Neutron Spectrometry and Dosimetry with Artificial Neural Networks and two Bonner spheres", (NSDann2BS), was designed in a graphical user interface under the LabVIEW programming environment. The main features of this code are to use an embedded artificial neural network architecture optimized with the "Robust design of artificial neural networks methodology" and to use two Bonner spheres as the only piece of information. In order to build the code here presented, once the net topology was optimized and properly trained, knowledge stored at synaptic weights was extracted and using a graphical framework build on the LabVIEW programming environment, the NSDann2BS code was designed. This code is friendly, intuitive and easy to use for the end user. The code is freely available upon request to authors. To demonstrate the use of the neural net embedded in the NSDann2BS code, the rate counts of 252Cf, 241AmBe and 239PuBe neutron sources measured with a Bonner spheres system.

  15. Laboratory Accreditation Bureau (L-A-B)

    DTIC Science & Technology

    2011-03-28

    to all Technical Advisors. Must agree with code of conduct, confidentiality and our mission DoD ELAP Program  ISO / IEC 17025 :2005 and DoD QSM...Additional DoD QSM requirements fit well in current 17025 process … just much, much more. Sector Specific. Outcome (L-A-B case)  83

  16. Gravitational and Magnetic Anomaly Inversion Using a Tree-Based Geometry Representation

    DTIC Science & Technology

    2009-06-01

    find successive mini- ized vectors. Throughout this paper, the term iteration refers to a ingle loop through a stage of the global scheme, not...BOX 12211 RESEARCH TRIANGLE PARK NC 27709-2211 5 NAVAL RESEARCH LAB E R FRANCHI CODE 7100 M H ORR CODE 7120 J A BUCARO CODE 7130

  17. Program Processes Thermocouple Readings

    NASA Technical Reports Server (NTRS)

    Quave, Christine A.; Nail, William, III

    1995-01-01

    Digital Signal Processor for Thermocouples (DART) computer program implements precise and fast method of converting voltage to temperature for large-temperature-range thermocouple applications. Written using LabVIEW software. DART available only as object code for use on Macintosh II FX or higher-series computers running System 7.0 or later and IBM PC-series and compatible computers running Microsoft Windows 3.1. Macintosh version of DART (SSC-00032) requires LabVIEW 2.2.1 or 3.0 for execution. IBM PC version (SSC-00031) requires LabVIEW 3.0 for Windows 3.1. LabVIEW software product of National Instruments and not included with program.

  18. Advanced Stirling Convertor Control Unit Testing at NASA Glenn Research Center in the Radioisotope Power Systems System Integration Laboratory

    NASA Technical Reports Server (NTRS)

    Dugala, Gina M.; Taylor, Linda M.; Kussmaul, Michael; Casciani, Michael; Brown, Gregory; Wiser, Joel

    2017-01-01

    Future NASA missions could include establishing Lunar or Martian base camps, exploring Jupiters moons and travelling beyond where generating power from sunlight may be limited. Radioisotope Power Systems (RPS) provide a dependable power source for missions where inadequate sunlight or operational requirements make other power systems impractical. Over the past decade, NASA Glenn Research Center (GRC) has been supporting the development of RPSs. The Advanced Stirling Radioisotope Generator (ASRG) utilized a pair of Advanced Stirling Convertors (ASC). While flight development of the ASRG has been cancelled, much of the technology and hardware continued development and testing to guide future activities. Specifically, a controller for the convertor(s) is an integral part of a Stirling-based RPS. For the ASRG design, the controller maintains stable operation of the convertors, regulates the alternating current produced by the linear alternator of the convertor, provides a specified direct current output voltage for the spacecraft, synchronizes the piston motion of the two convertors in order to minimize vibration as well as manage and maintain operation with a stable piston amplitude and hot end temperature. It not only provides power to the spacecraft but also must regulate convertor operation to avoid damage to internal components and maintain safe thermal conditions after fueling. Lockheed Martin Coherent Technologies has designed, developed and tested an Engineering Development Unit (EDU) Advanced Stirling Convertor Control Unit (ACU) to support this effort. GRC used the ACU EDU as part of its non-nuclear representation of a RPS which also consists of a pair of Dual Advanced Stirling Convertor Simulator (DASCS), and associated support equipment to perform a test in the Radioisotope Power Systems System Integration Laboratory (RSIL). The RSIL was designed and built to evaluate hardware utilizing RPS technology. The RSIL provides insight into the electrical interactions between as many as 3 radioisotope power generators, associated control strategies, and typical electric system loads. The first phase of testing included a DASCS which was developed by Johns Hopkins UniversityApplied Physics Laboratory and simulates the operation and electrical behavior of a pair of ASCs in real time via a combination of hardware and software. Testing included the following spacecraft electrical energy storage configurations: capacitive, battery, and supercapacitor. Testing of the DASCS and ACU in each energy storage configuration included simulation of a typical mission profile, and transient voltage and current data during load turn-on/turn-off. Testing for these devices also included the initiation of several system faults such as short circuits, electrical bus over-voltage, under-voltage and a dead bus recovery to restore normal power operations. The goal of this testing was to verify operation of the ACU(s) when connected to a spacecraft electrical bus.

  19. Incorporation of a Variable Discharge Coefficient for the Primary Orifice into the Benet Labs Recoil Analysis Model via Results from Quasi-Steady State Simulations Using Computational Fluid Dynamics

    DTIC Science & Technology

    2008-03-01

    Appendix 82 MatLab© Cd Calculator Routine FORTRAN© Subroutine of the Variable Cd Model ii ABBREVIATIONS & ACRONYMS Cd...Figure 29. Overview Flowchart of Benét Labs Recoil Analysis Code Figure 30. Overview Flowchart of Recoil Brake Subroutine Figure 31...Detail Flowchart of Recoil Pressure/Force Calculations Figure 32. Detail Flowchart of Variable Cd Subroutine Figure 33. Simulated Brake

  20. RatLab: an easy to use tool for place code simulations

    PubMed Central

    Schönfeld, Fabian; Wiskott, Laurenz

    2013-01-01

    In this paper we present the RatLab toolkit, a software framework designed to set up and simulate a wide range of studies targeting the encoding of space in rats. It provides open access to our modeling approach to establish place and head direction cells within unknown environments and it offers a set of parameters to allow for the easy construction of a variety of enclosures for a virtual rat as well as controlling its movement pattern over the course of experiments. Once a spatial code is formed RatLab can be used to modify aspects of the enclosure or movement pattern and plot the effect of such modifications on the spatial representation, i.e., place and head direction cell activity. The simulation is based on a hierarchical Slow Feature Analysis (SFA) network that has been shown before to establish a spatial encoding of new environments using visual input data only. RatLab encapsulates such a network, generates the visual training data, and performs all sampling automatically—with each of these stages being further configurable by the user. RatLab was written with the intention to make our SFA model more accessible to the community and to that end features a range of elements to allow for experimentation with the model without the need for specific programming skills. PMID:23908627

  1. Numerical simulation of nonlinear feedback model of saccade generation circuit implemented in the LabView graphical programming language.

    PubMed

    Jackson, M E; Gnadt, J W

    1999-03-01

    The object-oriented graphical programming language LabView was used to implement the numerical solution to a computational model of saccade generation in primates. The computational model simulates the activity and connectivity of anatomical strictures known to be involved in saccadic eye movements. The LabView program provides a graphical user interface to the model that makes it easy to observe and modify the behavior of each element of the model. Essential elements of the source code of the LabView program are presented and explained. A copy of the model is available for download from the internet.

  2. LPT. Shield test facility (TAN645 and 646). Calibration lab shield ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LPT. Shield test facility (TAN-645 and -646). Calibration lab shield door. Ralph M. Parsons 1229-17 ANP/GE-6-645-MS-1. April 1957. Approved by INEEL Classification Office for public release. INEEL index code no. 037-0645-40-693-107369 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  3. The NASA Neutron Star Grand Challenge: The coalescences of Neutron Star Binary System

    NASA Astrophysics Data System (ADS)

    Suen, Wai-Mo

    1998-04-01

    NASA funded a Grand Challenge Project (9/1996-1999) for the development of a multi-purpose numerical treatment for relativistic astrophysics and gravitational wave astronomy. The coalescence of binary neutron stars is chosen as the model problem for the code development. The institutes involved in it are the Argonne Lab, Livermore lab, Max-Planck Institute at Potsdam, StonyBrook, U of Illinois and Washington U. We have recently succeeded in constructing a highly optimized parallel code which is capable of solving the full Einstein equations coupled with relativistic hydrodynamics, running at over 50 GFLOPS on a T3E (the second milestone point of the project). We are presently working on the head-on collisions of two neutron stars, and the inclusion of realistic equations of state into the code. The code will be released to the relativity and astrophysics community in April of 1998. With the full dynamics of the spacetime, relativistic hydro and microphysics all combined into a unified 3D code for the first time, many interesting large scale calculations in general relativistic astrophysics can now be carried out on massively parallel computers.

  4. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  5. Microfluidic CODES: a scalable multiplexed electronic sensor for orthogonal detection of particles in microfluidic channels.

    PubMed

    Liu, Ruxiu; Wang, Ningquan; Kamili, Farhan; Sarioglu, A Fatih

    2016-04-21

    Numerous biophysical and biochemical assays rely on spatial manipulation of particles/cells as they are processed on lab-on-a-chip devices. Analysis of spatially distributed particles on these devices typically requires microscopy negating the cost and size advantages of microfluidic assays. In this paper, we introduce a scalable electronic sensor technology, called microfluidic CODES, that utilizes resistive pulse sensing to orthogonally detect particles in multiple microfluidic channels from a single electrical output. Combining the techniques from telecommunications and microfluidics, we route three coplanar electrodes on a glass substrate to create multiple Coulter counters producing distinct orthogonal digital codes when they detect particles. We specifically design a digital code set using the mathematical principles of Code Division Multiple Access (CDMA) telecommunication networks and can decode signals from different microfluidic channels with >90% accuracy through computation even if these signals overlap. As a proof of principle, we use this technology to detect human ovarian cancer cells in four different microfluidic channels fabricated using soft lithography. Microfluidic CODES offers a simple, all-electronic interface that is well suited to create integrated, low-cost lab-on-a-chip devices for cell- or particle-based assays in resource-limited settings.

  6. Simulation and visualization of fundamental optics phenomenon by LabVIEW

    NASA Astrophysics Data System (ADS)

    Lyu, Bohan

    2017-08-01

    Most instructors teach complex phenomenon by equation and static illustration without interactive multimedia. Students usually memorize phenomenon by taking note. However, only note or complex formula can not make user visualize the phenomenon of the photonics system. LabVIEW is a good tool for in automatic measurement. However, the simplicity of coding in LabVIEW makes it not only suit for automatic measurement, but also suitable for simulation and visualization of fundamental optics phenomenon. In this paper, five simple optics phenomenon will be discuss and simulation with LabVIEW. They are Snell's Law, Hermite-Gaussian beam transverse mode, square and circular aperture diffraction, polarization wave and Poincare sphere, and finally Fabry-Perrot etalon in spectrum domain.

  7. Differential Cross Section Kinematics for 3-dimensional Transport Codes

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Dick, Frank

    2008-01-01

    In support of the development of 3-dimensional transport codes, this paper derives the relevant relativistic particle kinematic theory. Formulas are given for invariant, spectral and angular distributions in both the lab (spacecraft) and center of momentum frames, for collisions involving 2, 3 and n - body final states.

  8. Recent Trends in Business Casual Attire and Their Effects on Student Job Seekers

    ERIC Educational Resources Information Center

    Kiddie, Thomas

    2009-01-01

    When the author introduces the unit on job hunting in his business communication course, he begins by relating his experiences searching for his first "real" job. He points out that the deciding factor for him in accepting a position at Bell Labs, instead of IBM, was Bell Lab's casual dress code. When he decided to retire from the former Bell…

  9. Anisotropic Effective Moduli of Microcrack Damaged Media

    DTIC Science & Technology

    2010-01-01

    18) vanish. In this case applying the L’Hospital’s rule to Eq. (18) when h2 ? h1 yields the following:C44 l2 ¼ 1 C55 þ pg lðlþ C44Þ ðl þ C44Þ½1...RESEARCH TRIANGLE PARK NC 27709-2211 5 NAVAL RESEARCH LAB E R FRANCHI CODE 7100 M H ORR CODE 7120 J A BUCARO CODE 7130 J S PERKINS

  10. Comparative genomic and plasmid analysis of beer-spoiling and non-beer-spoiling Lactobacillus brevis isolates.

    PubMed

    Bergsveinson, Jordyn; Ziola, Barry

    2017-12-01

    Beer-spoilage-related lactic acid bacteria (BSR LAB) belong to multiple genera and species; however, beer-spoilage capacity is isolate-specific and partially acquired via horizontal gene transfer within the brewing environment. Thus, the extent to which genus-, species-, or environment- (i.e., brewery-) level genetic variability influences beer-spoilage phenotype is unknown. Publicly available Lactobacillus brevis genomes were analyzed via BlAst Diagnostic Gene findEr (BADGE) for BSR genes and assessed for pangenomic relationships. Also analyzed were functional coding capacities of plasmids of LAB inhabiting extreme niche environments. Considerable genetic variation was observed in L. brevis isolated from clinical samples, whereas 16 candidate genes distinguish BSR and non-BSR L. brevis genomes. These genes are related to nutrient scavenging of gluconate or pentoses, mannose, and metabolism of pectin. BSR L. brevis isolates also have higher average nucleotide identity and stronger pangenome association with one another, though isolation source (i.e., specific brewery) also appears to influence the plasmid coding capacity of BSR LAB. Finally, it is shown that niche-specific adaptation and phenotype are plasmid-encoded for both BSR and non-BSR LAB. The ultimate combination of plasmid-encoded genes dictates the ability of L. brevis to survive in the most extreme beer environment, namely, gassed (i.e., pressurized) beer.

  11. Extended Plate and Beam Wall System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gunderson, Patti

    Home Innovation Research Labs studied the extended plate and beam wall (EP&B) system during a two-year period from mid-2015 to mid-2017 to determine the wall’s structural performance, moisture durability, constructability, and costeffectiveness for use as a high-R enclosure system for energy code minimum and above-code performance in climate zones 4–8.

  12. Migration of Hazardous Substances through Soil. Part 4. Development of a Serial Batch Extraction Method and Application to the Accelerated Testing of Seven Industrial Wastes

    DTIC Science & Technology

    1987-09-01

    Evaluation Commnand &_. ADMASS Coly, 1W~., and ZIP Code ) 7b. ADDRESS (C01y, State, wid ZIP Code ) Dugwiay, Utahi 84022-5000 Aberdeen Proving Ground...Aency_________________________ 9L AoOMS(CRY, 0to, and ZIP Code ) 10. SOURCE OF FUNDING NUMBERS Hazardous Waste Environmental RLsearch Lab PROGRAM PROJECT TASK...CLASSIFICATION 0 UNO.ASSIFIEDAIJNLIMITED 0l SAME AS RPT. 03 OTIC USERS UNCLA.SSIFIED 22a. RAWE OF RESPONSIBLE INDIVIDUAL 22b TELEPHONE (Include Area Code ) I

  13. Insensitive Munitions Modeling Improvement Efforts

    DTIC Science & Technology

    2010-10-01

    LLNL) ALE3D . Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the collection of information is estimated to...codes most commonly used by munition designers are CTH and the SIERRA suite of codes produced by Sandia National Labs (SNL) and ALE3D produced by... ALE3D , a LLNL developed code, is also used by various DoD participants. It was however, designed differently than either CTH or Sierra. ALE3D is a

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    TESP combines existing domain simulators in the electric power grid, with new transactive agents, growth models and evaluation scripts. The existing domain simulators include GridLAB-D for the distribution grid and single-family residential buildings, MATPOWER for transmission and bulk generation, and EnergyPlus for large buildings. More are planned for subsequent versions of TESP. The new elements are: TEAgents - simulate market participants and transactive systems for market clearing. Some of this functionality was extracted from GridLAB-D and implemented in Python for customization by PNNL and others; Growth Model - a means for simulating system changes over a multiyear period, including bothmore » normal load growth and specific investment decisions. Customizable in Python code; and Evaluation Script - a means of evaluating different transactive systems through customizable post-processing in Python code. TESP provides a method for other researchers and vendors to design transactive systems, and test them in a virtual environment. It allows customization of the key components by modifying Python code.« less

  15. Probability of illness definition for the Skylab flight crew health stabilization program

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Management and analysis of crew and environmental microbiological data from SMEAT and Skylab are discussed. Samples were collected from ten different body sites on each SMEAT and Skylab crew-member on approximately 50 occasions and since several different organisms could be isolated from each sample, several thousand lab reports were generated. These lab reports were coded and entered in a computer file and from the file various tabular summaries were constructed.

  16. Phase Field Fracture Mechanics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Brett Anthony

    For this assignment, a newer technique of fracture mechanics using a phase field approach, will be examined and compared with experimental data for a bend test and a tension test. The software being used is Sierra Solid Mechanics, an implicit/explicit finite element code developed at Sandia National Labs in Albuquerque, New Mexico. The bend test experimental data was also obtained at Sandia Labs while the tension test data was found in a report online from Purdue University.

  17. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  18. Developing Avionics Hardware and Software for Rocket Engine Testing

    NASA Technical Reports Server (NTRS)

    Aberg, Bryce Robert

    2014-01-01

    My summer was spent working as an intern at Kennedy Space Center in the Propulsion Avionics Branch of the NASA Engineering Directorate Avionics Division. The work that I was involved with was part of Rocket University's Project Neo, a small scale liquid rocket engine test bed. I began by learning about the layout of Neo in order to more fully understand what was required of me. I then developed software in LabView to gather and scale data from two flowmeters and integrated that code into the main control software. Next, I developed more LabView code to control an igniter circuit and integrated that into the main software, as well. Throughout the internship, I performed work that mechanics and technicians would do in order to maintain and assemble the engine.

  19. Fully-Implicit Navier-Stokes (FIN-S)

    NASA Technical Reports Server (NTRS)

    Kirk, Benjamin S.

    2010-01-01

    FIN-S is a SUPG finite element code for flow problems under active development at NASA Lyndon B. Johnson Space Center and within PECOS: a) The code is built on top of the libMesh parallel, adaptive finite element library. b) The initial implementation of the code targeted supersonic/hypersonic laminar calorically perfect gas flows & conjugate heat transfer. c) Initial extension to thermochemical nonequilibrium about 9 months ago. d) The technologies in FIN-S have been enhanced through a strongly collaborative research effort with Sandia National Labs.

  20. Attracting Assault: Victims' Nonverbal Cues.

    ERIC Educational Resources Information Center

    Grayson, Betty; Stein, Morris I.

    1981-01-01

    Describes a study in which prison inmates convicted of assault identified potential victims from videotapes. A lab analysis code was used to determine which nonverbal body movement categories differentiated victims and nonvictims. (JMF)

  1. A Tutorial on Interfacing the Object Management Group (OMG) Data Distribution Service (DDS) with LabView

    NASA Technical Reports Server (NTRS)

    Smith, Kevin

    2011-01-01

    This tutorial will explain the concepts and steps for interfacing a National Instruments LabView virtual instrument (VI) running on a Windows platform with another computer via the Object Management Group (OMG) Data Distribution Service (DDS) as implemented by the Twin Oaks Computing CoreDX. This paper is for educational purposes only and therefore, the referenced source code will be simplistic and void of all error checking. Implementation will be accomplished using the C programming language.

  2. Comparison of ATLOG and Xyce for Bell Labs Electromagnetic Pulse Excitation of Finite-Long Dissipative Conductors over a Ground Plane.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    campione, Salvatore; Warne, Larry K.; Schiek, Richard

    This report details the modeling results for the response of a finite-length dissipative conductor interacting with a conducting ground to the Bell Labs electromagnetic pulse excitation. We use both a frequency-domain and a time-domain method based on transmission line theory through a code we call ATLOG - Analytic Transmission Line Over Ground. Results are compared to the circuit simulator Xyce for selected cases. Intentionally Left Blank

  3. Search Codes for a Bibliography for the Study of African International Relations

    DTIC Science & Technology

    1975-01-01

    Soja (1970). A high speed ln-house reference system was needed to facilitate storage, search, and retrieval of bibliographic reference materlal...REfUBLIC 0F (N0R™ KUW KUWAIT LAB LABOR UNIONS, INTERNATIONAL ASPECTS l- LA LAND LOCKED NATIONS LAO LAOS LAT LATIN AliERICA LAW I AW, INTERNATIONAL...IOCRATIC PEOPLE’S REPUBLIC OF (NORTH) KÜS KOREA (SOUTH) KUW KUWA I T LAb LABOR UNIONS, INTERNATIONAL ASPECTS LAU LA )S I.AT LATIN AiiEUICA LAN

  4. Labview Interface Concepts Used in NASA Scientific Investigations and Virtual Instruments

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Parker, Bradford H.; Rapchun, David A.; Jones, Hollis H.; Cao, Wei

    2001-01-01

    This article provides an overview of several software control applications developed for NASA using LabVIEW. The applications covered here include (1) an Ultrasonic Measurement System for nondestructive evaluation of advanced structural materials, an Xray Spectral Mapping System for characterizing the quality and uniformity of developing photon detector materials, (2) a Life Testing System for these same materials, (3) and the instrument panel for an aircraft mounted Cloud Absorption Radiometer that measures the light scattered by clouds in multiple spectral bands. Many of the software interface concepts employed are explained. Panel layout and block diagram (code) strategies for each application are described. In particular, some of the more unique features of the applications' interfaces and source code are highlighted. This article assumes that the reader has a beginner-to-intermediate understanding of LabVIEW methods.

  5. Affordable Imaging Lab for Noninvasive Analysis of Biomass and Early Vigour in Cereal Crops

    PubMed Central

    2018-01-01

    Plant phenotyping by imaging allows automated analysis of plants for various morphological and physiological traits. In this work, we developed a low-cost RGB imaging phenotyping lab (LCP lab) for low-throughput imaging and analysis using affordable imaging equipment and freely available software. LCP lab comprising RGB imaging and analysis pipeline is set up and demonstrated with early vigour analysis in wheat. Using this lab, a few hundred pots can be photographed in a day and the pots are tracked with QR codes. The software pipeline for both imaging and analysis is built from freely available software. The LCP lab was evaluated for early vigour analysis of five wheat cultivars. A high coefficient of determination (R2 0.94) was obtained between the dry weight and the projected leaf area of 20-day-old wheat plants and R2 of 0.9 for the relative growth rate between 10 and 20 days of plant growth. Detailed description for setting up such a lab is provided together with custom scripts built for imaging and analysis. The LCP lab is an affordable alternative for analysis of cereal crops when access to a high-throughput phenotyping facility is unavailable or when the experiments require growing plants in highly controlled climate chambers. The protocols described in this work are useful for building affordable imaging system for small-scale research projects and for education. PMID:29850536

  6. LabVIEW: a software system for data acquisition, data analysis, and instrument control.

    PubMed

    Kalkman, C J

    1995-01-01

    Computer-based data acquisition systems play an important role in clinical monitoring and in the development of new monitoring tools. LabVIEW (National Instruments, Austin, TX) is a data acquisition and programming environment that allows flexible acquisition and processing of analog and digital data. The main feature that distinguishes LabVIEW from other data acquisition programs is its highly modular graphical programming language, "G," and a large library of mathematical and statistical functions. The advantage of graphical programming is that the code is flexible, reusable, and self-documenting. Subroutines can be saved in a library and reused without modification in other programs. This dramatically reduces development time and enables researchers to develop or modify their own programs. LabVIEW uses a large amount of processing power and computer memory, thus requiring a powerful computer. A large-screen monitor is desirable when developing larger applications. LabVIEW is excellently suited for testing new monitoring paradigms, analysis algorithms, or user interfaces. The typical LabVIEW user is the researcher who wants to develop a new monitoring technique, a set of new (derived) variables by integrating signals from several existing patient monitors, closed-loop control of a physiological variable, or a physiological simulator.

  7. StagLab: Post-Processing and Visualisation in Geodynamics

    NASA Astrophysics Data System (ADS)

    Crameri, Fabio

    2017-04-01

    Despite being simplifications of nature, today's Geodynamic numerical models can, often do, and sometimes have to become very complex. Additionally, a steadily-increasing amount of raw model data results from more elaborate numerical codes and the still continuously-increasing computational power available for their execution. The current need for efficient post-processing and sensible visualisation is thus apparent. StagLab (www.fabiocrameri.ch/software) provides such much-needed strongly-automated post-processing in combination with state-of-the-art visualisation. Written in MatLab, StagLab is simple, flexible, efficient and reliable. It produces figures and movies that are both fully-reproducible and publication-ready. StagLab's post-processing capabilities include numerous diagnostics for plate tectonics and mantle dynamics. Featured are accurate plate-boundary identification, slab-polarity recognition, plate-bending derivation, mantle-plume detection, and surface-topography component splitting. These and many other diagnostics are derived conveniently from only a few parameter fields thanks to powerful image processing tools and other capable algorithms. Additionally, StagLab aims to prevent scientific visualisation pitfalls that are, unfortunately, still too common in the Geodynamics community. Misinterpretation of raw data and exclusion of colourblind people introduced with the continuous use of the rainbow (a.k.a. jet) colour scheme is just one, but a dramatic example (e.g., Rogowitz and Treinish, 1998; Light and Bartlein, 2004; Borland and Ii, 2007). StagLab is currently optimised for binary StagYY output (e.g., Tackley 2008), but is adjustable for the potential use with other Geodynamic codes. Additionally, StagLab's post-processing routines are open-source. REFERENCES Borland, D., and R. M. T. Ii (2007), Rainbow color map (still) considered harmful, IEEE Computer Graphics and Applications, 27(2), 14-17. Light, A., and P. J. Bartlein (2004), The end of the rainbow? Color schemes for improved data graphics, Eos Trans. AGU, 85(40), 385-391. Rogowitz, B. E., and L. A. Treinish (1998), Data visualization: the end of the rainbow, IEEE Spectrum, 35(12), 52-59, doi:10.1109/6.736450. Tackley, P.J (2008) Modelling compressible mantle convection with large viscosity contrasts in a three-dimensional spherical shell using the yin-yang grid. Physics of the Earth and Planetary Interiors 171(1-4), 7-18.

  8. RivGen, Igiugig Deployment, Control System Specifications and Models

    DOE Data Explorer

    Forbush, Dominic; Cavagnaro, Robert J.; Guerra, Maricarmen; Donegan, James; McEntee, Jarlath; Thomson, Jim; Polagye, Brian; Fabien, Brian; Kilcher, Levi

    2016-03-21

    Control System simulation models, case studies, and processing codes for analyzing field data. Raw data files included from VFD and SCADA. MatLab and Simulink are required to open some data files and all model files.

  9. An RNA Phage Lab: MS2 in Walter Fiers' laboratory of molecular biology in Ghent, from genetic code to gene and genome, 1963-1976.

    PubMed

    Pierrel, Jérôme

    2012-01-01

    The importance of viruses as model organisms is well-established in molecular biology and Max Delbrück's phage group set standards in the DNA phage field. In this paper, I argue that RNA phages, discovered in the 1960s, were also instrumental in the making of molecular biology. As part of experimental systems, RNA phages stood for messenger RNA (mRNA), genes and genome. RNA was thought to mediate information transfers between DNA and proteins. Furthermore, RNA was more manageable at the bench than DNA due to the availability of specific RNases, enzymes used as chemical tools to analyse RNA. Finally, RNA phages provided scientists with a pure source of mRNA to investigate the genetic code, genes and even a genome sequence. This paper focuses on Walter Fiers' laboratory at Ghent University (Belgium) and their work on the RNA phage MS2. When setting up his Laboratory of Molecular Biology, Fiers planned a comprehensive study of the virus with a strong emphasis on the issue of structure. In his lab, RNA sequencing, now a little-known technique, evolved gradually from a means to solve the genetic code, to a tool for completing the first genome sequence. Thus, I follow the research pathway of Fiers and his 'RNA phage lab' with their evolving experimental system from 1960 to the late 1970s. This study illuminates two decisive shifts in post-war biology: the emergence of molecular biology as a discipline in the 1960s in Europe and of genomics in the 1990s.

  10. A new chapter in environmental sensing: The Open-Source Published Environmental Sensing (OPENS) laboratory

    NASA Astrophysics Data System (ADS)

    Selker, J. S.; Roques, C.; Higgins, C. W.; Good, S. P.; Hut, R.; Selker, A.

    2015-12-01

    The confluence of 3-Dimensional printing, low-cost solid-state-sensors, low-cost, low-power digital controllers (e.g., Arduinos); and open-source publishing (e.g., Github) is poised to transform environmental sensing. The Open-Source Published Environmental Sensing (OPENS) laboratory has launched and is available for all to use. OPENS combines cutting edge technologies and makes them available to the global environmental sensing community. OPENS includes a Maker lab space where people may collaborate in person or virtually via on-line forum for the publication and discussion of environmental sensing technology (Corvallis, Oregon, USA, please feel free to request a free reservation for space and equipment use). The physical lab houses a test-bed for sensors, as well as a complete classical machine shop, 3-D printers, electronics development benches, and workstations for code development. OPENS will provide a web-based formal publishing framework wherein global students and scientists can peer-review publish (with DOI) novel and evolutionary advancements in environmental sensor systems. This curated and peer-reviewed digital collection will include complete sets of "printable" parts and operating computer code for sensing systems. The physical lab will include all of the machines required to produce these sensing systems. These tools can be addressed in person or virtually, creating a truly global venue for advancement in monitoring earth's environment and agricultural systems. In this talk we will present an example of the process of design and publication the design and data from the OPENS-Permeameter. The publication includes 3-D printing code, Arduino (or other control/logging platform) operational code; sample data sets, and a full discussion of the design set in the scientific context of previous related devices. Editors for the peer-review process are currently sought - contact John.Selker@Oregonstate.edu or Clement.Roques@Oregonstate.edu.

  11. The NASA Langley Isolator Dynamics Research Lab

    NASA Technical Reports Server (NTRS)

    Middleton, Troy F.; Balla, Robert J.; Baurle, Robert A.; Humphreys, William M.; Wilson, Lloyd G.

    2010-01-01

    The Isolator Dynamics Research Lab (IDRL) is under construction at the NASA Langley Research Center in Hampton, Virginia. A unique test apparatus is being fabricated to support both wall and in-stream measurements for investigating the internal flow of a dual-mode scramjet isolator model. The test section is 24 inches long with a 1-inch by 2-inch cross sectional area and is supplied with unheated, dry air through a Mach 2.5 converging-diverging nozzle. The test section is being fabricated with two sets (glass and metallic) of interchangeable sidewalls to support flow visualization and laser-based measurement techniques as well as static pressure, wall temperature, and high frequency pressure measurements. During 2010, a CFD code validation experiment will be conducted in the lab in support of NASA s Fundamental Aerodynamics Program. This paper describes the mechanical design of the Isolator Dynamics Research Lab test apparatus and presents a summary of the measurement techniques planned for investigating the internal flow field of a scramjet isolator model.

  12. Space lab system analysis

    NASA Technical Reports Server (NTRS)

    Rives, T. B.; Ingels, F. M.

    1988-01-01

    An analysis of the Automated Booster Assembly Checkout System (ABACS) has been conducted. A computer simulation of the ETHERNET LAN has been written. The simulation allows one to investigate different structures of the ABACS system. The simulation code is in PASCAL and is VAX compatible.

  13. Computations for Truck Sliding with TRUCK 3.1 Code

    DTIC Science & Technology

    1989-08-01

    16 REFERENCES 1. L u. \\Villiam N.. Hobbs. Norman P. and Atkinson, Michael. TRUCK 3.1-An Improrcd Digital (’oiputtr Program for Calculating the Response...for Operations and Plans ATIN: Technical Libary Director of Chemical & Nuear Operations Dpartnt of the AIW Waskbington, DC 20310 1 Cocaeder US Ay...Lawrenoe Livermore Lab. ATIN: Code 2124, Tedhnical ATTN: Tech Info Dept L-3 Reports Libary P.O. Be 808 Monterey, CA 93940 Livermore, CA 94550 AFSC

  14. Beyond the First Optical Depth: Fusing Optical Data From Ocean Color Imagery and Gliders

    DTIC Science & Technology

    2009-01-01

    34*/ Office of Counsel,Code 1008.3 U •• "*-<-, ADOR/Director NCST E. R. Franchi , 7000 %. Public Affairs (Unclassified/ Unlimited Only). Code -rn...extreme weather (e.g., hurricanes) becoming a safe and efficient alternative to shipboard surveys3. Despite these benefits , data streams provided by...ECO-triplet poke, WetLabs). Unlike other glider types (e.g., spray, seaglider), the use of Slocums was especially advantageous in the WAP region to

  15. Performance and Architecture Lab Modeling Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-06-19

    Analytical application performance models are critical for diagnosing performance-limiting resources, optimizing systems, and designing machines. Creating models, however, is difficult. Furthermore, models are frequently expressed in forms that are hard to distribute and validate. The Performance and Architecture Lab Modeling tool, or Palm, is a modeling tool designed to make application modeling easier. Palm provides a source code modeling annotation language. Not only does the modeling language divide the modeling task into sub problems, it formally links an application's source code with its model. This link is important because a model's purpose is to capture application behavior. Furthermore, this linkmore » makes it possible to define rules for generating models according to source code organization. Palm generates hierarchical models according to well-defined rules. Given an application, a set of annotations, and a representative execution environment, Palm will generate the same model. A generated model is a an executable program whose constituent parts directly correspond to the modeled application. Palm generates models by combining top-down (human-provided) semantic insight with bottom-up static and dynamic analysis. A model's hierarchy is defined by static and dynamic source code structure. Because Palm coordinates models and source code, Palm's models are 'first-class' and reproducible. Palm automates common modeling tasks. For instance, Palm incorporates measurements to focus attention, represent constant behavior, and validate models. Palm's workflow is as follows. The workflow's input is source code annotated with Palm modeling annotations. The most important annotation models an instance of a block of code. Given annotated source code, the Palm Compiler produces executables and the Palm Monitor collects a representative performance profile. The Palm Generator synthesizes a model based on the static and dynamic mapping of annotations to program behavior. The model -- an executable program -- is a hierarchical composition of annotation functions, synthesized functions, statistics for runtime values, and performance measurements.« less

  16. Science Lab Report Writing in Postsecondary Education: Mediating Teaching and Learning Strategies between Students and Instructors

    NASA Astrophysics Data System (ADS)

    Kalaskas, Anthony Bacaoat

    The lab report is a genre commonly assigned by lab instructors and written by science majors in undergraduate science programs. The teaching and learning of the lab report, however, is a complicated and complex process that both instructors and students regularly contend with. This thesis is a qualitative study that aims to mediate the mismatch between students and instructors by ascertaining their attitudes, beliefs, and values regarding lab report writing. In this way, this thesis may suggest changes to teaching and learning strategies that lead to an improvement of lab report writing done by students. Given that little research has been conducted in this area thus far, this thesis also serves as a pilot study. A literature review is first conducted on the history of the lab report to delineate its development since its inception into American postsecondary education in the late 19th century. Genre theory and Vygotsky's zone of proximal development (ZPD) serve as the theoretical lenses for this thesis. Surveys and interviews are conducted with biology majors and instructors in the Department of Biology at George Mason University. Univariate analysis and coding are applied to elucidate responses from participants. The findings suggest that students may lack the epistemological background to understand lab reports as a process of doing science. This thesis also finds that both instructors and students consider the lab report primarily as a pedagogical genre as opposed to an apprenticeship genre. Additionally, although instructors were found to have utilized an effective piecemeal teaching strategy, there remains a lack of empathy among instructors for students. Collectively, these findings suggest that instructors should modify teaching strategies to determine and address student weaknesses more directly.

  17. LabVIEW Task Manager v. 1.10.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vargo, Timothy D.

    LabVIEW Task Manager is a debugging tool for use during code development in the National Instruments (NI) LabVIEW® IDE. While providing a dynamic & big-picture view of running code, an expandable/collapsible tree diagram displays detailed information (both static and dynamic) on all VIs in memory, belonging to a selected project/target. It allows for interacting with single or multiple selected VIs at a time, providing significant benefits while troubleshooting, and has the following features: Look & Feel similar to Windows® Task Manager; Selection of project/target; Lists all VIs in memory, grouped by class/library; Searches for and enumerates clones in memory; DropInmore » VI for including dynamically referenced clones (Clone Beacon); 'Refresh Now' (F5) re-reads all VIs in memory and adds new ones to the tree; Displays VI name, owning class/library, state, path, data size & code size; Displays VI FP Behavior, Reentrant?, Reentrancy Type, Paused? & Highlight?; Sort by any column, including by library name; Filter by item types vi, ctl, and vit/ctt; Filter out vi.lib and global VIs; Tracking of, and ability to toggle, execution highlighting on multiple selected VIs; Tracking of paused VIs with ability to Pause/Resume/TogglePause multiple selected VIs; DropIn VI for pausing on a condition; If a clone initiates a pause, a different pause symbol is used for all clones of that same reentrant original VI; Select multiple VIs and open or close their FPs or BDs; Double Click a VI from the tree to bring the BD (first choice) or FP to front, if already open; and Select multiple top-level VIs and Abort them.« less

  18. A text input system developed by using lips image recognition based LabVIEW for the seriously disabled.

    PubMed

    Chen, S C; Shao, C L; Liang, C K; Lin, S W; Huang, T H; Hsieh, M C; Yang, C H; Luo, C H; Wuo, C M

    2004-01-01

    In this paper, we present a text input system for the seriously disabled by using lips image recognition based on LabVIEW. This system can be divided into the software subsystem and the hardware subsystem. In the software subsystem, we adopted the technique of image processing to recognize the status of mouth-opened or mouth-closed depending the relative distance between the upper lip and the lower lip. In the hardware subsystem, parallel port built in PC is used to transmit the recognized result of mouth status to the Morse-code text input system. Integrating the software subsystem with the hardware subsystem, we implement a text input system by using lips image recognition programmed in LabVIEW language. We hope the system can help the seriously disabled to communicate with normal people more easily.

  19. MineLoC: A Rapid Production of Lab-on-a-Chip Biosensors Using 3D Printer and the Sandbox Game, Minecraft.

    PubMed

    Kim, Kyukwang; Kim, Hyeongkeun; Kim, Seunggyu; Jeon, Jessie S

    2018-06-10

    Here, MineLoC is described as a pipeline developed to generate 3D printable models of master templates for Lab-on-a-Chip (LoC) by using a popular multi-player sandbox game “Minecraft”. The user can draw a simple diagram describing the channels and chambers of the Lab-on-a-Chip devices with pre-registered color codes which indicate the height of the generated structure. MineLoC converts the diagram into large chunks of blocks (equal sized cube units composing every object in the game) in the game world. The user and co-workers can simultaneously access the game and edit, modify, or review, which is a feature not generally supported by conventional design software. Once the review is complete, the resultant structure can be exported into a stereolithography (STL) file which can be used in additive manufacturing. Then, the Lab-on-a-Chip device can be fabricated by the standard protocol to produce a Lab-on-a-Chip. The simple polydimethylsiloxane (PDMS) device for the bacterial growth measurement used in the previous research was copied by the proposed method. The error calculation by a 3D model comparison showed an accuracy of 86%. It is anticipated that this work will facilitate more use of 3D printer-based Lab-on-a-Chip fabrication, which greatly lowers the entry barrier in the field of Lab-on-a-Chip research.

  20. General 3D Airborne Antenna Radiation Pattern Code Users Manual.

    DTIC Science & Technology

    1983-02-01

    AD-A 30 359 GENERAL 3D AIRBORNEANTENNA RADIATION PATTERN CODE USERS MANUA (U) OHIO STATE UNIV COLUMBUS ELECTROSCIENCE LAB H HCHUNGET AL FEB 83 RADC...F30602-79-C-0068 9. PERFORMING ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASKAREA A WORK UNIT NUMEEfRS The Ohio State University...Computer Program 20, ABSTRACT (Coaffivme on reverse side it ntecessar a" 141etifIr &V block mUbef) This report describes a computer program and how it may

  1. Numerical modeling of immiscible two-phase flow in micro-models using a commercial CFD code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crandall, Dustin; Ahmadia, Goodarz; Smith, Duane H.

    2009-01-01

    Off-the-shelf CFD software is being used to analyze everything from flow over airplanes to lab-on-a-chip designs. So, how accurately can two-phase immiscible flow be modeled flowing through some small-scale models of porous media? We evaluate the capability of the CFD code FLUENT{trademark} to model immiscible flow in micro-scale, bench-top stereolithography models. By comparing the flow results to experimental models we show that accurate 3D modeling is possible.

  2. Chemical reactivity and spectroscopy explored from QM/MM molecular dynamics simulations using the LIO code

    NASA Astrophysics Data System (ADS)

    Marcolongo, Juan P.; Zeida, Ari; Semelak, Jonathan A.; Foglia, Nicolás O.; Morzan, Uriel N.; Estrin, Dario A.; González Lebrero, Mariano C.; Scherlis, Damián A.

    2018-03-01

    In this work we present the current advances in the development and the applications of LIO, a lab-made code designed for density functional theory calculations in graphical processing units (GPU), that can be coupled with different classical molecular dynamics engines. This code has been thoroughly optimized to perform efficient molecular dynamics simulations at the QM/MM DFT level, allowing for an exhaustive sampling of the configurational space. Selected examples are presented for the description of chemical reactivity in terms of free energy profiles, and also for the computation of optical properties, such as vibrational and electronic spectra in solvent and protein environments.

  3. Graphical coding data and operational guidance for implementation or modification of a LabVIEW®-based pHstat system for the cultivation of microalgae.

    PubMed

    Golda, Rachel L; Golda, Mark D; Peterson, Tawnya D; Needoba, Joseph A

    2017-06-01

    The influence of pH on phytoplankton physiology is an important facet of the body of research on ocean acidification. We provide data developed during the design and implementation of a novel pHstat system capable of maintaining both static and dynamic pH environments in a laboratory setting. These data both help improve functionality of the system, and provide specific coding blocks for controlling the pHstat using a LabVIEW® virtual instrument (VI). The data in this paper support the research article "Development of an economical, autonomous pHstat system for culturing phytoplankton under steady state or dynamic conditions" (Golda et al. [2]). These data will be of interest to researchers studying the effects of changing pH on phytoplankton in a laboratory context, and to those desiring to build their own pHstat system(s). These data can also be used to facilitate modification of the pHstat system to control salinity, temperature, or other environmental factors.

  4. From big data analysis in the cloud to robotic pot drumming: tales from the Met Office Informatics Lab

    NASA Astrophysics Data System (ADS)

    Robinson, Niall; Tomlinson, Jacob; Prudden, Rachel; Hilson, Alex; Arribas, Alberto

    2017-04-01

    The Met Office Informatics Lab is a small multidisciplinary team which sits between science, technology and design. Our mission is simply "to make Met Office data useful" - a deliberately broad objective. Our prototypes often trial cutting edge technologies, and so far have included projects such as virtual reality data visualisation in the web browser, bots and natural language interfaces, and artificially intelligent weather warnings. In this talk we focus on our latest project, Jade, a big data analysis platform in the cloud. It is a powerful, flexible and simple to use implementation which makes extensive use of technologies such as Jupyter, Dask, containerisation, Infrastructure as Code, and auto-scaling. Crucially, Jade is flexible enough to be used for a diverse set of applications: it can present weather forecast information to meteorologists and allow climate scientists to analyse big data sets, but it is also effective for analysing non-geospatial data. As well as making data useful, the Informatics Lab also trials new working practises. In this presentation, we will talk about our experience of making a group like the Lab successful.

  5. ASC Tri-lab Co-design Level 2 Milestone Report 2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, Rich; Jones, Holger; Keasler, Jeff

    2015-09-23

    In 2015, the three Department of Energy (DOE) National Laboratories that make up the Advanced Sci- enti c Computing (ASC) Program (Sandia, Lawrence Livermore, and Los Alamos) collaboratively explored performance portability programming environments in the context of several ASC co-design proxy applica- tions as part of a tri-lab L2 milestone executed by the co-design teams at each laboratory. The programming environments that were studied included Kokkos (developed at Sandia), RAJA (LLNL), and Legion (Stan- ford University). The proxy apps studied included: miniAero, LULESH, CoMD, Kripke, and SNAP. These programming models and proxy-apps are described herein. Each lab focused on amore » particular combination of abstractions and proxy apps, with the goal of assessing performance portability using those. Performance portability was determined by: a) the ability to run a single application source code on multiple advanced architectures, b) comparing runtime performance between \

  6. Spike-train acquisition, analysis and real-time experimental control using a graphical programming language (LabView).

    PubMed

    Nordstrom, M A; Mapletoft, E A; Miles, T S

    1995-11-01

    A solution is described for the acquisition on a personal computer of standard pulses derived from neuronal discharge, measurement of neuronal discharge times, real-time control of stimulus delivery based on specified inter-pulse interval conditions in the neuronal spike train, and on-line display and analysis of the experimental data. The hardware consisted of an Apple Macintosh IIci computer and a plug-in card (National Instruments NB-MIO16) that supports A/D, D/A, digital I/O and timer functions. The software was written in the object-oriented graphical programming language LabView. Essential elements of the source code of the LabView program are presented and explained. The use of the system is demonstrated in an experiment in which the reflex responses to muscle stretch are assessed for a single motor unit in the human masseter muscle.

  7. Recent skyshine calculations at Jefferson Lab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Degtyarenko, P.

    1997-12-01

    New calculations of the skyshine dose distribution of neutrons and secondary photons have been performed at Jefferson Lab using the Monte Carlo method. The dose dependence on neutron energy, distance to the neutron source, polar angle of a source neutron, and azimuthal angle between the observation point and the momentum direction of a source neutron have been studied. The azimuthally asymmetric term in the skyshine dose distribution is shown to be important in the dose calculations around high-energy accelerator facilities. A parameterization formula and corresponding computer code have been developed which can be used for detailed calculations of the skyshinemore » dose maps.« less

  8. Approximation for Bayesian Ability Estimation.

    DTIC Science & Technology

    1987-02-18

    Leyden Air Force Human Resources Lab Education Research Center Brooks AFB, TX 78235 Boerhaavelsan 2 2334 EN Leyden Dr. kent Eaton The NETHERLANDS Army...Box 16268 Alexandria, VA 22302-0266 Ms. Kathleen Moreno Navy Personnel R&D Ce nter Dr. William L. Maloy Code 62 Chief of Naval Education San Diego

  9. 77 FR 68891 - Medicare Program; Revisions to Payment Policies Under the Physician Fee Schedule, DME Face-to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-16

    ... Area ICD International Classification of Diseases IMRT Intensity Modulated Radiation Therapy IOM... Stereotactic body radiation therapy SGR Sustainable growth rate TC Technical component TIN Tax identification... Clinical Lab Fee Schedule, which is unaffected by the misvalued code initiative. Radiation therapy centers...

  10. Los Alamos and Lawrence Livermore National Laboratories Code-to-Code Comparison of Inter Lab Test Problem 1 for Asteroid Impact Hazard Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Robert P.; Miller, Paul; Howley, Kirsten

    The NNSA Laboratories have entered into an interagency collaboration with the National Aeronautics and Space Administration (NASA) to explore strategies for prevention of Earth impacts by asteroids. Assessment of such strategies relies upon use of sophisticated multi-physics simulation codes. This document describes the task of verifying and cross-validating, between Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL), modeling capabilities and methods to be employed as part of the NNSA-NASA collaboration. The approach has been to develop a set of test problems and then to compare and contrast results obtained by use of a suite of codes, includingmore » MCNP, RAGE, Mercury, Ares, and Spheral. This document provides a short description of the codes, an overview of the idealized test problems, and discussion of the results for deflection by kinetic impactors and stand-off nuclear explosions.« less

  11. A User’s Manual for Electromagnetic Surface Patch (ESP) Code. Version II. Polygonal Plates and Wires.

    DTIC Science & Technology

    1983-09-01

    to geometries not large in terms of wavelength and the lack of analytical results which can provide physical insight into the problem. The first...EPTS)*Ŗ) STPM *2. ’TP’ (CABS (ETPS)*Ŗ) STTHi2.*TP*(CARS(ETTS)*Ŗ) RETURN - 1 END 296 do APPENDIX 37 * SUBROUTINE SURFFY SUBROUTINE.SURF!? (KXl,YN1...D-Ai135 837 A USER’S MAiNUAfL FOR ELECTROMAGNETIC SURFACE PATCH (ESP) 1 /4 CODE VERSION 11 P.-(U) OHIO STATE UNIV COLUMBUS U CLAS ELECTROSCIENCE LAB E

  12. Terminal Ballistic Application of Hydrodynamic Computer Code Calculations.

    DTIC Science & Technology

    1977-04-01

    F1’T.D—AO*I 065 BALLISTIC RESEARCH LABS ABnoflN PR0VIM eRotic j~o NTERMiNAL BALLISIIC APPLICATION OF HYDRODYNAMIC C~I~~U7ER COVE CA—ET C(U) I APR 77...this short- coming of the code, design solutions using a combined calculational and empirical design procedure were tried . 18 --- - -- -- - --- -rn...In this calculation , the exp losive was conf ined on its periphery by a steel casing. The calculated liner shape is shown at 18 m icroseconds af

  13. An Evaluation of Characteristics Contributing towards Ease of User- Computer Interface in a Computer-Aided Instruction Exercise

    DTIC Science & Technology

    1987-12-31

    Kent E., Hamel, Cheryl J., and Shrestha, Lisa B. 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (Year, Month, Day) 15 PAGE COUNT Final FROM...DTIC USERS UNCLASSIFIED 22a NAME OF RESPONSIBLE INDIVIDUAL 22b TELEPHONE (Include Area Code) 22c OFFICE SYMBOL Cheryl J. Hamel 407-380-4825 Code 712 DO...Lab ATTN: Dr Alva Bittner, Jr., P. 0. Box 29407 New Orleans, LA 70189 Commanding Officer NETPMSA ATTN: Mr Dennis Knott Pensacola, FL 32509-5000

  14. Blood collection kit for Space Lab 1

    NASA Image and Video Library

    1981-02-02

    S81-26158 (Feb 1981) --- A close-up view of a training version of a STS-40/SLS-1 blood kit. Blood samples from crewmembers are critical to a number of Space Life Sciences-1 (SLS-1) investigations. One day's collection equipment, color coded for each crewmember, is neatly organized in the kit.

  15. Lab Fire Extinguishers: Here Today, Gone Tomorrow?

    ERIC Educational Resources Information Center

    Roy, Ken

    2010-01-01

    When renovations or new construction occur, fire extinguishers sometimes get lost in the mix. Unfortunately, whether to save money or because the fire code is misinterpreted, some schools do not install fire extinguishers in laboratories and other areas of the building. Let's set the record straight! If flammables are present, the fire code…

  16. NRL Fact Book 1992-1993

    DTIC Science & Technology

    1993-06-01

    administering contractual support for lab-wide or multiple buys of ADP systems, software, and services. Computer systems located in the Central Computing Facility...Code Dr. D.L. Bradley Vacant Mrs. N.J. Beauchamp Dr. W.A. Kuperman Dr. E.R. Franchi Dr. M.H. Orr Dr. J.A. Bucaro Mr. L.B. Palmer Dr. D.J. Ramsdale Mr

  17. Building Automatic Grading Tools for Basic of Programming Lab in an Academic Institution

    NASA Astrophysics Data System (ADS)

    Harimurti, Rina; Iwan Nurhidayat, Andi; Asmunin

    2018-04-01

    The skills of computer programming is a core competency that must be mastered by students majoring in computer sciences. The best way to improve this skill is through the practice of writing many programs to solve various problems from simple to complex. It takes hard work and a long time to check and evaluate the results of student labs one by one, especially if the number of students a lot. Based on these constrain, web proposes Automatic Grading Tools (AGT), the application that can evaluate and deeply check the source code in C, C++. The application architecture consists of students, web-based applications, compilers, and operating systems. Automatic Grading Tools (AGT) is implemented MVC Architecture and using open source software, such as laravel framework version 5.4, PostgreSQL 9.6, Bootstrap 3.3.7, and jquery library. Automatic Grading Tools has also been tested for real problems by submitting source code in C/C++ language and then compiling. The test results show that the AGT application has been running well.

  18. Control code for laboratory adaptive optics teaching system

    NASA Astrophysics Data System (ADS)

    Jin, Moonseob; Luder, Ryan; Sanchez, Lucas; Hart, Michael

    2017-09-01

    By sensing and compensating wavefront aberration, adaptive optics (AO) systems have proven themselves crucial in large astronomical telescopes, retinal imaging, and holographic coherent imaging. Commercial AO systems for laboratory use are now available in the market. One such is the ThorLabs AO kit built around a Boston Micromachines deformable mirror. However, there are limitations in applying these systems to research and pedagogical projects since the software is written with limited flexibility. In this paper, we describe a MATLAB-based software suite to interface with the ThorLabs AO kit by using the MATLAB Engine API and Visual Studio. The software is designed to offer complete access to the wavefront sensor data, through the various levels of processing, to the command signals to the deformable mirror and fast steering mirror. In this way, through a MATLAB GUI, an operator can experiment with every aspect of the AO system's functioning. This is particularly valuable for tests of new control algorithms as well as to support student engagement in an academic environment. We plan to make the code freely available to the community.

  19. Recovery Act: An Integrated Experimental and Numerical Study: Developing a Reaction Transport Model that Couples Chemical Reactions of Mineral Dissolution/Precipitation with Spatial and Temporal Flow Variations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saar, Martin O.; Seyfried, Jr., William E.; Longmire, Ellen K.

    2016-06-24

    A total of 12 publications and 23 abstracts were produced as a result of this study. In particular, the compilation of a thermodynamic database utilizing consistent, current thermodynamic data is a major step toward accurately modeling multi-phase fluid interactions with solids. Existing databases designed for aqueous fluids did not mesh well with existing solid phase databases. Addition of a second liquid phase (CO2) magnifies the inconsistencies between aqueous and solid thermodynamic databases. Overall, the combination of high temperature and pressure lab studies (task 1), using a purpose built apparatus, and solid characterization (task 2), using XRCT and more developed technologies,more » allowed observation of dissolution and precipitation processes under CO2 reservoir conditions. These observations were combined with results from PIV experiments on multi-phase fluids (task 3) in typical flow path geometries. The results of the tasks 1, 2, and 3 were compiled and integrated into numerical models utilizing Lattice-Boltzmann simulations (task 4) to realistically model the physical processes and were ultimately folded into TOUGH2 code for reservoir scale modeling (task 5). Compilation of the thermodynamic database assisted comparisons to PIV experiments (Task 3) and greatly improved Lattice Boltzmann (Task 4) and TOUGH2 simulations (Task 5). PIV (Task 3) and experimental apparatus (Task 1) have identified problem areas in TOUGHREACT code. Additional lab experiments and coding work has been integrated into an improved numerical modeling code.« less

  20. Neutron production cross sections for (d,n) reactions at 55 MeV

    NASA Astrophysics Data System (ADS)

    Wakasa, T.; Goto, S.; Matsuno, M.; Mitsumoto, S.; Okada, T.; Oshiro, H.; Sakaguchi, S.

    2017-08-01

    The cross sections for (d,n) reactions on {}^natC-{}^{197}Au have been measured at a bombarding energy of 55 MeV and a laboratory scattering angle of θ_lab = 9.5°. The angular distributions for the {}^natC(d,n) reaction have also been obtained at θ_lab = 0°-40°. The neutron energy spectra are dominated by deuteron breakup contributions and their peak positions can be reasonably reproduced by considering the Coulomb force effects. The data are compared with the TENDL-2015 nuclear data and Particle and Heavy Ion Transport code System (PHITS) calculations. Both calculations fail to reproduce the measured energy spectra and angular distributions.

  1. The Design of Integrated Information System for High Voltage Metering Lab

    NASA Astrophysics Data System (ADS)

    Ma, Yan; Yang, Yi; Xu, Guangke; Gu, Chao; Zou, Lida; Yang, Feng

    2018-01-01

    With the development of smart grid, intelligent and informatization management of high-voltage metering lab become increasingly urgent. In the paper we design an integrated information system, which automates the whole transactions from accepting instruments, make experiments, generating report, report signature to instrument claims. Through creating database for all the calibrated instruments, using two-dimensional code, integrating report templates in advance, establishing bookmarks and online transmission of electronical signatures, our manual procedures reduce largely. These techniques simplify the complex process of account management and report transmission. After more than a year of operation, our work efficiency improves about forty percent averagely, and its accuracy rate and data reliability are much higher as well.

  2. Optimal patch code design via device characterization

    NASA Astrophysics Data System (ADS)

    Wu, Wencheng; Dalal, Edul N.

    2012-01-01

    In many color measurement applications, such as those for color calibration and profiling, "patch code" has been used successfully for job identification and automation to reduce operator errors. A patch code is similar to a barcode, but is intended primarily for use in measurement devices that cannot read barcodes due to limited spatial resolution, such as spectrophotometers. There is an inherent tradeoff between decoding robustness and the number of code levels available for encoding. Previous methods have attempted to address this tradeoff, but those solutions have been sub-optimal. In this paper, we propose a method to design optimal patch codes via device characterization. The tradeoff between decoding robustness and the number of available code levels is optimized in terms of printing and measurement efforts, and decoding robustness against noises from the printing and measurement devices. Effort is drastically reduced relative to previous methods because print-and-measure is minimized through modeling and the use of existing printer profiles. Decoding robustness is improved by distributing the code levels in CIE Lab space rather than in CMYK space.

  3. A Remote Lab for Experiments with a Team of Mobile Robots

    PubMed Central

    Casini, Marco; Garulli, Andrea; Giannitrapani, Antonio; Vicino, Antonio

    2014-01-01

    In this paper, a remote lab for experimenting with a team of mobile robots is presented. Robots are built with the LEGO Mindstorms technology and user-defined control laws can be directly coded in the Matlab programming language and validated on the real system. The lab is versatile enough to be used for both teaching and research purposes. Students can easily go through a number of predefined mobile robotics experiences without having to worry about robot hardware or low-level programming languages. More advanced experiments can also be carried out by uploading custom controllers. The capability to have full control of the vehicles, together with the possibility to define arbitrarily complex environments through the definition of virtual obstacles, makes the proposed facility well suited to quickly test and compare different control laws in a real-world scenario. Moreover, the user can simulate the presence of different types of exteroceptive sensors on board of the robots or a specific communication architecture among the agents, so that decentralized control strategies and motion coordination algorithms can be easily implemented and tested. A number of possible applications and real experiments are presented in order to illustrate the main features of the proposed mobile robotics remote lab. PMID:25192316

  4. A remote lab for experiments with a team of mobile robots.

    PubMed

    Casini, Marco; Garulli, Andrea; Giannitrapani, Antonio; Vicino, Antonio

    2014-09-04

    In this paper, a remote lab for experimenting with a team of mobile robots is presented. Robots are built with the LEGO Mindstorms technology and user-defined control laws can be directly coded in the Matlab programming language and validated on the real system. The lab is versatile enough to be used for both teaching and research purposes. Students can easily go through a number of predefined mobile robotics experiences without having to worry about robot hardware or low-level programming languages. More advanced experiments can also be carried out by uploading custom controllers. The capability to have full control of the vehicles, together with the possibility to define arbitrarily complex environments through the definition of virtual obstacles, makes the proposed facility well suited to quickly test and compare different control laws in a real-world scenario. Moreover, the user can simulate the presence of different types of exteroceptive sensors on board of the robots or a specific communication architecture among the agents, so that decentralized control strategies and motion coordination algorithms can be easily implemented and tested. A number of possible applications and real experiments are presented in order to illustrate the main features of the proposed mobile robotics remote lab.

  5. Validating a Monotonically-Integrated Large Eddy Simulation Code for Subsonic Jet Acoustics

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Bridges, James

    2017-01-01

    The results of subsonic jet validation cases for the Naval Research Lab's Jet Engine Noise REduction (JENRE) code are reported. Two set points from the Tanna matrix, set point 3 (Ma = 0.5, unheated) and set point 7 (Ma = 0.9, unheated) are attempted on three different meshes. After a brief discussion of the JENRE code and the meshes constructed for this work, the turbulent statistics for the axial velocity are presented and compared to experimental data, with favorable results. Preliminary simulations for set point 23 (Ma = 0.5, Tj=T1 = 1.764) on one of the meshes are also described. Finally, the proposed configuration for the farfield noise prediction with JENRE's Ffowcs-Williams Hawking solver are detailed.

  6. JLIFE: THE JEFFERSON LAB INTERACTIVE FRONT END FOR THE OPTICAL PROPAGATION CODE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watson, Anne M.; Shinn, Michelle D.

    2013-08-01

    We present details on a graphical interface for the open source software program Optical Propagation Code, or OPC. This interface, written in Java, allows a user with no knowledge of OPC to create an optical system, with lenses, mirrors, apertures, etc. and the appropriate drifts between them. The Java code creates the appropriate Perl script that serves as the input for OPC. The mode profile is then output at each optical element. The display can be either an intensity profile along the x axis, or as an isometric 3D plot which can be tilted and rotated. These profiles can bemore » saved. Examples of the input and output will be presented.« less

  7. Implementation of an Unequal Path Length, Heterodyne Interferometer on the MOCHI LabJet Experiment

    NASA Astrophysics Data System (ADS)

    Card, Alexander Harrison

    The MOCHI LabJet experiment aims to explore the stability of magnetic flux tubes through the medium of laboratory astrophysical plasmas. The boundary conditions of large gravitational bodies, namely accretion disks, are replicated and allowed to influence a plasma over short timescales. Observation of the plasma is enabled through use of a variety of fast diagnostics, including an unequal path length, heterodyne, quadrature phase differential interferometer, the development and implementation of which is described in detail. The LabJet gun, a triple-electrode planar plasma gun featuring azimuthally symmetric gas injection achieves a new, long-duration, highly-stabilized, jet plasma formation. The line-integrated density in this new LabJet formation is found to be ne = (6 +/- 3)x1020 [m-2]. By observing the axial expansion rate of the jet over multiple chord locations (all perpendicular to the propagation axis), the interferometer provides an Alfvén velocity measurement of vA = 41.3 +/- 5.4 [km/s], which at the jet density observed indicates an axial magnetic field strength of Bz = 0.15 +/- 0.04 [T]. Various other laboratory components are also detailed, such as a shot-based MDSplus data storage architecture implemented into the LabVIEW experiment control code, and the production and performance of ten fast neutral gas injection valves which when fired in unison provide a total particle inventory of (7.8 +/- 0.6)x1023 [HI particles].

  8. Lightweight UDP Pervasive Protocol in Smart Home Environment Based on Labview

    NASA Astrophysics Data System (ADS)

    Kurniawan, Wijaya; Hannats Hanafi Ichsan, Mochammad; Rizqika Akbar, Sabriansyah; Arwani, Issa

    2017-04-01

    TCP (Transmission Control Protocol) technology in a reliable environment was not a problem, but not in an environment where the entire Smart Home network connected locally. Currently employing pervasive protocols using TCP technology, when data transmission is sent, it would be slower because they have to perform handshaking process in advance and could not broadcast the data. On smart home environment, it does not need large size and complex data transmission between monitoring site and monitoring center required in Smart home strain monitoring system. UDP (User Datagram Protocol) technology is quick and simple on data transmission process. UDP can broadcast messages because the UDP did not require handshaking and with more efficient memory usage. LabVIEW is a programming language software for processing and visualization of data in the field of data acquisition. This paper proposes to examine Pervasive UDP protocol implementations in smart home environment based on LabVIEW. UDP coded in LabVIEW and experiments were performed on a PC and can work properly.

  9. Attitude identification for SCOLE using two infrared cameras

    NASA Technical Reports Server (NTRS)

    Shenhar, Joram

    1991-01-01

    An algorithm is presented that incorporates real time data from two infrared cameras and computes the attitude parameters of the Spacecraft COntrol Lab Experiment (SCOLE), a lab apparatus representing an offset feed antenna attached to the Space Shuttle by a flexible mast. The algorithm uses camera position data of three miniature light emitting diodes (LEDs), mounted on the SCOLE platform, permitting arbitrary camera placement and an on-line attitude extraction. The continuous nature of the algorithm allows identification of the placement of the two cameras with respect to some initial position of the three reference LEDs, followed by on-line six degrees of freedom attitude tracking, regardless of the attitude time history. A description is provided of the algorithm in the camera identification mode as well as the mode of target tracking. Experimental data from a reduced size SCOLE-like lab model, reflecting the performance of the camera identification and the tracking processes, are presented. Computer code for camera placement identification and SCOLE attitude tracking is listed.

  10. Low-Cost High-Speed Techniques for Real-Time Simulation of Power Electronic Systems

    DTIC Science & Technology

    2007-06-01

    first implemented on the RT-Lab using Simulink S- fuctions . An effort was then initiated to code at least part of the simulation on the available FPGA. It...time simulation, and the use of simulation packages such as Matlab and Spice. The primary purpose of these calculations was to confirm that the

  11. Adaptation of Flux-Corrected Transport Algorithms for Modeling Dusty Flows.

    DTIC Science & Technology

    1983-12-20

    Defense Comunications Agency Olcy Attn XLA Washington, DC 20305 01cy Attn nTW-2 (ADR CNW D I: Attn Code 240 for) Olcy Attn NL-STN O Library Olcy Attn...Library Olcy Attn TIC-Library Olcy Attn R Welch Olcy Attn M Johnson Los Alamos National Scientific Lab. Mail Station 5000 Information Science, Inc. P

  12. 78 FR 39283 - Forum on Environmental Measurements Announcement of Competency Policy for Assistance Agreements...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-01

    ... originally approved on December 12, 2012 by the Science Technology Policy Council (STPC). Because... materials to aid with implementation are available on the FEM Web site ( http://www.epa.gov/fem/lab_comp.htm..., Science Advisor. [FR Doc. 2013-15753 Filed 6-28-13; 8:45 am] BILLING CODE 6560-50-P ...

  13. Resonant Terahertz Absorption Using Metamaterial Structures

    DTIC Science & Technology

    2012-12-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited RESONANT TERAHERTZ...Second Reader: James H. Newman THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704–0188 Public reporting... public release; distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The Sensor Research Lab at the Naval Postgraduate

  14. GOES satellite time code dissemination

    NASA Technical Reports Server (NTRS)

    Beehler, R. E.

    1983-01-01

    The GOES time code system, the performance achieved to date, and some potential improvements in the future are discussed. The disseminated time code is originated from a triply redundant set of atomic standards, time code generators and related equipment maintained by NBS at NOAA's Wallops Island, VA satellite control facility. It is relayed by two GOES satellites located at 75 W and 135 W longitude on a continuous basis to users within North and South America (with overlapping coverage) and well out into the Atlantic and Pacific ocean areas. Downlink frequencies are near 468 MHz. The signals from both satellites are monitored and controlled from the NBS labs at Boulder, CO with additional monitoring input from geographically separated receivers in Washington, D.C. and Hawaii. Performance experience with the received time codes for periods ranging from several years to one day is discussed. Results are also presented for simultaneous, common-view reception by co-located receivers and by receivers separated by several thousand kilometers.

  15. The Instrumentation of a Microfluidic Analyzer Enabling the Characterization of the Specific Membrane Capacitance, Cytoplasm Conductivity, and Instantaneous Young's Modulus of Single Cells.

    PubMed

    Wang, Ke; Zhao, Yang; Chen, Deyong; Huang, Chengjun; Fan, Beiyuan; Long, Rong; Hsieh, Chia-Hsun; Wang, Junbo; Wu, Min-Hsien; Chen, Jian

    2017-06-19

    This paper presents the instrumentation of a microfluidic analyzer enabling the characterization of single-cell biophysical properties, which includes seven key components: a microfluidic module, a pressure module, an imaging module, an impedance module, two LabVIEW platforms for instrument operation and raw data processing, respectively, and a Python code for data translation. Under the control of the LabVIEW platform for instrument operation, the pressure module flushes single cells into the microfluidic module with raw biophysical parameters sampled by the imaging and impedance modules and processed by the LabVIEW platform for raw data processing, which were further translated into intrinsic cellular biophysical parameters using the code developed in Python. Based on this system, specific membrane capacitance, cytoplasm conductivity, and instantaneous Young's modulus of three cell types were quantified as 2.76 ± 0.57 μF/cm², 1.00 ± 0.14 S/m, and 3.79 ± 1.11 kPa for A549 cells ( n cell = 202); 1.88 ± 0.31 μF/cm², 1.05 ± 0.16 S/m, and 3.74 ± 0.75 kPa for 95D cells ( n cell = 257); 2.11 ± 0.38 μF/cm², 0.87 ± 0.11 S/m, and 5.39 ± 0.89 kPa for H460 cells ( n cell = 246). As a semi-automatic instrument with a throughput of roughly 1 cell per second, this prototype instrument can be potentially used for the characterization of cellular biophysical properties.

  16. The Instrumentation of a Microfluidic Analyzer Enabling the Characterization of the Specific Membrane Capacitance, Cytoplasm Conductivity, and Instantaneous Young’s Modulus of Single Cells

    PubMed Central

    Wang, Ke; Zhao, Yang; Chen, Deyong; Huang, Chengjun; Fan, Beiyuan; Long, Rong; Hsieh, Chia-Hsun; Wang, Junbo; Wu, Min-Hsien; Chen, Jian

    2017-01-01

    This paper presents the instrumentation of a microfluidic analyzer enabling the characterization of single-cell biophysical properties, which includes seven key components: a microfluidic module, a pressure module, an imaging module, an impedance module, two LabVIEW platforms for instrument operation and raw data processing, respectively, and a Python code for data translation. Under the control of the LabVIEW platform for instrument operation, the pressure module flushes single cells into the microfluidic module with raw biophysical parameters sampled by the imaging and impedance modules and processed by the LabVIEW platform for raw data processing, which were further translated into intrinsic cellular biophysical parameters using the code developed in Python. Based on this system, specific membrane capacitance, cytoplasm conductivity, and instantaneous Young’s modulus of three cell types were quantified as 2.76 ± 0.57 μF/cm2, 1.00 ± 0.14 S/m, and 3.79 ± 1.11 kPa for A549 cells (ncell = 202); 1.88 ± 0.31 μF/cm2, 1.05 ± 0.16 S/m, and 3.74 ± 0.75 kPa for 95D cells (ncell = 257); 2.11 ± 0.38 μF/cm2, 0.87 ± 0.11 S/m, and 5.39 ± 0.89 kPa for H460 cells (ncell = 246). As a semi-automatic instrument with a throughput of roughly 1 cell per second, this prototype instrument can be potentially used for the characterization of cellular biophysical properties. PMID:28629175

  17. Members of the Science and Technology Partnership Forum Listen to a Presentation about In-Space Assembly and Satellite Servicing

    NASA Image and Video Library

    2017-09-06

    WASHINGTON, D.C.---S&T Partnership Forum In-Space Assembly Technical Interchange Meeting-On September 6th 2017, many of the United States government experts on In-Space Assembly met at the U.S. Naval Research Lab to discuss both technology development and in-space applications that would advance national capabilities in this area. Expertise from NASA, USAF, NRO, DARPA and NRL met in this meeting which was coordinated by the NASA Headquarters, Office of the Chief Technologist. This technical interchange meeting was the second meeting of the members of this Science and Technology Partnership Forum. Glen Henshaw of Code 8231 talks to the group in the Space Robotics Lab.

  18. Computer aided statistical process control for on-line instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meils, D.E.

    1995-01-01

    On-line chemical process instrumentation historically has been used for trending. Recent technological advances in on-line instrumentation have improved the accuracy and reliability of on-line instrumentation. However, little attention has been given to validating and verifying on-line instrumentation. This paper presents two practical approaches for validating instrument performance by comparison of on-line instrument response to either another portable instrument or another bench instrument. Because the comparison of two instruments` performance to each other requires somewhat complex statistical calculations, a computer code (Lab Stats Pack{reg_sign}) is used to simplify the calculations. Lab Stats Pack{reg_sign} also develops control charts that may be usedmore » for continuous verification of on-line instrument performance.« less

  19. Glen Henshaw Briefs NASA Chief and Deputy Chief Technologists at the In-Space Assembly Technical Interchange Meeting on September 6, 2017

    NASA Image and Video Library

    2017-09-06

    WASHINGTON, D.C.---S&T Partnership Forum In-Space Assembly Technical Interchange Meeting-On September 6th 2017, many of the United States government experts on In-Space Assembly met at the U.S. Naval Research Lab to discuss both technology development and in-space applications that would advance national capabilities in this area. Expertise from NASA, USAF, NRO, DARPA and NRL met in this meeting which was coordinated by the NASA Headquarters, Office of the Chief Technologist. This technical interchange meeting was the second meeting of the members of this Science and Technology Partnership Forum. Glen Henshaw of Code 8231 talks to the group in the Space Robotics Lab.

  20. Girls In STEM White Coat Ceremony 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neal, Kelsey Ann Elizabeth; Coronado, Elizabeth

    Like working with children? Who doesn’t? Girls in STEM has myriad opportunities for you to help local Title 1 students in the classroom. You can choose to volunteer from the Lab, from the Bradbury Science Museum, or to travel to Abiquiu Elementary School (car provided) to do a science demonstration. The best part is that you can use the Community Outreach Partnership Code.

  1. A CRISPR Path to Engineering New Genetic Mouse Models for Cardiovascular Research

    PubMed Central

    Miano, Joseph M.; Zhu, Qiuyu Martin; Lowenstein, Charles J.

    2016-01-01

    Previous efforts to target the mouse genome for the addition, subtraction, or substitution of biologically informative sequences required complex vector design and a series of arduous steps only a handful of labs could master. The facile and inexpensive clustered regularly interspaced short palindromic repeats (CRISPR) method has now superseded traditional means of genome modification such that virtually any lab can quickly assemble reagents for developing new mouse models for cardiovascular research. Here we briefly review the history of CRISPR in prokaryotes, highlighting major discoveries leading to its formulation for genome modification in the animal kingdom. Core components of CRISPR technology are reviewed and updated. Practical pointers for two-component and three-component CRISPR editing are summarized with a number of applications in mice including frameshift mutations, deletion of enhancers and non-coding genes, nucleotide substitution of protein-coding and gene regulatory sequences, incorporation of loxP sites for conditional gene inactivation, and epitope tag integration. Genotyping strategies are presented and topics of genetic mosaicism and inadvertent targeting discussed. Finally, clinical applications and ethical considerations are addressed as the biomedical community eagerly embraces this astonishing innovation in genome editing to tackle previously intractable questions. PMID:27102963

  2. Extending software repository hosting to code review and testing

    NASA Astrophysics Data System (ADS)

    Gonzalez Alvarez, A.; Aparicio Cotarelo, B.; Lossent, A.; Andersen, T.; Trzcinska, A.; Asbury, D.; Hłimyr, N.; Meinhard, H.

    2015-12-01

    We will describe how CERN's services around Issue Tracking and Version Control have evolved, and what the plans for the future are. We will describe the services main design, integration and structure, giving special attention to the new requirements from the community of users in terms of collaboration and integration tools and how we address this challenge when defining new services based on GitLab for collaboration to replace our current Gitolite service and Code Review and Jenkins for Continuous Integration. These new services complement the existing ones to create a new global "development tool stack" where each working group can place its particular development work-flow.

  3. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  4. LabVIEW control software for scanning micro-beam X-ray fluorescence spectrometer.

    PubMed

    Wrobel, Pawel; Czyzycki, Mateusz; Furman, Leszek; Kolasinski, Krzysztof; Lankosz, Marek; Mrenca, Alina; Samek, Lucyna; Wegrzynek, Dariusz

    2012-05-15

    Confocal micro-beam X-ray fluorescence microscope was constructed. The system was assembled from commercially available components - a low power X-ray tube source, polycapillary X-ray optics and silicon drift detector - controlled by an in-house developed LabVIEW software. A video camera coupled to optical microscope was utilized to display the area excited by X-ray beam. The camera image calibration and scan area definition software were also based entirely on LabVIEW code. Presently, the main area of application of the newly constructed spectrometer is 2-dimensional mapping of element distribution in environmental, biological and geological samples with micrometer spatial resolution. The hardware and the developed software can already handle volumetric 3-D confocal scans. In this work, a front panel graphical user interface as well as communication protocols between hardware components were described. Two applications of the spectrometer, to homogeneity testing of titanium layers and to imaging of various types of grains in air particulate matter collected on membrane filters, were presented. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Developing automated analytical methods for scientific environments using LabVIEW.

    PubMed

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  6. Simulations of a FIR Oscillator with Large Slippage parameter at Jefferson Lab for FIR/UV pump-probe experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benson, Stephen V.; Campbell, L. T.; McNeil, B.W.T.

    We previously proposed a dual FEL configuration on the UV Demo FEL at Jefferson Lab that would allow simultaneous lasing at FIR and UV wavelengths. The FIR source would be an FEL oscillator with a short wiggler providing diffraction-limited pulses with pulse energy exceeding 50 microJoules, using the exhaust beam from a UVFEL as the input electron beam. Since the UV FEL requires very short pulses, the input to the FIR FEL is extremely short compared to a slippage length and the usual Slowly Varying Envelope Approximation (SVEA) does not apply. We use a non-SVEA code to simulate this systemmore » both with a small energy spread (UV laser off) and with large energy spread (UV laser on).« less

  7. Effects of Norfolk Harbor Deepening on Management of Craney Island Disposal Area.

    DTIC Science & Technology

    1983-04-01

    VICKSBURG MS ENVIRONMENTAL LAB UNCLASSIFIED D F HAVES 01 APR 03 F/G 13/2 N smmmhhhhhhhhh smhmhhhhhomhl Ehhhhhhmhrnmm .4’ -Te SIBM 11-25 -A MIROOP RU4TT-oif...CLASSIFICATION OUNCLASSIFIEUNUMTED 1l SAMEAS ItP . C0 W USERS Unclasslfteg 22a. NAME OF RESPONSIBLE INDIVIDUAL 22b. TtLE VOtElA. Ale Code) U&, OfFICE

  8. Multiple Independent File Parallel I/O with HDF5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, M. C.

    2016-07-13

    The HDF5 library has supported the I/O requirements of HPC codes at Lawrence Livermore National Labs (LLNL) since the late 90’s. In particular, HDF5 used in the Multiple Independent File (MIF) parallel I/O paradigm has supported LLNL code’s scalable I/O requirements and has recently been gainfully used at scales as large as O(10 6) parallel tasks.

  9. Overview of codes and tools for nuclear engineering education

    NASA Astrophysics Data System (ADS)

    Yakovlev, D.; Pryakhin, A.; Medvedeva, L.

    2017-01-01

    The recent world trends in nuclear education have been developed in the direction of social education, networking, virtual tools and codes. MEPhI as a global leader on the world education market implements new advanced technologies for the distance and online learning and for student research work. MEPhI produced special codes, tools and web resources based on the internet platform to support education in the field of nuclear technology. At the same time, MEPhI actively uses codes and tools from the third parties. Several types of the tools are considered: calculation codes, nuclear data visualization tools, virtual labs, PC-based educational simulators for nuclear power plants (NPP), CLP4NET, education web-platforms, distance courses (MOOCs and controlled and managed content systems). The university pays special attention to integrated products such as CLP4NET, which is not a learning course, but serves to automate the process of learning through distance technologies. CLP4NET organizes all tools in the same information space. Up to now, MEPhI has achieved significant results in the field of distance education and online system implementation.

  10. NASA Data Acquisition System Software Development for Rocket Propulsion Test Facilities

    NASA Technical Reports Server (NTRS)

    Herbert, Phillip W., Sr.; Elliot, Alex C.; Graves, Andrew R.

    2015-01-01

    Current NASA propulsion test facilities include Stennis Space Center in Mississippi, Marshall Space Flight Center in Alabama, Plum Brook Station in Ohio, and White Sands Test Facility in New Mexico. Within and across these centers, a diverse set of data acquisition systems exist with different hardware and software platforms. The NASA Data Acquisition System (NDAS) is a software suite designed to operate and control many critical aspects of rocket engine testing. The software suite combines real-time data visualization, data recording to a variety formats, short-term and long-term acquisition system calibration capabilities, test stand configuration control, and a variety of data post-processing capabilities. Additionally, data stream conversion functions exist to translate test facility data streams to and from downstream systems, including engine customer systems. The primary design goals for NDAS are flexibility, extensibility, and modularity. Providing a common user interface for a variety of hardware platforms helps drive consistency and error reduction during testing. In addition, with an understanding that test facilities have different requirements and setups, the software is designed to be modular. One engine program may require real-time displays and data recording; others may require more complex data stream conversion, measurement filtering, or test stand configuration management. The NDAS suite allows test facilities to choose which components to use based on their specific needs. The NDAS code is primarily written in LabVIEW, a graphical, data-flow driven language. Although LabVIEW is a general-purpose programming language; large-scale software development in the language is relatively rare compared to more commonly used languages. The NDAS software suite also makes extensive use of a new, advanced development framework called the Actor Framework. The Actor Framework provides a level of code reuse and extensibility that has previously been difficult to achieve using LabVIEW. The

  11. Logical qubit fusion

    NASA Astrophysics Data System (ADS)

    Moussa, Jonathan; Ryan-Anderson, Ciaran

    The canonical modern plan for universal quantum computation is a Clifford+T gate set implemented in a topological error-correcting code. This plan has the basic disparity that logical Clifford gates are natural for codes in two spatial dimensions while logical T gates are natural in three. Recent progress has reduced this disparity by proposing logical T gates in two dimensions with doubled, stacked, or gauge color codes, but these proposals lack an error threshold. An alternative universal gate set is Clifford+F, where a fusion (F) gate converts two logical qubits into a logical qudit. We show that logical F gates can be constructed by identifying compatible pairs of qubit and qudit codes that stabilize the same logical subspace, much like the original Bravyi-Kitaev construction of magic state distillation. The simplest example of high-distance compatible codes results in a proposal that is very similar to the stacked color code with the key improvement of retaining an error threshold. Sandia National Labs is a multi-program laboratory managed and operated by Sandia Corp, a wholly owned subsidiary of Lockheed Martin Corp, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  12. Preliminary investigation of parasitic radioisotope production using the LANL IPF secondary neutron flux

    NASA Astrophysics Data System (ADS)

    Engle, J. W.; Kelsey, C. T.; Bach, H.; Ballard, B. D.; Fassbender, M. E.; John, K. D.; Birnbaum, E. R.; Nortier, F. M.

    2012-12-01

    In order to ascertain the potential for radioisotope production and material science studies using the Isotope Production Facility at Los Alamos National Lab, a two-pronged investigation has been initiated. The Monte Carlo for Neutral Particles eXtended (MCNPX) code has been used in conjunction with the CINDER 90 burnup code to predict neutron flux energy distributions as a result of routine irradiations and to estimate yields of radioisotopes of interest for hypothetical irradiation conditions. A threshold foil activation experiment is planned to study the neutron flux using measured yields of radioisotopes, quantified by HPGe gamma spectroscopy, from representative nuclear reactions with known thresholds up to 50 MeV.

  13. Computational Accelerator Physics. Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bisognano, J.J.; Mondelli, A.A.

    1997-04-01

    The sixty two papers appearing in this volume were presented at CAP96, the Computational Accelerator Physics Conference held in Williamsburg, Virginia from September 24{minus}27,1996. Science Applications International Corporation (SAIC) and the Thomas Jefferson National Accelerator Facility (Jefferson lab) jointly hosted CAP96, with financial support from the U.S. department of Energy`s Office of Energy Research and the Office of Naval reasearch. Topics ranged from descriptions of specific codes to advanced computing techniques and numerical methods. Update talks were presented on nearly all of the accelerator community`s major electromagnetic and particle tracking codes. Among all papers, thirty of them are abstracted formore » the Energy Science and Technology database.(AIP)« less

  14. Guide to Using Sierra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Ryan Phillip; Agelastos, Anthony Michael; Miller, Joel D.

    2015-03-01

    Sierra is an engineering mechanics simulation code suite supporting the Nation's Nuclear Weapons mission as well as other customers. It has explicit ties to Sandia National Labs' workfow, including geometry and meshing, design and optimization, and visualization. Dis- tinguishing strengths include "application aware" development, scalability, SQA and V&V, multiple scales, and multi-physics coupling. This document is intended to help new and existing users of Sierra as a user manual and troubleshooting guide.

  15. An Empirical Model-based MOE for Friction Reduction by Slot-Ejected Polymer Solutions in an Aqueous Environment

    DTIC Science & Technology

    2007-12-21

    of hydrodynamics and the physical characteristics of the polymers. The physics models include both analytical models and numerical simulations ...the experimental observations. The numerical simulations also succeed in replicating some experimental measurements. However, there is still no...become quite significant. 4.5 Documentation The complete model is coded in MatLab . In the model, all units are cgs, so distances are in

  16. Guide to Using Sierra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shaw, Ryan Phillip; Agelastos, Anthony Michael; Miller, Joel D.

    2017-04-01

    Sierra is an engineering mechanics simulation code suite supporting the Nation's Nuclear Weapons mission as well as other customers. It has explicit ties to Sandia National Labs' workfow, including geometry and meshing, design and optimization, and visualization. Dis- tinguishing strengths include "application aware" development, scalability, SQA and V&V, multiple scales, and multi-physics coupling. This document is intended to help new and existing users of Sierra as a user manual and troubleshooting guide.

  17. 113. ARAI Hot cell (ARA626) Building wall sections and details ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    113. ARA-I Hot cell (ARA-626) Building wall sections and details of radio chemistry lab. Shows high-bay roof over hot cells and isolation rooms below grade storage pit for fuel elements. Norman Engineering Company: 961-area/SF-626-A-4. Date: January 1959. Ineel index code no. 068-0626-00-613-102724. - Idaho National Engineering Laboratory, Army Reactors Experimental Area, Scoville, Butte County, ID

  18. Unmanned Systems: A Lab Based Robotic Arm for Grasping Phase II

    DTIC Science & Technology

    2016-12-01

    Leap Motion Controller, inverse kinematics, DH parameters. 15. NUMBER OF PAGES 89 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT...robotic actuator. Inverse kinematics and Denavit-Hartenberg (DH) parameters will be briefly explained. A. POSITION ANALYSIS According to [3] and... inverse kinematic” method and allows us to calculate the actuator’s position in order to move the robot’s end effector to a specific point in space

  19. CPMC-Lab: A MATLAB package for Constrained Path Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Nguyen, Huy; Shi, Hao; Xu, Jie; Zhang, Shiwei

    2014-12-01

    We describe CPMC-Lab, a MATLAB program for the constrained-path and phaseless auxiliary-field Monte Carlo methods. These methods have allowed applications ranging from the study of strongly correlated models, such as the Hubbard model, to ab initio calculations in molecules and solids. The present package implements the full ground-state constrained-path Monte Carlo (CPMC) method in MATLAB with a graphical interface, using the Hubbard model as an example. The package can perform calculations in finite supercells in any dimensions, under periodic or twist boundary conditions. Importance sampling and all other algorithmic details of a total energy calculation are included and illustrated. This open-source tool allows users to experiment with various model and run parameters and visualize the results. It provides a direct and interactive environment to learn the method and study the code with minimal overhead for setup. Furthermore, the package can be easily generalized for auxiliary-field quantum Monte Carlo (AFQMC) calculations in many other models for correlated electron systems, and can serve as a template for developing a production code for AFQMC total energy calculations in real materials. Several illustrative studies are carried out in one- and two-dimensional lattices on total energy, kinetic energy, potential energy, and charge- and spin-gaps.

  20. Teaching the Thrill of Discovery: Student Exploration of the Large-Scale Structures of the Universe

    NASA Astrophysics Data System (ADS)

    Juneau, Stephanie; Dey, Arjun; Walker, Constance E.; NOAO Data Lab

    2018-01-01

    In collaboration with the Teen Astronomy Cafes program, the NOAO Data Lab is developing online Jupyter Notebooks as a free and publicly accessible tool for students and teachers. Each interactive activity teaches students simultaneously about coding and astronomy with a focus on large datasets. Therefore, students learn state-of-the-art techniques at the cross-section between astronomy and data science. During the activity entitled “Our Vast Universe”, students use real spectroscopic data to measure the distance to galaxies before moving on to a catalog with distances to over 100,000 galaxies. Exploring this dataset gives students an appreciation of the large number of galaxies in the universe (2 trillion!), and leads them to discover how galaxies are located in large and impressive filamentary structures. During the Teen Astronomy Cafes program, the notebook is supplemented with visual material conducive to discussion, and hands-on activities involving cubes representing model universes. These steps contribute to build the students’ physical intuition and give them a better grasp of the concepts before using software and coding. At the end of the activity, students have made their own measurements, and have experienced scientific research directly. More information is available online for the Teen Astronomy Cafes (teensciencecafe.org/cafes) and the NOAO Data Lab (datalab.noao.edu).

  1. Staphylococcus aureus undergoes major transcriptional reorganization during growth with Enterococcus faecalis in milk.

    PubMed

    Viçosa, Gabriela Nogueira; Botta, Cristian; Ferrocino, Ilario; Bertolino, Marta; Ventura, Marco; Nero, Luís Augusto; Cocolin, Luca

    2018-08-01

    Previous studies have demonstrated the antagonistic potential of lactic acid bacteria (LAB) present in raw milk microbiota over Staphylococcus aureus, albeit the molecular mechanisms underlying this inhibitory effect are not fully understood. In this study, we compared the behavior of S. aureus ATCC 29213 alone and in the presence of a cheese-isolated LAB strain, Enterococcus faecalis 41FL1 in skimmed milk at 30 °C for 24 h using phenotypical and molecular approaches. Phenotypic analysis showed the absence of classical staphylococcal enterotoxins in co-culture with a 1.2-log decrease in S. aureus final population compared to single culture. Transcriptional activity of several exotoxins and global regulators, including agr, was negatively impacted in co-culture, contrasting with the accumulation of transcripts coding for surface proteins. After 24 h, the number of transcripts coding for several metabolite responsive elements, as well as enzymes involved in glycolysis and acetoin metabolism was increased in co-culture. The present study discusses the complexity of the transcriptomic mechanisms possibly leading to S. aureus attenuated virulence in the presence of E. faecalis and provides insights into this interspecies interaction in a simulated food context. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. 2016 CSSE L3 Milestone: Deliver In Situ to XTD End Users

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patchett, John M.; Nouanesengsy, Boonthanome; Fasel, Patricia Kroll

    This report summarizes the activities in FY16 toward satisfying the CSSE 2016 L3 milestone to deliver in situ to XTD end users of EAP codes. The Milestone was accomplished with ongoing work to ensure the capability is maintained and developed. Two XTD end users used the in situ capability in Rage. A production ParaView capability was created in the HPC and Desktop environment. Two new capabilities were added to ParaView in support of an EAP in situ workflow. We also worked with various support groups at the lab to deploy a production ParaView in the LANL environment for both desktopmore » and HPC systems. . In addition, for this milestone, we moved two VTK based filters from research objects into the production ParaView code to support a variety of standard visualization pipelines for our EAP codes.« less

  3. Validity of International Classification of Diseases (ICD) coding for dengue infections in hospital discharge records in Malaysia.

    PubMed

    Woon, Yuan-Liang; Lee, Keng-Yee; Mohd Anuar, Siti Fatimah Zahra; Goh, Pik-Pin; Lim, Teck-Onn

    2018-04-20

    Hospitalization due to dengue illness is an important measure of dengue morbidity. However, limited studies are based on administrative database because the validity of the diagnosis codes is unknown. We validated the International Classification of Diseases, 10th revision (ICD) diagnosis coding for dengue infections in the Malaysian Ministry of Health's (MOH) hospital discharge database. This validation study involves retrospective review of available hospital discharge records and hand-search medical records for years 2010 and 2013. We randomly selected 3219 hospital discharge records coded with dengue and non-dengue infections as their discharge diagnoses from the national hospital discharge database. We then randomly sampled 216 and 144 records for patients with and without codes for dengue respectively, in keeping with their relative frequency in the MOH database, for chart review. The ICD codes for dengue were validated against lab-based diagnostic standard (NS1 or IgM). The ICD-10-CM codes for dengue had a sensitivity of 94%, modest specificity of 83%, positive predictive value of 87% and negative predictive value 92%. These results were stable between 2010 and 2013. However, its specificity decreased substantially when patients manifested with bleeding or low platelet count. The diagnostic performance of the ICD codes for dengue in the MOH's hospital discharge database is adequate for use in health services research on dengue.

  4. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study.

    PubMed

    Weems, Shelley; Heller, Pamela; Fenton, Susan H

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: 1. Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity. 2. Coder training and type of record (inpatient versus outpatient) affect coding productivity. 3. Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity.

  5. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study

    PubMed Central

    Weems, Shelley; Heller, Pamela; Fenton, Susan H.

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity.Coder training and type of record (inpatient versus outpatient) affect coding productivity.Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity. PMID:26396553

  6. Simulated combined abnormal environment fire calculations for aviation impacts.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Alexander L.

    2010-08-01

    Aircraft impacts at flight speeds are relevant environments for aircraft safety studies. This type of environment pertains to normal environments such as wildlife impacts and rough landings, but also the abnormal environment that has more recently been evidenced in cases such as the Pentagon and World Trade Center events of September 11, 2001, and the FBI building impact in Austin. For more severe impacts, the environment is combined because it involves not just the structural mechanics, but also the release of the fuel and the subsequent fire. Impacts normally last on the order of milliseconds to seconds, whereas the firemore » dynamics may last for minutes to hours, or longer. This presents a serious challenge for physical models that employ discrete time stepping to model the dynamics with accuracy. Another challenge is that the capabilities to model the fire and structural impact are seldom found in a common simulation tool. Sandia National Labs maintains two codes under a common architecture that have been used to model the dynamics of aircraft impact and fire scenarios. Only recently have these codes been coupled directly to provide a fire prediction that is better informed on the basis of a detailed structural calculation. To enable this technology, several facilitating models are necessary, as is a methodology for determining and executing the transfer of information from the structural code to the fire code. A methodology has been developed and implemented. Previous test programs at the Sandia National Labs sled track provide unique data for the dynamic response of an aluminum tank of liquid water impacting a barricade at flight speeds. These data are used to validate the modeling effort, and suggest reasonable accuracy for the dispersion of a non-combustible fluid in an impact environment. The capability is also demonstrated with a notional impact of a fuel-filled container at flight speed. Both of these scenarios are used to evaluate numeric approximations, and help provide an understanding of the quantitative accuracy of the modeling methods.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Three side-by-side lab houses were built, instrumented and monitored in an effort to determine through field testing and analysis the relative contributions of select technologies toward reducing energy use in new manufactured homes.The lab houses in Russellville, Alabama compared the performance of three homes built to varying levels of thermal integrity and HVAC equipment: a baseline HUD-code home equipped with an electric furnace and a split system air conditioner; an ENERGY STAR manufactured home with an enhanced thermal envelope and traditional split system heat pump; and a house designed to qualify for Zero Energy Ready Home designation with a ductlessmore » mini-split heat pump with transfer fan distribution system in place of the traditional duct system for distribution. Experiments were conducted in the lab houses to evaluate impact on energy and comfort of interior door position, window blind position and transfer fan operation. The report describes results of tracer gas and co-heating tests and presents calculation of the heat pump coefficient of performance for both the traditional heat pump and the ductless mini-split. A series of calibrated energy models was developed based on measured data and run in three locations in the Southeast to compare annual energy usage of the three homes.« less

  8. [A quick algorithm of dynamic spectrum photoelectric pulse wave detection based on LabVIEW].

    PubMed

    Lin, Ling; Li, Na; Li, Gang

    2010-02-01

    Dynamic spectrum (DS) detection is attractive among the numerous noninvasive blood component detection methods because of the elimination of the main interference of the individual discrepancy and measure conditions. DS is a kind of spectrum extracted from the photoelectric pulse wave and closely relative to the artery blood. It can be used in a noninvasive blood component concentration examination. The key issues in DS detection are high detection precision and high operation speed. The precision of measure can be advanced by making use of over-sampling and lock-in amplifying on the pick-up of photoelectric pulse wave in DS detection. In the present paper, the theory expression formula of the over-sampling and lock-in amplifying method was deduced firstly. Then in order to overcome the problems of great data and excessive operation brought on by this technology, a quick algorithm based on LabVIEW and a method of using external C code applied in the pick-up of photoelectric pulse wave were presented. Experimental verification was conducted in the environment of LabVIEW. The results show that by the method pres ented, the speed of operation was promoted rapidly and the data memory was reduced largely.

  9. Naval Research Lab Review 1999

    DTIC Science & Technology

    1999-01-01

    Center offers high-quality out- put from computer-generated files in EPS, Postscript, PICT, TIFF, Photoshop , and PowerPoint. Photo- graphic-quality color...767-3200 (228) 688-3390 (831) 656-4731 (410) 257-4000 DSN 297- or 754- 485 878 — Direct- in -Dialing 767- or 404- 688 656 257 Public Affairs (202) 767...research described in this NRL Review can be obtained from the Public Affairs Office, Code 1230, (202) 767-2541. Information concerning Technology

  10. Progress report on Nuclear Density project with Lawrence Livermore National Lab Year 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, C W; Krastev, P; Ormand, W E

    2011-03-11

    The main goal for year 2010 was to improve parallelization of the configuration interaction code BIGSTICK, co-written by W. Erich Ormand (LLNL) and Calvin W. Johnson (SDSU), with the parallelization carried out primarily by Plamen Krastev, a postdoc at SDSU and funded in part by this grant. The central computational algorithm is the Lanczos algorithm, which consists of a matrix-vector multiplication (matvec), followed by a Gram-Schmidt reorthogonalization.

  11. OSSE Observations of 3C 273

    DTIC Science & Technology

    1995-01-01

    1991T, which was discovered on 1991 April 13 (Waagen & Knight 1991) in the spiral galaxy, NGC 4527, on the edge of the Virgo cluster . The OSSE observing...Strickman E.O. Hulburt Center for Space Research , Code 7650 Naval Research Lab, Washington DC 20375 K. McNaron-Brown George Mason University, Fairfax...VA 22030 E. Jourdain Centre d’Etude Spatiale des Rayonnements, Toulouse, France G. V. Jung Universities Space Research Association, Washington DC

  12. The Concept of Fit in Contingency Theory.

    DTIC Science & Technology

    1984-11-01

    Research Center San Diego, CA 92152 Psychology Department Naval Regional Medical Center San Diego, CA 92134 Com~’arding Officer - Naval Submarine Medical ...Research Laboratory Naval Submarine Base New London, Box 900 Grotcn, CT 06249 Co~anding Officer :;ava! Aerospace Medical Resea-:ch’ Lab Naval Air...Station Pen~sacola, FEL 32508 Program Manager for Human 44 Performance (Code 44) Naval Medical R&D Command National Naval Medical Center Bethesda, MD 20014A

  13. LabKey Server: an open source platform for scientific data integration, analysis and collaboration.

    PubMed

    Nelson, Elizabeth K; Piehler, Britt; Eckels, Josh; Rauch, Adam; Bellew, Matthew; Hussey, Peter; Ramsay, Sarah; Nathe, Cory; Lum, Karl; Krouse, Kevin; Stearns, David; Connolly, Brian; Skillman, Tom; Igra, Mark

    2011-03-09

    Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0.

  14. LabKey Server: An open source platform for scientific data integration, analysis and collaboration

    PubMed Central

    2011-01-01

    Background Broad-based collaborations are becoming increasingly common among disease researchers. For example, the Global HIV Enterprise has united cross-disciplinary consortia to speed progress towards HIV vaccines through coordinated research across the boundaries of institutions, continents and specialties. New, end-to-end software tools for data and specimen management are necessary to achieve the ambitious goals of such alliances. These tools must enable researchers to organize and integrate heterogeneous data early in the discovery process, standardize processes, gain new insights into pooled data and collaborate securely. Results To meet these needs, we enhanced the LabKey Server platform, formerly known as CPAS. This freely available, open source software is maintained by professional engineers who use commercially proven practices for software development and maintenance. Recent enhancements support: (i) Submitting specimens requests across collaborating organizations (ii) Graphically defining new experimental data types, metadata and wizards for data collection (iii) Transitioning experimental results from a multiplicity of spreadsheets to custom tables in a shared database (iv) Securely organizing, integrating, analyzing, visualizing and sharing diverse data types, from clinical records to specimens to complex assays (v) Interacting dynamically with external data sources (vi) Tracking study participants and cohorts over time (vii) Developing custom interfaces using client libraries (viii) Authoring custom visualizations in a built-in R scripting environment. Diverse research organizations have adopted and adapted LabKey Server, including consortia within the Global HIV Enterprise. Atlas is an installation of LabKey Server that has been tailored to serve these consortia. It is in production use and demonstrates the core capabilities of LabKey Server. Atlas now has over 2,800 active user accounts originating from approximately 36 countries and 350 organizations. It tracks roughly 27,000 assay runs, 860,000 specimen vials and 1,300,000 vial transfers. Conclusions Sharing data, analysis tools and infrastructure can speed the efforts of large research consortia by enhancing efficiency and enabling new insights. The Atlas installation of LabKey Server demonstrates the utility of the LabKey platform for collaborative research. Stable, supported builds of LabKey Server are freely available for download at http://www.labkey.org. Documentation and source code are available under the Apache License 2.0. PMID:21385461

  15. Next generation sequencing yields the complete mitochondrial genome of the Endangered Chilean silverside Basilichthys microlepidotus (Jenyns, 1841) (Teleostei, Atherinopsidae), validated with RNA-seq.

    PubMed

    Véliz, David; Vega-Retter, Caren; Quezada-Romegialli, Claudio

    2016-01-01

    The complete sequence of the mitochondrial genome for the Chilean silverside Basilichthys microlepidotus is reported for the first time. The entire mitochondrial genome was 16,544 bp in length (GenBank accession no. KM245937); gene composition and arrangement was conformed to that reported for most fishes and contained the typical structure of 2 rRNAs, 13 protein-coding genes, 22 tRNAs and a non-coding region. The assembled mitogenome was validated against sequences of COI and Control Region previously sequenced in our lab, functional genes from RNA-Seq data for the same species and the mitogenome of two other atherinopsid species available in Genbank.

  16. Towards an Integrated QR Code Biosensor: Light-Driven Sample Acquisition and Bacterial Cellulose Paper Substrate.

    PubMed

    Yuan, Mingquan; Jiang, Qisheng; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2018-06-01

    This paper addresses two key challenges toward an integrated forward error-correcting biosensor based on our previously reported self-assembled quick-response (QR) code. The first challenge involves the choice of the paper substrate for printing and self-assembling the QR code. We have compared four different substrates that includes regular printing paper, Whatman filter paper, nitrocellulose membrane and lab synthesized bacterial cellulose. We report that out of the four substrates bacterial cellulose outperforms the others in terms of probe (gold nanorods) and ink retention capability. The second challenge involves remote activation of the analyte sampling and the QR code self-assembly process. In this paper, we use light as a trigger signal and a graphite layer as a light-absorbing material. The resulting change in temperature due to infrared absorption leads to a temperature gradient that then exerts a diffusive force driving the analyte toward the regions of self-assembly. The working principle has been verified in this paper using assembled biosensor prototypes where we demonstrate higher sample flow rate due to light induced thermal gradients.

  17. An open-source LabVIEW application toolkit for phasic heart rate analysis in psychophysiological research.

    PubMed

    Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A

    2004-11-01

    The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.

  18. The Updated AGU Ethics Policy: Supporting Inclusive and Diverse Field and Lab Environments within the Geosciences.

    NASA Astrophysics Data System (ADS)

    Williams, B. M.; McPhaden, M. J.; Gundersen, L. C.

    2017-12-01

    The American Geophysical Union (AGU), a scientific society of >60,000 members worldwide, has established a set of scientific integrity and professional ethics guidelines for the actions of its members, for the governance of the union in its internal activities, and for the operations and participation in its publications and scientific meetings. More recently AGU has undertaken actions to help address the issue of harassment in the sciences and other work climate issues; and, where applied more broadly as a code of standard behavior, will help address tangential issues of diversity and inclusion. This presentation will highlight the proposed policy changes and additional resources now in place, as they apply to field and lab environments. Progress to date and remaining challenges of this effort will be discussed, including AGU's work to provide additional program strength in the areas of Ethics, Diversity and Inclusion.

  19. Computer Labs | College of Engineering & Applied Science

    Science.gov Websites

    A B C D E F G H I J K L M N O P Q R S T U V W X Y Z D2L PAWS Email My UW-System About UWM UWM Jobs D2L PAWS Email My UW-System University of Wisconsin-Milwaukee College ofEngineering & Olympiad Girls Who Code Club FIRST Tech Challenge NSF I-Corps Site of Southeastern Wisconsin UW-Milwaukee

  20. Research Labs | College of Engineering & Applied Science

    Science.gov Websites

    A B C D E F G H I J K L M N O P Q R S T U V W X Y Z D2L PAWS Email My UW-System About UWM UWM Jobs D2L PAWS Email My UW-System University of Wisconsin-Milwaukee College ofEngineering & Olympiad Girls Who Code Club FIRST Tech Challenge NSF I-Corps Site of Southeastern Wisconsin UW-Milwaukee

  1. Modeling Item Responses When Different Subjects Employ Different Solution Strategies.

    DTIC Science & Technology

    1987-10-01

    Crombag Dr. Stephen Dunbar University of Leyden Lindquist Center Education Research l-enter for Measurement Boerhaavelaan 2 University of Iowa 2334 EN... Leyden Iowa City, IA 52242 The NEFHlPLANDS Dr. James A. F.arles Mr. Iimothy Davey Air Force Human Resources Lab iJniversity of Illinois Brooks AFB, TX...Education and Training Dr. William Montague Naval Air Station NPRDC Code 13 Pensacola, fL 32508 San Diego, CA 92152-6800 Dr. Gary Marco Ms. Kathleen Moreno

  2. Modular Unix(Trade Name)-Based Vulnerability Estimation Suite (MUVES) analyst’s Guide

    DTIC Science & Technology

    1991-12-01

    Memorandum Report No. 1542, February 1964. [201 Steven B. Segletes, "A Model of the Effects of Transverse Velocity on ’the Penetration of a Shaped...Redstone Arsenal, AL 35898-5000 (Dr. Steven Carter) 220 Seventh Street, NE 1 Commander Charlottesville, VA 22901-5396. US Army Missile Command ATTN: AMSMI...Betbesda, MD 20084-5000 I University of DaytonDavid Taylor Research Center Graduate Engineering and Research ATTN: Steven L. Cohen Kettering Lab 262 Code

  3. Letter to the editor : Impartial review is key.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, G. W.; Materials Science Division

    The News Feature, 'Misconduct in physics: Time to wise up? [Nature 418, 120-121; 2002], raises important issues that the physical-science community must face. Argonne National Laboratory's code of ethics calls for a response very similar to that of Bell Labs, namely: 'The Laboratory director may appoint an ad-hoc scientific review committee to investigate internal or external charges of scientific misconduct, fraud, falsification of data, misinterpretation of data, or other activities involving scientific or technical matters.'

  4. Terascale spectral element algorithms and implementations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, P. F.; Tufo, H. M.

    1999-08-17

    We describe the development and implementation of an efficient spectral element code for multimillion gridpoint simulations of incompressible flows in general two- and three-dimensional domains. We review basic and recently developed algorithmic underpinnings that have resulted in good parallel and vector performance on a broad range of architectures, including the terascale computing systems now coming online at the DOE labs. Sustained performance of 219 GFLOPS has been recently achieved on 2048 nodes of the Intel ASCI-Red machine at Sandia.

  5. HMPT: Hazardous Waste Transportation Live 27928, Test 27929

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, Lewis Edward

    2016-03-17

    HMPT: Hazardous Waste Transportation (Live 27928, suggested one time and associated Test 27929, required initially and every 36 months) addresses the Department of Transportation (DOT) function-specific training requirements of the hazardous materials packagings and transportation (HMPT) Los Alamos National Laboratory (LANL) lab-wide training. This course addresses the requirements of the DOT that are unique to hazardous waste shipments. Appendix B provides the Title 40 Code of Federal Regulations (CFR) reference material needed for this course.

  6. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  7. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  8. Publicly Releasing a Large Simulation Dataset with NDS Labs

    NASA Astrophysics Data System (ADS)

    Goldbaum, Nathan

    2016-03-01

    Optimally, all publicly funded research should be accompanied by the tools, code, and data necessary to fully reproduce the analysis performed in journal articles describing the research. This ideal can be difficult to attain, particularly when dealing with large (>10 TB) simulation datasets. In this lightning talk, we describe the process of publicly releasing a large simulation dataset to accompany the submission of a journal article. The simulation was performed using Enzo, an open source, community-developed N-body/hydrodynamics code and was analyzed using a wide range of community- developed tools in the scientific Python ecosystem. Although the simulation was performed and analyzed using an ecosystem of sustainably developed tools, we enable sustainable science using our data by making it publicly available. Combining the data release with the NDS Labs infrastructure allows a substantial amount of added value, including web-based access to analysis and visualization using the yt analysis package through an IPython notebook interface. In addition, we are able to accompany the paper submission to the arXiv preprint server with links to the raw simulation data as well as interactive real-time data visualizations that readers can explore on their own or share with colleagues during journal club discussions. It is our hope that the value added by these services will substantially increase the impact and readership of the paper.

  9. Track-A-Worm, An Open-Source System for Quantitative Assessment of C. elegans Locomotory and Bending Behavior

    PubMed Central

    Wang, Sijie Jason; Wang, Zhao-Wen

    2013-01-01

    A major challenge of neuroscience is to understand the circuit and gene bases of behavior. C. elegans is commonly used as a model system to investigate how various gene products function at specific tissue, cellular, and synaptic foci to produce complicated locomotory and bending behavior. The investigation generally requires quantitative behavioral analyses using an automated single-worm tracker, which constantly records and analyzes the position and body shape of a freely moving worm at a high magnification. Many single-worm trackers have been developed to meet lab-specific needs, but none has been widely implemented for various reasons, such as hardware difficult to assemble, and software lacking sufficient functionality, having closed source code, or using a programming language that is not broadly accessible. The lack of a versatile system convenient for wide implementation makes data comparisons difficult and compels other labs to develop new worm trackers. Here we describe Track-A-Worm, a system rich in functionality, open in source code, and easy to use. The system includes plug-and-play hardware (a stereomicroscope, a digital camera and a motorized stage), custom software written to run with Matlab in Windows 7, and a detailed user manual. Grayscale images are automatically converted to binary images followed by head identification and placement of 13 markers along a deduced spline. The software can extract and quantify a variety of parameters, including distance traveled, average speed, distance/time/speed of forward and backward locomotion, frequency and amplitude of dominant bends, overall bending activities measured as root mean square, and sum of all bends. It also plots worm travel path, bend trace, and bend frequency spectrum. All functionality is performed through graphical user interfaces and data is exported to clearly-annotated and documented Excel files. These features make Track-A-Worm a good candidate for implementation in other labs. PMID:23922769

  10. IDENTIFYING GENETIC ASSOCIATIONS WITH VARIABILITY IN METABOLIC HEALTH AND BLOOD COUNT LABORATORY VALUES: DIVING INTO THE QUANTITATIVE TRAITS BY LEVERAGING LONGITUDINAL DATA FROM AN EHR.

    PubMed

    Verma, Shefali S; Lucas, Anastasia M; Lavage, Daniel R; Leader, Joseph B; Metpally, Raghu; Krishnamurthy, Sarathbabu; Dewey, Frederick; Borecki, Ingrid; Lopez, Alexander; Overton, John; Penn, John; Reid, Jeffrey; Pendergrass, Sarah A; Breitwieser, Gerda; Ritchie, Marylyn D

    2017-01-01

    A wide range of patient health data is recorded in Electronic Health Records (EHR). This data includes diagnosis, surgical procedures, clinical laboratory measurements, and medication information. Together this information reflects the patient's medical history. Many studies have efficiently used this data from the EHR to find associations that are clinically relevant, either by utilizing International Classification of Diseases, version 9 (ICD-9) codes or laboratory measurements, or by designing phenotype algorithms to extract case and control status with accuracy from the EHR. Here we developed a strategy to utilize longitudinal quantitative trait data from the EHR at Geisinger Health System focusing on outpatient metabolic and complete blood panel data as a starting point. Comprehensive Metabolic Panel (CMP) as well as Complete Blood Counts (CBC) are parts of routine care and provide a comprehensive picture from high level screening of patients' overall health and disease. We randomly split our data into two datasets to allow for discovery and replication. We first conducted a genome-wide association study (GWAS) with median values of 25 different clinical laboratory measurements to identify variants from Human Omni Express Exome beadchip data that are associated with these measurements. We identified 687 variants that associated and replicated with the tested clinical measurements at p<5×10-08. Since longitudinal data from the EHR provides a record of a patient's medical history, we utilized this information to further investigate the ICD-9 codes that might be associated with differences in variability of the measurements in the longitudinal dataset. We identified low and high variance patients by looking at changes within their individual longitudinal EHR laboratory results for each of the 25 clinical lab values (thus creating 50 groups - a high variance and a low variance for each lab variable). We then performed a PheWAS analysis with ICD-9 diagnosis codes, separately in the high variance group and the low variance group for each lab variable. We found 717 PheWAS associations that replicated at a p-value less than 0.001. Next, we evaluated the results of this study by comparing the association results between the high and low variance groups. For example, we found 39 SNPs (in multiple genes) associated with ICD-9 250.01 (Type-I diabetes) in patients with high variance of plasma glucose levels, but not in patients with low variance in plasma glucose levels. Another example is the association of 4 SNPs in UMOD with chronic kidney disease in patients with high variance for aspartate aminotransferase (discovery p-value: 8.71×10-09 and replication p-value: 2.03×10-06). In general, we see a pattern of many more statistically significant associations from patients with high variance in the quantitative lab variables, in comparison with the low variance group across all of the 25 laboratory measurements. This study is one of the first of its kind to utilize quantitative trait variance from longitudinal laboratory data to find associations among genetic variants and clinical phenotypes obtained from an EHR, integrating laboratory values and diagnosis codes to understand the genetic complexities of common diseases.

  11. Mean Flow and Noise Prediction for a Separate Flow Jet With Chevron Mixers

    NASA Technical Reports Server (NTRS)

    Koch, L. Danielle; Bridges, James; Khavaran, Abbas

    2004-01-01

    Experimental and numerical results are presented here for a separate flow nozzle employing chevrons arranged in an alternating pattern on the core nozzle. Comparisons of these results demonstrate that the combination of the WIND/MGBK suite of codes can predict the noise reduction trends measured between separate flow jets with and without chevrons on the core nozzle. Mean flow predictions were validated against Particle Image Velocimetry (PIV), pressure, and temperature data, and noise predictions were validated against acoustic measurements recorded in the NASA Glenn Aeroacoustic Propulsion Lab. Comparisons are also made to results from the CRAFT code. The work presented here is part of an on-going assessment of the WIND/MGBK suite for use in designing the next generation of quiet nozzles for turbofan engines.

  12. Insertion device calculations with mathematica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carr, R.; Lidia, S.

    1995-02-01

    The design of accelerator insertion devices such as wigglers and undulators has usually been aided by numerical modeling on digital computers, using code in high level languages like Fortran. In the present era, there are higher level programming environments like IDL{reg_sign}, MatLab{reg_sign}, and Mathematica{reg_sign} in which these calculations may be performed by writing much less code, and in which standard mathematical techniques are very easily used. The authors present a suite of standard insertion device modeling routines in Mathematica to illustrate the new techniques. These routines include a simple way to generate magnetic fields using blocks of CSEM materials, trajectorymore » solutions from the Lorentz force equations for given magnetic fields, Bessel function calculations of radiation for wigglers and undulators and general radiation calculations for undulators.« less

  13. GenomeDiagram: a python package for the visualization of large-scale genomic data.

    PubMed

    Pritchard, Leighton; White, Jennifer A; Birch, Paul R J; Toth, Ian K

    2006-03-01

    We present GenomeDiagram, a flexible, open-source Python module for the visualization of large-scale genomic, comparative genomic and other data with reference to a single chromosome or other biological sequence. GenomeDiagram may be used to generate publication-quality vector graphics, rastered images and in-line streamed graphics for webpages. The package integrates with datatypes from the BioPython project, and is available for Windows, Linux and Mac OS X systems. GenomeDiagram is freely available as source code (under GNU Public License) at http://bioinf.scri.ac.uk/lp/programs.html, and requires Python 2.3 or higher, and recent versions of the ReportLab and BioPython packages. A user manual, example code and images are available at http://bioinf.scri.ac.uk/lp/programs.html.

  14. LOINC, a universal standard for identifying laboratory observations: a 5-year update.

    PubMed

    McDonald, Clement J; Huff, Stanley M; Suico, Jeffrey G; Hill, Gilbert; Leavelle, Dennis; Aller, Raymond; Forrey, Arden; Mercer, Kathy; DeMoor, Georges; Hook, John; Williams, Warren; Case, James; Maloney, Pat

    2003-04-01

    The Logical Observation Identifier Names and Codes (LOINC) database provides a universal code system for reporting laboratory and other clinical observations. Its purpose is to identify observations in electronic messages such as Health Level Seven (HL7) observation messages, so that when hospitals, health maintenance organizations, pharmaceutical manufacturers, researchers, and public health departments receive such messages from multiple sources, they can automatically file the results in the right slots of their medical records, research, and/or public health systems. For each observation, the database includes a code (of which 25 000 are laboratory test observations), a long formal name, a "short" 30-character name, and synonyms. The database comes with a mapping program called Regenstrief LOINC Mapping Assistant (RELMA(TM)) to assist the mapping of local test codes to LOINC codes and to facilitate browsing of the LOINC results. Both LOINC and RELMA are available at no cost from http://www.regenstrief.org/loinc/. The LOINC medical database carries records for >30 000 different observations. LOINC codes are being used by large reference laboratories and federal agencies, e.g., the CDC and the Department of Veterans Affairs, and are part of the Health Insurance Portability and Accountability Act (HIPAA) attachment proposal. Internationally, they have been adopted in Switzerland, Hong Kong, Australia, and Canada, and by the German national standards organization, the Deutsches Instituts für Normung. Laboratories should include LOINC codes in their outbound HL7 messages so that clinical and research clients can easily integrate these results into their clinical and research repositories. Laboratories should also encourage instrument vendors to deliver LOINC codes in their instrument outputs and demand LOINC codes in HL7 messages they get from reference laboratories to avoid the need to lump so many referral tests under the "send out lab" code.

  15. Adapting NBODY4 with a GRAPE-6a Supercomputer for Web Access, Using NBodyLab

    NASA Astrophysics Data System (ADS)

    Johnson, V.; Aarseth, S.

    2006-07-01

    A demonstration site has been developed by the authors that enables researchers and students to experiment with the capabilities and performance of NBODY4 running on a GRAPE-6a over the web. NBODY4 is a sophisticated open-source N-body code for high accuracy simulations of dense stellar systems (Aarseth 2003). In 2004, NBODY4 was successfully tested with a GRAPE-6a, yielding an unprecedented low-cost tool for astrophysical research. The GRAPE-6a is a supercomputer card developed by astrophysicists to accelerate high accuracy N-body simulations with a cluster or a desktop PC (Fukushige et al. 2005, Makino & Taiji 1998). The GRAPE-6a card became commercially available in 2004, runs at 125 Gflops peak, has a standard PCI interface and costs less than 10,000. Researchers running the widely used NBODY6 (which does not require GRAPE hardware) can compare their own PC or laptop performance with simulations run on http://www.NbodyLab.org. Such comparisons may help justify acquisition of a GRAPE-6a. For workgroups such as university physics or astronomy departments, the demonstration site may be replicated or serve as a model for a shared computing resource. The site was constructed using an NBodyLab server-side framework.

  16. A Comparison of the Effects of Random Versus Fixed Order of Item Presentation Via the Computer

    DTIC Science & Technology

    1989-02-01

    Copies) Dr. Hans Crombag Dr. Stephen Dunbar University of Leyden Lindquist Center Education Research Center for Measurement Boerhaavelaan 2 University of...Iowa 2334 EN Leyden Iowa City, IA 52242 The NETHERLANDS Dr. James A. Earles Dr. Timothy Davey Air Force Human Resources Lab Educational Testing...Montague 4401 Ford Avenue NPRDC Code 13 P.O. Box 16268 San Diego, CA 92152-6800 Alexandria, VA 22302-0268 Ms. Kathleen Moreno Dr..William L. Maloy

  17. Software Development for a Satellite Signal Analyzer

    DTIC Science & Technology

    1979-12-01

    uses this mode index. BUTLI3(18,6) 1*4 Is an array of button num- bers used by the subroutine KTLSIM to simulate the user pushing bottons by...either the program SATCOM will be executed, or if HOLD was previously pushed, the array BUTLIS is used to sim- ulate the pushing of the bottons on...in branch 1 to the AW /WSC-3 has to be on. Branches 1-3 provide power to the lab benches, rack 0-6, and uninterruptable power to the time code

  18. Individual Differences in Attentional Flexibility.

    DTIC Science & Technology

    1978-05-15

    L A CADEMY ANNAPOLIS , MD 21*02 CDR PAUL NELSON NAVAL MEDICAL R& D COMMAND 1 Mr. Arnold I. Rubinstein CODE 1s14 Human Resoureces Program Manager...ARMY RESE AR CH iNSTITUTE Hea d Human Factors Engineering Div. 5001 EISENHOWER AVENUE Naval Air Development Center ALEXANDRI A , VA 22~~ 3 W~ rm inst...WEDNESDA Y, MAY 3, 1978 09:53 :0O—PDT PAGE 14 Army Air Force Dr. Joseph Ward 1 Air Force Human Resources Lab U.S. Army Research Institute AFHRL/PED 5001

  19. Operation SANDSTONE: 1948

    DTIC Science & Technology

    1983-12-19

    8217qOKINWOTME X_ ENJEBI LR UNU/ MIJ!KADREK P, 1BOKOLU BO) KENE LAB .-. •ELLE ,AEJ • LUJORS• I i F L E 149N" YOKE (49 KT). TOWVER I,. L QJWA% ALIE’.I t...MSRB-60 12 Cy ATTN: DD Merchant Marine Academy Field Comand ATTN: Director of Libraries Defense iuclear Agency Naval Historical Center ATTN: FCLS, MAJ...Judge Adv Gen ATTN: OMA, DP-22 ATTN: Code 73 Nevada Operations Office U.S. Merchant Marine Academy ATTN: Health Physics Div ATTN: Librarian 2 cy ATTN: R

  20. U.S. Marine Corps FY 82 Exploratory Development Program.

    DTIC Science & Technology

    1982-01-25

    1. It is requested that the cover of the reference be pen changed to reflect "FY 82" vice "FY 81". DISTRIBUTION: By direction (see attached pages) i4...Falls Church, VA 22041 3 Marine Corps Liaison Officer Naval Weapons Center China Lake, CA 93555 Marine Corps Liaison Officer HQ MASSTER Ft. Hood, TX...Center, Hawaii Lab P. 0. Box 997 Kaihua, Hawaii 96734 Mr. Paul H. Amundson Code 3304 Naval Weapons Center China Lake, CA 93555 Naval Surface Weapons

  1. An Analysis of the Navys Fiscal Year 2016 Shipbuilding Plan

    DTIC Science & Technology

    2015-12-01

    19b. TELEPHONE NUMBER (Include area code) 12/01/2014 Technical Report - Congressional Testimony An Analysis of the Navy’s Fiscal Year 2016 Shipbuilding ...Release 12/4/2015 No U U U CONGRESS OF THE UNITED STATES Testimony An Analysis of the Navy’s Fiscal Year 2016 Shipbuilding Plan Eric J. Labs Senior...Subcommittee, thank you for the opportunity to testify on the Navy’s 2016 shipbuilding plan and the 2014 update to the service’s 2012 force structure

  2. Report of the 1992 AFOSR Workshop on the Future of EEG and MEG

    DTIC Science & Technology

    1993-02-02

    Systems Lab, 51 Federal St., San Francisco, CA 94107 LTIC TAB 0 Unannounced Justificat;or’ 6 *Department of Physics, 2 Washington Place, New York University...By 7 New York, NY 10003 Distribution I Avaeiabdlty Codes D Avad ariclor D•tst Soecile 8 1. Introduction 9 A workshop on the prospects of the...undoubtedly be utilized include the diagnosis and treatment of diseases of the bran such as 4 epilepsy, Alzheimer’s, and schizophrenia; the monitoring and

  3. Optical Guiding in the Separable Beam Limit,

    DTIC Science & Technology

    1987-09-01

    UNIV COLLEGE PARK LAB FOR PLASMA AND FUSION ENERGY STUDIES T M ANTONSEN ET AL SEP 87 UMLPF-BB-Bui UNCLASSIFIED N8884-6-K-2 85 F/G 9/2 N E m9h hOTCA...University of Maryland, D-Aiitiun f Laboratory for Plasma and Fusion Energy Studies Av-,-~t Codes DISTEIBTION GT TMNTA Approved for public releaBOI...Distfibution Unlimited OPTICAL GUIDING IN THE SEPARABLE BEAM LIMIT T. M. Antonsen, Jr. and B. Levush Laboratory for Plasma and Fusion Energy Studies University

  4. Goddard's Astrophysics Science Divsion Annual Report 2014

    NASA Technical Reports Server (NTRS)

    Weaver, Kimberly (Editor); Reddy, Francis (Editor); Tyler, Pat (Editor)

    2015-01-01

    The Astrophysics Science Division (ASD, Code 660) is one of the world's largest and most diverse astronomical organizations. Space flight missions are conceived, built and launched to observe the entire range of the electromagnetic spectrum, from gamma rays to centimeter waves. In addition, experiments are flown to gather data on high-energy cosmic rays, and plans are being made to detect gravitational radiation from space-borne missions. To enable these missions, we have vigorous programs of instrument and detector development. Division scientists also carry out preparatory theoretical work and subsequent data analysis and modeling. In addition to space flight missions, we have a vibrant suborbital program with numerous sounding rocket and balloon payloads in development or operation. The ASD is organized into five labs: the Astroparticle Physics Lab, the X-ray Astrophysics Lab, the Gravitational Astrophysics Lab, the Observational Cosmology Lab, and the Exoplanets and Stellar Astrophysics Lab. The High Energy Astrophysics Science Archive Research Center (HEASARC) is an Office at the Division level. Approximately 400 scientists and engineers work in ASD. Of these, 80 are civil servant scientists, while the rest are resident university-based scientists, contractors, postdoctoral fellows, graduate students, and administrative staff. We currently operate the Swift Explorer mission and the Fermi Gamma-ray Space Telescope. In addition, we provide data archiving and operational support for the XMM mission (jointly with ESA) and the Suzaku mission (with JAXA). We are also a partner with Caltech on the NuSTAR mission. The Hubble Space Telescope Project is headquartered at Goddard, and ASD provides Project Scientists to oversee operations at the Space Telescope Science Institute. Projects in development include the Neutron Interior Composition Explorer (NICER) mission, an X-ray timing experiment for the International Space Station; the Transiting Exoplanet Sky Survey (TESS) Explorer mission, in collaboration with MIT (Ricker, PI); the Soft X-ray Spectrometer (SXS) for the Astro-H mission in collaboration with JAXA, and the James Webb Space Telescope (JWST). The Wide-Field Infrared Survey Telescope (WFIRST), the highest ranked mission in the 2010 decadal survey, is in a pre-phase A study, and we are supplying study scientists for that mission.

  5. ChromaStarPy: A Stellar Atmosphere and Spectrum Modeling and Visualization Lab in Python

    NASA Astrophysics Data System (ADS)

    Short, C. Ian; Bayer, Jason H. T.; Burns, Lindsey M.

    2018-02-01

    We announce ChromaStarPy, an integrated general stellar atmospheric modeling and spectrum synthesis code written entirely in python V. 3. ChromaStarPy is a direct port of the ChromaStarServer (CSServ) Java modeling code described in earlier papers in this series, and many of the associated JavaScript (JS) post-processing procedures have been ported and incorporated into CSPy so that students have access to ready-made data products. A python integrated development environment (IDE) allows a student in a more advanced course to experiment with the code and to graphically visualize intermediate and final results, ad hoc, as they are running it. CSPy allows students and researchers to compare modeled to observed spectra in the same IDE in which they are processing observational data, while having complete control over the stellar parameters affecting the synthetic spectra. We also take the opportunity to describe improvements that have been made to the related codes, ChromaStar (CS), CSServ, and ChromaStarDB (CSDB), that, where relevant, have also been incorporated into CSPy. The application may be found at the home page of the OpenStars project: http://www.ap.smu.ca/OpenStars/.

  6. Assessment of the dose distribution inside a cardiac cath lab using TLD measurements and Monte Carlo simulations

    NASA Astrophysics Data System (ADS)

    Baptista, M.; Teles, P.; Cardoso, G.; Vaz, P.

    2014-11-01

    Over the last decade, there was a substantial increase in the number of interventional cardiology procedures worldwide, and the corresponding ionizing radiation doses for both the medical staff and patients became a subject of concern. Interventional procedures in cardiology are normally very complex, resulting in long exposure times. Also, these interventions require the operator to work near the patient and, consequently, close to the primary X-ray beam. Moreover, due to the scattered radiation from the patient and the equipment, the medical staff is also exposed to a non-uniform radiation field that can lead to a significant exposure of sensitive body organs and tissues, such as the eye lens, the thyroid and the extremities. In order to better understand the spatial variation of the dose and dose rate distributions during an interventional cardiology procedure, the dose distribution around a C-arm fluoroscopic system, in operation in a cardiac cath lab at Portuguese Hospital, was estimated using both Monte Carlo (MC) simulations and dosimetric measurements. To model and simulate the cardiac cath lab, including the fluoroscopic equipment used to execute interventional procedures, the state-of-the-art MC radiation transport code MCNPX 2.7.0 was used. Subsequently, Thermo-Luminescent Detector (TLD) measurements were performed, in order to validate and support the simulation results obtained for the cath lab model. The preliminary results presented in this study reveal that the cardiac cath lab model was successfully validated, taking into account the good agreement between MC calculations and TLD measurements. The simulated results for the isodose curves related to the C-arm fluoroscopic system are also consistent with the dosimetric information provided by the equipment manufacturer (Siemens). The adequacy of the implemented computational model used to simulate complex procedures and map dose distributions around the operator and the medical staff is discussed, in view of the optimization principle (and the associated ALARA objective), one of the pillars of the international system of radiological protection.

  7. COSMOS: Python library for massively parallel workflows

    PubMed Central

    Gafni, Erik; Luquette, Lovelace J.; Lancaster, Alex K.; Hawkins, Jared B.; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P.; Tonellato, Peter J.

    2014-01-01

    Summary: Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Availability and implementation: Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. Contact: dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24982428

  8. COSMOS: Python library for massively parallel workflows.

    PubMed

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  9. Achieving behavioral control with millisecond resolution in a high-level programming environment.

    PubMed

    Asaad, Wael F; Eskandar, Emad N

    2008-08-30

    The creation of psychophysical tasks for the behavioral neurosciences has generally relied upon low-level software running on a limited range of hardware. Despite the availability of software that allows the coding of behavioral tasks in high-level programming environments, many researchers are still reluctant to trust the temporal accuracy and resolution of programs running in such environments, especially when they run atop non-real-time operating systems. Thus, the creation of behavioral paradigms has been slowed by the intricacy of the coding required and their dissemination across labs has been hampered by the various types of hardware needed. However, we demonstrate here that, when proper measures are taken to handle the various sources of temporal error, accuracy can be achieved at the 1 ms time-scale that is relevant for the alignment of behavioral and neural events.

  10. LHCb migration from Subversion to Git

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Couturier, B.; Closier, J.; Cattaneo, M.

    2017-10-01

    Due to user demand and to support new development workflows based on code review and multiple development streams, LHCb decided to port the source code management from Subversion to Git, using the CERN GitLab hosting service. Although tools exist for this kind of migration, LHCb specificities and development models required careful planning of the migration, development of migration tools, changes to the development model, and redefinition of the release procedures. Moreover we had to support a hybrid situation with some software projects hosted in Git and others still in Subversion, or even branches of one projects hosted in different systems. We present the way we addressed the special LHCb requirements, the technical details of migrating large non standard Subversion repositories, and how we managed to smoothly migrate the software projects following the schedule of each project manager.

  11. Development of flight experiment work performance and workstation interface requirements, part 1. Technical report and appendices A through G

    NASA Technical Reports Server (NTRS)

    Hatterick, R. G.

    1973-01-01

    A skill requirement definition method was applied to the problem of determining, at an early stage in system/mission definition, the skills required of on-orbit crew personnel whose activities will be related to the conduct or support of earth-orbital research. The experiment data base was selected from proposed experiments in NASA's earth orbital research and application investigation program as related to space shuttle missions, specifically those being considered for Sortie Lab. Concepts for two integrated workstation consoles for Sortie Lab experiment operations were developed, one each for earth observations and materials sciences payloads, utilizing a common supporting subsystems core console. A comprehensive data base of crew functions, operating environments, task dependencies, task-skills and occupational skills applicable to a representative cross section of earth orbital research experiments is presented. All data has been coded alphanumerically to permit efficient, low cost exercise and application of the data through automatic data processing in the future.

  12. Optimized holographic femtosecond laser patterning method towards rapid integration of high-quality functional devices in microchannels.

    PubMed

    Zhang, Chenchu; Hu, Yanlei; Du, Wenqiang; Wu, Peichao; Rao, Shenglong; Cai, Ze; Lao, Zhaoxin; Xu, Bing; Ni, Jincheng; Li, Jiawen; Zhao, Gang; Wu, Dong; Chu, Jiaru; Sugioka, Koji

    2016-09-13

    Rapid integration of high-quality functional devices in microchannels is in highly demand for miniature lab-on-a-chip applications. This paper demonstrates the embellishment of existing microfluidic devices with integrated micropatterns via femtosecond laser MRAF-based holographic patterning (MHP) microfabrication, which proves two-photon polymerization (TPP) based on spatial light modulator (SLM) to be a rapid and powerful technology for chip functionalization. Optimized mixed region amplitude freedom (MRAF) algorithm has been used to generate high-quality shaped focus field. Base on the optimized parameters, a single-exposure approach is developed to fabricate 200 × 200 μm microstructure arrays in less than 240 ms. Moreover, microtraps, QR code and letters are integrated into a microdevice by the advanced method for particles capture and device identification. These results indicate that such a holographic laser embellishment of microfluidic devices is simple, flexible and easy to access, which has great potential in lab-on-a-chip applications of biological culture, chemical analyses and optofluidic devices.

  13. Optimized holographic femtosecond laser patterning method towards rapid integration of high-quality functional devices in microchannels

    NASA Astrophysics Data System (ADS)

    Zhang, Chenchu; Hu, Yanlei; Du, Wenqiang; Wu, Peichao; Rao, Shenglong; Cai, Ze; Lao, Zhaoxin; Xu, Bing; Ni, Jincheng; Li, Jiawen; Zhao, Gang; Wu, Dong; Chu, Jiaru; Sugioka, Koji

    2016-09-01

    Rapid integration of high-quality functional devices in microchannels is in highly demand for miniature lab-on-a-chip applications. This paper demonstrates the embellishment of existing microfluidic devices with integrated micropatterns via femtosecond laser MRAF-based holographic patterning (MHP) microfabrication, which proves two-photon polymerization (TPP) based on spatial light modulator (SLM) to be a rapid and powerful technology for chip functionalization. Optimized mixed region amplitude freedom (MRAF) algorithm has been used to generate high-quality shaped focus field. Base on the optimized parameters, a single-exposure approach is developed to fabricate 200 × 200 μm microstructure arrays in less than 240 ms. Moreover, microtraps, QR code and letters are integrated into a microdevice by the advanced method for particles capture and device identification. These results indicate that such a holographic laser embellishment of microfluidic devices is simple, flexible and easy to access, which has great potential in lab-on-a-chip applications of biological culture, chemical analyses and optofluidic devices.

  14. Techniques for hot structures testing

    NASA Technical Reports Server (NTRS)

    Deangelis, V. Michael; Fields, Roger A.

    1990-01-01

    Hot structures testing have been going on since the early 1960's beginning with the Mach 6, X-15 airplane. Early hot structures test programs at NASA-Ames-Dryden focused on operational testing required to support the X-15 flight test program, and early hot structures research projects focused on developing lab test techniques to simulate flight thermal profiles. More recent efforts involved numerous large and small hot structures test programs that served to develop test methods and measurement techniques to provide data that promoted the correlation of test data with results from analytical codes. In Nov. 1988 a workshop was sponsored that focused on the correlation of hot structures test data with analysis. Limited material is drawn from the workshop and a more formal documentation is provided of topics that focus on hot structures test techniques used at NASA-Ames-Dryden. Topics covered include the data acquisition and control of testing, the quartz lamp heater systems, current strain and temperature sensors, and hot structures test techniques used to simulate the flight thermal environment in the lab.

  15. Visualization of yeast chromosomal DNA

    NASA Technical Reports Server (NTRS)

    Lubega, Seth

    1990-01-01

    The DNA molecule is the most significant life molecule since it codes the blue print for other structural and functional molecules of all living organisms. Agarose gel electrophoresis is now being widely used to separate DNA of virus, bacteria, and lower eukaryotes. The task was undertaken of reviewing the existing methods of DNA fractionation and microscopic visualization of individual chromosonal DNA molecules by gel electrophoresis as a basis for a proposed study to investigate the feasibility of separating DNA molecules in free fluids as an alternative to gel electrophoresis. Various techniques were studied. On the molecular level, agarose gel electrophoresis is being widely used to separate chromosomal DNA according to molecular weight. Carl and Olson separate and characterized the entire karyotype of a lab strain of Saccharomyces cerevisiae. Smith et al. and Schwartz and Koval independently reported the visualization of individual DNA molecules migrating through agarose gel matrix during electrophoresis. The techniques used by these researchers are being reviewed in the lab as a basis for the proposed studies.

  16. Development and program implementation of elements for identification of the electromagnet condition for movable element position control

    NASA Astrophysics Data System (ADS)

    Leukhin, R. I.; Shaykhutdinov, D. V.; Shirokov, K. M.; Narakidze, N. D.; Vlasov, A. S.

    2017-02-01

    Developing the experimental design of new electromagnetic constructions types in engineering industry enterprises requires solutions of two major problems: regulator’s parameters setup and comprehensive testing of electromagnets. A weber-ampere characteristic as a data source for electromagnet condition identification was selected. Present article focuses on development and implementation of the software for electromagnetic drive control system based on the weber-ampere characteristic measuring. The software for weber-ampere characteristic data processing based on artificial neural network is developed. Results of the design have been integrated into the program code in LabVIEW environment. The license package of LabVIEW graphic programming was used. The hardware is chosen and possibility of its use for control system implementation was proved. The trained artificial neural network defines electromagnetic drive effector position with minimal error. Developed system allows to control the electromagnetic drive powered by the voltage source, the current source and hybrid sources.

  17. An interactive computer lab of the galvanic cell for students in biochemistry.

    PubMed

    Ahlstrand, Emma; Buetti-Dinh, Antoine; Friedman, Ran

    2018-01-01

    We describe an interactive module that can be used to teach basic concepts in electrochemistry and thermodynamics to first year natural science students. The module is used together with an experimental laboratory and improves the students' understanding of thermodynamic quantities such as Δ r G, Δ r H, and Δ r S that are calculated but not directly measured in the lab. We also discuss how new technologies can substitute some parts of experimental chemistry courses, and improve accessibility to course material. Cloud computing platforms such as CoCalc facilitate the distribution of computer codes and allow students to access and apply interactive course tools beyond the course's scope. Despite some limitations imposed by cloud computing, the students appreciated the approach and the enhanced opportunities to discuss study questions with their classmates and instructor as facilitated by the interactive tools. © 2017 by The International Union of Biochemistry and Molecular Biology, 46(1):58-65, 2018. © 2017 The International Union of Biochemistry and Molecular Biology.

  18. Simplifying BRDF input data for optical signature modeling

    NASA Astrophysics Data System (ADS)

    Hallberg, Tomas; Pohl, Anna; Fagerström, Jan

    2017-05-01

    Scene simulations of optical signature properties using signature codes normally requires input of various parameterized measurement data of surfaces and coatings in order to achieve realistic scene object features. Some of the most important parameters are used in the model of the Bidirectional Reflectance Distribution Function (BRDF) and are normally determined by surface reflectance and scattering measurements. Reflectance measurements of the spectral Directional Hemispherical Reflectance (DHR) at various incident angles can normally be performed in most spectroscopy labs, while measuring the BRDF is more complicated or may not be available at all in many optical labs. We will present a method in order to achieve the necessary BRDF data directly from DHR measurements for modeling software using the Sandford-Robertson BRDF model. The accuracy of the method is tested by modeling a test surface by comparing results from using estimated and measured BRDF data as input to the model. These results show that using this method gives no significant loss in modeling accuracy.

  19. Identification of beer-spoilage bacteria using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Wieme, Anneleen D; Spitaels, Freek; Aerts, Maarten; De Bruyne, Katrien; Van Landschoot, Anita; Vandamme, Peter

    2014-08-18

    Applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) for identification of beer-spoilage bacteria was examined. To achieve this, an extensive identification database was constructed comprising more than 4200 mass spectra, including biological and technical replicates derived from 273 acetic acid bacteria (AAB) and lactic acid bacteria (LAB), covering a total of 52 species, grown on at least three growth media. Sequence analysis of protein coding genes was used to verify aberrant MALDI-TOF MS identification results and confirmed the earlier misidentification of 34 AAB and LAB strains. In total, 348 isolates were collected from culture media inoculated with 14 spoiled beer and brewery samples. Peak-based numerical analysis of MALDI-TOF MS spectra allowed a straightforward species identification of 327 (94.0%) isolates. The remaining isolates clustered separately and were assigned through sequence analysis of protein coding genes either to species not known as beer-spoilage bacteria, and thus not present in the database, or to novel AAB species. An alternative, classifier-based approach for the identification of spoilage bacteria was evaluated by combining the identification results obtained through peak-based cluster analysis and sequence analysis of protein coding genes as a standard. In total, 263 out of 348 isolates (75.6%) were correctly identified at species level and 24 isolates (6.9%) were misidentified. In addition, the identification results of 50 isolates (14.4%) were considered unreliable, and 11 isolates (3.2%) could not be identified. The present study demonstrated that MALDI-TOF MS is well-suited for the rapid, high-throughput and accurate identification of bacteria isolated from spoiled beer and brewery samples, which makes the technique appropriate for routine microbial quality control in the brewing industry. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Seismic Waveform Modeling of Broadband Data From a Temporary High-Density Deployment in the Los Angeles Basin

    NASA Astrophysics Data System (ADS)

    Herrman, M.; Polet, J.

    2016-12-01

    A total of 73 broadband seismometers were deployed for a passive source seismic experiment called the Los Angeles Syncline Seismic Interferometry Experiment (LASSIE) from September to November of 2014. The purpose of this experiment was to collect high density seismic data for the Los Angeles Basin (LAB) to better understand basin structure and response. This research will use the data collected from LASSIE to assess and refine current velocity models of the LAB using a full waveform modeling approach. To this end we will compare seismograms recorded by LASSIE for a subset of the 53 earthquakes and quarry blasts located by the Southern California Seismic Network (SCSN) that occurred within or near the LAB during the deployment period to synthetic seismograms generated by the Frequency-Wavenumber (FK) code developed by Zhu and Rivera (2002). A first analysis of the data indicates that roughly 25 of the 53 events have waveforms with sufficiently high signal to noise ratio, providing approximately 500 seismograms that are of suitable quality for comparison. We observe significant changes in waveform characteristics between stations with a very small separation distance of approximately 1 km. Focal mechanisms for most of these events have been obtained from Dr. Egill Hauksson (personal communication). We will show comparisons between the broadband velocity waveforms recorded by stations across the LASSIE array and FK synthetics determined for a variety of 1D velocity models that have been developed for the LAB area (such as Hadley and Kanamori, 1977; Hauksson, 1989, 1995 and Magistrale, 1992). The results of these comparisons will be analyzed to provide additional constraints on the subsurface seismic velocity structure within the Los Angeles basin.

  1. In vitro and in vivo probiotic assessment of Leuconostoc mesenteroides P45 isolated from pulque, a Mexican traditional alcoholic beverage.

    PubMed

    Giles-Gómez, Martha; Sandoval García, Jorge Giovanni; Matus, Violeta; Campos Quintana, Itzia; Bolívar, Francisco; Escalante, Adelfo

    2016-01-01

    Pulque is a Mexican traditional alcoholic, non-distilled, fermented beverage produced by the fermentation of the sap, known as aguamiel, extracted from several maguey (Agave) species. Pulque has traditionally been considered a healthy beverage due to its nutrient content and also a traditional medicine for the treatment of gastrointestinal disorders and intestinal infections. During pulque fermentation, the development of acidity, alcohol and viscosity define its final sensorial properties, developing an enriched environment where dominant lactic acid bacteria (LAB), including diverse Leuconostoc species, are present. Because traditional pulque is consumed directly from the fermentation vessel, the naturally associated LAB are ingested and reach the human small intestine alive. Here, we report the in vitro and in vivo probiotic assessment of Leuconostoc mesenteroides strain P45 isolated from pulque. This isolated LAB species exhibited lysozyme, acid (pH 3.5) and bile salts (0.1 and 0.3 % oxgall) resistance. Antibacterial activity against the pathogens Listeria monocytogenes, enteropathogenic Escherichia coli, Salmonella enterica serovar Typhi and S. enterica serovar Typhimurium were observed in assays involving cell-to-cell contact, cell-free 2× concentrated supernatants and cell-to-cell contact under exopolysaccharide-producing conditions. The in vivo probiotic assessment showed an anti-infective activity of L. mesenteroides P45 against S. enterica serovar Typhimurium in challenged male and female BALB/c mice. Analysis of the available genome sequence of strain P45 allowed identified a pre-bacteriocin coding gene and six peptidoglycan hydrolase enzymes, probably involved in the antimicrobial activity of this strain. The results presented in this study support some potential microbial mechanisms associated with the beneficial effects on human health of this LAB involved in the fermentation of pulque.

  2. A large-scale analysis of sex differences in facial expressions

    PubMed Central

    Kodra, Evan; el Kaliouby, Rana; LaFrance, Marianne

    2017-01-01

    There exists a stereotype that women are more expressive than men; however, research has almost exclusively focused on a single facial behavior, smiling. A large-scale study examines whether women are consistently more expressive than men or whether the effects are dependent on the emotion expressed. Studies of gender differences in expressivity have been somewhat restricted to data collected in lab settings or which required labor-intensive manual coding. In the present study, we analyze gender differences in facial behaviors as over 2,000 viewers watch a set of video advertisements in their home environments. The facial responses were recorded using participants’ own webcams. Using a new automated facial coding technology we coded facial activity. We find that women are not universally more expressive across all facial actions. Nor are they more expressive in all positive valence actions and less expressive in all negative valence actions. It appears that generally women express actions more frequently than men, and in particular express more positive valence actions. However, expressiveness is not greater in women for all negative valence actions and is dependent on the discrete emotional state. PMID:28422963

  3. Cone-beam micro-CT system based on LabVIEW software.

    PubMed

    Ionita, Ciprian N; Hoffmann, Keneth R; Bednarek, Daniel R; Chityala, Ravishankar; Rudin, Stephen

    2008-09-01

    Construction of a cone-beam computed tomography (CBCT) system for laboratory research usually requires integration of different software and hardware components. As a result, building and operating such a complex system require the expertise of researchers with significantly different backgrounds. Additionally, writing flexible code to control the hardware components of a CBCT system combined with designing a friendly graphical user interface (GUI) can be cumbersome and time consuming. An intuitive and flexible program structure, as well as the program GUI for CBCT acquisition, is presented in this note. The program was developed in National Instrument's Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) graphical language and is designed to control a custom-built CBCT system but has been also used in a standard angiographic suite. The hardware components are commercially available to researchers and are in general provided with software drivers which are LabVIEW compatible. The program structure was designed as a sequential chain. Each step in the chain takes care of one or two hardware commands at a time; the execution of the sequence can be modified according to the CBCT system design. We have scanned and reconstructed over 200 specimens using this interface and present three examples which cover different areas of interest encountered in laboratory research. The resulting 3D data are rendered using a commercial workstation. The program described in this paper is available for use or improvement by other researchers.

  4. Isolation and characterization of bacteriocinogenic lactic bacteria from M-Tuba and Tepache, two traditional fermented beverages in México

    PubMed Central

    de la Fuente-Salcido, Norma M; Castañeda-Ramírez, José Cristobal; García-Almendárez, Blanca E; Bideshi, Dennis K; Salcedo-Hernández, Rubén; Barboza-Corona, José E

    2015-01-01

    Mexican Tuba (M-Tuba) and Tepache are Mexican fermented beverages prepared mainly with pineapple pulp and coconut palm, respectively. At present, reports on the microbiota and nutritional effects of both beverages are lacking. The purpose of this study was to determine whether M-Tuba and Tepache contain cultivable lactic acid bacteria (LAB) capable of producing bacteriocins. Tepache and M-Tuba contain mesophilic aerobic bacteria, LAB, and yeast. Bacillus subtilis, Listeria monocytogenes, Listeria innocua, Streptococcus agalactiae, Staphylococcus aureus, Escherichia coli, Klebsiella pneumoniae, Salmonella typhimurium, and Salmonella spp, were the microorganisms most susceptible to metabolites produced by bacterial isolates. M-Tuba and Tepache contain bacteria that harbor genes coding for nisin and enterocin, but not pediocin. The presence of Lactococcus lactis and E. faecium in M-Tuba and Tepache, was identified by 16S rDNA. These bacteria produced bacteriocins of ∼3.5 kDa and 4.0–4.5 kDa, respectively. Partial purified bacteriocins showed inhibitory effect against Micrococcus luteus, L. monocytogenes, L. innocua, Str. agalactiae, S. aureus, Bacillus cereus, B. subtilis, E. faecalis, and K. pneumoniae. We characterized, for the first time, cultivable microbiota of M-Tuba and Tepache, and specifically, identified candidate lactic bacteria (LAB) present in these beverages that were capable of synthesizing antimicrobial peptides, which collectively could provide food preservative functions. PMID:26405529

  5. Abstract of talk for Silicon Valley Linux Users Group

    NASA Technical Reports Server (NTRS)

    Clanton, Sam

    2003-01-01

    The use of Linux for research at NASA Ames is discussed.Topics include:work with the Atmospheric Physics branch on software for a spectrometer to be used in the CRYSTAL-FACE mission this summer; work on in the Neuroengineering Lab with code IC including an introduction to the extension of the human senses project,advantages with using linux for real-time biological data processing,algorithms utilized on a linux system, goals of the project,slides of people with Neuroscan caps on, and progress that has been made and how linux has helped.

  6. Proceedings of the IDA Workshop on Formal Specification and Verification of Ada (Trade Name) (1st) Held in Alexandria, Virginia on 18-20 March 1985.

    DTIC Science & Technology

    1985-12-01

    on the third day. 5 ADA VERIFICATION WORKSHOP MARCH 18-20, 1985 LIST OF PARTICIPANTS Bernard Abrams ABRAMS@ADA20 Grumman Aerospace Corporation Mail...20301-3081 (202) 694-0211 Mark R. Cornwell CORNWELL @NRL-CSS Code 7590 Naval Research Lab Washington, D.C. 20375 (202) 767-3365 Jeff Facemire FACEMIRE...accompanied by descriptions of their purpose in English, to LUCKHAM@SAIL for annotation. - X-2 DISTRIBUTION LIST FOR M-146 Bernard Abrams ABRAMS@USC-ECLB

  7. Rapid white blood cell detection for peritonitis diagnosis

    NASA Astrophysics Data System (ADS)

    Wu, Tsung-Feng; Mei, Zhe; Chiu, Yu-Jui; Cho, Sung Hwan; Lo, Yu-Hwa

    2013-03-01

    A point-of-care and home-care lab-on-a-chip (LoC) system that integrates a microfluidic spiral device as a concentrator with an optical-coding device as a cell enumerator is demonstrated. The LoC system enumerates white blood cells from dialysis effluent of patients receiving peritoneal dialysis. The preliminary results show that the white blood cell counts from our system agree well with the results from commercial flow cytometers. The LoC system can potentially bring significant benefits to end stage renal disease (ESRD) patients that are on peritoneal dialysis (PD).

  8. X-Ray Diffraction Contrast Tomography in micro-CT Lab Source Systems

    DTIC Science & Technology

    2014-05-16

    microstrucutre as determined from DCT. (e) Surface mesh representing the fracture surface, colour coded with respect to its crystallographic orientation. Grain...sake of readability, we refrain from delving too deep into the mathematics of the projection models. Instead, we refer to Appendix A where more in...S−D 2 ). From the definition of the dot product, we learn that cos θ = B ·G ‖B‖‖G‖ = B ·G f . 1.9 Given sin2 θ + cos2 θ = 1, sin θ can be also

  9. Interactive web-based identification and visualization of transcript shared sequences.

    PubMed

    Azhir, Alaleh; Merino, Louis-Henri; Nauen, David W

    2018-05-12

    We have developed TraC (Transcript Consensus), a web-based tool for detecting and visualizing shared sequences among two or more mRNA transcripts such as splice variants. Results including exon-exon boundaries are returned in a highly intuitive, data-rich, interactive plot that permits users to explore the similarities and differences of multiple transcript sequences. The online tool (http://labs.pathology.jhu.edu/nauen/trac/) is free to use. The source code is freely available for download (https://github.com/nauenlab/TraC). Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Microstructures and Mechanical Responses of Powder Metallurgy Noncombustive Magnesium Extruded Alloy by Rapid Solidification Process in Mass Production

    DTIC Science & Technology

    2010-05-01

    Equal-Channel Angular Pressing for the Processing of Ultra-Fine Grained Materials. Scripta Mater. 1996 , 35, 143–146. 6. Saito, Y.; Tsuji, N...Mg-Al-Rare Earth Alloys. J. Alloy. Compd. 1996 , 232, 264–268. 20. Šplíchal, K.; Jurkech, L. Comparison of Oxidation of Cast and Sintered... PETERSON CODE 28 9500 MACARTHUR BLVD WEST BETHESDA MD 20817-5700 1 AIR FORCE ARMAMENT LAB AFATL DLJW W COOK EGLIN AFB FL 32542 1 BRIGGS

  11. Aerodynamic Performance Predictions of a SA- 2 Missile Using Missile DATCOM

    DTIC Science & Technology

    2009-09-01

    transformation that is given by Eqs. (4) and (5). Eqs. (8)–(10) show the formulation in the body and wind axis terminology. 2,0D AC C kC   L (8) 10 cos...by Teo (2008) using Missile LAB code. However, the missile geometry then was set up from a rudimentary drawing and not one that represented a high...provided by MSIC. These particular cases were run forcing turbulent flow with a surface roughness of 0.001016 cm, which was found by Teo (2008) to

  12. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ortiz-Rodriguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetrymore » with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural net approach it is possible to reduce the rate counts used to unfold the neutron spectrum. To evaluate these codes a computer tool called Neutron Spectrometry and dosimetry computer tool was designed. The results obtained with this package are showed. The codes here mentioned are freely available upon request to the authors.« less

  13. Evaluating the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks

    NASA Astrophysics Data System (ADS)

    Ortiz-Rodríguez, J. M.; Reyes Alfaro, A.; Reyes Haro, A.; Solís Sánches, L. O.; Miranda, R. Castañeda; Cervantes Viramontes, J. M.; Vega-Carrillo, H. R.

    2013-07-01

    In this work the performance of two neutron spectrum unfolding codes based on iterative procedures and artificial neural networks is evaluated. The first one code based on traditional iterative procedures and called Neutron spectrometry and dosimetry from the Universidad Autonoma de Zacatecas (NSDUAZ) use the SPUNIT iterative algorithm and was designed to unfold neutron spectrum and calculate 15 dosimetric quantities and 7 IAEA survey meters. The main feature of this code is the automated selection of the initial guess spectrum trough a compendium of neutron spectrum compiled by the IAEA. The second one code known as Neutron spectrometry and dosimetry with artificial neural networks (NDSann) is a code designed using neural nets technology. The artificial intelligence approach of neural net does not solve mathematical equations. By using the knowledge stored at synaptic weights on a neural net properly trained, the code is capable to unfold neutron spectrum and to simultaneously calculate 15 dosimetric quantities, needing as entrance data, only the rate counts measured with a Bonner spheres system. Similarities of both NSDUAZ and NSDann codes are: they follow the same easy and intuitive user's philosophy and were designed in a graphical interface under the LabVIEW programming environment. Both codes unfold the neutron spectrum expressed in 60 energy bins, calculate 15 dosimetric quantities and generate a full report in HTML format. Differences of these codes are: NSDUAZ code was designed using classical iterative approaches and needs an initial guess spectrum in order to initiate the iterative procedure. In NSDUAZ, a programming routine was designed to calculate 7 IAEA instrument survey meters using the fluence-dose conversion coefficients. NSDann code use artificial neural networks for solving the ill-conditioned equation system of neutron spectrometry problem through synaptic weights of a properly trained neural network. Contrary to iterative procedures, in neural net approach it is possible to reduce the rate counts used to unfold the neutron spectrum. To evaluate these codes a computer tool called Neutron Spectrometry and dosimetry computer tool was designed. The results obtained with this package are showed. The codes here mentioned are freely available upon request to the authors.

  14. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography.

    PubMed

    Hamilton, Liberty S; Chang, David L; Lee, Morgan B; Chang, Edward F

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users.

  15. HEPLIB `91: International users meeting on the support and environments of high energy physics computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstad, H.

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less

  16. HEPLIB 91: International users meeting on the support and environments of high energy physics computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstad, H.

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less

  17. Increasing productivity through Total Reuse Management (TRM)

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Total Reuse Management (TRM) is a new concept currently being promoted by the NASA Langley Software Engineering and Ada Lab (SEAL). It uses concepts similar to those promoted in Total Quality Management (TQM). Both technical and management personnel are continually encouraged to think in terms of reuse. Reuse is not something that is aimed for after a product is completed, but rather it is built into the product from inception through development. Lowering software development costs, reducing risk, and increasing code reliability are the more prominent goals of TRM. Procedures and methods used to adopt and apply TRM are described. Reuse is frequently thought of as only being applicable to code. However, reuse can apply to all products and all phases of the software life cycle. These products include management and quality assurance plans, designs, and testing procedures. Specific examples of successfully reused products are given and future goals are discussed.

  18. A verification and validation effort for high explosives at Los Alamos National Lab (u)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scovel, Christina A; Menikoff, Ralph S

    2009-01-01

    We have started a project to verify and validate ASC codes used to simulate detonation waves in high explosives. Since there are no non-trivial analytic solutions, we are going to compare simulated results with experimental data that cover a wide range of explosive phenomena. The intent is to compare both different codes and different high explosives (HE) models. The first step is to test the products equation of state used for the HE models, For this purpose, the cylinder test, flyer plate and plate-push experiments are being used. These experiments sample different regimes in thermodynamic phase space: the CJ isentropemore » for the cylinder tests, the isentrope behind an overdriven detonation wave for the flyer plate experiment, and expansion following a reflected CJ detonation for the plate-push experiment, which is sensitive to the Gruneisen coefficient. The results of our findings for PBX 9501 are presented here.« less

  19. Achieving behavioral control with millisecond resolution in a high-level programming environment

    PubMed Central

    Asaad, Wael F.; Eskandar, Emad N.

    2008-01-01

    The creation of psychophysical tasks for the behavioral neurosciences has generally relied upon low-level software running on a limited range of hardware. Despite the availability of software that allows the coding of behavioral tasks in high-level programming environments, many researchers are still reluctant to trust the temporal accuracy and resolution of programs running in such environments, especially when they run atop non-real-time operating systems. Thus, the creation of behavioral paradigms has been slowed by the intricacy of the coding required and their dissemination across labs has been hampered by the various types of hardware needed. However, we demonstrate here that, when proper measures are taken to handle the various sources of temporal error, accuracy can be achieved at the one millisecond time-scale that is relevant for the alignment of behavioral and neural events. PMID:18606188

  20. Semi-automated Anatomical Labeling and Inter-subject Warping of High-Density Intracranial Recording Electrodes in Electrocorticography

    PubMed Central

    Hamilton, Liberty S.; Chang, David L.; Lee, Morgan B.; Chang, Edward F.

    2017-01-01

    In this article, we introduce img_pipe, our open source python package for preprocessing of imaging data for use in intracranial electrocorticography (ECoG) and intracranial stereo-EEG analyses. The process of electrode localization, labeling, and warping for use in ECoG currently varies widely across laboratories, and it is usually performed with custom, lab-specific code. This python package aims to provide a standardized interface for these procedures, as well as code to plot and display results on 3D cortical surface meshes. It gives the user an easy interface to create anatomically labeled electrodes that can also be warped to an atlas brain, starting with only a preoperative T1 MRI scan and a postoperative CT scan. We describe the full capabilities of our imaging pipeline and present a step-by-step protocol for users. PMID:29163118

  1. Sirepo: a web-based interface for physical optics simulations - its deployment and use at NSLS-II

    NASA Astrophysics Data System (ADS)

    Rakitin, Maksim S.; Chubar, Oleg; Moeller, Paul; Nagler, Robert; Bruhwiler, David L.

    2017-08-01

    "Sirepo" is an open source cloud-based software framework which provides a convenient and user-friendly web-interface for scientific codes such as Synchrotron Radiation Workshop (SRW) running on a local machine or a remote server side. SRW is a physical optics code allowing to simulate the synchrotron radiation from various insertion devices (undulators and wigglers) and bending magnets. Another feature of SRW is a support of high-accuracy simulation of fully- and partially-coherent radiation propagation through X-ray optical beamlines, facilitated by so-called "Virtual Beamline" module. In the present work, we will discuss the most important features of Sirepo/SRW interface with emphasis on their use for commissioning of beamlines and simulation of experiments at National Synchrotron Light Source II. In particular, "Flux through Finite Aperture" and "Intensity" reports, visualizing results of the corresponding SRW calculations, are being routinely used for commissioning of undulators and X-ray optical elements. Material properties of crystals, compound refractive lenses, and some other optical elements can be dynamically obtained for the desired photon energy from the databases publicly available at Argonne National Lab and at Lawrence Berkeley Lab. In collaboration with the Center for Functional Nanomaterials (CFN) of BNL, a library of samples for coherent scattering experiments has been implemented in SRW and the corresponding Sample optical element was added to Sirepo. Electron microscope images of artificially created nanoscale samples can be uploaded to Sirepo to simulate scattering patterns created by synchrotron radiation in different experimental schemes that can be realized at beamlines.

  2. Fostering Team Awareness in Earth System Modeling Communities

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Lawson, A.; Strong, S.

    2009-12-01

    Existing Global Climate Models are typically managed and controlled at a single site, with varied levels of participation by scientists outside the core lab. As these models evolve to encompass a wider set of earth systems, this central control of the modeling effort becomes a bottleneck. But such models cannot evolve to become fully distributed open source projects unless they address the imbalance in the availability of communication channels: scientists at the core site have access to regular face-to-face communication with one another, while those at remote sites have access to only a subset of these conversations - e.g. formally scheduled teleconferences and user meetings. Because of this imbalance, critical decision making can be hidden from many participants, their code contributions can interact in unanticipated ways, and the community loses awareness of who knows what. We have documented some of these problems in a field study at one climate modeling centre, and started to develop tools to overcome these problems. We report on one such tool, TracSNAP, which analyzes the social network of the scientists contributing code to the model by extracting the data in an existing project code repository. The tool presents the results of this analysis to modelers and model users in a number of ways: recommendation for who has expertise on particular code modules, suggestions for code sections that are related to files being worked on, and visualizations of team communication patterns. The tool is currently available as a plugin for the Trac bug tracking system.

  3. Real-time blood flow visualization using the graphics processing unit

    NASA Astrophysics Data System (ADS)

    Yang, Owen; Cuccia, David; Choi, Bernard

    2011-01-01

    Laser speckle imaging (LSI) is a technique in which coherent light incident on a surface produces a reflected speckle pattern that is related to the underlying movement of optical scatterers, such as red blood cells, indicating blood flow. Image-processing algorithms can be applied to produce speckle flow index (SFI) maps of relative blood flow. We present a novel algorithm that employs the NVIDIA Compute Unified Device Architecture (CUDA) platform to perform laser speckle image processing on the graphics processing unit. Software written in C was integrated with CUDA and integrated into a LabVIEW Virtual Instrument (VI) that is interfaced with a monochrome CCD camera able to acquire high-resolution raw speckle images at nearly 10 fps. With the CUDA code integrated into the LabVIEW VI, the processing and display of SFI images were performed also at ~10 fps. We present three video examples depicting real-time flow imaging during a reactive hyperemia maneuver, with fluid flow through an in vitro phantom, and a demonstration of real-time LSI during laser surgery of a port wine stain birthmark.

  4. Optimized holographic femtosecond laser patterning method towards rapid integration of high-quality functional devices in microchannels

    PubMed Central

    Zhang, Chenchu; Hu, Yanlei; Du, Wenqiang; Wu, Peichao; Rao, Shenglong; Cai, Ze; Lao, Zhaoxin; Xu, Bing; Ni, Jincheng; Li, Jiawen; Zhao, Gang; Wu, Dong; Chu, Jiaru; Sugioka, Koji

    2016-01-01

    Rapid integration of high-quality functional devices in microchannels is in highly demand for miniature lab-on-a-chip applications. This paper demonstrates the embellishment of existing microfluidic devices with integrated micropatterns via femtosecond laser MRAF-based holographic patterning (MHP) microfabrication, which proves two-photon polymerization (TPP) based on spatial light modulator (SLM) to be a rapid and powerful technology for chip functionalization. Optimized mixed region amplitude freedom (MRAF) algorithm has been used to generate high-quality shaped focus field. Base on the optimized parameters, a single-exposure approach is developed to fabricate 200 × 200 μm microstructure arrays in less than 240 ms. Moreover, microtraps, QR code and letters are integrated into a microdevice by the advanced method for particles capture and device identification. These results indicate that such a holographic laser embellishment of microfluidic devices is simple, flexible and easy to access, which has great potential in lab-on-a-chip applications of biological culture, chemical analyses and optofluidic devices. PMID:27619690

  5. Real-time blood flow visualization using the graphics processing unit

    PubMed Central

    Yang, Owen; Cuccia, David; Choi, Bernard

    2011-01-01

    Laser speckle imaging (LSI) is a technique in which coherent light incident on a surface produces a reflected speckle pattern that is related to the underlying movement of optical scatterers, such as red blood cells, indicating blood flow. Image-processing algorithms can be applied to produce speckle flow index (SFI) maps of relative blood flow. We present a novel algorithm that employs the NVIDIA Compute Unified Device Architecture (CUDA) platform to perform laser speckle image processing on the graphics processing unit. Software written in C was integrated with CUDA and integrated into a LabVIEW Virtual Instrument (VI) that is interfaced with a monochrome CCD camera able to acquire high-resolution raw speckle images at nearly 10 fps. With the CUDA code integrated into the LabVIEW VI, the processing and display of SFI images were performed also at ∼10 fps. We present three video examples depicting real-time flow imaging during a reactive hyperemia maneuver, with fluid flow through an in vitro phantom, and a demonstration of real-time LSI during laser surgery of a port wine stain birthmark. PMID:21280915

  6. Visualizing the geography of genetic variants.

    PubMed

    Marcus, Joseph H; Novembre, John

    2017-02-15

    One of the key characteristics of any genetic variant is its geographic distribution. The geographic distribution can shed light on where an allele first arose, what populations it has spread to, and in turn on how migration, genetic drift, and natural selection have acted. The geographic distribution of a genetic variant can also be of great utility for medical/clinical geneticists and collectively many genetic variants can reveal population structure. Here we develop an interactive visualization tool for rapidly displaying the geographic distribution of genetic variants. Through a REST API and dynamic front-end, the Geography of Genetic Variants (GGV) browser ( http://popgen.uchicago.edu/ggv/ ) provides maps of allele frequencies in populations distributed across the globe. GGV is implemented as a website ( http://popgen.uchicago.edu/ggv/ ) which employs an API to access frequency data ( http://popgen.uchicago.edu/freq_api/ ). Python and javascript source code for the website and the API are available at: http://github.com/NovembreLab/ggv/ and http://github.com/NovembreLab/ggv-api/ . jnovembre@uchicago.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  7. A comparison of life prediction methodologies for titanium matrix composites subjected to thermomechanical fatigue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calcaterra, J.R.; Johnson, W.S.; Neu, R.W.

    1997-12-31

    Several methodologies have been developed to predict the lives of titanium matrix composites (TMCs) subjected to thermomechanical fatigue (TMF). This paper reviews and compares five life prediction models developed at NASA-LaRC. Wright Laboratories, based on a dingle parameter, the fiber stress in the load-carrying, or 0{degree}, direction. The two other models, both developed at Wright Labs. are multi-parameter models. These can account for long-term damage, which is beyond the scope of the single-parameter models, but this benefit is offset by the additional complexity of the methodologies. Each of the methodologies was used to model data generated at NASA-LeRC. Wright Labs.more » and Georgia Tech for the SCS-6/Timetal 21-S material system. VISCOPLY, a micromechanical stress analysis code, was used to determine the constituent stress state for each test and was used for each model to maintain consistency. The predictive capabilities of the models are compared, and the ability of each model to accurately predict the responses of tests dominated by differing damage mechanisms is addressed.« less

  8. PHYLUCE is a software package for the analysis of conserved genomic loci.

    PubMed

    Faircloth, Brant C

    2016-03-01

    Targeted enrichment of conserved and ultraconserved genomic elements allows universal collection of phylogenomic data from hundreds of species at multiple time scales (<5 Ma to > 300 Ma). Prior to downstream inference, data from these types of targeted enrichment studies must undergo preprocessing to assemble contigs from sequence data; identify targeted, enriched loci from the off-target background data; align enriched contigs representing conserved loci to one another; and prepare and manipulate these alignments for subsequent phylogenomic inference. PHYLUCE is an efficient and easy-to-install software package that accomplishes these tasks across hundreds of taxa and thousands of enriched loci. PHYLUCE is written for Python 2.7. PHYLUCE is supported on OSX and Linux (RedHat/CentOS) operating systems. PHYLUCE source code is distributed under a BSD-style license from https://www.github.com/faircloth-lab/phyluce/ PHYLUCE is also available as a package (https://binstar.org/faircloth-lab/phyluce) for the Anaconda Python distribution that installs all dependencies, and users can request a PHYLUCE instance on iPlant Atmosphere (tag: phyluce). The software manual and a tutorial are available from http://phyluce.readthedocs.org/en/latest/ and test data are available from doi: 10.6084/m9.figshare.1284521. brant@faircloth-lab.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. JOVE NASA-FIT program: Microgravity and aeronomy projects

    NASA Technical Reports Server (NTRS)

    Patterson, James D.; Mantovani, James G.; Rassoul, Hamid K.

    1994-01-01

    This semi-annual status report is divided into two sections: Scanning Tunneling Microscopy Lab and Aeronomy Lab. The Scanning Tunneling Microscopy (STM) research involves studying solar cell materials using the STM built at Florida Tech using a portion of our initial Jove equipment funding. One result of the participation in the FSEC project will be to design and build an STM system which is portable. This could serve as a prototype STM system which might be used on the Space Shuttle during a Spacelab mission, or onboard the proposed Space Station. The scanning tunneling microscope is only able to image the surface structure of electrically conductive crystals; by building an atomic force microscope (AFM) the surface structure of any sample, regardless of its conductivity, will be able to be imaged. With regards to the Aeronomy Lab, a total of four different mesospheric oxygen emission codes were created to calculate the intensity along the line of sight of the shuttle observations for 2972A, Herzberg I, Herzberg II, and Chamberlain bands. The thermosphere-ionosphere coupling project was completed with two major accomplishments: collection of 500 data points on modulation of neutral wind with geophysical variables, and establishment of constraints on behavior of the height of the ionosphere as a result of interaction between geophysical and geometrical factors. The magnetotail plasma project has been centered around familiarization with the subject in the form of a literature search and preprocessing of IMP-8 data.

  10. Online SVT Commissioning and Monitoring using a Service-Oriented Architecture Framework

    NASA Astrophysics Data System (ADS)

    Ruger, Justin; Gotra, Yuri; Weygand, Dennis; Ziegler, Veronique; Heddle, David; Gore, David

    2014-03-01

    Silicon Vertex Tracker detectors are devices used in high energy experiments for precision measurement of charged tracks close to the collision point. Early detection of faulty hardware is essential and therefore code development of monitoring and commissioning software is essential. The computing framework for the CLAS12 experiment at Jefferson Lab is a service-oriented architecture that allows efficient data-flow from one service to another through loose coupling. I will present the strategy and development of services for the CLAS12 Silicon Tracker data monitoring and commissioning within this framework, as well as preliminary results using test data.

  11. Development of a Computer Program for Store Airloads Prediction Technique

    DTIC Science & Technology

    1976-10-01

    W0N0000 44)N. )..04N4 tSNCN0U Y’~~~ Wo 0 JN*%O 0 W’ 0, 0 MOOC U gOO- 0*00 tf4-O gsU 0ow .4,0 4P- NNC0.z maot)m .Jrnu’ a" .r m (IN 0~."DNQ 40) 0440 a0...Code S 33)/Tech Lib 1 Sandia Lab/Tech Lib Div 3141 1 Rand Corp/ Library -D 1 TACTEC 1 USAFTAWC/TEFA 1 TAWC/TRADOCLO1 ADTC/SES 1 ADTC/SD23 1 AFATL/DL 1AFATL

  12. A&M. TAN607 floor plans. Shows three floor levels of pool, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. TAN-607 floor plans. Shows three floor levels of pool, hot shop, and warm shop. Includes view of pool vestibule, personnel labyrinth, location of floor rails, and room numbers of office areas, labs, instrument rooms, and stairways. This drawing was re-drawn to show as-built features in 1993. Ralph M. Parsons 902-3-ANP-607-A 96. Date of original: December 1952. Approved by INEEL Classification Office for public release. INEEL index code no. 034-0607-00-693-106748 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  13. Proceedings of the IDA (Institute for Defense Analyses) Workshop on Formal Specification and Verification of Ada (Trade Name) (2nd) Held in Alexandria, Virginia on July 23-25, 1985.

    DTIC Science & Technology

    1985-11-01

    2% -N X Mailing Directory U Bernard Abrams ABRAMS@USC-ECLB Grumman Aerospace Corporation Mail Station 001-31T Bethpage, NY 11714 (516) 575-9487 Omar...Aerospace & Comm. Corp. 10440 State Highway 83 Colorado Springs, Colorado 80908 Mark R. Cornwell CORNWELL @NRL-CSS Code 7590 Naval Research Lab Washington...5) Role of the Formal Definition of Ada Bernard Lang, INRIA, no date, 10 pages [6) The Users of a Formal Definition for Ada Bernd Krieg-Brdckner 2

  14. An introduction to scripting in Ruby for biologists.

    PubMed

    Aerts, Jan; Law, Andy

    2009-07-16

    The Ruby programming language has a lot to offer to any scientist with electronic data to process. Not only is the initial learning curve very shallow, but its reflection and meta-programming capabilities allow for the rapid creation of relatively complex applications while still keeping the code short and readable. This paper provides a gentle introduction to this scripting language for researchers without formal informatics training such as many wet-lab scientists. We hope this will provide such researchers an idea of how powerful a tool Ruby can be for their data management tasks and encourage them to learn more about it.

  15. Lake water quality mapping from LANDSAT

    NASA Technical Reports Server (NTRS)

    Scherz, J. P.

    1977-01-01

    The lakes in three LANDSAT scenes were mapped by the Bendix MDAS multispectral analysis system. Field checking the maps by three separate individuals revealed approximately 90-95% correct classification for the lake categories selected. Variations between observers was about 5%. From the MDAS color coded maps the lake with the worst algae problem was easily located. This lake was closely checked and a pollution source of 100 cows was found in the springs which fed this lake. The theory, lab work and field work which made it possible for this demonstration project to be a practical lake classification procedure are presented.

  16. Theoretical Studies of the Interface Electronic Properties of Tetrahedrally Coordinated Semiconductors.

    DTIC Science & Technology

    1987-09-29

    PAGE l. REPOR 1b. RESTRICTIVE MARKINGS N/A 2& ECU AD A 1 7 23. DISTRIBUTION/A VAI LAB]ILITY OF REP RSA16 N/A 2b. DECLASSiFICATION/OOW~ir~u- E E N/A 0...ADDRESS (City. State and ZIP Code) DEPARTMENT OF MATERIALS SCIENCE 800 NORTH QUINCY STREET LOS ANGELES, CA 90089-0241 ARLINGTON, VA 22217 e . NAME OF...vertex correction for the electron-phonon interaction in a 2D e - gas goes like, ?(i) 0 ( X" (.&°/EF)P/ 2 where /. is the dimensionless coupling strength

  17. Characterization of Proxy Application Performance on Advanced Architectures. UMT2013, MCB, AMG2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Howell, Louis H.; Gunney, Brian T.; Bhatele, Abhinav

    2015-10-09

    Three codes were tested at LLNL as part of a Tri-Lab effort to make detailed assessments of several proxy applications on various advanced architectures, with the eventual goal of extending these assessments to codes of programmatic interest running more realistic simulations. Teams from Sandia and Los Alamos tested proxy apps of their own. The focus in this report is on the LLNL codes UMT2013, MCB, and AMG2013. We present weak and strong MPI scaling results and studies of OpenMP efficiency on a large BG/Q system at LLNL, with comparison against similar tests on an Intel Sandy Bridge TLCC2 system. Themore » hardware counters on BG/Q provide detailed information on many aspects of on-node performance, while information from the mpiP tool gives insight into the reasons for the differing scaling behavior on these two different architectures. Results from three more speculative tests are also included: one that exploits NVRAM as extended memory, one that studies performance under a power bound, and one that illustrates the effects of changing the torus network mapping on BG/Q.« less

  18. T-38 Primary Flight Display Prototyping and HIVE Support Abstract & Summary

    NASA Technical Reports Server (NTRS)

    Boniface, Andrew

    2015-01-01

    This fall I worked in EV3 within NASA's Johnson Space Center in The HIVE (Human Integrated Vehicles & Environments). The HIVE is responsible for human in the loop testing, getting new technologies in front of astronauts, operators, and users early in the development cycle to make the interfaces more human friendly. Some projects the HIVE is working on includes user interfaces for future spacecraft, wearables to alert astronauts about important information, and test beds to simulate mock missions. During my internship I created a prototype for T-38 aircraft displays using LabVIEW, learned how to use microcontrollers, and helped out with other small tasks in the HIVE. The purpose of developing a prototype for T-38 Displays in LabVIEW is to analyze functions of the display such as navigation in a cost and time effective manner. The LabVIEW prototypes allow Ellington Field AOD to easily make adjustments to the display before hardcoding the final product. LabVIEW was used to create a user interface for simulation almost identical to the real aircraft display. Goals to begin the T-38 PFD (Primary Flight Display) prototype included creating a T-38 PFD hardware display in a software environment, designing navigation for the menu's, incorporating vertical and horizontal navigation bars, and to add a heading bug for compass controls connected to the HSI (Horizontal Situation Indicator). To get started with the project, measurements of the entire display were taken. This enabled an accurate model of the hardware display to be created. Navigation of menu's required some exploration of different buttons on the display. The T-38 simulator and aircraft were used for examining the display. After one piece of the prototype was finished, another trip of to the simulator took place. This was done until all goals for the prototype were complete. Some possible integration ideas for displays in the near future are autopilot selection, touch screen displays, and crew member preferences. Complete navigation, control, and function customization will be achievable once a display is fully developed. Other than the T-38 prototyping, I spent time learning how to design small circuits and write code for them to function. This was done by adding electronic circuit components to breadboard and microcontroller then writing code to speak to those components through the microcontroller. I went through an Arduino starter kit to build circuits and code software that allowed the hardware to act. This work was planned to assist in a lighting project this fall but another solution was discovered for the lighting project. Other tasks that I assisted with, included hands on work such as mock-up construction/removal, logic analyzer repairs, and soldering with circuits. The unique opportunity to be involved work with NASA has significantly changed my educational and career goals. This opportunity has only opened the door to my career with engineering. I have learned over the span of this internship that I am fascinated by the type of work that NASA does. My desire to work in the aerospace industry has increased immensely. I hope to return to NASA to be more involved in the advancement of science, engineering, and spaceflight. My interests for my future education and career lie in NASA’s work - pioneering the future in space exploration, scientific discovery and aeronautics research.

  19. Continuum Vlasov Simulation in Four Phase-space Dimensions

    NASA Astrophysics Data System (ADS)

    Cohen, B. I.; Banks, J. W.; Berger, R. L.; Hittinger, J. A.; Brunner, S.

    2010-11-01

    In the VALHALLA project, we are developing scalable algorithms for the continuum solution of the Vlasov-Maxwell equations in two spatial and two velocity dimensions. We use fourth-order temporal and spatial discretizations of the conservative form of the equations and a finite-volume representation to enable adaptive mesh refinement and nonlinear oscillation control [1]. The code has been implemented with and without adaptive mesh refinement, and with electromagnetic and electrostatic field solvers. A goal is to study the efficacy of continuum Vlasov simulations in four phase-space dimensions for laser-plasma interactions. We have verified the code in examples such as the two-stream instability, the weak beam-plasma instability, Landau damping, electron plasma waves with electron trapping and nonlinear frequency shifts [2]^ extended from 1D to 2D propagation, and light wave propagation.^ We will report progress on code development, computational methods, and physics applications. This work was performed under the auspices of the U.S. DOE by LLNL under contract no. DE-AC52-07NA27344. This work was funded by the Lab. Dir. Res. and Dev. Prog. at LLNL under project tracking code 08-ERD-031. [1] J.W. Banks and J.A.F. Hittinger, to appear in IEEE Trans. Plas. Sci. (Sept., 2010). [2] G.J. Morales and T.M. O'Neil, Phys. Rev. Lett. 28,417 (1972); R. L. Dewar, Phys. Fluids 15,712 (1972).

  20. Genomics dataset on unclassified published organism (patent US 7547531).

    PubMed

    Khan Shawan, Mohammad Mahfuz Ali; Hasan, Md Ashraful; Hossain, Md Mozammel; Hasan, Md Mahmudul; Parvin, Afroza; Akter, Salina; Uddin, Kazi Rasel; Banik, Subrata; Morshed, Mahbubul; Rahman, Md Nazibur; Rahman, S M Badier

    2016-12-01

    Nucleotide (DNA) sequence analysis provides important clues regarding the characteristics and taxonomic position of an organism. With the intention that, DNA sequence analysis is very crucial to learn about hierarchical classification of that particular organism. This dataset (patent US 7547531) is chosen to simplify all the complex raw data buried in undisclosed DNA sequences which help to open doors for new collaborations. In this data, a total of 48 unidentified DNA sequences from patent US 7547531 were selected and their complete sequences were retrieved from NCBI BioSample database. Quick response (QR) code of those DNA sequences was constructed by DNA BarID tool. QR code is useful for the identification and comparison of isolates with other organisms. AT/GC content of the DNA sequences was determined using ENDMEMO GC Content Calculator, which indicates their stability at different temperature. The highest GC content was observed in GP445188 (62.5%) which was followed by GP445198 (61.8%) and GP445189 (59.44%), while lowest was in GP445178 (24.39%). In addition, New England BioLabs (NEB) database was used to identify cleavage code indicating the 5, 3 and blunt end and enzyme code indicating the methylation site of the DNA sequences was also shown. These data will be helpful for the construction of the organisms' hierarchical classification, determination of their phylogenetic and taxonomic position and revelation of their molecular characteristics.

  1. Combustion Stability Analyses for J-2X Gas Generator Development

    NASA Technical Reports Server (NTRS)

    Hulka, J. R.; Protz, C. S.; Casiano, M. J.; Kenny, R. J.

    2010-01-01

    The National Aeronautics and Space Administration (NASA) is developing a liquid oxygen/liquid hydrogen rocket engine for upper stage and trans-lunar applications of the Ares vehicles for the Constellation program. This engine, designated the J-2X, is a higher pressure, higher thrust variant of the Apollo-era J-2 engine. Development was contracted to Pratt & Whitney Rocketdyne in 2006. Over the past several years, development of the gas generator for the J-2X engine has progressed through a variety of workhorse injector, chamber, and feed system configurations. Several of these configurations have resulted in injection-coupled combustion instability of the gas generator assembly at the first longitudinal mode of the combustion chamber. In this paper, the longitudinal mode combustion instabilities observed on the workhorse test stand are discussed in detail. Aspects of this combustion instability have been modeled at the NASA Marshall Space Flight Center with several codes, including the Rocket Combustor Interaction Design and Analysis (ROCCID) code and a new lumped-parameter MatLab model. To accurately predict the instability characteristics of all the chamber and injector geometries and test conditions, several features of the submodels in the ROCCID suite of calculations required modification. Finite-element analyses were conducted of several complicated combustion chamber geometries to determine how to model and anchor the chamber response in ROCCID. A large suite of sensitivity calculations were conducted to determine how to model and anchor the injector response in ROCCID. These modifications and their ramification for future stability analyses of this type are discussed in detail. The lumped-parameter MatLab model of the gas generator assembly was created as an alternative calculation to the ROCCID methodology. This paper also describes this model and the stability calculations.

  2. Intra-/inter-laboratory validation study on reactive oxygen species assay for chemical photosafety evaluation using two different solar simulators.

    PubMed

    Onoue, Satomi; Hosoi, Kazuhiro; Toda, Tsuguto; Takagi, Hironori; Osaki, Naoto; Matsumoto, Yasuhiro; Kawakami, Satoru; Wakuri, Shinobu; Iwase, Yumiko; Yamamoto, Toshinobu; Nakamura, Kazuichi; Ohno, Yasuo; Kojima, Hajime

    2014-06-01

    A previous multi-center validation study demonstrated high transferability and reliability of reactive oxygen species (ROS) assay for photosafety evaluation. The present validation study was undertaken to verify further the applicability of different solar simulators and assay performance. In 7 participating laboratories, 2 standards and 42 coded chemicals, including 23 phototoxins and 19 non-phototoxic drugs/chemicals, were assessed by the ROS assay using two different solar simulators (Atlas Suntest CPS series, 3 labs; and Seric SXL-2500V2, 4 labs). Irradiation conditions could be optimized using quinine and sulisobenzone as positive and negative standards to offer consistent assay outcomes. In both solar simulators, the intra- and inter-day precisions (coefficient of variation; CV) for quinine were found to be below 10%. The inter-laboratory CV for quinine averaged 15.4% (Atlas Suntest CPS) and 13.2% (Seric SXL-2500V2) for singlet oxygen and 17.0% (Atlas Suntest CPS) and 7.1% (Seric SXL-2500V2) for superoxide, suggesting high inter-laboratory reproducibility even though different solar simulators were employed for the ROS assay. In the ROS assay on 42 coded chemicals, some chemicals (ca. 19-29%) were unevaluable because of limited solubility and spectral interference. Although several false positives appeared with positive predictivity of ca. 76-92% (Atlas Suntest CPS) and ca. 75-84% (Seric SXL-2500V2), there were no false negative predictions in both solar simulators. A multi-center validation study on the ROS assay demonstrated satisfactory transferability, accuracy, precision, and predictivity, as well as the availability of other solar simulators. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Three-dimensional brain reconstruction of in vivo electrode tracks for neuroscience and neural prosthetic applications

    PubMed Central

    Markovitz, Craig D.; Tang, Tien T.; Edge, David P.; Lim, Hubert H.

    2012-01-01

    The brain is a densely interconnected network that relies on populations of neurons within and across multiple nuclei to code for features leading to perception and action. However, the neurophysiology field is still dominated by the characterization of individual neurons, rather than simultaneous recordings across multiple regions, without consistent spatial reconstruction of their locations for comparisons across studies. There are sophisticated histological and imaging techniques for performing brain reconstructions. However, what is needed is a method that is relatively easy and inexpensive to implement in a typical neurophysiology lab and provides consistent identification of electrode locations to make it widely used for pooling data across studies and research groups. This paper presents our initial development of such an approach for reconstructing electrode tracks and site locations within the guinea pig inferior colliculus (IC) to identify its functional organization for frequency coding relevant for a new auditory midbrain implant (AMI). Encouragingly, the spatial error associated with different individuals reconstructing electrode tracks for the same midbrain was less than 65 μm, corresponding to an error of ~1.5% relative to the entire IC structure (~4–5 mm diameter sphere). Furthermore, the reconstructed frequency laminae of the IC were consistently aligned across three sampled midbrains, demonstrating the ability to use our method to combine location data across animals. Hopefully, through further improvements in our reconstruction method, it can be used as a standard protocol across neurophysiology labs to characterize neural data not only within the IC but also within other brain regions to help bridge the gap between cellular activity and network function. Clinically, correlating function with location within and across multiple brain regions can guide optimal placement of electrodes for the growing field of neural prosthetics. PMID:22754502

  4. SU-E-T-157: CARMEN: A MatLab-Based Research Platform for Monte Carlo Treatment Planning (MCTP) and Customized System for Planning Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baeza, J.A.; Ureba, A.; Jimenez-Ortega, E.

    Purpose: Although there exist several radiotherapy research platforms, such as: CERR, the most widely used and referenced; SlicerRT, which allows treatment plan comparison from various sources; and MMCTP, a full MCTP system; it is still needed a full MCTP toolset that provides users complete control of calculation grids, interpolation methods and filters in order to “fairly” compare results from different TPSs, supporting verification with experimental measurements. Methods: This work presents CARMEN, a MatLab-based platform including multicore and GPGPU accelerated functions for loading RT data; designing treatment plans; and evaluating dose matrices and experimental data.CARMEN supports anatomic and functional imaging inmore » DICOM format, as well as RTSTRUCT, RTPLAN and RTDOSE. Besides, it contains numerous tools to accomplish the MCTP process, managing egs4phant and phase space files.CARMEN planning mode assist in designing IMRT, VMAT and MERT treatments via both inverse and direct optimization. The evaluation mode contains a comprehensive toolset (e.g. 2D/3D gamma evaluation, difference matrices, profiles, DVH, etc.) to compare datasets from commercial TPS, MC simulations (i.e. 3ddose) and radiochromic film in a user-controlled manner. Results: CARMEN has been validated against commercial RTPs and well-established evaluation tools, showing coherent behavior of its multiple algorithms. Furthermore, CARMEN platform has been used to generate competitive complex treatment that has been published in comparative studies. Conclusion: A new research oriented MCTP platform with a customized validation toolset has been presented. Despite of being coded with a high-level programming language, CARMEN is agile due to the use of parallel algorithms. The wide-spread use of MatLab provides straightforward access to CARMEN’s algorithms to most researchers. Similarly, our platform can benefit from the MatLab community scientific developments as filters, registration algorithms etc. Finally, CARMEN arises the importance of grid and filtering control in treatment plan comparison.« less

  5. DDS as middleware of the Southern African Large Telescope control system

    NASA Astrophysics Data System (ADS)

    Maartens, Deneys S.; Brink, Janus D.

    2016-07-01

    The Southern African Large Telescope (SALT) software control system1 is realised as a distributed control system, implemented predominantly in National Instruments' LabVIEW. The telescope control subsystems communicate using cyclic, state-based messages. Currently, transmitting a message is accomplished by performing an HTTP PUT request to a WebDAV directory on a centralised Apache web server, while receiving is based on polling the web server for new messages. While the method works, it presents a number of drawbacks; a scalable distributed communication solution with minimal overhead is a better fit for control systems. This paper describes our exploration of the Data Distribution Service (DDS). DDS is a formal standard specification, defined by the Object Management Group (OMG), that presents a data-centric publish-subscribe model for distributed application communication and integration. It provides an infrastructure for platform- independent many-to-many communication. A number of vendors provide implementations of the DDS standard; RTI, in particular, provides a DDS toolkit for LabVIEW. This toolkit has been evaluated against the needs of SALT, and a few deficiencies have been identified. We have developed our own implementation that interfaces LabVIEW to DDS in order to address our specific needs. Our LabVIEW DDS interface implementation is built against the RTI DDS Core component, provided by RTI under their Open Community Source licence. Our needs dictate that the interface implementation be platform independent. Since we have access to the RTI DDS Core source code, we are able to build the RTI DDS libraries for any of the platforms on which we require support. The communications functionality is based on UDP multicasting. Multicasting is an efficient communications mechanism with low overheads which avoids duplicated point-to-point transmission of data on a network where there are multiple recipients of the data. In the paper we present a performance evaluation of DDS against the current HTTP-based implementation as well as the historical DataSocket implementation. We conclude with a summary and describe future work.

  6. Migration to Current Open Source Technologies by MagIC Enables a More Responsive Website, Quicker Development Times, and Increased Community Engagement

    NASA Astrophysics Data System (ADS)

    Jarboe, N.; Minnett, R.; Koppers, A.; Constable, C.; Tauxe, L.; Jonestrask, L.

    2017-12-01

    The Magnetics Information Consortium (MagIC) supports an online database for the paleo, geo, and rock magnetic communities ( https://earthref.org/MagIC ). Researchers can upload data into the archive and download data as selected with a sophisticated search system. MagIC has completed the transition from an Oracle backed, Perl based, server oriented website to an ElasticSearch backed, Meteor based thick client website technology stack. Using JavaScript on both the sever and the client enables increased code reuse and allows easy offloading many computational operations to the client for faster response. On-the-fly data validation, column header suggestion, and spreadsheet online editing are some new features available with the new system. The 3.0 data model, method codes, and vocabulary lists can be browsed via the MagIC website and more easily updated. Source code for MagIC is publicly available on GitHub ( https://github.com/earthref/MagIC ). The MagIC file format is natively compatible with the PmagPy ( https://github.com/PmagPy/PmagPy) paleomagnetic analysis software. MagIC files can now be downloaded from the database and viewed and interpreted in the PmagPy GUI based tool, pmag_gui. Changes or interpretations of the data can then be saved by pmag_gui in the MagIC 3.0 data format and easily uploaded to the MagIC database. The rate of new contributions to the database has been increasing with many labs contributing measurement level data for the first time in the last year. Over a dozen file format conversion scripts are available for translating non-MagIC measurement data files into the MagIC format for easy uploading. We will continue to work with more labs until the whole community has a manageable workflow for contributing their measurement level data. MagIC will continue to provide a global repository for archiving and retrieving paleomagnetic and rock magnetic data and, with the new system in place, be able to more quickly respond to the community's requests for changes and improvements.

  7. FPGA based digital phase-coding quantum key distribution system

    NASA Astrophysics Data System (ADS)

    Lu, XiaoMing; Zhang, LiJun; Wang, YongGang; Chen, Wei; Huang, DaJun; Li, Deng; Wang, Shuang; He, DeYong; Yin, ZhenQiang; Zhou, Yu; Hui, Cong; Han, ZhengFu

    2015-12-01

    Quantum key distribution (QKD) is a technology with the potential capability to achieve information-theoretic security. Phasecoding is an important approach to develop practical QKD systems in fiber channel. In order to improve the phase-coding modulation rate, we proposed a new digital-modulation method in this paper and constructed a compact and robust prototype of QKD system using currently available components in our lab to demonstrate the effectiveness of the method. The system was deployed in laboratory environment over a 50 km fiber and continuously operated during 87 h without manual interaction. The quantum bit error rate (QBER) of the system was stable with an average value of 3.22% and the secure key generation rate is 8.91 kbps. Although the modulation rate of the photon in the demo system was only 200 MHz, which was limited by the Faraday-Michelson interferometer (FMI) structure, the proposed method and the field programmable gate array (FPGA) based electronics scheme have a great potential for high speed QKD systems with Giga-bits/second modulation rate.

  8. Computational Biology Methods for Characterization of Pluripotent Cells.

    PubMed

    Araúzo-Bravo, Marcos J

    2016-01-01

    Pluripotent cells are a powerful tool for regenerative medicine and drug discovery. Several techniques have been developed to induce pluripotency, or to extract pluripotent cells from different tissues and biological fluids. However, the characterization of pluripotency requires tedious, expensive, time-consuming, and not always reliable wet-lab experiments; thus, an easy, standard quality-control protocol of pluripotency assessment remains to be established. Here to help comes the use of high-throughput techniques, and in particular, the employment of gene expression microarrays, which has become a complementary technique for cellular characterization. Research has shown that the transcriptomics comparison with an Embryonic Stem Cell (ESC) of reference is a good approach to assess the pluripotency. Under the premise that the best protocol is a computer software source code, here I propose and explain line by line a software protocol coded in R-Bioconductor for pluripotency assessment based on the comparison of transcriptomics data of pluripotent cells with an ESC of reference. I provide advice for experimental design, warning about possible pitfalls, and guides for results interpretation.

  9. Development of the STPX Spheromak System

    NASA Astrophysics Data System (ADS)

    Williams, R. L.; Clark, J.; Weatherford, C. A.

    2015-11-01

    The progress made in starting up the STPX Spheromak system, which is now installed at the Florida A&M University, is reviewed. Experimental, computational and theoretical activities are underway. The control system for firing the magnetized coaxial plasma gun and for collecting data from the diagnostic probes, based on LabView, is being tested and adapted. Preliminary results of testing the installed magnetic field probes, Langmuir triple probes, cylindrical ion probes, and optical diagnostics will be discussed. Progress in modeling this spheromak using simulation codes, such as NIMROD, will be discussed. Progress in investigating the use of algebraic topology to describe this spheromak will be reported.

  10. An introduction to scripting in Ruby for biologists

    PubMed Central

    Aerts, Jan; Law, Andy

    2009-01-01

    The Ruby programming language has a lot to offer to any scientist with electronic data to process. Not only is the initial learning curve very shallow, but its reflection and meta-programming capabilities allow for the rapid creation of relatively complex applications while still keeping the code short and readable. This paper provides a gentle introduction to this scripting language for researchers without formal informatics training such as many wet-lab scientists. We hope this will provide such researchers an idea of how powerful a tool Ruby can be for their data management tasks and encourage them to learn more about it. PMID:19607723

  11. Intel NX to PVM 3.2 message passing conversion library

    NASA Technical Reports Server (NTRS)

    Arthur, Trey; Nelson, Michael L.

    1993-01-01

    NASA Langley Research Center has developed a library that allows Intel NX message passing codes to be executed under the more popular and widely supported Parallel Virtual Machine (PVM) message passing library. PVM was developed at Oak Ridge National Labs and has become the defacto standard for message passing. This library will allow the many programs that were developed on the Intel iPSC/860 or Intel Paragon in a Single Program Multiple Data (SPMD) design to be ported to the numerous architectures that PVM (version 3.2) supports. Also, the library adds global operations capability to PVM. A familiarity with Intel NX and PVM message passing is assumed.

  12. Using clinical data to predict high-cost performance coding issues associated with pressure ulcers: a multilevel cohort model.

    PubMed

    Padula, William V; Gibbons, Robert D; Pronovost, Peter J; Hedeker, Donald; Mishra, Manish K; Makic, Mary Beth F; Bridges, John Fp; Wald, Heidi L; Valuck, Robert J; Ginensky, Adam J; Ursitti, Anthony; Venable, Laura Ruth; Epstein, Ziv; Meltzer, David O

    2017-04-01

    Hospital-acquired pressure ulcers (HAPUs) have a mortality rate of 11.6%, are costly to treat, and result in Medicare reimbursement penalties. Medicare codes HAPUs according to Agency for Healthcare Research and Quality Patient-Safety Indicator 3 (PSI-03), but they are sometimes inappropriately coded. The objective is to use electronic health records to predict pressure ulcers and to identify coding issues leading to penalties. We evaluated all hospitalized patient electronic medical records at an academic medical center data repository between 2011 and 2014. These data contained patient encounter level demographic variables, diagnoses, prescription drugs, and provider orders. HAPUs were defined by PSI-03: stages III, IV, or unstageable pressure ulcers not present on admission as a secondary diagnosis, excluding cases of paralysis. Random forests reduced data dimensionality. Multilevel logistic regression of patient encounters evaluated associations between covariates and HAPU incidence. The approach produced a sample population of 21 153 patients with 1549 PSI-03 cases. The greatest odds ratio (OR) of HAPU incidence was among patients diagnosed with spinal cord injury (ICD-9 907.2: OR = 14.3; P  < .001), and 71% of spinal cord injuries were not properly coded for paralysis, leading to a PSI-03 flag. Other high ORs included bed confinement (ICD-9 V49.84: OR = 3.1, P  < .001) and provider-ordered pre-albumin lab (OR = 2.5, P  < .001). This analysis identifies spinal cord injuries as high risk for HAPUs and as being often inappropriately coded without paralysis, leading to PSI-03 flags. The resulting statistical model can be tested to predict HAPUs during hospitalization. Inappropriate coding of conditions leads to poor hospital performance measures and Medicare reimbursement penalties. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  13. The burden of clostridium difficile infection: estimates of the incidence of CDI from U.S. Administrative databases.

    PubMed

    Olsen, Margaret A; Young-Xu, Yinong; Stwalley, Dustin; Kelly, Ciarán P; Gerding, Dale N; Saeed, Mohammed J; Mahé, Cedric; Dubberke, Erik R

    2016-04-22

    Many administrative data sources are available to study the epidemiology of infectious diseases, including Clostridium difficile infection (CDI), but few publications have compared CDI event rates across databases using similar methodology. We used comparable methods with multiple administrative databases to compare the incidence of CDI in older and younger persons in the United States. We performed a retrospective study using three longitudinal data sources (Medicare, OptumInsight LabRx, and Healthcare Cost and Utilization Project State Inpatient Database (SID)), and two hospital encounter-level data sources (Nationwide Inpatient Sample (NIS) and Premier Perspective database) to identify CDI in adults aged 18 and older with calculation of CDI incidence rates/100,000 person-years of observation (pyo) and CDI categorization (onset and association). The incidence of CDI ranged from 66/100,000 in persons under 65 years (LabRx), 383/100,000 in elderly persons (SID), and 677/100,000 in elderly persons (Medicare). Ninety percent of CDI episodes in the LabRx population were characterized as community-onset compared to 41 % in the Medicare population. The majority of CDI episodes in the Medicare and LabRx databases were identified based on only a CDI diagnosis, whereas almost ¾ of encounters coded for CDI in the Premier hospital data were confirmed with a positive test result plus treatment with metronidazole or oral vancomycin. Using only the Medicare inpatient data to calculate encounter-level CDI events resulted in 553 CDI events/100,000 persons, virtually the same as the encounter proportion calculated using the NIS (544/100,000 persons). We found that the incidence of CDI was 35 % higher in the Medicare data and fewer episodes were attributed to hospital acquisition when all medical claims were used to identify CDI, compared to only inpatient data lacking information on diagnosis and treatment in the outpatient setting. The incidence of CDI was 10-fold lower and the proportion of community-onset CDI was much higher in the privately insured younger LabRx population compared to the elderly Medicare population. The methods we developed to identify incident CDI can be used by other investigators to study the incidence of other infectious diseases and adverse events using large generalizable administrative datasets.

  14. Cyto-sensing in electrochemical lab-on-paper cyto-device for in-situ evaluation of multi-glycan expressions on cancer cells.

    PubMed

    Su, Min; Ge, Lei; Kong, Qingkun; Zheng, Xiaoxiao; Ge, Shenguang; Li, Nianqiang; Yu, Jinghua; Yan, Mei

    2015-01-15

    A novel electrochemical lab-on-paper cyto-device (ELPCD) was fabricated to demonstrate sensitive and specific cancer cell detection as well as in-situ monitoring of multi-glycans on living cancer cells. In this ELPCD, aptamers modified three-dimensional macroporous Au-paper electrode (Au-PE) was employed as the working electrode for specific and efficient cancer cell capture. Using a sandwich format, sensitive and reproducible cell detection was achieved in this ELPCD on the basis of the electrochemical signal amplification of the Au-PE and the horseradish peroxidase-lectin electrochemical probe. The ELPCD displayed excellent analytical performance for the detection of four K562 cells with a wide linear calibration range from 550 to 2.0×10(7) cells mL(-1). Then, this ELPCD was successfully applied to determine cell-surface multi-glycans in parallel and in-situ monitor multi-glycans expression on living cells in response to drug treatment through in-electrode 3D cell culture. The proposed method provides promising application in decipherment of the glycomic codes as well as clinical diagnosis and treatment in early process of cancer. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Recent optimization of the beam-optical characteristics of the 6 MV van de Graaff accelerator for high brightness beams at the iThemba LABS NMP facility

    NASA Astrophysics Data System (ADS)

    Conradie, J. L.; Eisa, M. E. M.; Celliers, P. J.; Delsink, J. L. G.; Fourie, D. T.; de Villiers, J. G.; Maine, P. M.; Springhorn, K. A.; Pineda-Vargas, C. A.

    2005-04-01

    With the aim of improving the reliability and stability of the beams delivered to the nuclear microprobe at iThemba LABS, as well as optimization of the beam characteristics along the van de Graaff accelerator beamlines in general, relevant modifications were implemented since the beginning of 2003. The design and layout of the beamlines were revised. The beam-optical characteristics through the accelerator, from the ion source up to the analysing magnet directly after the accelerator, were calculated and the design optimised, using the computer codes TRANSPORT, IGUN and TOSCA. The ion source characteristics and optimal operating conditions were determined on an ion source test bench. The measured optimal emittance for 90% of the beam intensity was about 50π mm mrad for an extraction voltage of 6 kV. These changes allow operation of the Nuclear Microprobe at proton energies in the range 1 MeV-4 MeV with beam intensities of tenths of a pA at the target surface. The capabilities of the nuclear microprobe facility were evaluated in the improved beamline, with particular emphasis to bio-medical samples.

  16. Adapting smart phone applications about physics education to blind students

    NASA Astrophysics Data System (ADS)

    Bülbül, M. Ş.; Yiğit, N.; Garip, B.

    2016-04-01

    Today, most of necessary equipment in a physics laboratory are available for smartphone users via applications. Physics teachers may measure from acceleration to sound volume with its internal sensors. These sensors collect data and smartphone applications make the raw data visible. Teachers who do not have well-equipped laboratories at their schools may have an opportunity to conduct experiments with the help of smart phones. In this study, we analyzed possible open source physics education applications in terms of blind users in inclusive learning environments. All apps are categorized as partially, full or non-supported. The roles of blind learner’s friend during the application are categorized as reader, describer or user. Mentioned apps in the study are compared with additional opportunities like size and downloading rates. Out of using apps we may also get information about whether via internet and some other extra information for different experiments in physics lab. Q-codes reading or augmented reality are two other opportunity provided by smart phones for users in physics labs. We also summarized blind learner’s smartphone experiences from literature and listed some suggestions for application designers about concepts in physics.

  17. Payload Planning for the International Space Station

    NASA Technical Reports Server (NTRS)

    Johnson, Tameka J.

    1995-01-01

    A review of the evolution of the International Space Station (ISS) was performed for the purpose of understanding the project objectives. It was requested than an analysis of the current Office of Space Access and Technology (OSAT) Partnership Utilization Plan (PUP) traffic model be completed to monitor the process through which the scientific experiments called payloads are manifested for flight to the ISS. A viewing analysis of the ISS was also proposed to identify the capability to observe the United States Laboratory (US LAB) during the assembly sequence. Observations of the Drop-Tower experiment and nondestructive testing procedures were also performed to maximize the intern's technical experience. Contributions were made to the meeting in which the 1996 OSAT or Code X PUP traffic model was generated using the software tool, Filemaker Pro. The current OSAT traffic model satisfies the requirement for manifesting and delivering the proposed payloads to station. The current viewing capability of station provides the ability to view the US LAB during station assembly sequence. The Drop Tower experiment successfully simulates the effect of microgravity and conveniently documents the results for later use. The non-destructive test proved effective in determining stress in various components tested.

  18. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  19. Evaluation of the relationship between a chronic disease care management program and california pay-for-performance diabetes care cholesterol measures in one medical group.

    PubMed

    Cutler, Timothy W; Palmieri, James; Khalsa, Maninder; Stebbins, Marilyn

    2007-09-01

    Pay for performance (P4P) is a business model in which health plans pay provider organizations (medical groups) financial incentives based on attainment of clinical quality, patient experience, and use of information technology. The California P4P program is the largest P4P program in the united states and represents a potential revenue source for all participating medical groups. The clinical specifications for the California P4P program are based on the national Committee for Quality assurance (NCQA), Health Plan Employer Data, and information set (HEDIS), and each clinical measure has its own benchmark. in 2005, participating medical groups were paid on the basis of 9 clinical measures that were evaluated in the 2004 measurement year. The cholesterol testing measure represented 4.44%-7.14% of the total P4P dollars available to participating medical groups from the health plans. To (1) compare the percentage of medical group members aged 18 to 75 years with diabetes (type 1 or type 2) who received a low-density lipoprotein cholesterol (LDL-C) test and attained LDL-C control (<130 mg per dl) after enrolling in a chronic disease care management (CDCM) program with similar members managed by routine care, and to (2) assess the potential effect of CDCM on the quality performance ranking and financial reimbursement of a medical group reporting these measures in the 2004 California P4P measurement year. This is a retrospective database review of electronic laboratory (lab) values, medical and hospital claims, and encounter data collected between january 1, 2003 and December 31, 2004 at 1 California medical group comprising 160 multispecialty providers. Requirements were continuous patient enrollment in 1 of the 7 health plans participating in P4P during the measurement year (2004) with no more than 1 gap in enrollment of up to 45 days. Patients aged 18 to 75 years were included in the diabetes cholesterol measure (denominator) if they had at least 2 outpatient encounters coded for a primary, secondary, or tertiary diagnosis of diabetes (International Classification of Diseases, Ninth Revision, Clinical Modification code 250.xx, 357.2, 362.0, 366.41, 648.0) or 1 acute inpatient (Diagnosis Related Group code 294 or 295) or emergency room visit for diabetes. Lab values were obtained from multiple sources, including archived lab databases during the same measurement period (numerator). The CDCM program provided education and recommendations for diet, lifestyle, and medication modification delivered by a multidisciplinary team of nurses, pharmacists, and dieticians, and this intervention was compared with routine care for patients not enrolled in the CDCM program. Of the 54,000 health plan members enrolled in this medical group under capitated reimbursement, 1,859 patients (3.4%) met the California P4P specifications for eligibility for the diabetes cholesterol measures and were evaluated. Of these, 8.9% (165/1,859) were followed by the CDCM program and 91.1% (1,694/1,859) by routine care. The LDL-C lab testing rate for patients in the CDCM program was 91.5% (151/165), and the LDL-C goal rate was 78.2% (129/165) compared with 67.8% (1,148/1,694) and 55.7%, respectively, for routine care (P < 0.001 for both comparisons). if the LDL-C lab testing and goal attainment rates for the CDCM group were compared with rates for peer medical groups, this medical group would have scored in the 75th and 90th percentiles, respectively, corresponding to an annual revenue potential of $28,512 for this medical group if the total incentive payment from the health plan was $1 per member per month (PMPM), or $57,024 if the total incentive P4P payment was $2 PMPM. Preliminary data from 165 patients with diabetes managed in a CDCM program in a medical group operating under a small P4P financial incentive showed higher rates of LDL-C lab testing and goal attainment than from patients managed by routine care. Had these rates of LDL-C testing and goal attainment achieved in the CDCM program been extended to the entire P4P population with diabetes, this medical group would have generated incentive payments under the P4P program and ranked higher in publicly available quality scores.

  20. Lithospheric-Mantle Structure of the Kaapvaal Craton, South Africa, Derived from Thermodynamically Self-Consistent Modelling of Magnetotelluric, Surface-Wave Dispersion, S-wave Receiver Function, Heat-flow, Elevation and Xenolith Observations

    NASA Astrophysics Data System (ADS)

    Muller, Mark; Fullea, Javier; Jones, Alan G.; Adam, Joanne; Lebedev, Sergei; Piana Agostinetti, Nicola

    2013-04-01

    Results from recent geophysical and mantle-xenolith geochemistry studies of the Kaapvaal Craton appear, at times, to provide disparate views of the physical, chemical and thermal structure of the lithosphere. Models from our recent SAMTEX magnetotelluric (MT) surveys across the Kaapvaal Craton indicate a resistive, 220-240 km thick lithosphere for the central core of the craton. One published S-wave receiver function (SRF) study and other surface-wave studies suggest a thinner lithosphere characterised by a ~160 km thick high-velocity "lid" underlain by a low-velocity zone (LVZ) of between 65-150 km in thickness. Other seismic studies suggest that the (high-velocity) lithosphere is thicker, in excess of 220 km. Mantle xenolith pressure-temperature arrays from Mesozoic kimberlites require that the base of the "thermal" lithosphere (i.e., the depth above which a conductive geotherm is maintained - the tLAB) is at least 220 km deep, to account for mantle geotherms in the range 35-38 mWm-2. Richly diamondiferous kimberlites across the Kaapvaal Craton require a lithospheric thickness substantially greater than 160 km - the depth of the top of the diamond stability field. In this paper we use the recently developed LitMod software code to derive, thermodynamically consistently, a range of 1-D electrical resistivity, seismic velocity, density and temperature models from layered geochemical models of the lithosphere based on mantle xenolith compositions. In our work, the "petrological" lithosphere-asthenosphere boundary (pLAB) (i.e., the top of the fertile asthenospheric-mantle) and the "thermal" LAB (tLAB) are coincident. Lithospheric-mantle models are found simultaneously satisfying all geophysical observables: MT responses, new surface-wave dispersion data, published SRFs, surface elevation and heat-flow. Our results show: 1. All lithospheric-mantle models are characterised by a seismic LVZ with a minimum velocity at the depth of the petrological/thermal LAB. The top of the LVZ does not correspond with the LAB. 2. Thin (~160 km-thick) lithospheric-mantle models are consistent with surface elevation and heat-flow observations only for unreasonably low average crustal heat production values (~0.4 µWm-3). However, such models are inconsistent both with the surface-wave dispersion data and youngest (Group I) palaeo-geotherms defined by xenolith P-T arrays. 3. A three-layered geochemical model, with lithospheric thickness in excess of 230 km, is required to match all geophysical and xenolith constraints. 4. The chemical transition from a depleted harzburgitic composition (above) to a refertilised high-T lherzolitic composition (below) at 160 km depth produces a sharp onset of the seismic LVZ and a sharp increase in density. Synthetic SRFs indicate that this chemical transition is able to account for the reported S-to-P conversion event at 160 km depth. In this this instance the 160 km deep SRF event does not represent the petrological/thermal LAB.

  1. A Roadmap to Continuous Integration for ATLAS Software Development

    NASA Astrophysics Data System (ADS)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  2. Sirepo - Warp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagler, Robert; Moeller, Paul

    Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jin-ja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is Warp. Warp is a particle-in-cell (PIC) code de-signed to simulate high-intensity charged particle beams and plasmas in both the electrostatic and electromagnetic regimes, with a wide variety of integrated physics models and diagnostics. At pre-sent, Sirepo supports a small subset of Warp’s capabilities. Warp is open source and is part of the Berkeley Lab Accelerator Simulation Toolkit.« less

  3. In silico search for functionally similar proteins involved in meiosis and recombination in evolutionarily distant organisms.

    PubMed

    Bogdanov, Yuri F; Dadashev, Sergei Y; Grishaeva, Tatiana M

    2003-01-01

    Evolutionarily distant organisms have not only orthologs, but also nonhomologous proteins that build functionally similar subcellular structures. For instance, this is true with protein components of the synaptonemal complex (SC), a universal ultrastructure that ensures the successful pairing and recombination of homologous chromosomes during meiosis. We aimed at developing a method to search databases for genes that code for such nonhomologous but functionally analogous proteins. Advantage was taken of the ultrastructural parameters of SC and the conformation of SC proteins responsible for these. Proteins involved in SC central space are known to be similar in secondary structure. Using published data, we found a highly significant correlation between the width of the SC central space and the length of rod-shaped central domain of mammalian and yeast intermediate proteins forming transversal filaments in the SC central space. Basing on this, we suggested a method for searching genome databases of distant organisms for genes whose virtual proteins meet the above correlation requirement. Our recent finding of the Drosophila melanogaster CG17604 gene coding for synaptonemal complex transversal filament protein received experimental support from another lab. With the same strategy, we showed that the Arabidopsis thaliana and Caenorhabditis elegans genomes contain unique genes coding for such proteins.

  4. Computational Fluid Dynamics Analysis Method Developed for Rocket-Based Combined Cycle Engine Inlet

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Renewed interest in hypersonic propulsion systems has led to research programs investigating combined cycle engines that are designed to operate efficiently across the flight regime. The Rocket-Based Combined Cycle Engine is a propulsion system under development at the NASA Lewis Research Center. This engine integrates a high specific impulse, low thrust-to-weight, airbreathing engine with a low-impulse, high thrust-to-weight rocket. From takeoff to Mach 2.5, the engine operates as an air-augmented rocket. At Mach 2.5, the engine becomes a dual-mode ramjet; and beyond Mach 8, the rocket is turned back on. One Rocket-Based Combined Cycle Engine variation known as the "Strut-Jet" concept is being investigated jointly by NASA Lewis, the U.S. Air Force, Gencorp Aerojet, General Applied Science Labs (GASL), and Lockheed Martin Corporation. Work thus far has included wind tunnel experiments and computational fluid dynamics (CFD) investigations with the NPARC code. The CFD method was initiated by modeling the geometry of the Strut-Jet with the GRIDGEN structured grid generator. Grids representing a subscale inlet model and the full-scale demonstrator geometry were constructed. These grids modeled one-half of the symmetric inlet flow path, including the precompression plate, diverter, center duct, side duct, and combustor. After the grid generation, full Navier-Stokes flow simulations were conducted with the NPARC Navier-Stokes code. The Chien low-Reynolds-number k-e turbulence model was employed to simulate the high-speed turbulent flow. Finally, the CFD solutions were postprocessed with a Fortran code. This code provided wall static pressure distributions, pitot pressure distributions, mass flow rates, and internal drag. These results were compared with experimental data from a subscale inlet test for code validation; then they were used to help evaluate the demonstrator engine net thrust.

  5. Performance analysis of an OAM multiplexing-based MIMO FSO system over atmospheric turbulence using space-time coding with channel estimation.

    PubMed

    Zhang, Yan; Wang, Ping; Guo, Lixin; Wang, Wei; Tian, Hongxin

    2017-08-21

    The average bit error rate (ABER) performance of an orbital angular momentum (OAM) multiplexing-based free-space optical (FSO) system with multiple-input multiple-output (MIMO) architecture has been investigated over atmospheric turbulence considering channel estimation and space-time coding. The impact of different types of space-time coding, modulation orders, turbulence strengths, receive antenna numbers on the transmission performance of this OAM-FSO system is also taken into account. On the basis of the proposed system model, the analytical expressions of the received signals carried by the k-th OAM mode of the n-th receive antenna for the vertical bell labs layered space-time (V-Blast) and space-time block codes (STBC) are derived, respectively. With the help of channel estimator carrying out with least square (LS) algorithm, the zero-forcing criterion with ordered successive interference cancellation criterion (ZF-OSIC) equalizer of V-Blast scheme and Alamouti decoder of STBC scheme are adopted to mitigate the performance degradation induced by the atmospheric turbulence. The results show that the ABERs obtained by channel estimation have excellent agreement with those of turbulence phase screen simulations. The ABERs of this OAM multiplexing-based MIMO system deteriorate with the increase of turbulence strengths. And both V-Blast and STBC schemes can significantly improve the system performance by mitigating the distortions of atmospheric turbulence as well as additive white Gaussian noise (AWGN). In addition, the ABER performances of both space-time coding schemes can be further enhanced by increasing the number of receive antennas for the diversity gain and STBC outperforms V-Blast in this system for data recovery. This work is beneficial to the OAM FSO system design.

  6. Biology, literacy, and the African American voice: A case study of meaningful learning in the biology classroom

    NASA Astrophysics Data System (ADS)

    Reese, Keturah

    Under the direction of Sharon Murphy Augustine, Ph.D./Ph.D Curriculum and Instruction There was a substantial performance gap among African Americans and other ethnic groups. Additionally, African American students in a Title I school were at a significantly high risk of not meeting or exceeding on performance tests in science. Past reports have shown average gains in some subject areas, and declines in others (NCES, 2011; GADOE, 2012). Current instructional strategies and the lack of literacy within the biology classroom created a problem for African American high school students on national and state assessments. The purpose of this study was to examine the perceptions of African American students and teachers in the context of literacy and biology through the incorporation of an interactive notebook and other literacy strategies. The data was collected three ways: field notes for a two week observation period within the biology classroom, student and teacher interviews, and student work samples. During the observations, student work collection, and interviews, I looked for the following codes: active learning, constructive learning, collaborative learning, authentic learning, and intentional learning. In the process of coding for the pre-determined codes, three more codes emerged. The three codes that emerged were organization, studying/student ownership, and student teacher relationships. Students and teachers both solidified the notion that literacy and biology worked well together. The implemented literacy strategies were something that both teachers and students appreciated in their learning of biology. Overall students and teachers perceived that the interactive notebook along Cornell notes, Thinking maps, close reads, writing, lab experiments, and group work created meaningful learning experiences within the biology classroom.

  7. Dual Active Bridge based DC Transformer LabVIEW FPGA Control Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    In the area of power electronics control, Field Programmable Gate Arrays (FPGAs) have the capability to outperform their Digital Signal Processor (DSP) counterparts due to the FPGA’s ability to implement true parallel processing and therefore facilitate higher switching frequencies, higher control bandwidth, and/or enhanced functionality. National Instruments (NI) has developed two platforms, Compact RIO (cRIO) and Single Board RIO (sbRIO), which combine a real-time processor with an FPGA. The FPGA can be programmed with a subset of the well-known LabVIEW graphical programming language. The candidate software implements complete control algorithms in LabVIEW FPGA for a DC Transformer (DCX) based onmore » a dual active bridge (DAB). A DCX is an isolated bi-directional DC-DC converter designed to operate at unity conversion ratio, M, defined by where Vin is the primary-side DC bus voltage, Vout is the secondary-side DC bus voltage, and n is the turns ratio of the embedded high frequency transformer (HFX). The DCX based on a DAB incorporates two H-bridges, a resonant inductor, and an HFX to provide this functionality. The candidate software employs phase-shift modulation of the two H-bridges and a feedback loop to regulate the conversion ratio at unity. The software also includes alarm-handling capabilities as well as debugging and tuning tools. The software fits on the Xilinx Virtex V LX110 FPGA embedded in the NI cRIO-9118 FPGA chassis, and with a 40 MHz base clock, supports a modulation update rate of 40 MHz, and user-settable switching frequencies and synchronized control loop update rates of tens of kHz.« less

  8. The University of Connecticut Biomedical Engineering Mentoring Program for high school students.

    PubMed

    Enderle, John D; Liebler, Christopher M; Haapala, Stephenic A; Hart, James L; Thonakkaraparayil, Naomi T; Romonosky, Laura L; Rodriguez, Francisco; Trumbower, Randy D

    2004-01-01

    For the past four years, the Biomedical Engineering Program at the University of Connecticut has offered a summer mentoring program for high school students interested in biomedical engineering. To offer this program, we have partnered with the UConn Mentor Connection Program, the School of Engineering 2000 Program and the College of Liberal Arts and Sciences Summer Laboratory Apprentice Program. We typically have approximately 20-25 high school students learning about biomedical engineering each summer. The mentoring aspect of the program exists at many different levels, with the graduate students mentoring the undergraduate students, and these students mentoring the high school students. The program starts with a three-hour lecture on biomedical engineering to properly orient the students. An in-depth paper on an area in biomedical engineering is a required component, as well as a PowerPoint presentation on their research. All of the students build a device to record an EKG on a computer using LabView, including signal processing to remove noise. The students learn some rudimentary concepts on electrocardiography and the physiology and anatomy of the heart. The students also learn basic electronics and breadboarding circuits, PSpice, the building of a printed circuit board, PIC microcontroller, the operation of Multimeters (including the oscilloscope), soldering, assembly of the EKG device and writing LabView code to run their device on a PC. The students keep their EKG device, LabView program and a fully illustrated booklet on EKG to bring home with them, and hopefully bring back to their high school to share their experiences with other students and teachers. The students also work on several other projects during this summer experience as well as visit Hartford Hospital to learn about Clinical Engineering.

  9. High-Beta Electromagnetic Turbulence in LAPD Plasmas

    NASA Astrophysics Data System (ADS)

    Rossi, G.; Carter, T. A.; Pueschel, M. J.; Jenko, F.; Told, D.; Terry, P. W.

    2015-11-01

    The introduction of a new LaB6 cathode plasma source in the Large Plasma Device has enabled the study of pressure-gradient-driven turbulence and transport variations at significantly higher plasma β. Density fluctuations are observed to decrease with increasing β while magnetic fluctuations increase. Furthermore, the perpendicular magnetic fluctuations are seen to saturate while parallel (compressional) magnetic fluctuations increase continuously with β. These observations are compared to linear and nonlinear simulations with the GENE code. The results are consistent with the linear excitation of a Gradient-driven Drift Coupling mode (GDC) which relies on grad-B drift due to parallel magnetic fluctuations and can be driven by density or temperature gradients.

  10. Effects of die quench forming on sheet thinning and 3-point bend testing of AA7075-T6

    NASA Astrophysics Data System (ADS)

    Kim, Samuel; Omer, Kaab; Rahmaan, Taamjeed; Butcher, Clifford; Worswick, Michael

    2017-10-01

    Lab-scaled AA7075 aluminum side impact beams were manufactured using the die quenching technique in which the sheet was solutionized and then quenched in-die during forming to a super saturated solid state. Sheet thinning measurements were taken at various locations throughout the length of the part and the effect of lubricant on surface scoring and material pick-up on the die was evaluated. The as-formed beams were subjected to a T6 aging treatment and then tested in three-point bending. Simulations were performed of the forming and mechanical testing experiments using the LS-DYNA finite element code. The thinning and mechanical response was predicted well.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Max; Lystig, Ted; Beard, Jonathan

    Purpose. To compare the learning of endovascular interventional skills by training on pig models versus virtual reality simulators. Methods. Twelve endovascular novices participated in a study consisting of a pig laboratory (P-Lab) and a virtual reality laboratory (VR-Lab). Subjects were stratified by experience and randomized into four training groups. Following 1 hr of didactic instruction, all attempted an iliac artery stenosis (IAS) revascularization in both laboratories. Onsite proctors evaluated performances using task-specific checklists and global rating scales, yielding a Total Score. Participants completed two training sessions of 3 hr each, using their group's assigned method (P-Lab x 2, P-Lab +more » VR-Lab, VR-Lab + P-Lab, or VR-Lab x 2) and were re-evaluated in both laboratories. A panel of two highly experienced interventional radiologists performed assessments from video recordings. ANCOVA analysis of Total Score against years of surgical, interventional radiology (IR) experience and cumulative number of P-Lab or VR-Lab sessions was conducted. Inter-rater reliability (IRR) was determined by comparing proctored scores with the video assessors in only the VR-Lab. Results. VR-Lab sessions improved the VR-Lab Total Score ({beta} 3.029, p = 0.0015) and P-Lab Total Score ({beta} = 1.814, p = 0.0452). P-Lab sessions increased the P-Lab Total Score ({beta} = 4.074, p < 0.0001) but had no effect on the VR-Lab Total Score. In the general statistical model, both P-Lab sessions ({beta} = 2.552, p = 0.0010) and VR-Lab sessions ({beta} 2.435, p = 0.0032) significantly improved Total Score. Neither previous surgical experience nor IR experience predicted Total Score. VR-Lab scores were consistently higher than the P-Lab scores ({delta} = 6.659, p < 0.0001). VR-Lab IRR was substantial (r = 0.649, p < 0.0008). Conclusions. Endovascular skills learned in the virtual environment may be transferable to the real catheterization laboratory as modeled in the P-Lab.« less

  12. EGAC - HOME PAGE -

    Science.gov Websites

    recognized in are Accreditation of Testing Labs, Calibration Labs, Medical Labs, Inspection Bodies Testing Labs, Calibration Labs, Medical Labs, Inspection Bodies, Certification of QMS, EMS and FSMS. 4 Jul

  13. Department of Energy's Virtual Lab Infrastructure for Integrated Earth System Science Data

    NASA Astrophysics Data System (ADS)

    Williams, D. N.; Palanisamy, G.; Shipman, G.; Boden, T.; Voyles, J.

    2014-12-01

    The U.S. Department of Energy (DOE) Office of Biological and Environmental Research (BER) Climate and Environmental Sciences Division (CESD) produces a diversity of data, information, software, and model codes across its research and informatics programs and facilities. This information includes raw and reduced observational and instrumentation data, model codes, model-generated results, and integrated data products. Currently, most of this data and information are prepared and shared for program specific activities, corresponding to CESD organization research. A major challenge facing BER CESD is how best to inventory, integrate, and deliver these vast and diverse resources for the purpose of accelerating Earth system science research. This talk provides a concept for a CESD Integrated Data Ecosystem and an initial roadmap for its implementation to address this integration challenge in the "Big Data" domain. Towards this end, a new BER Virtual Laboratory Infrastructure will be presented, which will include services and software connecting the heterogeneous CESD data holdings, and constructed with open source software based on industry standards, protocols, and state-of-the-art technology.

  14. Time and Energy Characterization of a Neutron time of Flight Detector for Re-designing Line of Sight 270 at the Z Pulsed Power Facility.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Styron, Jedediah D.

    2016-11-01

    This work will focus on the characterization of NTOF detectors fielded on ICF experiments conducted at the Z-experimental facility with emphasis on the MagLif and gas puff campaigns. Three experiments have been proposed. The first experiment will characterize the response of the PMT with respect to the amplitude and width of signals produced by single neutron events. A second experiment will characterize the neutron transit time through the scintillator and the third is to characterize the pulse amplitude for a very specific range of neutron induced charged particle interactions within the scintillator. These experiments will cover incident neutron energies relevantmore » to D-D and D-T fusion reactions. These measurements will be taken as a function of detector bias to cover the entire dynamic range of the detector. Throughout the characterization process, the development of a predictive capability is desired. A new post processing code has been proposed that will calculate a neutron time-of-flight spectrum in units of MeVee. This code will couple the experimentally obtained values and the results obtained with the Monte Carlo code MCNP6. The motivation of this code is to correct for geometry issues when transferring the calibration results from a light lab setting to the Zenvironment. This capability will be used to develop a hypothetical design of LOS270 such that more favorable neutron measurements, requiring less correction, can be made in the future.« less

  15. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    NASA Technical Reports Server (NTRS)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  16. A Comparative Study on Real Lab and Simulation Lab in Communication Engineering from Students' Perspectives

    ERIC Educational Resources Information Center

    Balakrishnan, B.; Woods, P. C.

    2013-01-01

    Over the years, rapid development in computer technology has engendered simulation-based laboratory (lab) in addition to the traditional hands-on (physical) lab. Many higher education institutions adopt simulation lab, replacing some existing physical lab experiments. The creation of new systems for conducting engineering lab activities has raised…

  17. Development of a pentaplex PCR assay for the simultaneous detection of Streptococcus thermophilus, Lactobacillus delbrueckii subsp. bulgaricus, L. delbrueckii subsp. lactis, L. helveticus, L. fermentum in whey starter for Grana Padano cheese.

    PubMed

    Cremonesi, Paola; Vanoni, Laura; Morandi, Stefano; Silvetti, Tiziana; Castiglioni, Bianca; Brasca, Milena

    2011-03-30

    A pentaplex PCR assay for the rapid, selective and simultaneous detection of Lactobacillus helveticus, L. delbrueckii subsp. lactis, L. delbrueckii subsp. bulgaricus, Streptococcus thermophilus, and L. fermentum, was developed. The target sequences were a group of genes coding for beta-galactosidase production (S. thermophilus and L. delbrueckii subsp. bulgaricus), for cell-enveloped associated proteinase synthesis (L. helveticus), for dipeptide transport system production (L. delbrueckii subsp. lactis) and for arginine-ornithine antiporter protein production (L. fermentum). The analytical specificity of the assay was evaluated with 5 reference strains and 140 lactic acid bacterial strains derived from raw milk cheeses and belonging to the Lactobacillus, Streptococcus, Lactococcus and Enterococcus genera. The identification limit for each target strain was 10(3)CFU/ml. This new molecular assay was used to investigate the LAB population by direct extraction of DNA from the 12 whey cultures for Grana Padano. The pentaplex PCR assay revealed a good correspondence with microbiological analyses and allowed to identify even minor LAB community members which, can be out-competed in vitro by numerically more abundant microbial species. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Random lasers for lab-on-chip applications

    NASA Astrophysics Data System (ADS)

    Giehl, J. M.; Butzbach, F.; Jorge, K. C.; Alvarado, M. A.; Carreño, M. N. P.; Alayo, M. I.; Wetter, N. U.

    2016-04-01

    Random lasers are laser sources in which the feedback is provided by scattering instead of reflection and which, for this reason, do not require surfaces with optical finish such as mirrors. The investigation of such lasing action in a large variety of disordered materials is a subject of high interest with very important applications such as threedimensional and speckle-free imaging, detection of cancer tissue and photonic coding and encryption. However, potential applications require optimization of random laser performance especially with respect to optical efficiency and directionality or brightness. This work demonstrates such an optimization procedure with the goal of achieving a random laser with sufficient efficiency and brightness in order to be used in practical applications. Two random lasers are demonstrated, one solid and on liquid, that fulfil directionality and efficiency requirements. The first one consists of a neodymium doped powder laser with a record slope efficiency of 1.6%. The second one is a liquid random laser injected into a HC-ARROW waveguide which uses a microchannel connected to a much larger reservoir in order to achieve the necessary directionality. Both devices can be produced by low cost fabricating technologies and easily integrated into next-generation, lab-on-chip devices used for in-situ determination of infectious tropical diseases, which is the main goal of this project.

  19. Quarterly Research Performance Progress Report (2015 Q3). Ultrasonic Phased Arrays and Interactive Reflectivity Tomography for Nondestructive Inspection of Injection and Production Wells in Geothermal Energy Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santos-Villalobos, Hector J; Polsky, Yarom; Kisner, Roger A

    2015-09-01

    For the past quarter, we have placed our effort in implementing the first version of the ModelBased Iterative Reconstruction (MBIR) algorithm, assembling and testing the electronics, designing transducers mounts, and defining our laboratory test samples. We have successfully developed the first implementation of MBIR for ultrasound imaging. The current algorithm was tested with synthetic data and we are currently making new modifications for the reconstruction of real ultrasound data. Beside assembling and testing the electronics, we developed a LabView graphic user interface (GUI) to fully control the ultrasonic phased array, adjust the time-delays of the transducers, and store the measuredmore » reflections. As part of preparing for a laboratory-scale demonstration, the design and fabrication of the laboratory samples has begun. Three cement blocks with embedded objects will be fabricated, characterized, and used to demonstrate the capabilities of the system. During the next quarter, we will continue to improve the current MBIR forward model and integrate the reconstruction code with the LabView GUI. In addition, we will define focal laws for the ultrasonic phased array and perform the laboratory demonstration. We expect to perform laboratory demonstration by the end of October 2015.« less

  20. Geometry Calibration of the SVT in the CLAS12 Detector

    NASA Astrophysics Data System (ADS)

    Davies, Peter; Gilfoyle, Gerard

    2016-09-01

    A new detector called CLAS12 is being built in Hall B as part of the 12 GeV Upgrade at Jefferson Lab to learn how quarks and gluons form nuclei. The Silicon Vertex Tracker (SVT) is one of the subsystems designed to track the trajectory of charged particles as they are emitted from the target at large angles. The sensors of the SVT consist of long, narrow, strips embedded in a silicon substrate. There are 256 strips in a sensor, with a stereo angle of 0 -3° degrees. The location of the strips must be known to a precision of a few microns in order to accurately reconstruct particle tracks with the required resolution of 50-60 microns. Our first step toward achieving this resolution was to validate the nominal geometry relative to the design specification. We also resolved differences between the design and the CLAS12, Geant4-based simulation code GEMC. We developed software to apply alignment shifts to the nominal design geometry from a survey of fiducial points on the structure that supports each sensor. The final geometry will be generated by a common package written in JAVA to ensure consistency between the simulation and Reconstruction codes. The code will be tested by studying the impact of known distortions of the nominal geometry in simulation. Work supported by the Univeristy of Richmond and the US Department of Energy.

  1. Radiation Transport and Shielding for Space Exploration and High Speed Flight Transportation

    NASA Technical Reports Server (NTRS)

    Maung, Khin Maung; Trapathi, R. K.

    1997-01-01

    Transportation of ions and neutrons in matter is of direct interest in several technologically important and scientific areas, including space radiation, cosmic ray propagation studies in galactic medium, nuclear power plants and radiological effects that impact industrial and public health. For the proper assessment of radiation exposure, both reliable transport codes and accurate data are needed. Nuclear cross section data is one of the essential inputs into the transport codes. In order to obtain an accurate parametrization of cross section data, theoretical input is indispensable especially for processes where there is little or no experimental data available. In this grant period work has been done on the studies of the use of relativistic equations and their one-body limits. The results will be useful in choosing appropriate effective one-body equation for reaction calculations. Work has also been done to improve upon the data base needed for the transport codes used in the studies of radiation transport and shielding for space exploration and high speed flight transportation. A phenomenological model was developed for the total absorption cross sections valid for any system of charged and/or uncharged collision pairs for the entire energy range. The success of the model is gratifying. It is being used by other federal agencies, national labs and universities. A list of publications based on the work during the grant period is given below and copies are enclosed with this report.

  2. Press touch code: A finger press based screen size independent authentication scheme for smart devices.

    PubMed

    Ranak, M S A Noman; Azad, Saiful; Nor, Nur Nadiah Hanim Binti Mohd; Zamli, Kamal Z

    2017-01-01

    Due to recent advancements and appealing applications, the purchase rate of smart devices is increasing at a higher rate. Parallely, the security related threats and attacks are also increasing at a greater ratio on these devices. As a result, a considerable number of attacks have been noted in the recent past. To resist these attacks, many password-based authentication schemes are proposed. However, most of these schemes are not screen size independent; whereas, smart devices come in different sizes. Specifically, they are not suitable for miniature smart devices due to the small screen size and/or lack of full sized keyboards. In this paper, we propose a new screen size independent password-based authentication scheme, which also offers an affordable defense against shoulder surfing, brute force, and smudge attacks. In the proposed scheme, the Press Touch (PT)-a.k.a., Force Touch in Apple's MacBook, Apple Watch, ZTE's Axon 7 phone; 3D Touch in iPhone 6 and 7; and so on-is transformed into a new type of code, named Press Touch Code (PTC). We design and implement three variants of it, namely mono-PTC, multi-PTC, and multi-PTC with Grid, on the Android Operating System. An in-lab experiment and a comprehensive survey have been conducted on 105 participants to demonstrate the effectiveness of the proposed scheme.

  3. Press touch code: A finger press based screen size independent authentication scheme for smart devices

    PubMed Central

    Ranak, M. S. A. Noman; Nor, Nur Nadiah Hanim Binti Mohd; Zamli, Kamal Z.

    2017-01-01

    Due to recent advancements and appealing applications, the purchase rate of smart devices is increasing at a higher rate. Parallely, the security related threats and attacks are also increasing at a greater ratio on these devices. As a result, a considerable number of attacks have been noted in the recent past. To resist these attacks, many password-based authentication schemes are proposed. However, most of these schemes are not screen size independent; whereas, smart devices come in different sizes. Specifically, they are not suitable for miniature smart devices due to the small screen size and/or lack of full sized keyboards. In this paper, we propose a new screen size independent password-based authentication scheme, which also offers an affordable defense against shoulder surfing, brute force, and smudge attacks. In the proposed scheme, the Press Touch (PT)—a.k.a., Force Touch in Apple’s MacBook, Apple Watch, ZTE’s Axon 7 phone; 3D Touch in iPhone 6 and 7; and so on—is transformed into a new type of code, named Press Touch Code (PTC). We design and implement three variants of it, namely mono-PTC, multi-PTC, and multi-PTC with Grid, on the Android Operating System. An in-lab experiment and a comprehensive survey have been conducted on 105 participants to demonstrate the effectiveness of the proposed scheme. PMID:29084262

  4. Design of inquiry-oriented science labs: impacts on students' attitudes

    NASA Astrophysics Data System (ADS)

    Baseya, J. M.; Francis, C. D.

    2011-11-01

    Background: Changes in lab style can lead to differences in learning. Two inquiry-oriented lab styles are guided inquiry (GI) and problem-based (PB). Students' attitudes towards lab are important to consider when choosing between GI and PB styles during curriculum design. Purpose: We examined the degree to which lab experiences are explained by a GI or a PB lab style vs. students' attitudes towards specific aspects of the experience, reflected by perceived excitement (exc), difficulty (dif), time efficiency (eff) and association between lab and lecture material (help). Sample: Approximately 1000 students attending first-semester, college biology lab for science majors at the University of Colorado at Boulder, USA, participated in the study. Design and method: In 2007, two labs were run as GI and one as PB. Formats were switched in 2008. Attitudes were assessed with a post-semester survey. Results: Only the four attitude variables (not lab style) had a strong relationship with overall lab rating which was most strongly related to exc, followed by dif and help/eff. Dif and eff had the greatest influence on attitudes for or against GI vs. PB labs, and help and exc had little influence on a GI vs. a PB lab. Also, when dif was low, students' attitudes were not significantly different between PB and GI labs, but when dif was high, students' significantly rated GI labs higher than PB labs. Conclusions: Students' attitudes towards lab are more dependent on specific aspects of the experience than on lab style. Changes in GI vs. PB lab styles primarily influence dif and eff rather than exc and help. Dif may be an important factor to consider when implementing a lab in the PB vs. the GI format. It might be good to go with a GI when dif is high and a PB when dif is low.

  5. Awakening interest in the natural sciences - BASF's Kids' Labs.

    PubMed

    Lang, Cinthia

    2012-01-01

    At BASF's Ludwigshafen headquarters, kids and young adults in grades 1-13 can learn about chemistry in the Kids' Labs. Different programs exist for different levels of knowledge. In the two 'Hands-on Lab H(2)O & Co.' Kids' Labs, students from grades 1-6 explore the secrets of chemistry. BASF Kids' Labs have now been set up in over 30 countries. In Switzerland alone, almost 2,000 students have taken part in the 'Water Loves Chemistry' Kids' Lab since it was started in 2011. In Alsace, 600 students have participated to date. In the Teens' Lab 'Xplore Middle School', middle school students explore five different programs with the themes 'substance labyrinth', 'nutrition', 'coffee, caffeine & co.', 'cosmetics' and 'energy'. Biotechnological methods are the focus of the Teens' Lab 'Xplore Biotech' for students taking basic and advanced biology courses. In the 'Xplore High School' Teens' Lab, chemistry teachers present their own experimental lab instruction for students in basic and advanced chemistry courses. The Virtual Lab has been expanding the offerings of the BASF Kids' Labs since 2011. The online lab was developed by the company for the International Year Of Chemistry and gives kids and young adults the opportunity to do interactive experiments outside of the lab.

  6. Interactive, Online, Adsorption Lab to Support Discovery of the Scientific Process

    NASA Astrophysics Data System (ADS)

    Carroll, K. C.; Ulery, A. L.; Chamberlin, B.; Dettmer, A.

    2014-12-01

    Science students require more than methods practice in lab activities; they must gain an understanding of the application of the scientific process through lab work. Large classes, time constraints, and funding may limit student access to science labs, denying students access to the types of experiential learning needed to motivate and develop new scientists. Interactive, discovery-based computer simulations and virtual labs provide an alternative, low-risk opportunity for learners to engage in lab processes and activities. Students can conduct experiments, collect data, draw conclusions, and even abort a session. We have developed an online virtual lab, through which students can interactively develop as scientists as they learn about scientific concepts, lab equipment, and proper lab techniques. Our first lab topic is adsorption of chemicals to soil, but the methodology is transferrable to other topics. In addition to learning the specific procedures involved in each lab, the online activities will prompt exploration and practice in key scientific and mathematical concepts, such as unit conversion, significant digits, assessing risks, evaluating bias, and assessing quantity and quality of data. These labs are not designed to replace traditional lab instruction, but to supplement instruction on challenging or particularly time-consuming concepts. To complement classroom instruction, students can engage in a lab experience outside the lab and over a shorter time period than often required with real-world adsorption studies. More importantly, students can reflect, discuss, review, and even fail at their lab experience as part of the process to see why natural processes and scientific approaches work the way they do. Our Media Productions team has completed a series of online digital labs available at virtuallabs.nmsu.edu and scienceofsoil.com, and these virtual labs are being integrated into coursework to evaluate changes in student learning.

  7. The laboratory report: A pedagogical tool in college science courses

    NASA Astrophysics Data System (ADS)

    Ferzli, Miriam

    When viewed as a product rather than a process that aids in student learning, the lab report may become rote, busywork for both students and instructors. Students fail to see the purpose of the lab report, and instructors see them as a heavy grading load. If lab reports are taught as part of a process rather than a product that aims to "get the right answer," they may serve as pedagogical tools in college science courses. In response to these issues, an in-depth, web-based tutorial named LabWrite (www.ncsu.edu/labwrite) was developed to help students and instructors (www.ncsu.edu/labwrite/instructors) understand the purpose of the lab report as grounded in the written discourse and processes of science. The objective of this post-test only quasi-experimental study was to examine the role that in-depth instruction such as LabWrite plays in helping students to develop skills characteristic of scientifically literate individuals. Student lab reports from an introductory-level biology course at NC State University were scored for overall understanding of scientific concepts and scientific ways of thinking. The study also looked at students' attitudes toward science and lab report writing, as well as students' perceptions of lab reports in general. Significant statistical findings from this study show that students using LabWrite were able to write lab reports that showed a greater understanding of scientific investigations (p < .003) and scientific ways of thinking (p < .0001) than students receiving traditional lab report writing instruction. LabWrite also helped students develop positive attitudes toward lab reports as compared to non-LabWrite users (p < .01). Students using LabWrite seemed to perceive the lab report as a valuable tool for determining learning objectives, understanding science concepts, revisiting the lab experience, and documenting their learning.

  8. Fuel Cell Technology Status Composite Data Products | Hydrogen and Fuel

    Science.gov Websites

    Hours to 10% Stack Voltage Degradation CDP LAB 01, 5/8/17 Durability Lab Data Projection Sensitivity to Voltage Degradation Levels CDP LAB 02, 5/8/17 Field and Lab Durability Projection Comparison for LAB 04, 5/5/17 Comparison of MHE Field and Lab Data Voltage Durability CDP LAB 05, 5/3/2016 Data Set

  9. Status of chemistry lab safety in Nepal.

    PubMed

    Kandel, Krishna Prasad; Neupane, Bhanu Bhakta; Giri, Basant

    2017-01-01

    Chemistry labs can become a dangerous environment for students as the lab exercises involve hazardous chemicals, glassware, and equipment. Approximately one hundred thousand students take chemistry laboratory classes annually in Nepal. We conducted a survey on chemical lab safety issues across Nepal. In this paper, we assess the safety policy and equipment, protocols and procedures followed, and waste disposal in chemistry teaching labs. Significant population of the respondents believed that there is no monitoring of the lab safety in their lab (p<0.001). Even though many labs do not allow food and beverages inside lab and have first aid kits, they lack some basic safety equipment. There is no institutional mechanism to dispose lab waste and chemical waste is disposed haphazardly. Majority of the respondents believed that the safety training should be a part of educational training (p = 0.001) and they would benefit from short course and/or workshop on lab safety (p<0.001).

  10. Status of chemistry lab safety in Nepal

    PubMed Central

    Kandel, Krishna Prasad; Neupane, Bhanu Bhakta

    2017-01-01

    Chemistry labs can become a dangerous environment for students as the lab exercises involve hazardous chemicals, glassware, and equipment. Approximately one hundred thousand students take chemistry laboratory classes annually in Nepal. We conducted a survey on chemical lab safety issues across Nepal. In this paper, we assess the safety policy and equipment, protocols and procedures followed, and waste disposal in chemistry teaching labs. Significant population of the respondents believed that there is no monitoring of the lab safety in their lab (p<0.001). Even though many labs do not allow food and beverages inside lab and have first aid kits, they lack some basic safety equipment. There is no institutional mechanism to dispose lab waste and chemical waste is disposed haphazardly. Majority of the respondents believed that the safety training should be a part of educational training (p = 0.001) and they would benefit from short course and/or workshop on lab safety (p<0.001). PMID:28644869

  11. Student Use of Physics to Make Sense of Incomplete but Functional VPython Programs in a Lab Setting

    NASA Astrophysics Data System (ADS)

    Weatherford, Shawn A.

    2011-12-01

    Computational activities in Matter & Interactions, an introductory calculus-based physics course, have the instructional goal of providing students with the experience of applying the same set of a small number of fundamental principles to model a wide range of physical systems. However there are significant instructional challenges for students to build computer programs under limited time constraints, especially for students who are unfamiliar with programming languages and concepts. Prior attempts at designing effective computational activities were successful at having students ultimately build working VPython programs under the tutelage of experienced teaching assistants in a studio lab setting. A pilot study revealed that students who completed these computational activities had significant difficultly repeating the exact same tasks and further, had difficulty predicting the animation that would be produced by the example program after interpreting the program code. This study explores the interpretation and prediction tasks as part of an instructional sequence where students are asked to read and comprehend a functional, but incomplete program. Rather than asking students to begin their computational tasks with modifying program code, we explicitly ask students to interpret an existing program that is missing key lines of code. The missing lines of code correspond to the algebraic form of fundamental physics principles or the calculation of forces which would exist between analogous physical objects in the natural world. Students are then asked to draw a prediction of what they would see in the simulation produced by the VPython program and ultimately run the program to evaluate the students' prediction. This study specifically looks at how the participants use physics while interpreting the program code and creating a whiteboard prediction. This study also examines how students evaluate their understanding of the program and modification goals at the beginning of the modification task. While working in groups over the course of a semester, study participants were recorded while they completed three activities using these incomplete programs. Analysis of the video data showed that study participants had little difficulty interpreting physics quantities, generating a prediction, or determining how to modify the incomplete program. Participants did not base their prediction solely from the information from the incomplete program. When participants tried to predict the motion of the objects in the simulation, many turned to their knowledge of how the system would evolve if it represented an analogous real-world physical system. For example, participants attributed the real-world behavior of springs to helix objects even though the program did not include calculations for the spring to exert a force when stretched. Participants rarely interpreted lines of code in the computational loop during the first computational activity, but this changed during latter computational activities with most participants using their physics knowledge to interpret the computational loop. Computational activities in the Matter & Interactions curriculum were revised in light of these findings to include an instructional sequence of tasks to build a comprehension of the example program. The modified activities also ask students to create an additional whiteboard prediction for the time-evolution of the real-world phenomena which the example program will eventually model. This thesis shows how comprehension tasks identified by Palinscar and Brown (1984) as effective in improving reading comprehension are also effective in helping students apply their physics knowledge to interpret a computer program which attempts to model a real-world phenomena and identify errors in their understanding of the use, or omission, of fundamental physics principles in a computational model.

  12. Reflections on Three Corporate Research Labs: Bell Labs, HP Labs, Agilent Labs

    NASA Astrophysics Data System (ADS)

    Hollenhorst, James

    2008-03-01

    This will be a personal reflection on corporate life and physics-based research in three industrial research labs over three decades, Bell Labs during the 1980's, HP Labs during the 1990's, and Agilent Labs during the 2000's. These were times of great change in all three companies. I'll point out some of the similarities and differences in corporate cultures and how this impacted the research and development activities. Along the way I'll mention some of the great products that resulted from physics-based R&D.

  13. Lithospheric-Mantle Structure of the Kaapvaal Craton, South Africa, Derived From Thermodynamically Self-Consistent Modelling of Seismic Surface-Wave and S-wave Receiver Function, Heat-flow, Elevation, Xenolith and Magnetotelluric Observations

    NASA Astrophysics Data System (ADS)

    Muller, M. R.; Fullea, J.; Jones, A. G.; Adam, J.; Lebedev, S.; Piana Agostinetti, N.

    2012-12-01

    Results from recent geophysical and mantle-xenolith geochemistry studies of the Kaapvaal Craton appear, at times, to provide disparate views of the physical, chemical and thermal structure of the lithosphere. Models from our recent SAMTEX magnetotelluric (MT) surveys across the Kaapvaal Craton indicate a resistive, 220-240 km thick lithosphere for the central core of the craton. One published S-wave receiver function (SRF) study and other surface-wave studies suggest a thinner lithosphere characterised by a ~160 km thick high-velocity "lid" underlain by a low-velocity zone (LVZ) of between 65-150 km in thickness. Other seismic studies suggest that the (high-velocity) lithosphere is thicker, in excess of 220 km. Mantle xenolith pressure-temperature arrays from Mesozoic kimberlites require that the base of the "thermal" lithosphere (i.e., the depth above which a conductive geotherm is maintained) is at least 220 km deep, to account for mantle geotherms in the range 35-38 mWm-2. Richly diamondiferous kimberlites across the Kaapvaal Craton require a lithospheric thickness substantially greater than 160 km - the depth of the top of the diamond stability field. In this paper we use the recently developed LitMod software code to derive, thermodynamically consistently, a range of 1-D seismic velocity, density, electrical resistivity and temperature models from layered geochemical models of the lithosphere based on mantle xenolith compositions. In our work, the "petrological" lithosphere-asthenosphere boundary (pLAB) (i.e., the top of the fertile asthenospheric-mantle) and the "thermal" LAB (tLAB as defined above) are coincident. Lithospheric-mantle models are found simultaneously satisfying all geophysical observables: new surface-wave dispersion data, published SRFs, MT responses, surface elevation and heat-flow. Our results show: 1. All lithospheric-mantle models are characterised by a seismic LVZ with a minimum velocity at the depth of the petrological/thermal LAB. The top of the LVZ does not correspond with the LAB. 2. Thin (~160 km-thick) lithospheric-mantle models are consistent with surface elevation and heat-flow observations only for unreasonably low average crustal heat production values (~0.4 μWm-3). However, such models are inconsistent both with the surface-wave dispersion data and youngest (Group I) palaeo-geotherms defined by xenolith P-T arrays. 3. A three-layered geochemical model (consistent with mantle xenoliths), with lithospheric thickness in excess of 220 km, is required to match all geophysical constraints. 4. The chemical transition from a depleted harzburgitic composition (above) to a refertilised high-T lherzolitic composition (below) at 160 km depth produces a sharp onset of the seismic LVZ and a sharp increase in density. Synthetic SRFs will assess whether this chemical transition may account for the reported S-to-P conversion event at 160 km depth. However, in this this instance the SRF conversion event would not represent the petrological/thermal LAB.

  14. Kinematic Labs with Mobile Devices

    NASA Astrophysics Data System (ADS)

    Kinser, Jason M.

    2015-07-01

    This book provides 13 labs spanning the common topics in the first semester of university-level physics. Each lab is designed to use only the student's smartphone, laptop and items easily found in big-box stores or a hobby shop. Each lab contains theory, set-up instructions and basic analysis techniques. All of these labs can be performed outside of the traditional university lab setting and initial costs averaging less than 8 per student, per lab.

  15. Automated classification of MMPI profiles into psychotic, neurotic or personality disorder types.

    PubMed

    Hatcher, W E

    1978-03-01

    A Fortran program has been developed which can objectively classify Minnesota Multiphasic Inventory (MMPI) profiles as being psychotic, neurotic, personality disorder, or indeterminate types. The method used is a set of configural rules, 'Henrichs' rules for males'. The only input data required are K-corrected T scores, which are the end product of standard scoring techniques. To automate these rules it was necessary to rewrite them so that all decisions were the result of arithmetic comparisons or logical tests using only and, or and not. In particular, examination of the Welsh code, which many rules required, had to be stimulated by the use of several sorted arrays. The program has been carefully tested and is in the use in our computer lab.

  16. Wave journal bearing with compressible lubricant--Part 1: The wave bearing concept and a comparison to the plain circular bearing

    NASA Technical Reports Server (NTRS)

    Dimofte, Florin

    1995-01-01

    To improve hydrodynamic journal bearing steady-state and dynamic performance, a new bearing concept, the wave journal bearing, was developed at the author's lab. This concept features a waved inner bearing diameter. Compared to other alternative bearing geometries used to improve bearing performance such as spiral or herring-bone grooves, steps, etc., the wave bearing's design is relatively simple and allows the shaft to rotate in either direction. A three-wave bearing operating with a compressible lubricant, i.e., gas is analyzed using a numerical code. Its performance is compared to a plain (truly) circular bearing over a broad range of bearing working parameters, e.g., bearing numbers from 0.01 to 100.

  17. Simulations of Field-Emission Electron Beams from CNT Cathodes in RF Photoinjectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mihalcea, Daniel; Faillace, Luigi; Panuganti, Harsha

    2015-06-01

    Average field emission currents of up to 700 mA were produced by Carbon Nano Tube (CNT) cathodes in a 1.3 GHz RF gun at Fermilab High Brightness Electron Source Lab. (HBESL). The CNT cathodes were manufactured at Xintek and tested under DC conditions at RadiaBeam. The electron beam intensity as well as the other beam properties are directly related to the time-dependent electric field at the cathode and the geometry of the RF gun. This report focuses on simulations of the electron beam generated through field-emission and the results are compared with experimental measurements. These simulations were performed with themore » time-dependent Particle In Cell (PIC) code WARP.« less

  18. Sandia’s Current Energy Conversion module for the Flexible-Mesh Delft3D flow solver v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chartand, Chris; Jagers, Bert

    The DOE has funded Sandia National Labs (SNL) to develop an open-source modeling tool to guide the design and layout of marine hydrokinetic (MHK) arrays to maximize power production while minimizing environmental effects. This modeling framework simulates flows through and around a MHK arrays while quantifying environmental responses. As an augmented version of the Dutch company, Deltares’s, environmental hydrodynamics code, Delft3D, SNL-Delft3D-CEC-FM includes a new module that simulates energy conversion (momentum withdrawal) by MHK current energy conversion devices with commensurate changes in the turbulent kinetic energy and its dissipation rate. SNL-Delft3D-CEC-FM modified the Delft3D flexible mesh flow solver, DFlowFM.

  19. Active imaging systems to see through adverse conditions: Light-scattering based models and experimental validation

    NASA Astrophysics Data System (ADS)

    Riviere, Nicolas; Ceolato, Romain; Hespel, Laurent

    2014-10-01

    Onera, the French aerospace lab, develops and models active imaging systems to understand the relevant physical phenomena affecting these systems performance. As a consequence, efforts have been done on the propagation of a pulse through the atmosphere and on target geometries and surface properties. These imaging systems must operate at night in all ambient illumination and weather conditions in order to perform strategic surveillance for various worldwide operations. We have implemented codes for 2D and 3D laser imaging systems. As we aim to image a scene in the presence of rain, snow, fog or haze, we introduce such light-scattering effects in our numerical models and compare simulated images with measurements provided by commercial laser scanners.

  20. Cloud prediction of protein structure and function with PredictProtein for Debian.

    PubMed

    Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Staniewski, Cedric; Rost, Burkhard

    2013-01-01

    We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome.

  1. Cloud Prediction of Protein Structure and Function with PredictProtein for Debian

    PubMed Central

    Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Rost, Burkhard

    2013-01-01

    We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome. PMID:23971032

  2. Instrumentation for laser physics and spectroscopy using 32-bit microcontrollers with an Android tablet interface

    NASA Astrophysics Data System (ADS)

    Eyler, E. E.

    2013-10-01

    Several high-performance lab instruments suitable for manual assembly have been developed using low-pin-count 32-bit microcontrollers that communicate with an Android tablet via a USB interface. A single Android tablet app accommodates multiple interface needs by uploading parameter lists and graphical data from the microcontrollers, which are themselves programmed with easily modified C code. The hardware design of the instruments emphasizes low chip counts and is highly modular, relying on small "daughter boards" for special functions such as USB power management, waveform generation, and phase-sensitive signal detection. In one example, a daughter board provides a complete waveform generator and direct digital synthesizer that fits on a 1.5 in. × 0.8 in. circuit card.

  3. Overall migration and specific migration of bisphenol A diglycidyl ether monomer and m-xylylenediamine hardener from an optimized epoxy-amine formulation into water-based food simulants.

    PubMed

    Simal Gándara, J; López Mahía, P; Paseiro Losada, P; Simal Lozano, J; Paz Abuín, S

    1993-01-01

    The overall and specific migrations of BADGE n = 0 monomer and m-XDA hardener from a BEPOX LAB 889 (Gairesa internal code), epoxy system cured at room temperature, into three water-based food simulants are studied. Hydrolysis of BADGE n = 0 was observed in all of these simulants, giving more polar products. We thus propose changing the EEC Directives, which at present only legislate for levels of BADGE n = 0 monomer in the simulants, to include the hydrolysis products of BADGE monomers. Another alternative would be to express all the migration levels due to BADGE and its derived products in terms of BADGE itself.

  4. nodeGame: Real-time, synchronous, online experiments in the browser.

    PubMed

    Balietti, Stefano

    2017-10-01

    nodeGame is a free, open-source JavaScript/ HTML5 framework for conducting synchronous experiments online and in the lab directly in the browser window. It is specifically designed to support behavioral research along three dimensions: (i) larger group sizes, (ii) real-time (but also discrete time) experiments, and (iii) batches of simultaneous experiments. nodeGame has a modular source code, and defines an API (application programming interface) through which experimenters can create new strategic environments and configure the platform. With zero-install, nodeGame can run on a great variety of devices, from desktop computers to laptops, smartphones, and tablets. The current version of the software is 3.0, and extensive documentation is available on the wiki pages at http://nodegame.org .

  5. A comparative study on real lab and simulation lab in communication engineering from students' perspectives

    NASA Astrophysics Data System (ADS)

    Balakrishnan, B.; Woods, P. C.

    2013-05-01

    Over the years, rapid development in computer technology has engendered simulation-based laboratory (lab) in addition to the traditional hands-on (physical) lab. Many higher education institutions adopt simulation lab, replacing some existing physical lab experiments. The creation of new systems for conducting engineering lab activities has raised concerns among educators on the merits and shortcomings of both physical and simulation labs; at the same time, many arguments have been raised on the differences of both labs. Investigating the effectiveness of both labs is complicated, as there are multiple factors that should be considered. In view of this challenge, a study on students' perspectives on their experience related to key aspects on engineering laboratory exercise was conducted. In this study, the Visual Auditory Read and Kinetic model was utilised to measure the students' cognitive styles. The investigation was done through a survey among participants from Multimedia University, Malaysia. The findings revealed that there are significant differences for most of the aspects in physical and simulation labs.

  6. Locations of serial reach targets are coded in multiple reference frames.

    PubMed

    Thompson, Aidan A; Henriques, Denise Y P

    2010-12-01

    Previous work from our lab, and elsewhere, has demonstrated that remembered target locations are stored and updated in an eye-fixed reference frame. That is, reach errors systematically vary as a function of gaze direction relative to a remembered target location, not only when the target is viewed in the periphery (Bock, 1986, known as the retinal magnification effect), but also when the target has been foveated, and the eyes subsequently move after the target has disappeared but prior to reaching (e.g., Henriques, Klier, Smith, Lowy, & Crawford, 1998; Sorrento & Henriques, 2008; Thompson & Henriques, 2008). These gaze-dependent errors, following intervening eye movements, cannot be explained by representations whose frame is fixed to the head, body or even the world. However, it is unknown whether targets presented sequentially would all be coded relative to gaze (i.e., egocentrically/absolutely), or if they would be coded relative to the previous target (i.e., allocentrically/relatively). It might be expected that the reaching movements to two targets separated by 5° would differ by that distance. But, if gaze were to shift between the first and second reaches, would the movement amplitude between the targets differ? If the target locations are coded allocentrically (i.e., the location of the second target coded relative to the first) then the movement amplitude should be about 5°. But, if the second target is coded egocentrically (i.e., relative to current gaze direction), then the reaches to this target and the distances between the subsequent movements should vary systematically with gaze as described above. We found that requiring an intervening saccade to the opposite side of 2 briefly presented targets between reaches to them resulted in a pattern of reaching error that systematically varied as a function of the distance between current gaze and target, and led to a systematic change in the distance between the sequential reach endpoints as predicted by an egocentric frame anchored to the eye. However, the amount of change in this distance was smaller than predicted by a pure eye-fixed representation, suggesting that relative positions of the targets or allocentric coding was also used in sequential reach planning. The spatial coding and updating of sequential reach target locations seems to rely on a combined weighting of multiple reference frames, with one of them centered on the eye. Copyright © 2010 Elsevier Ltd. All rights reserved.

  7. TangoLab-2 Card Troubleshooting

    NASA Image and Video Library

    2017-10-17

    iss053e105442 (Oct. 17, 2017) --- Flight Engineer Mark Vande Hei swaps out a payload card from the TangoLab-1 facility and places into the TangoLab-2 facility. TangoLab provides a standardized platform and open architecture for experimental modules called CubeLabs. CubeLab modules may be developed for use in 3-dimensional tissue and cell cultures.

  8. MatLab Script and Functional Programming

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali

    2007-01-01

    MatLab Script and Functional Programming: MatLab is one of the most widely used very high level programming languages for scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. The MatLab seminar covers the functional and script programming aspect of MatLab language. Specific expectations are: a) Recognize MatLab commands, script and function. b) Create, and run a MatLab function. c) Read, recognize, and describe MatLab syntax. d) Recognize decisions, loops and matrix operators. e) Evaluate scope among multiple files, and multiple functions within a file. f) Declare, define and use scalar variables, vectors and matrices.

  9. An Evaluation of Two Hands-On Lab Styles for Plant Biodiversity in Undergraduate Biology

    ERIC Educational Resources Information Center

    Basey, John M.; Maines, Anastasia P.; Francis, Clinton D.; Melbourne, Brett

    2014-01-01

    We compared learning cycle and expository formats for teaching about plant biodiversity in an inquiry-oriented university biology lab class (n = 465). Both formats had preparatory lab activities, a hands-on lab, and a postlab with reflection and argumentation. Learning was assessed with a lab report, a practical quiz in lab, and a multiple-choice…

  10. Crustal Fracturing Field and Presence of Fluid as Revealed by Seismic Anisotropy

    NASA Astrophysics Data System (ADS)

    Pastori, M.; Piccinini, D.; de Gori, P.; Margheriti, L.; Barchi, M. R.; di Bucci, D.

    2010-12-01

    In the last three years, we developed, tested and improved an automatic analysis code (Anisomat+) to calculate the shear wave splitting parameters, fast polarization direction (φ) and delay time (∂t). The code is a set of MatLab scripts able to retrieve crustal anisotropy parameters from three-component seismic recording of local earthquakes using horizontal component cross-correlation method. The analysis procedure consists in choosing an appropriate frequency range, that better highlights the signal containing the shear waves, and a length of time window on the seismogram centered on the S arrival (the temporal window contains at least one cycle of S wave). The code was compared to other two automatic analysis code (SPY and SHEBA) and tested on three Italian areas (Val d’Agri, Tiber Valley and L’Aquila surrounding) along the Apennine mountains. For each region we used the anisotropic parameters resulting from the automatic computation as a tool to determine the fracture field geometries connected with the active stress field. We compare the temporal variations of anisotropic parameters to the evolution of vp/vs ratio for the same seismicity. The anisotropic fast directions are used to define the active stress field (EDA model), finding a general consistence between fast direction and main stress indicators (focal mechanism and borehole break-out). The magnitude of delay time is used to define the fracture field intensity finding higher value in the volume where micro-seismicity occurs. Furthermore we studied temporal variations of anisotropic parameters and vp/vs ratio in order to explain if fluids play an important role in the earthquake generation process. The close association of anisotropic and vp/vs parameters variations and seismicity rate changes supports the hypothesis that the background seismicity is influenced by the fluctuation of pore fluid pressure in the rocks.

  11. Value added or misattributed? A multi-institution study on the educational benefit of labs for reinforcing physics content

    NASA Astrophysics Data System (ADS)

    Holmes, N. G.; Olsen, Jack; Thomas, James L.; Wieman, Carl E.

    2017-06-01

    Instructional labs are widely seen as a unique, albeit expensive, way to teach scientific content. We measured the effectiveness of introductory lab courses at achieving this educational goal across nine different lab courses at three very different institutions. These institutions and courses encompassed a broad range of student populations and instructional styles. The nine courses studied had two key things in common: the labs aimed to reinforce the content presented in lectures, and the labs were optional. By comparing the performance of students who did and did not take the labs (with careful normalization for selection effects), we found universally and precisely no added value to learning course content from taking the labs as measured by course exam performance. This work should motivate institutions and departments to reexamine the goals and conduct of their lab courses, given their resource-intensive nature. We show why these results make sense when looking at the comparative mental processes of students involved in research and instructional labs, and offer alternative goals and instructional approaches that would make lab courses more educationally valuable.

  12. Telemetering and telecommunications research

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1991-01-01

    The research center activities during the reporting period have been focused in three areas: (1) developing the necessary equipment and test procedures to support the testing of 8PSK-TCM through TDRSS from the WSGT; (2) extending the theoretical decoder work to higher speeds with a design goal of 600MBPS at 2 bits/Hz; and (3) completing the initial phase of the CPFSK Multi-H research and determining what subsets (if any) of these coding schemes are useful in the TDRSS environment. The equipment for the WSGT TCM testing has been completed and is functioning in the lab at NMSU. Measured results to date indicate that the uncoded system with the modified HRD and NMSU symbol sync operates at 1 to 1.5 dB from theory when processing encoded 8PSK. The NMSU pragmatic decoder when combined with these units produces approximately 2.9 dB of coding gain at 10(exp -5) BER. Our study of CPFSK with Multi-H coding has reached a critical stage. The principal conclusions reached in this activity are: (1) no scheme using Multi-H alone investigated by us or found in the literature produces power/bandwidth trades that are as good as TCM with filtered 8PSK; (2) when Multi-H is combined with convolutional coding, one can obtain better coding gain than with Multi-H alone but still no better power/bandwidth performance than TCM and these gains are available only with complex receivers; (3) the only advantage we can find for the CPFSK schemes over filtered MPSK with TCM is that they are constant envelope (however, constant envelope is of no benefit in a multiple access channel and of questionable benefit in a single access channel since driving the TWT to saturation in this situation is generally acceptable); and (4) based upon these results the center's research program will focus on concluding the existing CPFSK studies.

  13. Experimental benchmark of the NINJA code for application to the Linac4 H- ion source plasma

    NASA Astrophysics Data System (ADS)

    Briefi, S.; Mattei, S.; Rauner, D.; Lettry, J.; Tran, M. Q.; Fantz, U.

    2017-10-01

    For a dedicated performance optimization of negative hydrogen ion sources applied at particle accelerators, a detailed assessment of the plasma processes is required. Due to the compact design of these sources, diagnostic access is typically limited to optical emission spectroscopy yielding only line-of-sight integrated results. In order to allow for a spatially resolved investigation, the electromagnetic particle-in-cell Monte Carlo collision code NINJA has been developed for the Linac4 ion source at CERN. This code considers the RF field generated by the ICP coil as well as the external static magnetic fields and calculates self-consistently the resulting discharge properties. NINJA is benchmarked at the diagnostically well accessible lab experiment CHARLIE (Concept studies for Helicon Assisted RF Low pressure Ion sourcEs) at varying RF power and gas pressure. A good general agreement is observed between experiment and simulation although the simulated electron density trends for varying pressure and power as well as the absolute electron temperature values deviate slightly from the measured ones. This can be explained by the assumption of strong inductive coupling in NINJA, whereas the CHARLIE discharges show the characteristics of loosely coupled plasmas. For the Linac4 plasma, this assumption is valid. Accordingly, both the absolute values of the accessible plasma parameters and their trends for varying RF power agree well in measurement and simulation. At varying RF power, the H- current extracted from the Linac4 source peaks at 40 kW. For volume operation, this is perfectly reflected by assessing the processes in front of the extraction aperture based on the simulation results where the highest H- density is obtained for the same power level. In surface operation, the production of negative hydrogen ions at the converter surface can only be considered by specialized beam formation codes, which require plasma parameters as input. It has been demonstrated that this input can be provided reliably by the NINJA code.

  14. Teachers' Perspectives on Online Virtual Labs vs. Hands-On Labs in High School Science

    NASA Astrophysics Data System (ADS)

    Bohr, Teresa M.

    This study of online science teachers' opinions addressed the use of virtual labs in online courses. A growing number of schools use virtual labs that must meet mandated laboratory standards to ensure they provide learning experiences comparable to hands-on labs, which are an integral part of science curricula. The purpose of this qualitative case study was to examine teachers' perceptions of the quality and effectiveness of high school virtual labs. The theoretical foundation was constructivism, as labs provide student-centered activities for problem solving, inquiry, and exploration of phenomena. The research questions focused on experienced teachers' perceptions of the quality of virtual vs. hands-on labs. Data were collected through survey questions derived from the lab objectives of The Next Generation Science Standards . Eighteen teachers rated the degree of importance of each objective and also rated how they felt virtual labs met these objectives; these ratings were reported using descriptive statistics. Responses to open-ended questions were few and served to illustrate the numerical results. Many teachers stated that virtual labs are valuable supplements but could not completely replace hands-on experiences. Studies on the quality and effectiveness of high school virtual labs are limited despite widespread use. Comprehensive studies will ensure that online students have equal access to quality labs. School districts need to define lab requirements, and colleges need to specify the lab experience they require. This study has potential to inspire positive social change by assisting science educators, including those in the local school district, in evaluating and selecting courseware designed to promote higher order thinking skills, real-world problem solving, and development of strong inquiry skills, thereby improving science instruction for all high school students.

  15. The Influence of Tablet PCs on Students' Use of Multiple Representations in Lab Reports

    NASA Astrophysics Data System (ADS)

    Guelman, Clarisa Bercovich; De Leone, Charles; Price, Edward

    2009-11-01

    This study examined how different tools influenced students' use of representations in the Physics laboratory. In one section of a lab course, every student had a Tablet PC that served as a digital-ink based lab notebook. Students could seamlessly create hand-drawn graphics and equations, and write lab reports on the same computer used for data acquisition, simulation, and analysis. In another lab section, students used traditional printed lab guides, kept paper notebooks, and then wrote lab reports on regular laptops. Analysis of the lab reports showed differences between the sections' use of multiple representations, including an increased use of diagrams and equations by the Tablet users.

  16. Integrating Multiple On-line Knowledge Bases for Disease-Lab Test Relation Extraction.

    PubMed

    Zhang, Yaoyun; Soysal, Ergin; Moon, Sungrim; Wang, Jingqi; Tao, Cui; Xu, Hua

    2015-01-01

    A computable knowledge base containing relations between diseases and lab tests would be a great resource for many biomedical informatics applications. This paper describes our initial step towards establishing a comprehensive knowledge base of disease and lab tests relations utilizing three public on-line resources. LabTestsOnline, MedlinePlus and Wikipedia are integrated to create a freely available, computable disease-lab test knowledgebase. Disease and lab test concepts are identified using MetaMap and relations between diseases and lab tests are determined based on source-specific rules. Experimental results demonstrate a high precision for relation extraction, with Wikipedia achieving the highest precision of 87%. Combining the three sources reached a recall of 51.40%, when compared with a subset of disease-lab test relations extracted from a reference book. Moreover, we found additional disease-lab test relations from on-line resources, indicating they are complementary to existing reference books for building a comprehensive disease and lab test relation knowledge base.

  17. Are Virtual Labs as Effective as Hands-on Labs for Undergraduate Physics? A Comparative Study at Two Major Universities

    ERIC Educational Resources Information Center

    Darrah, Marjorie; Humbert, Roxann; Finstein, Jeanne; Simon, Marllin; Hopkins, John

    2014-01-01

    Most physics professors would agree that the lab experiences students have in introductory physics are central to the learning of the concepts in the course. It is also true that these physics labs require time and money for upkeep, not to mention the hours spent setting up and taking down labs. Virtual physics lab experiences can provide an…

  18. Evaluation of Petrifilm Lactic Acid Bacteria Plates for Counting Lactic Acid Bacteria in Food.

    PubMed

    Kanagawa, Satomi; Ohshima, Chihiro; Takahashi, Hajime; Burenqiqige; Kikuchi, Misato; Sato, Fumina; Nakamura, Ayaka; Mohamed, Shimaa M; Kuda, Takashi; Kimura, Bon

    2018-06-01

    Although lactic acid bacteria (LAB) are used widely as starter cultures in the production of fermented foods, they are also responsible for food decay and deterioration. The undesirable growth of LAB in food causes spoilage, discoloration, and slime formation. Because of these adverse effects, food companies test for the presence of LAB in production areas and processed foods and consistently monitor the behavior of these bacteria. The 3M Petrifilm LAB Count Plates have recently been launched as a time-saving and simple-to-use plate designed for detecting and quantifying LAB. This study compares the abilities of Petrifilm LAB Count Plates and the de Man Rogosa Sharpe (MRS) agar medium to determine the LAB count in a variety of foods and swab samples collected from a food production area. Bacterial strains isolated from Petrifilm LAB Count Plates were identified by 16S rDNA sequence analysis to confirm the specificity of these plates for LAB. The results showed no significant difference in bacterial counts measured by using Petrifilm LAB Count Plates and MRS medium. Furthermore, all colonies growing on Petrifilm LAB Count Plates were confirmed to be LAB, while yeast colonies also formed in MRS medium. Petrifilm LAB Count Plates eliminated the plate preparation and plate inoculation steps, and the cultures could be started as soon as a diluted food sample was available. Food companies are required to establish quality controls and perform tests to check the quality of food products; the use of Petrifilm LAB Count Plates can simplify this testing process for food companies.

  19. Theoretical analysis and simulation of the influence of self-bunching effects and longitudinal space charge effects on the propagation of keV electron bunch produced by a novel S-band Micro-Pulse electron Gun

    NASA Astrophysics Data System (ADS)

    Zhao, Jifei; Lu, Xiangyang; Zhou, Kui; Yang, Ziqin; Yang, Deyu; Luo, Xing; Tan, Weiwei; Yang, Yujia

    2016-06-01

    As an important electron source, Micro-Pulse electron Gun (MPG) which is qualified for producing high average current, short pulse, low emittance electron bunches steadily holds promise to use as an electron source of Coherent Smith-Purcell Radiation (CSPR), Free Electron Laser (FEL). The stable output of S-band MPG has been achieved in many labs. To establish reliable foundation for the future application of it, the propagation of picosecond electron bunch produced by MPG should be studied in detail. In this article, the MPG which was working on the rising stage of total effective Secondary Electron Yield (SEY) curve was introduced. The self-bunching mechanism was discussed in depth both in the multipacting amplifying state and the steady working state. The bunch length broadening induced by the longitudinal space-charge (SC) effects was investigated by different theoretical models in different regions. The 2D PIC codes MAGIC and beam dynamic codes TraceWin simulations were also performed in the propagation. The result shows an excellent agreement between the simulation and the theoretical analysis for bunch length evolution.

  20. Genome sequences of two closely related strains of Escherichia coli K-12 GM4792.

    PubMed

    Zhang, Yan-Cong; Zhang, Yan; Zhu, Bi-Ru; Zhang, Bo-Wen; Ni, Chuan; Zhang, Da-Yong; Huang, Ying; Pang, Erli; Lin, Kui

    2015-01-01

    Escherichia coli lab strains K-12 GM4792 Lac(+) and GM4792 Lac(-) carry opposite lactose markers, which are useful for distinguishing evolved lines as they produce different colored colonies. The two closely related strains are chosen as ancestors for our ongoing studies of experimental evolution. Here, we describe the genome sequences, annotation, and features of GM4792 Lac(+) and GM4792 Lac(-). GM4792 Lac(+) has a 4,622,342-bp long chromosome with 4,061 protein-coding genes and 83 RNA genes. Similarly, the genome of GM4792 Lac(-) consists of a 4,621,656-bp chromosome containing 4,043 protein-coding genes and 74 RNA genes. Genome comparison analysis reveals that the differences between GM4792 Lac(+) and GM4792 Lac(-) are minimal and limited to only the targeted lac region. Moreover, a previous study on competitive experimentation indicates the two strains are identical or nearly identical in survivability except for lactose utilization in a nitrogen-limited environment. Therefore, at both a genetic and a phenotypic level, GM4792 Lac(+) and GM4792 Lac(-), with opposite neutral markers, are ideal systems for future experimental evolution studies.

  1. CDAC Student Report: Summary of LLNL Internship

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herriman, Jane E.

    Multiple objectives motivated me to apply for an internship at LLNL: I wanted to experience the work environment at a national lab, to learn about research and job opportunities at LLNL in particular, and to gain greater experience with code development, particularly within the realm of high performance computing (HPC). This summer I was selected to participate in LLNL's Computational Chemistry and Material Science Summer Institute (CCMS). CCMS is a 10 week program hosted by the Quantum Simulations group leader, Dr. Eric Schwegler. CCMS connects graduate students to mentors at LLNL involved in similar re- search and provides weekly seminarsmore » on a broad array of topics from within chemistry and materials science. Dr. Xavier Andrade and Dr. Erik Draeger served as my co-mentors over the summer, and Dr. Andrade continues to mentor me now that CCMS has concluded. Dr. Andrade is a member of the Quantum Simulations group within the Physical and Life Sciences at LLNL, and Dr. Draeger leads the HPC group within the Center for Applied Scientific Computing (CASC). The two have worked together to develop Qb@ll, an open-source first principles molecular dynamics code that was the platform for my summer research project.« less

  2. Reconstruction of Axial Energy Deposition in Magnetic Liner Inertial Fusion Based on PECOS Shadowgraph Unfolds Using the AMR Code FLASH

    NASA Astrophysics Data System (ADS)

    Adams, Marissa; Jennings, Christopher; Slutz, Stephen; Peterson, Kyle; Gourdain, Pierre; U. Rochester-Sandia Collaboration

    2017-10-01

    Magnetic Liner Inertial Fusion (MagLIF) experiments incorporate a laser to preheat a deuterium filled capsule before compression via a magnetically imploding liner. In this work, we focus on the blast wave formed in the fuel during the laser preheat component of MagLIF, where approximately 1kJ of energy is deposited in 3ns into the capsule axially before implosion. To model blast waves directly relevant to experiments such as MagLIF, we inferred deposited energy from shadowgraphy of laser-only experiments preformed at the PECOS target chamber using the Z-Beamlet laser. These energy profiles were used to initialize 2-dimensional simulations using by the adaptive mesh refinement code FLASH. Gradients or asymmetries in the energy deposition may seed instabilities that alter the fuel's distribution, or promote mix, as the blast wave interacts with the liner wall. The AMR capabilities of FLASH allow us to study the development and dynamics of these instabilities within the fuel and their effect on the liner before implosion. Sandia Natl Labs is managed by NTES of Sandia, LLC., a subsidiary of Honeywell International, Inc, for the U.S. DOEs NNSA under contract DE-NA0003525.

  3. Equilibrium and Stability Properties of Low Aspect Ratio Mirror Systems: from Neutron Source Design to the Parker Spiral

    NASA Astrophysics Data System (ADS)

    Peterson, Ethan; Anderson, Jay; Clark, Mike; Egedal, Jan; Endrizzi, Douglass; Flanagan, Ken; Harvey, Robert; Lynn, Jacob; Milhone, Jason; Wallace, John; Waleffe, Roger; Mirnov, Vladimir; Forest, Cary

    2017-10-01

    Equilibrium reconstructions of rotating magnetospheres in the lab are computed using a user-friendly extended Grad-Shafranov solver written in Python and various magnetic and kinetic measurements. The stability of these equilibria are investigated using the NIMROD code with two goals: understand the onset of the classic ``wobble'' in the heliospheric current sheet and demonstrating proof-of-principle for a laboratory source of high- β turbulence. Using the same extended Grad-Shafranov solver, equilibria for an axisymmetric, non-paraxial magnetic mirror are used as a design foundation for a high-field magnetic mirror neutron source. These equilibria are numerically shown to be stable to the m=1 flute instability, with higher modes likely stabilized by FLR effects; this provides stability to gross MHD modes in an axisymmetric configuration. Numerical results of RF heating and neutral beam injection (NBI) from the GENRAY/CQL3D code suite show neutron fluxes promising for medical radioisotope production as well as materials testing. Synergistic effects between NBI and high-harmonic fast wave heating show large increases in neutron yield for a modest increase in RF power. work funded by DOE, NSF, NASA.

  4. Towards seamless workflows in agile data science

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Robertson, J.

    2017-12-01

    Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the geosciences. We use code management that allows researchers to interact with the code through tools like Jupyter Notebooks while data are held in an object store. Our aim is an architecture allowing seamless integration of code development, data management, and data processing in virtual research environments.

  5. MEG and EEG data analysis with MNE-Python.

    PubMed

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-12-26

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne.

  6. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Technical Reports Server (NTRS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  7. MEG and EEG data analysis with MNE-Python

    PubMed Central

    Gramfort, Alexandre; Luessi, Martin; Larson, Eric; Engemann, Denis A.; Strohmeier, Daniel; Brodbeck, Christian; Goj, Roman; Jas, Mainak; Brooks, Teon; Parkkonen, Lauri; Hämäläinen, Matti

    2013-01-01

    Magnetoencephalography and electroencephalography (M/EEG) measure the weak electromagnetic signals generated by neuronal activity in the brain. Using these signals to characterize and locate neural activation in the brain is a challenge that requires expertise in physics, signal processing, statistics, and numerical methods. As part of the MNE software suite, MNE-Python is an open-source software package that addresses this challenge by providing state-of-the-art algorithms implemented in Python that cover multiple methods of data preprocessing, source localization, statistical analysis, and estimation of functional connectivity between distributed brain regions. All algorithms and utility functions are implemented in a consistent manner with well-documented interfaces, enabling users to create M/EEG data analysis pipelines by writing Python scripts. Moreover, MNE-Python is tightly integrated with the core Python libraries for scientific comptutation (NumPy, SciPy) and visualization (matplotlib and Mayavi), as well as the greater neuroimaging ecosystem in Python via the Nibabel package. The code is provided under the new BSD license allowing code reuse, even in commercial products. Although MNE-Python has only been under heavy development for a couple of years, it has rapidly evolved with expanded analysis capabilities and pedagogical tutorials because multiple labs have collaborated during code development to help share best practices. MNE-Python also gives easy access to preprocessed datasets, helping users to get started quickly and facilitating reproducibility of methods by other researchers. Full documentation, including dozens of examples, is available at http://martinos.org/mne. PMID:24431986

  8. Computational strategies for three-dimensional flow simulations on distributed computer systems

    NASA Astrophysics Data System (ADS)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  9. Hydrogen analysis depth calibration by CORTEO Monte-Carlo simulation

    NASA Astrophysics Data System (ADS)

    Moser, M.; Reichart, P.; Bergmaier, A.; Greubel, C.; Schiettekatte, F.; Dollinger, G.

    2016-03-01

    Hydrogen imaging with sub-μm lateral resolution and sub-ppm sensitivity has become possible with coincident proton-proton (pp) scattering analysis (Reichart et al., 2004). Depth information is evaluated from the energy sum signal with respect to energy loss of both protons on their path through the sample. In first order, there is no angular dependence due to elastic scattering. In second order, a path length effect due to different energy loss on the paths of the protons causes an angular dependence of the energy sum. Therefore, the energy sum signal has to be de-convoluted depending on the matrix composition, i.e. mainly the atomic number Z, in order to get a depth calibrated hydrogen profile. Although the path effect can be calculated analytically in first order, multiple scattering effects lead to significant deviations in the depth profile. Hence, in our new approach, we use the CORTEO Monte-Carlo code (Schiettekatte, 2008) in order to calculate the depth of a coincidence event depending on the scattering angle. The code takes individual detector geometry into account. In this paper we show, that the code correctly reproduces measured pp-scattering energy spectra with roughness effects considered. With more than 100 μm thick Mylar-sandwich targets (Si, Fe, Ge) we demonstrate the deconvolution of the energy spectra on our current multistrip detector at the microprobe SNAKE at the Munich tandem accelerator lab. As a result, hydrogen profiles can be evaluated with an accuracy in depth of about 1% of the sample thickness.

  10. e-Learning - Physics Labs

    NASA Astrophysics Data System (ADS)

    Mohottala, Hashini

    2014-03-01

    The general student population enrolled in any college level class is highly diverse. An increasing number of ``nontraditional'' students return to college and most of these students follow distance learning degree programs while engaging in their other commitments, work and family. However, those students tend to avoid taking science courses with labs, mostly because of the incapability of remotely completing the lab components in such courses. In order to address this issue, we have come across a method where introductory level physics labs can be taught remotely. In this process a lab kit with the critical lab components that can be easily accessible are conveniently packed into a box and distributed among students at the beginning of the semester. Once the students are given the apparatus they perform the experiments at home and gather data All communications with reference to the lab was done through an interactive user-friendly webpage - Wikispaces (WikiS). Students who create pages on WikiS can submit their lab write-ups, embed videos of the experiments they perform, post pictures and direct questions to the lab instructor. The students who are enrolled in the same lab can interact with each other through WikiS to discuss labs and even get assistance.

  11. Lactobacillus backii and Pediococcus damnosus isolated from 170-year-old beer recovered from a shipwreck lack the metabolic activities required to grow in modern lager beer.

    PubMed

    Kajala, Ilkka; Bergsveinson, Jordyn; Friesen, Vanessa; Redekop, Anna; Juvonen, Riikka; Storgårds, Erna; Ziola, Barry

    2018-01-01

    In 2010, bottles of beer containing viable bacteria of the common beer-spoilage species Lactobacillus backii and Pediococcus damnosus were recovered from a shipwreck near the Åland Islands, Finland. The 170-year quiescent state maintained by the shipwreck bacteria presented a unique opportunity to study lactic acid bacteria (LAB) evolution vis-a-vis growth and survival in the beer environment. Three shipwreck bacteria (one L. backii strain and two P. damnosus strains) and modern-day beer-spoilage isolates of the same two species were genome sequenced, characterized for hop iso-α-acid tolerance, and growth in degassed lager and wheat beer. In addition, plasmid variants of the modern-day P. damnosus strain were analyzed for the effect of plasmid-encoded genes on growth in lager beer. Coding content on two plasmids was identified as essential for LAB growth in modern lager beer. Three chromosomal regions containing genes related to sugar transport and cell wall polysaccharides were shared by pediococci able to grow in beer. Our results show that the three shipwreck bacteria lack the necessary plasmid-located genetic content to grow in modern lager beer, but carry additional genes related to acid tolerance and biofilm formation compared to their modern counterparts. © FEMS 2017. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Microarray Data Analysis of Space Grown Arabidopsis Leaves for Genes Important in Vascular Patterning. Analysis of Space Grown Arabidopsis with Microarray Data from GeneLab: Identification of Genes Important in Vascular Patterning

    NASA Technical Reports Server (NTRS)

    Weitzel, A. J.; Wyatt, S. E.; Parsons-Wingerter, P.

    2016-01-01

    Venation patterning in leaves is a major determinant of photosynthesis efficiency because of its dependency on vascular transport of photo-assimilates, water, and minerals. Arabidopsis thaliana grown in microgravity show delayed growth and leaf maturation. Gene expression data from the roots, hypocotyl, and leaves of A. thaliana grown during spaceflight vs. ground control analyzed by Affymetrix microarray are available through NASA's GeneLab (GLDS-7). We analyzed the data for differential expression of genes in leaves resulting from the effects of spaceflight on vascular patterning. Two genes were found by preliminary analysis to be up-regulated during spaceflight that may be related to vascular formation. The genes are responsible for coding an ARGOS (Auxin-Regulated Gene Involved in Organ Size)-like protein (potentially affecting cell elongation in the leaves), and an F-box/kelch-repeat protein (possibly contributing to protoxylem specification). Further analysis that will focus on raw data quality assessment and a moderated t-test may further confirm up-regulation of the two genes and/or identify other gene candidates. Plants defective in these genes will then be assessed for phenotype by the mapping and quantification of leaf vascular patterning by NASA's VESsel GENeration (VESGEN) software to model specific vascular differences of plants grown in spaceflight.

  13. Early childhood cortisol reactivity moderates the effects of parent-child relationship quality on the development of children’s temperament in early childhood

    PubMed Central

    Kopala-Sibley, Daniel C.; Dougherty, Lea R.; Dyson, Margret W.; Laptook, Rebecca S.; Olino, Thomas M.; Bufferd, Sara J.; Klein, Daniel N.

    2017-01-01

    Positive parenting has been related both to lower cortisol reactivity and more adaptive temperament traits in children, whereas elevated cortisol reactivity may be related to maladaptive temperament traits, such as higher negative emotionality (NE) and lower positive emotionality (PE). However, no studies have examined whether hypothalamic-pituitary-adrenal axis activity, as measured by cortisol reactivity, moderates the effect of the quality of the parent-child relationship on changes in temperament in early childhood. In this study, 126 3-year olds were administered the Laboratory Temperament Assessment Battery (Lab-TAB; Goldsmith et al., 1995) as a measure of temperamental NE and PE. Salivary cortisol was collected from the child at 4 time points during this task. The primary parent and the child completed the Teaching Tasks battery (Egeland et al., 1995), from which the quality of the relationship was coded. At age 6, children completed the Lab-TAB again. From age 3 to 6, adjusting for age 3 PE or NE, a better quality relationship with their primary parent predicted decreases in NE for children with elevated cortisol reactivity and predicted increases in PE for children with low cortisol reactivity. Results have implications for our understanding of the interaction of biological stress systems and the parent-child relationship in the development of temperament in childhood. PMID:26689860

  14. Early childhood cortisol reactivity moderates the effects of parent-child relationship quality on the development of children's temperament in early childhood.

    PubMed

    Kopala-Sibley, Daniel C; Dougherty, Lea R; Dyson, Margret W; Laptook, Rebecca S; Olino, Thomas M; Bufferd, Sara J; Klein, Daniel N

    2017-05-01

    Positive parenting has been related both to lower cortisol reactivity and more adaptive temperament traits in children, whereas elevated cortisol reactivity may be related to maladaptive temperament traits, such as higher negative emotionality (NE) and lower positive emotionality (PE). However, no studies have examined whether hypothalamic-pituitary-adrenal axis activity, as measured by cortisol reactivity, moderates the effect of the quality of the parent-child relationship on changes in temperament in early childhood. In this study, 126 3-year-olds were administered the Laboratory Temperament Assessment Battery (Lab-TAB; Goldsmith et al., 1995) as a measure of temperamental NE and PE. Salivary cortisol was collected from the child at 4 time points during this task. The primary parent and the child completed the Teaching Tasks battery (Egeland et al., 1995), from which the quality of the relationship was coded. At age 6, children completed the Lab-TAB again. From age 3 to 6, adjusting for age 3 PE or NE, a better quality relationship with their primary parent predicted decreases in NE for children with elevated cortisol reactivity and predicted increases in PE for children with low cortisol reactivity. Results have implications for our understanding of the interaction of biological stress systems and the parent-child relationship in the development of temperament in childhood. © 2015 John Wiley & Sons Ltd.

  15. PWL 1.0 Personal WaveLab: an object-oriented workbench for seismogram analysis on Windows systems

    NASA Astrophysics Data System (ADS)

    Bono, Andrea; Badiali, Lucio

    2005-02-01

    Personal WaveLab 1.0 wants to be the starting point for an ex novo development of seismic time-series analysis procedures for Windows-based personal computers. Our objective is two-fold. Firstly, being itself a stand-alone application, it allows to do "basic" digital or digitised seismic waveform analysis. Secondly, thanks to its architectural characteristics it can be the basis for the development of more complex and power featured applications. An expanded version of PWL, called SisPick!, is currently in use at the Istituto Nazionale di Geofisica e Vulcanologia (Italian Institute of Geophysics and Volcanology) for real-time monitoring with purposes of Civil Protection. This means that about 90 users tested the application for more than 1 year, making its features more robust and efficient. SisPick! was also employed in the United Nations Nyragongo Project, in Congo, and during the Stromboli emergency in summer of 2002. The main appeals of the application package are: ease of use, object-oriented design, good computational speed, minimal need of disk space and the complete absence of third-party developed components (including ActiveX). Windows environment spares the user scripting or complex interaction with the system. The system is in constant development to answer the needs and suggestions of its users. Microsoft Visual Basic 6 source code, installation package, test data sets and documentation are available at no cost.

  16. WetLab-2: Providing Quantitative PCR Capabilities on ISS

    NASA Technical Reports Server (NTRS)

    Parra, Macarena; Jung, Jimmy Kar Chuen; Almeida, Eduardo; Boone, Travis David; Schonfeld, Julie; Tran, Luan Hoang

    2015-01-01

    The objective of NASA Ames Research Centers WetLab-2 Project is to place on the ISS a system capable of conducting gene expression analysis via quantitative real-time PCR (qRT-PCR) of biological specimens sampled or cultured on orbit. The WetLab-2 system is capable of processing sample types ranging from microbial cultures to animal tissues dissected on-orbit. The project has developed a RNA preparation module that can lyse cells and extract RNA of sufficient quality and quantity for use as templates in qRT-PCR reactions. Our protocol has the advantage that it uses non-toxic chemicals, alcohols or other organics. The resulting RNA is transferred into a pipette and then dispensed into reaction tubes that contain all lyophilized reagents needed to perform qRT-PCR reactions. These reaction tubes are mounted on rotors to centrifuge the liquid to the reaction window of the tube using a cordless drill. System operations require simple and limited crew actions including syringe pushes, valve turns and pipette dispenses. The resulting process takes less than 30 min to have tubes ready for loading into the qRT-PCR unit.The project has selected a Commercial-Off-The-Shelf (COTS) qRT-PCR unit, the Cepheid SmartCycler, that will fly in its COTS configuration. The SmartCycler has a number of advantages including modular design (16 independent PCR modules), low power consumption, rapid thermal ramp times and four-color detection. The ability to detect up to four fluorescent channels will enable multiplex assays that can be used to normalize for RNA concentration and integrity, and to study multiple genes of interest in each module. The WetLab-2 system will have the capability to downlink data from the ISS to the ground after a completed run and to uplink new programs. The ability to conduct qRT-PCR on-orbit eliminates the confounding effects on gene expression of reentry stresses and shock acting on live cells and organisms or the concern of RNA degradation of fixed samples. The system can be used to validate terrestrial analyses of samples returned from ISS by providing on-orbit gene expression benchmarking prior to sample return. The ability to get on-orbit data will provide investigators with the opportunity to adjust experimental parameters in real time for subsequent trials, without the need for sample return and re-flight to sample multigenerational changes. The system can also be used for analysis of air, surface, water, and clinical samples to monitor environmental contaminants and crew health. The verification flight of the instrument is scheduled to launch on SpaceX-7 in June 2015. The WetLab-2 Project is supported by NASAs ISS Program at JSC, Code OZ.

  17. Berkeley Lab Training

    Science.gov Websites

    Berkeley Lab Berkeley Lab A-Z Index Phone Book Jobs Search DOE Help Berkeley Lab Training Welcome Welcome to Berkeley Lab Training! Login to access your LBNL Training Profile. This provides quick access to all of the courses you need. Look below, to learn about different types of training available at

  18. Berkeley Lab Scientist Named MacArthur "Genius" Fellow for Audio

    Science.gov Websites

    Preservation Research | Berkeley Lab Berkeley Lab A-Z Index Directory Submit Web People Navigation Berkeley Lab Search Submit Web People Close About the Lab Leadership/Organization Calendar News to digitally recover a 128-year-old recording of Alexander Graham Bell's voice, enabling people to

  19. Designing an Innovative Data Architecture for the Los Angeles Data Resource (LADR).

    PubMed

    Mukherjee, Sukrit; Jenders, Robert A; Delta, Sebastien

    2015-01-01

    The Los Angeles Data Resource (LADR) is a joint project of major Los Angeles health care provider organizations. The LADR helps clinical investigators to explore the size of potential research study cohorts using operational clinical data across all participating institutions. The Charles R. Drew University of Medicine and Science (CDU) LADR team sought to develop an innovative data architecture that would aggregate de-identified clinical data from safety-net providers in the community into CDU LADR node. This in turn would be federated with the other nodes of LADR for a shared view in a way that was never available before. This led to a self-service system to assess patients matching study criteria at each medical center and to search patients by demographics, ICD-9 codes, lab results and medications.

  20. X-ray simulation algorithms used in ISP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sullivan, John P.

    ISP is a simulation code which is sometimes used in the USNDS program. ISP is maintained by Sandia National Lab. However, the X-ray simulation algorithm used by ISP was written by scientists at LANL – mainly by Ed Fenimore with some contributions from John Sullivan and George Neuschaefer and probably others. In email to John Sullivan on July 25, 2016, Jill Rivera, ISP project lead, said “ISP uses the function xdosemeters_sim from the xgen library.” The is a fortran subroutine which is also used to simulate the X-ray response in consim (a descendant of xgen). Therefore, no separate documentation ofmore » the X-ray simulation algorithms in ISP have been written – the documentation for the consim simulation can be used.« less

  1. PyPWA: A partial-wave/amplitude analysis software framework

    NASA Astrophysics Data System (ADS)

    Salgado, Carlos

    2016-05-01

    The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.

  2. Sustains--direct access for the patient to the medical record over the Internet.

    PubMed

    Eklund, Benny; Joustra-Enquist, Ingrid

    2004-01-01

    The basic idea of Sustains III is to emulate the Internet banking for Health Care. Instead of an "Internet Bank Account" the user has a "Health Care Account". The user logs in using a One Time Password which is sent to the user's mobile phone as an SMS, three seconds after the PIN code is entered. Thus personal information can be transferred both ways in a secure way, with acceptable privacy. The user can then explore the medical record in detail. Also get full and complete list of prescriptions, lab-result etc. It's also an easy way of exchange written information between the doctor and the patient. So far Sustains has showed that patients are very satisfied and is also beneficial for the physicians.

  3. Investigations of the Rayleigh-Taylor and Richtmyer-Meshkov Instabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riccardo Bonazza; Mark Anderson; Jason Oakley

    2008-03-14

    The present program is centered on the experimental study of shock-induced interfacial fluid instabilities. Both 2-D (near-sinusoids) and 3-D (spheres) initial conditions are studied in a large, vertical square shock tube facility. The evolution of the interface shape, its distortion, the modal growth rates and the mixing of the fluids at the interface are all objectives of the investigation. In parallel to the experiments, calculations are performed using the Raptor code, on platforms made available by LLNL. These flows are of great relevance to both ICF and stockpile stewardship. The involvement of four graduate students is in line with themore » national laboratories' interest in the education of scientists and engineers in disciplines and technologies consistent with the labs' missions and activities.« less

  4. Investigation of the Richtmyer-Meshkov instability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riccardo Bonazza; Mark Anderson; Jason Oakley

    2008-12-22

    The present program is centered on the experimental study of shock-induced interfacial fluid instabilities. Both 2-D (near-sinusoids) and 3-D (spheres) initial conditions are studied in a large, vertical square shock tube facility. The evolution of the interface shape, its distortion, the modal growth rates and the mixing of the fluids at the interface are all objectives of the investigation. In parallel to the experiments, calculations are performed using the Raptor code, on platforms made available by LLNL. These flows are of great relevance to both ICF and stockpile stewardship. The involvement of three graduate students is in line with themore » national laboratories' interest in the education of scientists and engineers in disciplines and technologies consistent with the labs' missions and activities.« less

  5. Stress Physiology of Lactic Acid Bacteria

    PubMed Central

    Papadimitriou, Konstantinos; Alegría, Ángel; Bron, Peter A.; de Angelis, Maria; Gobbetti, Marco; Kleerebezem, Michiel; Lemos, José A.; Linares, Daniel M.; Ross, Paul; Stanton, Catherine; Turroni, Francesca; van Sinderen, Douwe; Varmanen, Pekka; Ventura, Marco; Zúñiga, Manuel; Tsakalidou, Effie

    2016-01-01

    SUMMARY Lactic acid bacteria (LAB) are important starter, commensal, or pathogenic microorganisms. The stress physiology of LAB has been studied in depth for over 2 decades, fueled mostly by the technological implications of LAB robustness in the food industry. Survival of probiotic LAB in the host and the potential relatedness of LAB virulence to their stress resilience have intensified interest in the field. Thus, a wealth of information concerning stress responses exists today for strains as diverse as starter (e.g., Lactococcus lactis), probiotic (e.g., several Lactobacillus spp.), and pathogenic (e.g., Enterococcus and Streptococcus spp.) LAB. Here we present the state of the art for LAB stress behavior. We describe the multitude of stresses that LAB are confronted with, and we present the experimental context used to study the stress responses of LAB, focusing on adaptation, habituation, and cross-protection as well as on self-induced multistress resistance in stationary phase, biofilms, and dormancy. We also consider stress responses at the population and single-cell levels. Subsequently, we concentrate on the stress defense mechanisms that have been reported to date, grouping them according to their direct participation in preserving cell energy, defending macromolecules, and protecting the cell envelope. Stress-induced responses of probiotic LAB and commensal/pathogenic LAB are highlighted separately due to the complexity of the peculiar multistress conditions to which these bacteria are subjected in their hosts. Induction of prophages under environmental stresses is then discussed. Finally, we present systems-based strategies to characterize the “stressome” of LAB and to engineer new food-related and probiotic LAB with improved stress tolerance. PMID:27466284

  6. Decision Analysis with Value Focused Thinking as a Methodology to Select Buildings for Deconstruction

    DTIC Science & Technology

    2007-03-01

    Congress Facility 7366 30251 Hazardous Material Storage Shed 432 20447 Aircraft Research Lab 1630 20449 Aircraft Research Lab 2480 34042 Reserve Forces...Congress Facility 0.566 20055 Engineering Admin. Building 0.578 20449 Aircraft Research Lab 0.595 20447 Aircraft Research Lab 0.605 20464...0.525 $39.00 0.01346 20447 Aircraft Research Lab 0.605 $59.50 0.01017 20449 Aircraft Research Lab 0.595 $62.40 0.00954 20464 Area B Gas Station

  7. Love the Lab, Hate the Lab Report?

    ERIC Educational Resources Information Center

    Bjorn, Genevive

    2018-01-01

    In the author's large, urban high school, enrollment in a laboratory science is mandatory. While the student participation rate for lab activities is over 98%, the turn-in rate for traditional lab reports averages just 35% to 85%. Those students who don't produce a lab report miss a critical opportunity to improve their skills in scientific…

  8. Assessing Usage and Maximizing Finance Lab Impact: A Case Exploration

    ERIC Educational Resources Information Center

    Noguera, Magdy; Budden, Michael Craig; Silva, Alberto

    2011-01-01

    This paper reports the results of a survey conducted to assess students' usage and perceptions of a finance lab. Finance labs differ from simple computer labs as they typically contain data boards, streaming market quotes, terminals and software that allow for real-time financial analyses. Despite the fact that such labs represent significant and…

  9. Planning a Computer Lab: Considerations To Ensure Success.

    ERIC Educational Resources Information Center

    IALL Journal of Language Learning Technologies, 1994

    1994-01-01

    Presents points to consider when organizing a computer laboratory. These include the lab's overall objectives and how best to meet them; what type of students will use the lab; where the lab will be located; and what software and hardware can best meet the lab's overall objectives, population, and location requirements. Other factors include time,…

  10. Design of Inquiry-Oriented Science Labs: Impacts on Students' Attitudes

    ERIC Educational Resources Information Center

    Baseya, J. M.; Francis, C. D.

    2011-01-01

    Background: Changes in lab style can lead to differences in learning. Two inquiry-oriented lab styles are guided inquiry (GI) and problem-based (PB). Students' attitudes towards lab are important to consider when choosing between GI and PB styles during curriculum design. Purpose: We examined the degree to which lab experiences are explained by a…

  11. Seismic Analysis Code (SAC): Development, porting, and maintenance within a legacy code base

    NASA Astrophysics Data System (ADS)

    Savage, B.; Snoke, J. A.

    2017-12-01

    The Seismic Analysis Code (SAC) is the result of toil of many developers over almost a 40-year history. Initially a Fortran-based code, it has undergone major transitions in underlying bit size from 16 to 32, in the 1980s, and 32 to 64 in 2009; as well as a change in language from Fortran to C in the late 1990s. Maintenance of SAC, the program and its associated libraries, have tracked changes in hardware and operating systems including the advent of Linux in the early 1990, the emergence and demise of Sun/Solaris, variants of OSX processors (PowerPC and x86), and Windows (Cygwin). Traces of these systems are still visible in source code and associated comments. A major concern while improving and maintaining a routinely used, legacy code is a fear of introducing bugs or inadvertently removing favorite features of long-time users. Prior to 2004, SAC was maintained and distributed by LLNL (Lawrence Livermore National Lab). In that year, the license was transferred from LLNL to IRIS (Incorporated Research Institutions for Seismology), but the license is not open source. However, there have been thousands of downloads a year of the package, either source code or binaries for specific system. Starting in 2004, the co-authors have maintained the SAC package for IRIS. In our updates, we fixed bugs, incorporated newly introduced seismic analysis procedures (such as EVALRESP), added new, accessible features (plotting and parsing), and improved the documentation (now in HTML and PDF formats). Moreover, we have added modern software engineering practices to the development of SAC including use of recent source control systems, high-level tests, and scripted, virtualized environments for rapid testing and building. Finally, a "sac-help" listserv (administered by IRIS) was setup for SAC-related issues and is the primary avenue for users seeking advice and reporting bugs. Attempts are always made to respond to issues and bugs in a timely fashion. For the past thirty-plus years, SAC files contained a fixed-length header. Time and distance-related values are stored in single precision, which has become a problem with the increase in desired precision for data compared to thirty years ago. A future goal is to address this precision problem, but in a backward compatible manner. We would also like to transition SAC to a more open source license.

  12. Introductory labs; what they don't, should, and can teach (and why)

    NASA Astrophysics Data System (ADS)

    Wieman, Carl

    2016-03-01

    Introductory physics labs are widely used and expensive. They have a wide variety of potential learning goals, but these are seldom specified and less often measured if they are achieved. We cover three different research projects on introductory labs: 1) We have done cognitive task analyses of both experimental research in physics and instructional labs. The striking differences explain much of the unhappiness expressed by students with labs: 2) We have measured the effectiveness of two introductory physics lab courses specifically intended to teach the physics content covered in standard introductory courses on mechanics and E & M. As measured by course exams, the benefit is 0 +/-2% for both. 3) We show how it is possible to use lab courses to teach students to correctly evaluate physical models with uncertain data. Such quantitative critical thinking is an important skill that is not learned in typical lab courses, but is well learned by our modified lab instruction.

  13. Stress Physiology of Lactic Acid Bacteria.

    PubMed

    Papadimitriou, Konstantinos; Alegría, Ángel; Bron, Peter A; de Angelis, Maria; Gobbetti, Marco; Kleerebezem, Michiel; Lemos, José A; Linares, Daniel M; Ross, Paul; Stanton, Catherine; Turroni, Francesca; van Sinderen, Douwe; Varmanen, Pekka; Ventura, Marco; Zúñiga, Manuel; Tsakalidou, Effie; Kok, Jan

    2016-09-01

    Lactic acid bacteria (LAB) are important starter, commensal, or pathogenic microorganisms. The stress physiology of LAB has been studied in depth for over 2 decades, fueled mostly by the technological implications of LAB robustness in the food industry. Survival of probiotic LAB in the host and the potential relatedness of LAB virulence to their stress resilience have intensified interest in the field. Thus, a wealth of information concerning stress responses exists today for strains as diverse as starter (e.g., Lactococcus lactis), probiotic (e.g., several Lactobacillus spp.), and pathogenic (e.g., Enterococcus and Streptococcus spp.) LAB. Here we present the state of the art for LAB stress behavior. We describe the multitude of stresses that LAB are confronted with, and we present the experimental context used to study the stress responses of LAB, focusing on adaptation, habituation, and cross-protection as well as on self-induced multistress resistance in stationary phase, biofilms, and dormancy. We also consider stress responses at the population and single-cell levels. Subsequently, we concentrate on the stress defense mechanisms that have been reported to date, grouping them according to their direct participation in preserving cell energy, defending macromolecules, and protecting the cell envelope. Stress-induced responses of probiotic LAB and commensal/pathogenic LAB are highlighted separately due to the complexity of the peculiar multistress conditions to which these bacteria are subjected in their hosts. Induction of prophages under environmental stresses is then discussed. Finally, we present systems-based strategies to characterize the "stressome" of LAB and to engineer new food-related and probiotic LAB with improved stress tolerance. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  14. Designing virtual science labs for the Islamic Academy of Delaware

    NASA Astrophysics Data System (ADS)

    AlZahrani, Nada Saeed

    Science education is a basic part of the curriculum in modern day classrooms. Instructional approaches to science education can take many forms but hands-on application of theory via science laboratory activities for the learner is common. Not all schools have the resources to provide the laboratory environment necessary for hands-on application of science theory. Some settings rely on technology to provide a virtual laboratory experience instead. The Islamic Academy of Delaware (IAD), a typical community-based organization, was formed to support and meet the essential needs of the Muslim community of Delaware. IAD provides science education as part of the overall curriculum, but cannot provide laboratory activities as part of the science program. Virtual science labs may be a successful model for students at IAD. This study was conducted to investigate the potential of implementing virtual science labs at IAD and to develop an implementation plan for integrating the virtual labs. The literature has shown us that the lab experience is a valuable part of the science curriculum (NBPTS, 2013, Wolf, 2010, National Research Council, 1997 & 2012). The National Research Council (2012) stressed the inclusion of laboratory investigations in the science curriculum. The literature also supports the use of virtual labs as an effective substitute for classroom labs (Babateen, 2011; National Science Teachers Association, 2008). Pyatt and Simms (2011) found evidence that virtual labs were as good, if not better than physical lab experiences in some respects. Although not identical in experience to a live lab, the virtual lab has been shown to provide the student with an effective laboratory experience in situations where the live lab is not possible. The results of the IAD teacher interviews indicate that the teachers are well-prepared for, and supportive of, the implementation of virtual labs to improve the science education curriculum. The investigator believes that with the support of the literature and the readiness of the IAD administration and teachers, a recommendation to implement virtual labs into the curriculum can be made.

  15. Using "Saccharomyces cerevisiae" to Test the Mutagenicity of Household Compounds: An Open Ended Hypothesis-Driven Teaching Lab

    ERIC Educational Resources Information Center

    Marshall, Pamela A.

    2007-01-01

    In our Fundamentals of Genetics lab, students perform a wide variety of labs to reinforce and extend the topics covered in lecture. I developed an active-learning lab to augment the lecture topic of mutagenesis. In this lab exercise, students determine if a compound they bring from home is a mutagen. Students are required to read extensive…

  16. Incorporating inquiry and the process of science into introductory astronomy labs at the George Washington University

    NASA Astrophysics Data System (ADS)

    Cobb, Bethany E.

    2018-01-01

    Since 2013, the Physics Department at GWU has used student-centered active learning in the introductory astronomy course “Introduction to the Cosmos.” Class time is spent in groups on questions, math problems, and hands-on activities, with multiple instructors circulating to answer questions and engage with the students. The students have responded positively to this active-learning. Unfortunately, in transitioning to active-learning there was no time to rewrite the labs. Very quickly, the contrast between the dynamic classroom and the traditional labs became apparent. The labs were almost uniformly “cookie-cutter” in that the procedure and analysis were specified step-by-step and there was just one right answer. Students rightly criticized the labs for lacking a clear purpose and including busy-work. Furthermore, this class fulfills the GWU scientific reasoning general education requirement and thus includes learning objectives related to understanding the scientific method, testing hypotheses with data, and considering uncertainty – but the traditional labs did not require these skills. I set out to rejuvenate the lab sequence by writing new inquiry labs based on both topic-specific and scientific reasoning learning objectives. While inquiry labs can be challenging for the students, as they require active thinking and creativity, these labs engage the students more thoroughly in the scientific process. In these new labs, whenever possible, I include real astronomical data and ask the students to use digital tools (SDSS SkyServer, SOHO archive) as if they are real astronomers. To allow students to easily plot, manipulate and analyze data, I built “smart” Excel files using formulas, dropdown menus and macros. The labs are now much more authentic and thought-provoking. Whenever possible, students independently develop questions, hypotheses, and procedures and the scientific method is “scaffolded” over the semester by providing more guidance in the early labs and more independence later on. Finally, in every lab, students must identify and reflect on sources of error. These labs are more challenging for the instructors to run and to grade, but they are much more satisfying when it comes to student learning.

  17. Advanced LabVIEW Labs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Eric D.

    1999-06-17

    In the world of computer-based data acquisition and control, the graphical interface program LabVIEW from National Instruments is so ubiquitous that in many ways it has almost become the laboratory standard. To date, there have been approximately fifteen books concerning LabVIEW, but Professor Essick's treatise takes on a completely different tack than all of the previous discussions. In the more standard treatments of the ways and wherefores of LabVIEW such as LabVIEW Graphical Programming: Practical Applications in Instrumentation and Control by Gary W. Johnson (McGraw Hill, NY 1997), the emphasis has been instructing the reader how to program LabVIEW tomore » create a Virtual Instrument (VI) on the computer for interfacing to a particular instruments. LabVIEW is written in G a graphical programming language developed by National Instruments. In the past the emphasis has been on training the experimenter to learn G . Without going into details here, G incorporates the usual loops, arithmetic expressions, etc., found in many programming languages, but in an icon (graphical) environment. The net result being that LabVIEW contains all of the standard methods needed for interfacing to instruments, data acquisition, data analysis, graphics, and also methodology to incorporate programs written in other languages into LabVIEW. Historically, according to Professor Essick, he developed a series of experiments for an upper division laboratory course for computer-based instrumentation. His observation was that while many students had the necessary background in computer programming languages, there were students who had virtually no concept about writing a computer program let alone a computer- based interfacing program. Thus the beginnings of a concept for not only teaching computer- based instrumentation techniques, but aiso a method for the beginner to experience writing a com- puter program. Professor Essick saw LabVIEW as the perfect environment in which to teach computer-based research skills. With this goal in mind, he has succeeded admirably. Advanced LabVIEW Labs presents a series of chapters devoted to not only introducing the reader to LabVIEW, but also to the concepts necessary for writing a successful computer pro- gram. Each chapter is an assignment for the student and is suitable for a ten week course. The first topic introduces the while loop and waveform chart VI'S. After learning how to launch LabVIEW, the student then leans how to use LabVIEW functions such as sine and cosine. The beauty of thk and subsequent chapters, the student is introduced immediately to computer-based instruction by learning how to display the results in graph form on the screen. At each point along the way, the student is not only introduced to another LabVIEW operation, but also to such subjects as spread sheets for data storage, numerical integration, Fourier transformations', curve fitting algorithms, etc. The last few chapters conclude with the purpose of the learning module, and that is, com- puter-based instrumentation. Computer-based laboratory projects such as analog-to-digital con- version, digitizing oscilloscopes treated. Advanced Lab VIEW Labs finishes with a treatment on GPIB interfacing and finally, the student is asked to create an operating VI for temperature con- trol. This is an excellent text, not only as an treatise on LabVIEW but also as an introduction to computer programming logic. All programmers, who are struggling to not only learning how interface computers to instruments, but also trying understand top down programming and other programming language techniques, should add Advanced Lab-VIEW Labs to their computer library.« less

  18. Advanced LabVIEW Labs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Eric D.

    1999-06-17

    In the world of computer-based data acquisition and control, the graphical interface program LabVIEW from National Instruments is so ubiquitous that in many ways it has almost become the laboratory standard. To date, there have been approximately fifteen books concerning LabVIEW, but Professor Essick's treatise takes on a completely different tack than all of the previous discussions. In the more standard treatments of the ways and wherefores of LabVIEW such as LabVIEW Graphical Programming: Practical Applications in Instrumentation and Control by Gary W. Johnson (McGraw Hill, NY 1997), the emphasis has been instructing the reader how to program LabVIEW tomore » create a Virtual Instrument (VI) on the computer for interfacing to a particular instruments. LabVIEW is written in "G" a graphical programming language developed by National Instruments. In the past the emphasis has been on training the experimenter to learn "G". Without going into details here, "G" incorporates the usual loops, arithmetic expressions, etc., found in many programming languages, but in an icon (graphical) environment. The net result being that LabVIEW contains all of the standard methods needed for interfacing to instruments, data acquisition, data analysis, graphics, and also methodology to incorporate programs written in other languages into LabVIEW. Historically, according to Professor Essick, he developed a series of experiments for an upper division laboratory course for computer-based instrumentation. His observation was that while many students had the necessary background in computer programming languages, there were students who had virtually no concept about writing a computer program let alone a computer- based interfacing program. Thus the beginnings of a concept for not only teaching computer- based instrumentation techniques, but aiso a method for the beginner to experience writing a com- puter program. Professor Essick saw LabVIEW as the "perfect environment in which to teach computer-based research skills." With this goal in mind, he has succeeded admirably. Advanced LabVIEW Labs presents a series of chapters devoted to not only introducing the reader to LabVIEW, but also to the concepts necessary for writing a successful computer pro- gram. Each chapter is an assignment for the student and is suitable for a ten week course. The first topic introduces the while loop and waveform chart VI'S. After learning how to launch LabVIEW, the student then leans how to use LabVIEW functions such as sine and cosine. The beauty of thk and subsequent chapters, the student is introduced immediately to computer-based instruction by learning how to display the results in graph form on the screen. At each point along the way, the student is not only introduced to another LabVIEW operation, but also to such subjects as spread sheets for data storage, numerical integration, Fourier transformations', curve fitting algorithms, etc. The last few chapters conclude with the purpose of the learning module, and that is, com- puter-based instrumentation. Computer-based laboratory projects such as analog-to-digital con- version, digitizing oscilloscopes treated. Advanced Lab VIEW Labs finishes with a treatment on GPIB interfacing and finally, the student is asked to create an operating VI for temperature con- trol. This is an excellent text, not only as an treatise on LabVIEW but also as an introduction to computer programming logic. All programmers, who are struggling to not only learning how interface computers to instruments, but also trying understand top down programming and other programming language techniques, should add Advanced Lab-VIEW Labs to their computer library.« less

  19. Virtual Simulations as Preparation for Lab Exercises: Assessing Learning of Key Laboratory Skills in Microbiology and Improvement of Essential Non-Cognitive Skills.

    PubMed

    Makransky, Guido; Thisgaard, Malene Warming; Gadegaard, Helen

    2016-01-01

    To investigate if a virtual laboratory simulation (vLAB) could be used to replace a face to face tutorial (demonstration) to prepare students for a laboratory exercise in microbiology. A total of 189 students who were participating in an undergraduate biology course were randomly selected into a vLAB or demonstration condition. In the vLAB condition students could use a vLAB at home to 'practice' streaking out bacteria on agar plates in a virtual environment. In the demonstration condition students were given a live demonstration from a lab tutor showing them how to streak out bacteria on agar plates. All students were blindly assessed on their ability to perform the streaking technique in the physical lab, and were administered a pre and post-test to determine their knowledge of microbiology, intrinsic motivation to study microbiology, and self-efficacy in the field of microbiology prior to, and after the experiment. The results showed that there were no significant differences between the two groups on their lab scores, and both groups had similar increases in knowledge of microbiology, intrinsic motivation to study microbiology, as well as self-efficacy in the field of microbiology. Our data show that vLABs function just as well as face to face tutorials in preparing students for a physical lab activity in microbiology. The results imply that vLABs could be used instead of face to face tutorials, and a combination of virtual and physical lab exercises could be the future of science education.

  20. Cherenkov and scintillation light separation in organic liquid scintillators

    NASA Astrophysics Data System (ADS)

    Caravaca, J.; Descamps, F. B.; Land, B. J.; Yeh, M.; Orebi Gann, G. D.

    2017-12-01

    The CHErenkov/Scintillation Separation experiment (CHESS) has been used to demonstrate the separation of Cherenkov and scintillation light in both linear alkylbenzene (LAB) and LAB with 2 g/L of PPO as a fluor (LAB/PPO). This is the first successful demonstration of Cherenkov light detection from the more challenging LAB/PPO cocktail and improves on previous results for LAB. A time resolution of 338± 12 ps FWHM results in an efficiency for identifying Cherenkov photons in LAB/PPO of 70 ± 3 % and 63± 8% for time- and charge-based separation, respectively, with scintillation contamination of 36± 5% and 38± 4%. LAB/PPO data is consistent with a rise time of τ _r=0.72± 0.33 ns.

  1. Inexpensive DAQ based physics labs

    NASA Astrophysics Data System (ADS)

    Lewis, Benjamin; Clark, Shane

    2015-11-01

    Quality Data Acquisition (DAQ) based physics labs can be designed using microcontrollers and very low cost sensors with minimal lab equipment. A prototype device with several sensors and documentation for a number of DAQ-based labs is showcased. The device connects to a computer through Bluetooth and uses a simple interface to control the DAQ and display real time graphs, storing the data in .txt and .xls formats. A full device including a larger number of sensors combined with software interface and detailed documentation would provide a high quality physics lab education for minimal cost, for instance in high schools lacking lab equipment or students taking online classes. An entire semester’s lab course could be conducted using a single device with a manufacturing cost of under $20.

  2. A Constructivist Cloud Lab.

    ERIC Educational Resources Information Center

    Emery, Dave

    1996-01-01

    Describes a lab involving a cloud formation activity that uses the constructivist learning model to get students more involved in creating the lab. Enables students to develop a greater understanding of the concepts involved and more interest in the lab's outcomes. (JRH)

  3. Differences between Lab Completion and Non-Completion on Student Performance in an Online Undergraduate Environmental Science Program

    NASA Astrophysics Data System (ADS)

    Corsi, Gianluca

    2011-12-01

    Web-based technology has revolutionized the way education is delivered. Although the advantages of online learning appeal to large numbers of students, some concerns arise. One major concern in online science education is the value that participation in labs has on student performance. The purpose of this study was to assess the relationships between lab completion and student academic success as measured by test grades, scientific self-confidence, scientific skills, and concept mastery. A random sample of 114 volunteer undergraduate students, from an online Environmental Science program at the American Public University System, was tested. The study followed a quantitative, non-experimental research design. Paired sample t-tests were used for statistical comparison between pre-lab and post-lab test grades, two scientific skills quizzes, and two scientific self-confidence surveys administered at the beginning and at the end of the course. The results of the paired sample t-tests revealed statistically significant improvements on all post-lab test scores: Air Pollution lab, t(112) = 6.759, p < .001; Home Chemicals lab t(114) = 8.585, p < .001; Water Use lab, t(116) = 6.657, p < .001; Trees and Carbon lab, t(113) = 9.921, p < .001; Stratospheric Ozone lab, t(112) =12.974, p < .001; Renewable Energy lab, t(115) = 7.369, p < .001. The end of the course Scientific Skills quiz revealed statistically significant improvements, t(112) = 8.221, p < .001. The results of the two surveys showed a statistically significant improvement on student Scientific Self-Confidence because of lab completion, t(114) = 3.015, p < .05. Because age and gender were available, regression models were developed. The results indicated weak multiple correlation coefficients and were not statistically significant at alpha = .05. Evidence suggests that labs play a positive role in a student's academic success. It is recommended that lab experiences be included in all online Environmental Science programs, with emphasis on open-ended inquiries, and adoption of online tools to enhance hands-on experiences, such as virtual reality platforms and digital animations. Future research is encouraged to investigate possible correlations between socio-demographic attributes and academic success of students enrolled in online science programs in reference to lab completion.

  4. HARE: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mckie, Jim

    2012-01-09

    This report documents the results of work done over a 6 year period under the FAST-OS programs. The first effort was called Right-Weight Kernels, (RWK) and was concerned with improving measurements of OS noise so it could be treated quantitatively; and evaluating the use of two operating systems, Linux and Plan 9, on HPC systems and determining how these operating systems needed to be extended or changed for HPC, while still retaining their general-purpose nature. The second program, HARE, explored the creation of alternative runtime models, building on RWK. All of the HARE work was done on Plan 9. Themore » HARE researchers were mindful of the very good Linux and LWK work being done at other labs and saw no need to recreate it. Even given this limited funding, the two efforts had outsized impact: _ Helped Cray decide to use Linux, instead of a custom kernel, and provided the tools needed to make Linux perform well _ Created a successor operating system to Plan 9, NIX, which has been taken in by Bell Labs for further development _ Created a standard system measurement tool, Fixed Time Quantum or FTQ, which is widely used for measuring operating systems impact on applications _ Spurred the use of the 9p protocol in several organizations, including IBM _ Built software in use at many companies, including IBM, Cray, and Google _ Spurred the creation of alternative runtimes for use on HPC systems _ Demonstrated that, with proper modifications, a general purpose operating systems can provide communications up to 3 times as effective as user-level libraries Open source was a key part of this work. The code developed for this project is in wide use and available at many places. The core Blue Gene code is available at https://bitbucket.org/ericvh/hare. We describe details of these impacts in the following sections. The rest of this report is organized as follows: First, we describe commercial impact; next, we describe the FTQ benchmark and its impact in more detail; operating systems and runtime research follows; we discuss infrastructure software; and close with a description of the new NIX operating system, future work, and conclusions.« less

  5. Towards a Manifesto for Living Lab Co-creation

    NASA Astrophysics Data System (ADS)

    Følstad, Asbjørn; Brandtzæg, Petter Bae; Gulliksen, Jan; Börjeson, Mikael; Näkki, Pirjo

    There is a growing interest in Living Labs for innovation and development in the field of information and communication technology. In particular there seem to be a tendency that current Living Labs aim to involve users for co-creative purposes. However, the current literature on Living Lab co-creation is severely limited. Therefore an Interact workshop is arranged as a first step towards a manifesto for Living Lab co-creation.

  6. Alternation of Generations and Experimental Design: A Guided-Inquiry Lab Exploring the Nature of the "her1" Developmental Mutant of "Ceratopteris richardii" (C-Fern)

    ERIC Educational Resources Information Center

    Spiro, Mark D.; Knisely, Karin I.

    2008-01-01

    Inquiry-based labs have been shown to greatly increase student participation and learning within the biological sciences. One challenge is to develop effective lab exercises within the constraints of large introductory labs. We have designed a lab for first-year biology majors to address two primary goals: to provide effective learning of the…

  7. Open web system of Virtual labs for nuclear and applied physics

    NASA Astrophysics Data System (ADS)

    Saldikov, I. S.; Afanasyev, V. V.; Petrov, V. I.; Ternovykh, M. Yu

    2017-01-01

    An example of virtual lab work on unique experimental equipment is presented. The virtual lab work is software based on a model of real equipment. Virtual labs can be used for educational process in nuclear safety and analysis field. As an example it includes the virtual lab called “Experimental determination of the material parameter depending on the pitch of a uranium-water lattice”. This paper included general description of this lab. A description of a database on the support of laboratory work on unique experimental equipment which is included this work, its concept development are also presented.

  8. The contribution of weathering of the main Alpine rivers on the global carbon cycle

    NASA Astrophysics Data System (ADS)

    Donnini, Marco; Probst, Jean-Luc; Probst, Anne; Frondini, Francesco; Marchesini, Ivan; Guzzetti, Fausto

    2013-04-01

    On geological time-scales the carbon fluxes from the solid Earth to the atmosphere mainly result from volcanism and metamorphic-decarbonation processes, whereas the carbon fluxes from atmosphere to solid Earth mainly depend on weathering of silicates and carbonates, biogenic precipitation and removal of CaCO3 in the oceans and volcanic gases - seawater interactions. Quantifying each contribution is critical. In this work, we estimate the atmospheric CO2 uptake by weathering in the Alps, using results of the study of the dissolved loads transported by 33 main Alpine rivers. The chemical composition of river water in unpolluted areas is a good indicator of surface weathering processes (Garrels and Mackenzie, 1971; Drever, 1982; Meybeck, 1984; Tardy, 1986; Berner and Berner, 1987; Probst et al., 1994). The dissolved load of streams originates from atmospheric input, pollution, evaporite dissolution, and weathering of carbonate and silicate rocks, and the application of mass balance calculations allows quantification of the different contributions. In this work, we applied the MEGA (Major Element Geochemical Approach) geochemical code (Amiotte Suchet, 1995; Amiotte Suchet and Probst, 1996) to the chemical compositions of the selected rivers in order to quantify the atmospheric CO2 consumed by weathering in Alpine region. The drainage basins of the main Alpine rivers were sampled near the basin outlets during dry and flood seasons. The application of the MEGA geochemical consisted in several steps. First, we subtracted the rain contribution in river waters knowing the X/Cl (X = Na, K, Mg, Ca) ratios of the rain. Next, we considered that all (Na+K) came from silicate weathering. The average molar ratio Rsil = (Na+K)/(Ca+Mg) for rivers draining silicate terrains was estimated from unpolluted French stream waters draining small monolithological basins (Meybeck, 1986; 1987). For the purpose, we prepared a simplified geo-lithological map of Alps according to the lithological classification of Meybeck (1986, 1987). Then for each basin we computed Rsil weighted average considering the surface and the mean precipitation for the surface area of each lithology. Lastly, we estimated the (Ca+Mg) originating from carbonate weathering as the remaining cations after silicate correction. Depending on time-scales of the phenomena (shorter than about 1 million year i.e., correlated to the short term carbon cycle, or longer than about 1 million years i.e., correlated to the long-term carbon cycle), we considered different equations for the quantification of the atmospheric CO2 consumed by weathering (Huh, 2010). The results show the net predominance of carbonate weathering on fixing atmospheric CO2 and that, considering the long-term carbon cycle, the amount of atmospheric CO2 uptake by weathering is about one order of magnitude lower than considering the short-term carbon cycle. Moreover, considering the short-term carbon cycle, the mean CO2 consumed by Alpine basins is of the same order of magnitude of the mean CO2 consumed by weathering by the 60 largest rivers of the world estimated by Gaillardet et al. (1999). References Amiotte-Suchet, P. "Cycle Du Carbone, Érosion Chimique Des Continents Et Transfert Vers Les Océans." Sci. Géol. Mém. Strasbourg 97 (1995): 156. Amiotte-Suchet, P., and J.-L. Probst. "Origins of dissolved inorganic carbon in the Garonne river waters: seasonal and interannual variations." Sci. Géologiques Bull. Strasbourg 49, no. 1-4 (1996): 101-126. Berner, E.K., and R.A. Berner. The Global Water Cycle. Geochemistry and Environment. Prentice Halle. Engelwood Cliffs, NJ, 1987. Drever, J.L. The Geochemistry of Natural Waters. Prentice Hall, 1982. Gaillardet, J., B. Dupré, P. Louvat, and C.J. Allègre. "Global Silicate Weathering and CO2 Consumption Rates Deduced from the Chemistry of Large Rivers." Chemical Geology 159 (1999): 3-30. Garrels, R.M., and F.T. Mackenzie. Evolution of Sedimentary Rocks. New York: W.W. Nortonand, 1971. Huh, Y. "Estimation of Atmospheric CO2 Uptake by Silicate Weathering in the Himalayas and the Tibetan Plateau: a Review of Existing Fluvial Geochemical Data." In Monsoon Evolution and Tectonics-Climate Linkage in Asia, 129-151. Geological Society of London, Special Publications. 342. London, 2010. Meybeck, M. "Composition Chimique Naturelle Des Ruisseaux Non Pollués En France." Bullettin De La Société Géologique 39 (1986): 3-77. Meybeck, M. "Global Chemical Weathering of Surficial Rocks Estimated from River Dissolved Load." Am. J. Sci 287 (1987): 401-428. Probst J.L., Mortatti J. And Tardy Y. 1994- Carbon river fluxes and global weathering CO2 consumption in the Congo and Amazon river basins. Applied Geochemistry, 9, p 1-13 Tardy, Y. Le Cycle De l'Eau. Climats, Paléoclimates Et Géochimie Globale. Masson. Paris, 1986.

  9. Lorentz boosted frame simulation technique in Particle-in-cell methods

    NASA Astrophysics Data System (ADS)

    Yu, Peicheng

    In this dissertation, we systematically explore the use of a simulation method for modeling laser wakefield acceleration (LWFA) using the particle-in-cell (PIC) method, called the Lorentz boosted frame technique. In the lab frame the plasma length is typically four orders of magnitude larger than the laser pulse length. Using this technique, simulations are performed in a Lorentz boosted frame in which the plasma length, which is Lorentz contracted, and the laser length, which is Lorentz expanded, are now comparable. This technique has the potential to reduce the computational needs of a LWFA simulation by more than four orders of magnitude, and is useful if there is no or negligible reflection of the laser in the lab frame. To realize the potential of Lorentz boosted frame simulations for LWFA, the first obstacle to overcome is a robust and violent numerical instability, called the Numerical Cerenkov Instability (NCI), that leads to unphysical energy exchange between relativistically drifting particles and their radiation. This leads to unphysical noise that dwarfs the real physical processes. In this dissertation, we first present a theoretical analysis of this instability, and show that the NCI comes from the unphysical coupling of the electromagnetic (EM) modes and Langmuir modes (both main and aliasing) of the relativistically drifting plasma. We then discuss the methods to eliminate them. However, the use of FFTs can lead to parallel scalability issues when there are many more cells along the drifting direction than in the transverse direction(s). We then describe an algorithm that has the potential to address this issue by using a higher order finite difference operator for the derivative in the plasma drifting direction, while using the standard second order operators in the transverse direction(s). The NCI for this algorithm is analyzed, and it is shown that the NCI can be eliminated using the same strategies that were used for the hybrid FFT/Finite Difference solver. This scheme also requires a current correction and filtering which require FFTs. However, we show that in this case the FFTs can be done locally on each parallel partition. We also describe how the use of the hybrid FFT/Finite Difference or the hybrid higher order finite difference/second order finite difference methods permit combining the Lorentz boosted frame simulation technique with another "speed up" technique, called the quasi-3D algorithm, to gain unprecedented speed up for the LWFA simulations. In the quasi-3D algorithm the fields and currents are defined on an r--z PIC grid and expanded in azimuthal harmonics. The expansion is truncated with only a few modes so it has similar computational needs of a 2D r--z PIC code. We show that NCI has similar properties in r--z as in z-x slab geometry and show that the same strategies for eliminating the NCI in Cartesian geometry can be effective for the quasi-3D algorithm leading to the possibility of unprecedented speed up. We also describe a new code called UPIC-EMMA that is based on fully spectral (FFT) solver. The new code includes implementation of a moving antenna that can launch lasers in the boosted frame. We also describe how the new hybrid algorithms were implemented into OSIRIS. Examples of LWFA using the boosted frame using both UPIC-EMMA and OSIRIS are given, including the comparisons against the lab frame results. We also describe how to efficiently obtain the boosted frame simulations data that are needed to generate the transformed lab frame data, as well as how to use a moving window in the boosted frame. The NCI is also a major issue for modeling relativistic shocks with PIC algorithm. In relativistic shock simulations two counter-propagating plasmas drifting at relativistic speeds are colliding against each other. We show that the strategies for eliminating the NCI developed in this dissertation are enabling such simulations being run for much longer simulation times, which should open a path for major advances in relativistic shock research. (Abstract shortened by ProQuest.).

  10. Cherenkov and scintillation light separation in organic liquid scintillators

    DOE PAGES

    Caravaca, J.; Descamps, F. B.; Land, B. J.; ...

    2017-11-29

    The CHErenkov/Scintillation Separation experiment (CHESS) has been used to demonstrate the separation of Cherenkov and scintillation light in both linear alkylbenzene (LAB) and LAB with 2 g/L of PPO as a fluor (LAB/PPO). This is the first successful demonstration of Cherenkov light detection from the more challenging LAB/PPO cocktail and improves on previous results for LAB. A time resolution of 338 ± 12 ps FWHM results in an efficiency for identifying Cherenkov photons in LAB/PPO of 70 ± 3 % and 63 ± 8 % for time- and charge-based separation, respectively, with scintillation contamination of 36 ± 5 % andmore » 38 ± 4 %. LAB/PPO data is consistent with a rise time of τ r = 0.72 ± 0.33 ns.« less

  11. Cherenkov and scintillation light separation in organic liquid scintillators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caravaca, J.; Descamps, F. B.; Land, B. J.

    The CHErenkov/Scintillation Separation experiment (CHESS) has been used to demonstrate the separation of Cherenkov and scintillation light in both linear alkylbenzene (LAB) and LAB with 2 g/L of PPO as a fluor (LAB/PPO). This is the first successful demonstration of Cherenkov light detection from the more challenging LAB/PPO cocktail and improves on previous results for LAB. A time resolution of 338 ± 12 ps FWHM results in an efficiency for identifying Cherenkov photons in LAB/PPO of 70 ± 3 % and 63 ± 8 % for time- and charge-based separation, respectively, with scintillation contamination of 36 ± 5 % andmore » 38 ± 4 %. LAB/PPO data is consistent with a rise time of τ r = 0.72 ± 0.33 ns.« less

  12. Cost-effectiveness of the non-laboratory based Framingham algorithm in primary prevention of cardiovascular disease: A simulated analysis of a cohort of African American adults.

    PubMed

    Kariuki, Jacob K; Gona, Philimon; Leveille, Suzanne G; Stuart-Shor, Eileen M; Hayman, Laura L; Cromwell, Jerry

    2018-06-01

    The non-lab Framingham algorithm, which substitute body mass index for lipids in the laboratory based (lab-based) Framingham algorithm, has been validated among African Americans (AAs). However, its cost-effectiveness and economic tradeoffs have not been evaluated. This study examines the incremental cost-effectiveness ratio (ICER) of two cardiovascular disease (CVD) prevention programs guided by the non-lab versus lab-based Framingham algorithm. We simulated the World Health Organization CVD prevention guidelines on a cohort of 2690 AA participants in the Atherosclerosis Risk in Communities (ARIC) cohort. Costs were estimated using Medicare fee schedules (diagnostic tests, drugs & visits), Bureau of Labor Statistics (RN wages), and estimates for managing incident CVD events. Outcomes were assumed to be true positive cases detected at a data driven treatment threshold. Both algorithms had the best balance of sensitivity/specificity at the moderate risk threshold (>10% risk). Over 12years, 82% and 77% of 401 incident CVD events were accurately predicted via the non-lab and lab-based Framingham algorithms, respectively. There were 20 fewer false negative cases in the non-lab approach translating into over $900,000 in savings over 12years. The ICER was -$57,153 for every extra CVD event prevented when using the non-lab algorithm. The approach guided by the non-lab Framingham strategy dominated the lab-based approach with respect to both costs and predictive ability. Consequently, the non-lab Framingham algorithm could potentially provide a highly effective screening tool at lower cost to address the high burden of CVD especially among AA and in resource-constrained settings where lab tests are unavailable. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. An evaluation of two hands-on lab styles for plant biodiversity in undergraduate biology.

    PubMed

    Basey, John M; Maines, Anastasia P; Francis, Clinton D; Melbourne, Brett

    2014-01-01

    We compared learning cycle and expository formats for teaching about plant biodiversity in an inquiry-oriented university biology lab class (n = 465). Both formats had preparatory lab activities, a hands-on lab, and a postlab with reflection and argumentation. Learning was assessed with a lab report, a practical quiz in lab, and a multiple-choice exam in the concurrent lecture. Attitudes toward biology and treatments were also assessed. We used linear mixed-effect models to determine impacts of lab style on lower-order cognition (LO) and higher-order cognition (HO) based on Bloom's taxonomy. Relative to the expository treatment, the learning cycle treatment had a positive effect on HO and a negative effect on LO included in lab reports; a positive effect on transfer of LO from the lab report to the quiz; negative impacts on LO quiz performance and on attitudes toward the lab; and a higher degree of perceived difficulty. The learning cycle treatment had no influence on transfer of HO from lab report to quiz or exam; quiz performance on HO questions; exam performance on LO and HO questions; and attitudes toward biology as a science. The importance of LO as a foundation for HO relative to these lab styles is addressed. © 2014 J. M. Basey et al. CBE—Life Sciences Education © 2014 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  14. Virtual Reality Lab Assistant

    NASA Technical Reports Server (NTRS)

    Saha, Hrishikesh; Palmer, Timothy A.

    1996-01-01

    Virtual Reality Lab Assistant (VRLA) demonstration model is aligned for engineering and material science experiments to be performed by undergraduate and graduate students in the course as a pre-lab simulation experience. This will help students to get a preview of how to use the lab equipment and run experiments without using the lab hardware/software equipment. The quality of the time available for laboratory experiments can be significantly improved through the use of virtual reality technology.

  15. Multi-User Facilities for Molecular Marine Biology and Biotechnology

    DTIC Science & Technology

    1990-04-06

    relatedness (or non-relatedness) of symbiotic zooxanthellae in corals. The DNA synthesizer is used to prepare universal primers for 18s r RNA. A...tunicates (Weissman lab) Robert Rowan - Zooxanthellae (Powers lab) Lani West - sponges/barnacles (Powers lab) Jeff Mitton - Mytilus (Powers lab) Kristi

  16. Novartis School Lab: bringing young people closer to the world of research and discovering the excitement of science.

    PubMed

    Michel, Christiane Röckl; Standke, Gesche; Naef, Reto

    2012-01-01

    The Novartis School Lab (http://www.novartis.ch/schullabor) is an institution with an old tradition. The School Lab reaches about 5000 students through internal courses and an additional 5000 children at public science events where they can enjoy hands-on science in disciplines of biomedical research. The subjects range from chemistry, physics, molecular biology and genetics to toxicology and medical topics. The Novartis School Lab offers a variety of activities for youngsters aged 10-20 ranging from lab courses for school classes, continuing education for teachers and development of teaching kits, support for individual research projects to outreach for public science events. Innovation and adaptation to changes of current needs are essential aspects for the Novartis School Lab. Ongoing activities to shape the Novartis Biomedical Learning Lab include design of new teaching experiments, exploration into additional disciplines of biomedical science and the creation of a fascinating School Lab of the future.

  17. Integrating Robotic Observatories into Astronomy Labs

    NASA Astrophysics Data System (ADS)

    Ruch, Gerald T.

    2015-01-01

    The University of St. Thomas (UST) and a consortium of five local schools is using the UST Robotic Observatory, housing a 17' telescope, to develop labs and image processing tools that allow easy integration of observational labs into existing introductory astronomy curriculum. Our lab design removes the burden of equipment ownership by sharing access to a common resource and removes the burden of data processing by automating processing tasks that are not relevant to the learning objectives.Each laboratory exercise takes place over two lab periods. During period one, students design and submit observation requests via the lab website. Between periods, the telescope automatically acquires the data and our image processing pipeline produces data ready for student analysis. During period two, the students retrieve their data from the website and perform the analysis. The first lab, 'Weighing Jupiter,' was successfully implemented at UST and several of our partner schools. We are currently developing a second lab to measure the age of and distance to a globular cluster.

  18. An analysis of high school students' perceptions and academic performance in laboratory experiences

    NASA Astrophysics Data System (ADS)

    Mirchin, Robert Douglas

    This research study is an investigation of student-laboratory (i.e., lab) learning based on students' perceptions of experiences using questionnaire data and evidence of their science-laboratory performance based on paper-and-pencil assessments using Maryland-mandated criteria, Montgomery County Public Schools (MCPS) criteria, and published laboratory questions. A 20-item questionnaire consisting of 18 Likert-scale items and 2 open-ended items that addressed what students liked most and least about lab was administered to students before labs were observed. A pre-test and post-test assessing laboratory achievement were administered before and after the laboratory experiences. The three labs observed were: soda distillation, stoichiometry, and separation of a mixture. Five significant results or correlations were found. For soda distillation, there were two positive correlations. Student preference for analyzing data was positively correlated with achievement on the data analysis dimension of the lab rubric. A student preference for using numbers and graphs to analyze data was positively correlated with achievement on the analysis dimension of the lab rubric. For the separating a mixture lab data the following pairs of correlations were significant. Student preference for doing chemistry labs where numbers and graphs were used to analyze data had a positive correlation with writing a correctly worded hypothesis. Student responses that lab experiences help them learn science positively correlated with achievement on the data dimension of the lab rubric. The only negative correlation found related to the first result where students' preference for computers was inversely correlated to their performance on analyzing data on their lab report. Other findings included the following: students like actual experimental work most and the write-up and analysis of a lab the least. It is recommended that lab science instruction be inquiry-based, hands-on, and that students be tested for lab content acquisition. The final conclusion of the study is that students expressed a preference for working in groups and working with materials and equipment as opposed to individual, non-group work and analyzing data.

  19. MatLab Programming for Engineers Having No Formal Programming Knowledge

    NASA Technical Reports Server (NTRS)

    Shaykhian, Linda H.; Shaykhian, Gholam Ali

    2007-01-01

    MatLab is one of the most widely used very high level programming languages for Scientific and engineering computations. It is very user-friendly and needs practically no formal programming knowledge. Presented here are MatLab programming aspects and not just the MatLab commands for scientists and engineers who do not have formal programming training and also have no significant time to spare for learning programming to solve their real world problems. Specifically provided are programs for visualization. Also, stated are the current limitations of the MatLab, which possibly can be taken care of by Mathworks Inc. in a future version to make MatLab more versatile.

  20. Department of Chemistry and Biochemistry - University of Maryland,

    Science.gov Websites

    Access Analytical Facilities? New Labs Catalyze Chemistry Learning Inclusive & Interdisciplinary New Collaborative Research New Labs Catalyze Chemistry Learning Inclusive & Interdisciplinary New Molecule Shows Author's profile esj-lab New Labs Catalyze Chemistry Learning The Edward St. John Learning and Teaching

  1. Design and Analysis of the Warm-To Suspension Links for Jefferson Lab's 11 Gev/c Super High Momentum Spectrometer

    NASA Astrophysics Data System (ADS)

    Sun, E.; Brindza, P.; Lassiter, S.; Fowler, M.

    2010-04-01

    This paper describes design and analysis performed for the warm-to-cold suspension links of the warm iron yoke superconducting quadrupole magnets, and superconducting dipole magnet. The results of investigation of titanium Ti-6Al-4V and Nitronic 50 stainless steel for the suspension links to support the cold mass, preloads, forces due to cryogenic temperature, and imbalanced magnetic forces from misalignments are presented. Allowable stresses at normal-case scenarios and worst-case scenarios, space constraints, and heat leak considerations are discussed. Principles of the ASME Pressure Vessel Code were used to determine allowable stresses. Optimal angles of the suspension links were obtained by calculation and finite element methods. The stress levels of suspension links at multiple scenarios are presented, discussed, and compared with the allowable stresses.

  2. MetaboAnalystR: an R package for flexible and reproducible analysis of metabolomics data.

    PubMed

    Chong, Jasmine; Xia, Jianguo

    2018-06-28

    The MetaboAnalyst web application has been widely used for metabolomics data analysis and interpretation. Despite its user-friendliness, the web interface has presented its inherent limitations (especially for advanced users) with regard to flexibility in creating customized workflow, support for reproducible analysis, and capacity in dealing with large data. To address these limitations, we have developed a companion R package (MetaboAnalystR) based on the R code base of the web server. The package has been thoroughly tested to ensure that the same R commands will produce identical results from both interfaces. MetaboAnalystR complements the MetaboAnalyst web server to facilitate transparent, flexible and reproducible analysis of metabolomics data. MetaboAnalystR is freely available from https://github.com/xia-lab/MetaboAnalystR. Supplementary data are available at Bioinformatics online.

  3. Make safety awareness a priority: Use a login software in your research facility

    DOE PAGES

    Camino, Fernando E.

    2017-01-21

    We report on a facility login software, whose objective is to improve safety in multi-user research facilities. Its most important safety features are: 1) blocks users from entering the lab after being absent for more than a predetermined number of days; 2) gives users a random safety quiz question, which they need to answer satisfactorily in order to use the facility; 3) blocks unauthorized users from using the facility afterhours; and 4) displays the current users in the facility. Besides restricting access to unauthorized users, the software keeps users mindful of key safety concepts. In addition, integration of the softwaremore » with a door controller system can convert it into an effective physical safety mechanism. Depending on DOE approval, the code may be available as open source.« less

  4. NASA/Army Rotorcraft Transmission Research, a Review of Recent Significant Accomplishments

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    1994-01-01

    A joint helicopter transmission research program between NASA Lewis Research Center and the U.S. Army Research Lab has existed since 1970. Research goals are to reduce weight and noise while increasing life, reliability, and safety. These research goals are achieved by the NASA/Army Mechanical Systems Technology Branch through both in-house research and cooperative research projects with university and industry partners. Some recent significant technical accomplishments produced by this cooperative research are reviewed. The following research projects are reviewed: oil-off survivability of tapered roller bearings, design and evaluation of high contact ratio gearing, finite element analysis of spiral bevel gears, computer numerical control grinding of spiral bevel gears, gear dynamics code validation, computer program for life and reliability of helicopter transmissions, planetary gear train efficiency study, and the Advanced Rotorcraft Transmission (ART) program.

  5. MICA: Multiple interval-based curve alignment

    NASA Astrophysics Data System (ADS)

    Mann, Martin; Kahle, Hans-Peter; Beck, Matthias; Bender, Bela Johannes; Spiecker, Heinrich; Backofen, Rolf

    2018-01-01

    MICA enables the automatic synchronization of discrete data curves. To this end, characteristic points of the curves' shapes are identified. These landmarks are used within a heuristic curve registration approach to align profile pairs by mapping similar characteristics onto each other. In combination with a progressive alignment scheme, this enables the computation of multiple curve alignments. Multiple curve alignments are needed to derive meaningful representative consensus data of measured time or data series. MICA was already successfully applied to generate representative profiles of tree growth data based on intra-annual wood density profiles or cell formation data. The MICA package provides a command-line and graphical user interface. The R interface enables the direct embedding of multiple curve alignment computation into larger analyses pipelines. Source code, binaries and documentation are freely available at https://github.com/BackofenLab/MICA

  6. Biospark: scalable analysis of large numerical datasets from biological simulations and experiments using Hadoop and Spark.

    PubMed

    Klein, Max; Sharma, Rati; Bohrer, Chris H; Avelis, Cameron M; Roberts, Elijah

    2017-01-15

    Data-parallel programming techniques can dramatically decrease the time needed to analyze large datasets. While these methods have provided significant improvements for sequencing-based analyses, other areas of biological informatics have not yet adopted them. Here, we introduce Biospark, a new framework for performing data-parallel analysis on large numerical datasets. Biospark builds upon the open source Hadoop and Spark projects, bringing domain-specific features for biology. Source code is licensed under the Apache 2.0 open source license and is available at the project website: https://www.assembla.com/spaces/roberts-lab-public/wiki/Biospark CONTACT: eroberts@jhu.eduSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. The Microsoft Biology Foundation Applications for High-Throughput Sequencing

    PubMed Central

    Mercer, S.

    2010-01-01

    w9-2 The need for reusable libraries of bioinformatics functions has been recognized for many years and a number of language-specific toolkits have been constructed. Such toolkits have served as valuable nucleation points for the community, promoting the sharing of code and establishing standards. The majority of DNA sequencing machines and many other standard pieces of lab equipment are controlled by PCs using Windows, and a Microsoft genomics toolkit would enable initial processing and quality control to happen closer to the instrumentation and provide opportunities for added-value services within core facilities. The Microsoft Biology Foundation (MBF) is an open source software library, freely available for both commercial and academic use, available as an early-stage betafrom mbf.codeplex.com. This presentation will describe the structure and goals of MBF and demonstrate some of its uses.

  8. Bounce frequency fishbone analysis

    NASA Astrophysics Data System (ADS)

    White, Roscoe; Fredrickson, Eric; Chen, Liu

    2002-11-01

    Large amplitude bursting modes are observed on NSTX, which are identified as bounce frequency fishbone modes(PDX Group, Princeton Plasma Physics Lab, Phys Rev. Lett) 50, 891 (1983)^,(L. Chen, R. B. White, and M. N. Rosenbluth Phys Rev. Lett) 52, 1122 (1984). The identification is carried out using numerical equilibria obtained from TRANSP( R. V. Budny, M. G. Bell A. C. Janos et al), Nucl Fusion 35, 1497 (1995) and the numerical guiding center code ORBIT( R.B. White, Phys. Fluids B 2)(4), 845 (1990). These modes are important for high energy particle distributions which have large average bounce angle, such as the nearly tangentially injected beam ions in NSTX and isotropic alpha particle distributions. They are particularly important in high q low shear advanced plasma scenarios. Different ignited plasma scenarios are investigated with these modes in view.

  9. Make safety awareness a priority: Use a login software in your research facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camino, Fernando E.

    We report on a facility login software, whose objective is to improve safety in multi-user research facilities. Its most important safety features are: 1) blocks users from entering the lab after being absent for more than a predetermined number of days; 2) gives users a random safety quiz question, which they need to answer satisfactorily in order to use the facility; 3) blocks unauthorized users from using the facility afterhours; and 4) displays the current users in the facility. Besides restricting access to unauthorized users, the software keeps users mindful of key safety concepts. In addition, integration of the softwaremore » with a door controller system can convert it into an effective physical safety mechanism. Depending on DOE approval, the code may be available as open source.« less

  10. The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality

    NASA Technical Reports Server (NTRS)

    Conway, Darrel J.; Hughes, Steven P.

    2010-01-01

    The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).

  11. Effectiveness of a Lab Manual Delivered on CD-ROM

    ERIC Educational Resources Information Center

    Brickman, Peggy; Ketter, Catherine A. Teare; Pereira, Monica

    2005-01-01

    Although electronic instructional media are becoming increasingly prevalent in science classrooms, their worth remains unproven. Here, student perceptions and performance using CD-ROM delivery of lab materials are assessed. Numerous learning barriers that produced lower lab grades for students using a CD-ROM lab manual in comparison to a print…

  12. Student Plagiarism and Faculty Responsibility in Undergraduate Engineering Labs

    ERIC Educational Resources Information Center

    Parameswaran, Ashvin; Devi, Poornima

    2006-01-01

    In undergraduate engineering labs, lab reports are routinely copied. By ignoring this form of plagiarism, teaching assistants and lab technicians neglect their role responsibility. By designing courses that facilitate it, however inadvertently, professors neglect their causal responsibility. Using the case of one university, we show via interviews…

  13. Experiential Learning of Digital Communication Using LabVIEW

    ERIC Educational Resources Information Center

    Zhan, Wei; Porter, Jay R.; Morgan, Joseph A.

    2014-01-01

    This paper discusses the design and implementation of laboratories and course projects using LabVIEW in an instrumentation course. The pedagogical challenge is to enhance students' learning of digital communication using LabVIEW. LabVIEW was extensively used in the laboratory sessions, which better prepared students for the course projects. Two…

  14. Blood Count Tests - Multiple Languages

    MedlinePlus

    ... window. Arabic (العربية) Expand Section Your Lab Tests - English PDF Your Lab Tests - العربية (Arabic) PDF American ... Cantonese dialect) (繁體中文) Expand Section Your Lab Tests - English PDF Your Lab Tests - 繁體中文 (Chinese, Traditional (Cantonese ...

  15. Computers, Networks, and Desegregation at San Jose High Academy.

    ERIC Educational Resources Information Center

    Solomon, Gwen

    1987-01-01

    Describes magnet high school which was created in California to meet desegregation requirements and emphasizes computer technology. Highlights include local computer networks that connect science and music labs, the library/media center, business computer lab, writing lab, language arts skills lab, and social studies classrooms; software; teacher…

  16. Detailed Project Report and Environmental Assessment. Section 111. Shores East of Diked Disposal Area, Lorain Harbor, Ohio.

    DTIC Science & Technology

    1981-11-01

    STONE). &7 LAB 07 AORD LAB APRL 1978 LAB * 107/78.6118 PRESQUE ISLE PROJECT UNK ORD LAB CLEVELAND WEST BREAKWATER JU. LAB 103/78.6240 .R..ABILITATION...NOTES IS. KEY WORDS (Continue on revere side if neeemvr and identify by block number) beach erosion diked disposal areas shore erosion Lake Erie ...House Document No. 229, 83rd Congress, "Appendix VIII, Ohio Shoreline of Lake Erie Between Vermilion and Sheffield Lake Village, Beach Erosion Control

  17. Improving the Quality of Lab Reports by Using Them as Lab Instructions

    NASA Astrophysics Data System (ADS)

    Haagen-Schuetzenhoefer, Claudia

    2012-10-01

    Lab exercises are quite popular in teaching science. Teachers have numerous goals in mind when teaching science laboratories. Nevertheless, empirical research draws a heterogeneous picture of the benefits of lab work. Research has shown that it does not necessarily contribute to the enhancement of practical abilities or content knowledge. Lab activities are frequently based on recipe-like, step-by-step instructions ("cookbook style"), which do not motivate students to engage cognitively. Consequently, students put the emphasis on "task completion" or "manipulating equipment."2

  18. GeneLab: Open Science For Exploration

    NASA Technical Reports Server (NTRS)

    Galazka, Jonathan

    2018-01-01

    The NASA GeneLab project capitalizes on multi-omic technologies to maximize the return on spaceflight experiments. The GeneLab project houses spaceflight and spaceflight-relevant multi-omics data in a publicly accessible data commons, and collaborates with NASA-funded principal investigators to maximize the omics data from spaceflight and spaceflight-relevant experiments. I will discuss the current status of GeneLab and give specific examples of how the GeneLab data system has been used to gain insight into how biology responds to spaceflight conditions.

  19. The Effect of Triage on Patient Flow in an Outpatient Clinic.

    DTIC Science & Technology

    1979-12-01

    returns the records to the reception desk, and then checks his in- basket to call his next patient. If lab tests or X-rays are indicated and the patient is...Waiting times 7 Start and end of Provider service time provider service Total service time 8 Lab tests Percentage of lab tests ordered by screener 9 Lab... tests Percentage of lab tests ordered by provider Second provider start Provider service time and end of service Total service time 10 Referral to

  20. Results of a 90-day safety assurance study with rats fed grain from corn borer-protected corn.

    PubMed

    Hammond, B G; Dudek, R; Lemen, J K; Nemeth, M A

    2006-07-01

    The results of a 90-day rat feeding study with grain from MON 810 corn (YieldGard Cornborer -- YieldGard Cornborer is a registered trademark of Monsanto Technology, LLC) that is protected against feeding damage from corn and stalk boring lepidopteran insects are presented. Corn borer protection was accomplished through the introduction of cry1Ab coding sequences into the corn genome for in planta production of a bioactive form of Cry1Ab protein. Grain from MON 810 and its near-isogenic control was separately formulated into rodent diets at levels of 11% and 33% (w/w) by Purina Mills, Inc. (PMI). All diets were nutritionally balanced and conformed to PMI specifications for Certified LabDiet (PMI Certified LabDiet 5002 is a registered trademark of Purina Mills, Inc.) 5002. There were a total of 400 rats in the study divided into 10 groups of 20 rats/sex/group. The responses of rats fed diets containing MON 810 were compared to those of rats fed grain from conventional corn varieties. Overall health, body weight, food consumption, clinical pathology parameters (hematology, blood chemistry, urinalysis), organ weights, and gross and microscopic appearance of tissues were comparable between groups fed diets containing MON 810 and conventional corn varieties. This study complements extensive agronomic, compositional and farm animal feeding studies with MON 810 grain, confirming that it is as safe and nutritious as grain from existing commercial corn varieties.

  1. Airplane numerical simulation for the rapid prototyping process

    NASA Astrophysics Data System (ADS)

    Roysdon, Paul F.

    Airplane Numerical Simulation for the Rapid Prototyping Process is a comprehensive research investigation into the most up-to-date methods for airplane development and design. Uses of modern engineering software tools, like MatLab and Excel, are presented with examples of batch and optimization algorithms which combine the computing power of MatLab with robust aerodynamic tools like XFOIL and AVL. The resulting data is demonstrated in the development and use of a full non-linear six-degrees-of-freedom simulator. The applications for this numerical tool-box vary from un-manned aerial vehicles to first-order analysis of manned aircraft. A Blended-Wing-Body airplane is used for the analysis to demonstrate the flexibility of the code from classic wing-and-tail configurations to less common configurations like the blended-wing-body. This configuration has been shown to have superior aerodynamic performance -- in contrast to their classic wing-and-tube fuselage counterparts -- and have reduced sensitivity to aerodynamic flutter as well as potential for increased engine noise abatement. Of course without a classic tail elevator to damp the nose up pitching moment, and the vertical tail rudder to damp the yaw and possible rolling aerodynamics, the challenges in lateral roll and yaw stability, as well as pitching moment are not insignificant. This thesis work applies the tools necessary to perform the airplane development and optimization on a rapid basis, demonstrating the strength of this tool through examples and comparison of the results to similar airplane performance characteristics published in literature.

  2. A crowdsourcing workflow for extracting chemical-induced disease relations from free text

    PubMed Central

    Li, Tong Shu; Bravo, Àlex; Furlong, Laura I.; Good, Benjamin M.; Su, Andrew I.

    2016-01-01

    Relations between chemicals and diseases are one of the most queried biomedical interactions. Although expert manual curation is the standard method for extracting these relations from the literature, it is expensive and impractical to apply to large numbers of documents, and therefore alternative methods are required. We describe here a crowdsourcing workflow for extracting chemical-induced disease relations from free text as part of the BioCreative V Chemical Disease Relation challenge. Five non-expert workers on the CrowdFlower platform were shown each potential chemical-induced disease relation highlighted in the original source text and asked to make binary judgments about whether the text supported the relation. Worker responses were aggregated through voting, and relations receiving four or more votes were predicted as true. On the official evaluation dataset of 500 PubMed abstracts, the crowd attained a 0.505 F-score (0.475 precision, 0.540 recall), with a maximum theoretical recall of 0.751 due to errors with named entity recognition. The total crowdsourcing cost was $1290.67 ($2.58 per abstract) and took a total of 7 h. A qualitative error analysis revealed that 46.66% of sampled errors were due to task limitations and gold standard errors, indicating that performance can still be improved. All code and results are publicly available at https://github.com/SuLab/crowd_cid_relex Database URL: https://github.com/SuLab/crowd_cid_relex PMID:27087308

  3. One doll fits all: validation of the Leiden Infant Simulator Sensitivity Assessment (LISSA).

    PubMed

    Voorthuis, Alexandra; Out, Dorothée; van der Veen, Rixt; Bhandari, Ritu; van IJzendoorn, Marinus H; Bakermans-Kranenburg, Marian J

    2013-01-01

    Children vary hugely in how demanding of their caregivers they are. This creates differences in demands on parents during observation, making the comparison of sensitivity between parents difficult. It would therefore be of interest to create standard situations in which all caregivers are faced with the same level of demand. This study developed an ecologically valid but standardized setting using an infant simulator with interactive features, the Leiden Infant Simulator Sensitivity Assessment (LISSA). The infant simulator resembles a real infant in appearance and it produces crying sounds that are life-like. The simulator begins with fussing and progresses to more intense crying in case of no care or inappropriate care. It responds by being calm again if appropriate care is given. One hundred and eighty-one female participants took care of the infant simulator for two evenings and in a 30 min lab session with increasing competing demands. Sensitive parenting behavior during the lab session was coded with the Ainsworth Sensitivity Scale. Sensitivity ratings covered the whole range of the scale (1-9), and were stable across settings (free play, competing demands). Sensitivity was related to an increase of positive affect during caretaking, and insensitivity was related to intended harsh caregiving response during a computerized cry paradigm. Sensitivity was unrelated to social desirability and self-reported quality of care given to the infant simulator. We discuss the potentials of the infant simulator for research on sensitive parenting, for preventive interventions, and for clinical practices.

  4. Long Non-Coding RNA Malat1 Regulates Angiogenesis in Hindlimb Ischemia.

    PubMed

    Zhang, Xuejing; Tang, Xuelian; Hamblin, Milton H; Yin, Ke-Jie

    2018-06-11

    Angiogenesis is a complex process that depends on the delicate regulation of gene expression. Dysregulation of transcription during angiogenesis often leads to various human diseases. Emerging evidence has recently begun to show that long non-coding RNAs (lncRNAs) may mediate angiogenesis in both physiological and pathological conditions; concurrently, underlying molecular mechanisms are largely unexplored. Previously, our lab identified metastasis associates lung adenocarcinoma transcript 1 ( Malat1 ) as an oxygen-glucose deprivation (OGD)-responsive endothelial lncRNA. Here we reported that genetic deficiency of Malat1 leads to reduced blood vessel formation and local blood flow perfusion in mouse hind limbs at one to four weeks after hindlimb ischemia. Malat1 and vascular endothelial growth factor receptor 2 ( VEGFR2 ) levels were found to be increased in both cultured mouse primary skeletal muscle microvascular endothelial cells (SMMECs) after 16 h OGD followed by 24 h reperfusion and in mouse gastrocnemius muscle that underwent hindlimb ischemia followed by 28 days of reperfusion. Moreover, Malat1 silencing by locked nucleic acid (LNA)-GapmeRs significantly reduced tube formation, cell migration, and cell proliferation in SMMEC cultures. Mechanistically, RNA subcellular isolation and RNA-immunoprecipitation experiments demonstrate that Malat1 directly targets VEGFR2 to facilitate angiogenesis. The results suggest that Malat1 regulates cell-autonomous angiogenesis through direct regulation of VEGFR2.

  5. Tools for open geospatial science

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Mitasova, H.

    2017-12-01

    Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.

  6. Naturalistically observed conflict and youth asthma symptoms.

    PubMed

    Tobin, Erin T; Kane, Heidi S; Saleh, Daniel J; Naar-King, Sylvie; Poowuttikul, Pavadee; Secord, Elizabeth; Pierantoni, Wayne; Simon, Valerie A; Slatcher, Richard B

    2015-06-01

    To investigate the links between naturalistically observed conflict, self-reported caregiver-youth conflict, and youth asthma symptoms. Fifty-four youth with asthma (age range: 10-17 years) wore the Electronically Activated Recorder (EAR) for a 4-day period to assess interpersonal conflict and caregiver-youth conflict as they occur in daily life. Conflict also was assessed with baseline self-report questionnaires and daily diaries completed by youth participants and their caregivers. Asthma symptoms were assessed using daily diaries, baseline self-reports, and wheezing, as coded from the EAR. EAR-observed measures of conflict were strongly associated with self-reported asthma symptoms (both baseline and daily diaries) and wheezing coded from the EAR. Further, when entered together in regression analyses, youth daily reports of negative caregiver-youth interactions and EAR-observed conflict uniquely predicted asthma symptoms; only EAR-observed conflict was associated with EAR-observed wheezing. These findings demonstrate the potential impact of daily conflict on youth asthma symptoms and the importance of assessing conflict as it occurs in everyday life. More broadly, they point to the importance of formulating a clear picture of family interactions outside of the lab, which is essential for understanding how family relationships "get under the skin" to affect youth health. (c) 2015 APA, all rights reserved).

  7. Cross-site comparison of ribosomal depletion kits for Illumina RNAseq library construction.

    PubMed

    Herbert, Zachary T; Kershner, Jamie P; Butty, Vincent L; Thimmapuram, Jyothi; Choudhari, Sulbha; Alekseyev, Yuriy O; Fan, Jun; Podnar, Jessica W; Wilcox, Edward; Gipson, Jenny; Gillaspy, Allison; Jepsen, Kristen; BonDurant, Sandra Splinter; Morris, Krystalynne; Berkeley, Maura; LeClerc, Ashley; Simpson, Stephen D; Sommerville, Gary; Grimmett, Leslie; Adams, Marie; Levine, Stuart S

    2018-03-15

    Ribosomal RNA (rRNA) comprises at least 90% of total RNA extracted from mammalian tissue or cell line samples. Informative transcriptional profiling using massively parallel sequencing technologies requires either enrichment of mature poly-adenylated transcripts or targeted depletion of the rRNA fraction. The latter method is of particular interest because it is compatible with degraded samples such as those extracted from FFPE and also captures transcripts that are not poly-adenylated such as some non-coding RNAs. Here we provide a cross-site study that evaluates the performance of ribosomal RNA removal kits from Illumina, Takara/Clontech, Kapa Biosystems, Lexogen, New England Biolabs and Qiagen on intact and degraded RNA samples. We find that all of the kits are capable of performing significant ribosomal depletion, though there are differences in their ease of use. All kits were able to remove ribosomal RNA to below 20% with intact RNA and identify ~ 14,000 protein coding genes from the Universal Human Reference RNA sample at >1FPKM. Analysis of differentially detected genes between kits suggests that transcript length may be a key factor in library production efficiency. These results provide a roadmap for labs on the strengths of each of these methods and how best to utilize them.

  8. Naturalistically-Observed Conflict and Youth Asthma Symptoms

    PubMed Central

    Tobin, Erin T.; Kane, Heidi S.; Saleh, Daniel J.; Naar-King, Sylvie; Poowuttikul, Pavadee; Secord, Elizabeth; Pierantoni, Wayne; Simon, Valerie; Slatcher, Richard B.

    2014-01-01

    Objective To investigate the links between naturalistically-observed conflict, self-reported caregiver-youth conflict, and youth asthma symptoms. Method Fifty-four youth with asthma (aged 10-17) wore the Electronically Activated Recorder (EAR) for a 4-day period to assess interpersonal conflict and caregiver-youth conflict as they occur in daily life. Conflict also was assessed with baseline self-report questionnaires and daily diaries completed by the youth participants and their caregiver. Asthma symptoms were assessed via daily diaries and baseline self-reports and wheezing as coded from the EAR. Results EAR-observed measures of conflict were strongly associated with self-reported asthma symptoms (both baseline and daily diaries) and wheezing coded from the EAR. Further, when entered together in regression analyses, youth daily reports of negative caregiver-youth interactions and EAR-observed conflict uniquely predicted asthma symptoms; only EAR-observed conflict was associated with EAR-observed wheezing. Conclusions These findings demonstrate the potential impact of daily conflict on youth asthma symptoms and the importance of assessing conflict as it occurs in everyday life. More broadly, they point to the importance of formulating a clear picture of family interactions outside of the lab, which is essential for understanding how family relationships “get under the skin” to affect youth health. PMID:25222090

  9. Theoretical analysis and simulation of the influence of self-bunching effects and longitudinal space charge effects on the propagation of keV electron bunch produced by a novel S-band Micro-Pulse electron Gun

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Jifei; Lu, Xiangyang, E-mail: xylu@pku.edu.cn; Yang, Ziqin

    As an important electron source, Micro-Pulse electron Gun (MPG) which is qualified for producing high average current, short pulse, low emittance electron bunches steadily holds promise to use as an electron source of Coherent Smith-Purcell Radiation (CSPR), Free Electron Laser (FEL). The stable output of S-band MPG has been achieved in many labs. To establish reliable foundation for the future application of it, the propagation of picosecond electron bunch produced by MPG should be studied in detail. In this article, the MPG which was working on the rising stage of total effective Secondary Electron Yield (SEY) curve was introduced. Themore » self-bunching mechanism was discussed in depth both in the multipacting amplifying state and the steady working state. The bunch length broadening induced by the longitudinal space-charge (SC) effects was investigated by different theoretical models in different regions. The 2D PIC codes MAGIC and beam dynamic codes TraceWin simulations were also performed in the propagation. The result shows an excellent agreement between the simulation and the theoretical analysis for bunch length evolution.« less

  10. MWIR hyperspectral imaging with the MIDAS instrument

    NASA Astrophysics Data System (ADS)

    Honniball, Casey I.; Wright, Rob; Lucey, Paul G.

    2017-02-01

    Hyperspectral imaging (HSI) in the Mid-Wave InfraRed (MWIR, 3-5 microns) can provide information on a variety of science applications from determining the chemical composition of lava lakes on Jupiter's moon Io, to investigating the amount of carbon liberated into the Earth's atmosphere during a wildfire. The limited signal available in the MWIR presents technical challenges to achieving high signal-to-noise ratios, and therefore it is typically necessary to cryogenically cool MWIR instruments. With recent improvements in microbolometer technology and emerging interferometric techniques, we have shown that uncooled microbolometers coupled with a Sagnac interferometer can achieve high signal-to-noise ratios for long-wave infrared HSI. To explore if this technique can be applied to the MWIR, this project, with funding from NASA, has built the Miniaturized Infrared Detector of Atmospheric Species (MIDAS). Standard characterization tests are used to compare MIDAS against a cryogenically cooled photon detector to evaluate the MIDAS instruments' ability to quantify gas concentrations. Atmospheric radiative transfer codes are in development to explore the limitations of MIDAS and identify the range of science objectives that MIDAS will most likely excel at. We will simulate science applications with gas cells filled with varying gas concentrations and varying source temperatures to verify our results from lab characterization and our atmospheric modeling code.

  11. Indicators for the use of robotic labs in basic biomedical research: a literature analysis

    PubMed Central

    2017-01-01

    Robotic labs, in which experiments are carried out entirely by robots, have the potential to provide a reproducible and transparent foundation for performing basic biomedical laboratory experiments. In this article, we investigate whether these labs could be applicable in current experimental practice. We do this by text mining 1,628 papers for occurrences of methods that are supported by commercial robotic labs. Using two different concept recognition tools, we find that 86%–89% of the papers have at least one of these methods. This and our other results provide indications that robotic labs can serve as the foundation for performing many lab-based experiments. PMID:29134146

  12. Characterization of Lactic Acid Bacteria (LAB) isolated from Indonesian shrimp paste (terasi)

    NASA Astrophysics Data System (ADS)

    Amalia, U.; Sumardianto; Agustini, T. W.

    2018-02-01

    Shrimp paste was one of fermented products, popular as a taste enhancer in many dishes. The processing of shrimp paste was natural fermentation, depends on shrimp it self and the presence of salt. The salt inhibits the growth of undesirable microorganism and allows the salt-tolerant lactic acid bacteria (LAB) to ferment the protein source to lactic acids. The objectives of this study were to characterize LAB isolated from Indonesian shrimp paste or "Terasi" with different times of fermentation (30, 60 and 90 days). Vitech analysis showed that there were four strains of the microorganism referred to as lactic acid bacteria (named: LABS1, LABS2, LABS3 and LABS4) with 95% sequence similarity. On the basis of biochemical, four isolates represented Lactobacillus, which the name Lactobacillus plantarum is proposed. L.plantarum was play role in resulting secondary metabolites, which gave umami flavor in shrimp paste.

  13. Domain Adaptation Methods for Improving Lab-to-field Generalization of Cocaine Detection using Wearable ECG.

    PubMed

    Natarajan, Annamalai; Angarita, Gustavo; Gaiser, Edward; Malison, Robert; Ganesan, Deepak; Marlin, Benjamin M

    2016-09-01

    Mobile health research on illicit drug use detection typically involves a two-stage study design where data to learn detectors is first collected in lab-based trials, followed by a deployment to subjects in a free-living environment to assess detector performance. While recent work has demonstrated the feasibility of wearable sensors for illicit drug use detection in the lab setting, several key problems can limit lab-to-field generalization performance. For example, lab-based data collection often has low ecological validity, the ground-truth event labels collected in the lab may not be available at the same level of temporal granularity in the field, and there can be significant variability between subjects. In this paper, we present domain adaptation methods for assessing and mitigating potential sources of performance loss in lab-to-field generalization and apply them to the problem of cocaine use detection from wearable electrocardiogram sensor data.

  14. Cantera and Cantera Electrolyte Thermodynamics Objects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John Hewson, Harry Moffat

    Cantera is a suite of object-oriented software tools for problems involving chemical kinetics, thermodynamics, and/or transport processes. It is a multi-organizational effort to create and formulate high quality 0D and 1D constitutive modeling tools for reactive transport codes.Institutions involved with the effort include Sandia, MIT, Colorado School of Mines, U. Texas, NASA, and Oak Ridge National Labs. Specific to Sandia's contributions, the Cantera Electrolyte Thermo Objects (CETO) packages is comprised of add-on routines for Cantera that handle electrolyte thermochemistry and reactions within the overall Cantera package. Cantera is a C++ Cal Tech code that handles gas phase species transport, reaction,more » and thermodynamics. With this addition, Cantera can be extended to handle problems involving liquid phase reactions and transport in electrolyte systems, and phase equilibrium problemsinvolving concentrated electrolytes and gas/solid phases. A full treatment of molten salt thermodynamics and transport has also been implemented in CETO. The routines themselves consist of .cpp and .h files containing C++ objects that are derived from parent Cantera objects representing thermodynamic functions. They are linked unto the main Cantera libraries when requested by the user. As an addendum to the main thermodynamics objects, several utility applications are provided. The first is multiphase Gibbs free energy minimizer based on the vcs algorithm, called vcs_cantera. This code allows for the calculation of thermodynamic equilibrium in multiple phases at constant temperature and pressure. Note, a similar code capability exists already in Cantera. This version follows the same algorithm, but gas a different code-base starting point, and is used as a research tool for algorithm development. The second program, cttables, prints out tables of thermodynamic and kinetic information for thermodynamic and kinetic objects within Cantera. This program serves as a "Get the numbers out" utility for Cantera, and as such it is very useful as a verification tool. These add-on utilities are encapsulated into a directory structure named cantera_apps, whose installation uses autoconf and also utilizes Cantera's application environment (i.e., they utilize Cantera as a library).« less

  15. Improving the Quality of Lab Reports by Using Them as Lab Instructions

    ERIC Educational Resources Information Center

    Haagen-Schuetzenhoefer, Claudia

    2012-01-01

    Lab exercises are quite popular in teaching science. Teachers have numerous goals in mind when teaching science laboratories. Nevertheless, empirical research draws a heterogeneous picture of the benefits of lab work. Research has shown that it does not necessarily contribute to the enhancement of practical abilities or content knowledge. Lab…

  16. The Development of MSFC Usability Lab

    NASA Technical Reports Server (NTRS)

    Cheng, Yiwei; Richardson, Sally

    2010-01-01

    This conference poster reviews the development of the usability lab at Marshall Space Flight Center. The purpose of the lab was to integrate a fully functioning usability laboratory to provide a resource for future human factor assessments. and to implement preliminary usability testing on a MSFC website to validate the functionality of the lab.

  17. LabLessons: Effects of Electronic Prelabs on Student Engagement and Performance

    ERIC Educational Resources Information Center

    Gryczka, Patrick; Klementowicz, Edward; Sharrock, Chappel; Maxfield, MacRae; Montclare, Jin Kim

    2016-01-01

    Lab instructors, for both high school and undergraduate college level courses, face issues of constricted time within the lab period and limited student engagement with prelab materials. To address these issues, an online prelab delivery system named LabLessons is developed and tested out in a high school chemistry classroom. The system…

  18. 75 FR 6997 - Federal Property Suitable as Facilities To Assist the Homeless

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-12

    ... Agency, 143 Billy Mitchell Blvd., Suite 1, San Antonio, TX 78226; (210) 925-3047; GSA: Gordon Creed... Research Lab Bldg. 247 Rome Lab Rome Co: Oneida NY 13441 Property Number: 18200340024 Status: Unutilized Comments: 13199 sq. ft., presence of asbestos, most recent use-- Electronic Research Lab Bldg. 248 Rome Lab...

  19. Lab at Home: Hardware Kits for a Digital Design Lab

    ERIC Educational Resources Information Center

    Oliver, J. P.; Haim, F.

    2009-01-01

    An innovative laboratory methodology for an introductory digital design course is presented. Instead of having traditional lab experiences, where students have to come to school classrooms, a "lab at home" concept is proposed. Students perform real experiments in their own homes, using hardware kits specially developed for this purpose. They…

  20. Outreach Science Education: Evidence-Based Studies in a Gene Technology Lab

    ERIC Educational Resources Information Center

    Scharfenberg, Franz-Josef; Bogner, Franz X.

    2014-01-01

    Nowadays, outreach labs are important informal learning environments in science education. After summarizing research to goals outreach labs focus on, we describe our evidence-based gene technology lab as a model of a research-driven outreach program. Evaluation-based optimizations of hands-on teaching based on cognitive load theory (additional…

  1. The Portable Usability Testing Lab: A Flexible Research Tool.

    ERIC Educational Resources Information Center

    Hale, Michael E.; And Others

    A group of faculty at the University of Georgia obtained funding for a research and development facility called the Learning and Performance Support Laboratory (LPSL). One of the LPSL's primary needs was obtaining a portable usability lab for software testing, so the facility obtained the "Luggage Lab 2000." The lab is transportable to…

  2. Can Graduate Teaching Assistants Teach Inquiry-Based Geology Labs Effectively?

    ERIC Educational Resources Information Center

    Ryker, Katherine; McConnell, David

    2014-01-01

    This study examines the implementation of teaching strategies by graduate teaching assistants (GTAs) in inquiry-based introductory geology labs at a large research university. We assess the degree of inquiry present in each Physical Geology lab and compare and contrast the instructional practices of new and experienced GTAs teaching these labs. We…

  3. Engaging Digital Natives

    ERIC Educational Resources Information Center

    Preusse-Burr, Beatrix

    2011-01-01

    Many classrooms have interactive whiteboards and several computers and many schools are equipped with a computer lab and mobile labs. However, there typically are not enough computers for every student in each classroom; mobile labs are often shared between several members of a team and time in the computer labs needs to be scheduled in advance.…

  4. Constructing the Components of a Lab Report Using Peer Review

    ERIC Educational Resources Information Center

    Berry, David E.; Fawkes, Kelli L.

    2010-01-01

    A protocol that emphasizes lab report writing using a piecemeal approach coupled with peer review is described. As the lab course progresses, the focus of the report writing changes sequentially through the abstract and introduction, the discussion, and the procedure. Two styles of lab programs are presented. One style rotates the students through…

  5. Undergraduate Student Construction and Interpretation of Graphs in Physics Lab Activities

    ERIC Educational Resources Information Center

    Nixon, Ryan S.; Godfrey, T. J.; Mayhew, Nicholas T.; Wiegert, Craig C.

    2016-01-01

    Lab activities are an important element of an undergraduate physics course. In these lab activities, students construct and interpret graphs in order to connect the procedures of the lab with an understanding of the related physics concepts. This study investigated undergraduate students' construction and interpretation of graphs with best-fit…

  6. 77 FR 64143 - Manufacturer of Controlled Substances; Notice of Registration; Cambridge Isotope Lab

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-18

    ...; Notice of Registration; Cambridge Isotope Lab By Notice dated June 18, 2012, and published in the Federal Register on June 26, 2012, 77 FR 38086, Cambridge Isotope Lab, 50 Frontage Road, Andover, Massachusetts....C. 823(a) and determined that the registration of Cambridge Isotope Lab to manufacture the listed...

  7. Enhancing Communication Skills of Pre-service Physics Teacher through HOT Lab Related to Electric Circuit

    NASA Astrophysics Data System (ADS)

    Malik, A.; Setiawan, A.; Suhandi, A.; Permanasari, A.; Dirgantara, Y.; Yuniarti, H.; Sapriadil, S.; Hermita, N.

    2018-01-01

    This study aimed to investigate the improvement to pre-service teacher’s communication skills through Higher Order Thinking Laboratory (HOT Lab) on electric circuit topic. This research used the quasi-experiment method with pretest-posttest control group design. Research subjects were 60 students of Physics Education in UIN Sunan Gunung Djati Bandung. The sample was chosen by random sampling technique. Students’ communication skill data collected using a communication skills test instruments-essays form and observations sheets. The results showed that pre-service teacher communication skills using HOT Lab were higher than verification lab. Student’s communication skills in groups using HOT Lab were not influenced by gender. Communication skills could increase due to HOT Lab based on problems solving that can develop communication through hands-on activities. Therefore, the conclusion of this research shows the application of HOT Lab is more effective than the verification lab to improve communication skills of pre-service teachers in electric circuit topic and gender is not related to a person’s communication skills.

  8. Experiences with lab-centric instruction

    NASA Astrophysics Data System (ADS)

    Titterton, Nathaniel; Lewis, Colleen M.; Clancy, Michael J.

    2010-06-01

    Lab-centric instruction emphasizes supervised, hands-on activities by substituting lab for lecture time. It combines a multitude of pedagogical techniques into the format of an extended, structured closed lab. We discuss the range of benefits for students, including increased staff interaction, frequent and varied self-assessments, integrated collaborative activities, and a systematic sequence of activities that gradually increases in difficulty. Instructors also benefit from a deeper window into student progress and understanding. We follow with discussion of our experiences in courses at U.C. Berkeley, and using data from some of these investigate the effects of lab-centric instruction on student learning, procrastination, and course pacing. We observe that the lab-centric format helped students on exams but hurt them on extended programming assignments, counter to our hypothesis. Additionally, we see no difference in self-ratings of procrastination and limited differences in ratings of course pace. We do find evidence that the students who choose to attend lab-centric courses are different in several important ways from students who choose to attend the same course in a non-lab-centric format.

  9. Learning Experience on Transformer Using HOT Lab for Pre-service Physics Teacher’s

    NASA Astrophysics Data System (ADS)

    Malik, A.; Setiawan, A.; Suhandi, A.; Permanasari, A.

    2017-09-01

    This study aimed at investigating pre-service teacher’s critical thinking skills improvement through Higher Order Thinking (HOT) Lab on transformer learning. This research used mix method with the embedded experimental model. Research subjects are 60 students of Physics Education in UIN Sunan Gunung Djati Bandung. The results showed that based on the results of the analysis of practical reports and observation sheet shows students in the experimental group was better in carrying out the practicum and can solve the real problem while the control group was going on the opposite. The critical thinking skills of students applying the HOT Lab were higher than the verification lab. Critical thinking skills could increase due to HOT Lab based problems solving that can develop higher order thinking skills through laboratory activities. Therefore, it was concluded that the application of HOT Lab was more effective than verification lab on improving students’ thinking skills on transformer topic learning. Finally, HOT Lab can be implemented in other subject learning and could be used to improve another higher order thinking skills.

  10. Pathogen translocation and histopathological lesions in an experimental model of Salmonella Dublin infection in calves receiving lactic acid bacteria and lactose supplements

    PubMed Central

    Zbrun, María V.; Soto, Lorena P.; Bertozzi, Ezequiel; Sequeira, Gabriel J.; Marti, Luis E.; Signorini, Marcelo L.; Armesto, Roberto Rodríguez; Rosmini, Marcelo R.

    2012-01-01

    The purpose of this study was to evaluate the capacity of a lactic acid bacteria (LAB) inoculum to protect calves with or without lactose supplements against Salmonella Dublin infection by evaluating histopathological lesions and pathogen translocation. Fifteen calves were divided into three groups [control group (C-G), a group inoculated with LAB (LAB-G), and a group inoculated with LAB and given lactose supplements (L-LAB-G)] with five, six, and four animals, respectively. The inoculum, composed of Lactobacillus (L.) casei DSPV 318T, L. salivarius DSPV 315T, and Pediococcus acidilactici DSPV 006T, was administered with milk replacer. The LAB-G and L-LAB-G received a daily dose of 109 CFU/kg body weight of each strain throughout the experiment. Lactose was provided to the L-LAB-G in doses of 100 g/day. Salmonella Dublin (2 × 1010 CFU) was orally administered to all animals on day 11 of the experiment. The microscopic lesion index values in target organs were 83%, 70%, and 64.3% (p < 0.05) for the C-G, LAB-G, and L-LAB-G, respectively. Administration of the probiotic inoculum was not fully effective against infection caused by Salmonella. Although probiotic treatment was unable to delay the arrival of pathogen to target organs, it was evident that the inoculum altered the response of animals against pathogen infection. PMID:23000583

  11. Hospital financing of ischaemic stroke: determinants of funding and usefulness of DRG subcategories based on severity of illness.

    PubMed

    Dewilde, Sarah; Annemans, Lieven; Pincé, Hilde; Thijs, Vincent

    2018-05-11

    Several Western and Arab countries, as well as over 30 States in the US are using the "All-Patient Refined Diagnosis-Related Groups" (APR-DRGs) with four severity-of-illness (SOI) subcategories as a model for hospital funding. The aim of this study is to verify whether this is an adequate model for funding stroke hospital admissions, and to explore which risk factors and complications may influence the amount of funding. A bottom-up analysis of 2496 ischaemic stroke admissions in Belgium compares detailed in-hospital resource use (including length of stay, imaging, lab tests, visits and drugs) per SOI category and calculates total hospitalisation costs. A second analysis examines the relationship between the type and location of the index stroke, medical risk factors, patient characteristics, comorbidities and in-hospital complications on the one hand, and the funding level received by the hospital on the other hand. This dataset included 2513 hospitalisations reporting on 35,195 secondary diagnosis codes, all medically coded with the International Classification of Disease (ICD-9). Total costs per admission increased by SOI (€3710-€16,735), with severe patients costing proportionally more in bed days (86%), and milder patients costing more in medical imaging (24%). In all resource categories (bed days, medications, visits and imaging and laboratory tests), the absolute utilisation rate was higher among severe patients, but also showed more variability. SOI 1-2 was associated with vague, non-specific stroke-related ICD-9 codes as primary diagnosis (71-81% of hospitalisations). 24% hospitalisations had, in addition to the primary diagnosis, other stroke-related codes as secondary diagnoses. Presence of lung infections, intracranial bleeding, severe kidney disease, and do-not-resuscitate status were each associated with extreme SOI (p < 0.0001). APR-DRG with SOI subclassification is a useful funding model as it clusters stroke patients in homogenous groups in terms of resource use. The data on medical care utilisation can be used with unit costs from other countries with similar healthcare set-ups to 1) assess stroke-related hospital funding versus actual costs; 2) inform economic models on stroke prevention and treatment. The data on diagnosis codes can be used to 3) understand which factors influence hospital funding; 4) raise awareness about medical coding practices.

  12. Lactic acid bacteria contribution to gut microbiota complexity: lights and shadows

    PubMed Central

    Pessione, Enrica

    2012-01-01

    Lactic Acid Bacteria (LAB) are ancient organisms that cannot biosynthesize functional cytochromes, and cannot get ATP from respiration. Besides sugar fermentation, they evolved electrogenic decarboxylations and ATP-forming deiminations. The right balance between sugar fermentation and decarboxylation/deimination ensures buffered environments thus enabling LAB to survive in human gastric trait and colonize gut. A complex molecular cross-talk between LAB and host exists. LAB moonlight proteins are made in response to gut stimuli and promote bacterial adhesion to mucosa and stimulate immune cells. Similarly, when LAB are present, human enterocytes activate specific gene expression of specific genes only. Furthermore, LAB antagonistic relationships with other microorganisms constitute the basis for their anti-infective role. Histamine and tyramine are LAB bioactive catabolites that act on the CNS, causing hypertension and allergies. Nevertheless, some LAB biosynthesize both gamma-amino-butyrate (GABA), that has relaxing effect on gut smooth muscles, and beta-phenylethylamine, that controls satiety and mood. Since LAB have reduced amino acid biosynthetic abilities, they developed a sophisticated proteolytic system, that is also involved in antihypertensive and opiod peptide generation from milk proteins. Short-chain fatty acids are glycolytic and phosphoketolase end-products, regulating epithelial cell proliferation and differentiation. Nevertheless, they constitute a supplementary energy source for the host, causing weight gain. Human metabolism can also be affected by anabolic LAB products such as conjugated linoleic acids (CLA). Some CLA isomers reduce cancer cell viability and ameliorate insulin resistance, while others lower the HDL/LDL ratio and modify eicosanoid production, with detrimental health effects. A further appreciated LAB feature is the ability to fix selenium into seleno-cysteine. Thus, opening interesting perspectives for their utilization as antioxidant nutraceutical vectors. PMID:22919677

  13. Probiotics versus antibiotic decontamination of the digestive tract: infection and mortality.

    PubMed

    Oudhuis, Guy J; Bergmans, Dennis C; Dormans, Tom; Zwaveling, Jan-Harm; Kessels, Alfons; Prins, Martin H; Stobberingh, Ellen E; Verbon, Annelies

    2011-01-01

    Selective decontamination of the digestive tract (SDD) has been shown to decrease the infection rate and mortality in intensive care units (ICUs); Lactobacillus plantarum 299/299v plus fibre (LAB) has been used for infection prevention and does not harbour the potential disadvantages of antibiotics. The objective was to assess whether LAB is not inferior to SDD in infection prevention. Two hundred fifty-four consecutive ICU patients with expected mechanical ventilation ≥ 48 h and/or expected ICU stay ≥ 72 h were assigned to receive SDD: four times daily an oral paste (polymyxin E, gentamicin, amphotericin B), enteral solution (same antibiotics), intravenous cefotaxime (first 4 days) or LAB: two times daily L. plantarum 299/299v with rose-hip. The primary endpoint was infection rate. A difference <12% between both groups indicated non-inferiority of LAB. The trial was prematurely stopped after a study reporting increased mortality in critically ill pancreatitis patients receiving probiotics. No significant difference in infection rate [31% in the LAB group, 24% in the SDD group (OR 1.68, 95% CI 0.91-3.08; p = 0.10)] was found. ICU mortality was 26% and not significantly different between the LAB and SDD groups. Gram-positive cocci and Pseudomonas aeruginosa were significantly more frequently isolated from surveillance cultures in the SDD group compared to the LAB group (for sputum: 18 vs. 10% and 33 vs. 14%). Significantly more Enterobacteriaceae were found in the LAB group (23 vs. 50%). No increase in antibiotic resistance was found during and after SDD or LAB use. The trial could not demonstrate the non-inferiority of LAB compared with SDD in infection prevention. Results suggest no increased ICU mortality risk in the LAB group.

  14. Computer-based Astronomy Labs for Non-science Majors

    NASA Astrophysics Data System (ADS)

    Smith, A. B. E.; Murray, S. D.; Ward, R. A.

    1998-12-01

    We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.

  15. Serum level of LOX-1 ligand containing ApoB is associated with increased carotid intima-media thickness in Japanese community-dwelling men, especially those with hypercholesterolemia LOX-1 ligand and IMT in Japanese.

    PubMed

    Okamura, Tomonori; Miura, Katsuyuki; Sawamura, Tatsuya; Kadota, Aya; Hisamatsu, Takashi; Fujiyoshi, Akira; Miyamatsu, Naomi; Takashima, Naoyuki; Miyagawa, Naoko; Kadowaki, Takashi; Ohkubo, Takayoshi; Murakami, Yoshitaka; Nakamura, Yasuyuki; Ueshima, Hirotsugu

    2016-01-01

    The serum level of LOX-1 ligand containing ApoB (LAB) may reflect atherogenicity better than usual lipid parameters; however, the relationship between LAB and carotid intima-media thickness (IMT) was not clear even in Asian populations. A total of 992 community-dwelling Japanese men, aged 40 to 79 years, were enrolled in the present study. Serum LAB levels were measured by enzyme-linked immunosorbent assays (ELISAs) with recombinant LOX-1 and monoclonal anti-apolipoprotein B antibody. Serum LAB levels (median [interquartile range], μg cs/L) were 5341 μg cs/L (4093-7125). The mean average IMT of the common carotid artery was highest in the fourth LAB quartile (842 μm) compared with the first quartile (797 μm) after adjustment for age, high-density lipoprotein cholesterol, triglyceride, body mass index, hypertension, diabetes, high sensitivity C-reactive protein, smoking, and alcohol drinking. However, this statistically significant difference was lost after further adjustment for total cholesterol (TC). After stratification using the combination of median LAB and hypercholesterolemia (serum TC ≥ 6.21 mmol/L and/or lipid-lowering medication), the adjusted mean average IMT (standard error) in the high LAB/hypercholesterolemia group was 886 μm (12.7), 856 μm (16.7) in the low LAB/hypercholesterolemia group, and 833 μm (8.4) in the low LAB/normal cholesterol group (P = .004). After further adjustment for TC, mean average IMT in the high LAB group was significantly higher than that measured in the low LAB group in hypercholesterolemic participants not taking lipid-lowering medication. Serum LAB was associated with an increased carotid IMT in Japanese men, especially those with hypercholesterolemia. Copyright © 2016 National Lipid Association. Published by Elsevier Inc. All rights reserved.

  16. Reducing unnecessary lab testing in the ICU with artificial intelligence.

    PubMed

    Cismondi, F; Celi, L A; Fialho, A S; Vieira, S M; Reti, S R; Sousa, J M C; Finkelstein, S N

    2013-05-01

    To reduce unnecessary lab testing by predicting when a proposed future lab test is likely to contribute information gain and thereby influence clinical management in patients with gastrointestinal bleeding. Recent studies have demonstrated that frequent laboratory testing does not necessarily relate to better outcomes. Data preprocessing, feature selection, and classification were performed and an artificial intelligence tool, fuzzy modeling, was used to identify lab tests that do not contribute an information gain. There were 11 input variables in total. Ten of these were derived from bedside monitor trends heart rate, oxygen saturation, respiratory rate, temperature, blood pressure, and urine collections, as well as infusion products and transfusions. The final input variable was a previous value from one of the eight lab tests being predicted: calcium, PTT, hematocrit, fibrinogen, lactate, platelets, INR and hemoglobin. The outcome for each test was a binary framework defining whether a test result contributed information gain or not. Predictive modeling was applied to recognize unnecessary lab tests in a real world ICU database extract comprising 746 patients with gastrointestinal bleeding. Classification accuracy of necessary and unnecessary lab tests of greater than 80% was achieved for all eight lab tests. Sensitivity and specificity were satisfactory for all the outcomes. An average reduction of 50% of the lab tests was obtained. This is an improvement from previously reported similar studies with average performance 37% by [1-3]. Reducing frequent lab testing and the potential clinical and financial implications are an important issue in intensive care. In this work we present an artificial intelligence method to predict the benefit of proposed future laboratory tests. Using ICU data from 746 patients with gastrointestinal bleeding, and eleven measurements, we demonstrate high accuracy in predicting the likely information to be gained from proposed future lab testing for eight common GI related lab tests. Future work will explore applications of this approach to a range of underlying medical conditions and laboratory tests. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. Reducing unnecessary lab testing in the ICU with artificial intelligence

    PubMed Central

    Cismondi, F.; Celi, L.A.; Fialho, A.S.; Vieira, S.M.; Reti, S.R.; Sousa, J.M.C.; Finkelstein, S.N.

    2017-01-01

    Objectives To reduce unnecessary lab testing by predicting when a proposed future lab test is likely to contribute information gain and thereby influence clinical management in patients with gastrointestinal bleeding. Recent studies have demonstrated that frequent laboratory testing does not necessarily relate to better outcomes. Design Data preprocessing, feature selection, and classification were performed and an artificial intelligence tool, fuzzy modeling, was used to identify lab tests that do not contribute an information gain. There were 11 input variables in total. Ten of these were derived from bedside monitor trends heart rate, oxygen saturation, respiratory rate, temperature, blood pressure, and urine collections, as well as infusion products and transfusions. The final input variable was a previous value from one of the eight lab tests being predicted: calcium, PTT, hematocrit, fibrinogen, lactate, platelets, INR and hemoglobin. The outcome for each test was a binary framework defining whether a test result contributed information gain or not. Patients Predictive modeling was applied to recognize unnecessary lab tests in a real world ICU database extract comprising 746 patients with gastrointestinal bleeding. Main results Classification accuracy of necessary and unnecessary lab tests of greater than 80% was achieved for all eight lab tests. Sensitivity and specificity were satisfactory for all the outcomes. An average reduction of 50% of the lab tests was obtained. This is an improvement from previously reported similar studies with average performance 37% by [1–3]. Conclusions Reducing frequent lab testing and the potential clinical and financial implications are an important issue in intensive care. In this work we present an artificial intelligence method to predict the benefit of proposed future laboratory tests. Using ICU data from 746 patients with gastrointestinal bleeding, and eleven measurements, we demonstrate high accuracy in predicting the likely information to be gained from proposed future lab testing for eight common GI related lab tests. Future work will explore applications of this approach to a range of underlying medical conditions and laboratory tests. PMID:23273628

  18. Strain-specific detection of orally administered canine jejunum-dominated Lactobacillus acidophilus LAB20 in dog faeces by real-time PCR targeted to the novel surface layer protein.

    PubMed

    Tang, Y; Saris, P E J

    2013-10-01

    Lactobacillus acidophilus LAB20 has potential to be a probiotic strain because it can be present at high numbers in the jejunum of dog. To specifically detect LAB20 from dog faecal samples, a real-time PCR protocol was developed targeting the novel surface (S) layer protein gene of LAB20. The presence of S-layer protein was verified by N-terminal sequencing of the approximately 50-kDa major band from SDS-PAGE gel. The corresponding S-layer gene was amplified by inverse PCR using homology to known S-layers and sequenced. This novel S-layer protein has low sequence similarity to other S-layer proteins in the N-terminal region (32-211 aa, 7-39%). This enabled designing strain-specific PCR primers. The primer set was utilized to study intestinal persistence of LAB20 in dog that was fed with LAB20 fermented milk for 5 days. The results showed that LAB20 can be detected from dog faecal sample after 6 weeks with 10(4·53)  DNA copies g(-1) postadministration. It suggested that LAB20 could be a good candidate to study the mechanism behind its persistence and dominance in dog intestine and maybe utilize it as a probiotic for canine. A real-time PCR method was developed to detect Lactobacillus acidophilus LAB20, a strain that was previously found dominant in canine gastrointestinal (GI) tract. The quantitative detection was based on targeting to variation region of a novel S-layer protein found in LAB20, allowing to specifically enumerate LAB20 from dog faeces. The results showed that the real-time PCR method was sensitive enough to be used in later intervention studies. Interestingly, LAB20 was found to persist in dog GI tract for 6 weeks. Therefore, LAB20 could be a good candidate to study its colonization and potentially utilize as a canine probiotic. © 2013 The Society for Applied Microbiology.

  19. Is This Real Life? Is This Just Fantasy?: Realism and Representations in Learning with Technology

    NASA Astrophysics Data System (ADS)

    Sauter, Megan Patrice

    Students often engage in hands-on activities during science learning; however, financial and practical constraints often limit the availability of these activities. Recent advances in technology have led to increases in the use of simulations and remote labs, which attempt to recreate hands-on science learning via computer. Remote labs and simulations are interesting from a cognitive perspective because they allow for different relations between representations and their referents. Remote labs are unique in that they provide a yoked representation, meaning that the representation of the lab on the computer screen is actually linked to that which it represents: a real scientific device. Simulations merely represent the lab and are not connected to any real scientific devices. However, the type of visual representations used in the lab may modify the effects of the lab technology. The purpose of this dissertation is to examine the relation between representation and technology and its effects of students' psychological experiences using online science labs. Undergraduates participated in two studies that investigated the relation between technology and representation. In the first study, participants performed either a remote lab or a simulation incorporating one of two visual representations, either a static image or a video of the equipment. Although participants in both lab conditions learned, participants in the remote lab condition had more authentic experiences. However, effects were moderated by the realism of the visual representation. Participants who saw a video were more invested and felt the experience was more authentic. In a second study, participants performed a remote lab and either saw the same video as in the first study, an animation, or the video and an animation. Most participants had an authentic experience because both representations evoked strong feelings of presence. However, participants who saw the video were more likely to believe the remote technology was real. Overall, the findings suggest that participants' experiences with technology were shaped by representation. Students had more authentic experiences using the remote lab than the simulation. However, incorporating visual representations that enhance presence made these experiences even more authentic and meaningful than afforded by the technology alone.

  20. KENNEDY SPACE CENTER, FLA. -- In the Space Life Sciences Lab, Lanfang Levine, with Dynamac Corp., transfers material into a sample bottle for analysis. She is standing in front of new equipment in the lab that will provide gas chromatography and mass spectrometry. The equipment will enable analysis of volatile compounds, such as from plants. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

    NASA Image and Video Library

    2004-01-05

    KENNEDY SPACE CENTER, FLA. -- In the Space Life Sciences Lab, Lanfang Levine, with Dynamac Corp., transfers material into a sample bottle for analysis. She is standing in front of new equipment in the lab that will provide gas chromatography and mass spectrometry. The equipment will enable analysis of volatile compounds, such as from plants. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

  1. acme: The Amendable Coal-Fire Modeling Exercise. A C++ Class Library for the Numerical Simulation of Coal-Fires

    NASA Astrophysics Data System (ADS)

    Wuttke, Manfred W.

    2017-04-01

    At LIAG, we use numerical models to develop and enhance understanding of coupled transport processes and to predict the dynamics of the system under consideration. Topics include geothermal heat utilization, subrosion processes, and spontaneous underground coal fires. Although the details make it inconvenient if not impossible to apply a single code implementation to all systems, their investigations go along similar paths: They all depend on the solution of coupled transport equations. We thus saw a need for a modular code system with open access for the various communities to maximize the shared synergistic effects. To this purpose we develop the oops! ( open object-oriented parallel solutions) - toolkit, a C++ class library for the numerical solution of mathematical models of coupled thermal, hydraulic and chemical processes. This is used to develop problem-specific libraries like acme( amendable coal-fire modeling exercise), a class library for the numerical simulation of coal-fires and applications like kobra (Kohlebrand, german for coal-fire), a numerical simulation code for standard coal-fire models. Basic principle of the oops!-code system is the provision of data types for the description of space and time dependent data fields, description of terms of partial differential equations (pde), their discretisation and solving methods. Coupling of different processes, described by their particular pde is modeled by an automatic timescale-ordered operator-splitting technique. acme is a derived coal-fire specific application library, depending on oops!. If specific functionalities of general interest are implemented and have been tested they will be assimilated into the main oops!-library. Interfaces to external pre- and post-processing tools are easily implemented. Thus a construction kit which can be arbitrarily amended is formed. With the kobra-application constructed with acme we study the processes and propagation of shallow coal seam fires in particular in Xinjiang, China, as well as analyze and interpret results from lab experiments.

  2. Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction

    PubMed Central

    Gallistel, C. R.; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-01-01

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer. PMID:24637442

  3. Automated, quantitative cognitive/behavioral screening of mice: for genetics, pharmacology, animal cognition and undergraduate instruction.

    PubMed

    Gallistel, C R; Balci, Fuat; Freestone, David; Kheifets, Aaron; King, Adam

    2014-02-26

    We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.

  4. Seeing an Old Lab in a New Light: Transforming a Traditional Optics Lab into Full Guided Inquiry

    ERIC Educational Resources Information Center

    Maley, Tim; Stoll, Will; Demir, Kadir

    2013-01-01

    This paper describes the authors' experiences transforming a "cookbook" lab into an inquiry-based investigation and the powerful effect the inquiry-oriented lab had on our students' understanding of lenses. We found the inquiry-oriented approach led to richer interactions between students as well as a deeper conceptual…

  5. Problem Solvers: MathLab's Design Brings Professional Learning into the Classroom

    ERIC Educational Resources Information Center

    Morales, Sara; Sainz, Terri

    2017-01-01

    Imagine teachers, administrators, and university mathematicians and staff learning together in a lab setting where students are excited about attending a week-long summer math event because they are at the forefront of the experience. Piloted in three New Mexico classrooms during summer 2014, MathLab expanded into 17 lab settings over six…

  6. 76 FR 61744 - Xpedite Systems, LLC Deerfield Beach, Florida; Notice of Negative Determination on Reconsideration

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-05

    ... allegations: ``* * * there was a contract between Xpedite and AppLabs, an Indian company to do customer... customers that need custom data transfers. Xpedite was also using AppLabs for migration work. AppLabs... of events: ``AppLabs completes SOW (SOW template) reviewed by SE before going to customer'' ``SOW...

  7. Governing Methods: Policy Innovation Labs, Design and Data Science in the Digital Governance of Education

    ERIC Educational Resources Information Center

    Williamson, Ben

    2015-01-01

    Policy innovation labs are emerging knowledge actors and technical experts in the governing of education. The article offers a historical and conceptual account of the organisational form of the policy innovation lab. Policy innovation labs are characterised by specific methods and techniques of design, data science, and digitisation in public…

  8. Low Budget Biology 3: A Collection of Low Cost Labs and Activities.

    ERIC Educational Resources Information Center

    Wartski, Bert; Wartski, Lynn Marie

    This document contains biology labs, demonstrations, and activities that use low budget materials. The goal is to get students involved in the learning process by experiencing biology. Each lab has a teacher preparation section which outlines the purpose of the lab, some basic information, a list of materials , and how to prepare the different…

  9. Sustainable dual-use labs: neurovascular interventional capabilities within the cath lab.

    PubMed

    Lang, Stacey

    2012-01-01

    The inclusion of neurovascular interventional capabilities within the cath lab setting can be key to optimal utilization of resources, increased staff efficiency, and streamlined operations. When considering an expansion, look beyond the patient population traditionally associated with cardiac cath labs and consider the integration of programs outside cardiac alone--to create a true dual-use lab space. With proper planning, quality dual purpose equipment, appropriately trained staff, capable physicians, and strong leadership, an organization willing to embrace the challenge can build a truly extraordinary service.

  10. On Laboratory Work

    NASA Astrophysics Data System (ADS)

    Olney, Dave

    1997-11-01

    This paper offers some suggestions on making lab work for high school chemistry students more productive, with students taking an active role. They include (1) rewriting labs from manuals to better suit one's purpose, (2) the questionable use of canned data tables, (3) designing microscale labs that utilize its unique features, such as safety and ease of repetition, (4) having students actually carry out experimental design on occasion, using a model from PRACTICE IN THINKING, and (5) using comuters/calculators in the lab in meaningful ways. Many examples feature discovery-type labs the author has developed over the years.

  11. EPICS Channel Access Server for LabVIEW

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhukov, Alexander P.

    It can be challenging to interface National Instruments LabVIEW (http://www.ni.com/labview/) with EPICS (http://www.aps.anl.gov/epics/). Such interface is required when an instrument control program was developed in LabVIEW but it also has to be part of global control system. This is frequently useful in big accelerator facilities. The Channel Access Server is written in LabVIEW, so it works on any hardware/software platform where LabVIEW is available. It provides full server functionality, so any EPICS client can communicate with it.

  12. Implementation of the Web-based laboratory

    NASA Astrophysics Data System (ADS)

    Ying, Liu; Li, Xunbo

    2005-12-01

    With the rapid developments of Internet technologies, remote access and control via Internet is becoming a reality. A realization of the web-based laboratory (the W-LAB) was presented. The main target of the W-LAB was to allow users to easily access and conduct experiments via the Internet. While realizing the remote communication, a system, which adopted the double client-server architecture, was introduced. It ensures the system better security and higher functionality. The experimental environment implemented in the W-Lab was integrated by both virtual lab and remote lab. The embedded technology in the W-LAB system as an economical and efficient way to build the distributed infrastructural network was introduced. Furthermore, by introducing the user authentication mechanism in the system, it effectively secures the remote communication.

  13. Genomics of lactic acid bacteria: Current status and potential applications.

    PubMed

    Wu, Chongde; Huang, Jun; Zhou, Rongqing

    2017-08-01

    Lactic acid bacteria (LAB) are widely used for the production of a variety of foods and feed raw materials where they contribute to flavor and texture of the fermented products. In addition, specific LAB strains are considered as probiotic due to their health-promoting effects in consumers. Recently, the genome sequencing of LAB is booming and the increased amount of published genomics data brings unprecedented opportunity for us to reveal the important traits of LAB. This review describes the recent progress on LAB genomics and special emphasis is placed on understanding the industry-related physiological features based on genomics analysis. Moreover, strategies to engineer metabolic capacity and stress tolerance of LAB with improved industrial performance are also discussed.

  14. Three-phase Four-leg Inverter LabVIEW FPGA Control Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    In the area of power electronics control, Field Programmable Gate Arrays (FPGAs) have the capability to outperform their Digital Signal Processor (DSP) counterparts due to the FPGA’s ability to implement true parallel processing and therefore facilitate higher switching frequencies, higher control bandwidth, and/or enhanced functionality. National Instruments (NI) has developed two platforms, Compact RIO (cRIO) and Single Board RIO (sbRIO), which combine a real-time processor with an FPGA. The FPGA can be programmed with a subset of the well-known LabVIEW graphical programming language. The use of cRIO and sbRIO for power electronics control has developed over the last few yearsmore » to include control of three-phase inverters. Most three-phase inverter topologies include three switching legs. The addition of a fourth-leg to natively generate the neutral connection allows the inverter to serve single-phase loads in a microgrid or stand-alone power system and to balance the three-phase voltages in the presence of significant load imbalance. However, the control of a four-leg inverter is much more complex. In particular, instead of standard two-dimensional space vector modulation (SVM), the inverter requires three-dimensional space vector modulation (3D-SVM). The candidate software implements complete control algorithms in LabVIEW FPGA for a three-phase four-leg inverter. The software includes feedback control loops, three-dimensional space vector modulation gate-drive algorithms, advanced alarm handling capabilities, contactor control, power measurements, and debugging and tuning tools. The feedback control loops allow inverter operation in AC voltage control, AC current control, or DC bus voltage control modes based on external mode selection by a user or supervisory controller. The software includes the ability to synchronize its AC output to the grid or other voltage-source before connection. The software also includes provisions to allow inverter operation in parallel with other voltage regulating devices on the AC or DC buses. This flexibility allows the Inverter to operate as a stand-alone voltage source, connected to the grid, or in parallel with other controllable voltage sources as part of a microgrid or remote power system. In addition, as the inverter is expected to operate under severe unbalanced conditions, the software includes algorithms to accurately compute real and reactive power for each phase based on definitions provided in the IEEE Standard 1459: IEEE Standard Definitions for the Measurement of Electric Power Quantities Under Sinusoidal, Nonsinusoidal, Balanced, or Unbalanced Conditions. Finally, the software includes code to output analog signals for debugging and for tuning of control loops. The software fits on the Xilinx Virtex V LX110 FPGA embedded in the NI cRIO-9118 FPGA chassis, and with a 40 MHz base clock, supports a modulation update rate of 40 MHz, user-settable switching frequencies and synchronized control loop update rates of tens of kHz, and reference waveform generation, including Phase Lock Loop (PLL), update rate of 100 kHz.« less

  15. Measure, Then Show: Grasping Human Evolution Through an Inquiry-Based, Data-driven Hominin Skulls Lab.

    PubMed

    Bayer, Chris N; Luberda, Michael

    2016-01-01

    Incomprehension and denial of the theory of evolution among high school students has been observed to also occur when teachers are not equipped to deliver a compelling case also for human evolution based on fossil evidence. This paper assesses the outcomes of a novel inquiry-based paleoanthropology lab teaching human evolution to high-school students. The inquiry-based Be a Paleoanthropologist for a Day lab placed a dozen hominin skulls into the hands of high-school students. Upon measuring three variables of human evolution, students explain what they have observed and discuss findings. In the 2013/14 school year, 11 biology classes in 7 schools in the Greater New Orleans area participated in this lab. The interviewed teacher cohort unanimously agreed that the lab featuring hominin skull replicas and stimulating student inquiry was a pedagogically excellent method of delivering the subject of human evolution. First, the lab's learning path of transforming facts to data, information to knowledge, and knowledge to acceptance empowered students to themselves execute part of the science that underpins our understanding of deep time hominin evolution. Second, although challenging, the hands-on format of the lab was accessible to high-school students, most of whom were readily able to engage the lab's scientific process. Third, the lab's exciting and compelling pedagogy unlocked higher order thinking skills, effectively activating the cognitive, psychomotor and affected learning domains as defined in Bloom's taxonomy. Lastly, the lab afforded students a formative experience with a high degree of retention and epistemic depth. Further study is warranted to gauge the degree of these effects.

  16. My Green Car: The Adventure Begins (Ep. 1) – DOE Lab-Corps Video Series

    ScienceCinema

    Saxena, Samveg; Shah, Nihar; Hansen, Dana

    2018-06-12

    One key difference between a great technology that stays in the lab and one that reaches the marketplace is customer interest. In Episode 1, the Lab’s MyGreenCar team gets ready to step outside the lab and test their technology’s value to consumers in a scientific way. What makes a new technology compelling enough to transition out of the lab and become a consumer product? That’s the question Berkeley Lab researchers Samveg Saxena, Nihar Shah, and Dana Hansen plus industry mentor Russell Carrington set out to answer for MyGreenCar, an app providing personalized fuel economy or electric vehicle range estimates for consumers researching new cars. DOE’s Lab-Corps program offered the technology team some answers. The EERE-funded program, based on the National Science Foundation’s I-Corps™ model for entrepreneurial training, provides tools and training to move energy-related inventions to the marketplace. During Lab-Corp’s intensive six-week session, technology teams interview 100 customer and value chain members to discover which potential products based on their technologies will have significant market pull. A six video series follows the MyGreenCar team’s Lab-Corps experience, from pre-training preparation with the Lab’s Innovation and Partnerships Office through the ups and downs of the customer discovery process. Will the app make it to the marketplace? You’ll just have to watch.

  17. Role of Broiler Carcasses and Processing Plant Air in Contamination of Modified-Atmosphere-Packaged Broiler Products with Psychrotrophic Lactic Acid Bacteria▿

    PubMed Central

    Vihavainen, Elina; Lundström, Hanna-Saara; Susiluoto, Tuija; Koort, Joanna; Paulin, Lars; Auvinen, Petri; Björkroth, K. Johanna

    2007-01-01

    Some psychrotrophic lactic acid bacteria (LAB) are specific meat spoilage organisms in modified-atmosphere-packaged (MAP), cold-stored meat products. To determine if incoming broilers or the production plant environment is a source of spoilage LAB, a total of 86, 122, and 447 LAB isolates from broiler carcasses, production plant air, and MAP broiler products, respectively, were characterized using a library of HindIII restriction fragment length polymorphism (RFLP) patterns of the 16 and 23S rRNA genes as operational taxonomic units in numerical analyses. Six hundred thirteen LAB isolates from the total of 655 clustered in 29 groups considered to be species specific. Sixty-four percent of product isolates clustered either with Carnobacterium divergens or with Carnobacterium maltaromaticum type strains. The third major product-associated cluster (17% of isolates) was formed by unknown LAB. Representative strains from these three clusters were analyzed for the phylogeny of their 16S rRNA genes. This analysis verified that the two largest RFLP clusters consisted of carnobacteria and showed that the unknown LAB group consisted of Lactococcus spp. No product-associated LAB were detected in broiler carcasses sampled at the beginning of slaughter, whereas carnobacteria and lactococci, along with some other specific meat spoilage LAB, were recovered from processing plant air at many sites. This study reveals that incoming broiler chickens are not major sources of psychrotrophic spoilage LAB, whereas the detection of these organisms from the air of the processing environment highlights the role of processing facilities as sources of LAB contamination. PMID:17142357

  18. Graduate student training and creating new physics labs for biology students, killing two birds with one stone.

    NASA Astrophysics Data System (ADS)

    Jones, Barbara

    2001-03-01

    At UCSD biology majors are required to take 3 quarters of a calculus based physics course. This is taught in a standard format large lecture class partly by faculty and partly by freeway flyers. We are working with physics graduate students who are also participating in our PFPF (Preparing Future Physics Faculty) program to write, review, and teach new weekly labs for these biology students. This provides an experience for the grad student that is both rewarding to them and useful to the department. The grad students participate in curriculum development, they observe the students behaviour in the labs, and assess the effectiveness of different lab formats. The labs are intended to provide an interactive, hands on experience with a wide variety of equipment which is mostly both simple and inexpensive. Both students and grads find the labs to be engaging and fun. Based on group discussions the labs are modified to try to try to create the best teaching environment. The biology students benefit from the improvements both in the quality of the labs they do, and from the enthusiasm of the TAs who take an active interest in their learning. The ability to make significant changes to the material taught maintains the interest of the grad students and helps to make the labs a stable and robust environment.

  19. Lab notebooks as scientific communication: Investigating development from undergraduate courses to graduate research

    NASA Astrophysics Data System (ADS)

    Stanley, Jacob T.; Lewandowski, H. J.

    2016-12-01

    In experimental physics, lab notebooks play an essential role in the research process. For all of the ubiquity of lab notebooks, little formal attention has been paid to addressing what is considered "best practice" for scientific documentation and how researchers come to learn these practices in experimental physics. Using interviews with practicing researchers, namely, physics graduate students, we explore the different experiences researchers had in learning how to effectively use a notebook for scientific documentation. We find that very few of those interviewed thought that their undergraduate lab classes successfully taught them the benefit of maintaining a lab notebook. Most described training in lab notebook use as either ineffective or outright missing from their undergraduate lab course experience. Furthermore, a large majority of those interviewed explained that they did not receive any formal training in maintaining a lab notebook during their graduate school experience and received little to no feedback from their advisors on these records. Many of the interviewees describe learning the purpose of, and how to maintain, these kinds of lab records only after having a period of trial and error, having already started doing research in their graduate program. Despite the central role of scientific documentation in the research enterprise, these physics graduate students did not gain skills in documentation through formal instruction, but rather through informal hands-on practice.

  20. Incorporating a Literature-Based Learning Approach into a Lab Course to Increase Student Understanding

    ERIC Educational Resources Information Center

    Parent, Beth A.; Marbach-Ad, Gili; Swanson, Karen V.; Smith, Ann C.

    2010-01-01

    Scientific literature was used to give a research oriented context to our immunology lab course. Immunology lab, a senior level course (60 students/year) was formerly taught in a traditional mode, with exercises aimed at learning lab protocols. To engage students in understanding we connected the protocols to their use as reported in research…

  1. Extreme Environments Test Capabilities at NASA GRC for Parker Hannifin Visit

    NASA Technical Reports Server (NTRS)

    Arnett, Lori

    2016-01-01

    The presentation includes general description on the following test facilities: Fuel Cell Testing Lab, Structural Dynamics Lab, Thermal Vacuum Test Facilities - including a description of the proposed Kinetic High Altitude Simulator concept, EMI Test Lab, and the Creek Road Cryogenic Complex - specifically the Small Multi-purpose Research Facility (SMiRF) and the Cryogenics Components Lab 7 (CCL-7).

  2. Do Policies that Encourage Better Attendance in Lab Change Students' Academic Behaviors and Performances in Introductory Science Courses?

    ERIC Educational Resources Information Center

    Moore, Randy; Jensen, Philip A.

    2008-01-01

    Science courses with hands-on investigative labs are a typical part of the general education requirements at virtually all colleges and universities. In these courses, labs that satisfy a curricular requirement for "lab experience" are important because they provide the essence of the scientific experience--that is, they give students…

  3. Value Added or Misattributed? A Multi-Institution Study on the Educational Benefit of Labs for Reinforcing Physics Content

    ERIC Educational Resources Information Center

    Holmes, N. G.; Olsen, Jack; Thomas, James L.; Wieman, Carl E.

    2017-01-01

    Instructional labs are widely seen as a unique, albeit expensive, way to teach scientific content. We measured the effectiveness of introductory lab courses at achieving this educational goal across nine different lab courses at three very different institutions. These institutions and courses encompassed a broad range of student populations and…

  4. Validity of Selected Lab and Field Tests of Physical Working Capacity.

    ERIC Educational Resources Information Center

    Burke, Edmund J.

    The validity of selected lab and field tests of physical working capacity was investigated. Forty-four male college students were administered a series of lab and field tests of physical working capacity. Lab tests include a test of maximum oxygen uptake, the PWC 170 test, the Harvard Step Test, the Progressive Pulse Ratio Test, Margaria Test of…

  5. Behind the Scenes at Berkeley Lab - The Mechanical Fabrication Facility

    ScienceCinema

    Wells, Russell; Chavez, Pete; Davis, Curtis; Bentley, Brian

    2018-04-16

    Part of the Behind the Scenes series at Berkeley Lab, this video highlights the lab's mechanical fabrication facility and its exceptional ability to produce unique tools essential to the lab's scientific mission. Through a combination of skilled craftsmanship and precision equipment, machinists and engineers work with scientists to create exactly what's needed - whether it's measured in microns or meters.

  6. The protein and peptide mediated syntheses of non-biologically-produced oxide materials

    NASA Astrophysics Data System (ADS)

    Dickerson, Matthew B.

    Numerous examples exist in nature of organisms which have evolved the ability to produce sophisticated structures composed of inorganic minerals. Studies of such biomineralizing organisms have suggested that specialized biomolecules are, in part, responsible for the controlled formation of these structures. The research detailed in this dissertation is focused on the use of biomolecules (i.e., peptides and proteins) to form non-biologically produced materials under mild reaction conditions (i.e., neutral pH, aqueous solutions, and room temperature). The peptides utilized in the studies detailed in this dissertation were identified through the screening of single crystal rutile TiO2 substrates or Ge powder with a phagedisplayed peptide library. Twenty-one peptides were identified which possessed an affinity for Ge. Three of these twenty one peptides were tested for germania precipitation activity. Those peptides possessing a basic isoelectric point as well as hydroxyl- and imidazole-containing amino acid residues were found to be the most effective in precipitating amorphous germania from an alkoxide precursor. The phage-displayed peptide library screening of TiO2 substrates yielded twenty peptides. Four of these peptides, which were heavily enriched in histidine and/or basic amino acid residues, were found to possess signficant titania precipitation activity. The activity of these peptides was found to correlate with the number of positive charges they carried. The sequence of the most active of the library-identified peptides was modified to yield two additional peptides. The titania precipitation activity of these designed peptides was higher than the parent peptide, with reduced pH dependence. The titania materials generated by the library-identified and designed peptides were found to be composed of amorphous titania as well as <10 nm anatase and/or monoclinic TiO2 crystallites. The production of titania and zirconia resulting from the interaction of the cationic enzyme, hen egg white lysozyme, with Ti- or Zr-lactate precursors is also presented in this dissertation. Lysozyme was found to entrap itself in an active form within the nanoparticles of amorphous titania or zirconia precipitated by this protein under ambient conditions. The lysozyme synthesized titania was observed to be superior to the lysozyme-zirconia materials in preserving the activity of the enzyme under denaturing conditions. Four recombinant proteins, derived from the amino acid sequences of proteins (silaffins) associated with biosilicification in diatoms, were also investigated for titania precipitation activity. The two most basic of these recombinant silaffins, rSil1L and rSilC, were able to induce the formation of titania. The titania precipitates generated by rSil1L were found to be similar to those produced by the phage-displayed library identified peptides. The second recombinant silaffin, rSilC, was found to produce hollow spheres of titania, which, following dehydration, were observed to transform into larger, solid spheres composed of radially aligned columns of rutile TiO2. The highly repetitive nature of the rSilC's amino acid sequence is believed to be responsible for the differences in TiO2 polymorph generated by the different recombinant silaffins and peptides. This dissertation also details research conducted on the formation of titania utilizing rSilC conjugated to synthetic and biogenic silica surfaces. These silica surfaces were functionalized with a newly developed drendritic growth technique. The dendritic functional-group amplification process was demonstrated to increase the loading of hexahisitidine tagged proteins on silica surfaces by more than 40%, as compared to traditional immobilization procedures. The higher loadings of rSilC provided by this dendritic growth method were observed to have a positive impact on the extent of surface mineralization. The titania formed by immobilized rSilC was observed to be composed of amorphous and crystalline TiO2.

  7. Antimicrobial activity of lactic acid bacteria against Listeria monocytogenes on frankfurters formulated with and without lactate/diacetate.

    PubMed

    Koo, Ok-Kyung; Eggleton, Mallory; O'Bryan, Corliss A; Crandall, Philip G; Ricke, Steven C

    2012-12-01

    Contamination by Listeria monocytogenes has been a constant public health threat for the ready-to-eat (RTE) meat industry due to the potential for high mortalities from listeriosis. Lactic acid bacteria (LAB) have shown protective action against various pathogenic bacteria. The aim of this study was to evaluate the antilisterial activity of a combination of three LAB strains (Lactiguard®) on L. monocytogenes. The combination of the LAB was inhibitory to L. monocytogenes inoculated onto frankfurters not containing lactate/diacetate after 8weeks of refrigerated storage (0.6 log reduction compared to L. monocytogenes only control), and when a cell free extract (CFS) of the LAB was added with LAB even more inhibition was obtained (1.2 log reduction compared with L. monocytogenes only). In frankfurters containing lactate/diacetate the LAB and the LAB plus CFS were more effective in reducing growth of L. monocytogenes after 8 weeks of refrigerated storage (2 and 3.3 log reductions respectively). Copyright © 2012 Elsevier Ltd. All rights reserved.

  8. The experiment editor: supporting inquiry-based learning with virtual labs

    NASA Astrophysics Data System (ADS)

    Galan, D.; Heradio, R.; de la Torre, L.; Dormido, S.; Esquembre, F.

    2017-05-01

    Inquiry-based learning is a pedagogical approach where students are motivated to pose their own questions when facing problems or scenarios. In physics learning, students are turned into scientists who carry out experiments, collect and analyze data, formulate and evaluate hypotheses, and so on. Lab experimentation is essential for inquiry-based learning, yet there is a drawback with traditional hands-on labs in the high costs associated with equipment, space, and maintenance staff. Virtual laboratories are helpful to reduce these costs. This paper enriches the virtual lab ecosystem by providing an integrated environment to automate experimentation tasks. In particular, our environment supports: (i) scripting and running experiments on virtual labs, and (ii) collecting and analyzing data from the experiments. The current implementation of our environment supports virtual labs created with the authoring tool Easy Java/Javascript Simulations. Since there are public repositories with hundreds of freely available labs created with this tool, the potential applicability to our environment is considerable.

  9. Computational Labs Using VPython Complement Conventional Labs in Online and Regular Physics Classes

    NASA Astrophysics Data System (ADS)

    Bachlechner, Martina E.

    2009-03-01

    Fairmont State University has developed online physics classes for the high-school teaching certificate based on the text book Matter and Interaction by Chabay and Sherwood. This lead to using computational VPython labs also in the traditional class room setting to complement conventional labs. The computational modeling process has proven to provide an excellent basis for the subsequent conventional lab and allows for a concrete experience of the difference between behavior according to a model and realistic behavior. Observations in the regular class room setting feed back into the development of the online classes.

  10. Monitoring of wheat lactic acid bacteria from the field until the first step of dough fermentation.

    PubMed

    Alfonzo, Antonio; Miceli, Claudia; Nasca, Anna; Franciosi, Elena; Ventimiglia, Giusi; Di Gerlando, Rosalia; Tuohy, Kieran; Francesca, Nicola; Moschetti, Giancarlo; Settanni, Luca

    2017-04-01

    The present work was carried out to retrieve the origin of lactic acid bacteria (LAB) in sourdough. To this purpose, wheat LAB were monitored from ear harvest until the first step of fermentation for sourdough development. The influence of the geographical area and variety on LAB species/strain composition was also determined. The ears of four Triticum durum varieties (Duilio, Iride, Saragolla and Simeto) were collected from several fields located within the Palermo province (Sicily, Italy) and microbiologically investigated. In order to trace the transfer of LAB during the consecutive steps of manipulation, ears were transformed aseptically and, after threshing, milling and fermentation, samples of kernels, semolinas and doughs, respectively, were analysed. LAB were not found to dominate the microbial communities of the raw materials. In general, kernels harboured lower levels of microorganisms than ears and ears than semolinas. Several samples showing no development of LAB colonies acidified the enrichment broth suggesting the presence of LAB below the detection limit. After fermentation, LAB loads increased consistently for all doughs, reaching levels of 7.0-7.5 Log CFU/g on M17. The values of pH (5.0) and TTA (5.6 mL NaOH/10 g of dough) indicated the occurrence of the acidification process for several doughs. LAB were phenotypically and genotypically differentiated by randomly amplified polymorphic DNA (RAPD)-PCR into eight groups including 51 strains belonging to the species Lactobacillus brevis, Lactobacillus coryniformis, Lactobacillus plantarum, Lactococcus lactis, Lactococcus garvieae, Enterococcus casseliflavus, Enterococcus faecium, Leuconostoc citreum, and Pediococcus pentosaceus. Lactobacilli constituted a minority the LAB community, while lactococci represented more than 50% of strains. Lower LAB complexity was found on kernels, while a richer biodiversity was observed in semolinas and fermented doughs. For broader microbiota characterisation in doughs before fermentation, the 16S rRNA gene fragment profiling was conducted on the unfermented doughs using MiSeq Illumina. LAB group was represented by Enterococcus, Lactococcus and members of Leuconostocaceae family whose relative abundances differed according to both geographical area and variety of wheat. The culture-independent approach confirmed that pediococci and lactobacilli constituted low abundance members of the semolina LAB microbiota and that although some strains may pass from wheat ear to fermented doughs, most are likely to come from other sources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Research and Teaching. Effects of a Research-Based Ecology Lab Course: A Study of Nonvolunteer Achievement, Self-Confidence, and Perception of Lab Course Purpose

    ERIC Educational Resources Information Center

    Kloser, Matthew J.; Brownell, Sara E.; Shavelson, Richard J.; Fukami, Tadashi

    2013-01-01

    Undergraduate biology lab courses have long been criticized for engaging students in "cookbook" experiences in which students follow a given protocol to collect data that help answer a predetermined question. Recent reform documents in biology education have suggested that students should engage in lab courses that provide more authentic…

  12. Army Reserve Component Personal Empowerment Program #2t

    DTIC Science & Technology

    2013-10-01

    rescheduling of appointments • Retrieved lab reports from hospital lab for 161 participants 8 • Identified abnormal values and sent copies to campus nurse ...recommendation of SHU Scientific Committee • Collaborated with SHU nurse to establish procedure for abnormal lab values • Implemented suggested...results were encouraged to discuss further with nurse as per protocol. • Researched literature concerning vitamin D to better understand lab results

  13. Introduction to Computing: Lab Manual. Faculty Guide [and] Student Guide.

    ERIC Educational Resources Information Center

    Frasca, Joseph W.

    This lab manual is designed to accompany a college course introducing students to computing. The exercises are designed to be completed by the average student in a supervised 2-hour block of time at a computer lab over 15 weeks. The intent of each lab session is to introduce a topic and have the student feel comfortable with the use of the machine…

  14. Effects of Implementing a Hybrid Wet Lab and Online Module Lab Curriculum into a General Chemistry Course: Impacts on Student Performance and Engagement with the Chemistry Triplet

    ERIC Educational Resources Information Center

    Irby, Stefan M.; Borda, Emily J.; Haupt, Justin

    2018-01-01

    Here, we describe the implementation a hybrid general chemistry teaching laboratory curriculum that replaces a portion of a course's traditional "wet lab" experiences with online virtual lab modules. These modules intentionally utilize representations on all three levels of the chemistry triplet-macroscopic, submicroscopic, and symbolic.…

  15. Threshold-Switchable Particles (TSP) to Control Internal Hemorrhage

    DTIC Science & Technology

    2012-12-01

    the Liu lab (in collaboration with the Morrissey lab): Citrate gold nanoparticle synthesis (toward Task 3, Milestone 4) Gold nanoparticles with an...dimethylamino) propyl ]carbodiimide). Different pH conditions were used to test the conjugation efficiency between PAAc and cystamine. An excess amount of...Studies from the Stucky lab (in collaboration with the Morrissey lab): Silica Nanoparticle (SNP) synthesis (toward Task 3, Milestone 4) In our

  16. KENNEDY SPACE CENTER, FLA. -- In the Space Life Sciences (SLS) Lab, Jan Bauer, with Dynamac Corp., places samples of onion tissue in the elemental analyzer, which analyzes for carbon, hydrogen, nitrogen and sulfur. The 100,000 square-foot SLS houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

    NASA Image and Video Library

    2004-01-05

    KENNEDY SPACE CENTER, FLA. -- In the Space Life Sciences (SLS) Lab, Jan Bauer, with Dynamac Corp., places samples of onion tissue in the elemental analyzer, which analyzes for carbon, hydrogen, nitrogen and sulfur. The 100,000 square-foot SLS houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

  17. KENNEDY SPACE CENTER, FLA. -- Sharon Edney, with Dynamac Corp., measures photosynthesis on Bibb lettuce being grown hydroponically for study in the Space Life Sciences Lab. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

    NASA Image and Video Library

    2004-01-05

    KENNEDY SPACE CENTER, FLA. -- Sharon Edney, with Dynamac Corp., measures photosynthesis on Bibb lettuce being grown hydroponically for study in the Space Life Sciences Lab. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

  18. KENNEDY SPACE CENTER, FLA. -- Sharon Edney, with Dynamac Corp., checks the roots of green onions being grown hydroponically for study in the Space Life Sciences Lab. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

    NASA Image and Video Library

    2004-01-05

    KENNEDY SPACE CENTER, FLA. -- Sharon Edney, with Dynamac Corp., checks the roots of green onions being grown hydroponically for study in the Space Life Sciences Lab. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

  19. KENNEDY SPACE CENTER, FLA. -- Lanfang Levine, with Dynamac Corp., helps install a Dionex DX-500 IC/HPLC system in the Space Life Sciences Lab. The equipment will enable analysis of volatile compounds, such as from plants. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

    NASA Image and Video Library

    2004-01-05

    KENNEDY SPACE CENTER, FLA. -- Lanfang Levine, with Dynamac Corp., helps install a Dionex DX-500 IC/HPLC system in the Space Life Sciences Lab. The equipment will enable analysis of volatile compounds, such as from plants. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

  20. KENNEDY SPACE CENTER, FLA. -- In the Space Life Sciences (SLS) Lab, Jan Bauer, with Dynamac Corp., weighs samples of onion tissue for processing in the elemental analyzer behind it. The equipment analyzes for carbon, hydrogen, nitrogen and sulfur. The 100,000 square-foot SLS houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

    NASA Image and Video Library

    2004-01-05

    KENNEDY SPACE CENTER, FLA. -- In the Space Life Sciences (SLS) Lab, Jan Bauer, with Dynamac Corp., weighs samples of onion tissue for processing in the elemental analyzer behind it. The equipment analyzes for carbon, hydrogen, nitrogen and sulfur. The 100,000 square-foot SLS houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

  1. KENNEDY SPACE CENTER, FLA. -- Sharon Edney, with Dynamac Corp., checks the growth of radishes being grown hydroponically for study in the Space Life Sciences Lab. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

    NASA Image and Video Library

    2004-01-05

    KENNEDY SPACE CENTER, FLA. -- Sharon Edney, with Dynamac Corp., checks the growth of radishes being grown hydroponically for study in the Space Life Sciences Lab. The 100,000 square-foot facility houses labs for NASA’s ongoing research efforts, microbiology/microbial ecology studies and analytical chemistry labs. Also calling the new lab home are facilities for space flight-experiment and flight-hardware development, new plant growth chambers, and an Orbiter Environment Simulator that will be used to conduct ground control experiments in simulated flight conditions for space flight experiments. The SLS Lab, formerly known as the Space Experiment Research and Processing Laboratory or SERPL, provides space for NASA’s Life Sciences Services contractor Dynamac Corporation, Bionetics Corporation, and researchers from the University of Florida. NASA’s Office of Biological and Physical Research will use the facility for processing life sciences experiments that will be conducted on the International Space Station. The SLS Lab is the magnet facility for the International Space Research Park at KSC being developed in partnership with Florida Space Authority.

  2. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    NASA Astrophysics Data System (ADS)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.

  3. New weather depiction technology for night vision goggle (NVG) training: 3D virtual/augmented reality scene-weather-atmosphere-target simulation

    NASA Astrophysics Data System (ADS)

    Folaron, Michelle; Deacutis, Martin; Hegarty, Jennifer; Vollmerhausen, Richard; Schroeder, John; Colby, Frank P.

    2007-04-01

    US Navy and Marine Corps pilots receive Night Vision Goggle (NVG) training as part of their overall training to maintain the superiority of our forces. This training must incorporate realistic targets; backgrounds; and representative atmospheric and weather effects they may encounter under operational conditions. An approach for pilot NVG training is to use the Night Imaging and Threat Evaluation Laboratory (NITE Lab) concept. The NITE Labs utilize a 10' by 10' static terrain model equipped with both natural and cultural lighting that are used to demonstrate various illumination conditions, and visual phenomena which might be experienced when utilizing night vision goggles. With this technology, the military can safely, systematically, and reliably expose pilots to the large number of potentially dangerous environmental conditions that will be experienced in their NVG training flights. A previous SPIE presentation described our work for NAVAIR to add realistic atmospheric and weather effects to the NVG NITE Lab training facility using the NVG - WDT(Weather Depiction Technology) system (Colby, et al.). NVG -WDT consist of a high end multiprocessor server with weather simulation software, and several fixed and goggle mounted Heads Up Displays (HUDs). Atmospheric and weather effects are simulated using state-of-the-art computer codes such as the WRF (Weather Research μ Forecasting) model; and the US Air Force Research Laboratory MODTRAN radiative transport model. Imagery for a variety of natural and man-made obscurations (e.g. rain, clouds, snow, dust, smoke, chemical releases) are being calculated and injected into the scene observed through the NVG via the fixed and goggle mounted HUDs. This paper expands on the work described in the previous presentation and will describe the 3D Virtual/Augmented Reality Scene - Weather - Atmosphere - Target Simulation part of the NVG - WDT. The 3D virtual reality software is a complete simulation system to generate realistic target - background scenes and display the results in a DirectX environment. This paper will describe our approach and show a brief demonstration of the software capabilities. The work is supported by the SBIR program under contract N61339-06-C-0113.

  4. Attracting STEM talent: do STEM students prefer traditional or work/life-interaction labs?

    PubMed

    DeFraine, William C; Williams, Wendy M; Ceci, Stephen J

    2014-01-01

    The demand for employees trained in science, technology, engineering, and mathematics (STEM) fields continues to increase, yet the number of Millennial students pursuing STEM is not keeping pace. We evaluated whether this shortfall is associated with Millennials' preference for flexibility and work/life-interaction in their careers-a preference that may be inconsistent with the traditional idea of a science career endorsed by many lab directors. Two contrasting approaches to running STEM labs and training students were explored, and we created a lab recruitment video depicting each. The work-focused video emphasized the traditional notions of a science lab, characterized by long work hours and a focus on individual achievement and conducting research above all else. In contrast, the work/life-interaction-focused video emphasized a more progressive view - lack of demarcation between work and non-work lives, flexible hours, and group achievement. In Study 1, 40 professors rated the videos, and the results confirmed that the two lab types reflected meaningful real-world differences in training approaches. In Study 2, we recruited 53 current and prospective graduate students in STEM fields who displayed high math-identification and a commitment to science careers. In a between-subjects design, they watched one of the two lab-recruitment videos, and then reported their anticipated sense of belonging to and desire to participate in the lab depicted in the video. Very large effects were observed on both primary measures: Participants who watched the work/life-interaction-focused video reported a greater sense of belonging to (d = 1.49) and desire to participate in (d = 1.33) the lab, relative to participants who watched the work-focused video. These results suggest Millennials possess a strong desire for work/life-interaction, which runs counter to the traditional lab-training model endorsed by many lab directors. We discuss implications of these findings for STEM recruitment.

  5. Co-segregation of hyperactivity, active coping styles, and cognitive dysfunction in mice selectively bred for low levels of anxiety.

    PubMed

    Yen, Yi-Chun; Anderzhanova, Elmira; Bunck, Mirjam; Schuller, Julia; Landgraf, Rainer; Wotjak, Carsten T

    2013-01-01

    We established mouse models of extremes in trait anxiety, which are based on selective breeding for low vs. normal vs. high open-arm exploration on the elevated plus-maze. Genetically selected low anxiety-related behavior (LAB) coincided with hyperactivity in the home cage. Given the fact that several psychiatric disorders such as schizophrenia, mania, and attention deficit hyperactivity disorder (ADHD) share hyperactivity symptom, we systematically examined LAB mice with respect to unique and overlapping endophenotypes of the three diseases. To this end Venn diagrams were used as an instrument for discrimination of possible models. We arranged the endophenotypes in Venn diagrams and translated them into different behavioral tests. LAB mice showed elevated levels of locomotion in the open field (OF) test with deficits in habituation, compared to mice bred for normal (NAB) and high anxiety-related behavior (HAB). Cross-breeding of hypoactive HAB and hyperactive LAB mice resulted in offspring showing a low level of locomotion comparable to HAB mice, indicating that the HAB alleles are dominant over LAB alleles in determining the level of locomotion. In a holeboard test, LAB mice spent less time in hole exploration, as shown in patients with schizophrenia and ADHD; however, LAB mice displayed no impairments in social interaction and prepulse inhibition (PPI), implying a unlikelihood of LAB as an animal model of schizophrenia. Although LAB mice displayed hyperarousal, active coping styles, and cognitive deficits, symptoms shared by mania and ADHD, they failed to reveal the classic manic endophenotypes, such as increased hedonia and object interaction. The neuroleptic haloperidol reduced locomotor activity in all mouse lines. The mood stabilizer lithium and the psychostimulant amphetamine, in contrast, selectively reduced hyperactivity in LAB mice. Based on the behavioral and pharmacological profiles, LAB mice are suggested as a novel rodent model of ADHD-like symptoms.

  6. Attracting STEM Talent: Do STEM Students Prefer Traditional or Work/Life-Interaction Labs?

    PubMed Central

    DeFraine, William C.; Williams, Wendy M.; Ceci, Stephen J.

    2014-01-01

    The demand for employees trained in science, technology, engineering, and mathematics (STEM) fields continues to increase, yet the number of Millennial students pursuing STEM is not keeping pace. We evaluated whether this shortfall is associated with Millennials' preference for flexibility and work/life-interaction in their careers-a preference that may be inconsistent with the traditional idea of a science career endorsed by many lab directors. Two contrasting approaches to running STEM labs and training students were explored, and we created a lab recruitment video depicting each. The work-focused video emphasized the traditional notions of a science lab, characterized by long work hours and a focus on individual achievement and conducting research above all else. In contrast, the work/life-interaction-focused video emphasized a more progressive view – lack of demarcation between work and non-work lives, flexible hours, and group achievement. In Study 1, 40 professors rated the videos, and the results confirmed that the two lab types reflected meaningful real-world differences in training approaches. In Study 2, we recruited 53 current and prospective graduate students in STEM fields who displayed high math-identification and a commitment to science careers. In a between-subjects design, they watched one of the two lab-recruitment videos, and then reported their anticipated sense of belonging to and desire to participate in the lab depicted in the video. Very large effects were observed on both primary measures: Participants who watched the work/life-interaction-focused video reported a greater sense of belonging to (d = 1.49) and desire to participate in (d = 1.33) the lab, relative to participants who watched the work-focused video. These results suggest Millennials possess a strong desire for work/life-interaction, which runs counter to the traditional lab-training model endorsed by many lab directors. We discuss implications of these findings for STEM recruitment. PMID:24587044

  7. Co-segregation of hyperactivity, active coping styles, and cognitive dysfunction in mice selectively bred for low levels of anxiety

    PubMed Central

    Yen, Yi-Chun; Anderzhanova, Elmira; Bunck, Mirjam; Schuller, Julia; Landgraf, Rainer; Wotjak, Carsten T.

    2013-01-01

    We established mouse models of extremes in trait anxiety, which are based on selective breeding for low vs. normal vs. high open-arm exploration on the elevated plus-maze. Genetically selected low anxiety-related behavior (LAB) coincided with hyperactivity in the home cage. Given the fact that several psychiatric disorders such as schizophrenia, mania, and attention deficit hyperactivity disorder (ADHD) share hyperactivity symptom, we systematically examined LAB mice with respect to unique and overlapping endophenotypes of the three diseases. To this end Venn diagrams were used as an instrument for discrimination of possible models. We arranged the endophenotypes in Venn diagrams and translated them into different behavioral tests. LAB mice showed elevated levels of locomotion in the open field (OF) test with deficits in habituation, compared to mice bred for normal (NAB) and high anxiety-related behavior (HAB). Cross-breeding of hypoactive HAB and hyperactive LAB mice resulted in offspring showing a low level of locomotion comparable to HAB mice, indicating that the HAB alleles are dominant over LAB alleles in determining the level of locomotion. In a holeboard test, LAB mice spent less time in hole exploration, as shown in patients with schizophrenia and ADHD; however, LAB mice displayed no impairments in social interaction and prepulse inhibition (PPI), implying a unlikelihood of LAB as an animal model of schizophrenia. Although LAB mice displayed hyperarousal, active coping styles, and cognitive deficits, symptoms shared by mania and ADHD, they failed to reveal the classic manic endophenotypes, such as increased hedonia and object interaction. The neuroleptic haloperidol reduced locomotor activity in all mouse lines. The mood stabilizer lithium and the psychostimulant amphetamine, in contrast, selectively reduced hyperactivity in LAB mice. Based on the behavioral and pharmacological profiles, LAB mice are suggested as a novel rodent model of ADHD-like symptoms. PMID:23966915

  8. ERLN Lab Compendium Fact Sheet

    EPA Pesticide Factsheets

    The Compendium is an online database of environmental testing laboratories nationwide. It enables labs to create profiles of their capabilities, so emergency responders can quickly identify a lab that will meet their support needs.

  9. The lithosphere-asthenosphere boundary beneath the South Island of New Zealand

    NASA Astrophysics Data System (ADS)

    Hua, Junlin; Fischer, Karen M.; Savage, Martha K.

    2018-02-01

    Lithosphere-asthenosphere boundary (LAB) properties beneath the South Island of New Zealand have been imaged by Sp receiver function common-conversion point stacking. In this transpressional boundary between the Australian and Pacific plates, dextral offset on the Alpine fault and convergence have occurred for the past 20 My, with the Alpine fault now bounded by Australian plate subduction to the south and Pacific plate subduction to the north. Using data from onland seismometers, especially the 29 broadband stations of the New Zealand permanent seismic network (GeoNet), we obtained 24,971 individual receiver functions by extended-time multi-taper deconvolution, and mapped them to three-dimensional space using a Fresnel zone approximation. Pervasive strong positive Sp phases are observed in the LAB depth range indicated by surface wave tomography. These phases are interpreted as conversions from a velocity decrease across the LAB. In the central South Island, the LAB is observed to be deeper and broader to the northwest of the Alpine fault. The deeper LAB to the northwest of the Alpine fault is consistent with models in which oceanic lithosphere attached to the Australian plate was partially subducted, or models in which the Pacific lithosphere has been underthrust northwest past the Alpine fault. Further north, a zone of thin lithosphere with a strong and vertically localized LAB velocity gradient occurs to the northwest of the fault, juxtaposed against a region of anomalously weak LAB conversions to the southeast of the fault. This structure could be explained by lithospheric blocks with contrasting LAB properties that meet beneath the Alpine fault, or by the effects of Pacific plate subduction. The observed variations in LAB properties indicate strong modification of the LAB by the interplay of convergence and strike-slip deformation along and across this transpressional plate boundary.

  10. Linear alkylbenzenes as tracers of sewage-sludge-derived inputs of organic matter, PCBs, and PAHs to sediments at the 106-mile deep water disposal site

    USGS Publications Warehouse

    Lamoureux, E.M.; Brownawell, Bruce J.; Bothner, Michael H.

    1996-01-01

    Linear alkylbenzenes (LABs) are sensitive source-specific tracers of sewage inputs to the marine environment. Because they are highly particle reactive and nonspecifically sorbed to organic matter, LABs are potential tracers of the transport of both sludge-derived organic matter and other low solubility hydrophobic contaminants (e.g., PCBs and PAHs); sediment trap studies at the 106-Mile Site have shown LABs to be valuable in testing models of sludge deposition to the sea floor. In this study we report on the distributions of LABs, PCBs, PAHs, and Ag in surface sediments collected within a month of the complete cessation of dumping (July, 1992) in the vicinity of the dump site. Total LAB concentrations were lower than those measured by Takada and coworkers in samples from nearby sites collected in 1989. LABs from both studies appear to be significantly depleted (6 to 25-fold) in surface sediments relative to excess Ag (another sludge tracer) when compared to sewage sludge and sediment trap compositions. Comparison of LAB sediment inventories to model predictions of sludge particle fluxes supports the contention that LABs have been lost from the bed. The use of LABs to examine the short-or long-term fate of sludge derived materials in deep-sea sediments should be questioned. The causes of this LAB depletion are unclear at this point, and we discuss several hypotheses. The concentrations of total PCBs and PAHs are both correlated with sludge tracers, suggesting that there may be a measurable contribution of sludge-derived inputs on top of other nonpoint sources of these contaminant classes. This possibility is consistent with the composition of these contaminants determined in recent and historical analyses of sewage sludge.

  11. Education and Outreach in the Life Sciences: Qualitative Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burbank, Roberta L.; John, Lisa; Mahy, Heidi A.

    The DOE's National Nuclear Security Agency (NNSA) asked Pacific Northwest National Laboratory (PNNL) to consider the role of individual scientists in upholding safety and security. The views of scientists were identified as being a critical component of this policy process. Therefore, scientists, managers, and representatives of Institutional Biosafety Committees (IBCs) at the national labs were invited to participate in a brief survey and a set of focus groups. In addition, three focus groups were conducted with scientists, managers, and IBC representatives to discuss some of the questions related to education, outreach, and codes of conduct in further detail and gathermore » additional input on biosecurity and dual-use awareness at the laboratories. The overall purpose of this process was to identify concerns related to these topics and to gather suggestions for creating an environment where both the scientific enterprise and national security are enhanced.« less

  12. The alliance in a friendship coaching intervention for parents of children with ADHD.

    PubMed

    Lerner, Matthew D; Mikami, Amori Yee; McLeod, Bryce D

    2011-09-01

    The alliance between parent and therapist was observed in a group-based parent-training intervention to improve social competency among children with attention-deficit/hyperactivity disorder (ADHD). The intervention, called Parental Friendship Coaching (PFC), was delivered to 32 parents in small groups as part of a randomized clinical trial. PFC was delivered in eight, 90-minute sessions to parents; there was no child treatment component. Observed parent-therapist alliance recorded among 27 of the parents was measured using the Therapy Process Observational Coding System--Alliance scale (TPOCS-A; McLeod, 2005). Early alliance and change in alliance over time predicted improvements in several parenting behaviors and child outcomes, including peer sociometrics in a lab-based playgroup. These preliminary findings lend support to the importance of examining the parent-therapist alliance in parent-training groups for youth social and behavioral problems. Copyright © 2011. Published by Elsevier Ltd.

  13. A 2.5D Computational Method to Simulate Cylindrical Fluidized Beds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Tingwen; Benyahia, Sofiane; Dietiker, Jeff

    2015-02-17

    In this paper, the limitations of axisymmetric and Cartesian two-dimensional (2D) simulations of cylindrical gas-solid fluidized beds are discussed. A new method has been proposed to carry out pseudo-two-dimensional (2.5D) simulations of a cylindrical fluidized bed by appropriately combining computational domains of Cartesian 2D and axisymmetric simulations. The proposed method was implemented in the open-source code MFIX and applied to the simulation of a lab-scale bubbling fluidized bed with necessary sensitivity study. After a careful grid study to ensure the numerical results are grid independent, detailed comparisons of the flow hydrodynamics were presented against axisymmetric and Cartesian 2D simulations. Furthermore,more » the 2.5D simulation results have been compared to the three-dimensional (3D) simulation for evaluation. This new approach yields better agreement with the 3D simulation results than with axisymmetric and Cartesian 2D simulations.« less

  14. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  15. G =  MAT: linking transcription factor expression and DNA binding data.

    PubMed

    Tretyakov, Konstantin; Laur, Sven; Vilo, Jaak

    2011-01-31

    Transcription factors are proteins that bind to motifs on the DNA and thus affect gene expression regulation. The qualitative description of the corresponding processes is therefore important for a better understanding of essential biological mechanisms. However, wet lab experiments targeted at the discovery of the regulatory interplay between transcription factors and binding sites are expensive. We propose a new, purely computational method for finding putative associations between transcription factors and motifs. This method is based on a linear model that combines sequence information with expression data. We present various methods for model parameter estimation and show, via experiments on simulated data, that these methods are reliable. Finally, we examine the performance of this model on biological data and conclude that it can indeed be used to discover meaningful associations. The developed software is available as a web tool and Scilab source code at http://biit.cs.ut.ee/gmat/.

  16. G = MAT: Linking Transcription Factor Expression and DNA Binding Data

    PubMed Central

    Tretyakov, Konstantin; Laur, Sven; Vilo, Jaak

    2011-01-01

    Transcription factors are proteins that bind to motifs on the DNA and thus affect gene expression regulation. The qualitative description of the corresponding processes is therefore important for a better understanding of essential biological mechanisms. However, wet lab experiments targeted at the discovery of the regulatory interplay between transcription factors and binding sites are expensive. We propose a new, purely computational method for finding putative associations between transcription factors and motifs. This method is based on a linear model that combines sequence information with expression data. We present various methods for model parameter estimation and show, via experiments on simulated data, that these methods are reliable. Finally, we examine the performance of this model on biological data and conclude that it can indeed be used to discover meaningful associations. The developed software is available as a web tool and Scilab source code at http://biit.cs.ut.ee/gmat/. PMID:21297945

  17. Quantitative criteria for assessment of gamma-ray imager performance

    NASA Astrophysics Data System (ADS)

    Gottesman, Steve; Keller, Kristi; Malik, Hans

    2015-08-01

    In recent years gamma ray imagers such as the GammaCamTM and Polaris have demonstrated good imaging performance in the field. Imager performance is often summarized as "resolution", either angular, or spatial at some distance from the imager, however the definition of resolution is not always related to the ability to image an object. It is difficult to quantitatively compare imagers without a common definition of image quality. This paper examines three categories of definition: point source; line source; and area source. It discusses the details of those definitions and which ones are more relevant for different situations. Metrics such as Full Width Half Maximum (FWHM), variations on the Rayleigh criterion, and some analogous to National Imagery Interpretability Rating Scale (NIIRS) are discussed. The performance against these metrics is evaluated for a high resolution coded aperture imager modeled using Monte Carlo N-Particle (MCNP), and for a medium resolution imager measured in the lab.

  18. The QUANTGRID Project (RO)—Quantum Security in GRID Computing Applications

    NASA Astrophysics Data System (ADS)

    Dima, M.; Dulea, M.; Petre, M.; Petre, C.; Mitrica, B.; Stoica, M.; Udrea, M.; Sterian, R.; Sterian, P.

    2010-01-01

    The QUANTGRID Project, financed through the National Center for Programme Management (CNMP-Romania), is the first attempt at using Quantum Crypted Communications (QCC) in large scale operations, such as GRID Computing, and conceivably in the years ahead in the banking sector and other security tight communications. In relation with the GRID activities of the Center for Computing & Communications (Nat.'l Inst. Nucl. Phys.—IFIN-HH), the Quantum Optics Lab. (Nat.'l Inst. Plasma and Lasers—INFLPR) and the Physics Dept. (University Polytechnica—UPB) the project will build a demonstrator infrastructure for this technology. The status of the project in its incipient phase is reported, featuring tests for communications in classical security mode: socket level communications under AES (Advanced Encryption Std.), both proprietary code in C++ technology. An outline of the planned undertaking of the project is communicated, highlighting its impact in quantum physics, coherent optics and information technology.

  19. Active imaging systems to perform the strategic surveillance of an aircraft environment in bad weather conditions

    NASA Astrophysics Data System (ADS)

    Riviere, Nicolas; Hespel, Laurent; Ceolato, Romain; Drouet, Florence

    2011-11-01

    Onera, the French Aerospace Lab, develops and models active imaging systems to understand the relevant physical phenomena impacting on their performances. As a consequence, efforts have been done both on the propagation of a pulse through the atmosphere (scintillation and turbulence effects) and, on target geometries and their surface properties (radiometric and speckle effects). But these imaging systems must operate at night in all ambient illuminations and weather conditions in order to perform the strategic surveillance of the environment for various worldwide operations or to perform the enhanced navigation of an aircraft. Onera has implemented codes for 2D and 3D laser imaging systems. As we aim to image a scene even in the presence of rain, snow, fog or haze, Onera introduces such meteorological effects in these numerical models and compares simulated images with measurements provided by commercial imaging systems.

  20. Using PVM to host CLIPS in distributed environments

    NASA Technical Reports Server (NTRS)

    Myers, Leonard; Pohl, Kym

    1994-01-01

    It is relatively easy to enhance CLIPS (C Language Integrated Production System) to support multiple expert systems running in a distributed environment with heterogeneous machines. The task is minimized by using the PVM (Parallel Virtual Machine) code from Oak Ridge Labs to provide the distributed utility. PVM is a library of C and FORTRAN subprograms that supports distributive computing on many different UNIX platforms. A PVM deamon is easily installed on each CPU that enters the virtual machine environment. Any user with rsh or rexec access to a machine can use the one PVM deamon to obtain a generous set of distributed facilities. The ready availability of both CLIPS and PVM makes the combination of software particularly attractive for budget conscious experimentation of heterogeneous distributive computing with multiple CLIPS executables. This paper presents a design that is sufficient to provide essential message passing functions in CLIPS and enable the full range of PVM facilities.

Top